This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No 101030275
First copy this repository to your pc running the following in your python terminal (Python 3.9 recommended)
git clone https://github.com/MarnixMeersman/DocumeNDT_dataprocessing
Then install al required libraries using: (if prophet gives you errors, dont worry, it is only required for machine learning time series predictions and not a vital library in order to run the rest)
pip install -r requirements.txt
Then proceed with reading this README. Have fun!
The script requires you to upload matlab ".mat" files into the ./raw_data folder. Make sure the filenames have only a code-number as file name, e.g. "11.mat", as it searches for these particular names while processing. The second step is to open main.py and adjust the parameters to your liking, at the top of the file, you should define the locations list which should corresponds to the files you just uploaded. Parameters to adjust include:
- locations ID's
- number of stickers (programm assumes 10 laserpoints per sticker)
- Signal to Noise Ratio threshold to disregard bad quality readings
- Variance threshold on when to trigger T1
The ouput of main.py updates ./results/time_differences.csv, velocities.csv should be calculated yourself using an excelfile and the known locations of emission and reception. If make_plots = True, you can find the results under ./plots. Careful, this almost doubles the processing time. A short runtime is around 5 minutes. Here a visual overview of main.py:
To manually check any of the waveforms, use manual_checker.py as shown below. Adjustment to T0 or T1 can be made using the GUI. Clicking on the legend allows you to view and hide different tranformations and signals.
In order to visualize your results, go to the ./visualization folder. Make sure that you update your ./visualization/emission_reception_velocities folder since it uses only these values. So UPDATE MANUALLY this folder. Below a video showing how to use the interface:
!!!In addition for 3d visualizations, please upload a DOWNSAMPLED .obj file for the object. Also, the scaling might be off sometimes, try exporting in mm instead of meters. You can shift and scale the obj file playing with the scaling and shifteing operators for each axis in line 62, 63 and 64 of visualisation_function.py !!!
Allows you to take sections of the interpolated volume using the slicers in X, Y and Z. Button in the top left allows for different color mappings. The histogram represents the distribution over the WHOLE volume. Therefore in doesn't change when playing with the sliders.
It allows your for an interactive way to see side my side the interpolated volume and the the dimensional object containing the rays that pass-through your volume (requires an .obj file)
DocumeNDT_dataprocessing is available under the CC0 license. See the LICENSE file for more info.