Library for event-based vision
Clone the repo:
git clone https://gitlab.iit.it/gscarpellini/event-library
Install event_library:
python setup.py install
If you want to contribute to event-library
, we suggest to create a virtualenv first. You can use pipenv
for that.
git clone https://gitlab.iit.it/gscarpellini/event-library
cd event-library
pipenv install
pre-commit install
- Start opening an issue on gitlab assessing the feature you're going to implement or the bug you want to fix.
- Fork the project and start coding1! 🔥
- We use
pre-commit
to check that code respectPEP8
and good practices - Submit your
pull-request
to the main repository :)
python -m pip install .
python -m pip install -r requirements/docs.txt
sphinx-build -b html docs/source docs/build
python scripts/generate.py frames_dir={INPUT_DIR} output_dir={OUTPUT_DIR}
upsample=true emulate=true representation=voxel
Tree:
+-- inputdir
| +-- videodir1
|
+-- outputdir
| +-- videodir1
| +-- part_0
python scripts/generate.py frames_dir={FRAME_VIDEO_DIR} output_dir={OUTPUT_DIR}
upsample=true extract=false emulate=true representation=voxel
Each video has a imgs
directory, where you put the set of frames. Create a
fps.txt
file in each video directory where you specify the
frame-rate of the video as a single integer number (e.g., 30)
Tree:
+-- inputdir
| +-- videodir1
| +-- imgs
| +-- fps.txt
+-- outputdir
| +-- videodir1
| +-- part_0
You can visualize a npy
events files using the visualize
script:
python scripts/visualize.py file_path={YOUR_FILE.npy} representation={REPRESENTATION}
You can obtain a tool help using python {TOOL}.py --help
- voxelgrid
- constant-count
- raw
upsample
: if true, upsample frames to higher fps using SuperSloMo model
emulate
: if true, create output events files as npy
using representation
stretegy