EyeGestures is open source eyetracking software/library using native webcams and phone camers for achieving its goal. The aim of library is to bring accessibility of eye-tracking and eye-driven interfaces without requirement of obtaining expensive hardware.
Our Mission!
Important
EyeGestures is a fully volunteer-based project and exists thanks to your donations and support.
π’π’ We are looking for business partnerships and sponsors! π’π’
For enterprise avoiding GPL3 licensing there is commercial license!
We offer custom integration and managed services. For businesses requiring invoices message us [email protected]
.
Sponsor us and we can add your link, banner or other promo materials!
- EyePilot
- EyePather
- EyeFocus <- 4th best product on PH!
- Add your project! [email protected] or PR
Subscribe and get access to our software:
python3 -m pip install eyeGestures
Warning
some users report that mediapipe, scikit-learn or opencv is not installing together with eyegestures. To fix it, just install it with pip.
Tracker works best when your camera or laptop is at arm's length, similar to how you would typically use it. If you are further away, it may be less responsive for now - currently working on solving this issue.
python3 examples/simple_example_v2.py
python3 examples/simple_example.py [legacy tracker, will become obsolete]
from eyeGestures.utils import VideoCapture
from eyeGestures.eyegestures import EyeGestures_v2
# Initialize gesture engine and video capture
gestures = EyeGestures_v2()
cap = VideoCapture(0)
calibrate = True
screen_width = 500
screen_height= 500
# Process each frame
while True:
ret, frame = cap.read()
event, cevent = gestures.step(frame,
calibrate,
screen_width,
screen_height,
context="my_context")
if event:
cursor_x, cursor_y = event.point[0], event.point[1]
fixation = event.fixation
# calibration_radius: radius for data collection during calibration
You can customize your calibration points/map to fit your solutions. Simple copy snippet below, and place your calibration poitns on x,y planes from 0.0 to 1.0. It will be then automatically scaled to your display.
gestures = EyeGestures_v2()
gestures.uploadCalibrationMap([[0,0],[0,1],[1,0],[1,1]])
V2 is two stage tracker. It runs V1 under the hood but then uses it as feature extractor for V2 machine learning component, and combines both outputs to generate new gaze point. It is possible to control how much V1 affects V2 by:
gestures.setClassicImpact(N) # setting N = 2 is working best for my testing
This makes that sample obtained from V2 is averaged with N times sample from V1 (same sample copied that many times). In outcome having V2 impacting output in 1/N+1
and V1 N/N+1
.
It is also worth to know that you can enable hidden calibration for V1 (same calibration when using only V1, but now it is invisible to user):
gestures.enableCNCalib()
from eyeGestures.utils import VideoCapture
from eyeGestures.eyegestures import EyeGestures_v1
# Initialize gesture engine with RoI parameters
gestures = EyeGestures_v1()
cap = VideoCapture(0)
ret, frame = cap.read()
calibrate = True
screen_width = 500
screen_height= 500
# Obtain estimations from camera frames
event, cevent = gestures.estimate(
frame,
"main",
calibrate, # set calibration - switch to False to stop calibration
screen_width,
screen_height,
0, 0, 0.8, 10
)
if event:
cursor_x, cursor_y = event.point[0], event.point[1]
fixation = event.fixation
# calibration_radius: radius for data collection during calibration
Feel free to copy and paste the relevant code snippets for your project.
If you are building publicly available product, and have no commercial license, please mention us somewhere in your interface.
- RSS
- discord
- daily.dev
- email: [email protected]
- some users report that
mediapipe
,scikit-learn
oropencv
is not installing together witheyegestures
. To fix it, just install it withpip
.
We will be extremely grateful for your support: it helps to keep server running + fuels my brain with coffee.
Support project on Polar (if you want to help we provide access to alphas versions and premium content!):