Skip to content

Releases: pupil-labs/pupil

Pupil Capture, Service and Player release

27 Sep 14:04
Compare
Choose a tag to compare

This release incorporates our latest round of improvements and bugfixes.

  • Add support for MacOS 10.12 "Sierra"
  • New normal fixation detector.
  • Improved user interface for Pupil Remote settings.
  • Implemented z-ordering of menus in Pupil Player.
  • Added Icons for Thumb elements in Pupil Player.
  • Added realtime fixation detector.
  • Minor fixes.

We hope you find these new features useful and look forward to feedback!

Best,
The Pupil Dev Team

Pupil Capture, Player, and Service release

29 Aug 19:40
Compare
Choose a tag to compare

This release fixes a few issues and improves the binocular 3d gaze mapper.

Pupil Capture and Player Hotfix release

13 Jul 14:14
Compare
Choose a tag to compare

This release fixes a few bugs of the v0.8 release in Pupil Capture.

  • The surface tracker did not work with with the IPC
  • Inverted markers did not work when using robust detection.
  • Shutdown of the IPC has been improved.
  • Pupil Player exports work again (this was broken for MacOS only.)

Pupil Capture, Player, and Service Release

13 Jul 13:56
Compare
Choose a tag to compare

We are happy to release v0.8 of the Pupil platform!

Jump to Downloads

Introduction of the IPC Backbone

To allow realtime interaction with other actors on the same machine and across machine boundaries we have completely rewritten the way Pupil exchanges data internally.

From now on all exchanged data (pupil data, control messages, log messages) flow through a message bus we call the IPC Backbone. All messaging has been unified and is now sent using the ZMQ PUBSUB pattern. One of the greatest features of the IPC backbone is that it can be accessed from outside Pupil Capture and across the network. As an outside actor you can write to the IPC Backbone and subscribe to (parts of) the message stream.

Message Format

Messages are defined as multi-frame zmq messages. Some message topics we are using as of now are logging , pupil , gaze , notify , delayed_notify . More info can be found here: Pupil-Interprocess-and-Network-Communication

Tapping into the IPC from outside

Any external actor (on the same box or remote) can now tap into the IPC Backbone and subscribe to all (or parts) of the message stream. It can also send messages that will be received by all subscribers in the app.

Deprecation of Pupil Server

Since this IPC Backbone is basically Pupil Server on steroids we have no need for it any more in Pupil Capture and it has been removed from the app.

This also means that the Pupil Server data format is deprecated. The IPC Backbone will give you the similar data. The only difference is that gaze, surface, and pupil data are no longer grouped. All pupil helper scripts have been updated accordingly.

Gaze Mapping

All gaze mappers have been restructured to handle temporal correlation of pupil data explicitly. Two base classes for mono and binocular mappers have been added.

Plugin API changes

Plugins will continue working as before. There are no plugin API changes except delayed notifications. See plugin.py for details.

Plugin notifications are now published on the IPC Backbone. Note, that all notification subjects have changed!

More structure to Notification names

Notification subjects are grouped by categories category.command_or_statement. Example: recording.should_stop.

New Dependencies

We use msg_pack for message serialization

pip install msgpack-python

New App: Pupil Service

We have added a third app to the Pupil project: Pupil Service. It is like Pupil Capture except it does not have a world video feed or GUI.

It is intended for AR and VR eye tracking setups. Pupil Service is meant to run in the background and controlled via network commands only. The service process has no GUI.

The tools introduced in the HMD-eyes project are made to work with Pupil Service and Pupil Capture alike.

Bugfixes and Enhancements

  • Tweak USB Camera bandwidth settings.
  • Make eye window ROI interaction more intuitive.
  • Use new marker design for screen marker calibration.
  • Renamed surfaces event to surface.
  • Added 3d location data to surface events.
  • Add jump to next fixation button in Pupil Player
  • Added option to use inverted markers in realtime surface tracker.

We hope you find these new features useful and look forward to feedback!

Best,
The Pupil Dev Team

Pupil Capture and Pupil Player release

11 May 09:51
Compare
Choose a tag to compare

We are happy to release v0.7.6 of the Pupil platform!

Jump to Downloads

New Features

This is a minor release, it reflects the latest changes and bug fixes made to the pupil source code since v0.7.5.

Calibration Marker Detection

The detector has been improved to work with smaller markers like the one below

manual_calibration_marker-01

The simple marker above printed with a diameter of 40mm can be used at distances of 2m with current Pupil hardware. The old marker is still supported and will continue working.

Capture Robustification

  • The pupil video_backend now tries to restart the capture device after all attempts to read from the device have failed.
  • The world process with fall back to a fake capture if all attempts to restart and recover communications with the camera have failed. This behavior provides better feedback to the user than a shutdown of the capture program.
  • The world process will stop recording when the fallback to fake capture occurs.

Revived Plugin: Pupil Remote

Improved Logging

  • All Pupil Capture logging is now globally received by the world process, displayed in the world window, and saved to capture.log
  • New Plugin: Log History shows the last 50 log messages in the world window.

We hope you find these new features useful and look forward to feedback!

Best,
The Pupil Dev Team

Pupil Capture and Player release

19 Apr 16:16
Compare
Choose a tag to compare

We are happy to release v0.7.5 of the Pupil platform!

Jump to Downloads

New Features

This is a minor release, it reflects the latest changes and bug fixes made to the pupil source code since v0.7.4.

Surface Tracker

We have improved the surface tracker.

  • Renamed marker detector to surface tracker.
  • 'min_markers_perimeter' is settable in the offline surface tracker.
  • Individual markers can now be added and removed from surfaces.
  • The UI has been improved.
  • A few bugs in the marker detector have been fixed.
  • The surface tracker now compensates for lens distortion.

3D gaze mapping data

3D gaze mappers now report 3d gaze data.

Performance improvements

Thanks to @mikeboers work a critical memory leak in PyAV has been fixed: Pupil Player and Pupil Capture now use considerably less memory on multiprocessor linux machines.

Small improvements and bug fixes

  • Improved notifications for recording and calibration.
  • Small tweaks and bug fixes.

We hope you find these new features useful and look forward to feedback!

Best,
The Pupil Dev Team

Pupil Capture and Pupil Player Release

24 Mar 08:13
Compare
Choose a tag to compare

We are happy to release v.0.7 of the Pupil platform! These notes are for new features introduced in v0.7 and v0.7.3. and serve as the official v0.7 release notes.

Jump to Downloads

Contributors

We want to thank everybody that has contributed! Be it in code, bug reports or feature request and feedback! Pupil would not be possible without you! Check out the list of contributors here.

New features

Pupil Detection

Ported the existing detector to C++ and improved its performance. The detector CPU load has been cut in half!
v0.7 introduces our new 3d model based pupil detector. We call it 3d detector and renamed the old detector 2d detector.

3d Pupil Detector

The old pupil detector and gaze mapper hit the limits of regression based calibration: It assumed the headset was "screwed to the head". When the headset moves the gaze mapping shifts. This is usually referred to as drift.
Additionally, a model-less search for the ellipse yields poor results when the pupil is partially obstructed by reflections, eyelashes, and/or eyelids.

With the use of a 3d model of the eye and a pinhole model of the camera based on Swirski’s work in “A fully-automatic, temporal approach to single camera, glint-free 3d eye model fitting - PETMEI 2013 we model the eyeball as a sphere and the pupil as a disk on the sphere. The sphere size used is based on an average human eyeball diameter of 24mm. The state of the model is the position of the sphere in eye camera space and two rotation vectors that describe the location of the pupil on the sphere.

Using temporal constraints and competing eye models we can detect and compensate for slippage events when 2d pupil evidence is strong. In case of weak 2d evidence we can use constraints from existing models to robustly fit pupils with considerably less evidence than before.

Using the 3d model we can now compute some exciting and very useful properties from the eye video footage:

  • 3d pupil normal: The normal vector of the pupil in relation to the 3d location of the eye camera.
  • 3d pupil diameter: Using the 3d model we correct the pupil diameter for perspective scale and transform it to millimeter (mm) space using the assumption that the eye ball size has a diameter of 24mm.
    3d sphere location: The location of the eyeball in relation to the eye camera.
    Augmented 2d detector confidence: Using a strong 3d model we have a better handle on false positives.

This new 3d pupil detector still has areas that we will continue to work on, but we feel that it already outperforms the 2d detector and recommend giving it a try!

3d Gaze Mapper & Calibration

With a 3d location of the eye and 3d vectors of gaze we don’t have to rely on a generic polynomial or similar for gaze mapping.

Instead we use a geometric gaze mapping approach: We model the world camera as a pinhole camera with distortion, and project pupil line of sight vectors onto the world image. For this we need to know the rotation translation of world and eye camera. This rigid transformation is obtained in a 5 point calibration routine. Under the hood we use a custom bundle adjustment routine.

Since we are now detecting and modeling the location of the eyeball we are taking movement of the headset into account. This means that the 3d model in combination with the geometric gaze mapper can compensate for slippage. You can take off the headset and put it back on without having to re-calibrate! We are still working on this feature, and you can expect this to become better over time.

For robust 3d pupil detection and fast slippage compensation we found that detecting the eye(s) at 90 Hz or greater is very helpful.

Restructured Pupil Server & Updated pupil-helper scripts

  • Pupil server now sends JSON data, that conforms to a standard format structured topic.
  • Every subject in the events like pupil and gaze data is broadcast as a separate topic.
  • Motivated by @zenithlight’s pull request: #326
  • See updated pupil (and much easier to use) helper scripts: https://github.com/pupil-labs/pupil-helpers

Introducing Plugin Notifications

This scheme allows plugins to notify their in-app environment and when using pupil sync to notify other programs and machines.

  • This simplifies developing more intertwined behaviour while keeping plugins isolated.
  • Plugin notifications that carry the flag record are saved during recording and can be retrieved.

Pupil Sync

This is not a new feature but we added some cool new stuff!

Full Support for Binocular Operation

  • Start and stop left and right eye during runtime.
  • 2d and 3d gaze mappers support binocular mapping.

Better Visual Feedback in Pupil Capture

Pupil and Gaze confidence is represented as transparency in the gaze dot in the world window and the pupil dot in the eye window(s). More opaque dot = higher confidence.

Changes to the Recording Format

We have depreciated the pupil_positions.npy and gaze_postions.npy files and now record all events and notifications that happen during recording into the pupil_data file. pupil_data is a pickled python dictionary. All raw data can be easily exported as .csv from Pupil Player 👇

Exporting Data

We have restructured export logic in Pupil Player.

  • Now you can press e or click the UI button e and all open plugins that have export capability will export.
  • All exports are separated from your raw data and contained in the exports directory. The exports directory will live within your recording folder. Structural change prompted by @cpicanco's issue.
  • We have added an exporter in Pupil Player that will give you access to all raw data that was acquired during the recording.

Bugfixes and Small improvements

  • GLFW MacOS fixes for render issues and crashes
  • Player GUI design is more consistent
  • Various pyglui fixes and improvements
  • Audio modes are settable
  • Camera default values for 120fps capture
  • Fixed bugs in libuvc and pyuvc that appeared when using Logitech C525/B525 cameras.
  • Fixed bug in Pupil Player that prevented exports when world.wav file was present.

Licence Change

On 2016-12-15 we transitioned to a new license. Pupil is now licensed under LGPLv3. For more information see the License page on the wiki.

We hope you find these new features useful and look forward to feedback!

Best,
The Pupil Dev Team

Pupil Capture Bugfix release

26 Jan 13:11
Compare
Choose a tag to compare

This release fixes a bug that crippled the eye model lifecycle management.

Pupil Capture and Player release

23 Jan 14:03
Compare
Choose a tag to compare

This is the biggest release of Pupil to-date!

Some details below:

Limits of regression based calibration: It assumes the headset is "screwed to the head". You need to compensate for the movement of the headset. Addionally, a model-less search for the ellipse yields poor results when the pupil is partially obstructed by reflections or eyelashes or eyelids.
With the use of a 3D model of the eye and a pinhole model of the camera based on Swirski’s work in [“A fully-automatic, temporal approach to single camera, glint-free 3D eye model fitting” PETMEI 2013] we can model the eyeball as a sphere and the pupil as a disk on that sphere. The sphere used is based on an average human eyeball diameter is 24mm. The state of the model is the position of the sphere in eye camera space and two rotation vectors that describe the location of the pupil on the sphere.

Using temporal constraints and competing eye models we can detect and compensate for slippage events when 2D pupil evidence is strong. In case of weak 2D evidence we can use constraint from existing models to robustly fit pupils with considerably less evidence than before.

With a 3D location of the eye and 3D vectors of gaze we don’t have to rely of polynomials for gaze mapping. Instead we use a geometric gaze mapping approach. We model the world camera as a pinhole camera with distortion, and project pupil line of sight vectors onto the world image. For this we need to know the rotation translation of world and eye camera. This rigid transformation is obtained in a 3 point calibration routine. At the time of writing we simply assume that the targets are equally far away and minimize the distance of the obtained point pairs. This will be extended to infer distances during calibration.

Pupil Capture Hotfix release for Mac

05 Jan 12:05
Compare
Choose a tag to compare

Updated glfw binary to fix fullscreen crashes.
Testing new icon design.