Skip to content

What-You-See-Is-What-You-Get Indoor Localization [WIL] - (not only) for Human-Robot Interaction

License

Notifications You must be signed in to change notification settings

szaguldo-kamaz/wysiwyg-iloc

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

78 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

What-You-See-Is-What-You-Get Indoor Localization (WIL)

wysiwyg_overview

Here's a simple demonstration where a human and a mobile robot (Vstone MegaRover 3.0) are moving around and two non-moving objects are placed on the floor, while their sensed poses are being projected onto them.

move

For gathering the raw pose data HTC Vive Trackers / Controllers were used. The WIL system transforms the raw pose data to be aligned with the real space (previously calibrated) and generates the image to be projected. Raw pose data sources supported: SteamVR, libsurvive, ROS messages, UDP packets (see "RemoteUDP" files). See wil_config.py for configuring which one to use. IDs of the trackers (LHR-xxxxxxxx) you want to use should be listed in wil_config.py. Buttons on the Vive Controller are also supported.

Calibration is very simple and easy, just run wil_calibration_with_floorproj.py, and you should see something similar:

calib

Usage:
  mouse dragging with left button pressed: adjust x,y offset
  mouse scrollwheel: adjust world orientation offset
  mouse scrollwheel with left button pressed: adjust tracker orientation offset
  keys + - : adjust pixel ratio
  keys x y r : toggle swap x, y, rotdir
  key z : in invidivual tracker adjustment mode: take the selected tracker as the reference zero height   key z : in world adjustment mode: take the tracker with the lowest height as reference zero height
  key c : start/stop pose data capture to file
  keys 1..0: select tracker no. 1..10 for individual adjusment
  key Space: pause / unpause
  key BackSpace: select world / unselect individually adjusted tracker
  key Esc : exit without saving calibration parameters
  key Enter : save calibration parameters and exit

First, try to adjust the world offsets, then (if required) adjust the trackers individually (use the 1..0 keys to select, then mouse drag + scrollwheel). If you place one tracker on the floor directly, then by selecting that tracker and pressing "z", the height of that selected tracker will be used as the reference height for the world (zero). By default the calibration parameters are stored in a file called wil_calibparams.txt (be sure to copy or symlink your previous calibparams file when using recorded data). To adjust heights (z axis), modify the saved wil_calibparams.txt file manually. If you would like to use the adjusted poses in your application, please see wil_talker_* as examples.

Even without using floor projection, WIL can possibly serve as a "Room Setup" tool for libsurive (and also for SteamVR, but it already has a "rough" Room Setup component).

Requires Python 3 and PySimpleGUI (pip install pysimplegui). And also openvr if you want to use SteamVR as the localization data source (pip install openvr).

Also, you can find more details in the following paper, also if you find this useful and use it in your research, please cite:

D. Vincze, M. Niitsuma, "What-You-See-Is-What-You-Get Indoor Localization for Physical Human-Robot Interaction Experiments", in Proc. of the 2022 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM2022), 2022, Sapporo, Japan, pp. 909-914.

DOI: 10.1109/AIM52237.2022.9863359

About

What-You-See-Is-What-You-Get Indoor Localization [WIL] - (not only) for Human-Robot Interaction

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages