Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vive controller - remote control #116

Open
noorbeast opened this issue May 31, 2016 · 18 comments
Open

Vive controller - remote control #116

noorbeast opened this issue May 31, 2016 · 18 comments

Comments

@noorbeast
Copy link

I would be very interested in being able to use the Vive controller as a remote controller, as with the space craft on the table in The Lab.

I already have the craft set up in the Unity Scene, controlled by the WASD keys, so any idea how I can grab and translate Vive controller position to those?

@thestonefox
Copy link
Member

Check out this example scene:

https://www.youtube.com/watch?v=4J8abeLzH58&index=16&list=PLRM1b2lKjTbdFJtYv_SNAb3NvYp-Gl7nZ

It should do what you want.

@noorbeast
Copy link
Author

Thanks for that.

Yes I had a play with the touchpad controller, but personally in terms
of input it is nowhere near as intuitive nor gives as fine control as
the Vive controller movement, as per The Lab drone.

Kind regards

Gary

On 31-May-16 6:02 PM, StoneFox wrote:

Check out this example scene:

https://www.youtube.com/watch?v=4J8abeLzH58&index=16&list=PLRM1b2lKjTbdFJtYv_SNAb3NvYp-Gl7nZ

It should do what you want.


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
#116 (comment),
or mute the thread
https://github.com/notifications/unsubscribe/AKJ3QAdKvk0-hAMsDXou-Q0AWungtYDBks5qG-r4gaJpZM4IqLKM.

@thestonefox
Copy link
Member

@noorbeast what do you mean by it's not as intuitive?

The example shows off how you use the touch locations on the touchpad as an input. If you want to use those values to control an object then that's up to the developer to implement it.

The toolkit isn't really intended to be a bunch of assets people can use, but a bunch of concepts people can take and develop off of as a platform.

@noorbeast
Copy link
Author

Yes I get that.

What I am trying to convey is that using the whole controller as input,
like the table drone example in the Lab, is really intuitive and gives
fine control for that sort of game play, compared to traditional input
like the touchpad. I hope that makes some sort of sense.

On 01-Jun-16 2:04 AM, StoneFox wrote:

@noorbeast https://github.com/noorbeast what do you mean by it's not
as intuitive?

The example shows off how you use the touch locations on the touchpad
as an input. If you want to use those values to control an object then
that's up to the developer to implement it.

The toolkit isn't really intended to be a bunch of assets people can
use, but a bunch of concepts people can take and develop off of as a
platform.


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
#116 (comment),
or mute the thread
https://github.com/notifications/unsubscribe/AKJ3QCIPZYlHGIL7QWvHNISn6Re-PkFXks5qHFv3gaJpZM4IqLKM.

@thestonefox
Copy link
Member

@noorbeast ah ok I see what you mean, you're talking about the controller is actually moving the thing based on it's rotation, like one big gyroscopic remote control?

@thestonefox thestonefox reopened this May 31, 2016
@noorbeast
Copy link
Author

Absolutely, sorry I was so awkward trying to describe what I meant.

On 01-Jun-16 5:05 AM, StoneFox wrote:

@noorbeast https://github.com/noorbeast ah ok I see what you mean,
you're talking about the controller is actually moving the thing based
on it's rotation, like one big gyroscopic remote control?


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
#116 (comment),
or mute the thread
https://github.com/notifications/unsubscribe/AKJ3QBoKWKfSF6bMtqgOTFB_JPWYezp4ks5qHIaGgaJpZM4IqLKM.

@ghost
Copy link

ghost commented Jun 1, 2016

@noorbeast As a quick side note we implemented this in our game but ended up going with the touchpad for accuracy. Using the controller gyro rotation can only really be used for inaccurate movement and requires a fairly high tolerance. Expect a lot of false positives as you'll get different results depending on if the player is sitting or standing, holding the controllers close or arms stretched, is facing the monitor straight on or not.
Just my 2 cents. Hope you can get it working in your situation though :)

Cheers

@noorbeast
Copy link
Author

That is interesting.

Everyone I have had trying out the drone in The Lab, including a drone
pilot, have commented on how simple and well it works for them, though
that does nor need precision, most get the biggest kick out of teasing
the Robo Dog with the drone.

On 01-Jun-16 11:25 PM, x-mugen-x wrote:

@noorbeast https://github.com/noorbeast As a quick side note we
implemented this in our game but ended up going with the touchpad for
accuracy. Using the controller gyro rotation can only really be used
for inaccurate movement and requires a fairly high tolerance. Expect a
lot of false positives as you'll get different results depending on if
the player is sitting or standing, holding the controllers close or
arms stretched, is facing the monitor straight on or not.
Just my 2 cents. Hope you can get it working in your situation though :)

Cheers


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
#116 (comment),
or mute the thread
https://github.com/notifications/unsubscribe/AKJ3QKeWRUQ6jy1oYmRTtlZvNh0jfrxsks5qHYhOgaJpZM4IqLKM.

@ghost
Copy link

ghost commented Jun 1, 2016

@noorbeast Yes for general movement it may be ok. For something like gestures using the gyro it just wasn't working in our setup. The accuracy of the controllers is great but seeing how different players hold them throws the math out a bit. We started with a tolerance of 10% then upped it to 20% which worked ok but wouldn't engage more often than not, upping it to 30% and we were getting too many false positives in the gesture recognition. We would set it up one way then the player would hold the controllers slightly different or not face the monitor directly and it wouldn't engage or would engage too much. For general movement it might be ok as you have a wide margin to play with (in degrees) and doesn't really need much accuracy.
I guess if you have a calibration setup at the beginning of your game you can see what the player calls horizontal or vertical or pitched, then set an offset to the controller reading in your game :)

@noorbeast
Copy link
Author

That is exceedingly helpful advice.

It does flag some potential issues issues for something else on my todo
list that I wanted to experiment with, motion cancellation for motion
simulators, where the movement to the motion of the sim is tracked by
mounting the controllers to it, then subtract the simulator moment from
the HMD visuals.

I don't know if it would work at all, but as the most active member of
the DIY motion sim community I do know it is desperately needed to make
the most of VR and motion simulation. And the alternative for the Rift
(VectionVR) is not practical, as requires compiling with the game source
code.

From what you have said at a minimum there would need to be a
calibration process for the controllers mounted to the motion rig. Do
you think the tracking would be accurate enough to use to cancel the
motion from the HMD view in real time? Latency is the killer.

Kind regards

Gary

On 02-Jun-16 9:44 AM, x-mugen-x wrote:

@noorbeast https://github.com/noorbeast Yes for general movement it
may be ok. For something like gestures using the gyro it just wasn't
working in our setup. The accuracy of the controllers is great but
seeing how different players hold them throws the math out a bit. We
started with a tolerance of 10% then upped it to 20% which worked ok
but wouldn't engage more often than not, upping it to 30% and we were
getting too many false positives in the gesture recognition. We would
set it up one way then the player would hold the controllers slightly
different or not face the monitor directly and it wouldn't engage or
would engage too much. For general movement it might be ok as you have
a wide margin to play with (in degrees) and doesn't really need much
accuracy.
I guess if you have a calibration setup at the beginning of your game
you can see what the player calls horizontal or vertical or pitched,
then set an offset to the controller reading in your game :)


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
#116 (comment),
or mute the thread
https://github.com/notifications/unsubscribe/AKJ3QK2lkQHzdfCIv8QMXSOtJDq0W-4hks5qHhlCgaJpZM4IqLKM.

@ghost
Copy link

ghost commented Jun 2, 2016

@noorbeast That's a very interesting project. I assume the motion rigs are all custom built by the community members and therefore there is not a set standard for where to place the motion controllers therefore I'd have to agree you'd need a calibration setup in your game, which wouldn't be difficult to implement at all.
The tracking is incredibly accurate once calibrated. Using the controllers tracking data to offset the HMD view in realtime shouldn't be an issue.
If I understand correctly what you're trying to achieve, you'd simply be grabbing the controller's rotational vector3's and (perhaps including a middle man for calibration / offset) then inverting the vector3 values and applying that rotation to the in game (HMD) camera every frame. Much how video stabilization software works, track the motion and invert the movement and apply to the original camera, voila stable videos.
There's probably a bit more to it than this but this is how I'd start.

Cheers

@noorbeast
Copy link
Author

Yes @x-mugen-x that is a good summary of what I had in mind.

@thestonefox
Copy link
Member

Is it worth having some new events on ControllerEvents things like:

Yaw(rotation) - when you move the controller in a left/right motion (0 degrees being straight forward)
Pitch(angle) - when you tilt the controller in an up/down motion (0 degrees being straight forward)
Roll(rotation) - when you rotate the controller in left/right motion (whilst looking ahead)

image

Then these values could be used to control something.

It's basically the rotation values of the controller but tidied up in a neater way.

@ghost
Copy link

ghost commented Oct 22, 2016

@thestonefox This would definitely be useful.
Not sure if it would be doable but a smoothing (interpolation) multiplier would be nice.
I.e if it set to 0.0 there'd be no smoothing (could be jittery when applied to objects)
if set to 1.0 have some interpolation to smooth it out.
This would help in a lot of cases especially like your plane demo image, vehicle control, anything where you're controlling something of significant size or need accuracy, like a combination lock on a bank vault.
Just an idea. I just noticed in my project the values were jittery.
Cheers!

@paulc755
Copy link

paulc755 commented Oct 23, 2016

@thestonefox Excellent idea, would you hold the controller as you would a joystick? In fact is it possible to get the controller to emulate a 5 button Joystick?

@benbega
Copy link

benbega commented Sep 27, 2017

@thestonefox I think this can be closed.

@thestonefox
Copy link
Member

I still quite like this idea though

@fight4dream
Copy link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants