Empty

Total: $0.00

old.yostlabs.com

Yost Labs Old website

Experimental Projects

Head Tracking with YEI 3-Space Sensor in UDK

Using a YEI 3-Space Embedded Sensor and Sony HMZ-T1 Headset a simple head tracking demo in UDK was made. Taking advantage of UDK's ability to bind external DLLs, getting orientation data into Unreal requires only a few lines of uscript and the 3-Space C API dll.

  • Real-time Head Tracking using an off-the-self HMD
  • Works on the entire 3-Space Sensor Family
  • Model for Embedded sensor clip available for download
  • Built and tested with Unreal Development Kit(May 2012)
  • Free and open source, thus allowing use and modification of the application without consequences


Android 3-Space Bluetooth App

To demonstrate the capability of using the 3-Space Bluetooth on Android this app was created. It is designed to be an example to understand how to communicate with the 3-Space Sensor in Android. Requires basic understanding of Android development, source code provided only.

  • Real-time visualization of orientation
  • Set axis directions and LED color
  • Get Sensor info from your phone
  • Built and tested with multiple phones
  • Free and open source, thus allowing use and modification of the application without consequences
  • To download

 

Head Tracking in Simulations and Games

Using a YEI 3-Space Sensors in existing games and simulators for basic head tracking

  • 2DOF working in ArmA2 and Day-Z, requires no additional software

 

 

Real-time Mocap and VR in UDK

This is a video of the live demo that was at ION GNSS 2012 September 17-21.

Using 17 YEI 3-Space Wireless Sensors and 3 3-Space Wireless Dongles motions from an actor can be applied to a mesh in real-time and can be used to interact with a virtual scene. Adding a HMD to the suit allows for an immersive first-person experience with head and body tracking.

  • Uses UDK to render impressive and interactive scenes
  • Using a head mounted display can be used for more immersive VR
  • System does not rely on line of sight to function
  • Sensor fusion is done on-chip greatly simplifying implementation
  • Real-time mocap with low cpu usage

 

 

Real-time Motion Capture with YEI Mocap Bundle in Ogre3d

This is the video of the live demo that was at Sensors Expo 2012 in Chicago.

  • Uses Ogre3d for a completely open source example
  • System does not rely on line of sight to function
  • Sensor fusion is done on-chip greatly simplifying implementation
  • Real-time mocap with low cpu usage
  • An early test for simple translation