Microsoft’s Kinect motion sensor brought affordable controller-less motion input to game consoles. However, the resolution on offer does limit the level of body motion that can be detected and used in a game. If you want better detection you need more sensors, and a demo using the Unreal Engine has shown just how good full-body motion sensing can be with the right kit.
At the Institute of Navigation (ION) GNSS 2012 conference held last month a system was setup that allowed for full-body, real-time motion tracking of a person walking around a room and doing a range of actions. The system relied on the use of 17 YEI 3-Space Wireless Sensors attached to the body and 3 3-Space Wireless Dongles.
The information gathered from tracking those sensors was fed into Epic’s Unreal Engine and applied to a character mesh that copied the actions. As this was all happening in real-time, the on-screen character mimicked the actions almost perfectly, suggesting such a system could work in-game as a control solution.
The tracking system uses YEI’s miniature, high precision sensors coupled with an attitude and heading reference system. Each sensor includes a triaxial gyroscope, accelerometer, and compass sensor, along with on-board “Kalman filtering algorithms” allowing for both position and orientation to be assessed precisely.
The other question to ask here is: even if the price of such a system came down to a consumer-friendly level, would you be willing to wear 17 sensors every time you wanted to play? I think if the game became much more immersive and offered you a serious work out through playing, most gamers wouldn’t mind as long as they had the space to move around and play effectively..
Source: Geek
Tidak ada komentar