A Body Gesture Controlled Virtual Tour


Introduction

This article is based on a project I involved in during my internship period at Arimac 360° Digital
Interactive Studio.
The project is implementing a Virtual tour in proposed Hambanthota heritage museum. This will be presented to Namal Rajapaksha the parliament minister of Hambanthota district. The goal of this project was introducing our products to the government and acquiring government market for the projects which are suitable for our company. If we could succeed, we will be able to build and sell Kedapatha touch tables to the museum. The challenging thig in this project was, we have to implement a methodology which allows user to walk in the Museum without touching the keyboard or mouse.

Project Plan
As we we have to implement a methodology to walk in the museum without keyboard or mouse we have to use a natural interface to control the application. For that we had two choices. We could either use Leap motion or Kinect. Both technologies has its own strengths and weaknesses. Initially we didn’t have a good understanding on what to use.  So, I decided to implement an input control independent application so that later on I will be able to integrate the selected device as the input device. Also, this approach will allow the design team to continue with their development while I research on the natural interface
Kinect
Kinect is a motion sensing input device developed by Microsoft. Kinect is consists of an RGB camera, depth sensor and multi array microphone. this device provides full body motion capture, facial recognition and voice recognition capabilities. For our project, we only needed full body motion capturing capability. Also, this device was within affordable price range. Therefore, this was a suitable option for our purpose.
Figure 2-4: Kinect
In order to get to kinect work with windows and unity 3D, I used OpenNI framework. OpenNI (open source natural interface) is an open source sdk(software development kit) which is used to develop 3D sensing applications and middleware applications. OpenNi provides

  • Voice and voice command recognition
  • Hand gestures
  • Body Motion Tracking
Figure 2-5 represents the SDK architecture of OpenNI. According to the diagram, Kinect belongs to Devices layer and the application we develop belongs to Application layer.

Figure 2-5: OpenNI sdk architecture



Implementing Body Gesture Capturing mechanism
Using Kinect + OpenNI, we can get a set of coordinates of set of points recognized body. Figure 2-6 represents the points which can be recognized by kinect.

Figure 2-6: kinect skeleton
Using the input coordinates of few joints, I implemented a mechanism to capture the following body gestures inside unity. The following diagram represents the gestures application can identify. Each gesture is associated with a particular action the user can perform in 3D tour. From left to right of the figure 2-7 shows the gestures used to move forward, move backward, look up, look down, turn left and turn right

Figure 2-7: body gestures
The problem I faced in when implementing the body gestures was, the application should not mis interprit normal behaviors of a person as a command. For that, first I had to study normal common poses of a person and I defined the gestures which are not similar to common poses of the body. For example, when we stand up normally, we keep our both hands below the shoulder level. So, its not practical to define a body gesture only using the hand coordinates. There for I had to use as much points of body to define the gesture. More points I use, more the accuracy of the gesture becomes.
As I had implemented the museum application in way which could be easily integrated any input device, I could integrate it with kinect input easily. So, the application worked perfectly with kinect. But still I haven’t tested the application with LeapMotion. So, next thing I did is testing the application with LeapMotion.

Leap Motion
Leap Motion is a device which aquire hand and finger motion as an input. This was also suitable for our Museum project. Therefore I tested the museum application with LeapMotion too. The major advantage of leap motion over kinect is, the higher resolution of the device. But its observation area is much smaller compared to unity.
http://www.xbitlabs.com/images/news/2013-01/leapmotion_2.jpg
Figure 2-8: Leap motion

Leapmotion uses 2 monochromatic IR cameras and 3 infrared LEDs. Then 2D IR  images captured by the device sent to the software of the host computer. Then using the 2D images, 3D coordinates position of the hand and its interested points are calculated using math.
Figure 2-9 shows the points of hand that Leap Motion calculate for 3D coordinates.

Figure 2-9: Kinect hands

Using those coordinates I could define custom hand gestures which allow the user to walk in the museum. Following diagrams shows the hand gestures with the corresponding action. As in kinect, the more points I use to define the calculation, the more accurate the gesture.

But when we testing the application with Leap Motion, we discovered that, it is a bit harder to control using the hand gestures. Because of the limited observation area of Leap Motion, user has to keep the hand always above the sensor. But on the other hand, kinect was more robust and easy to use. So, we decided to move on with kinect.  

Problem – Very Low Frame Rate in Museum Application
While I was researching on natural user interface for the Museum application, design team had finished their job. They had completely designed the 3D model of the museum building and placed exhibits in proper places. The museum environment looked really nice, but it seemed that when running the 3D tour, the frame rate was very low. Therefore, I had to optimize the application.


Comments

  1. HI Thilina,
    I'm searching a Kinect-unity3d developer.
    Please send me to my account (fb.carlini@libero.it) your contact info. I will explain you my project.
    Thank you
    Fabio

    ReplyDelete

Post a Comment