DIY Project – Wearable IMU Tracking Sensor

Last Update : 05/15/2017

Project Motivation

In this post I present a self-made wearable IMU (inertial measurement unit) tracker which can be used for Virtual Reality applications as well as for any other application which makes use of real-time acceleration data or orientation data.

I got inspired by wearables which got on the market a few years ago. There are also companies like xsens selling tracking sensors for Motion Capture applications. However, those sensor are quite costly. And I began thinking “Why not trying to create a cheap version by yourself ?”. And here is what I got after a couple of month of development in my free time.

It is important to clarify that my tracking sensor does not achieve the performance nor the tracking quality as professional products from companies like xsens. I think xsens spends manpower and money for advanced calibration methods and hardware to get the best out their sensors. This largely reduces drift which is one main disadvantage of IMU (inertial measurement unit) sensors. Additionally xsens uses positional tracking for correct the remaining amount drift.

However, my sensors are still useful for many tasks in which drift is not a big deal or in which the drift can be more or less corrected by exploiting meaningful constraints.


Applications

    • Motion Capture

The presented wearable sensors provide occlusion-free orientation data that could be recorded and used for animating a kinematic body model afterwards. However, for full body capture in 3d space positional data is neccessary. This could be provided by using an additional calibrated external sensor, e.g. Microsoft Kinect. Alternatively, an outside-in tracking camera with a wearable tracking marker could be used for completing the MoCap setup.

    • Real-time avatar animation for VR

In case of real-time avatar animation in VR a set of wearable IMU sensors would provide the required body orientations. In a sitting pose no other data would be required since positional changes are not required. For positional movements in space a single additional body position is required. Every other point can be derived from the orientation data provided by the IMU sensors. The body position could be provided by a head-mounted camera tracking the environment (inside-out tracking).

    • 3d interaction device: orientation and position mapping

The sensor can be used as an interaction device. The orientation of the sensor could be mapped from the palm of the hand e.g. to a rotational menu with entries which could then be selected. Additionally, if the acceleration data is used, a relative position can be computed by double integrating the acceleration. However, the drift will get large so acceleration reduction is recommended for drift compensation in this case. The relative change in position could intuitively control simple but also more complex (multidimensional) parameter, e.g. a light position in a 3d modeling tool or a parameter in a digital audio workstation.

    • Point and click device, simple gestures, drawing

If the sensor is imagined being pointing device (like a laser pointer) it could be used instead of a mouse or keyboard in a presentation. Simple gestures like “wipe” could trigger to call the next slide intuitively. The sensor would also allow painting or sculpting in 2d and 3d.

    • Monitoring of sports activities

Many activities could be monitored with the provided sensor to train and practice and improve certain movements or just for fun. The sensors are not water-proof (which could be an interesting road for an improved version btw), but the sensor is robust making it suitable also for outdoor usage. The sensor data can be recorded, monitored and analyzed in real-time. I have implemented a rowing simulator which uses the IMU data to animate the rowing paddles. However, monitoring the rowing movement is a much simpler task.

    • Mixed Reality

With a combination of IMU sensors and VR/AR display the user could use his arms and hands even without explicit finger tracking. This setup enables a variety of Mixed Reality applications (real or rendered environment with augmented information and body interaction).

    • Video stabilization and Panoramic photo capture with pose guidance

Video stabilization algorithms very often use image features to solve for the camera pose. In case blurred images or images with a low amount of visible features the camera pose estimation may fail. Acceleration and orientation data from the IMU sensor may increase provide support for enhanced robustness or faster convergence of the camera pose estimation. Another application which requires camera pose estimation is panoramic photo capture. There are already applications using the IMU data from the smartphone e.g. HTC PanoOMG. A combined setup of IMU sensor and camera could enable similar applications for learning purposes or extended applications.

    • Performance art

Drawing lines in 3D space using your own body movements is a very expressive method for painting which is not possible with traditional forms of art. If experienced also in 3d the provided IMU sensors represent a effective novel tool to create fascinating art in space and time.

Wearable IMU Sensors
Wearable IMU Sensors

Advantages of the Wearable IMU Sensor

  • independent sensor does not suffer from occlusions compared to optical systems (external cameras with markers)
  • high temporal resolution, high and constant accuracy
  • non-obtrusive and 3d-printable case
  • wearable, self-containing, flexible sensor (external data storage is smartphone or a desktop computer)
  • simple creation/assembly (soldered cables, no special circuits)
  • simple setup
  • low costs (less than 30 Euros for one sensor)

Disadvantages

  • drift caused by required integration of acceleration/velocity over time
  • many sensor may introduce latency or ‘jaggy’ motion
  • wearable requires power, charge required every 8h of usage

Sensor Development

The project started with the orientation sensor. I use a MPU-6050 breakout board with an Arduino. In the first tests I used the Arduino UNO (5 Volts) which is appropriate for rapid prototyping, but I had to switch to a smaller version (Arduino Pro Mini, 3.7 V) later so that the sensor is small enough to be wearable.

First tests with the orientation sensor

After this first test I started a small-scale student project. The idea was to create a MoCap suit with a set of self-made orientation sensors on Arduino basis so that torso, arms and legs can be tracked in real-time.

Here you can see a prototype of the first generation of the sensor.

Early prototype
Early prototype

A sensor contains a bluetooth breakout (HC-06), an Arduino Pro Mini and an MPU-6050 breakout (acceleration sensor + gyro sensor). You can see the soldiering layout and wire connection in the next image.

Soldering layout
Soldering layout

The software for calibration, sensor reading and output via Bluetooth uses the basic Arduino library as well as the i2cdevlib library from Jeff Rowberg.

After building one sensor we extended the system to combine a set of sensors. We decided to connect every sensor to a single wearable battery.

One issue with Bluetooth has been the number of parallel connections which is limited to 7. One way to get around this problem has been to increase the number of Bluetooth dongles on the other side (desktop computer, smartphone etc.). However, to be able to track arm, legs and the torso more than 15 sensors are required. Keeping all the connections online with low latency and without missing can very painful depending on the stack being used. From my point of view Linux is currently the best systems for this task since each Bluetooth dongle has its own stack. On Windows there is just one stack if the default drivers are used. Using other drivers (from Toshiba, etc.) in parallel has been also successful, but the usage is less comfortable in this case.

We therefor decided to connect two orientation trackers with a single Bluetooth module to halve the amount of necessary Bluetooth connections.

4 tracking sensors with 2 Bluetooth units
2 sensor pairs with 2 Bluetooth units

You can see the extended soldering layout in the following image.

Sensor pair soldering layout
Sensor pair soldering layout

The performance and tracking quality of the first sensor generation has been very promising already. Here you can see a video I shot when I persuade a colleague (thanks Ben!) to play around with the sensors in a simple MoCap application I wrote for the Unreal Engine.

Unfortunately the student project had ended with the state of the first sensor generation. However, I wanted to make more out this project and pushed the development further in my free time. Primarily I improved the form factor of the case and avoided any cable outside of the case by integrating a battery within the sensor.

This is the 4th (and up-to-date) generation of the case which has several benefits over the first prototypes.

Module Cover in SolidEdge
Module cover (created in Solid Edge)
Module Base in SolidEdge
Module base (created in Solid Edge)

 


Creation Guidance

So, if you have decided to recreate this sensor for your own applications, make sure to have everything together. The required components are:

  • acceleration/gyro breakout board (MPU-6050 on Amazon)
  • bluetooth breakout (HC-05 on Amazon)
  • Arduino Pro Mini (3.3V on Amazon)
  • 3.7V battery (Li-Ion 3,7 V, 850 mAh, e.g. this one)
  • switch, charge connector (e.g. on Amazon)
  • (colored) cables
  • hot glue
  • fixation strap + hook and loop fastener
  • four screws for the cover

Additionally make sure you have the tools you need to assemble the sensor:

  • wire cutter
  • soldering station
  • hot glue gun + glue
  • screw driver
  • needle+thread for the fixation straps

Now, let’s begin to create the sensor piece for piece:

    1. configure the bluetooth board (set descriptive name, use wiring #1)
    2. download software and models for 3d printing (see Download section below)
    3. upload software to Arduino Pro Mini (use wiring #2)
    4. 3d print cover and fixation strap clip
Preparing 3d print in Cura
Preparing 3d print in Cura
3d print sensor case (4th generation )
3d print sensor case (4th generation )
    1. check provided layout to arrange components on bottom cover
    2. glue parts onto bottom
    3. glue battery onto top cover
    4. solder arduino with bluetooth breakout (use wiring #3)
    5. solder arduino with MPU-6050
    6. solder switch with arduino, set switch to OFF
    7. solder switch with charge connector
    8. solder battery with switch
Completed soldering of components
The wired components are finally fixed to module base using hot glue.
  1. insert fixation strap and sew hook and loop fastener
  2. use the four cover screws to finish assembly (do not overtighten the screws)
Finished sensor
Finished IMU components

Initial operation

Before usage, connect the power supply with charge connector to charge the battery. I have used a 850 mAh battery which takes about an hour to charge with a standard power supply. Please monitor the charging behavior using a multimeter ! If the battery gets hot and ‘overcharged’ caused by high current, the battery may explode ! And you probably don’t want to this to happen.

Programming the HC-05 Bluetooth Breakout

I suggest to provide a sensor-describing name for the bluetooth breakout board. You can use the Arduino Pro Mini for this task. However, It is easier to quickly connect an Arduino UNO to the Bluetooth breakout.

For programming the HC-05 you need the hardware address of the module. The best way to find it out is just to check it with your Smartphone or computer. I am using the application BlueTerm on my Smartphone and TeraTerm on my computer. Both applications are very handy. In this example BlueTerm shows that the Bluetooth Module has the hardware address 00:14:01:03:55:35. Please note the address now. We will use this adress later in the programming procedure.

After that the actual programming procedure begins.

  • load SoftwareSerialExample in Arduino IDE
  • set mySerial variable to Baud 38400
  • connect Arduino UNO via USB with computer
  • Upload sketch to Arduino UNO while Bluetooth module is NOT connected
  • disconnect USB (!)
  • connect Bluetooth module, setup the following wiring for programming

Wiring #1:
HC-05 TXD -> Arduino Digital 10 (RX)
HC-05 RXD -> Arduino Digtial 11 (TX)
HC-05 VCC -> Arduino 5V
HC-05 GND -> Arduino GND
HC-05 WakeUp -> Arduino 3.3V

  • connect Arduino UNO via USB, disconnect USB after 5 seconds (!!!)
  • now the HC-05 is in programming mode
  • Open Serial Monitor via Arduino IDE (Baud 9600)
  • insert “AT” and send -> you should get an “OK” reply from the Bluetooth module
  • bind module with hardware address (in my case this is  00:14:01:03:55:35) by inserting the following command “AT+BIND=0014,01,035535”. Please note the format of the adress with both comma.
  • maybe you get an answer on the binding, it was not the case with my breakout
  • finally for naming the module (e.g NewName in this case): “AT+NameNewName”  -> you should get a “OKsetname” as a reply from the name
  • disconnect Arduino

Make sure to chose a descripte name for the sensor (e.g. “UpperLeftArm” in case you want to use it for real-time MoCap. You can check if the name set worked, either by your computer or with you smartphone.

Wiring #2: Program Arudino Pro Mini
Arduino Uno Reset -> Arduino Pro Mini GND (short edge)
Arduino Uno 3.3V -> Arduino Pro Mini VCC
Arduino Uno GND -> Arduino Pro Mini GND
Arduino Uno TX -> Arduino Pro Mini TXD
Arduino Uno RX -> Arduino Pro Mini RXI

Wiring #3: Arduino Pro Mini, MPU-6050, HC-05, Battery
Arduino Pro Mini Digital 10 -> HC-05 TXD
Arduino Pro Mini Digital 11 -> HC-05 RXD
Arduino Pro Mini GND -> HC-05 GND
Arduino Pro Mini 3.3V -> HC-05 VCC
Arduino Pro Mini Digital 2 -> MPU-6050 INT
Arduino Pro Mini 3.3V -> MPU-6050 VCC
Arduino Pro Mini GND -> MPU-6050 GND
Arduino Pro Mini A5 -> MPU-6050 SCL
Arduino Pro Mini A4 -> MPU-6050 SDA
Arduino Pro Mini RAW -> Battery +3.7V
Arduino Pro Mini GND -> Battery GND


Future Work

  • Efficient calibration and drift compensation

There are a bunch of possible improvements for the calibration algorithm. One aspect are the preset paramters for the DMP given in the i2cdevlib library for the MPU-6050. Those calibration parameters are empirical parameters that require fine-tuning for the current sensors. I think the drift of the DMP could be reduced if the drift correction parameters are derived in a dedicated calibration step. A first way could be to just measure the raw acceleration drift of the sensor in motion and while fixated and then just subtract the acceleration vector for drift compensation.

  • Replacement of Bluetooth by Wifi to enhance the possible number of connections

Currently the bandwith as well as the number of parallel connections are kind of limited by the specifications for Bluetooth connections. Wifi is less restricting in this context and could be better suited for real-time usage of a larger amount of sensors (> 7 parallel connections).

  • Onboard sd-card enables independent tracking recording

If the tracker should able to be used stand-alone a MicroSD reader should be integrated. This breakout from Adafruit is just 3×2 cm in length and width and could be therefor integrated by just slightly enlarging the sensor case.


 

Downloads

3d printing models: Sensor case (STL for 3d print)

Download software components on github

  • Arduino/Teensy sketch: IMU sensor sender (requires i2cdevlib for Arduino IDE)
  • Unreal example project for a simple rotating cube
Wearable IMU Sensors
Wearable IMU Sensors
Advertisements

16 thoughts on “DIY Project – Wearable IMU Tracking Sensor

  1. Have you thought about attaching these sensors to your feet/ankles for use with an omni directional treadmill (such as the virtuix omni)? Do you think these sensors would bexpect accurate enough to provide good input? I have been doing some research lately, as I hope to be able to build my own omni directional treadmill for use with the oculus rift cv1 when it comes out.

    Regards,
    Alex

    Like

    1. Hi Alex. I think since you can exploit an additional ground-constraint your idea should be possible in terms of temporal and angular resolution and latency. However, I would replace the MPU-6050 by a 9-axis sensor breakout for this application (costs a little more) which includes also a magnetometer. This helps to avoid a drift in the yaw angle (about the y-axis).

      Like

      1. Do you think a MPU-9250 would be sufficient? Also, do you have any tips as to how I would go about doing this on the software side? I have not written any kind of code before so this would be my first time. Essentially all I need is a way to use the sensor to emulate a game pad joystick.

        Like

      2. Yes, the MPU-9250 is nice piece of hardware which also has a DMP onboard. With regard to the software side you can compute the position data on an Arduino with high temporal resolution and perform also a more or less complex filter (e.g. Kalman filter) without introducing too much latency. Then transfer the motion data using a serial writer/reader to the render system and inject the motion data into the game engine (e.g. Unreal Engine) which runs your game. Then you can trigger every event you need for player control.

        Like

  2. Hi Michael,
    I’ve just come across your project and it looks incredibly interesting! Is it still ongoing? Would you at all be able to share the arduino / linux software code that you’ve developed for sensor sending – receiving? My experience is very limited in this field, but I would love to try this out for my own diy project!
    Thank you!

    Like

    1. The code is online on github. The serial reader is currently not really portable but this shouldnt be a big deal since it is basically just reading 4 floats representing the quaternion from the USB in a loop.

      Like

      1. Actually full details on the assembly process and code/integration for full utilization would be Awesome. 😀
        Well, you asked 😏😜
        Thank you for providing what you have to everyone – it is GREATLY appreciated !!!!

        Like

  3. Your project is amazing…
    I have done a similar thing for an university project using Arduino 101 which has integrated imu sensors and i was looking for a tool nice to plot values of the IMU, and your unreal engine Mocap is gorgeous, so i’ll to use it…did you upload to github right?
    The think that i’ll have to change is that the values that it has to use and read are already stored in a txt file; do You think is possibile?
    Thanks so much
    Very good job

    Like

      1. I was looking in the github folder, but where is the code of the “blu man” sample? I have found just the cube example…. Unfortunately the readme isn’t very helpfull…

        Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s