[youtube http://www.youtube.com/watch?v=Y-z9cT9MOKU?rel=0&w=640&h=360] [Microsoft Kinect-controlled VTOL aircraft control simulator using LabVIEW]
In the three years I spent struggling with my Pharmacology BSc, I came to realise that researching new and innovative drug molecules wasn’t quite my thing. The new and innovative bit was cool, the drugs and science bit not so.
Prior to leaving university, I started FatKidOnFire as a way of distracting myself from my finals. Upon leaving (with a degree), I continued the blog while I did what every graduate does after the Student Loan runs out – I started looking for a job.
Except I didn’t want to work in a lab or do a Masters (which, at the time, seemed like my only options). So I did a bit of hunting around and ended up doing an internship with a lovely little digital agency on the edges of East London’s Silicon whatsit.
It was at Hoop, where I’ve been for the past 14 months, that I discovered my love for new and innovative technology. One of my main responsibilities at the company is keeping their blog ticking over – with industry insight pieces and a few series on startups/ internetz finds.
Getting to the point (I’m not particularly good at being succinct – props if you made it this far and please keep reading!), I’m leaving Hoop in ten days’ time and joining a 4 person technology startup in North London. I’ll be losing the blogging steam I’ve been running on at Hoop – but luckily the nice people at Ubelly have offered me up a spot here! So I’ll be spouting my love for all things cool and innovative here instead (if you’ll have me).
I’ve been scratching my head about how to kick my Ubelly career off, but luckily (as things often do) it all fell into place. One of my old school friends is at University of Leeds, studying a Mechanical Engineering Masters. And it so happens his final year research project involves a rather nifty use of the Kinect SDK…
[Chris demonstrating part of a video suite that Kinesthesia have developed. When patients are doing the traditional ARAT test, the Kinect is able to video and record the entire session. The video is recorded as an AVI with the skeleton position data encoded into the video, allowing clinicians to review through the video footage to get a better, 3D view of the patient's movement. They can also select areas of interest in the video and append them together so that time between exercises can be ignored.]
Who are you guys? We are Chris, Barnaby and Dom. We’re all in our final year of study for a MEng in Mechanical Engineering at the University of Leeds. Barnaby and I have focused largely in tech-based subjects, a lot of programming and mechatronics, while Dom has focused his studies on Biomedical Engineering (Engineering at Leeds is a world leader in the field of research for biomedical engineering).
Tell me a bit about your research project. Who/ what/ why/ how? During the summer of 2011, Barnaby and I were both on independent summer placements. Over the summer, a large number of Kinect hacks started going viral – stimulating our imagination. Through a long stream of emails, we started to come up with some ideas and thinking about how we could try to get the project approved as a final year project. After approaching the university, two academics were keen to investigate the potential of the Microsoft Kinect. One was Prof. Martin Levesley (who is researching the next generation of healthcare devices and Dr. Peter Culmer (who is head of Surgical Technology). We were linked to National Instruments as our industrial mentor. National Instruments’ LabVIEW software is used extensively within the department and so developing code that could be used in the department was pretty damn vital.
[The toolkit Kinesthesia have developed. The code you see here is in the LabVIEW graphical programming language – it is all the code you need in order to get the Video and Depth Images from the camera and produce a 3D rendering of the skeleton.]
What are you researching? The primary focus of our project is to develop a toolkit that opens up the functionality of the Microsoft Kinect and interfaces it with National Instruments’ LabVIEW. Beyond this, the project has 3 main aims:
- We are developing a system that allows a clinician to assess a patient’s progress throughout rehab (for instance after a stroke) by viewing video and skeleton data. This is showing a lot of promise and we are extending this section to produce a video analysis suite so that video and skeleton data captured can be edited, viewed and analysed programmatically. We are currently working on providing a virtual ARAT test (Action Research Arm Test) that is used to assess a patient’s capacity to perform reaching and grasping tasks (particularly after a stroke).
- The second project goal is looking into the possibilities of using the Kinect for new technology within surgery. We initially looked at using the skeletal tracking of the Kinect to measure the movement of a surgeon while working on a laparoscopic training tool (to give an objective measure of a surgeons skill through their efficiency of movement and time taken). We are also aiming to use the depth-measurement capabilities to assess the room a surgeon has to work within the abdomen of a patient.
- The third area of research is using the Kinect’s skeleton tracking to provide a gait analysis suite for the assessment of musculoskeletal diseases.
What hardware do you use in the course of your research?The main focus of the work is obviously the Microsoft Kinect. In order to assess the data that we obtain from it, evaluate and validate its measurements, we’re also using various pieces of hardware (such as the OptoTrak Certus Infrared tracking system and 3D scanners) to validate the skeleton tracking and depth map measurements. Additionally, some of our examples of Kinect-controlled systems interface with external hardware, such as the VTOL demonstration.
When did you adopt the Kinect SDK? Why? The Microsoft Kinect is game changing technology. Through the combination of its depth map and its skeleton tracking abilities, it enables a massive range of applications. It was a clear choice to attempt to use this incredible technology to produce a low cost and effective toolkit for the use of biomedical engineers. From working with it we, and the academic staff supporting us, are consistently blown away with the capabilities of the hardware and how it has been packaged into such an affordable piece that’s open to anyone.
[An example of charts that physios could use to plot either the position, velocity or acceleration of any joint of the skeleton in the X, Y and Z axes based on the options the user chooses. It's quite a good tool for the physios Kinesthesia have been talking to!]
How does the Kinect facilitate your research? The SDK has enabled us to gain access to the full capabilities of the Kinect, controlling camera functions and collecting all data streams from it. We have built Virtual Instruments (sub-programmes within LabVIEW) that provide a simplified interface for people wanting to develop Kinect-based applications in LabVIEW and gain access to the data they need. Through these VIs we are developing software to analyse the data collected from the Kinect, so that clinicians and researchers can assess patients.
What implications does your research with the Kinect have for changing the current tools used for treatment/ research? Already, within the School of Engineering, we are looking to replace the use of the Optotrak system to assess the patients that have been trialling the use of the intelligent Pneumatic Arm Movement (iPAM) robot. We are also looking into adding a Kinect to the iPAM robots, as well as another robot developed for children with Cerebral Palsy to ensure exercises are being performed correctly. Finally, the virtual ARAT environment may eventually provide rehabilitation exercises and assessments that can be performed in the patient’s house.
Any final words? Expect to see a lot more physical systems being controlled via the Kinect. The quantity and variety of systems operated via LabVIEW is staggering, and by providing a user-friendly interface between the two, we are opening up the Kinect to a whole host of new applications.
We are also entering the project into the 2012 National Instruments Student Design Competition, so if anyone likes what we’re doing feel free to visit our project page and even give our page a ‘like’ !
[Barnaby demonstrating the Virtual ARAT system Kinesthesia are developing – people sit in front of it and can have 'their' arms rendered on-screen. According to Chris; "the camera angle moves with the user's head so you get some cool effects of the perspective etc as you move around".]
So there you go. I was pretty blown away by Chris and the guys’ research project with the Kinect SDK – and it seems their work has some pretty massive implications in the rehab treatment for stroke patients; bearing in mind stroke is the “leading cause of disability in adults in the USA and Europe”! Pretty damn cool.
*I was trying to think of a title for this series of posts – until I remembered a nickname Sara had given me. Problem solved!