Melbourne University RoboGals Reasons.

Each girl entering the RoboGals competition had to answer some questions. My answers were:

Why did you choose this project?

Every time I see a person confined to a wheelchair, they’re either pushing their wheels with their hands, or (if in cases of a complete quadriplegic) being pushed around by others. So I thought; what if I do an experiment that allows complete quadriplegics to have independence when controlling their wheelchairs? It looked like quite a difficult investigation, and I thought it would be quite challenging to complete it in time. Most importantly, it looked like lots of fun to do, especially since I got to use my mentor for my quadriplegic model! You may have noticed that the above video is longer than the specified 4 minutes, so if you would like to see the shorter version, please go to this website. It has less information about the sensors used. http://www.youtube.com/watch?v=Ax5Y4dnYNgg

What did you enjoy most about the project?

I enjoyed researching the needs of a quadriplegic, and incorporated some of them into my prototype wheelchair. It was very difficult to do these things, but the challenge is part of the fun! I enjoyed using and investigating the different sensors (LEGO, MindSensors, HiTechnic, Firgelli, Dexter Industries), and this provided me with more experience in RobotC. But the thing I enjoyed most was the feeling that I could help others – if this worked, so many people could benefit from it! Overall, it has been a very enjoyable and challenging investigation for me to do. :)

What have you learned from the project?

There were so many things I learnt from this experience. I couldn’t possibly fit them in the video, so I listed them here.

  1. This is my first big RobotC project. I’m not familiar with this programming language, as my previous programming language was RoboLab.
  2. This is my first time using MindSensors Sumo Eyes, which were the sensors used when detecting the obstacle and the cliff (BONUS: SAFETY FEATURES).
  3. This is my second time using HiTechnic EOPD sensors, but also my first time using 2 EOPD sensors at the same time, so this is a new experience for me.
  4. This is my first time using a Firgelli Linear Actuator (the up-and-down movements of the chair)
  5. This is my first time using radio communication between robots (Dexter Industries NXTBee device). This is also a very important moment for me, because haven’t we all had trouble with tangled cables and knotted wires?
  6. I learnt a lot about quadriplegics, and how it’s very inconvenient to rely on others if you want to get somewhere.
  7. I learnt that some people can train their eyebrows and ears to move. It doesn’t always have to be a genetic factor, so unless the person in the wheelchair has face paralysis (i.e. Möbius syndrome) it shouldn’t be too difficult to get their facial features moving if they practice.
  8. I learnt to use Camtasia Studio and SnagIt for video and image editing.
  9. I learnt how to download videos to YouTube.
  10. And of course, I learnt that it is possible to allow independent wheelchair control for complete quadriplegics using their facial features.

Keep in mind that this is just a prototype wheelchair, so the final product will be different. :)

How did your parent or mentor help you?

My mentor had previously written some tutorials for the NXTBee, Sumo Eyes and EOPD (www.DrGraeme.net), as they aren’t part of the standard RobotC compiler. He also showed me how to use Camtasia Studio and SnagIt, and how to download videos onto YouTube. He said he learnt how to move his ears when he was my age, after reading Enid Blyton’s Famous Five books, in which he read Uncle Quentin could move one ear. He learnt how to move his eyebrows separately at that age too! He said you can learn to move your eyebrows and ears from practice, and references I read on the internet agree.

For the senior age group (13-18): briefly explain the underlying scientific theories behind the project, why you chose this method and equipment, and whether the experiment is repeatable and why, and any formulas or calculations used in the project.

I couldn’t afford a real wheelchair (as they can cost many thousands of dollars), so I had no choice but to prototype the wheelchair using a LEGO MindStorms NXT robot. This would provide a cost-effective solution to wheelchair independence. The LEGO MindStorms NXT set is commercially available, making this investigation repeatable for anyone with the sensors and the kit. However, the adjustment of the headset is critical. Any movement could cause variation in the readings and create problems with the functioning of the wheelchair. The movement of the nose is extremely small, and right at the limit of the EOPD sensor range. I did manage to get it to work, but I’m not sure if it is reliable enough for a commercial environment.

Yaya Lu.

YayaLu.net - Some of my robots - 2011