This school project was built with Arduino and Java. Our team was given freedom of choice for the realm of development, so we decided to take on accessibility.
American Sign Language has a large shortage of speakers and translators, and we decided to research possible solutions for communication friction between speakers and non-speakers. We settled on developing a glove which responds to changes in the wearer's finger extension and contraction in order to translate and audio-visually display alpha-numerical characters on a terminal.
Created in the span of 8 working days, our working prototype was ready for presentation.
My responsibilities included electrical circuitry, wireless interfacing, data visualization, ASL translation systems, and diagramming.
It became obvious from an early point the glove was not going to be pretty. We were heavily constrained from the beginning by our provided electrical kits, and the fact that we needed to contain all our sensor data collection on the back of a glove before handing it over for processing. All of our wires were of uniform length as is standard for our supplied Arduino kits. Our given wiring solutions were a slightly bulky breadboard, and our Arduino processor unit.
Our solution was to mount the Arduino unit on top of the breadboard, covering up most of the breadboard's openings, but leaving enough room for all needed functionality. We also purchased five bend sensors for each finger.
The combined mount (minus the battery) was at the very minimum about the size of an average palm.
It was important for us to have wireless functionality, both to satisfy our own technical curiosity, and to remedy positional issues with accessibility. In short, we wanted to make sure our prototype wasn't introducing constraints, since its entire goal in the first place was to remove constraints.
We used xBee as a Arduino-compatible solution.
Moving forwards, our team would've liked to introduce accelerometer support for more gestural communication, since ASL words (and letters) often use sweeping or arcing movements as a variable within the language. Two-hand support would become a natural extension to cover more communicative bases.
Because of the complexity of the language, scaling up our project would be a great opportunity for the utilization of Artificial Neural Networks as a solution for training the translation software on the terminal-end of the setup. With our intended data bracket, we managed to programmatically account for all numerical characters, and a subset of the alphabet which doesn't use gestural factors. This would become exponentially harder to do programmatically as gestures and a second hand are introduced.
We were happy with our proof of concept in the time we were given, and received a 100% on the project for successfully tackling wireless data streaming in addition to hitting our functionality goals.