Re-Glove





Introduction

The glove based controller is a new approach to interact with the digital world. Its unique design allows for truly immersive experiences by tracking your hands in real-time. With an unlimited amount of possibilities, the controller offers a dynamic experience by enhancing existing digital technology.


Summary

Traditional input devices such as keyboard and mouse are unable to keep up with the rapidly growing world of Augmented Reality ​ ( ​ a technology that superimposes a computer-generated image on a user's view of the real world, thus providing a composite view ​ ) ​ . They provide limited functionality and are not user friendly in computer environments specifically developed to control objects in the physical world.

The proposed glove-based model aims to add to the use of traditional input methods and make use of physical gestures to control a device using intuitive, natural movements. The key idea behind developing this prototype is to pave the way for completely new input devices. With sustained growth in Augmented Reality and Virtual Reality, such controllers will emerge as ideal wearable devices that can be used to communicate with computers, devices, robots and environments, both wirelessly and in a wired manner. In future, this kind of technology is ideal to interact with immersive 3D Virtual Reality environment, to model the movement of body parts in that environment. The glove can give a level of control that traditional input devices cannot rival.

Apart from controlling a computer, it can be used to control robotic devices like remote controlled cars, or most naturally, a robotic arm. Home devices like smart TV (certain extensions will allow control of older TV versions), lighting systems, and various electrical appliances in the house can be controlled using the controller with simple hand gestures.


Product details

The controller is a skeleton/glove which enables the wearer to control electronic devices like computers, robots, drones and IoT enabled home appliances. Designing the device is a challenge since it needs to comfortably fit the hand and at the same time be portable and lightweight.

The device consists of five flex sensors placed over each finger and a central 6/9 degree of freedom Inertial Measurement Unit (IMU)(MPU 6050/9150 breakout board ) modules attached to the dorsal surface of the hand . Each of these flex sensors are designed in a manner that their resistance increases on bending the sensor.

The change in current passing through it will be fed to a on-board microcontroller (Arduino nano/ Arduino pro mini). The board will read the raw values from the IMU and it will process the data into human-readable instantaneous state variables. These values will be sent wirelessly (over WiFi / Bluetooth) or by wire (as per requirement) to a Computer*.

The computing unit will process the data, interpret the gestures and accordingly perform the required tasks based on the current operating mode of the controller.


Impact and utility

The controller is an entirely new input method and is expected to soon be incorporated in almost all VR enabled devices. The main thrust behind developing the glove is to enable a more natural way of interacting with technology. In this manner, we will be able to better understand newer methods of controlling devices and appliances.

In the near future this technology could be used to control robotic devices remotely, eliminating every single remote controlled device to be replaced with a universal controller. It has potentially limitless application.

It can be used to control robots in a manufacturing line, where humans are not allowed to enter. It can be used to drive vehicles, control office and home electronics, it can be connected to media streaming devices to control media playback. And it can also be used to instruct intelligent robotic objects. Gesture recording is also possible and gestures can be user independent.


Target Customers

Since the product can control every aspect of technology, it can be used from the most basic forms like controlling a powerpoint presentation, by swiping left or right to change slides, or use it as a virtual laser pointer on screen to as advanced as controlling complex machinery and robotic elements like quadcopters or remotely controllable robots. It can interact with 3D drawings and objects in the virtual world and be used to play games in immersive environments.

It can be used for home automation, controlling smart devices. Complicated gestures can be used for extensive interaction with computers and VR controllers. In future iterations, we plan on using the product for integrating with 3D CAD software like Solidworks or Autodesk Fusion.


Project Development

The project began in September 2017. In the initial month, we discussed design, interfacing and programming decisions with regards to the glove. We elected to use the Arduino Nano for our processing needs as its 5 volt logic level was in alignment with the sensor’s voltage requirements and also for its robustness. We chose not to use the Teensy as it was too expensive for the prototyping phase .

Starting October, we formalized the pipeline for transmission and processing of the sensor data generated by the MPU 6050. A few simple programs were written which allowed the wearer to navigate a computer and move a cursor via glove control. We had established high reliability levels for data correctness given transmission rates and distances.

By November, we were able to estimate the pose of a hand purely from MPU readings as well as deftly maneuver a radio controlled rover. A simplistic housing for the components was developed which enabled us to collect sensor data so as to explore the possibility of gesture recognition. Four iterations of the housing were designed and 3D printed in the Sandbox. Models trained on the collected data indicated that offline gesture recognition was entirely possible. A flight simulator was also interfaced with the glove. Users who engaged with the rover and the simulator shared that glove control felt more natural and intuitive than traditional methods such as a radio controller or a keyboard and mouse setup.

We suffered a setback in December and January due to a delay in the delivery of new components such as flex sensors and the MPU 9150 which would provide richer, more confident sensor readings. We also experienced a malfunction in which one iteration of the glove was rendered defunct. In this period of time, we interfaced the glove with traditional video games such as Pong and Super Mario, developed a preliminary pipeline for online gesture recognition and made extensive use of the PCB printing machine so as to reduce the size and weight of the module.

Currently, the prototype must be tethered to a power source, communicates through line-of-sight over radio frequencies, and uses an MPU 6050 to collect acceleration and angular velocity readings. Ideally, a battery module must be incorporated into the glove. A wifi module should be used so that transmission is multi-directional and data can be sent to a central server, a facet that can be immensely useful while adapting the glove to the task of home automation. An upgrade to the MPU 9150 would lend greater reliability to the sensor data.


Role of Sandbox

We used the 3D printer for making the outer case for compact assembly. The consumables provided in SANDBOX were used for making circuits. CNC PCB milling machine was used to make PCBs for the third and fourth iterations. The CNC laser cutting machine was used to make parts of the chassis of the robot vehicle. Oscilloscopes were used to check various aspects of the circuitry. The mech workshop in Sandbox is a priceless addition to the facility that has all the tools to manufacture mechanical components.

Apart from the equipment provided, SANDBOX provides us a perfect working space to execute and implement the ideas. It provides an environment that helps us focus on our work. The financial aid provided by SANDBOX in terms of ordering project components is indispensable to our project. 3D printing makes assembly easy. Rapid prototyping has become easy and PCB printing can be done in a matter of few hours, at cheap prices.


Name ID
Aakanksh Zarapkar 2016A7PS096G(Gesture Recognition)
Mihir Kulkarni 2016A4PS150G(Electronics and Assembly)
Utkarsh Mahajan 2016A4PS299G(Design and Manufacturing)
Anish Bhobe 2016A7PS0030G(Game Development and VR interfacing)