Skip to content
Related Articles

Related Articles

Sixth Sense Technology

Improve Article
Save Article
  • Difficulty Level : Medium
  • Last Updated : 22 May, 2020
Improve Article
Save Article

Steve Mann is known as “Father of Sixth Sense Technology” who made a wearable computing device in 1990. People know that the way of integration between humans and robots has been developing using various technologies. Controlling the robotic vehicles using gestures is a predominant way of enhancing integration. Human gesture enhances human-robot interaction by making it independent from input devices. Inbuilt camera within the system is used to capture the real-time video of gestures. It associates technologies like hand gesture recognition, image capturing, processing, and manipulation. Color markers are used to elucidate the gesture movements. For sending the gesture commands wireless Zigbee series 2 module is used generally.


  1. Camera – Key input device, art as a digital eye. Able to capture input by user gestures.
  2. Projector – Key output device and projects digital information from the device on the surface or wall.
  3. Mobile Computing device – Microsoft enabled device and used as the brain of setup. It is pre-programmed to complete the tax.
  4. Mirror – It reflects the projection to the perfect desirable place
  5. Microphone – Required when using a paper. Used as a setup.
  6. Coloured Markers – In figures, used as gestures or simply clicking images.

Working Procedure

  • Zigbee coordinate is connected to the serial port of the system through a USB explorer board.
  • An Arduino board is used. The user wears colored tapped to provide input gesture to the system.
  • The camera captures the real-time video at a fixed frame rate and resolution which is determined by the hardware of the camera.
  • Color plays an important role in image processing. Each image is composed of an array of M*N pixels with M rolls and N columns of pixels. Each pixel maintains a certain value for primary colors like red, green, and blue. Based on the threshold values RGB colors can be differentiated. In the color detection process, the required color can be detected from an image.
  • Order to control the robotic vehicle according to the input gases given by the user Arduino Uno board is used.
  • Serial communications between Arduino and MATLAB are established wireless through Zigbee.
  • The RGB image is converted into a gray-scale image. It converts the true color image RGB to the gray-scale image.
  • From the RGB image the required color (red) is extracted by subtracting the image. The red, green, and blue color object is detected by subtracting the RGB super-hit channel from the grayscale image. The grey region to the image obtained after subtraction needs to be converted to a binary image for finding the region of the detected object.
  • The conversion to binary is required to find the properties of a monochromatic image.
  • For the user to control the mouse pointer it is necessary to determine a point phone whose coordinates can be sent to the cursor. With these coordinates, the system can perform robotic movements.
  • The centroid is calculated for the detected region. The output function is a matrix consisting of the X and Y coordinates of the centroid. The number of centroids is transmitted to the Zigbee coordinated via the COM port of the system.
  • Zigbee routers present in the wireless network receive data from co-ordinate and it transmits to the Arduino.
  • The Arduino transmits a command to robotic vehicles. Appropriate commands are used for movements like forward, reverse, turning left and right are executed.


  1. Viewing map(call a map)
  2. Check time
  3. Create a multimedia reading experience
  4. Taking pictures
  5. Drawing applications(zoom in zoom out using hand movements)
  6. Making calls and interacting with physical objects
  7. Flight updates, Getting product information


  1. Portable
  2. Support Multi-touch and multi-user interaction
  3. Cost-effective because a device cost up-to $300
  4. We can directly access data from machines in real-time approach
  5. Mind map the idea anywhere
  6. Open-source software


  1. Hardware limitations of the devices, that we currently carry with us
  2. For example, many phones will not permit the external camera feed to be manipulated in real-time
  3. Post-processing can occur sometimes

Conclusion & Future Enhancements

The technology uses a wearable gestural interface that superimposes the material world with the digital world.
Other than eye, tongue, nose, ear, gesture, this technology uses an extra six sensory perceptions to interact with the material or physical world. Sixth sense technology will remove the task of carrying tedious laptops and other heavy devices. The connection between the physical world and the digital world is a huge milestone achieved by mankind. There are many scopes of further modifications to this technology:

  1. To get rid of color markers
  2. To embedded camera and projector inside mobile and computing devices
  3. To have 3D gesture tracking
  4. To make Sixth sense work as a fifth sense for disabled people
My Personal Notes arrow_drop_up
Related Articles

Start Your Coding Journey Now!