ProfessionResearchCase-Study
July 28, 2023
N/A
Project Research
Status: Submitted

SERVO DRIVEN FINGER GESTURE CONTROL BASED ON COMPUTER VISION

Based on the computer vision and Arduino broad control, the program is automatic after execution; first it decides either the palm is detected or not, if detected the actor is good to go for showing gesture regarding fingers only and then physical palm will mimic with respect to actor palm else it will terminate the model and reopen the webcam screen. The looping will work till we quit it manually. The aim of this project is to show the various wonderful application possibilities of computer vision combined with robotics. This project will use computer vision techniques and some essential python libraries to process the image of the palm and mimic the movement of the fingers with the provided degree of freedom to the appendage.

 
Introduction of Project
----------------------------------
Computer vision is a technology which is increasingly in the spotlight and it is important
for everyone involved in the field of technology to understand the possibilities it presents
and its current progressively more complex and powerful cameras are being integrated
into many devices, and this in turn pushes the development of more complex vision
systems. The possibility of computer vision has boosted along with the advancement of
artificial intelligence. Computer vision is the automation of the Human Visual System so
that computers can obtain high level of understanding of the environment from digital
images and videos.
Robot vision is made up of complex algorithms, cameras and any other hardware that
enables robot to develop visual sights. So, we conclude that computer vision is the eye of
a robot. This allows machines to carry out complex visual tasks such as a robotic arm or
palm programmed to mimic a visual gesture by a human.

Motivation
----------------------------------
In this project we aim to reduce human effort with minimal human intervention. Today's
systems are reducing in size and cost, we interfaced the prototype of hand with an
Arduino UNO.
Our project provides a secure approach with the aim of lowering risk of human life. This
risk includes working in hazardous environment like mines, radioactive areas and so on.
Inaccessible areas like outer space, extreme temperature areas etc. can also be interacted
with human like precision

Problem Statement
----------------------------------
The problem is to develop the mimicking system for fingers of shown palm on the web-
cam. For this mimicking system we need to develop the palm model using computer
vision, pyfirmata and mediapipe. After the model is ready, the system will mimic same
way as our palm fingers mimic. This model will work for simple gestures or mimics.

Objectives
----------------------------------
• To detect the palm using computer vision.
• To mimic the respective gesture of fingers.

Scope and Applications
----------------------------------
This mimicking system of fingers can be applied to many fields such medical, physics,
chemistry and others where we can adopt the robotic method by replacing human hazard
and harmful zones.

Conclusion & Result

After the model was executed, we were able to achieve the efficiency above 90%, But due to signal conversion, signal transformation time, signal transferring time, web-cam Jeo quality, distance of palm from the webcam and other miscellaneous issues, we were satisfied to the accuracy of our rest The finger model performs with 12° to 20° marginal difference of response mimicking of human finger's actual deflection or finger's full degree of freedom. While operating the model on real time, we encountered multiple times with the average mimicking response time differences of 0.58 second.

Authors

  • Er. Mahesh Bogati (Team-Work)

    Tribhuvan University

    076bei018@ioepc.edu.np

    Personal-Website

  • Er. Sangram Bhandari (Team-Work)

    Tribhuvan University

    076bei030@ioepc.edu.np

  • Er. Sarthak Chaudary (Team-Work)

    Tribhuvan University

    076bei034@ioepc.edu.np

Related Important Files

1. SERVO DRIVEN FINGER GESTURE CONTROL BASED ON COMPUTER VISION.pdf

About file: Submitted project report PDF File.

2. Hand Prototype Code.py

About file: Executed prototype python program

Technologies & Tags

PythonPyfirmataComputer VisionArduinoElectronics
ElectronicsCommunicationRoboticsComputer VisionSignal Analysis

Code Repository

servo-driven-finger-gesture-control-based-on-computer-vision.github

Citations & References

  • REFERENCE :- [1] K. Ohnishi, T. Tajima, Y Saito, "Anthropomorphic hand control for robotic hand-arm system", IEEE Transactions on Mechatronics, pp. 1255-1259, 2000

  • REFERENCE :- [2] Jakub Segen and Senthil Kumar, Bell Laboratories. "Shadow Gestures: 3D Hand Pose Estimation using a Single Camera", IEEE Transactions on Image processing, pp. 479-485,1999

  • REFERENCE :- [3] A. D. Kulkarni, Computer vision and fuzzy-neural systems. Prentice Hall PTR, 2001, ch. 2 and ch. 6.

  • REFERENCE :- [41 D. A. Forsyth, J. Ponce, Computer vision: a modern approach. Prentice Hall Professional Technical Reference, 2002, ch. 15.

External Links and Works

Gallery

All_Fingers_Respective_Servo_Mapping_Design.jpeg
Figure: Electronics Circuit Board Design for all five Finger and respective servos
Demo_Implemented_Model
Figure: Executed Demo View of Model
Project_working_Diagram
Figure: Finger Model Diagram of the model

Engineering Since 2018

Life will not be cool always, so you have to be cool.