Control Mouse with hand gestures detection python

Control Mouse with hand gestures detection python

Introduction

Gesture Controlled Virtual Mouse makes human-computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant. This project makes use of state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by MediaPipe running on top of pybind11. It consists of two modules: One which works directly on hands by making use of MediaPipe Hand detection, and the other which makes use of Gloves of any uniform color. Currently, it works on the Windows platform.

Video Demonstration: link
Note: Use Python version: 3.8.5

Gesture Recognition:

Neutral Gesture

Control Mouse with hand gestures detection python Gesture Controlled Virtual Mouse makes human-computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant. This project makes use of state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by MediaPipe running on top of pybind11. It consists of two modules: One which works directly on hands by making use of MediaPipe Hand detection, and the other which makes use of Gloves of any uniform color. Currently, it works on the Windows platform.
Neutral Gesture. Used to halt/stop the execution of the current gesture.

Move Cursor

Control Mouse with hand gestures detection python Gesture Controlled Virtual Mouse makes human-computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant. This project makes use of state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by MediaPipe running on top of pybind11. It consists of two modules: One which works directly on hands by making use of MediaPipe Hand detection, and the other which makes use of Gloves of any uniform color. Currently, it works on the Windows platform.
The cursor is assigned to the midpoint of the index and middle fingertips. This gesture moves the cursor to the desired location. The speed of the cursor movement is proportional to the speed of the hand.

Left Click

Control Mouse with hand gestures detection python Gesture Controlled Virtual Mouse makes human-computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant. This project makes use of state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by MediaPipe running on top of pybind11. It consists of two modules: One which works directly on hands by making use of MediaPipe Hand detection, and the other which makes use of Gloves of any uniform color. Currently, it works on the Windows platform.
Gesture for single left click

Right Click

Control Mouse with hand gestures detection python Gesture Controlled Virtual Mouse makes human-computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant. This project makes use of state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by MediaPipe running on top of pybind11. It consists of two modules: One which works directly on hands by making use of MediaPipe Hand detection, and the other which makes use of Gloves of any uniform color. Currently, it works on the Windows platform.
Gesture for single right-click

Double Click

Control Mouse with hand gestures detection python Gesture Controlled Virtual Mouse makes human-computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant. This project makes use of state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by MediaPipe running on top of pybind11. It consists of two modules: One which works directly on hands by making use of MediaPipe Hand detection, and the other which makes use of Gloves of any uniform color. Currently, it works on the Windows platform.
Gesture for double click

Scrolling

Control Mouse with hand gestures detection python Gesture Controlled Virtual Mouse makes human-computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant. This project makes use of state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by MediaPipe running on top of pybind11. It consists of two modules: One which works directly on hands by making use of MediaPipe Hand detection, and the other which makes use of Gloves of any uniform color. Currently, it works on the Windows platform.
Dynamic Gestures for horizontal and vertical scroll. The speed of the scroll is proportional to the distance moved by the pinch gesture from the start point. Vertical and Horizontal scrolls are controlled by vertical and horizontal pinch movements respectively.

Drag and Drop

Control Mouse with hand gestures detection python Gesture Controlled Virtual Mouse makes human-computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant. This project makes use of state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by MediaPipe running on top of pybind11. It consists of two modules: One which works directly on hands by making use of MediaPipe Hand detection, and the other which makes use of Gloves of any uniform color. Currently, it works on the Windows platform.
Gesture for drag and drop functionality. Can be used to move/transfer files from one directory to another.

Multiple Item Selection

Control Mouse with hand gestures detection python Gesture Controlled Virtual Mouse makes human-computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant. This project makes use of state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by MediaPipe running on top of pybind11. It consists of two modules: One which works directly on hands by making use of MediaPipe Hand detection, and the other which makes use of Gloves of any uniform color. Currently, it works on the Windows platform.
Gesture to select multiple items

Volume Control

Control Mouse with hand gestures detection python Gesture Controlled Virtual Mouse makes human-computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant. This project makes use of state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by MediaPipe running on top of pybind11. It consists of two modules: One which works directly on hands by making use of MediaPipe Hand detection, and the other which makes use of Gloves of any uniform color. Currently, it works on the Windows platform.
Dynamic Gestures for Volume control. The rate of increase/decrease of volume is proportional to the distance moved by the pinch gesture from the start point.

Brightness Control

Control Mouse with hand gestures detection python Gesture Controlled Virtual Mouse makes human-computer interaction simple by making use of Hand Gestures and Voice Commands. The computer requires almost no direct contact. All i/o operations can be virtually controlled by using static and dynamic hand gestures along with a voice assistant. This project makes use of state-of-art Machine Learning and Computer Vision algorithms to recognize hand gestures and voice commands, which works smoothly without any additional hardware requirements. It leverages models such as CNN implemented by MediaPipe running on top of pybind11. It consists of two modules: One which works directly on hands by making use of MediaPipe Hand detection, and the other which makes use of Gloves of any uniform color. Currently, it works on the Windows platform.
Dynamic Gestures for Brightness control. The rate of increase/decrease of brightness is proportional to the distance moved by the pinch gesture from the start point.

Getting Started

virtual mouse using hand gesture

Now we will see all the requirements we need to run this virtual mouse using the hand gestures in our system, below I have mentioned all the requirements.

Pre-requisites

Python: (3.6 – 3.8.5)
Anaconda Distribution: To download click here.

Procedure

git clone https://github.com/xenon-19/Gesture-Controlled-Virtual-Mouse.git

For detailed information about cloning visit here.

Step 1:

conda create --name gest python=3.8.5

Step 2:

conda activate gest

Step 3:

pip install -r requirements.txt

Step 4:

conda install PyAudio
conda install pywin32

Step 5:

cd to the GitHub Repo till src folder

The command may look like this: cd C:\Users\.....\Gesture-Controlled-Virtual-Mouse\src

Step 6:

For running Voice Assistant:

python Proton.py

( You can enable Gesture Recognition by using the command “Proton Launch Gesture Recognition” )

Or to run only Gesture Recognition without the voice assistant:

Uncomment the last 2 lines of code in the file Gesture_Controller.py

python Gesture_Controller.py

Also Read:

Share:

Author: Harry

Hello friends, thanks for visiting my website. I am a Python programmer. I, with some other members, write blogs on this website based on Python and Programming. We are still in the growing phase that's why the website design is not so good and there are many other things that need to be corrected in this website but I hope all these things will happen someday. But, till then we will not stop ourselves from uploading more amazing articles. If you want to join us or have any queries, you can mail me at admin@violet-cat-415996.hostingersite.com Thank you