- Meet Variya - https://github.com/meetvariya
- Many of the Disabled People find it difficult to write using keyboard ,Also in Covid Situation many people don’t want to touch stuff that’s related to Covid.
- Sometimes when Mouse or Keyboard doesnt Work Properly Incase of a laptop is a Difficulty.
- We make a Gesture Based Keyboard Interface using Webcam so that one can type things in a new Interactive way. The Way it Works is using your Hand gesture and Eye tracking to use the Gui keyboard and mouse to Interact With it.
- Developed python exe file setup using tkinter to fully control mouse & keyboard movements by human hand & eye gestures.
- It works with the help of different libraries and packages like OpenCV, Mediapipe, Cvzone, Dlib, Haar Cascade, etc.
- Also included Voice & basic storage features. Helpful in such a circumstances like COVID-19.
Model Detects Face and Hand points using following modules:
- The mouth can be accessed through points [48, 68].
- The right eyebrow through points [17, 22].
- The left eyebrow through points [22, 27].
- The right eye using [36, 42].
- The left eye with [42, 48].
- The nose using [27, 35].
- And the jaw via [0, 17].
- 4,8,12,16,20 - top of all five fingers, thumb top to pinky fingure top respectively.
Different packages:
cvzone's HandTrackingModule to detect hands. Dlib's shape predictor(shape_predictor_68_face_landmarks.dat) and haarcascade classifier(frontal_face_detcetor.xml) to detects face imutils to fetch eyes landmarks from dlib's 68 points facial detection. Eye-aspect-ratio(EAR) for detect blinking. pynput for controlling keyboard events. pyautogui for mouse events
- CvZone
- Pynput
- HaarCascade
- PyAutoGui
- Tkinter
- Mediapipe
- auto-py-to-exe
- Dlib