Real-Time Computer Vision Based Hand Gesture Recognition
DOI:
https://doi.org/10.47392/irjash.2023.S052Keywords:
Hand gesture recognition, Palm landmark poin, Static gestures, Dynamic gestures, MediaPipe-Hands, Key-Point based Classification, Pont-History based Classification, LSTMAbstract
The ability to recognize the shape and movement of hands can help improve the user experience in a wide range of technical domains and platforms. It can help you understand sign language and move your hands in the right way, for example. It can also make it possible for digital information and materials to be added on top of the real world in augmented reality. Here, I talk about a real-time, on-device hand gesture recognition solution that lets us control our system’s graphical user interface (GUI) with static and dynamic hand gestures that can be trained to do a set of actions that are similar to what we do with our mouse and keyboard. It is built with MediaPipe-Hands, which finds the palm landmark point. The data is then sent through a pipeline of data-preprocessing functions and trained with two models: one for static gesture recognition and one for dynamic gesture recognition. In real-time, the models are then used to detect similar gestures on-device from a video-capturing device like a webcam.
Downloads
Published
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.