Welcome to the GestureFlow project! This project aims to revolutionize interactive gaming by introducing a novel hand gesture control system. By leveraging advanced computer vision techniques, we have developed a system that allows players to control games using natural hand movements, providing a more immersive and engaging gaming experience.
- Introduction
- Literature Survey
- Methodology
- Camera Image Acquisition
- Background and Hand Segmentation
- Hand Detection Framework
- Hand Detection Mechanism
- Development of Game
- Game Environment and Logic
- User Interface and Management
- Mapping Function
- Gesture-Based Directional Control
- Mapping Gestures to Control
- Optimization and Performance Considerations
- Validation and User Feedback
- Results and Discussion
- Conclusion
- References
The gaming industry is at a pivotal point where alternative input methods are being explored to enhance user interaction. GestureFlow aims to improve the immersive gaming experience by providing interfaces that capture hand gestures to control actions within games. This approach integrates computer vision, hand segmentation, and gesture recognition to offer a natural and engaging way to interact with virtual environments.
Our research builds upon several key studies in the field of hand gesture recognition and interactive gaming. Notable contributions include:
- Multi-modal zero-shot dynamic hand gesture recognition using Transformer and other advanced models.
- Real-time hand gesture to text translation with high accuracy.
- Gesture-controlled virtual artboards for communication and education.
We use OpenCV, DirectShow (Windows), and Video4Linux2 (V4L2) for capturing images and videos from cameras. These tools provide a robust framework for real-time image and video processing.
Two primary techniques are used:
- Skin Colour Detection: Identifies hand pixels based on skin color.
- Background Subtraction: Uses a stable background to isolate hand movements.
Google’s Mediapipe Framework is employed for real-time hand detection using feature points of hands. It provides accurate 2D and 3D hand landmark coordinates for gesture recognition.
- Landmark Identification: Detects key points on the hand such as the index finger tip and wrist.
- Angle Calculation: Calculates angles formed by the fingers to determine gestures.
Using libraries like Mediapipe, OpenCV, and Pygame, we developed a game where gestures control the game actions. Pygame handles game events, logic, and graphics.
The game environment is initialized with assets like car images and background. The main game loop handles player input, updates game state, and displays the game.
Messages like “Game Over” are displayed based on game events. The background is continuously updated based on game speed and screen dimensions.
A custom mapping function associates each recognized gesture with a specific control command. This function uses data structures like hash maps and arrays for efficient gesture-action mappings.
The system monitors hand movements and maps them to game controls. The car game can also be controlled using a keyboard as an alternative input method.
The mapping method improves user experience and the intuitiveness of gesture-based interactions within the game.
Optimization techniques like parallel processing and hardware acceleration minimize latency and enhance system performance.
Performance tests and user feedback loops are used to improve the gesture-action mapping scheme based on gameplay scenarios and user interactions.
Preliminary results show that the gesture-based control method is effective and engaging. Users found the system natural and easy to use, though some lags and errors were noted. Future improvements will focus on enhancing system response and accuracy.
GestureFlow represents a significant advancement in integrating hand gesture control systems with interactive gaming. By providing a comprehensive understanding of the technical intricacies involved, this research contributes to the advancement of gesture-based interaction paradigms in gaming.
For a detailed list of references and further reading, please refer to the full research paper included in this repository.
We hope you find GestureFlow to be an exciting and innovative approach to interactive gaming. Thank you for exploring our project!