docs/en/datasets/pose/hand-keypoints.md
The hand-keypoints dataset contains 26,768 images of hands annotated with keypoints, making it suitable for training models like Ultralytics YOLO for pose estimation tasks. The annotations were generated using the Google MediaPipe library, ensuring high accuracy and consistency, and the dataset is compatible with Ultralytics YOLO26 formats.
<p align="center"> <iframe loading="lazy" width="720" height="405" src="https://www.youtube.com/embed/fd6u1TW_AGY" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen> </iframe><strong>Watch:</strong> Hand Keypoints Estimation with Ultralytics YOLO26 | Human Hand Pose Estimation Tutorial
</p>The dataset includes keypoints for hand detection. The keypoints are annotated as follows:
Each hand has a total of 21 keypoints.
The hand keypoint dataset is split into two subsets:
Hand keypoints can be used for gesture recognition, AR/VR controls, robotic manipulation, and hand movement analysis in healthcare. They can also be applied in animation for motion capture and biometric authentication systems for security. The detailed tracking of finger positions enables precise interaction with virtual objects and touchless control interfaces.
A YAML (Yet Another Markup Language) file is used to define the dataset configuration. It contains information about the dataset's paths, classes, and other relevant information. In the case of the Hand Keypoints dataset, the hand-keypoints.yaml file is maintained at https://github.com/ultralytics/ultralytics/blob/main/ultralytics/cfg/datasets/hand-keypoints.yaml.
!!! example "ultralytics/cfg/datasets/hand-keypoints.yaml"
```yaml
--8<-- "ultralytics/cfg/datasets/hand-keypoints.yaml"
```
To train a YOLO26n-pose model on the Hand Keypoints dataset for 100 epochs with an image size of 640, you can use the following code snippets. For a comprehensive list of available arguments, refer to the model Training page.
!!! example "Train Example"
=== "Python"
```python
from ultralytics import YOLO
# Load a model
model = YOLO("yolo26n-pose.pt") # load a pretrained model (recommended for training)
# Train the model
results = model.train(data="hand-keypoints.yaml", epochs=100, imgsz=640)
```
=== "CLI"
```bash
# Start training from a pretrained *.pt model
yolo pose train data=hand-keypoints.yaml model=yolo26n-pose.pt epochs=100 imgsz=640
```
The Hand keypoints dataset contains a diverse set of images with human hands annotated with keypoints. Here are some examples of images from the dataset, along with their corresponding annotations:
The example showcases the variety and complexity of the images in the Hand Keypoints dataset and the benefits of using mosaicing during the training process.
If you use the hand-keypoints dataset in your research or development work, please acknowledge the following sources:
!!! quote ""
=== "Credits"
We would like to thank the following sources for providing the images used in this dataset:
- [11k Hands](https://sites.google.com/view/11khands)
- [2000 Hand Gestures](https://www.kaggle.com/datasets/ritikagiridhar/2000-hand-gestures)
- [Gesture Recognition](https://www.kaggle.com/datasets/imsparsh/gesture-recognition)
The images were collected and used under the respective licenses provided by each platform and are distributed under the [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](https://creativecommons.org/licenses/by-nc-sa/4.0/).
We would also like to acknowledge the creator of this dataset, Rion Dsilva, for his great contribution to Vision AI research.
To train a YOLO26 model on the Hand Keypoints dataset, you can use either Python or the command line interface (CLI). Here's an example for training a YOLO26n-pose model for 100 epochs with an image size of 640:
!!! example
=== "Python"
```python
from ultralytics import YOLO
# Load a model
model = YOLO("yolo26n-pose.pt") # load a pretrained model (recommended for training)
# Train the model
results = model.train(data="hand-keypoints.yaml", epochs=100, imgsz=640)
```
=== "CLI"
```bash
# Start training from a pretrained *.pt model
yolo pose train data=hand-keypoints.yaml model=yolo26n-pose.pt epochs=100 imgsz=640
```
For a comprehensive list of available arguments, refer to the model Training page.
The Hand Keypoints dataset is designed for advanced pose estimation tasks and includes several key features:
For more details, you can explore the Hand Keypoints Dataset section.
The Hand Keypoints dataset can be applied in various fields, including:
For more information, refer to the Applications section.
The Hand Keypoints dataset is divided into two subsets:
This structure ensures a comprehensive training and validation process. For more details, see the Dataset Structure section.
The dataset configuration is defined in a YAML file, which includes paths, classes, and other relevant information. The hand-keypoints.yaml file can be found at hand-keypoints.yaml.
To use this YAML file for training, specify it in your training script or CLI command as shown in the training example above. For more details, refer to the Dataset YAML section.