docs/source/index.rst
Fast State-of-the-art tokenizers, optimized for both research and production
🤗 Tokenizers_ provides an implementation of today's most used tokenizers, with
a focus on performance and versatility. These tokenizers are also used in
🤗 Transformers_.
.. _🤗 Tokenizers: https://github.com/huggingface/tokenizers .. _🤗 Transformers: https://github.com/huggingface/transformers
.. toctree:: :maxdepth: 2 :caption: Getting Started
quicktour
installation/main
pipeline
components
.. toctree-tags:: :maxdepth: 3 :caption: Using 🤗 Tokenizers :glob:
:python:tutorials/python/*
.. toctree:: :maxdepth: 3 :caption: API Reference
api/reference
.. include:: entities.inc