docs/install.md
---
width: 100%
figclass: caption
alt: HanLP versions
name: hanlp-versions
---
Choose your HanLP version
.. margin:: **Beginners Attention**
.. Hint:: New to NLP? Just install RESTful packages and call :meth:`~hanlp_restful.HanLPClient.parse` without pain.
For beginners, the recommended RESTful packages are easier to start with. The only requirement is an auth key. We officially released the following language bindings:
pip install hanlp_restful
See Java instructions.
See Golang instructions.
The native package running locally can be installed via pip.
```{note}
See [developer guideline](https://hanlp.hankcs.com/docs/contributing.html#development).
```
pip install hanlp
HanLP requires Python 3.6 or later. GPU/TPU is suggested but not mandatory. Depending on your preference, HanLP offers the following flavors:
```{note}
Installation on Windows is **perfectly** supported. No need to install Microsoft Visual C++ Build Tools anymore.
```
```{note}
HanLP also perfectly supports accelerating on Apple Silicon M1 chips, see [tutorial](https://www.hankcs.com/nlp/hanlp-official-m1-support.html).
```
| Flavor | Description |
|---|---|
| default | This installs the default version which delivers the most commonly used functionalities. However, some heavy dependencies like TensorFlow are not installed. |
| tf | This installs TensorFlow and fastText. |
| amr | To support Abstract Meaning Representation (AMR) models, this installs AMR related dependencies like penman. |
| full | For experts who seek to maximize the efficiency via TensorFlow and C++ extensions, pip install hanlp[full] installs all the above dependencies. |
In short, you don't need to manually install any model. Instead, they are automatically downloaded to a directory called HANLP_HOME when you call hanlp.load.
Occasionally, some errors might occur the first time you load a model, in which case you can refer to the following tips.
If the auto-download of a HanLP model fails, you can either:
zip file to a particular path.If the auto-download of a Hugging Face 🤗 Transformers model fails, e.g., the following exception is threw out:
lib/python3.8/site-packages/transformers/file_utils.py", line 2102, in get_from_cache
raise ValueError(
ValueError: Connection error, and we cannot find the requested files in the cached
path. Please try again or make sure your Internet connection is on.
You can either:
Retry as the Internet is quite unstable in some regions (e.g., China).
Force Hugging Face 🤗 Transformers to use cached models instead of checking updates from the Internet if you have ever successfully loaded it before, by setting the following environment variable:
export TRANSFORMERS_OFFLINE=1
If your server has no Internet access at all, just debug your codes on your local PC and copy the following directories to your server via a USB disk or something.
~/.hanlp: the home directory for HanLP models.~/.cache/huggingface: the home directory for Hugging Face 🤗 Transformers.Some TensorFlow/fastText models will ask you to install the missing TensorFlow/fastText modules, in which case you'll need to install the full version:
pip install hanlp[full]
NEVER install thirdparty packages (TensorFlow/fastText etc.) by yourself, as higher or lower versions of thirparty packages have not been tested and might not work properly.