Install RESTful Packages¶
For beginners, the recommended RESTful packages are easier to start with. The only requirement is an auth key. We officially released the following language bindings:
pip install hanlp_restful
Install Native Package¶
The native package running locally can be installed via pip.
pip install hanlp
HanLP requires Python 3.6 or later. GPU/TPU is suggested but not mandatory. Depending on your preference, HanLP offers the following flavors:
This installs the default version which delivers the most commonly used functionalities. However, some heavy dependencies like TensorFlow are not installed.
For experts who seek to maximize the efficiency via TensorFlow and C++ extensions,
In short, you don’t need to manually install any model. Instead, they are automatically downloaded to a directory called
HANLP_HOME when you call
Occasionally, some errors might occur the first time you load a model, in which case you can refer to the following tips.
If the auto-download fails, you can either:
Retry as our file server might be busy serving users from all over the world.
Follow the message on your terminal, which often guides you to manually download a
zipfile to a particular path.
Use a mirror site which could be faster and stabler in your region.
Server without Internet¶
If your server has no Internet access at all, just debug your codes on your local PC and copy the following directory to your server via a USB disk.
~/.hanlp: the home directory for HanLP models.
~/.cache/huggingface: the home directory for Hugging Face 🤗 Transformers.
Some TensorFlow/fastText models will ask you to install the missing TensorFlow/fastText modules, in which case you’ll need to install the full version:
pip install hanlp[full]
DO NOT install TensorFlow/fastText by yourself, as higher or lower versions of TensorFlow have not been tested and might not work properly.