Install
Contents
Install¶
Install RESTful Packages¶
For beginners, the recommended RESTful packages are easier to start with. The only requirement is an auth key. We officially released the following language bindings:
Python¶
pip install hanlp_restful
Java¶
See Java instructions.
Golang¶
See Golang instructions.
Install Native Package¶
The native package running locally can be installed via pip.
pip install hanlp
HanLP requires Python 3.6 or later. GPU/TPU is suggested but not mandatory. Depending on your preference, HanLP offers the following flavors:
Flavor |
Description |
---|---|
default |
This installs the default version which delivers the most commonly used functionalities. However, some heavy dependencies like TensorFlow are not installed. |
tf |
This installs TensorFlow and fastText. |
amr |
To support Abstract Meaning Representation (AMR) models, this installs AMR related dependencies like |
full |
For experts who seek to maximize the efficiency via TensorFlow and C++ extensions, |
Install Models¶
In short, you don’t need to manually install any model. Instead, they are automatically downloaded to a directory called HANLP_HOME
when you call hanlp.load
.
Occasionally, some errors might occur the first time you load a model, in which case you can refer to the following tips.
Download Error¶
HanLP Models¶
If the auto-download of a HanLP model fails, you can either:
Retry as our file server might be busy serving users from all over the world.
Follow the message on your terminal, which often guides you to manually download a
zip
file to a particular path.Use a mirror site which could be faster and stabler in your region.
Hugging Face 🤗 Transformers Models¶
If the auto-download of a Hugging Face 🤗 Transformers model fails, e.g., the following exception is threw out:
lib/python3.8/site-packages/transformers/file_utils.py", line 2102, in get_from_cache
raise ValueError(
ValueError: Connection error, and we cannot find the requested files in the cached
path. Please try again or make sure your Internet connection is on.
You can either:
Retry as the Internet is quite unstable in some regions (e.g., China).
Force Hugging Face 🤗 Transformers to use cached models instead of checking updates from the Internet if you have ever successfully loaded it before, by setting the following environment variable:
export TRANSFORMERS_OFFLINE=1
Server without Internet¶
If your server has no Internet access at all, just debug your codes on your local PC and copy the following directories to your server via a USB disk or something.
~/.hanlp
: the home directory for HanLP models.~/.cache/huggingface
: the home directory for Hugging Face 🤗 Transformers.
Import Error¶
Some TensorFlow/fastText models will ask you to install the missing TensorFlow/fastText modules, in which case you’ll need to install the full version:
pip install hanlp[full]
Danger
NEVER install thirdparty packages (TensorFlow/fastText etc.) by yourself, as higher or lower versions of thirparty packages have not been tested and might not work properly.