amr2text

amr2text

AMR captures “who is doing what to whom” in a sentence. Each sentence is represented as a rooted, directed, acyclic graph with labels on edges (relations) and leaves (concepts). The goal of AMR-to-Text Generation is to recover the original sentence realization given an AMR. This task can be seen as the reverse of the structured prediction found in AMR parsing. Before loading an AMR model, make sure to install HanLP with the amr dependencies:

pip install hanlp[amr] -U

To generate a sentence given an AMR:

import hanlp

generation = hanlp.load(hanlp.pretrained.amr2text.AMR3_GRAPH_PRETRAIN_GENERATION)
print(generation('''
(z0 / want-01
    :ARG0 (z1 / boy)
    :ARG1 (z2 / believe-01
              :ARG0 (z3 / girl)
              :ARG1 z1))
'''))
The boy wants the girl to believe him.

All the pre-trained parsers and their scores are listed below.

hanlp.pretrained.amr2text.AMR3_GRAPH_PRETRAIN_GENERATION = 'https://file.hankcs.com/hanlp/amr2text/amr3_graph_pretrain_generation_20221207_153535.zip'

A seq2seq (Bevilacqua et al. 2021) BART (Lewis et al. 2020) large AMR2Text generator trained on Abstract Meaning Representation 3.0 (Knight et al. 2014) with graph pre-training (Bai et al. 2022). Its Sacre-BLEU is 50.38 according to their official repository.