What are Abstract Meaning Representation graphs?

Joseph Gatto
Analytics Vidhya
Published in
4 min readDec 5, 2021

--

If you work in NLP, you might come across a graphical semantic representation known as Abstract Meaning Representations. In this article I will present a high-level introduction to Abstract Meaning Representation graphs, or AMRs.

It feels like anywhere you read about AMRs tells you that the graphs encode “who is doing what to whom” in a given sentence [1][2]. That’s because its true! AMRs are a graphical representation of some text, where the nodes are concepts and the edges denote the relationships between concepts. Note here that the nodes in the graph are not necessarily words in the sentence, but rather high-level abstractions of the concepts within that sentence. Let’s take a look at the example graph below. Specifically, let’s walk through the exact interpretation of some of the relations in this graph to help you better read AMRs in the future.

AMR graph for the sentence “Tony told Bella that he could not come to her party”. This graph was generated using a SOTA AMR parser.
Focusing in on the first node in the full graph.

Looking at the node furthest to the left, we see the concept “t/tell-01”. This notation means that the label of this node is “t”, which represents the concept “tell-01”. But what is “tell-01”? Well, this is one of the concepts in the AMR vocabulary of concepts. Each concept has it’s own definition, and an extensive list can be found within the AMR Dataset. What you need to know is: AMR concepts have definitions, each concept may have different definitions based on their use in a given sentence (which is why we have tell-01 as there does exist a tell-02), and these concepts are used to help abstract away from the exact words in a sentence.

In this example, we understand tell to be referring to the word “tell” in the original sentence. Let’s break down the first relation related to “tell-01” in this graph.

  • tell-01 -> :ARG0 -> p/person

For those unfamiliar, A -> :ARG0 -> B means that B happens to A (at least that’s how I think of it). This relation says that there is a person in this sentence who is “telling” something. Note that nowhere in the original sentence do we find the word “person”. The AMR parser has inferred that Tony is a person and he is telling. This is a great example of how AMR’s abstract away from low-level things like explicit names and provide something more general like “person”.

  • tell-01 -> :ARG1 -> p2/possible-01

A few things to note here. First, A -> :ARG1 -> B means that A happens to B. Second, possible is labeled p2 because p is occupied by person. Now, in order to fully understand this relation, we need to peak ahead to what p2 relates to.

  • p2/possible-01 -> :polarity -> -

Here, we see that possible has a negative modifier. This means that it is not possible. Note the use of the concept “possible” with a negative modifier here is because the original sentence uses the language “could not”.

  • p2/possible-01 -> :ARG1 -> c/come-01

Here, this reads as “it is not possible to come” where the not comes from the aforementioned negative modifier.

Next, lets look at a few relations pointing to p/person

  • p2/possible-01 -> :ARG1 -> c/come-01 -> :ARG1 -> p/person
  • t/tell-01 ->:ARG0 -> p/person

So here we have two different takeaways about the person involved. 1) It is not possible for person to come. 2) That person, who is not able to come, is telling something. This is pretty cool right? We are fully capturing the semantics of this sentence with regards to Tony’s actions.

As an exercise, I encourage you to walk through various relations in the provided graph and explain exactly what they are encoding.

In this article, we talked about how to read an AMR. These graphs are useful for a variety of tasks throughout NLP which require deep semantic textual understanding. For more detailed information, I suggest the article in [1] which provides a low-level breakdown of AMRs.

References

[1] https://github.com/amrisi/amr-guidelines/blob/master/amr.md

[2] https://catalog.ldc.upenn.edu/LDC2020T02

--

--