• No results found

Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases

N/A
N/A
Protected

Academic year: 2020

Share "Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases"

Copied!
11
0
0

Loading.... (view fulltext now)

Full text

Loading

Figure

Figure 1: Overall architecture of the BAMnet model.
Figure 3: KB-aware attention module. CAT: concatena-tion, SelfAtt: self-attention, AddAtt: additive attention.
Figure 4: Attention heatmap generated by the reason-ing module. Best viewed in color.
Table 3: Predicted answers of BAMnet w/ and w/o bidirectional attention on the WebQuestions test set.

References

Related documents