Code-to-Sequence (code2seq) Neural Network

From GM-RKB
Jump to navigation Jump to search

A Code-to-Sequence (code2seq) Neural Network is an seq2seq architecture with attention that generates a sequence from a source code snippet.



References

2020

|- | |- |}

The code2seq model was demonstrated on the task of method name prediction in Java (in which it performed significantly better than code2vec); on the task of predicting StackOverflow natural language questions given their source code answers (which was first presented by Iyer et al. 2016); and on the task of predicting documentation sentences (JavaDocs) given their Java methods. In all tasks, code2seq was shown to perform much better than strong Neural Machine Translation (NMT) models that address these tasks as “translating” code as a text to natural language.

2018

2018 Code2seqGeneratingSequencesfrom Fig3.png
Figure 3: Our model encodes each AST path with its values as a vector, and uses the average of all of the $k$ paths as the decoder's start state. The decoder generates an output sequence while attending over the $k$ encoded paths.