FANDOM


The goal, as I understand is to escape from the current simple execution of neural networks and have them work out programs by themselves. The ultimate product of this line of research will be a Gödel machine. It is actively carried out by Google and Facebook employees, among others.

  1. Neural Turing Machine (NTM) (Graves et al., 2014)[1]
  2. Stack-Augmented recurrent neural networks (Joulin & Mikolov, 2015[2]; Grefenstette et al., 2015[3])
  3. Grid-LSTM (Kalchbrenner et al., 2015)[4]
  4. Neural Random-access machines: Kurach et al. (2015)[5]
  5. Zaremba et al. (2015)[6]
  6. Neural programmer: Neelakantan et al. (2015)[7]
  7. Neural GPUs: Kaiser & Sutskever (2015)[8]
  8. Zhang et al. (2015)[9]
  9. Neural Programmer-Interpreters: Reed & de Freitas (2015)[10]

References Edit

  1. Graves, Alex, Wayne, Greg, and Danihelka, Ivo. Neural turing machines. arXiv preprint arXiv:1410.5401, 2014.
  2. Joulin, Armand and Mikolov, Tomas. Inferring algorithmic patterns with stack-augmented recurrent nets. arXiv preprint arXiv:1503.01007, 2015.
  3. Joulin, Armand and Mikolov, Tomas. Inferring algorithmic patterns with stack-augmented recurrent nets. arXiv preprint arXiv:1503.01007, 2015.
  4. Kalchbrenner, Nal, Danihelka, Ivo, and Graves, Alex. Grid long short-term memory. arXiv preprint arXiv:1507.01526, 2015
  5. Karol Kurach, Marcin Andrychowicz, Ilya Sutskever. 2015. NEURAL RANDOM-ACCESS MACHINES Arxiv preprint
  6. LEARNING SIMPLE ALGORITHMS FROM EXAMPLES. 2015. Wojciech Zaremba, Tomas Mikolov, Armand Joulin, Rob Fergus
  7. NEURAL PROGRAMMER: INDUCING LATENT PROGRAMS WITH GRADIENT DESCENT. Arvind Neelakantan, Quoc V. Le, Ilya Sutskever
  8. Łukasz Kaiser & Ilya Sutskever. 2015. NEURAL GPUS LEARN ALGORITHMS
  9. Wei Zhang, Yang Yu, Bowen Zhou. 2015. Structured Memory for Neural Turing Machines Arxiv
  10. Scott Reed and Nando de Freitas. 2015. Neural Programmer-Interpreters