Sequence to Sequence and attention from scratch using Tensorflow
The goal is to not use any existing contrib library implementation of Tensorflow and build a sequence to sequence LSTM based model from scratch using Tensorflow. Probably only thing not from scratch in this project is backpropagation.
A Neural Attention mechanism was also implemented and the results were compared. This project was done for learning purpose for the course Deep Learning by Google (Udacity)