Sequence To Sequence And Attention From Scratch Using Tensorflow Save

Sequence to Sequence and attention from scratch using Tensorflow

Project README

Sequence to Sequence Model and Attention from Scratch

The goal is to not use any existing contrib library implementation of Tensorflow and build a sequence to sequence LSTM based model from scratch using Tensorflow. Probably only thing not from scratch in this project is backpropagation.

A Neural Attention mechanism was also implemented and the results were compared. This project was done for learning purpose for the course Deep Learning by Google (Udacity)

Open Source Agenda is not affiliated with "Sequence To Sequence And Attention From Scratch Using Tensorflow" Project. README Source: subho406/Sequence-to-Sequence-and-Attention-from-scratch-using-Tensorflow

Open Source Agenda Badge

Open Source Agenda Rating