CS7643 Deep Learning In Python 3 Assignment 4 Answer

pages Pages: 4word Words: 890

Question :

CS7643: Deep Learning Assignment 4

In this assignment, we will work with Python 3. If you do not have a python distribution installed yet, we recommend installing Anaconda (or miniconda) with Python 3. Given that you should already have your PyTorch installed in your local anaconda environment in assignment 2, we do not provide environment files for you to setup anaconda environment. You should make sure you have the following packages installed

pip install torchtext==0.8.1 pip install torch==1.7.1 pip install spacy==2.3.5 pip install tqdm pip install numpy

Additionally, you will need the Spacy tokenizers in English and German language, which can be downloaded as such:

$python -m spacy download en

$python -m spacy download de

In your assignment you will see the notebook Machine_Translation.ipynb which contains test cases and shows the training progress. You can follow that notebook for instructions as well.

RNNs and LSTMs

In models/naive you will see files necessary to complete this section. In both of these files you will complete the initialization and forward pass.

1.1 RNN Unit

You will be using PyTorch Linear layers and activations to implement a vanilla RNN unit. Please refer to the following structure and complete the code in RNN.py:RNN Unit


1.2 LSTM

You will be using PyTorch nn.Parameter and activations to implement an LSTM unit. You can simply translate the following equations using nn.Pa- rameter and PyTorch activation functions to build an LSTM from scratch:

it σ(Wiixt bii Whiht1 bhift σ(Wif xt bif Whf ht1 bhf )

gt = tanh(Wigxt big Whght1 bhgot σ(Wioxt bio Whoht1 bho)

ct ft Ⓢ ct1 it Ⓢ gt ht ot Ⓢ tanh(ct)

Here’s a great visualization of the above equation from Colah’s blog to help you understand LSTM unit.
LSTM

If you want to see nn.Parameter in example, check out this tutorial from PyTorch.

Seq2Seq Implementation

In models/seq2seq you will see the files needed to complete this section. In these files you will complete the initialization and forward pass in     init and forward function. Encoder.py Decoder.py Seq2Seq.py

2.1 Training and Hyperparameter Tuning

Train seq2seq on the dataset with the default hyperparameters. Then per- form hyperparameter tuning and include the improved results in a report explaining what you have tried. Do NOT just increase the number of epochs or change the model type (RNN to LSTM) as this is too trivial.

Transformers

We will be implementing a one-layer Transformer encoder which, similar to an RNN, can encode a sequence of inputs and produce a final output of possibility of tokens in target language. The architecture can be seen below. You can refer to the original paper.In models you will see the file Transformer.py. You will implement the functions in the TransformerTranslatorclass.


Transformers

3.1 Embeddings

We will format our input embeddings similarly to how they are constructed in [BERT (source of figure)](https://arxiv.org/pdf/1810.04805.pdf). Recall from lecture that unlike a RNN, a Transformer does not include any posi- tional information about the order in which the words in the sentence occur. Because of this, we need to append a positional encoding token at each po- sition. (We will ignore the segment embeddings and [SEP] token here, since we are only encoding one sentence at a time). We have already appended the

Show More

Answer :

For solution, connect with our online professionals.