Skip to content

Commit 2cf2a38

Browse files
authored
Update Readme.md
1 parent d27e286 commit 2cf2a38

File tree

1 file changed

+17
-3
lines changed
  • Chapter-wise code/Code - PyTorch/6. Natural-Language-Processing/6. Machine Translation/NMT-Basic

1 file changed

+17
-3
lines changed

Chapter-wise code/Code - PyTorch/6. Natural-Language-Processing/6. Machine Translation/NMT-Basic/Readme.md

Lines changed: 17 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,6 @@
11
# Machine Translation using Basic Linear Algebra
22

3-
## Algorithm
4-
5-
### Generate French and English Word Embedding
3+
## 1. Generate French and English Word Embedding
64

75
Here, we have 3 given parameters:
86
1. `en_embeddings`: English words and their corresponding embeddings.<br>
@@ -41,3 +39,19 @@ Now, we have to create an English embedding matrix and French embedding matrix:<
4139
# stack the vectors of Y_l into a matrix Y
4240
Y = np.vstack(Y_l)
4341
```
42+
43+
## 2. Linear Transformation of Word Embeddings
44+
Given dictionaries of English and French word embeddings we will create a transformation matrix `R`. In other words, given an english word embedding, `e`, we need to multiply `e` with `R`, i.e., (`eR`) to generate a new word embedding `f`.
45+
46+
### Describing Translation Problem as the Minimization Problem
47+
We can describe our translation problem as finding a matrix `R` that minimizes the following equation:<br>
48+
<img src="./images/translation_problem.png"></img><br>
49+
For this, we calculate the loss by modifying the original *Forbenius norm* :<br>
50+
Original Forbenius Norm: <img src="./images/original_forbenius_norm.png"></img><br>
51+
Modified Forbenius Norm: <img src="./images/modified_forbenius_norm.png"></img><br>
52+
Finally, our loss funtion will look something like this:<br>
53+
<img src="./images/final_loss_function.png"></img><br>
54+
55+
56+
57+

0 commit comments

Comments
 (0)