From d50535ea02e4b520d104ba93d2743ec09b5da814 Mon Sep 17 00:00:00 2001 From: Klaus Ondrag <18405064+klausondrag@users.noreply.github.com> Date: Thu, 20 Dec 2018 19:32:20 +0100 Subject: [PATCH] fixed typos --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 1689006..0a83c54 100644 --- a/README.md +++ b/README.md @@ -252,9 +252,9 @@ right-to-left model, e.g.: P(store|)*P(a|store )*… -Now you have  two representations of each word, one left-to-right and one right-to-left, and you can concatenate them together for your downstream task. +Now you have two representations of each word, one left-to-right and one right-to-left, and you can concatenate them together for your downstream task. -But intuitively, it would be much better if we could train a single model that was deeply bidirectional. +But intuitively, it would be much better if we could train a single model that was deeply bidirectional. It's unfortunately impossible to train a deep bidirectional model like a normal LM, because that would create cycles where words can indirectly