Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,6 @@
|
|
|
|
|
|
|
|
|
|
|
| 1 |
# English → Hindi Translation with Seq2Seq + Multi-Head Attention
|
| 2 |
|
| 3 |
This Streamlit Space demonstrates the **power of LSTM with self-attention mechanisms** for sequence-to-sequence (Seq2Seq) tasks. Specifically, it showcases **multi-head cross-attention** in a translation setting.
|
|
@@ -82,4 +85,4 @@ This Space is designed to **illustrate how LSTM-based Seq2Seq models combined wi
|
|
| 82 |
|
| 83 |
Daksh Bhardwaj
|
| 84 |
Email: [email protected]
|
| 85 |
-
GitHub: [Daksh5555](https://github.com/daksh5555)
|
|
|
|
| 1 |
+
---
|
| 2 |
+
colorTo: indigo
|
| 3 |
+
---
|
| 4 |
# English → Hindi Translation with Seq2Seq + Multi-Head Attention
|
| 5 |
|
| 6 |
This Streamlit Space demonstrates the **power of LSTM with self-attention mechanisms** for sequence-to-sequence (Seq2Seq) tasks. Specifically, it showcases **multi-head cross-attention** in a translation setting.
|
|
|
|
| 85 |
|
| 86 |
Daksh Bhardwaj
|
| 87 |
Email: [email protected]
|
| 88 |
+
GitHub: [Daksh5555](https://github.com/daksh5555)
|