A powerful Python library for building and training Sequence-to-Sequence models with attention mechanisms.
- Multiple RNN Types: LSTM, GRU, RNN, and bidirectional variants
- Attention Mechanisms: Bahdanau and Luong-style attention
- Custom Model Format: Save/load models in
.aceformat - Advanced Tokenization: Flexible preprocessing and vocabulary management
- Production Ready: Comprehensive training utilities and inference tools
from aceflow import Seq2SeqModel
from aceflow.utils import Tokenizer
# Initialize model
model = Seq2SeqModel(
src_vocab_size=1000,
tgt_vocab_size=1000,
hidden_size=256,
rnn_type='lstm',
use_attention=True
)
# Train and save
model.save("model.ace")
# Load model
loaded_model = Seq2SeqModel.load("model.ace")For detailed installation instructions, see Installation Guide.
This project is licensed under the MIT License - see the LICENSE file for details.
Made with ❤️ by Maaz Waheed
```