KazBERT
Published:
KazBERT is a BERT-based model specifically designed and fine-tuned for Kazakh language tasks. The model is trained using Masked Language Modeling (MLM) on a multilingual text corpus containing Kazakh, Russian, and English texts.
Features
- Custom tokenizer optimized for Kazakh language
- Trained on diverse Kazakh text corpus
- Supports downstream NLP tasks for Kazakh
