Gonyai-v1: A Poetic Konkani Language Model
Gonyai-v1 is a 160M parameter transformer model specifically designed for Konkani text generation. It features a custom architecture (KonkanGPT) utilizing Rotary Positional Embeddings (RoPE), RMSNorm, and SwiGLU activation functions.
Unlike general-purpose models, Gonyai-v1 is a linguistic specialist focused on the cultural and poetic nuances of the Konkani language.
Model Details
- Architecture: KonkanGPT (Custom Transformer)
- Parameters: ~160 Million
- Tokenizer: Custom 32k Byte-Level BPE (Optimized for Devanagari/Konkani)
- Training Data: Curated Konkani literature, news, and artistic works.
📊 Benchmarks (Sub-1B Category)
In Feb 2026 benchmarks, Gonyai-v1 was tested against global heavyweights SmolLM2-360M and Qwen2.5-0.5B. Despite its smaller size, Gonyai-v1 demonstrates superior linguistic efficiency for Konkani.
| Metric | Gonyai-v1 (160M) | SmolLM2-360M | Qwen2.5-0.5B |
|---|---|---|---|
| Token Efficiency (Lower is Better) | 5.00 | 7.85 | 6.57 |
| Generation Speed (Tokens/Sec) | 65.96 | 27.00 | 33.27 |
| Vocabulary Diversity | 0.80 | 0.91 | 0.93 |
Key Takeaway: Gonyai-v1 is 2x faster and significantly more token-efficient than larger generic models when handling Konkani script.
⚠️ Known Limitations
- Factual Accuracy: At 160M parameters, the model is a creative artist, not an encyclopedia. It may hallucinate historical facts or dates.
- Logical Reasoning: Not suitable for complex math or coding tasks.
- Topic Drift: In long-form generations, the model may drift from the prompt into poetic repetition.
🚀 How to Use
Use the script below for optimal inference. Note: You must set trust_remote_code=True to load the custom architecture.
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "omdeep22/Gonyai-v1"
tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_id, trust_remote_code=True).to("cuda")
response = model.chat(tokenizer, "गोंयच्या निसर्गाविशीं एक ओळ बरय.")
print(response)
- Downloads last month
- 520

