# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("illian64/madlad400-7b-mt-ct2-bfloat16", dtype="auto")Quick Links
Disclaimer: illian64, who was not involved in this research, converted the original models to CTranslate2 optimized model and wrote the contents of this model card based on google/madlad400-7b-mt.
Convert params:
ct2-transformers-converter --model google/madlad400-7b-mt --quantization bfloat16 --output_dir madlad400-7b-mt-ct2-bfloat16
- Downloads last month
- -
Model tree for illian64/madlad400-7b-mt-ct2-bfloat16
Base model
google/madlad400-7b-mt
# Use a pipeline as a high-level helper # Warning: Pipeline type "translation" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("translation", model="illian64/madlad400-7b-mt-ct2-bfloat16")