Apollo-1
Collection
Multilingual Medicine: English, Chinese, French, Hindi, Spanish, Hindi, Arabic • 7 items • Updated • 1
docker run --gpus all \
--shm-size 32g \
-p 30000:30000 \
-v ~/.cache/huggingface:/root/.cache/huggingface \
--env "HF_TOKEN=<secret>" \
--ipc=host \
lmsysorg/sglang:latest \
python3 -m sglang.launch_server \
--model-path "FreedomIntelligence/Apollo-6B" \
--host 0.0.0.0 \
--port 30000# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:30000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "FreedomIntelligence/Apollo-6B",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'Covering English, Chinese, French, Hindi, Spanish, Hindi, Arabic So far
👨🏻💻Github •📃 Paper • 🌐 Demo • 🤗 ApolloCorpus • 🤗 XMedBench
中文 | English
🤗Apollo-0.5B • 🤗 Apollo-1.8B • 🤗 Apollo-2B • 🤗 Apollo-6B • 🤗 Apollo-7B
🤗 Apollo-0.5B-GGUF • 🤗 Apollo-2B-GGUF • 🤗 Apollo-6B-GGUF • 🤗 Apollo-7B-GGUF
User:{query}\nAssistant:{response}<|endoftext|>
Dataset 🤗 ApolloCorpus
[
"string1",
"string2",
...
]
[
[
"q1",
"a1",
"q2",
"a2",
...
],
...
]
[
[
"q1",
"a1",
"q2",
"a2",
...
],
...
]
Evaluation 🤗 XMedBench
EN:
ZH:
ES: Head_qa
FR: Frenchmedmcqa
HI: MMLU_HI
AR: MMLU_Ara
Waiting for Update
Please use the following citation if you intend to use our dataset for training or evaluation:
@misc{wang2024apollo,
title={Apollo: Lightweight Multilingual Medical LLMs towards Democratizing Medical AI to 6B People},
author={Xidong Wang and Nuo Chen and Junyin Chen and Yan Hu and Yidong Wang and Xiangbo Wu and Anningzhe Gao and Xiang Wan and Haizhou Li and Benyou Wang},
year={2024},
eprint={2403.03640},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Install from pip and serve model
# Install SGLang from pip: pip install sglang# Start the SGLang server: python3 -m sglang.launch_server \ --model-path "FreedomIntelligence/Apollo-6B" \ --host 0.0.0.0 \ --port 30000# Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "FreedomIntelligence/Apollo-6B", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'