meditron

meditron

Medical language model pretraining

MEDITRON-70B is a large language model (LLM) aimed at democratizing access to medical knowledge. It is an open-source LLM with 7B and 70B parameters, adapted specifically for the medical domain. The model builds on Llama-2 and extends pretraining on a curated medical corpus, including selected PubMed articles and internationally-recognized medical guidelines.

Machine LearningMedicalNatural Language
Key facts
Maturity
Support
C4DT
Inactive
Lab
Unknown
  • Technical

Natural Language Processing Lab

Natural Language Processing Lab
Antoine Bosselut

Prof. Antoine Bosselut

The NLP lab is focused on advanced NLP research areas like knowledge representations, reasoning, narrative understanding, and biomedical NLP.

This page was last edited on 2024-02-20.