upcarta
  • Sign In
  • Sign Up
  • Explore
  • Search

Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs

  • Article
  • May 5, 2023
  • #MachineLearning
www.mosaicml.com
Read on www.mosaicml.com
1 Recommender
1 Mention
Introducing MPT-7B, the latest entry in our MosaicML Foundation Series. MPT-7B is a transformer trained from scratch on 1T tokens of text and code. It is open source, available for... Show More

Introducing MPT-7B, the latest entry in our MosaicML Foundation Series. MPT-7B is a transformer trained from scratch on 1T tokens of text and code. It is open source, available for commercial use, and matches the quality of LLaMA-7B. MPT-7B was trained on the MosaicML platform in 9.5 days with zero human intervention at a cost of ~$200k. Starting today, you can train, finetune, and deploy your own private MPT models, either starting from one of our checkpoints or training from scratch. For inspiration, we are also releasing three finetuned models in addition to the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the last of which uses a context length of 65k tokens!

Show Less
Recommend
Post
Save
Complete
Collect
Mentions
See All
Jesse Dodge @JesseDodge ยท May 5, 2023
  • Post
  • From Twitter
Fantastic work from MosaicML open sourcing some large models!
  • upcarta ©2025
  • Home
  • About
  • Terms
  • Privacy
  • Cookies
  • @upcarta