Model Insights
One of the first instruction tuned decoder model in the industry released by Mosaic. It is built by finetuning MPT-7B on open source datasets.
Model
Details
Developer
Databricks
License
cc-by-sa-3.0
Model parameters
7B
Pretraining tokens
1T
Release date
May 2023
Supported context length
1T
Price for prompt tokens*
$0.15/Million tokens
Price for response tokens*
$0.15/Million tokens
*Note: Data based on 11/14/2023
Here's how MPT-7b-instruct performed across all three task types
Digging deeper, here’s a look how MPT-7b-instruct performed across specific datasets
💰 Cost insights
The model scores low across al the tasks. It is 13x cheaper compared to GPT3.5 and 6x cheaper compared to Llama 70b variant. We suggest using Zephyr-7b-beta instead of this.