Jin Daily Ai Trivia : Mistral Medium 3 aims to reduce costs—but is it truly a SOTA model?
Jin Daily Ai Trivia : Mistral Medium 3 aims to reduce costs—but is it truly a SOTA model?
Mistral AI has released yet another “open” model (but not available to public :P), though this time it comes with more restrictions. It’s only available for self-hosting by enterprise clients.
Here’s what they claim:
“Mistral Medium 3 outperforms leading open models like LLaMA 4 Maverick and even enterprise models such as Cohere Command R+. In terms of pricing, it beats cost leaders like DeepSeek V3 - both for API access and self-hosted setups.”
But the reality?
It’s not looking great. In early benchmarks, it couldn’t even outperform Qwen QwQ-32B. And without open weights to test the cost-effectiveness, I doubt it’s significantly cheaper than DeepSeek.
