Developers can modify and deploy Codestral Mamba from its GitHub repository and through HuggingFace. It will be available with an open-source Apache 2.0 license. Mistral has claimed that an earlier version of Codestral outperformed other code generators like CodeLlama 70B and DeepSeek Coder 33B. Code generation and coding assistants have become increasingly popular, with platforms like GitHub’s Copilot, Amazon’s CodeWhisperer, and Codenium gaining traction.
Mistral’s second model launch is Mathstral 7B, specifically designed for math-related reasoning and scientific discovery. Developed in collaboration with Project Numina, Mathstral has a 32K context window and outperforms other models designed for math reasoning. It can achieve significantly better results on benchmarks with more inference-time computations. Users can access Mathstral through Mistral’s la Plateforme and HuggingFace.
Mistral, known for its open-source system, has been competing with AI developers like OpenAI and Anthropic. It recently raised $640 million in series B funding, with a valuation close to $6 billion. Tech giants like Microsoft and IBM have also invested in the company.
Overall, Mistral’s new models, Codestral Mamba 7B and Mathstral 7B, demonstrate the company’s commitment to developing powerful and efficient AI models for specific use cases. These models offer faster response times, longer context, and better performance compared to rival models, making them valuable tools for programmers, developers, and those in need of math-related reasoning. With its recent funding and support from tech giants, Mistral is poised to make a significant impact in the AI industry.