Magistral Small
Magistral Small is Mistral's entry-level reasoning model, bringing structured chain-of-thought thinking to a compact and cost-effective package. It's designed for applications that benefit from deliberate, step-by-step reasoning without the compute overhead of larger thinking models.
Magistral Small produces visible reasoning traces that show its work, making it particularly useful for educational applications, explainable AI systems, and any use case where users need to verify the model's logic. It handles arithmetic, basic logic puzzles, and structured problem-solving reliably.
Key Features
Structured chain-of-thought reasoning with visible thinking traces
Cost-effective reasoning — deliberate thinking at small-model prices
128K token context for processing longer analytical problems
Good performance on arithmetic, logic, and structured problem-solving
Fast time-to-first-token compared to larger reasoning models
Ideal Use Cases
Educational tutoring with step-by-step explanations
Lightweight analytical queries — quick math, logic checks, fact verification
Explainable AI applications where reasoning transparency is required
Automated grading and assessment with visible scoring rationale
Technical Specifications
| Context Window | 128K tokens |
| Modality | Text → Text |
| Provider | Mistral |
| Category | Reasoning |
| Thinking Mode | Structured chain-of-thought |
| Latency | Low — optimized for interactive use |
| Best For | Cost-effective reasoning tasks |
API Usage
1 curl -X POST https://api.vincony.com/v1/chat/completions \ 2 -H "Authorization: Bearer YOUR_API_KEY" \ 3 -H "Content-Type: application/json" \ 4 -d '{ 5 "model": "mistral/magistral-small", 6 "messages": [ 7 { "role": "user", "content": "Hello, Magistral Small!" } 8 ] 9 }'
Replace YOUR_API_KEY with your Vincony API key. OpenAI-compatible endpoint — works with any OpenAI SDK.
Compare with Another Model
Frequently Asked Questions
Try Magistral Small now
Start using Magistral Small instantly — 100 free credits, no credit card required. Access 343+ AI models through one platform.
More from Mistral
Use ← → to navigate between models · Esc to go back
Devstral 2
Top-tier agentic coding model with 256K context, multi-file understanding, and autonomous planning.
Devstral Small 2
Second-gen compact code model with improved contextual awareness.
Devstral Small
Original lightweight code assistant optimized for low-latency autocomplete.
Mistral Large 3
Flagship 128K-context enterprise model with strong multilingual fluency.