Temperature
A setting that controls how creative or conservative AI output is. Low temperature = predictable and focused. High temperature = varied and creative.
Temperature is a parameter that controls the randomness of AI output. It determines how adventurous the model is when selecting the next word in its response. The metaphor comes from thermodynamics — higher temperature means more molecular movement, and in AI, more variability in output.
How temperature works
When an AI model generates text, it calculates a probability for every possible next token. Temperature adjusts these probabilities:
- Low temperature (0.0-0.3): The model strongly favours the highest-probability tokens. Output is consistent, focused, and predictable. Ask the same question twice and you will get nearly identical answers.
- Medium temperature (0.4-0.7): A balance between consistency and creativity. The model mostly picks high-probability tokens but occasionally selects less likely alternatives.
- High temperature (0.8-1.0+): The model gives more weight to lower-probability tokens. Output is more varied, creative, and surprising — but also more likely to be inconsistent or drift off-topic.
At temperature 0, the model is deterministic — it always picks the single most probable next token. At temperature 1, the probability distribution is used as-is. Above 1, lower-probability tokens are boosted even further.
Choosing the right temperature
The right temperature depends on what you are doing:
Use low temperature (0.0-0.3) for: - Factual question answering - Data extraction and formatting - Code generation - Legal or compliance documents - Anything where accuracy and consistency are paramount
Use medium temperature (0.4-0.7) for: - Business writing (emails, reports, presentations) - Summarisation - General-purpose assistant conversations - Content that needs to sound natural without being unpredictable
Use high temperature (0.8-1.0) for: - Brainstorming and ideation - Creative writing (stories, poetry, advertising copy) - Generating diverse options or alternatives - Exploratory conversations where you want surprising perspectives
Temperature in practice
Most AI interfaces do not expose temperature as a setting — the platform chooses a default that works for general conversation (typically 0.5-0.7). However, if you access AI through an API or a tool that exposes the parameter, understanding temperature lets you fine-tune output for your specific task.
Some practical observations:
- Setting temperature to 0 does not guarantee perfectly consistent output — there are other sources of variation — but it minimises randomness.
- Very high temperatures (above 1.2) tend to produce incoherent or nonsensical output.
- The optimal temperature is task-specific. There is no universally "best" setting.
Related parameters
Temperature is one of several parameters that control AI output. Others include:
- Top-p (nucleus sampling): Instead of adjusting probabilities, this limits the model to only considering the top percentage of likely tokens. Top-p 0.9 means the model considers only the most likely tokens that together account for 90% of the probability.
- Top-k: Limits the model to the k most likely tokens.
- Frequency penalty: Reduces the likelihood of the model repeating words or phrases.
These parameters interact with each other. In practice, most users only need to think about temperature, and many tasks work well with the default setting.
Why This Matters
Temperature is the most accessible lever for improving AI output quality. If your AI responses feel generic and safe, raising the temperature can produce more interesting results. If your AI is making things up or going on tangents, lowering the temperature reins it in. Even if you never touch the setting directly, understanding that temperature exists helps you evaluate why AI sometimes produces wildly different responses to similar prompts — and how to get more consistent results.
Related Terms
Continue learning in Essentials
This topic is covered in our lesson: Why Your Prompts Fail (And How to Fix Them)