Back to Developer Roadmap

Inference

src/data/roadmaps/ai-engineer/content/inference@4NtUD9V64gkp8SFudj_ai.md

4.0870 B
Original Source

Inference

In artificial intelligence (AI), inference refers to the process by which a trained machine learning model makes predictions or draws conclusions from new, unseen data. Unlike training, inference involves the model applying what it has learned to make decisions without needing examples of the exact result. In essence, inference is the AI model actively functioning. For example, a self-driving car recognizing a stop sign on a road it has never encountered before demonstrates inference. The model identifies the stop sign in a new setting, using its learned knowledge to make a decision in real-time.

Visit the following resources to learn more: