Back to Developer Roadmap

Feature Scaling & Normalization

src/data/roadmaps/machine-learning/content/feature-scaling--normalization@iBkTNbk8Xz626F_a3Bo5J.md

4.0862 B
Original Source

Feature Scaling & Normalization

Feature scaling is a preprocessing technique in machine learning that transforms numerical features to a common scale, ensuring they contribute equally to the model by preventing features with larger ranges from dominating, which is particularly important for algorithms sensitive to feature scales like gradient descent-based methods. Key methods include Standardization (transforming data to a mean of 0 and a standard deviation of 1, often better for outliers), and Normalization (scaling data to a fixed range, often 0 to 1).

Visit the following resources to learn more: