Parametric vs Non-Parametric Models#
Parametric Models#
Definition
A parametric model assumes a fixed functional form and has a fixed number of parameters regardless of the size of the dataset.
Examples:
Linear Regression
Logistic Regression
Naive Bayes
Neural Networks (fixed architecture)
Key properties
Assumes a specific shape (e.g., linear, logistic curve)
Model complexity is fixed
Fast to train, easy to interpret
Needs fewer data points
Risk of underfitting if the true function is more complex
Example
Linear regression: $\( y = w_1x + w_0 \)$ Only 2 parameters (w₁, w₀) no matter how much data you have.
Non-Parametric Models#
Definition
A non-parametric model does not assume a fixed functional form. The number of parameters grows with data, allowing the model to become more complex as data increases.
Examples:
k-Nearest Neighbors
Decision Trees
Random Forest
Gaussian Processes
Kernel SVM
Histogram or Kernel Density Estimation
Key properties
Flexible; data determines model shape
Can learn very complex patterns
Need more data to generalize well
Higher computation cost
Risk of overfitting without regularization
Example
k-Nearest Neighbors (kNN): Prediction depends on stored data points. More data → model becomes larger and more complex.
Side-by-Side Comparison
Aspect |
Parametric |
Non-Parametric |
|---|---|---|
Assumes fixed form? |
Yes |
No |
Number of parameters |
Fixed |
Grows with data |
Flexibility |
Low to medium |
High |
Data requirement |
Low |
High |
Computation |
Fast |
Slower |
Risk |
Underfitting |
Overfitting |
Examples |
Linear/Logistic regression, Naive Bayes |
kNN, Decision Trees, GPs, Random Forest |
Intuitive Summary
Parametric = fixed recipe (You decide the shape of the function; data only adjusts parameters.)
Non-parametric = flexible recipe (Model adapts shape based on how much data you provide; no fixed structure.)