An explicit or implicit assumption or prior information about the model that permits it to generalize outside of the training set of data is known as inductive bias.
Examples of inductive bias:
When it comes to decision trees, shorter trees work better than longer ones.
The response variable (y) in linear regression is thought to vary linearly in predictors (X).
In general, the belief that the most simplest hypothesis is more accurate than the more complicated one (Occam's razor) .