As we know Cross-Validation can be used for model error estimation as well as model hyperparameter optimization.
If we are doing either model’s error estimation or the model’s hyperparameter optimization then we can use non-nested simple k-fold cross-validation.
However, if cross-validation is used simultaneously for the model’s error estimation AND the model’s hyperparameter optimization, nested cross-validation is required.
This is because, if the same cross-validation procedure is used for the model’s error estimation AND the model’s hyperparameter optimization, the model’s error estimation may be biased.
One way to fix this bias is to nest the hyperparameter optimization Cross-Validation procedure under the model selection Cross-Validation procedure.
This is called nested Cross-Validation.
Please watch the video Nested vs Non-nested Cross-Validation in Machine Learning for a more detailed explanation.