September 25, 2023

Topics Cover in today’s lecure:

  1. There are two methods of sampling.                                                                                          A. Cross- Validation B. Bootstrap
  2. The difference between testing error and training error.                                          Training error is often quite lower than the test error.
  3. Training error is not a good surrogate to testing error.
  4. Understand the concept of overfitting through the graph of testing and training data.
  5. Bias– How far on the average the model is true.
  6. Variance– How much that the estimate varies around it average.
  7. Bias and Variance together gives prediction error.
  8. K- fold validation – This is used to estimate test error.
  9. Data is divided into equal size in this technique.
  10. True and estimated test for mean squared error for data.
  11. The estimate of prediction error is biased upward, this is the issue with cross-validation.
  12. Bias can be minimize but in that situation variance will be high.
  13. Understand when we should use cross- validation.
  14. Right and wrong ways to apply cross-validation.

Leave a Reply

Your email address will not be published. Required fields are marked *