How does a model with very high capacity tend to shape its decision boundary during training?
Answer
It twists and turns its decision boundary to perfectly encompass nearly every single training point, including noise and outliers.
When a model has excessive flexibility, it can afford to find highly specific rules for individual training instances, causing its boundary to snake precisely around every data point rather than generalizing the overall pattern.

Related Questions
What condition defines the presence of overfitting in a machine learning model?What significant performance disparity is a hallmark symptom of overfitting?What primary model characteristic sets the potential for overfitting?How does a model with very high capacity tend to shape its decision boundary during training?How does insufficient training data volume contribute to the overfitting problem?What specific risk arises from poor data quality when using an overly complex model?In iterative training algorithms, what diagnostic observation signals the onset of overfitting due to prolonged training?How does high dimensionality in the feature space increase susceptibility to overfitting?Within the bias-variance tradeoff, what characteristic fundamentally describes an overfit model?According to the error comparison table, what primary issue characterizes a model where both training error and test error are high?