Principlev1
Track calibration by recording confidence levels on
Track calibration by recording confidence levels on predictions and measuring correspondence between stated confidence and actual frequency of correctness.
Why This Is a Principle
This PRINCIPLE operationalizes calibration development from Domain-Specific Calibration Development (calibration develops from feedback) and Systematic Overconfidence Taxonomy (systematic overconfidence), using Hindsight Bias and Calibration Necessity (hindsight bias) as the problem being solved. It prescribes HOW to build calibration (track predictions vs outcomes systematically) rather than stating calibration exists.