Calibration of Judges
I'm a bit new at this but I am a CBJ and have cooked my first contest. It seems to me that judges could be calibrated by taking their scores and after a few contests, giving them a report on how they stack up against the rest of the judging population. I remember the first time I judged being paranoid about whether I was in line with the others or was going to be #6 that was thrown out. Just had an experience where I got horribly inconsistent judging. How can you get 999866 ? Other folks at the same contest had similar observations. I think a calibration tool for the judges would go a long way towards fairness and objectivity. Has this ever been considered? Or was it too time consuming and / or expensive. I have been involved with Statistical Process Control for years in a manufacturing environment. It seems to me that Judges falling outside of standard deviation should be informed; whether high or low.
Two Modified WSM 22, iQue 110, CyberQ, Traeger Texas, Smokey Joe, Weber Gasser