PDA

View Full Version : Calibration of Judges


accuseal
05-25-2013, 01:55 PM
I'm a bit new at this but I am a CBJ and have cooked my first contest. It seems to me that judges could be calibrated by taking their scores and after a few contests, giving them a report on how they stack up against the rest of the judging population. I remember the first time I judged being paranoid about whether I was in line with the others or was going to be #6 that was thrown out. Just had an experience where I got horribly inconsistent judging. How can you get 999866 ? Other folks at the same contest had similar observations. I think a calibration tool for the judges would go a long way towards fairness and objectivity. Has this ever been considered? Or was it too time consuming and / or expensive. I have been involved with Statistical Process Control for years in a manufacturing environment. It seems to me that Judges falling outside of standard deviation should be informed; whether high or low.

drbbq
05-26-2013, 12:01 PM
IMO that's the kind of thinking that is needed. But no cooks ever want to accept anything but 8-9 and an occasional 7 so all that's ever done is nudging judges into using only the top three numbers. No point in trying to sort that out. I'm guessing a real statistician would want to use most of the scale.

Smoke'n Ice
05-26-2013, 02:58 PM
Rumor has it that the new "Score" program, while very limited in its ability, has this as a feature. Can't keep track of coment cards, hence their elimination, but it can compare judges and tables. Remember this is just rumor and maybe some one on the software committee could post the SRS if they exist.

carlyle
05-26-2013, 03:21 PM
Not on the committee or the board. I was there at Kansas City in January when there was a presentation about the new program.

New software is supposed to track judges. That is why score cards now have table and seat number plus judge number.

This kind of feed back is essential. Everybody needs to get an idea of how they are doing. Outliers, either too high or too low, need to be identified, informed, and efforts made to rehabilitate them.

Right now - at least at our contest, I encourage talk between judges after cards have been turned in. Table captains have their eyes open, and reps have a quiet talk with judges that are more than 2 points off from the rest of the table. Sometimes valid reason, sometimes not.

Before the contest, I pre assign table assignments in an effort to balance experience for each table. I do not want a hot table or a cold table. Right now the only tools are my previous experiences with the judges, plus their experience. Then with inexperienced judges or judges I don't know, I can fill them in around known quantities.

When people are involved who are doing subjective tasks - either cooking or judging-
the idea that you can calibrate them like a piece of machinery or a robot is an illusion
IMO.

Does not mean we should not make efforts to improve the status quo.

I am hopeful that the new scoring system will take a step in the right direction.

Time will tell.

Smokin' D
05-26-2013, 03:42 PM
I am a judge and some things are difficult to control. Some judges, just like regular people, like what they like and do not like what they don't like. Here is my rib taste score from last weekend: 899949, Really? Texture: 997959, appearance was pretty even though. Looks like Judge #5 was in a hateful mood, or was judge #6 in disguise.

I do like to idea of calibrating the judging, if for no other reason than letting the outliers know they are off the norm.

ModelMaker
05-27-2013, 07:42 PM
I'm a bit new at this but I am a CBJ and have cooked my first contest. It seems to me that judges could be calibrated by taking their scores and after a few contests, giving them a report on how they stack up against the rest of the judging population. I remember the first time I judged being paranoid about whether I was in line with the others or was going to be #6 that was thrown out. Just had an experience where I got horribly inconsistent judging. How can you get 999866 ? Other folks at the same contest had similar observations. I think a calibration tool for the judges would go a long way towards fairness and objectivity. Has this ever been considered? Or was it too time consuming and / or expensive. I have been involved with Statistical Process Control for years in a manufacturing environment. It seems to me that Judges falling outside of standard deviation should be informed; whether high or low.


What makes you think the 3- 9's aren't the oddballs and the 8 was a gift?
Why is it so hard to accept that 2 judges thought your entry was just average and the rest were the dreaded high scoring, meat nibbling, cooler toting sort?
I have no problem scoring your entry as average if it is lacking somewhere, the "kiss of death 6" just means you need to up your game some. I'm not a 7,8,9 judge but rest assured if you give me an excellent piece of meat you get the 9. I give out lots of 9's and also a lot of 6's.
The day comes I have to justify a 6 is the day I just go back to eatin and judgin my own stuff, because 6 is just average not bad BBQ.
Ed

Porcine Perfection
06-10-2013, 12:10 PM
I am a judge and some things are difficult to control. Some judges, just like regular people, like what they like and do not like what they don't like. Here is my rib taste score from last weekend: 899949, Really? Texture: 997959, appearance was pretty even though. Looks like Judge #5 was in a hateful mood, or was judge #6 in disguise.

I do like to idea of calibrating the judging, if for no other reason than letting the outliers know they are off the norm.

Perfect example of why the low score is thrown out.