Admittedly I am not the mathematician that many of you are, I understand and agree with you that there is not enough of a constant in the equation to utilize the new data to to predict a statistical probability of future outcomes. The new data can however provide and excellent analytical review of an event that has already taken place and I think that is where it's value is.
The new scoring system will allow me to see for example that over the course of the 24 entries judged by table x only 1 top ten score in any category. Understandably, I don't get to sample any of the food but I can further determine that out of the 24 entries judged 3 were submitted by current or former Jack/AR champs, 10 were from teams currently in the Top 50 in there respective category, 1 was a Sam's regional champion and 5 others were from 3 teams with 11 combined GC's this year. Looking at the quality of the teams submitting these entries, I think it is very reasonable to think those results are the more likely the result of a scoring anomaly than deviation of your process/recipe. Now I can see that the brisket I way overcooked but finished 2nd with was more than likely due to me hitting the table that put 14 entries into the top ten from 11 teams that have never had top 10 than judges suddenly liking brisket cooked to 215*.
"What sort of people are these charcoal masters? They behaved badly and were unconcerned with appearances. Their hair was long and unkempt and their clothes were wrinkled and old. They drank beer to and from the crab house and they made rude noises while we cooked." Tao of Charcoal