Amended Nationals Appeals Policy

nicholasmaurer
Coach
Coach
Posts: 422
Joined: May 19th, 2017, 10:55 am
Division: Grad
State: OH
Has thanked: 1 time
Been thanked: 22 times

Re: Amended Nationals Appeals Policy

Post by nicholasmaurer »

Skink wrote:
SciNerd42 wrote:The problem (that has no solution), is that teams can tell if a mistake was made with building events (you knew you had a higher time, score, but were ranked lower) but with academic events, you have no evidence, and you just have to assume everyone was scored right.
Yeah. And, this is a feature, not a bug. There's a beast of a slippery slope the moment they open up any subjectively scored events to prying eyes.
Yes and no. There are many valid reasons for wanting to keep the scoring process - particularly of test events - as a black box. If, for example, tests were passed back or raw scores were released, competitors would no doubt find numerous mistakes in answer keys, point totals, and rubrics. Some of the ensuing criticisms would be entirely valid. Others would likely be nit-picky, unreasonable, or plain wishful thinking.

However, I did have an experience at a National Tournament where I placed far worse (think 30+ places lower) in a test event than was reasonably possible. There is, of course, no "compelling evidence" to support this claim; it truly does rely on what the arbitration policy specifically excludes: "thinking that your team 'did better'." The difference in my performance from both that year's Ohio State Tournament, and the previous National Tournament, was far too wide to be entirely explained by a bad day or careless errors on my part. Most likely, a page of my exam was lost during scoring, or a math error was made when totaling points. As an ES, I know these mistakes happen more often than anyone would like.

Unfortunately, the current system offers no redress, or even closure/understanding, for competitors in this situation. I am not necessarily sure there is a better option available. When, as an ES or Tournament Director, you are handling thousands (or hundreds of thousands) of data points, many will doubtless be in error. However, it does concern me that the opacity of the scoring for test events leaves little incentive for quality grading or careful totaling. I'm sure the overwhelming majority of supervisors are diligent and careful, and I am sure all have good intentions. But the skeptic in me is always wary of a system whose accountability relies on one principle: trust us.
Assistant Coach and Alumnus ('14) - Solon High School Science Olympiad
Tournament Director - Northeast Ohio Regional Tournament
Tournament Director - Solon High School Science Olympiad Invitational

Opinions expressed on this site are not official; the only place for official rules changes and FAQs is soinc.org.
User avatar
windu34
Staff Emeritus
Staff Emeritus
Posts: 1383
Joined: April 19th, 2015, 6:37 pm
Division: Grad
State: FL
Has thanked: 2 times
Been thanked: 40 times

Re: Amended Nationals Appeals Policy

Post by windu34 »

nicholasmaurer wrote:
Yes and no. There are many valid reasons for wanting to keep the scoring process - particularly of test events - as a black box. If, for example, tests were passed back or raw scores were released, competitors would no doubt find numerous mistakes in answer keys, point totals, and rubrics. Some of the ensuing criticisms would be entirely valid. Others would likely be nit-picky, unreasonable, or plain wishful thinking.

However, I did have an experience at a National Tournament where I placed far worse (think 30+ places lower) in a test event than was reasonably possible. There is, of course, no "compelling evidence" to support this claim; it truly does rely on what the arbitration policy specifically excludes: "thinking that your team 'did better'." The difference in my performance from both that year's Ohio State Tournament, and the previous National Tournament, was far too wide to be entirely explained by a bad day or careless errors on my part. Most likely, a page of my exam was lost during scoring, or a math error was made when totaling points. As an ES, I know these mistakes happen more often than anyone would like.

Unfortunately, the current system offers no redress, or even closure/understanding, for competitors in this situation. I am not necessarily sure there is a better option available. When, as an ES or Tournament Director, you are handling thousands (or hundreds of thousands) of data points, many will doubtless be in error. However, it does concern me that the opacity of the scoring for test events leaves little incentive for quality grading or careful totaling. I'm sure the overwhelming majority of supervisors are diligent and careful, and I am sure all have good intentions. But the skeptic in me is always wary of a system whose accountability relies on one principle: trust us.
Definitely agree with this. I have certainly made grading mistakes on tests and I don't know how many went un-caught by me or the rest of my team. I have personally my own system when it comes to scoring tests for a study event: I grade all of the FRQ/Math/Subjective questions, volunteer #1 grades multiple choice and short answer, volunteer #2 double checks volunteer #1's grading and totals the points, volunteer #3 checks volunteer #2's point totaling. I believe this is the most thorough possible way to score tests to avoid as many mistakes as possible given a limited number of volunteers because every objective answer has 2 people look at it and any subjective answer has only one person looking at it to maintain consistency in partial credit. I don't personally know how other supervisors choose to grade their tests and I feel it is not as much of an "assembly line".

The grading process for tests does need to be a black box by necessity, but that doesn't mean the scoring process cannot be standardized/optimized to minimize mistakes as much as possible.
Boca Raton Community High School Alumni
University of Florida Science Olympiad Co-Founder
Florida Science Olympiad Board of Directors
[email protected] || windu34's Userpage
Ender1982
Member
Member
Posts: 41
Joined: February 4th, 2018, 8:10 pm
Division: C
State: IN
Has thanked: 0
Been thanked: 0

Re: Amended Nationals Appeals Policy

Post by Ender1982 »

Ideally, for every event (at all competitions, not just nationals), there are 2 people in the room, one scorer, and one double-checker. We all make mistakes, but the truth is that a lot of us have spent hours/days/months preparing for competition, and we are owed the chance to have our scores be accurate. I totally get that giving tests back will be a slippery slope, so the solution is just to have everything double checked per event before it goes to scoring, then we as competitors can "trust the process"

Now for this to happen, perhaps an official "Event supervisors" rule book. Many tournaments say the 1-2 pgs are the official rules, and things we discuss on this board, don't get included in the rules. Science Olympiad puts out a ton of materials for Event Supervisors, but in general they are not read. Requiring the "Event supervisors follow guidelines produced by Science Olympiad" rule would help to minimize the incorrect scoring, bad tests, limited questions, etc. This is especially important because we don't get the tests back, and we need to "trust the process"

Yes, I'm aware that even then, some event supervisors will do what they want (volunteers and all) but it will help minimize at least state/national test discrepancies and help us all "trust the process"
User avatar
Unome
Moderator
Moderator
Posts: 4336
Joined: January 26th, 2014, 12:48 pm
Division: Grad
State: GA
Has thanked: 235 times
Been thanked: 85 times

Re: Amended Nationals Appeals Policy

Post by Unome »

Ender1982 wrote:Now for this to happen, perhaps an official "Event supervisors" rule book. Many tournaments say the 1-2 pgs are the official rules, and things we discuss on this board, don't get included in the rules. Science Olympiad puts out a ton of materials for Event Supervisors, but in general they are not read. Requiring the "Event supervisors follow guidelines produced by Science Olympiad" rule would help to minimize the incorrect scoring, bad tests, limited questions, etc. This is especially important because we don't get the tests back, and we need to "trust the process"
This assumes that such a book can be produced at good quality. Based on my experiences with the Event Logistics manual produced by NSO, I personally would take any such NSO resource with several grains of salt.
Userpage

Opinions expressed on this site are not official; the only place for official rules changes and FAQs is soinc.org.
Ender1982
Member
Member
Posts: 41
Joined: February 4th, 2018, 8:10 pm
Division: C
State: IN
Has thanked: 0
Been thanked: 0

Re: Amended Nationals Appeals Policy

Post by Ender1982 »

This assumes that such a book can be produced at good quality. Based on my experiences with the Event Logistics manual produced by NSO, I personally would take any such NSO resource with several grains of salt.
But at least it would be something "official" that can be critiqued, and make sure that all supervisors follow, and if they don't, proper arbitration can happen.
hippo9
Member
Member
Posts: 271
Joined: March 12th, 2018, 9:35 am
Division: C
State: IN
Has thanked: 1 time
Been thanked: 6 times

Re: Amended Nationals Appeals Policy

Post by hippo9 »

I think he just wants a system to be there to allow for a possible correction in study events because currently there is none.
2018: Battery Buggy, Road Scholar, Roller Coaster
2019: Chem Lab, Code, Disease, Fossils, Geo Maps, Sounds
2020 and 2021: Astro, Chem Lab, Code, Fossils, Geo Maps, Sounds

When you miss nats twice by a combined two points :|
User avatar
Unome
Moderator
Moderator
Posts: 4336
Joined: January 26th, 2014, 12:48 pm
Division: Grad
State: GA
Has thanked: 235 times
Been thanked: 85 times

Re: Amended Nationals Appeals Policy

Post by Unome »

Ender1982 wrote:
This assumes that such a book can be produced at good quality. Based on my experiences with the Event Logistics manual produced by NSO, I personally would take any such NSO resource with several grains of salt.
But at least it would be something "official" that can be critiqued, and make sure that all supervisors follow, and if they don't, proper arbitration can happen.
I intended my statement to be a little stronger than that. There are parts in the Event Logistics manual that I would consider outright counterproductive to running a good tournament at times (or at least, there were last year - I haven't looked at this year's manual).

Edit: Probably should be more specific. I refer in particular to the info on number of helpers, which is often significantly overstated for the vast majority of tournaments.
Userpage

Opinions expressed on this site are not official; the only place for official rules changes and FAQs is soinc.org.
chalker
Member
Member
Posts: 2107
Joined: January 9th, 2009, 7:30 pm
Division: Grad
State: OH
Has thanked: 1 time
Been thanked: 56 times

Re: Amended Nationals Appeals Policy

Post by chalker »

Unome wrote:The obvious question is of course "what prompted this?"
Other than the fact we have a webform now, nothing is really changing with the process. We are spelling out certain things more explicitly, but everything follows the same criteria and processes we have followed from year to year. I was the one who pushed for this to happen because of 2 reasons: 1. we didn't have a defined post-ceremony appeal process (which meant many coaches who were 'in the know' would run me down after the ceremony to file an appeal since they mostly filtered through me and 2. many of the appeals we got we rejected because there was no evidence or they were of the type that someone 'thought' they should have done better.

Student Alumni
National Event Supervisor
National Physical Sciences Rules Committee Chair
nicholasmaurer
Coach
Coach
Posts: 422
Joined: May 19th, 2017, 10:55 am
Division: Grad
State: OH
Has thanked: 1 time
Been thanked: 22 times

Re: Amended Nationals Appeals Policy

Post by nicholasmaurer »

chalker wrote:2. many of the appeals we got we rejected because there was no evidence or they were of the type that someone 'thought' they should have done better.
I do wish there was a better process for handling these types of situations. If I remember correctly, your scoring system does include some useful metrics than can help spot outlier data (e.g. if an event should be ranked with scores low-to-high, but is accidentally entered as the reverse).

However, I wonder if some additional review could be done if, for example, a team that is otherwise placing consistently in the top 10 has one event entered where they placed 50th. Is it possible they genuinely did that poorly? Sure. Anyone can have an off day. But I would think that result would be something worth reviewing to make sure there wasn't a clerical error along the way. This is less practical at invitationals, but since State and National Tournament awards are typically at a set time later in the evening, it is more possible there. Just a thought - maybe this already happens behind the scenes and I'm just not aware!
Assistant Coach and Alumnus ('14) - Solon High School Science Olympiad
Tournament Director - Northeast Ohio Regional Tournament
Tournament Director - Solon High School Science Olympiad Invitational

Opinions expressed on this site are not official; the only place for official rules changes and FAQs is soinc.org.
chalker
Member
Member
Posts: 2107
Joined: January 9th, 2009, 7:30 pm
Division: Grad
State: OH
Has thanked: 1 time
Been thanked: 56 times

Re: Amended Nationals Appeals Policy

Post by chalker »

nicholasmaurer wrote:
I do wish there was a better process for handling these types of situations. If I remember correctly, your scoring system does include some useful metrics than can help spot outlier data (e.g. if an event should be ranked with scores low-to-high, but is accidentally entered as the reverse).

However, I wonder if some additional review could be done if, for example, a team that is otherwise placing consistently in the top 10 has one event entered where they placed 50th. Is it possible they genuinely did that poorly? Sure. Anyone can have an off day. But I would think that result would be something worth reviewing to make sure there wasn't a clerical error along the way. This is less practical at invitationals, but since State and National Tournament awards are typically at a set time later in the evening, it is more possible there. Just a thought - maybe this already happens behind the scenes and I'm just not aware!
We do indeed look for major issues like reverse sort order, unexpected No Shows, etc. However looking for more nuanced outliers like a top 10 team placing low in an event is much harder. For example, if you look at the Div C nationals result from last year, you'll see that:

1. The gold medal team had 2 events they placed in the 20's
2. The silver medal team have 2 events they placed in the 30's
3. The 6th place team had 3 events in the 20's, 1 in the 30's and 1 in the 50's
4. The 8th place teams had 2 events in the 20's and 3 in the 40's
5. Conversely, the 38th place team got a silver medal in one event

I don't see an obvious algorithm we can implement beyond our normal process of carefully checking things multiple times. Of course if someone has a good idea, I'd be happy to try to implement it.

Student Alumni
National Event Supervisor
National Physical Sciences Rules Committee Chair
Locked

Return to “2018 Nationals”

Who is online

Users browsing this forum: No registered users and 2 guests