Balancing Difficulty and Accessibility in Test Writing

Alumni make some of the best volunteers!
User avatar
Adi1008
Moderator
Moderator
Posts: 478
Joined: December 6th, 2013, 1:56 pm
Division: Grad
State: TX
Location: Austin, Texas

Balancing Difficulty and Accessibility in Test Writing

Postby Adi1008 » September 26th, 2018, 10:46 am

Hi all,

I'm writing tests (mainly for Astronomy) at a few tournaments. I've written tests for tournaments in the past, and I've always struggled with balancing difficulty with accessibility for less experienced teams, often leading to very lopsided score distributions.

As someone who has loved Astronomy for as long as I can remember, the last thing I want to do is write a test that discourages students from pursuing the subject further or makes them think Astronomy is too esoteric or difficult to understand. On the other hand, I remember the frustration I'd feel as a competitor with tests that were too easy and failed to create any separation between competitors, especially within the top teams (and moreso when a trip to Nationals is at stake!) And more importantly, I personally loved the thrill of taking a challenging test (e.g. Nationals Astronomy 2016, Princeton Astronomy 2017, etc) that pushed me to my limits. Taking hard tests is what made me love Astronomy even more and I want to recreate that feeling in everyone, including the best competitors who might find easier tests boring or lackluster

Over the past few years, I've found some interesting takes on test writing, difficulty, format, etc, such as prioritizing "gettable" questions over "gimme" ones and writing longer tests with easier problems, hoping that the length/speed of the test will separate competitors as opposed to difficulty. I'm curious to see what others personally think are the best ways or how their personal test writing philosophies shape the tests they write.

In short, to others who write tests: how do you try and balance these two elements of test writing?
University of Texas at Austin '22
Seven Lakes High School '18
Beckendorff Junior High '14

User avatar
JoeyC
Member
Member
Posts: 232
Joined: November 7th, 2017, 1:43 pm
Division: C
State: TX

Re: Balancing Difficulty and Accessibility in Test Writing

Postby JoeyC » September 26th, 2018, 11:31 am

I have written tests on occasion, and feel that while a few "Gimme" questions are necessary to establish that the competitor(s) have at least a basic knowledge of the topic, (and if they don't it'll prompt them to learn), the most important parts of the test should be in depth questions that require strong understanding of the principles of the subject; application questions. As a test taker, easy tests disappoint me, and don't prompt me to learn anything; only when something hard and out there appears will I be pushed to up my game.
Ohayo!
Dynamic Planet, Protein Modeling, Fast Facts, Thermodynamics
Dynamic Planet, Compound Machines, Chem Lab, WaQua (maybe), Ornith (maybe)
1 Corinthians 13:4-7
Scientia Potentia Est

User avatar
windu34
Moderator
Moderator
Posts: 1339
Joined: April 19th, 2015, 6:37 pm
Division: Grad
State: FL
Location: Gainesville, Florida

Re: Balancing Difficulty and Accessibility in Test Writing

Postby windu34 » September 26th, 2018, 11:46 am

I too have been exploring this subject with much interest and plan to try a multi-part question format for the next test I write. I think the best tests facilitate students to actually learn how to apply their knowledge by asking "real world scenario" type problems that really make them think. I plan to try a format where I have between 8-15 questions with 4-6 parts (between 150-300 points available total depending on competitiveness and number of teams of tournament) to each question that presents the competitor with a scenario that they have to work through. The parts in each question will get more and more difficult and most will build off the previous part in some meaningful way. My Gen Chem 2 and Orgo 1 Professors used a format like this incredible effectively (IMO) and I am excited to apply it to Science Olympiad to hopefully make my tests more engaging and interesting to take. Tests that just consist of a crap-ton of unrelated questions really dont force students to understand the concepts, they just reward the students that have seen similar problems before. Of course the exact methodology for writing a test will depend on the event so this strategy may not hold true for all events, but I am excited to try it out for the Physics events I will be supervising.
Boca Raton Community High School Alumni
Florida State Tournament Director 2020
National Physical Sciences Rules Committee Member
kevin@floridascienceolympiad.org || windu34's Userpage

Circuit Lab Event Supervisor for 2020: UT Austin (B/C), MIT (C), Solon (C), Princeton (C), Golden Gate (C), Nationals (C)

User avatar
Unome
Moderator
Moderator
Posts: 4122
Joined: January 26th, 2014, 12:48 pm
Division: Grad
State: GA
Location: somewhere in the sciolyverse

Re: Balancing Difficulty and Accessibility in Test Writing

Postby Unome » September 26th, 2018, 12:13 pm

My attempts so far have mostly worked out how I intended - with the exception of the first Astronomy test that I wrote, which resulted in one ~60% score and everyone else below 35%. I find that a lot of the difficulty comes from the fact that, on many occasions, there really is no meaningful difference between the knowledge of the bottom 50-70% of teams - no matter how gradated the questions are, most of the teams will fall within a relatively small space - for my past tests, usually the 20-40% range - with a few teams really low and a wider distribution near the top.

As windu talked about, I definitely try to relate sequences of questions to each other, although I tend not to explicitly format it that way very often (Astro being the exception).
Userpage
Chattahoochee High School Class of 2018
Georgia Tech Class of 2022

Opinions expressed on this site are not official; the only place for official rules changes and FAQs is soinc.org.

nicholasmaurer
Coach
Coach
Posts: 382
Joined: May 19th, 2017, 10:55 am
Division: Grad
State: OH
Location: Solon, OH

Re: Balancing Difficulty and Accessibility in Test Writing

Postby nicholasmaurer » September 26th, 2018, 3:53 pm

For some tests I have written, I explicitly structured them by difficulty. For each topic, I would create subsections that were explicitly labelled as easy, moderate, or difficult with questions to match.

Generally, I am for the low score to be ~20% and the high score to be ~80%. There may be an outlier, but its generally possible to get almost all of the teams distributed in this range if you're careful with your approach.
Assistant Coach and Alumnus ('14) - Solon High School Science Olympiad
Tournament Director - Northeast Ohio Regional Tournament
Tournament Director - Solon High School Science Olympiad Invitational

Opinions expressed on this site are not official; the only place for official rules changes and FAQs is soinc.org.

User avatar
TheChiScientist
Member
Member
Posts: 676
Joined: March 11th, 2018, 11:25 am
Division: Grad
State: IL
Location: Suffering in a college class

Re: Balancing Difficulty and Accessibility in Test Writing

Postby TheChiScientist » September 26th, 2018, 4:50 pm

The most effective tests I have seen are the ones that never have a 100% score. I have had a hand in writing/taking tests so what you want is a good format like this. Concepts and Plug and Chug equations should make up 20-40% of a test. These questions are normally the "easy ones" FRQ and deep thinking questions should make up about 30%-50% of a test. These questions tend to be moderately hard to hard. Finally, you should have 10%-30% of the test be college level questions that require in-depth knowledge of the subject in question. These questions should stay within what the rules allow but they should also make the well-prepared teams go "WTH is this!!!!!". These questions serve the purpose of rooting out the top performers from the poor performers. Any other space you feel you need to fill should be gimme questions but try to avoid these as they teach very little about the event. That's my 50 cents in a nutshell. :D
A Science Olympian from 2015 - 2019
Medal Count:30 8-)
School:Crystal Lake Central High School Wiki
Assassinator #119 and Co-Conspirator in #120
President of The Builder Cult. Builders rise up!

User avatar
windu34
Moderator
Moderator
Posts: 1339
Joined: April 19th, 2015, 6:37 pm
Division: Grad
State: FL
Location: Gainesville, Florida

Re: Balancing Difficulty and Accessibility in Test Writing

Postby windu34 » September 26th, 2018, 5:02 pm

The most effective tests I have seen are the ones that never have a 100% score. I have had a hand in writing/taking tests so what you want is a good format like this. Concepts and Plug and Chug equations should make up 20-40% of a test. These questions are normally the "easy ones" FRQ and deep thinking questions should make up about 30%-50% of a test. These questions tend to be moderately hard to hard. Finally, you should have 10%-30% of the test be college level questions that require in-depth knowledge of the subject in question. These questions should stay within what the rules allow but they should also make the well-prepared teams go "WTH is this!!!!!". These questions serve the purpose of rooting out the top performers from the poor performers. Any other space you feel you need to fill should be gimme questions but try to avoid these as they teach very little about the event. That's my 50 cents in a nutshell. :D
I would disagree with this approach. Throwing in random, sparsely-related subject matter that isnt relevant to the big picture of the event is really just rewarding the teams that have perfected their cheat sheets. How is that a good way to assess which teams truly UNDERSTAND what is on their cheat sheet? The hardest questions on the test should consist of applying various inter-related concepts of the event to solve a problem (or series of problems). The first reaction shouldnt be "What is this?", but rather "How the heck am I going to approach this?".
Boca Raton Community High School Alumni
Florida State Tournament Director 2020
National Physical Sciences Rules Committee Member
kevin@floridascienceolympiad.org || windu34's Userpage

Circuit Lab Event Supervisor for 2020: UT Austin (B/C), MIT (C), Solon (C), Princeton (C), Golden Gate (C), Nationals (C)

User avatar
dxu46
Exalted Member
Exalted Member
Posts: 798
Joined: April 11th, 2017, 6:55 pm
Division: C
State: MO

Re: Balancing Difficulty and Accessibility in Test Writing

Postby dxu46 » September 26th, 2018, 5:04 pm

Give it to your partner (or some other knowledgeable person) and if they get 85% or more, it's a good test.

User avatar
TheChiScientist
Member
Member
Posts: 676
Joined: March 11th, 2018, 11:25 am
Division: Grad
State: IL
Location: Suffering in a college class

Re: Balancing Difficulty and Accessibility in Test Writing

Postby TheChiScientist » September 26th, 2018, 5:18 pm

The most effective tests I have seen are the ones that never have a 100% score. I have had a hand in writing/taking tests so what you want is a good format like this. Concepts and Plug and Chug equations should make up 20-40% of a test. These questions are normally the "easy ones" FRQ and deep thinking questions should make up about 30%-50% of a test. These questions tend to be moderately hard to hard. Finally, you should have 10%-30% of the test be college level questions that require in-depth knowledge of the subject in question. These questions should stay within what the rules allow but they should also make the well-prepared teams go "WTH is this!!!!!". These questions serve the purpose of rooting out the top performers from the poor performers. Any other space you feel you need to fill should be gimme questions but try to avoid these as they teach very little about the event. That's my 50 cents in a nutshell. :D
I would disagree with this approach. Throwing in random, sparsely-related subject matter that isnt relevant to the big picture of the event is really just rewarding the teams that have perfected their cheat sheets. How is that a good way to assess which teams truly UNDERSTAND what is on their cheat sheet? The hardest questions on the test should consist of applying various inter-related concepts of the event to solve a problem (or series of problems). The first reaction shouldnt be "What is this?", but rather "How the heck am I going to approach this?".
Whoops. Probably should have worded that differently. The main idea I am trying to get at is you should have a part with concept understanding questions. These being the "do you know what you are doing" questions. Harder questions should be the do you understand what the questions is asking you and how you must solve it? The hardest questions should be the ones that really make the teams think and they should have to understand the questions wholeheartedly in conjunction with using their cheatsheet to comprehend how to solve the question. These questions are the one that make teams go initially "WTH is this!!!" but after thinking if they trully understand the concepts in question they should have an "aha" moment. Overall teams have to understand what is in front of them by having prior background knowledge and not just what is on their cheatsheet. I think that should clear up my thinking. :?
A Science Olympian from 2015 - 2019
Medal Count:30 8-)
School:Crystal Lake Central High School Wiki
Assassinator #119 and Co-Conspirator in #120
President of The Builder Cult. Builders rise up!

User avatar
Unome
Moderator
Moderator
Posts: 4122
Joined: January 26th, 2014, 12:48 pm
Division: Grad
State: GA
Location: somewhere in the sciolyverse

Re: Balancing Difficulty and Accessibility in Test Writing

Postby Unome » September 26th, 2018, 5:38 pm

Generally, I am for the low score to be ~20% and the high score to be ~80%. There may be an outlier, but its generally possible to get almost all of the teams distributed in this range if you're careful with your approach.
This is basically what I try to do, although I use 15% and 70% as my benchmarks, since 80% in Georgia (excluding outliers) would mean removing almost every question that requires thinking.
Userpage
Chattahoochee High School Class of 2018
Georgia Tech Class of 2022

Opinions expressed on this site are not official; the only place for official rules changes and FAQs is soinc.org.


Return to “Alumni”

Who is online

Users browsing this forum: No registered users and 1 guest