I agree with your opinion on the Remote Sensing Test. Unfortunately, I am not strong in physics (at all).EdwardMMNT wrote:Disease Detectives (10): Supervisors only gave us 40 minutes to complete the test so it was a time crunch until the very end. It was completely different than any prior Nationals tests — there weren’t two distinct case scenarios but rather one general knowledge section and one case study. The questions were not very difficult and they provided a significant amount of filler information. They also included a strange patient information chart that we needed to fill out based on background information given to us. The test was very strange and it didn’t appear very difficult, but there were indeed some tricky and abstract questions. We pushed ourselves finish the test and were hoping to medal, but alas, the field was strong (as always) and we came up short.
Remote Sensing (10): This test was very disappointing. They asked optics and physics questions unrelated to the nature of the event. I thought last year’s test was relatively fair (albeit very difficult) but this year’s test was outright off-topic on several sections. They included Snell’s Law problems, Doppler speed questions (not too egregious), and other unrelated physics jargon. The test was very lacking in terms of climate information and relevant physics questions. They spiced in a few good questions here and there but I expected so much more from a Nationals Test.
Forensics (14): The test was long and hard. Powders were varied and well chosen and the test was a trademark Nationals test. There were so many things to do in 50 minutes and we needed every second of it. We didn’t have enough time to finish (not sure anyone did) but plowed through a lot of it, although we messed up on chromatography and getting proper measurements. I was told Forensics Nationals would be crazy and it really was. No amount of preparation could’ve equipped me for the time-crunch. That said, the test itself and the frantic atmosphere was pretty exhilarating and thrilling. Good test and had a lot of fun taking it (even if my hands were shaking the entire time).
Towers (12): Very nice supervisors. I heard the stand was not level, but aside from that, it was smoothly run.
Nationals Event Discussion
-
- Member
- Posts: 25
- Joined: March 18th, 2015, 6:47 am
- Division: C
- State: AL
- Has thanked: 0
- Been thanked: 0
Re: Nationals Event Discussion
2019 Interests: Anatomy, Disease Detectives, Fossils, Experimental Design, Geologic Mapping, Designer Genes
Anatomy/Disease/Experimental/Fossils/Circuit Lab:
MIT: 12/25/13/22
Regionals: 1/1/x/x/1
State: 1/1/2/1/x
Nationals:
Anatomy/Disease/Experimental/Fossils/Circuit Lab:
MIT: 12/25/13/22
Regionals: 1/1/x/x/1
State: 1/1/2/1/x
Nationals:
-
- Member
- Posts: 16
- Joined: April 16th, 2016, 10:00 am
- Division: Grad
- State: MI
- Has thanked: 5 times
- Been thanked: 7 times
- Contact:
Re: Nationals Event Discussion
I apologize for being so late to this forum, but I thought it would be best as a graduating senior to first take the time to look back at my entire Science Olympiad career and the experiences I've had. I've enjoyed every minute of competition over the years, and I plan to continue being involved in SO in any way I can. With that said, I hope that those involved with SO and the National Tournament will seriously consider the thoughts that I and many others on this forum put out so that SO can continue to improve at the highest level. I apologize in advance for the length!
I'll start with my events!
Anatomy & Physiology: (21) - I'll be completely honest - this was not good at all. This test failed to represent the rules manual, as many topics in each system were completely left out. For the most part, the stations were far too easy, but there also were questions that were either too random or too irrelevant to the heart of the event (i.e. pink puffers, blue bloaters). Although the case study was a good idea, it just wasn't executed well enough and the patient gave too many contradictory responses.
One of the beautiful parts of an event like Anatomy & Physiology is the multitude of ways in which an event supervisor can test the competitors' knowledge. I am always impressed by invitational tests like MIT's that are able to cover all of the rule book and really separate teams based on their knowledge of A&P. This test was truly embarrassing for a national-level tournament in its inability to accomplish that. I thought last year's A&P test was better - it had parts that were easy, medium, and hard, as well as sections that forced the competitors to dig into the knowledge of A&P that they had built over many months. You could tell that last year's test was written by someone with a doctor's perspective and a solid grasp of the rules. I cannot say the same about Ms. Palmietto's exams from both 2015 and 2018. It was just so frustrating for someone like me who made this event my first true love over the years and was fortunate enough to attend the SfN Conference through SO last year to end on such a sour note. However, I know I would've had the same complaints even if I had placed in the top 6. Congrats to Mason on a phenomenal job in this event and as a team all year.
Disease Detectives (4) - The CDC does a good job with this event on a yearly basis. The exams are always long, challenging, and test the competitors' ability to adapt to different scenarios and use critical thinking. All in all, I thought this year's event was done well, but probably not as well as previous years. The general knowledge section at the beginning of the test was an interesting twist (especially with the bias and confounding part), but it was a little too straightforward for a national test. The second section on plague had some new questions that I enjoyed answering and thinking through, but it still felt too easy overall (i.e. the Bradford-Hill Criteria listing). Although I think topics like occupational health and plague are interesting inclusions in Nationals case studies, it's still strange that foodborne disease was hardly ever addressed.
The test was definitely doable in the 40 minutes given, but the national supervisors need to do a much better job with checking teams in and handing out booklets so that all teams have 50 minutes to work, not 40 or fewer. This test was OK overall, but I'd point to the tests from the last three years as much better examples of how a nationals DD test should be structured. Congrats to TJHSST for winning this event.
Materials Science: (25) - Just like how an NBA game can be "tale of two halves", this MatSci exam was a tale of two parts. I'll start with the test. For the most part, I thought it was quite good for a Nationals test. It had varied multiple choice questions that covered most, if not all, of the rule book. For example, I really liked the questions that tested the materials testing equipment because it's something mentioned in the rules that takes a good amount of research to fully understand. The matching section could've been written and spaced out better, and there was probably too much on certain topics like IR. Like Unome said earlier, there was also probably too much focus on individual polymers rather than how polymers behave as a class.
The lab, on the other hand, was quite flawed. The lab asked competitors to melt some cheese and then extrude it through a syringe to form a string that had to carried to a station for judging. Next, it asked us to make a hashtag from the extruded cheese that also had to carried. Seriously? I understand that challenging labs can be expensive to create and difficult to complete in the allotted time, but I've seen many interesting and intuitive labs at tournaments all year that blew these out of the water. I was definitely expecting better from the event supervisor because my partner had a blast doing the metals-based labs at Wright State Nationals.
Also, the scoring system for the cheese lengths was just bad. I don't remember the numbers exactly, but the scoring ranges were unreasonably large. For example, a team with a string of 21 cm would get the same number of points as a team with a 99 cm string, a significant difference considering how much cheese we were initially given. It just didn't capture the essence of the event's focus on polymers and seemed rather silly in the grand scheme of things. Frankly, it seemed like the event supervisor felt the need to force cheese into the event in some way because he's from Wisconsin. The rest of the lab section dealt with opacity and general polymer stuff, both of which were OK. This event was decent, but it had the potential to be a great overall with a better lab component. Hats off to Troy for winning this event and the overall tournament.
Again, I don't want to give the impression that I'm complaining or suffering from sour grapes because I didn't do well as I had hoped in all my events. I've thought long and hard about each one, and I feel that those events and others should be analyzed in an unbiased manner that accounts for all the factors involved.
With that said, I thought the overall tournament was run pretty well. The event staff and volunteers were great as usual, and the builders on my team didn't seem to have any glaring problems with how events were run. Colorado State had a great campus, and I didn't really have any issues with traveling between events. I will say that there seemed to be very few things that the competitors could do on Friday and Saturday other than walk around and/or compete. The Noosa was great, however, and I haven't eaten yogurt since then because other brands are just a letdown in quality I'd rank this tournament as better than Nebraska (for obvious reasons) and around the same as Wright State and UW-Stout. In my opinion, the UCF Nationals from 2014 and 2012 were just awesome and unparalleled in quality.
I'll premise my final part by first saying that Science Olympiad does a LOT of things right. I've always loved picking up a variety of events that have allowed me to grow in so many ways. SO is also incredibly rewarding, and its benefits obviously go far beyond a few medals at States and Nationals. However, the National Tournament and SO have some pressing issues that need to be tackled. Nationals should reflect the highest standards that SO has to offer. I think a lot of people on these forums believe that the quality of Nationals, especially in the bio events, has been declining a bit over the years. It's far too common to see tests at invitationals like MIT, GGSO, and SOUP in January and February that are significantly better than Nationals tests. That shouldn't happen, plain and simple. I understand that the National event supervisors have responsibilities outside of SO, but it's simply unacceptable when a college student or professor writes a test that demonstrates a greater understanding of the event and all it entails than the National supervisor does. This is especially concerning since the event supervisors have almost an entire year to prepare an exam that they know students have been diligently preparing for. There are too many exams like Ecology last year and Herpetology and Anatomy & Physiology this year that leave competitors asking for much, much more. The National supervisor committee as a whole should honestly reconsider their approach, especially since there's a solid argument that tests from invites like MIT are better nearly across the board.
I've been truly honored to attend 7 national tournaments, and as a result, it feels like SO has become ingrained in my DNA. I'm a little sad that my team and I were not able to do as well as we had wished in my final competition ever. In the end, however, I'll remember all the incredible memories I've created from SO. I love my teammates to death, and I know they'll be back next year with a vengeance. We all have a great deal of respect for teams like Troy, Solon, and Harriton that consistently churn out winning teams that capture what SO is all about. Another shoutout goes to all the people on these forums for their awesome performances and dedication to Science Olympiad. You guys rock.
I'd love to hear what others think. Feel free to reply or PM me if you'd like to discuss.
I'll start with my events!
Anatomy & Physiology: (21) - I'll be completely honest - this was not good at all. This test failed to represent the rules manual, as many topics in each system were completely left out. For the most part, the stations were far too easy, but there also were questions that were either too random or too irrelevant to the heart of the event (i.e. pink puffers, blue bloaters). Although the case study was a good idea, it just wasn't executed well enough and the patient gave too many contradictory responses.
One of the beautiful parts of an event like Anatomy & Physiology is the multitude of ways in which an event supervisor can test the competitors' knowledge. I am always impressed by invitational tests like MIT's that are able to cover all of the rule book and really separate teams based on their knowledge of A&P. This test was truly embarrassing for a national-level tournament in its inability to accomplish that. I thought last year's A&P test was better - it had parts that were easy, medium, and hard, as well as sections that forced the competitors to dig into the knowledge of A&P that they had built over many months. You could tell that last year's test was written by someone with a doctor's perspective and a solid grasp of the rules. I cannot say the same about Ms. Palmietto's exams from both 2015 and 2018. It was just so frustrating for someone like me who made this event my first true love over the years and was fortunate enough to attend the SfN Conference through SO last year to end on such a sour note. However, I know I would've had the same complaints even if I had placed in the top 6. Congrats to Mason on a phenomenal job in this event and as a team all year.
Disease Detectives (4) - The CDC does a good job with this event on a yearly basis. The exams are always long, challenging, and test the competitors' ability to adapt to different scenarios and use critical thinking. All in all, I thought this year's event was done well, but probably not as well as previous years. The general knowledge section at the beginning of the test was an interesting twist (especially with the bias and confounding part), but it was a little too straightforward for a national test. The second section on plague had some new questions that I enjoyed answering and thinking through, but it still felt too easy overall (i.e. the Bradford-Hill Criteria listing). Although I think topics like occupational health and plague are interesting inclusions in Nationals case studies, it's still strange that foodborne disease was hardly ever addressed.
The test was definitely doable in the 40 minutes given, but the national supervisors need to do a much better job with checking teams in and handing out booklets so that all teams have 50 minutes to work, not 40 or fewer. This test was OK overall, but I'd point to the tests from the last three years as much better examples of how a nationals DD test should be structured. Congrats to TJHSST for winning this event.
Materials Science: (25) - Just like how an NBA game can be "tale of two halves", this MatSci exam was a tale of two parts. I'll start with the test. For the most part, I thought it was quite good for a Nationals test. It had varied multiple choice questions that covered most, if not all, of the rule book. For example, I really liked the questions that tested the materials testing equipment because it's something mentioned in the rules that takes a good amount of research to fully understand. The matching section could've been written and spaced out better, and there was probably too much on certain topics like IR. Like Unome said earlier, there was also probably too much focus on individual polymers rather than how polymers behave as a class.
The lab, on the other hand, was quite flawed. The lab asked competitors to melt some cheese and then extrude it through a syringe to form a string that had to carried to a station for judging. Next, it asked us to make a hashtag from the extruded cheese that also had to carried. Seriously? I understand that challenging labs can be expensive to create and difficult to complete in the allotted time, but I've seen many interesting and intuitive labs at tournaments all year that blew these out of the water. I was definitely expecting better from the event supervisor because my partner had a blast doing the metals-based labs at Wright State Nationals.
Also, the scoring system for the cheese lengths was just bad. I don't remember the numbers exactly, but the scoring ranges were unreasonably large. For example, a team with a string of 21 cm would get the same number of points as a team with a 99 cm string, a significant difference considering how much cheese we were initially given. It just didn't capture the essence of the event's focus on polymers and seemed rather silly in the grand scheme of things. Frankly, it seemed like the event supervisor felt the need to force cheese into the event in some way because he's from Wisconsin. The rest of the lab section dealt with opacity and general polymer stuff, both of which were OK. This event was decent, but it had the potential to be a great overall with a better lab component. Hats off to Troy for winning this event and the overall tournament.
Again, I don't want to give the impression that I'm complaining or suffering from sour grapes because I didn't do well as I had hoped in all my events. I've thought long and hard about each one, and I feel that those events and others should be analyzed in an unbiased manner that accounts for all the factors involved.
With that said, I thought the overall tournament was run pretty well. The event staff and volunteers were great as usual, and the builders on my team didn't seem to have any glaring problems with how events were run. Colorado State had a great campus, and I didn't really have any issues with traveling between events. I will say that there seemed to be very few things that the competitors could do on Friday and Saturday other than walk around and/or compete. The Noosa was great, however, and I haven't eaten yogurt since then because other brands are just a letdown in quality I'd rank this tournament as better than Nebraska (for obvious reasons) and around the same as Wright State and UW-Stout. In my opinion, the UCF Nationals from 2014 and 2012 were just awesome and unparalleled in quality.
I'll premise my final part by first saying that Science Olympiad does a LOT of things right. I've always loved picking up a variety of events that have allowed me to grow in so many ways. SO is also incredibly rewarding, and its benefits obviously go far beyond a few medals at States and Nationals. However, the National Tournament and SO have some pressing issues that need to be tackled. Nationals should reflect the highest standards that SO has to offer. I think a lot of people on these forums believe that the quality of Nationals, especially in the bio events, has been declining a bit over the years. It's far too common to see tests at invitationals like MIT, GGSO, and SOUP in January and February that are significantly better than Nationals tests. That shouldn't happen, plain and simple. I understand that the National event supervisors have responsibilities outside of SO, but it's simply unacceptable when a college student or professor writes a test that demonstrates a greater understanding of the event and all it entails than the National supervisor does. This is especially concerning since the event supervisors have almost an entire year to prepare an exam that they know students have been diligently preparing for. There are too many exams like Ecology last year and Herpetology and Anatomy & Physiology this year that leave competitors asking for much, much more. The National supervisor committee as a whole should honestly reconsider their approach, especially since there's a solid argument that tests from invites like MIT are better nearly across the board.
I've been truly honored to attend 7 national tournaments, and as a result, it feels like SO has become ingrained in my DNA. I'm a little sad that my team and I were not able to do as well as we had wished in my final competition ever. In the end, however, I'll remember all the incredible memories I've created from SO. I love my teammates to death, and I know they'll be back next year with a vengeance. We all have a great deal of respect for teams like Troy, Solon, and Harriton that consistently churn out winning teams that capture what SO is all about. Another shoutout goes to all the people on these forums for their awesome performances and dedication to Science Olympiad. You guys rock.
I'd love to hear what others think. Feel free to reply or PM me if you'd like to discuss.
Northville '18, UMich '22
Vice Executive Director - University of Michigan Science Olympiad Invitational (UMSO)
Website: www.umichscioly.org
Instagram: @umichscioly
Twitter: @umichscioly
Vice Executive Director - University of Michigan Science Olympiad Invitational (UMSO)
Website: www.umichscioly.org
Instagram: @umichscioly
Twitter: @umichscioly
- windu34
- Staff Emeritus
- Posts: 1383
- Joined: April 19th, 2015, 6:37 pm
- Division: Grad
- State: FL
- Has thanked: 2 times
- Been thanked: 40 times
Re: Nationals Event Discussion
I find this quite interesting in particular. I would also be interested in hearing peoples perspectives and speculation on thisjaguarhunter wrote: I'll premise my final part by first saying that Science Olympiad does a LOT of things right. I've always loved picking up a variety of events that have allowed me to grow in so many ways. SO is also incredibly rewarding, and its benefits obviously go far beyond a few medals at States and Nationals. However, the National Tournament and SO have some pressing issues that need to be tackled. Nationals should reflect the highest standards that SO has to offer. I think a lot of people on these forums believe that the quality of Nationals, especially in the bio events, has been declining a bit over the years. It's far too common to see tests at invitationals like MIT, GGSO, and SOUP in January and February that are significantly better than Nationals tests. That shouldn't happen, plain and simple. I understand that the National event supervisors have responsibilities outside of SO, but it's simply unacceptable when a college student or professor writes a test that demonstrates a greater understanding of the event and all it entails than the National supervisor does. This is especially concerning since the event supervisors have almost an entire year to prepare an exam that they know students have been diligently preparing for. There are too many exams like Ecology last year and Herpetology and Anatomy & Physiology this year that leave competitors asking for much, much more. The National supervisor committee as a whole should honestly reconsider their approach, especially since there's a solid argument that tests from invites like MIT are better nearly across the board.
Boca Raton Community High School Alumni
University of Florida Science Olympiad Co-Founder
Florida Science Olympiad Board of Directors
[email protected] || windu34's Userpage
University of Florida Science Olympiad Co-Founder
Florida Science Olympiad Board of Directors
[email protected] || windu34's Userpage
-
- Member
- Posts: 2107
- Joined: January 9th, 2009, 7:30 pm
- Division: Grad
- State: OH
- Has thanked: 1 time
- Been thanked: 56 times
Re: Nationals Event Discussion
There is a nuance here that many people don't seem to be aware of. Part of the general 'deal' with being a host site for the National Tournament is that you get to designate a certain number of local people as National Event Supervisors for the tournament. As such, every year there are many events run by people who haven't had experience running a national event before. I'm not implying there is a direct correlation with the concerns some of you have about some national events, but it's likely a significant factor, even though we do a variety of things to try to help them prepare.windu34 wrote:I find this quite interesting in particular. I would also be interested in hearing peoples perspectives and speculation on thisjaguarhunter wrote: I'll premise my final part by first saying that Science Olympiad does a LOT of things right. I've always loved picking up a variety of events that have allowed me to grow in so many ways. SO is also incredibly rewarding, and its benefits obviously go far beyond a few medals at States and Nationals. However, the National Tournament and SO have some pressing issues that need to be tackled. Nationals should reflect the highest standards that SO has to offer. I think a lot of people on these forums believe that the quality of Nationals, especially in the bio events, has been declining a bit over the years. It's far too common to see tests at invitationals like MIT, GGSO, and SOUP in January and February that are significantly better than Nationals tests. That shouldn't happen, plain and simple. I understand that the National event supervisors have responsibilities outside of SO, but it's simply unacceptable when a college student or professor writes a test that demonstrates a greater understanding of the event and all it entails than the National supervisor does. This is especially concerning since the event supervisors have almost an entire year to prepare an exam that they know students have been diligently preparing for. There are too many exams like Ecology last year and Herpetology and Anatomy & Physiology this year that leave competitors asking for much, much more. The National supervisor committee as a whole should honestly reconsider their approach, especially since there's a solid argument that tests from invites like MIT are better nearly across the board.
Student Alumni
National Event Supervisor
National Physical Sciences Rules Committee Chair
-
- Member
- Posts: 128
- Joined: April 30th, 2017, 12:27 pm
- Division: Grad
- State: MI
- Has thanked: 0
- Been thanked: 0
Re: Nationals Event Discussion
I do wish that all tournaments, not just nationals, involved younger student alumni (eg college students) more frequently. I feel that students can sometimes write some of the best tests because they are the ones who have actually competed the events dozens of times and know what works and what doesn’t. Obviously I’m not saying adults/professors don’t do a good job, but I wish college students were more a part of the process at state and national tournaments. I think having student alumni help with the test writing/supervising could increase the quality of the tests substantially.
University of Michigan Science Olympiad Executive Board
- Unome
- Moderator
- Posts: 4338
- Joined: January 26th, 2014, 12:48 pm
- Division: Grad
- State: GA
- Has thanked: 235 times
- Been thanked: 85 times
Re: Nationals Event Discussion
I'd say that, on average, the proportion of good event supervisors is similar among both groups (younger student alumni and everyone else). I've seen plenty of not-so-good alumni event supervisors.MIScioly1 wrote:I do wish that all tournaments, not just nationals, involved younger student alumni (eg college students) more frequently. I feel that students can sometimes write some of the best tests because they are the ones who have actually competed the events dozens of times and know what works and what doesn’t. Obviously I’m not saying adults/professors don’t do a good job, but I wish college students were more a part of the process at state and national tournaments. I think having student alumni help with the test writing/supervising could increase the quality of the tests substantially.
-
- Member
- Posts: 128
- Joined: April 30th, 2017, 12:27 pm
- Division: Grad
- State: MI
- Has thanked: 0
- Been thanked: 0
Re: Nationals Event Discussion
Oh for sure. I still wish that scioly was more welcoming to younger supervisors at higher levels of competition.Unome wrote:I'd say that, on average, the proportion of good event supervisors is similar among both groups (younger student alumni and everyone else). I've seen plenty of not-so-good alumni event supervisors.MIScioly1 wrote:I do wish that all tournaments, not just nationals, involved younger student alumni (eg college students) more frequently. I feel that students can sometimes write some of the best tests because they are the ones who have actually competed the events dozens of times and know what works and what doesn’t. Obviously I’m not saying adults/professors don’t do a good job, but I wish college students were more a part of the process at state and national tournaments. I think having student alumni help with the test writing/supervising could increase the quality of the tests substantially.
University of Michigan Science Olympiad Executive Board
-
- Coach
- Posts: 422
- Joined: May 19th, 2017, 10:55 am
- Division: Grad
- State: OH
- Has thanked: 1 time
- Been thanked: 22 times
Re: Nationals Event Discussion
My understanding is that the new Science Olympiad Alumni Task Force is designed to better retain and integrate alumni, especially at state and regional tournaments. Participation/volunteering at these tournaments, and the relationships formed as a result, often serve as a stepping stone to becoming involved at the national level.MIScioly1 wrote:Oh for sure. I still wish that scioly was more welcoming to younger supervisors at higher levels of competition.Unome wrote:I'd say that, on average, the proportion of good event supervisors is similar among both groups (younger student alumni and everyone else). I've seen plenty of not-so-good alumni event supervisors.MIScioly1 wrote:I do wish that all tournaments, not just nationals, involved younger student alumni (eg college students) more frequently. I feel that students can sometimes write some of the best tests because they are the ones who have actually competed the events dozens of times and know what works and what doesn’t. Obviously I’m not saying adults/professors don’t do a good job, but I wish college students were more a part of the process at state and national tournaments. I think having student alumni help with the test writing/supervising could increase the quality of the tests substantially.
Assistant Coach and Alumnus ('14) - Solon High School Science Olympiad
Tournament Director - Northeast Ohio Regional Tournament
Tournament Director - Solon High School Science Olympiad Invitational
Opinions expressed on this site are not official; the only place for official rules changes and FAQs is soinc.org.
Tournament Director - Northeast Ohio Regional Tournament
Tournament Director - Solon High School Science Olympiad Invitational
Opinions expressed on this site are not official; the only place for official rules changes and FAQs is soinc.org.
-
- Member
- Posts: 128
- Joined: April 30th, 2017, 12:27 pm
- Division: Grad
- State: MI
- Has thanked: 0
- Been thanked: 0
Re: Nationals Event Discussion
Good to hear!nicholasmaurer wrote:
My understanding is that the new Science Olympiad Alumni Task Force is designed to better retain and integrate alumni, especially at state and regional tournaments. Participation/volunteering at these tournaments, and the relationships formed as a result, often serve as a stepping stone to becoming involved at the national level.
University of Michigan Science Olympiad Executive Board
Who is online
Users browsing this forum: No registered users and 4 guests