2019 Harvard Undergraduate Science Olympiad Invitational
- windu34
- Staff Emeritus
- Posts: 1383
- Joined: April 19th, 2015, 6:37 pm
- Division: Grad
- State: FL
- Has thanked: 2 times
- Been thanked: 40 times
Re: 2019 Harvard Undergraduate Science Olympiad Invitational
Good luck to all teams competing today!
Boca Raton Community High School Alumni
University of Florida Science Olympiad Co-Founder
Florida Science Olympiad Board of Directors
[email protected] || windu34's Userpage
University of Florida Science Olympiad Co-Founder
Florida Science Olympiad Board of Directors
[email protected] || windu34's Userpage
- TheSquaad
- Member
- Posts: 166
- Joined: March 18th, 2017, 5:14 pm
- Division: Grad
- Has thanked: 0
- Been thanked: 5 times
Re: 2019 Harvard Undergraduate Science Olympiad Invitational
1. Acton-Boxborough B (53)
2. Acton-Boxborough A (91)
3. Newton North B (227)
4. Hillsborough A (229)
5. Belmont A (243)
6. Newton South A (292)
Note: AB had a superscore of 32
2. Acton-Boxborough A (91)
3. Newton North B (227)
4. Hillsborough A (229)
5. Belmont A (243)
6. Newton South A (292)
Note: AB had a superscore of 32
-
- Member
- Posts: 56
- Joined: November 11th, 2018, 5:19 pm
- State: FL
- Has thanked: 1 time
- Been thanked: 9 times
Re: 2019 Harvard Undergraduate Science Olympiad Invitational
This is amazing and scary at the same time. Such dominance! Are the official (or unofficial) scores up anywhere?TheSquaad wrote:1. Acton-Boxborough B (53)
2. Acton-Boxborough A (91)
3. Newton North B (227)
4. Hillsborough A (229)
5. Belmont A (243)
6. Newton South A (292)
Note: AB had a superscore of 32
- sciolyperson1
- Exalted Member
- Posts: 1074
- Joined: April 23rd, 2018, 7:13 pm
- Division: C
- State: NJ
- Pronouns: He/Him/His
- Has thanked: 529 times
- Been thanked: 601 times
- Contact:
Re: 2019 Harvard Undergraduate Science Olympiad Invitational
https://app.avogadro.ws/invitational/harvard-c/ ?demir wrote:This is amazing and scary at the same time. Such dominance! Are the official (or unofficial) scores up anywhere?TheSquaad wrote:1. Acton-Boxborough B (53)
2. Acton-Boxborough A (91)
3. Newton North B (227)
4. Hillsborough A (229)
5. Belmont A (243)
6. Newton South A (292)
Note: AB had a superscore of 32
*Note, it's blank as of now.
SoCal Planning Team & BirdSO Tournament Director
WW-P HSN '22, Community MS '18
Sciolyperson1's Userpage
WW-P HSN '22, Community MS '18
Sciolyperson1's Userpage
-
- Member
- Posts: 14
- Joined: March 20th, 2017, 3:28 pm
- Division: C
- State: MA
- Has thanked: 0
- Been thanked: 0
Re: 2019 Harvard Undergraduate Science Olympiad Invitational
go.sciolyharvard.org/divc-results
Here they are.
Here they are.
2020 MIT Sounds of Music Event Co-Supervisor
Acton-Boxborough Regional High School '19
2019 Nationals: 1st Anatomy and Physiology, 1st Designer Genes, 2nd Chemistry Lab, 2nd Sounds of Music, 3rd Forensics
2018 Nationals: 1st Chemistry Lab, 6th Forensics, 8th Herpetology, 9th Anatomy and Physiology
Acton-Boxborough Regional High School '19
2019 Nationals: 1st Anatomy and Physiology, 1st Designer Genes, 2nd Chemistry Lab, 2nd Sounds of Music, 3rd Forensics
2018 Nationals: 1st Chemistry Lab, 6th Forensics, 8th Herpetology, 9th Anatomy and Physiology
- TheSquaad
- Member
- Posts: 166
- Joined: March 18th, 2017, 5:14 pm
- Division: Grad
- Has thanked: 0
- Been thanked: 5 times
Re: 2019 Harvard Undergraduate Science Olympiad Invitational
A quick review of the events I did:
Sounds: The test was probably the best I’ve ever taken in my Scioly career (shoutouts to windu34). The instrument testing however had an extremely bad decibel meter (the entire testing room could hear my build through the walls and it only scored 82db when I know it hits over 100).
Mission: good es’ing and fairly efficient, though the impound area was the floor and people had to step over builds to retrieve theirs. Other than that, well ran.
Expedes: a decent experiment prompt and ran orderly. I didn’t like how they printed the official answer sheet double-sided, but they gave spare paper we could staple on which made it fine.
Boomi: Good rig, though nothing was checked for legality (measurements, contact line etc). Also they scooped sand out of the bucket before weighing to “compensate” for extra sand that poured into the bucket after it broke. I understand that some extra sand is added (it’s just inevitable), but leaving it to an es to estimate how much to modify your score by just seems wrong.
Overall, my events were ran pretty well. Some minor annoyances, but nothing that broke the competition for me.
Sounds: The test was probably the best I’ve ever taken in my Scioly career (shoutouts to windu34). The instrument testing however had an extremely bad decibel meter (the entire testing room could hear my build through the walls and it only scored 82db when I know it hits over 100).
Mission: good es’ing and fairly efficient, though the impound area was the floor and people had to step over builds to retrieve theirs. Other than that, well ran.
Expedes: a decent experiment prompt and ran orderly. I didn’t like how they printed the official answer sheet double-sided, but they gave spare paper we could staple on which made it fine.
Boomi: Good rig, though nothing was checked for legality (measurements, contact line etc). Also they scooped sand out of the bucket before weighing to “compensate” for extra sand that poured into the bucket after it broke. I understand that some extra sand is added (it’s just inevitable), but leaving it to an es to estimate how much to modify your score by just seems wrong.
Overall, my events were ran pretty well. Some minor annoyances, but nothing that broke the competition for me.
- windu34
- Staff Emeritus
- Posts: 1383
- Joined: April 19th, 2015, 6:37 pm
- Division: Grad
- State: FL
- Has thanked: 2 times
- Been thanked: 40 times
Re: 2019 Harvard Undergraduate Science Olympiad Invitational
Im so glad you enjoyed it! I certainly spent alot of time trying to write a balanced exam (not just pure physics or music theory) that focused on application with relatively few questions that could be answered from a binder. Admittedly, i may have put a little too much saxophone and jazz theory on it, but hopefully that gave it a uniqueness to it that may not appear at other tournaments. I thought the woodwind questions were particularly interesting (and difficult - only 1 or 2 teams got more than half credit on each of them) and i dug through many PhD dissertations to figure out how the physics worked myself. The exam had a pretty good, mostly even score distributions with a high of ~150 and a low of 15 out of 240.TheSquaad wrote: Sounds: The test was probably the best I’ve ever taken in my Scioly career (shoutouts to windu34). The instrument testing however had an extremely bad decibel meter (the entire testing room could hear my build through the walls and it only scored 82db when I know it hits over 100).
I thought i wouldnt have the same troubles MIT Sounds had since i had 8 instead of 12 teams per time block and i focused on grading and proctoring rather than scoring devices, but as many of you learned, i was wrong and we were backed up by about 20 mins most of the day and about 40 by the end of the day. After speaking to numerous other tournament directors, it is clear that Sounds REQUIRES two sets of device testing personnel to get through all the teams in one time block. Im slightly disappointed i didnt forsee this seeing as how i helped run one of the TWO optics boards at nationals. This will be a lesson i learn for supervising future physics events.
That said, i know several teams were not pleased with the dynamic volume test, but we did have 1 full score (trombone) and several in the 80s. I was actually quite pleased to not have an overwhelming nunber of high dynamic scores like i expected, and the microphone Harvard scioly purchased for me was quite easy to use and i definitely think it helped with pitch test consistency.
Now for the pitch test - plenty of teams seemed to have not practiced under testing conditions, during which the AVERAGE pitch is what matters. Alot of teams with othereise in-tune devices didnt do so well. I wont go into any more depth regarding what device designs i saw and what worked best, but i will say that my volunteers running the device testing thought that the teams that had put in the most effort ended up scoring the best and that the trombone was clearly the best instrument for the event (in their opinion, i cannot vouch for that).
Finally, i hope everyone who attended and will end up taking my test in the future (as im sure it will circulate throughout the "black market") will enjoy it and learn something new from it, as i put alot of effort into making sure it wasnt "just another binder-regurgitating sounds test" that i have seen so much of.
Boca Raton Community High School Alumni
University of Florida Science Olympiad Co-Founder
Florida Science Olympiad Board of Directors
[email protected] || windu34's Userpage
University of Florida Science Olympiad Co-Founder
Florida Science Olympiad Board of Directors
[email protected] || windu34's Userpage
-
- Member
- Posts: 19
- Joined: September 3rd, 2018, 7:18 pm
- Division: C
- State: FL
- Has thanked: 0
- Been thanked: 0
Re: 2019 Harvard Undergraduate Science Olympiad Invitational
windu34 wrote:Im so glad you enjoyed it! I certainly spent alot of time trying to write a balanced exam (not just pure physics or music theory) that focused on application with relatively few questions that could be answered from a binder. Admittedly, i may have put a little too much saxophone and jazz theory on it, but hopefully that gave it a uniqueness to it that may not appear at other tournaments. I thought the woodwind questions were particularly interesting (and difficult - only 1 or 2 teams got more than half credit on each of them) and i dug through many PhD dissertations to figure out how the physics worked myself. The exam had a pretty good, mostly even score distributions with a high of ~150 and a low of 15 out of 240.TheSquaad wrote: Sounds: The test was probably the best I’ve ever taken in my Scioly career (shoutouts to windu34). The instrument testing however had an extremely bad decibel meter (the entire testing room could hear my build through the walls and it only scored 82db when I know it hits over 100).
I thought i wouldnt have the same troubles MIT Sounds had since i had 8 instead of 12 teams per time block and i focused on grading and proctoring rather than scoring devices, but as many of you learned, i was wrong and we were backed up by about 20 mins most of the day and about 40 by the end of the day. After speaking to numerous other tournament directors, it is clear that Sounds REQUIRES two sets of device testing personnel to get through all the teams in one time block. Im slightly disappointed i didnt forsee this seeing as how i helped run one of the TWO optics boards at nationals. This will be a lesson i learn for supervising future physics events.
That said, i know several teams were not pleased with the dynamic volume test, but we did have 1 full score (trombone) and several in the 80s. I was actually quite pleased to not have an overwhelming nunber of high dynamic scores like i expected, and the microphone Harvard scioly purchased for me was quite easy to use and i definitely think it helped with pitch test consistency.
Now for the pitch test - plenty of teams seemed to have not practiced under testing conditions, during which the AVERAGE pitch is what matters. Alot of teams with othereise in-tune devices didnt do so well. I wont go into any more depth regarding what device designs i saw and what worked best, but i will say that my volunteers running the device testing thought that the teams that had put in the most effort ended up scoring the best and that the trombone was clearly the best instrument for the event (in their opinion, i cannot vouch for that).
Finally, i hope everyone who attended and will end up taking my test in the future (as im sure it will circulate throughout the "black market") will enjoy it and learn something new from it, as i put alot of effort into making sure it wasnt "just another binder-regurgitating sounds test" that i have seen so much of.
Hi Windu! I wasn’t at Harvard but my B team was. They did say they had some issues with in tune notes reading out of tune. Could you perhaps elaborate on what you mean by not practicing under testing conditions? Do you mean just not practicing playing the note for 5 seconds? Thanks
scio alum
- TheSquaad
- Member
- Posts: 166
- Joined: March 18th, 2017, 5:14 pm
- Division: Grad
- Has thanked: 0
- Been thanked: 5 times
Re: 2019 Harvard Undergraduate Science Olympiad Invitational
I think he means that most people only tune their instrument with a traditional tuner, which has a low refresh rate and always centers on the actual pitch. Google science journal (which is what was used) picks up the pitch continuously, including any overtones and points with no audible pitch. Those often create huge spikes and bowls in the pitch curve, messing up the average pitch.scienceisfunalil wrote:windu34 wrote:Im so glad you enjoyed it! I certainly spent alot of time trying to write a balanced exam (not just pure physics or music theory) that focused on application with relatively few questions that could be answered from a binder. Admittedly, i may have put a little too much saxophone and jazz theory on it, but hopefully that gave it a uniqueness to it that may not appear at other tournaments. I thought the woodwind questions were particularly interesting (and difficult - only 1 or 2 teams got more than half credit on each of them) and i dug through many PhD dissertations to figure out how the physics worked myself. The exam had a pretty good, mostly even score distributions with a high of ~150 and a low of 15 out of 240.TheSquaad wrote: Sounds: The test was probably the best I’ve ever taken in my Scioly career (shoutouts to windu34). The instrument testing however had an extremely bad decibel meter (the entire testing room could hear my build through the walls and it only scored 82db when I know it hits over 100).
I thought i wouldnt have the same troubles MIT Sounds had since i had 8 instead of 12 teams per time block and i focused on grading and proctoring rather than scoring devices, but as many of you learned, i was wrong and we were backed up by about 20 mins most of the day and about 40 by the end of the day. After speaking to numerous other tournament directors, it is clear that Sounds REQUIRES two sets of device testing personnel to get through all the teams in one time block. Im slightly disappointed i didnt forsee this seeing as how i helped run one of the TWO optics boards at nationals. This will be a lesson i learn for supervising future physics events.
That said, i know several teams were not pleased with the dynamic volume test, but we did have 1 full score (trombone) and several in the 80s. I was actually quite pleased to not have an overwhelming nunber of high dynamic scores like i expected, and the microphone Harvard scioly purchased for me was quite easy to use and i definitely think it helped with pitch test consistency.
Now for the pitch test - plenty of teams seemed to have not practiced under testing conditions, during which the AVERAGE pitch is what matters. Alot of teams with othereise in-tune devices didnt do so well. I wont go into any more depth regarding what device designs i saw and what worked best, but i will say that my volunteers running the device testing thought that the teams that had put in the most effort ended up scoring the best and that the trombone was clearly the best instrument for the event (in their opinion, i cannot vouch for that).
Finally, i hope everyone who attended and will end up taking my test in the future (as im sure it will circulate throughout the "black market") will enjoy it and learn something new from it, as i put alot of effort into making sure it wasnt "just another binder-regurgitating sounds test" that i have seen so much of.
Hi Windu! I wasn’t at Harvard but my B team was. They did say they had some issues with in tune notes reading out of tune. Could you perhaps elaborate on what you mean by not practicing under testing conditions? Do you mean just not practicing playing the note for 5 seconds? Thanks
-
- Member
- Posts: 19
- Joined: September 3rd, 2018, 7:18 pm
- Division: C
- State: FL
- Has thanked: 0
- Been thanked: 0
Re: 2019 Harvard Undergraduate Science Olympiad Invitational
TheSquaad wrote:I think he means that most people only tune their instrument with a traditional tuner, which has a low refresh rate and always centers on the actual pitch. Google science journal (which is what was used) picks up the pitch continuously, including any overtones and points with no audible pitch. Those often create huge spikes and bowls in the pitch curve, messing up the average pitch.scienceisfunalil wrote:windu34 wrote: Im so glad you enjoyed it! I certainly spent alot of time trying to write a balanced exam (not just pure physics or music theory) that focused on application with relatively few questions that could be answered from a binder. Admittedly, i may have put a little too much saxophone and jazz theory on it, but hopefully that gave it a uniqueness to it that may not appear at other tournaments. I thought the woodwind questions were particularly interesting (and difficult - only 1 or 2 teams got more than half credit on each of them) and i dug through many PhD dissertations to figure out how the physics worked myself. The exam had a pretty good, mostly even score distributions with a high of ~150 and a low of 15 out of 240.
I thought i wouldnt have the same troubles MIT Sounds had since i had 8 instead of 12 teams per time block and i focused on grading and proctoring rather than scoring devices, but as many of you learned, i was wrong and we were backed up by about 20 mins most of the day and about 40 by the end of the day. After speaking to numerous other tournament directors, it is clear that Sounds REQUIRES two sets of device testing personnel to get through all the teams in one time block. Im slightly disappointed i didnt forsee this seeing as how i helped run one of the TWO optics boards at nationals. This will be a lesson i learn for supervising future physics events.
That said, i know several teams were not pleased with the dynamic volume test, but we did have 1 full score (trombone) and several in the 80s. I was actually quite pleased to not have an overwhelming nunber of high dynamic scores like i expected, and the microphone Harvard scioly purchased for me was quite easy to use and i definitely think it helped with pitch test consistency.
Now for the pitch test - plenty of teams seemed to have not practiced under testing conditions, during which the AVERAGE pitch is what matters. Alot of teams with othereise in-tune devices didnt do so well. I wont go into any more depth regarding what device designs i saw and what worked best, but i will say that my volunteers running the device testing thought that the teams that had put in the most effort ended up scoring the best and that the trombone was clearly the best instrument for the event (in their opinion, i cannot vouch for that).
Finally, i hope everyone who attended and will end up taking my test in the future (as im sure it will circulate throughout the "black market") will enjoy it and learn something new from it, as i put alot of effort into making sure it wasnt "just another binder-regurgitating sounds test" that i have seen so much of.
Hi Windu! I wasn’t at Harvard but my B team was. They did say they had some issues with in tune notes reading out of tune. Could you perhaps elaborate on what you mean by not practicing under testing conditions? Do you mean just not practicing playing the note for 5 seconds? Thanks
Ahh okay thanks. So you can solve the silences by building an instrument that can play continuously through the five seconds, but aren’t overtones unavoidable?
scio alum
Who is online
Users browsing this forum: No registered users and 1 guest