SOLVI 2020

Area to advertise for your competitions!
User avatar
CookiePie1
Exalted Member
Exalted Member
Posts: 428
Joined: February 15th, 2018, 5:05 pm
Division: C
State: NJ
Pronouns: He/Him/His
Has thanked: 121 times
Been thanked: 94 times

Re: SOLVI 2020

Post by CookiePie1 »

SOLVI Review (Competed on SBHS team Chips)

Protein Modeling (8th) - The test was sufficiently long. I mainly did the Jmol Section, so that was fine. There was a good mix of easy questions and ones that took some more time and thinking. I was able to work through the section initially in about 35 minutes. The topic-specific questions were good. There was a significant section on enzyme kinematics, which I'm not totally sure if that falls within the scope of the event but the rules are very vague so I guess it's justified.

Sounds of Music (8th) - The test was actually a bit shorter than I expected it to be but we ended up not finishing early. There were very few multiple choice questions, which is appropriate for the miniSO format. It's good that we got the graded tests back as well as the worked-through solutions which helps to review and study what we did wrong.

Overall (20th) - There was a good amount of communication especially as the tournament date approached. For a while, we weren't totally sure if our registration went through because we didn't get any initial confirmation, but our emails were promptly answered. Many students reported that some answers entered into scilympiad were not saved even though they clicked out of the prompt as instructed. Not sure how that happened but ig there's not much we can do. Awards, as mentioned was the fastest one I had ever been to. Scilympiad overall works fairly well and the UI is pretty nice, but I would personally prefer if it was designed to be more responsive. When inputting events into Scilympiad, the overflowing part of the table wasn't scrollable so you kinda need to have two monitors to be efficient or keep tabbing back and forth which I really don't like. I usually do build events, so ig it kinda sucked that we didn't do builds, but it probably would have been a lot to handle esp as a lone person and school is really slamming me this year.
South Brunswick High School Captain '22
2020 Events: Protein Modeling, Ping Pong Parachute, Wright Stuff, Sounds of Music
2021 Events: Protein Modeling, Sounds of Music, Ornithology
2022 Events: TBD

Reality is merely an illusion, albeit a very persistent one.
-Albert Einstein
User avatar
sciolyperson1
Exalted Member
Exalted Member
Posts: 1074
Joined: April 23rd, 2018, 7:13 pm
Division: C
State: NJ
Pronouns: He/Him/His
Has thanked: 529 times
Been thanked: 601 times
Contact:

Re: SOLVI 2020

Post by sciolyperson1 »

Takeaways:
  • Mountain View looks strong but is inconsistent - 24th in sounds, placements between 6-10 in the first half of events in a seemingly strong hard stack. A consistent Mira Loma or Mission San Jose can beat them at states.
  • WWPN is kinda washed... no golds, one silver, one bronze unstack. Inconsistent even when superscored (Circuit, Code, Detector)
  • Mason is strong - close 2nd superscore splitting against three teams. Got 5 golds total - pretty good imo
  • Enloe underperformed - did well other than Astronomy (also bombed at PCHS) and Geomapping (I assume technical issues)? Considering that's their stack, not looking great, especially if placing lower than WWPN's unstack.
  • Ward is pretty underrated as well, they have their downs (Circuit, Disease, etc). Perhaps technical issues? Teams typically don't have this many study bombs.
  • Tesla STEM came off a Camas/Bothell/Clark win at Camas Invy the weekend prior - 45th in Chem Lab, 24th in Forensics (got 1st and 2nd at Camas Invy) were their main bombs, putting them at 6th overall. Even though they had these bombs, I'd put them as favorites to win Washington states.
  • TJ's performance reminds me of CMU 2020, even unstack and mediocre overall until you superscore - then they do well.
Consistent trend is random bombs across the board with study events - not quite sure why, though.

I'm not surprised MTV won superscore. Their team was a pretty hard 17 for 17 stack (A team beating B team in all events by a significant margin). Mason managed to split up partners and still got a 2nd superscore - with a few events having 2 teams in the top 5 (strong indicator for good performance at future higher levels of competition). WWPN was worse, splitting between just two teams and placing behind Mason in superscore (we're not looking great). TJ was similar to Mason: splitting up three teams and placing 13/16/18 (Mason got 4/12/17). If all teams stacked here, the placements would look like:

1) Mason (previously 105 superscore across 3 teams)
2) WWPN (previously 113 superscore across 2 teams)
3) Mountain View (previously 101 superscore, full stack)
4) TJ (previously 163 superscore across 3 teams)
5) Enloe (previously 157 superscore, full stack)
6) Ward (previously 166 superscore, full stack)
7) Tesla STEM (previously 179 superscore, full stack)
These users thanked the author sciolyperson1 for the post:
kman1234t (December 21st, 2020, 9:43 am)
SoCal Planning Team & BirdSO Tournament Director
WW-P HSN '22, Community MS '18
Sciolyperson1's Userpage
User avatar
BennyTheJett
Exalted Member
Exalted Member
Posts: 462
Joined: February 21st, 2019, 2:05 pm
Division: Grad
Pronouns: He/Him/His
Has thanked: 95 times
Been thanked: 281 times

Re: SOLVI 2020

Post by BennyTheJett »

Lumosityfan and I wrote the Dynamic Planet test, and I was the Event Supervisor. Our distributions are included below.
Screenshot_2020-12-20 SOLVI DP-C 2020 Distributions - Google Docs(2).png
Screenshot_2020-12-20 SOLVI DP-C 2020 Distributions - Google Docs(2).png (17.11 KiB) Viewed 2605 times
Screenshot_2020-12-20 SOLVI DP-C 2020 Distributions - Google Docs(1).png
Screenshot_2020-12-20 SOLVI DP-C 2020 Distributions - Google Docs(1).png (15.88 KiB) Viewed 2605 times

Overall I'd say it was fairly well distributed, with a fairly accurate linear curve, and slightly exponential at the very top. The entire curve was a tad lower than I had anticipated, indicating that Lumo and I made the test slightly too hard ( :twisted: ). Test reviews (or complaints) can be handled via filling out the form that was sent to coaches for event reviews or by contacting Lumo or I on here or on Discord. Thanks to all the teams who competed again! It was a pleasure to be able to write for a fairly large and competitive tournament, and is something I look forward to hopefully doing more of in the future!
Last edited by BennyTheJett on December 21st, 2020, 8:25 am, edited 4 times in total.
These users thanked the author BennyTheJett for the post (total 4):
CookiePie1 (December 21st, 2020, 9:09 am) • kman1234t (December 21st, 2020, 9:44 am) • malikaow1004 (December 21st, 2020, 1:21 pm) • Umaroth (December 22nd, 2020, 8:46 am)
Menomonie '21 UW-Platteville '25

Division D and proud. If you want a Geology tutor hmu.
User avatar
sciolyperson1
Exalted Member
Exalted Member
Posts: 1074
Joined: April 23rd, 2018, 7:13 pm
Division: C
State: NJ
Pronouns: He/Him/His
Has thanked: 529 times
Been thanked: 601 times
Contact:

Re: SOLVI 2020

Post by sciolyperson1 »

sciolyperson1 wrote: December 11th, 2020, 7:36 pm
SOLVI Superscore predictions but corrected with my biases
Mason High School	103
West Windsor Plainsboro High School North	145
William G. Enloe High School	181
Mountain View High School	212
Thomas Jefferson HS for Science and Technology	230
Tesla STEM High School	302
Ward Melville High School	326
Ed W. Clark High School	329
Hillsborough High School	344
South Brunswick High School	364
Glen A. Wilson High School	420
Valencia High School (OC)	466
Milpitas High School	510
North Hollywood High School	515
Abington Heights High School	525
Lake Braddock High School	551
Brooklyn Technical High School	559
Walton High School	560
Shady Side Academy	583
Cardinal Gibbons High School	603
Canyon High School	621
Cerritos High School	659
Texas Academy of Mathematics and Science	693
East Grand Rapids High School	717
Lincoln Southwest High School	741
Jericho High School	758
Blue Valley North High School	800
Eastview High School	819
White Station High School	850
The College Preparatory School	873
Lexington High School	932
Potomac Falls High School	954
Green Valley High School	974
Venice High School	1049
Los Alamitos High School	1065
Barrington High School	1094
Sheldon High School	1142
Parsippany High School	1175
West Career and Technical Academy	1221
Cedar Ridge High School	1263
Rancho High School	1339
Schools and Prediction/Actual Rank Differences
Mountain View HS	3
Mason High School	1
West Windsor Plainsboro HS North	1
William G. Enloe High School	1
Thomas Jefferson HS for Science and Technology	0
Ward Melville High School	1
Tesla STEM High School	1
Lexington High School	22
Ed W. Clark High School	1
Hillsborough High School	1
North Hollywood High School	3
Jericho High School	13
Valencia High School (OC)	1
Glen A. Wilson High School	3
South Brunswick High School	5
Shady Side Academy	3
Walton High School	1
White Station High School	10
Brooklyn Technical HS	2
Texas Academy of Mathematics and Science	2
Canyon High School	1
Lake Braddock High School	6
Blue Valley North High School	3
Barrington High School	11
East Grand Rapids High School	2
Lincoln Southwest HS	2
Milpitas HS	14
Venice High School	5
Sheldon High School	7
Los Alamitos High School	4
Abington Heights High School	16
West Career and Technical Academy	6
Eastview High School	6
The College Preparatory School	5
Cerritos High School	14
Green Valley High School	4
Parsippany High School	0
Potomac Falls High School	7
Cedar Ridge High School	0
Rancho High School	0
Schools that had no data and how much they were off by
Thomas Jefferson HS for Science and Technology 0
Ward Melville High School 1
South Brunswick High School 5
Abington Heights High School 16
Brooklyn Technical High School 2
Cardinal Gibbons High School (NS)
Jericho High School 13
White Station High School 10
Lexington High School  22
Potomac Falls High School 7
Green Valley High School 4
Parsippany High School 0
West Career and Technical Academy 6
Rancho High School 0
Prediction accuracy:
Out of 40 schools that participated, 23 schools were predicted within 3 places. This includes both Cedar Ridge and Rancho High Schools, which were both predicted to the exact placement (last two placing teams, eliminating Cardinal Gibbons that no showed). 28 of these schools were predicted within 5 places, and 34 of these schools were predicted within 10 places (40 total schools). The rest of these schools' placement/prediction differences could be attributed to sending their B team/incomplete team, or having no data (forcing me to pick from past data - Lexington, Abington Heights, White Station, all had no data and were off by 22, 16, and 10, for instance).
These users thanked the author sciolyperson1 for the post:
MadCow2357 (December 21st, 2020, 9:29 am)
SoCal Planning Team & BirdSO Tournament Director
WW-P HSN '22, Community MS '18
Sciolyperson1's Userpage
User avatar
Umaroth
Exalted Member
Exalted Member
Posts: 398
Joined: February 10th, 2018, 8:51 pm
Division: C
State: CA
Pronouns: He/Him/His
Has thanked: 167 times
Been thanked: 325 times

Re: SOLVI 2020

Post by Umaroth »

Lexington popped off, could this be a fun year for MA?
These users thanked the author Umaroth for the post:
MadCow2357 (December 21st, 2020, 10:00 am)
Cal 2026
Troy SciOly 2021 Co-Captain
Proud Padre of the Evola SciOly Program 2018-now
Dank Memes Area Homeschool Juggernaut 2018-now
Sierra Vista SciOly Co-Head Coach 2020-now

Umaroth's Userpage
User avatar
sciolyperson1
Exalted Member
Exalted Member
Posts: 1074
Joined: April 23rd, 2018, 7:13 pm
Division: C
State: NJ
Pronouns: He/Him/His
Has thanked: 529 times
Been thanked: 601 times
Contact:

Re: SOLVI 2020

Post by sciolyperson1 »

Umaroth wrote: December 21st, 2020, 9:48 am Lexington popped off, could this be a fun year for MA?
I don't believe they popped off hard enough to beat Acton, but we'll wait and see I guess
SoCal Planning Team & BirdSO Tournament Director
WW-P HSN '22, Community MS '18
Sciolyperson1's Userpage
User avatar
MadCow2357
Exalted Member
Exalted Member
Posts: 774
Joined: November 19th, 2017, 9:09 am
Division: C
State: RI
Has thanked: 211 times
Been thanked: 56 times
Contact:

Re: SOLVI 2020

Post by MadCow2357 »

sciolyperson1 wrote: December 21st, 2020, 10:10 am
Umaroth wrote: December 21st, 2020, 9:48 am Lexington popped off, could this be a fun year for MA?
I don't believe they popped off hard enough to beat Acton, but we'll wait and see I guess
MIT should be the best indicator of how MA states will play out. I'm not too familiar with either of the teams, but does anybody know if the lack of builds is disproportionately affecting AB or Lexington this year?
MadCow2357's Userpage
Gallagher MS '19
Barrington HS '23
malikaow1004
Member
Member
Posts: 4
Joined: July 31st, 2017, 6:48 am
Division: C
State: NV
Has thanked: 30 times
Been thanked: 16 times

Re: SOLVI 2020

Post by malikaow1004 »

Congratulations to all the teams who competed at SOLVI this past weekend! We are thankful for all those who participated as well as our many volunteers who helped write, supervise, and grade!
These users thanked the author malikaow1004 for the post (total 5):
sciolyperson1 (December 21st, 2020, 1:52 pm) • CookiePie1 (December 21st, 2020, 1:56 pm) • Giantpants (December 22nd, 2020, 2:41 am) • lumosityfan (December 22nd, 2020, 6:08 am) • MadCow2357 (December 23rd, 2020, 1:06 pm)
Going to bloom eventually!
User avatar
Giantpants
Member
Member
Posts: 190
Joined: February 7th, 2019, 5:42 am
Division: Grad
State: NY
Pronouns: He/Him/His
Has thanked: 150 times
Been thanked: 160 times
Contact:

Re: SOLVI 2020

Post by Giantpants »

Hi friends!

On Saturday I had the big honor of being the Geologic Mapping event supervisor for SOLVI! So I just wanted to give some stats/results/thoughts for the event, and for SOLVI as a whole.

First off, big congrats to everyone who competed. Online tournaments continue to present new and unique challenges to everyone, so competing in these is a really awesome way of staying connected with Science Olympiad even in unusual times, and I am continuously impressed by the resilience of the Science Olympiad community which turns out massive fields for these competitions every time. It's a true honor to serve such a committed group!

For my test, I tried to follow a similar-ish format to my BEARSO test (which you can read about and find here!) since that was pretty well received, both in feedback and solid score distribution. So, similar to BEARSO, the test was sort of long, and again intended to be pretty challenging. Most reviews I have gotten so far agree that it was pretty hard but good, which I love to see! It's good to see my skill as a writer continuing to improve, and that high level teams are able to attest to the test's (haha see what I did there) quality. It means a lot from everyone who took it, and especially those who gave me feedback!!! So glad that so many people enjoyed it!!

Distributions will be attached at the end of the post, and they came out pretty nice! Scores were relatively low, which was the intent. It was awesome to see the 1st and 2nd place teams doing so well, achieving incredible scores, but everyone who took the test definitely deserves immense commendation. Congratulations to Ward Melville on the well-earned victory (hooray for New York!! haha) and to everyone else to participated on an awesome job.

Quick stats:
Average of 41.0779 (30.89%)
Median of 41.375 (31.11%)
Standard deviation of 15.14

On the test itself, teams did well in the general geology and map interpretation questions. This was awesome to see, since I feel like map interpretation is a part of the event some supervisors forget about sometimes. I also placed a good bit of the free response on fold geometry, and it was nice to see teams doing well there.

I received some feedback that the map was illegible. Sorry to hear this, it definitely passed my standards when adding it to the test, as even the tiniest of numbers were easily legible in the PDF I attached to the test. The map was also deemed easily readable by the 5 people I asked before the competition (and 2 afterwards) to confirm, so to hear teams had trouble reading it is frustrating. Going forward, I'll do my best to select maps which are objectively easily readable. Again, my apologies.

I also made an entire section devoted to figuring out the orientation of stresses which acted on faults using a stereonet, something I was definitely not familiar with in high school, so to see some teams make progress on that section was amazing! I understand the difficulty of doing stereonet problems on a computer, so if that hindered you, my apologies again. I would recommend investing in a paper stereonet and some tracing paper so that way you can simply replicate the problem on paper if you need to do it, since that is almost certainly easier.

Final Thoughts

This was a super fun event to run, and since the Scilympiad system has been getting constantly updated and improved, the interfaces were a lot smoother than a few months back, which made event supervising a lot simpler. Once again, it was awesome to hear such praise for the test, and I'm very thankful for all of it.

If you want to leave feedback for my test (which I would appreciate a great deal!) feel free to at this link, and give as much or as little as you want! It's always helpful to hear what competitors thought of the exam, for next time of course! It would be really really appreciated, and I hope to get more responses (thank you to everyone who already submitted!). My contact information is included too, so feel free to reach out with requests for help on problems, or just to talk.

The funny answers this time were incredible, and BEARSO's were a tough act to follow. Everyone who submitted a funny answer, though I couldn't leave a response there, earned my undying gratitude, as it all helped to make grading a little bit more fun. That little bit can make all the difference!

Huge thanks to malikaow1004 and ArchdragoonKaela for being such awesome team reps for Clark and for answering all my questions regarding event supervising and tournament procedure, and being so kind. They, along with the rest of the Clark team, deserve immense thanks for being such amazing hosts! (also random side note but Clark's team website is soooooo nice)

And of course, big big thanks to my fellow event supervisors (sophisSyo, BennyTheJett, Lumosityfan, RobertYL, Nydauron, and so many others lol) for making grading and supervising such a fun, collaborative effort. Being able to readily communicate with fellow supervisors was such a nice touch by the Clark squad, and it definitely made supervising a blast.

As promised, here is a link to a Google Drive folder, containing the test, key, the geologic map, and the distributions.

And if you don't feel like clicking that, here are the distributions anyway.

Thanks for participating in Geologic Mapping at SOLVI! I had a great time running it, and hope everyone enjoyed taking it. Being able to say I supervised for a competition in Las Vegas is such a cool conversation starter, and I wouldn't have had it any other way.
Attachments
SOLVI Geologic Mapping Combined Distributions.jpg
SOLVI Geologic Mapping Combined Distributions.jpg (56.28 KiB) Viewed 2451 times
These users thanked the author Giantpants for the post (total 10):
sciolyperson1 (December 22nd, 2020, 5:36 am) • lumosityfan (December 22nd, 2020, 6:08 am) • kleinerPanzer (December 22nd, 2020, 6:42 am) • BennyTheJett (December 22nd, 2020, 8:41 am) • RobertYL (December 22nd, 2020, 8:45 am) • Umaroth (December 22nd, 2020, 8:46 am) • malikaow1004 (December 22nd, 2020, 10:59 am) • Name (December 22nd, 2020, 1:24 pm) • ArchdragoonKaela (December 22nd, 2020, 1:34 pm) • MadCow2357 (December 23rd, 2020, 1:08 pm)
Haverford College, Class of 2024!
Former President, Kellenberg, 2018-2020
Bro. Joseph Fox, 2014-2017

Events I'm Writing in 2023: Sounds of Music, Rocks and Minerals
Events I've Written in Years Past: Geologic Mapping, Remote Sensing
Giantpants's Userpage
RobertYL
Member
Member
Posts: 28
Joined: May 26th, 2018, 9:53 pm
Division: Grad
State: CA
Has thanked: 17 times
Been thanked: 73 times

Re: SOLVI 2020

Post by RobertYL »

Event Supervisor Review

This is probably a little late, but better now than never.

Hi everyone! I am the Astronomy event supervisor and Machines co-event supervisor with Jessica Shah. I wanted to go through all of the statistic graphs (that we all love) and list out some of my thoughts for the event. But first, I want to congratulate all of the teams who took my test! Not often do event supervisors get the opportunity to have 83 competitive teams take their test and I was fortunate enough to be a part of the lucky few (so much data!).

Astronomy

Statistics:
Mean: 60.8 (30.4%)
Median: 54.5 (27.3%)
St. Dev.: 26
Max: 133 (66.5%)

Graphs:
Astronomy_C-Distributions.png
Astronomy_C-Distributions.png (42.43 KiB) Viewed 2315 times
More in-depth statistics and graphs pertaining to sections and specific questions can be found at this link.

Thoughts:
This was my first time writing and supervising an Astronomy test solo and everything worked out as well as it could have. During the testing period, there was a surprising lack of questions which indicated everything was going perfectly or catastrophically wrong. Fortunately it was the former.

Overall, I am extremely pleased with the distribution of scores. There were minimal ties and a good distribution, albeit skewed to the left, with the first and second place teams pulling well away from the trendline. I expected teams to be unable to reach all the way to the end of Section C, so I pretty much treated the test as a 150 point one, with 50 points extra for teams to practice and learn from after the exam.
  • Section A (General Knowledge) was an easier section overall, with teams scoring well in the Stellar Evolution and Galaxy sections. But as expected, the score dipped on the last Cosmology section. Surprisingly, many teams missed question A5, which asks about the elements fused in main sequence stars (only hydrogen!).
  • Section B (Deep-Sky Objects) went as expected, with high scores for recall questions (identifications, numbers, key phrases) and lower scores for more complex, application-based questions (i.e. B4c-d). I kept the question number low, with only 7 DSOs, as the 12 DSO questions at BEARSO led to teams spending too much time on the DSO section and missing out on the other sections.
  • Section C (Astrophysics) was where many of the top teams shone through, who were able to grab a lot of points from the large pool of 90 points. My general process with writing calculation questions is to use real-world data (generally from research papers) to better reflect realistic values and make more unique questions than the basic plug-and-chug ones. However, this results in answer predicated on reading the graph to a certain degree of accuracy and I saw many teams on the edge of the error range. Some teams indicated that one or two non-stimulus-based question would be more accessible to teams, which I agree with. These three questions were fairly involved and that was reflected in the distribution with the majority of the teams ending up in the 0 to 15 point range. However, this may also be due to many teams being unable to reach the final section and even attempt the questions.
In order to give teams more options, I made an external submission available to teams through email (in hindsight that was a poor choice, see below). Even though only four teams used this external submission process, I hope I was able to relieve some of the stress from competing digitally. I plan on making this option available for all events I run to keep some of the in-person testing experience alive. Let me know of your thoughts in the feedback form below!

Finally, I want to thank my team of graders that helped me get through all 83 tests in a timely manner. Thank you to pb5754 and name! I couldn't have done this without you two.

Test Folder:
The exam and all other material will be found in this folder. At the time of this post, only the statistics document is available. Everything should be uploaded and accessible within the week.

Machines

Statistics:
Mean: 59.7 (39.8%)
Median: 59 (39.3%)
St. Dev.: 22.8
Max: 109.5 (73.0%)

Graphs:
Machines_C-Distributions.png
Machines_C-Distributions.png (41.82 KiB) Viewed 2315 times
More in-depth statistics and graphs pertaining to sections and specific questions can be found at this link.

Thoughts:
Overall, I am extremely pleased with the distribution of scores (much more balanced than BEARSO). There were minimal ties and a good distribution, with a slight grouping towards the right. Also like Astronomy, the top four teams pulled away from the trendline. Exceptional performance from those teams!
  • Section A (Multiple Chioce) was definitely easier, with teams scoring near the maximum of 60 points. This was the intention and I hoped this let teams get off to a running start with the exam until they hit the FRQ portion. Sadly there were a few errors in this section, so my apologies to the teams who got confused.
  • Section B (Free Response) was where things ramped up and where the top teams were separated. Similar to BEARSO, I wanted to write unique scenarios to evaluate and questions that pulled from all parts of mechanics to really challenge competitors. I didn't want to reuse the basic setups that show up on every test. In addition, I always want to make sure each and every type of simple machine is tested on, which is harder than it sounds. Teams were able to score well on the first question, but were stumped on the second and fourth questions. Impressively, a few teams fully solved (FRQ) B2b and came up with the exact pressure-depth function. For (FRQ) B2c.iv, my main intention for this question was to assess the physics problem solving techniques of competitors. Being able to explain your process is as important as finding the answer. I plan on including these types of open-ended problems in my future exams. In all, I am pleased with the results from Section B, with at least one team scoring high on each problem.
I think the most interesting aspect of the test was (FRQ) B3, which was a design-based question focused on the mass ratio device. I created this question in hopes of integrating the on-site testing portion of the event that is gone in an online setting. This idea was actually suggested through feedback from BEARSO, so I was able to test out the idea here. Fortunately, many teams enjoyed this part of the test and almost half of the teams submitted a design. Unfortunately, my email inbox did not enjoy this and a Google Form submission is in dire need. Since this was just a trial run, I plan on making this question more involved with additional design specifications (lever-less perhaps?) along with error analysis from specified sources, rather than the open question asked here. I hope teams are up to the task!

Finally, I want to thank my co-ES, Jessica, for being a great partner and making the event run so smoothly.

Test Folder:
The exam and all other material will be found in this folder. At the time of this post, only the statistics document is available. Everything should be uploaded and accessible within the week.

Test Feedback

If you have feedback for either test, feel free to leave it here! I would appreciate it a ton, since feedback helps a lot with gauging what I need to adjust in my tests. The test codes are as follows:
  • Astronomy: 2021SOLVI-AstronomyC-Spin
  • Machines: 2021SOLVI-MachinesC-Shear
Last edited by RobertYL on December 23rd, 2020, 12:20 am, edited 5 times in total.
These users thanked the author RobertYL for the post (total 8):
Giantpants (December 23rd, 2020, 12:06 am) • sciolyperson1 (December 23rd, 2020, 7:29 am) • gz839918 (December 23rd, 2020, 8:27 am) • Name (December 23rd, 2020, 9:08 am) • 0sm0sis (December 23rd, 2020, 9:12 am) • MadCow2357 (December 23rd, 2020, 1:12 pm) • malikaow1004 (December 23rd, 2020, 1:51 pm) • ArchdragoonKaela (December 29th, 2020, 10:06 pm)
Locked

Return to “2021 Invitationals”

Who is online

Users browsing this forum: No registered users and 2 guests