fanjiatian wrote:IMO, not being able to update scores in real-time and allow multiple people to input scores greatly slows productivity and increases room for error.
For what it's worth, I don't have any issue with people using other programs / developing other software. I think it's great some people are willing to contribute to the SO ecosystem in this way and I'll be the first to advocate for using a truly excellent system.
At nationals we generally use my system, but have used Avogadro a few times in recent years. We'll be using Ezra next year since it's at Cornell. As the person ultimately responsible for ensuring scoring is as effective and error-free as possible, I do find fault with the statement I've quoted above. The main thing I emphasize over and over again is that we have a best practices PROCESS that is very important to follow (and is documented in this file available on the national website:
https://www.dropbox.com/s/8n8i0ojswlk7n ... 1.pdf?dl=0 ). The actual tools used don't really matter (I happen to show my system in the guide, but it's easily adapted to other systems).
The key thing is we WANT to slow certain things down and have multiple eyes cross checking things at various steps. I've either run or been involved in scoring at about half of the National Tournaments over the ~3 decade history of SO and can tell SO many stories of mistakes, errors, and weird things that have cropped up. Most all the steps you see in our scoring guidelines are direct results of situations we've encountered in the past.
The biggest concern I have is that far too many people want to 'streamline' everything and get to the point of scores essentially going straight from the event supervisor / volunteers to the awards ceremony. That's a big mistake in my opinion and I'd strongly advocate against it! You have to have objective, independent score counselors checking what the event supervisors do, otherwise you are bound to have mistakes creep in.