Poster Session

The virtual poster session is scheduled for 11:00-13:00 on Saturday, Oct. 9, 2021. The poster session also serves as a social networking opportunity. We invite submissions from all, especially students, working on sports analytics.

Each poster will be presented in an individual virtual room during the poster session (11:00-13:00) and join a Twitter gallary before the event. Viewers at the event will enter and leave virtual rooms of their choice just like they do a poster in a real poster session. Each virtual room will be assisted by a student volunteer.

Submission

The submission has been closed.

Student Poster Award

  • Posters submitted by students enter a Student Poster Award competition.
  • The poster review committee scores the student posters (see criteria below).
  • The Student Poster Award will be presented at the closing ceremony.

Student Poster Evaluation Criteria

The evaluation should focus on four key points:

  • Motivation: a clearly presented answer to “Why do we care?”
  • Innovation: a clearly presented answer to “Why is this new or different?
  • Execution: a clearly presented answer to “How well did they succeed?”
  • Contribution: a clearly presented answer to “What did they leave behind?

Each of those four facets is worth 10 “points”, for 40 points total. Scores for each facet are “free form”, but all 4 must be considered independently. Some aspects to consider for each of the above:

  • Motivation:
    • Is it clear what problem is being tackled?
    • Is the problem “important” to the community?
    • If so, Is it clear from the poster why it’s an important consideration?
  • Innovation:
    • Is something novel? (The approach, the results, the problem space, etc.? Not all need to be, but something should be!)
    • Is how this fits in the broader context of what came before clearly stated?
    • Are potential limitations presented openly?
  • Execution:
    • Is it clear what the result of the work is (regardless of its significance or “success”)?
    • Does this result improve the state-of-the-art (note: validating prior work or aggregating previous results, etc. certainly qualifies!)
    • Are the “results” validated somehow, and do you “trust” them?
  • Contribution:
    • Is there some artifact “left behind” (source is best, but web apps, etc. qualify)
    • Could you validate or reproduce this work if you wanted to?
    • Do you feel the poster and leave-behinds help build and foster further engagement? (i.e. are they good teaching tools, or have other hooks that will persist past the conference?)