Can web design affect a show’s success at Fringe?

Rating systems are tricky things, especially when they allow anybody to submit their rating. Because people often take them at face value, rating systems carry a lot of weight, therefore it’s important to consider this when designing one. This year the Winnipeg Fringe Festival launched a supplementary website called fringe.exchange which serves as a place where theatre patrons can go to write reviews and read reviews by other patrons.

On the side of the website there is a list of the top-rated shows by patrons. One thing to notice is that a show needs a minimum of 2 ratings before it will appear on this list. How the number 2was decided on is another matter, however it’s important to understand why a minimum is needed. Without a minimum it would be easy to assume a show is brilliant because it has received 10 hearts and miss the fact that only one person so far had rated it. Likewise someone could get the impression that a show with one heart is awful even though only one person thought so.

A causal user may simply glance at a show’s average rating before deciding whether or not to buy a ticket and this must be taken into account when displaying averages. A bad average could drive patrons away from a good show that has one bad rating, and a good average could lead to an increase in ticket sales for a bad show with a single good rating. So a minimum rating requirement is essential in maintaining the trust of users who don’t want to feel mislead but the information they see on a site.


The Winnipeg Free Press reviews many of the Fringe shows with their own team of reporters. They also provide a forum for readers to provide their own rating and compare how the public’s rating differs from the critic’s rating.

Here the Average Reader Rating is missing any indication of how many readers have rated it, but I will not be focusing on this here. The Winnipeg Free Press rating system has a much larger flaw that may lead users to give plays completely inaccurate ratings. Here is an example of someone rating a show two stars:

A user, intending to give a play 2 stars, clicks two stars as one would expect a user to do. But then when the user goes to click the RATE IT button every star they mouse-over on their way to to the button is also filled in. A user may not notice in time to go back and ensure their intended rating is recorded. In this case the user erroneously gives this show a 5 star rating rather than the 2 they intended.

This is another area where web-design can affect whether someone wants to see a show, or if they want to skip it over for another. If a user expects their rating to stick after they click a star, a behavior that is common on such websites as IMDB, Facbook and Google Maps, they may give a show an incorrect rating when their expectations are not met by the website.

When a disconnect exists between how a user expects a site to behave and how a web design expects a user to behave unintended consequences may occur. This is especially important to consider as performers rely on word of mouth to build and audience and they may be negatively affected by how rating systems obtain and display ratings from patrons.