Ludum Dare 21 Results!

Posted by (twitter: @ludumdare)
September 12th, 2011 7:00 pm

It’s that time! Three weeks and a whopping 599 games later, here are the results:

Top 50 Games

Due to our HUGE recent increase in submissions, we’ve bumped the top 20 to a top 50. Check out the best competition games here:

Compo Top 50:

Winners are decided by the Overall category. In addition to the top 50 compo games (solo, stricter rules), here are the top 50 jam games (solo and teams, relaxed rules):

Jam Top 50:

Congratulations to all the winners.

NEW: The lists have gotten so big lately. So to keep the site fast and snappy during the heavy loads (events and results), we had to truncate them at 50. Don’t worry though, you can see your individual categorical ratings on your games page.

Categorical Top 25s

Here at Ludum Dare, being the best game isn’t the only way to win. Games are rated in 7 additional categories, with a special “Coolness” category highlighting people that went above and beyond to be sure you got a vote.

Categorical Top 25s:

(And for the press, a shorter Top 5’s is available here)

*NOTE*: You can click on the titles of the categories for Top 50 style lists per category.

More Ludum Dare 21 links

Keynote!, with Breakdance McFunkypants and special guest Sos
Theme Voting Results
Post Event Post
Wallpaper of all 599 games, by ExciteMike

Interesting Tags: montage, motivation, foodphoto, food, deskphoto, desk, timelapse

October Challenge 2011!

Yes, we’re doing it again this year. Details about the upcoming event will be posted soon.

In summary, yes, basically the same thing as last year. Go make money. 😀

Ludum Dare 22 – Coming December 2011!!

Stop by again this December for our next regularly scheduled event. We’ll try to have a date nailed down a month or two ahead of time. Don’t forget the mailing list, and Twitter.

September Mini LD, hosted by increpare

Still got that Ludum Dare fever?

Tune in Friday for a brand new Mini LD event hosted by increpare. Unfamiliar with Mini LDs? It’s like a regular LD without the weeks of voting (and waiting).

A Busy Busy September

Mini LD isn’t the only thing going on this weekend. Breakdance McFunkypants has posted a comprehensive list of 7 game jams going on this weekend (and/or ending/crossing this weekend). Check it out.

Don’t let the URL fool you. There was more going on than initially thought. :)


If you have any suggestions for us (website, observations, etc), we continue to collect them in the comments here:

Thanks everyone for coming out and making Ludum Dare 21 such a HUGE success! We hope to see you again soon!

– Mike Kasprzak (PoV)

36 Responses to “Ludum Dare 21 Results!”

  1. ChevyRay says:

    It’d be awesome if we could somehow sync these jams together or something, like so we can chat with each other and all announce our shit at once or something.

    I’ll be at OrcaJam in Victoria BC this weekend, and really looking forward to it! There’ll be a few Ludum Darers there too, so it’s bound to be an awesome time.

    Either way, I’ll get everybody on IRC or livestreaming or something so we can wave at each other as we jam our pants off.

  2. digital_sorceress says:

    “The lists have gotten so big lately. So to keep the site fast and snappy during the heavy loads (events and results), we had to truncate them at 50.”

    Do you think we could have the full lists in a couple of days time, when traffic drops?

    I ‘d like to graph the correlations between the categories, and I’d need the full list to do that. :-)

    Preferably just tabulated text with no images or link (ie, rows = users, columns = category scores). It doesn’t even need to be dynamically generated – Generate it once and save the table in plain html.

  3. RichMakeGame says:

    how many games were there in the compo vs the jam?

  4. Vlad says:

    is it possible to see what votes you got on your entry like it was before?? it was really helpful option…

    • PoV says:

      I dunno if it was helpful particularly. It made anyone mad that got an all 1’s rating, which was most people. For the sake of ratings, we remove the best and worst rating and average the rest to get the results now. Still, I’ll talk with Phil about showing scores once he gets back from Denver.

      • Felipe Budinich says:

        “we remove the best and worst rating and average the rest to get the results now”

        While it’s better than just averaging everything, *removing* votes it’s not such a good idea, I would do
        weighted means using mode to rank each value giving more weight to the value that occurs most often, and using range as a tiebreaker.

        Something like this:

        Value weight ranked by Mode:
        A.- Value 45% weight
        B.- Value 25% weight
        C.- Value 15% weight
        D.- Value 10% weight
        E.- Value 5% weight

        A + B + C + D + E = Total Score

        So if someone gets:

        1 Star = 1 Vote
        2 Star = 3 Votes
        3 Star = 5 Votes
        4 Star = 4 Votes
        5 Star = 2 Votes

        It would get a score of: (3*0.45) + (4*0.25) + (2*0.15) + (5*0.1) + (1*0.05) = 3,2

        Funny thing, if you average you get the same result, maybe I should skew the weights a little more, but you get the idea.

        • digital_sorceress says:

          I don’t like the idea of removing votes. Removing the best and worst means that nearly 1200 votes have been thrown away!

          But I don’t like the idea of weighting votes either. The weights are chosen arbitrarily, and by nudging those weights slightly, the top 20 could be made to look quite a bit different.

          And this means that the top 20 rankings would be equally arbitrary.

          • Felipe Budinich says:

            Yes, I’m aware that it would be just as arbitrary, but it’s an improvement over just removing votes, if someone draws some 1’s or 5’s there must be something to it

        • Kvisle says:

          Removing the votes that say “1-1-1-1-1-1” or “5-5-5-5-5-5” is OK with me – those are never serious votes anyway.

  5. hdon says:

    Still no distinction between Flash and Web-standards games in the listing of submissions…

  6. Vladp995 says:

    PoV look here!!
    ..Anyway, I think that the voting is unbalanced.. because most of the people are not playing all the games at all(average is 20 games.. total is 600!!!) therefor some “good” games can get a lower score then some “bad” game just because it was judged by two different people(think what happens when one person judged only 5 “good” games and other person only 5 “bad” games… due to the voting limit of 5 stars), such thing can’t happen if both of them would play all the 10 games at once..
    I’m telling it because I saw many games that I think deserve much more that the place they took and I’m a little disappointed..
    So maybe next time time you should give more freedom to the people that judging the games and higher the limit to 100(like the coolness), because it’s really hard to give a rating within the 1-5 stars limit..(and there are so many different games). I think that upgrading to voting system that way will be great and many will be satisfied.
    Or as my friend told me: usually when judging, half of the final score comes from the judge and the rest from the voters.. and since there are almost always someone that judge all the games you can choose a team of judges that will decide half of the final score for each game and leave the rest as it is. It will probably won’t work but anyway..

    Hope you’ll consider this..

    Please support me if you agree.

    • digital_sorceress says:

      >>> I’m telling it because I saw many games that I think deserve much more that the place they took and I’m a little disappointed.

      Just because you think a game deserves more does not mean that all people will think that. We all have different tastes.

      >>> usually when judging, half of the final score comes from the judge and the rest from the voters.

      LD is a community driven event, and I don’t feel that an elite panel of judges would fit with that.

      But it’s inevitable that the people casting the most votes will skew the results towards their own tastes moreso than the people who votes little. Short of compelling people to vote, or capping the number of games a person can rate, there’s little that can be done about that.

      Having said that, I do have some ideas on how the rating system can be improved, but not in the ways you are thinking about. :)

    • Felipe Budinich says:

      I only agree that the granularity of the vote should be expanded, from 1 to 10 maybe, but judges? naaaaah

    • Vladp995 says:

      That’s why I’ve said that it’s probably won’t work.. but expanding the vote limits is still a very good idea!

      • digital_sorceress says:

        When scoring goes from 1-10, the scale becomes distorted as 7 is commonly considered to be average. This makes discrimination poor at the top end (8-10 has width 3) compared to the lower end (0-6 has width 7). That’s a non-linear calibration, making it easier to drag a game down than pull it up, so calculated averages don’t mean as much.

        upto 7 stars would be okay (as everyone would agree that 4 stars is an average score), but I definitely wouldn’t stretch it further than that.

        Another option is to add helper words after the stars when you click on them, so that everyone perceives scores in the same way.

        So when you click x number of stars it says :
        *—— (fail. for games that don’t work)
        **—– (poor, for games that are well below average. No more than 10% of games should go in here)
        ***—- (below average )
        ****— (average. as many games should be above this as below it)
        *****– (above average. for games that stand out a little from the crowd)
        ******- (excellent, for games that strongly stand out from the crowd. The top 10%)
        ******* (A+. For games that are really something special)

        • Felipe Budinich says:

          This I like, good idea. (tooltips + scale of 7)

        • johnfn says:

          I think the best way to do this isn’t to ask users to normalize their votes, but to force their votes to be normalized. This is pretty hard to do without having your users have a bigger space of votes then just 1-5 like how we currently have.

          • digital_sorceress says:

            Do you mean like an x->ax+b function to shift the mean and standard deviation of the scores that each person awards?

            If so, it will introduce as many problems as it attempts to fix.

            Not everyone is able to play all entries (by which I mean they don’t have access to all operating systems)

            Yet there may be correlations between operating systems and game quality.

            So normalising scores risks having the effect of artificially eliminating those correlations from the scoring.

            The same sort of phenomenon can occur with other things such as if a person’s computer is older and won’t run Unity, or html5, or whatever. Unity may produce the most awesome games, but they’ll never get to play them and not get to rate them, so that person’s ratings are expected to be slightly below average.

  7. Shadow says:

    This is a bit too crazy of a idea, but it’s the best thing I can come up at 4 AM and feeling sick as hell…

    What about using some sort of pool for voting? I mean having a “selection” of games that need votes and only allowing those to be voted. Initially, all games would be available for voting. Once a game receives a vote, it can’t further receive score (i.e: is “removed from the pool of choices”) until every single game has 1 vote. This will happen for every game, so, the selection will narrow until every game has a vote, then, all games become available again for a second round of votes.

    YES. Games will have their score based on a ridiculously low amount of votes (as all the “voting power” of the community will be forcefully distributed among all games), but that is still better than having games with hundreds of votes while others have 1.. or even 0.

    NO. It won’t help with vote-trolling and won’t deal with unfair rating. Also, it will make the process more frustrating to people, because they won’t be able to vote for the game they want unless it’s available for voting.

    Games will remain playable at all times though. Just the vote submission would be affected.
    Anyway, perhaps the idea need a little polish. Perhaps is trash. I don’t know.

    • digital_sorceress says:

      With this idea, there’s a problem if a person makes a game for an obscure platform, like on their AtariXL.

      There may not be enough people who could play it, and the voting would ground to halt.

      • Shadow says:

        Well, yeah. But that raises an interesting thing. If you are doing a game for an obscure platform you can’t actually expect to be voted and reviewed as massively as .. let’s say.. a web game. So *perhaps* the “pool” idea can still work if you make it “per platform”. i.e: Windows, Linux, Mac, Web, Other. Each category having their own pool of games for voting. Of course, if a game is multi-platform it should be removed from all the platform pools if it receives a vote in any of them.

        Anyway, I still think is a little bit too crazy to actually be practical.

  8. Shadow says:

    Another alternative (just thought of it) would be to reward players with points after they have rated a game. The less votes a game has (compared to other games, perhaps on a per platform basis), the more points it will grant to the user who rates it. After the compo ends, a “high score” list would be displayed with the top voters (this would serve as motivation to rate games, and since the user will try to reach a high score, he/she will tend to rate games with less votes so to obtain more points). It would be like a game. Actually is like an enhancement over the “medals” thing.

    • digital_sorceress says:

      I’m not sure that high scores motivate people in quite that way.

      Computer games can be difficulty to master, so as a player, your score measures how elite you are. People like to get high scores because can take pride in being elite.

      Rating games is a time sink more than anything else. Scores only measure time invested – not skills or eliteness. So I don’t think people would derive much esteem from it. So I don’t expect scores for rating games would motivate people to do so.

Leave a Reply

You must be logged in to post a comment.

[cache: storing page]