NeoLemmix used to
Really? What was the system?
Instead, the question I would ask is: does a scoring system need to be watertight? ... my point is, if you want to foster competition for scores, all you need is for a scoring system to exist and for it to have leaderboards
It doesn't need to be watertight, agreed. But it does need to be meaningful, otherwise players won't be interested in trying to improve it. Leaderboards can be a good way to encourage a bit of light competition, and they give a player an idea of their skill level as well (which can be equally compelling and demotivating). Which brings me to Simon's question:
What is the use case?
Probably, a player wanting to get an idea of how well they've performed on a particular level; not necessarily as compared to other people, but according to the level's expectations/standards.
I think what I'm realising is that these expectations and standards evolve over time, and so are not possible to quantify at the outset. Even if we could assign a "maximum possible score" to a level, this would present the same problem as a leaderboard: if a player completes the level but falls way short of the maximum possible score, it could be demotivating.
Ideally, a score system should feel rewarding, motivating, and compel the player to come back and try again for better. I suppose that depends massively on the individual player, though.
Does the score have to be an integer?
Not necessarily. S, A, B, C etc work nicely in many games to give players a bit of performance feedback.
surely the only metric worth measuring is "how many skills did the player use?"
Agreed; NL reports fewest skills, and SLX now displays it on the postview screen. It's an interesting thing to try and improve upon.
I suppose the downside of a scoring system is that the player doesn't necessarily know what contributed to the score. The individual stats definitely work better in this regard, and would still very much need to be prominently tracked and displayed.
Saving one skill on "No added colours or lemmings" ... is much more impressive than finding the solution that saves a builder on "With a twist of lemming, please". But how much more impressive? Does the question even make sense?
Suppose we have two replays, both save 100% of lemmings.
- Case A: This replay uses only 3 skill assignments, but takes 5 minutes to complete.
- Case B: This replay uses 25 skill assignments, but is completed in 45 seconds.
Which of these should be awarded more points? Which is more impressive?
I suppose that's the unanswerable question of Lemmings, and why this topic is intriguing.
We need a way of measuring "The L Factor" - how interesting, unique, clever, unexpected, innovative, and perhaps above all how elegant (with a capital "L"
) a solution is. Basically, how cool does it look in a replay?
Score = (%Lemmings saved * 1000) - (Number of skills used * 10) - (Number of frames elapsed in replay)
...
Thus, players are incentivised to save as many lemmings as possible, whilst minimising both skill usage and time taken.
Agreed.
Perhaps it would be most appropriate to gather this score from different playthroughs of the same level; then, we can reward both the effort that saves all lemmings but uses a million skills,
and the one that only saves 50% but only uses 1 skill; as long as the same player did both, then their overall score would reflect this.
In NL/SLX, we already have ways to track a player's best performance fairly comprehensively in multiple game elements (maximum saved, quickest time, fewest skills, fewest skill types, fewest of each individual skill, etc.). It makes sense to make use of this data to calculate a player's score.
Such a system incentivises repeat play, and rewards finding multiple ways to solve the same level. In turn, this hopefully incentivises designers to provide multiple possible solutions as well (something I'm a huge advocate of, so I would hope any scoring system reflects this).
Perhaps we could take the square root of the number of frames? Or apply some other kind of function that gives diminishing returns?
Please elaborate!
Upon spawning into a level, a lemming has a hidden "score count" variable ...
When a lemming is saved, its score value is added to the total.
This would work excellently for calculating the score of a single playthrough, for sure. It seems important for a scoring system to take individual playthroughs into account, but this shouldn't necessarily affect a player's overall score for that level - for that, we're always interested in tracking what is essentially the "sum of best".