Disclaimer: A lot of this was written as it entered my head. My stream of consciousness is a isn’t necessarily logical. If you can follow then kudos.
Last time I looked at replacement level performance and determined that the Premiership teams are essentially fighting for a share of the 410 ‘non replacement level’ points that are scored each season (link). Here I’m going to have a stab of narrowing this down to determine just how many points the best and worst players are worth. This time I’m going to do it using points, at some point in the future I’ll use shots and see if I come up with a similar answer.
We know that each season there are a total of ~800,000 ‘player minutes’ played*. The ~400 ‘non replacement level’ points I mentioned in the previous paragraph must somehow be divided between these 800,000 minutes. Divide the points by minutes and we find that an average Premiership player is worth ~ 0.0005 PAR per minute, or 1.80 PAR per season**.
The best players
The best Premiership team ever, in terms of points, was Chelsea in 2004-05, scoring 95 points (~63 PAR). Let’s imagine the theoretical scenario whereby that for every minute of that season the eleven Chelsea players on the pitch were both a) equally talented and b) the best players in the league. In this scenario each player contributed equally to the PAR earned by Chelsea that season and so the PAR earned by Chelsea should be distributed equally amongst those players.
63 PAR / 11 players ~ 6 PAR / player
In reality the players were neither all equally talented nor all the best in the league and as such some players must have been worth more than 6 PAR. Thus this value serves as the lower bounds for the PAR for the best player in the league.
The step to determine the upper bound is not perfect but I think it’s justifiable nonetheless (in my head it works as a logical jump). Let’s assume that ten of the players on the ’04-05 Chelsea team were exactly league average and that the extra PAR they earned was entirely due to one player. That one player would thus be worth:
63 PAR – (10 average players x 1.80 PAR per player) ~ 45 PAR
The worst players
This time we’ll use the 11 points scored by Derby in 2007-08. They finished that season 21 points below replacement, a remarkable feat achieved through a combination of incompetence and bad luck. Following the same logic as earlier lets assume that the players on the pitch were equally talented and we can distribute the (negative) PAR between them.
-21 PAR / 11 players ~ -2 PAR / player
Thus the ‘best’ that the worst player can be is -2 PAR. In reality, as earlier, this isn’t the case and the worst players will actually be worse than this.
As before I’ll assume that ten of the players on the team were exactly league average and their (lack of) performance was due to one player to determine the outer bounds of their value.
In this case that player is worth
-21 PAR – (10 average players x 1.80 PAR per player) ~ -39 PAR
All of this leaves us with a pair of equations.
The value of the best player in the league is therefore summarised as: 6 < PAR < 45
The value of the worst player in the league is therefore summarised as: -39 < PAR < -2
Now the ranges are massive here but the point of this isn’t to give a specific value, more as a guide. Basically if this reasoning is taken to its logical conclusion and results in a ratings system then the values for the best and worst players should fall within these ranges. If they fall outside these bounds then there’s a good chance that, somewhere along the line, something’s been screwed up.
I think I could make a pretty good guess at where the best and worst players lie within these ranges. The question is how best to use our knowledge to narrow the range?
*22 players x 380 games per season x 95 minutes per game = 794,200 player minutes per season
**presuming a player plays every minute of the season