Tuesday, 19 June 2012

Feedback and Input Welcome

In our Round 12 review (Round 12: Thursday is the new Friday), we mentioned toward the foot of the post that there has been feedback coming in that our rankings were very slow to react and do not reflect current form.

While we continue to review the system and the inputs to the calculations, a few actions will be undertaken, as per here...
1) the blog and current system will continue as is until seasons end, to allow for consistency in tipping and ranking.
2) Additional data will be processed and published, with attenuated calculations that better reflect the current status of teams to date.
We feel the need to proceed 'as is', as it is not practical to change the system and rules mid-stream. Instead any changes will be considered at seasons end, and be ready for Season 2013.
What we can do differently though of the remainder of this year is offer an alternate ranking point calculation.
Publishing additional data will be for information, feedback and reasons of review and criticism only.
We encourage you, our loyal reader*, to consider and contribute as the remainder of the season progresses.


Option: 'Upping The Ante'
The most logical way to make the ratings more responsive to current form is to boost the weighting for each game, which thereby attributes more (or less) points to teams.

With that method locked in at double the rate, the following two tables are our starter for the review process.
The season would have started with the teams ranked as per here...


Which itself makes for interesting reading, as;
- The grand finalists were clearly head and shoulders better than anyone else.
- The top 8 teams as ranked are very much in line with the top 8 of the AFL ladder at the end of Season 2011, with the exceptions...
     ::West Coast lower in our estimation, as they were coming off a spoon year, and
     ::That the seasons 8th placed team (by the AFL ladder), Essendon, was according to our system only the 10th best.
- The teams outside the eight also follow sequence with the AFL table. There are some exceptions, such as Adelaide higher and Melbourne lower than AFL table, and these are due to late seasons performances in the main.


After 12 rounds, the current standings would be at left. 

In this scenario we can see;
- Hawthorn take over leadership of the ranking table
- Collingwood and Geelong are still highly ranked, but have been heavily penalised through their losses or wins under the projected margins.
- West Coast are closing in on positions 3 and 2.
- Reward for a good season start is given to Essendon, Richmond and Adelaide (now 8th through 10th), and Port Adelaide. Note that more points have been attributed to Essendon and Richmond as they have had tougher draws than Adelaide (i.e. played better ranked opponents).
- Falls for North Melbourne, Gold Coast, Melbourne and the Bulldogs who have had poor showings in 2012.
- Interestingly, the GWS Giants are holding their own  under this scenario, with a modest 10pt increase to date.


Compare these more attenuated figures with the numbers generated from the current system we have employed since Round 1.


















* Well... loyal reader may be underplaying it as this blog has a regular 30-40 hits each post. A modest readership, most appreciated.

8 comments:

  1. Interesting alternative.

    Perhaps you could simply incorporate a 'trend' component to your original approach that would reflect recent form (and possibly include short-term future projections). Maybe it would cover a rolling window of three rounds either side of the current game? It could be presented as a number or, more simply, a colour spectrum [e.g. red (= poor) to green (= great)].

    I'm not sure if this would be difficult to implement, but it could save on a complete rejig of your methodology (which seems to be more indicative of form over the season than the alternative).

    Hope that helps!

    ReplyDelete
    Replies
    1. That's actually a good idea that I will try to implement it for the second half of the year... as long as I can automate it!
      Certainly easy for past form, future projection a bit more difficult. The numbers might get a bit rubbery! I will try though!

      As to a "complete re-jig", the above proposed is a one-number change to the datasheet, so is fairly basic... the problem for me would be retro-fitting the new numbers into the blog... hence why I was thinking to keep the current system until years end.

      Thanks for the feedback!

      Delete
    2. No worries!

      I think that if you use a trend (similar to the 'three game rolling average' used by Virtual Sports in their numerous fantasy football competitions) then you shouldn't have to run the alternative system this season. To be fair, I think your original system reflects season long form much better anyways.

      Delete
    3. The system (or at least the one I sent you originally) already has a form variable - the protected rating. So you could just use that. I tend to think of it as a +- on how confident I am in a team; in that sometimes they are having a run of poor luck, and sometimes it is a sign of difficulties.

      You can also play with the amount put into protected rankings (make it double or triple the game weight for instance) and the amount taken out (I think it is set at 25%) to emphasise trends more, and smooth out variation.

      On your comment below. Where is the sweet-spot for tipping variation? And is your home advantage still correct with a bigger weight - that might make a difference.

      Delete
  2. The difficulty of course is that at the start of the season two things are guaranteed:
    - Teams will be of different ability to 6 months previously.
    - You have almost no way of knowing what that difference is.

    In theory, some regression to the mean is likely, so one option is to write off half the difference from 1000 for each team before round 1. But that is arbitrary, rather than based on any mathematical consideration.

    Ultimately the proof should be in the tipping. Do you get better results with a faster change/other adjustments. Notwithstanding that some tips are bound to be wrong because they defy form, ability and logic.

    ReplyDelete
    Replies
    1. "Ultimate proof is the tipping"... well here are the numbers. (An attempt to find a sweet spot, if you like)

      Game Wt [Correct]
      15 [73] ...current
      25 [71]
      30 [71] ...above proposal
      45 [72]
      60 [71]
      100 [75]
      200 [79]
      250 [80] ...sweetspot!
      300 [77]
      350 [73]
      400 [72]

      But the best performing teams list when set at 250... well, its up for debate. See below.

      St Kilda [1560.8]
      Hawthorn [1470.9]
      SthMelb/Syd. [1437.7]
      Adelaide [1430.3]
      Collingwood [1314.0]
      Essendon [1298.6]
      Richmond [1203.1]
      Fremantle [1145.4]
      West Coast [1004.2]
      Brisbane Lions [927.6]
      Geelong [906.2]
      Carlton [862.0]
      Port Adelaide [859.2]
      Foots./W.Bulldogs [839.5]
      Melbourne [823.4]
      GWS [655.0]
      North Melbourne [395.0]
      Gold Coast [332.1]

      Delete
    2. Also, the tips are way out at 250 as well... tipping correct is one factor, tipping margins accurately is a different story.

      Delete
  3. I quite like the adjusted rankings, but probably because they're closer to my own!

    One rough rule of thumb that I've used to check my system is vaguely sensible is to check things against the premiership betting markets. Not that they should match up exactly of course - if the betting market is wrongly overvaluing or undervaluing certain teams then you want a system that does better than that. But there's some wisdom in crowds, and so it's reassuring if the two sets of tables match up to some extent.

    ReplyDelete