Predicting the Future?

Mark Upton
 
Jul 29, 2016

‘It’s tough to make predictions, especially about the future.’

– Yogi Berra

Whilst there are many definitions and discussions around the terms Talent ID/Development/Selection, the vision driving them is a distant future where those “talented” players display high levels of performance at the pinnacle of their sport.

(This provocatively titled post “Is Talent ID Bullshit?” from Mladen Jovanovic raises many good points about the (flawed) current approaches to Talent ID and Development. Mark O Sullivan also frequently posts on this topic on his blog.)

Generally, the capacity to see far into the future with any clarity has alluded us. As in many industries at the moment, there is a line of thinking that “data” will be the saviour, specifically that a significant volume of data combined with intricate algorithms will lead to the holy grail – predicting the future with a high level of certainty.

When these claims are contested, a common way to brush them aside is to label those arguing against as being “old school”, not keeping up with the times and/or resistant to using data. Particularly if these people do not have a science/academic background, ie many coaches and other experienced folk in sport. Given this, recent comments from Tim Cable, Director of Sport Science at the acclaimed Aspire Academy in Qatar, are quite interesting. Tim’s views cannot be dismissed with the same logic as mentioned above, and clearly Aspire have been exploring this area for some time…

“We (Aspire) go out and test every 11 year old boy in the country and have done so for the last 10 years. We’ve got a data-set of around 60,000 points. We’re trying to apply algorithms to that data-set that would allow us to predict future performance….but I’m not sure we’re ever going to get there with that, with those real complex questions”

– Tim Cable, Director of Sport Science @ Aspire Academy

I find it interesting that Tim used the term “complex” in what I perceived as layman’s fashion. Yet he hits on a key issue here – when dealing with Complexity (big ‘C’ as in the theory/science of) more data and better algorithms will fail to eliminate the inherent uncertainty (I’m deliberately using the term ‘uncertainty’ as opposed to ‘risk’ – very different).

If we take Tim‘s doubts from significant applied experience and marry it up with theoretical ideas from Complexity then serious questions have to be raised about the merits of this approach (yet another example of using data but not being evidence/theory-based). However, there are pressures to hold up the illusion of using data & predictive modelling as a proxy for an evidence-based approach. In a recent talent seminar I was invited to be involved in, two senior figures from national sporting bodies both rolled out lines about building models to enable them to identify future stars and predict performance levels. Yet in the same breath they essentially acknowledged that their current high performers, including world champions, would not have been identified by these models and that they would continue to look for “outliers”. So they are really saying “we are doing this data stuff to keep up appearances, but we actually recognise the flaws and will happily make decisions that conflict with what the models are telling us”.

Tim Cable also mentioned there is a significant investment required in acquiring and updating data platforms, technology and expertise. Could this investment be directed toward a more fruitful strategy instead? – perhaps “as many as possible, as long as possible, in the best environment possible” (Mark O Sullivan).

(this respects various rules of thumb for managing Complexity – “avoid premature convergence”, “don’t put all your eggs in one basket”, “place many small bets”)

Ultimately, we may be better off embracing an uncertain future and directing our energy at “managing the evolutionary potential of the present” (Dave Snowden). This is where data (both quantitative and qualitative) is being, and has even more scope to be, used to good effect.