
Falling Prey to The Data
It seems technology and data companies are becoming very adept at exploiting the narratives around “if you can’t measure it, you can’t manage it”, that data provides “objective, cold hard facts*”, and if you have data you are using an “evidence-based” approach.
(a further thought — the allure of automating data capture is also very persuasive)
Whilst often not explicitly stating it, companies do this to create a perception their offering is the key to you enhancing player/team development.
However, to make these claims you must, at the very least, have a sound theoretical model of how human development works. In the most recent example I came across, a ball projection machine for football was being promoted as enabling players to improve by PRECISE (in terms of how the ball was being delivered) repetition of striking a ball . And of course their product could produce all sorts of data about a session.
Just one small problem — the “evidence” in skill acquisition and learning science (schema theory or dynamical systems theory), whilst valuing repetition, would suggest some level of VARIABILITY is usually helpful for developing skill. So this is actually nothing like an evidence-based approach – in fact further towards the opposite end of the spectrum. Worse, the real danger is coaches moving away from what may be sound (often tacit) models/theories of learning design to something less so, because they fall prey to the narratives mentioned above.
Key point: Just because you can produce data doesn’t make an intervention/action evidence based**. And it certainly doesn’t make your program “elite”, “high performance” or any other buzzword you fancy.
Start by thinking about this question – “what is your model of the learner/learning process?” In my experiences, if you are well-informed on this (by theory, experiential knowledge and critical reflection) you are in a good position to exploit both simple and advanced applications of data/technology to enhance human development.
(occasionally reminding ourselves we are trying to develop people, not engineer machines…. that is useful too)
* Ben Alamar has made a very insightful observation that one of the biggest misconceptions about metrics is that they provide the “facts”, a single truth. This is rarely the case.
** given the complexity of humans, there are also dangers with strictly adhering to (seemingly) legitimate evidence-based approaches…maybe I’ll address that in a future post.