The Myth of Future Jobs

By Professor Mark Brown

For over a decade we have been told that today’s universities are at risk of preparing a generation of students for jobs that don’t yet exist using out-of-date teaching methods and old learning technologies. In a similar vein, we often hear claims from respected international agencies and generally trusted academic sources that 65% of jobs of the future have yet to be invented. This claim, for example, is prominent in Professor Cathy N. Davidson‘s 2011 book Now You See It on the future of education, which The Atlantic reviews. And more recently in the context of the perceived disruptive potential of robotics and Artificial Intelligence (AI) we are being challenged by a new threat that many jobs will disappear in the future.

Jobs

It follows that such claims raise important questions in today’s rapidly changing world about the currency, relevance and usefulness of completing a university degree. Does a university qualification still matter? This basic question leads to a number of deeper questions: How seriously should we take predictions of the future? Should we be alarmed by some of the claims about the future of work? Are our current jobs safe? Although it is easy to be seduced by the hype shaping projected technology-infused imaginary of the future, the question is just how accurate are these predictions? What is their factual basis? What is the evidence behind the predicted obsolescence of many traditional jobs? Does a university degree help to future-proof your job? More to the point, especially in the context of the increasing costs of higher education, is a degree still relevant in today’s rapidly changing world?

You can read more of this article on Professor Mark Brown‘s Linkedin account.

Methodological Issues in Learning Analytics: Critical Insights and Reflections

By Professor Mark Brown

This brief opinion paper raises a number of conceptual and methodological issues associated with attempts to evaluate institutional initiatives in the area of learning analytics. It frames the discussion around three recent works that invite a more critical reading of learning analytics research and the potential of interventions and data-driven decisions for successful, sustainable and scalable impact on an institution-wide basis.

Firstly, the emerging field of Learning Analytics would benefit from more critical engagement with some of the points raised by Paul Kirschner (2016) in his keynote at the 6th International Conference on Learning Analytics and Knowledge (LAK16). More specifically, Kirschner warns that naïve understandings of learning and narrow conceptions of learning analytics may potentially do a lot of harm.

digital-388075_960_720More recently Kirschner and Neelen (2017) argue that many so-called learning analytics initiatives: (i) view education as a simple process that is easily modelled; (ii) base decisions and interventions on data rich but weak theory; (iii) inform decisions and interventions based on wrong or even invalid variables; (iv) make interpretations and arrive at conclusions that confuse correlations with causality; and (v) result in unintended and unwanted effects that pigeonhole and stereotype learners which may be counterproductive to enhancing student engagement and learner success. Arguably, to date there has not been a serious or comprehensive response to these justifiable concerns.

You can read more of this opinion piece on the ICDE website where the full version of this paper was first published as part of the two-day ICDE Leadership Summit in May 2017 in Nancy, France.