Independent learning or isolated learning? Squaring the student satisfaction circle

Our Dean of Students was telling me today about a presentation he recently gave, in which he highlighted views expressed by students on what ‘independent learning’ meant.  They essentially said that it meant ‘learning on my own’.  The sense is of an ‘absence’ or ‘lack’ – so independent learning is the same thing that happens in a seminar, lecture or tutorial, just without the support that such contexts provide.  Independence seems to mean isolation, not autonomy.

In the humanities, we emphasise the importance of independent study.  It’s part of learning and practising our disciplines.  It develops skills that we describe to students as valuable for their future lives: defining a problem or question, pursuing a line of enquiry, synthesising evidence, critiquing others’ interpretations, constructing an argument.  The products of such a process – still often essays – provide the raw materials for assessing students’ academic achievement.  Far from being a deficit model, independence is part of being intellectually resourceful and productive (which can be brought to bear not just in individual but also in collaborative work).

The apparent disjuncture between the student and the academic understanding of ‘independent learning’ could be a real issue, not least in terms of students’ capacity to develop that ‘graduateness’ that seems to be associated with an ability to be reflective and to take responsibility and ownership for learning.

Recent UH graduates Lewis Stockwell and Florence Afolabi’s reflections in their recent post on the Guardian learning and teaching hub on collaborative/partnership vs consumerist/transmission models of HE delivery seem relevant here.  In a context where student satisfaction stands as the main indicator of ‘quality’, the belief that they’re just being ‘left on their own’ is a potentially damaging one.  Maybe it’s akin to the perennial issue with feedback.  Do we need to invest more in explaining what independent study actually is, what it looks and feels like, its dead ends and u-turns as well as its moments of insight and discovery – and, importantly, its place at the heart of academic practice?  Or is something more fundamental involved?  Can we square the circle and remodel and describe independent study in a way that enables students to develop academic initiative and independence and allows them to feel supported (and, yes, satisfied) in doing so?

‘Science’ and ‘arts’: should we play in each other’s fields a bit more?

I find science and maths a real draw.  I often listen to The Life Scientific, Material World and More or Less podcasts ahead of more predictable favourites Making History, The Long View and History Today (though maybe not Friday Night Comedy…)

It was interesting to hear the recent  interview on TLS with Sunetra Gupta, novelist and professor of theoretical epidemiology, in which she refused to recognise a division between science and arts, only different ways to express ideas.

From this perspective, the early commitment of Hatfield Technical College to Liberal Studies seems ahead of its time.  All students were to have 10 per cent of teaching time allocated to subjects such as History, Economics, Politics, Geography and Modern Languages.  It was thought that educating the next generation of engineers and technologists in this balanced way would serve the national interest.  So it was rather fitting that C P Snow became the then Hatfield Polytechnic’s first Visitor in 1972.

The humanities have since come into their own as the institution broadened its scope and the model of reserving time for accessing another ‘culture’ did not survive.  Now it seems an unrealisable ideal – and student choice may be delivering a narrower range of experiences than was imposed in the 1950s.  Would we now be prepared to mandate that all students should take a module a year from different Schools or Departments?  What would be the demands on lecturers or the effects on the ‘home’ students?  What would be the implications for students’ grades?  Then again, if we did do it, what might be the returns?

While academics tend to have a strong sense of disciplinary identity, many of us also have an inclination for greater integration of different ways of expressing ideas.  And these may well currently be manifested only in podcast preferences.

But there is often more than unites than divides us.  I often find Alice Bell’s blog Through The Looking Glass setting off some mental sparks and work aligning scientific and historical method has proved hugely interesting and useful. The case for interdisciplinarity between ‘science’ and ‘arts’ in meeting some of the biggest challenges we face, such as climate change or an ageing society, is now being made in stronger terms. But how often do we actually bridge the divide? Or if we do, do we tend to contribute to the greater whole from our respective positions as specialists in our disciplines, rather than getting to ‘play in each other’s fields’?

Public philosophy 2: experts and climate change

Once we accept the expert authority of climate science, we have no basis for supporting the minority position.

So argues Gary Gutting, a professor of philosophy at the University of Notre Dame, in ‘The Stone’, a forum within the New York Times Opinionator section for contemporary philosophy ‘on issues both timely and timeless’.  In essence, he’s doing some public philosophy, applying ‘critical thinking to information and events that have appeared in the news.’

His position is based on the ‘logic of appeals to the authority of experts’.  If we accept who the experts are on a particular topic, and that our own status as non-experts excludes us from adjudicating disputes among said experts, then we must also recognise that we have no basis for rejecting the truth of any claim that is backed by a strong consensus within that community.

In the case of climate change, neither the existence of an expert academic field of climate science nor that of a strong consensus that human activities are causing the planet to warm can be challenged.  So, argues Gutting, the only way a non-expert can legitimately challenge climate change is by proposing that climate science ‘lacks the scientific status needed to be taken seriously in our debates about public policy’.  In passing, he notes that such a critique – though unlikely to find much traction in the case of climate science – may well prove more promising for ‘various sub-disciplines of the social sciences’.

Let’s say we accept this, but we then arrive at a problem.  How does expert knowledge translate into policy?  What is its role?  As Gutting acknowledges, scientific conclusions don’t have absolute authority in democratic debates, though his reasoning is based on logic rather than questions of accountability, that is, the fact of global warming is exists separate from and therefore doesn’t imply any particular policy response to that fact.

This is a sequential model – the experts generate consensus then effectively turn the body of knowledge over to ‘us’ to make the value judgements their science cannot and formulate policy accordingly.  I’m not sure even in the so-called ‘hard’ sciences that it works like this, but even if it does, the process of policymaking is itself one that calls for forms of expertise.  Returning to climate change, the importance of people’s behaviour, their beliefs, practices and ways of making meaning of their lives, is being increasingly discussed.  Once we get into the humanities and social sciences, the disciplines with much to offer in this dimension, we get into highly contested debates, we lose the consensus to which Gutting refers.

But rather than seeing this as a problem (where a perceived lack of scientific status leads to a lesser status in policy debates), can we instead recognise a process to which these forms of expertise have distinctive and important contributions to make?  Can the lack of consensus be productive?  Policymaking involves reconciling interests, beliefs and evidence that sometimes overlap, sometimes conflict.  It involves holding in mind at the same time different levels of human organisation and considering how those levels interact, how policy might affect that interaction.  It’s conditioned by institutions, with all their complexity of structures and relationships.  It’s many other things besides, but as a process it could surely benefit from forms of expertise that fundamentally engage with those kind of issues.  A sequential model has its attractions, but the role of expertise in policymaking isn’t that simple, because policymaking isn’t simple.  Question is, can the humanities and social sciences turn complexity and lack of consensus into a strength?

History ventures: skills vs knowledge in the public history marketplace

“The skills of doing history are more frequently used, needed, and recompensed than the expertise of knowing history”

This is Darlene Roth, writing in the NCPH’s Public History News.  Roth goes on to talk about the successful model of ‘developmental history’ work her consultancy The History Group undertook for planners, developers and government agencies.  She also refers to corporate histories and museum curation.  These examples open our eyes to the range of tasks and projects that can done well – or best – by historians, whether academics working ‘across borders’ or the historically-trained working in professional contexts.

But for me it also suggests the need to open our minds.  Can we articulate clearly what ‘the skills of doing history’ are and be creative in identifying tasks that are not necessarily explicitly historical in character but would be done well – or best – by historians?  There are fields where there is often a ‘history gap’, such as in policy development, marketing or organisational strategy, but these should not constitute the limits of our imagination.

Working this out is not just a self-serving exercise.  Humanities applications for the first year of the new funding system are down in many institutions.  It’s too early to say whether concerns about employability in the context of higher debt are a major factor, but it’s a strong possibility – particularly for certain student groups – that we need to consider (league tables of salaries are rather unhelpful here).  More needs to be done to ensure prospective students and their parents understand the student finance system, but  universities have a role too, and not just their recruitment and marketing departments.  Open days and school visits are important opportunities for university staff to meet students and parents and discuss what studying a particular subject at a particular institution is like.  If, as historians, we can share with them the many ways in which the skills of doing history can be meaningfully and usefully applied in the world of work, and our commitment to helping students develop those skills, we can start to counteract the belief that a humanities degree ‘just equips them for standing in the dole queue’ (as one Tweeter said to me recently).

Students come to university for many reasons.  To further their job prospects may only be one reason, but it’s a legitimate one, and one with which we need to engage.  We shouldn’t give in to the cynicism that divides knowledge and skills and denigrates the latter as empty, instrumental or devalued.  Nor should we section off ’employability skills’ in the curriculum; by teaching students to be historians, we are developing skills needed for work – we just need to bring awareness of that connection to the surface (see my earlier Parallel Tracks blog post).  I hope the emerging field of public history can provide a context to help us frame the terms of the debate rather differently.

Roth goes on:

I am saying that it pays to look at how you do what you do as a historian, and how you think as a historian, and follow those routes to marketability, not just the standard one of equating historical knowledge as the thing being sold.  Ergo: “I am an entrepreneur, and history is my product” becomes “I am an entrepreneur and history is the source of my products”… If history is the answer, what is the question?  Who needs it and why?

We may prefer a somewhat different language in this country, but I think we can take on the idea of entrepreneurship and interpret it for our own context.  Can we be entrepreneurs for the discipline, for the practice of history, but also for our students so that they can see history as their future?

Tackling belief is the key to overcoming climate change scepticism

Adam Corner’s great piece today on Guardian.co.uk (via @alicebell) highlights the importance of belief in determining people’s position on climate change.  For me, this opens up debate on a vital role for the humanities:

…we should not be looking to science to provide us with the answer to a problem that is social in nature. The challenge is to find a way of explaining why climate change matters using language and ideas that don’t alienate people. Simply repeating the scientific case for climate change is – unfortunately – not going to cut it.

In fact, the more we know, the less it seems that climate change scepticism has to do with climate science at all. Climate change provokes such visceral arguments because it allows ancient battles – about personal responsibility, state intervention, the regulation of industry, the distribution of resources and wealth, or the role of technologies in society – to be fought all over again.

The last sentence is particularly resonant :

It follows that the answer to overcoming climate change scepticism is to stop reiterating the science, and start engaging with what climate change scepticism is really about – competing visions of how people see the world, and what they want the future to be like.