The Matter With Things. Chapter 13. Institutional science and truth.

McGilchrist opens this chapter with these two aptly chosen quotes which pinpoint the key arguments he is making in this chapter.

‘Most human institutions, by the purely technical and professional manner in which they come to be administered, end by becoming obstacles to the very purposes which their founders had in view.’ (William James, 1909, A Pluralistic Universe)

‘Specialization is for insects.’ (Robert Heinlein, 1973, Time Enough for Love)

This chapter is a bit different to others In The Matter With Things as it focusses less on hemisphere correlates and is not heavily reliant on the hemisphere hypothesis. Instead McGilchrist explores the question of whether we can trust science’s claims on truth and examines the limitations of the institution of science.

I recognised most of the key points McGilchrist makes in this chapter as I think would anyone who has a background in research and publication, especially but not solely if these are related to science. Many of the same issues arise in humanities disciplines. McGilchrist discusses the limitations of the institutions of science under four headings; Specialisation and its impact on original thinking; How reliable is scientific evidence? The problems of publication; and Peer review.

Specialisation and its impact on original thinking.

‘Science is a victim of its own extraordinary success’. (p.502) The explosion of scientific knowledge has led to increasing specialisation, such that scientists can only be an expert in a small area. When even a very good scientist talks about science, unless he is talking about his own area, he is taking it on trust/authority. We need to think about the worthiness of this trust/authority.

Specialisation drives scientific disciplines apart, leads to ‘narrowness, technicalisation and fragmentation, at the expense of breadth, humanity, and synthesis’ (p.508). McGilchrist argues that whilst of course we need specialists, we also need generalists. We need both the flies eye view and the birds eye view. He describes this as follows:

‘If perceiving shapes is how maths and science progress, as I believe it is, you will see those shapes only by rising above the hole where you are digging. The view in the valley floor is good, but if you never climb, you will not know that there are many other valleys, and mountain ranges nearby, which are not only beautiful in themselves, but help you see why good work needs to be done down in the valley floor at all.’ (p.504)

Specialisation also leads to specialised jargon.

‘Increasingly, the heavily acronymic jargon of research papers seems to me to present an almost impenetrable barrier to anyone other than the most highly specialised reader, and even then, if they are to get anything out of the exercise, they must have a huge capacity to tolerate boredom’. (p.507)

(McGilchrist’s writing often makes me smile 😊)

How reliable is scientific evidence?

In this section McGilchrist discusses the problems involved in interpreting data, taking mirror imaging (a way of knowing what is going on in the minds and brains of people) as an example. Brain activity scans are difficult to read accurately. The data require interpretation and therefore cannot be assumed to be objective or truthful. Every way of looking at the brain has its limitations. Whenever you are looking at a complex system, you can’t assume that the bit you are looking at is the crucial one. We should use as many ways as possible to look at the brain, and not rely solely on scanning.

In scientific research, for a result to count as important it must be replicable, reliable, and reproducible, but ‘A survey of 1,576 researchers across scientific disciplines published in Nature revealed that more than 70% of researchers had tried and failed to reproduce another scientist’s experiments, and more than half had failed to reproduce their own experiment.’ (513)

A widely cited paper by John Ioannidis (2005) – ‘Why most published research findings are false’, concludes that most research is not adequately designed to prove what it claims to show, that ‘The hotter the scientific claim, the less likely the research findings are to be true’, and that ‘the greater the financial and other interests and prejudices in a scientific field the less likely the research findings are to be true’.

There are now huge temptations (in terms of financial and reputational rewards) to commit anything from a minor misdemeanour to recognisable fraud (fabrication of results), and McGilchrist provides examples of these in this chapter.

McGilchrist also includes an Appendix (3) on the reliability of public health policy which makes for interesting reading. If you have cut salt out of your diet, you might want to think again, or read the Appendix!

The problems of publication

Most academics will recognise the exhortation to Publish or Perish!

Institutions put enormous pressure on their staff to publish, whether or not they have anything to say; quantity is more important than quality, as is publishing in high impact journals. This leads to corner cutting and inflation of claims. It also leads to a focus on writing short papers rather than books, which take a long time to write and require fallow periods. McGilchrist’s view is that this is ‘inimical to free thinking.

‘Scientific thinking gets crystallised too early, before it has had a chance to broaden and deepen; there is no longer a chance for ideas to evolve, to enter the necessary fallow period of unconscious gestation, without being prematurely forced into explicit form, and worse still in sliced form, so that what might have come to be a dawning new Gestalt is forever lost. And in the end, science is not about producing data so much as thinking, to which the acquisition of data can be only a prelude or addendum.’ (p. 516/517)

This pressure to publish can also lead to deliberate gaming of the system, where authors chase citations by working in ‘highly populated’ areas of science (even though ‘it is estimated that only 20% of cited papers have actually been read’, p.158), or even pay to have their work published in predatory open journals (See Beall’s list ) As soon as there is payment for publication, the whole system is corrupted.

And then, there are the fake papers. McGilchrist devotes Appendix 2 to some of these – papers such as those that are created by computer programs but nevertheless succeed in getting published, despite being, literally, gibberish.

So, we may ask, what happened to peer review?

Peer review

How effective is peer review? Richard Smith (Editor of the BMJ) wrote that far from being an objective, reliable and consistent process, peer review is ‘a subjective and, therefore, inconsistent process ….. something of a lottery’. (Smith, 2006, ‘Peer review: a flawed process at the heart of science and journals). As McGilchrist states:

‘Bias is intrinsic to human life. We just waste a lot of time and money pretending we’re avoiding it, and then kid ourselves that the outcome was ‘objective’ – a more dangerous position, because it introduces complacency and is a much more difficult thing to fight, precisely because of its appearance of objectivity’. (p.529)

Peer review is a laborious progress which takes up researchers’ time, which is given for free, and so takes them away from their own work. Interestingly, McGilchrist tells us that until the 1930s/40s peer review was never part of the publication process. Papers were reviewed by the editorial committee. Einstein, for example, refused to subject his work to peer review – only one of his 310 publications underwent peer review. Presumably once was enough to convince him of the flaws in the process.

There is also evidence that peer review can be prone to bias against innovation and radical new ideas, such that no-one wants to publish a paper that will rock the boat. Those who step out of line pay a huge price. In addition, reviewers have been shown to regularly fail to spot major errors in research, such that the process is obviously open to fraud.

The bottom line is that science is not exempt from human fallibility.

McGilchrist ends this chapter by discussing the need for a new paradigm, one that recognises that the essence of good science is originality and original thinking takes time. Science cannot avoid operating under the existing paradigm, ‘because, without such a paradigm, its findings could not cohere’ (p.536) but working within the prevailing paradigm also ‘militates against those great insights that change the direction of scientific history, despite this being widely believed to be precisely what science is about.’ (p.536)

McGilchrist believes that contemporary science is not scientific enough in that it is not willing to be aware of its limitations. On the concluding page of this chapter, McGilchrist defends science in the following terms:

Science is, or should be, a source of wonder that opens out our understanding of the world and gives us one of the touchstones on the path towards truth. Just because science cannot answer all our questions does not mean that it is not the very best way to answer some of them, and a helpful contributor to answering many more. And that there is corrupt practice in science does not make it different from any other human enterprise.’ (p.544)

For discussion of this chapter between Iain McGilchrist and Alex Gomez-Marin, see

References

McGilchrist, I. (2021). The Matter With Things. Our Brains, Our Delusions, and the Unmaking of the World. Perspectiva Press.

You are Jürgen Habermas!

Jürgen Habermas in 2014 at the age of 84

I have spent the start of the new year, trying to bring some order to the hundreds of documents on my laptop and was surprised to find a document in my 2014 folder with the title – ‘You are Jürgen Habermas’, which included the following text:

Author of The Logic of the Social Sciences, you recognize that the primary activity of human beings is to interpret the meaning of things in the world around them. As human beings themselves, researchers also interpret meanings and cannot therefore keep their own perspective separate from their research. Since there is no absolute truth, research must instead use reason and argument to arrive at the best interpretation. Go use your hermeneutics to conquer the world!

It turns out that this was the result of an online quiz – ‘What’s your epistemology?’

I don’t take these quizzes seriously. They are just a bit of light-hearted fun to occupy a spare moment, or when procrastinating, but I was interested that six years later I get the same result, and given that I haven’t written a blog post for a couple of months, sharing this seemed like a gentle restart.

I don’t know a huge amount about Habermas, but I do like his advocacy of communicative action and Ideal Speech Situation.

Keith Morrison (2008) makes these ideas accessible in his paper ‘Educational Philosophy and the Challenge of Complexity’ where he writes:

A complexity informed pedagogy requires communication that includes:

  • Freedom to enter a discourse, check questionable claims, evaluate explanations and justifications;
  • Freedom to modify a given conceptual framework and alter norms;
  • Mutual understanding between participants;
  • Equal opportunity for dialogue that abides by the validity claims of truth, legitimacy, sincerity and comprehensibility, and recognises the legitimacy of each subject to participate in the dialogue as an autonomous and equal partner;
  • Equal opportunity for discussion, and the achieved—negotiated—consensus resulting from discussion deriving from the force of the better argument alone, and not from the positional power of the participants;
  • Exclusion of all motives except for the cooperative search for truth.

All this feels very relevant at the beginning of 2020.

And, as an aside, the quiz included this lovely image:

A painting by a Swedish artist new to me – Bruno Liljefors (1860-1939)

Happy New Year to anyone reading this post.

Reference
Morrison, K. (2008). Educational Philosophy and the Challenge of Complexity Theory. Educational Philosophy and Theory, 40(1). https://doi.org/10.1111/j.1469-5812.2007.00394.x

The Value and Limits of Science

A bit of background

On the recent Field and Field four-day course (June 8th – 11th 2019), Iain McGilchrist discussed key ideas from his book The Master and His Emissary. The Divided Brain and the Making of the Western World, talking for an hour on each. For the most part these talks were familiar as I have attended this course before.

  • Introduction to the Hemispheres
  • Brain Disorders of the Hemispheres
  • What is Language For?
  • Are we Becoming Machines?
  • What Does it Mean to Think?
  • The Power of No

I have blogged about these topics after attending previous courses.  See my page on The Divided Brain, on this blog.

But Iain is now writing a new book which will have the title (proposed, but not yet confirmed) – “The Matter With Things”. It was good to get this update, as on the last course I attended we were told that the title of the book would be There are no Things. I think Iain feels that his philosophical position is clearer with the newer title. This new book will argue against reductionism and materialism and for betweenness.

In the second part of this new book, which Iain is still working on, he will discuss what he told us are the four main paths to knowledge: science, reason, intuition and imagination. He stressed that we need all four, but that intuition and imagination have been downgraded in favour of science and reason, a result of left hemisphere dominance. So we were very fortunate to hear five one hour talks about these most recent ideas.

  • The Value and Limits of Science
  • The Value and Limits of Reason
  • The Values and Limits of Intuition
  • The Value and Limits of Imagination
  • Everything Flows

The value and limits of science  (These are the notes from Iain’s talk. Any errors are mine and I do not at all mind being corrected in the comments).

Collingwood wrote: Science and metaphysics are inextricably united, and stand or fall together.

And Heidegger wrote: Science does not think, science does not venture in the realm of philosophy. It is a realm, however, on which without her knowing it, she is dependent. (translated from the original by Iain McGilchrist)

(I cannot find these quotes online to verify them, and I learned on this course that my note-taking has slowed down, so I am not absolutely sure of their accuracy, but, as written, they provide the gist of Iain’s argument. For more on this, see the Update – 17-06-19 – at the end of this post.)

The word science simply means knowledge. We need science, but we rely too much on the left hemisphere. Public science is different to what good science is telling us.

The two hemispheres find two different worlds. Objectivity is not about what is out there. There isn’t a thing out there that we can know. Things only come into being through interaction with our consciousness. The more you dig into a tiny hole, the less you can see the whole. So the question is: What constitutes evidence in life? The ‘howness’ of the ‘what‘ matters a lot. Objectivity is a ‘howness’ – a disposition towards the world. You try to be just and truthful, to bring an understanding. This reminded me of the work of Gayle Letherby et al. on Objectivity and Subjectivity in Social Research .

There are no things that are not unique. How does science cope with this? In science when we say we understand something, we are comparing it to something else. Everything is built on analogy.

Science is not chaste (pure and virtuous). It starts from certain axioms/assumptions, e.g. the world is fully comprehensible physically. This is an unlikely but reasonable assumption. But why do we want to understand the physical?  Iain thinks this is related to ‘the matter with things’, the title of his new book, so I expect we will learn more about this when the new book is published (hopefully by the end of 2020).

Science is reluctant to accept anything that can’t be measured. It is based on a false dichotomy between facts and value. There is always a value involved in seeking any kind of truth. We try to rise to meet this through objectivity. Many things in science can’t be separated from value, but there is value involved in appreciating what is a fact.

Problems with science

There are 3 problems with science

  • Intrinsic problems built on assumptions
  • Problems of the model of the machine
  • Institutional problems – the way science promulgates what it is doing

Intrinsic problems built on assumptions

There is no one truth, only more or less truth, but we must be loyal and faithful to truth. (See Where Can we go for Truth? for more of Iain’s thoughts on truth). So how do we decide which questions are worth asking?

Values, judgement and insights are very important in science. Great scientists allow ideas to incubate for a long time. Science eliminates the idea of purpose. This is a tenet of science; there is no purpose to science. Science cannot address things like love or an understanding of God. We can see these in operation, but they cannot be explained by science. But science is teleological – things happen for a reason, although the value of reason itself can’t be reasoned.

An example of a problem built on assumptions is DNA. DNA is not a building block; there is just not enough information in DNA. DNA is a resource from which the cell can draw. It is not a script. Only 2% of it expresses anything. Quarter of a million new neurones a minute are developed in the brain. We cannot get this from a linear script. The genome is not the answer.

Problems of the model of the machine

We are not machines. A machine can be switched off, but life is constant and cannot be switched off. A machine operates close to equilibrium; you have to put energy in to make it change. Life is the exact opposite. It is always changing, but how does it remain stable enough to keep going better? Through homeostasis. Human beings and living things change. Natural selection is the thing that stops change, it doesn’t cause change.

Organisms are not on/off. They involve inconceivably complex reactions to maintain stability between motion and stasis. They are non-linear, action is not one-way as in machines. The parts of organisms themselves are changing. This doesn’t happen in machines. The genome restructures itself all the time. DNA is not the robot master. The same genes can give rise to different effects, e.g. Pax6 gives rise to different eyes in the fly, the frog and humans. Some animals can regenerate parts of their body. If you cut off the head of a nematode worm, it will grow a new head with the same memories. Living organisms are not machines. The instructions for life are within the organism.

See also a previous post – The Human Versus the Machine 

Institutional problems

Science is carried out by normal people with egos etc. Fashions of thinking dominate. Science depends on results, safety, conformity, narrowness. There are many dogmas that can’t be broken.

Scientists are expected to publish or perish. This is destructive to morale. Scientists are rated on the number of papers they can churn out, but they need fallow periods, and they can get caught up in administration, particularly if they get promoted.

Lots of science papers need to be retracted, because they have been made up. And Ceci and Peters’ research raised doubts about the reliability of the peer- review process.

Scientists are also subject to predatory journals to the extent that Jeremy Beale published a list of journals which researchers should avoid.

Truth matters, but these problems with science show that finding out what is true is more difficult. We need more replication work. The amount of replication work is very low.

Why is truth important? We are here to engage with the world. If it is pointless why go with truth?

Update (17-06-19) re the Heidegger quotes (with thanks to Iain McGilchrist for this information)

The first part, » Die Wissenschaft denkt nicht «, is originally from page 4 of Heidegger’s Was heißt denken?, the version of his lectures given in Freiburg in 1951-2 published by Max Niemeyer Verlag, Tübingen (1954), and later translated into English by FD Wick & JG Gray as What is Called Thinking?(Harper & Row, 1968).  Heidegger then repeated it in a conversation with his pupil the German philosopher Richard Wisser on the 17th September 1969, in which he follows it by another phrase in explanation, thus: » Und dieser Satz: die Wissenschaft denkt nicht, der viel Aufsehen erregte, als ich ihn in einer Freiburger Vorlesung aussprach, bedeutet: Die Wissenschaft bewegt sich nicht in der Dimension der Philosophie. Sie ist aber, ohne daß sie es weiß, auf diese Dimension angewiesen «. In H Heidegger (ed), Martin Heidegger: Gesamtausgabe, Part One, Veröffentlichte Schriften 1910-1976, vol 16, Reden und Andere Zeugnisse eines Lebensweges, Vittorio Klostermann, Frankfurt am Main, 2000, 702-710 (705).

Publishing in open access journals

Periodically I receive a message from Taylor and Francis about how often a paper I published with Mariana Funes has been read. This week they sent me the following message:

Of course Taylor and Francis can’t know whether or not the article has been read. They can only know how many time the article has been clicked on or downloaded. And, yes, sharing the article on social media (as we did when it was first published on Feb 28 2018) may well increase the article’s reach.

It is gratifying to see that the article has been accessed more than 500 times on the Taylor and Francis website, but of course Learning, Media and Technology is a closed journal so the reach of the article will necessarily be confined to those with access.

But the journal did allow for open publication of the pre-print of the article, which Mariana and I did on this blog, where we provide open access to the pre-publication (but virtually identical) article for free. This version of the article has had a much wider reach.

In 2018, this blog post was clicked on 4,070 times. This year to date it has been clicked on 231 times.

Again, it is not possible to say whether or not the article has been read, only that there is sufficient interest in it for people to click on the blog post.

As yet, there hasn’t been a mad rush to cite this paper. Google Scholar shows that it has been cited twice this year. Whilst it may be that this is not a paper that will be much cited, I do know from experience that it can take a year or two for papers to come to the attention of other researchers, so there is time yet for it to be more widely cited.

It is possible that open journals still don’t have the kudos of closed journals. Someone recently told me that it wouldn’t be worth my while applying for a job that required a PhD and 4 papers published in ranked journals, because most of my papers have been published in open journals, which, because many are fairly new journals, are still building their reputation. It has always been my preference to publish in open journals. I appreciate Taylor and Frances wanting their journal articles to reach a wider audience, but I suspect that their reasons for this are different to mine.

Kudos or not, the point is that it is not simply using social media to disseminate research that makes a difference to its reach. My experience suggests that extending a paper’s reach depends at least in part on whether or not the paper can be openly accessed.

But I am grateful to the Learning, Media and Technology journal for not only publishing our article, but also allowing us to openly publish the final pre-print version, and I will now follow their advice and tweet this blog post, so that, hopefully, the article will continue to reach a wider audience. The prize for every author is to be read.

Reference

Funes, M. & Mackness, J. (2018): When inclusion excludes: a counter narrative of open online education, Learning, Media and Technology, DOI: 10.1080/17439884.2018.1444638 When Inclusion Excludes MF:JM 280218

Crafting Research

This seminar that I attended last week on crafting research was very interesting. It was organised by the Department of Organisation, Work and Technology at Lancaster University, UK,  and delivered by Professor Hugh Willmott  from City University London. Hugh Willmott has been working with Professor Emma Bell from the Open University. His talk was based on a paper they are working on, in which they are exploring the significance of crafting research in business and management, although having heard this talk the ideas presented seem relevant to social sciences research in general.

The essence of the work lies in an interest in how to produce well-crafted research and avoid Baer and Shaw’s (see reference list) criticism:

As editors, we are often surprised by the lack of “pride and perfection” in submitted work, even when there is a kernel of a good idea somewhere in the manuscript. Submitted manuscripts that report results from research designs in which many shortcuts have been taken are rather commonplace. In addition, many papers seem to have been hastily prepared and submitted, with obvious rough edges in terms of grammar and writing style.

In their article Baer and Shaw quote C.W. Mills as follows:

Scholarship is a choice of how to live as well as a choice of career; whether he knows it or not, the intellectual workman forms his own self as he works toward perfection of his craft; to realize his potentialities, and any opportunities that come his way, he constructs a character which has at its core the qualities of the good workman. —C. W. Mills, 1959

The seminar started with a look at the online etymology dictionary where we can see that the meaning of the word craft has, over time, shifted in meaning from ‘power, physical strength, might’ to ‘skill, dexterity’.

The thrust of the argument made was that researchers should shift towards being craftsmen who are dedicated to the community, have a social conscience and are aware of and acknowledge the ethical and political dimensions of their research. Such an approach would also openly acknowledge uncertainty and bias in research and the role of embodiment and imagination.

The image that ran through this presentation was Simon Starling’s art work ‘Shedboatshed’.

Starling was the winner of the Turner Prize in 2005.  For this work he dismantled a shed and turned it into a boat; loaded with the remains of the shed, the boat was paddled down the Rhine to a museum in Basel, dismantled and re-made into a shed. See:  http://www.tate.org.uk/whats-on/tate-britain/exhibition/turner-prize-2005/turner-prize-2005-artists-simon-starling for further information.

I’m not sure that I fully understand the significance of using Starling’s work in relation to crafting research unless it’s that his work has been described as research-based and clearly involves research and craft. Maybe it’s simply that Starling deconstructs familiar things to recreate them in different forms?

Interestingly, in the questions that followed the seminar, the thorny issue of having to write in a prescriptive way to be accepted in high ranking journals was discussed. Some members of the audience seemed to accept this as a given constraint which cannot be surmounted, i.e. we need to write and present research in a way which will not only be accepted by the given journal, but also will meet the requirements of the University’s REF. For some of the seminar participants there seemed to be no room for embodied, imaginative research which embraces uncertainty. My suggestion that we should perhaps look for alternative publishing outlets, blogs being one example, was met with an outcry of protest from one or two in the audience. ‘No one reads blogs’ they said, and besides ‘Blogging is cowardly’. I neither understood this nor agreed. The conversation seemed to endorse these sentences from Baer and Shaw’s paper:

Our goal was to reaffirm the notion that scholarly pursuit in the management sciences is a form of craftsmanship—we are craftsmen! Some may dismiss our arguments as idealistic or romantic. The realities of life as an academic, the pressures we are under—to publish in order not to perish—offer an all-to-convenient excuse to dismiss our ideas.

What a sorry state of affairs, but I do know from experience that many journals are not prepared to take a chance on non-conventional styles of presentation; Introduction, Literature Review, Method, Results, Conclusion remains the format most likely to get accepted and to suggest that the research endeavour might have failed or that there is a degree of uncertainty around the results is unlikely to lead to a favourable response. It seems there’s a long way to go before the idea of crafting research in the terms presented by Hugh Willmott is widely accepted.

A wide range of Literature was referred to in this seminar, which will be interesting to follow up on. See the references below.

References

Adamson, G. (2013). The Invention of Craft. Bloomsbury Academic.

Alley, M. (2018 4th edition). The Craft of Scientific Writing. Springer

Baer, M. & Shaw, J.D. (2017). Falling in love again with what we do: Academic Craftsmanship in the Management Sciences. Academy of Management Journal. 80(4), 1213-1217.

Bell, E., Kothiyal, N. & Willmott, H. (2017). Methodology-as-Technique and the Meaning of Rigour in Globalized Management Research. British Journal of Management, 28(3), 534–550.

Burrell, G. & Morgan, G. (1979). Sociological paradigms and organisational analysis: elements of the sociology of corporate life. Heinemann

Cunliffe, A. (2010). Crafting Qualitative Research: Morgan and S Smircich 30 Years On. Organizational Research Methods OnlineFirst.

Delamont, S. & Atkinson, P.A. (2001). Doctoring uncertainty: mastering craft knowledge. Social Studies of Science, 31(1), 87-107.

Frayling, C. (2017, reprint edition). On Craftsmanship: Towards a New Bauhaus. Oberon Books

Kvale, S. & Brinkmann, S. (2008 2nd edition). Interviews: Learning the Craft of Qualitative Research. SAGE Publication.

Wright Mills, C. (2000). Sociological Imagination. Oxford University Press.

#SOCRMx Week 4: ‘half-way’ tasks and reflections

(Click on the image to go to source.)

This post should have been made at the end of last week. We are now at the end of Week 5 and I am attempting to catch up.

We are now half-way through this 8-week Introduction to Social Research Methods course. I continue to be impressed by the content, but the course doesn’t really lend itself to much discussion. I am grateful that it is open and that I have access to the excellent resources, but the course has been designed for Edinburgh University Masters and PhD students – the rest of us must fit in where we can.

There are two tasks for Week 4. I have completed one – rather hurriedly – but will report on both.

The first task for Week 4 was to consider one of the research methods we explored in Weeks 2 and 3 and answer the following questions in a reflective blog post.

  • What three (good) research questions could be answered using this approach?
  • What assumptions about the nature of knowledge (epistemology) seem to be associated with this approach?
  • What kinds of ethical issues arise?
  • What would “validity” imply in a project that used this approach?
  • What are some of the practical or ethical issues that would need to be considered?
  • And finally, find and reference at least two published articles that have used this approach (aside from the examples given in this course). Make some notes about how the approach is described and used in each paper, linking to your reflections above.

So far, I have explored the resources related to Surveys, Working with Images, Discourse Analysis and Ethnography. All have been extremely useful and I have written posts about the first three. I will move on to Interviews next and hope to explore the remaining methods (Focus groups, Experimental interventions, Social network analysis and Learning Analytics) before the end of the course.

I have decided not to do this week’s reflection task which requires answering the questions above. For me these questions will be useful when I am working on an authentic research project, but I don’t want to spend time working through them for a hypothetical project. As I mentioned in a previous post I tend to work backwards on research, or at least backwards and forwards, i.e. I get immersed and see what happens rather than plan it all out ahead of time. That doesn’t mean that the questions above are not important and useful, they are, but for me they are ongoing questions rather than up-front questions. This approach to research doesn’t really fit with traditional Masters or PhD research.  I did do a traditional Masters but felt I was ‘playing the game’ in my choice of dissertation topic.  My PhD by publication was a much better fit with the way I work, but even that was playing the game a bit! My independent research has never felt like ‘playing the game’. It has always stemmed from a deep personal interest in the research question.

The second task for Week 4 was to review a “published academic journal article, and answer a set of questions about the methods employed in the study”. I have completed this task, but not submitted it for assessment, since I am not doing this course for assessment. The assessment is a set of multi-choice questions.

At this point it’s worth mentioning that there are a lot of multi-choice quizzes in this course and that I am hopeless at them! I rarely ever get a full score, although I think I have answered these Week 4 task questions correctly. Most of the quizzes in this course allow you to have multiple attempts and sometimes I have needed multiple attempts. Thank goodness for a second computer monitor, where I can display the text being tested at the same time as trying to answer the multi-choice quizzes. Having two monitors is essential to the way I work and even more essential for my research work.  I’m not sure that multiple choice quizzes do anything for my learning, other than to confirm that I have completed a section. I would prefer an open controversial question for discussion, but in this course there is so much content to cover that there would be no time for this.

But again, some excellent resources have been provided for this week. Particularly useful is reference to this open textbook : Principles of Sociological Inquiry – Qualitative and Quantitative Methods with specific reference to Chapters 14.1 and 14.2.

I am copying this helpful Table (from the open textbook) here for future reference: Table 14.2 Questions Worth Asking While Reading Research Reports

Report section Questions worth asking
Abstract What are the key findings? How were those findings reached? What framework does the researcher employ?
Acknowledgments Who are this study’s major stakeholders? Who provided feedback? Who provided support in the form of funding or other resources?
Introduction How does the author frame his or her research focus? What other possible ways of framing the problem exist? Why might the author have chosen this particular way of framing the problem?
Literature review How selective does the researcher appear to have been in identifying relevant literature to discuss? Does the review of literature appear appropriately extensive? Does the researcher provide a critical review?
Sample Was probability sampling or nonprobability sampling employed? What is the researcher’s sample? What is the researcher’s population? What claims will the researcher be able to make based on the sample? What are the sample’s major strengths and major weaknesses?
Data collection How were the data collected? What do you know about the relative strengths and weaknesses of the method employed? What other methods of data collection might have been employed, and why was this particular method employed? What do you know about the data collection strategy and instruments (e.g., questions asked, locations observed)? What don’t you know about the data collection strategy and instruments?
Data analysis How were the data analyzed? Is there enough information provided that you feel confident that the proper analytic procedures were employed accurately?
Results What are the study’s major findings? Are findings linked back to previously described research questions, objectives, hypotheses, and literature? Are sufficient amounts of data (e.g., quotes and observations in qualitative work, statistics in quantitative work) provided in order to support conclusions drawn? Are tables readable?
Discussion/conclusion Does the author generalize to some population beyond her or his sample? How are these claims presented? Are claims made supported by data provided in the results section (e.g., supporting quotes, statistical significance)? Have limitations of the study been fully disclosed and adequately addressed? Are implications sufficiently explored?

Finally – some of the course participants have completed the first task and posted their reflections on their blogs. See

Now to see if I can make a start on Week 5 which finished today!

#openedMOOC Week 5: Research on OER Impact and Effectiveness

Click on image for source

I have not done any research into OER Impact and Effectiveness and I don’t, in my career as a teacher, remember ever heavily relying on a textbook that students would have to buy. I do remember having to buy them myself during my own undergraduate studies, but that was back in the mid 60s. When I was teaching in Higher Education at the end of the 90s, early 2000, we would recommend textbooks which the students could buy if they wished, or could take out of the library (we tried to ensure multiple copies were in the library), but mostly we wrote our own materials and gave students hand-outs in the sessions. I must have written many text-books worth of hand-outs during my career. It never occurred to us at that time to share these online, but even if we had wanted to it would not have been possible, because all the materials we produced belonged to the institution. Of course, we also referred students to open online sites where they could explore further materials and dig deeper. So I haven’t had a lot of experience of this heavy reliance on expensive text books for teaching, although I have in my own research had difficulty accessing materials behind paywalls (see below).

It seems that at least in the US, there is this reliance on expensive textbooks and that explains the push for further research that David Wiley talks about in this week’s video. He tells us that whereas in the early days of this research the focus was on surveys and finding out what OERs were being used, and what happens when you use OERs, there is now a need for more nuanced research into what difference they make to student outcomes. According to David Wiley research into OER adoption is still at an early stage and there is need for further research into how OERs are produced and used, and how they are used in teaching.

Stephen Downes in his video for this week once more gets to the nub of the issue when he questions what we mean by impact and effectiveness. He tells us that research has shown that the medium makes no difference to student outcomes, i.e. it makes no difference whether the student learning environment is open or closed. The obvious difference that OERs will make to the student is cost.

As an aside here, from my own perspective, I doubt I would be a researcher if there weren’t OERs. I remember when we submitted our first paper on CCK08 learner experience in 2009, the reviewers criticised the number of blog posts we referenced. There were two reasons we referenced blog posts, 1. At that time there were no research papers on MOOCs to reference 2. Even if there had been, if they were in closed journals (and there were not many open access journals back then), as independent researchers we would not have been able to access them. I still have these issues, particularly in relation to books, which I often cannot afford; there are not enough open access e-books.

Returning to Stephen’s video and the point where I think he really nails it is in his discussion of what we mean by impact. He thinks, and I agree, that impact is more than grades, graduation and course completion. For him we should be looking at a person’s ability to:

  • Play a role in society
  • Live a happy and productive life
  • Be healthy
  • Engage in positive relationships with others
  • Live meaningfully
  • Have a valuable impact as seen through their own eyes and through the eyes of society

He asks how does using open content change any of these. If OERs are only used by the teacher then there won’t be much change. He says open is how you do things, open is when you share how you work with other people, open is when you take responsibility for ensuring that knowledge is carried forward into the next generation. This is the long-term impact of your value and worth in society. Stephen asks where is the research on this – we need research on how open resources help society.  This seems to me like the big picture and quite a challenge for research.

#openedmooc participants have responded to this week’s resources in different ways.

Matthias Melcher has questioned what we mean by research effectiveness in his blog post for this week.

Geoff Cain also reminds us to not forget the role of Connectivism and the role of history in open education (See also  Martin Weller’s recent post. Katy Jordan has done some amazing work in relation to this. I wish I had her technical skills!).

Merle Hearns  has done a great job of commenting on this video A review of the Effectiveness & Perceptions of Open Educational Resources as Compared to Textbooks  and further discusses Martin Weller’s paper The openness-creativity cycle in education as well as sharing her own work.

Benjamin Stewart keeps plugging away asking for a more critical perspective, both in his tweets, e.g. https://twitter.com/bnleez/status/924735752336039936 and in his blog.

For more blog posts see the course site

Finally there are some great resources provided this week, which I have copied here for future reference. These are links to publication lists. For anyone doing research into OER, they would be a great help.

#SOCRMx: Week 4 – Discourse Analysis

 (Click on image for source)

In Week 4 the Introduction to Social Research Methods course requires participants to move on and 1) reflect on a chosen method, and 2) test our ability to identify specific information about methods in a given research paper. I hope to get round to this but I am behind and am not ready to do it yet. I still want to explore some of the methods that I haven’t had time to engage with yet and take advantage of the resources provided.

In this post I will share my notes from watching Sally Wiggins’ video introducing Discourse Analysis. I have not attempted to complete the associated task, or to synthesise the other resources and information provided by the course.There are many more resources in the Week 2/3 materials of the course site. And some participants have tackled this as a course task. See for example these blog posts:

http://lizhudson.coventry.domains/general-blog-posts/research-method-option-1-discourse-analysis/

https://screenface.net/week-3-socrmx-discourse-analysis/

http://www.brainytrainingsolutions.com/discourse-analysis-facebook-conversation/#.WfL87hNSxTY

http://focusabc.blogspot.co.uk/2017/10/discourse-analysis-in-focus-example.html

Discourse analysis is not a method I have used, but it seems to be relevant to the research I have done and my interests.

My notes

Discourse analysis is a method for collecting qualitative data through the analysis of talk and text. It constructs rather than reflects reality from the premise that talk is a social, and talk and writing are never neutral.

Sally Wiggins in her video introducing discourse analysis tells us there are 5 types:

  1. Conversation analysis
  2. Discursive psychology
  3. Critical discursive psychology
  4. Foucauldian discourse analysis
  5. Critical discourse analysis

She explained that conversation analysis and discursive psychology approaches look at the detail of discourse (with a zoom lens), whilst critical discursive psychology and Foucauldian discourse analysis are interested in a broader perspective (wide angle lens). Critical discourse analysis is between these two. Before using discourse analysis as a method, we must decide which lens to use.

Conversation analysis (CA): uses tape recorders and other technologies to capture the detail of conversation. All aspects are captured, including body language, to explore how social interactions work. CA is all about illuminating the things we take for granted, all those intricate everyday social actions, and exploring them in great detail.

Discursive psychology (DP): examines the detail of interaction but also explores issues such as identities, emotions and accountabilities. Like CA it also uses technologies, such as video, to record interactions, but is used to explore how psychological states are invoked.

Critical discursive psychology (CDP): seeks a perspective which is somewhere between the zoom and wide angle lenses, blending the detail of interaction with broader social issues. It can’t be reduced to a line by line analysis, but instead examines patterns in the data in terms of culturally available ways of talking (interpretative repertoires). It explores familiar ways of talking about issues that shape and structure how we understand concept in a particular culture. It uses interviews and focus groups to explore everyday, common sense ways of understanding and issues produced in everyday talk.

Foucauldian discourse analysis (FDA): emerged from post structuralism. It takes a wide angle perspective on how discourses are connected to knowledge and power. It draws on textual and visual images, such as advertisements, as well as conversations, interviews and focus groups. FDA is interested in the implications of discourse for our subjective experience, how discourse and knowledge changes over time and how this effects people’s understanding of themselves.

Critical discourse analysis (CDA): takes a wide angle perspective and is the most critical form of discourse analysis. Its foundations lie in critical linguistics, semiotics and sociolinguistics. CDA seeks to reveal hidden ideologies underlying particular discourses, and how discourses are used to exert power over individuals and groups. CDA is used when we want to focus on a social problem of some kind. It draws heavily on semiotics and how words and images create to convey meaning in particular ways. It tries to unpack layers of meaning. CDA has a political vision, e.g. it is used to explore how individuals or groups are marginalized or dominated by other groups in society. It uses broad texts and images and seeks to expose ideologies that underpin a particular discourse. CDA shed light on social inequalities and how these are produced through certain discourses, but it also illuminates ways to challenge these discourses.

Just a minimal amount of wider reading around discourse analysis reveals there to be a wealth of literature related to this research method. I suspect it is not a method to be taken up lightly. I would have liked further examples of research questions that have been addressed using each of the five types of discourse analysis. Of the five types, I am most drawn to critical discourse analysis and critical discursive psychology.

#SOCRMx: Week 3 – Working with images

I have found the working with images resources in the Introduction to Social Research Methods MOOC very stimulating. According to the information provided in this course, visual methods are becoming increasingly popular.  I have always been interested in images, knowing that they can elicit ideas and feelings that words cannot. John Berger in his series of programmes on “Ways of Seeing” showed that the relation between what we see and what we know is never settled.

There are three kinds of visual data

  • researcher created, e.g. diagrams, maps, videos, photos
  • participant created, e.g. video diaries
  • researcher curated, e.g. a photo essay, cultural anthropology

Digital technologies have greatly increased the possibilities for working with each of these kinds of data. Images can also be used to elicit information in interviews.

Key considerations when working with visual images for research are: Why use this method? How can it address the research question? What are the best images for the given question? How can the image/s be accessed? What are the ethical implications of using images, e.g. research participant anonymity and right to privacy?

With respect to photos, further considerations relate to how a photo is conceptualised. Is it a copy or is it a more complex construction? Does the camera never lie or do the eye and brain perceive differently to the camera? Do we accept that the photo is evidence or do we consider how the photo was produced, what choices were made, what is included/excluded, what was around the photo that cannot be seen?

The strengths of visual research methods are thought to be that they can:

  • Generate more talk
  • Evoke sensory, affective and emotional responses
  • Encourage reflection on what is taken for granted, what is hidden, what is visible, what is not visible
  • Engage with people who find talk challenging
  • Reduce power differentials
  • Are inherently collaborative and interpreted through communication

This week’s task

The task for this method is to spend an hour or two engaging in a small-scale image-creation research activity. I have not taken a photo specifically for this task, but have trawled back through my own photos to find one that might fit the task and raise some of the issues that need to be addressed.

I have selected this photo that was taken in 2012. I could envisage this photo being used for example with Indian tourism students to explore perceptions of inequality.

Source of photo – here

We have been asked to consider six questions.

  • What is depicted in the image(s)?

I think this would be an interesting question to ask the tourism students. For me the image shows an Indian woman carrying a small child apparently unaffected by a white woman sunbathing. This appears to be a normal situation and each appears oblivious of the other, maybe indicating that they live in separate worlds even though they are inhabiting the same space.

  • What were you trying to discover by creating your image(s)

At the time I was on holiday in Mamallapuram, South of Chennai in India. This photo was not planned, but I noticed the incongruity suggested by the scene, probably because I am a white woman and was a tourist. Neither subject was aware of me taking the photo. I don’t think there were any ethics concerned with taking the photo – lots of unknown people appear in my holiday photos. I’m not sure what the ethics would be of using this photo for a real research project, given that there is no way that I could identify or contact either of the subjects.

  • What did the process of image creation involve?

I was in the right place at the right time with my camera ready. This photo was not staged. It was a snapshot in time, but nevertheless I was aware at the time that it conveys a message beyond a beach scene.

  • What is not seen, and why?

The photo is as it was taken. It might have been cropped and sharpened – I don’t remember, but just looking at it through this frame makes it appear that there are just two people on the beach. In fact I was sitting in a restaurant on the edge of the beach, full of tourists, and the beach was full of people, both Indians and tourists from around the world. There were also fishermen with their boats on the beach. It was a lively location and was situated within walking distance of the exquisite Mahabalipuram stone carvings. Does knowing this change how the photo is perceived?

  • How is meaning being conveyed?

Through the proximity of the two subjects who are so near but so far from each other. They are back to back, facing in opposite directions, but don’t appear concerned, or even to have noticed this ambiguity. Further opposites are conveyed through their clothing and through their posture – one is walking and the other lying.

  • With respect to the photograph, how might the image convey something different to your experience of ‘being there’.

The image appears still and quiet without sound, or the sound of the sea, but it was busy and there was plenty of sound, chatter, laughter, shouting, music, the sound of the sea and so on. Indian tourism students may have seen this type of scene so often that they do not notice it or if they do it may not concern them. Alternatively it may concern them greatly. As tourism students are the contradictions evident in this photo something they should be concerned about? What issues are raised?

#SOCRMx End of Week 3 Reflections

This is the third week of the Introduction to Social Research Methods MOOC, which I am finding both very useful and frustrating at the same time. It is very useful, because the resources provided (as mentioned in a previous post) are really excellent, but unfortunately some of them are locked down in closed systems so only accessible to course participants. I wish there was more time to engage with them all properly. Their high quality has left me wondering whether I should spend time making sure I have seen them all or whether I should focus on the weekly tasks and trying to follow other participants.

The course is frustrating because there is little social interaction, or have I missed it? The majority of participants seem to be doing a Masters or a PhD at the University of Edinburgh, so completing the tasks and getting feedback from a tutor on those tasks must be a high priority for them and the tasks take quite a bit of time, not leaving much time for discussion. In addition, it’s difficult to respond to the task requirements in short posts, leading to long pages of text which are demotivating in terms of discussion. I find the design of the edX discussion forums terrible – very time consuming and difficult. I feel as though I have wasted time trying to follow what little discussion there is in these forums.

I wondered whether there was more discussion on participants’ blogs than in the forums, so I have spent some time collating all the blogs I could find. If blogs are going to be used in MOOCs, then my view is that it’s essential that these are centrally aggregated. This was realized as long ago as 2008 in the first MOOC – CCK08. This is the list of bloggers I have found.

There are probably more than this. I am finding it very difficult to get a sense of who is doing this MOOC, from where and why. The map that we were all asked to add our names to in the first week, no longer seems to be on the site (or if it is, I can no longer find it), so I have no sense of how many people are on the course. From the forum posts that I have read, there seem to be people from the States, Latin America, Australia and Europe, but I’m not clear about whether they are students of Edinburgh University or not.

I am going to persevere with the MOOC because of the high quality of the resources and I will also try and follow the blogs I have found, although I suspect that not all participants are blogging that much.

However, on reflection I have decided that I probably won’t engage fully with the tasks. My response to last week’s task on Surveys was, I acknowledge, quite half-hearted, whereas I can see that some participants made a really good job of it. One participant has commented that it is difficult to engage in tasks for which there doesn’t seem a real purpose. I agree. I find it difficult to get motivated to write survey questions or complete some of the other tasks with no intention of doing this for an actual research project. This is not helped by the fact that I am actually, at this very time, completing writing a research paper, so my ‘head’ is in another zone.

Nevertheless this process and reflection have been helpful – because I have realized, even more clearly than before, that in all my research I have worked backwards rather than forwards. This means that I haven’t decided ‘I am going to go out and research that’, these are my questions, this is the methodology I will adopt, and these are the methods I will use. All my research has emerged, almost serendipitously, from my experience – mostly experience of participating in MOOCs. At the end of the MOOC (or equivalent experience) I find I have met people who, like me, have unanswered questions and want to probe further and then it goes from there. It is messy. The questions keep changing, the data is difficult and messy to gather and it takes months and months to make sense of. The survey we designed to research the use of blogs and forums in the CCK08 MOOC, took months and months of convoluted discussion. We didn’t concoct these questions from thin air, we drew them from our data, endless hours of trawling blogs and forums for what participants had said. We then spent further endless hours debating these statements, their language, whether they made sense and yet we have been asked in this MOOC to write a set of hypothetical survey questions in one week. In addition, all my research has been collaborative, so it feels strange to be working on the methods tasks in isolation, however half-heartedly.

To end on a more positive note, I have thoroughly enjoyed going through all the Visual Methods and Ethnography resources this week, which have been very informative.

And to end on a fun note, one of the participants, Helen Walker (@helenwalker7) has just posted an infrographics quiz on her blog –  The ‘who old are you? quiz shows me to be at the limits of my creative zenith, career and worldly success. Maybe that accounts for this post 🙂