#SOCRMx Week 7: Objectivity and Subjectivity in Social Research

The past two weeks in the Introduction to Social Research Methods MOOC (SOCRMx) have covered ‘How to Lie with Statistics’ (Week 6 Quantitative data analysis) and ‘Interpretation and Trustworthiness in Qualitative Data Analysis’ (Week 7).

Lots of resources have been provided for the Week 6 topic on quantitative data analysis. I have saved them for future reference, should I ever go down that track. It was interesting to see that more course participants got involved in discussion in the Week 6 forums. I’m not sure if that’s because a positivist approach comes more easily to those new to research, or whether it was because once again the resources were extremely good and the tutor for this week – Rory Ewins – actually engaged in the forums and provided feedback!

The resources for Week 7 on qualitative data analysis are very familiar given that all my research has taken this approach, but once more the resources provided on what we mean by trustworthiness and how to code qualitative data are helpful and thought provoking. Again I have saved these resources for future reference when I can explore them with more time. I think both these topics require and deserve more time than one week to get to grips with.

The topic of qualitative analysis reminded me of an excellent video that was shown in the first week of the course under the title of theoretical considerations in research design. This is a SAGE research methods video and therefore not open access, but if anyone from SAGE visits this post, I would urge them to make this video publicly available. I think both novice and experienced researchers would find it hugely helpful.

The video shows John Scott (Professor of Sociology, Plymouth University), Malcolm  Williams (Professor in the School of Social Sciences, Cardiff University) and Gayle Letherby (Professor of Sociology , Plymouth University) discussing objectivity and subjectivity in social research, having published a book about this in 2014:

Scott, J, Williams, M & Letherby, G. (2014). Objectivity and subjectivity in social research, SAGE Publications Ltd., London.

These are the notes I made from watching the video.

John Scott starts by drawing on the work of Kant and Mannheim to discuss how subjectivity, objectivity, relativity and truth have multi-faceted meanings. We see the world differently according to our social location (our perspective) and we construct knowledge relative to that location. But if this is so, how, as researchers, can we present a truthful representation of the world. One way we can do this (which Iain McGilchrist has also discussed in his work on the Divided Brain) is to try and synthesise different standpoints (alternative perspectives) to achieve a better model for understanding the world.

The three authors each approach research from different perspectives. Malcolm Williams adopts a socially situated objectivity approach, Gayle Letherby a theorised subjectivity approach and John Scott, is somewhere in the middle of these two perspectives, and the point was made that it’s not necessarily a question of either/or, as shown by mixed methods approaches.

Malcolm Williams sees objectivity as a value which is socially constructed. The pursuit of truth is a key research value and he argues that a necessary condition of objectivity is to begin with subjectivity.

Gayle Letherby recognises the personhood of research and the complex relationship between researcher and respondent. From this perspective research is subjective, power laden, emotional and embodied. A theorised subjectivity approach is concerned with how identity plays out in the research process. Gayle Letherby tells us that interrogation of the self in research, with reference to the ‘other’, gets us closer to a position that we might call objective and that autobiography is always relevant. We need to interrogate our biases, but we should avoid demonization of subjectivity. Subjectivity is not a redefinition of objectivity, but starts from a different place. Objectivity and subjectivity are interrelated.

Whether we take a subjective or objective approach, social science research does have validity and needs to be defended. This does not mean that one account is as good as any other. Researchers must be responsible and explain how we got what we got and how what we did affects what we got. Research always involves a view from somewhere and we need to write about our subjective positions. Where did the research question come from? Why was a particular approach adopted? Can we justify why our findings are important? What is our ethical position? Have we acknowledged that ideas about knowledge can’t be separated from ideas about power? (Foucault’s work is relevant here) The social and political role of research means that there is no escape from issues of power.

Scott, Williams and Letherby conclude that the validity of knowledge depends on having:

  • Open societies (Karl Popper)
  • An ideal speech situation (Habermas)
  • Free discussion (Mannheim)
  • Leaving power outside the door as much as possible.

#SOCRMx Week 5: Data Analysis

In this second half of the Introduction to Social Research Methods course the focus shifts from data generation and research methods to data analysis.

 

(Click on image for source)

The main task for this week has been to look at the way in which the authors of two research papers (provided by the course) have analysed their data and presented their subsequent findings. One paper took a qualitative approach to data generation and the other a quantitative approach. Useful prompt questions have been provided to support this task.

For me, more interesting than this task are two points raised in the course text, with the suggestion that these are discussed in our blogs, but to my knowledge no-one has done this.

The first is related to the messiness of research which is drawn to our attention through a quote from Hardy and Bryman’s text – Handbook of Data Analysis – which incidentally is not open access.

This is the quote:

active researchers seldom march through the stages of design, data collection, and data analysis as if they were moving through security checkpoints that allowed mobility in only one direction. Instead, researchers typically move back and forth, as if from room to room, taking what they learn in one room and revisiting what was decided in the previous room, keeping the doors open. (Hardy and Bryman 2004, p.2)

This is very much in keeping with my experience, but I would suggest that it is even more messy than Hardy and Bryman suggest. Richard Feynman talked about living and working with doubt and uncertainty…

… and Paul Feyerabend argued ‘Against Method’. In the Stanford Encyclopedia of Philosophy is written of Feyerabend 

‘…. whereas he had previously been arguing in favour of methodology (a “pluralistic” methodology, that is), he had now become dissatisfied with any methodology. He emphasised that older scientific theories, like Aristotle’s theory of motion, had powerful empirical and argumentative support, and stressed, correlatively, that the heroes of the scientific revolution, such as Galileo, were not as scrupulous as they were sometimes represented to be. He portrayed Galileo as making full use of rhetoric, propaganda, and various epistemological tricks in order to support the heliocentric position.’  

More recently Stephen Downes has also argued against method suggesting that traditional approaches to research do not account for the horribly messy, complex, always changing world in which we are now living and conducting researchSee Digital Research Methodologies Redux for his presentation.

A course like this Introduction to Social Research Methods necessarily presents an orderly sequenced set of resources and activities, but research ‘in the wild’ is far from orderly.

The second point made in the Week 5 course text that stood out for me was this one:

Hardy and Bryman (2004) ……  also discuss data reduction as a core element of analysis (my bold): to analyze or to provide an analysis will always involve a notion of reducing the amount of data we have collected so that capsule statements about the data can be provided. (p.4)

This for me is an enormously significant statement. As McGilchrist says (p. 28 The Master and His Emissary)

The kind of attention we bring to bear on the world changes the nature of the world we attend to ……… Attention changes what kind of a thing comes into being for us: in that way it changes the world.

It follows then that how we choose to reduce the amount of data we have collected will determine what research outcomes ‘come into being’, what we learn from the research and will have implications for how the findings are used.

Both these statements in the Week 5 course materials, concerning the messiness of research and the reduction of data, seem to me to perhaps warrant more attention than they have received in the course.

References

Hardy, M. and Bryman, A. (2004). Introduction: common threads among techniques of data analysis. In Hardy, M. & Bryman, A. Handbook of data analysis (pp. 1-13). : SAGE Publications Ltd. doi: 10.4135/9781848608184

McGilchrist, I. (2010). The Master and his Emissary. The Divided Brain and the Making of the Western World. Yale University Press

#SOCRMx Week 4: ‘half-way’ tasks and reflections

(Click on the image to go to source.)

This post should have been made at the end of last week. We are now at the end of Week 5 and I am attempting to catch up.

We are now half-way through this 8-week Introduction to Social Research Methods course. I continue to be impressed by the content, but the course doesn’t really lend itself to much discussion. I am grateful that it is open and that I have access to the excellent resources, but the course has been designed for Edinburgh University Masters and PhD students – the rest of us must fit in where we can.

There are two tasks for Week 4. I have completed one – rather hurriedly – but will report on both.

The first task for Week 4 was to consider one of the research methods we explored in Weeks 2 and 3 and answer the following questions in a reflective blog post.

  • What three (good) research questions could be answered using this approach?
  • What assumptions about the nature of knowledge (epistemology) seem to be associated with this approach?
  • What kinds of ethical issues arise?
  • What would “validity” imply in a project that used this approach?
  • What are some of the practical or ethical issues that would need to be considered?
  • And finally, find and reference at least two published articles that have used this approach (aside from the examples given in this course). Make some notes about how the approach is described and used in each paper, linking to your reflections above.

So far, I have explored the resources related to Surveys, Working with Images, Discourse Analysis and Ethnography. All have been extremely useful and I have written posts about the first three. I will move on to Interviews next and hope to explore the remaining methods (Focus groups, Experimental interventions, Social network analysis and Learning Analytics) before the end of the course.

I have decided not to do this week’s reflection task which requires answering the questions above. For me these questions will be useful when I am working on an authentic research project, but I don’t want to spend time working through them for a hypothetical project. As I mentioned in a previous post I tend to work backwards on research, or at least backwards and forwards, i.e. I get immersed and see what happens rather than plan it all out ahead of time. That doesn’t mean that the questions above are not important and useful, they are, but for me they are ongoing questions rather than up-front questions. This approach to research doesn’t really fit with traditional Masters or PhD research.  I did do a traditional Masters but felt I was ‘playing the game’ in my choice of dissertation topic.  My PhD by publication was a much better fit with the way I work, but even that was playing the game a bit! My independent research has never felt like ‘playing the game’. It has always stemmed from a deep personal interest in the research question.

The second task for Week 4 was to review a “published academic journal article, and answer a set of questions about the methods employed in the study”. I have completed this task, but not submitted it for assessment, since I am not doing this course for assessment. The assessment is a set of multi-choice questions.

At this point it’s worth mentioning that there are a lot of multi-choice quizzes in this course and that I am hopeless at them! I rarely ever get a full score, although I think I have answered these Week 4 task questions correctly. Most of the quizzes in this course allow you to have multiple attempts and sometimes I have needed multiple attempts. Thank goodness for a second computer monitor, where I can display the text being tested at the same time as trying to answer the multi-choice quizzes. Having two monitors is essential to the way I work and even more essential for my research work.  I’m not sure that multiple choice quizzes do anything for my learning, other than to confirm that I have completed a section. I would prefer an open controversial question for discussion, but in this course there is so much content to cover that there would be no time for this.

But again, some excellent resources have been provided for this week. Particularly useful is reference to this open textbook : Principles of Sociological Inquiry – Qualitative and Quantitative Methods with specific reference to Chapters 14.1 and 14.2.

I am copying this helpful Table (from the open textbook) here for future reference: Table 14.2 Questions Worth Asking While Reading Research Reports

Report section Questions worth asking
Abstract What are the key findings? How were those findings reached? What framework does the researcher employ?
Acknowledgments Who are this study’s major stakeholders? Who provided feedback? Who provided support in the form of funding or other resources?
Introduction How does the author frame his or her research focus? What other possible ways of framing the problem exist? Why might the author have chosen this particular way of framing the problem?
Literature review How selective does the researcher appear to have been in identifying relevant literature to discuss? Does the review of literature appear appropriately extensive? Does the researcher provide a critical review?
Sample Was probability sampling or nonprobability sampling employed? What is the researcher’s sample? What is the researcher’s population? What claims will the researcher be able to make based on the sample? What are the sample’s major strengths and major weaknesses?
Data collection How were the data collected? What do you know about the relative strengths and weaknesses of the method employed? What other methods of data collection might have been employed, and why was this particular method employed? What do you know about the data collection strategy and instruments (e.g., questions asked, locations observed)? What don’t you know about the data collection strategy and instruments?
Data analysis How were the data analyzed? Is there enough information provided that you feel confident that the proper analytic procedures were employed accurately?
Results What are the study’s major findings? Are findings linked back to previously described research questions, objectives, hypotheses, and literature? Are sufficient amounts of data (e.g., quotes and observations in qualitative work, statistics in quantitative work) provided in order to support conclusions drawn? Are tables readable?
Discussion/conclusion Does the author generalize to some population beyond her or his sample? How are these claims presented? Are claims made supported by data provided in the results section (e.g., supporting quotes, statistical significance)? Have limitations of the study been fully disclosed and adequately addressed? Are implications sufficiently explored?

Finally – some of the course participants have completed the first task and posted their reflections on their blogs. See

Now to see if I can make a start on Week 5 which finished today!

#SOCRMx: Week 4 – Discourse Analysis

 (Click on image for source)

In Week 4 the Introduction to Social Research Methods course requires participants to move on and 1) reflect on a chosen method, and 2) test our ability to identify specific information about methods in a given research paper. I hope to get round to this but I am behind and am not ready to do it yet. I still want to explore some of the methods that I haven’t had time to engage with yet and take advantage of the resources provided.

In this post I will share my notes from watching Sally Wiggins’ video introducing Discourse Analysis. I have not attempted to complete the associated task, or to synthesise the other resources and information provided by the course.There are many more resources in the Week 2/3 materials of the course site. And some participants have tackled this as a course task. See for example these blog posts:

http://lizhudson.coventry.domains/general-blog-posts/research-method-option-1-discourse-analysis/

https://screenface.net/week-3-socrmx-discourse-analysis/

http://www.brainytrainingsolutions.com/discourse-analysis-facebook-conversation/#.WfL87hNSxTY

http://focusabc.blogspot.co.uk/2017/10/discourse-analysis-in-focus-example.html

Discourse analysis is not a method I have used, but it seems to be relevant to the research I have done and my interests.

My notes

Discourse analysis is a method for collecting qualitative data through the analysis of talk and text. It constructs rather than reflects reality from the premise that talk is a social, and talk and writing are never neutral.

Sally Wiggins in her video introducing discourse analysis tells us there are 5 types:

  1. Conversation analysis
  2. Discursive psychology
  3. Critical discursive psychology
  4. Foucauldian discourse analysis
  5. Critical discourse analysis

She explained that conversation analysis and discursive psychology approaches look at the detail of discourse (with a zoom lens), whilst critical discursive psychology and Foucauldian discourse analysis are interested in a broader perspective (wide angle lens). Critical discourse analysis is between these two. Before using discourse analysis as a method, we must decide which lens to use.

Conversation analysis (CA): uses tape recorders and other technologies to capture the detail of conversation. All aspects are captured, including body language, to explore how social interactions work. CA is all about illuminating the things we take for granted, all those intricate everyday social actions, and exploring them in great detail.

Discursive psychology (DP): examines the detail of interaction but also explores issues such as identities, emotions and accountabilities. Like CA it also uses technologies, such as video, to record interactions, but is used to explore how psychological states are invoked.

Critical discursive psychology (CDP): seeks a perspective which is somewhere between the zoom and wide angle lenses, blending the detail of interaction with broader social issues. It can’t be reduced to a line by line analysis, but instead examines patterns in the data in terms of culturally available ways of talking (interpretative repertoires). It explores familiar ways of talking about issues that shape and structure how we understand concept in a particular culture. It uses interviews and focus groups to explore everyday, common sense ways of understanding and issues produced in everyday talk.

Foucauldian discourse analysis (FDA): emerged from post structuralism. It takes a wide angle perspective on how discourses are connected to knowledge and power. It draws on textual and visual images, such as advertisements, as well as conversations, interviews and focus groups. FDA is interested in the implications of discourse for our subjective experience, how discourse and knowledge changes over time and how this effects people’s understanding of themselves.

Critical discourse analysis (CDA): takes a wide angle perspective and is the most critical form of discourse analysis. Its foundations lie in critical linguistics, semiotics and sociolinguistics. CDA seeks to reveal hidden ideologies underlying particular discourses, and how discourses are used to exert power over individuals and groups. CDA is used when we want to focus on a social problem of some kind. It draws heavily on semiotics and how words and images create to convey meaning in particular ways. It tries to unpack layers of meaning. CDA has a political vision, e.g. it is used to explore how individuals or groups are marginalized or dominated by other groups in society. It uses broad texts and images and seeks to expose ideologies that underpin a particular discourse. CDA shed light on social inequalities and how these are produced through certain discourses, but it also illuminates ways to challenge these discourses.

Just a minimal amount of wider reading around discourse analysis reveals there to be a wealth of literature related to this research method. I suspect it is not a method to be taken up lightly. I would have liked further examples of research questions that have been addressed using each of the five types of discourse analysis. Of the five types, I am most drawn to critical discourse analysis and critical discursive psychology.

#SOCRMx: Week 3 – Working with images

I have found the working with images resources in the Introduction to Social Research Methods MOOC very stimulating. According to the information provided in this course, visual methods are becoming increasingly popular.  I have always been interested in images, knowing that they can elicit ideas and feelings that words cannot. John Berger in his series of programmes on “Ways of Seeing” showed that the relation between what we see and what we know is never settled.

There are three kinds of visual data

  • researcher created, e.g. diagrams, maps, videos, photos
  • participant created, e.g. video diaries
  • researcher curated, e.g. a photo essay, cultural anthropology

Digital technologies have greatly increased the possibilities for working with each of these kinds of data. Images can also be used to elicit information in interviews.

Key considerations when working with visual images for research are: Why use this method? How can it address the research question? What are the best images for the given question? How can the image/s be accessed? What are the ethical implications of using images, e.g. research participant anonymity and right to privacy?

With respect to photos, further considerations relate to how a photo is conceptualised. Is it a copy or is it a more complex construction? Does the camera never lie or do the eye and brain perceive differently to the camera? Do we accept that the photo is evidence or do we consider how the photo was produced, what choices were made, what is included/excluded, what was around the photo that cannot be seen?

The strengths of visual research methods are thought to be that they can:

  • Generate more talk
  • Evoke sensory, affective and emotional responses
  • Encourage reflection on what is taken for granted, what is hidden, what is visible, what is not visible
  • Engage with people who find talk challenging
  • Reduce power differentials
  • Are inherently collaborative and interpreted through communication

This week’s task

The task for this method is to spend an hour or two engaging in a small-scale image-creation research activity. I have not taken a photo specifically for this task, but have trawled back through my own photos to find one that might fit the task and raise some of the issues that need to be addressed.

I have selected this photo that was taken in 2012. I could envisage this photo being used for example with Indian tourism students to explore perceptions of inequality.

Source of photo – here

We have been asked to consider six questions.

  • What is depicted in the image(s)?

I think this would be an interesting question to ask the tourism students. For me the image shows an Indian woman carrying a small child apparently unaffected by a white woman sunbathing. This appears to be a normal situation and each appears oblivious of the other, maybe indicating that they live in separate worlds even though they are inhabiting the same space.

  • What were you trying to discover by creating your image(s)

At the time I was on holiday in Mamallapuram, South of Chennai in India. This photo was not planned, but I noticed the incongruity suggested by the scene, probably because I am a white woman and was a tourist. Neither subject was aware of me taking the photo. I don’t think there were any ethics concerned with taking the photo – lots of unknown people appear in my holiday photos. I’m not sure what the ethics would be of using this photo for a real research project, given that there is no way that I could identify or contact either of the subjects.

  • What did the process of image creation involve?

I was in the right place at the right time with my camera ready. This photo was not staged. It was a snapshot in time, but nevertheless I was aware at the time that it conveys a message beyond a beach scene.

  • What is not seen, and why?

The photo is as it was taken. It might have been cropped and sharpened – I don’t remember, but just looking at it through this frame makes it appear that there are just two people on the beach. In fact I was sitting in a restaurant on the edge of the beach, full of tourists, and the beach was full of people, both Indians and tourists from around the world. There were also fishermen with their boats on the beach. It was a lively location and was situated within walking distance of the exquisite Mahabalipuram stone carvings. Does knowing this change how the photo is perceived?

  • How is meaning being conveyed?

Through the proximity of the two subjects who are so near but so far from each other. They are back to back, facing in opposite directions, but don’t appear concerned, or even to have noticed this ambiguity. Further opposites are conveyed through their clothing and through their posture – one is walking and the other lying.

  • With respect to the photograph, how might the image convey something different to your experience of ‘being there’.

The image appears still and quiet without sound, or the sound of the sea, but it was busy and there was plenty of sound, chatter, laughter, shouting, music, the sound of the sea and so on. Indian tourism students may have seen this type of scene so often that they do not notice it or if they do it may not concern them. Alternatively it may concern them greatly. As tourism students are the contradictions evident in this photo something they should be concerned about? What issues are raised?

#SOCRMx End of Week 3 Reflections

This is the third week of the Introduction to Social Research Methods MOOC, which I am finding both very useful and frustrating at the same time. It is very useful, because the resources provided (as mentioned in a previous post) are really excellent, but unfortunately some of them are locked down in closed systems so only accessible to course participants. I wish there was more time to engage with them all properly. Their high quality has left me wondering whether I should spend time making sure I have seen them all or whether I should focus on the weekly tasks and trying to follow other participants.

The course is frustrating because there is little social interaction, or have I missed it? The majority of participants seem to be doing a Masters or a PhD at the University of Edinburgh, so completing the tasks and getting feedback from a tutor on those tasks must be a high priority for them and the tasks take quite a bit of time, not leaving much time for discussion. In addition, it’s difficult to respond to the task requirements in short posts, leading to long pages of text which are demotivating in terms of discussion. I find the design of the edX discussion forums terrible – very time consuming and difficult. I feel as though I have wasted time trying to follow what little discussion there is in these forums.

I wondered whether there was more discussion on participants’ blogs than in the forums, so I have spent some time collating all the blogs I could find. If blogs are going to be used in MOOCs, then my view is that it’s essential that these are centrally aggregated. This was realized as long ago as 2008 in the first MOOC – CCK08. This is the list of bloggers I have found.

There are probably more than this. I am finding it very difficult to get a sense of who is doing this MOOC, from where and why. The map that we were all asked to add our names to in the first week, no longer seems to be on the site (or if it is, I can no longer find it), so I have no sense of how many people are on the course. From the forum posts that I have read, there seem to be people from the States, Latin America, Australia and Europe, but I’m not clear about whether they are students of Edinburgh University or not.

I am going to persevere with the MOOC because of the high quality of the resources and I will also try and follow the blogs I have found, although I suspect that not all participants are blogging that much.

However, on reflection I have decided that I probably won’t engage fully with the tasks. My response to last week’s task on Surveys was, I acknowledge, quite half-hearted, whereas I can see that some participants made a really good job of it. One participant has commented that it is difficult to engage in tasks for which there doesn’t seem a real purpose. I agree. I find it difficult to get motivated to write survey questions or complete some of the other tasks with no intention of doing this for an actual research project. This is not helped by the fact that I am actually, at this very time, completing writing a research paper, so my ‘head’ is in another zone.

Nevertheless this process and reflection have been helpful – because I have realized, even more clearly than before, that in all my research I have worked backwards rather than forwards. This means that I haven’t decided ‘I am going to go out and research that’, these are my questions, this is the methodology I will adopt, and these are the methods I will use. All my research has emerged, almost serendipitously, from my experience – mostly experience of participating in MOOCs. At the end of the MOOC (or equivalent experience) I find I have met people who, like me, have unanswered questions and want to probe further and then it goes from there. It is messy. The questions keep changing, the data is difficult and messy to gather and it takes months and months to make sense of. The survey we designed to research the use of blogs and forums in the CCK08 MOOC, took months and months of convoluted discussion. We didn’t concoct these questions from thin air, we drew them from our data, endless hours of trawling blogs and forums for what participants had said. We then spent further endless hours debating these statements, their language, whether they made sense and yet we have been asked in this MOOC to write a set of hypothetical survey questions in one week. In addition, all my research has been collaborative, so it feels strange to be working on the methods tasks in isolation, however half-heartedly.

To end on a more positive note, I have thoroughly enjoyed going through all the Visual Methods and Ethnography resources this week, which have been very informative.

And to end on a fun note, one of the participants, Helen Walker (@helenwalker7) has just posted an infrographics quiz on her blog –  The ‘who old are you? quiz shows me to be at the limits of my creative zenith, career and worldly success. Maybe that accounts for this post 🙂

SOCRMx Week 2 Surveys

I have divided this post into two parts. The first part reports on information from the course site. The second part is an attempt to complete one of the suggested activities.

Surveys: Part 1
Source of image here

The Social Research Methods MOOC (SOCRMx) being run by Edinburgh University on edX is a treasure trove of content and resources. For Weeks 2/3 of the course we have been asked to ‘choose our own adventure’ and an adventure it is certainly proving to be. We have been given a list of methods with extensive associated resources for each (videos, reading, examples of research) and it has been suggested that we choose two and write a blog post about our understanding of that approach. There are also specific activities suggested for each method. This is the list of methods:

  • Surveys
  • Discourse analysis
  • Learning analytics
  • Interviews
  • Focus groups
  • Ethnography – Participant observation
  • Social Network Analysis (SNA)
  • Experimental intervention
  • Working with images

The videos associated with these resources are really excellent especially the SAGE videos, but these videos are only available to course participants or people who subscribe to SAGE and come with clearly and strongly stated restrictions:

You shall not:

  • print, email, publicly post, display, sell, distribute or otherwise share the content with non-registered users without further permission from SAGE.
  • transfer, sell or share login details for the express purpose of providing access to content for non-registered users.
  • translate, adapt, delete, alter or apply any other treatment to the content without further permission from SAGE.
  • be guaranteed access to the content after completion of the course.

This doesn’t at all fit with my aspirations for open education, but they are really good videos, so I am going to watch as many as I can, while I have the chance.

I started at the top of the list with ‘Surveys’ simply out of curiosity, with no real initial intention of choosing this as a method to follow up on, but quickly found myself drawn in by the excellent videos. As one of the presenters said there is now 30-40 years of research into questionnaire design to draw on, and plenty of information out there about using surveys for research. I first used a survey in my MA research which was 20 years ago. At that time we were encouraged to use Judith Bell’s book ‘Doing your research project’ and looking at it now I can see that the information it provides is very similar to the information provided by this course. There are a couple of topics that Judith Bell doesn’t discuss:

  1. The difficulties associated with Agree/Disagree questions, which can be problematic as they require respondents to go through extra cognitive steps and can lead to acquiescence response bias. This link from the Survey Monkey blog explains this well – https://www.surveymonkey.com/blog/2016/12/21/lets-agree-not-use-agreedisagree-questions/

This link also mentions recent developments in questionnaire designs which are using technology to collect para data, e.g. measuring how long it takes a respondent to answer a question or using eye- tracking technology ‘to track respondents’ attention to each type of question’.

  1. The difficulty of getting honest responses when dealing with sensitive topics. Two techniques are used with varying degrees of success to achieve this: Item count technique and Randomised response

I have found all this fascinating but the truth of the matter is that I prefer collecting qualitative data to quantitative data. Whilst surveys are usually used to generate quantitative data, large amounts of data in a standardized format from which generalizable statements about the entire population can be made, it is possible to use surveys to generate qualitative data, through the use of open-ended questions.

Surveys: Part 2

In the few surveys I have conducted, I have always ended up being more interested in the responses to the open-ended questions that provide a text box for the answer, but what is the point of asking open-ended questions in a survey? I found some answers to this question in this blog post. What the blog post doesn’t say is that asking open-ended questions in a big survey, e.g. a survey sent out to hundreds of MOOC participants, is going to lead to an equally massive amount of work!

Reflecting on the resources provided about surveys has made me think about the number of times we ask survey type questions when not actually having formal research intentions, e.g. student course evaluations. The same guidelines about good questions apply. I have also been reminded of how we used to agonise over how to ask our primary trainee teachers questions about their knowledge of science concepts which would reveal their misconceptions.

For this social research methods course, we have been asked to design a simple survey which will explore some aspect of the use of Web 2.0 tools by a certain group of people, but I’m going to do my own thing and instead I’m going to revisit the problems we had in eliciting students’ science misconceptions. (I already know this will probably not lead to a satisfactory conclusion!).

Some of the issues already identified in Part 1 of this post relating to honesty and sensitivity of the subject are relevant here. It is important not to make students feel stupid – on the other hand it is extremely important to make them aware of their own misconceptions, so that they do not pass these on to the children they teach.  It will also be important to try and avoid receiving responses which are guesses.

By asking the questions below I want to elicit students’ misconceptions. I am imagining that I am asking a cohort of approx. 200 students to respond to these questions. For this task I have simply focused on the questions and at this stage have not thought about presentation, introductory information etc. Questions could be asked about each scientific concept in the primary school curriculum. I have settled for just two here.

Questions:

Which of the following three statements is correct?  Put a tick in the box beside the correct statement.

1.   Putting a coat on a snowman will keep it cold and stop it melting so fast.
2.   Putting a coat on a snowman will warm it up and make it melt faster.
3.   Putting a coat on a snowman will make no difference to how fast it melts.
Using your knowledge of materials and their properties, explain your choice of correct answer.

 

 

 

Which of the following three statements is correct?  Put a tick in the box beside the correct statement.

1.   A book standing on a table does not move because there are no forces acting on it.
2.   A book standing on a table does not move because the forces acting on it are balanced.
3.   A book standing on a table does not move because of the friction between the book and the table.
Using your knowledge of forces and motion, explain your choice of correct answer.

 

 

 

Practical issues

The objective of this questionnaire is to find out:

  • How many students still have science misconceptions at the end of the course (I could do this for any scientific concept).
  • Which misconceptions students still hold at the end of the course.
  • Whether there are any patterns to these misconceptions.
  • Whether students can use their knowledge and understanding to explain their responses.

The main issue with this survey is that if the student doesn’t explain their response in the text box provided then they could simply have guessed the answer and there would be no way of knowing this.

The epistemological assumption here is that there is a correct answer which the student should know if the course has been effectively taught and if the student has engaged with the course. I don’t think there would be any ethical issues associated with these questions. They could be answered anonymously, although a possible ethical issue might be whether the students find them threatening to their self-esteem, but this is unlikely if the taught course has focused on misconceptions.

Perhaps of more concern is validity. Will the questions really give me the information I need? Perhaps there needs to be a statement up front explaining why an explanation for a response is so important.

Finally, if this questionnaire depends on the open-ended question where students are asked to give an explanation for their choice of statement, then there is clearly the practical issue of the work-load of reading and analyzing all those explanations once collected.

If this type of questionnaire would scale up, then there is the potential of finding out a lot about primary school trainee teachers’ scientific misconceptions, but I’m not sure that it would scale up – and I suspect it might need a follow-up questionnaire.

I have not done what was asked this week, but I have learned more about how not to design a survey 🙂

Update 131017

Of course I should have mentioned that this type of survey may need to be run face-to-face. Otherwise, students could simply look up their responses on the internet or from books.  This was a problem we always had when trying to gather information about the knowledge of our distance learning students. As mentioned above, one way round this these days, would be to measure the student response time to each question, but presumably that needs specialist knowledge and technology.

Acknowledgement

The snowman statements come from the following sources:

Concept Cartoons

Image from: http://slideplayer.com/slide/4332605/ (Slide 18)

For the original cartoon, see also Keogh, B. & Naylor, S. (1996). Teaching and learning in science: a new perspective. Retrieved from: http://www.leeds.ac.uk/educol/documents/000000115.htm