SOCRMx Week 2 Surveys

I have divided this post into two parts. The first part reports on information from the course site. The second part is an attempt to complete one of the suggested activities.

Surveys: Part 1
Source of image here

The Social Research Methods MOOC (SOCRMx) being run by Edinburgh University on edX is a treasure trove of content and resources. For Weeks 2/3 of the course we have been asked to ‘choose our own adventure’ and an adventure it is certainly proving to be. We have been given a list of methods with extensive associated resources for each (videos, reading, examples of research) and it has been suggested that we choose two and write a blog post about our understanding of that approach. There are also specific activities suggested for each method. This is the list of methods:

  • Surveys
  • Discourse analysis
  • Learning analytics
  • Interviews
  • Focus groups
  • Ethnography – Participant observation
  • Social Network Analysis (SNA)
  • Experimental intervention
  • Working with images

The videos associated with these resources are really excellent especially the SAGE videos, but these videos are only available to course participants or people who subscribe to SAGE and come with clearly and strongly stated restrictions:

You shall not:

  • print, email, publicly post, display, sell, distribute or otherwise share the content with non-registered users without further permission from SAGE.
  • transfer, sell or share login details for the express purpose of providing access to content for non-registered users.
  • translate, adapt, delete, alter or apply any other treatment to the content without further permission from SAGE.
  • be guaranteed access to the content after completion of the course.

This doesn’t at all fit with my aspirations for open education, but they are really good videos, so I am going to watch as many as I can, while I have the chance.

I started at the top of the list with ‘Surveys’ simply out of curiosity, with no real initial intention of choosing this as a method to follow up on, but quickly found myself drawn in by the excellent videos. As one of the presenters said there is now 30-40 years of research into questionnaire design to draw on, and plenty of information out there about using surveys for research. I first used a survey in my MA research which was 20 years ago. At that time we were encouraged to use Judith Bell’s book ‘Doing your research project’ and looking at it now I can see that the information it provides is very similar to the information provided by this course. There are a couple of topics that Judith Bell doesn’t discuss:

  1. The difficulties associated with Agree/Disagree questions, which can be problematic as they require respondents to go through extra cognitive steps and can lead to acquiescence response bias. This link from the Survey Monkey blog explains this well – https://www.surveymonkey.com/blog/2016/12/21/lets-agree-not-use-agreedisagree-questions/

This link also mentions recent developments in questionnaire designs which are using technology to collect para data, e.g. measuring how long it takes a respondent to answer a question or using eye- tracking technology ‘to track respondents’ attention to each type of question’.

  1. The difficulty of getting honest responses when dealing with sensitive topics. Two techniques are used with varying degrees of success to achieve this: Item count technique and Randomised response

I have found all this fascinating but the truth of the matter is that I prefer collecting qualitative data to quantitative data. Whilst surveys are usually used to generate quantitative data, large amounts of data in a standardized format from which generalizable statements about the entire population can be made, it is possible to use surveys to generate qualitative data, through the use of open-ended questions.

Surveys: Part 2

In the few surveys I have conducted, I have always ended up being more interested in the responses to the open-ended questions that provide a text box for the answer, but what is the point of asking open-ended questions in a survey? I found some answers to this question in this blog post. What the blog post doesn’t say is that asking open-ended questions in a big survey, e.g. a survey sent out to hundreds of MOOC participants, is going to lead to an equally massive amount of work!

Reflecting on the resources provided about surveys has made me think about the number of times we ask survey type questions when not actually having formal research intentions, e.g. student course evaluations. The same guidelines about good questions apply. I have also been reminded of how we used to agonise over how to ask our primary trainee teachers questions about their knowledge of science concepts which would reveal their misconceptions.

For this social research methods course, we have been asked to design a simple survey which will explore some aspect of the use of Web 2.0 tools by a certain group of people, but I’m going to do my own thing and instead I’m going to revisit the problems we had in eliciting students’ science misconceptions. (I already know this will probably not lead to a satisfactory conclusion!).

Some of the issues already identified in Part 1 of this post relating to honesty and sensitivity of the subject are relevant here. It is important not to make students feel stupid – on the other hand it is extremely important to make them aware of their own misconceptions, so that they do not pass these on to the children they teach.  It will also be important to try and avoid receiving responses which are guesses.

By asking the questions below I want to elicit students’ misconceptions. I am imagining that I am asking a cohort of approx. 200 students to respond to these questions. For this task I have simply focused on the questions and at this stage have not thought about presentation, introductory information etc. Questions could be asked about each scientific concept in the primary school curriculum. I have settled for just two here.

Questions:

Which of the following three statements is correct?  Put a tick in the box beside the correct statement.

1.   Putting a coat on a snowman will keep it cold and stop it melting so fast.
2.   Putting a coat on a snowman will warm it up and make it melt faster.
3.   Putting a coat on a snowman will make no difference to how fast it melts.
Using your knowledge of materials and their properties, explain your choice of correct answer.

 

 

 

Which of the following three statements is correct?  Put a tick in the box beside the correct statement.

1.   A book standing on a table does not move because there are no forces acting on it.
2.   A book standing on a table does not move because the forces acting on it are balanced.
3.   A book standing on a table does not move because of the friction between the book and the table.
Using your knowledge of forces and motion, explain your choice of correct answer.

 

 

 

Practical issues

The objective of this questionnaire is to find out:

  • How many students still have science misconceptions at the end of the course (I could do this for any scientific concept).
  • Which misconceptions students still hold at the end of the course.
  • Whether there are any patterns to these misconceptions.
  • Whether students can use their knowledge and understanding to explain their responses.

The main issue with this survey is that if the student doesn’t explain their response in the text box provided then they could simply have guessed the answer and there would be no way of knowing this.

The epistemological assumption here is that there is a correct answer which the student should know if the course has been effectively taught and if the student has engaged with the course. I don’t think there would be any ethical issues associated with these questions. They could be answered anonymously, although a possible ethical issue might be whether the students find them threatening to their self-esteem, but this is unlikely if the taught course has focused on misconceptions.

Perhaps of more concern is validity. Will the questions really give me the information I need? Perhaps there needs to be a statement up front explaining why an explanation for a response is so important.

Finally, if this questionnaire depends on the open-ended question where students are asked to give an explanation for their choice of statement, then there is clearly the practical issue of the work-load of reading and analyzing all those explanations once collected.

If this type of questionnaire would scale up, then there is the potential of finding out a lot about primary school trainee teachers’ scientific misconceptions, but I’m not sure that it would scale up – and I suspect it might need a follow-up questionnaire.

I have not done what was asked this week, but I have learned more about how not to design a survey 🙂

Update 131017

Of course I should have mentioned that this type of survey may need to be run face-to-face. Otherwise, students could simply look up their responses on the internet or from books.  This was a problem we always had when trying to gather information about the knowledge of our distance learning students. As mentioned above, one way round this these days, would be to measure the student response time to each question, but presumably that needs specialist knowledge and technology.

Acknowledgement

The snowman statements come from the following sources:

Concept Cartoons

Image from: http://slideplayer.com/slide/4332605/ (Slide 18)

For the original cartoon, see also Keogh, B. & Naylor, S. (1996). Teaching and learning in science: a new perspective. Retrieved from: http://www.leeds.ac.uk/educol/documents/000000115.htm

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s