25 February 2012

On starting from the question

Background: Recently I went to observe a training session for nursery nurses (staff at day-care facilities for pre-school children). They are a major constituency for vocational education, but the job is low-status and traditionally undertaken as a process of drift, from leaving school with few qualifications to motherhood. Not "parenthood"--not only are young men rare in the sector, they are often regarded with suspicion. The pressure for training comes from well-meaning government regulation and inspection of everyone from registered child-minders to nursery schools by Ofsted.

So, in practice the class consists of eight young women. Because of the nature of the scheme, which requires the accumulation of assessed units of competence--taken in no particular order--into a portfolio which can be presented for accreditation (roughly), this group has met only for this morning. Some of them know each other and have met before, most don't and haven't. Similarly the tutors (I forgot--there are two of them and they are [in-service] students on our course) have met some of them before and even know their names, and in a couple of cases know something of their background and work setting, but are meeting others for the first time. And this particular group may never meet again.

The topic concerns the systematic observation of children to... Do what?

The EYFS sets out recommended procedures for systematic observation of children, and recording them.

So... how does one teach that? Boring, but simple.

First, ask why "we" might want to undertake such observations.

The students don't know each other and so they don't trust each other. Come to that, they don't know or trust the tutors--there is no track record of knowing whether wrong answers will be met with encouragement to get closer to the answer, or humiliating scorn for monumental ineptitude.

Good move from the tutor: everyone has to write down their answer, and then she goes round the group asking them to read them out. Everyone says, "to assess development" in one way or another. But because they are reading what what they wrote, the last person is not humiliated by failing to come up with an eighth distinctive reason.

So we now know that they know an acceptable answer. (And further questioning elicits more, and a later pair exercise makes them more comfortable with each other...)  Yes, an "acceptable" answer. The criterion is fundamentally about the process in the class.

I don't want to get too precious about this. This kind of closed-question, guess-what-I'm-thinking, Q and A has been the staple of such courses from time immemorial (for all the much-vaunted "Socratic method", he was a serious perpetrator, too). And it works.

I've just been watching Call the Midwife on BBC1. A newly-trained midwife on a solo call is faced with a breech birth; you can see her (no tricks, just Miranda Hart's acting) recalling her training as she carefully follows the recommended procedure (which may well have been up-dated since). She learned how to do that from just the same kind of instruction, I'm sure, especially given that it is set in the early 'fifties.

And as I was writing I came across this real-life example. That crew may well have learned the same way, too.

And yet. The dramatic, high-stakes situation depicted concentrates the mind wonderfully; she would have reached back into her training and drawn everything she could from it, no matter how badly it was done, as would the fire crew.

Systematically observing children is not like that. It's more of a chore. Quite rightly, the staff would rather be playng with the children and responding to them moment by moment. (Indeed, is there any evidence that other than in gross and obvious cases of "abnormal" behaviour--which tend to force themselves on the observer--that the effort expended on such surveillance has any payoff? The session I observed implied that it was principally about generating evidence for a report for the Special Educational Needs Co-ordinator [a.k.a. SENCO], which could then be produced for Ofsted as evidence of due diligence. )

So--leaving aside reservations about the whole idea, and living with what we have got--how do we get the class members to engage with the requirements?


The template for the session is information-transmission. That first of all requires motivation, and in this session the students were asked to come up with their reasons about why we might observe. Then they were provided with standard observation tools, and encouraged to familiarise themselves with them and use them with a video clip. I was quite interested to see how detailed (and time-consuming) the tools are.


I wonder if this couldn't be flipped. How would it work if the students were shown the video first, and encouraged to pay close (but non-specific) attention to it, and then asked what they observed, perhaps in an open format first, and then testing them on specific points? (I'm reminded of the Sherlock Holmes trick of asking Watson what he can tell about their latest client, and then showing off his skill by reeling off a lot more information he has picked up by observation and deduction--on Sherlock they do it very neatly with key-words labelling the tell-tale signs.)  

The point is they won't report much to begin with, because they don't know what to look for. But this technique would make them aware of that, and create a need (i.e. motivation) to get hold of the tools, so they would be more likely to take them in.

And similarly, later they could be shown completed examples of the observation sheets, and asked to interpret what story they tell, and what hypotheses (or in Sherlock's case) conclusions they might draw from the raw data. 

Perhaps one could use selected clips from CSI or a similar forensics-based show?

(23.01.12)

A few days later, the students and I discussed the session and the alternative strategies. They have done something similar in the past, and agree it is more engaging--but it all takes so long to "cover" the material. Yet again, the tyranny of the syllabus, which is all about "knowing about". It matters more that the material has been seen to have been taught, than that it has been learned. And paradoxically that puts more of a burden on the trainees and their mentors at work, because everything has to be addressed again there, often without the benefit of understanding the underpinning principles which could be addressed in class, and which would have increased the probability of the practical instruction "sticking".

No comments:

Post a Comment

Comments welcome, but I am afraid I have had to turn moderation back on, because of inappropriate use. Even so, I shall process them as soon as I can.