The screening test

Published:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

The questionnaires were sent out with the screening test during the autumn of 2008 to all teachers who had agreed to participate in this study. Questions were designed to elicit both qualitative and quantitative data involving first, a closed question and second, an open question. Initial questions were rated on a scale of three: yes - quite - no and these were analysed quantitatively using the computer program SPSS Version 17. Not all respondents answered every question in the questionnaires and when this was the case they were coded as 'Nil response'. The second responses involved individual comments offered to extend the first answer lending greater quality of information. The qualitative data was coded according to theme and searched using NVivo 8 software for Windows XP to demonstrate the range of answers and differences/ communality between responses. To ensure anonymity, each person who had completed a questionnaire was given an identification number so that the data could be cross-referenced between schools, authorities and teaching position. These attributes were chosen to see if there was any consensus between schools within authorities as well as between authorities. It was also intended to find out what teachers holding different positions within the schools felt about phonological awareness and its role in literacy acquisition. The following code information was assigned to each respondent:

CT (+number) = Class teacher.

PT (+number) = Principal teacher.

SfL (+number) = Support for learning.

DHT (+number) = Depute head teacher.

HT (+number) = Head teacher.

H = Highland Council.

WI = Western Isles Council.

O = Other Council.

Thus PT20WI refers to a principal teacher from School 20 within Western Isles Council.

There was frequent opportunity for individual comments to be made in both questionnaire and interview. The questions which were investigated in the questionnaire can be seen in Appendix 8 which gives full details of the questionnaire responses. A total of 45 questionnaire responses were received from Gaelic-medium units and schools. Two schools returned the test results but did not return the questionnaire completed. No reason was given but three teachers who were interviewed said they had not been given the full package at first, only the test materials, and had been given the questionnaire and consent form by the Head Teacher at a later date. Three schools had photocopied the questionnaire and returned two questionnaires each completed by different teachers who had conducted the tests, one teaching Primary 2, the other teaching Primary 3.

Returns of questionnaires relating to teaching position in schools.

The questionnaire respondents held various roles in the schools as shown in Figure 11. As the screening test had been intended for use by classroom teachers, it was encouraging to find that 62.2% of respondents were indeed classroom teachers. In some of the small Gaelic units there is often only one teacher so it was to be expected that some respondents would hold a promoted post. Only three support for learning teachers were involved in the questionnaires. The author's previous research (Lyon, 2003) found that very few Support for Learning staff are available in schools where Gaelic-medium education takes place and MacLeod and MacLeod (2001) also indicated that teachers sometimes had only very limited access to support. They usually are employed in larger primary schools or in a larger primary school containing a Gaelic unit within.

It had been hoped that ten per cent of the respondents would volunteer to be interviewed by telephone. Surprisingly, over one third of the respondents elected to take part in the telephone interviews examining the efficacy of the screening test and its ability to measure phonological awareness more closely. The eighteen interviewees consisted of ten classroom teachers, seven teachers in promoted posts varying from head teacher, depute head teacher to principal teacher, and two support for learning teachers .

Interviews relating to teaching position in schools.

In all cases, a semi-structured format was used which allowed the interviewees to be led through a series of pre-planned questions. Interviewees were made aware that their responses were being recorded and transcribed in order that they could be accurately represented during analysis. They were also made aware that the research results would be anonymous. Full transcripts were made of these interviews and the responses to the questions are grouped according to teaching position in Appendix 10. The texts of these interviews were all typed into NVivo 8 software program for further analysis and coding.

The screening tool was designed for use by class teachers; as previously discussed, in the majority of the schools in question, Gaelic-medium teachers do not often have the services of a support for learning teacher to carry out such assessments. It was therefore expected that the majority of respondents were classroom teachers. From a research point of view, the fact that respondents who volunteered to be interviewed were drawn from different roles and posts of responsibility within the school system made for a wider perspective of the difficulties involved in assessing phonological awareness.

Seven out of the eleven local councils were represented among the interviewees. Details are given in Figure 13. The highest percentage of volunteers who were interviewed came from Western Isles Council, representing half of the interviewees, although the pupils from that council made up less than a third of the total number of pupils screened. Western Isles and Highland Councils together contained more than fifty per cent of the pupils involved but 72% of the interviewees came from these councils. These phenomena can be explained by looking at the number of schools returning questionnaires from each council. Twenty schools involved came from Western Isles Council and thirteen schools came from Highland Council, representing 73% of the number of schools involved, compared with three from Argyll and Bute and nine from other councils. The third category of interviewees came from five separate councils. One reason that more interviewees volunteered from these seven councils could be that they live in areas where more Gaelic is spoken.

Councils represented by interviewees.

The coding of the interviews and open questions from the questionnaires using NVivo 8 software involved examining each question and searching for similar themes. The software is intended for qualitative information gathering but it was also possible to group information quantitatively. Once the texts of the interviews were typed into the software, it was possible to see a pattern of information appearing. This, when added to the quantitative information gathered, enabled several broad themes to emerge. Gibbs (2007) stated that if data is categorised thus by theme then not only can comparisons and summaries be made but new phenomena can also be identified. The value of considering quantitative data alongside qualitative is recommended by several authors (Cohen & Manion, 1994; Robson, 2000). The following themes appeared: Identification of Phonological Weaknesses, Screening Test and Administration, Teacher Knowledge of Phonological Awareness, Previous Identification and Gaelic Fluency Differences. Following an examination of these themes, some of the common difficulties presented within these themes were gathered together.

Identification of Phonological Weaknesses:

In the questionnaire, 87% of teachers thought that the test accurately identified areas of phonological weakness; 9% of teachers thought that it identified areas of weakness to some extent; 2% thought that it did not identify areas of weakness and 2% did not respond. Table 20 shows the numbers of respondents to this question according to teaching position.

Positive responses were received to this question from all interviewees. "It made me home in on the areas of weakness." CT33WI. Seven interviewees stated that the screening test had been very helpful and the two support for learning teachers both agreed that it had identified the weaknesses and would "try to work on them." "It has helped already with several children."SfL32H. Other respondents believed that the test was a good general indicator of where problems might lie. Only four respondents made any reference to particular aspects of phonological awareness, namely, initial and final phoneme deletion, initial sounds and sounds; however, one classroom teacher referred to rhyming, blending and initial/ final phonemes and went into some detail describing some of her methodology in teaching phonics in class. Further discussion of this will follow. Two interviewees referred to what they thought they knew about the phonological development of their pupils before the test by saying: "It reinforced what we felt" DHT46WI ... "you pick up so much in class but it was good to do it in a more structured way." CT4WI.

One principal teacher stated: "By the end of the test I knew which children knew their sounds, which children knew their blends, which children didn't and I found that very helpful."PT37O.

The second part of the question was intended to establish if enough information had been gained from the screening to enable certain programmes of intervention to be put in place. The identification of weaknesses and the possibility of specific intervention planning were considered together in order to gain an indication of the teacher's knowledge of phonological awareness and its importance in the steps towards literacy. Ideally such a screening tool should enable specific intervention to take place and it was interesting to note that the support for learning teachers all said that the pupil's results would be helpful in planning future intervention .

In the questionnaire, 73% of teachers believed that the pupils' results would help to plan the next steps in their teaching. 16% of teachers stated that results would help to plan next steps to some extent. 9% did not think it would help plan next steps and 2& did not respond. Many teachers referred to the types of intervention that they would be putting into place. All interviewees thought that the screening test was a useful tool that could be used to plan specific intervention. Responses included:

"We have used the results to help plan our language better." HT40WI. "Being able to provide programmes of work more to their needs as opposed to just general phonics lessons. Definitely good for planning next steps." PT44O. "We used the results when planning language." CT40WI. "I would take the mistakes, the weaknesses that I found and try to work on them." SfL22WI.

One classroom teacher stated that it helped her set homework: "I was able to use that information for their homework and focus on their sounds going home."CT27WI.

Two teachers felt that it would have been beneficial to plan specific intervention if they had had any pupils with difficulties; this was good to see that they acknowledged the test's potential even though the results showed that no pupils had difficulties. One principal teacher went on to describe how she would use the results to discuss the teacher's use of the phonics scheme - when and how often it was being used in the class.

While responses were generally very positive, there was one principal teacher who felt that some more information or background knowledge on what each subtest was testing would be needed to plan next steps. Further into the interview, however, she thought: "in terms of targeting support in an immersion class, which is difficult, that it (the test) would be very helpful." PT1O. She clarified her previous comment by saying that if each subtest could be described more specifically, for example, "this element tests whether a child can hear the rhyme identified" then she could find something specific to use or make up something herself "to plug the gap."

67% of teachers found the results informative. 18% thought that the results were informative to some extent. 6% did not think that the results were informative and 9% did not respond.

This question attempted to find out whether pupils' performance in class was reflected in the scores of the screening tool. Had there been a great deal of difference then it could be said that the screening tool was testing something other than what it was designed for. It was hoped that scores of phonological awareness would reflect a pupil's attainment in literacy. For the most part there was agreement on the point that test results corroborated achievement .

In the questionnaire, 67% of teachers believed that the screening test accurately tested pupils' phonological skills. 27% stated that it tested pupils' phonological skills to some extent. No teachers thought that pupils' phonological skills had not been accurately tested and 6% declined to respond. All interviewees replied in the affirmative some stating that there were "no surprises". Responses included:

"Those that do well with reading did well with the phonics test." CT30H. "There were a couple - their needs had come up from ancillary staff - and it showed up when we did the assessment." SfL12H. "What it did was it reinforced what we had already come to the conclusion about." DHT46WI. "Results did corroborate attainment pretty much as expected." HT40WI.

Only two teachers each said that one pupil in their class had scored better than expected. One of these pupils who did better than expected received speech and language therapy but the teacher thought that this performance was due to the fact that the teacher had to say the words rather than the child himself.

One teacher remarked that she had noticed a difference between one pupil tested and the other pupils and that this pupil had "a wee bit of difficulty in the Gaelic reading." CT11O. She noticed a marked difference in test performance between him and the other pupils. Another described a pupil who "had a special skill in language, had achieved Level A (National Test in Reading) ahead of the others and she did really well in your test." CT33WI. One principal teacher knew that a pupil struggled with phonics and Gaelic language and said that it had been useful to "home in on the specific phonological weaknesses as opposed to just sort of more general he struggles with reading." PT44O. She went on to say that she hadn't tried before to determine what difficulties in particular he was having. The head teacher described a pupil who was not working quite at the same level as the other pupils and thought that the test results confirmed things for her.

One class teacher commented that the test "quickly identified areas of difficulty and also showed where more practice was required." CT40aWI. She noticed that there were things that the test picked up on that she hadn't, especially rhyming words. Another teacher thought that her pupils were doing well this year "but it gave me an idea for a couple of them who need more work on things. It has made quite a difference since I identified them and they've been working on it." CT10H.

Screening Test and Administration:

Responses to this initial question about the screening test were encouraging with 78% of teachers stating that the test was easy to administer and 16% stating that it was quite easy to administer. No teacher stated that it was not easy to administer and 6% gave no response . Responses on test administration were generally positive: "Very straightforward - good instructions - folder layout made it easy." HT40WI. "Yes, probably yes and it was very helpful and easy to administer." CT17O. "Was very straightforward." CT11WI. "Was easy & pupils enjoyed doing it." CT10H As well as written instructions on each subtest, a CD containing verbal instructions and a sample test by a teacher and pupil was sent to every school. "It was very easy to administer, I hardly needed to listen to CD." CT2WI. "The test was quite simple for a class teacher once you read up on it. It's important to read the instructions and about it rather than dive in." CT18WI. "A good indicator of potential problems - this was comprehensive but short and easy for the child to do." CT13O. "It was very straightforward with good instructions. The layout was easy to follow and it quickly identified areas of difficulty." HT8WI. "I think the more you do it the more au fait you are with it." SfL14WI.

Some teachers commented on pupils reactions to the test; "The children didn't seem at all bothered by it anyway. They were absolutely fine with it - they didn't feel under any pressure" CT7WI. "The children thoroughly enjoyed doing it. They didn't think that they were doing a test at all. They weren't aware of testing." CT1H.

While interviewees found the test itself straightforward enough to administer, there were also comments that made reference to the time required to administer the test to the children. These have been recorded below together with the responses from the questionnaire on test length.

In the questionnaire, 85% of teachers agreed that the length of the test was good. Comments written included: "It was of the right length." CT17O "Length was fine as I only had 4 children in P2 to test." CT25aWI. 13% stated that the length of the test was alright to some extent. No teachers disagreed with the length of the test and 2% gave no response (See Table 18). "It was time consuming but worth doing." PT41O. "Easy to administer, but time consuming" CT45H; "I found it quite difficult time wise." PT11O. One principal teacher explained how she managed to administer the test to a large number of pupils: "I was fortunate to have a student, which enabled me to take children out individually. Otherwise it would have been very difficult to find the time to administer." PT44O.

In contrast to some previous comments on pupil reactions to the test, one teacher commented that "because children had not been tested in this format before they were worried about giving an incorrect answer. Therefore it took longer to administer." CT35O. Another class teacher thought that "the length was alright - it was difficult to administer in a multi-composite class - because it had to be done individually." CT8H "It took quite a long time to do. I suppose if you were only doing one or two children it would be OK." PT6O One teacher commented "It would have been easier to administer the test if I had been given time out of class." CT30H.

19 respondents added further comments in answer to the second question - What would make it easier? These ranged in theme from time allotted to administer the test to:

Which items would you exclude or add?

What other subset would improve the test?

In the questionnaire, 16% of teachers thought that they would like to see changes made to the screening test. 9% said they would like to see changes made to some extent. 49% of teachers said that they did not want to see any changes made to the test. 26% of teachers declined to make a comment.

What else would help identify future difficulties with reading?

There was no consensus between the interviewees about when it would be best to administer the test. Table 26 gives a brief summary of the preferences of the interviewees by teaching position. Only two thought that Primary 1 would be a suitable time to screen the children stating that they would have covered all the sounds in the phonics scheme Facal is Fuaim by the end of January.

Views were divided as to who would be best to conduct the test amid the interviewees although the majority of interviewees, 77.7%, suggested that the screening test would be best conducted by the class teacher although two also thought that the support for learning teacher, where available, could assist. Some class teachers commented: "Except for time, my first reaction would be the class teacher to carry out the test as the one who knows the pupils well" CT33WI; "The children were comfortable with me doing the test" CT6H; "The children are more relaxed with the class teacher" HT40WI; the class teacher is the best person "who knows the child and the child is more comfortable with that person and more likely to perform better" HT40WI and "they obviously know the children best in terms of their learning." PT44O.

Another two felt that "ideally a learning support teacher would probably be the best person because they're more qualified and even have more time out of class to sit with a child." CT30H. One support for learning teacher thought that ideally the class teacher would be best but that "they often don't have the time and I found that in support for learning I have more time than they have." SfL22H. Two principal teachers suggested that it was difficult for a class teacher, especially with a large number of pupils, to administer the test to the whole class. They both commented that the class teacher would probably be able to assess one or two pupils they had concerns about.

In the questionnaire, 53% of teachers thought that it would be helpful to use tables or graphs to represent the information gathered from the test results. 25% thought that it would be helpful to some extent. 13% did not think it would be useful to use tables or graphs to represent information gathered from the results and 9% did not respond.

The majority of interviewees expressed a preference for a graph to represent the results. "Graph form would be very useful" but also "A table of norms would be useful" CT3WI; "It would be useful to have a graph or tables with norms like psychologists and support for learning teachers use and you can see if he's above or below or in the middle" PT2O.

Not all teachers wanted a graph: "Some form of 'at a glance' display which would show areas of strengths and weaknesses" DHT46WI; "Probably a pictorial profile." CT45H. "Maybe have an overall score with a profile showing areas of strength and weakness" SfL22WI. One reason given for not wanting a graph or pictorial representation was given as "one to one interaction was more useful" HT40WI.

Previous Identification:

Previous to using this test, how did you identify a pupil who you felt was struggling with phonics/reading and how did you deal with them?

60% of teachers stated that they did not use any other tests either in Gaelic or English.

20% of teachers stated that they used similar tests to identify phonological awareness skills. Of these respondents, five said they used tests in English - Edinburgh Baseline Test, Sound Linkage, Aston Index and Nelson Screening Test were named and the Special Needs Assessment Profile (SNAP) which has recently been translated into Gaelic. One teacher referred to this as being a good overall assessment of needs "but doesn't go into so much detail phonologically" CT7H. 7% said they used other tests to some extent. 13% of teachers did not respond to this question.

Would you envisage using this test in future years?

Were there any noticeable differences in performance between fluent Gaelic speakers and learners?

Further comments.

56% of the teachers made further comments about the screening test.

32 % of the respondents stated that they would be willing to take part in a short telephone interview about the screening test and gave their contact details. Eighteen of the total forty-five respondents were interviewed.

Teacher Knowledge of Phonological Awareness:

What do you use to teach phonics?

What do you use to identify pupil's rhyming skills?

It was useful to hear teachers' perceptions and knowledge of phonological awareness and its impact on early literacy. The influence of English-medium primary schools wherein the majority of Gaelic-medium units are embedded in most cases was apparent. The current arrangements for identification are gradually improving but there was not a great deal of difference found now from the research findings completed by the researcher in 2003. Teachers gave their thoughts on the future use and relevance of the test as a diagnostic tool to enhance attainment through early intervention. It was heartening to hear the teachers' responses to this test as it is the first of its kind.

Writing Services

Essay Writing
Service

Find out how the very best essay writing service can help you accomplish more and achieve higher marks today.

Assignment Writing Service

From complicated assignments to tricky tasks, our experts can tackle virtually any question thrown at them.

Dissertation Writing Service

A dissertation (also known as a thesis or research project) is probably the most important piece of work for any student! From full dissertations to individual chapters, we’re on hand to support you.

Coursework Writing Service

Our expert qualified writers can help you get your coursework right first time, every time.

Dissertation Proposal Service

The first step to completing a dissertation is to create a proposal that talks about what you wish to do. Our experts can design suitable methodologies - perfect to help you get started with a dissertation.

Report Writing
Service

Reports for any audience. Perfectly structured, professionally written, and tailored to suit your exact requirements.

Essay Skeleton Answer Service

If you’re just looking for some help to get started on an essay, our outline service provides you with a perfect essay plan.

Marking & Proofreading Service

Not sure if your work is hitting the mark? Struggling to get feedback from your lecturer? Our premium marking service was created just for you - get the feedback you deserve now.

Exam Revision
Service

Exams can be one of the most stressful experiences you’ll ever have! Revision is key, and we’re here to help. With custom created revision notes and exam answers, you’ll never feel underprepared again.