Levels Of Teacher Questions And Student Responses Education Essay

Published:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

This study purported to determine the types of questions asked by the college teachers and the types of responses given by their students during classroom interactions. It hypothesized a significant relationship between the teachers' types of questions and the students' types of responses across disciplines. It also hypothesized significant differences between the two variables according to teachers' gender, educational attainment and length of service.

The study employed the Descriptive-Correlation Design with the use of the classroom observation and survey questionnaire with tally sheets based on Bloom's six (6) levels of questioning, such as: knowledge, comprehension, application, analysis, synthesis, and evaluation. The respondents consisted of thirty one (31) teachers randomly selected from the faculty of the College of Liberal Arts (CLA) at Western Mindanao State University (WMSU). During the classroom observations, the teachers' questions and the students' responses were tallied and categorized according to Bloom's model. A tape recorder was also used to record the classroom interactions for validation purpose.

Through the use of Pearson r, t-test and ANOVA, main results revealed: a) that there was a significant relationship between teachers' types of questions and students' types of responses, and b) that there were no significant differences in the teachers' types of questions when data were grouped according to gender, educational attainment and length of service.

Introduction

The art of asking questions is one of the basic skills of good teaching. Choosing the right types of questions to ask students is necessary to spark thought-provoking answers and to engage students in productive discussions. Socrates believed that knowledge and awareness were an intrinsic part of each learner. Thus, exercising the craft of good questioning, an educator must reach into the learner's levels of knowing and awareness in order to help the learners reach new levels of thinking (Erickson, H.L., 2007).

In the recent studies on the Art of Questioning cited in Wolf (2010) and Winter (1987), it was disclosed that teachers know that questions are to be one of their most familiar- maybe even one of their most powerful tools. Santos (1998) in Rosaldo (2002) averred that what is urgently needed in the 21st century is a thinking instructor. Teachers must make every classroom a thoughtful classroom, where students are provided opportunities to engage in critical thinking rather than just mere memorization of factual information. However, observations revealed that teachers in the College of Liberal Arts, WMSU tend to confine themselves in asking low-level questions. This was shown when the researcher went through the test questions of most of the English, Literature and Humanities teachers. When asked why it was so, one teacher said, "It is easy to check and will not consume much time in checking when one needs immediate result of final examination."

To add input to research literature on these perspectives for educational implications and to confirm the observations, this study purported to determine the types of questions used in the classrooms by the teachers of the College of Liberal Arts at Western Mindanao State University. It also looked into the types of responses that the students give in answering the teachers' questions. Moreover, it purported to establish a significant relationship between the types of teacher questions and the types of student responses and to find out if there would be significant differences in the types of teacher questions when data would be grouped according to gender, educational attainment and length of service.

Method

This study used the Descriptive- Correlational Research Design which had a sample of 31 teachers randomly selected from the 108 College of Liberal Arts faculty of Western Mindanao State University, Philippines. In this design, the data gathering which lasted for more than 5 months went through at least three main stages. The preliminary stage included 1) the preparation of the instruments, such as, the Survey Questionnaire for the demographic variables, the tally sheets based on Bloom's Taxonomy of Levels of Questions and the tape recorder, 2) the seeking of the dean's permission to conduct the study in the concerned college and getting of the voluntary consent of the teacher- and student- respondents to participate in the study and 3) the conduct of orientations with the teacher-respondents regarding the objective of the study and the manner of the classroom observation.

The next stage was the actual data gathering. In here, the research team composed of 3 judges and a cassette operator got inside the classrooms of the respondents one at a time for 30 minutes each of classroom interaction observation. The 30 minutes began upon the start of the lesson proper. The 3 judges tallied the questions of the teachers and the student responses during the interaction using Bloom Taxonomy of Levels of Questions. For example, "what" and "who" questions were tallied under Knowledge Level, "why" questions under Comprehension, "which part and why" questions under Analysis, "how would you apply" questions under Application, "can you summarize" questions under Synthesis and "is it justifiable or not" questions under Evaluation. Simultaneously, the recording of the interactions was done. Moreover, the teacher-respondents were asked to fill out the Survey Questionnaire for the demographic variables such as gender, educational attainment and length of service. For the last stage, all the tallies made by the 3 judges were validated by an expert using the tape recorded interactions.

For the treatment of data, mean and percentage were used to determine the types of teacher questions and the student responses using Bloom's Model. In establishing significant relationship between teacher questions and student responses, Pearson r was used. To find out the significant differences between and among the moderator variables in the study, t-test and ANOVA were used.

Results

Types of Questions Asked by the College of Liberal Arts (CLA) Teachers

The data in Table 1 revealed that out of the 21 average number of questions asked, 9 (42.9%) were knowledge questions, 4 (19%) were comprehension questions and 2 (9.5%) were application questions. The rest of the questions asked were analysis questions (9.5%), evaluation questions (9.5%), and synthesis questions (9.5%).

These data implied that almost half of the questions asked by the CLA teachers were knowledge questions. The remaining percentage of a little more than fifty percent was distributed to the other types of questions which are comprehension, application, analysis, synthesis and evaluation questions.

Table 1. Types of Questions Asked by CLA Teachers

Types of Questions

Total

K

C

An

Ap

Syn

E

No.

No.

%

No.

%

No.

%

No.

%

No.

%

No.

%

9

42.9

4

19

2

9.5

2

9.5

2

9.5

2

9.5

21

Legend:

K - Knowledge questions Ap - Application questions

C - Comprehension questions Syn - Synthesis questions

An - Analysis questions E - Evaluation question

Types of Responses Given by the CLA Freshman Students

As shown in Table 2, out of the 20 average student responses, 9 (45%) were classified as knowledge, 4 (20%) were on comprehension level, 2 (10%) were on analysis, 3 (15%) were on application, 1 (5%) each for application and synthesis. This implied that almost half of the students' responses were on the knowledge level which is on memory and recall only.

Table 2. Types of Student Responses Based on Bloom's Taxonomy

Types of Students' Responses

Total

K

C

An

Ap

Syn

E

No.

%

No.

%

No.

%

No.

%

No.

%

No.

%

No.

9

45

4

20

2

10

3

15

1

5

1

5

20

Relationship between the Types of Teacher Questions

and the Types of Student Responses

The data in Table 3 on the correlation matrix between teachers' questions and students' responses revealed that all of the r values were significant at alpha.05. It meant that there was a significant correlation between teachers' types of questions and students' responses. This implied that if the teachers asked knowledge-based type of elicitation technique, the students ultimately would respond to the questions using the same level that is knowledge-based question which elicits students' simple recall or knowledge-based response.

Table 3. Correlation Matrix between Teachers' Types of Questions and Students' Responses

Variables

r

Sig.

Interpretation

Knowledge and SR

.948

.000

Significant

Comprehension and SR

.989

.000

Significant

Application and SR

.642

.000

Significant

Analysis and SR

.80

.000

Significant

Synthesis and SR

.756

.000

Significant

Evaluation and SR

.348

.055

Significant

Difference in Teachers' Types of Questions by Gender,

Educational Attainment and Length of Service

In terms of gender, as shown in Table 4 none of the t values in all types of questions were significant at .05 alpha level. It meant that male and female respondents did not vary or differ in the types of questions asked in their classes. Perhaps, these teachers used common types of questions to elicit responses from the students. Thus, gender did not influence teachers' types of questions used in the classroom.

Table 4. Differences in the Teachers' Types of Questions by Gender

Types of Question

Sex

Mean

Std. Deviation

T

Sig.

Interpretation

Knowledge

Male

8.7273

4.90083

-.224

.824

Not

Female

9.1500

5.09153

Comprehension

Male

3.5455

2.58316

.050

.961

Not

Female

3.5000

2.35081

Application

Male

2.5455

2.20743

.625

.537

Not

Female

2.1000

1.71372

Analysis

Male

3.3636

2.80260

1.698

.100

Not

Female

2.1500

1.18210

Synthesis

Male

1.3636

1.91169

-.455

.659

Not

Female

1.6500

1.59852

Evaluation

Male

1.9091

3.08073

.367

.716

Not Significant

Female

1.6000

1.63514

Total

Total

Total

When data were grouped by educational attainment, ANOVA results revealed that none of the F values were significant at alpha .05 (Table 5). It meant that teachers did not differ in the types of questions they asked in the classroom. In other words, educational attainment did not influence the types of questions employed by the teachers in the classroom. However, the F value of 2.524 for analysis with the corresponding P value of .065 was not significant at alpha .05 level but meaningful. To be meaningful according to Pedhasur is when P value is greater than .05 but less than .20. It means that there is good reason not to drop off such variable but to affirm or disaffirm its contribution in any future research endeavor. Further analysis of the results suggested that in general, educational attainment cannot account for much of the variance in the teachers' types of questions asked in the classrooms.

Table 5. ANOVA Results on the Differences in the Teachers' Types of Questions by

Educational Attainment

Types of Questions

Source of Variation

Sum of Squares

DF

Mean Squares

F

Sig.

Knowledge

Between Groups

132.234

4

33.058

1.334

.284

Within Groups

644.154

26

24.775

Total

776.387

30

Comprehension

Between Groups

9.198

4

2.299

.392

.812

Within Groups

152.351

26

5.860

Total

161.548

30

Application

Between Groups

10.234

4

2.558

.573

.685

Within Groups

116.154

26

4.467

Total

126.387

30

Analysis

Between Groups

20.353

4

5.088

2.524

.065

Within Groups

52.421

26

2.016

Total

72.774

30

Synthesis

Between Groups

13.950

4

3.487

1.164

.350

Within Groups

77.921

26

2.997

Total

77.921

30

Evaluation

Between Groups

9.339

4

2.335

.625

.649

Within Groups

97.048

26

3.733

Total

106.387

30

On the differences in the teachers' types of questions by length of service, ANOVA results showed that none of the F values in all types of questions were significant at alpha .05 level. It meant that the teachers did not vary or differ in the types of questions they asked in their classes (Table 6).

This implied that teachers' length of service was not a factor affecting the types of questions they elicited in their classes. This further implied that those who had been in the teaching profession for quite a long time and those having a short period seemed to have the same teaching experience since they employed the same types of questions in the classrooms. This result has negated several studies which revealed that teachers' length of service may interact with educational opportunities.

Table 6. ANOVA Table on the Differences in the Teachers'

Types of Questions by Length of Service

Types of Questions

Source of Variation

Sum of Squares

DF

Mean Squares

F

Sig.

Interpretation

Knowledge

Between Groups

3.115

3

1.038

.038

.990

NS

Within Groups

730.885

27

27.070

Total

734.000

30

Comprehension

Between Groups

9.089

3

3.030

.503

.683

NS

Within Groups

162.653

27

6.024

Total

171.742

30

Application

Between Groups

2.954

3

.985

.258

.855

NS

Within Groups

102.981

27

3.814

Total

105.935

30

Analysis

Between Groups

14.896

3

4.965

1.332

.285

NS

Within Groups

100.653

27

3.728

Total

115.548

30

Synthesis

Between Groups

2.946

3

.982

.320

.810

NS

Within Groups

82.731

27

3.064

Total

85.677

30

Evaluation

Between Groups

13.247

3

4.416

.895

.456

NS

Within Groups

133.140

27

4.931

Total

146.387

30

NS=not significant

Discussion

On the types of questions asked by college teachers, the study concluded that almost half of the questions asked by the CLA teachers were knowledge questions. The remaining percentage of a little more than fifty percent was distributed to the other types of questions which are comprehension, application, analysis, synthesis and evaluation questions. These data implied that the questions asked by the CLA teachers of WMSU across disciplines were generally low-level questions. Why is this so? According to Gall (1970) (as cited in Rosaldo, 2002), where the vital issue of what fuels or explains this persistent pattern of questioning emerges, there is little or no research. But, teachers freely admit that there are hurdles. It takes skills and practice to build a climate of inquiry to students of widely varying backgrounds. In addition, Corteza (2003) in her study, revealed that teachers asked knowledge questions more often followed by comprehension questions, which are considered low-level questions. Application, analysis, synthesis and evaluation questions, which are considered high-level questions, are not often used by teachers in the classroom. The result also affirmed Eumague's et al (2009) finding revealing that teachers used low-level questions (72.3%) in their literature tests. When interviewed why it was so, it was because of the convenience of using the low-level questions particularly in terms of scoring; fifty percent (50%) of the interviewees said that answers to low-level questions are easy to check and to score.

As to the student responses, the result disclosed that almost half of the students' responses were on the knowledge level and this correlated with the types of teachers' questions wherein knowledge type of questions got the highest. This finding has confirmed what Cooper and others said about students' knowledge-level responses. Cooper et al. (1977) (as cited in Corteza, 2003) stated that the students are able to answer knowledge questions very easily as they are able to remember what they have read simply because the information is explicitly stated in the article or text. It does not require the students to manipulate information or to think, as it is one of the simplest systems that allow for only one right response. It only needs recall and rote memory of the previously learned facts.

Regarding students' inability to ask higher level questions, Brown and Cooper had these explanations. Brown (1994) stated that in order to answer a comprehension level of questions, the student must go beyond recall of information. The student must demonstrate a personal grasp of the material, being able to rephrase it to give a description in his or her own words, and to use it in making comparison. Likewise, Cooper et al. (1977) stated that most of the students are poor in answering application questions. Many of them cannot use the data and principle to complete a problem task with a minimum direction. They are poor in answering application questions maybe because they are not much exposed to application questions and that they find difficulty in applying previously learned information in order to reach an answer to a problem. In analysis questions, students were also poor. This may be because according to Cooper et al. (1977) in analysis questions, several answers are possible. Furthermore, it takes time to think and analyze. These questions cannot be answered quickly or without careful thought. Analysis questions not only help students learn what happened but also help them search for the reasons behind what happened. Analysis questions require students to analyze information in order to identify causes to reach conclusions to find supporting pieces of evidence.

As to the correlation between the teacher's type of questions and the student's responses, the data revealed that all of the r values were significant at alpha .05. It meant that there was a significant correlation between teachers' types of questions and students' responses. This implied that if the teachers asked knowledge-based type of elicitation technique, the students ultimately would respond to the questions using the same level - that is, knowledge-based question which elicits students' simple recall or knowledge-based response. This result supported Lambert and Pearl (1986) who asserted that effective questioning yields effective responses. Hence, high-level questioning yields high-level responses, and likewise, low-level questioning yields low-level responses. This finding also supported the theory of Martin et al. (1988) (as cited in Corteza, 2003) that indeed asking questions at a certain level will elicit response at the same level of questioning. They said that the questions teachers asked set the limits and influence the progression of the class; teachers expected students to think at a certain level, and they did.

Total

Total

Total

Furthermore, the study showed no significant differences in the teachers' types of questions considering the variables of gender, educational attainment and length of service. None of the t values in all types of questions were significant at .05 alpha level. It meant that teachers asked the same low-level questions whether they were male or female, whether bachelors or doctorate degree holders, and whether new or old in the teaching profession. This result in terms of gender has affirmed Sunga's ( 2000 ) study on "Questioning Strategies Through Interactive Learning Vis-a-Vis the Levels of Comprehension of the Students in terms of Knowledge, Comprehension, Application, Interference, Analysis , Synthesis, and Evaluation Levels" which revealed sex to have no influence on teachers' questioning skills. The finding of no significant difference in teachers' questioning levels by educational attainment contradicted Corteza's (2003) study which revealed that teachers' educational attainment contributed to their questioning skills. A justification would be that in the present study, the teachers used varied instructional materials but, perhaps, with similar objectives for basic knowledge acquisition since lessons were developmental ones and each observation lasted only for 30 minutes.

On the basis of the aforementioned findings, it can be inferred that teachers across disciplines utilized different types of questions such as knowledge, comprehension and application during the teaching-learning process but only less frequently. This showed that teachers may have used materials that did not trigger higher-order thinking skills, or perhaps it was due to the limitation of only one observation per teacher. Thus, the study recommended that teachers should be encouraged to use higher-level questions to help develop students' critical thinking. For the others who may not know, their art of questioning should be improved by making it part of the input/sharing in the college faculty professional circle or in whatever similar gathering. For future researchers, it was recommended that similar study be conducted with three observations per teacher to provide one follow-up observation and another observation for confirmation to have more reliable results.

Writing Services

Essay Writing
Service

Find out how the very best essay writing service can help you accomplish more and achieve higher marks today.

Assignment Writing Service

From complicated assignments to tricky tasks, our experts can tackle virtually any question thrown at them.

Dissertation Writing Service

A dissertation (also known as a thesis or research project) is probably the most important piece of work for any student! From full dissertations to individual chapters, we’re on hand to support you.

Coursework Writing Service

Our expert qualified writers can help you get your coursework right first time, every time.

Dissertation Proposal Service

The first step to completing a dissertation is to create a proposal that talks about what you wish to do. Our experts can design suitable methodologies - perfect to help you get started with a dissertation.

Report Writing
Service

Reports for any audience. Perfectly structured, professionally written, and tailored to suit your exact requirements.

Essay Skeleton Answer Service

If you’re just looking for some help to get started on an essay, our outline service provides you with a perfect essay plan.

Marking & Proofreading Service

Not sure if your work is hitting the mark? Struggling to get feedback from your lecturer? Our premium marking service was created just for you - get the feedback you deserve now.

Exam Revision
Service

Exams can be one of the most stressful experiences you’ll ever have! Revision is key, and we’re here to help. With custom created revision notes and exam answers, you’ll never feel underprepared again.