Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of UK Essays.
The purpose of this proposed study is to expand on effects of technology interventions compared to math drill interventions on building math facts fluency. The purposed research question is determining if technology instructional interventions are more effective and efficient in comparison to standard math paper-pencil drill interventions, on improving second graders math fact fluency. Previous literature has failed to find consistent evidence for effectiveness of mathematic digital interventions on learning outcomes. This study expands on literatures contradictory results, as well as explores efficiency of technology interventions, compared to standard math-drill exercises. A cumulative acquisition design was utilized across four second grade participants and compared two conditions; FASTT math software instruction, and standard paper-pencil math drills. The results could provide further evidence on what instructional modes are most effective for mastering math fact fluency skills.
Examining Effects of Intervention Modes on Building Math Facts Fluency
Unfortunately, international math assessments continue to report 4th and 8th grade U.S students at disappointing ranks in comparison to same grade students in other countries. Similarly, the National Assessment of Educational Progress (NAEP) reports only 40 percent of public school students in grade 4 met mathematic proficiency levels and even more discouraging, only 35 percent of students at grade 8 were meeting mathematic proficiency levels (NAEP, 2017). Consequently, educational researchers have explored various factors in efforts to improve students’ mathematics performance. One common area researchers have studied is how different modes of instruction can positively influence success in academics. This current study expands on analyzing effectiveness of technology interventions compared to traditional paper-pencil modes of learning, and uses a single case research design to provide scientifically based evidence.
In relation to the mathematics field, common modes of instruction include math drills, and math games, such as Quick Images and Ten Frames (Kling, 2011). Implementing technology as a mode of instruction is fairly a recent area of exploration in mathematics, and is more frequently explored in other academic subjects such as reading, and science. In today’s generation, a high number of children interact with digital games on a daily basis, sparking educators interest in applying digital games to academic learning. Instructional games have been linked to many interesting findings about how they perpetuate a new learning culture. Positive improvements from technology-assisted modes have been found on computerized flashcards, drill practices, and even homework (Reynolds, 2010, Williams, 2010, & Mendicino, Razzaq, & Heffernan, 2009). As a result, studies have been interested in finding out why instructional modes with technology seem to be beneficial for learning outcomes.
Positive correlations between instructional games and students’ habits and interests have been examined, suggesting a more personalized style of learning. In addition, learning through digital games may increase attention spans due to the learning being made in an engaging way. Technology game-based learning has also found consistent evidence for increasing motivation to learn, especially for content that is difficult to understand or teach (Prensky, 2001). In relation, Chang, Chen, and Huang (2008) found significant growth in motivation to learn basic math facts in 6th and 8th graders, according to the students’ own responses. In sum, teaching through instructional games makes learning more fun for students. Given these benefits, instructional games are commonly being used within school settings across the United States.
For example, a national PBS survey was given across the country, and found 43% of classrooms computing involves playing educational, digital games (PBS.org, 2012). This demonstrates the wide use of technology in academic settings, and stresses the importance in exploring the effectiveness of these learning tools. From an educator’s standpoint, the topic elicits controversial perspectives, however, some research suggests teachers feel the digital games increase motivation and can make learning more personalized for students (JoanGonzCoonetCenter.org, 2012). From these findings, we can conclude technology is being used as an educational intervention, bringing to light the importance of analyzing its’ effectiveness on learning outcomes.
Clark, Tanner-Smith, & Killingsworth (2014) conducted a meta-analysis and examined 77 peer-reviewed journal articles studying effects of using digital games versus no digital games on STEM subjects. The findings suggest there is a moderate to strong effect in support of digital games on broad cognitive learning outcomes. In addition, the use of technology as a learning tool has been highly researched and reveals positive learning outcomes in social sciences (Huizenga, Admiraal, Akkerman, & ten Dam, 2009), science education (Annetta, Mangrum, Holmes, Collazo, & Cheng, 2009), and for language skill acquisition (Liu & Chu, 2010). However, investigating its effects on mathematics performance is significantly less studied, identifying a gap in literature.
Much of the current state of literature relative to effects of instruction through technology has studied its’ effects on building math fact fluency. Math fact fluency is taught early on in education and focuses on ability to compute addition and subtraction problems fluently. Mastery in math fact fluency for addition and subtraction is critical because it serves as building blocks for future mathematic skills acquisition. Although we know the importance of this intervention, what type of math interventions are best for building math fact fluency are highly debated in research. Research findings have been mainly inconsistent in establishing if students’ mathematics performance increases (Ke and Grabowski, 2007; Kebritchi, Hirumi, and Bai, 2010) or has no effect (Gelman, 2010; Jones, 2011; Ritzhaupt, Higgins & Allred, 2011), as a result of technology interventions.
Codding, Burns, and Lukito (2011) examined seventeen single-case design studies with intent to examine the type of intervention, level of intensity, and feasibility, on math fact fluency. The results yield drill and practice with modeling produced the largest effect sizes. In 2012, this study was expanded on to analyze differences in the effect sizes, and found variables moderating the results included student age, time spend in intervention, and intervention type (Methe, Kilgus, Neiman, & Tillman, 2012). Noteworthy, none of the articles studied included technology interventions. Despite these results, many other sources advocate for moving away from math drill inteventions.
As indicated previously, from a global aspect, U.S students are significantly behind in mathematics in comparison to other countries. The Trends in International Mathematics and Science Study (TIMSS) has repeatedly ranked Singapore in the top on mathematic scores. In 2017, Singapore was ranked number one, on fourth and eighth grade math performance, comparatively, the United States was ranked 24th. As a result, educators, parents, and researchers, have wondered what Singapore is doing differently in relation to math instruction. Singapore math supports not memorizing math skills, but mastering them.
While the United States commonly practices drills and memorization of math facts, Singapore educators work on children understanding the material on a deeper level, through visualizations with pictures and graphs (Keirleber, 2015). In addition, evidence of Singapore’s mathematic assessment scores is not the only form of data suggesting memorizing math facts through drills may not be affective. Consistent results have examined effectiveness of math fact memorization through automaticity drills, and have found a process-drive approach is more supported based on building more familiarity and strategies (McGee, Brewer, Hodgson, Richardson, Gonulates, & Weinl, 2017). From this information, we can conclude how alternative interventions for math drills, can elicit better learning outcomes.
On top of the contradicting results in literature, methodological limitations in empirical studies are another contributing factor to literature gaps. Single case design studies (SCD) conducted in educational settings are strongly advocated for because educational professionals can make defensible judgements about the impact of the interventions on outcome data with only one student. Then, this intervention can be generalized to students with similar characteristics, and needs. Successful interventions are imperative for students’ to fulfil their full academic potential. Therefore, being able to use a defensible methodology to determine whether an intervention is effective, can have a crucial impact. (Tillman and Burns, 2009). Unfortunately, SCD’s including the analysis of digital mathematic interventions on math fact fluency are nearly nonexistent. Furthermore, a cumulative acquisition design (CAD), in efforts to analyze differences in intervention efficiency, has not been conducted.
Noteworthy, from the meta-alysis literature reviews previously discussed, none of the articles studied included technology interventions. In addition, studies which have found positive effects of math-drill interventions, mainly used qualitative data, descriptive statistics, and quasi-experimental designs to analyze the results. Mastery in math fact fluency for addition and subtraction is critical because it serves as building blocks for future mathematic skills acquisition. Although we know the importance of this intervention, what type of math interventions are most effective are highly debated in research. This study is designed to expand on using SCD’s to evaluate effectiveness by incorporating digital technology as a mode of instruction. Furthermore, data collection in a CAD allows us to evaluate which intervention is more effective, and which elicits positive results quicker.
This study will analyze the effects of learning mathematics by technology (FASTT Software) versus learning by pencil-paper modes of instruction, on math fact fluency. AIMSweb’s CBM math fact fluency assessments will serve as our measuring tool of math fact fluency skills. Using a cumulative acquisition design (CAD), we can form two separate hypotheses. One, we can expect to see a higher increase in math fact fluency for the technology condition (FASTT Software), compared to the pencil-paper condition. Furthermore, considering the gap in literature assessing for instructional efficiency research, the design allows us to analyze which condition results in students learning math facts at a quicker rate. While unclear which condition will yield faster learning, we would expect one condition to result in quicker acquired skills, compared to the other.
A small group of four second grade students will be selected to participate in a study evaluating effectiveness of standard mathematic interventions versus intervention with technology, on mastery of basic math facts. The sample is chosen from an urban, Elementary Millard Public School and will include two boys and two girls, all at the age of eight. Parents will fill out a demographic questionnaire and if parental consent is given, students of Caucasian race, from middle socioeconomic status and with no known inhibiting genetic or environmental factors in academic ability, will be selected. This inclusionary criterion was chosen because it can best be generalized to students in surrounded locations.
The proposed class will be highly structured, and consists of 15 – 20 total students. Two of the four participating students will remain work in the hall way during implementation of the math-drill intervention. Considering the common use of math drills as established in the literature, for mastering math fact fluency, this group can provide relative baseline information. The remaining two students who fit the preferred demographics, will engage in the FASTT software program at a small table in the back of the classroom, with two computers, for 10 minutes.
AIMSweb CBM math fact fluency probes will be used for screening the four students who need to meet mastery levels in math fact fluency. In addition, this same measure will be used as the intervention data to evaluate progress. AIMSweb M-CBM math fact fluency probes are nationally normed at grade level and focus on basic math skills (addition and subtraction). Two separate probes will be given, one with subtraction problems, and one with addition problems. Each probe consists of 84 problems, for a total of 132 possible digits correct, and involves problem solving single and double-digit problems, as well as ability to regroup (See Appendix C). Each probe is given for 2 minutes and mastery level is 80 digits correct, or 40 per minute, on each probe. In the technical manual, a correlation coefficient reliability of 0.82, split-half reliability of 0.85, and high interrater reliability of 0.99 were found for second-grade probes. The math problems are grade appropriate, and are a valid representation of student’s current skill levels. The overall administration of the AIMSweb CBM math fact fluency takes about 5 minutes.
Equipment and Independent Variables
This study will require two computers or tablets, as well as the downloaded computer software on each devise, FASTT math, next generation (Hasselbring and Goin, 2005). Access to a computer and a printer will also be necessary for the needed AIMSweb materials. FASTT Math is a computer- based math intervention program designed to support students in grade 2nd and up in mastering math fact fluency in addition and subtraction. This software automatically differentiates instruction based on continuous assessments of the students’ math fact fluency due to its adaptive software. Although I was unable to find reliability and validity data, the program has been recognized by Scholastic.com, utilized throughout U.S schools, and research evaluations suggests positive results on mathematic performance (Clawson, Grier, Iocoangeli, 2011).
First, we need to identify our students who are in need of additional supports for mastering math fact fluency. All students will be administered AIMSweb M-CBM math fluency probes in addition and subtraction. For those who demonstrate basic math skills have not been mastered, a follow up questionnaire will be given to the parents (See Appendix A) in order to gather data on if they are suitable candidates and meet the criteria. After four students have been identified as good candidates for the experimental design, a consent form will be given to the parents to sign.
This research study will have two conditions. Condition A will include two students who will engage in 10 minutes of extra daily math drills with subtraction and addition problems which will be administered by a school psychologist. This approach represents the standard educational practice dud to the commonality of implementation in schools. Each day the sheets will change and include different problems, in order to prevent carry over affects. Condition B will include two students who will engage in 10 minutes of the FASTT program on a daily basis.. Data collection will be collected every, by assessing their math fact fluency score on two AMSweb – CBM math fact fluency probe, one for addition, and one for subtraction. The total digits correct will be used as the dependent variable. The Friday on the 15th week will mark the last day of our study, and we will report, analyze, and interpret our data to see which condition improved the most, and which condition showed improvement at a quicker rate.
Research Design and Data Analysis
This study used a CAD to address effectiveness of different instructional modes on math fact fluency. Considering approximately 10-15% of students can be identified not meeting mastery level in math fact fluency, implementing complex single-case research designs for all of these students would be too complicated and present itself with many challenges (Riley-Tillman & Burns, 2009). In this CAD, two conditions are compared, and two difference graphs are elicited. The first graph would show differences of effectiveness of the intervention. In relation to our hypothesis, we would hope to see condition B, have a higher trendline, compared to condition A’s. Therefore, our X-axis would be weeks, and our Y-axis is labeled number of math digits correct. For CAD designs, we could continue collecting until a clear difference is seen between conditions (Riley-Tillman & Burns, 2009). However, taking into account we did not collect ME design data prior, the intervention will be implemented for 15 weeks. Unique to a CAD study, we will have a second graph, demonstrating number of instructional minutes for each condition. If condition A’s trendline is positively higher than Condition B’s, we would conclude instruction through math drills, is the more efficient intervention of the 2.
Based on previous literature showing inconsistent findings for technology as an effective intervention, our expectations for results of this study are not concrete. However, taking into account the research supporting going beyond math drill instructions (Keierleber, 2015), we could conclude the technology condition will show more improvement then the math drill condition. Furthermore, considering literature suggests increased learning motivation with technology instruction, our hypothesis in favor of the technology group would remain (Chang, et al., 2015). As for predictions of which condition will elicit better instructional efficacy, our hypothesis is unknown. Implementing a CAD design to compare the two conditions has yet to be conducted, however, rate of improvement has been examined for math drills, and has found moderate effect sizes (Methe, et al., 2012).
Given we find the technology condition is more effective and efficient compared to the math drill condition, several implications can be made. Having a clear scientific research between assistive technology and improved math performance, can reform schools at the national level, and the microsystems level. More school policies may provide funding for computer based mathematics software. In addition, since we are applying learning in a more interesting, visual, engaging way, and effective way, we could expect math proficiency scores to increase at a national level. At the classroom level, more teachers would teach and support learning through technology use, compared to paper-pencil methods. In terms of the chosen experimental design, the need to establish instruction efficiency may warn itself more to implementing a CAD design more frequently (Riley-Tillman & Burns, 2009).
Lastly, this study design does present some weaknesses and warrants areas of concern. For starters, the materials needed presents a couple of challenges. FASTT Math, Next Generation is unfortunately, not free. If carried out, we would not only have to find classroom access to computers or tablets, but funding for the software programs. In addition, the complex design poses an area of concern for internal validity because random assignment is not applicable. While a CAD can inform us of instructional efficiency on the studied topic, the ability to generalize beyond the 4 participants, settings, or stimuli, is limited. Overall, the studies results should be interpreted with caution, but they can serve a valuable purpose in exploring intervention effectiveness on math fact fluency.
Demographic Questionnaire About Child
- Basic Information
Do you have any concerns about your child’s health?
Do you have any concerns about your child’s academic ability?
Are there any known genetic or environmental factors which could inhibit your child’s academic success?
Did your child attend pre-school?
(AIMSweb math fact fluency probes should be here but I could not figure out how to copy them. If hard copy of this is presented, probes will be attached)
- Chang, M., Evans, M., Kim, S., Norton, A., & Samur, Y. (2015). Differential effects of learning games on mathematics proficiency. Educational Media International, 52(1), 47-57.
- Codding, R. S., Burns, M. K., & Lukito, G. (2011). Meta-analysis of math basic-fact fluency interventions: A component analysis. Learning Disabilities Research & Practice, 26, 36–47.
- Clawson, Krystal; Grier, Valerie; and Iocoangeli, Amanda, “Evaluation of FASTT Math” (2011). Graduate Capstone Projects. 1. http://commons.emich.edu/cap/1
- Clark, D., Tanner-Smith, E., Killingsworth, S . (2014). Digital Games, Design and Learning: A Systematic Review and Meta-Analysis (Executive Summary). Menlo Park, CA: SRI International.
- Gelman, A. (2010). Mario math with millennials: The impact of playing the Nintendo DS on student achievement. Retrieved from ProQuest Digital Dissertations (UMI No. 3411899).
- Hasselbring, T.S., Lott, A.C., and Zydney, J.M. (2006). Technology-Supported Math Instruction for Students with Disabilities: Two Decades of Research and Development. Washington, DC: CITEd, Center for Implementing Technology in Education (www.cited.org).
- Keierleber, M. (2015, July 15). 6 Reasons Why Singapore Math Might Just Be the Better Way. Retrieved from https://www.the74million.org/article/6-reasons-why-singapore-math-might-just-be-the-better-way/
- Kling, G. (2011). Fluency with basic addition. Teaching Children Mathematics. 18 (20), 80-88.
- Mendicino, M., Razzaq, L., & Heffernan, N. T. (2009). A comparison of traditional homework to computer-supported homework. Journal of Research on Technology in Education, 41(3), 331-359.
- Methe, S.A., Kilgus, S.P., Neiman, C., & Tillman., C. (2012) Meta-Analysis of Interventions for Basic Mathematics Computation in Single-case Research. Journal of Behavioral Education, 21: 230. https://doi.org/10.1007/s10864-012-9161-1
- Prensky, M. (2001). Digital game-based learning. New York: McGraw-Hill.
- McGee, D., Brewer, M., Hodgson, T., Richardson, P., Gonulates, F., & Weinel, R. (2017). A Districtwide Study of Automaticity When Included in Concept-Based Elementary School Mathematics Instruction. School Science & Mathematics, 117(6), 259-268. doi:10.1111/ssm.12233
- Reynolds, J. (2010). The effects of computerized instruction and systematic presentation and review on math fact acquisition and fluency. ProQuest LLC.
- Riley-Tillman, T. C., & Burns, M. K. (2009). Evaluating educational interventions: Single-case design for measuring response to intervention. New York: Guilford Press.
- Williams, L. (2000). The effect of drill and practice software on multiplication skills: ‘Multiplication puzzles” verses “The mad minute”. Johnson Bible College.
If you need assistance with writing your essay, our professional essay writing service is here to help!Find out more
Cite This Work
To export a reference to this article please select a referencing style below:
Related ServicesView all
DMCA / Removal Request
If you are the original writer of this essay and no longer wish to have the essay published on the UK Essays website then please: