Print

Using Technology to Enhance RtI Implementation


Response to Intervention (RtI) is perhaps the most significant educational reform initiative in this country in the past 50 years. However, there is a long history of failed educational innovations due to various factors including a lack of supporting research or implementation difficulties (Ellis, 2005). Research has consistently supported the effectiveness of RtI approaches in enhancing student learning (Burns, Appleton, & Stehouwer, 2005), but several scholars have suggested that fidelity of implementation will be an important obstacle to overcome for large-scale RtI implementation (Gansle & Noell, 2007; Ysseldyke, 2005).

 

Technology offers a potential medium through which RtI implementation could be made easier and more likely to occur (Ysseldyke & McLeod, 2007). The use of technology makes ongoing data collection, data consumption, and data-based decision making a more plausible proposition, and it can keep these important aspects of RtI from monopolizing teacher time. Previous research found that the use of technology substantially facilitated collecting, managing, and analyzing educational data (McIntire, 2002; McLeod, 2005; Pierce, 2005; Wayman, 2005). Thus, technology-enhanced assessment (TEA) would likely support RtI implementation, but applying technology to other aspects of RtI would likely enhance the implementation of those components as well. The RTI Action Network of the National Center for Learning Disabilities has identified high-quality classroom instruction, tiered instruction/intervention, ongoing student assessment, and family involvement as the essential components of RtI. Below is information about those essential components and suggestions for ways in which technology could facilitate the successful implementation of each. It should be noted that the information presented below does not suggest endorsement of a product.

 

High-quality Classroom Instruction

 

An effective core curriculum is the foundation to successful RtI implementation. Small-group supplemental interventions cannot be successful if too many students demonstrate skill deficits, because schools usually do not have sufficient resources to implement supplemental interventions for more than 20% of the student population. Recent meta-analyses found moderate to large effects for various technologies, including personal computers, game-like curricula, and interactive simulations (Blanchard & Stock, 1999; Vogel et al., 2006), which suggest that schools could use technology to improve core instruction.

 

To fully examine the different technologies would go beyond the scope of this article. However, technology could help facilitate implementation of an effective Tier 1 through various applications and by assessing the quality of the core instruction. Version 3 of the Ecobehavioral Assessment System Software (EBASS) is based on the research of Greenwood, Carta, and Atwater (1991) and provides a technology-enhanced assessment of the instructional environment. EBASS is a software system that school personnel can use to conduct systematic classroom observational assessments with laptop, notebook, or hand-held computers.

 

The classroom observer can use one or multiple components of the EBASS: the Code for Instructional Structure and Student Academic Response (CISSAR), Ecobehavioral System for Complex Assessments of Preschool Environments (ESCAPE), Mainstream-CISSAR (MS-CISSAR) for students with disabilities in general education, the Ecobehavioral System for the Contextual Recording of Interactional Bilingual Environments (ESCRIBE). Each system involves classroom observations with 10-second intervals using a portable computer to gather the data. Reports from an observation may consist of percentage of occurrence for all events or probabilities of student behavior given specific arrangements of the classroom ecology, using data from a single observation or data gathered over a period of time. Some of the information that the EBASS provides includes academic engaged time, the occurrence of inappropriate behavior, and the occurrence of task-management responses. The strength of EBASS is that it provides precise information on the frequency of occurrence of specific kinds of behaviors and on the kinds of contextual factors associated with the occurrence of each (Ysseldyke & Burns, 2009). Thus, the EBASS could be used to provide feedback to classroom teachers about how to improve their instructional practice and the core curriculum.

 

Tiered Instruction/Intervention

 

Approximately 20% of students are not successful in the core instruction despite good curriculum and effective instructional practices. Students who need additional support beyond core instruction receive small-group (Tier 2) or individualized (Tier 3) interventions in addition to the daily classroom instruction. Effective Tier 2 interventions should be targeted to the student deficit, occur 3 to 5 times per week, and last approximately 30 minutes per day (Burns, Hall-Lande, Lyman, Rogers, & Tan, 2006). Currently the most common application of technology to tiered interventions is the use of Web sites to identify interventions (e.g., Intervention Central and IES What Works Clearninghouse). However, there are many well-constructed and research-based technology-enhanced interventions that both target the student deficit and allow for automated delivery. Up until recently, the Florida Center for Reading Research (FCRR) reviewed supplemental interventions and rated which of the five areas of the National Reading Panel (phonemic awareness, phonics, fluency, vocabulary, and comprehension) each addresses and whether all aspects of that area were taught and/or practiced. Moreover, each intervention was rated as technology based or not. Listed in Table 1 are the supplemental interventions rated as technology based and as addressing all areas of one or more National Reading Panel instructional categories.

 

Table 1: Reading Interventions Rated as Technology Based and as Addressing a Specific Reading Skill
Program Grades Appropriate Area Highly Rated by FCRR For More Information
Discover Intensive Phonics for Yourself K-12 Phonics Reading Horizons
Fast Forward Language K-12 Phonemic awareness Fast Forward Language
The Literacy Center K-2 Phonemic awareness and phonics Leap Frog School
My Reading Coach 2-12+ Phonics MindPlay
Read Naturally 1-12+ Fluency Read Naturally
Read On 9-12+ Vocabulary Read On!
ReadAbout 3-6 Comprehension ReadAbout
Thinking Reader 6-8 Comprehension

Thinking Reader

 

Although the information presented by the FCRR is extremely helpful for practitioners, substantially less information is available about technology-enhanced math interventions. Three of the more commonly used technology-enhanced interventions for math include the VmathLive from Voyager Learning, and Renaissance Learning’s Accelerated Math and Math Facts in a Flash. All three of these programs have interactive software that appears appealing to students, allows for automated interventions with little supervision, and can be used to target specific skills and objectives.

 

Ongoing Student Assessment

 

Assessment is perhaps the very cornerstone of RtI. Although schools are more frequently engaging in assessment practices, some of the tools being used are psychometrically less than desirable. Assessment data within an RtI framework should be used to screen student skills in order to identify those who require additional support, and to monitor progress of students receiving interventions. There are many well-constructed assessment measures that can be used to conduct universal screenings three times each year, but there are comparably few that can be used to adequately monitor progress.

 

The National Center on Student Progress Monitoring rated several measures to report whether there was sufficient evidence to address seven standards: reliability, validity, alternate forms, sensitivity to student improvement, adequate yearly progress benchmarks, improving student learning or teacher planning, and rates of improvement specified. Listed in Table 2 are all of the measures that had a technology component and were rated as demonstrating sufficient evidence for all seven standards. The table also includes the number of alternate forms that can be used to monitor progress, and whether the test can a) be auto-administered with little supervision, b) include a data-management system to facilitate use of the data, c) report summary data, and d) provide instructional suggestions.

 

As can be seen in Table 2, all of the technology-enhanced assessment tools provide sufficient data management systems, but only six can be administered with little teacher supervision. Moreover, many of the tools provide data that could be used to design interventions, but only five provide specific instructional suggestions.

 

Table 2: Technology Enhanced Progress-Monitoring Tools That Met All Seven National Center on Student Progress Monitoring Standards
Continuous Progress-Monitoring Measures Number of Stimulus Sets Available Automated Administration Data Management

Report Includes

Summary of Data/ Instructional Suggestions

Renaissance Learning Accelerated Math Unlimited X X X/X
AIMSweb Early Literacy 33 per grade X X
AIMSweb Maze 33 per grade X X
AIMSweb Reading 33 per grade X X
DIBELS Nonsense Word Fluency 26 X X
DIBELS Oral Reading Fluency 30 per grade X X
DIBELS Phoneme Segmentation Fluency 25 X X
EdCheckup Maze 23 X X
EdCheckup Reading 23 X X
iSTEEP Reading Fluency 50 per grade X X
Monitoring Basic Skills Progress: Basic Math 30 per grade X X/X
Monitoring Basic Skills Progress: Basic Reading 30 per grade X X/X
Vital Indicators of Progress-Nonsense Word Fluency

11 for kindergarten

12 for 1st grade

9 for 2nd grade

X X X/X
Vital Indicators of Progress-Phoneme Segmentation

11 for kindergarten

12 for 1st grade

X X X/X
Yearly Progress Pro Math 52/grade X X X
Yearly Progress Pro Reading 33/grade X X X

Family Involvement

 

Research has consistently suggested that family–school partnerships improve student outcomes (Eccles & Harold, 1993; Jeynes, 2007), but this knowledge has yet to fully inform the RtI process. Reschly (n.d.) provided an overview of parental involvement in RtI and suggested that the five family–school co-roles outlined by Christenson and Sheridan (2001; co-communicators, co-supporters, co-learners, co-teachers, and co-decision makers) map onto the three tiers of intervention within RtI. Parents of students who are successful in the core instruction are engaged in their students’ learning through communication and support. However, co-learning, co-teaching, and co-decision making is needed between students and schools for students receiving Tier 2 or Tier 3 intervention.

Technology can certainly assist with co-decision making in that the technology-enhanced assessment tools described above often generate reports with graphs and other figures that make data easier for parents to understand. However, there is surprising little available for the other partnering roles. Most Web-based communication systems (e.g., e-mails, parent/student portals for access to school performance indicators, etc.) are unidirectional and only communicate from home to school. Thus, schools are encouraged to develop technology-enhanced methods for bidirectional communication.

 

Schools are also encouraged to explore Web-based interventions that parents could use at home, in collaboration with teachers, to support daily instruction. For example, the Waterford Early Reading Program and Head Sprout are two seemingly user-friendly instructional programs that parents could implement at home. Moreover, there may be Web-based components of some of the intervention programs listed above that parents could implement at home and record data that could be shared with the student’s classroom teacher. Having parents actually implementing interventions, and providing them resources to do so, could create a bidirectional partnership that would likely enhance student learning.

 

Conclusion

 

From spreadsheets like those created with Microsoft Excel to advanced intervention, data-management, and communication systems, technology may very well be the answer to successful RtI implementation. Although the basic tenets of RtI should always be in place, each system will need to modify implementation plans to address unique needs. Thus, some of the tools listed above might exactly address the needs of one school, but other approaches might be more advantageous for a different school. The first step to implementation should be an implementation team (or task force, etc.) that includes principals, teachers, school psychologists, and parents. Ideally, the team should also include someone who is knowledgeable about technology and who can help review potential tools.

 

The focus of this article has been on the academic side of the tiered-intervention triangle. However, technology may also facilitate implementation of behavioral RtI systems. Practitioners are encouraged to explore technology alternatives for behavioral assessment and interventions. Two potential options could be Direct Behavior Ratings to collect behavioral data and School-Wide Information System to manage data. These two resources seem promising, but the review of behavioral tools is far less extensive than examinations of various technology-enhanced interventions or assessments for academic deficits.

 

There is still much to be learned about RtI implementation. Research to address many of the important implementation issues is ongoing, but schools will have to answer some questions based on their own data and experience. Technology could provide a critical tool for RtI, but only if schools examine various applications, attempt them on a small scale while collecting data, and use the data to guide subsequent implementation decisions. Unfortunately, schools may be tempted by the marketing schemes of test and intervention publishers, which seems especially possible for technological applications, but the implementation process and student outcome data need to guide decisions. Moreover, schools need to commit appropriate resources to train staff in using the specific application because relatively few teachers are well trained in technology and most applications bring complex implementation issues (Pfohl & Pfohl, 2008). Making RtI implementation easier while enhancing student learning should be the goal, and informed decisions and training make it an obtainable one.

 

References

 

Blanchard, J., & Stock, W. (1999). Meta-analysis of research on a multimedia elementary school curriculum using personal and video-game computers. Perceptual and Motor Skills, 88, 329–336.

 

Burns, M. K., Appleton, J. J., & Stehouwer, J. D. (2005). Meta-analysis of response-to-intervention research: Examining field-based and research-implemented models. Journal of Psychoeducational Assessment, 23, 381–394.

 

Burns, M. K., Hall-Lande, J., Lyman, W., Rogers, C., & Tan, C. S. (2006). Tier II interventions within response-to-intervention: Components of an effective approach. Communiqué, 35(4), 38–40.

 

Christenson, S. L., & Sheridan, S. M. (2001). Schools and families: Creating essential connections for learning. New York: Guilford Press.

 

Eccles, J. S., & Harold, R. D. (1993). Parent–school involvement during the early adolescent years. Teachers College Record, 94, 568–587.

 

Ellis, A. K. (2005). Research on educational innovations (4th ed.). Larchmont, NY: Eye on Education.

 

Gansle, K. A., & Noell, G. H. (2007). The fundamental role of intervention implementation in assessing resistance to intervention. In S. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.), Handbook of response to intervention: The science and practice of assessment and intervention (pp. 244–254). New York: Springer.

 

Greenwood, C. R., Carta, J. J., & Atwater, J. (1991). Ecobehavioral analysis in the classroom: Review and implications. Journal of Behavioral Education, 1, 59–77.

 

Jeynes, W. H. (2007). The relationship between urban secondary school student academic achievement: A meta-analysis. Urban Education, 42, 82–110.

 

McIntire, T. (2002). The administrator’s guide to data-driven decision making. Technology & Learning, 22(11), 18–28, 32–33.

 

McLeod, S. (2005). Technology tools for data-driven teachers. Retrieved February 20, 2009, from Microsoft Innovative Teachers Thought Leaders.

 

Pfohl, W. F., & Pfohl, V. A. (2008). Best practices in technology. In A. Thomas & J. Grimes (Eds.) Best practices in school psychology (5th ed., pp. 1885–1900). Bethesda, MD: National Association of School Psychologists.

 

Pierce, D. (2005). Formative assessment rates high at FETC. Retrieved February 20, 2009, from eSchoolNews Online.

 

Reschly, A. (n.d.) Schools, families, and response to intervention. Retrieved February 13, 2009 from the RTI Action Network Web site.

 

Vogel, J. J., Vogel, D. S., Cannon-Bowers, J., Bowers, C. A., Muse, K., & Wright, M. (2006). Computer gaming and interactive simulations for learning: A meta-analysis. Journal of Educational Computing Research, 34, 229–243.

 

Wayman, J. C. (2005). Involving teachers in data-driven decision-making: Using computer data systems to support teacher inquiry and reflection. Journal of Education for Students Placed At Risk, 10, 295–308.

 

Ysseldyke, J. E. (2005). Assessment and decision making for students with learning disabilities: What if this is as good as it gets? Learning Disability Quarterly, 28, 125–128.

 

Ysseldyke, J., & Burns, M. K. (2009). Functional assessment of instructional environments for the purpose of making data-driven instructional decisions. In T. B. Gutkin & C. R. Reynolds (Eds.), The handbook of school psychology (4th ed.). Hoboken, NJ: Wiley.

 

Ysseldyke, J. E., & McCleod, S. (2007). Using technology tools to monitor response to intervention. In S. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.), Handbook of response to intervention: The science and practice of assessment and intervention (pp. 396–407). New York: Springer.


Back To Top
 
You must login to this website in order to comment.