Transcription

Quality Assurance:Monitoring andEvaluation to InformPractice and LeadershipTransformation Framework

Developinga LearningCommunityCurriculum andAssessmentTeacher andLeader Capacity21st CenturyPedagogyEstablishinga VisionLeadershipand PolicyQuality Assurance: Monitoringand Evaluation to InformPractice and LeadershipDesigning Technologyfor Efficient andEffective SchoolsPhysical LearningEnvironmentsOrganizational Capacity,Strategic Planning andQuality AssurancePartnerships andCapacity y andSustainabilityIntroductionThis paper examines one of ten criticalcomponents of effective transformation inschools and education systems. Each paperis produced by an expert author, whopresents a global perspective on their topicthrough current thinking and evidence fromresearch and practice, as well as showcaseexamples. Together, the papers document thecontributions of ‘anytime, anywhere’ approachesto K-12 learning and explore the potential of newtechnology for transforming learning outcomesfor students and their communities.Quality Assurance: Monitoring and Evaluationto Inform Practice and LeadershipWhat is the EducationTransformation Framework?This paper provides monitoring and evaluation guides andexamples for leaders. Monitoring and evaluation is usedby governments worldwide to improve school systems andeducational results – and they can play an integral role in holisticeducation transformation.The Microsoft Education Transformation Frameworkhelps fast track system-wide transformation bysummarizing decades of quality research. It includesa library of supporting materials for ten componentsof transformation, each underpinned by an executivesummary and an academic whitepaper detailing globalevidence. This provides a short-cut to best practice,speeding up transformation and avoiding the mistakesof the past. Microsoft also offers technology architecturesand collaborative workshops to suit your needs.Education leaders at all levels can benefit from applying theplanning, monitoring and evaluation cycle and outcomes-basedplanning and evaluation to education transformation initiatives.Monitoring and evaluation can help educational transformationprograms define and measure quality indicators and measuresof the education transformation process, gauge progresstoward desired educational outcomes, increase stakeholderparticipation, and empower school leaders and teachers to buildand sustain transformation in schools.As each educational system is unique, evaluators should beprepared to vary their evaluation approach based on programpurpose and context. Technology is playing an increasinglyimportant role in increasing data access, as well as a tool forschool leaders and teachers to inform instruction and improvestudent outcomes in education transformation initiatives.About the authorDr. Tom ClarkPresidentTA ConsultingDr. Tom Clark, President of TA Consulting, providesevaluation, research, and strategic services foreducation clients. He has led evaluations of a widerange of online, blended and distance learningprograms, from district-wide and state-wideprograms to complex multi-state and postgraduateprograms. He also provides strategic consultation,developing policy analysis and whitepapers forclients. Dr. Clark is an accomplished author, havingco-written one of the first American textbooks onpostsecondary distance learning, and authored oneof the seminal works in K-12 online learning.

The first stepsto successHow can I help ensure ourtransformation is a success?What does monitoringand evaluation achieve?When you’re looking to ensure asuccessful and sustainable educationtransformation initiative, monitoring andevaluation for quality assurance (M&E)plays an important role. According toJames & Miller, “the M&E process shouldbe an integral component of any plannedICT in education program and should befactored into planning before a projectstarts.”1 Furthermore, planning for M&Eis considered one of the ten criticalcomponents needed to bring abouteducational transformation. 2M&E can help kickstart educationprograms, by:M&E can keep education programson track, by: D eveloping clear, attainableoutcomes and goals for educationtransformation, and flexiblestrategies for achieving them M onitoring program implementationand progress toward desired outcomes P romoting high levels ofengagement by local schoolstakeholders H elping sustain effective programimplementation over time Promoting ongoingcommunication about roles,expectations, progress,and performance D ocumenting program successfor educational stakeholdersand funders. H elping programs identify and remedyimplementation problems early on H elping staff, teachers and partnerslearn from their experiences, allowingthem to make more informeddecisions, be accountable, andreposition their efforts. 31 James, T., & Miller, J. (2005). Developing a monitoring and evaluation plan for ICT. In Wagner, D. A., Day, B., Jones, T., Kozma, R. B., Miller, J., & Unwin, T. (Eds), Monitoring and evaluationof ICT in education projects (pp. 32-42). Washington, DC: World Bank.2 Cavanaugh, C., McCarthy, A., & East, M. (2014). An innovation framework for holistic school transformation: ten critical conversations for the 21st century. Redmond, WA: MicrosoftWorld Public Sector.3 United Nations Development Programme. (2009). Handbook on planning, monitoring and evaluating for development results. New York.4 Quality Assurance: Monitoring and Evaluation to Inform Practice and LeadershipThe M&E processshould be factoredinto planning beforea project starts.James & Miller, 2005

ManagementStrategiesResults-Based ManagementWhat exactly do you mean?Together with planning, M&E creates thePlanning, Management and Evaluationcycle. A number of managementapproaches incorporate this cycle.Results-Based Management (RBM) is oneof the most widely known, used by manyinternational development agencies.In management approaches like RMB,stakeholders create a vision, definedesired results, plan the project, monitorimplementation, then evaluate whetherdesired results were achieved and makeimprovements or changes as necessary.Here is an illustrated guide to the cycle,as used by the United Nations:Planning is addresses in more depthin the first and second white papers onVision and Enabling Transformationwith Strategic Planning.EvaluationMonitoring may be defined as“an ongoing process by whichstakeholders obtain regular feedbackon the progress being made towardsgoals and objectives.” Monitoring isan important source of informationfor program evaluation.Evaluation is defined by Patton as“the systematic collection ofinformation to make judgments,improve program effectiveness and/or generate knowledge to informdecisions about future programs.”PlanningSettingthe visionManagingand gand usingmonitoringDefiningthe resultsmap and R&MframeworkPlanning formonitoring andevaluationPoor implementation isusually a direct result ofthe ‘policy lever approach,’but poor alignment isoften blamed.Evaluation may be formative,providing feedback for improvement,or summative, assessing merit orworth. It may be internal, conductedby program staff such as ‘M&E Officers’in development programs, or external,conducted by outside evaluators whoprovide third party validation or examinequestions of special interest.Results-basedmanagement (RBM) is amanagement strategy thatuses feedback loops toachieve strategic goals.Logical Framework ApproachA first step in building an effectiveeducation transformation program isdefining desired results or outcomes(the measurable changes in people ororganizations) that will help addresscritical needs or problems that fallwithin your organizational mission.Thinking about your program needsand desired results should precedeplanning of program, monitoring orevaluation activities.Here is an example logic model for aneducation transformation initiative thatis tackling low educational attainmentdue to limited educational access: Onceyou have desired outcomes that addressthe problem, it’s time to considerhow the program will achieve them.“Identifying the theory underlying theprogram and the contextual factorsthat affect its operations and successis critical”.4 This realization led to thedevelopment of the logic model, whichprovides a theory of action for how theprogram is intended to work. Someagencies refer to logic models as logicalframeworks or “logframes”.5The example logic model foreducational transformation shows asituation (what the program is intendedto address), inputs (existing resources),processes (program activities andservices), outputs (who and how manyare served), and outcomes (short,medium, and long-range changes inpeople and organizations attributedat least in part to the program).Monitoring and evaluation schemestypically follow such an approach,whether or not they have an explicitlogic model or logframe. A typical M&Escheme for ICT in education includes1) input indicators, 2) process indicators,3) outcomes, 4) assessment instruments,and 5) a monitoring and evaluation plan.6The plan is the most critical part, as itincludes a schedule for monitoring theindicators, administering the assessmentsand undertaking evaluation activities.A Logical Framework(Logframe) lays out thedifferent types of eventsthat take place as a projectis implemented: Inputs,Processes, Outputsand Outcomes.SituationInputsActivitiesOutputsWhat fuels the program:What the program does:Who the program serves: Funding Staff Partners Facilities School leader trainingTech infrastructure buildingTeacher PD & Peer MentoringWeb & Content DevelopmentStudentsTeachersLeadersSchool and rovements in:Improvements in:Improvements in: Teacher knowledge & skills Student attitudes School technology access Teacher retention Educational attainment School effectivenessSchool leader and teacher practicesStudent learningSchool PoliciesSchool innovation levelMonitoringSource: United Nations Development Program (2009), p.10.6 Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership4 Wholey, J. S., Hatry, H. P., & Newcomer, K. E. (Eds.). (2010). Handbook of practical program evaluation. 3rd Ed. San Francisco: Jossey-Bass.5 World Bank. (2004). Monitoring & evaluation: some tools, methods & approaches. Washington, DC.6 Rodríguez, P., Nussbaum, M., López, X., & Sepúlveda, M. (2010). A Monitoring and evaluation scheme for an ICT-supported education program in schools.Educational Technology & Society, 13 (2), 166–179.Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership 7

What to watchout forGovernmental performancein Australia, Canada, Ireland,and the U.S., shows a sharedemphasis on monitoringoutcomes and outputs,rather than activities.Challenges to effectivemonitoring and evaluationWhile outcomes-based models andresults-based management can bevaluable tools, how they are implementedimpacts their effectiveness as methodsfor managing education transformation.MANGO, a UK charity that works ininternational development, identifies anumber of ways in which logic models/logframes and results-based managementmay fail to be used effectively:7 P lanners may assume that complexsocial issues can be reduced to a simpleoverarching problem, but often theycannot. Also, some important goalscannot be easily measured. R esults and impact may be beyond theagency’s control, take years to achieve,and be influenced by many factorsbesides the program. T here may be multiple viewpointsabout problems and competinginterests, not a single view or interesteasily expressed as an outcome. L ogframes may focus the projecton the agency’s actions rather thanthe local program and the peopleserved, and tend to exclude localpeople from planning, especiallymarginalized people. I nitial plans are never completelyaccurate, and circumstances andpriorities may change, but logframesmay reduce flexibility to make changeslater to fit project realities. L ogframes are often not used by fieldstaff after initial planning, becausethey do not fit how the projectactually works on the ground.Lessons learned andeffective practicesTo make monitoring and evaluationeffective in education transformation: D esired program outcomesshould be developed with localschool leader and teacher input,be realistic, and, where possible,within the control of the program. S trategies and activities selectedto attain desired outcomes shouldbe flexible and open to revisionas needed by empowered localprogram managers and schoolleaders to reflect the evolvingprogram in practice. M onitoring and evaluationin local schools should beparticipatory, to build localbuy-in and capacity to sustainan effective program.8 Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership P lanning for monitoring andevaluation should start earlyon. For example, identifyingkey data indicators and howdata will be gathered. F inally, the program can benefitfrom distributing information onthese measures throughout theprogram’s implementation, ratherthan just at the end.8According to Kozma & Wagner’s analysisof prior international national monitoringand evaluation studies of ICT use, some ofthe most effective practices for evaluatorsare as follows:8Role monitoring andevaluation in government P rogram evaluations shouldconcentrate on [outcome] measuresof student and teacher learning. The most sensitive are those customdesigned for the program Includemeasures of learning that are likely toresult from the systematic use of ICT. E valuators need to documentand measure baseline inputs tothe program. E valuators need to acknowledgeand describe the educational,technological, social, andeconomic factors that enableand constrain the program. D irect measures of M&E indicatorsare the most reliable sourcesof informat