Standard 5.3 Program EvaluationCandidates design and implement program evaluations to determine the overall effectiveness of professional learning on deepening teacher content knowledge, improving teacher pedagogical skills and/or increasing student learning. (PSC 5.3/ISTE 4c)
![]()
|
This artifact is a current reality evaluation that I completed for my school during October of 2016 for ITEC 7460. The evaluation contained an overview and evaluation of the initiatives and professional development currently in use at McConnell Middle School. The evaluative portion of this document contains evidence of our professional development efficacy with regards to deepening teacher content knowledge, improving teacher pedagogical skill and/or increasing student learning. The document also contains recommendations to enhance our professional development programs.
Standard 5.3, Program Evaluation, is concerned with the design and implementation of program evaluation to determine the effectiveness of professional learning experiences. The artifact further details the default design of most of our evaluation which always combines raw data and surveys with other metrics such as observations and focus groups. This combination is, I believe, much more powerful than numbers alone as we are better able to paint a more detailed and holistic picture of what is generally a complex and nuanced situation.
The creation of this document helped make me a better technology administrator by providing an inventory of current initiatives and giving me a bird’s eye view of a process that I am normally buried inside of where the light is poor and the results too subjective. The other advantage of conducting this analysis was the perspective I gained in reviewing the evaluation methodology for what worked and what did not as opposed to the review of the subject of the process itself. The staff benefited from this review as well since the better our evaluation of training becomes, the more likely that the training will improve and respond to their needs. In the future I hope to streamline the feedback process so that suggestions and enhancements can be folded into initiatives in a more timely manner. This should also encourage more participation in the feedback process as stakeholders see swift action taken on concerns.
While professional development remains a crucial part of most technology initiatives, its role and methods should be as dynamic as the people and the technology it serves. There must be constant assessment to insure that dollars and time are being spent wisely. The movement towards individualized coaching and away from large-scale professional development, which this evaluative process seems to support is, I believe, a simple assertion that the quality of development is far more important than the quantity. This experience helped me improve my game which, in turn, improves instruction in the school as a whole. Even those teachers who were initially uncomfortable with small group or individualized instruction have begun to adjust and embrace this new model sometimes without even being aware what they are embracing. For example, we have a couple of teachers on staff who are now implementing and loving technology strategies that they learned from another teacher without being burdened by the knowledge that their instructor learned the strategy from me. This has been very important to the dispersion of knowledge and skills here as there will always be a handful of staff who either do not click with my style or have other reasons to not accept my help. However, as long as they learn, it does not matter who they learn from. Of course, we will need to wait to look at staff survey results, lab reports, and student account usage metrics to tell us how much of an impact this change has made this year. My strong hunch (and current anecdotal evidence) is that we will see a marked increase in all of the above.
Standard 5.3, Program Evaluation, is concerned with the design and implementation of program evaluation to determine the effectiveness of professional learning experiences. The artifact further details the default design of most of our evaluation which always combines raw data and surveys with other metrics such as observations and focus groups. This combination is, I believe, much more powerful than numbers alone as we are better able to paint a more detailed and holistic picture of what is generally a complex and nuanced situation.
The creation of this document helped make me a better technology administrator by providing an inventory of current initiatives and giving me a bird’s eye view of a process that I am normally buried inside of where the light is poor and the results too subjective. The other advantage of conducting this analysis was the perspective I gained in reviewing the evaluation methodology for what worked and what did not as opposed to the review of the subject of the process itself. The staff benefited from this review as well since the better our evaluation of training becomes, the more likely that the training will improve and respond to their needs. In the future I hope to streamline the feedback process so that suggestions and enhancements can be folded into initiatives in a more timely manner. This should also encourage more participation in the feedback process as stakeholders see swift action taken on concerns.
While professional development remains a crucial part of most technology initiatives, its role and methods should be as dynamic as the people and the technology it serves. There must be constant assessment to insure that dollars and time are being spent wisely. The movement towards individualized coaching and away from large-scale professional development, which this evaluative process seems to support is, I believe, a simple assertion that the quality of development is far more important than the quantity. This experience helped me improve my game which, in turn, improves instruction in the school as a whole. Even those teachers who were initially uncomfortable with small group or individualized instruction have begun to adjust and embrace this new model sometimes without even being aware what they are embracing. For example, we have a couple of teachers on staff who are now implementing and loving technology strategies that they learned from another teacher without being burdened by the knowledge that their instructor learned the strategy from me. This has been very important to the dispersion of knowledge and skills here as there will always be a handful of staff who either do not click with my style or have other reasons to not accept my help. However, as long as they learn, it does not matter who they learn from. Of course, we will need to wait to look at staff survey results, lab reports, and student account usage metrics to tell us how much of an impact this change has made this year. My strong hunch (and current anecdotal evidence) is that we will see a marked increase in all of the above.