Program Assessment Tool (PAT)
Programs are the foundation of Extension’s educational strategies. Yet, despite the literature and all of the expertise that exists, Extension faculty and administrators often find it difficult to assess the development stage of the program. Extension organizations and educators often describe a program as “signature” without criteria of what that term means. It is also difficult to take an objective view of a program and decide if further resources are warranted that can move it from a good idea with limited applicability to a statewide effort that meets a critical public need or issue. A national environmental and literature scan of Extension resources did not produce a tool that established criteria to make informed program assessments. Because of these and several other reasons, the UME Program Assessment Tool (PAT) was developed.
The PAT is based on the two well-known and used educational tools: rubrics and logic models. Like a rubric, the PAT provides criteria that can be used to help make decisions or judgments about where a program stands in the development process. Like a logic model, the PAT can be read from left to right—starting with the emerging and developing stages on the left where Extension efforts are more focused on outputs, to the right where the focus in on signature and evidence-based programs and outcomes.
Impact Teams will use this tool to make decisions about which programs will be sent forward to be peer-reviewed for signature status, as well as to determine emerging and developing programs that will be priorities for further investments. For some programs, they will be critiqued for an evidence-based status.
Definitions of Terms
Many terms in this tool could be interpreted in multiple ways. For purposes of use of the PAT, we’ve provided a short list of terms and our definitions.
- Curriculum - A specific learning program with targeted learners, goals and objectives, learning activities and materials.
- Educational Intervention - The programming done by Extension salaried and volunteer faculty and staff.
- Evaluation Methods - The evaluation strategies that will be used to determine program outcomes.
- Evaluation Use - What type of data will be collected and how it will be used.
- Needs Assessment - “A systematic way … for identifying education and training problems, needs, issues, and the like” (Caffarella, 2002, p. 123).
- Programs
- Informational - A UME-branded program that delivers research-based information.
- Developing - A UME-branded program in early stages of demonstrating its public value.
- Signature - A UME-branded, research-based program known for its demonstrated public value.
- Evidence-Based - A UME branded program that can be replicated with similar outcomes based on scientific measure of effect and judged by external reviews to meet standardized assessments.
- Program Scholarly Outputs - Products that document the educational intervention including theory, findings, and effectiveness measures. Refereed reviews are the gold standard of judgment of quality of educational interventions.
- Research Base - The science of the curriculum content, delivery, and evaluation.
Instructions for Using PAT
A. Individuals and Impact or other teams should use the PAT under these conditions:
- When assessing a current program for the extent to which it meets the criteria and deciding
- what to do to strengthen the program to remain in that category, or
- if it’s time to end the program or hand-off to a non-Extension entity.
- When determining what would need to be done to advance the program into a next category.
B. In some cases, stakeholders and partners should be included in completing the assessment. In other cases, an external review may be helpful.
C. ALL boxes need to be checked for a program to meet the requirements of its category.
We recognize that programs are constantly evolving and go through cycles, perhaps moving move forward and backward in these four types of categories that we have established. Programs need to change as the needs of individuals and communities that we serve change. Program evaluations often bring forth evidence that program changes are needed. This understanding is best described in the Cornell Office for Research and Evaluation (CORE)’s The Guide to the Systems Evaluation Protocol (2012):
“Each iteration of a program is related to the program’s history but is also shaped by decisions based on new information about how and how well the program works, and about what is needed by the target audiences or community; and by purely external factors like funding availability. The process of evolution involves learning, changing, and ultimately strengthening the larger system as a program is run, evaluated and revised and re-run over time” (p. 18)
University of Maryland Extension Program Assessment Tool
Category | Informational | Developing | Signature | Evidence-Based |
---|---|---|---|---|
Needs Assessment: |
||||
Fit with UME Mission (Program Design) |
▢ Represents an emerging public issue or need that could be addressed by UME. ▢ Based on some evidence of the issue and/or need ▢ Included in at least one IEP. ▢ Not yet included in TEP. ▢ Minimal or no specific UME funding or other resources dedicated to addressing the emerging issue or need through a formal UME program. |
▢ Represents a developing public issue or need that can be addressed by UME. ▢ Represents a developing public issue or need that can be addressed by UME. ▢ Based on substantive evidence of the public issue or need AND the capacity of UME to make an impact. ▢ Included in multiple IEPs. ▢ Included in at least one TEP for development. ▢ Start-up UME funding or other resources committed to addressing the issue or need through a formal program. |
▢ Represents a priority of UME based on identified public issues and/or needs of the people of the state. ▢ Provides sufficient evidence of impact to justify commitment of resources to conduct program. ▢ Defines the distinctiveness ofUME from other organizations in addressing the public issue and/or particular need of the people of the state. ▢ Included in multiple IEPs across multiple disciplines. ▢ Identified as a signature program in at least one TEP. ▢ Adequate funding and other resources from UME and others to have an impact on the issue or need through a program that is known outside of UME among public decision-makers and the people of the state. |
▢ Represents an ongoing priority(ies) of UME based on identified public issues and needs of the people of the state. ▢ Provides sufficient evidence to justify commitment of resources needed to substantially address the issue or need over time. ▢ Documents the distinctiveness of UME from other organizations to address the public issue and/or particular needs of the people of the state or beyond. ▢ Included in multiple IEPs across multiple disciplines. ▢ Included as a signature program in at least one TEP. ▢ Adequate and sustained funding and other resources from UME and others, including states that replicate the program, to address the national issue or need and provide scientifically rigorous evidence of impact. |
Category | Informational | Developing | Signature | Evidence-Based |
---|---|---|---|---|
Educational Intervention: |
||||
Meets Critical Clientele Needs (Program Development) | ▢ Exchange of information to answer questions and address concerns. ▢ Information is transferred to client for immediate use. ▢ Information is research-based. |
▢ Exchange of information is for immediate use and could lead to change over time in an individual’s knowledge, attitude, skills, and aspirations (KASA). ▢ Information and methods of teaching/learning are research and theory-based. ▢ Contact time with client is of a short-to-medium duration and may be face-to-face and/or through different types of media. ▢ May involve key partners or stakeholders. |
▢ Exchange of information leads to documented change in an individual’s knowledge, attitude, skills, and aspirations (KASA). ▢ Exchange of information isused to aid in the solution of a public issue or need of individuals, families, and communities. ▢ Information and methods of teaching/learning are research and theory-based. ▢ Contact time with client is of a medium-to-long duration and uses multiple methods of contact, including face-to-face and different types of media. ▢ Involves key partners and stakeholders. |
▢ Exchange of information leads to scientifically-rigorous, documented change in an individual’s knowledge, attitude, skills, and aspirations (KASA) over time. ▢ Exchange of information is used to aid in the solution of a public issue or need of individuals, families, and communities. ▢ Information and methods of teaching/learning are research and theory-based. ▢ Contact time with client is of a medium-to-long duration and uses multiple methods of contact, including face-to-face and different types of media. ▢ Involves key partners and stakeholders. ▢ Uses program strategies that have been scientifically tested and proven successful for public issues and needs of people. |
Curriculum: |
||||
▢ No curriculum. | ▢ Program curriculum under development is tested based on the UME Extension Curriculum Assessment Tool(CAT) and, when appropriate, the Materials Assessment Tool (MAT). ▢ Program curriculum changes have been made based on the UME Extension CAT and, when appropriate, the MAT. ▢ Curriculum has been pilot-tested using appropriate testing methods. ▢ If curriculum is adapted from another source, is subjected to the CAT and, if appropriate, to MAT, and pilot tested for appropriateness in state and modified as needed. |
▢ Program curriculum developed using the UME Curricula Assessment Tool(CAT) review guidelines. ▢ Program curriculum adapted from another state has been peer reviewed using the UME Extension CAT and, when appropriate, MAT, and modified to meet Maryland needs. ▢ Curriculum has been both internally and externally peer-reviewed. ▢ Curriculum has been published with a UME signature-program endorsement. ▢ Curriculum is available to other states to use and adapt. |
▢ Program curriculum developed using the UME Curricula Assessment Tool(CAT) review guidelines. ▢ Program curriculum adapted from another state has been peer reviewed using UME CAT and, when appropriate, the MAT. ▢ Curriculum produces evidence-based results. |
Category | Informational | Developing | Signature | Evidence-Based |
---|---|---|---|---|
Research Base: |
||||
Research & Scholarship (Program Development & Delivery) |
▢ Uses research-based information. | ▢ Theory and research-based information is explicitly explained and incorporated into the development of program. | ▢ Theory and research-based information are used to explain impact measures and outcomes. ▢ Provides information that can be used to build additional intervention strategies and research questions. |
▢ Theory, research-based information, and empirical evidence are explicitly integrated in explanation of program intervention impacts on intended outcomes. ▢ Program research results provide evidence to build additional the oretical models. ▢ Program research results provide evidence that allows for further research study funds to be generated. |
Program Scholarly Outputs: |
||||
▢ Program activities cited in CVs and annual faculty reports for merit review. | ▢ Program activities cited in CVs and annual faculty reports for merit review. ▢ Conference and professional association posters. ▢ Conference and professional association workshops and presentations based on preliminary data. ▢ Contributions to eXtension Communities of Practice(COP). ▢ UME peer-reviewed Extension Briefs and/or Factsheets. |
▢ Program scholarship findings cited in CV and annual faculty reports for merit reviews. ▢ Program scholarship findings used in promotion and tenure packages for decisions about Senior or Principal Agent advancement and for merit reviews. ▢ Program results presentations at professional association meetings, workshops, panels, and other types of delivery methods--both refereed and non-refereed. ▢ Invited presentations and articles about program results. ▢ Contributions to eXtension Communities of Practice(COP). ▢ Refereed articles in subject-based journals. ▢ UME peer-reviewed Extension Briefs, Factsheets, Bulletins, Manuals, and Curricula. |
▢ Program scholarship findings cited in CV and annual faculty reports for merit reviews. ▢ Program scholarship findings used in promotion and tenure packages for decisions about Senior or Principal Agent advancement and for merit reviews. ▢ Invited presentations and articles about program results from other states, regions, and countries. Evaluation results add to a national evidence-based database. ▢ Invited presentations and articles about program results are issued from other states, regions, and countries. ▢ Primary authorships in eXtension Communities of Practice (COP). ▢ Journal editorial board memberships. ▢ Refereed articles in highly-acclaimed journals. ▢ UME peer-reviewed Extension Briefs, Factsheets, Bulletins, Manuals, and Curricula. ▢ Books or book chapters. |
Category | Informational | Developing | Signature | Evidence-Based |
---|---|---|---|---|
Evaluation Use: |
||||
Program Evaluation | ▢ Data collected and evaluated to determine participant knowledge gain and satisfaction level with the interaction experience. ▢ Evaluation results are used to communicate reach of Educator’s work. |
▢ Dated collected and evaluated to determine participants’ short-term KASA outcomes and clientele satisfaction level with the interaction experience. ▢ Evaluation results used to determine program effectiveness and to communicate effectiveness of Educator’s work to meet clientele needs. |
▢ Data collected and evaluated to determine medium-term outcomes achieved that benefit clientele and/or the community. ▢ Evaluation results used to communicate UME’s value in addressing societal, economic, and environmental needs. ▢ Evaluation results used to communicate the effectiveness of Educator’s work to meet clientele needs in Maryland. |
▢ Data collected and evaluated to determine long-term outcomes achieved that benefit clientele. ▢ Evaluation results used to communicate UME’s impacton compelling societal,economic, and environmental issues in Maryland. ▢ Evaluation results used to communicate state and national impacts on compelling societal, economic, and environmental issues. |
Evaluation Methods: |
||||
▢ End-of-session instruments used to determine client satisfaction. ▢ No IRB approval required if client satisfaction will not be published. |
▢ Basic logic model developed. ▢ End-of-session instruments used for program improvement. ▢ Paired or unmatched pretests and posttests assessments for KASA changes. ▢ Qualitative methods incorporated where appropriate (structured observations, interviews). ▢ IRB approved. |
▢ Logic model is fully developed. ▢ End-of-session instruments used for program improvement. ▢ Paired or unmatched pretests and posttests for assessment of KASA changes. ▢ Qualitative methods incorporated where appropriate (structured observations, interviews). ▢ Follow-up survey research used to assess medium-term outcomes. ▢ Control and comparison groups used where appropriate. ▢ Findings are used to improve programs. ▢ Findings are peer reviewed and published when appropriate. ▢ IRB approved. |
▢ Logic model is fully developed and tested for utility overtime. ▢ Results of evaluations have been subject to critical peer review. ▢ Empirical evidence exists about program effectiveness. ▢ Program results grounded in rigorous evaluations using experimental or quasi-experimental studies with randomized control groups. ▢ Program can be replicated by other states with confidence in program effectiveness. ▢ Findings are published in peer-reviewed journals and other publications. ▢ IRB approved. |
Category | Informational | Developing | Signature | Evidence-Based |
---|---|---|---|---|
Adoption & Replication (Program Dissemination) |
▢ Potential for adoption and replication unknown. | ▢ Has potential to become a program that can be replicated by Extension or others in state. | ▢ Recognized by respected agencies and organizationsas an effective program. ▢ Adopted by other organizations or Extension services. |
▢ Program is promoted and adopted nationally as an empirically-tested intervention with identified short-, medium-, and long-term outcomes. ▢ Program materials(curriculum, protocols, evaluation instruments) exist that make adoption and replication possible. |
Marketing & Communication (Program Dissemination) |
▢ No formal marketing plan, but program is advertised at the local level though flyers, newspaper articles,newsletters, or word-of-mouth. | ▢ No formal marketing plan, but advertising has extended beyond the local community. | ▢ Formal marketing plan in place and evaluated for effectiveness. | ▢ Effective components of a formal marketing plan are used. |
Public Value (Program Dissemination) |
▢ Program value is evident to the individual participants using information. | ▢ Program value is evident tothe individual participants using information and participating in the program. | ▢ Program’s value is evident to individuals, families, and the community-at-large. | ▢ Program’s value is evident to individuals, families, and the community-at-large. ▢ Program’s public value is determined by people or agencies outside of UME using this assessment tool or one used by an agency with a standardized tool and or a process for judging value. |
Sustainability (Organizational Commitment) |
▢ Minimum resources are required to initiate elements of a program. ▢ Internal resources used to launch the program. |
▢ Short-term resources committed from Impact Teams to assist program in developing into signature program. ▢ Short term external funding secured to assist in developing program. ▢ Potential partners identified. |
▢ Medium-term resources committed to supporting the program from the UME budget pending evidence of potential for impact. ▢ External funders may be involved in on-going support of the program. ▢ Partners involved in program when appropriate. |
▢ Long-term funding in UME budget due to evidence of impact. External, long-term funding or partners secured to maintain programming. ▢ National partners involved in program when appropriate. |
References:
- Boyle, P. (1981). Planning better programs. New York: McGraw-Hill.
- Caffarella, R. S. (2002). Planning programs for adult learners. San Francisco: Jossey Bass.
- Cornell Office for Research on Evaluation. (2012). The guide to the systems evaluation protocol. Ithaca, NY: Cornell Digital Print Services. Available from https://core.human.cornell.edu/research/systems/protocol/index.cfm
Acknowledgements:
The UME Program Assessment Tool (PAT) was developed by Teresa McCoy, Assistant Director, Evaluation & Assessment, and Dr. Bonnie Braun, Professor and Extension Family Policy Specialist with assistance of Nicole Finkbeiner, M.S., Graduate Research Assistant. This tool is based, in part, on the the Curriculum Assessment Tool (CAT) and the Materials Assessment Tool (MAT) created by Bonnie Braun and Nicole Finkbeiner, November 2012. All three assessment tools are contained in the Extension Education Theoretical Framework Manual, to be published in 2013 by the University of Maryland Extension.
The PAT was reviewed as part of a formative evaluation by the following members of the UME Health Smart Team: Karen Aspinwall, Virginia Brown, Nancy Lewis and Elizabeth Maring.
The PAT was also reviewed by the UME program leadership team of Dr. Patsy Ezell, Assistant Director, Family & Consumer Sciences; Dr. Jeff Howard, Assistant Director, 4-H Youth Development; Dr. Andy Lazur, Assistant Director, Agriculture & Natural Resources; Dr. Doug Lipton, Director, Maryland Sea Grant Program; and Tom Miller, Assistant Director of Operations. The need for this tool was identified during the leadership of Dr. Nick Place, Associate Dean/Associate Director of UME, now the Dean and Director, University of Florida Institute for Food and Agricultural Sciences.
Program Assessment Worksheet
Category | Signature | Comments |
---|---|---|
Needs Assessment: |
||
Fit with UME Mission (Program Design) |
▢ Represents a priority of UME based on identified public issues and/or needs of the people of the state. ▢ Provides sufficient evidence of impact to justify commitment of resources to conduct program. ▢ Defines the distinctiveness of UME from other organizations in addressing the public issue and/or particular need of the people of the state. ▢ Included in multiple IEPs across multiple disciplines. ▢ Adequate funding and other resources from UME and others to have an impact on the issue or need through a program that is known outside of UME among public decision-makers and the people of the state. |
Meets Criteria: ▢ Yes ▢ Marginal ▢ No |
Educational Intervention: |
||
Meets Critical Clientele Needs (Program Development) | ▢ Exchange of information leads to documented change in an individual’s knowledge, attitude, skills, and aspirations (KASA). ▢ Exchange of information is used to aid in the solution of a public issue or need of individuals, families, and communities. ▢ Information and methods of teaching/learning are research and theory-based. ▢ Contact time with client is of a medium-to-long duration and uses multiple methods of contact, including face-to-face and different types of media. ▢ Involves key partners and stakeholders. |
Meets Criteria: ▢ Yes ▢ Marginal ▢ No |
Curriculum |
||
▢ Program curriculum developed using the UME Curricula Assessment Tool (CAT) review guidelines. ▢ Program curriculum adapted from another state has been peer reviewed using the UME Extension CAT and, when appropriate, MAT, and modified to meet Maryland needs. ▢ Curriculum has been both internally and externally peer-reviewed. ▢ Curriculum has been published with a UME signature-program endorsement. ▢ Curriculum is available to other states to use and adapt. |
Meets Criteria: ▢ Yes ▢ Marginal ▢ No |
Category | Signature | Comments |
---|---|---|
Research Base: |
||
Research & Scholarship (Program Development & Delivery) |
▢ Theory and research-based information are used to explain impact measures and outcomes. ▢ Provides information that can be used to build additional intervention strategies and research questions. |
Meets Criteria: ▢ Yes ▢ Marginal ▢ No |
Program Scholarly Outputs: |
||
▢ Program scholarship findings cited in CV and annual faculty reports for merit reviews. ▢ Program scholarship findings used in promotion and tenure packages for decisions about Senior or Principal Agent advancement and for merit reviews. ▢ Program results presentations at professional association meetings, workshops, panels, and other types of delivery methods-- both refereed and non-refereed. ▢ Invited presentations and articles about program results. Contributions to eXtension Communities of Practice (COP). ▢ Refereed articles in subject-based journals. ▢ UME peer-reviewed Extension Briefs, Factsheets, Bulletins, Manuals, and Curricula. |
Meets Criteria: ▢ Yes ▢ Marginal ▢ No |
Category | Signature | Comments |
---|---|---|
Evaluation Use: |
||
Program Evaluation | ▢ Data collected and evaluated to determine medium-term outcomes achieved that benefit clientele and/or the community. ▢ Evaluation results used to communicate UME’s value in addressing societal, economic, and environmental needs. ▢ Evaluation results used to communicate the effectiveness of Educator’s work to meet clientele needs in Maryland. |
Meets Criteria: ▢ Yes ▢ Marginal ▢ No |
Evaluation Methods: |
||
▢ Logic model is fully developed. ▢ End-of-session instruments used for program improvement. Paired or unmatched pretests and posttests for assessment of KASA changes. ▢ Qualitative methods incorporated where appropriate(structured observations, interviews). ▢ Follow-up survey research used to assess medium- term outcomes. ▢ Control and comparison groups used where appropriate. ▢ Findings are used to improve programs. ▢ Findings are peer reviewed and published when appropriate. ▢ IRB approved. |
Meets Criteria: ▢ Yes ▢ Marginal ▢ No |
|
Adoption & Replication (Program Dissemination) |
▢ Recognized by respected agencies and organizations as an effective program. ▢ Adopted by other organizations or Extension services. |
Meets Criteria: ▢ Yes ▢ Marginal ▢ No |
Marketing & Communication (Program Dissemination) |
▢ Formal marketing plan in place and evaluated for effectiveness. | Meets Criteria: ▢ Yes ▢ Marginal ▢ No |
Public Value (Program Dissemination) |
▢ Program’s value is evident to individuals, families, and the community-at-large. | Meets Criteria: ▢ Yes ▢ Marginal ▢ No |
Sustainability (Organizational Commitment) |
▢ Medium-term resources committed to supporting the program from the UME budget pending evidence of potential for impact. ▢ External funders may be involved in on-going support of the program. ▢ Partners involved in program when appropriate. |
Definitions of Terms
Many terms in this tool could be interpreted in multiple ways. For purposes of use of the PAT, we’ve provided a short list of terms and our definitions.
- Curriculum - A specific learning program with targeted learners, goals and objectives, learning activities and materials.
- Educational Intervention - The programming done by Extension salaried and volunteer faculty and staff.
- Evaluation Methods - The evaluation strategies that will be used to determine program outcomes.
- Evaluation Use - What type of data will be collected and how it will be used.
- Needs Assessment - “A systematic way … for identifying education and training problems, needs, issues, and the like” (Caffarella, 2002, p. 123).
- Programs
- Informational - A UME-branded program that delivers research-based information.
- Developing - A UME-branded program in early stages of demonstrating its public value.
- Signature - A UME-branded, research-based program known for its demonstrated public value.
- Evidence-Based - A UME branded program that can be replicated with similar outcomes based on scientific measure of effect and judged by external reviews to meet standardized assessments.
- Program Scholarly Outputs - Products that document the educational intervention including theory, findings, and effectiveness measures. Refereed reviews are the gold standard of judgment of quality of educational interventions.
- Research Base - The science of the curriculum content, delivery, and evaluation.
References:
- Boyle, P. (1981). Planning better programs. New York: McGraw-Hill.
- Caffarella, R. S. (2002). Planning programs for adult learners. San Francisco: Jossey Bass.
- Cornell Office for Research on Evaluation. (2012). The guide to the systems evaluation protocol. Ithaca, NY: Cornell Digital Print Services. Available from https://core.human.cornell.edu/research/systems/protocol/index.cfm
For further information, contact Dr. Debasmita Patra, Program Director, Evaluation and Assessment, University of Maryland Extension, 0322B Symons Hall, College Park, MD, 20742, 301-405-0929, dpatra@umd.edu.