What should evaluations include




















Do you see evaluation as an invaluable tool to improve your program? Or do you find it intimidating because you don't know much about it? The purpose of this introductory section is to provide you with some useful background information on evaluation. Evaluation is a process that critically examines a program.

It is important to periodically assess and adapt your activities to ensure they are as effective as they can be. Evaluation can help you identify areas for improvement and ultimately help you realize your goals more efficiently. Additionally, when you share your results about what was more and less effective, you help advance environmental education.

The information you collect allows you to better communicate your program's impact to others, which is critical for public relations, staff morale, and attracting and retaining support from current and potential funders. Why conduct evaluations? Evaluations fall into one of two broad categories: formative and summative. Formative evaluations are conducted during program development and implementation and are useful if you want direction on how to best achieve your goals or improve your program.

Summative evaluations should be completed once your programs are well established and will tell you to what extent the program is achieving its goals.

Norland, E. From education theory.. Pancer, s. Rossi R H. Evaluation: a systematic approach Thousand Oaks. For additional information on the differences between outcomes and impacts, including lists of potential EE outcomes and impacts, see MEERA's Outcomes and Impacts page.

A well-planned and carefully executed evaluation will reap more benefits for all stakeholders than an evaluation that is thrown together hastily and retrospectively. Though you may feel that you lack the time, resources, and expertise to carry out an evaluation, learning about evaluation early-on and planning carefully will help you navigate the process. MEERA provides suggestions for all phases of an evaluation.

But before you start, it will help to review the following characteristics of a good evaluation list adapted from resource formerly available through the University of Sussex, Teaching and Learning Development Unit Evaluation Guidelines and John W. Evans' Short Course on Evaluation Basics :. Your evaluation should be crafted to address the specific goals and objectives of your EE program. However, it is likely that other environmental educators have created and field-tested similar evaluation designs and instruments.

Rather than starting from scratch, looking at what others have done can help you conduct a better evaluation. It ensures that diverse viewpoints are taken into account and that results are as complete and unbiased as possible. Input should be sought from all of those involved and affected by the evaluation such as students, parents, teachers, program staff, or community members. One way to ensure your evaluation is inclusive is by following the practice of participatory evaluation.

Evaluation results are likely to suggest that your program has strengths as well as limitations. Your evaluation should not be a simple declaration of program success or failure. Evidence that your EE program is not achieving all of its ambitious objectives can be hard to swallow, but it can also help you learn where to best put your limited resources.

Outcomes are the impacts on the customers or on clients receiving services, e. Often, management wants to know everything about their products, services or programs. However, limited resources usually force managers to prioritize what they need to know to make current decisions.

Your program evaluation plans depend on what information you need to collect in order to make major decisions.

Usually, management is faced with having to make major decisions due to decreased funding, ongoing complaints, unmet needs among customers and clients, the need to polish service delivery, etc. For example, do you want to know more about what is actually going on in your programs, whether your programs are meeting their goals, the impact of your programs on customers, etc?

You may want other information or a combination of these. Ultimately, it's up to you. There are trade offs, too, in the breadth and depth of information you get. The more breadth you want, usually the less depth you get unless you have a great deal of resources to carry out the evaluation. On the other hand, if you want to examine a certain aspect of a program in great detail, you will likely not get as much information about other aspects of the program.

For those starting out in program evaluation or who have very limited resources, they can use various methods to get a good mix of breadth and depth of information. They can both understand more about certain areas of their programs and not go bankrupt doing so. Consider the following key questions when designing a program evaluation. For what purposes is the evaluation being done, i. Who are the audiences for the information from the evaluation, e. From what sources should the information be collected, e.

How can that information be collected in a reasonable fashion, e. When is the information needed so, by when must it be collected? What resources are available to collect the information? When designing your evaluation approach, it may be helpful to review the following three types of evaluations, which are rather common in organizations. Note that you should not design your evaluation approach simply by choosing which of the following three types you will use -- you should design your evaluation approach by carefully addressing the above key considerations.

Often programs are established to meet one or more specific goals. These goals are often described in the original program plans. Goal-based evaluations are evaluating the extent to which programs are meeting predetermined goals or objectives. Questions to ask yourself when designing an evaluation to see if you reached your goals, are: 1. How were the program goals and objectives, is applicable established? Was the process effective? What is the status of the program's progress toward achieving the goals?

Will the goals be achieved according to the timelines specified in the program implementation or operations plan? If not, then why? Do personnel have adequate resources money, equipment, facilities, training, etc.

How should priorities be changed to put more focus on achieving the goals? Depending on the context, this question might be viewed as a program management decision, more than an evaluation question. How should timelines be changed be careful about making these changes - know why efforts are behind schedule before timelines are changed? How should goals be changed be careful about making these changes - know why efforts are not achieving the goals before changing the goals?

Should any goals be added or removed? How should goals be established in the future? Process-based evaluations are geared to fully understanding how a program works -- how does it produce that results that it does. These evaluations are useful if programs are long-standing and have changed over the years, employees or customers report a large number of complaints about the program, there appear to be large inefficiencies in delivering program services and they are also useful for accurately portraying to outside parties how a program truly operates e.

There are numerous questions that might be addressed in a process evaluation. These questions can be selected by carefully considering what is important to know about the program. What is required of employees in order to deliver the product or services? How are employees trained about how to deliver the product or services?

How do customers or clients come into the program? What is required of customers or client? How do employees select which products or services will be provided to the customer or client?

What is the general process that customers or clients go through with the product or program? What do customers or clients consider to be strengths of the program? What do staff consider to be strengths of the product or program? Program evaluation with an outcomes focus is increasingly important for nonprofits and asked for by funders. An outcomes-based evaluation facilitates your asking if your organization is really doing the right program activities to bring about the outcomes you believe or better yet, you've verified to be needed by your clients rather than just engaging in busy activities which seem reasonable to do at the time.

Outcomes are benefits to clients from participation in the program. Outcomes are often confused with program outputs or units of services, e. The following information is a top-level summary of information from this site. To accomplish an outcomes-based evaluation, you should first pilot, or test, this evaluation approach on one or two programs at most before doing all programs. The general steps to accomplish an outcomes-based evaluation include to: 1. Identify the major outcomes that you want to examine or verify for the program under evaluation.

You might reflect on your mission the overall purpose of your organization and ask yourself what impacts you will have on your clients as you work towards your mission. For example, if your overall mission is to provide shelter and resources to abused women, then ask yourself what benefits this will have on those women if you effectively provide them shelter and other services or resources.

As a last resort, you might ask yourself, "What major activities are we doing now? This "last resort" approach, though, may just end up justifying ineffective activities you are doing now, rather than examining what you should be doing in the first place. Choose the outcomes that you want to examine, prioritize the outcomes and, if your time and resources are limited, pick the top two to four most important outcomes to examine for now. For each outcome, specify what observable measures, or indicators, will suggest that you're achieving that key outcome with your clients.

This is often the most important and enlightening step in outcomes-based evaluation. However, it is often the most challenging and even confusing step, too, because you're suddenly going from a rather intangible concept, e.

It helps to have a "devil's advocate" during this phase of identifying indicators, i. Specify a "target" goal of clients, i. Identify what information is needed to show these indicators, e. If your program is new, you may need to evaluate the process in the program to verify that the program is indeed carried out according to your original plans. Michael Patton, prominent researcher, writer and consultant in evaluation, suggests that the most important type of evaluation to carry out may be this implementation evaluation to verify that your program ended up to be implemented as you originally planned.

Decide how can that information be efficiently and realistically gathered see Selecting Which Methods to Use below. Consider program documentation, observation of program personnel and clients in the program, questionnaires and interviews about clients perceived benefits from the program, case studies of program failures and successes, etc.

You may not need all of the above. Analyze and report the findings see Analyzing and Interpreting Information below. The following table provides an overview of the major methods used for collecting data during evaluations. Also consider Appreciative Inquiry Survey Design. Note that if you plan to include in your evaluation, the focus and reporting on personal information about customers or clients participating in the evaluation, then you should first gain their consent to do so.

They should understand what you're doing with them in the evaluation and how any information associated with them will be reported. You should clearly convey terms of confidentiality regarding access to evaluation results.

They should have the right to participate or not. Have participants review and sign an informed consent form. See the sample informed-consent form. The overall goal in selecting evaluation method s is to get the most useful information to key decision makers in the most cost-effective and realistic fashion. Consider the following questions: 1. What information is needed to make current decisions about a product or program?

Furthermore, this outburst could be overheard from the reception room. If this occurs again, a report will be written up and placed in your file. Do you understand the importance of this? Written warning. How you handle the written warning plays a critical role in the success of your disciplinary and termination procedures. This is the time to make it clear to the employee just how serious his or her performance problem is.

Once the written warning is mishandled in this way, it no longer has any merit. A standard, written, warning form should include the following:. A description of the behavior or problem that includes objective findings,. The measurable actions and changes expected of the employee,. The support the employer will provide for improvement,. A description of what will occur e.

The signature of the employee and appraiser and the date of the warning. Click below for a form that can be used to document a written warning. Download as MS Word document. Explain the reason for the termination but do so briefly and objectively to avoid getting into an elaborate discussion that puts you in a defensive position.

Also, let the employee know what will become of any accrued vacation or sick leave, pension benefits, etc. Finally, ask if the employee has any further questions and then assist the employee in retrieving all of his or her belongings and leaving with as much dignity as possible. Set an evaluation schedule. However you decide to schedule the evaluations, ensure that each appraiser consistently meets the deadline.

A performance evaluation system should be a key component of your practice structure. Already a member or subscriber? Log in. Interested in AAFP membership? Learn more. She is also the director of practice management for the Professional Association of Health Care Office Management and the marketing chairperson for the Society of Medical Dental Management Consultants.

Send comments to fpmedit aafp. This content is owned by the AAFP. A person viewing it online may make one printout of the material and may use that printout only for his or her personal, non-commercial reference. This material may not otherwise be downloaded, copied, printed, stored, transmitted or reproduced in any medium, whether now known or later invented, except as authorized in writing by the AAFP. Contact fpmserv aafp. Want to use this article elsewhere?

Get Permissions. Read the Issue. Sign Up Now. Previous: Sharing Maternity Care. Next: Maintaining a Medication List in the Chart. Maintaining a Medicati



0コメント

  • 1000 / 1000