Kathryn E. H. Race, President
Race & Associates, Ltd.
4430 N. Winchester Ave.
Chicago, IL 60640
(773) 878-8535 (Phone)
(773) 878-2746 (Fax)
Our Approach to Evaluation
As reflected on the first page of this website, we divide our evaluation work into three separate categories: program-theory based evaluations, surveys supported by psychometrics (demonstrating the reliability and validity of measures); and our use of mixed-methods. Generally, we begin with a program model. Then, we support these activities by implementing a mixed-methods methodology to fit the particular needs of a given evaluation effort selecting measures that align with program outcomes. And, we provide valuable assessment of the reliability and validity of specifically-designed evaluation measures as the need for this arises for given programs.
First Understand the Program
We believe it is essential to understand the program that is being evaluated before a design is developed. We strive to work with program stakeholders who see the merit of spending initial planning time in articulating the program in important ways. That is, starting with a program model (this is broader than a logic model), we articulate the core strategies that underlie the program, which provides an avenue to understand and see the logical links from these strategies to program outcomes. The model also helps us understand the context in which the program operates, such as the connection between the program and the mission of the organization, or say the school environment which can impact the program and its outcomes. In addition, an implemented program is likely to differ from the program as designed in many ways. Understanding the program as designed helps to ensure that program modifications are made in a conscious way and understanding the implemented program helps us determine the level of intervention received by participants to better understand any gains related to the program.
We approach evaluation from a "methods-neutral" standpoint. That is, we seek to develop a program model that will drive the design and subsequent implementation of the evaluation. Selection of evaluation measures are guided by the program outcomes identified in the program model. Thus, we are likely to use a mixed-methods approach as suggested by the specific needs of the program and assessment. Nearly all of the evaluation work we have conducted incorporates both qualitative and quantitative methods.
We take a hybrid, academic-business approach to evaluation work. For example, we appreciate the importance of accuracy but we recognize the importance and need for efficiency. We have extensive experience in evaluation and applied research having conducted evaluations in a variety of venues including safety and health, vocational services, healthcare, and education. The principal, Kathryn Race is a long-standing member of the American Evaluation Association (AEA); and, we use AEA's Guiding Principles to help us think and reflect about our practice and evaluation issues that may arise. We strive to formulate a team, based on a network of many experienced, credentialed colleagues, to assist us in our work when needed. And, we seek to listen to our clients and be empathic to program concerns while maintaining the integrity and quality of the evaluation work.
An effective evaluation begins with a strong understanding of the program and what the expected outcomes are. A program model serves to direct the evaluation in determined ways that are known and guided by the parties that are engaged in the program and its assessment. In addition, the measures that are selected align with the program outcomes; this helps to ensure that priority areas are assessed during the evaluation. And, an effective evaluation works with program stakeholders to help facilitate the usefulness of the information learned; whether it is related to the program (and its underlying theory) or the outcomes associated with program participation.
Effectiveness of a Project
Many evaluations can be quite challenging given modest evaluation budgets and time
restraints. We gauge the effectiveness of a project by its ability to provide useful information regarding the program and its associated impact to clients and to the funder(s) while realistically staying within the budget, designated deadlines, and project time frame, and maintaining the quality and the integrity of the evaluation.
Effectiveness of a Client Relationship
Much of our evaluation work is based on repeat business and word of mouth referrals.
We support a strong working relationship(s) with clients and stakeholders associated with a given project. Clients are treated with respect and recognized for their individual expertise and professional knowledge. Critical listening skills are an important part of the approach that we take when engaged with clients as we work to understand the issues from their viewpoint while respecting our own evaluation knowledge and expertise. Also of importance, is describing the scope and limitations of an evaluation so that expectations related to program outcomes and evaluation efforts are realistic.