Since its launch 3 decades ago, the Center for Community Health and Evaluation has conducted more than 300 evaluations, with its evaluators traveling by plane, train, car, and dog sled to assist communities in 48 states.
The center can be traced to 1990 when the Group Health Center for Health Promotion was funded by the Kaiser Family Foundation to contribute to the evaluation of a national health promotion initiative. In the years that have followed, it has grown from an informal interest group of researchers and health promotion staff at Group Health Cooperative into a dedicated team of 18 staff and local evaluators that comprise a department within Kaiser Permanente Washington Health Research Institute.
Today CCHE works with foundations, nonprofit organization, health-care providers, and government agencies to improve the health of communities with collaborative approaches to planning, assessment, and evaluation.
To mark its 30th anniversary, we spoke with CCHE director Maggie Jones, MPH, and Allen Cheadle, PhD, a KPWHRI senior investigator who was CCHE director from 2011 to 2018, about its current mission, some of its past highlights, and challenges ahead.
Q: What sets apart community health evaluations from other types of studies?
AC: What launched CCHE was a 5-year study of community health promotion funded by the Kaiser Family Foundation. It was designed as a randomized trial with 22 communities - trying to treat a community essentially the way you would treat a person in a drug trial. The “pill” in this case was $150,000 to form a coalition and do community health promotion activities. Our biggest learning was probably that randomized control trials are not the best way to understand the impact of and learn from community health investments, and that a different approach was needed.
MJ: CCHE was aware from its beginnings that the context of community work is more complicated. It’s very difficult to meaningfully quantify the results in the timeframe of most program funding cycles. You can’t control all of the factors that are in play, and as a result you can’t attribute the results to any one specific intervention. We use a logic model or theory of change to understand the contribution and focus on shorter-term outcomes so that we can assess aspects of the intervention that are more likely to change during the grant or investment period.
Q: Do CCHE’s evaluations also differ in terms of your relationship with the subjects?
MJ: While research is often driven by investigator inquiry, our work is driven by the questions of our partners/clients about their work. Our primary goal is to provide them with useful, reliable data and insights to improve their interventions and inform future investments or programming.
Q: And I guess you don’t just publish your findings and leave it at that, true?
MJ: We focus on using data for learning and improvement, and as such we need to design our evaluations and reporting to provide regular—near real-time—feedback. This requires more rapid cycle data collection, analysis, and reporting than what typically happens on a research study.
Q: So, what are some examples of your work helping to affect change?
AC: The work we did with the Kaiser Permanente Community Health Initiative, a 15-year community-based obesity prevention initiative, contributed to our knowledge of how to do effective local policy and environmental change work. What we learned was published in a special 12-article supplement to the May 2018 edition of the American Journal of Preventive Medicine: Building Thriving Communities Through Comprehensive Community Health Initiatives: Evaluations from 10 Years of Kaiser Permanente’s Community Health Initiative to Promote Healthy Eating and Active Living
There was a menu of strategies ranging from changing what’s stocked in grocery stores to improving what food was served at schools to community-wide media campaigns. There were more than 20 different strategies in some 60 communities that we tried to fit together. One of the findings was the importance of focusing on youth in schools for population-level impact, particularly physical activity. All the observed population health changes related to the presence of strong, or what we call “high-dose,” interventions took place in schools, not community settings.
Q: What do you mean by “dose”?
AC: CCHE has devised a number of approaches to evaluating community health initiatives, and “dose” is a significant one. We have a toolkit describing it that we can share with partners.
Dose methods give us a way to add and compare different kinds of community strategies using a common yardstick to estimate impacts. For example, using dose we can compare a strategy like building more sidewalks to increase walkability (a high-reach, but low-strength strategy) to a strategy like a walking group that meets every day (low-reach, but high-strength). Dose lets us add up the impact of different strategies that target the same outcome and group of people. And it gives us a way of talking about how to increase the impact of our strategies—increasing the number of people reached and/or finding ways to make them stronger.
Q: There’s a great graphic timeline that your colleague Carol Cahill produced that shows the major projects that CCHE has been involved in since its inception. Looking at the chart, how has the organization evolved?
MJ: In the early days, most of our evaluations were large, multi-site coalition-based health improvement efforts. While we continue to evaluate and learn from coalition-based initiatives, we have also expanded to working with other types of organizations (e.g., safety net clinics) and on other topics (e.g., community development, trauma-informed systems). We recently posted some lessons we’ve learned from working with groups trying to improve screening for adverse childhood experiences (ACEs) in pediatric clinical settings.
Another big shift has been an increased attention to using evaluation to inform strategy, which requires our team to have strong skills in mixed methods evaluation, but also to be comfortable with facilitating learning and strategic planning so that we can translate our results into strategy recommendations. We’ve deepened our toolbox in interactive learning strategies and data visualization to effectively share results in a way that prompts action.
Q: It’s like the old proverb. Give a man a fish, he eats for a day. Teach him to fish, he eats for life. What does such a success look like for a CCHE client?
MJ: We worked with the Healthcare Georgia Foundation to build the Georgia Evaluation Resource Center, which helps health-related nonprofits demonstrate the impact they have in their communities, along with informing the foundation’s grant decisions. This entailed building processes to ensure that evaluation was embedded into all aspects of the grantmaking process and developing a training and coaching curriculum to bolster the evaluation capacity of its grantees. CCHE worked with them to create an online toolkit; to provide guides to using logic models and conducting focus groups; and to make available other evaluation tools. As a result, the Foundation is better able to understand the difference that their grantees are making in the health of their communities.
Q: That foundation is one of the largest funders of community health programs in Georgia. Your work is often associated with Kaiser Permanente initiatives. How common is it for you to work with other partners?
MJ: It’s a good mix. We have been working with Kaiser Permanente to evaluate its national and regional community health investments since 2003, and that continues to be a substantial part of our project portfolio. We also work closely with health foundations, non-profits, and governmental agencies that are investing in community health. For example, we have an evaluation that's funded by Robert Wood Johnson Foundation with the National 4-H Council that's working to support rural health coalitions and youth leadership to advance a culture of health. We had a project with the Washington State Health Care Authority to establish and develop 9 regional, multi-sector collaborative organizations called Accountable Communities of Health (ACHs), and we continue to provide evaluation support to individual ACHs. We’ve worked with foundations that invest in health, such as the California Endowment, and the Kresge Foundation, and the Foundation for a Healthy Kentucky as well as national and local public health agencies, such as Centers for Disease Control and Prevention, and Public Health–Seattle & King County.
Q: CCHE, like everyone, is facing new challenges with the coronavirus. What are you doing?
MJ: We are adjusting projects in the community and with safety net clinics to be supportive of their responses to the pandemic. While what that means for our work is still emerging, we’ve been talking with our partners about how our evaluations can be adapted to answer questions that are relevant to this time of crisis and response. We have been exploring how to support learning and peer sharing during the response and, for some evaluations, are planning to facilitate postmortems once the acute response has ended to capture lessons learned. We’ve also been extending timelines and altering data collection plans to be sensitive to the priorities of communities right now.
Q: I can’t end the interview without asking you about the dogsleds CCHE evaluators have used as part of their work. What was that about?
AC: We were engaged on a project to understand innovative dental projects. There was one in particular that was only being implemented in a tribal clinic in rural Alaska. It was quite a harrowing flight to get there, arriving in a small plane that pulled scary close to a mountain as it descended to land during a storm. Our staff then got in a dog sled to get to the clinic. Visiting the clinic offered helpful insights about the role mid-level dental practitioners can play in remote areas.
MJ: A lot of our work requires going to the community. You can talk to people on the phone, but you can't really understand what the situation is in that community until you go there and see it for yourself. You don't really understand, for instance, what a community garden looks like in different places: It can be a small plot in the inner city, or it could be an acre of farmland. There's such a value to understanding the context, which is why we have had local evaluators in places like Kentucky and Georgia and California in the past, as well as sent our staff to visit places like the clinic in rural Alaska. We believe that a boots-on-the-ground perspective is really important to effectively evaluate the impact of community investments.
Our mission is to improve the health of communities with collaborative approaches to planning, assessment, and evaluation.
Mid-project report on a 6-region initiative shows steps forward in prioritizing equity in community development.
KPWHRI’s Center for Community Health and Evaluation proposes 5 necessary elements based on its recent work with partner programs.
NASHP (National Academy for State Health Policy) News, Jan 21, 2020
A 12-article report features CCHE on decade of Kaiser Permanente Community Health Initiatives, write Drs. Allen Cheadle and Elena Kuo.
Read it in Healthy Findings.
Dr. Allen Cheadle describes how CCHE’s new Healthy Dose Toolkit makes it easier to design and evaluate community health initiatives.
Northwest Public Health, Spring/Summer 2017
A pilot program supported by Kaiser Permanente will boost clinical, operational, and financial performance of safety net clinics, with CCHE evaluating.
Study ends but benefits for Kaiser Permanente members continue — thanks to a new support role in the regions’ clinics, writes Dr. Clarissa Hsu.
Dr. Clarissa Hsu writes about how her team equalized power between patients and staff in a quality improvement project.
Maggie brings both continuity and fresh perspectives to the job, writes former director Allen Cheadle.