Beyond Compliance: Making Sense of Your ACGME Survey Data

| December 11, 2025 | Print Article

Every spring, the arrival of the ACGME Resident/Fellow and Faculty Survey results marks a pivotal moment for GME leadership. For Designated Institutional Officials (DIOs), Program Directors, and institutional administrators, ACGME survey results offer more than compliance data—they provide a timely reflection of how residents, fellows, and faculty experience the learning environment.

Survey findings highlight key strengths and areas of concern across five core program areas. When approached with clarity and purpose, survey results become essential tools for assessing your GME program culture and identifying growth opportunities.

This article outlines a practical approach to interpreting your ACGME survey results, offering strategies for institutional review, stakeholder communication, and continuous improvement planning.

Reframing the Purpose: How To Read ACGME Results

Before diving into numbers, pause to remember the “why.” The ACGME surveys exist to monitor the educational environment and promote continuous improvement—not to grade programs. A healthy first step is shifting from defensive review to curious interpretation.

As we explored in the Partners® Pulse article “Powering Excellence,” data has value only when it drives reflection. The same principle applies here. Approach the results with curiosity: What are they telling us about our learners’ experience, our culture, and our systems?

As highlighted in our January webinar, “Effective GME Oversight for Site Visits,” compliance alone isn’t the goal. Oversight that focuses on curiosity, connection, and continuous improvement transforms survey data into insight—and insight into change. 

What the ACGME Surveys Really Measure

The ACGME Resident and Faculty Surveys measure five core aspects of the clinical learning environment: supervision, teamwork, professionalism, well-being, and communication. These domains align with the ACGME Common Program Requirements and reflect the day-to-day experiences of residents, fellows, and faculty, not just program structure.

Understanding those domains helps ensure that the discussion stays focused on meaning, not metrics. Consider sharing the full list of domains with your leadership and faculty teams. Connecting each question to the ACGME Common Program Requirements builds transparency and demystifies what the data actually reflects.

Related Webinar: Promoting IPE in GME explores how teamwork and collaboration appear within survey domains.

How To Interpret ACGME Survey Data & Avoid Common Traps

To interpret ACGME survey data, look for multi-year patterns rather than focusing on a single year’s scores. Long-term trends offer a clearer picture of your program’s cultural trajectory and whether improvements are sustained over time.

Programs sometimes fall into “data whiplash” —reacting strongly to small shifts year to year. But surveys are snapshots influenced by timing, cohort size, and local context. One year’s data rarely represents a trend; two to three years of data do.

As we discussed in Partners® Pulse: “Let’s Get Ahead of Ourselves,” the story lies in where your culture is heading, not where it stood this year. 

Setting Up an Institutional ‘First Look’ At Your GME Program’s Survey Results

Before sharing results widely, consider an internal “first look” meeting between your GME Office, Program Director, and key stakeholders. Reviewing together allows context and nuance to shape the narrative before conclusions take hold. This also reinforces institutional transparency and shared accountability.

Strong oversight means connecting the dots—not just reacting to scores. When resident feedback, faculty evaluations, and milestones are reviewed together, patterns start to emerge. Those patterns tell the real story: where support is needed, what’s working well, and where early intervention could make a difference. The best institutions don’t wait for a site visit to notice a problem—they use data as a continuous feedback loop to keep learning, adapting, and improving. 

Framing Your Results with Context

When presenting results to leadership, remember that not all gaps between program and national means are significant. A small program with 10 residents will naturally show wider variation than a large program with 60. Use comparisons wisely—pair quantitative data with qualitative insight, such as internal surveys or resident focus group feedback.

The Partners® Pulse article “Refreshing Your Clinical Competency Committee (CCC)” highlights a similar principle: context transforms data into understanding. Present your findings as part of a broader narrative of learning and improvement.

Our new ACGME Survey Interpretation Worksheet helps programs move beyond instinctive reactions to survey scores by guiding them through structured interpretation. This resource includes sample data and narrative examples, showing how to translate raw numbers into meaningful patterns and actionable insights. By documenting key findings, identifying contributing factors, and framing results constructively, programs can approach GMEC discussions with clarity and confidence—turning data review into a shared learning process rather than a defensive exercise. 

Download the worksheet.

Turn ACGME Survey Results into a Strategic Advantage

Approaching survey results with curiosity and context is just the beginning. By pairing thoughtful interpretation with strategic action, GME leaders can turn survey results into meaningful change. For guidance tailored to your institution, Partners® offers expert support in data analysis, stakeholder engagement, and compliance planning.

Explore our GME consulting services or contact us to discuss how we can support your GME team. We also encourage you to stay tuned for Part 2 in January: “From Data to Dialogue.”