Volume 54 | Number 5 | October 2019

Abstract List

Megan K. Beckett Ph.D., Marc N. Elliott, Q Burkhart MS, Paul D. Cleary Ph.D., Nate Orr M.A., Julie A. Brown, Sarah Gaillot Ph.D., Karin Liu MPH, Ron D. Hays


To assess the effect of changing survey questions on plan‐level patient experience measures and ratings.

Data Source

2015 Medicare Advantage Survey respondents.

Study Design

Ninety three randomly selected beneficiaries in each of 40 plans received a revised (5.0) survey; 38 832 beneficiaries received version 4.0. Linear mixed‐effect regression predicted measures from fixed effects for survey version and beneficiary characteristics and random effects for plan and plan‐by‐version random slope.

Principal Findings

Response rates were 42 percent for both versions. Removal of “try to” from screeners increased the percentage of respondents eligible for follow‐up questions. Version 5.0 caused a small increase (1‐3 points on a 0‐100 scale,  < 0.05) in the mean of three altered measures and a moderate increase (>3 points) in one. There was a small statistically significant increase in two unaltered measures. These changes were uniform across plans, so there would be no expected change compared to results using the legacy survey in the score distributions other than uniform mean shifts, and no expected effect on summary measures.


These analyses illustrate how to assess the impact of seemingly minor survey modifications for other national surveys considering changes and highlight the importance of screeners in instrument design.