HRMS Frequently Asked Questions


1)   What is the Health Reform Monitoring Survey (HRMS)?


The Health Reform Monitoring Survey, or HRMS, is an innovative new Urban Institute research program that is exploring the value of cutting-edge Internet-based survey methods to monitor the Affordable Care Act (ACA) before data from federal government surveys are available. It relies on the KnowledgePanel® to provide timely assessments of the implementation of the ACA and to identify trends in key outcomes related to the changes under the ACA.


The HRMS provides data on health insurance coverage, access to and use of health care, health care affordability, and self-reported health status. Where possible, its questions are based on questions used in federal government surveys—including the American Community Survey, the Behavioral Risk Factor Surveillance System, the Annual Social and Economic Supplement to the Current Population Survey, and the National Health Interview Survey—and the data collected are benchmarked against those federal data.


Beginning in the second quarter of 2013, each round of the HRMS also contains topical questions focusing on timely ACA policy issues. In the first quarter of 2015, the HRMS shifted from a quarterly fielding schedule to a semiannual schedule.


The HRMS does not replace the ongoing federal government surveys; it supplements them by providing previously unavailable and valuable early feedback on the ACA well before federal survey data are available. While the HRMS carries with it more risks and potential errors than federal government surveys, its timely findings could give federal and state policymakers early insights into ACA implementation, allowing them to fine-tune their policy choices in real time and maximize the benefits of the ACA. By addressing the gap in data availability from federal surveys, the HRMS can provide well-timed feedback on the effects of health care reform, helping identify emerging issues and challenges. A core component of the HRMS research program is assessing the reliability of the early feedback against both qualitative sources and stronger quantitative sources as those data become available.


The HRMS instruments are available here.


2)   What is the KnowledgePanel®?


The KnowledgePanel® is a nationally representative, probability-based Internet panel maintained by Ipsos. The approximately 55,000 people in the panel include households with and without Internet access; laptops and free Internet access are provided to those who lack their own Internet access to ensure their participation in the panel.


KnowledgePanel® members agree to participate regularly in surveys like the HRMS. Panel members typically stay in the panel about two years, although some leave earlier and some stay longer.


Ipsos also gathers a wide range of information—including detailed demographic and socioeconomic characteristics as well as health and disability status—on panel members and their households through supplemental surveys.


The KnowledgePanel® household recruitment rate (where someone in a sampled household expresses willingness to participate in the panel) is currently about 14 percent.1 Those individuals are asked to provide background information by completing an initial survey. The completion rate for that background survey is about 64 percent. The people in that 64 percent become the sample pool for surveys based on the KnowledgePanel®.


3)   How is the HRMS sample selected from the KnowledgePanel®?


Each quarter's HRMS sample of nonelderly adults is drawn from active KnowledgePanel® members to be representative of the US population. In the first quarter of 2013, the HRMS provides an analysis sample of about 3,000 nonelderly (age 18–64) adults. After that, the HRMS sample was expanded to provide analysis samples of roughly 7,500 nonelderly adults, with oversamples added to better track low-income adults and adults in selected state groups based on (1) the potential for gains in insurance coverage in the state under the ACA (as estimated by the Urban Institute's microsimulation model) and (2) states of specific interest to the HRMS funders. Beginning in the third quarter of 2015, only low-income adults are oversampled. In the first quarter of 2017, the HRMS sample was expanded to provide analysis samples of approximately 9,500 nonelderly adults.


Although fresh samples are drawn each quarter, the same individuals may be selected for different rounds of the survey. Because each panel member has a unique identifier, it is possible to control for the overlap in samples across quarters.


4)   How are the KnowledgePanel® members who are selected for the HRMS informed of the survey?


The selected KnowledgePanel® members are emailed an invitation to participate in the HRMS that includes a link to the online questionnaire. Follow-up emails and, if needed, automated telephone reminders are sent to individuals who do not respond to the initial invitation.


5)   Is the HRMS available in languages other than English?


Yes; it is available in Spanish.


6)   Does the HRMS collect information on children?


The HRMS sample was expanded in quarter 2 of 2013 to include questions about a randomly selected child in respondents' households, if the household included children. The information is collected for about 2,400 children and includes the topics covered for adults (i.e., insurance coverage, access to health care, affordability of health care, and health status), as well as additional topics related to children's health and disability status, access to dental care, emergency room visits, adults' satisfaction with children's health insurance coverage, and experiences with Medicaid and CHIP.


7)   How long does it take respondents to complete the HRMS questionnaire?


It takes about six minutes to complete the core information on nonelderly adults and four minutes to do the topical questions. In quarter 3 2015, the topical module was expanded to six minutes. The questions on a random child, if there's one in the household, take another six minutes.


8)   What is the response rate for the HRMS?


For surveys based on Internet panels, the overall response rate incorporates the survey completion rate as well as the rates of panel recruitment and panel participation over time. The American Association for Public Opinion Research (AAPOR) cumulative response rate for the HRMS is the product of the panel household recruitment rate, the panel household profile rate, and the HRMS completion rate2—roughly 5 percent each quarter.3


While low, this response rate does not necessarily imply inaccurate estimates; a survey with a low response rate can still be representative of the sample population, although the risk of nonresponse bias is, of course, higher.4 Other factors—such as levels of bias, levels of missing data, and conformity with other research findings—are also important.5 Consistent with that, studies assessing nonresponse to panel recruitment in KnowledgePanel® have found little evidence of nonresponse bias on core demographic and socioeconomic variables.6 Similarly, studies comparing KnowledgePanel® and traditional random-digit-dial telephone surveys have yielded comparable estimates for a range of measures related to demographic and socioeconomic characteristics, health status and health behaviors, and other characteristics.7 Finally, comparing early HRMS estimates and federal government survey data yields comparable results, especially for the key outcome of interest in the HRMS—health insurance coverage.8


Reflecting the assessment of the panel by a range of users, KnowledgePanel® has a track record of supporting timely policy research across academia, research organizations, and government agencies.9


Nonetheless, the HRMS carries with it more risks and potential errors than federal government surveys. A number of potential sources of bias, including nonresponse bias, likely are only partly mitigated through the survey weighting (see question 9).


9)   Do HRMS estimates account for sample design and survey nonresponse and non-coverage?


All tabulations from the HRMS are based on weighted estimates. The HRMS weights reflect the probability of sample selection from the KnowledgePanel® and post-stratification to the characteristics of nonelderly adults and children in the United States based on benchmarks from the American Community Survey (ACS), the Current Population Survey (CPS), and the Pew Hispanic Center Survey. Because the KnowledgePanel® collects in-depth information on panel members, the post-stratification weights can be based on a rich set of measures, including gender, age, race/ethnicity, education, household income, homeownership, Internet access, primary language (English/Spanish), residence in a metropolitan area, and region. Given the many potential sources of bias in survey data in general, and in data from Internet-based surveys in particular, the survey weights for the HRMS likely reduce, but do not eliminate, potential biases.


The quarter 3 2013 HRMS has a design effect of 1.47 for nonelderly adults, and a sampling margin of error for a 50 percent statistic with 95 percent confidence of +/- 1.3 for the nonelderly adult sample.


In January 2017, all rounds of the HRMS from the first quarter of 2013 through the third quarter of 2016 were reweighted because of a change to the CPS question on household Internet access that was used to create benchmarks for the original poststratification weights. Under the new weighting procedure, the data are weighted to be representative of the nonelderly adult population in terms of lack of Internet access by age group (18 to 34, 35 to 44, 45 to 54, and 55 to 64) based on benchmarks derived from a more stable set of questions on household Internet access from the ACS. Other CPS and Pew Hispanic Survey questions used in the weighting process are unchanged. The transition to the updated weights has a small effect on national estimates. For instance, in the quarter 3 2016 round of the survey, the estimated uninsurance rate is 0.23 percentage points lower under the new weights than under the original weights. The effect of reweighting on estimated changes in key outcomes over time is limited because the new weighting procedure was applied to all previous rounds of the data.


10)   How are missing data handled in the HRMS?


Values are imputed for observations with missing information for family size and family income in the HRMS using regression-based methods. For the remaining measures, item nonresponse is generally low (less than 3 percent) and missing values are not imputed. When reporting on estimates from the HRMS, analyses on this web page include a category for missing data.


11)   What is the timetable for fielding the HRMS?


The HRMS was initially intended to be a monthly survey. In early 2013, however, the research team decided to shift to a quarterly fielding. The first quarter 2013 HRMS combines data from the January–February and February–March 2013 surveys. The quarter 2 2013 HRMS was fielded in June–July 2013. Subsequent rounds of the HRMS were fielded in September (quarter 3), December (quarter 4), March (quarter 1), and June (quarter 2). In the first quarter of 2015, the HRMS shifted from a quarterly fielding schedule to a semiannual schedule.


12)   Are public-use files for the HRMS available?


Public-use files for the HRMS data on nonelderly adults are available on the Survey Resources page.


13)   How is the HRMS funded?


Core funding for the HRMS is provided by the Robert Wood Johnson Foundation and the Urban Institute. The Ford Foundation provided core funding during 2014. The addition of the questions on children has been supported by The Atlantic Philanthropies, the David and Lucile Packard Foundation, and an anonymous donor and has been conducted in partnership with the Center for Children and Families at Georgetown University. Other donors provide supplemental funding to support targeted oversamples and special analyses.


14) Does the HRMS support state-specific estimates?

The HRMS does not support state-specific estimates. However, several funders have contracted with the panel vendor to obtain samples to support state-specific estimates based on the HRMS instrument. For HRMS-Minnesota, HRMS-New Jersey, and HRMS-Texas, the supplemental state samples are based on KnowledgePanel®, the nationally representative, probability-based Internet panel that underlies the HRMS. For HRMS-Louisiana and HRMS-Maine, KnowledgePanel does not provide sufficient sample for state-specific estimates so the samples for those states include “opt-in” respondents who volunteer to participate in surveys. As a result, the HRMS-Louisiana and HRMS-Maine samples are not comparable to the HRMS sample.


1We report on quarter 3 2013 since quarter 3's design reflects the ongoing design of the survey.

2The survey completion rate for the HRMS sample in quarter 3 was 58.1 percent, which is similar to that of other health studies that have relied on the KnowledgePanel®. See, for example, Sarah E. Gollust, Amanda F. Dempsey, Paula M. Lantz, Peter A. Ubel, and Erica Franklin Fowler, "Controversy Undermines Support for State Mandates on the Human Papillomavirus Vaccine," Health Affairs 29, no. 11 (2010): 2041–46; and Lindsey Murtagh, Thomas H. Gallagher, Penny Andrew, and Michelle M. Mello, "Disclosure-and-Resolution Programs That Include Generous Compensation Offers May Prompt a Complex Patient Response," Health Affairs 31, no. 12 (2012): 2681–89.

3For context, the response rates for federal government surveys are generally quite high (for example, 97.3 percent (weighted) for the American Community Survey (ACS) and 77.6 percent for the National Health Interview Survey (NHIS), while response rates for telephone surveys used for polling, like those conducted by Pew Research and Gallup, are much lower. The response rate for a typical telephone survey by Pew Research and for the Gallup Daily Tracking Poll are both 9 percent. For information on response rates in the national surveys see "Response Rates — Data," U.S. Census Bureau, and "2012 National Health Insurance Survey (NHIS) Public Use Data Release — NHIS Survey Description," National Center for Health Statistics. For information on the Pew Research response rate, see, "Assessing the Representativeness of Public Opinion Surveys," Pew Research Center for the People and the Press. Information on the response rate for the Gallup Daily Tracking Poll was obtained through personal communication with Gallup.

4Robert M. Groves, "Nonresponse Rates and Nonresponse Bias in Household Surveys," Public Opinion Quarterly 70, no. 5 (2006): 646–75; Jonathon R. B. Halbesleben and Marilyn V. Whitman, "Evaluating Survey Quality in Health Services Research: A Decision Framework for Assessing Nonresponse Bias," Health Services Research 48, no. 3 (2013): 913–30; and J. Michael Brick, "The Future of Survey Sampling," Public Opinion Quarterly 75, no. 5 (2011): 872–88.

5"Response Rate - An Overview," American Association for Public Opinion Research, accessed September 2013.

6Timothy Heeren, Erika M. Edwards, J. Michael Dennis, Sergei Rodkin, Ralph W. Hingson, and David L. Rosenbloom, "A Comparison of Results from an Alcohol Survey of a Prerecruited Internet Panel and the National Epidemiologic Survey on Alcohol and Related Conditions," Alcoholism Clinical & Experimental Research 32, no. 2 (2008): 222–29; and J Garret, JM Dennis, and CA DiSogra, "Non-response Bias: Recent Findings from Address-Based Panel Recruitment," paper presented at the annual conference of the American Association for Public Opinion Research, Chicago, May 2010.

7See Linchiat Chang and Jon A. Krosnick, "National Surveys via RDD Telephone Interviewing versus the Internet," Public Opinion Quarterly 73, no. 4 (2009): 641–78; and David S. Yeager, Jon A. Krosnick, Linchiat Chang, Harold S. Javitz, Matthew S. Levendusky, Alberto Simpser, and Rui Wang, "Comparing the Accuracy of RDD Telephone Surveys and Internet Surveys Conducted with Probability and Non-Probability Samples," Public Opinion Quarterly 75, no. 4 (2011): 709–47.

8Sharon K. Long, Genevieve M. Kenney, Stephen Zuckerman, Dana E. Goin, Douglas Wissoker, Fredric Blavin, Linda J. Blumberg, Lisa Clemans-Cope, John Holahan, and Katherine Hempstead, “The Health Reform Monitoring Survey: Addressing Data Gaps to Provide Timely Insights into the Affordable Care Act,” Health Affairs 33, no. 1 (2014),

9The Time-Sharing Experiments for the Social Sciences (TESS), which is supported by nine different divisions of the National Science Foundation and housed at Northwestern University's Institute for Policy Research, has provided more than 400 researchers access to the KnowledgePanel® to support innovative research studies. TESS was awarded the 2007 Warren J. Mitofsky Innovators Award by the American Association for Public Opinion Research.

Urban Institute Robert Wood Johnson Foundation