Hospitals function within an environment in which regulation influences their daily operation, reporting, and reimbursement. This study tests whether regulatory reporting is affected by significant regulatory change. To examine whether this relationship exists, we utilize a comprehensive sample of more than 22,000 Medicare cost reports (MCRs) and corresponding MCR status changes spanning 2007–2014, surrounding the passage of the Patient Protection and Affordable Care Act of 2010 (“ACA”). We find that hospitals restate more Medicare cost reports in the post-ACA (2012–2014) period suggesting that regulatory reporting accuracy declines overall. The analysis is expanded to examine reporting accuracy across hospital types. There is an increase in MCR restatements by all hospitals following the ACA, but the timing varies. Further analysis reveals that the increase in restatements following the ACA is due to an increase in amended returns that outweighs a decline in MCRs that are reopened following their settlement.

Medicare constitutes a key pillar in the financial composition of the U.S. healthcare industry. In 2017, Medicare expenditures topped $700 billion for the first time ever, representing 20 percent of total national health expenditures (CMS 2017). This trend is not expected to reverse as the U.S. population trends older. Overall, 58 million individuals were enrolled in the Medicare program during 2017, including 3.8 million new enrollees. The emergence of the novel coronavirus in 2020 has only served to sharpen society's collective focus on the U.S. healthcare system and its efficacy. Given the significance of the Medicare program as a component of national healthcare expenditures and the critical importance of healthcare services and providers to our collective well-being, participating healthcare organizations are subject to intense scrutiny and extensive regulation.

One such piece of recent regulation is The Patient Protection and Affordable Care Act of 2010 (ACA), which significantly altered reporting requirements and reimbursement structures for hospitals. Specifically, following passage of the ACA, hospitals are required to report new quality metrics offering deeper insights into patient care and outcomes. Further, there were significant shifts in the reimbursement structures and required reporting metrics, resulting in direct changes to the Medicare reporting process. For instance, payments for Medicare disproportionate share (DSH), which reimburses hospitals for treating a large portion of uninsured patients, were reduced by 75 percent under the ACA and funds were instead made available for hospitals to claim for uncompensated care. As a result, the ACA changed the reporting requirements and the economic incentives, and in turn may have affected the Medicare cost reports (MCR) regulatory reporting process.

The implementation of the ACA offers a natural experiment for an evaluation of the influence of regulatory changes on hospital MCR reporting processes and accuracy. On the one hand, increased scrutiny and the economic incentives accompanying the ACA may lead hospitals to report more accurately post-implementation. Conversely, periods of significant regulatory change are often accompanied by uncertainty and adjustments to existing structures and processes, which could result in less accurate regulatory reporting in the post-implementation time period. This study leverages a robust set of MCR panel data that capture MCR report status changes throughout the report submission and review processes for an eight-year time period during the implementation of the ACA in 2012. The MCR data we leverage in this study were obtained from the Research Data Assistance Center (ResDAC) and include every MCR and the corresponding status changes during the MCR review period for all U.S. Short-Term Acute Care (STAC) hospitals filing cost reports between 2007 and 2014. This unique sample construction differs from other contemporary cost report research, which typically incorporates final MCR reports as of a particular point in time (i.e., static). In contrast, our MCR sample offers a dynamic glimpse into the reporting process itself, which should yield new insights into regulatory reporting accuracy surrounding the implementation of the ACA in 2012.

By analyzing subsequent changes in reporting to correct the original MCR submissions, a reporting error rate can be determined. We evaluate the restatement of reports during the pre- and post-ACA periods and offer unique insight into the role of regulatory change in hospital MCR reporting accuracy. Our primary variable of interest is the number of MCR submissions for a reporting year. Multiple MCR submissions suggest that a report was amended or reopened1 for a hospital for a particular year. Overall, the number of MCRs is higher for the full post-ACA period (2012–2014) as compared to the pre-ACA baseline (2007–2011). We also find that the number of submitted MCRs increases by 0.06 in the year of ACA implementation (2012) versus the pre-ACA period, but then declines in the latter post-ACA period (2013–2014), while remaining above pre-ACA levels. The overall sample, as well as each of the hospital-type (for-profit, nonprofit, government, and hospital district) subsamples, experience a statistically significant increase in the number of MCRs submitted following implementation of the ACA, but the timing of observed reporting changes varies by hospital type. In addition, the nature of restated submissions shifts post-ACA as hospitals are more likely to amend their MCRs but are less likely to reopen settled MCRs. The increased occurrence of MCR amendments post-ACA suggests regulatory reporting accuracy may have declined for initial MCR submissions. At the same time, hospitals appear to be working diligently to correct these reporting errors by filing amended MCRs and, overall, hospitals appear to be reopening MCRs less frequently post-ACA.

We examine several alternative error rate specifications and document consistent evidence that regulatory change is negatively associated with MCR reporting accuracy in the ACA's implementation year (2012) and mixed evidence of the effects thereafter. The ACA had elements that were phased in over several years for the broader healthcare industry, but the hospital MCRs changes went into effect in 2012. Collectively, the results show an increased propensity for hospitals to restate MCRs in the year of ACA implementation and declining restatements thereafter, indicative of adjustments in the hospital regulatory reporting processes during this period of regulatory change. Altogether, these findings should be of interest to regulators and analysts who use MCR data to set future reimbursement rates and to academic researchers who make use of MCR data in their research as the results of these analyses suggest the MCR regulatory reporting and review processes have changed significantly after the ACA was implemented in 2012. Medicare cost reporting accuracy is a topic of keen interest for policy makers who seek to balance public health, sustainability of the Medicare system, and other political interests, and this study sheds new light on reporting accuracy during periods of regulatory change. Policy makers might also find this study to be of interest in their efforts to understand the consequences of regulatory change, which in turn affects healthcare providers, consumers, and other key stakeholders. For example, hospital contractual arrangements with private/third-party payers are likely to be influenced by Medicare cost report information, which in turn affects consumer insurance rates, premiums, and other important factors.

These findings also have implications for the broader academy of capital market participants and researchers in accounting and finance. Research often relies on publicly available reporting information; this paper should serve as a reminder that periods of regulatory change may introduce volatility into reporting processes. However, in our sample, for-profit hospital chains reported more consistently throughout the ACA implementation, suggesting that these organizations may better navigate periods of regulatory change than nonprofit entities.

The remainder of this paper is organized as follows. In Section II, the research question is provided along with relevant background information; Section III describes the research methodology and the sample; Section IV presents and discusses the empirical results and Section V concludes.

In 2007, total national health expenditures (NHEs) constituted 15.9 percent of U.S. Gross Domestic Product, as compared to 17.3 percent in 2014 and 17.9 percent in 2017 (CMS 2019a). NHEs grew 3.9 percent in 2017 to nearly $3.5 trillion, down slightly compared to the 4.3 percent cumulative average growth over the preceding decade from a baseline of $2.3 trillion NHEs in 2007 (CMS 2019b). Health expenditures in the U.S. are predominantly funded through one of four sources (corresponding percent of 2017 NHEs): (1) private health insurance (34 percent); (2) Medicare (20 percent); (3) Medicaid (17 percent); and (4) out of pocket, or self-pay (10 percent). Across all sources, the federal government sponsored 28 percent of aggregate NHEs in 2017, as compared to 28 percent sponsored by private households and 20 percent by private businesses (CMS 2019a). Federal government-sponsored healthcare plans draw considerable attention from academics, lawmakers, regulators, and other key stakeholders given the magnitude, growth trajectory, and public interest in these massive taxpayer-funded programs (Brand et al. 2012; Allison 2013). Among the federal healthcare programs, Medicare is the largest by a wide margin. In 2017, Medicare expenditures were $705.9 billion, almost double the $361.2 billion disbursed through the federal Medicaid program (CMS 2019c). Medicare expenditures currently comprise approximately 15 percent of total government expenditures and are projected to rise to 18 percent of government expenditures in the coming decade (Cubanski, Neuman, and Freed 2019).

With approximately one in every six dollars spent by the federal government flowing to Medicare, it should be of little surprise that the program is a focal point for lawmakers and regulators. Lawmakers and regulators serve a critical function in shaping the reporting framework for U.S. healthcare organizations and their respective stakeholders. Collectively, lawmakers and other healthcare regulators establish requirements for the timing and composition of financial reports, impose audit requirements to ensure reports are credible and accurate, and compile and analyze the information contained in hospital financial reports to evaluate organizational and program performance in order to shape future laws and policies that affect the operating and reporting environment for the entire healthcare sector.

Among healthcare providers, hospitals comprise a central component to healthcare delivery in the United States. According to a 2017 American Hospital Association (AHA) survey, there were approximately 6,200 hospitals and 930,000 staffed hospital beds nationwide (AHA 2019). According to the same survey, total admissions to U.S. hospitals tallied more than 36.5 million in 2017 and total expenses across all hospitals nationwide topped $1 trillion.

In a recent comprehensive literature review of management accounting and control research within the hospital industry, Eldenburg, H. Krishnan, and R. Krishnan (2017) present a theoretical framework for accounting research conducted in the hospital setting. Specifically, the authors identify multiple important linkages that must be accounted for in research design, with organizational governance and control at the center. Management accounting and control research within the hospital industry is particularly compelling because it offers researchers a unique opportunity to examine how organizational governance and control influence organizational outcomes that cannot be easily replicated in other settings. As Eldenburg et al. (2017) highlight, a hallmark of the U.S. hospital industry is that the hospitals themselves represent several unique forms of ownership and control (i.e., for-profit, nonprofit, governmental), but all share important commonalities in their core production function(s). However, the goals of hospitals vary based on type of control. For-profit hospitals are answerable to investors/stakeholders while nonprofit and governmental hospitals have a responsibility to the communities they serve. Therefore, a hospital's mission and operational approach will likely differ based on the type of control. In addition, hospital financial reporting and operating performance data is often publicly obtainable, which facilitates analysis and research.

The ACA altered reporting requirements and reimbursement structures for hospitals. Specifically, hospitals are now required to report new quality metrics relative to patient care and outcomes. Payments for Medicare DSH, which reimburses hospitals for treating a large number of uninsured patients, were reduced by 75 percent under the ACA. While many hospitals could offset this reduction against the newly initiated reimbursement for uncompensated care cost (UCC), the reporting changes were significant and may have precipitated a reduction in overall reporting quality, especially in the years immediately surrounding the ACA's implementation. These changes have significant implications for hospitals, and their response to this regulatory change could differ based on the hospital's type of control and mission. MCR reporting accuracy is particularly important given the enormous scope of the program and because regulators and numerous other stakeholders make use of this data to assess the current state of the U.S. healthcare system and to evaluate future policy and reimbursement strategies. Interestingly, hospitals have been identified as being markedly behind other industries in the implementation of superior and up-to-date accounting and reporting practices (Langabeer, DelliFraine, and Helton 2010). Reporting deficiencies may be especially problematic during times of significant regulatory change, such as the time period surrounding the passage of the ACA of 2010 and its implementation in 2012.

The MCR preparer (i.e., the hospital or a designated representative) groups expenses and revenues into cost centers consistent with Centers for Medicare & Medicaid Services (CMS) regulations. During cost finding, expenses are adjusted to remove non-allowable expenses and revenues. A paid claims detail, known as the Provider Statistical and Reimbursement Report (PS&R), is utilized to include the payments received by CMS for Medicare beneficiaries. These payments are grouped into cost centers to determine cost-to-charge ratios for each cost center; this information forms the basis for determining the final Medicare settlement for the preparer in each reporting period.

Medicare administrative contractors (MACs) are contracted by CMS to conduct desk reviews and audits of MCRs submitted to improve reporting accuracy prior to issuing a Notice of Program Reimbursement (NPR), which is the final settlement for each cost report. The initial “as submitted” (status code 1) MCR is provided to the MAC by the end of the fifth month after their year-end. The MAC will then perform a desk review and the MCR can be “settled without audit” (status code 2) or “settled with audit” (status code 3) by the MAC.2 Once the MAC settles the MCR, an NPR is issued, finalizing the settlement for the MCR. Hospitals may “amend” (status code 5) an MCR up to the issuance of the NPR. Once an NPR is issued, hospitals can “reopen” (status code 4) a report for up to three years after the issuance of the NPR. However, to reopen an MCR, a hospital must be reopening the MCR in relation to an item adjusted during the MAC's review. Such a review would primarily focus on DSH payments, bad debts, and/or census and payment updates based on revised/more current data being available due to time lag. Cost report reopenings focus solely on the issues outlined in the reopening request and do not reopen the entire cost report to an additional review. In addition, the financial impact of the reopening must be $15,000 or greater. Figure 1 depicts the MCR preparation, filing, and review process.

FIGURE 1

MCR Life Cycle

Once filed, cost reports are maintained by the Research Data Assistance Center (ResDAC) in the Healthcare Cost Report Information System (HCRIS) database. CMS analyzes these data as they evaluate establishing or modifying the requirements and regulations to which facilities must adhere to maintain eligibility for participation in the Medicare and Medicaid programs (P.L. 105-33 1997). CMS also uses the data submitted in MCR submissions to set future reimbursement rates, make revisions to payment models, and evaluate other potential program changes. CMS utilizes a trailing three-year approach to setting reimbursement rates and making revisions. Therefore, if errors exist within the MCR, future legislation and payments within the Medicare program could be influenced, which could critically affect hospitals and the communities they serve. Further, CMS makes this information available to the public at large, so reporting accuracy is a topic of broad interest for academics, healthcare industry administrators and analysts, and numerous other interested parties who make use of this data in their work. For instance, private/third-party payers are likely to make use of this information as they negotiate contracts with hospitals.

Regulatory changes like the ACA create challenges for the U.S. healthcare delivery system in general and for hospitals specifically (Longest 2012). The ACA initiated numerous reimbursement and reporting changes that directly affected MCR content and composition. For instance, the legislation established new health insurance exchanges and mandated that individuals maintain health insurance (Eldenburg et al. 2017). The ACA also reduced Medicaid DSH payments to hospitals. Collectively, these marketplace changes, insurance mandates, and proposed payment changes generated significant uncertainty among hospital managers and executives tasked with budgeting and planning (Eldenburg et al. 2017).

On the reporting front, the ACA implemented additional quality reporting metric requirements that include UCC, Value-Based Purchasing (VBP), and Hospital-Acquired Conditions (HAC), to note a few examples, (P.L. 111-148 2010; Navathe et al. 2012). These new reporting metrics provide reimbursement reduction implications for hospitals not meeting the minimum requirements set by CMS. Further, the ACA imposed reimbursement implications for hospitals utilizing a “pay-for-performance” quality driven structure (Pearson and Bach 2010). This regulatory approach follows other countries' efforts to reduce the cost of providing healthcare and shifts the responsibility for operational efficiency from the payer (i.e., Medicare) to the healthcare providers (Cardinaels and Soderstrom 2013). Finally, the ACA introduced bundled payments for the entire patient care cycle, which hospitals must split appropriately among the various parties engaged in patient care (Eldenburg et al. 2017). Collectively, these regulatory changes present hospitals with unique reporting and operating challenges and serve as an important backdrop for this study.

Eldenburg et al. (2017) present a useful framework for conducting accounting research within the hospital setting. Collectively the authors identify several important factors that may influence accounting outcomes either directly or indirectly. Direct linkages are drawn between accounting outcomes and (1) governance and control; (2) management accounting system design; (3) market structure; (4) public policy; and (5) institutional forces. Further, the authors identify indirect linkages for each of these forces and accounting outcomes occurring through the management accounting system design element. In other words, governance and control, market structure competition, public policy, and institutional forces are likely to shape accounting outcomes directly and indirectly via the accounting systems implemented and resources allocated by hospital managers.

Hospital governance and control is at the center of this proposed framework and several studies specifically examine the role of governance and control in shaping operating and reporting outcomes. Holzhacker, Krishnan, and Mahlendorf (2015) leverage economic theory to propose and test a model for changes in cost behavior surrounding regulatory change. Using a large sample of German hospitals, the authors show that ownership and control structures influence how hospitals respond to price regulation. Specifically, for-profit hospitals demonstrate greater flexibility than nonprofit and governmental hospitals, given that these other control types face greater institutional pressures against changes in cost structure.

Two studies—Kennedy, Burney, Troyer, and Stroup (2010) and Krishnan and Yetman (2011)—analyze nonprofit hospitals to better understand how institutional factors influence decision making for this important subset of hospitals. Kennedy et al. (2010) leverage a sample of Texas nonprofit hospitals to examine how state regulation establishing a minimum threshold on charity care influenced charity care spending. The authors found that hospitals operating below the threshold increased spending on charity care, while stronger performing hospitals that were above the benchmark, actually decreased charity care spending. In particular, the reduced charity care spending among nonprofit hospitals that were above the benchmark suggests that profit motives compete with institutional forces and constraints on managerial decision making for these organizations. Krishnan and Yetman (2011) examine the role of institutional factors on the practice of cost shifting in a sample of California nonprofit hospitals. Specifically, the authors identify a range of normative and regulative pressures influencing nonprofit hospitals and find that normative pressure leads to increased cost shifting, whereas regulative institutional forces reduce cost-shifting behaviors.

Several other academic studies analyze hospital accounting and performance outcomes in periods of regulatory change. Barnes and Harp (2018) provide evidence that established DSH thresholds affect hospital capacity planning. In their study, the authors focus the analysis on a sample of nearly 200 hospitals operating near the 100-bed DSH add-on payment threshold. Specifically, the authors observe a discontinuity in the distribution of urban hospital bed capacities, suggesting that hospitals manage their available beds to maximize DSH payments. The observed behavior appears to most pronounced among the subset of for-profit hospitals. Barniv, Danvers, and Healy (2000) find that the Medicare capital prospective payment system (CPPS) regulation influences capital expenditures using a set of audited financial statements for a large, national sample of U.S. hospitals (n = 1,949) spanning pre- (1988–1991) and post-CPPS (1992–1996).

Lynch (2003) studies hospital financing decisions and finds that hospitals reduced debt (driven largely by a reduction in capital expenditures) in response to the 1992 Medicare capital prospective payment regulatory changes for a sample of California hospitals. The change eliminated incentives to issue debt and hospitals responded accordingly. Eldenburg and Vines (2004) study 98 Florida hospitals across a three-year time period (1989–1991) and find evidence consistent with opportunistic reclassification of bad debt expense as charity care and find that high cash, low operating margin hospitals appear to engage in this reclassification more aggressively. This study shows that reporting accuracy deteriorates in relatively predictable ways given the incentives and disincentives embedded within healthcare regulation. Eldenburg and Kallapur (1997) examine patient mix and cost allocation decisions for a sample of Washington state hospitals surrounding a Medicare reimbursement change in 1983. Specifically, inpatient services were reimbursed at a fixed rate (i.e., the current model) while outpatient services were reimbursed at cost. The analysis showed that the ratio of outpatient revenues to total revenues increased more for Medicare patients than for non-Medicare patients following this regulatory change, suggesting that hospitals utilize a variety of operating levers to manage cash flows favorably during periods of regulatory change.

Extant research identifies the ACA payment structure changes surrounding the delivery of healthcare for uninsured patients (UCC now being emphasized over DSH) as presenting significant performance risks for U.S. hospitals (Chaikind 2012; Gardner 2013; Eldenburg et al. 2017). A recent study by McGowan, Chan, Yurova, Liu, and Wong (2018) finds that external audit quality improved following the passage of the Sarbanes-Oxley Act of 2002 and other nonprofit regulations and disclosure requirements. The paper draws from institutional theory and posits that increased regulatory scrutiny leads to shifts in auditor behavior, causing them to work diligently to preserve their reputation and to ensure compliance with newly imposed government mandates and/or regulatory requirements. While the study focuses on the work of the external auditor, the findings suggest that company financial reporting and internal controls are likely to be affected because of the heightened scrutiny.

In summary, the hospital industry presents a compelling opportunity for researchers to leverage natural experiment research designs to examine how hospital managers respond to regulatory change. We similarly apply institutional theory to analyze hospital regulatory reporting processes and accuracy in the years surrounding the passage of the ACA. Given the shifting regulatory requirements, it seems plausible that reporting risks and obstacles may increase during periods of regulatory change, which in turn has implications for regulatory reporting accuracy.

This study evaluates the influence of regulatory changes, (i.e., ACA), on the MCR regulatory reporting process among U.S. short-term acute care (STAC) hospitals. The passage of the ACA offers a compelling natural experiment in which to evaluate the influence of regulatory changes on hospital reporting processes and accuracy. Our study draws from institutional theory, which predicts that organizations and their key stakeholders endeavor to conform to the newly established requirements in order to maintain legitimacy and stability (Krishnan and Yetman 2011; Eldenburg et al. 2017; McGowan et al. 2018). Specifically, the ACA implements new reporting requirements with a reimbursement structure that is punitive to hospitals that fail to operate efficiently (Rosenbaum 2011). Further, the ACA changes certain reimbursement metrics, which could have negative reimbursement implications for hospitals that underperform or report incorrect data on their MCR. Therefore, it is reasonable to expect that hospitals will have a heightened awareness of the importance for accurate reporting under the ACA and would be expected to exhibit fewer restatements and errors over time given these possible consequences to misreporting. Hospitals that fail to report known overpayments to CMS could be barred from participating in the Medicare and Medicaid programs and could be fined. These penalties could be levied for any known overpayments regardless of the amount.

At the same time, the reporting changes mandated by the ACA likely require hospitals to alter their reporting processes and structures. As with any significant regulatory changes, there may also be uncertainty about reporting requirements and expectations during the initial years of implementation. Thus, hospitals may exhibit a greater number of errors and/or restatements in their MCRs given the uncertainties and changes contained within this significant new regulation and the corresponding changes to their internal reporting processes. Given these competing sets of expectations, we propose the following non-directional research question:

RQ1:

Is the MCR regulatory reporting process affected by the implementation of the Patient Protection and Affordable Care Act (ACA)?

To evaluate RQ1, we utilize two primary regression specifications to test for evidence of an association between regulatory change and reporting accuracy using two different proxies: (1) the number of MCR submissions and restatements and (2) the probability of restating an MCR. The various models and variables are described in the following sections.

The Number of MCR Submissions and Restatements

This analysis utilizes the number of MCRs submitted prior to final settlement. By quantifying the initial submission (status code 1) plus the number of restatements (status codes 4 or 5) for hospitals both pre- and post-ACA, it is possible to evaluate the influence of regulatory change on this important process using the following model:
where \(\# Report{s_{it}}\) for hospital \(i\) at time \(t\) is the number of MCRs submitted by a hospital prior to final settlement and Xit is the vector hospital-year controls that include Medicare intensity, natural log of beds, natural log of patient revenue, natural log of employees, and several institutional and environmental control factors discussed further below. The key variables of interest pertain to the passage of the ACA. In the baseline regressions, a dummy variable (ACAt) equal 1 for the hospital-years beginning in 2012 after ACA implementation. Subsequent analyses utilize an indicator variable for the year of ACA implementation (2012) and post-implementation years (2013 and 2014). The term ui is a hospital fixed effect. Given the importance of hospital ownership and control on accounting and regulatory reporting outcomes, we include indicator variables for the significant “nonprofit” control types (i.e., private nonprofit, governmental, and public hospital district) in the full sample and also run the regressions separately on samples partitioned for each major control type (including for-profit hospitals).

Extant research has established linkages between hospital accounting and several different institutional and environmental factors, including hospital setting (rural or urban) (Wilson, Kerr, Bastian, and Fulton 2012), hospital status (teaching or non-teaching) (Dalton, Norton, and Kilpatrick 2001), and chain/system membership (Wilcox-Gök 2002; Chang and Chang 2017) and so we include these variables in our model to control for the various other institutional and environmental forces that exert themselves on hospitals. A complete variable list is provided in Appendix A. We first estimate the above model utilizing linear regression and then using a count model (Poisson regression).

Restatement of MCR

Hospitals are motivated to restate their MCRs to improve reporting. For instance, hospitals review their bad debts and DSH payments to ensure accuracy. If a hospital determines a reporting error exists, they should amend the MCR, and in some circumstances, they may be required to do so. For instance, if a hospital determines a reporting error that resulted in an overpayment to the hospital, they are required to notify Medicare and refund the funds within 90 days of becoming aware of the error. This requirement is based on the assumption of when the organization should “reasonably” be aware of said overpayment. Failure to comply could result in the hospital being removed from participation in the program and fines and interest being levied. As previously stated, there is no minimum threshold amount for notifying CMS of the overpayment. Hospitals do not possess the ability to plead ignorance to the overpayment to avoid penalties. This means hospitals must be diligent in monitoring and reconciling their accounts.

Once an overpayment is identified, the provider must correct and resubmit the affected MCR along with a restitution payment to CMS. Although the ACA did not impose harsher penalties on hospitals, other reporting changes in the ACA provide the need for increased monitoring of any potential reporting errors in the MCRs. Such a restatement can take place as an amended (status code 5) or reopened report (status code 4). We consider both such status codes as a “restatement” and analyze the probability of hospitals restating their MCRs using the following conditional logit model:
where
and the remaining variables and sample partitions are similar to those described in Equation (1). In addition, we analyze amended and reopened reports separately using a similar framework.

The data utilized capture financial, census, and demographic elements submitted on the MCR for each STAC and reporting year. These data are longitudinal in nature. Only MCRs that were in the respective periods (pre-ACA, ACA 2012, and post-ACA) were included in the analysis. The data also include every MCR submission and status change for each hospital participating in the Medicare program during the period under investigation. Contemporaneous research incorporating MCR data often utilizes the final (and static) reports, which are publicly available. With this data, researchers can only observe the final status code included with the file (e.g., status code “2”, settled without audit). Instead, we leverage a more dynamic sample of MCRs, which yields visibility into the entire MCR reporting and review process. Specifically, we are able to track all applicable status codes for an individual MCR (e.g., report submitted [“1”] → amended report filed [“4”] → settled without audit [“2”]). Cowles (1991) obtains similarly granular MCR data to analyze changes arising from the MCR review process. Specifically, the author calculates changes between the original “as submitted” MCR and the final “settled” MCR and finds that the review process generally results in minimal changes to the reported numbers.

For this study, MCRs were selected from the HCRIS database to identify STAC hospitals reimbursed under the CMS's Inpatient Prospective Payment System (P.L. 105-33 1997). The study utilizes a linked sample of “as submitted” and subsequently “amended,” “restated,” or “as finalized” MCRs for STAC hospitals for a time period spanning the implementation of the ACA. The MCR data used in this study are arranged according to Federal Fiscal Years (FFY). CMS defines their FFY as “cost reports beginning on or after October 1 each year” (ResDAC: FFY). Thus, some hospitals file their MCR on a year-end other than a calendar year-end (i.e.: 6/30, 8/31, etc.)

The final sample utilized for this evaluation includes 22,755 MCRs submitted by 3,261 hospitals for the fiscal years 2007–2014 (see Table 1). The sample was converted to a panel dataset to allow for evaluating subsequent MCR filings to the “As Submitted” report. Hospitals not submitting a 12-month MCR were dropped, reducing the sample by 3,740 MCRs submitted by 356 hospitals. Eleven hospitals that failed to report full-time equivalents (FTEs) were dropped from the sample. The sample was further restricted to hospitals reporting in both the pre- and post-ACA periods, which removed 333 hospitals and 843 MCRs from the sample. Forty-two hospitals submitted MCRs that were never settled and the 96 corresponding MCRs were excluded from the sample. Finally, two providers had obvious reporting errors for the variable number of beds, which were corrected in the sample after confirming the errors and correct number of beds with ResDAC. For various reasons, such as regulatory appeals, some MCRs do not get settled until all issues have been adjudicated, so we use FFY 2014 as a cut-off for our sample to ensure all MCRs have been fully settled.

TABLE 1

Sample Composition

Sample Composition
Sample Composition

Table 2 describes the patterns of initial submissions, amended, and reopened MCRs (during each time period in our sample and broken out across the primary hospital control types). As our focus is on the submissions and restatements by hospitals, we remove the status codes indicating settlement of the MCR (status codes 2 and 3), as these codes do not indicate an additional report submission by the hospital. Therefore, the percentages shown for status code “1” (as submitted), will not equal 100 percent. For example, the total proportion shown for the 2007–2011 period (all hospital types) is 60.4 percent, suggesting that three in five hospitals submitted a single MCR that was subsequently settled with or without an audit (status codes 2 or 3). The remaining 40 percent of hospitals either amended (status code 5), reopened (status code 4), or both amended and subsequently reopened their respective MCR during the pre-ACA time period.

TABLE 2

Report Patterns

Report Patterns
Report Patterns

For reference, approximately 17 percent of the submitted MCRs in our sample were settled with audit (status code: 3; not tabulated). The result is four status code patterns: status code pattern (1) indicates that the initial submission was settled with or without an audit (i.e., accepted as submitted); status code pattern (1, 5) indicates the report was submitted then amended before it settled; status code pattern (1, 4) indicates the report was submitted, settled, and then reopened; and status code pattern (1, 5, 4) indicates that the report was submitted, amended, and then reopened after settlement.

We note that the incidence of multiple submissions (a pattern that includes status codes 4 or 5) increases from about 40 percent in the pre-ACA time period (2007–2011) to nearly 50 percent post-ACA (2012–2014). This pattern generally holds for both the overall sample, as well as for each hospital type subsample. We also note that across the entire period nonprofit hospitals are the most likely to submit multiple MCRs.

Interestingly, hospitals appear much more likely to file amended reports (status code 5) post-ACA, whereas the practice of reopening reports (status code 4) declines post-ACA, especially in the years following implementation (2013–14). Collectively, the percent of MCR submissions that are reopened stays fairly flat in the pre-ACA period compared to the implementation year (25.5 versus 26.8 percent) before falling in the years post-implementation (to 15.2 percent, or roughly 1 in 7 MCRs). The decline in the number of MCRs reopened post-ACA appears to be a fairly consistent trend for all hospital control types, although the reduction is fairly minimal in the subset of for-profit in this univariate analysis. Nonprofit, government public hospital district hospitals all reopened significantly fewer MCRs in the post-ACA (2013–14) period as compared to the other periods.

While the propensity to reopen the MCR dropped collectively post-ACA, hospitals appear much more likely to amend their reports in the years following the passage of the ACA (19.2 percent of reports include status code 5 pre-ACA versus 28.9 percent in 2012 and 40.8 percent in the years post-ACA implementation). The observed increase in filing amended returns occurred fairly consistently among hospital control types. For-profit hospitals and hospital districts were the least likely to file amended MCRs pre-ACA (roughly 13 percent or one in eight) and nonprofit and government hospitals filed amended reports more often during that time period (roughly 22 percent or one in five). However, the propensity to file amended MCRs rose steadily through the implementation year to approximately 40 percent (two in five) across all hospital groups in the post-ACA time period (2013–14).

Table 2 demonstrates a trend toward more frequent amended report submissions in the post-ACA period. As shown in Figure 1, amended reports occur earlier in the process, before an NPR is issued, suggesting that hospitals and MACs may be undertaking a proactive approach to correcting misstated regulatory reports. However, these statistics alone do not address the reason that the report was amended. To further investigate the underlying causes, we examine the probability of four key report entries changing between the initial submission (status code 1) and the final submission that occurs when the report is amended (status code 5) or reopened (status code 4). Specifically, we examine changes in the number of Medicare Days, Total Days, DSH, and Bad Debt. The calculation of the first two of these remained the same after the implementation of ACA, while the calculation of the latter two changed. Due to the changes made to DSH with the implementation of the ACA, this change is understandable. Additionally, some hospitals may have dropped below the DSH threshold, thus becoming ineligible to receive DSH payments. Also, with the heightened scrutiny over bad debts and MACs reviewing these to ensure the bad debts claimed were written off in the period, hospitals could be reporting less bad debt. Table 3 summarizes the number of these four entries that were changed from the first to last submission, which is created from the subsample of firms with multiple submissions (i.e., patterns (1, 5), (1, 4), (1, 5, 4) in Table 2)3.

TABLE 3

Number of Corrections

Number of Corrections
Number of Corrections

Overall, the results in Table 3 show that the number of entries that were corrected were substantially reduced following the ACA. Ninety-eight percent of the reports that were restated included a change to one or more of these four items in the pre-ACA period. This includes 17.9 percent of the reports that changed all four entries and 48.9 percent that changed three entries. These proportions drop to 8.0 percent and 24.6 percent, respectively, post-ACA (2013–14).

While Table 3 shows a reduction in the number of entry corrections that occurred with restatements following the ACA, it does not provide details on the changes of each entry item. The statistics from these underlying changes are expanded in the four panels of Appendix B, which show the proportion of each item (Medicare Days, Total Days, DSH, and Bad Debt) that were changed. Panel A of this table shows the portion of multiple submission reports that result in a change in the entry for Medicare Days. Overall, 85.9 percent of the reports with multiple submissions result in a change in Medicare days. As mentioned in section 2, this measure is often updated with more current data that is obtained after a time lag, which is apparent when comparing cases where the report is reopened (patterns (1, 4) and (1, 5, 4)) versus those for which the report is amended (pattern (1, 5) before it is settled. A notable trend occurs across all hospital types: the portion of corrections for the number of Medicare Days drops substantially in the later 2013–2014 post-ACA period for those hospitals that are only amending their report (pattern (1, 5)). Panel B displays the portion of corrected reports that include a corrected number of total days in the later submission. Across all hospital types and time periods, 22 percent of corrected reports include an updated entry for Total Days. However, a general trend downward exists, with a drop from 25 percent of reports experiencing a correction in total days in the pre-ACA period to 20.1 and 16.8 percent correction rates in the 2012 and 2013–14 sample periods, respectively. Panels C and D also show that a substantially smaller number of reports correct entries for Bad Debt and DSH following the ACA.

Overall, a synthesis of Tables 2 and 3 and Appendix B suggests a trend toward corrections occurring before the MCR is settled and the NPR is issued. These tables suggest a post-ACA trend toward fewer entry item corrections that occur via amended (submission pattern (1, 5)) reports earlier in the process. While the proportion of reports with a correction for the entry for Medicare Days remained relatively constant in the pre- and post-ACA periods, a substantial decrease in the proportion of reports correcting the other three entries (Total Days, Bad Debt, and DSH) occurred.

Table 4 provides descriptive statistics for additional variables in our sample. The mean bed count across the hospital-year observations is 222 and mean number of full-time equivalent employees is approximately 1,300. Just over 17 percent of observations are classified as teaching hospitals and approximately three-quarters are designated as part of a chain organization. The proportion of chain hospitals is in line with the AHA statistics for hospital systems according to the 2019 “Fast Facts on U.S. Hospitals” release (roughly two-thirds per this publication). While the AHA and CMS system/chain designations are not identical, they are similar, so the composition of our sample is in line with expectations based on the U.S. hospital landscape. Approximately 25 percent of the sample is for-profit hospitals, 60 percent of the sample is nonprofit hospitals, with the remainder composed of city, county, state, or federal government hospitals (11 percent) or hospital districts (4 percent). Table 4 also shows the proportion of MCRs that are amended, which we note is highest for nonprofit hospitals and is identical to the sum of the shares of the (1, 4), (1, 5), and (1, 5, 4) status code patterns shown in Table 2.

TABLE 4

Descriptive Statistics

Descriptive Statistics
Descriptive Statistics

Table 5, Panel A provides the initial results of the linear regressions with fixed effects at the hospital level (Equation 1) for RQ1. The first column includes the full sample with hospital-type dummies included. The results indicate that the ACA (2012–2014) regulatory change has a positive and significant relationship with the number of MCR submissions. The magnitude of the regulatory effect in the baseline model is equal to 0.049 reports. This effect remains positive and significant when each hospital-type subsample is examined separately in columns (2) through (5). Additionally, Medicare intensity has a consistently significant and negative relationship with MCR submissions, indicating that the more Medicare business that is conducted by a hospital, the fewer MCRs they file. However, this is not the case with hospital-district hospitals (column 5), which indicates this is not a driver of reporting accuracy for these hospitals.

TABLE 5

Number of Report Submissions—Fixed-Effects Regressions

Number of Report Submissions—Fixed-Effects Regressions
Number of Report Submissions—Fixed-Effects Regressions

The number of beds and employees, as well as total revenue, serve as controls addressing scale in the model. Larger, more complex organizations may face additional coordination issues in their reporting structure but may also possess employees specializing in the reporting process. Overall, the results for the full sample in Table 5, Panel A are mixed, with a negative and significant effect of the natural log of beds but a positive and significant influence by the natural log of total revenue. In both cases, significance varies across the hospital-type subsamples in columns (2) through (5). The final scale effect, the natural log of employees, is insignificant in all cases except the for-profit subsample. Last, the indicators for teaching and chain hospitals are also insignificant for the full sample and the majority of subsamples. Table 5 also reports the within R-squared measures for the respective columns. These values represent the portion of the variance explained by the regressors beyond the fixed effects. The relatively low values suggest that errors in the reporting process remain unexplained. While this is not unexpected in the investigation of reporting error rates, we, nonetheless, note this as a limitation to the analysis.

The results in Table 5, Panel B examine the regulatory change across time by separating the implementation year, ACA (2012), and the post-ACA period, ACA (2013–2014). The regulatory effects, represented by the coefficients on the ACA (2012) and ACA (2013–2014) indicators, are positive and significant in both periods, albeit smaller in magnitude in the latter period. Such a result suggests a period of adjustment following the implementation, which ultimately led to the number of restatements subsiding somewhat as many hospitals became familiar with the new reporting process.

The remaining columns of Table 5, Panel B indicate that the marginal influence of regulatory change on the number of submitted reports was largest for hospital districts in both periods (column 5). Across the subsamples, the significance and magnitude of the regulatory effect is greater in the immediate period, ACA (2012), for nonprofit, government, and hospital district hospitals. However, the impact is larger and significant in the later period, ACA (2013–2014), in the case of for-profit hospitals shown in the second column.

These results were further confirmed via Poisson regression analysis with fixed effects at the hospital level, shown in Table 6, Panels A and B. Overall, the results remain similar in sign and significance, albeit with some differences in magnitude, to the previous linear regression results presented in Table 5. The pseudo R-squared measures reported in Table 6 are calculated as in McFadden (1973). Altogether the results of these analyses suggest that the number of reports submitted by hospitals is influenced by the regulatory change imposed by the ACA.

TABLE 6

Number of Report Submissions—Poisson Regressions

Number of Report Submissions—Poisson Regressions
Number of Report Submissions—Poisson Regressions

Table 7 shows the average marginal effects of the Logit regression (Equation 2) calculated with independent variables similar to those of Tables 4 and 5. Hospitals were analyzed to determine the influence of whether a hospital restated their MCR (0 = no restatement; 1 = restated), where a hospital is considered to have restated their MCR for a given year if they have either submitted an amended MCR (status code 5) or reopened the MCR (status code 4). Overall, the results suggest that the regulatory change led to a greater likelihood of restatements. Panel A of Table 6 examines the entire ACA period, ACA (2012–2014). Specifically, the results estimate an increase in restatements of 11.9 percent, ceteris paribus, for the overall sample post-ACA. The regulatory impact is significant at the 5 percent or better level for all subsamples. Medicare intensity and the scale effects of beds and total revenue have similar signs and levels of significance as in the previous regressions in Tables 4 and 5. The sample size declines due to a number of hospitals being dropped due to perfect prediction of restatement within a panel (i.e., quasi-separability).

TABLE 7

MCR Restatement—Logit Regressions

MCR Restatement—Logit Regressions
MCR Restatement—Logit Regressions

Panel B of Table 7 shows that the regulatory effect on MCR restatements is slightly larger in the latter post-ACA period, ACA (2013–2014), as compared to the implementation year, ACA (2012). The statistical significance remains at the p < 0.01 level for both periods. Columns (2) through (5) indicate that the timing of the regulatory influence is different across the hospital-type subsamples. The results suggest that for-profit and nonprofit hospitals are more likely to restate their MCRs to correct reporting errors in the later implementation years. In the case of government and hospital-district hospitals, the regulatory influence on the restatement of MCRs is larger and more significant in the period immediately following implementation of the ACA (2012). Governmental and district hospitals are public entities and uniquely answerable to their constituencies. Thus, these types of hospitals may be more immediately focused on adjusting their reporting processes in the wake of new regulations.

Overall, the findings on the probability to restate MCRs combined with the previous results regarding the number of MCR submissions, suggest that hospitals may be more likely to file one or more restatements following the ACA's implementation. However, the timing of this propensity to restate is different across the hospital-type subsamples. The increase in the number of submissions and the probability of a restatement increases only in the later 2013–2014 time period in the case of for-profit hospitals. The results for nonprofit hospitals are mixed. The number of amended or reopened reports rose in the first year of implementation, before declining in the years following implementation as hospitals become more familiar with the regulations. However, the overall probability of at least one restatement increased in the later post-ACA period, ACA (2013–2014), suggesting that nonprofit hospitals were more likely to restate their MCRs after the ACA was implemented. In the case of government and hospital-district hospitals, the number of MCR submissions and the probability of one or more restatements is significant in the immediate ACA period, ACA (2012), but then falls in magnitude and is insignificant in the latter ACA period.

The prior section notes an increase in the likelihood of a restatement of an MCR post-ACA. As depicted in Figure 1, this restatement could occur prior to settlement in the form of an amended MCR (status code 5) or, alternatively, following settlement if the MCR is reopened (status code 4). The prior results suggest a period of learning that occurs around the implementation of the ACA, resulting in additional restatements to the MCRs submitted by hospitals. However, the submission patterns described in Table 2 indicate a decline in reopened MCRs when examined in a univariate framework. Thus, we expand our analysis in this subsection to investigate the amending and reopening of MCRs separately.

To conduct this investigation, we utilize a conditional logit framework similar to that in Equation (2) but substitute distinct indicators for amend or reopen. In Panel A of Table 8 the dependent variable is equal to 1 when the MCR has been amended, and is 0 otherwise. The results presented include the two distinct ACA periods, ACA (2012) and ACA (2013–2014). Overall, we find results similar in sign to the combined analysis of restatements in Table 7, Panel B. However, the magnitude of the marginal effect is substantially larger in both post-ACA periods for the entire sample of amended reports shown in column (1). Similarly, for hospital-type subsamples, the magnitude of the average marginal effect is larger for both ACA periods and all hospital types, except for ACA (2012) in the case of government and hospital-district hospitals.

TABLE 8

MCR Amend and Reopen—Logit Regressions

MCR Amend and Reopen—Logit Regressions
MCR Amend and Reopen—Logit Regressions

Panel B of Table 8 presents the results of a similar model that uses an indicator for reopen (status code 4) as the dependent variable. The results are substantially different as compared to those from the models using restated or amended reports. The average marginal effect of ACA (2013–2014) is negative and significant for the overall sample and all hospital-type subsamples. The effect of ACA (2012) is also negative and significant in the case of government and hospital-district hospitals, but insignificant for the overall sample and the for-profit and nonprofit subsamples. Overall, this additional analysis suggests that the impact of the ACA on the regulatory reporting process is nuanced. There was an overall increase in the number of reports and an increased probability of restating MCRs. However, the overall post-ACA increase in amended MCRs was partially offset by a decline in reopened MCRs.

This investigation provides insights into the influence of regulatory change that could assist MCR preparers and reviewers, regulators, and academics. Specifically, these analyses highlight consequences of regulatory change (i.e., increased reporting errors). As such, regulators should be mindful that future regulation can have consequences on regulatory reporting processes and accuracy. This investigation identifies patterns of reporting behavior observed during a period of significant regulatory change; the results may assist regulators in understanding how hospitals and other preparers respond to such changes. In addition, this investigation helps academics further understand the constructs within CMS reimbursement and the implications for further research to improve the understanding of regulatory change on for profit, nonprofit, and governmental hospitals. Hospitals appear to be much more likely to file amended MCRs post-ACA, but also appear to be making only minor corrections to these amended reports. On the other hand, hospitals are less likely to reopen MCRs in the post-implementation years (2013–14), suggesting that hospitals may be trying to avoid this costly and time-consuming process by more proactively submitting amended reports to correct regulatory reporting errors.

Overall, the findings suggest that hospitals are being diligent in their efforts to submit accurate MCR data, highlighting the impact of regulation on the hospitals' MCR regulatory reporting processes. It appears as though MCR preparers and reviewers collectively recognize that significant regulatory changes may impair reporting accuracy and they seem to have adjusted their practices accordingly. Regulators and academics who make use of the MCR data in practice-based and academic research should be aware that reporting accuracy may differ across reporting periods, especially during periods of regulatory change.

This analysis is subject to a few limitations. First, the MCR information is self-reported. Kane and Magnus (2001) note the need for additional focus on accuracy due to the reduced reliability of the self-reported data contained in the annual filings. Self-reported data have been identified as an area of concern within empirical research relative to the probative value of research results (V. Richman and A. Richman 2012). For example, the MAC, similar to a formal audit, does not audit the financial statement information reported in the MCR. Additionally, missing data for variables of interest create a limitation relative to the potential generalizability of these results.

Future research could extend this study by evaluating hospital financial performance surrounding the implementation of the ACA. Additional research may also aid in developing a more robust understanding of the factors that preserve regulatory reporting accuracy during periods of regulatory change. Additionally, evaluating reporting inconsistencies between other sources of financial data (i.e., annual audited financial statements and IRS Form 990) and financial data disclosed in the Medicare cost report to identify additional irregularities in reporting could extend our understanding of reporting error. An analysis of this nature may provide additional insight for regulators as they evaluate the need for a centralized database of audited hospital financial reports.

Allison,
M.
2013
.
The effect of public policy on health service providers
.
Economics Dissertations Paper 99. Available at: https://surface.syr.edu/ecn_etd/99
American Hospital Association (AHA).
2019
.
Fast facts on U.S. hospitals
.
Barnes,
B. G.,
and
Harp
N. L.
2018
.
The U.S. Medicare disproportionate share hospital program and capacity planning
.
Journal of Accounting and Public Policy
37
(
4
):
335
351
.
Barniv,
R.,
Danvers
K.,
and
Healy
J.
2000
.
The impact of Medicare capital prospective payment regulation on hospital capital expenditures
.
Journal of Accounting and Public Policy
19
(
1
):
9
40
.
Brand,
C. A.,
Barker
A. L.,
Morello
R. T.,
Vitale
M. R.,
Evans
S. M.,
Scott
I. A.,
Stoelwinder
J. U.,
and
Cameron
P. A.
2012
.
A review of hospital characteristics associated with improved performance
.
International Journal for Quality in Health Care
24
(
5
):
483
494
.
Cardinaels,
E.,
and
Soderstrom
N.
2013
.
Managing in a complex world: Accounting and governance choices in hospitals
.
European Accounting Review
22
(
4
):
647
684
.
Centers for Medicare & Medicaid Services (CMS).
2017
.
National health expenditure data (NHE) tables
.
Centers for Medicare & Medicaid Services (CMS).
2019
a.
National health expenditure data (NHE) fact sheet
.
Centers for Medicare & Medicaid Services (CMS).
2019
b. National health expenditure data (NHE) table 1—National health expenditures; aggregate and per capita amounts. Available at: https://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/NationalHealthExpendData/Downloads/Tables.zip
Centers for Medicare & Medicaid Services (CMS).
2019
c.
National health expenditure data (NHE) table 3—National health expenditures, by source of funds
.
Chaikind,
H. R.
2012
.
ACA: A Brief Overview of the Law, Implications, and Legal Challenges
.
Washington, DC
:
Congressional Research Service
.
Chang,
K.,
and
Chang
G. H.
2017
.
An examination of economic efficiency in California hospitals
.
Journal of Governmental & Nonprofit Accounting
6
(
1
):
30
51
.
Cowles,
C. M.
1991
.
Review effect on cost reports: Impact smaller than anticipated
.
Health Care Financing Review
12
(
3
):
21
25
.
Cubanski,
J.,
Neuman
T.,
and
Freed
M.
2019
.
The facts on Medicare spending and financing
.
Dalton,
K.,
Norton
E. C.,
and
Kilpatrick
K.
2001
.
A longitudinal study of the effects of graduate medical education on hospital operating costs
.
Health Services Research
35
(
6
):
1267
1291
.
Eldenburg,
L. G.,
and
Kallapur
S.
1997
.
Changes in hospital service mix and cost allocations in response to changes in Medicare reimbursement schemes
.
Journal of Accounting and Economics
23
(
1
):
31
51
.
Eldenburg,
L. G.,
and
Vines
C. C.
2004
.
Nonprofit classification decisions in response to a change in accounting rules
.
Journal of Accounting and Public Policy
23
(
1
):
1
22
.
Eldenburg,
L. G.,
Krishnan
H. A.,
and
Krishnan
R.
2017
.
Management accounting and control in the hospital industry: A review
.
Journal of Governmental & Nonprofit Accounting
6
(
1
):
52
91
.
Gardner,
D.
2013
.
ACA implementation: A vulnerable and misunderstood endeavor
.
Nursing Economics
31
(
6
):
307
309
.
Holzhacker,
M.,
Krishnan
R.,
and
Mahlendorf
M. D.
2015
.
The impact of changes in regulation on cost behavior
.
Contemporary Accounting Research
32
(
2
):
534
566
.
Kane,
N. M.,
and
Magnus
S. A.
2001
.
The Medicare cost report and the limits of hospital accountability: Improving financial accounting data
.
Journal of Health Politics, Policy and Law
26
(
1
):
81
106
.
Kennedy,
F. A.,
Burney
L. L.,
Troyer
J. L.,
and
Stroup
J. C.
2010
.
Do non-profit hospitals provide more charity care when faced with a mandatory minimum standard? Evidence from Texas
.
Journal of Accounting and Public Policy
29
(
3
):
242
258
.
Krishnan,
R.,
and
Yetman
M. H.
2011
.
Institutional drivers of reporting decisions in nonprofit hospitals
.
Journal of Accounting Research
49
(
4
):
1001
1039
.
Langabeer,
J. R.,
II,
DelliFraine,
J. L.
and
Helton.
J. R.
2010
.
Mixing finance and medicine: The evolution of financial practices in healthcare
.
HFM Magazine
92
(6)
:
27
34
.
Longest,
B. B.,
Jr.
2012
.
Management challenges at the intersection of public policy environments and strategic decision making in public hospitals
.
Journal of Health and Human Services Administration
35
(
2
):
207
230
.
Lynch,
L. J.
2003
.
The effect of Medicare capital prospective payment regulation: Additional evidence from hospital financing decisions
.
Journal of Accounting and Public Policy
22
(
2
):
151
173
.
McFadden,
D.
1973
.
Conditional logit analysis of qualitative choice behavior
.
In
Frontiers in Econometrics
, edited by
Zarembka
P.,
105
142
.
New York, NY
:
Academic Press
.
McGowan,
M. M.,
Chan
S. H.,
Yurova
Y. V.,
Liu
C.,
and
Wong
R. M. K.
2018
.
The influence of institutional regulatory pressure on nonprofit hospital audit quality
.
Journal of Governmental & Nonprofit Accounting
7
(
1
):
1
23
.
Navathe,
A. S.,
Volpp
K. G.,
Konetzka
R. T.,
Press
M. J.,
Zhu
J.,
Chen
W.,
and
Lindrooth
R. C.
2012
.
A longitudinal analysis of the impact of hospital service line profitability on the likelihood of readmission
.
Medical Care Research and Review : MCRR
69
(
4
):
414
431
.
Pearson,
S. D.,
and
Bach
P. B.
2010
.
How Medicare could use comparative effectiveness research in deciding on new coverage and reimbursement
.
Health Affairs (Project Hope)
29
(
10
):
1796
1804
.
ResDAC
:
FFY. Research Data Assistance Center. Centers for Medicare & Medicaid Services. Federal Fiscal Year (FFY). Available at: https://resdac.org/articles/creation-fiscal-year-cost-report-files
Richman,
V.,
and
Richman
A.
2012
.
A tale of two perspectives: Regulation versus self-regulation. A financial reporting approach (from Sarbanes-Oxley) for research ethics
.
Science and Engineering Ethics
18
(
2
):
241
246
.
Rosenbaum,
S.
2011
.
The Patient Protection and Affordable Care Act: Implications for public health policy and practice
.
Public Health Reports (Washington, D.C.)
126
(
1
):
130
135
.
Wilcox-Gök,
V.
2002
.
The effects of for-profit status and system membership on the financial performance of hospitals
.
Applied Economics
34
(
4
):
479
489
.
Wilson,
A. B.,
Kerr
B. J.,
Bastian
N. D.,
and
Fulton
L. V.
2012
.
Financial performance monitoring of the technical efficiency of critical access hospitals: A data envelopment analysis and logistic regression modeling approach
.
Journal of Healthcare Management
57
(
3
):
200
213
.
APPENDIX A

Variable Descriptions

Variable Descriptions
Variable Descriptions
APPENDIX B

Change from Initial Submission

Change from Initial Submission
Change from Initial Submission
1

Amended MCRs are reports that are submitted to the Medicare Administrative Contractor (MAC) prior to a Notice of Program Reimbursement (NPR) being issued. Reopened MCRs are reports that must be approved by the MAC for acceptance and are related to item(s) noted on the audit adjustments issued when a report is final settled or “NPR'd.”

2

Audits are conducted annually based on individual contractual arrangements between CMS and the respective MACs. According to CMS, audits are conducted using “a statistically valid stratified random sample of claims” (CMS 2019a). The exact terms and nature of the sampling stratification are not precisely known.

3

In the case of multiple resubmission (i.e., pattern (1, 5, 4)) we compare the first and last submissions).