Global stakeholders have expressed interest in increasing the use of data analytics throughout the audit process. While data analytics offer great promise in identifying audit-relevant information, auditors may not use this information to its full potential, resulting in a missed opportunity for possible improvements to audit quality. This article summarizes a study by Koreff (2022) that examines whether conclusions from different types of data analytical models (anomaly versus predictive) and data analyzed (financial versus non-financial) result in different auditor decisions. Findings suggest that when predictive models are used and identify a risk of misstatement, auditors increase budgeted audit hours more when financial data are analyzed than when non-financial data are analyzed. However, when anomaly models are used and identify a risk of misstatement, auditors' budgeted hours do not differ based on the type of data analyzed. These findings provide evidence that different data analytics do not uniformly impact auditors' decisions.

In this article, we summarize a study that attempts to explain how auditors' prior experience using different types of analyses impacts how they respond to conclusions drawn from different data analytical models, depending on the type of data analyzed (Koreff 2022).

Despite the advances in technology enabling accounting firms to develop more sophisticated data analytics to identify audit relevant information and potentially improve audit quality, use of these tools by auditors is often inconsistent for a variety of reasons, including concerns over inspection risk (Eilifsen, Kinserdal, Messier, and McKee 2020), the Public Company Accounting Oversight Board (PCAOB) not explicitly requiring the use of these tools (PCAOB 2021b), and the restrictive nature of the technology (Dowling and Leech 2014). Koreff (2022) shows that even when the same output is presented, auditors' experience (familiarity) with the combination of the type of model and data used to arrive at the same conclusion can result in inconsistent decision making.

The interview data in Koreff (2022) shows that auditors report a comparable amount of experience analyzing financial and non-financial data when using anomaly models, which explains why decisions do not differ when auditors use anomaly models that analyze different data. Thus, when firms develop more advanced anomaly-based analytics, the type of data analyzed is not expected to result in inconsistent auditor decision making. However, the same cannot be said for predictive analytics, as interviewees reported that predictive analytics tend to focus on financial data relative to non-financial data. Accordingly, Koreff (2022) demonstrates that when predictive analytics are used, auditors are more likely to incorporate the findings into their decisions when financial data are analyzed as compared to non-financial data. These findings are in line with the PCAOB's data and technology research project expressing a concern that auditor experience and understanding of analytics represent important factors to the effective use of these tools (PCAOB 2021a), and ultimately improvement of audit quality.

Taken together, Koreff (2022) observes that two attributes of analytics, model and data, do not impact auditors' decisions individually. However, the combination of these two attributes impact auditors' decisions.

Advances in technology have resulted in the development of data analytical tools that can perform a list of analyses such as population testing, identifying outliers based on a criteria, predictive modeling, and analysis of non-traditional unstructured data. In fact, the American Institute of Certified Public Accountants (AICPA) Assurance Services Executive Committee (ASEC) has developed an “Audit Data Analytics Guide” that suggests that data analytics are an outgrowth and expansion of analytical procedures (AICPA 2015; Appelbaum, Kogan, and Vasarhelyi 2017; AICPA 2017). Furthermore, Statement on Auditing Standards (SAS) 142 (titled “Audit Evidence”) permits auditors to use automated tools and techniques to enhance the evaluation of audit evidence, including the analysis of non-financial data.1

Although data analytics can be seen as an extension of analytical procedures (Appelbaum et al. 2017), auditors do not always use analytical procedures effectively (PCAOB 2007, 2014, 2008, 2013; Barr-Pulliam, Brazel, McCallen, and Walker 2020; Brazel, Leiby, and Schaefer 2022a; Cao, Duh, Tan, and Xu 2022). As an additional barrier to consistent implementation of data analytics, PCAOB standards do not require the use of data analytics (PCAOB 2021b). Shortcomings of analytics include users not considering risks beyond what the analytics identified (Seow 2011), and not properly evaluating false positives (Koreff, Weisner, and Sutton 2021). Auditors have a preference for simpler analytics, including comparing current year balances to prior year balances, and thus may be reluctant to use more sophisticated analytics (Ameen and Strawser 1994; Trompeter and Wright 2010; Schmidt, Riley, and Church 2020b; Schmidt, Church, and Riley 2020a; Brazel et al. 2022b). Yet, the PCAOB encourages the use of these tools in order to improve the audit process and audit quality (PCAOB 2016, 2018). One way to promote auditors' use of analytics may be to provide auditors with analytics that use familiar analyses.

When auditors use familiar analyses, it is expected to induce cognitive fit. Cognitive fit refers to the congruence between a process used by a decision maker and the decision aiding tool (Vessey and Galletta 1991; Al-Natour, Benbasat, and Cenfetelli 2008). Auditors will experience greater cognitive fit with data analytics that use combinations of data analytic models and data types that they are more familiar with, since cognitive fit is correlated with experience (Dunn and Grabski 2001; Goodhue and Thompson 1995).2 Data analytics can be used to analyze a multitude of data types, but auditors will experience different levels of cognitive fit depending on experience using the analyses utilized by the analytics (i.e., the combination of model and data). Thus, when auditors view the results of an analytic that uses familiar analyses, auditors will experience greater cognitive fit with the analytic and therefore be more likely to incorporate the results of the analytic into their decision making process.

Two analytical models were examined by Koreff (2022): Anomaly and Predictive models. Anomaly models perform a distributional (bell curve) analysis to identify outliers (SAS Institute 2014). Predictive models analyze patterns of previously identified issues and compare them with current patterns (Kuenkaikaew and Vasarhelyi 2013). Koreff (2022) illustrates that auditors' experience using these two types of models does not differ substantially. As a result, Koreff (2022) predicts that the auditors' cognitive fit will depend not only on the analytical model used, but also the data analyzed by the model.

Two types of data were assessed by Koreff (2022): financial data and non-financial data. Predictive models focus primarily on analyzing financial data (Dechow, Ge, Larson, and Sloan 2011; Sinclair 2015; Perols, Bowen, Zimmermann, and Samba 2017), whereas anomaly models are more capable of analyzing both types of data (Glover, Prawitt, and Wilks 2005; Hobson, Mayew, and Venkatachalam 2012; Brazel, Jones, and Prawitt 2014). See Figure 1 for a graphical depiction of auditors' experiences using the four combinations of the different types of analyses. The depiction in Figure 1 suggests that auditors have comparable experience using predictive and anomaly models (hence the two bars rising to the same level), yet they overwhelmingly use predictive analytics to analyze financial data, as compared to non-financial data. The lack of experience using predictive analytics to analyze non-financial data are expected to result in auditors resisting the incorporation of results from this combination of model and data into their decisions. Yet, the same cannot be said for anomaly models as auditors' experience using financial and non-financial data are approximately the same (hence a more balanced amount of time in the bar on the right side of the graph).

As a result, considering only the type of model or type of data individually, rather than a combination of these two factors, used by analytics could paint an incomplete picture of auditors' willingness to use the findings of analytics in their decisions. This difference in experience is expected to impact auditors' cognitive fit and, in turn, decision making. When predictive models identify a risk of misstatement, auditors will increase budgeted audit hours more (and presumably see a greater improvement in audit quality) when financial data are analyzed, as opposed to non-financial data. Yet, when anomaly models are used and identify a risk, no such difference is expected.

Koreff (2022) employed an experiment to test the aforementioned expectations, where the participants consisted of 98 auditors of all ranks employed by a variety of sized firms.3 Follow-up interviews were conducted with 26 of the auditors that completed the experiment to obtain insights on their experiences using different types of analytics (described in Figure 1).

Participants were provided with background information related to their role as an in-charge auditor of a privately held, mid-sized sporting equipment manufacturer. Participants were told that their firm's Central Data Analytics Group had identified a potential misstatement with an estimated range that just exceeded performance materiality of $304,000. The conclusion stated that the use of predictive/anomaly models to analyze journal entries/emails presented a 56 percent risk that revenue was overstated by an amount between $270,000 and $310,000. As such, the risk identified was held constant; however, the process used to arrive at that risk varied.4

The experiment manipulated the type of analytical model used (predictive or anomaly) and the type of data analyzed (financial or non-financial).5 See Appendix B for specific descriptions of these manipulations. The participants were asked: “Assume 30 hours were initially budgeted to audit revenue. How would you adjust the budgeted hours for the revenue account in percentages (every 5 percent change results in a change of 1.5 hours)?”

Koreff (2022) illustrated that auditors with experience using analytics report comparable experiences using anomaly models and predictive models when answering, “How experienced are you in using data analytics that identify statistical outliers such as unusually high/low fluctuations or ratios (anomaly models) as part of your job function?” and “How experienced are you in using data analytics that compare current data against previously identified issues/occurrences to identify similarities (predictive models) as part of your job function?” Both questions were measured on five-point Likert scales with endpoints of 1 = “Not at all experienced,” and 5 = “Extremely experienced.” No significant difference was identified between these measures with means of 2.590 (for anomaly models) and 2.559 (for predictive models).

Results in Koreff (2022) also showed that the type of model used and the type of data analyzed did not individually impact auditors' determination of budgeted audit hours; however budgeted audit hours were impacted by the combined impact of these two factors. See Figure 2 for a graphical depiction of the results. The results demonstrated that, when employing predictive analytics, auditors increased their budgeted hours more when financial data were used as compared to non-financial data (19.48 percent increase versus 11.38 percent increase, p = 0.01). However, when anomaly models were used, Koreff (2022) observed no statistically significant difference in the responses of auditors to the two data types (18.42 percent versus 14.16 percent increase, p > 0.10).6 Additionally, when financial data were analyzed, auditors increased budgeted audit hours more when predictive models were used (19.48 percent increase versus 14.16 percent increase, p = 0.09). On the other hand, when data analytics used non-financial data, auditors were more likely to increase budgeted audit hours when anomaly models were used (18.42 percent increase versus 11.38 percent increase, p = 0.07).

For additional insights, we conducted additional analyses replicating the primary results presented in Koreff (2022), while adding control variables for auditor age, years of audit experience, years of professional experience, title, and prior experience using data analytics. In all cases, the primary results of Koreff (2022) hold. We also considered the possibility that industry expertise impacted auditors' use of the analytics as we controlled for auditors' percentage of time auditing manufacturing clients and a variable measuring if the auditor audits any manufacturing clients. These variables did not significantly impact results, and the results are consistent with the main results of Koreff (2022). Finally, we conducted analysis including only auditors employed by national and international firms in the sample. The primary results remained supported, consistent with the results reported by Koreff (2022).

Koreff (2022) conducted interviews of auditors that completed the experiment to provide additional insights into auditors' varying levels of experience using different types of analytics. When asked about prior experience using predictive analytics (specifically, “How would you describe the amount of experience you had using predictive analytics that analyzed financial vs. non-financial data?”), interviewees generally reported greater experience analyzing financial versus non-financial data. When asked about prior experience using anomaly analytics (specifically, “How would you describe the amount of experience you had using anomaly analytics that analyzed financial vs. non-financial data?”) auditors generally reported comparable experience analyzing financial and non-financial data.

Despite the promise that data analytics have for improving the audit process, simply providing these tools to decision makers is insufficient to induce adoption (Messier 1995; Venkatesh et al. 2003; Schmidt et al. 2020a; Schmidt et al. 2020b). Although firms are developing more advanced analytics, the results of Koreff (2022) suggest that auditors may not use these tools consistently. Effective implementation of these data analytics should account for auditors' prior experiences related to combinations of analytical models and the data processed by these models.

Koreff (2022) findings suggest that if new analytics are deemed effective by the firm, they still need to be cognizant of auditors' lack of experience using the analysis as a barrier to adoption (and potentially improving audit quality). Although auditors have comparable experience using the two types of analytics examined by Koreff (2022), consideration of the type of data these models tend to analyze revealed a disparity in the amount of time auditors spend analyzing different data by these types of models. While predictive analytics tend to focus on analyzing financial data, auditors reported anomaly models incorporating a more balanced amount of financial and non-financial data. This disparity ultimately impacts auditors' decisions. Therefore, public accounting firms should train their employees on how predictive models can be effective using both financial and non-financial data to encourage consistent decision making. Firms should consider appropriate matching of analytic models to the data being analyzed or determine ways to ensure that auditors' experiences with different model/data combinations employed in practice do not vary substantially (e.g., through training sessions illustrating the use of analytic tools).

Al-Natour,
S.,
Benbasat
I.,
and
Cenfetelli
R. T.
2008
.
The effects of process and outcome similarity on users' evaluations of decision aids
.
Decision Sciences
39
(
2
):
175
211
.
Ameen,
E. C.,
and
Strawser
J. R.
1994
.
Investigating the use of analytical procedures: An update and extension
.
Auditing: A Journal of Practice & Theory
13
(
2
):
69
76
.
American Institute of Certified Public Accountants (AICPA).
2015
.
Audit Data Standards—Base Standard
.
New York, NY: AICPA.
American Institute of Certified Public Accountants (AICPA).
2017
.
Guide to Audit Data Analytics
.
New York, NY: AICPA.
Appelbaum,
D.,
Kogan
A.,
and
Vasarhelyi
M. A.
2017
.
Big data and analytics in the modern audit engagement: Research needs
.
Auditing: A Journal of Practice & Theory
36
(
4
):
1
27
.
Barr-Pulliam,
D.,
Brazel
J. F.,
McCallen
J.,
and
Walker
K.
2020
.
Data analytics and skeptical actions: The countervailing effects of false positives and consistent rewards for skepticism
.
Working paper, University of Louisville, North Carolina State University, University of Georgia, and Virginia Tech.
Brazel,
J. F.,
Jones
K. L.,
and
Prawitt
D. F.
2014
.
Auditors' reactions to inconsistencies between financial and nonfinancial measures: The interactive effects of fraud risk assessment and a decision prompt
.
Behavioral Research in Accounting
26
(
1
):
131
156
.
Brazel,
J. F.,
Leiby
J.,
and
Schaefer
T.
2022
a.
Do rewards encourage professional skepticism? It depends
.
The Accounting Review
97
(
4
):
131
154
.
Brazel,
J. F.,
Jones
K. L.,
and
Lian
Q.
2022
b.
Auditor use of benchmarks to assess fraud risk: The case for industry data
.
Working paper, North Carolina State University and The University of Kansas.
Cao,
T.,
R.-R. Duh,
Tan
H.-T.,
and
Xu
T.
2022
.
Enhancing auditors' reliance on data analytics under inspection risk using fixed and growth mindsets
.
The Accounting Review
97
(
3
):
131
153
.
Dechow,
P. M.,
Ge
W.,
Larson
C. R.,
and
Sloan
R. G.
2011
.
Predicting material accounting misstatements
.
Contemporary Accounting Research
28
(
1
):
17
82
.
Dowling,
C.,
and
Leech
S. A.
2014
.
A Big 4 firm's use of information technology to control the audit process: How an audit support system is changing auditor behavior
.
Contemporary Accounting Research
31
(
1
):
230
252
.
Dunn,
C.,
and
Grabski
S. V.
2001
.
An investigation of localization as an element of cognitive fit in accounting model representations
.
Decision Sciences
32
(
1
):
55
94
.
Eilifsen,
A.,
Kinserdal
F.,
Messier,
W. F.
Jr.,
and
McKee
T.
2020
.
An exploratory study into the use of audit data analytics on audit engagements
.
Accounting Horizons
34
(
4
):
75
103
.
Glover,
S. M.,
Prawitt
D. F.,
and
Wilks
T. J.
2005
.
Why do auditors over-rely on weak analytical procedures? The role of outcome and precision
.
Auditing: A Journal of Practice & Theory
24
(
s-1
Supplement):
197
220
.
Goodhue,
D. L.,
and
Thompson
R. L.
1995
.
Task-technology fit and individual performance
.
Management Information Systems Quarterly
19
(
2
):
213
236
.
Hobson,
J. L.,
Mayew
W. J.,
and
Venkatachalam
M.
2012
.
Analyzing speech to detect financial misreporting
.
Journal of Accounting Research
50
(
2
):
349
392
.
Koreff,
J.
2022
.
Are auditors' reliance on conclusions from data analytics impacted by different data analytic inputs?
Journal of Information Systems
36
(
1
):
19
37
.
Koreff,
J.,
Weisner
M.,
and
Sutton
S. G.
2021
.
Data analytics (AB) use in healthcare fraud audits
.
International Journal of Accounting Information Systems
42
:
100523
.
Kuenkaikaew,
S.,
and
Vasarhelyi
M. A.
2013
.
The predictive audit framework
.
The International Journal of Digital Accounting Research
13
:
37
71
.
Messier,
W. F.
1995
.
Research in and development of audit decision aids
.
In
Judgment and Decision-Making Research in Accounting and Auditing
,
edited by
Ashton
R. H.
and
Ashton
A. H.,
207
227
.
Cambridge, U.K
.:
Cambridge University Press
.
Perols,
J.,
Bowen
R. M.,
Zimmermann
C.,
and
Samba
B.
2017
.
Finding needles in a haystack: Using data analytics to improve fraud prediction
.
The Accounting Review
92
(
2
):
221
245
.
Public Company Accounting Oversight Board.
2007
.
Report on the PCAOB's 2004, 2005, and 2006 Inspections of Domestic Triennially Inspected Firms
.
Washington, DC: PCAOB.
Public Company Accounting Oversight Board.
2008
.
Report on the PCAOB's 2004, 2005, 2006, and 2007 Inspections of Domestic Annually Inspected Firms
.
Washington, DC: PCAOB.
Public Company Accounting Oversight Board.
2013
.
Report on 20072010 Inspections of Domestic Firms that Audit 100 or Fewer Public Companies. Washington, DC: PCAOB.
Public Company Accounting Oversight Board.
2014
.
In the Matter of KPMG LLP's Quality Control Remediation Submissions
.
Washington, DC
:
PCAOB.
Public Company Accounting Oversight Board.
2016
.
Preview of Observations from 2015 Inspections of Auditors of Issuers
.
Staff Inspection Brief
.
Washington, DC
:
PCAOB.
Public Company Accounting Oversight Board.
2018
.
Strategic Plan 20182022. Washington, DC: PCAOB.
Public Company Accounting Oversight Board.
2021
a.
Spotlight: Data and Technology Research Project Update Spotlight
.
Washington, DC: PCAOB.
Public Company Accounting Oversight Board.
2021
b.
Data and Technology Research Project Update
.
Washington, DC
:
PCAOB.
Schmidt,
P. J.,
Church
K. S.,
and
Riley
J.
2020
a.
Clinging to excel as a security blanket: Investigating accountants' resistance to emerging data analytics technology
.
Journal of Emerging Technologies in Accounting
17
(
1
):
33
39
.
Schmidt,
P. J.,
Riley
J.,
and
Church
K. S.
2020
b.
Investigating accountants' resistance to move beyond Excel and adopt new data analytics technology
.
Accounting Horizons
34
(
4
):
165
180
.
Seow,
P. S.
2011
.
The effects of decision aid structural restrictiveness on decision-making outcomes
.
International Journal of Accounting Information Systems
12
(
1
):
40
56
.
Sinclair,
N.
2015
.
How KPMG is Using Formula 1 to Transform Audit
.
Edinburgh, U.K.: Institute of Chartered Accountants of Scotland.
SAS Institute.
2014
.
How a Hybrid Anti-Fraud Approach Could Have Saved Government Benefit Programs More than $100 Million
.
Cary, NC: SAS Institute.
Trompeter,
G.,
and
Wright
A.
2010
.
The world has changed
Have analytical procedure practices?
Contemporary Accounting Research
27
(
2
):
669
700
.
Venkatesh,
V.,
Morris
M.,
Davis
G.,
and
Davis
F.
2003
.
User acceptance of information technology: Toward a unified view
.
Management Information Systems Quarterly
27
(
3
):
425
478
.
Vessey,
I.,
and
Galletta
D.
1991
.
Cognitive fit: An empirical study of information acquisition
.
Information Systems Research
2
(
1
):
63
84
.

Quotes From Interviewees Discussing Prior Experience and Process Familiarity Impacting Cognitive Fit

Appendix A shows the interviewees' prior experience using a technology enabled tool (e.g., data analytics) and process familiarity of the analysis inducing use of that tool. The first column (“Interviewee”) represents the interviewee number. The second column (“Prior experience”) includes the quote that best depicts the interviewee's discussion of how prior experience induces use of a tool. The third column (“process familiarity”) includes the quote that best depicts the interviewee's discussion of how prior experience using a certain analysis induces use of a tool.

Manipulation Descriptions

In the predictive models condition, participants were provided a description of the predictive models used that read:

The Central Data Analytics Group employs predictive analytical models to identify patterns that are similar to previously identified issues. Predictive models rely on prior historical data to identify patterns and predict future events. Predictive models compare information in the data collected from clients associated with previously identified events/occurrences to current information. Predictive models may be used in the audit process to identify a pattern over several years associated with a previously identified material misstatement that may be indicative of a current material misstatement.

In the anomaly models condition, participants were provided a description of anomaly models used that read:

The Central Data Analytics Group employs anomaly analytical models to identify statistical outliers. Anomaly models rely only on current year (non-historical) data to identify statistical outliers. Anomaly models compare information in the data collected from your firm's client base to identify very high or low amounts or ratios. Anomaly models may be used in the audit process to identify very high or low ratios (i.e., gross margin, debt to equity, current ratio) that may be indicative of a current material misstatement.

The second variable manipulated between participants was the type of data analyzed. In the financial data condition, participants were told:

The Central Data Analytics Group is capable of identifying journal entries that affect revenue. For the Madison audit, the Central Data Analytics Group used this financial information to identify the number of journal entries that include revenue and were made just below the performance materiality threshold. Although the Central Data Analytics Group has explained what criteria they use for “just below the performance materiality” for the journal entries, this explanation contained substantial statistical jargon and was not well understood by your audit team. Several of your colleagues have reported similar issues with explanations received from the Central Data Analytics Group.

In the non-financial data condition, participants were told:

The Central Data Analytics Group is capable of identifying sentences in the e-mails that discuss revenue. For the Madison audit, the Central Data Analytics Group used this non-financial information to identify optimistic language used in internal and external e-mails for sentences that discuss revenue. Although the Central Data Analytics Group has explained what criteria they use for “optimistic language” in the e-mails, this explanation contained substantial statistical jargon and was not well understood by your audit team. Several of your colleagues have reported similar issues with explanations received from the Central Data Analytics Group.

Quotes From Interviewees Discussing Experience Using Anomaly and Predictive Models

Appendix C shows the interviewees' description of their experience using different inputs for different analytics. The first column (“interviewee”) represents the interviewee number. The second column (“predictive”) includes the quote that best depicts the interviewee's experience using predictive analytics to analyze financial versus non-financial data. The third column (“anomaly”) includes the quote that best depicts the interviewee's experience using anomaly analytics to analyze financial versus non-financial data. The fourth column (“difference”) includes the quote that best depicts the interviewee's comparison of the proportion of time predictive versus anomaly analytics use financial versus non-financial data.

1

For examples on permitting the use of automated tools, see paragraphs A3, A4, A43, A45, A46, A47 and A61. See paragraph A59 for permitting the use of analysis of non-financial data.

2

While we acknowledge that these studies are not from recent years, these findings are echoed by interviewees in Koreff (2022) that experience using analyses increases the likelihood of using the analysis by stating “it all comes down to experience using it … so I'd say those are probably the largest one [resistance to using analytics] is the lack of experience” and “anytime there's new data, I'm a little bit nervous … If the auditor has experience with the process or with the client I think there can probably be higher willingness to use certain analytics.” See Appendix A for a complete list of quotes from interviewees in Koreff (2022) discussing cognitive fit and prior experience impacting use of analytics.

3

On average, participants had 9.0 years of audit experience. Sixty of the auditors were employed by national or international firms.

4

The Central Data Analytics Group was described as consisting of non-CPAs without an accounting background. The likelihood of someone without an accounting background identifying an accounting misstatement is low. To make for a more realistic case, a risk of misstatement (as opposed to an actual misstatement) was said to have been identified by the Central Data Analytics Group.

5

In both manipulations the background information provided was limited in an effort to keep the case short. Future research may seek to examine the impact of providing additional detailed information.

6

Statistical analyses (i.e., ANCOVA results) documented in Koreff (2022) confirm that the evidence supports these conclusions.