Training is one of the most important factors affecting acceptance and use of technology (Venkatesh and Bala 2008). We investigate the timing of technology training as a potential intervention for auditors' resistance to use of optional technology. Appropriate timing may reduce time-related pressures, thus increasing a willingness to train when pressures are lower (during the non-busy season) and reducing resistance to technology use. However, training long before use (again, during the non-busy season) may raise concerns of memory decay. We manipulate training between three time periods (July, November, and December), which vary in both time pressure and closeness to the time when the technology would actually be applied to an audit task. We then elicit perceptions of two pressures commonly recognized in the accounting literature (time pressure and confidence in memory) as well as intentions to train on, and use, the new technology. We find intentions to train are greater when training is available earlier, suggesting that busy season pressure is of greater concern than memory retention. Additionally, intentions to train are directly influenced by intentions to use the technology, ease of use, confidence in memory, task experience, gender, and position in the firm.

Training is one of the most important factors affecting acceptance and use of technology (Venkatesh and Bala 2008). We investigate the timing of technology training as a potential intervention for auditors' resistance to use of optional technology.

Technology acceptance has been widely researched in the information systems field and to some extent in accounting (Curtis and Payne 2008; Gonzalez, Sharma, and Galletta 2012). In fact, as there is significant research on individual-level characteristics and IT in the broad areas of IS and AIS, many call for new research aimed at organizational-level decisions (interventions) that may enhance the acceptance and use of technology (Venkatesh and Bala 2008; Venkatesh, Davis, and Morris 2007). The auditing profession is not immune to technology resistance. In fact, a survey of practicing auditors revealed that although certain technology applications are used extensively in the field (e.g., electronic work papers and internet search tools), other applications such as fraud review and computer-aided audit techniques (CAATs) meet with more resistance (Janvrin, Bierstaker, and Lowe 2008).1 These authors recently replicated this earlier work, finding the use of all technology applications investigated in their study increased over a decade. However, there remains a large difference between the perceived importance and actual use of audit testing applications (Lowe, Bierstaker, and Janvrin 2016). This situation is particularly troublesome when considering the Center for Audit Quality predicts technology will drive future changes in auditing that will be necessary to keep up with the expectations of financial statement users (CAQ 2015).

The resistance is due, in part, to the fact that firms often allow auditors to decide when to employ technologies, and therefore their use is optional rather than mandated. While it may seem reasonable that auditors would use any technology available that could improve the quality of their results and the efficiency of their audits, the many pressures faced by auditors in the field as well as lack of knowledge regarding the tool may create resistance. In fact, lack of IT audit training and lack of understanding/expertise in IT of auditors are cited as the main constraints of using audit technology in the audit process (Abou-El-Sood, Kotb, and Allam 2015). Indeed, training is one of the most important factors affecting acceptance and use of technology in numerous professions (Venkatesh and Bala 2008). This study answers the call for research to aid our understanding of factors influencing voluntary use of audit technology by financial statement auditors (Curtis, Jenkins, Bedard, and Deis 2009), and to investigate whether training is a factor in determining auditors' use of technology (Lowe et al. 2016) by investigating the timing of technology training as a potential intervention for auditors' resistance to use of optional technology. Our experimental design employs CAATs as the specific technology considered by participants, although other types of audit technology meet these criteria.

While technology training studies are limited in auditing (Bedard, Jackson, Ettredge, and Johnstone 2003; Loraas and Wolfe 2006), it is possible that appropriate timing of training may reduce the impact of many time-related pressures, thus increasing auditors' willingness to undergo training at a time when pressures are lower (during the non-busy season) and thereby reducing resistance to new tools and technologies made available by the firm. However, an alternative theory of pressure would suggest a “use it or lose it” concern that one might forget what was learned during training if the length of time between training and use is too great. We develop competing hypotheses and then vary the timing in which the training for a new technology is available to determine the influence of these various pressures on auditors' intentions to take training on the new tool.

In an experiment with 104 experienced auditors, we manipulate availability of training between July, November, and December for a calendar year-end client, and elicit auditor intentions to train on, and use, the new technology. We also elicit several covariates of intention to train and intention to use, including perceptions of two pressures commonly recognized in the accounting literature (time pressure and confidence in memory), as well as traditional technology acceptance factors. We find auditors' intentions to train on a new technology are greater when training is made available earlier, thus suggesting busy season pressure is of greater concern than memory retention. In addition, intentions to train are directly influenced by intentions to use the technology, ease of use, confidence in memory, task experience, gender, and position in the firm, while intentions to use are influenced by ease of use and usefulness.

This research contributes to the accounting literature by exploring the impediments to use of technology in the audit through evaluation of the impact of optional training dates. Our results show the pressures of the busy season outweigh the pressure of memory retention when auditors consider training on optional technology. Furthermore, our results show auditors in lower positions in the firm are more reluctant to train on a new technology, suggesting a misalignment of individual-level and firm-level goals. Altogether these results provide important and useful information to administrators aiming to develop or improve firm-related technology acceptance interventions, since removing the barriers to training will increase use of technology (Venkatesh and Bala 2008).

In the next section of the paper we discuss prior research and theory relating to technology training and use and develop hypotheses. In the “Research Method” section following, we describe the participants, research materials, and procedures. Next we present the results of our statistical analyses and conclude the paper with a discussion of these results and implications for practice and future research.

The hypotheses we develop in this section are depicted in Figure 1. Technology acceptance is a common topic of research in the information systems field (Davis 1989) that explores the acceptance of software by computer users. However, despite the significant differences between this context and public accounting, only a few studies have considered technology acceptance in auditing. For example, Bedard et al. (2003) find training leads to increases in preparers' computer self-efficacy and task self-efficacy, which are both positively associated with perceptions of system ease of use. Mascha and Smedley (2007) show the importance of matching the appropriate level of auditor skill, decision-aid feedback, and problem complexity when using decision aid technology. Curtis and Payne (2008) find firm interventions, including a longer budget and evaluation period and influence from superiors, increase intention to use technology, but in the absence of firm influences, individual characteristics, including risk-aversion and perceptions of budgetary pressure, affect intentions.

FIGURE 1

Hypothesized Effects on Intention to Train on Optional Technology

H3 is indicated by the dashed lines (USE partially mediates the relationships between USEF and EOU, and TRAIN).

Variable Definitions:

USEF = sum of the six Usefulness questions in Table 2;

EOU = sum of the six Ease of Use questions in Table 2;

USE = “How likely would you be to actually use K-SUL during the upcoming busy season?” (ten-point scale with endpoints of Very Unlikely and Very Likely);

DATE = manipulated training date (July = 1, November = 2, or December = 3);

TP = sum of the two Time Pressure questions in Table 2;

MEM = “I do not worry about the time lag between when I learn something and when I use it.” (nine-point scale with endpoints of Completely Disagree and Completely Agree); and

TRAIN = “How likely would you be to sign up for the training that is available?” (ten-point scale with endpoints of Very Unlikely and Very Likely).

FIGURE 1

Hypothesized Effects on Intention to Train on Optional Technology

H3 is indicated by the dashed lines (USE partially mediates the relationships between USEF and EOU, and TRAIN).

Variable Definitions:

USEF = sum of the six Usefulness questions in Table 2;

EOU = sum of the six Ease of Use questions in Table 2;

USE = “How likely would you be to actually use K-SUL during the upcoming busy season?” (ten-point scale with endpoints of Very Unlikely and Very Likely);

DATE = manipulated training date (July = 1, November = 2, or December = 3);

TP = sum of the two Time Pressure questions in Table 2;

MEM = “I do not worry about the time lag between when I learn something and when I use it.” (nine-point scale with endpoints of Completely Disagree and Completely Agree); and

TRAIN = “How likely would you be to sign up for the training that is available?” (ten-point scale with endpoints of Very Unlikely and Very Likely).

Close modal

In addition, the training factor has received relatively little attention, even within the broader technology acceptance literature (Marler, Liang, and Dulebohn 2006). Agarwal and Prasad (1999) found attitudes toward a technology fully mediated the relationship between training and intentions to use. Marler et al. (2006) found both the extent and quality of training were positively related to intentions to practice new, mandatory technology before the mandated implementation date. However, both of these relationships were mediated by beliefs about employee resources such as time to practice, supervisor support for learning, documentation, and expert help. Within the audit domain, Loraas and Wolfe (2006) identify several pressures or costs that impact the timing of an auditor's willingness to self-train a new technology. In the next section we discuss technology training from a theoretical perspective and relate certain aspects of training to the audit environment.

Technology training is defined as transferring knowledge and operational skills to IT users (Huang 2002). Effective technology training should facilitate learning and positively affect users' attitudes regarding the technology itself (Klein, Hall, and Laliberte 1990). As is the case with skill acquisition in general, practice between the dates of formal IT training and actual implementation is crucial to reduce training decay and facilitate retention (Anderson 2000; Arthur, Bennett, Stanush, and McNelly 1998). Thus, in order to maximize training effectiveness, concerns of memory decay suggest organizations should provide employees the opportunity for such practice to facilitate learning, maintenance, and transfer of knowledge (e.g., Baldwin and Ford 1988; Yelon and Ford 1999).

Unlike the taxation area of public accounting where training for new technologies and updates are typically provided very near the beginning of tax season, CAAT tool training is typically provided to auditors on one occasion via eLearning (Rowe 2008). Given the nature of public accounting's busy season for auditors (generally early November through early March; greatest between mid-December to late January), most training is provided and taken during the offseason, or summer months.2 Unfortunately, the pressure to report billable hours does not likely provide auditors with time to continue “practicing” skills learned in summer training classes.

Anderson (2000) describes three general stages of skill acquisition. The first stage is the cognitive stage in which people use the factual knowledge they have acquired along with problem-solving efforts. For example, they may work from instructions or examples of how to solve a particular problem. With increasing amounts of practice, a person moves to the second stage, which is the associative stage. Here, general problem-solving methods are abandoned as domain-specific methods take over—this is known as proceduralization. The cognitive component of a skill is greatly reduced and skill performance greatly improves. In the final autonomous stage, the skill becomes completely automatic with eventual elimination of cognitive effort. The inability to practice does not allow one to advance through the stages of skill acquisition to achieve automation of processes (Anderson 2000; Baddeley 1990) and the passage of time causes already learned information to be forgotten (Anderson 2000; Baddeley 1990).

Thus, the time gap between training and use can lead to a frustrating decay of newly learned skills (Aguinis and Kraiger 2009). This situation suggests auditors are more likely to take optional technology training courses that are offered closest to the date they will actually use the technology in order to avoid such decay.

  • H1a: 

    The intention to train on optional technology is greater when training is available closer to the time of actual technology use.

The nature of busy season workloads creates pressure for auditors. DeZoort and Lord (1997) describe workload as an organizational-type job pressure that affects stress, which in turn leads to attitudinal and behavioral outcomes such as job dissatisfaction and judgment and decision variations. Workload pressure may be of the quantitative type in which a person has too much to do, or the qualitative type in which a task is too complex for a person to perform comfortably (DeZoort and Lord 1997). Quantitative overload pressure is closely related to time pressure, and qualitative overload can be caused by many factors, including pressure to keep up with technology changes (DeZoort and Lord 1997).

Although the construct of workload pressure has been studied very little apart from its related construct of time pressure, the time pressure studies (including both time budgets and deadlines)3 are abundant. In general, these studies suggest that although some degree of time pressure may have positive outcomes because auditors become more efficient, moderate to high levels of time pressure lead to dysfunctional behavior (including inefficiency) and negative attitudes (see DeZoort and Lord [1997] for a detailed review of these studies).

During the busy season, when auditors experience moderate to high levels of workload and time pressures, they are likely to avoid additional activities that would increase those pressures, such as non-mandatory technology training. In addition, training may not be as effective during these times of high pressure (compared to times of lower pressure), particularly when it is delivered electronically (via eLearning) (Rowe 2008; Santhanam, Sasidharan, and Webster 2008; Wagner 2008). For example, effective eLearning should include self-regulated learning (SRL) strategies, such as forethought of goals (or a learning plan), note taking and organization of study materials, and self-reflection/monitoring of the learning process. However, research indicates people have difficulty applying SRL and often use inadequate learning strategies (Santhanam et al. 2008). This may be particularly applicable to auditors, who are not likely to spend the additional time and effort to engage in SRL strategies when training is taken during periods of pressure.

The previous discussion suggests that auditors will avoid training during the busy season in order to reduce workload pressure. This reasoning leads to a prediction that is in opposition to H1a:

  • H1b: 

    The intention to train on optional technology is greater when training is available during the least busy times of the year.

Previous research finds support for training as a determinant of use (e.g., Marler et al. 2006; Agarwal and Prasad 1999). However, given the optional nature of CAAT training for auditors, we focus on intentions rather than actual behavior in order to discover factors impacting intentions to train. Goal theory supports our assertion that intention to use a new technology affects the intention to take training on that technology prior to use. Gollwitzer (1999, 494) differentiates goal and implementation intentions in that goal intentions are related to attainment of specific goals, whereas “implementation intentions are subordinate to goal intentions and specify the when, where, and how of responses leading to goal attainment.” So, if an auditor sets a goal of using CAAT technology on future audit engagements in order to increase efficiency, improve his/her performance evaluation, etc., then s/he will next set implementation decisions to achieve that goal, including an intention to take the necessary training. Thus, commitment to a goal is a motivational factor that increases the effort exerted and maintained over time and decreases the likelihood of abandoning a goal in the face of a challenge (Lee, Keil, and Kasi 2012).

For auditors who intend to use a new technology this becomes a committed goal. Their motivation makes it much more likely they will take the necessary training in order to achieve the goal. Those with no goal of using the new technology in the foreseeable future are not likely to sign up for technology training because the likelihood of forgetting makes taking a training class an inefficient use of time. Furthermore, auditors may perceive that training on the technology would lead to expectations by superiors to actually use the technology. Goal commitment helps auditors overcome these obstacles. This discussion leads to the following hypothesis:

  • H2: 

    The intention to use optional technology positively affects intention to train on the technology.

Technology acceptance studies have identified numerous perceptions of the environment that affect the intention to use new technology either directly or indirectly. The classic factors of the Technology Acceptance Model (TAM) model are ease of use and usefulness of the technology under consideration (Davis 1989). A significant body of research on TAM has shown that perceived ease of use and perceived usefulness are strong determinants of technology user acceptance, adoption, and usage behavior (Gonzalez et al. 2012; Venkatesh et al. 2007; Jeyaraj, Rottman, and Lacity 2006).

As discussed previously, our study examines intentions, not actual use. Thus goal theory suggests perceptions of ease of use and usefulness should impact intention to train (Morris and Venkatesh 2000). Specifically, perceptions of ease of use and usefulness will lead to an auditor setting a goal to use CAAT technology on future audit engagements. However, considering also the strong link between contextual perceptions and intention to use as established by prior research, we expect intention to use to at least partially mediate the effect of contextual perceptions on intention to train.

  • H3: 

    Intention to use mediates the relationship between contextual perceptions (ease of use and usefulness) and intention to train.

Ease of use of online training resources affects the efficacy of online training (Lim, Lee, and Nam 2007). Prior research provides support for this relationship based on motivation theory (Compeau and Higgins 1995; Hicks and Klimoski 1987), media richness theory (Daft and Lengel 1986), technology acceptance theory (Davis 1989), and institutional theory (Orlikowski 1992). Therefore we include this predicted link our model:

  • H4: 

    Ease of use positively increases perceptions of usefulness.

Technology acceptance studies have identified numerous individual characteristics that affect intention to use new technology either directly or indirectly (e.g., Technology Acceptance Model 3 (TAM3)—Venkatesh and Bala 2008). Individual characteristics may also have a direct impact on training intention, without a corresponding influence on technology-related use.

We examine memory retention since individuals differ in performance of various cognitive functions (Necka 1999) and it is likely that one's confidence in memory retention may affect their intention to train on a new technology. Auditors who are less confident in their memories may be less likely to train, and even more so the further the training date is from actual use. We also examine individual levels of time pressure. Higher levels of perceived time pressure may enhance the intention to use technology if auditors believe the technology's benefits will lead to long-term reductions in pressure. However, a focus on busy season time pressure would likely result in a reduced intention to train because training reduces available time and therefore increases pressure in the short run.

In addition to confidence in memory and time pressure, we also explore the effects of demographic variables including task experience, gender, and position in the firm (Venkatesh, Morris, G. Davis, and F. Davis 2003; Curtis and Payne 2008). We study individual difference variables in an exploratory fashion rather than making direct predictions due to the lack of prior research documenting the effects and the complexity of these relationships, including the potential for higher-order interactions.

One hundred fifteen auditors from several accounting firms of various sizes participated in the case study.4 One participant did not complete the study, leaving a sample size of 114. The experiment was administered via two methods. In one method, the experimental materials were placed online. We contacted several accounting firms that agreed to participate in the study and then provided access instructions to their auditors. In the other method, we administered a written version of the instrument to auditors attending a national training class for their firm.5

The case study was presented in the experimental materials, followed by measures of our dependent and independent variables and demographics (see Appendix A). The case describes a hypothetical in-charge auditor who has the opportunity to take training on a new CAAT tool (called K-SUL) that can be used for the calendar year-end audit test to search for unrecorded liabilities. The CAAT is described as somewhat complex to implement but with the potential to reduce budgeted hours for liabilities by 50 percent for most clients, thereby establishing both risks and rewards, which are typical in audit situations. Furthermore, use of the tool is described as optional and, to avoid demand effects, no reference is made as to superiors' preferences since these are found to influence auditors' decisions to implement (Curtis and Payne 2008).

The auditors are informed as to when training for the software would become available. The experimental design is between subjects and manipulates training dates. The participants then answered questions related to their intention to train on the new technology and intention to use it. Last in the experimental materials was a questionnaire that included manipulation check questions, measures of individual characteristics, and demographic questions.

The manipulated independent variable in this study is the timing of optional technology training. Three dates were manipulated—the third week in July, November, or December. July represents the non-busy season for most auditors. November and December are typical busy season months for auditors as interim work is performed prior to year-end, with December being typically the busier of the two due to inventory counts and other year-end-related audit tests.6 The three dates were chosen because they represent the following for July, November, and December, respectively: low pressure/high forgetting, moderate pressure/moderate forgetting, and high pressure/low forgetting. When we test our competing hypotheses to determine which is the greater concern for training (forgetting the information or increased pressure during the busy season), this timing allows us to determine whether one concern overrides the other (higher intentions to train and use the technology in July or December), or they do not (higher intention to train and use the technology in November).

To measure our dependent variable, intention to train, we asked participants “How likely would you be to sign up for the training that is available?” Additionally, in order to measure and control for the intention to use technology, we asked participants “How likely would you be to actually use K-SUL during the upcoming busy season?” These questions were both measured on a scale from 1–10 with endpoints of Very Unlikely to Very Likely.

The scale questions we used to measure contextual perceptions of ease of use and usefulness, and individual differences in time pressure are included in Table 2. The ease of use and usefulness questions were adapted from the Technology Acceptance Model (Davis 1989; see Legris, Ingham, and Collerette [2003] for a literature review). Consistent with DeZoort and Lord's (1997) distinction between the two aspects of time pressure, we included two questions to measure budget pressure and deadline pressure. The budget pressure question was taken from Curtis and Payne (2008) and the deadline pressure question was developed specifically for this study. One question was constructed to measure confidence in memory for skill/technology acquisition—“I do not worry about the time lag between when I learn something and when I use it.” All of these questions were measured on a nine-point scale with endpoints of Completely Disagree and Completely Agree. We also obtained numerous demographic measures including task experience, gender, and position in the firm.

A manipulation check question was included in the questionnaire following the case study. Participants were asked, “When did the firm offer the training class?” and were given three responses to choose from—the third week in July, November, or December. Ten participants answered incorrectly and were excluded from further analysis, leaving a final sample of 104 participants.7 Two additional questions measured whether the dates used in the manipulations were perceived by participants as periods that would lead to more or less forgetting, or would impose higher or lower pressure. On a nine-point scale with endpoints of Extremely Unlikely to Extremely Likely, participants responded to the following two statements: (1) I will forget how to use this software before I need it at year-end, and (2) I will be working under a great deal of pressure at the time I take this training class.

Table 1 depicts these question means (all p-values reported below are two-tailed). For the forgetting question, ANOVA results show estimated marginal means of 4.1, 4.3, and 3.4 for July, November, and December, respectively. The differences between training dates are as follows: July/November (p = 0.64) is not significant, July/December (p = 0.08) is marginally significant, and November/December is significant (p = 0.02). When all potential covariates are included in the ANOVA (TP, MEM, TASK, GEND, POSITION) the respective results are p = 0.80, p = 0.05, and p = 0.02. For the pressure question, the means are 3.7, 5.6, and 5.8 for July, November, and December, respectively. The difference between November/December is not significant (p = 0.64), but the differences between July/November and July/December are significant (p = 0.00 for each). When all potential covariates are included in the ANOVA (TP, MEM, TASK, GEND, POSITION) the respective results are p = 0.24, p = 0.00, and p = 0.00. These results suggest our manipulated training dates generally reflect the levels of anticipated forgetting and pressure that were intended. These results also support our contention that the July training date is preferable when considering anticipated busy season pressure, and December is preferable when anticipated forgetting is considered.

TABLE 1

Cell Means for Manipulation Comprehension Questions (Significance of the Cell Mean and Scale Midpoint Difference)

Cell Means for Manipulation Comprehension Questions (Significance of the Cell Mean and Scale Midpoint Difference)
Cell Means for Manipulation Comprehension Questions (Significance of the Cell Mean and Scale Midpoint Difference)
TABLE 2

Confirmatory Factor Analysis Results

Confirmatory Factor Analysis Results
Confirmatory Factor Analysis Results

We also compared forgetting and pressure within each time period. Paired sample t-tests show anticipated pressure is a greater concern than anticipated forgetting for all groups combined, and in the November and December training periods (p = 0.00 for each of these). Thus, there appears to be more concern overall for workload pressure than forgetting. However, these analyses do not include the effects of potential covariates, so any assertion regarding support for our hypotheses is premature.

To confirm validity of the constructs used in our study we performed a confirmatory factor analysis using LISREL 8.80. Results are shown in Table 2. The final solution converged in 16 iterations with the following fit statistics: χ2 = 229.475 (p = 0.00, 74 df); Standardized Root Mean Square Residual (SRMR) = 0.0677; and Comparative Fit Index (CFI) = 0.936. Together these fit statistics demonstrate a relatively good fit.8 Convergent validity is obtained since all factors load significantly on their intended constructs (all p-values < 0.02) (Kline 2005). All factor loadings are greater than the 0.40 acceptability threshold and all are significant (p < 0.05). All scale composite reliabilities are higher than 0.90. The Variance Extracted exceeds 0.74 for all constructs. Additionally, to test for common method bias, we employed Harmon's one factor test (Malhotra, Kim, and Patil 2006; P. Podsakoff, MacKenzie, Lee, and N. Podsakoff 2003). When all scale questions were loaded on one factor, we obtained the following fit statistics indicating poor fit: χ2 = 690.683 (p = 0.00, 77 df); SRMR = 0.178; and CFI = 0.755, suggesting our study does not suffer from common method bias.

Demographic statistics are shown in Table 3. On average, participants have performed the search for unrecorded liabilities 16.48 times. The sample is 63 percent male. Position in the firm is represented as follows: staff accountant with no in-charge experience (23.5 percent), staff accountant with some in-charge experience (9.8 percent), senior (52.0 percent), manager (10.8 percent), and partner (3.9 percent).9

TABLE 3

Demographic Statistics

Demographic Statistics
Demographic Statistics

Table 4 shows, by treatment group and in total, the means and standard deviations of intention to train and intention to use the new technology, as well as the contextual variables (usefulness and ease of use), and the individual difference variables (time pressure and confidence in memory). ANOVA results show no significant differences between treatment groups for the individual difference variables, suggesting effective random assignment and no artificial treatment effects. Given a scale midpoint of 5.5, means for participants indicated moderately high intentions to train for (7.77) and use (7.13) the new technology.

TABLE 4

Descriptive Statistics*

Descriptive Statistics*
Descriptive Statistics*

A combined scale midpoint of 30.0 suggests these auditors believe the technology will be useful (41.17) with moderate ease of use (33.84). They report a moderately high level of time pressure (14.40; midpoint 10.0) and a moderate confidence in memory (4.31; midpoint 5.0). Intention to train is significantly higher in July than December (p = 0.04, two-tailed). Intention to use does not differ significantly between any of the treatment groups. However, these analyses do not include the effects of potential covariates, so any assertion regarding support for our hypotheses is premature.

Pearson Correlations are shown in Table 5 (all p-values discussed below are two-tailed). There is a significant, negative correlation between training date and intention to train (r = −0.20; p = 0.04). There is also a strong, positive, and expected correlation between intention to train on the new technology and intention to use (r = 0.51; p = 0.00). In addition, intention to train is significantly correlated with usefulness (r = 0.28; p = 0.01), ease of use (r = 0.39; p = 0.00), and time pressure (r = 0.22; p = 0.03). Intention to use is also significantly correlated with usefulness (r = 0.50; p = 0.00), ease of use (r = 0.40; p = 0.00), and time pressure (r = 0.24; p = 0.01). Ease of use is significantly correlated with time pressure (r = 0.24; p = 0.01) and usefulness is significantly correlated with ease of use (r = 0.54; p = 0.00). The significant correlations between the contextual variables, individual characteristic variables, and the independent and dependent variables demonstrate the need to include these characteristics in our hypotheses tests.

TABLE 5

Pearson Correlations (p-values)

Pearson Correlations (p-values)
Pearson Correlations (p-values)

To test the hypothesized relationships shown in Figure 1, we ran path analyses in LISREL 8.80 using the means of our constructs. Our initial results included a modification index of 7.955 for the unhypothesized path between ease of use and intention to train. We added this path, reran the analysis and obtained a significant improvement in our model fit (Δχ2 = 7.511, 1 df; p < 0.01). The results of our path analysis, including this additional path, are shown in Figure 2, and Table 6 shows the indirect, direct, and total effects. The model converged in six iterations and has the following fit statistics: χ2 = 15.081 (p = 0.302, 13 df); Standardized Root Mean Square Residual (SRMR) = 0.0431; and Comparative Fit Index (CFI) = 0.985, indicating overall good fit (see footnote 8). All p-values discussed below are two-tailed.

FIGURE 2

Path Analysis Results

*, **, *** Indicate p < 0.10; p < 0.05, and p < 0.01, respectively, (two-tailed).

Effects are standardized effects. H3 is indicated by the dashed lines (USE partially mediates the relationships between USEF and EOU, and TRAIN).

Variable Definitions:

EOU = sum of the six Ease of Use questions in Table 2;

USEF = sum of the six Usefulness questions in Table 2;

USE = “How likely would you be to actually use K-SUL during the upcoming busy season?” (ten-point scale with endpoints of Very Unlikely and Very Likely);

DATE = manipulated training date (July = 1, November = 2, or December = 3);

TP = sum of the two Time Pressure questions in Table 2;

MEM = “I do not worry about the time lag between when I learn something and when I use it.” (nine-point scale with endpoints of Completely Disagree and Completely Agree);

TASK = “How many times have you performed a search for unrecorded liabilities?”;

GEND = “What is your gender?” (0 = female, 1 = male);

POSITION = Staff—no in-charge exp., Staff—some in-charge exp., Senior, Manager, Partner; and

TRAIN = “How likely would you be to sign up for the training that is available?” (ten-point scale with endpoints of Very Unlikely and Very Likely).

FIGURE 2

Path Analysis Results

*, **, *** Indicate p < 0.10; p < 0.05, and p < 0.01, respectively, (two-tailed).

Effects are standardized effects. H3 is indicated by the dashed lines (USE partially mediates the relationships between USEF and EOU, and TRAIN).

Variable Definitions:

EOU = sum of the six Ease of Use questions in Table 2;

USEF = sum of the six Usefulness questions in Table 2;

USE = “How likely would you be to actually use K-SUL during the upcoming busy season?” (ten-point scale with endpoints of Very Unlikely and Very Likely);

DATE = manipulated training date (July = 1, November = 2, or December = 3);

TP = sum of the two Time Pressure questions in Table 2;

MEM = “I do not worry about the time lag between when I learn something and when I use it.” (nine-point scale with endpoints of Completely Disagree and Completely Agree);

TASK = “How many times have you performed a search for unrecorded liabilities?”;

GEND = “What is your gender?” (0 = female, 1 = male);

POSITION = Staff—no in-charge exp., Staff—some in-charge exp., Senior, Manager, Partner; and

TRAIN = “How likely would you be to sign up for the training that is available?” (ten-point scale with endpoints of Very Unlikely and Very Likely).

Close modal
TABLE 6

Path Analysis Indirect, Direct, and Total Effects

Path Analysis Indirect, Direct, and Total Effects
Path Analysis Indirect, Direct, and Total Effects

H1a predicts auditors' concerns regarding memory retention will result in a preference for training closer to the date of actual usage, whereas the competing hypothesis, H1b, predicts busy season pressures will lead to a preference for training during the non-busy season. The significant negative effect of training date on intention to train (β = −0.136; p < 0.10) provides support for H1b. As discussed previously, the significantly higher indications of anticipated pressure than forgetting concerns (as shown in Table 1) indicate that it is indeed these pressures and concerns that are driving the H1 results (support for H1b as opposed to H1a). Intention to use is significantly related to intention to train (β = 0.377; p < 0.01), supporting H2. H3 predicts intention to use mediates the relationship between contextual perceptions (ease of use and usefulness) and intention to train. The results support mediation between usefulness and intention to train. However, the significant, unhypothesized path between ease of use and intention to train (β = 0.239; p < 0.05) does not support mediation between ease of use and intention to train. H4 is supported by the significant path from ease of use to usefulness (β = 0.537; p < 0.01).

For the individual characteristics, time pressure is not significant (β = 0.016; p > 0.10) and confidence in memory is significant (β = −0.259; p < 0.01) in relation to intention to train. Note the negative relationship suggesting auditors with less confidence in their memories are more likely to train. All three of the demographic variables are significant in the model. Auditors with more task experience are less likely to train (β = −0.290; p < 0.01). Males are more likely than females to train (β = 0.165; p < 0.10). Finally, auditors in higher positions in the firm are more likely to train (β = 0.348; p < 0.01). We tested all two- and three-way interactions within the model and none were significant.10

Prior research suggests auditors are resistant to the adoption of optional technology, despite evidence that CAATs and other tools can increase both the effectiveness and efficiency of the audit. Prior research also suggests training can mitigate technology reluctance, in some circumstances. Our goal in this study was to explore the impact of the timing of training and individual characteristics on auditors' decisions to train on new, optional technology. We identified two competing pressures (busy season and memory retention) and designed a study to explore the impact that these competing pressures pose when training is offered at various points in the auditor's work year. We theorized a trade-off between busy season pressure and memory retention pressure and constructed competing hypotheses. Specifically, we hypothesized if busy season was the prevailing pressure, then a non-busy season training date would encourage training; alternatively, if memory retention was the prevailing pressure, then training closer to actual usage (i.e., during busy season) would encourage training. We manipulated three training dates—July, November, and December because they represent periods of low pressure/high forgetting, moderate pressure/moderate forgetting, and high pressure/low forgetting, respectively. This manipulation allowed us to determine whether either the busy season or memory retention pressure was prevalent, and to test our competing hypotheses. We also predicted intention to use technology positively affects intention to train on the technology. Furthermore, we predicted intention to use would mediate the relationship between contextual variables (ease of use and usefulness) and intention to train. Also predicted was the influence of ease of use on usefulness. We explored the effects of two individual characteristics not previously tested in technology acceptance studies (time pressure and confidence in memory) on intention to train, as well as several demographic variables.

Our results suggest busy season pressure is of more concern to auditors than memory retention since intention to train on technology was significantly greater when it was offered in July versus December. Greater intentions to train were also indicated by greater intentions to use the technology, less confidence in memory, less task experience, higher positions in the firm, and by males. Our results support mediation by intention to use between usefulness and intention to train. However, mediation by usefulness between ease of use and intention to train is not supported because ease of use positively affects both usefulness and intention to train in the model. Ease of use also positively affects usefulness.

One possible limitation of this study is that we did not directly measure the types of training methods employed at the auditors' home firms and there is a possibility this variable could affect the overall results. However, we do include a control variable for type of firm, as well as many other technology training variables. It is likely the “type of firm” variable would capture types of training methods employed at the auditors' home firms. Of course without an exact measure, we cannot be sure. Another possible limitation is that we did not perform a true “pilot test” for this study. Our instrument was subjected to comments from colleagues and practitioners prior to administration.

These findings have implications for future research. Those interested in further exploration of technology reluctance should consider the role of training in the scenarios they construct. In contexts where training is not manipulated, perceptions of its availability and timing should be controlled. Future research is needed to investigate the effects of different training methods on auditors' intentions to train. In addition, as our study used only one type of CAAT; future research is needed to determine whether our results generalize to other CAATs and other audit applications.

There are practical implications of our results as well. Because numerous firm-level interventions can be implemented to enhance technology acceptance (incentive structures, mandates, providing resources, etc. [Jasperson, Carter, and Zmud 2005]), audit firms must understand the factors creating reluctance to take optional technology training. Reducing or removing the reluctance will increase training intentions and eventual use of technology. Our results indicate firms need to roll out training for new, optional technology well in advance of the upcoming busy season to induce higher training rates. Also, even though our results show auditors are far more concerned about busy season pressure than memory retention, memory decay may still be of concern for some and will likely affect the performance of many when the technology is eventually used. To address this problem, firms can develop and implement short, refresher training modules to be accessed as needed just prior to implementation and use of the new technology. Firms may also need to make many of the optional online training courses mandatory during the non-busy season or provide incentives to take the optional courses.

Understanding the individual characteristics associated with technology training reluctance will enable firms to focus training interventions on specific users as well. For example, firms need to demonstrate the effectiveness and efficiency gains of using specific CAATs so that less experienced auditors who currently perceive little benefit from training on the related technology will be more inclined to do so. Similarly, such education (and perhaps mandates) will increase training and usage for those with higher levels of manual task experience and for females. In summary, understanding and then removing obstacles to training should improve the ultimate utilization of software within audit firms.

Abou-El-Sood
,
H
.,
A
.
Kotb
,
and
A
.
Allam
.
2015
.
Exploring auditors' perceptions of the usage and importance of audit information technology
.
International Journal of Auditing
19
(
3
):
252
256
.10.1111/ijau.12039
Agarwal
,
R
.,
and
J
.
Prasad
.
1999
.
Are individual differences germane to the acceptance of new information technologies?
Decision Sciences
30
:
361
391
.10.1111/j.1540-5915.1999.tb01614.x
Aguinis
,
H
.,
and
K
.
Kraiger
.
2009
.
Benefits of training and development for individuals and teams, organizations, and society
.
Annual Review of Psychology
60
:
451
474
.10.1146/annurev.psych.60.110707.163505
Anderson
,
J. R
.
2000
.
Learning and Memory. 2nd edition
.
New York, NY
:
John Wiley & Sons
.
Arthur
,
W
.,
W
.
Bennett
,
P
.
Stanush
,
and
T
.
McNelly
.
1998
.
Factors that influence skill decay and retention: A quantitative review and analysis
.
Human Performance
11
:
57
101
.10.1207/s15327043hup1101_3
Baddeley
,
A
.
1990
.
Human Memory—Theory and Practice
.
Needham Heights, MA
:
Allyn and Bacon
.
Baldwin
,
T. T
.,
and
J. K
.
Ford
.
1988
.
Transfer of training: A review and directions for future research
.
Personnel Psychology
41
:
63
105
.10.1111/j.1744-6570.1988.tb00632.x
Bedard
,
J. C
.,
C
.
Jackson
,
M. L
.
Ettredge
,
and
K. M
.
Johnstone
.
2003
.
The effect of training on auditors' acceptance of an electronic work system
.
International Journal of Accounting Information Systems
4
:
227
250
.10.1016/j.accinf.2003.05.001
Center for Audit Quality (CAQ)
.
2015
.
Center for audit quality launches “Profession in Focus” web video series
.
CAQ Snapshot
9
(
5
):
1
.
Compeau
,
D. R
.,
and
C. A
.
Higgins
.
1995
.
Computer self-efficacy: Development of a measure and initial test
.
MIS Quarterly
19
(
2
):
189
211
.10.2307/249688
Curtis
,
M. B
.,
and
E. A
.
Payne
.
2008
.
An examination of contextual factors and individual characteristics affecting technology implementation decisions in auditing
.
International Journal of Accounting Information Systems
9
(
2
):
104
121
.10.1016/j.accinf.2007.10.002
Curtis
,
M. B
.,
J. G
.
Jenkins
,
J. C
.
Bedard
,
and
D. R
.
Deis
.
2009
.
Auditors' training and proficiency in information systems: A research synthesis
.
Journal of Information Systems
23
(
1
):
79
96
.10.2308/jis.2009.23.1.79
Daft
,
R. L
.,
and
R. H
.
Lengel
.
1986
.
Organizational information requirements, media richness and structural design
.
Management Science
32
(
2
):
554
571
.10.1287/mnsc.32.5.554
Davis
,
F. D
.
1989
.
Perceived usefulness, perceived ease of use, and user acceptance of information technology
.
MIS Quarterly
13
(
3
):
319
339
.10.2307/249008
DeZoort
,
F. T
.,
and
A. T
.
Lord
.
1997
.
A review and synthesis of pressure effects research in accounting
.
Journal of Accounting Literature
16
:
28
85
.
Gollwitzer
,
P. M
.
1999
.
Implementation intentions: Strong effects of simple plans
.
American Psychologist
54
(
7
):
493
503
.10.1037/0003-066X.54.7.493
Gonzalez
,
G. C
.,
P. N
.
Sharma
,
and
D. F
.
Galletta
.
2012
.
The antecedents of the use of continuous auditing in the internal auditing context
.
International Journal of Accounting Information Systems
13
(
3
):
248
262
.10.1016/j.accinf.2012.06.009
Hair
,
J. F
.,
R. E
.
Anderson
,
R. L
.
Tatham
,
and
W. C
.
Black
.
1998
.
Multivariate Data Analysis with Readings. 5th edition
.
Englewood Cliffs, NJ
:
Prentice-Hall
.
Hicks
,
W. D
.,
and
R. J
.
Klimoski
.
1987
.
Entry into training outcomes: A field experiment
.
Academy of Management Journal
30
:
542
552
.10.2307/256013
Huang
,
A. H
.
2002
.
A three-tier technology training strategy in a dynamic business environment
.
Journal of End User Computing
14
(
2
):
30
39
.10.4018/joeuc.2002040103
Janvrin
,
D
.,
J
.
Bierstaker
,
and
D. J
.
Lowe
.
2008
.
An examination of audit information technology use and perceived importance
.
Accounting Horizons
22
(
1
):
1
21
.10.2308/acch.2008.22.1.1
Jasperson
,
J. S
.,
P. E
.
Carter
,
and
R. W
.
Zmud
.
2005
.
A comprehensive conceptualization of the post-adoptive behaviors associated with IT-enabled work systems
.
MIS Quarterly
29
:
525
557
.
Jeyaraj
,
A
.,
J. W
.
Rottman
,
and
M. C
.
Lacity
.
2006
.
A review of the predictors, linkages, and biases in IT innovation adoption research
.
Journal of Information Technology
21
:
1
23
.10.1057/palgrave.jit.2000056
Klein
,
K. J
.,
R. J
.
Hall
,
and
M
.
Laliberte
.
1990
.
Training and the organizational consequences of technological change: A case study of computer-aided design and drafting
.
In
Technological Innovation and Human Resources: End-User Training
, edited by
Garttiker
,
U
.,
and
L
.
Larwood
,
31
79
.
Greenwich, CT
:
JAI Press
.
Lee
,
J. S
.,
M
.
Keil
,
and
V
.
Kasi
.
2012
.
The effect of an initial budget and schedule goal on software project escalation
.
Journal of Management Information Systems
29
(
1
):
53
77
.10.2753/MIS0742-1222290102
Legris
,
P
.,
J
.
Ingham
,
and
P
.
Collerette
.
2003
.
Why do people use information technology? A critical review of the technology acceptance model
.
Information & Management
40
:
191
204
.10.1016/S0378-7206(01)00143-4
Lim
,
H
.,
S
.
Lee
,
and
K
.
Nam
.
2007
.
Validating eLearning factors affecting training effectiveness
.
International Journal of Information Management
27
:
22
35
.10.1016/j.ijinfomgt.2006.08.002
Loraas
,
T
.,
and
C. J
.
Wolfe
.
2006
.
Why wait? Modeling factors that influence the decision of when to learn a new use of technology
.
Journal of Information Systems
20
(
2
):
1
23
.10.2308/jis.2006.20.2.1
Lowe
,
D. J
.,
J
.
Bierstaker
,
and
D
.
Janvrin
.
2016
.
Audit Information Technology Use and Perceived Importance: Have the Big 4 Lost Their Advantage?
Working paper
,
Arizona State University
.
Malhotra
,
N. K
.,
S. S
.
Kim
,
and
A
.
Patil
.
2006
.
Common method variance in IS research: A comparison of alternative approaches and a reanalysis of past research
.
Management Science
52
(
12
):
1865
1883
.10.1287/mnsc.1060.0597
Marler
,
J. H
.,
X
.
Liang
,
and
J. H
.
Dulebohn
.
2006
.
Training and effective employee information technology use
.
Journal of Management
32
(
5
):
721
743
.10.1177/0149206306292388
Mascha
,
M. F
.,
and
G
.
Smedley
.
2007
.
Can computerized decision aids do “damage”? A case for tailoring feedback and task complexity based on task experience
.
International Journal of Accounting Information Systems
8
(
2
):
73
91
.10.1016/j.accinf.2007.03.001
Morris
,
M. G
.,
and
V
.
Venkatesh
.
2000
.
Age differences in technology adoption decisions: Implications for a changing work force
.
Personnel Psychology
53
(
2
):
375
403
.10.1111/j.1744-6570.2000.tb00206.x
Necka
,
E
.
1999
.
Learning, automaticity and attention: An individual-differences approach
.
In
Learning and Individual Differences—Process, Trait, and Content Determinants
, edited by
Ackerman
,
P. L
.,
P. C
.
Kyllonen
,
and
R. D
.
Roberts
,
161
181
.
Washington, DC
:
American Psychological Association
.
Orlikowski
,
W. J
.
1992
.
The duality of technology: Rethinking the concept of technology in organization
.
Organization Science
3
(
3
):
398
427
.10.1287/orsc.3.3.398
Podsakoff
,
P. M
.,
S. B
.
MacKenzie
,
J. Y
.
Lee
,
and
N. P
.
Podsakoff
.
2003
.
Common method biases in behavioral research: A critical review of the literature and recommended remedies
.
Journal of Applied Psychology
88
(
5
):
879
903
.10.1037/0021-9010.88.5.879
Rowe
,
R
.
2008
.
Discussion of an examination of contextual factors and individual characteristics affecting technology implementation decisions in auditing
.
International Journal of Accounting Information Systems
9
(
2
):
127
129
.10.1016/j.accinf.2007.10.004
Santhanam
,
R
.,
S
.
Sasidharan
,
and
J
.
Webster
.
2008
.
Using self-regulatory learning to enhance e-learning-based information technology training
.
Information Systems Research
19
(
1
):
26
47
.10.1287/isre.1070.0141
Venkatesh
,
V
.,
and
H
.
Bala
.
2008
.
Technology acceptance Model 3 and a research agenda on interventions
.
Decision Sciences
39
(
2
):
273
315
.10.1111/j.1540-5915.2008.00192.x
Venkatesh
,
V
.,
F. D
.
Davis
,
and
M. G
.
Morris
.
2007
.
Dead or alive? The development, trajectory and future of technology adoption research
.
Journal of the Association for Information Systems
8
(
4
):
267
286
.
Venkatesh
,
V
.,
M. G
.
Morris
,
G. B
.
Davis
,
and
F. D
.
Davis
.
2003
.
User acceptance of information technology: Toward a unified view
.
MIS Quarterly
27
(
3
):
425
478
.
Wagner
,
E. D
.
2008
.
Delivering on the Promise of eLearning. White paper
.
San Jose, CA
:
Adobe Systems Incorporated
.
Whitecotton
,
S. M
.
1996
.
The effects of experience and confidence on decision aid reliance: A causal model
.
Behavioral Research in Accounting
8
:
194
216
.
Yelon
,
L
.,
and
K
.
Ford
.
1999
.
Pursuing a multidimensional view of transfer
.
Performance Improvement Quarterly
12
:
58
78
.10.1111/j.1937-8327.1999.tb00138.x
1

We differentiate between CAATs and decision aids (DA). While CAATs may generate audit evidence, they only provide inputs into the actual audit judgment. DA, on the other hand, facilitate the integration and application of evidence into the judgment process. Thus, CAATs provide input while DA provide or facilitate outputs of audit judgment (Whitecotton 1996).

2

Aguinis and Kraiger (2009) point out that technology training is increasingly delivered on demand and online. They state one potential drawback of this approach is that it transfers more control to learners to make decisions about what, when, and how to learn.

3

Time budget pressure arises in engagements when resources allocated to a task are limited, while time deadline pressure relates to specific completion dates (DeZoort and Lord 1997).

4

The authors obtained approval for this study from their institutional review boards.

5

The significance (and non-significance) of model variables in our reported results is robust to inclusion of an administration method variable.

6

Certainly individual auditors' busiest time periods will vary depending on the type of engagements/clients they have, the fiscal year-end, and reporting deadlines for these clients, etc. Therefore, we emphasized in the case study “Because most all of your clients have December 31 year-ends, your busy season typically runs from late October to early March.”

7

Because the significance of some tests differs depending upon whether the ten participants who answered the manipulation check question incorrectly are included, we exclude these participants in the results discussed below and from all further analyses.

8

Hair, Anderson, Tatham, and Black (1998) provide the following recommendations for our sample size and number of variables: χ2 = Significant p-values can result even with good fit, SRMR < 0.08 (with CFI of 0.95 or higher) and CFI 0.95 or better.

9

None of these demographic variables have a significant effect on the dependent variable, Intention to Train: Task Experience (Pearson Correlation = −0.14; p = 0.15); Gender (Mann-Whitney U Test p = 0.43); and Position (Kruskal-Wallis Test p = 0.19).

10

The following questions/demographics were also tested and did not significantly alter the reported results: “When new audit technologies are introduced at the firm I feel pressure to use them”; “IT auditors should be in charge of technology on audit engagements”; number of college classes in MIS and AIS; number of hours of technology training received since leaving college; time since last technology training; years of public accounting experience; age; years of non-public accounting audit experience; years of public accounting IT audit experience; years of non-public accounting non-audit experience; years of non-public IT audit experience; audit industry specialization; education level; certifications (CPA, CFE, CIA, other); firm size; and number of CAATS used previously.

Assume you are an in-charge auditor with a large public accounting firm. Your firm has recently developed a new CAAT tool (which the firm has nicknamed “K-SUL”) that is used for the year-end search for unrecorded liabilities. If you are trained on the new tool (K-SUL) and are able to successfully implement it, then you will reduce the budgeted hours for liabilities by 50 percent for most of your clients, even with the extra time it will take to implement the software. You should also be able to use the software on these clients next year. However, K-SUL is considered to be somewhat complex, so there is a slight possibility that implementation may be difficult. Because most all of your clients have December 31 year-ends, your busy season typically runs from late October to early March.

The firm has made K-SUL available but is not requiring its use. In order to use K-SUL effectively, however, you will need training. The training will be done online and is estimated to take about six hours to complete. You have heard good preliminary reports about this particular online class, but the class will not be available until the third week in July (the third week in November, or the third week in December).

Please write a short explanation for your decision.