ABSTRACT
Facing a more competitive environment, institutions in the higher education sector increasingly deploy enterprise resource planning (ERP) systems to facilitate better decision making. Of more recent origin, business analytics approaches are supplementing this technology. However, based on anecdotal accounts, many of these organizations have not reaped the advantages that were sought from these advances. The current research explores this conundrum by proposing and testing a model of perceived ERP effectiveness. Using data collected in a survey of colleges in the U.S., the results show that although distinctions between information quality and systems quality tend not to be made, overall perceived input quality is associated with ERP effectiveness. ERP effectiveness is only indirectly affected by general information technology competence. Here, perceived organizational support exists as an important mediating construct, but business analytics are not perceived to play a consequential role.
JEL Classifications: C3; L3; I22.
Data Availability: Survey data available upon request.
I. INTRODUCTION
The state of higher education is changing. Although higher education has become an economic juggernaut with revenues of $634 billion, including $391 billion for public and $243 billion for private institutions (National Center for Education Statistics [NCES] 2020), the economic outlook does not suggest the continuation of growth and prosperity. External forces are influencing the continuation of traditional sources of funding for both public and private schools. In over half the states in the U.S., public institutions are now competing for governmental funding based on student performance (Tandberg and Hillman 2014; Rabovsky 2012). Meanwhile, private institutions have weathered a 10 percent decline in net tuition revenue over the past decade because private schools across the country are awarding more institutional aid to students in an unsustainable effort to maintain enrollments (Vanover Porter 2015). According to a 2018 survey, the discount rate for first-time, full-time college freshman at private universities exceeded 52 percent, meaning that net tuition to the institutions is less than half what should have been collected judged by the tuition rate (Valbrun 2019). In sum, financial pressures on higher education are significant and growing (Powell, Gilleland, and Pearson 2012). This situation is exacerbated by unfavorable enrollment trends through the next decade (Barshay 2020). Demographic trends suggest an enrollment cliff in the year 2025 as the number of college students are predicted to fall by more than 15 percent (Kline 2019).
As institutions of higher learning adapt to a less beneficial external environment, they are increasingly seeking better information and more efficient means of using it. Institutions are more cognizant that better decision making will be demanded when resources are not as plentiful (Campbell and Fogarty 2018; McCourt 2015). Toward this objective, institutions appear to be aware of an increased need for competitive intelligence (Norris and Baer 2013) that can be used in a more fluid and consistent manner. Many feel that higher education institutions must become more data driven in order to be more effective in achieving mission objectives as well as satisfying increasing regulatory requirements (Grajek 2016).
Research in information systems and accounting information systems has long recognized the importance of decision making (Kelton, Pennington, and Tuttle 2010), although most of this work has been focused on for-profit organizations. Not to be outdone, institutions of higher learning have invested significantly in technology for the promise of data-driven decisions, spending upward of 5 percent of their budgets on information processing technologies1 (Then and Amaria 2013). Similar to corporations, higher education is in the process of turning toward heightened centralized decision making in the pursuit of coordination and focus (AICPA 2013; Bolt-Lee and Moody 2008). The sector remains challenged by the rapid expansion of relevant data (AICPA 2013) and by the ability to marshal the resources to understand them (LaValle, Hopkins, Lesser, Shockley, and Kruschwitz 2010; Norris and Baer 2013). Many believe that data analytics could help enable schools to better see trends, ask critical “what-if” questions, discern correlations, apply predictive models, and use all these new insights to take those actions that would constitute superior strategic capability (Grajek 2016).
Given the unique nature of higher education, the sudden change in its financial environment, and its nascent move toward modern information technologies, a consideration of ERP effectiveness is appropriate. This study first proposes a model of relationships responding to calls for such an exercise (e.g., Ferguson and Seow 2011). An empirical study is offered as a means to evaluate the information environment of higher education.
Using data collected in a national survey of collegiate finance executives, this study's results show a selective embrace of innovation routinely found in corporate applications. Information technology (IT) distinctions do not seem as precisely delineated, and business analytics have not yet made a clear contribution to the perception of organizational effectiveness. Gartner Research has dubbed higher education as a laggard sector in regard to technology adaptation (Manyika et al. 2011). Subsequently, the importance of organizational support to ERP effectiveness constitutes another noteworthy contingency that limits the ability to fully exploit information. The contribution of this study is to identify the perils of not embracing the unique environment of higher education as it attempts to operate more fully exposed to dynamic market forces.
II. LITERATURE REVIEW AND HYPOTHESES DEVELOPMENT
The Information Success Model (IS Model) has been well established in prior literature, starting with work by DeLone and McLean (1992, 2003). The IS Model identifies and describes the relationships among six critical dimensions of information success: information quality, system quality, service quality, system use/usage intentions, user satisfaction, and net system benefits. In the IS Model, net system benefits result from attributes of the system and how it is used by people (Gable, Sedera, and Chan 2008; Sedera and Gable 2004; Kanellou and Spathis 2013; DeLone and McLean 1992). The literature streams that have flowed from the IS Model form the conceptual backbone of the current study.
The dependent variable in the information success model is “net benefits.” However, other research has considered “information technology accounting benefits” as opposed to “net benefits” (Kanellou and Spathis 2013). The role of accountants has changed due to ERP implementation (Kanellou and Spathis 2013), and the relationship between technologies and accountants has become increasing intertwined (Grabski, Leech, and Schmidt 2011). In the context of this research study, ERP effectiveness is defined in ways that would be valued by those charged with accounting-type responsibilities. This includes flexibility in information generation, increased integration with accounting applications, improved quality of reports, improved decisions based on timely and reliable information, and the reduction of time to close annual accounts (Kanellou and Spathis 2013). While “effectiveness” is capable of other meanings, and ERP systems possess capabilities of other sorts not embraced by this definition, this research's conception of it contains both breadth and importance.
Direct Effects on ERP Effectiveness
As brutally captured by the dictum “garbage in, garbage out,” information system effectiveness is strictly limited by the organization's ability to capture transactions and other important events. Prior research has suggested the key dimensions of information quality include accuracy, completeness, currency, and format (Nelson, Todd, and Wixom 2005; Wang and Strong 1996). Accuracy is commonly defined as correctness in the mapping of stored information (Nelson et al. 2005). Completeness refers to the degree in which all possible states relevant to the user population are represented in the stored information (Nelson et al. 2005; Huh, Keller, Redman, and Watkins 1990). Currency refers to the degree in which information is timely (Nelson et al. 2005). Format refers to the degree that information is presented in ways that are understandable to the user (Nelson et al. 2005; Wang and Strong 1996). Thus, information quality is comprised of many characteristics that enhance or limit value, and can be treated as empirical assertions (Trites 2013).
Relatively new to the need to gather knowledge about their environments on more than a passive basis, colleges and universities cannot be assumed to be efficient information gatherers. In fact, survey results indicate that 90 percent of business officers believe that greater importance should be placed on using information more effectively to create value in understanding operational efficiencies, cost of instruction, internal academic process assessments, and faculty workload planning (Wayt 2019). If it is not captured from the environment of higher education, information cannot be part of a decision. The following statement captures the expectation that perceptions of data quality limit believed success of information processing.
Perceived information quality positively affects perceived ERP effectiveness in higher education applications.
System quality reflects the attributes of the information processing system that are required to produce desired output (DeLone and McLean 2003). Prior research suggests that system quality is derived from response time, flexibility, reliability, accessibility, and integration (Nelson et al. 2005; Gorla, Somers, and Wong 2010). A comprehensive instrument to measure system quality was introduced by Sedera and Gable (2004), for a literature that includes integration as one of the key attributes. Integration refers to how information is aggregated from various sources. Due to the strong reliance on both structured and unstructured data experienced by most institutions, integration is a key component of system quality (Wixom and Todd 2005).
Since ERP systems were created for business organizations, their fit to the unique needs of higher education is an important empirical question. Although incentives exist to extend the range of ERP applications, we cannot assume complete success. Knowledgeable parties might believe in the quality of the system and its impressive abilities, but yet not believe that worthwhile information will result. This issue can be expressed as:
Perceived system quality positively affects perceived ERP effectiveness in higher education applications.
The existence of good data and an appropriate processing technology does not guarantee effectiveness. Much depends upon what users ask of the data with their system. In recent times, many observers have reported the existence of a watershed moment that has promoted a decision-making format that many call “Big Data” (Breur 2016). Those who advocate the new approach to decision making believe that valid inferences can be made from sufficiently large datasets, some of which are only marginally connected on their face to the phenomenon of interest, but reveal consequential high correlations of interest. Big Data approaches are managed by tools called analytics. Business analytics refers the extensive use of data, statistical and quantitative analysis, explanatory and predictive modeling, and fact-based management that drive decisions and actions (Davenport and Harris 2017). “Advances in technology have enabled business to develop innovative ways to collate data from both internal and external sources. This leads to unprecedented challenges of Big Data characterized by high volume and high velocity” (Cao, Duan, and Li 2015, 384). Analytic tools are useful in extracting data directly from the tables in an organization's ERP system. Additionally, analytic tools can identify higher-risk situations by highlighting deviations from the expected business process and relationships (Capriotti 2014). Analytics use more information and demand more of the ERP system's data processing capabilities and therefore are likely to be associated with more perceptions of effectiveness. Whereas we have reason to believe that for-profit corporations have been working in the direction of analytic tools and reasoning for some time (Acito and Khatri 2014), the same journey cannot be assumed in higher education. Those who believe in the new approach to what constitutes data will probably also believe in the heightened power of this data integration. To which:
The perceived value of business analytics positively affects perceived ERP effectiveness in higher education applications.
Although IT competency is defined expansively as the processes, tasks, and technology needed to transform inputs to outputs (Grover, Jeong, Kettinger, and Teng 1995), the most unique element of this concept is the human one. Consistently, Chakravarty, Grewal, and Sambamurthy (2013) use IT capabilities as a measure of the understanding and technical proficiency for leveraging IT systems, relating the extent to which a firm has strong technical skills and IT-related knowledge. While IT competency artifacts are the tools and methods to improve the capacity for work, much of their value resides in the know-how of staff. Information technology has proven to be such a dynamic area so as to challenge those who work in this area to match the evolving expectations that new technology poses. Here, that which passed for state-of-the-art one day is passé the next (Brooks 2015). CIOs in higher education identify hiring and retaining qualified staff and updating the knowledge and skills of existing technology staff as a top priority (Brooks 2015). If people are organizations' most valuable asset, their skills should be central to perceptions of overall effectiveness. We expect that people will believe that an alignment exists between an organization's human capabilities and the technical processing abilities. This is captured as:
Perceived IT competency positively affects perceived ERP effectiveness in higher education applications.
Indirect Effects
The four direct effects posted in the previous stated hypotheses suggest a rather simple additive model. Along those lines, ERP effectiveness should be the product of having the right information processed by a system with strong capabilities operated by people who know what they were doing and have adopted the modern analytic perspective on decision making. By specifying partial mediation effects, such direct effects are made more circumspect and conditional.
We suspect the analytics effect is different from the other hypotheses in that it represents the adoption of a new way of thinking. Whereas the positioning and acquisition of resources have always underscored the ability to accomplish desired results, the use of analytic techniques represents some degree of uncharted risk. These efforts have been viewed as a discretionary supplement to ERP platforms (Prasad and Green 2015). Top management support for such ventures in other areas has been viewed as essential (Liang, Saraf, Hu, and Xue 2007; Lee, Elbashir, Mahama, and Sutton 2014). People who generally trust their organizations can be expected to engage in the belief that the organization will do what it takes to step up to modern levels of IT value maximization. This contingency recognizes the important and pervasive role of top-level organizational support in higher education. Such a perspective is also consistent with efforts to create a more comprehensive appreciation of the information systems function (Myers, Kappelman, and Prybutok 1997). This expectation essentially qualifies the application of H3 to those organizations that provide space and latitude for new employees, who are more likely to believe in new methods and in integrated decision making:
Perceived organizational support mediates the positive effect of the perceived value of business analytics upon perceived ERP effectiveness in higher education applications.
The management literature has routinely found that perceived organizational support helps explain employees' emotional commitment to their organization (Rhoades and Eisenberger 2002; Becker, Klein, and Meyer 2012). Perceived organizational support would influence an employee's treatment by the organization, which in turn would influence the employee's interpretation of organizational motives (Eisenberger, Huntington, Hutchinson, and Sowa 1986). Elbashir, Collier, Sutton, Davern, and Leech (2013) show that the relationship between IT and business personnel is “crucial,” because only a positive relationship opens the organization to better business intelligence assimilation. Therefore, we would expect that the strength of perceived organizational support would partially moderate in a positive direction the relationship of IT competency on ERP effectiveness. Increasing the nuance of previous expectations, we suggest that only organizations that convincingly signal their willing assistance can leverage the beliefs in analytics and integrated information, as suggested by the following:
Perceived organizational support mediates the positive effect of IT competency upon perceived ERP effectiveness in higher education applications.
As a new way of looking at the decision making that must be done, business analytics cannot be completely separated from the information that will be used and the people that will use it. An adoption of the proactive analytic approach can be expected to put new meaning into what constitutes quality information, along the contingency lines suggested by Morton and Hu (2008). In a similar vein, an organization that has committed to business analytics will place higher demands on people. What constitutes competency gradually will take on a more stylized and esoteric meaning. Whether business analytics has changed everything resides not only in its direct effects, but also in its conditioning of relationships between other variables. Thus, business analytics, as a new worldview regarding data and their use, can be expected to shape the role of other effectiveness antecedents, as expressed in these two propositions:
The perceived value of business analytics moderates the relationship between information quality and ERP effectiveness such that information quality has a stronger positive effect on perceived ERP effectiveness in situations where business analytics are more involved in higher education applications.
The perceived value of business analytics moderates the relationship between IT competency and ERP effectiveness such that IT competency has a stronger positive effect on perceived ERP effectiveness in situations where business analytics are more involved in higher education applications.
Figure 1 depicts the conceptual model that includes the hypotheses that have been stated above. For these purposes, four direct effects and four indirect effects have been included. The latter all are partial mediation expectations.
In light of the fact that higher education is highly diversified as a sector, control variables are needed to capture the variation that may be attributable to these divisions. While many classifications of institutions might be suggested, the most obvious pertains to the major source of an institution's funding (see Perry and Rainey 1988). The U.S. has a rich tradition of private and public institutions, a division that has had pervasive consequences upon these organizations (Geiger 1988). Although governmental funding has declined in recent years, public schools remain creatures of the state, legally accountable to external officials. Public schools, in partial recognition of their political nature, may be less flexible and willing to adopt new managerial ideas (Hegde 2005).
A second control variable pertains to the enrollment trend of the institution. Schools facing declining enrollments are likely to be under more pressure to adopt new management practices, including those related to information technology. Schools with a more stable flow of tuition-paying students may be more content to continue old practices. To some extent, relative success may breed a complacency, or at least allow more time to react to change (Christensen and Eyring 2011).
The two control variables are included in the model to guard against the possibility that the definition of information system effectiveness might vary within segments of the higher education field. To the extent that these classifications capture economic affordability, they might also temper the interpretation of antecedents.
III. METHODOLOGY
The topic of information system effectiveness requires the collection and evaluation of the perceptions of knowledgeable organizational participants. Therefore, a survey was designed and administered for that purpose.
Measures
The study of ERP effectiveness in business organizations has produced a set of usable scales that could be adopted for use in the higher education field. Where necessary, new questions were designed and screened for suitability. Responses for each of the constructs were collected using a five-point response Likert scale [1 = strongly disagree, 5 = strongly agree]. All measured items are reproduced and grouped by construct in Appendix A.
ERP effectiveness is approached as the sum of critical benefits produced by the ERP system. Following Kanellou and Spathis (2013), these include increased flexibility in information generation, increased integration of accounting applications, improved quality of reports, and reduction of time to close annual accounts. For these purposes, nine items from a 21-item scale developed by Kanellou and Spathis (2013) are used.
Prior research shows an a priori model of enterprise system success that defines information quality and systems quality among other mutually exclusive success dimensions (Sedera and Gable 2004). In this sense, both information quality and system quality are IT artifacts that can help measure individual and/or organizational impact. Following Sedera and Gable (2004), information quality is defined in generic ways to capture the caliber of raw information. These include availability, usability, understandability, relevance, format, and conciseness dimensions.
System quality is defined as how the system performs from a technical and design perspective (Gable et al. 2008). The six-item scale was adapted from Sedera and Gable (2004). The items question the system's ability to capture the necessary input, meet user needs, and possess desired features and options. Also included are questions pertaining to system integration, access, and customizability.
The scale used to consider the perceived value of business analytics is adapted from Torkzadeh and Xia's (1992) effort to capture management's understanding, development, and use of this modern technology in strategic planning. Six of their eight items are used to explore management's perceptions of importance, range, and their willingness to support use.
Perceived organizational support is defined as the global belief concerning the extent to which the organization values employee contributions and cares about their wellbeing (Eisenberger et al. 1986). This research adapted the two most global questions from a much larger scale developed by Eisenberger et al. (1986) to measure employee perceptions about this aspect of their organization.
IT competency is defined as the processes, tasks, and technology needed to transform inputs to outputs (Grover et al. 1995). IT competency is measured with a seven-item scale that was adapted from Grover et al. (1995). The items sought responses about basic IT expertise in the organization, specific knowledge about existing applications, and the extent to which the organization was proactive about emerging capabilities. Additional questions about IT capabilities measures were adopted from a scale from Chakravarty et al. (2013).
Participant Sample and Data Collection
In order to identify the factors that influence ERP effectiveness in higher education, working practitioners in the field were surveyed. The participants in this study are business officers at institutions of higher learning in the U.S.2 Participants include business officers from private as well as public non-profit colleges and universities. Employees of for-profit institutions were deliberately excluded. Participants in the study held primary roles as chief financial officer or were at an associate vice-president level or higher. The dataset consists of 333 survey participants that were identified through their affiliations with the National Association of College and University Business Officers Association (NACUBO) and other regional higher education business officer associations. Additionally, most states maintain a consortium of private independent colleges as a collaborative resource, and a number of these organizations agreed to disseminate the survey instrument to individual members. Of the approximately 1,900 surveys that were distributed, 476 (25.0 percent) were returned. Of these, 333 (70 percent) were sufficiently complete for analysis. This amounted to a 17.5 percent effective yield. To ensure homogeneity of the sample, participants were asked whether their role was that of their school's chief business officer. Based on a Chi-square test of differences, the responses did not significantly vary by job classification, and therefore were treated as a single sample.
Descriptive Statistics
Table 1 provides information about the respondents. Since identifying information was optional, some of what we know is limited to those who chose to provide the information. We do know that the effort to obtain a national sample was successful, with survey participants from 41 states. There is variability of the institutions represented by the respondents in terms of both student enrollment magnitude as well the size of the institutional endowment. Student enrollments characterized as large (greater than 15,000), midsize (between 5,000 and 15,000), and small (less than 5,000) students comprise 35 percent, 37 percent, and 38 percent of the respondents that choose to identify. The size the institutions' endowment from those who choose to identify their institution also showed great variety among respondents. For example, 50 institutions report endowments in excess of $500 million, while 66 institutions report endowment less than $25 million. There are 69 institutions whose endowments fall between $50 million and $250 million. The breadth and scope of the sample pool suggests a relatively normal distribution of institutions of higher education participated in the study.
There were six variables with some missing values, all of which constitute less than 1 percent of all responses. These missing values were replaced with the median for ordinal scales and the mean for continuous scales. This conventional method did not produce results different than if the missing observations had been deleted.
During data screening, the data were examined for skewness and kurtosis. All variables exhibited some modest homoscedasticity, but these fell within acceptable ranges (Muthén and Kaplan 1985). A large number of the variables have a negative kurtosis, a condition that indicates that there are not many outliers. The dataset also was sufficiently large to reduce the effects of skewness and kurtosis when using Likert scales. Descriptive statistics for the variables are found in Table 2.
As on Table 2, the mean scores for the nine items that measure perceived ERP effectiveness merit our attention. All are below the 3.0 midpoint of the five-point Likert scale upon which they were measured. They average a 2.608, indicating that most disagree with the conclusion that their ERP system is effective. In results not shown, the perception that ERP systems are not effective is also generally true for the major subsets of the data. More perceived ineffectiveness exists at public schools [global average = 2.574]. However, effectiveness did not seem to systematically vary by enrollment size, although the lowest effectiveness rating was found at the midsized schools [enrollments between 2,500 and 5,000 students]. The standard deviation of the nine measures is also quite low, again suggesting that those that see more than average effectiveness to be rather rare. Given the cost of these technologies, these descriptive results increase our motivation to investigate antecedents.
Factor Structure
An exploratory factor analysis (EFA) was conducted, using a maximum likelihood promax rotation, in order to determine if the factors loaded adequately in this model. The commonalities of factors are assessed using a threshold of 0.40, as recommended by MacCallum, Widaman, Zhang, and Hong (1999). Factors were eliminated sequentially as the EFA output was continuously reassessed. Once the commonalities were deemed acceptable, the freely estimated model loaded to a five-factor solution. Items that exhibited cross-loading in excess of 0.32 were removed from the pattern matrix (Reio and Shuck 2014; Costello and Osborne 2003), as were factors below the threshold acceptability level of 0.40 (Brown 2015). Using Reio and Shuck's (2014) recommendation, eigenvalues greater than 1.0 are used in conjunction with a scree plot test as criteria to assess the number of factors. The loadings of emergent factors are provided in Table 3.
Although system quality and information quality are two well-established constructs in the information success model (Gable et al. 2008; Sedera and Gable 2004; Kanellou and Spathis 2013; DeLone and McLean 1992), these two variables converged in the EFA. This result suggests that chief business officers perceive system quality and information quality to be essentially the same thing in the context of higher education.
The Bartlett's Test of Sphericity is significant, and the Kaiser-Meyer-Olkin (KMO) is adequate at 0.91. The five-factor EFA explained 59.6 percent of the variance with non-redundant residuals of less than 8.0 percent. As evidence of convergent validity, all the loadings in the pattern matrix are greater than 0.50 (Hair, Black, Babin, and Anderson 2010). As evidence of discriminant validity, the questions comprising the constructs have no large cross-loadings in the final pattern matrix. The curve of estimation of our model relationships suggests that all relationships were sufficiently linear. The variable inflation factor (VIF) values were all less than 2.0, indicating that the variables are distinct and that multicollinearity was negligible (Stine 1995; O'Brien 2007).
Using the final pattern matrix from the EFA, a confirmatory factor analysis (CFA) was generated using AMOS software (v25). In this model, no unidirectional path was specified between any of the latent variables. We estimated a covariance model where all of the latent variables were correlated with one another. The modification indices showed several strong relationships, which were evaluated to determine what items should be eliminated in order to achieve better model fit. Three items from information quality, two items for data analytics, and one item from ERP effectiveness were eliminated in this process.
Following procedures recommended by P. Podsakoff, MacKenzie, and N. Podsakoff (2012) and Bagozzi (2011), the model was evaluated for the influence of common method bias. Finding that some portion of the shared variance could be attributed to common method bias, the model was amended to incorporate an unmeasured latent factor to which significant paths were linked. Table 4 contains the correlation matrix for the final model.
Validity and Reliability
Acceptable Cronbach's alpha statistics were produced (Eisinga, Grotenhuis, and Pelzer 2013). The dependent variable's (ERP effectiveness) alpha was 0.72. Several dependent variable scales achieved or exceeded the 0.90 level. These included the information quality-systems quality composite, made necessary by the factor analysis results. The other scales all exceeded the 0.80 level, with most above 0.85. These results suggest strong reliability (MacKenzie, P. Podsakoff, and N. Podsakoff 2011). Convergent validity was deemed acceptable by an inspection of the average variance extracted (AVE) metric. All variables had scores exceeding the conventional 0.5 benchmark (MacKenzie, Podsakoff, and Jarvis 2005) except IT competency, where an AVE of 0.471 approached the benchmark. Following Borsboom, Mellenbergh, and van Heerden (2004), shortfalls of this magnitude do not necessarily indicate serious validity concerns. Composite reliability was evaluated with maximum shared variance values ranging from 0.778 to 0.889. This metric always exceeded the 0.70 standard recommended by Hair et al. (2010).
Model Fit
The hypothesized model shown in Figure 1 was tested using a structural equation modeling approach, as operationalized by AMOS. Following procedures recommended by Hayes (2013), bias-corrected resamples were run 2,000 times using a 95 percent two-tail confidence interval and uncorrelated error terms.
Final model fit statistics (shown in greater detail in Table 4) included a Chi-square ratio of 3.91 and a RMSEA3 of 0.094. The final model fit statistics with the unmeasured latent factor are: CMin/df = 1.432; GFI = 0.937; CFI = 0.991; TLI = 0.988; and NFI = 0.955 (Hair et al. 2010; Brown 2015; Hu and Bentler 1998, 1999; Hooper et al. 2008). Judged against established benchmarks (see Hair et al. 2010; Hooper et al. 2008; Hu and Bentler 1998), adequate model fit has been achieved.
IV. RESULTS
As shown in Figure 2, the squared multiple correlations for the endogenous variables show R2 values for ERP Effectiveness = 0.61 and the R2 for Organizational Support = 0.51, indicating that sufficient variance is explained for each of these outcome variables in the model.
Structure Equation Model
*, ** Indicate significance at p < 0.05 and p < 0.001, respectively.
n = 333.
Structure Equation Model
*, ** Indicate significance at p < 0.05 and p < 0.001, respectively.
n = 333.
Direct Effects
H1 concerns whether information quality is positively associated with ERP effectiveness. In results not shown, information quality as originally measured and hypothesized had a significant effect on ERP effectiveness. However, the merger of system quality and information quality, discovered in the factor analysis results above, moots such an independent test. System quality contributes to what now could be thought of as input quality, and has been tested in a reformulated H1-H2. As evidenced on Figure 2 by a coefficient of 0.733 (p < 0.001), high levels of perceived input quality are associated with positive perceptions of ERP system effectiveness. Therefore, the composite H1-H2 is supported.
The third hypothesis pertains to the influence of the use of business analytics upon the ability to extract more positive ERP effectiveness perceptions. Here the thinking was that those respondents who more profoundly accepted analytics would also reach a new plateau in their appreciation of ERP systems effectiveness. However, the lack of significance for this path (β = −0.039; p > 0.10) indicates that this is not the case. The coefficient is not even in the hypothesized direction. People who appreciate the importance and role of business analytics do not tend to have higher ERP effectiveness perceptions. No support was found for H3.
The last direct effect (H4) proposes that organizations that possess more IT competency will have higher perceptions of ERP effectiveness. The results suggest that this is not the case. The negative sign of the coefficient tells us that lower levels of competence tend to be associated with higher perception of perceived ERP effectiveness, albeit at the lower level of significance (β = −0.086; p < 0.05). As stated, H4 is not supported.
Mediation and Moderation Effects
H5a posits that business analytics will work in an indirect fashion, through perceived organizational support, along a pathway that could be written as: perceived analytics value → perceived organizational support → perceived ERP effectiveness. This expectation is sustained with the significant indirect path from business analytics value to perceived organizational support at β = 0.589 (p < 0.001), and from perceived organizational support to perceived ERP effectiveness β = 0.243 (p < 0.001). This sequence of significant effects supports H5a.
The indirect path between perceived IT competency and perceived ERP effectiveness (H5b) is found to be significant. Specifically, perceived organizational support partially mediates the relationship from IT competency to ERP effectiveness (ITComp → POS → ERP). The path from perceived IT competency to perceived organizational support is significant (β = 0.212; p < 0.001), as is the relationship between perceived organizational support and ERP effectiveness (β = 0.243; p < 0.001). This result supports H5b.
Both H6a and H6b relate to the moderation of direct effects. These were tested using interaction terms created in SPSS for the composite Input (information/system) Quality ∗ Analytics, as well as IT Competency ∗ Analytics. A review of the estimates determined there are no significant interaction effects present on ERP effectiveness, as judged by these two interaction terms. The effect of Business Analytics ∗ Input Quality is not significant (β = −0.007; p > 0.10), nor is the effect of IT Competency ∗ Analytics (β = −0.003; p > 0.10).
Control Variables and Additional Tests
Both control variables in the model were not significantly related to ERP effectiveness (Enrollment Trend: β = −0.031; p > 0.10 and School Type (public versus private): β = 0.003; p > 0.10). Despite no direct significance as controls, further consideration was given to the effects of these variables. A multi-group analysis separating public and private schools showed that the path from information quality to ERP effectiveness is stronger for private institutions than public schools (p < 0.01). Meanwhile, the path from perceived organizational support to ERP effectiveness is stronger for public schools than private schools (p < 0.01). A similar multi-group analysis on enrollment size (more than/less than 10,000 students) was conducted with a Chi-square difference test, where models are freely estimated, except for constraining the path from perceived organizational support to ERP effectiveness, to be equal across the two groups. Although there were no differences at the path level (p > 0.01), the explanatory power of the model for larger schools was higher.
V. DISCUSSION
This study sought to investigate factors that influence perceived ERP system effectiveness in higher education. This inquiry included linkages among IS Model antecedents, and was supplemented with business analytics. Overall, support was found for some, but not all of the hypotheses. This underlines the unique setting of higher education.
Although system quality and information quality are two well-established constructs in the information success model (Gable et al. 2008; Sedera and Gable 2004; Kanellou and Spathis 2013; DeLone and McLean 1992), these two variables converged in the EFA, suggesting that chief business officers perceive system quality and information quality to be tightly intertwined. Information quality refers to the usefulness, timeliness, and clarity of the information, whereas system quality deals with how the system performs from a technical and design perspective. The fact that key finance personnel view these synonymously was a surprising revelation that requires additional investigation. Such a confluence may be a reflection of a certain degree of user naivety about modern information processing that should be anticipated by system designers. Prior work with the dimensions of the IS Model in the literature suggest that this particular convergence is not unprecedented (see Seddon 1997).
The current study suggests that perceived organizational support serves to fully mediate the effect of perceived IT competency on ERP effectiveness as well as business analytics on ERP effectiveness. The positive relationship between business analytics and ERP effectiveness only exists when mediated through perceived organizational support. Likewise, the relationship between IT competency and ERP effectiveness only exists when mediated through perceived organizational support. Thus, perceived organizational support appears to be a powerful construct. Higher education could leverage this contingency to facilitate strategic information system initiatives. Although not hypothesized, multi-group level analysis also found perceived organizational support especially important for public institutions.
In the absence of perceived organizational support, the effect between perceived IT competency and ERP effectiveness is negative and modestly significant. In other words, organizations lacking talent in this area are operating based on blind faith. This finding seems consistent with a main conclusion from the practitioner literature that higher education has not kept pace with the demand for more actionable and comparable information (Norris and Baer 2013). Even as businesses are forced to rely more on technology to be competitive, IT executives struggle to create meaningful IS strategic alignment. Ample levels of organizational support appear critical to bridge the knowledge gap between business and IT people at the operational level, thus improving strategic IT alignment (Elbashir et al. 2013). IT professionals in higher education understand the need for alignment and the improved utilization of IT information (Norris and Baer 2013), but it appears that all the relevant practical factors that could ensure that alignment will emerge in the higher education sector have yet to be determined.
The direct path between business analytics and ERP effectiveness is not significant. This result is surprising because that literature indicated that additional business analytics would increase overall ERP effectiveness for organizations. Apparently, the promise of business analytics has yet to be realized in higher education practice. This result is exacerbated by the fact that the interaction effects of business analytics on both input quality and IT competency on ERP effectiveness were also found to be not significant. This was somewhat surprising, but consistent with findings from a McKinsey report on Big Data, which suggests that higher education “faces hurdles because of a lack of data-driven mind-set, and available data” (Manyika et al. 2011, 9). Corresponding to higher education previously being a lagging sector in terms of ERP adoption and implementation (Kvavik et al. 2002; Dodds, Fleagle, Patterson, and Denna 2014), we have to question whether institutions of higher education now clearly recognize the value of analytics as its enhancement. This conclusion is consistent with a report that higher education has “made no measurable progress in analytics maturity in the past two years as fewer than 15 percent of institutional analytical programs might be described as strong or excellent” (Grajek 2016, 44). The literature that suggests that schools may likely have an increased focus on streamlined decision making through analytics in the recent future (Norris and Baer 2013), may have to revise their timelines. The prevailing trend for schools to divest from inefficient solutions and reinvest to develop the necessary capabilities to use technology in order to create differentiation from others (Grajek 2016), may be slower than first thought.
The results have direct bearing upon real-world decision making in higher education. The merger of information and system quality suggests that currently people are treating their ERPs as more of a “black box” than they should, and that this can be improved with better technical education. The results also suggest that higher education has excessive skepticism about the use of the analytic tools that many believe will allow systems to reach more of their true potential. To the extent this reluctance is based on ignorance about what is already possible in the search for students and other resources, it should be overcome. This is exacerbated as many universities continue to depend on antiquated client-server or mainframe-based applications that lack the features that are required in today's interconnected world (Panettieri 2007). Finally, we find that universities have not provided their IT professionals with unwavering support for their efforts. The results show that the positive values of this support are not being achieved across the board. This finding is consistent with the literature that includes the finding that 80 percent of business officers think that culture and capacity barriers intensify the challenges to the integration and use of analytics in higher education (Wayt 2019).
This study had several limitations. First, not all possible measures of ERP effectiveness have been examined in this study. Second, higher education's approach to defining analytics may be inconsistent with conventional usage, as some definitions are more conceptual while others are more functional (Norris and Baer 2013). Taking data from high-level college officials may not have fully captured organizational capacity for analytics. Furthermore, there may be varying levels of implementation. Colleges may be in what has been called a first stage of static reporting, which could be quickly followed by dynamic analysis, intervention, and finally optimization (Norris and Baer 2013). To the extent that self-designed measures of some variables were used, improvements are always possible. Here, the measure of perceived organizational support could be made much more specific to IT personnel if we had known the perspective that respondents were most likely to adopt when answering survey items.
Notwithstanding the results of this paper, implementing analytics and applying data-driven decisions is likely to be a major differentiator between high-performing and low-performing organizations (Norris and Baer 2013). A hint of this might exist in this paper's multi-group level analysis between schools of different enrollment. An opportunity for future research exists to study whether modern information technology will empower smaller schools, or speed the rate at which they are left behind.
Another area for future research should be the apparent inability of financial decision makers to differentiate information quality and system quality. This lacuna may compromise the ability to make intelligent technology acquisitions going forward. Even more important is the state of IT competency in the sector. As data become more comprehensive, and as processing becomes more powerful, interpretation and action imperatives remain very human.
REFERENCES
Some of this spending pays for customer relationship management (CRM) or learning management systems (LMS) systems such as Blackboard Learn and Canvas and should be therefore discounted.
Institutional Review Board approval was granted by the corresponding universities.
The acronyms used in this section stand for the following: RMSEA = root mean square error of approximation; CMin/df = Chi-square minimum discrepancy per degree of freedom; GFI = goodness of fit index; CFI = comparative fit index; NFI = normal fit index; TLI = Tucker-Lewis index (see Hooper, Coughlan, and Mullen 2008).