ebook img

ERIC ED594959: Institutional Data Quality and the Data Integrity Team. Professional File. Article 140, Summer 2017 PDF

2017·2.2 MB·English
by  ERIC
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview ERIC ED594959: Institutional Data Quality and the Data Integrity Team. Professional File. Article 140, Summer 2017

PROFESSIONAL FILES | SUMMER 2017 VOLUME Supporting quality data and decisions for higher education. © Copyright 2017, Association for Institutional Research Letter from the Editor Summer brings time to reflect and recharge. The Summer 2017 volume of AIR Professional Files presents four articles with intriguing ideas to consider as you plan for the next academic year. Data governance is a pressing issue for many IR professionals, as sources of data proliferate and challenge our ability to control data integrity. In her article, Institutional Data Quality and the Data Integrity Team, McGuire synthesizes and interprets results from 172 respondents to an AIR- administered survey of postsecondary institutions on their data integrity efforts. She describes the current state of data governance and offers strategies to encourage institutional leaders to invest in data quality. Those of us who work in assessment often take it for granted that assessment results will be used for learning improvement. Fulcher, Smith, Sanchez, and Sanders challenge this assumption by analyzing information from program assessment reports at their own institution. Needle in a Haystack: Finding Learning Improvement in Assessment Reports uncovers many possible reasons for the gap between obtaining evidence of student learning and using that evidence for improvement. The authors suggest ways to promote learning improvement initiatives, and share a handy rubric for evaluating assessment progress. Institutional researchers are beset with requests to form peer groups, and it seems that no one is ever satisfied with the results. Two articles in this volume present very different methodologies for forming sets of comparison institutions. In her article, A Case Study to Examine Three Peer Grouping Methodologies, D’Allegro compares peer sets generated by different selection indices. She offers guidance for applying each index and encourages cautious interpretation of results. Rather than rummaging around for the perfect peer set, Chatman proposes creating a clone, or doppelganger university, one that is constructed from disaggregated components drawn from diverse data sources. In Constructing a Peer Institution: A New Peer Methodology, he walks us through the process of creating peers for faculty salaries, instructional costs, and faculty productivity. While the constructed peer approach has its challenges, the appeal of achieving a perfect fit peer is undeniable. I hope your summer “reflection” inspires you to share your work with your IR colleagues through AIR Professional Files. Sincerely, Sharron L. Ronco IN THIS ISSUE... EDITORS Article 140 Page 1 Author: Katherine A. McGuire Institutional Data Quality and the Data Integrity Team Sharron Ronco Coordinating Editor Article 141 Page 19 Marquette University Authors: Keston H. Fulcher, Kristen L. Smith, Elizabeth R. H. Sanchez and Courtney B. Sanders Leah Ewing Ross Needle in a Haystack: Finding Learning Improvement in Assessment Reports Managing Editor Association for Institutional Research Article 142 Page 35 Authors: Mary Lou D’Allegro Lisa Gwaltney A Case Study to Examine Three Peer Grouping Methodologies Editorial Assistant Association for Institutional Research Article 143 Page 55 Authors: Steve Chatman ISSN 2155-7535 Constructing a Peer Institution: A New Peer Methodology PROFESSIONAL FILE ARTICLE 140 © Copyright 2017, Association for Institutional Research INSTITUTIONAL DATA QUALITY AND THE DATA INTEGRITY TEAM Katherine A. McGuire communication, collaboration, and information. In recent years, demand awareness of data quality issues. Both for information for accountability, About the Author survey respondents and interviewees institutional decision-making, and Katherine A. McGuire is director of reported that more data governance planning has placed increased scrutiny institutional research at Oxford College resources, including dedicated staff on data quality and data processes at of Emory University. time, were needed to improve data postsecondary institutions. quality. The implications of these Abstract findings for strategic data quality Since early in the development of Data quality has become a pressing and best practices for institutions are the field of institutional research, issue for many campuses in recent discussed. practitioners have expressed concern years, as colleges struggle to extract about the accuracy of data contained in timely, accurate, and consistent Keywords: Data quality, data student information systems. In a 1989 information from ever-proliferating governance Association for Institutional Research institutional data sources in order (AIR) Professional File paper entitled to meet strategic decision-making BACKGROUND “Data Integrity: Why Aren’t the Data and accountability demands. In this Accurate?,” Gose described a number of Computerized database systems have mixed methods study, a survey and major types of data errors, and noted created a revolution in the capacity semi-structured interviews were that the human element was essential of organizations to store and rapidly used to examine data integrity in maintaining data systems free from retrieve information about their teams, which are groups that try to various types of “data corruption.” By processes and people. The routine improve the accuracy and usefulness “human element,” he presumably operations of colleges and universities of data in computing systems at meant that improving communication have been profoundly affected by these institutions of higher education. A between departments and individuals broad-based changes in information survey sent to a random sample of about data problems and data management. All administrative and institutional researchers revealed that standards is crucial to improving data academic departments on a campus many campuses did not have data quality. require access to information contained integrity teams. Where campuses in institutional databases for their had them, those teams frequently McGilvray points out that a persistent daily activities, whether it be directory did not engage in activities like data problem with data quality is that data information, student enrollment and auditing, creating or maintaining management is one area where the academic records information, financial data standards documentation, or trend toward greater integration and aid data, accounting and billing data, training staff on data standards issues. collaboration in organizations has faculty and staff personnel data, Interview participants from campuses lagged behind: “Our applications and donor records, grants management with an established team reported business needs for information are data, or facilities and scheduling that the greatest benefits were greater integrated, but our behavior has not SUMMER 2017 VOLUME | PAGE 1 changed to work effectively in this address data problems as they arise, integrity teams on college campuses by world. For example, your company as well as to proactively implement means of a concurrent mixed methods may need information to support improved data management policies research design, including an online end-to-end processes and enterprise and procedures. survey and semi-structured interviews decision-making, but the information of postsecondary data users. Some is being created by an individual Young and McConkey (2012) and of the research questions the study contributor from the business who has McLaughlin, Howard, Cunningham, addressed were these: no visibility to other needs for the same and Payne (2004) have described information” (2006, p. 2). many of the activities that are 1. What percentage of appropriate for data integrity teams postsecondary institutions have Thus, data entry responsibilities in higher education. Teams should formal data integrity teams? Can frequently fall to the lowest-ranking first identify data stakeholders and any institutional characteristics and newest member of a department, their needs. They should institute or organizational conditions be someone who does not understand the consistent data definitions across identified that seem to promote needs of end users and in whom just the institution, such as by creating the development of data integrity enough training is invested to get the a data dictionary, and they should teams? job done at a basic transactional level. establish data use rules. They should 2. Who typically serves on Such employees tend to be rewarded for draft data policies, communicate the data integrity teams? Which speed rather than accuracy, and often importance of those policies, and institutional departments the department where data entry occurs monitor and report both the status of play leadership roles in data is not directly impacted by data errors. data quality efforts and compliance governance? with standards. They should assign 3. How well are data integrity Colleges have adopted various data stewards or custodians so that teams supported by executive strategies for improving campus data, there is no ambiguity about who is leadership and what authority do all of which could be described by responsible for data in a given area, and teams have to make and enforce the umbrella term “data governance.” they should update such assignments data policy? Such strategies might include when necessary. They should seek to 4. What are the typical tasks codifying data standards, creating understand external accountability and undertaken by a data integrity standard operating procedures for internal research and planning data team? How effective do team data processes, developing master needs, and should incorporate these members believe their teams are data sets for reporting, and assigning needs into data standards decisions. at solving various types of data to specific personnel oversight of Teams should be aware of data quality quality problems? data in campus functional areas. All issues surrounding documentation, 5. What do team members perceive these strategies require that critical process gaps, and missing data. as the barriers to institutional stakeholders regularly communicate They should address issues of access, data quality? How do they think and collaborate to identify problems, security, and integration of multiple these might be overcome? Are set standards and policy, oversee and data systems. Finally, data integrity there any types of data problems review data and data processes, and teams should track how data decisions that are insurmountable or help manage change that impacts data are made, as well as how conflicts unavoidable? integrity. Some college campuses have between departments or members are instituted data integrity teams to serve resolved. In the first phase of the study, randomly this function. Data integrity teams are selected members of the higher groups of stakeholders from diverse The present study examined the education professional association, functional areas on campus that meet staffing, scope of activities, institutional AIR, were invited to participate in a regularly to try to collaboratively environments, and effectiveness of data 20-minute online survey that asked PAGE 2 | SUMMER 2017 VOLUME questions about the demographic Table 1. FTE Enrollment of Respondents’ Institutions characteristics of their institution and FTE Enrollment Frequency Percent whether it had a data integrity team. If the institution had such a team, Fewer than 1,000 18 11% questions followed as to who served on the team, core team activities, and 1,000–2,999 36 21% team accomplishments and challenges. A second qualitative phase of the study 3,000–9,999 55 32% interviewed individual data integrity 10,000–19,999 36 21% team members at postsecondary institutions about their teams’ activities 20,000 or more 27 16% and challenges. This study differs from previous data integrity research Total 172 done by higher education information technology (IT) groups like EDUCAUSE Note: FTE = full-time equivalent. (see Yanosky 2009) by focusing on the perceptions of professional institutional The qualitative phase of the study RESULTS researchers rather than on IT leadership consisted of structured individual Survey Results or staff, as well as in having a qualitative interviews. Each interview subject was A total of 205 AIR member respondents component. a data integrity team member from a submitted the survey, for a 39% different postsecondary institution. response rate. Of these, 197 responded METHODOLOGY Participants were recruited through to at least one item on data integrity the e-mail lists of two institutional The quantitative phase of the study and were included in the final analysis research groups: the Georgia consisted of an online survey created of survey results. Association for Institutional Research, and maintained in the online web Planning, Assessment, and Quality survey tool SurveyMonkey (www. The majority (87%) of respondents (GAIRPAQ) and the Higher Education surveymonkey.com), and administrated were employed at postsecondary Data Sharing Consortium (HEDS). by AIR. The survey contained item institutions. Of the 172 respondents Additional potential subjects were tracking so that AIR members employed on postsecondary campuses, located by a Google search of terms whose institutions did not have by far the largest group was at such as “university data integrity team,” data integrity teams or who were institutions with both undergraduate “college data governance,” etc., and not members of their schools’ data and postgraduate programs (66%). e-mail contact was made with relevant integrity teams answered a different Smaller percentages of respondents staff at institutions for whom data set of questions than respondents were from institutions with two- integrity team information was found who were on campuses and/or served year (22%), four-year only (9%), and online. Subjects were interviewed by on data integrity teams. A sample graduate-only (3%) programs. There phone using the online tool Skype, and of 519 randomly selected members were slightly more respondents interviews were recorded to MP3 files of AIR were sent an e-mail from AIR from public (55%) than from private using the Skype recording tool Evaer. explaining the purpose of the survey institutions; only four respondents All semi-structured interviews were and inviting them to participate by (2%) were from private proprietary transcribed manually from the MP3 clicking on a hyperlink in the e-mail schools. The diversity of institutional files, and the resultant data were coded message. Descriptive data analysis was student enrollment sizes represented and analyzed in QDA Miner Lite. Both performed using Statistical Package for in the sample can be seen in Table 1. thematic and content analyses were the Social Sciences (SPSS). Exactly half of the respondents were performed where appropriate. SUMMER 2017 VOLUME | PAGE 3 at multicampus systems, illustrating Table 2. Characteristics of Institutions with Data Integrity Teams the potential complexity of data management at the institutions in the Institutional Characteristics Number of Percent of study. Institutional Institutional Respondents Respondents Fewer than half (44%) of the 172 with Data with Data respondents from postsecondary Integrity Team* Integrity Team* institutions reported that their school Institutional Type had a data integrity team, and only 38 respondents (22%) reported Two year 18 47% leading or serving on a data integrity team. Table 2 shows the institutional Four year only 5 46% characteristics of institutions that had data integrity teams. Four year plus graduate and/or 48 52% professional Executive Advocacy of Data Quality Efforts Graduate and/or professional only 4 80% Respondents indicated they believed that campus executive leaders were Institutional Control overall supportive of efforts to improve data quality (see Table 3). With the Private for-profit 1 25% exception of the chief business officer, whose rating decreased slightly Private not-for-profit 36 58% when disaggregated, this confidence Public 38 49% in leaders’ support of data integrity was even more pronounced for respondents who were members of Institutional FTE their institutions’ data integrity teams. Fewer than 1,000 5 36% Respondents’ Ratings of 1,000–2,999 19 59% Institutional Data Quality Sixty-six percent of all institutional 3,000–9,999 27 53% respondents said that they “Agreed” or “Strongly agreed” with the statement, 10,000–19,999 15 54% “The overall quality of data in my institution’s administrative computing 20,000 or more 9 50% system is high.” There was virtually no difference in the percentage of Note: FTE = full-time equivalent. respondents who rated institutional * “I don’t know” and “No response” omitted from numerator and denominator. data quality highly who were on data integrity teams from those who were not. Respondents who reported that their campus did not have a data integrity team were asked why they thought it did not (see Table 4). PAGE 4 | SUMMER 2017 VOLUME Table 3. Support of Campus Leaders for Data Integrity Efforts athletics, career services, the veterans’ affairs office, and student life. The following campus leaders All Institutional Data Integrity Data Integrity Team support efforts to address Respondents Team data integrity at my institution (n=169) Members Characteristics (Strongly agree or agree) Only (n=32) Over 80% of the respondents who served on their institution’s data President/Chief executive officer 56% 76% integrity team had been on the team for more than three years, and only Provost/Chief academic officer 69% 90% about 15% had served for less than a year. The most common regular Chief business officer/Chief financial 68% 61% officer meeting schedules were monthly (24%) or quarterly (18%); a combined 32% Chief student affairs officer 56% 68% said they met either irregularly or on an as-needed basis rather than keeping a Chief Information officer 71% 84% regular schedule. About 30% of the respondents said Table 4. Reasons Respondents’ Institutions Do Not Have Data Integrity Teams their data integrity team reported to the institutional research, institutional To the best of your knowledge, what are the reasons that Percent effectiveness, or assessment functional your institution does not have a data integrity team? area. Another 16% reported to IT, 13% (check all that apply) (n=70) reported to academic affairs, and about Data quality is not a problem at my institution. 14% 10% reported to the president or chief executive officer. A few other teams Data quality issues are too contentious/political. 20% reported to executive cabinets or other entities. Several respondents said that Decision-makers are not aware of data quality issues. 27% their team either did not report to anyone or that they were not sure who Decision-makers are not interested in data quality issues. 20% their team reported to. Respondents Decision-makers do not have time to devote to data quality issues. 40% indicated that the team reported to the individual or entity that oversaw it by Decision-makers do not have resources to devote to data quality face-to-face meetings or presentations 43% issues. (42%), memos or reports (13%), or both methods (40%). Most teams reported Composition and Leadership of followed by IT. Various other leader that they had only a limited range of Data Integrity Teams functional areas were mentioned in the data policy-making authority and that Over 80% of survey respondents who open-ended comments for this survey they referred data policy violators to were on data integrity teams worked item, including associate vice president another entity or person (see Table 6). in institutional research or assessment and bursar, as well as cochairing offices, as might be expected given the arrangements. Team Activities and Effectiveness population sampled. As shown in Table 5, by far the most common functional Additional team members mentioned Data integrity team members reported area of team leaders was institutional in the open-ended comments sections their team doing a variety of common research and related departments, were online or e-learning coordinators, data quality–related activities, as SUMMER 2017 VOLUME | PAGE 5 summarized in Table 7. The activities Table 5. Team Leader’s Team Department and Represented that were most often cited as a focus Leader’s Representation on Team on Team of the team were identifying data gaps Department (n=41) and inconsistencies, identifying data (n=32) stewards, and considering institutional strategic reporting needs. The two Institutional research/Institutional 47% 100% items that respondents cited least often effectiveness/Assessment as being a focus of the team concerned IT/Computing 24% 71% data auditing and policy assessment. Other (please specify) 16% 13% Team members also reported on institutional and departmental Academic affairs/Faculty 3% 58% environments and outcomes for data quality, as shown in Table 8. Although Admissions/Enrollment management 3% 71% respondents indicated that advocacy Development/Advancement 3% 34% and awareness of data quality issues existed on their campuses, only Registrar 3% 79% slightly over half agreed that having a data integrity team had improved Business/Accounting 0% 66% institutional data quality. Many of the typical activities associated Financial aid 0% 74% with data integrity teams, such as Human resources 0% 45% creating data documentation, training staff, documenting data steward Table 6. Team Authority to Make and Enforce Data-related Policy responsibilities, and monitoring data quality, were occurring at a third or Which best describes the team’s authority to make data- fewer of the institutions. Only a quarter Percent related policy on your campus? (n=33) of the respondents agreed that data users knew the procedure for reporting We have a broad range of policy-making authority. 23% data problems. We have a limited range of policy-making authority. 45% Views of Non-Team Members We can make recommendations only. 29% on Data Integrity Practices As noted previously, many of the AIR member respondents either did not Which best describes the team’s authority to enforce Percent serve on their campus data integrity data-related policy on your campus? (n=32) team, were employed on a campus We have policy enforcement authority (e.g., can limit data systems that did not have a data integrity 13% access). team, or were not employed on a college campus. Respondents who We refer individuals who violate data policies to other entities (e.g., reported that they were not currently 53% their supervisors). on data integrity teams answered opinion questions about data quality We have no authority to enforce policy. 27% issues on college campuses. Of these Note: “Other” responses not included. respondents, 85% agreed with the PAGE 6 | SUMMER 2017 VOLUME Table 7. Frequency of Data Integrity Team Activities and Team Effectiveness How often does the data integrity team focus on the Frequency of Team Team Effectiveness following issues, and how effective is the team in each Activities (Percent (Percent “Effective” area? “Sometimes” or or “Highly effective”) “Often”) (n=32) (n=31) Identify data gaps and inconsistencies. 97% 66% Identify data stewards (people responsible for maintaining data 97% 68% quality and reporting data issues). Consider internal strategic data reporting needs. 93% 54% Create new data policies. 90% 55% Review current data policies. 87% 71% Align data policies between departments. 87% 54% Seek input from data stakeholders. 86% 57% Address compliance or regulatory issues. 86% 61% Establish needs, roles, and responsibilities of data stewards. 86% 58% Determine who has or needs access to data. 79% 61% Assess effectiveness of data policies. 79% 48% Monitor data quality. 79% 57% statement, “Every college or university data integrity team were likely to think data integrity team to be highly or should have a data integrity team.” The most important differed somewhat moderately effective. majority of respondents (55%) believed from the activities that data integrity that data integrity teams should report team members reported as teams’ Open-Ended Survey Comments to the office of institutional research most frequently addressed issues, with Around two dozen respondents gave or institutional effectiveness; only 11% data auditing and policy assessment additional reasons or commentary stated that the team should report to assuming greater importance to the about why their institution did not an IT function. non-team-member respondents. have a data integrity team. About a third of the comments indicated that Respondents were also asked what About a third of the respondents not data quality issues were handled in an they thought the activities of a data currently on data integrity teams had informal, ad hoc manner in response integrity team should be (see Table 9). served on one in the past; of these to specific problems or projects with The activities that respondents not on a respondents, 65% rated their previous whatever departments were impacted SUMMER 2017 VOLUME | PAGE 7 Table 8. Institutional Environments and Activities for Data Quality Reported by respondents expressed the belief that Data Integrity Team Members data integrity teams were not useful because data quality issues were too complex to be solved by a single team. Indicate your level of agreement with the Percent “Agree” following statements about your institution: or “Strongly Agree” (n=37) Most of the respondents who served on data integrity teams commented My supervisor is aware of the importance of data quality. 90% on how data integrity could be improved at their own institution. Data integrity team members serve as advocates for good Typical comments cited the need for 77% data in their departments. more buy-in by both senior leadership and staff. More centralization of data Data quality is a strategic priority. 65% quality efforts and user accountability for data quality were also mentioned Data stewards/managers exist in each functional unit that by several respondents. Training 58% has data access and responsibilities. for data users was one of the most frequently mentioned needs, as was Having a data integrity team on my campus has improved creating or updating a data dictionary. 55% data quality. The need for additional staff was a concern, and several respondents said Data quality is continuously monitored. 48% that they believed their institution needed dedicated staff to oversee data Significant resources are devoted to data quality 42% integrity issues. improvement efforts. About 40% of the respondents not The institution has a usable and complete data dictionary. 33% currently serving on a data integrity team answered the open-ended All data users have easy access to data field 32% documentation. question, “How can data integrity be improved at institutions?” Twenty-five Staff who work with data receive training about data percent of the comments mentioned 32% standards. the need for greater executive buy-in and accountability, and nearly 20% of Data steward/manager responsibilities are clearly 30% comments mentioned the need for documented. some kind of accountability for data entry or data reporting staff. As Table There are regularly scheduled comprehensive data quality 26% 10 shows, team members and non- audits. team members mentioned similar data Individuals who use data know how to report a problem or quality solutions. 26% issue with data quality. There were also several comments by the particular issue. Similarly, that they had previously had a data from both team members and non- several other respondents indicated integrity team that had stopped team members about the need to that data quality issues were handled meeting, and several others said that understand the origins of information in a decentralized fashion within their institution was in the process of and filter out bad data before such departments. Three participants said forming a data integrity team. Two data got into centralized data systems, PAGE 8 | SUMMER 2017 VOLUME

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.