ebook img

ERIC EJ1063997: Would You Recommend Your Institution's Effort-Reporting Process to Others? Determining Best Practices in Effort-Reporting Compliance PDF

2015·0.47 MB·English
by  ERIC
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview ERIC EJ1063997: Would You Recommend Your Institution's Effort-Reporting Process to Others? Determining Best Practices in Effort-Reporting Compliance

Research Management Review, Volume 20, Number 2 (2015) Would You Recommend Your Institution’s Effort-Reporting Process to Others? Determining Best Practices in Effort-Reporting Compliance Ashley E. Whitaker Nova Southeastern University ABSTRACT Effort-reporting compliance at higher education institutions was examined to discern best practices from those that would recommend their effort-reporting process. Data were derived from a survey of effort administrators—the research administrators responsible for the effort- reporting compliance program at their respective higher education institutions. The research was conducted in the fall of 2012, before the implementation of the OMB (2013) Uniform Guidance. Data were separated into two focus groups for greater applicability: Doctoral/Research Universities (DRUs) and Predominantly Undergraduate Institutions (PUIs). These effort administrators were generally confident about their institution’s compliance with current effort-reporting regulations and believed that, even aside from the regulations, they properly documented compensation costs charged to sponsoring agencies. These data provide information on best practices in effort-reporting compliance for these two types of higher education institutions and expand the body of knowledge in the field of research administration. Data derived from this study can also be used as a baseline from which to compare future studies on effort-reporting compliance after the implementation of the OMB (2013) Uniform Guidance. 1 Research Management Review, Volume 20, Number 2 (2015) INTRODUCTION 133 audits, high-profile large settlements, and false-claims whistleblower lawsuits E ffort reporting is one of the most have motivated continued auditor oversight challenging compliance areas faced by of universities (Fife, 2006; Stanley & research administrators. Effort reporting, or McCartney, 2009). These circumstances documenting compensation for personnel demonstrate the need for universities to services, is a federal requirement mandating have a sound effort reporting compliance institutions to verify that personnel costs on program. sponsored projects are reasonable when At the time of this study, the federal taking into account the actual work requirements governing compensation for performed on the project (Anthony & personnel services were found in the U.S. Gindhart, 2009; Council on Government Office of Management and Budget (OMB) Relations, 2007). The process of reporting Circular A-21, section J.10, Cost Principles for effort also verifies that time commitments Educational Institutions (OMB, 2004). The made to a sponsoring agency are met. federal regulations for compensation costs Personnel charges typically represent a have since changed to the OMB (2013) large portion of sponsored project costs. Uniform Administrative Requirements, Cost Auditors have always focused on effort Principles, and Audit Requirements for Federal reporting, but there has been a recent Awards, “Uniform Guidance,” section increase in auditor oversight due to federal 200.430. Based on either set of requirements, audit findings and multimillion dollar a sound effort-reporting compliance settlements, institutional disclosures, and program should have a policy, procedure, whistleblower lawsuits brought under the and system that address the federal False Claims Act (1863); these have created requirements. Continued from Circular A- concerns that the policies and procedures in 21 (2004) to the OMB (2013) Uniform place at universities are inadequate or out Guidance, is a required “after-the-fact” of compliance (Blevens, 2013; Council on review process of personnel costs, versus Governmental Relations, 2007; Fife, 2006). relying on budget estimates to document An examination of the Summary of costs. Further, it is expressed in both University Audits, Settlements and documents that costs must be reasonable, Investigations Related to Federal Programs accurate, and based on all activities (Blevens, 2013) shows that effort-reporting represented in an employee’s institutional- findings represent the largest proportion, based salary. With the new focus on constituting over 25% of all compliance stringent internal controls in OMB (2013) areas. Effort-reporting findings in annual A- Uniform Guidance, institutions should 2 Research Management Review, Volume 20, Number 2 (2015) continue vigilance to ensure they have a (National Science Board, 2012). Rockwell strong effort-reporting compliance (2009) noted that universities “go beyond program. the regulations” (p. 36) because they experience an audit, fear they will be . . . a sound effort-reporting audited, or have different interpretations by compliance program should have auditors, thus further exacerbating a policy, procedure, and system administrative burdens. Developing and that address the federal utilizing best practices are one way that requirements. institutions can work together to lessen administrative burdens (National Science External monitoring programs cannot Board, 2014). be solely relied upon to ensure compliance (Fedor, Yaussy, & Cola, 2008). Compliance Effective [compliance] programs programs should be implemented into daily should ultimately lessen the operations and the policies and procedures administrative burdens, while put in place to foster compliance should be allowing for the early followed (Saputelli & Smith, 2010). An identification and prevention of effective compliance program serves to issues . . . . protect an institution from liability, mitigate risk, and foster the proper stewardship of Institutions can point to multiple external funds and institutional resources resources for developing sound compliance (Erickson & Tangredi-Hannon, 2006). The programs. For example, the Draft OIG assessment of compliance programs at an Compliance Program (2005) for recipients of institution must be a constant priority and Public Health Service awards offered the continually monitored (Erickson & following guidance for a good compliance Tangredi-Hannon, 2006). Effective program as a means to promote strong programs should ultimately lessen internal controls: administrative burdens while allowing for 1. Implementing written policies and the early identification and prevention of procedures, 2. Designating a compliance issues (Fedor et al., 2008). The increase in officer and compliance committee, 3. compliance costs is a large component of Conducting effective training and research and development expenses for education, 4. Developing effective lines higher education institutions; streamlining of communication, 5. Conducting regulations is a way to decrease these costs internal monitoring and auditing, 6. for all higher education institutions Enforcing standards through well- 3 Research Management Review, Volume 20, Number 2 (2015) publicized disciplinary guidelines, 7. their colleagues through listservs and Responding promptly to detected conferences for assistance due to the ever- problems and undertaking corrective changing compliance environment and with action, and 8. Defining roles and specific areas of interest, such as effort responsibilities and assigning oversight reporting (Saputelli & Smith, 2010). The responsibility (p. 71313). regulatory environment has evolved to The two internal control documents cited in have “strict” and “reactive” requirements the OMB (2013) Uniform Guidance, Internal that come with little guidance or time to Control Integrated Framework, issued by the implement (Saputelli & Smith, 2010, p. 23). Committee of Sponsoring Organizations of Further, since the guidance is not always the Treadway Commission (COSO), and clear, institutions are responsible for Standards for Internal Control in the Federal clarifying some information in their policies Government, issued by the Comptroller (Saputelli & Smith, 2010). Due to this General of the United States, are provided ambiguity, best practices are essential to for best-practice guidance and can also be developing or evaluating a compliance used to design a sound effort-reporting program. Best practices have changed over compliance program (Committee of the past ten years due to new competitors, Sponsoring Organizations, 2013; pressures from the government on “cost Comptroller General of the United States, containment,” increased regulatory 2014; Office of Management and Budget, oversight by sponsors, and the technology 2014). Internal control is defined as “a age (Kirby & Waugaman, 2005, p. 5). process effected by an entity’s oversight A goal of this study was to assist in body, management, and other personnel identifying best practices by examining that provides reasonable assurance that the some of the common issues that institutions objectives of an entity will be achieved” must address when designing an effort (Comptroller General of the United States, reporting compliance program. Further, 2014, p. 5). Strong internal controls since institutions are unique in their size ultimately allow institutions to quickly and culture, there is no one best compliance respond to change, such as the changes in program (Draft OIG Compliance, 2005). As regulation that the research community is such, this paper summarizes the effort- now experiencing (Committee of reporting characteristics of a sampling of Sponsoring Organizations, 2013). Doctoral/Research Universities (DRUs) and Although the above examples are Predominantly Undergraduate Institutions straightforward, achieving compliance is (PUI) that would recommend their effort- not. Research administrators still rely on reporting process as a means of identifying 4 Research Management Review, Volume 20, Number 2 (2015) best practices in effort-reporting compliance listserv and three of NCURA’s Collaborate for these types of institutions. A DRU is membership communities (Predominantly defined as a higher education institution Undergraduate Institutions, Compliance, that awards at least 20 research doctoral and Financial Research) in order to access degrees (adapted from Carnegie the largest number of eligible respondents Foundation for the Advancement of from the population. To further expand the Teaching, n.d.). A PUI is defined by the number of participants, listserv and National Council of University Research community members were encouraged to Administrators (NCURA, 2013) as follows: send the survey communication on to the The PUI Neighborhood members appropriate person at their institution. The provide research administration sample was drawn from this proportion of information to our colleagues at the population of effort administrators and “predominantly undergraduate was composed of those individuals who institutions”—two-year, four-year, completed the web-based survey. masters-level, and small doctoral Nonprobablility sampling was utilized colleges and universities that grant since the groups described above were used baccalaureate degrees, or provide to collect the sample and it was not known programs of instruction for students if all universities subject to effort-reporting pursuing such degrees with institutional requirements were represented in these transfers (e.g., two-year schools), where groups. A random sampling method was undergraduate enrollment exceeds used because the respondents were only graduate enrollment, and no more than sought out via the groups and not 10 Ph.D. or D.Sc. degrees are awarded individually selected to participate in the per year (adapted from the National study. The survey was anonymous; no Science Foundation’s description of identifying information was collected on the PUIs) (para. 12). participants or their institutions. The METHODS number of participants was not limited in this study. S tudy participants were research Demographic information was collected administrators responsible for the effort- on both the institutions and individual reporting compliance program at their respondents. The institutional information respective higher education institutions (i.e., collected included institutional effort administrators). Participants were classification (Doctoral/Research University, recruited from both the REASADM-L Master's College or University, Predominantly Research Administration Discussion List Undergraduate Institution, Associate's or 5 Research Management Review, Volume 20, Number 2 (2015) Technical College, or Other), public versus Collaborate communities. In order to private status, total amount of annual prevent multiple individuals from sponsored funding expenditures, and the responding to the survey from the same office that oversaw effort reporting. institution, the invitation to participate in Respondent information collected was the the study specified that only the person respondent’s position title and years of responsible for effort-reporting compliance experience working in effort administration. for the institution was eligible to take the Institutions were grouped by their survey. Users were limited from responding institutional classification in order to to the survey more than once by using compare types of institutions. software features. The invitation directed The instrument was a web-based eligible participants to a link to the survey questionnaire using the Survey Gizmo instrument. To reduce nonresponse error software program and consisted of and ensure a high response rate was predominantly closed-ended questions, received, a follow-up invitation was sent with a small number of semi-closed-ended one week following the initial invitation on questions and one open-ended question. October 2, 2012. A final request was sent The instrument was separated into four two weeks following the initial invitation on sections consisting of demographic data, October 9, 2012. current data on the institution’s effort- Each of the survey question response reporting compliance program, data on past choices was coded prior to the data being audit influences, and perceptions of future collected. Once collected, the data were changes to the effort-reporting regulations. exported to the IBM Corporation’s SPSS The survey was preceded by a participant (Versions 20 and 21) software for analysis. letter that included a participant rights The response rate was not calculated statement and statement of consent. The because the number of eligible respondents survey adhered to Nova Southeastern could not be calculated since the number of University Institutional Review Board eligible potential respondents was not consent compliance requirements, and all known. To determine best-practice applicable information was included in the characteristics of effort-reporting e-mail invitation and survey introduction. compliance programs by type of institution, A cross-sectional survey design was the data from respondents who indicated used in this study. On September 25, 2012, that they would recommend their effort- an e-mail invitation to participate in the reporting process were separated from the research study was sent to the REASADM-L master data set and then further divided listserv and posted in the three NCURA into two groups, Doctoral Research 6 Research Management Review, Volume 20, Number 2 (2015) Universities (DRU) and Predominantly presented here on DRUs and PUIs are Undergraduate Institutions (PUI). All specific to only those respondents who variables were covered individually by the would recommend their institution’s effort- survey questions and percentages were reporting process. calculated based on the strength of the RESULTS responses to the variables. Descriptive R esults are designed to shed light on statistics were also used to analyze the best practices among DRUs and PUIs that variables; frequencies on the responses were can be adopted by other institutions looking calculated. to update their effort-reporting compliance A total of 114 responses were received. program. This analysis was conducted Of these, eight responses (six complete and separately for both DRUs and PUIs to two partial) were ineligible for the survey identify best practices for each type of because they were not self-classified as institution. Throughout this section, institutions of higher education; these variable labels are listed in parentheses. responses were omitted from data analysis. The first section of the survey Of the 106 remaining responses, 38 were instrument collected demographic data on partial responses for which not enough data the types of respondents (position title), were collected and thus were discarded their institutions (public versus private from the final analysis. The analyzed status and research expenditures), and the responses resulted in 67 or 68, depending respondent institution’s effort-reporting on the variable. Two classifications of compliance program (office that oversaw higher education institutions (DRUs and effort reporting, effort-reporting system, PUIs) represented the majority of OMB Circular A-21 method, frequency of institutions in this study (91.1% or 62 certification, number of effort certifications institutions). Of the 68 total institutions per reporting period, and source of funding represented in the analysis, only 30 or 44.1% reported on). Most DRU respondents were of respondents would recommend their public institutions (78.6%; d3). As expected, effort-reporting process to others (variable: they reported higher research expenditures pREC). Of these 30 institutions, 28 were and effort certifications for their institution DRUs and PUIs with an equal number than PUIs (d5, d11). They expended more representing each group. A higher than $10 million in research expenditures percentage of DRU respondents (14 of 19 for the last fiscal year, with most over $50 institutions or 73.6%) would recommend million (d5). This correlates with a higher their effort-reporting process than PUIs (14 number of effort certifications; they all of 43 institutions or 32.5%). The results 7 Research Management Review, Volume 20, Number 2 (2015) reported above 500 certifications per the that their institutions expended less than institutionally specified certification period, $50 million in funds for research in the last and the majority reported over 1,000 (78.6%; fiscal year, with half of these respondents d11). Further, effort-reporting compliance reporting under $5 million (d5). This programs were most often administered at correlates with a lower number of effort DRUs by a central effort administrator certifications, with 85.7% having fewer than (42.9%) with the next highest response 1,000 certifications per certification period being a central post-award research and 71.4% having even fewer than 500 administrator (21.4%; d1). They were (d11). A clear trend was not observed on divided by which office oversaw effort who administered effort-reporting reporting, with the sponsored accounting compliance programs at these PUIs, offices a slight majority (57.1%; d6) over although the most common positions sponsored programs offices (42.9%; d6). included a central effort administrator, Most DRU respondents indicated that their noneffort specific post-award research institution used a software system to report administrator, and generalist research effort; an equal number of institutions chose administrator with varying functions (d1). off-the-shelf software and institutionally PUI respondents were also divided about developed software (85.8%; d8). The which office oversaw effort reporting at majority (64.3%; d9) of DRU respondents their institutions—sponsored programs indicated an after-the-fact method for offices were the majority, at 64.3%, followed reporting effort. DRU respondents were by sponsored accounting offices at 28.6% divided on the frequency of certification at (d6). In contrast to the DRU respondents, their institutions. The most common only half of PUI respondents used a response was semiannually (42.9%; d10). software system at their institution, and the Finally, almost all DRU respondents others utilized paper (both 42.9%; d8). The indicated that their institution reported majority (85.7%) of PUI respondents used effort for all sponsored funding (versus only an after-the-fact method for reporting effort federal funding or federal and state at their institutions (d9), although they were funding; 92.9%; d12). mixed on the frequency of certification Most PUI respondents were public (d10). Finally, PUI respondents differed institutions (64.3%; d3). As expected, PUI from DRU respondents in regard to types of respondents reported lower research sponsored funding reported—64.3% expenditures and effort certifications for reported effort for all sponsored funding, their institutions than DRU respondents 14.3% reported on federal and state funding (d5, d11). Most PUI respondents indicated only, and 21.4% reported on federal funding 8 Research Management Review, Volume 20, Number 2 (2015) only (d12). evaluations (c1, c12). The majority defined The second aspect of the survey in their policy who could attest to effort or, collected current data on institutions’ effort- who had suitable means of verification (78.6%; reporting compliance programs. Factors c2), and what constituted a significant change included: having an effort policy (c1), per OMB Circular A-21 requirements defining who can attest to effort or “suitable (71.4%; c10). DRU policies did not let means of verification” (c2), allowance of administrators certify another individual’s certification by administrators (c3), training effort for which they did not have suitable (c4), consequences in place for those that do means of verification (71.4%) or they allowed not certify (c5), process in place to track late it only with supporting documentation or overdue statements (c6), commitment (21.4%; c3). A formal training program on management (c7), maximum effort policy effort reporting was also common at DRUs (c8), minimum policy for principal (85.7%) although the trend was towards a investigators (c9), defining significant non-mandatory program (64.3% versus change per OMB Circular A-21 (c10), are 21.4%; c4). A slight majority reported that sponsors charged correctly (c11), conducts their institution had consequences in place independent internal evaluations (c12), for those who failed to certify effort (57.1%; timeliness of certification (c13), and c5). Almost all respondents said they allowance of recertification of effort (c14). In tracked down late or overdue statements to addition, the following self-analysis achieve compliance (92.9%; c6). DRU questions were examined: overall OMB respondents also managed commitments of Circular A-21 compliance (c15), having no effort (85.7%; c7) with the majority having a federal audit findings (c16), having accurate policy on minimum effort (specific to certification (c17), and having an effective Principal Investigators; 64.3%; c9) and compliance program (c18). DRU maximum effort (78.6%; c8) charged to respondents who would recommend their sponsored projects. The majority of effort effort-reporting process demonstrated best certifications at their institutions were practices on which other institutions could completed on time (71.4%; c13). However, model their effort-reporting compliance responses were mixed regarding the programs (Table 1). All of these respondents allowance of recertification, indicating no reported having an effort-reporting policy clear trend (c14). and conducted independent internal 9 Research Management Review, Volume 20, Number 2 (2015) Table 1 Practices of Doctoral Research University Respondents Who Would Recommend Their Effort-Reporting Process Variable Description Percentage who demonstrated the practice c1 Effort policy 100.0 c2 Define suitable means of verification 78.6 c3 Do not allow certification by administrators 71.4 c4 Mandatory or non-mandatory training 85.7 c5 Consequences for not certifying 57.1 c6 Track late or overdue statements 92.9 c7 Commitment management 85.7 c8 Maximum effort policy 78.6 c9 Minimum principal investigator effort policy 64.3 c10 Defining significant change 71.4 c12 Conducts independent internal evaluations 100.0 c13 Timeliness 71.4 Note: For Variable c3, an additional 21.4% would allow certification by administrators with supporting documentation. For Variable c13, the response choices always, very often, and fairly often are included in the percentage. PUI respondents who would documentation; c3). No trends were recommend their effort-reporting process observed with regard to a formal effort- also demonstrated best practices for that reporting training program (c4) or having process (Table 2). Similar to DRU consequences in place for those who did not respondents, almost all reported that their certify effort (c5). All respondents indicated institution had an effort-reporting policy that their institution tracked down late or and conducted independent internal overdue statements to achieve compliance evaluations (both 92.9%; c1, c12) although (c6). Like DRU respondents, the majority of there were mixed results about providing a PUIs formally managed commitments at definition of suitable means of verification in their institutions (84.6%; c7), but in contrast that policy or who has suitable means of to the DRU respondents, PUIs did not have verification to certify effort (c2, d15). The policies on minimum or maximum effort majority of PUI respondents indicated that (57.1% and 78.6%, respectively; c9, c8). They their institution defined what constituted a were positive with regard to timely significant change (71.4%; c10). They also did completion of effort certifications at their not let administrators certify another institutions (71.4%; c13). However, individual’s effort for which they did not responses were mixed regarding the have suitable means of verification (85.7%, allowance of recertification, indicating no 14.3% allowed only with supporting clear trend (c14). 10

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.