ebook img

ERIC ED374912: Improving Guidance Techniques for Early Childhood Program through Inservice Training and Observation. PDF

34 Pages·1994·0.69 MB·English
by  ERIC
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview ERIC ED374912: Improving Guidance Techniques for Early Childhood Program through Inservice Training and Observation.

DOCUMENT RESUME ED 375 912 JC 940 632 TITLE Institutional Effectiveness Program. Pima County Community College District Institutional Effectiveness Series: 1. INSTITUTION Pima County Community Coll. District, AZ. PUB DATE [94) NOTE 35p.; For numbers 2-6 of the series, see JC 940 633-637. AVAILABLE FROM Pima Community College, Office of the Vice Chancellor, 4907 E. Broadway, Tucson, AZ 85701-1030. PUB TYPE Reports Descriptive (141) EDRS PRICE MF01/PCO2 Plus Postage. DESCRIPTORS *College Planning; Community Colleges; Community Involvement; Evaluation Methods; *Institutional Mission; Institutional Research; *Organizational Effectiveness; Outcomes of Education; Personnel Evaluation; Program Evaluation; *Research Utilization; *Self Evaluation (Groups); Two Year Colleges IDENTIFIERS Pima Community College AZ ABSTRACT Describing Pima Community College's (Arizona) institutional effectiveness program, this report provides related board policy, an overview of the program, and an analysis of each of the five program components. Following introductory materials and a board statement indicating the college's commitment to ensuring institutional effectiveness through continuous assessment and quality improvement, an overview of the program is provided, including a schematic diagram of program components and a list of documents in which review results are made available. Individual descriptions are then provided for the following five components of the institutional effectiveness program: (1) an evaluation, every 5 years, of the mission statement involving broad-based community involvement; (2) annual reviews of programs and services with respect to students, faculty, curriculum, and financial operation, as well 5-year reviews of goal achievement, faculty development, and curriculum modifications; (3) periodic evaluations of faculty, the chancellor, administrators, staff, and the board of governors; (4) student outcomes assessment, including classroom mini-grants, general education, occupational education, and student information system reports; and (5) continuous evaluation of the planning process. For each section, information is provided on participants in each component, the form of documentation, and the utilization of results. Timelines, the executive summaries of four research reports, and other supporting materials are appended. (KP) *********************************************************************** Reproductions supplied by EDRS * are the best that can be made from the original document. *********************************************************************** Program ess 11 S. DEPARTMENT OF EDUCATION Office of Educat.onsi ReSealCh and improvement "PERMISSION TO REPRODUCE THIS EDUCATIONAL RESOURCES INFORMATION INJ hls MATERIAL HAS BEEN GRANTED BY CENTER IERICI dOcurnent has been reproduced as rstarved from the person or Orgamzahon \-9 R. Baker ortginafing It O Mtn°, changes have been made to .mprove reproduction QOahfy a Ponsts Of ,new or optmOns stated in this dot, meni do not neceManly relarebent OfhGb OERI posthon or policy TO THE EDUCATIONAL RESOURCES Cr. INFORMATION CENTER (ERIC).- PimaCountyCommunityCollegeDistrict Institutional Effectiveness Series : 1 Table of Contents ii Preface Introduction 1 Institutional Effectiveness Board Policy 1 Overview of the Program 1 4 Mission Statement Evaluation 5 Program and Service Review Board and Employee Evaluation Faculty 8 Chancellor 8 Administration 9 Classified Staff 10 Board of Governors Student Outcomes Assessment 11 Classroom Mini-Grants 11 12 General Education 12 The DACUM Process 13 Student Information System Reports 14 Planning Process & Evaluation of Planning 15 Conclusion 16 Examples of Mission Success Indicators Appendix I. 17 Chart of Responsibility for Program Review Appendix II. Sample Program Review Level I Report 18 Appendix III. Sample Page from Program Review Level II Status Report 19 Appendix IV. 20 Board and Employee Evaluation: Chart of Responsibility Appendix V. Action Plan & Timetable for Student Outcomes Assessment Program Appendix VI. 21 Student Outcomes Assessment Data Flowchart 23 Appendix VII. 24 Sample of Mini-Grant Abstract Appendix VIII. Executive Summary: Longitudinal Study of Minority Appendix IX. 25 Student Retention and Transfer Success Executive Summary: A Study of Students Concurrently Enrolled Appendix X. 27 at PCC and University of Arizona 28 Appendix X' Executive Summary: Employer Needs Assessment Results Executive Summary: 1992 Employer Assessment of Recent PCC Graduates 29 Appendix XII. Preface To the Pima College Community: In the past few years, we have seen change in the College on a scale that we will probably never see again. As we struggled with all that change, as we rushed from deadline to deadline, we may have felt often that we were doing it only to remedy deficiencies rather than to build a better future. Now, pausing to look back on where we have been and where we have come, we c::n see things in a better light. Our Institutional Effectiveness Program does represent a timely response to the pressures of the recent past, and that is accomplishment enough. But it is even more timely, and significant, because it is our response to our community and our students. Together, we have fashioned a plan for the future, not an apology for the past. The program lays out a set of guidelines to make the most of what we have, and it will prove its real value when it leads to later programs for an ever betterfuture. We set out on this plan because we knew that change has no value unless it leads to improvement, and real improvement happens only after we look at ourselves and decide where we need to go. As you reflect on your role in getting the College this far, perhaps you will come to value the place we are at, and we will all have good reason to hope that our community and our students will like where we are leading them. Carol A. Gorsuch Vice Chancellor. Introduction College's commitment to support and The purpose of this document is to describe Pima Community and the principal means by which assess- evaluate the five areas of the Institutional Effectiveness Program effectiveness, an also contains the board policy on institutional ment and improvement will occur. It analysis of each of the program compo- overview of the program and its development, and a summary Also includ- participants, documentation, and utilization of results). nents (including information on the of the institutional effectiveness program on the ed are example documents which illustrate the impact procedures, regulations, and types of information life of Pima Community College. More details on the presented in documents cited at the end of each collected for the Institutional Effectiveness Program are section. Institutional Effectiveness Board Policy the institutional effectiveness The program described in this document was formulated in response to March 11, 1992: policy, which was adopted by the PCC Board of Governors on Effectiveness through continuous assessment The College is committed to ensuring institutional responsive and integrated plan- and quality improvement. Accordingly, the College wid establish help the College fulfill its mission in ning, evaluation, development and project-support systems to is authorized to establish reg- the most effective and efficient manner. The College administration ulations and procedures to implement this policy. Overview of the Program activities in five different areas of insti- The Institutional Effectiveness Program is composed of evaluative tutional life, as follows: Mission success Program and service review Student outcomes assessment Board of Governors and employee performance evaluations Planning and planning evaluation frequently relate with one anoth- These processes have developed somewhat independently, but they now following page. At the center of the Institutional Effectiveness er, as depicted in the schematic on the of the other processes. A summa- Program is Student Outcomes Assessment, which shares data with each evaluations will be made available in an annual Report on Institutional ry of the results of these for monitoring and coordinating Effectiveness beginning at the end of the 1993/94 year. The responsibility is assigned to the Office of the as well as for the annual report the Institutional Effectiveness Program Vice Chancellor. 1 As illustrated in the schematic, student outcomes assessment is the central focus of the Institutional Effectiveness Program in that it is deemed to be the key measure of the College's success. The other four program components use student outcomes data, either directly or indirectly, as a part of their respective processes. Each of the processes also interacts one 1,vith another. (----- Board and Mission Employee Statement Evaluation Evaluation STUDENT OUTCOMES ASSESSMENT Evaluation Program and of Planning Service Review The results of the Institutional Effectiveness Program are made available to the College community (and, in the case of the mission success evaluation, to the community at large) via a variety of publications and documents. Among them are the following, which are available for review in the district central offices indicated. Mission Success Indicators: Outcomes Status, Spring 1993 (Research & Planning Office) Program and Service Review (Vice Chancellor's Office) Instructional Programs and Disciplines Student Development Services Instructional Support Services Official Program and Service Review Cycle Employee Handbooks (Human Resources Office) Appendix B: Comprehensive Faculty Evaluation Program. Faculty Personnel Policy Statement Administrative Personnel Policy Statement Classified Employees Exempt Policy Statement Classified Employees Non-Exempt Policy Statement Faculty Evaluation Aggregate Reports (Human Resources Office) 1993 Board of Governors Evaluation (Institutional Research Office) Classroom Mini-grants (Academic Affairs and Student Development Office) RFP for mini-grants Abstracts and Summaries (Selected Bulletin issues) 1 Curriculum Procedures Manual (Curriculum )ervices Office) General 'Education Designation Process Developing a Curriculum, DACUNI (Occupational Education Office) Longitudinal Study of Minority Student Retention and Transfer Success ( Institutional Research Office) A Study of Students Concurrently Enrolled at PCC and University of Arizona (Research and Planning Office) 1992 Employer Needs Assessment Results (Institutional Research Office) 1992 Employer Assessments of Recent PCC Graduates (Institutional Research Office) This listing represents only a sample of currently available assessment data. Other less formalized assess- ment techniques are being discovered and implemented throughout the College. 3 I. Mission Statement Evaluation Pima Community College reviews its mission and reports to the community annually. It also conducts a comprehensive evaluation ay lximately once every five years with broad-based community involvement to ensure that the College continues to address the needs and expectations of its publics and that it is suc- ceeding in fulfilling its mission. Participants Annually, the Research and Planning Office is responsible for determining appropriate measures of the success indicators either through data already available, surveys like the Personnel Assessment of the College Environment (PACE), other surveys designed by the College, or data collected by outside consul- tants for this purpose. Typically, a five-year comprehensive evaluation review committee of approximately 100 members is formed with 50 percent College personnel (including faculty, administrators, staff, and students) and 50 percent community leaders who: assess the current mission statement by examining the success indicators of previous years, identify topic areas that are or should be expressed as major commitments in the statement, modify the statement, if necessary, and develop success indicators with measurable outcomes in each area identified. Documentation The Office of Research and Planning provides data from a variety of sources including survey and regis- tration information to substantiate the degree of progress made in each of the areas. Copies of the report are provided to all faculty. The Chancellor reviews these outcomes and presents a progress report to the community. The most recent progress report, in pamphlet form, is entitled Mission Success Indicators: Outcomes Status, Spring 1993, and is available from the Office of Research and Planning. Utilization of Results As part of the Program Review process, faculty relate mission success outcomes to their programs or disci- plines and indicate what changes they intend to make, based upon the aggregate data. So, the results of the Mission Statement Evaluation are used by administrators, departments, and faculty to recommend pro- gram changes. They also provide the foundation for subsequent modifications of the Mission Statement. Examples of mission success indicator outcomes are noted in Appendix I. 4 rr II. Program and Service Review Pima Community College conducts regular evaluations of programs and services to promote the educa- tional quality, equity, vitality, and efficiency of the College. These evaluations measure the extent to which programs and services fulfill the College's mission. Program and Service Review consists of two levels. All credit instructional programs receive Level I data annually in four major categories: Students Faculty Curriculum Financial/Operational Budge'. Characteristics Level II is a comprehensive review combining both a quantitative and a qualitative evaluation of each pro- gram and service, conducted every fifth year by program faculty. Reviews are staggered over that period. Level II evaluations are conducted utilizing an instruction booklet with guidelines for the self-evaluation. Topics for evaluation include ( 1) how well the program meets mission goals, facility and personnel needs, (2) examination and evaluation of faculty professional development, (3) curriculum modifications, and (4) evaluation of course scheduling and student outcomes assessment. Participants The roles of various participants are summarized in the chart of responsibility which appears in Appendix II. The Chancellor's Cabinet coordinates the college-wide review program. The Office of Research and Planning is responsible for the database and survey instruments, and also analyzes, interprets, and reports survey results. The responsibility for monitoring Program and Service Review is assigned to the Assistant Vice Chancellor for Academic Affairs and Student Development. The faculty or staff in each participating unit conduct reviews and write reports. A lead administrator, selected from among the deans of instruction or student development, is assigned to coordinate each pro- gram or service review and to work with department chairs or staff persons from all participating cam- puses. For programs based on a single campus, the area dean responds to faculty and staff recommenda- tions, prioritizes the recommendations, and sends them to the provosts for action. For multi-campus pro- grams, the lead dean meets with all the deans, reviews and responds to the recommendations, prioritizes them and sends them to the provosts. The provosts approve recommendations, allocate funds for improvements, and resolve inter-campus issues. Any major change in program direction, as well as any consolidation or elimination of programs, is proposed to the Chancellor's Cabinet. Documentation Procedures and forms for Level I and Level II Program and Service Reviews are provided in packets designed for each of the three types of operations reviewed: Instructional Programs and Disciplines, Student Development Services, and Instructional Support Services. These are available in the office of the Assistant Vice Chancellor for Academic Affairs and Student Development. Utilization of Results Level I materials provide aggregate data to all departments and programs. This information is used to sug- gest adjustments that can be made at the departmental level in curriculum, scheduling, or services. Appendix III contains an example of a Level I review. 5 Q Level II reviews, conducted every fifth year, require listing and prioritizing recommendations based on the results of the review Special funds are designated each year for capital equipment (S150,000 in 1993/94) and faculty/staff/curriculum development (S70,000 in 1993/94) in response to these recommen- dations. The recommendations also provide the basis for budgeting and curriculum decisions for depart- ments and programs. Progress in implementing Level II recommendations is reported every six months on summary forms and these are presented in annual reports to departmental faculty. See Appendix IV for a sample page indicating the status of recommendations resulting from a Level II report.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.