Shepherd University Logo
SEARCH    
Home | About | Undergraduate | Graduate | Prospective | Current | Athletics | Alumni | Faculty/Staff
NCATE header
NCATE Overview NCATE Framework NCATE Standard 1 NCATE Standard 2 NCATE Standard 3 NCATE Standard 4 NCATE Standard 5 NCATE Standard 6 NCATE Documents

Standard 2 Links:

Assessment System

Data Collection, Analysis, and Evaluation

Use of Data for Program Improvement


Department of Education

Professional Education
Unit Council

Shepherd University

NCATE

NCATE Log-in

STANDARD 2. ASSESSMENT SYSTEM AND UNIT EVALUATION
The unit has an assessment system that collects and analyzes data on the applicant qualifications, the candidate and graduate performance, and unit operations to evaluate and improve the unit and its programs.

2A. Assessment System

2a.1. How does the unit ensure that the assessment system collects information on candidate proficiencies outlined in the unit's conceptual framework, state standards, and professional standards?

The Professional Education Unit Assessment System (PEUAS) ensures that teacher candidates at Shepherd University are prepared to serve as professional teachers who create effective learning environments for meaningful learning and student engagement. Assessments address and align with the Professional Education Unit's (PEU) conceptual framework, the Interstate New Teacher Assessment and Support Consortium (INTASC) standards and SPA-specific standards for individual programs. The PEU assesses in four domains: qualifications of applicants and candidates, candidate proficiencies, competence of graduates, and unit and program operations and quality. Addendum A - Assessment Domains displays the alignment of each assessment domain with the major unit assessments. The conceptual framework, Teacher as Reflective Problem Solver, exemplifies the belief of the PEU that the process of action, interpretation, and reflection is inherent in one's education and professional life.

Candidate assessment occurs at several transition points using multiple assessments in each of the four assessment domains. Assessments of the teacher candidate through the transition/ juncture points provide evidence of the candidate's growth and development. Each transition point uses various formative and summative assessments that collect sufficient data to determine the professional and pedagogical content knowledge of candidates and graduates. Formative evaluation (course instructors, clinical supervisors, and advisors) gives the candidate ongoing feedback on performance to identify aspects of performance that need improvement and to offer corrective suggestions. The summative assessments identify patterns and trends in candidate and unit performance and judge these summary statements against criteria to obtain performance ratings. Aggregate unit data and disaggregate program specific data are analyzed and distributed on an annual basis to members of the Professional Education Unit Counsel (PEUC) at retreats and regular meetings by the DAA and the Field Placement Coordinator (FPC).

The PEU views this assessment process as an opportunity to revise specific content curricula and make program changes that result in furthering the development and growth of candidates, as they become Teachers as Reflective Problem Solvers. This process also provides candidates with opportunities for self-assessment, reflection, and feedback from University faculty and public school practitioners.

2a.2. What are the key assessments used by the unit and its programs to monitor and make decisions about candidate performance at transition points such as those listed in Table 6? Please complete Table 6 or upload your own table at Prompt 2a.6 below.

The PEU uses internal and external assessment tools to measure candidate performance and competence. Assessments of teacher candidates through the transition/ juncture points provide evidence of the candidate's growth and development. Each transition point uses various formative and summative assessments that collect sufficient data to determine the professional and pedagogical content knowledge of candidates and graduates. Formative evaluation gives the candidate ongoing feedback on performance to identify aspects of performance that require improvement and to offer corrective suggestions. The summative assessments identify patterns and trends in candidate and unit performance and judge these summary statements against criteria to obtain performance ratings. Within Attachment J, the following are presented: Addendum B: Professional Education Unit Assessment System: Transition Point Assessments outlines the assessment tools for Juncture I-Admission to the Teacher Education Program, Juncture II- Admission to Student Teaching, and Juncture III-Recommendation for Licensure. Addendum C: Decision Outcomes at Transition Points outlines the decision outcomes that are at each transition point and options for candidates who do not meet each standard.

Transition Points The PEUAS uses three transition points as its primary assessment structure to make decisions regarding the progress of initial teacher candidates through the program:

* Juncture I: Admission to Teacher Education
* Juncture II: Admission to Student Teaching
* Juncture III: Recommendation for Licensure

Juncture I: The PEUC deliberates the recommendations of the specialization coordinators to determine that teacher candidates demonstrate the academic knowledge, the dispositions, and performance skills necessary to enter teacher education.

Juncture II: The PEUC deliberates the recommendations of the specialization coordinators to determine that teacher candidates are ready to enter a full-time clinical experience.

Juncture III: The DTE determines if teacher candidates meet the requirements for recommendation for certification.

PRAXIS I and II content scores (for MAT candidates), GPA, and course grades qualify applicants and candidates. The evaluation of SPA-specific assessments, portfolios, development and delivery of unit plans and Teacher Work Samples (TWSs), student teaching evaluations, and disposition evaluations determine candidate proficiencies. In addition, the assessments developed and implemented for the Master of Arts in Curriculum and Instruction (MACI) align with the NBPTS. Candidates for the MACI also complete an assessment portfolio, a comprehensive exam (for candidates admitted after January 2009), the Diversity Field Project, and the action research thesis. Survey analysis of cooperating teachers, alumni and principals help determine the competence of candidates. The quality of the unit and program is determined by course evaluations and surveys completed by graduates, alumni, cooperating teachers, and principals.

Table 6
Unit Assessment System: Transition Point Assessments

PEU Table Six

2a.3. How is the unit assessment system evaluated? Who is involved and how?

The Unit Heads, the Vice President for Academic Affairs (VPAA) and the Dean of Education and Professional Studies (DEPS) authorized the creation of the position of Director of Assessment and Accountability (DAA) for collection, management, and distribution of unit data. The DAA distributes aggregate unit data and disaggregate program-specific data to the Unit Heads and the PEUC for the purpose of unit and program improvement.

The PEUC, under the leadership of the DTE, created and implemented the assessment system. The council reviews and discusses the unit and program assessments during meetings and retreats. Specialization coordinators or any member of the council can bring forward questions and recommendations in relation to assessment during bimonthly meetings. Specialization coordinators report policy changes and needed curricula changes back to their respective departments for discussion, review, and implementation. Revisions may include curricula and assessment changes that further align programs to SPA standards (see SPA reports), changes in WVDE requirements for teacher education programs (see additions since last visit), and/or other changes deemed as needed by the PEUC to improve the assessment of the unit such as the implementation of the Tk20 system.

In addition, these changes may include suggested revisions or questions about the administration of unit assessments, or might relate to a program-specific assessment that a particular specialty area would like the unit to examine to improve outcomes for candidates. In addition to these regular meetings, the PEUC regularly holds retreats (which often includes P-12 professionals) typically the week before fall classes begin, the week before the spring semester begins, and at the end of the spring semester. A review of unit assessments and assessment procedures are often topics at these meetings. When needed, the PEUC forms sub-committees that create suggested revisions to policies and procedures that are presented to the PEUC for final determination.

The review and revision of the PEUAS promotes equity and fairness in the instruction and assessment of programs as well as the assessment of candidates and the unit. The assessment system is multifaceted and comprehensive as the unit strives to produce highly qualified candidates.

2a.4. How does the unit ensure that its assessment procedures are fair, accurate, consistent, and free of bias?

The PEUC works to ensure that the evaluation of major unit assessments is fair, accurate, consistent and free of bias. Candidates learn the assessment system of the program through a variety of sources. Initially, candidates learn about the Unit Assessment System through the orientation and advisement program, during EDUC 150, the introductory course, and via catalogs, specialization handbooks, and online materials. Candidates meet their advisors, and learn the juncture process; (the system they will use to apply for admission to teacher education and student teaching) during the EDUC 150 course. Candidates continue to meet with their advisors to learn the required elements for portfolios, PRAXIS requirements, and any other content specific assessments. This process helps students understand the major unit assessments and achieve the conceptual framework goals as they move through the program.

PRAXIS I and PRAXIS II exams are national assessments that have undergone validity and reliability studies. Other major unit assessments utilized by PEU members use scoring guides with detailed rubrics that address issues of fairness, accuracy, consistency and bias. The rubrics provide candidates with clear expectations of the work that they submit, and help reduce the potential bias of the rater. Detailed rubrics are the basis of the Pro-05 qualitative evaluations (both initial licensure and MACI programs), the Unit Plan (initial licensure - completed in special methods classes), the Teacher Work Sample (TWS) (initial licensure), and the Student Teaching Evaluation (ST76) (initial licensure). In addition, rubrics ensure fair evaluation for three of the major assessments for the MACI including the Diversity Field Project, the Assessment Portfolio, and the Action Research Thesis. Candidates, admitted to the MACI program after January 2009, must pass comprehensive exams in order to enter the course, EDUC 580: Action Research Thesis.

To address interrater reliability for the TWS, course instructors of the student teaching seminar participate in a training session in which the DAA defines and describes the expectations for each of the components of the TWS. Course instructors rate and discuss a sample TWS together to develop a consistent rating process. In order to develop greater interrater reliability for the Student Teaching Evaluation (ST-76), the PEU developed a manual that programs detailed descriptors for each indicator at each rating level. P-Adult mentor teachers also use the same evaluation form as the university supervisors. P-Adult mentor teachers and PEUC members continue to provide feedback on this manual so that the rubric elements address PEU evaluation targets for candidate proficiencies in clinical practice.

2a.5. What assessments and evaluation measures are used to manage and improve the operations and programs of the unit?

Shepherd University uses multiple measures to assess student learning to improve its operations and programs. The Collegiate Learning Assessment (CLA) measures an institution's contributions to student learning. The CLA results for 2008-09 placed Shepherd University for "value added" in the 96th percentile. The National Survey of Student Engagement (NSSE) assesses the extent to which students engage in learning experiences on and off campus. Currently, Shepherd University is conducting a NSSE survey online as the last survey was completed in 2005. The Center for Teaching and Learning (CTL) provides results online and in the Center.

The PEU uses a variety of assessments to improve the operations and programs of the unit. The University Assessment Review (UAR) is department based and not unit based; therefore, the University Assessment Review is a component of the PEUAS but does not specifically evaluate candidates. The UAR, conducted through the CTL, maintains a consistent cycle of data analysis and program improvement. Each department submits an assessment plan that identifies goals, intended student outcomes, and means of assessment to measure intended student outcomes with performance levels identified. Plans are submitted in the spring semester every two years. Data collection occurs for the next four semesters. Data analysis determines if intended student outcomes achieve target performance levels. If assessment outcomes do not meet identified performance levels, then the department develops and submits plans for improvement of student performance.

Unit assessments to evaluate program and unit operations for both initial licensure and advanced programs include course evaluations, graduating student surveys, alumni surveys, cooperating teacher surveys, and principal's surveys evaluating recent graduates. The cooperating teacher survey was added in the spring semester of 2010. All surveys developed for initial licensure and the MACI programs include indicators that address program operations and quality. For every professional education course, course evaluations are completed and included in the faculty's annual review. Data analyses for these evaluations are distributed to the course instructor, the chairperson of the DOE, the DGSCE, the DEPS, and the VPAA. If weaknesses are identified in these evaluations, the chair and the dean, during the annual review process, encourage faculty members to develop a plan for improvement. Items were added to the graduating candidate survey in the spring semester of 2010 provide data regarding the operations of the university outside of the unit. All surveys developed for the initial programs and advanced MACI program include indicators that address program operations and quality.

2a.6. (Optional Upload for Online IR) Tables, figures, and a list of links to key exhibits related to the unit's assessment system may be attached here. [Because BOE members should be able to access many exhibits electronically, a limited number of attachments (0-5) should be uploaded.]

Links:
Shepherd Course Evaluation Form http://www.shepherd.edu/ir/Documents//evaluation/Evaluation_directions.pdf

Exhibits for 2a:
* Attachments B, C, and D: Revised Conceptual Framework
* Attachments E, F, and G: Alignment of Programs with Standards and Assessments
* Attachment H Matrix of Alignment of Standards and Courses
* Attachment I Overview of Unit Assessment System
* Attachment J Unit Assessment System
* Campus Workroom: Minutes from PEUC Meetings

2b. Data Collection, Analysis, and Evaluation

2b.1. What are the processes and timelines used by the unit to collect, compile, aggregate, summarize, and analyze data on candidate performance, unit operations, and program quality?

How are the data collected?
From whom (e.g., applicants, candidates, graduates, faculty) are data collected?
How often are the data summarized and analyzed?
Whose responsibility is it to summarize and analyze the data? (dean, assistant dean, data coordinator, etc.)
In what formats are the data summarized and analyzed? (reports, tables, charts, graphs, etc.)
What information technologies are used to maintain the unit's assessment system?

Data Collection
The PEU implemented a data collection system that provides information for evaluating candidates, the program and unit functions. Methods for data collection and analysis improved over the past year with the implementation of the Tk20 data management system. This system piloted in the fall semester of 2009, was implemented in all education courses during the spring semester of 2010. Components of the PEUAS, including key course assessments, student teaching evaluations, and surveys are accessed, completed, and evaluated through Tk20. The development of an electronic Campus Workroom is in process because of the successful implementation of Tk20. The PEUAS outlines the schedule of assessments and data recipients.

Initial Licensure
Multiple sources provide data on applicant and candidate qualifications for initial licensure. Grade and GPA data, submitted by applicants/candidates, is verified by the certification analyst. The DAA receives PRAXIS (I and II) exam score reports directly from ETS. ETS aggregates data, matches score reports, and provides a summary for the Title II report. The PEUC and Shepherd University administration receive aggregate data reports annually. The DAA distributes PRAXIS II data to specialization coordinators to share with departments to improve curriculum and candidate performance.

Data collection for assessments evaluating candidate proficiencies emanate from course instructors. Instructors for EDUC 400, Student Teaching Seminar, submit TWS data to the DAA via Tk20. Instructors evaluate each candidate using the Pro-05, the disposition qualitative evaluation, at the conclusion of each professional education course. A copy of this data is provided to the candidate, the candidate's advisor, and a copy is filed in the candidate's record file. The Field Placement Coordinator (FPC) creates reports based upon the Pro-05s completed in EDUC 150, 320, and 400 for the undergraduate program and for the student teaching semester for MAT candidates in order to identify possible trends related to candidate dispositions. Aggregate unit reports as well as reports for each licensure program emanate from these data. The specialization coordinator for each licensure program collects, maintains, and analyzes data for assessments specific to individual licensure programs, such as the unit plan assignment, the portfolio review, and other SPA specific assessments. These specialization coordinators are encouraged to share these data regularly with their departments.

The DOE collects survey data that address both program operations and competence of graduates. The graduating senior survey, developed by DOE faculty and approved by the NCATE Task Force, is administered to graduating candidates via Tk20. The DOE and the PEUC reviewed and approved the surveys approved by the NCATE Task Force. The FPC generates reports distributed by the DAA to the PEUC annually. The FPC created surveys sent to alumni (graduates over the past five years) and principals (evaluating recent graduates/hires) via regular mail and email biannually, in order to assess perceptions of graduates' competence.

Course evaluations are completed for every professional education course every semester. The Office of Institutional Research compiles evaluation results each semester and provides these results to the course instructor, the chair of the DOE, the DGSCE, the DEPS, and the VPAA. The Office of Institutional Research does not provide aggregate data for course evaluations for the PEU; therefore, course evaluation data for undergraduate and graduate courses are not analyzed at the unit level.

Advanced Licensure
Multiple sources provide data on applicant and candidate qualifications for the advanced MACI program. MACI applicants submit documentation of undergraduate degree, teaching license, grade and GPA (as well as Millers Analogies Test or Graduate Record Exam scores if the applicant's GPA is 2.5 or higher, but less than 2.75). The graduate studies administrative assistant verifies data submitted by candidates. The MACI program coordinator monitors data for current candidates using grade data provided by the Office of the Registrar to ensure that candidates meet all the GPA, comprehensive exams (for candidates admitted after January 2009), and grade requirements for admission to the action research thesis and for program completion.

Course instructors collect data from assessments designed to assess MACI candidate proficiencies. Course instructors evaluate the Diversity Field Project, Assessment Portfolio, and Action Research Thesis and complete rubrics through Tk20. The FPC develops reports based upon these data that are distributed by the DAA to the MACI coordinator and members of the DOE. The FPC collects survey data that address both program operations and competence of MACI graduates. The graduating survey created is completed by graduating MACI candidates at the conclusion of the action research course. The FPC generates reports and distributes these reports to the DAA and the PEUC annually. The FPC also created surveys sent to MACI alumni (graduates over the past five years) and principals (evaluating recent graduates/hires) via regular mail and email biannually. Data from these reports are distributed to PEUC members biannually, and examined for program strengths and weaknesses.

Course evaluations must be completed for every professional graduate education course every semester. The Office of Institutional Research compiles evaluation results each semester and provides these results to the course instructor, the chair of the DOE, the DGSCE, and the VPAA.

Data Analysis
The DOE uses the university assessment review process to maintain a consistent cycle of data analysis and program improvement. The university assessment process is set up on a two year cycle. Each department submits an assessment plan that identifies goals, intended student outcomes, and means of assessment to measure intended student outcomes with performance levels identified. Plans are submitted in the spring semester every two years. Data are then collected for the next four semesters (Fall, Spring, Fall, Spring). At the end of the four semesters of data collection, data are analyzed to determine if intended student outcomes were reached at the identified performance levels. If assessment outcomes do not meet identified performance levels, then the department develops and submits plans for improvement to foster higher levels of student performance. The MAT coordinator submitted all data reports for all DOE programs for the 2008-09 reporting cycle.

The DOE develops assessment plans for elementary and secondary initial licensure programs (undergraduate and MAT) and the advanced MACI program. The DOE collaborates to identify goals, outcomes, assessments, and performance levels. The current plan has goals for candidate knowledge, candidate performance, and candidate dispositions. Following data collection, the DOE will analyze the data and develops the report with plans for improvement beginning in the fall 2010 semester. The assessment plan and report were distributed to the PEUC for feedback in the spring semester of 2010. The plan and report are shared with the Educational Personnel Preparation Advisory Committee (EPPAC), which is comprised of P-12 partners and administrators.

The assessment plans developed by the DOE for elementary and secondary initial licensure programs do not address specific secondary licensure programs. Specialization coordinators are encouraged to include candidate proficiencies specific to their content area in their departmental assessment plan and SPAs/ CARs.

2b.2 How does the unit disaggregate candidate assessment data for candidates on the main campus, at off-campus sites, in distance learning programs, and in alternate route programs?

Initial and advanced candidates receive disaggregate data reports regarding progress from multiple sources. These sources include the institution, the Educational Testing Service (ETS), course instructors, university supervisors, cooperating teachers and self and peer evaluations. Candidates discuss progress with instructors and academic advisors. Candidates receive grade reports from the Office of the Registrar via RAIL. Candidates learn their PRAXIS exam results from the ETS website while Shepherd University receives paper reports of candidates' PRAXIS score reports. Course instructors provide evaluations of candidates' assignments via Tk20, directly in class, and via observation reports of candidates' performance in public school settings. Candidates communicate with course instructors via Sakai, Tk20, and email regarding assessment of assignment and field performances.

Course instructors, university supervisors and cooperating teachers evaluate candidates' performances in classrooms and provide feedback during post-observation conferences and using evaluation instruments. Candidates write self-reflections on their growth in field experiences as they complete coursework in the program.

There are no discreet off-campus and distance learning programs at this time. Individual sections of courses, such as MUSC 111 or ART 103, are taught online; but those courses are not part of a discreet online program. Individual graduate courses that are part of the advanced MACI program (EDUC 502), at times, are taught off campus in the community to foster increased enrollment. Again, these courses are a part of the campus based MACI program.

2b.3. How does the unit maintain records of formal candidate complaints and their resolutions?

The DOE chairperson maintains a file of departmental candidate complaints and their resolutions. Instructors inform candidates of expectations for assignments and course grades on syllabi, Sakai, and/or Tk20. If candidates do not agree with assignment or course grades, they have the right to appeal the grade to the course instructor. If the candidate does not find satisfaction at this first level, the candidate appeals to the department chairperson. The chairperson meets with the candidate and investigates. If the candidate is not satisfied with the decision of the chairperson, the candidate appeals to the dean. The candidate can appeal the decision of the dean to the VPAA.

The PEUC maintains records of candidate withdrawals from clinical practice in candidates' permanent files located in Knutti 104 and 108. Letters to the candidates documenting the candidate's voluntary or involuntary withdrawal are located in the candidate's permanent record. Candidates have the right to appeal involuntary removal from a field placement and clinical practice. Candidates write a letter to the DTE as chairperson of the PEUC. Members of the PEUC volunteer to serve as members of a PEUC sub-committee that investigates the removal of the candidate from field experience. The sub-committee investigates and makes recommendations to the PEUC who votes whether to accept the recommendations of the sub-committee. The candidate receives a letter from the DTE informing the candidate of the decision of the PEUC. If the candidate is denied readmission to the teacher education program, the candidate can complete a degree in another program outside of teacher education. If the candidate is readmitted to the program, the candidate may have to complete additional courses, assignments, and hours in the field to increase one's potential for success. The PEUC minutes include the decisions related to candidate withdrawal.

2b.4. (Optional Upload for Online IR) Tables, figures, and a list of links to key exhibits related to the unit's data collection, analysis, and evaluation may be attached here. [Because BOE members should be able to access many exhibits electronically, a limited number of attachments (0-5) should be uploaded.]

Links:
University Assessment Review Process http://www.shepherd.edu/ctl/assess_learning.html

Exhibits for 2b:
* Attachment I Overview of Unit Assessment System
* Attachment J Unit Assessment System
* Attachments Y and Z: Policies and Procedures Manuals
* File of student complaints and the unit's response (KN 108)

2c. Use of Data for Program Improvement

2c.1. In what ways does the unit regularly and systematically use data to evaluate the efficacy of and initiate changes to its courses, programs, and clinical experiences?

To improve outcomes for students in P-12 settings, the PEU is committed to using data to evaluate and improve candidate preparation and graduate competencies, as well as the operations of the unit and the programs within the unit. Assessment data indicate that candidates are performing well and data provide information for growth at the candidate, faculty, program and unit levels. Data, shared with faculty, staff, candidates, EPPAC, administrators, and P-12 partners, demonstrate that the PEU uses data for program improvement.

All programs at Shepherd University, including the PEU, are part of the university assessment process. The purpose of the university assessment system is to improve the quality of programs, identify needs for support, and help set priorities for reallocation of resources to improve delivery of instruction to Shepherd University students. Each department submits an assessment plan that identifies goals, intended student outcomes, and means of assessment to measure intended student outcomes with performance levels identified. Each department collects and analyzes data over a two-year period. The Center for Teaching and Learning receives data-driven Assessment Reports from all departments for all programs. The Dean of Teaching and Learning and the Assessment Task Force Committee review the submissions, suggest changes, and VPAA and Dean meet with the Deans' Council and program chairs to discuss strengths and needs of each program. Examples of DOE reviews are available in the documents room. The university assessment process works on a two-year cycle; however, the collection and analysis of data from unit assessments occurs on a regular basis.

The DOE discusses PRAXIS scores, juncture applications, survey results, and program data during department meetings. Data results inform the department of candidates' strengths and needs and potential needed changes in coursework and strategies within courses. The PEUC receive data during faculty retreats and during the bimonthly meetings. Specialization coordinators and other faculty members develop plans for program improvement as needed.
The DOE piloted Tk20 as its data management system in the fall semester of 2009 in order to aggregate and store data. The data management system provides a mechanism for collection, storage, analysis, and distribution of assessment data related to coursework and fieldwork throughout the teacher education program. Aggregate data are used for analysis of student performance and program improvement. The implementation of Tk20 in all education courses occurred during the spring 2010 semester. Components of some key assessments, identified on the course matrix, changed during the implementation process to improve the assessment process.
Academic advisors and the coordinator of the MAT program evaluate candidate portfolios to ensure candidate understanding of program philosophy, conceptual framework, and SPA-specific standards; and to determine entrance and retention in the teacher education program. Instructors evaluate each candidate using the Pro-05 qualitative evaluation at the end of each professional education course.
The specialization coordinator for each licensure program collects, maintains, and analyzes data for assessments that are specific to individual licensure programs, such as the unit plan, the portfolio review, and other SPA-specific assessments. Specialization coordinators share these data with their respective departments when needed to prepare candidates to pass PRAXIS and succeed in clinical practice. The Department of Computer Science, Engineering and Mathematics added PRAXIS II preparation to its capstone course in order to improve candidates' performance on the PRAXIS II exam in mathematics (0061). The Department of Health, Education, Recreation, and Sports recently added a new course to its curriculum, Human Sexuality, to improve PRAXIS II performance on the Health Education (0550), and to increase candidate comfort level teaching sexuality as a future teacher of health education.
Data for assessments evaluating candidate proficiencies come directly from course instructors. Instructors for Special Methods and Pedagogy courses evaluate candidates' unit plans and provide data that indicate whether candidates have mastered SPA-specific standards and that they understand how to plan and assess instruction. Instructors for EDUC 400, Inclusion in the Regular Education Classroom, submit TWS data to the DAA. Departments receive data from the specialization coordinators. The DOE reflects on candidate performance and implement strategies to increase candidates' academic and field based performances.

The FPC collects survey data addressing the program operations and competence of graduates. In addition, university student teaching supervisors and public school cooperating teachers complete observation evaluations on Tk20. Alumni completed surveys via email and traditional mail with self-addressed stamped envelopes to facilitate the return of the surveys. The DAA and the FPC administer, through Tk20, the graduating senior survey to graduating candidates during an EDUC 400 class session and an EDUC 580 class session. The survey was piloted at the conclusion of the spring 2009 semester. The PEUC analyzed the data and revised the survey during the spring 2009 retreat. The FPC, as the instructor of EDUC 580, administers the graduating survey during a final class session for MACI candidates. The DAA and the FPC distribute the results to the PEUC each semester beginning in the spring semester of 2010 during the final retreat of the academic year.
The DOE develops assessment plans and tools for elementary and secondary initial licensure programs, undergraduate and MAT, as well as for the MACI. The department identifies goals, outcomes, assessments, and performance levels. The current assessment plan has goals for candidate knowledge, candidate performance, and candidate dispositions. After data are collected, the department analyzes the data and develops plans for improvement.

2c.2. What data-driven changes have occurred over the past three years?

The unit has made several important data driven changes over the past three years. One of the major changes relates to the TWS. In 2002, the teacher education program began using the TWS assessment during the student teaching seminar. Analysis of data in subsequent years demonstrated that candidates were not developing the required proficiencies to perform well on this assessment. The DOE held a faculty retreat in August of 2006 to address this issue. The DOE decided to embed elements of the TWS throughout all of the professional education courses so that when candidates complete the comprehensive TWS in the student teaching seminar, they will have developed the skills to perform well on this assessment. Analyses of subsequent TWS data demonstrate that candidate performance on this assessment has improved (See program reports for TWS data.).

Another data driven change relates to the student teaching evaluation (ST76). When reviewing data reports, the DOE recognized that candidate ratings were probably higher than they should be, particularly for evaluations completed early in the student teaching semester. To address rater inflation, the DOE developed the ST-76 manual to provide specific criteria for each indicator and get a more accurate evaluation of candidate performance. The manual was developed with input from the PEUC as well as P-Adult mentors/partners. Based on feedback from school-based partners, the PEU asked P-Adult mentor teachers to complete the same evaluation form to facilitate data comparison. In addition, the PEUC made the decision to have candidates complete a self-evaluation using the same form in order to encourage self-analysis and reflection.

A third major change was based on feedback received from candidates on the graduating student survey administered in the fall semester of 2008. A number of students suggested that the student teaching seminar course should be offered online since many candidates had to drive from other states. When the PEUC reviewed these data during the January 2009 faculty retreat, a recommendation that came from this session was to consider moving the course to an online format. The course instructors for EDUC 400 made the decision that spring to move the course to a hybrid in-class/ online format. Feedback from candidates on the graduating student survey during the spring 2009 will provide us with guidance as to whether to continue with this format. The class was offered in the spring semester of 2010 with two hybrid sections and one traditional format. Candidates knew which section was offered in which format when they registered for the class. As a result in the fall semester of 2010, two sections of EDUC 400 are offered face-to-face and one section is offered in a hybrid format.

In addition to unit-wide changes, individual programs have made data-driven changes based on data collection and analysis. See Part V of the SPA/CAR reports for a description of these changes.

2c.3. What access do faculty members have to candidate assessment data and/or data systems?

Faculty members have access to their advisees' transcripts, scores, and disposition evaluations (Pro-05). They also have access to the data of the students in their classes. The certification analyst, the DAA, or the FPC can provide faculty with additional data when requested. Faculty members receive candidate assessment data from major unit assessments at department and PEUC meetings. Some aggregate data are provided on a yearly basis, while other data are provided after the end of each semester. The Tk20 data management system provides faculty with streamlined access to candidate data from major unit assessments as well as other course embedded assessments.

2c.4. How are assessment data shared with candidates, faculty, and other stakeholders to help them reflect on and improve their performance and programs?

Typically, aggregate data for unit assessments are not shared with candidates; however, there are two candidates who serve as representatives to the PEUC. One candidate serves as a student representative for elementary majors and the other candidate represents secondary education majors. When data are shared with this body, these two students have access to aggregate data. The student representatives report this information to the Shepherd Education Student Association.

Individual assessment data are provided to each candidate for all assessments. Candidates obtain copies of the Pro-05 disposition evaluations for each education course from their advisor. In addition, candidates receive copies of all student teaching evaluations. Other data from course-embedded assessments are distributed to candidates directly by the course instructor. Data are shared regularly with the PEUC either during bimonthly meetings or during special faculty retreats. These faculty retreats often include P- Adult professionals who work closely with Shepherd University in the former Professional Development Schools or other partner schools. In addition, assessment data are shared with the Educational Personnel Preparation Advisory Committee (EPPAC) that is comprised of P-Adults partners and administrators. Data are shared with this body through the presentation of the University Assessment Reports.

2c.5. (Optional Upload for Online IR) Tables, figures, and a list of links to key exhibits related to the use of data for program improvement may be attached here. [Because BOE members should be able to access many exhibits electronically, a limited number of attachments (0-5) should be uploaded.]

Links:
University Assessment Review Process http://www.shepherd.edu/ctl/assess_learning.html

Exhibits for 2c:
* Attachment J Unit Assessment System
* Attachment R Student Teaching Data (ST 76)
* Campus Workroom: Minutes from PEUC Meetings and EPPAC Meetings

 

Shepherd University | P.O. Box 5000 | Shepherdstown, West Virginia | 25443-5000 | 304-876-5000 | 800-344-5231