STANDARD 4: The provider demonstrates the impact of its completers on P-12 student learning and development, classroom instruction, and schools, and the satisfaction of its completers with the relevance and effectiveness of their preparation.
NOTE 1: All components must be met for Standard 4
NOTE 2: Standard 4 and the “8 annual reporting measures”
The CAEP January requests for provider annual reports include questions about data on each of the 4.1 through 4.4 components. The request to EPPs defines the minimum expectation each year until reporting across providers can be complete and consistent. Trends in the provider’s cumulative reports since the last accreditation cycle will be included and interpreted as part of the self-study report. Providers are expected to supplement that annual reporting information with other, more detailed, data on the same topics from their own sources. Unconstrained by CAEP’s longer-term goal for consistently defined and commonly reported annual measures, EPPs will have greater flexibility to assemble their best documentation for Standard 4 by employing sources available in their own state, or documentation that they created, if any.
4.1 The provider documents, using multiple measures, that program completers contribute to an expected level of student-learning growth. Multiple measures shall include all available growth measures (including value-added measures, student-growth percentiles, and student learning and development objectives) required by the state for its teachers and available to educator preparation providers, other state-supported P-12 impact measures, and any other measures employed by the provider.
4.2 The provider demonstrates, through structured and validated observation instruments and student surveys, that completers effectively apply the professional knowledge, skills, and dispositions that the preparation experiences were designed to achieve.
4.3 The provider demonstrates, using measures that result in valid and reliable data and including employment milestones such as promotion and retention, that employers are satisfied with the completers’ preparation for their assigned responsibilities in working with P-12 students.
4.4 The provider demonstrates, using measures that result in valid and reliable data, that program completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective.
Evidence provided for Standards 1, 2, and 3 shows candidates complete a rigorous, progressive, and comprehensive program in order to graduate and be recommended for state teacher licensure. Once candidates become completers, they may find positions in the teaching field in West Virginia, though many of our graduates find employment in other nearby states or opt not to work directly in the field. As with many EPPs, the Professional Education Unit Council has grappled with how to quantify the success of Shepherd University completers who are in the field, particularly when it comes to acquiring data about their P-12 students. Part of the issue lies within the fact that many of our graduates teach outside of WV. Beyond that, for those completers who do remain in WV, there is no way to access P-12 student data generally, let alone specific to an individual teacher and there are no plans at the state level to provide EPPs with such information.
In previous years, the PEUC has looked at options for tracking completers, and has gathered data regarding the prior year’s graduates as to whether they are teaching full-time or part-time or not at all via Post Graduate surveys. Email, social media, and individual phone calls are used to follow up on those who do not respond. While the rate of response well exceeds CAEP requirements (67% for the 2015-2016 report, for example) the information gathered is sparse and does not speak to the impact completers are having on their students, their impressions of how prepared they were to be in the classroom, our completers. To reach as many completers as possible, records of who responds are matched to the master list, and a staff or their supervisor’s opinions of the quality of their work. The process of reaching out to completers and keeping track of who responds is also time consuming and not sustainable in the long run.
Therefore, the PEUC has considered a number of options to specifically address the requirements of Standard 4 as seen in Discussion Notes. In measuring the impact on P-12 learning, SU’s EPP has taken purposeful and strategic steps in this area of focus in working to develop its assessment of impact. While candidate, completer, supervising teacher, university supervisor, and employer surveys have been administered for years, data has not been collected in a systematic way or on a regular basis to gather reliable results. In formulating a cohesive plan for measuring impact, the EPP is determining how to utilize data to inform the broader programmatic decision- making process in a systematic, intentional manner.
Collaborative efforts, fostered by Education Provider Program Advisory Council participation to gather exit data and survey program completers over an extended period, as well as a state data survey (to be discussed later) will provide the EPP with comparative data throughout the region we serve.
In summary, SU’s EPP seeks to emphasize the emerging and evolving use of the data gathered to drive decision making internally, and with the valuable external feedback, assess the quality and impact of its programs against national norms. This transformation involves technical, personnel, procedural and cultural changes on the part of the EPP to increase the use of multiple measures to assess candidate outcomes and to draw conclusions regarding candidate quality, program viability and impact. This data is also useful in providing evidence for several other compliance components re: success of graduates:
1.) The program review process required by the West Virginia Higher Education Policy Commission.
2.) Criterion 4.6 of the criteria for accreditation from SU’s regional accreditor, the Higher Learning Commission.
As a whole, these measures work collaboratively and systematically to ensure excellent preparation of educators and continuous improvement of program components.
Plan for gathering P-12 student impact data from program completers to satisfy Standard 4.1 (As a note, while the state is discussing ways to make P-12 learning data available in the future, at present, no state system is in place to provide this data.)
SU-EPP PLAN FOR GATHERING DATA FROM PROGRAM COMPLETERS:
1. In EDUC 400, before student teachers graduate, identify those who plan to apply and eventually teach in WV. Since some attrition is to be expected, include ALL those who plan to apply/teach in WV and are willing to participate in this project. Goal: Initial cohort of 15-17 students representing a cross section of content areas.
2. Develop a variety of tools to measure value-added data for SUs EPP. In addition to a variety of surveys, annual evaluations are linked to personal goals and student learning. Follow would include identifying clear learning targets for students in the class that are appropriate to the developmental level and specialization area of the students and subsequent measurement of learning against those goals. Examples include collaborating with survey instruments from the Gates Foundation. Shepherd’s LMS SAKAI or Tk20 (accounts are active for 7 years) will be used to administer these surveys and analyze the resulting data. The faculty member(s) assigned to this project will receive course release time to oversee its development and implementation.
3. Offer an incentive to completers for participating in the project: Professional development credits for licensure re-certification (1 credit for each year of participation-value: $60) or a waiver of 1 graduate-level degree credit waiver for the 3-year study (value: $420-$650). The Dean of Graduate Studies and Continuing Education has agreed to work with completers participating in the program to ensure completion of registration forms and application of fee waivers.
4. Follow up by SU DoE faculty with the teachers and/or administrators (if this is not a DAA responsibility, this time should be factored into faculty teaching load, in a way similar to student teacher supervision).
5. Utilize information from the Voluntary System of Accountability Survey (VSA) administered by SU Institutional Research to compare satisfaction and future plans of education majors as compared to the overall SU student body.
6. Segment data received from the graduation survey and alumni surveys to determine completer perception of preparation and quality of the program.
7. Use this data in conjunction with findings from the NExT Exit Survey to determine the same items.
INDICATORS OF TEACHING EFFECTIVENESS: Plan for gathering data regarding teacher effectiveness and value-added data.
1. Work through EPPAC members and county superintendents to evaluate test scores in individual classrooms and overall schools (if the State determines a way to provide this information). Personal data such as complete names and ID numbers will be redacted from reports. This will be accomplished through reports of student achievement scores.
2. Rather than developing a separate rubric of questions and evaluations, use current evaluations employed by the state system with personnel information redacted.
3. Part of teaching effectiveness is also tied to teacher retention and the issues facing first-year or novice teachers. The survey participation will further strengthen connections to the institution, familiar advisors, and provide a forum for supporting teacher success and impact on student learning. There are resources and data to provide a significant information for a potential case study facing this vulnerable population:
Rodger, E., Prater, G., & Blocher, J. (2014). Alumni perceptions of their school/university partnership programs. In J. Ferrara, J. Nath, & I. Guadarrama (Eds.) Creating Visions for University/School Partnerships: A Volume in Professional Development School Research. Information Age Publishing.
Ross, V. & Prior J. (2014, April). Early-career teacher retention: Stories of staying. Presentation at the American Education Research Association Annual Conference, Philadelphia, Pennsylvania.
Prior, J., Ross, V., & Powell, P. (2013, May). Pre-service Teachers, children, and life: Using stories to reconceptualize curriculum. Presentation at the American Education Research Association Annual Conference, San Francisco, CA.
It is possible that students pursuing the thesis option in Shepherd’s Master of Arts in Curriculum and Instruction could pursue a case-study project with this area of focus.
Beyond gathering data from at least 20% of completers with a focus on their impact on P12 learning, there is a need for the deeper overall level of feedback from completers required by CAEP that our current efforts to get completer and administer survey responses is not accomplishing. To not only comply with CAEP expectations, but to grow and improve our EPP, SU is taking the next step to find out quality, competency, effectiveness of the program, as well as other related information, via participation in the North Dakota State University developed NExT Exit Survey. SU, along with several other institutions across the state of West Virginia, has signed an MOU participate in this survey, which begins Spring 2017. The employer survey is sent out every year. Principals and completers must fill out end of year report that has been in place for several years. The state is now going to embed a link to this survey within the report. Institutions that have opted to participate, as SU has, will receive data if the principal has employed a teacher from our institution. When the completer submits the survey, the information goes to the state, who then sends it off to NDSU, who will then separate out the data and send the information to participating EPPs. We will get 100% of our completers who are teaching in WV, which will easily get us well beyond the 20% response rate required by CAEP.
The university confirmed participation in this statewide initiative midway through the Fall 2016 semester and shared the information with student teachers at that time. These students were given the opportunity to complete the survey. The data contained in the NExT Exit Survey Report Fall 2016 shows 22/33 student teachers responding, equaling a 67% response rate. Of these responders, 19 of 27 (70% rate of response) were undergraduates and 3 of 6 (50% rate of response). The data was received by the EPP just prior to the submission of the Self Study Report, leaving little time for deep analysis of the results. However, the PEUC did have a chance to see both the Survey and the Survey responses and offer feedback. Additionally, the PEUC anticipates having an even higher percentage of completers participating moving forward as students will receive information about the Survey early in their student teaching experience as well as information about the importance of participating.