Accreditation

The College of Education represents the Educator Preparation Provider (EPP) at Southern Connecticut State University. The College received its continuous accreditation by the National Council for Accreditation of Teacher Education (NCATE) in 2014, and is currently working toward continuous accreditation by CAEP.

Third-Party Comments

The College of Education at Southern Connecticut State University is hosting an accreditation visit by the Council for the Accreditation of Educator Preparation (CAEP) on March 28-30, 2021. Interested parties are invited to submit third-party comments to the site team. Please note that comments must address substantive matters related to the quality of professional education programs offered, and should specify the party's relationship to SCSU (i.e., graduate, present or former faculty member, employer of graduates, cooperating teacher). 

CAEP accepts written comments up to six weeks before the site visit — that is, until February 14, 2021. We invite you to submit written testimony to the address below:

CAEP
1140 19th Street NW, Suite 400
Washington, DC 20036
Or by email to: callforcomments@caepnet.org

Such comments must be within the specified period and based on the core tenets of CAEP accreditation standards of excellence which recognize that:

  1. In CAEP's performance-based system, accreditation is based on evidence that demonstrates that teacher candidates know the subject matter and can teach it effectively so that students learn. In the CAEP system, EPPs must prove that candidates can connect theory to practice and be effective in an actual P-12 classroom.

  2. A professional education provider that is accredited by CAEP is expected to be involved in ongoing planning and evaluation; engaged in continuous assessment and development; ensure that faculty and programs reflect new knowledge, practice, and technologies; and be involved in continuous development in response to the evolving world of education and educational reform.

  3. Comments must address substantive matters related to the quality of professional education programs offered, and should specify the respondent's relationship, if any, to the institution (i.e., graduate, present or former faculty member, employer of graduates). Copies of all correspondence received will be sent to the university for comment prior to the review. No anonymous testimony will be considered.

For more information, visit: http://caepnet.org/accreditation/caep-accreditation/third-party-comments.

EPP Programs

The EPP consists of 58 programs at the initial and advanced levels, and are run by departments from the College of Education, the College of Arts & Sciences and the College of Health & Human Services.

The status of each of the programs with respect to NCATE /CAEP accreditation can be found on this link.

The following graduate advanced programs have earned their accreditation within their specialty areas, outside CAEP:

The following programs achieved National Recognition by their Specialized Professional Associations:

  • SYC Educational Leadership – Full recognition by ELCC/NELP (Recognized till 2025)

  • EdD – EDLA - Full recognition by ELCC/NELP (Recognized till 2025)

  • BS English (7-12) - Full recognition by NCTE (Recognized till 2027)

  • Post Bacc English (7-12) - Full recognition by NCTE (Recognized till 2027)

  • MS Reading- Full recognition by ILA/IRA (Recognized till 2025)

  • SYC Reading - Full recognition by ILA/IRA (Recognized till 2025)

  • SYC School Psychology- Conditional recognition by NASP

CAEP Annual Reporting Measures 

CAEP requires every Educator Preparation Provider (EPP) to report and prominently display data on its website relating to eight Annual Reporting Measures established by CAEP.

At this time, Connecticut legislation explicitly prohibits the linking of any state student-testing database with state educator databases, thereby precluding the use of value-added methodologies for the evaluation of teacher performance based on student achievement.

However, given CAEP standard requirements and federal Title II requirements regarding measurements of student effectiveness, Connecticut EPPs continue to work with the Connecticut State Department of Education (CSDE) to develop alternative reliable and valid methodologies for measuring EPP program impact on student growth.

The EPP at SCSU uses a variety of measures to demonstrate the effectiveness of our programs in preparing our candidates to have an impact on P-12 student learning. We use a variety of quantitative and qualitative methods through employer and completer satisfaction surveys, interviews with school leaders and teacher evaluation data during the induction phase of teaching of our graduates. The EPP prepares day one ready teachers, who apply the knowledge, skills and dispositions developed in the programs. We continuously assess the effectiveness of our initial EPP programs through active partnerships with several of our local school districts and regional education service centers. Such partners comprise a twenty-person external advisory board chaired by the executive director of the Cooperative Educational Services that review our syllabi, assessments, breadth and depth of our placements, and survey data reports. Feedback from this board and the Superintendent’s Consortium chaired by the Dean of the College of Education helps the EPP continuously improve its programming and keep up-to-date with the needs and challenges of our regional school systems where many of our candidates are hired.

 

Teacher Effectiveness Case Studies 

A shared goal of our various teacher preparation programs is to prepare future teachers who will have a positive impact on P-12 student learning. The Council for the Accreditation of Educator Preparation (CAEP) requires EPP to demonstrate the impact of our program completers on P-12 student learning and development.

The College of Education at Southern Connecticut State University conducted case studies on our program completers in the field to gain empirical insight into the effectiveness of their teaching and their impact on the learning of their P-12 students over the past two years.

 

Results of Case Study 1: 2017 - 2018

The EPP conducted a first case study during the academic year 2017-2018, that focused on the teaching effectiveness of program completers with particular focus on their classroom application of professional knowledge, skills and dispositions. The study was qualitative and designed following the tenets of participatory action research. Ethnographic data collection methods and a survey were used to gather data on the participants’ perceptions of their teaching effectiveness. Within this study, program completers evaluated their own teaching effectiveness and impact on K-12 students. They created individualized goals, plans for improvement and participated in on-going reflection of their methods and growth.  Participants were able to evaluate areas in need of growth, challenges faced within the school and classroom setting. Overall, this study provided helpful feedback regarding curriculum-based learning to support success for pre-service teachers and provided rich information for our teacher preparation programs.

Key Findings of the Study

  • Initially, program completers self-identified areas of instructional delivery as needing improvement. Researchers were able to identify common instructional themes among participants, and to monitor the processes that program completers underwent to improve their effectiveness in identified areas.

  • Program completers were able to self-identify areas in need of improvement, suggesting that teacher effectiveness is a work in progress. In fact, regardless of their years of teaching experience, data reflects that all completers can identify areas to improve upon.

  • Through a process of reflection, shared feedback, and strategy, program completers reported improvement in self-identified areas of teacher effectiveness (e.g., checking for understanding, meaningful use of technology).

  • Program completers identified the following areas of preparation as in need of further support: collaboration with paraprofessionals, differentiation/individualization of instruction, assessment, and writing individualized education plans (IEP).

  • Other themes immerged that suggested that program completer rating of teaching effectiveness was impacted by current employment factors. These included large class size, district mandates, diversity in learners’ needs, lack of technology, and limited time to complete job responsibilities.

Results of Case Study 2: 2020

The EPP conducted a second case study in Spring 2020, working with one of our providers of clinical field work, a Regional Educational Service Center (RESC) for the twenty-five school districts in south central Connecticut. Approximately 70 of our teacher education candidates are placed in the RESC schools each semester through either clinical field experience or student teaching placement. This core partner offers a representative sample of our graduates and their impact on P-12 student learning.

This case study consisted of qualitative interviews with school leaders based on their evaluations of our program completers. School leaders completed the SEED evaluation to assess teachers across the following four domains:

  • Domain 1 Classroom Environment, Student Engagement and Commitment to Learning

  • Domain 2 Planning for Active Learning Teachers implement instruction in order to engage students in rigorous and relevant learning and to promote their curiosity about the world at large

  • Domain 3 Instruction for Active Learning

  • Domain 4 Professional Responsibilities and Teacher Leadership

Qualitative highlights including areas for continuing improvement by domain follow.

 

Domain 1: Classroom Environment, Student Engagement and Commitment to Learning

Across all interviews, school leaders reported that our completers developed strong relationships with students to create welcoming learning environments. Completers were commended for being open, welcoming, and also putting forth efforts to get to know students outside of school. Furthermore, completers were observed acting as mentors to students and celebrating students. Beyond making all students feel welcome, completers also take pride in creating comfortable learning environments for students.

School leaders noted an area of improvement for SCSU completers to be establishing working relationships with co-teachers. Specifically, they noted missed opportunities for teachers to better manage classrooms through small groups as opposed to whole class, by not knowing how to utilize the co-teacher in the room. School leaders believe that, with training for completers on developing and utilizing working relationships with co-teachers in the classrooms, management would be even stronger.

Domain 2: Planning for Active Learning

School leaders elaborated on the planning practices SCSU completers engage in on a daily basis within their schools. Whether it be daily lesson planning with mentors or grade level colleagues, or unit planning with department members across the district, all SCSU completers are responsible for creating and using lesson plans within their classrooms.

Schools commended SCSU completers for being thoughtful and well planned. Completers take advantage of collaborative and individual planning times. This is a strong area for our graduates.

Domain 3: Instruction for Active Learning

SCSU completers were noted for using data throughout their planning practices, however one school leader stated that teachers could benefit from using data more often.

Schools also noted that while SCSU completers are competent in modifying and differentiating work for lower leveled students, at times they struggle to differentiate for students functioning at higher academic levels. 

Domain 4: Professional Responsibilities and Teacher Leadership

Through Power School, program completers communicate regularly with parents—on both a daily and weekly basis. In addition, teachers communicate with parents through a Google Doc log regarding individual student progress. 

In regard to professional growth, all interviewees were in agreement that SCSU completers work to better themselves. For example, one teacher took steps within their first year to write a grant in which $10,000 was awarded. Through this grant, the teacher embraced the school’s theme and created connections to STEM with class content.

The employer survey is not administered every year. The EPP is currently revising the survey and moving forward, the new survey will be administered every other year.

  • Employer Satisfaction Survey: 2020

  • Employer Satisfaction Survey: 2015 

The alumni survey is not administered every year. The EPP is currently revising the survey and moving forward, the new survey will be administered every other year.

  • Completers Satisfaction Survey: 2020

  • Completers Satisfaction Survey: 2015 

  • Completion Rate = Graduated / (Total Admitted Students - Active Students)

  • Graduation Date: Summer 2020 or Earlier

 

Completion Rate for UG Programs
Cohorts are defined as the total number of students admitted to the CoE (sophomore year) and who began their program during an academic year (Fall, Spring or Summer).

Cohort

Admitted

Active

Withdrew

Completers

Completion Rate

AY 13-14

157

 

18

139

88.54%

AY 14-15

155

 

23

132

85.16%

AY 15-16

144

 

22

122

84.72%

AY 16-17

175

 

22

153

87.43%

AY 17-18

182

6

27

149

84.66%

AY 18-19

194

74

27

93

77.50%

Total

1007

80

139

788

85.01%

 

Completion Rate for Post-Baccalaureate Programs

Cohort

Admitted

Active

Withdrew

Completers

Completion Rate

AY 13-14

94

 

14

80

85.11%

AY 14-15

79

 

9

70

88.61%

AY 15-16

68

 

5

63

92.65%

AY 16-17

114

 

14

100

87.72%

AY 17-18

82

2

16

64

80.00%

AY 18-19

68

18

17

33

66.00%

Total

505

20

75

410

84.54%

 

Completion Rate for Grad Programs
 

Cohort

Admitted

Active

Withdrew

Completers

Completion Rate

 

SYC - School Psychology

AY 15-16

8

 

 

8

100.00%

AY 16-17

14

 

 

14

100.00%

AY 17-18

13

 

 

13

100.00%

AY 18-19

12

 

1

11

91.67%

Total

47

 

1

46

97.87%

 

SYC - Educational Leadership (Administration)

AY 15-16

56

 

12

44

78.57%

AY 16-17

51

1

7

43

86.00%

AY 17-18

90

1

10

79

88.76%

AY 18-19

83

12

5

66

92.96%

Total

280

14

34

232

87.22%

 

MS - Reading

AY 15-16

23

 

2

21

91.30%

AY 16-17

17

 

1

16

94.12%

AY 17-18

28

4

1

23

95.83%

AY 18-19

14

6

 

8

100.00%

Total

82

10

4

68

94.44%

 

SYC - Reading

AY 15-16

20

2

5

13

72.22%

AY 16-17

15

 

4

11

73.33%

AY 17-18

14

2

8

4

33.33%

AY 18-19

4

4

 

 

 

Total

53

8

17

28

62.22%

State Licensure Exams for Initial Programs (as reported in Title II)

 

 

 Institution Data

 

    State Wide

 

Group

Number
Taking
Assessment1

Number
Passing
Assessment2

Institutional
Pass Rate

Number
Taking
Assessment1

Number
Passing
Assessment2

Statewide
Pass Rate

All program completers, 2018-19

  237

  193

81%

 1227

 1079

88%

All program completers, 2017-18

  202

  185

92%

 1173

 1109

95%

All program completers, 2016-17

  139

  131

94%

 1234

 1153

93%

 

 

 

 

 

 

 

 

 

Note: In cases where there are less than ten students taking the assessment or license/certificate, the number passing and pass rate are not reported.

  1. Number of completers taking one or more assessments within their area of specialization.

  2. Summary level “Number Taking Assessment” may differ from assessment level “Number Taking Assessment” because each student is counted once at the summary level but may be counted in multiple assessments at the assessment level.

Download Full Report

 

State Licensure Exams for Advanced Programs

The following pass rates are calculated based on the scores of program completes that we received from Pearson and ETS. The State of Connecticut does not provide official reports for the advanced licensure exams. 

   

AY 17-18

AY 18-19

AY 19-20

Reading Specialist (008)

Test Takers

25

35

16

Total Passing

24

33

15

Pass Rate

96%

94%

94%

         

Praxis II 6412 - Administrator

Test Takers

 

53

36

Total Passing

 

53

36

Pass Rate

 

100.00%

100.00%

 

Title II Reports: Federal Reporting