Search

Southern Home PageAbout Southern Connecticut State UniversityAcademicsAdmissionsStudent LifeResearchAthleticsHuman Resources at Southern
Southern Connecticut State University LibraryMySCSUSouthern DirectoryCalendar of EventsTechnologyContact Us
Department Banner

Program Review & Assessment Committee
Minutes

November 9, 2000

Committee Members Present: C. Coron (chair), C. Durwin, B. Gelbach, W. Shyam, J. Yang    Also Present: R. Gerber, E. O'Sullivan

Recorder: C. Durwin

The meeting adjourned at 11:00.

The purpose of this meeting was to discuss how the assessment process was working, what improvements could be made, and to discuss the roles of the committee and administration.

Gerber began by stating that his major concern was to discuss how the assessment process is coordinated between the administration and the committee

He mentioned that the original committee had a much more supervisory role in the assessment process than the current committee, and that faculty should take ownership of this process.

Gerber identified 3 major issues for discussion:

  1. clarifying the current assessment process
  2. clarifying the role of the committee
  3. determining what the faculty responsibility should be over time

Coron commented that the determination of the committee's role (supervisory or advisory) would be a matter for UCF to decide.

Gelbach also commented that we are long overdue in asking UCF to re-ratify the original assessment standards (and corresponding rubric).

Gelbach then offered some historical perspective of the role of the committee as a means of identifying aspects of the assessment process that needed to be improved:

  • The original committee developed the standards and proposed the rubric.
  • As time went on, the committee thought it was best to serve in the capacity of monitoring the assessment process that programs were undergoing.
  • However, his observation was that the committee took on responsibilities that were beyond its capacity to fulfill.
     

1.  The committee attempted to train departments in how to use the standards to conduct the self-study. However, the committee felt they lacked resources (time and otherwise) for doing this

2.  The committee was supposed to be notified at certain points in the assessment process.

  • For example, the committee was supposed to review each program's proposal for self-study (early in the assessment process) to determine if it met the standards. However, for the most part, they haven't received such proposals.
  • The committee was also supposed to see written reports before external examiners came to campus for a site visit. Again, for the most part, they didn't receive such material-making it difficult to review the reports

3.  The committee was also aware that site visitors sometimes did not receive the criteria for evaluating programs. The evaluation sans criteria was not allowing all of those involved to benefit from the assessment process.

4.  Departments were supposed to receive a written record from administration on the outcome of the assessment process, and the committee was supposed to be a repository for these memos.

O'Sullivan also added some history on the assessment process from the perspective of the administration's experiences.

  • The intent was to make it a meaningful process for programs and departments. However, 10 years ago when this was first implemented, the process was very threatening to people.
  • Over the past 10 years, administration has progressively been trying to help departments "get where they need to go" in this process.
  • Currently, of those who have undergone the assessment process, some departments have "done nothing," others have "complied only," and some have "made a good start" as the basis for their next assessment. 
  • O'Sullivan emphasized that this has been an evolving process with administration being supportive of departments in this process, and that at every phase, administration has tinkered with implementation in order to make improvements from their experiences (e.g., correcting problems regarding site visits by external examiners, receiving written reports at the same time that the site visitor does, etc.)
  • O'Sullivan also mentioned the recent pilot implementation of providing a written record of the assessment outcome that would have to be signed by the appropriate Dean and VP)

Gerber then noted his perspective:

  • The problems in implementation of this process have been due to the flux in membership of the committee and the increased load that it was required to handle (20 assessments per academic year). Originally, the assessment process called for a full-time Dean-level position to handle this.
  • He emphasized that written criteria are available and provided to examiners.
  • He agreed that the committee has not been the repository for information
  • With respect to the written record of the assessment outcome (which is supposed to "close the loop" of the assessment process by providing resources to departments for their future improvements), Gerber also emphasized that administration does in fact have an implementation that they sign off on , but that the resources cannot always be provided instantly (e.g., faculty allocation). However, the resources are always provided in the budget cycle.

One other problem in implementation that Gelbach noted was that our committee was very different from the Graduate Council in authority. The Graduate Council has the authority to conduct the assessment and make a binding decision about whether a reviewed program should continue to exist as a graduate program. This committee has no clout to require departments to comply with the assessment and in a timely manner.

Gerber disagreed with this notion, saying that it was never the intent of administration to be coercive in the process or judgmental in reviewing the outcome. Rather, the approach was "where do you see yourselves in 5 years and how are you going to get there?" Gerber further acknowledged that this approach has been met with resistance.

Coron responded that administration and the committee have philosophical differences of opinion on the assessment process.

  • We can never eliminate the "threatening" feelings on the part of some of the faculty despite all of our good intentions.
  • Rather, we have to "encourage" faculty to engage in the process despite their intuitions, and we should do this by emphasizing the good that can come out of the assessment process for them, and by facilitating the process.
  • The committee has a mandate to represent the faculty, and the main issue is for UCF to decide how we (the committee) should represent the faculty.

Gerber reiterated the fact that assessment findings have in no way been used against departments or faculty. The findings serve two purposes: (1) information for departments for further growth and improvement; (2) information for the assessment office to serve as a basis for the budget.

Coron suggested that the committee and the assessment office should work to agree on the assessment approach and guidelines so that the two parties can cooperate.

Yang also mentioned that what the new committee is interested in is what is and is not working in the process so that we can work to improve it.

Gerber brought up the fact that if we were going to revise the standards we have to now keep in mind that NEASC has 11 standards by which they evaluate universities on "institutional effectiveness," and that it would be beneficial if our assessment and NEASC were aligned so that the process would not have to be duplicated.

Shyam asked Gerber and O'Sullivan if they had any feedback from faculty on the assessment process (regarding what worked and did not work), and how the training workshops were perceived by departments.

O'Sullivan responded that they did a self-study and the results were generally positive.

Durwin asked if the committee could review the data as a basis for determining what aspects of the assessment process needed improvement.

O'Sullivan agreed to present their data at a future meeting, and also suggested that the committee invite some departments who have recently undergone the assessment to share their feedback, since the data is already a few years old and does not reflect more recent improvements by the administration.

Coron then asked if there were plans to increase the assessment office's staff.

Gerber replied: not at the moment, but that there is money for release time for doing the self-study, for workshops, and for external examiners.

Gerber acknowledged that it does cost money to provide the assistance to departments. One solution would be to train new chairs in the assessment process during the course of the Orientation for new chairs.

O'Sullivan also suggested that we make it a policy for someone from the assessment office to appear at a department's meeting in order to orient the department to the assessment process and initiate the process.

All participants agreed that another meeting needed to be scheduled for further discussion of solutions as well as sharing of data.

The meeting adjourned at 12:20.