Skip to Content

College of Education


Systematic Program Evaluation Summary

Consistent with the 2016 CACREP Standards regarding data-driven program evaluation and modification, the Counselor Education program has designated four key sources of outcome data for consideration. These sources are not be confused with evaluation of the program’s pedagogy which are assessed via measurement of student learning outcomes based on course evaluation rubrics.

(Report compiled by Dr. Joshua M. Gold)

This data reflects student performance and feedback after completing the program, the input from which will be used to guide program practice and policy. Each section will integrate data for each program, providing indications of program strength and areas for enhancement, followed if appropriate by the intended program modifications. For the purposes of this report, the utilized sources of data include:

  • Counselor Education Program Alumni Surveys sent to graduates of each program (School Counseling, MCFC & Ph.D.) by the Program in the spring of each year;
  • Input from Advisory Boards and program employer
  • Input from site supervisors (data collected in spring term)
  • EdS student performances on the National Counseling Exam (NCE); and,
  • EdS student performances on the Praxis School Counseling Exam

 

Feedback on the EdS Program

For this reporting period, the Program received feedback from 13 out of 13 School Counseling graduates (100%) and 6 out of 8 MCFC graduates (75%).

 

EdS Program Strengths

Based on the quantitative data, “strength” was a topic in which at least 80%, or 5 out of 8, respondents, rated the experience as “strongly agree:”

 

MCFC Program

Strengths

  • Item 1.1: academic competence of faculty
  • Item 1.11: faculty responsiveness to student concerns
  • Item 4.2: appropriateness of internship sites

Strengths

  • Faculty expertise (4)
  • Cohort model (1)
  • Systemic orientation (1)

Recommendations

  • Directed training on treatment implementations (3)
  • Organization (1)
  • More internship sites (1)
  • Fewer online classes (1)
  • Consider a 3-year program to space out classes (1)

Comments on specific faculty or courses

  • Carlson (3)
  • Gold (3)
  • Phelp’s teaching of EDCE 715: Human Sexuality

Based on the quantitative data, “area for enhancement” was a topic in which 80%, or 5/8, rated the experience as “strongly disagree:” Based on the feedback gathered for this reporting period, there were no such topics or areas of concern.

 

School Counseling Program

Based on the quantitative data, “strength” was a topic in which at least 80%, or 10 out of 13, respondents, rated the experience as “strongly agree:”

Strengths

  • Item 1.13: The grading system was fair
  • Item 1.15: Ethical standards were continually taught and maintained by faculty and supervisors
  • Item 1.19: I would recommend the USC School Counseling EdS program to others interested in preparation to work as a school counselor

Strengths

  • Faculty (7)
  • Preparation for interviews (3)
  • Comprehensiveness of program
  • Quality of clinical supervision
  • Emphasis on evidence-based practice
  • Emphasis on ASCA model and alignment

Recommendations

  • Improve communication between sites and faculty re: course expectations (3)
  • Upgrade and maintain technology in Wardlaw classrooms (2)
  • Simplify site placement process for practicum & internship (2)
  • Include observations of counseling during Year 1

Based on the quantitative data, “area for enhancement” was a topic in which 80%, or 10/13, rated the experience as “strongly disagree:” Based on the feedback gathered for this reporting period, there were no such topics or areas of concern.

 

Advisory Board Input

MCFC Advisory Board: April 17, 2017

Present:
Community Representatives & Program Graduate Employers

  • LRADAC N. Deems
  • Palmetto Health Counseling D.Garnett
  • USC Counseling Center B. Sheridan
  • Family Intervention Services G. Lindsey & J. Wilson

USC Representatives

  • R. Carlson
  • R. Haber
  • J. Gold

Identified areas of program strength

  • Professionalism on the part of students
  • Students highly motivated
  • Students self-starting
  • Excellent skills in joining with clients
  • Communication system with program representatives

Identified areas for program growth

  • Additional training in clinical intentionality and treatment planning
  • Specialized training relevant to client populations
  • Effective use of supervision and observation of live sessions
  • How to handle an impaired intern.

 

School Counseling Advisory Board

This data was not available at the time of the preparation of this report.

 

EdS student performances on the National Counseling Exam

This data is based on the reports of student performances on the fall, 2016 and spring, 2017 National Counseling Exam. The data to be reported is based on 8 MCFC students and 1 School Counseling who sat for the exam in spring, 2017. Out of those MCFC students, 8 students (100%) passed the exam with an average score of 111.30 as compared to the minimum criteria of 94. Out of those School Counseling students, 0 student passed the exam with a score of 88 as compared to the minimum criteria of 94. Given that all EdS students take the same courses specific to the NCE requirements, no interpretations to program amendments can be made from this data, as overall, 8 out of 9 or 88% of the students were successful.

 

Performances of 8 MCFC Students, Spring, 2017

 CACREP Areas

Results

(% correct)

Items

Human Growth & Development

8.5 (71%)

12

Social & Cultural Diversity

7.5 (68%)

11

Helping Relationships

23.75 (66%)

36

Group Work

12.50 (78%)

16

Career Development

13.50 (68%)

20

Assessment

13.38 (67%)

20

Research & Program Evaluation

10.50 (66%)

16

Professional Orientation & Ethical Practice

21.63 (75%)

29

Total

111.30 (70%)

160

 

It can be surmised that the April, 2017 USC results confirm the efficacy and effectiveness of the program’s pedagogy and learning experiences across all 8 CACREP core curricular areas. With a national passing mark of 94/160, or 59%, a comparison between that cut-off score and core area performances could indicate any specific content are in which student performances did not meet or exceed that mark. Based on this data, in each category, student performance exceeded the 59% minimum criteria and serves to support the efficacy of the current pedagogical practices specific to this curriculum.

 

EdS student performances on the Praxis School Counseling Exam

The source for this data is the office of the Graduate Director, College of Education. The data to be reported is based on 12 students who sat for the exam in spring, 2017. All 12 students (100%) passed the exam with an average score of 172 as compared to the minimum criteria of 156. Once again, these scores can be viewed as confirmation of the efficacy and effectiveness of the program’s pedagogy and learning experiences for the School Counseling students.

 

Feedback on the PhD Program

Of the 6 students who completed the PhD program, feedback was received from 4 (67%). Given the small number of potential responders and even smaller number of completed surveys, program strengths will be noted as those items on which there was unanimous (100%) rating of “strongly agree” and the areas for program strength are likewise determined.       

Strengths

  • Item 1.8:          Faculty were highly competent in the supervision process
  • Item 1.11:        In general, faculty members were well prepared for class
  • Item 1.15:        Ethical standards were continually taught and maintained by faculty and supervisors
  • Item 1.17:        The entire program of academic and clinical education provided a solid foundation for a professional career in private practice, higher education, agency and government settings
  • Item 2.4:          Curricular experiences in the area of multicultural counseling
  • Item 2.5:          Curricular experiences in the area of theory and practice of counselor education
  • Item 2.7:          Curricular experiences in the area of ethical/legal issues in counselor education
  • Item 3.2:          Appropriateness of practicum site to fulfill course requirements
  • Item 3.3:          Quality of practicum supervision offered by faculty members

 

Identified areas for program growth
There were no items endorsed in a frequency that would allow items to be noted in this section

 

Written comments

Program strengths

  • Supportive faculty (3)
  • Faculty inviting students to collaborate on scholarly activities (3)

Areas for modification

  • Better funding for doctoral full-time study (2)
  • Better funding for participation in scholarly presentations etc. (2)
  • Question need and purpose of doctoral cognate (1)

The complete data sets from which this report was summarized are available in room 266 Wardlaw College. The documents are in a folder entitled “Program Evaluation Summaries” and collated by year.