Part V: The Impact Report of the Quality Enhancement Plan
Plan title and brief description of plan elements as originally presented
Lamar State College-Orange's (LSC-O) Quality Enhancement Plan (QEP) is entitled Lamar State College-Orange Advances Critical Thinking Skills (LSC-O ACTS). The original plan goal was simple to express: student critical thinking will improve. This goal did not change during the various stages of implementation, although the objectives and strategies for accomplishing it did change over time. The plan goal was to be accomplished through the achievement of three objectives:
- full-time students will improve their critical thinking skills over a two-year period;
- graduates will report an improvement in their critical thinking skills; and
- employers of LSC-O graduates will report an improvement in critical thinking skills.
Initially the plan called for all faculty and staff to be involved in designing and facilitating instructional and extracurricular activities that would promote or advance student critical thinking skills; enhanced instructional activities would serve as the principal strategy for accomplishing all three objectives. For the first objective, assessment measures would include the Watson-Glaser test, developmental course success rates, student satisfaction survey results, and CCSSE results. To assess the second plan objective, the primary measure would be the graduate follow-up survey conducted by the Office of Institutional Research. Assessment of the third objective would be accomplished through the employer follow-up survey. Appropriate benchmarks were established and approved for each assessment measure.
Plan leadership structure as originally established
The plan's two co-directors, one faculty member and one staff member, would assume direct leadership functions and would work closely with a Critical Thinking Resource Team, whose members would derive from all divisions--instructional, support, and administrative. The co-directors would report to the President, work closely with the institution's Chief Academic Officer, and serve as members of a QEP Oversight Committee, a subcommittee of the Campus Planning Committee (whose members include all department heads as well as alumni and local civic and industry representatives).
QEP submission, approval, and implementation
The plan was submitted to the SACS Commission on Colleges in fall 2004 at the same time it was implemented, and was formally approved the following January (2005), with no recommendations and three suggestions, one of which the site visit team voiced at the time: they believed that the plan as submitted was too ambitious, specifically in the extent to which it proposed to involve the college's non-instructional staff members in the advancement of student critical thinking. The plan's leadership took this suggestion into account during the administrative preparation and piloting phases, focusing on the enhancement of classroom instruction as the plan's primary emphasis in the implementation, while the involvement of non-instructional units was to be effected as opportunities presented. The QEP Steering Committee, convinced that staff represented a significant informational and creative resource, retained the co-director structure and complement. The other two suggestions (see page 3) were addressed in a monitoring report submitted in April 2006.
Implementation of the plan was to occur in three phases: an administrative preparation/dissemination phase, a piloting phase, and an integration phase during which the elements of the plan would become embedded in standard operating practices. The first phase was initiated in fall 2004, after the plan had been approved by the Texas State University System Board of Regents but prior to being reviewed by the Commission on Colleges.
Administrative preparation/dissemination phase operations
The most important aspect of this phase was the creation of an apparatus to assist faculty in developing enhanced classroom instructional methods or activities. To this end, the Steering Committee first developed a broad view of the plan's function. Faculty would design enhancement projects or activities which their students would complete, and the institution (rather than individual faculty members) would assess the work with respect to critical thinking. Information from the assessment would feed back to the instructors who would use it directly to improve student learning in a formative dynamic by informing the processes of project design and instruction enhancement. It was necessary to create standards to govern project design and a means to assess student performance. This work involved the convening of the Critical Thinking Resource Team (CTRT).
Creating a critical thinking rubric to assess student performance
First the co-directors and the members of the CTRT created a rubric for evaluating student performance on critical thinking projects; the components of this rubric corresponded closely to the levels of Bloom's taxonomy of elements in the cognitive domain, and described student performance in general terms with respect to each element, criterion, or intellectual competency on a 1-4 scale, with 4 representing high-level thought and 2.65 being defined as the minimum standard for student success. This rubric was disseminated to faculty and staff and discussed in instructional meetings involving all full-time faculty members.
Creating student learning outcomes
Student learning outcomes were directly identified with the elements of the rubric: identification, exploration, inference, analysis, explanation, and reflection (synthesis was added in Fall 2008). Identification of student learning outcomes with the various aspects of cognitive function provided sufficient distinction to facilitate the creation and assessment of learning activities.
Creating instruments and means to facilitate project design
Simultaneously the co-directors were preparing an instructional form for creating descriptions of enhanced assignments and drawing up a calendar according to which faculty would submit design descriptions of the critical-thinking-enhanced activities they would integrate into their course curricula. To support those instructional enhancement and course redesign efforts, a Critical Thinking Resource Center was established in a designated area of the stacks in the Ron E. Lewis Library, the library staff soliciting faculty and staff input on the range of acquisitions necessary and appropriate for collection enhancement sufficient to support the initiative. On-campus training was conducted at which full-time faculty attendance was mandatory. This phase of the plan was complete by December 2004.
Piloting phase operations
The administrative preparation/dissemination and piloting phases overlapped, occurring over the course of the next five semesters, during which time 88% of the full-time faculty and 10% of the adjunct faculty participated in the plan, submitting project descriptions conforming to plan criteria, completing projects in class, and submitting student work in compliance with a schedule developed by the plan co-directors and approved by the college administration.
Each semester faculty were scheduled to participate when the co-directors conducted a brief orientation to the forms and their function, and to the communication, logistics of the duties, deadlines and practices. Each faculty member would then create a project consistent with the plan criteria and submit it by an established deadline. The CTRT would review the projects and make any necessary suggestions for revision, after which the directors would return the project descriptions to the faculty, following up individually as needed to assure that each project met the plan's specifications. Each faculty member would then complete the project, collect the student work product, and submit it (or duplicates of it) to the co-directors for double-blind assessment by the members of the CTRT and the co-directors, who would determine if, and the extent to which, each student's work product met the critical thinking criteria as described in the rubric. This information would be processed and reported to the faculty members (usually early in the ensuing semester) and, in the aggregate, to the college administration. The tabulated information would also be incorporated into the plan's assessment system.
Means to facilitate plan elements
A newsletter published to the campus kept faculty and staff apprised of the plan's progress, communicated information about upcoming development opportunities, and offered tips on best practices. Other information, more tips on best practices, a concise handbook of critical thinking, and various miscellaneous teaching resources were also distributed to faculty and staff at convocations and via the campus's intranet platform, while the library staff continued to grow its collection of resources for enhancing critical thinking instruction and curriculum.
Commission review and suggestions
While the practices and procedures described above did not vary significantly from Fall 2005 to Spring 2007 (4 full terms), and while the goal or the practical emphasis of the plan did not change during that time either, the objectives whereby LSC-O planned to achieve that goal did change, in part as a result of three suggestions made by a COC review received in Spring 2006. These suggestions, made subsequent to the site visit by a Commission review, noted that the plan's administrative leadership needed to:
- identify specific student learning outcomes appropriate to the critical thinking skills;
- revise the plan goals and objectives to be more directly related to the improvement of critical thinking skills; and
- revise the assessment plan to provide more direct evidence of improvement in critical thinking skills rather than relying on surveys of opinions and perceptions.
2006 Monitoring Report
In April 2006, LSC-O submitted a monitoring report to the SACS Commission on Colleges detailing the institution's response to the Commission's suggestions for changes to the plan as originally submitted. One of the suggestions had already been addressed: i.e., that the operational elements of the plan would focus on enhanced classroom instruction, deemphasizing extracurricular. The other two suggestions called for a revision of the plan objectives to be more directly related to the improvement of critical thinking skills, and a revision of the assessment plan to provide more direct evidence of improvement in critical thinking skills rather than relying on indirect measures such as surveys of opinions and perceptions.
To this end, the objectives were rewritten in Spring 2006. LSC-O ACTS would:
- improve student critical thinking skills over a two-year period;
- integrate augmented teaching methods and/or activities into every course; and
- improve student engagement through critical thinking.
Under the modified objectives, faculty would still create enhanced instructional activities as the primary strategy for accomplishing the plan's first objective. To facilitate the second plan objective involving faculty's efforts to integrate critical thinking activities or teaching methods into every course, the new strategies included provision of specialized training, professional development, incentives to promote innovative instructional design, the collection of specialized resources, and leadership in revising program- and course-level student learning outcomes to reflect the emphasis on enhancing critical thinking skills. Objective three involving student engagement called for staff development focusing on the design of student learning activities offered outside the classroom, and on the regular scheduling of student activities to directly support or enhance student critical thinking. A variety of internal and external direct and indirect measures were correlated to the strategies for achieving each objective, but survey results were used as a measure for only one objective, and appropriate benchmarks were established for each. This phase was completed by the end of the Spring 2007 semester.
More changes to the plan occurred in 2007 and beyond. A leadership change was necessary since both of the original co-directors had left the institution in Spring 2007. Since over its three-year lifetime to that point the plan seemed to have acquired a kind of momentum, the President and his executive staff decided to replace the co-directors with a single director, a faculty member, 40% of whose load was devoted to administering LSC-O ACTS. As part of a strategy for plan assessment and possible revision, the director conducted a series of informal interviews from August to November 2007, to survey faculty and staff attitudes and perspectives on the LSC-O ACTS plan. Twelve FTE faculty (about 25 % of the institutional complement) and 2 long-standing members of the CTRT (25%) participated in the focus groups and interviews.
At a meeting on November 7, 2007 attended by the institution's SACS liaison, the Director of Institutional Effectiveness, the LSC-O ACTS director, and the CTRT, the following suggestions for changes were presented, discussed, and approved:
Critical Thinking Resource Team membership, structure, and role
The director proposed expanding the Critical Thinking Resource Team by adding more faculty, to a ratio of eight faculty to four staff, on the supposition that new members would bring new ideas and over time broaden the campus' base of expertise in teaching and assessing critical thinking; attendees also discussed rotating membership to ensure adequate experience to mentor new members. Appointment to the team might be coordinated through division chairs.
Attendees discussed the feasibility of forming several action teams that could do specialized work of a type to be determined and assigned by the full team, as opposed to having the whole team address every issue brought before it.
Over time the role of the CTRT had come to be limited to the approval of projects and the assessment of student performance. Discussions centered on shifting the team's function to faculty training and development rather than assessment of projects, and on making designated action teams responsible for performing such work. Examples of action team work discussed included helping or mentoring faculty in project design and in designing means of assessment, and in designing or investigating development opportunities for team members or faculty and staff members (for example, designing workshops on how to teach others to design a project). Action teams could meet more frequently than the main group, which would meet as needed, usually 2-4 times per semester.
Suggestions for changes in critical thinking projects
The group agreed that project scope should be narrower than in the past (originally every project was required to include every element of critical thinking addressed by the rubric). The group recommended simpler projects, or perhaps multiple projects in a given course, each addressing different elements of critical thinking. The consensus was also that the plan should no longer require each course's project complement to include all critical thinking elements, but rather to try to address between two and four elements. Further discussion concluded that complete integration of LSC-O ACTS would require that a suitable project be seamlessly incorporated into every section of every course, every semester.
Given the attendees' agreement on the preceding point, the methodology for assessing student performance would necessarily have to change, to be completed by each faculty member for his or her own projects and students. The resource team projected that within a few semesters it would develop a process for providing oversight by reviewing sample project assessments to confirm that assessments are not inflated. The team would also verify that a concrete means for assessment was built into each project during its design. In the course syllabi, the critical thinking activities would be clearly identified.
At the January 2008 faculty convocation, these proposed changes were described to faculty and endorsed by the institution's executive leadership. Two workshops were also conducted during faculty convocation for reviewing the plan's elements with veteran faculty and introducing them to new and adjunct faculty.
Adaptations during implementation
Further modifications to strategies for accomplishing plan objectives
While the plan goal and objectives have remained unchanged from Spring 2007 until the present, strategies for accomplishing those objectives along with the associated assessment have evolved as a result of ongoing assessment. To accomplish the first objective, new strategies were proposed for implementation in Fall 2007. In addition to using enhanced instruction, faculty and administration would revise program and course outcomes to reflect the inclusion of critical thinking as a core competency in every program, and curricular modifications would be facilitated by adding the LSC-O ACTS director to the curriculum committee as a non-voting member. A new strategy for accomplishing the second objective involved changing the New Student Orientations to emphasize student understanding of the teaching/learning process in general and the campus focus on critical thinking across the disciplines in particular. Additionally more emphasis would be placed during the annual service unit planning cycle on tying student development activities to the critical thinking initiative, and campus study skills seminars would be used to emphasize critical thinking as a resource for college success. To accomplish objective three, a policy review and revision would be conducted, and changes made to the institution' syllabus template, to the contents of the syllabi themselves, and to the faculty evaluation instrument, which would reflect faculty compliance with plan requirements. Each syllabus would note the plan goal and the corresponding critical thinking activities or assignments embedded in the course, and the faculty evaluation instrument would be revised to reflect the extent to which each faculty member met the requirements of the plan. Additionally, each instructor would be required to design and implement one new critical thinking activity per section and course each semester until every section of every course included an activity or assignment meeting the plan requirements. New projects would not require approval, which streamlined the design process, but the CTRT recommended quality control measures to be implemented in the Fall 2008 semester. In terms of faculty training and development, annual QEP workshops would be offered and opportunities would be made available for faculty and staff to engage in intensive off-campus training or development at conferences or workshops dedicated to teaching or assessing critical thinking.
Participant compliance: resources, processes, instruments, and quality assurance
The emphasis in Spring 2008 was on educating new faculty members and reeducating veteran faculty members on the elements of the plan, and on requiring everyone to use the modified design specifications approved in Fall 2007 to create a new or updated critical thinking project for a class to be offered in Fall 2008. Changes to LSC-O ACTS subsequent to January 2008 have been practical rather than conceptual, focusing on methods of assuring a high level of quality in the design of critical thinking activities, on developing strategies faculty can use to assess student performance with respect to critical thinking, on developing non-burdensome ways to report the results of student performance assessment to plan leadership, and on determining appropriate means of determining the validity of those assessments. The CTRT formed action teams responsible for achieving these ends: promoting faculty and staff development specifically in terms of critical thinking instruction and assessment; creating, revising, and editing the informational, design, reporting, and assessment documents; helping faculty design learning activities that effectively promote high-level critical thinking and design assessment measures that are specific and objectively quantifiable; and implementing double blind reading processes to assure quality design and student performance assessment.
Data on results of changes made to the plan as originally presented
The results of these changes were encouraging. By the end of Spring 2008, the CTRT had become a true resource for faculty as opposed to a resource for administration and assessment. Data for Spring 2008 revealed that 56% of the syllabi had been modified to meet the minimum criteria and 57% of the FTE faculty had participated in the plan by offering enhanced critical thinking instruction that semester; although the implementation strategy for Spring 2008 called for faculty to be engaged in designing projects according to the new criteria, faculty who had participated in the pilot phase and who already had projects conforming to the preexisting criteria in place conducted those enhanced activities; partial assessment results derived from 34 sections of 9 courses, in three areas of the core curriculum, indicate that the projects for which data were available did promote high-level critical thinking (the institutional average was 2.71 with the threshold for success defined from the plan's inception as 2.65). The faculty evaluation instrument was modified according to the needs of the plan, and faculty were cooperative and even enthusiastic in implementing the graduated process for adding suitable activities and assessments to each course and section. In order to facilitate further resource collection development, a collection development librarian was added to the CTRT. The college' student services division created and promoted critical-thinking-specific development activities, workshops, opportunities to attend conferences, and lecture series selections that promoted the advancement of the plan's goal. Although sequential course success did not meet the 5% benchmark, the success rate did improve, and transfer student success rates also showed improvement (from 2.48 in 2006-7 to 2.87 in 2007-8).
The Fall 2008 semester brought Hurricane Ike, which devastated the area, rendering all but one building on the LSC-O campus unusable and closing the campus for almost a month. Even so, when campus reopened, the CTRT met even before classes had resumed, and action teams received their charges.
CTRT action team operations since Spring 2008
In response to faculty requests from the previous year, the action team devoted to designing and editing documents and forms reviewed and simplified the language of the scoring rubric and added examples. It also created two forms for faculty to use in reporting the results of their assessments of student performance (later consolidated to a single form with an embedded spreadsheet). The faculty development team planned a series of ten project design workshops which were offered at a variety of times to fit faculty schedules, and by the end of the academic year, 73% of the FT faculty had completed and reported results for at least one course and 44% had added new projects. Project design quality control measures recommended by the CTRT in Fall 2007 were enacted, evaluating 55 projects in a double blind reading by members of the resource team, with only three projects failing to meet the standards set by the plan' leadership (those projects were returned to the faculty members with explicit suggestions for modification). The project design team made mentors available to faculty who needed assistance in designing activities and assessments, and student performance on the critical thinking projects (from the plan' inception the most important piece of information) was 2.96 on a 4-point scale for Fall 2008, and at 3.07 for Spring 2009.
In Spring 2009, the action teams, working in response to faculty input, made more changes, The development team distributed a brief and a comprehensive bibliography of discipline-specific critical thinking instructional resources available through the library and crafted a set of recommendations for ensuring and improving training and development, particularly for adjunct faculty. The document review team made further modifications to the forms, and made simplified project design resources available via the LAN network. Perhaps most importantly, this team also added a self-assessment section to the reporting form used by faculty to report the results of student performance assessment for each section of each course; the added section asks faculty to incorporate the results of student performance assessment into an evaluation of their own projects, and to enumerate and explain any data-driven improvements they would make in ensuing projects or upgrades.
Process for validating student performance assessments
Since the scores describing student performance were determined and reported by the same faculty members who designed and administered the projects, the CTRT was concerned about quality control (primarily instructor inflation of student performance assessments) and recommended that an ad hoc team be formed to create a monitoring apparatus. This was in fact accomplished in 2009-10, and was the last major change to be enacted in the plan' integration.
For this exercise, which was conducted on May 14, 2010, four to six samples of ungraded student work generated in completion of a critical thinking project were collected from eight faculty members (a total of 36 projects). The project description forms, each of which included a description of the assessment criteria--a brief rubric--were attached. Each sample was read in a double blind study by two faculty members. One member of each reading pair was a CTRT member, while the other was a faculty member whose assessments were slated to be evaluated (using this same process) in the ensuing semester and who had been invited to participate. Each pair read the project descriptions and then reviewed four to six samples of the students' work product. Then each reviewing faculty member assessed each project with respect to the assessment criteria described in the project description forms, using a 1-to-4 scale, and the results were compared to those results issued by the course instructors. Variances of 2 or more points were noted. In only 6.6% of the assessments did the faculty member score the student work more than one point higher than one of the two reviewers, while in 8.4% of the assessments one or both reviewers scored the sample higher than the faculty member had done. 87% of the reviewer assessments were within the one-point variance regarded as normal. These results encouraged the CTRT members interpreting the results to cautiously regard faculty assessments as being valid.
Institutionalization of the plan elements
In the Fall 2009 semester, the plan had been in process for five years, and the institution's executive leadership scheduled a review. The plan's assessment system had evolved to employ multiple measures in determining the extent to which the goal and objectives had been realized, and the results of this review, conducted early in 2010, were generally positive.
Results of the review
Faculty had been mobilized and educated about critical thinking in general, and about facilitating the achievement of course-level student learning outcomes for critical thinking in particular. Every LSC-O faculty member had come to know what a learning outcome was, could define critical thinking in precise and concrete terms, and had a good grasp of what constitutes higher order critical thinking. Every LSC-O faculty member and most of the staff members could not only explain what a rubric is and how to use one, but could make one for assessing a specific critical thinking learning outcome. Many courses had been enhanced with engineered critical thinking activities and many instructors had improved their teaching as a result of training, knowledge, and experience gained as a result of participating in the plan. Syllabi, courses, and programs had been reviewed and unit-level, course-level, and program-level outcomes revised to reflect the importance of critical thinking as a necessary competency in today's graduate. Although the results for the Watson-Glaser test administered in the Spring 2010 semester showed a 9% decrease in the number of students meeting the benchmark for critical thinking (74% in 2010 vs. 83% in 2007), the numbers are still three percent higher than scores from 2006, and 74% success is a respectable performance. The latest Transfer Student Study results (from 2007, two years into the plan) show not only an increase of .41 grade points from 2005, but they also show that LSC-O transfer students outperformed native Lamar University students, 2.87 to 2.70. Finally, the aggregate Sequential Course Study information shows an increase in success rates from 82% 2006 to 97% in 2008 (the most recent available results).
Migrating the plan elements into existing curricula and administrative units
The plan's original projection was that any specialized apparatus created to administer the plan would become unnecessary when the plan had become a part of the college's culture, with possibly some specialized tools, processes, or forms being retained and put to a different use. In consequence of the institutional review, the executive leadership agreed that the plan was completed and already largely integrated into the college's curricula and culture. In April 2010, an action team of the CTRT formulated a set of recommendations for institutionalizing LSC-O ACTS: in sum, the curricular revisions and program-level learning outcomes made in furtherance of the objectives for LSC-O ACTS will be retained; faculty will continue to complete critical thinking projects every semester but the burden of reporting results of assessment will be reduced and completed according to a rotation schedule, with each faculty member reporting results and submitting sample work for an assessment review once every three years. The Critical Thinking Resource Team will continue to serve as an informational and mentoring resource for new and adjunct instructors. Annual faculty evaluations and regularly scheduled program reviews will ensure continued completion and periodic upgrading of projects and means of assessment. Any remaining administrative or operations functions will be addressed by existing administrative apparatus, such as the Division Directors, the Curriculum Committee, or the Campus Planning Council.
- Gaps in the data collected and weaknesses in assessment strategy suggest that the Office of Institutional Effectiveness should have been involved more closely with the plan from its inception, and it will be for our next QEP, whatever the focus.
- Communication via memos and newsletters proved less effective than we had hoped, recipients mentioning things like "memo fatigue" in feedback sessions. We will use audio and video files more effectively in conjunction with more traditional resources to facilitate communication around campus-wide initiatives and projects henceforth.
- Effective faculty and staff leaders are essential to the process of promoting change, and leadership is a skill in which people can be trained, so to insure a leadership pool deep enough to supply our campus' needs, we will make conscientious efforts to enhance leadership development among all campus constituencies.
- Faculty and staff at LSC-O are cooperative in spirit but not always in actual fact, so we will consider from the outset the best way to tie faculty and staff compliance with initiatives to annual performance reviews.
- The process for selecting the QEP and the leadership structure proved to be effective and efficient, and provides a viable process for selecting the next QEP.
Evaluating the plan's impact
The goal of Lamar State College-Orange's Quality Enhancement Plan was to improve students' critical thinking. Considerable effort, thought, time, money, and human resources were devoted to enacting the plan and evaluating its effects. Most of the measures established by the plan support the interpretation that student critical thinking did improve, although admittedly the difficulty involved in determining the effects of independent variables on the learning and practice of critical thinking is considerable, and the impact of different factors problematic to quantify at best. Nevertheless, LSC-O can say with confidence that the conscientious, methodical and careful work that faculty, staff, and administration devoted to the execution of the plan yielded positive results for the students and for the institution.
For the students, LSC-O ACTS provided direct instruction in the elements and practice of critical thinking; multiplied the opportunities afforded them for practicing critical thinking in a variety of disciplines; and over the terms of their study at LSC-O provided specific, concrete, quantified feedback on their critical thinking performance using an array of assessment criteria.
For faculty, their participation in the plan helped them see their own teaching and their own disciplines from a researcher's or instructional engineer's perspective, putting instructors in the position of consciously thinking about and assessing their own teaching methods and strategies, acquainted them with a variety of discipline-specific resources for improving student learning, and gave many an opportunity to develop professionally. Membership on the various resource teams provided opportunities for faculty members to learn and practice leadership skills in pursuit of an important goal and to learn about assessment as a practical exercise in improving curriculum and instruction.
For the institution at large, LSC-O ACTS sparked campus-wide discussions about academic rigor that led to the establishment of a Committee for Academic Excellence, the full extent of benefits for which are yet to be explored. Participation in the plan has educated the entire faculty about assessment and curriculum design, at least at the course level, and thus helped prepare them to contribute to the general education assessment initiative the campus is embracing. It helped create partnerships between faculty and staff, especially among those who worked together on the CTRT, and it has made many of the faculty and staff at LSC-O engaged in, even excited about, exploring ways to improve student learning.
Overall the Quality Enhancement Plan has been successful, conveying an array of benefits on all the members of the LSC-O community of learners, thinkers, contributors and leaders.
To view supporting documents, please refer to the Document Index.
- QEP Monitoring Report
- QEP Progress Report