Faculty Projects

 

Revision of the Objective Portion of the Writing Skills Test (WST)

Gloria Collins

The Proposal:

My project will produce new, homegrown, SJSU versions of the objective part of the Writing Skills Test. The objective (or multiple choice) part of the WST contains questions about grammar and rhetorical choices (editing). SJSU currently pays ACT $13.10 per exam, which includes all materials and scoring. I propose to create SJSU versions of the objective part of the WST. As a result, SJSU will save money and control the content of the test, which we can customize for our student population.

The SJSU community and I have these questions about the objective test from ACT:

  • What is the objective part of the WST testing? Rhetoric? Grammar?
  • How does the objective test relate to our composition courses?
  • Are the items on the test appropriate for our demographics?
  • Can the scores be reported in a way that makes more sense to students?
  • Must we have 72 items in 45 minutes of test time?
The Road Ahead:

In order to create an objective test for junior-level students, I will need to do the following:

  • interview instructors who teach LLD 1, LLD 2, English 1A and 1B.
  • study the course objectives for those courses.
  • interview Derrick Koh in Testing for his advice and wisdom.
  • study the Barron’s Guide for CSU Writing Proficiency Exams.
  • consider data like WST pass rates.

Devising an exam is very complex. I will have to ask questions that we cannot currently answer. What are the university’s standards in writing overall? How literate should a graduate of SJSU be? How do we measure that literacy?

On the more technical front, how would SJSU administer its own objective test? We’d need to devise answer forms, run data analysis to be sure the questions were valid, devise a score reporting system, and maintain test security.

Derrick Koh (from the SJSU Testing Office) mentioned important factors I never considered before. ACT and ETS do not want students to figure out which questions they got right or wrong; therefore, testing agencies never reveal raw scores. By reporting scores too obviously, students can piece together wrong and right answers, create an “answer guide” for the test, and make that information public. But currently, the score from ACT means nothing to students. If a student earns a 60 on the test, that does not mean they got 60 items right out of 72. Each question is weighed differently. In addition, the questions of the test are curved for the group that took the test on that particular test day. The score report begins at 50, giving some students the false impression that they got 50 answers of 72 right, when they might have gotten ten items right. I suggest the current scores be reported on a 100% scale, so students can have more meaningful data.

The Advantages of a Custom-made Test:

In the long run, the university would save money because we would have done the work to make the test. Right now ACT charges us $13.10 per test. By giving an in-house test, we wouldn’t have to mail test materials back and forth out-of-state. We would not have to wait for mail deliveries and make sure nothing was missing.

Like a custom made suit or shirt, we wouldn’t suffer the bad fit of off the rack merchandise. Questions can be tailor made to reflect the learning objectives of our composition/writing courses.

Instructors, administrators, and even students can weigh in on the development phase. Over time, the test can be changed or adjusted to fit our academic needs. We can report the scores in a way that makes more sense to students and faculty.

The Devil’s Advocate Speaks:

If the essays are scored with a half of one percent discrepancy rate or less, why do we need the objective part? This is a question that can launch a thousand Writing Requirement Committee meetings, where we have discussed this possibility in the recent past. But here are factors why we might want to keep the multiple choice part of the WST: What happens if students get a six or seven total holistic score on the WST? Currently those two scores can pass with an appropriate objective score. Without an objective test, students who once passed through with those lower scores would now fail, increasing the overall fail rate by approximately 20% per test administration. How do we help these additional failing students? Do they crowd into English and LLD 100A courses? take the test again? go to community colleges for more writing courses?

Instructors, the WRC, and all concerned parties must talk about the standards and expectations we want to have for our students and how we will coordinate these expectations across the university.

 

Assessing Students' Writing

Dr. Martin Leach

Goal:

The goal is to improve the writing skills of students at both the graduate and undergraduate levels.

Objectives:

To meet the goal, the objectives of this proposal address the skills of the students at two levels: the surface level and the meaning level. The surface level includes grammar, spelling, and punctuation. The meaning level includes clarity and coherence and incorporates building sentences, constructing paragraphs, and organizing entire documents.

Background:

There are two classes that will be included in this study, one for graduate students and one for undergraduate students. The graduate class is METR-202, Research Methods, and the undergraduate class is METR-100W, Writing Workshop: Meteorological Reports.

Method:

The Research Methods class is divided into 5 three-week sessions. Different professors will conduct the sessions, covering their individual areas of expertise. The students will be responsible for a literature review at the end of each session, typically a 5 to 8 page paper. Students will be asked to hand in two copies of their papers; one will go to the individual professor and one will go to me. The professor’s primary focus in grading the papers will be primarily on the meaning level, i.e. how well the students grasped the science topic and how well they have assimilated background material. I do expect some professors to comment on surface level errors. My focus will be on both the surface and meaning layer. I will track individual student’s progress through the semester, quantifying the number of surface layer errors and assessing the meaning level. They will be given comments from both the professor and me on each paper and have those comments before they write their next paper.

I am the instructor for the Meteorological Reports class. The students are given weekly assignments as well as two longer literature reviews, one at mid-term and the other as a final assignment at the end of the semester. The objectives of the weekly assignments vary, sometimes focusing on individual components of a technical report (e.g. an abstract), while other times emphasizing specific writing aspects (e.g. paragraph coherence). The first week of class, the students are given a three-page assignment, which will serve as a benchmark for improvement. I will assess the students’ progress throughout the semester; however, I will compare the benchmark to the mid-term and final assignments.

Outcome:

I will produce a report assessing the students in both classes, including surface level and meaning level improvements. I expect the students in the graduate class to be better writers at the beginning of the semester, likely leaving less room for improvement. However, the focus in that class is not solely on developing writing skills. The focus of the undergraduate class is improving writing skills, and I expect the improvement to be much greater. The final analysis will include how well the students in the two classes compare.

 

Past Faculty Projects

Gloria Collins, 2011-2012