The Assistant Writing Specialist (AWS) Program
The Writing Center implemented the Assistant Writing Specialist (AWS) program in January 2011, at the start of the spring 2011 semester. Applicants for the Assistant Writing Specialist position must complete the same intensive training as full upper-division and graduate-level Writing Specialists. However, AWSs are sophomore-level students who have not yet taken the WST, completed 100W, and achieved upper-division standing. Once an AWS has completed the above tasks, he or she can be promoted to a full Writing Specialist.
As part of this on-going project, I have completed the following tasks:
1. Created the program for Assistant Writing Specialists.
- Decided upon pay scale, general qualifications, and hiring criteria for the new position.
- Created the full job description.
- Posted the full job description on the Writing Center web page.
- Conducted all steps of the hiring process with all prospective Assistant Writing Specialists.
- Hired four Assistant Writing Specialists during the 2010-2011 academic year.
2. Maintained the program for Assistant Writing Specialists.
- Assisted AWSs with their daily tutoring responsibilities.
- Answered questions from the Assistant Writing Specialists as they adjusted to their new positions at the Center.
- Helped the AWSs deal with difficult tutees, problematic tutoring sessions, and instructor concerns.
- Provided critical commentary about the e-mail messages and "Homegrown Handouts" created by the AWSs.
- Discussed ways to improve tutoring sessions with the AWSs on an individual basis.
- Helped AWSs reach high evaluation scores for their tutoring sessions.
The Assistant Writing Specialist program has benefited the university and the Writing Center in a few key ways: (1) it has helped to draw in freshman and sophomore level (lower-division) tutees; (2) it has allowed exceptional lower-division students to become tutors at the Center and get excellent professional job experience; (3) it has allowed the Center to retain high-quality tutors for a lengthier period of time; and (4) it has increased the diversity of the Center since the AWSs are ethnically diverse and from different major departments.
The Web Presence of the SJSU Writing Center
One of my primary responsibilities is to maintain and promote the web presence of the Center. As part of this on-going project, I have accomplished the following tasks:
1. Updated the official Writing Center website.
- Reorganized the website.
- Made all necessary updates.
- Uploaded new Homegrown Handouts.
- Maintained the workshop schedule.
- Posted pictures and bios of new staff and faculty.
- Posted new pictures from the Center.
- Collected and posted "testimonials" from instructors and tutees.
2. Created and updated the Facebook fan page and the Twitter account for the Writing Center.
- Publicized the Facebook fan page and the Twitter account.
- Posted "tips of the week" from our tutors.
- Posted relevant updates about the Center (announcements, hours, services, etc.).
- Shared relevant links.
- Reviewed page statistics.
- Helped the Center gain 281 individual Facebook "fans" and 83 Twitter "followers" (as of September 2012).
- Helped the Center gain many "fans" and "followers" from a variety of sources (other SJSU groups, a PR firm in South Dakota, an editing service in Oregon, and other Writing Centers throughout the nation, including Texas A&M, private university WCs in New York and North Carolina, a state college in Florida, etc.).
These online projects have been crucial public relations activities for the Center. They have publicized the Center to both SJSU faculty and students and other universities, Writing Centers, and scholars throughout the nation.
Revision of the Objective Portion of the Writing Skills Test (WST)
My project will produce new, homegrown, SJSU versions of the objective part of the Writing Skills Test. The objective (or multiple choice) part of the WST contains questions about grammar and rhetorical choices (editing). SJSU currently pays ACT $13.10 per exam, which includes all materials and scoring. I propose to create SJSU versions of the objective part of the WST. As a result, SJSU will save money and control the content of the test, which we can customize for our student population.
The SJSU community and I have these questions about the objective test from ACT:
- What is the objective part of the WST testing? Rhetoric? Grammar?
- How does the objective test relate to our composition courses?
- Are the items on the test appropriate for our demographics?
- Can the scores be reported in a way that makes more sense to students?
- Must we have 72 items in 45 minutes of test time?
The Road Ahead:
In order to create an objective test for junior-level students, I will need to do the following:
- interview instructors who teach LLD 1, LLD 2, English 1A and 1B.
- study the course objectives for those courses.
- interview Derrick Koh in Testing for his advice and wisdom.
- study the Barron’s Guide for CSU Writing Proficiency Exams.
- consider data like WST pass rates.
Devising an exam is very complex. I will have to ask questions that we cannot currently answer. What are the university’s standards in writing overall? How literate should a graduate of SJSU be? How do we measure that literacy?
On the more technical front, how would SJSU administer its own objective test? We’d need to devise answer forms, run data analysis to be sure the questions were valid, devise a score reporting system, and maintain test security.
Derrick Koh (from the SJSU Testing Office) mentioned important factors I never considered before. ACT and ETS do not want students to figure out which questions they got right or wrong; therefore, testing agencies never reveal raw scores. By reporting scores too obviously, students can piece together wrong and right answers, create an “answer guide” for the test, and make that information public. But currently, the score from ACT means nothing to students. If a student earns a 60 on the test, that does not mean they got 60 items right out of 72. Each question is weighed differently. In addition, the questions of the test are curved for the group that took the test on that particular test day. The score report begins at 50, giving some students the false impression that they got 50 answers of 72 right, when they might have gotten ten items right. I suggest the current scores be reported on a 100% scale, so students can have more meaningful data.
The Advantages of a Custom-made Test:
In the long run, the university would save money because we would have done the work to make the test. Right now ACT charges us $13.10 per test. By giving an in-house test, we wouldn’t have to mail test materials back and forth out-of-state. We would not have to wait for mail deliveries and make sure nothing was missing.
Like a custom made suit or shirt, we wouldn’t suffer the bad fit of off the rack merchandise. Questions can be tailor made to reflect the learning objectives of our composition/writing courses.
Instructors, administrators, and even students can weigh in on the development phase. Over time, the test can be changed or adjusted to fit our academic needs. We can report the scores in a way that makes more sense to students and faculty.
The Devil’s Advocate Speaks:
If the essays are scored with a half of one percent discrepancy rate or less, why do we need the objective part? This is a question that can launch a thousand Writing Requirement Committee meetings, where we have discussed this possibility in the recent past. But here are factors why we might want to keep the multiple choice part of the WST: What happens if students get a six or seven total holistic score on the WST? Currently those two scores can pass with an appropriate objective score. Without an objective test, students who once passed through with those lower scores would now fail, increasing the overall fail rate by approximately 20% per test administration. How do we help these additional failing students? Do they crowd into English and LLD 100A courses? take the test again? go to community colleges for more writing courses?
Instructors, the WRC, and all concerned parties must talk about the standards and expectations we want to have for our students and how we will coordinate these expectations across the university.
Assessing Students' Writing
Dr. Martin Leach
The goal is to improve the writing skills of students at both the graduate and undergraduate levels.
To meet the goal, the objectives of this proposal address the skills of the students at two levels: the surface level and the meaning level. The surface level includes grammar, spelling, and punctuation. The meaning level includes clarity and coherence and incorporates building sentences, constructing paragraphs, and organizing entire documents.
There are two classes that will be included in this study, one for graduate students and one for undergraduate students. The graduate class is METR-202, Research Methods, and the undergraduate class is METR-100W, Writing Workshop: Meteorological Reports.
The Research Methods class is divided into 5 three-week sessions. Different professors will conduct the sessions, covering their individual areas of expertise. The students will be responsible for a literature review at the end of each session, typically a 5 to 8 page paper. Students will be asked to hand in two copies of their papers; one will go to the individual professor and one will go to me. The professor’s primary focus in grading the papers will be primarily on the meaning level, i.e. how well the students grasped the science topic and how well they have assimilated background material. I do expect some professors to comment on surface level errors. My focus will be on both the surface and meaning layer. I will track individual student’s progress through the semester, quantifying the number of surface layer errors and assessing the meaning level. They will be given comments from both the professor and me on each paper and have those comments before they write their next paper.
I am the instructor for the Meteorological Reports class. The students are given weekly assignments as well as two longer literature reviews, one at mid-term and the other as a final assignment at the end of the semester. The objectives of the weekly assignments vary, sometimes focusing on individual components of a technical report (e.g. an abstract), while other times emphasizing specific writing aspects (e.g. paragraph coherence). The first week of class, the students are given a three-page assignment, which will serve as a benchmark for improvement. I will assess the students’ progress throughout the semester; however, I will compare the benchmark to the mid-term and final assignments.
I will produce a report assessing the students in both classes, including surface level and meaning level improvements. I expect the students in the graduate class to be better writers at the beginning of the semester, likely leaving less room for improvement. However, the focus in that class is not solely on developing writing skills. The focus of the undergraduate class is improving writing skills, and I expect the improvement to be much greater. The final analysis will include how well the students in the two classes compare.