3.1. Analysis

 

The biggest lesson from Horstmann's experiment is that assessment needs to be task-based. Indeed, this was the motivation for creating the task-based rubric for SLO2 that was used in Fall 09 and going forward, all rubrics will be task-based. The only question that remains is how specific the tasks should be. To preserve some flexibility for instructors it is envisioned that tasks will be somewhat abstract. For example:

Write a Java program that modifies an array, for example, a program that sorts an array.

In the future we might want to turn this into a specific array-sorting problem and request faculty to put it on their final exams.

By giving pre-semester tests and post-semester tests Horstmann was able to pinpoint CS151 as the problem class. However, under the modified assessment schedule using task-based rubrics, the problems in CS151 would still be noticed if, for example, students emerging from CS46B did well on SLO5-B tasks, but students emerging from CS151 did poorly on SLO5-I tasks.

So how should the department address the issue of tool usage? Horstmann's report [6] suggests that the CS46A/B labs should be re-designed. This is in the process of being done. He also suggests that the CS151 syllabus should be re-designed. This has not been done yet. The Assessment Committee will request that the Programming and Software Engineering course committee (the committee responsible for the CS151 syllabus) begin this task in the near future.

Another solution to the problem might be for the department to fix a standard set of tools in a manner similar to the way a standard language is fixed. For example, fixing Eclipse as the department IDE, Trac as a standard issue tracking tool, and Subversion as a standard version control tool means that CS151 instructors could assume entering students were familiar with these particular tools at a beginning level. Support for such tools would have to be ensured by the Department as well.