Main Content

Provost's Assessment Award - 2008

Applied Sciences and Arts - Library and Information Science Program

The Library and Information Science (SLIS) program provides a model for the college in the assessment of student learning. The with special recognition to Ken Haycock, Jane Fisher, and Linda Main

SLIS is a model for effectively using assessment data to drive programmatic change. One example of this was how they established program standards for grading. In addition, SLIS has regularly held retreats to constantly review and upgrade their assessment practices and plan for the future. This semester they moved to a focus on program wide student learning objectives that have been carefully integrated into their Master of Library and Information Science core competencies. Finally, the college selection committee was impressed with how the department has also sought student feedback regarding their assessment as a routine part of the students' culminating experience with the program.

Business - Dr. Mallory McWilliams & Dr. Robert Sibley

Dr. Robert E. Sibley is the Chair of the College of Business Assessment Committee and has also undertaken to lead the assessment efforts of the college since assuming the position of Director, Assessment upon its creation last summer. In addition to managing the College's ongoing efforts regarding assessment, Dr. Sibley headed the COB's assessment retreat drive, which resulted in bringing a highly-regarded AACSB representative to campus this spring. He also has led the effort to develop an open and transparent rubric on how assessment data will be used to drive curricular development within the College of Business.

Dr. Mallory McWilliams has been a long-standing member of the COB Assessment Committee and she serves as a constant voice for doing Assessment the right way and for doing Assessment for the right reason--not because accreditation agencies require it, but so that we can serve our students better. Mallory has been a driving force behind program Assessment, where she has worked hard to make sure we have a strong linkage between our program objectives and student learning outcomes.

She also is a constant source of strength on the small things that have to happen to make assessment work. For example, she volunteered to vet the CSU-BAT questions in Accounting and Finance, handling those in her area herself and making sure that the Finance questions were reviewed by someone more expert in that area. She also has been a key driver in making sense of concentration assessment and she took on the task of making sure that each of the four A & F concentrations had an assessment coordinator (all of the positions other than hers went vacant with personnel changes since the previous year). In a sense, Mallory, with her no-nonsense style, functions as our conscience--she is constantly asking of all of us that work in Assessment in the COB: Is this the best way to accomplish this goal? Does this measure student learning? Does this really reflect our program goal(s)? Is there a better way to do this?

In sum, Mallory is committed to Assessment and to the use we can make of Assessment activities in assuring student learning. She is a good team player, yet she is capable of "taking the ball and running with it" when we need her to do something. I unreservedly offer my highest recommendation to Mallory McWilliams for the COB's nominee for the Provost's Outstanding Assessment Award.

Education - Department of Child and Adolescent Development

Toni Campbell, Amy Strage, Kathryn Lindholm-Leary, Mary McVey, Robin Love, Maureen Smith, Ravisha Mathur, Nadia Sorkhabi

The Child and Adolescent Development department has played an important role in modeling strategies for assessment, in designing assessments that evaluate their students' competencies and ability to meet the learning outcomes, being consistent and diligent in their collection and analysis of student performance data, and in using these data to inform their program improvement practices.

Engineering - The College Assessment Committee

The following individuals, as members of the College of Engineering Assessment Committee, have made outstanding efforts in continuously improving the department/college assessment processes and methodologies benefiting student learning Ahmed Hambaba, Ali Zargar, Kurt McMullin, Gregory Young, Weider Yu, Robert Morelos-Zaragoa, Minnie Patel, Raymond Yee, Len Wesley.

The CoE Assessment Committee meets every month to discuss various assessment areas such as streamlining the assessment processes, refining direct assessment methods, and technology-based assessment processes. The committee holds once a year assessment retreat where assessment experts are invited to share their knowledge to help improve college's assessment directed towards student learning. The committee also holds once a year workshop to support its efforts in streamlining the assessment processes, in refining direct assessment techniques, and in exploring and implementing technology-based assessment processes.

The committee ensures that each department is engaged in activities that are related to monitoring and enhancing student learning. For this purpose, the committee has organized a workshop on assessment on April 4, 2008 to learn about ways to use technology in assessment and streamline processes to make the assessment process easier and to focus more on student learning through assessment. The committee also discusses how existing direct assessment methods can be refined and new direct assessment methods can be implemented to evaluate student learning.

The students are the future of this country and training students to meet the challenges of dynamic global environment is very important now than ever before. The committee has a good understanding of this need and continuously strives to ensure that the CoE programs are evolving and developing to achieve the necessary student learning.

Humanities and the Arts - Dr. Scot Guenter

Science- Dr. Peter Beyersdorf

Dr. Peter Beyersdorf, an Assistant Professor in the Department of Physics, has been a leader in establishing a culture of assessment in a department that, like many on campus, had no formal programmatic assessment procedures in place prior to the recent WASC accreditation visits. Assessment of major degree programs can never be the sole result of a single individual's efforts, but it is clear that his department would not be where it is today without his understanding, expertise, and skill at implementing effective program assessment strategies. Dr. Beyersdorf grasped the essentials of assessment early on and put an assessment plan into practice at a time when many (or most?) faculty in the College of Science were reluctant participants at best. The Department of Physics has created a practical model that other departments might well emulate. The four Student Learning Outcomes (SLOs) follow logically from a four-point mission statement. Each of these SLOs has several sub-SLOs that are clearly defined to be programmatic and assessable. Assessments included questions from a nationally developed standardized exam, so the department can benchmark against an external standard for some of its SLOs. Most importantly, when certain aspects of the assessment plan were clearly not working well, these were abandoned and replaced with other, hopefully more effective, methods to measure the SLOs.

As if assessment were not a significant enough contribution, Dr. Beyersdorf has pioneered podcasting via iTunes U at SJSU. He is currently serving as the CFD Faculty-in-Residence for Technology Innovations <http://www.sjsu.edu/cfd/programs/facultyinresidence/podcasting/>. His physics colleagues benefit directly from his pedagogical expertise in making lectures and homework assistance available electronically to his students.

Social Science - Dr. Ron Rogers