introduction | background | terms | webbing | samples | possibilities | acknowledgements
Assessment Possibilities Beyond First Year Writing

Construction of a rubric for assessment beyond the course work of first year composition is beyond the scope of this essay. In addition, any rubric used at the program level would need to be locally constructed , taking into account individual program goals. My purpose here is merely to demonstrate that such an assessment tool is possible, given the flexibility of hypertext and the imagination of the writing program in constructing its goals and assessment strategies.

           Computer applications as such are hardly new to the assessment game. Early text editors, spell checkers, and grammar checkers have been used since the late 80s and early 90s to assess student achievement. Early focus on multiple choice and short answer tests as both direct and indirect tools of measurement lost the computer more ground than it gained in the field of assessment, however, since these instruments could not measure the kinds of learning that writing instructors considered their primary goals. Programs developed by W.
Current work at Ball State is also summarized in this issue.
Webster Newbold (Ball State) attempted to bridge this gap between instructor goals and outcomes by creating algorithmic programs that used word-stringing capabilities to measure primary conceptual relationships (4-5). Linda Flowers and John Hayes studied fundamental cognitive processes by using software that could collect and preserve keystroking as students composed, and coupled this with oral reports by the students themselves denoting their thoughts during the process of composition. Hawisher (1989) and Sirc and Bridwell-Bowles (1988) improved on this model by creating a terminate-and-stay resident program so that students would not be distracted by oral reporting during the composing process. Innovations to this experimental model by John Smith (1989) included a parsing program that in addition to analyzing data could also reduce it to a manageable size for later study.

           These studies have all contributed to a growing pool of knowledge about how students compose – an essential tool for teaching writing. In 1990, Newbold observed that "the most novel and exciting dimension of networking in writing instruction is the encouragement it gives to self assessment" (9). He stated that in a networked environment, as students "get a sense of writing as communicating with people like themselves, which is fostered by network-based activities, they begin to be able to assess their writing more realistically and successfully" (9), and he found that the networked environment supported text-based, cognitive, and social approaches to writing. The Web portfolio takes the network environment one step further along this same path. In the hypertext environment, readers can access a full range of comments from the entire classroom community of writers in addition to the students' own assessments of their work.

           Beyond the classroom level, such portfolios can also be used to assess a particular writing program's effectiveness. It can demonstrate that instructors are achieving the higher-level goals that programs have set for themselves-goals like critical thinking and problem solving. The concept of the Writer's Web can be applied at the program level for programs with a two-course core, writing across the curriculum program, or rising-junior/senior capstone requirements. And to be effective, it will need to do this. Archived and re-accessed portfolios will generate a tracking mechanism through which a program can monitor progress toward meeting individual student goals, and setting new ones as the original goals are achieved or as situational variables demand different strategies. Thus a student's Writing Web grows with her, creating and maintaining a history of her progress as a reader, thinker, and writer for as long as she continue to add to it. A chosen symbol or metaphor may become trite or inaccurate as she advances. In that case, she might change the shape, size and style of the Web to better suit her evolving image of herself as a writer, researcher, and thinker. At that time, she would necessarily add a self-reflective "process" page explaining why changing needs or goals necessitated a new process of metaphorization in her Web. When she reaches the end of her program of study, that information can conceivably be made available for assessment purposes at the program level.

           Using the webbed environment as a performance assessment tool incorporates and builds upon earlier techniques. The Writer's Web is an expanded form of the traditional portfolio without the weight of paper and the inconvenience of storage, with the added advantage of being dynamic and interactive. It can take advantage of previous performance assessment techniques demonstrated by White and Faigley, using criterion-based scoring techniques, conducted by trained raters and guided by a rubric constructed locally and tailored to the demands of the assignment (Newbold 6). It continues to avoid the problematic one-time, single draft essay sample, with the added input of the student's own assessment of what (s)he has accomplished over time. It supports discourse community approaches to writing by supplying vital contextual references from the entire "community" of readers/writers from the course(s) within its structure (Faigley 1985), (Rosenthal 1983). Raters can evaluate elements crucial to writing instruction and writers' development: the texts themselves, their placement within a community of writers (through peer and instructor reviews of draft documents), and the students' self-reported processes and progress.

           To suggest that a Writer's Web completely solves the problems of teaching and assessing student writing would be foolish. The Web portfolio generates many of the same problematic issues that plague its more traditional paper bound counterpart. Where and how would such documents be stored? Who would have access to them, and how could instructors, raters or other assessors be certain that these portfolios accurately represented the students' own work? The plagiarism and ownership issues that apply to commercial, organizational, and educational Web documents apply equally to student webs; perhaps even more so, given the high stakes involved in their use to assign grades and accurately reflect student performance. These problems will need solutions before Web portfolios can achieve their potential for teaching and learning.

           However, since current outcomes assessment demands highlight the need for more accurate and reliable methods to demonstrate what students have learned in their writing courses, the availability of a "Writer's Web" could supply valuable insight into areas of assessment that have proven difficult to measure. Instead of saving or revising "old" essays at the rising junior or senior level, a Writer's Web could make available not just the final products, but an evolving process of documents produced and revised over time with a clear view of students' reasoning processes for the revisions and personal views of their progress toward concrete, self-determined goals. While no instrument can deliver 100% reliability and validity in measuring such a complex cognitive process, a Writer's Web places the emphasis where it belongs--on writing as a process of reading, learning, and revising-what instructors stress as the most important aspects of first year writing courses. This kind of assessment tool can accentuate the process of writing, keeping the flash and taking out the trash.


 

 


Kairos 6.2
vol. 6 Iss. 2 Fall 2001