Everyday Assessment Practices in Writing Centers: A Cultural Rhetorics Approach

Marilee Brooks-Gillies
IUPUI
mbrooksg@iupui.edu

Trixie Smith
Michigan State University
smit1254@msu.edu

Abstract

We write about a cultural rhetorics approach to writing center assessment at two different institutions where we think about assessment as everyday practice that enables us to tell multiple stories about our centers. We share how we create assessment committees within the center and collaboratively develop and revise assessment approaches and instruments, particularly with consultant input. Then, we discuss the various communities that inform and benefit from our assessments, including consultants, a broad range of writing center stakeholders, and writing center administrators. Assessment as everyday practice means that we are better informed and prepared when these constituents ask questions, make requests, or operate from (false) assumptions. We hope this view of assessment leads readers to build relationships with the individuals in their centers and universities in order to create assessments that matter in particular times and spaces as well as assessments that morph and change as the readers’ cultural communities change.

A cultural rhetorics orientation is to enact a set of respectful and responsible practices to form and sustain relationships with cultural communities and their shared beliefs and practices including texts, materials, and ideas. This orientation rejects the idea that “everything is a text” to be read and instead engages with the material, embodied, and relational aspects of research and scholarly production. One engages with texts, bodies, materials, ideas, or space knowing that these subjects are interconnected to the universe and belong to a cultural community with its own intellectual tradition and history. --Andrea Riley Mukavetz

As we’ve said, for us, the core of cultural rhetorics practices is an orientation and embodied storying of the maker in relation to what is being made. The makers of the pieces gathered here [in this special issue of Enculturation focused on Entering the Cultural Rhetorics Conversations] engage in that storying in different ways, through different means, hearing the voices of different ancestors and elders, honoring different kinds of stories. Their stories build and are theories, forming a web of relations—making, re-making, and extending the brief outline of cultural rhetorics practices we’ve talked about in this introduction. --Phil Bratta & Malea Powell

Our individual Writing Centers and the field of Writing Center Studies can be understood as specific cultural communities where writers write; where writers talk about writing with other writers; where writers are encouraged and empowered to enter a community of practice with other writers around making meaning through conversations, texts, and most importantly relationships. As Riley Mukavetz notes, a cultural community, such as the writing center, is a space where “one engages with texts [tutor manuals, intake forms, client surveys, theoretical articles read independently or in concert], bodies [tutors, clients, faculty who send their students to the WC, administrators], materials [essays, computers, candy bowls, tables and chairs], ideas [non-directive tutoring, process not product, ghosts in the center, the faces of the center], or space [basements, libraries, satellites, online, websites] knowing that these subjects are interconnected to the universe and belong to a cultural community with its own intellectual tradition and history” (109). Likewise, as Bratta and Powell tell us, it is through honoring all of these different aspects of community and storytelling that we build our theories and thus our practices. In shaping our inquiry into our communities by recognizing the relationships among the makers and makings, we come to writing centers as cultural rhetorics scholars. 

From this cultural rhetorics positionality and our own lived experiences, we posit that assessment is a practice of everyday life in the writing center. Something we do, and are called to do, on a regular basis, day in and day out. In The Practice of Everyday Life, Michel de Certeau’s goal is to create an approach to make everyday practices visible; he writes, “This goal will be achieved if everyday practices, ‘ways of operating’ or doing things, no longer appear as merely the obscure background of social activity, and if a body of theoretical questions, methods, categories, and perspectives, by penetrating this obscurity, make it possible to articulate them” (xi). We see this as a helpful understanding of assessment. In developing ways to make visible and understand our everyday practices, we begin to assess it. Through approaching assessment as an everyday practice and applying it to our everyday practices, we create a way to talk about our work in systematic, clear ways. We can use it to draw attention to practices that work well and to address practices that need modification(s).

Many writing center theorists and practitioners have previously noted the need for assessment, as well as the fear of assessment. Neal Lerner, for example, says that assessment is one of the words that “haunt” writing center professionals, alongside “research.”  He then argues for reclaiming the words, and the acts, of assessment and research in order to “put into a critical context the common call to investigate how well we are doing” (58) and to “begin to assess our work in ways that we feel are meaningful and useful” (73).¹ Similarly, Joan Hawthorne indicates her surprise that writing center professionals “have had to be pushed and pulled into assessment, resisting all the way” (237). Hawthorne then makes an argument for assessment as a set of scholarly activities that provide us with evidence and information (and we would add stories) to be used with a range of stakeholders at every level. Jon Olson and Dawn J. Moyer, and Adelia Falda have also argued for both the importance and ease of conducting assessment in our centers, comparing assessment to the tutoring work we already do: “addressing questions, through conversation, that help people see more clearly what they’ve been doing so they can then do more effectively what they need to achieve” (111). It is the everydayness of this work that leads them to argue for student-centered research as a way to provide valuable insight into the work of writing centers while giving students valuable experiences with research and assessment and all that these processes entail. 

For decades, writing center publications have both celebrated the possibilities of research in the writing center and called for more research at the same time. Alice Gillam’s survey of the scholarship on research argued that the literature fell into three categories: “the call to research, the survey or overview of research, and the critique, that is, in its own way, a call to research differently” (4-5). Often these calls have asked for more empirical methods. Driscoll and Perdue, for example, surveyed three decades worth of research in The Writing Center Journal and concluded that writing center professionals need to do more replicable, aggregable, and data-supported research (RAD) research. More recent scholars, however, point out some of the problems with RAD research. Alexandria Lockett, when focusing on the experiences of Black women in writing centers, warns us to “be cautious about privileging replicable and aggregable data (RAD) research over more explicitly subjective methods. The language of RAD tends to strip the human experience of its nuance and may risk diminishing the various ways we might interpret experience as data” (33). Likewise, when discussing his research with queer-identified writing center directors, Travis Webster writes, 

From a RAD lens, my methods are replicable, and I do hope they could offer a lens for other queer, transgender, and raced projects. However, I caution against just any researcher replicating my methods and instruments given that my queer body played a critical role in the development and framework for the project, including and especially linked to recruitment and establishing trust with my participants. (18)

Writing center assessment, then, is a form of research that needs to be considered and conducted in and with the community. Decisions about questions to explore and how to explore them need to be made through engaging with the center, its stakeholders, its needs, its practices, its history. Cultural rhetorics research focuses on the maker in relation to what is being made. In writing centers, we foster relationships and support writing in a myriad of ways, which differ from center to center and change over time. 

How, then, do we create an approach to writing center assessments that embrace this emphasis on relationships? Relationships that exist in a particular context and historical moment? What kinds of questions do we need to ask and how do we go about answering them? Who should be part of the asking and answering? What methods do we use to create assessments that will provide us with the ability to learn more about our community, our relationships, in responsible and reciprocal ways? In Building Writing Center Assessments that Matter, William J. Macauley, Jr. writes, “Our work is unique, which means that others cannot reasonably be expected to handle the assessment of that work. In order to assess and accurately represent that work, we cannot work from lore or anecdote alone” (29). He calls for more deliberate assessments and research, so we can not only tell our own stories but also show our stories, stories collected in and through intentional, structured, and empirically sound methods². The stories of the center include stories of consultants, student writers, administrators, engagement partners, and other stakeholders. Our approach to assessment includes the writing center community of each of our centers and all of the various stakeholders’ investments in and connections to each of our writing centers.

To run effective writing centers under this broad idea of community and story, we need to understand our writing centers as composed of various communities of practice, where groups of learners operate on common ground. Like Malencyzk, et al. “we [as directors] admit to being strongly influenced by our many collective years tutoring in, directing, and researching writing centers, sites in which students teaching and learning from other students is woven into the fabric” (70). From the beginning of their time working in our particular centers, consultants recognize their voices are encouraged as part of ongoing conversations about what our centers are and can be. To successfully create, implement, and revise meaningful assessments, we need to see ourselves as working together to tell compelling stories about what we do and how we do it. Central to developing a community of practice within the writing center, then, is a strong writing center education program that provides consultants with information about writing center history, a student-centered pedagogy, an emphasis on teambuilding and community, guidance on intercultural and workplace communication, attentiveness to diversity, equity, and inclusion as values that guide our practices, and other orientations important to the local contexts of our particular universities and centers. While models vary from credit-bearing courses, required staff meetings, and self-paced modules, writing center education matters to create common understanding of our values and goals and for recognition of the ways our daily practices support those goals. 

In this article, we focus on the part of this education that stresses to consultants and staff, who represent the culture of the center, that assessment relies on the input of everyone in the center acting in community. This focus includes sharing how we create assessment committees and collaboratively develop assessment approaches and instruments, as well as how this information is then shared with the community in various ways and forms. We then discuss the diverse cultural communities that inform and benefit from our assessments, including consultants, writing center administrators, and a broad range of writing center stakeholders—from students to campus administrators.

Creation of Assessment Committees

Dear Amber³,

I am forming a new committee to help with the work of the Writing Center—the assessment committee. I would like for this committee to have a good mixture of new and experienced consultants, undergrad and graduate consultants, fields of study, etc. Therefore, I am inviting you to be a part of this committee, which will consider ways to evaluate the work of the

center, the work of individual consultants, and the work of the administrators.

If you are willing to serve the writing center in this way, please write back and let me know. Also let me know all of your available meeting times (we'll move around schedules if we need to).

Thanks,

Trixie

For us, ongoing writing center education and professional development includes participating in committees. At both of our centers, the committees are sites where consultants across experience levels work together to build, sustain, and revise writing center programming and the structures that make that programming possible. One such committee is an assessment committee. The assessment committee at The Writing Center at Michigan State University (MSU) was established by Trixie soon after she came to the university as a way to pull in multiple student voices, to expand existing assessments, and to make assessment a common occurrence across programs and initiatives. In fact, Marilee served on the assessment committee at The Writing Center at MSU when she was a doctoral student, working with undergraduate and graduate consultants, the Director (Trixie), and Associate Director to create and implement various assessments. The assessment committee at The Writing Center at MSU continues in this form, and Marilee has started similar committees at both writing centers she has directed since graduating from MSU.

When establishing the assessment committee at MSU, Trixie assembled a group of writing consultants that represented a wide spectrum of writing center experiences; some were in their first semester of consulting, while some had consulted at multiple writing centers over the course of many years. Some students were studying Rhetoric and Writing, while others were preparing for careers in different disciplines and professional capacities. The committee was intentionally organized to represent disparate and varied experiences and perspectives; each member was valued and each member contributed to the development of assessment instruments and strategies used in our writing center. The aim was to develop a range of approaches that would help us not only understand but advance our work together. We started by reading some of the existing writing center and writing program literature on assessment to give us some common ground and a place from which to ask questions. Members of the committee also brought in readings from their own fields to help inform our discussions. Trixie recalls conversations about statistics from an accounting major, how-to guides for focus groups from a communications/ human resources major, and validity testing of initial instruments from consultants in linguistics and sociology. Following Ed White’s advice, we benefited from a team “representing different discourse communities” and “multiple measures” (199). 

From the beginning, assessment was framed as a practice of the everyday life of the center, countering the narrative that assessment is some big scary monster. Instead, we choose to greet assessment as a friend who is always around to help us out. In Strategies for Writing Center Research, Jackie Grutsch McKinney, for example, argues that we conduct research for a wide range of reasons: to participate and contribute to a larger conversation, to interrogate practices, to make better decisions, to make stronger arguments, to complicate received narratives, to gain academic or professional cachet, and to enjoy our work more (or again) (xix-xx). Assessment is not some big project that happens every few years when accreditation documents are due or when we’re trying to argue for new programming, space, or funds. It is instead, a part of our daily sustenance—both food for thought and our exercise regime.

We had these very ideas in mind when we first began thinking strategically about assessment in the MSU Writing Center. We looked at what information we already had, such as client intake data, assorted emails and messages from both clients and consultants, and individual appointment information. We then asked ourselves what else we wanted to know. Multiple brainstorming and planning sessions led us to develop four key approaches to assessment, utilizing four different assessment/research tools: short client surveys administered in a two-week blitz once a semester, a lengthier consultant self-assessment completed once a year during a fall staff meeting, group interviews or focus groups with consultants focused on writing center administrators facilitated by the assessment committee once a year during a spring staff meeting, and session observations of each consultant scheduled throughout the year. We viewed each of the individual instruments as pieces of a puzzle designed to collect stories from different constituents and perspectives; when put together these puzzle pieces would create a more robust (but never complete) picture of the work of the center. We also saw these as middle pieces of the puzzle with openings and nubs reaching out to other assessments in our center, our university, the larger writing center community, and to places as yet unknown (see Schendel and Macauley for their very useful articulation of ways to consider broader assessments in higher education in relation to writing center assessment design and implementation). We brought them to the larger staff for feedback that would help us clarify the purpose of each tool/approach/instrument and to be open and transparent about their purposes. We purposefully worked these different assessments into the everyday work of our center over the course of three years, layering them into existing structures and practices to cultivate understanding of each of these approaches with the entire staff as they were implemented. We wanted to show how assessment is a way to look closely at what we were doing and how we could do it better—in other words, our everyday practices. 

Marilee has since implemented similar models at two different writing centers, both of which had fewer existing assessments prior to her taking on the role of director. In each writing center, she started conversations about assessment with existing staff in ongoing professional development staff meetings and workshops and with incoming consultants in the writing center education course. Consultants read about writing center assessments, reviewed writing center missions, values, and goals statements, and were given time to reflect on the purpose of the writing center and how that purpose is evident in our everyday practices. In addition to local writing center archives and models from writing center literature, Marilee shared the MSU Writing Center assessment instruments with newly established assessment committees to consider how they may be useful as models to consider in developing their own. 

Several years ago, Marilee encountered significant pushback from the broader staff in the process of trying to create a more collaborative environment in the Writing Center she directed at IUPUI, including engaging in assessments. Her initial moves to increase the number of staff meetings and to meet individually with each consultant was understood as a significant revision when it was intended as a way to get to know consultants and the Center and learn more about the everyday practices of the Center. Her hope was that by engaging with consultants and learning about the parts of the job that were most exciting and satisfying for them and the things that left them frustrated or concerned she could work with them to make shifts to consultant education, strengthen the newly created committees, and develop an assessment plan. She was concerned when some consultants saw these overtures as “surveillance” and indicated that the center did not have a history or culture of “supervision and accountability.” While Marilee didn’t want her actions to be understood as surveillance, she most certainly did want to create a culture of accountability in the Center. 

Looking back at these early stages in the development of our thinking about the communities of our centers, we can now say that assessment as a practice of everyday life in both of our centers is more attractive to our staff members generally, but this change does not happen instantaneously. It takes time for assessment to be seen as part of the everyday; however, being seen as an everyday practice is the only way for assessment to become familiar, inviting even. For instance, as Marilee introduced the committee structure at IUPUI, she worked with staff to cultivate a culture of inquiry and curiosity, where consultants were encouraged to seek out answers to questions about consulting through reading scholarship and observing each other. Staff shared readings and began to see that learning how to do the work of writing centers was something that was never over. They were never “done” learning, and that made assessment of our practices less scary and, instead, helpful and interesting.

Over time, staff members can and will interact with the assessment process; they can ask questions, revise questions, revise answers, share results in multiple forms and venues; they can learn from it and grow with it, seeing it as a part of ongoing professional development and growth. For example, in Trixie’s center, the assessment committee now collaborates with all of the other committees in the center to generate questions for the semesterly client surveys; one semester when focusing on Dear Client Letters in Professional Development, we added a couple of questions about the usefulness of the letters and whether or not clients shared the letters with others (i.e., professors and advisors). When the assessment committee analyzes the collected data from various instruments, they then share out general feedback through the staff newsletter and/or blog and specific feedback to committees or programs that need to integrate this feedback into their future engagements. Interacting with the feedback from assessment on a regular basis reinforces the everydayness of the process and builds a familiarity that encourages staff members to move stories around, re-constellating results or thoughts on a regular basis.  

Revising and Responding to Changing Contexts

Through listening, and listening carefully, we have gained a better understanding of our different lived, embodied experiences and the ways our various identity positions (including but not limited to race, gender, institutional status, and disciplinary backgrounds) and prior experiences inform how we engage in the work of the writing center and the ways we negotiate relationships with and among one another within and beyond the UWC. --Marilee Brooks-Gillies, et al.

Lipari’s notion of listening offers a way for those of us situated in the academy to attend to both when listening leads to understanding and also when listening leads to the recognition that we might not understand another person and therefore have a responsibility to make space for their experiences and perspectives. --Kathyrn Valentine

As mentioned previously, communities of practice are multiply-situated, across centers, people, spaces, and places. In part that means that the everyday practices of assessment, of necessity, must change on a regular basis. Assessments respond to the ever changing staff of the writing center, both administrators and students, they adjust to the needs of clients and stakeholders, and if done correctly, they morph based on previous assessments. The questions we ask change because we need to know different things at different times. In addition, the process of engaging in assessment and the assessments themselves support and impact members of the communities connected to our centers in different ways. In what follows, we share the ways various communities contribute to or are informed by our everyday assessment approach.

Consultants

As the ones on the ground interacting with writers on a daily basis in personal and embodied ways, our consultants have a different vantage point from which to assess their work and the work of the writing center more broadly. It behooves us then, to treat them as experts in this arena and to build connections across our experiences and theirs, our theoretical positions and theirs.

Engaging in the assessment process can be valuable to consultants even before the assessment data can be reviewed and used to make changes to writing center programming. For instance, consultants observe each other using our observation instrument, which provides them with a framework for the kinds of value-driven practices that are common in our one-to-one sessions with writers. Through observing their peers with the guidance of the observation instrument, they consider what practices are serving the writer, how writing center pedagogy is evident in our sessions, and what activities and strategies may be useful in particular contexts. From the observations, they get new ideas for activities and strategies to implement in their sessions. They can share ideas about their approaches with their peers as well, and they can reflect on their own practices. In addition, the process leads them to ask more questions about what happens in writing center sessions and what could happen in writing center sessions. As Blazer shares, “While we may be immersed in scholarship and conversation, our challenge is to figure out abbreviated but meaningful ways to engage with big ideas alongside tutors. And, as most of us facilitate staff education in periodic meetings throughout the semester, we have to stay mindful of the magnitude of this project and the challenge of sharing limited time with our tutors” (24).

For example, in the MSU Writing Center, we have begun focusing our staff professional development on a year-long theme. When we first released our WC’s Language Statement Policy, we used all of our staff meetings that academic year to think about, grapple with, and reflect on linguistic diversity and issues of language justice (see Aguilar-Smith, Pouncil, Sanders for more on this process). This in turn, meant that many of our assessments were focused on the learning around linguistic diversity. For instance, staff surveys became focused on changes in attitudes about language justice. In addition, observation instruments were revised to add questions about how consultants talked about issues of language in their sessions and techniques used to implement the language statement in their consulting. This emphasis on language justice changed the questions asked and thus the data/stories collected, which in turn led to new reflections and new questions. For example, these reflective moments led our multilingual writers committee to think about the need for indicating on consultant bios languages spoken or read to encourage clients to seek out consultants who could help them in languages other than English if they so desired.

As illustrated, assessment can lead to shifts in thinking about writing and about what we value and what we teach in the writing center. Consequently, such changes can have a domino effect. Changes in writing center programming in response to assessment often lead to new/additional discussions with faculty who send their students to the writing center, for example, or conversations with advisory boards and other administrators in similar types of programs and services. In other words, we then need to think about our communities of stakeholders.

Stakeholders

If consultants are the ones on the ground, then stakeholders are often the shadows looming around us, exerting influence, but perhaps not directly, asking or expecting particular programs or kinds of services, and often controlling budgets and personnel. Assessment then is needed to educate this group of people, offices, and communities so they can understand the stories of the writing center we are sharing. Such stakeholders may include writing faculty, other faculty across disciplines, and administrators such as chairs, deans, and directors. Stakeholders may also include accrediting and licensing bodies, advisory boards, regional/national/international disciplinary organizations, and institutional and community partners–both local and global. These stakeholders often represent larger movements on our campuses and in the world at large, and these movements prompt us to ask new and different questions while simultaneously pushing us for particular answers, figures, and/or stories.

As we engage in assessments and share stories/data from assessments, we can have an impact on the way stakeholders understand the work of our centers. As Grutsch McKinney writes,“[W]hen we repeatedly tell outsiders writing centers do x or writing centers are x, they expect x. They assess x. They fund x. And if x is an explanation, a story, meant to make the work understandable to outsiders but really only the beginning of what writing centers are, do, did, or could do, we might find ourselves having trouble getting reorganized or getting funding for y or z” (Peripheral 8-9). By sharing assessments with stakeholders and engaging them in parts of the assessment process, we begin to (re)shape understandings of what our centers are and can be to a much broader audience than the consultants who work within our centers and writers who visit our centers.

While assessments help us understand our centers better and communicate those understandings to stakeholders, we cannot always anticipate what assessments will be necessary to inform point-of-need decisions within our centers. Sometimes stakeholders ask questions we are not ready to answer. For instance, in 2020 when all of us were pivoting to online versions of our writing centers due to the COVID-19 pandemic, the needs and expectations of our stakeholders changed quickly and loudly. Can you keep the center open? How will you move online? How can we accommodate the needs of our students? How will you help the faculty in their transitions? Both of our centers were already offering online appointments, so we had an infrastructure for moving online, but it was still a task to move all of our in-person schedules to online, and this was before thinking about other services such as workshops and writing groups, or thinking about the needs of our staff, as both employees and students. 

In the IUPUI center, we found that while operations were fully online consultants were having a hard time connecting with each other in informal ways that came easily in an in-person environment. They could no longer quickly ask each other questions, share consulting strategies, or connect about collaborative projects. To address this need, we restructured our communication platforms and procedures (see Hull and Pettit). While this work made it possible for us to improve communication, morale, and daily operations, it was hard to make visible to stakeholders. All they noticed was that we had fewer writing consultations than usual. Marilee shared the Center’s staff focus group report assessment, which summarized consultant experiences, concerns, frustration, and celebrations from across the year with her department chair and dean. While it provided helpful information about the ways the Center was impacted by the pandemic that were compelling to her department chair and dean, it was not consistently compelling to other stakeholders. One unit responded by providing the Center with additional support for graduate writers, while another pulled financial support for undergraduate writers demanding the Center open for in-person operations in the fall and expressing concerns about a drop in usage.

At MSU the fully online consulting schedule was complicated by the needs of international students who had been sent home due to the pandemic and later of international students who couldn’t get back into the US because of COVID-19 protocols. These needs were compounded by the needs of their professors who were struggling to meet the range of demands represented by all of their students and the university’s mandate (rightly so) to support them in any way possible. Our nascent plans for adding an asynchronous modality to our schedule was then fast-tracked. Some of our consultants were quickly trained in this form of consulting, and the schedule opened. However, we knew from the start that this new offering would require a new form of assessment as we reflected on what was working, what needed to be revised, and how we needed to proceed with future training and assessments. Our staff focus groups were immediately revised to focus on those conducting asynchronous sessions. Then, we added a set of client focus groups to get a more complete picture of how effective (or not) the asynchronous consulting was and what needed to be improved. These results were shared with the MSU Writing Center’s Advisory Committee, which then shared with us other faculty needs and struggles, particularly around workshops. At the same time, these consultant and client evaluations helped us revamp our training of asynchronous consultants, which was important as the affordances of asynchronous consulting became appealing to new groups of students, such as third year med students working on clinical applications in the middle of the night shift. 

In both of these instances, our centers needed to respond quickly to changing institutional contexts. Existing assessments supported some of the decisions we needed to make but were insufficient in addressing others. Based on these experiences, we developed new assessments or adjusted existing assessments to help us create or shift programming within our centers and make (sometimes) compelling arguments about our work to stakeholders outside our centers. While the examples we’ve shared focus on a hopefully rare global pandemic, more common considerations pertaining to relationships with stakeholders include adjusting to changing leadership and administrative structures within the institution, partnering with community organizations or other institutions, and the inevitable restructuring of our own centers.

Writing Center Administrators

Writing Center administrators–directors, coordinators, associate directors, faculty supervisors–are the bridge between the consultants on the ground and the looming shadows of stakeholders and the interests they represent. Administrators, such as ourselves, collect the various pieces of assessment and share them out, (re)framed for different audiences and purposes, theories turned into practices and enacted through policies and procedures. As Laura Greenfield says, “our practices inform our theories, which inform our practices, which inform our theories in an ongoing dynamic process” (141).

Within our centers and with co-administrators, assessments provide stories and data that inform program revision and creation. Assessments provide data and share stories that help us understand if and how our programs do what we designed them to do. They provide us with information about client/writer experiences as well as consultant experiences. We share assessment data with staff to guide conversations about adjustments we might make to programs and to develop new assessments that help us learn more about the practices in our centers. As administrators, we also use assessment data and stories to consider changes to writing center education and mentoring. Change can be hard, and assessments give us data that enables productive changes and helps consultants see the need for moving out of their comfort zones to make necessary changes. As Sarah Blazer writes, “To make even small shifts . . . requires time, reflection, and careful attention to the ways that everyday, small acts relate to our philosophical stances” (2015, 46). In this way, assessments support writing center administrators in making data-informed decisions and communicating those decisions to consultants. 

One aspect of programming that the MSU Writing Center has been reflecting on, revising, and continuing to assess is workshops–the workshops that consultants facilitate for faculty classes, programmatic meetings, and even community groups. Originally, our thought was that anyone who was a trained writing consultant could (and should be able to) facilitate any of the workshops on our menu. Afterall, they were all focused on some aspect of writing. However, feedback from consultant self-assessments as well as focus groups, indicated that this was not true. Some consultants felt unprepared for these workshops, others felt totally uncomfortable in this position and mentioned introversion and social anxiety as reasons they should be excused. Our first thought was that we needed additional training for everyone; we needed to address workshop literacies, so folks had a better toolkit for facilitating workshops. This training really did help some of our consultants, but pushback and feedback persisted–not everyone was cut out for workshop facilitation. This story was corroborated by the assessment forms from faculty who had engaged our workshop services for their classes, particularly those who requested the same workshop across multiple sections of a course, which meant that they saw multiple consultants in action as facilitators. We, as administrators of the center, needed to reflect even more, ask more questions of our staff and our stakeholders, and find a more viable solution. 

At present the MSU Writing Center has a new way of addressing this programming: we are still conducting workshop literacies training, but only for those who have opted in and taken on workshop facilitation as one of their jobs in the center. We have a more limited pool of workshop facilitators, but they are a group that is prepared, trained, and willing, which makes the workshops run more smoothly and makes our workshop coordinator’s job much more reasonable. We even have consultants seeking this training because they want to push their own boundaries and comfort zones and learn to facilitate workshops for groups of people. The multiple layers of and types of assessment feedback worked together to highlight a needed revision to policy and procedure and made for more effective programming and happier consultants.

The MSU Writing Center administrators in this story were often caught in the middle of multiple requests, needs, complaints, and suggestions. There were the needs of the consultants to consider as we didn’t want workshop facilitation to traumatize consultants or put them at risk. We also had the learning goals of faculty in the equation as well as the learning goals of the students participating in the workshops. Likewise, there were other stakeholder pressures to reach more students through class and programmatic workshops. All of these needs and pressures had to be mediated and the data from various assessments helped to inform the decision making, particularly when combined with pedagogical and theoretical knowledge about programs, assessments, student needs, and writing/centers. 

Another common mediation in our administrative position is located in often confusing lines of reporting and general center assessment. The accomplishments, as well as the needs, of the center become the accomplishments and the needs of the administrator. The “we” of the writing center becomes the “I” of the director; consequently, the director is often viewed as needy or selfish and the actual labor of the individual administrator is masked by the plurality of the center. As Geller and Denny explain “writing center directors who find themselves in administrative appointments realize their academic route for advancement is unclear. Writing center directors in tenure-track positions question how intellectual labor is understood and how academic membership is conferred locally and disciplinarily” (103). Grutsch McKinney further explains that “Our collective research agenda [as writing center administrators] is stymied both because scholars are not hired as administrators (or administrators are not paid to be scholars) and because our gaze has more or less kept us, in large part, from substantial theoretical and empirical research on aspects of writing center work beyond tutoring” (Peripheral 85). This unclear and inconsistent positioning of writing center administrators points to a need for both broad and specific assessments that allow for telling and showing various stories depending on what is being asked of us and by whom. Various assessments also provide space and data to illustrate what we, as administrators, as researchers, as teachers and mentors, are requesting in the name of the writing center but also what we are requesting as individuals who work in/with centers. 

As the first tenure-line director of the IUPUI University Writing Center, Marilee has encountered many instances where her career trajectory and the goals and objectives of the writing center were conflated. Concerns about the writing center having sufficient administrative support after the assistant director stepped down became less about what was good for the writing center in the long term and more about supporting Marilee through the tenure process. Clearly, having the required amount of administrative labor was a necessary requirement for Marilee earning tenure, but seeing administrative support only in light of that short-term consideration was insufficient. The writing center would need consistent administrative support of more than Marilee’s one-course reallocation each semester even after she earned tenure. In this framing of the situation, the conflation of Marilee’s success and the writing center’s success, while certainly related, privileged Marilee’s tenure case over the long-term health of the center. Marilee worked to remedy this way of framing her work in relation to the writing center through the administrative focus groups assessment the writing center facilitated each spring and an external program review. Both the assessments and the review provided significant data about the need for long term investment in an administrative team for the IUPUI Writing Center to maintain its programming, especially its emphasis on providing high-impact learning practices for student consultants. Assessments can help us more clearly distinguish when we are asking for resources that can support ongoing development of the center and when we are asking for resources that support our individual careers. 

Conclusion

Assessment offers us an exciting opportunity to organize around our shared values and to engage in empirical research with our writing center communities. The work of assessment is an engaged learning, high-impact practice that helps writing center communities answer questions about work and provides consultants with experiences they can carry beyond their work in writing centers. It can lead consultants to see themselves as researchers with important questions and the skills needed to begin answering them. It can foster a culture of inquiry and curiosity across our centers and our discipline. 

That said, assessment cannot solve all of our writing center or personal career problems, but assessment as everyday practice means that we are better informed and prepared when stakeholders ask questions, make requests, or operate from (false) assumptions. However, assessment is not something you build in isolation. As Hannah Arendt wrote, “For excellence, the presence of others is always required” (198)—their voices, their stories, their perspectives must all be taken into account for the big picture to emerge. In writing centers, a great many voices are needed to get a fuller picture of our operations, our successes, our limitations. An important set of voices within writing center assessment are those of students, both student writers who visit our centers and student consultants who work in our centers. As Malencyzk, et al. share, “We have often gone where students have led us—in our teaching, in our research, in our writing centers, in our program development—and we have followed others in the field who have championed students as knowledge makers” (70). In addition, at times we go where our upper administrators and university leaders direct us through their questions and requests, guided by campus strategic plans and university values statements. Likewise, we go where the field itself leads or, perhaps more importantly, where we want to lead the field. 

It is our intention that this essay leads readers to embrace assessment in their own writing centers as a regular part of their daily practices. We want you to see assessment as one of many tools used in writing centers to help us better serve whoever our stakeholders are in our individual centers: students, writers, consultants, faculty, administrators, community members, and others. We know that this assessment will look and act differently in every writing center and will draw from and collaborate through the various methods and methodologies that are important to the people planning and conducting the assessments. Therefore, we also hope it leads you to build relationships with the individuals in your centers and universities in order to create assessments that matter for (all of) you in particular times and spaces as well as assessments that morph and change as your circumstances and personnel change.

Notes

  1. It’s important to note here that in this article Lerner critiques his own earlier work in the oft-cited “Counting Beans and Making Beans Count” for being statistically and logically unsound (62). Something he also revisits in the WLN article “Counting Beans Wisely.” 

  2. For more on cultural rhetorics methodologies see Anderson, Bratta and Powell, Brooks-Gillies et al., Cox et al., Powell et al., Riley Mukavetz, Webster, and Wilson.

  3. This is a copy of an actual email sent to a number of student consultants in the MSU writing center in 2008. It was strategically designed to recruit a range of consultants for the newly forming assessment committee. 

  4. This is the present form of the client feedback form in the Writing Center at MSU. We use the genre of the letter in order to make the summary and feedback more personal and to recognize the client/writer as the primary audience for the client report.

  5. As cultural rhetorics scholars, we understand our position within a “constellation of relationships” through which we create and maintain structures and practices that make up our everyday practices and the discipline of writing center studies (constellated itself in relation to the fields of writing studies, writing across the disciplines, etc.). As Powell et al. explain, “A constellation, however [unlike an intersection], allows for all the meaning-making practices and their relationships to matter. It allows for multiply-situated subjects to connect to multiple discourses at the same time, as well as for those relationships (among subjects, among discourses, among kinds of connections) to shift and change without holding the subject captive” (5).

  6. Outlining for example rapport-building, agenda setting, sharing resources, encouraging writer agency, concluding moves, and even the types of roles the consultant plays in the session.

Works Cited 

Aguilar-Smith, Stephanie, Floyd Pouncil, and Nick Sanders. “Departing To a Better World: Advancing Linguistic Justice Through Staff Professional Development.” The Peer Review, vol. 6, no. 1, 2022. https://thepeerreview-iwca.org/issues/issue-6-1/departing-for-a-better-writing-center-advancing-language-justice-through-staff-professional-development/ 

Anderson, Joyce Rain. “To Listen You Must Silence Yourself.” Eds. Jennifer Clary-Lemon and David M. Grant. Decolonial Conversations in Posthuman and New Material Rhetorics. The Ohio State University Press, 2023, pp. ix-xi.

Arendt, Hannah. “Vita Activa and the Human Condition.” The Human Condition in The Portable Hannah Arendt. Penguin, 2003.

Blazer, Sarah. “Twenty-first Century Writing Center Staff Education: Teaching and Learning toward Inclusive and Productive Everyday Practice.” The Writing Center Journal, vol. 35, no. 1, 2015, pp. 17-55.

Bratta, Phil, and Malea Powell.. “Introduction to the Special Issue: Entering the Cultural Rhetorics Conversations.” Cultural Rhetorics, a special issue of Enculturation, vol. 21, 2016. http://enculturation.net/entering-the-cultural-rhetorics-conversations

Brooks-Gillies, Marilee, Varshini Balaji, KC Chan-Brose, and Kelin Hull. “Listening Across: A Cultural Rhetorics Approach to Understanding Power Dynamics within a University Writing Center.” Have We Arrived? Revisiting and Rethinking Responsibility in Writing Center Work, a special issue of Praxis, 2022, vol. 19, no. 1, https://www.praxisuwc.com/191-brooksgillies-et-al 

Cox, Matthew, Elise Dixon, Katie Manthey, Rachel Robinson, and Trixie G. Smith. “Embodiment, Relationality, and Constellation: A Cultural Rhetorics Story of Doctoral Writing.” Re-Imagining Doctoral Writing. Eds. Cecile Badenhorst, Brittany Amell, and James Burford. WAC Clearinghouse, 2021, pp. 145-166.

de Certeau, Michel. The Practice of Everyday Life. Trans. Steven Randall. University of California Press, 1984.

Driscoll, Dana Lynn, and Perdue, Sherry Wynn."RAD Research as a Framework for Writing Center Inquiry: Survey and Interview Data on Writing Center Administrators' Beliefs about Research and Research Practices," Writing Center Journal, vol. 34 ,  no. 1, 2014. DOI: https://doi.org/10.7771/2832-9414.1787 

Driscoll, Dana Lynn, and Perdue, Sherry Wynn. "Theory, Lore, and More: An Analysis of RAD Research in The Writing Center Journal, 1980-2009," The Writing Center Journal, vol. 32, no. 2, 2012. DOI: https://doi.org/10.7771/2832-9414.1744 

Geller, Anne Ellen, and Harry Denny. “Of Ladybugs, Low Status, and Loving the Job: Writing Center Professionals Navigating Their Careers.” The Writing Center Journal, vol. 33, no. 1, 2013, pp. 96-129.

Gillam, Alice. “The Call to Research: Early Representations of Writing Center Research.” Writing Center Research: Extending the Conversation. Eds. Paula Gillespie, Alice Gillam, Lady Falls Brown, and Byron Stay. Lawrence Erlbaum Assoc, 2002. pp. 3-21.

Greenfield, Laura. Radical Writing Center Praxis: A Paradigm for Ethical Political Engagement. Utah State University Press, 2019.

Grutsch McKinney, Jackie. Peripheral Visions for Writing Centers. Utah State University Press, 2013.

---. Strategies for Writing Center Research. Parlor Press, 2016.

Hawthorne, Joan. “Approaching Assessment as if It Matters.” The Writing Center Director’s Resource Book. Eds Christina Murphy and Bryan L. Stay. Lawrence Erlbaum Associates, 2006. pp. 237-245. 

Hull, Kelin and Cory Pettit. “Making Community through the Utilization of Discord in a (Suddenly) Online Writing Center.” The Peer Review, vol. 5, no. 2, Autumn 2021. https://thepeerreview-iwca.org/issues/issue-5-2/making-community-through-the-utilization-of-discord-in-a-suddenly-online-writing-center/

Lerner, Neal. “Writing Center Assessment: Searching for the ‘Proof’ of Our Effectiveness.” The Center Will Hold: Critical Perspectives on Writing Center Scholarship. Eds. Michael A. Pemberton and Joyce Kinkead. Utah State University Press, 2003. pp. 58-73.

Lockett, Alexandria. “A Touching Place: Womanist Approaches to the Center.” Out in the Center: Public Controversies and Private Struggles. Eds. Harry Denny, Robert Mundy, Liliana M. Naydan, Richard Sévère, Anna Sicari. Utah State University Press, 2018, pp. 28-42. 

Macaulay, Jr, William J. “Getting from Values to Assessable Outcomes.” Building Writing Center Assessments that Matter. Eds. Ellen Schendel and William J. Macauley, Jr. Utah State University Press, 2012, pp. 25-56.

Malenczyk, Rita, Neal Lerner, and Elizabeth H. Boquet. “Learning from Bruffee: Collaboration, Students, and the Making of Knowledge in Writing Administration.” Composition, Rhetoric, and Disciplinarity. Utah State University Press, 2018, pp. 70-84.

Olson, Jon, Dawn J. Moyer, and Adelia Falda. “Student-Centered Assessment Research in the Writing Center.” Writing Center Research: Extending the Conversation. Eds. Paula Gillespie, Alice Gillam, Lady Falls Brown, and Byron Stay. Lawrence Erlbaum Assoc, 2002, pp. 111-131.

Powell, Malea, Daisy Levy, Andrea Riley-Mukavetz, Marilee Brooks-Gillies, Maria Novotny, Jennifer Fisch-Ferguson. “Our Story Begins Here: Constellating Cultural Rhetorics Practices.” Enculturation, vol. 18, 2014.  http://enculturation.net/our-story-begins-here

Riley Mukavetz, Andrea. “Towards a Cultural Rhetorics Methodology: Making Research Matter with Multi-Generational Women from the Little Traverse Bay Band.” Rhetoric, Professional Communication and Globalization, vol. 5, no. 1, 2014, pp. 108-124.

---. “Females, the Strong Ones: Listening to the Lived Experiences of American Indian Women.” Studies in American Indian Literatures, vol. 30, no. 1, Spring 2018, pp. 1-23.

Schendel, Ellen, and William J. Macauley, Jr. Building Writing Center Assessments that Matter. Utah State University Press, 2012.

Valentine, Kathryn. “Listening to the Friction: An Exploration of a Tutor’s Listening to the Community and Academy.” Have We Arrived? Revisiting and Rethinking Responsibility in Writing Center Work, a special issue of Praxis, vol. 19, no. 1, 2022. https://www.praxisuwc.com/191-valentine 

Webster, Travis. Queerly Centered: LGBTQA Writing Center Directors Navigate the Workplace. Utah State University Press, 2021. 

White, Edward M. “Language and Reality in Writing Assessment.” College Composition and Communication, vol. 41, no. 2,  1990, pp. 187-200. 

Wilson, Shawn. Research is Ceremony: Indigenous Research Methods. Fernwood Publishing, 2008.