For the past few days, I have temporarily placed my primary hat of librarianship on the rack and taken up the Stetson hat of assessment to attend the 2012 SACS Annual Meeting in Dallas with some colleagues. With assessment firmly on the brain, I am finally getting around to a post that I have been meaning to share for some time. My institution completed a reaccreditation site visit this past spring, concluding a lengthy, two-year self-study process. Or so one might think. The site visit does bring a sense of finality to a self-study, but there is another way to view the assessment activity that drives the self-study process.
At the initial meeting for the site visit, the following statement about the reaccreditation process was offered: “It’s not a destination; it’s a journey.” Off and on since that time, I have reflected on that statement. And I consistently end up thinking about one thing:
Anyone who has ever played a flight sim game or actually flown an airplane is familiar with the concept of the waypoint. In flight navigation, a waypoint is a specific point along a flight route that serves as a marker or guidepost to help keep you on your correct flight path. One dictionary defines a waypoint as “the co-ordinates of a specific location as defined by a GPS.” Another dictionary offers this definition: “An intermediate point on a route or line of travel.” In other words, it’s not an (ultimate) destination, but rather a (point along the) journey.
The reaccreditation site visit is typically seen as the “capstone” event of a self-study process, bringing much-needed closure to a long-suffering process that includes massive amounts of data collection and review, many sleepless nights, more meetings that you can shake a stick at, a few more sleepless nights, hours upon hours of writing, and even more sleepless nights. Our minds need some finality to the whole process. We need time to breathe. To quote Jack Nicholson from “A Few Good Men” [with some translative license]: You WANT this to be the end! You NEED this to be the end!
There is an ongoing nature to the whole process. Even after the site visit, for example, there still remain those follow-up reports in response to recommendations of the visiting team. In many ways, today’s accreditation process never really ends. It shouldn’t end. The end-game for assessment is improvement and effectiveness, and I believe we will never reach the bottom of the jar of improvement and greater effectiveness. Yes, the site visit could be seen as a singular event in time. Underneath that event, however, flows a steady stream of ongoing activity–a river of assessment. Assessment is an ongoing process of identification, collection, measurement, and review followed by a determination of what level of success is reflected in the outcomes and a plan for using what is learned to benefit and guide going forward. And that is followed by another round of identification, collection, measurement, review, and so on.
So on the return flight back to South Carolina tomorrow afternoon, my colleagues and I will be in a plane that will (hopefully) be hitting its waypoints in order to effectively reach the GSP airport. Likewise, when we get back to campus, we will be aiming for waypoints to guide us through the continual journey of assessment. Happy flying, everyone.
OK, it’s time to come out of hiding and re-enter the blogging world. It’s been an extremely busy summer, but an experience today has motivated and called me out. So here goes.
This morning I read an article on survey fatigue in The Chronicle and shared a link to it on Twitter along with another post asking the twitter-peeps if–outside of surveys–they use any creative ways of collecting feedback data. Almost immediately, I was engaged in a Twitter conversation with Ned Potter (@theREALwikiman) about a real interest in hearing how folks might respond to such a question. (Once again, evidence of the power of social connections)
Anywho, Ned suggested that writing a blog post on the subject might help to solicit responses. And he did just that. In the post, he asks:
I’m really interested in how to get feedback – not just from students in academic libraries, but from all patrons for all types of libraries.
And later in the post:
So what are you doing to ascertain what your patrons are thinking? Is there something more reliable than surveys? And if you’re asking them via social media, how did you find out what social media platforms they used in the first place…?
I share his interest, so I ask: If people are burning out on surveys, what are some other ways of gathering feedback from those we serve? Are you using any creative/innovative ways of soliciting feedback that is working and giving you a healthy response rate?
And I, too, am thinking of libraries–those of all types–and their engagement with library patrons. But I would extend the question to areas outside libraries. Do we see non-survey feedback strategies being successfully employed in other places that could be ventured perhaps in the library environment?
So let’s hear from you! Respond to this post. Respond to Ned Potter’s post. Share your creative solutions. Yes, the irony is thick with a feedback solicitation on the topic of feedback fatigue. But, hey, it’s Friday and comic relief is good for everyone, right?
Challenging. Time-consuming. The first semester of this academic year has been…well, just that. I can’t remember a more busy time in my career as a librarian since 1998/99. (That’s another “perfect storm” story altogether.) And most of what I have been entwined with recently comes from outside my typical sphere of duties. Our institution is currently involved in the re-accreditation process, and I have landed on several self-study committees either as a chair or a resource person. Anyone who has been through the re-accreditation process (this is my 2nd go-around) understands what that means.
Honestly, most of my work energies over the last 3-4 months have been devoted to something outside of the library, and I kinda miss my job. (I should also note that I lament being socially MIA on Twitter, etc. with my peeps.) Special activities like re-accreditation are beneficial and much-needed. Nevertheless, at times I feel like a school kid wandering the streets in the middle of a weekday looking over my shoulder for a truant officer. (Am I abandoning my post?) Other times, I feel like Cinderella must have felt to be left scrubbing the floors while her sisters went out to the big event. (Am I missing the fun?)
I’m ready to be a librarian again…and in more ways than one. I’m ready to get back to what I know and love best. At the same time, I have been reflecting on just what it is that I know and love best.
Perhaps one of the benefits of this time away from my normal duties has been the ability to step out of the mix somewhat and reflect. I have been doing some soul-searching, or–more precisely–some mission-searching. Actually, I’ve been reflecting on “mission,” “purpose,” and the like for about a year now. Maybe this semester was the match to throw on the charcoals that I have been soaking in lighter fluid. When I heard from some of the library staff that they had a good conversation this week about the library’s purpose and identity, I knew that I was onto something.
So here’s what we as a library staff are going to do. In January we are going to hold an informal library staff forum to talk about our library and its role in our institution and higher education in general. We will reflect on:
- Who we (the library) are.
- What we do.
- How we do it.
A family meeting, so to speak. Who knows? We may even invite the academic dean and the president. (Open communication is golden.) The plan is simple: Talk, listen, and respond and then see what happens.
New year resolution. Spring cleaning. A first step. Utter nonsense. Call it what you will. We’re going to talk and listen, and hopefully we’ll come out on the other end all the better for having done so.
Time to go. I’ve got more re-accreditation work to do before breaking for the holidays.
This post continues a series on five thoughts with which I have been tussling concerning my library. To get caught up, first go here and then here. Thought #3 is perhaps the most overarching of all, touching on every aspect of what we do. It actually has two parts:
Thought #3: (Part A) Are our current methods of assessment and evaluation effectively doing their job? And (Part B) are we using assessment and evaluation outcomes to their fullest potential?
As a library administrator, I fully understand the role and value of assessment. Apart from mandates and professional responsibilities, I appreciate assessment simply because I care. I believe that any of us with genuine concern about the impact of our efforts grasps the merit of evaluation. It is from this vantage point that I have been conducting a mental assessment (so to speak) of our assessment efforts. This includes evaluation of the resources and services provided by the library as well as evaluation of the library staff. I am wondering if we can makes any changes in our assessment methods that will make the process more meaningful.
With our assessment efforts:
- Are we asking the right questions? Quite simply, are we assessing the right things? Are we leaving anything on the table? Is there something that we do or offer about which our users would be more than willing to provide feedback if we only asked? Would the library staff find greater interest and value in staff evaluations if we totally redesigned the process?
- Are we asking those questions the right way? Are we approaching our assessment efforts from the best angle to yield the best results? When assessment involves feedback from users, do we pose our questions in a way that they understand what we are asking? I love the recent blog post by Andy Burkhardt (Information Tyrannosaur) about librarians seeing the library with fresh eyes. Sometimes we need to remove ourselves from our everyday role and see what we do from a library user’s perspective. Andy offers some great suggestions on how to give it a try, including a reference to a brilliant idea posed by Brian Herzog (Swiss Army Librarian).
- Are we asking the right people? When seeking feedback concerning a particular resource/service, are we asking the people who are actually using that resource/service? Are we considering input from every possible user group (i.e. students, faculty, staff, alumni, freshmen, athletes, music majors, etc.)?
- Are we closing the loop? I’ll be honest; I have been guilty of going to great lengths to gather evaluative data only to let it collect dust. You can have the richest collection of assessment data in the universe. You can even prepare the sharpest and clearest report of evaluation findings known to mankind. But all of that means very little if you do nothing with it. Assessment for assessment’s sake generates a file of data. Assessment for the sake of improvement generates value. We must do something with that data that we collect.
I must confess that short of minor tweaks, many of our library’s assessment tools have changed very little over the past several years. When assessment is one of many tasks in a roster of duties, it is easy to just continue using the same metrics, collecting them the same way year after year. The reality, however, is that the playing field continues to change. It stands to reason that our assessment efforts must often do the same in order to remain in step and retain their relevance.
Are you trying any innovative methods of assessment that draws useful participation and feedback?
Are you conducting staff evaluations in a fresh way that is resonating with those being evaluated?
What steps are you taking to ensure that you are doing something to “close the loop” with your assessment data?
Pic credit: swannman