Libraries and the CCSSE, Part II: Efforts to Change the CCSSE

In my last post, I provided some background about the Community College Survey of Student Engagement (CCSSE), and the troubling fact that libraries, librarians, and information literacy are not mentioned at all on the 80-item instrument.

I first realized the omission thanks to a 2012 post by Troy Swanson, who notes that libraries “are all about engagement.” Although Swanson was part of a Community and Junior College Section task force that lobbied CCSSE and proposed four library questions in 2009, CCSSE declined to adopt any of them.

Fired up, I joined my campus Data Team, a committee of faculty and staff charged with discussing institutional research findings and sharing them with the rest of the campus community. That first year, Data Team members were as shocked as I was to learn that the library didn’t come up once on the CCSSE. The following year, I chaired the Data Team and had the opportunity to really sink my teeth into CCSSE items, benchmarks, and findings. I wanted to understand the CCSSE well enough to lobby for change.

I had a brief ray of hope when I started attending Institutional Effectiveness Council meetings and brought the issue up, but our Director of Institutional Research soon brought me up short. He explained apologetically that adding custom items to the CCSSE is tremendously expensive and, in the end, wouldn’t tell us much anyway, because we wouldn’t be able to compare those results to other colleges’ data.

At least one community college has added their own custom questions. Glendale Community College, whom I’ve long admired for their work assessing short- and long-term student success outcomes of participating in library instruction, added three custom questions about library usage when they administered the CCSSE in 2011.

Students selected an answer from a Likert scale (Frequently, Sometimes, Rarely, Never) to answer the following items:

  • In the current semester, how often have you checked out a book from the campus library?
  • In the current semester, how often have you used the electronic resources (Online Journals, Magazines, E-Books, Ask-A-Librarian 24/7 Chat, etc.)?
  • How often do you access campus library resources (Online Journals, Magazines, E-Books, Ask-A-Librarian 24/7 Chat, etc.) from a location other than the campus library?

While these questions yield interesting descriptive data (e.g., “62.7% never checked out a book from the campus library”), the drawback of adding custom questions – besides the cost – is that without other colleges’ data, we can’t draw correlations between these items and student engagement measures. So we wouldn’t be able to tell, for example, if colleges where students use electronic library resources in greater numbers have higher measures of student engagement.

Our colleagues at the National Council for Learning Resources (NCLR), an affiliate council of the American Association of Community Colleges, have had some success proposing questions for the National Survey of Student Engagement (NSSE), CCSSE’s counterpart at 4-year universities.

I recently spoke with Dr. David Wright, the chair of NCLR, and he shared with me the questions they wrote for NSSE’s topical module on information literacy. Students are asked about the frequency of eight actions relating to research and information-seeking behaviors, about how often instructors emphasize five different critical thinking skills relating to research, and a final question on how much the students’ experience at their institution contributed to their “knowledge, skills, and personal development in using information effectively.”

Can we do something similar with CCSSE? If CCSSE rebuffed librarians once, and adding questions is prohibitively expensive and not necessarily informative for an individual institution, what recourse do librarians have to cracking the CCSSE?

It turns out there is some good news. Last March I attended the League for Innovation in the Community College Innovations conference and spoke with Emmet Campos, CCSSE’s High-Impact Practices Project Coordinator, on the exhibits floor. Dr. Campos told me that they were planning on revamping the CCSSE in the next six months to a year, so there may be an opportunity for change soon.

When I followed up with Dr. Campos upon my return, he put me in touch with Catherine Cunningham and Katie Mitchell, CCSSE college liaisons who are working to refresh the instrument with the development of an information literacy module. Catherine described their current work in an email to me:

“[We] are currently working on a review of the literature on information literacy in order to start the process of developing a module on questions surrounding information literacy and student engagement. We are working to find questions that will point to practices that have been shown to increase student success and student engagement as it relates to information literacy, be it in the library or in the classroom. We are not sure what those questions will look like yet, but we are definitely in the process of reviewing the literature and beginning to formulate some ideas around these topics. If you have any resources you’d like to point us toward, we’re happy to review any research you might be able to point us towards.”

I immediately sent back the Glendale Community College data I linked to above, as well as ACRL’s Value of Academic Libraries report. While it doesn’t seem as though we’re at the stage to propose potential questions like NCLR did for the NSSE, this is a great opportunity for librarians to get in on the ground floor and point the CCSSE researchers toward studies correlating information literacy and library use with measures of student success and student engagement.

To share your input, please consider contacting Catherine Cunningham and Katie Mitchell at cunningham@cccse.org and mitchell@ccsse.org.

I’m sure I’ve missed some important work other librarians are doing on this front. Have you or a colleague worked on researching or proposing questions for the CCSSE, the NSSE, or another instrument? If so, please leave a note in the comments section to let us know.

If you’re interested in learning more or getting involved, feel free to email me at jane.stimpson@sjcd.edu. With interest in an information literacy module building, and the success of librarians on NCLR with adding NSSE items, this could be the year libraries finally get some representation on the CCSSE.

Libraries and the CCSSE, Part I: What is the CCSSE, and Why do We Care?

If you work at one of the 718 colleges that administered the Community College Survey of Student Engagement (CCSSE) in 2013, you may have noticed something missing: The library. Students answered 80 questions about their experiences with campus resources and services, but the word “library” or “librarian” didn’t show up once.

Librarians  myself included  are understandably upset about the omission. I joined my campus’ Data Team, a committee that liaises between administration and faculty to share institutional data and their implications, in order to learn more about the CCSSE and bring this issue to light on my campus.

I’ll write about the efforts of librarians to get representation on the CCSSE in my next post. For now, a little background on the CCSSE instrument: What it seeks to measure, and what, specifically, it asks students.

The CCSSE is administered every other year during the spring semester. It’s designed to measure student engagement on community college campuses.

Why student engagement? Research finds that when students are engaged with their college  with the people there and with what they’re learning   “the more likely they are to persist in their college studies and to achieve at higher levels.” So CCSSE data can help administrators and institutional researchers determine where students are making connections on campus  “sticky points,” we call them, at my college  that help them stay in classes and succeed in their coursework.

CCSSE items are grouped into five benchmarks, representing different areas of student engagement: Active and collaborative learning, student effort, academic challenge, student-faculty interaction, and support for learners. A CCSSE validation study has correlated those benchmarks with various student outcomes, including GPA, retention, and persistence. For more detail, check out the validation study or take a look at this table I created for a Data Team presentation to my campus to map those correlations.

Librarians care deeply about student success, and very much want to have good data on how library resources and services engage students and contribute to their success. Not to be represented on the CCSSE is an enormous oversight that needs to be rectified.

What specifically does the CCSSE ask students? There are 80 questions total, but I’ve highlighted some of the items from the 2013 instrument that may be of particular interest to librarians, or seem to be particularly well-suited for the addition of the library.

  • Item 4: “In your experiences at this college during the current school year, about how often have you done each of the following?”

There are 21 actions under this item, and students rate each one on a Likert scale: Never, Sometimes, Often, or Very often.

Action 4d asks students how often they have “worked on a paper or project that required integrating ideas or information from various sources.” This item seems to be the closest the CCSSE comes to asking students about research skills. I don’t think it necessarily refers to traditional research papers  the sources could be course readings provided the instructors, for example  but synthesizing information from sources into a finished product is a skill librarians teach.

This item falls under the “Student Effort” benchmark, so why not go a step further and ask how often students sought help from a library staff member on a paper or project?

An engaged student makes the effort to seek help from available resources on campus; library staff are one of them, so we should be on the CCSSE.

  • Item 12: “How much has your experience at this college contributed to your knowledge, skills, and personal development in the following areas?”

There are 15 different areas under this item, and students rate each one on a Likert scale: Very little, Some, Quite a bit, or Very much.

The areas that most closely align with the skills students develop when they use the library and research might be areas 12e (“Thinking critically and analytically”), 12g (“Using computing and information technology”), or even 12i (“Learning effectively on your own”).

Research skills or information literacy do not show up on the list at all, but this is another area where the library could be added without too much effort. CCSSE could add areas or substitute existing areas with ones such as “Evaluating information sources to determine their credibility” or “Researching a topic using specialized search tools like library catalogs or databases.”

  • Items 13.1 – 13.3: These items ask students about their use and perception of various college services:
    • Item 13.1: “How often do you use the following services at this college?” Answered on a Likert scale: Rarely/Never, Sometimes, or Often
    • Item 13.2: “How satisfied are you with the following services at this college?” Answered on a Likert scale: Not at all, Somewhat, Very
    • Item 13.3 asks, “How important are the following services to you at this college?” Answered on a Likert scale: Not at all, Somewhat, Very

There are 11 services listed in these items, and I include the complete list here to note the range of services students are asked to consider:

    • Academic advising/planning
    • Career counseling
    • Job placement assistance
    • Peer or other tutoring
    • Skill labs (writing, math, etc.)
    • Child care
    • Financial aid advising
    • Computer lab
    • Student organizations
    • Transfer credit assistance
    • Services to students with disabilities

I know libraries aren’t considered a student service in the same way counseling and advising are, but some of these services overlap with the library. My library was recently renovated according to a Learning Commons model, so all of our peer tutoring areas (two of which are a Writing Center and a Math Lab) are in the library. I’m sure some students use the library as a computer lab, as well  we probably have the largest number of computers on campus, second only to the “real” computer lab one building over. When computer labs and skill labs and tutoring centers are located in the library, where do we draw boundaries between using a lab and using the library? And do our students draw the same boundaries? What would it take to add “Reference Services” or “Library” to this list?

Not surprisingly, it takes a lot of effort and a lot of time to change the CCSSE to incorporate the library. While individual colleges can add items, that can’t be a permanent solution, for a couple of reasons.

First, it is very expensive to add items  prohibitively so, for many colleges. Second, and even more important, we can’t tell much if we’re the only college in the cohort asking the question. Without benchmarks and other colleges’ data, we can’t tell how we stack up against other institutions. If 33% of my college’s students report having consulted a reference librarian for a project or paper, I can’t know if that’s relatively more or fewer than other colleges if they’re not asking the same thing.

So if we want the CCSSE to incorporate questions about the library, we’re going to need the entire instrument to be redesigned to accommodate them. Luckily, that redesign is coming  and librarians have a chance to get involved. I’ll describe past efforts and future opportunities for input on the CCSSE in my next post.

For more information on CCSSE in the meantime, start here, or contact the institutional research office on your campus.

Two Year Talk is Seeking Contributors

Two-Year Talk is currently seeking contributors. Do you have something to say about two-year college libraries, but no personal blog? Do you find that you don’t have the time to maintain a personal blog, but might like to contribute a blog post once in a while? We encourage those who are interested to sign up as blog authors.  If you would like to read our submission guidelines please see our about page.

The URL for the blog is http://twoyeartalk.wordpress.com/. To be set up as a blog author, go to http://wordpress.com/ and create an account.  Then send your WordPress username or e-mail address here. When you submit a post for publication, it will go into “Pending” status until an editor reviews it and “Publishes” it. We have a roster of volunteer editors in charge of proofreading and publishing the posts on weekdays.

There is also a Two Year Talk Google Group, which works as a forum to discuss issues, concerns, topics, etc.  To join the Google Group, contact this email address.

The mission of Two Year Talk: A Blog for Librarians at Two-Year Colleges is to communicate news, promote discussion of relevant issues and serve as a visible development vehicle for two-year college libraries.

Give the People What They Want

We’ve just passed the mid-term point, and the rush of in-class IL sessions has subsided. I experimented this term with a concept I read about in a College & Research Library News article in 2012 called, “Google Spreadsheets and real-time assessment: Instant feedback for library instruction,” where the author, Shannon R. Simpson, reported on her technique of using Google forms to shape her classes while they were actually taking place. I liked the idea of making sessions more relevant to students, rather than trying to guess what they would need. So, in a couple of my sessions this term, I implemented a simplified version of this technique, with mixed results.

For one 200-level writing class, the instructor told me ahead of time that students were having difficulty finding peer-reviewed resources, even though it was something she knew they’d all had to do in their 100-level writing courses. For that session, I created a quick survey (using Google Forms, though you could probably use other tools, as well) that asked students what their topic was and an open-ended question about what they identified as their problem in finding the peer-reviewed resources they needed. I placed a link to the survey in the online research guide for the class and asked them to fill in out in the first four minutes or so of class. I was able to open the Google spreadsheet the results were entered into and show it to the whole class and pretty much ran the IL session based on the problems they reported. In this scenario, the technique worked great. I was able to address conceptual problems they identified in the survey in my lecture and demonstration part of the session, as well as identify individual students to help with more specific problems in the practice portion of the session. I felt like the session was more relevant and engaging to the students, and the instructor sent me a couple of follow-up comments that indicated that the students found it useful as well.

The second time I used it was in a 100-level writing class, where the IL session focused on evaluating web sources for writing an evaluative essay (lots of evaluation going on!). Again, I asked students for their topics (I always like to use their topics as examples in my lecture/demonstration), and I asked them a close-ended question about the criteria they were developing for their evaluation. In retrospect (though I should have realized it beforehand), the close-ended question was a pretty bad idea, as it didn’t relate most closely to the information literacy outcomes of the session and therefore didn’t do a very good job of directing my instruction. So, I pretty much ended up doing my normal web resource evaluation lecture, with practice time in the second half. I may have fallen prey to the temptation of incorporating a new (to me) technological technique just because I could – not because it was really needed.

I will definitely try this technique again in the future, though, with these lessons learned from my very brief experience:

  • Think about the questions to ask, and make sure they align with the session topic and outcomes
  • Avoid close-ended questions, as these don’t provide enough direction on where to take the session
  • Keep it short – 2-3 questions
  • Have a backup for if/when the technology fails