Presentation season at UNCG is wrapping up for the business classes requiring those big, pesky team research projects. The final set of Export Odyssey presentations in International Marketing is this afternoon, followed by the final team presentations in our new Entrepreneurship Living Learning Community flagship class, ENT 130. Those freshmen are planning the relaunch of an on-campus store as a student-run venture.
I’ve enjoyed helping grade the presentations in those classes. It’s always valuable to see what students end up doing with the resources strategies and sources they (hopefully) learn about through their research workshops. Observing final presentations, like looking over final written reports, can be a useful form of assessment even if you don’t end up with quantifiable data afterwards.
So I examined the new information literacy special issue of the Journal of Business & Finance Librarianship with interest. The issue begins with a literature review, “Business Information Literacy: A Synthesis for Best Practices” by Ann Fiegan of California State University San Marcos. Ann’s article ends with a list of “Best Practices for Business Information Literacy Instruction Derived from the Literature, 1980-2009.” Many of the best practices should elicit immediate head-nods from teaching librarians who have been around the block a few times. One example: “Schedule in-class library instruction just after project start date and provide adequate time for project completion.”
I question the validity of another best practice, though: “Plan your MBA graduate instruction to the higher order information literacy competencies. MBA students are independent learners and knowledgeable evaluators. They are interested in the more advanced features of research tools.” MBA students are often working adults with families whose classes are centered on case studies. I haven’t seen too many research-intensive MBA classes, at least at UNCG. Meanwhile there are plenty of undergraduates and students in other business school graduate programs in research-intensive classes that necessitate strong growth in info lit skills.
Another article in the special issue is “”Course-Integrated Information Literacy Instruction in Introduction to Accounting” by Anne Kelly, Teresa Williams, Brad Matthies and J. Burdeane Orris of Butler University. I’m jealous of the required freshmen class the Butler librarians describe. All UNCG business school freshmen take BUS 105, “Introduction to Business Skills Development,” which includes a class session I teach, but the real objective of this class is to orientate new students to university life, not to teach business principles. So its research requirements are light. (Entrepreneurship and CARS students get into research early in required classes, at least.) More on this article in a minute.
“Broad Focus, Narrow Focus: A Look at Information Literacy Across a School of Business and Within a Capstone Course” by Diane Campbell of Rider University was the other article I’ve looked at closely. Diane discusses ENT 348, a feasibility class, and ENT 410, a business plan class. We have a similar pairing here.
Both the Butler and Rider articles describe multiple choice surveys used by the librarians to assess information literacy. Their survey questions fall pretty evenly into two categories:
- Research strategies
- Research sources or tools
For example, asking about industry classification systems or the purpose of trade magazines falls under research strategies; asking about the best database to cover topic X or what the library research guides are called (answer: Libguides) would fall under research sources or tools.
I find the questions the librarians chose for their surveys very interesting. Both libraries certainly had to think hard about what to ask. (Diane notes that “It goes without saying that these five questions leave many types of business information and many BIL skills still to be addressed” (p. 319)). It’s a very different thought process than designing graded exercises or worksheets intended to assess the narrow learning outcomes of a single class.
I wonder, though — if we really hope to develop lifelong learning in our students, should we ask questions about tools and sources, like specific vendor databases, that are probably not accessible to graduates?
Certainly we want students to become familiar with our business databases. (Five years ago or so a marketing major who went on to get a master’s degree in CARS once exclaimed to me and her classmates “I love Mintel!” It brought a tear to my eye.) But our field is one where the research the students do for class projects is very similar to the research they will have to do as entrepreneurs, marketing managers, or investors. This is unlike some of the other majors around campus for whom research means searching for scholarly books and articles to use in a paper, a type of research most students never need to do again after graduation unless they go to grad school. In addition to asking questions about business research strategies, there are the many free resources (Census data, SEC filings, BLS data, etc.) that students can use after graduation that would make good targets for information literacy survey questions. If we want to test for long-term learning, would it be more useful to ask students about those types of sources versus subscription databases?
Now I need to be honest and state that I’ve never tried to do an info lit assessment with statistical analysis like the Butler and Rider librarians have done. A big thank you to them for sharing their planning and survey results with the rest of us.