| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Journal Club Digest 2010-2011 (redirected from Journal club digest 2010-2011)

Page history last edited by Megan 11 years, 7 months ago

Journal Club Digest 2010-2011

The following report synopsizes the club’s first year’s worth of meetings.

September 2010
Dill, E. (2008). Do clickers improve library instruction? Lock in your answers now. The Journal of Academic Librarianship, 34(6), 527-529.
Facilitator: Lorie Kloda
Liked
The study employed some good design elements, for example, randomization.
Needed Improvement
The article was brief, and did not include information about the learning objectives, the instruction session or the assessment tool used. These would have been useful for librarians interested in emulating the study.
Bottom Line
If the clickers are low cost and make the session more enjoyable to the students, they may still be a worthwhile investment.

October 2010
Antelman, K. (2004). Do open-access articles have a greater research impact? College & Research Libraries, 65(5), 372-382.
Facilitator: Megan Fitzgibbons
Liked
This article offers a concrete methodology, though imperfect, for addressing the complex relationship between OA publishing and research impact (i.e., as measured by citations).

Needed improvement
Many of the limitations of the data weren’t addressed. At points, the author implied a cause-and-effect relationship between OA publishing and citations that wasn’t proven by the data. Also, the different types of OA (green, gold, depositing, etc.) are conflated, obscuring the results.
Bottom Line
The article’s age limits its applicability now. Without causation proven, it can’t be used on its own as an argument to encourage open access practices, but it can still open conversations and dispel misconceptions about how and why people share their research through OA.

November 2010
Cawthorne, J. E. (2010). Leading from the middle of the organization: An examination of shared leadership in academic libraries. The Journal of Academic Librarianship, 36(2),151-157.
Facilitator: Genevieve Gore
Liked
The article provided an example of how a theoretical framework about leadership practices could be tested in a library setting by using a survey methodology.
Needed Improvement
The description of data analysis methods was unclear and sometimes contradictory. In addition, the quantitative results were not presented uniformly nor do the figures have appropriate labels. It was unclear as to whether a quantitative study was in fact an appropriate approach to address the question of the extent of shared leadership in library management.
Bottom Line
The study does not address whether organizations are shifting from hierarchical models to shared leadership models, so the article’s applications to librarians’ practice is minimal.


January 2011
Bowman, L. L., Levine, L. E., Waite, B. M., & Gendron, M. (2010). Can students really multitask? An experimental study of instant messaging while reading. Computers & Education, 54(4), 927-931.
Facilitators: Lindsey Sikora & Graham Lavender
Liked
Written by researchers in psychology and information systems, this study used a true experimental design, which is not particularly common in LIS literature.
Needed Improvement
The execution of the design was found to have some weaknesses, such as the use of an “unauthentic” situation with regard to time limits and cognitive tasks. Also, the study did not ultimately address the idea of multitasking, but instead considered the ability of students to switch tasks.
Bottom Line
The topic addressed in the paper prompted interesting discussion on the implications of students’ study behaviour in terms of how they seek information. Research on task switching is also highly relevant to librarians’ own work practices, and this article pointed to further directions that could be pursued in the area.


February 2011
Belliston, C. J., Howland. J. L., & Roberts, B. C. (2007).Undergraduate use of federated searching: A survey of preferences and perceptions of value-added functionality.College & Research Libraries, 68(6), 472-486.
Facilitators: Vincci Lui & Jan Sandink
Liked
The article provided several examples of methodologies that could be applied in assessing the perennially important question of how federated search tools are used.
Needed Improvement

The study was lacking in many areas, particularly in population selection and data gathering. The study tried to do too much at once and as a result was neither well designed nor well executed. This is unfortunate as some of their questions, most notably regarding the differences in citation quality, had great potential but were not answered clearly enough to inspire confidence.

Bottom Line

The study does open up a lot of possibilities for further research. In addition to knowing whether or not students are satisfied with federated search tools, it is perhaps even more important to be aware of how they use and interact with such tools. Such insight would allow librarians to offer and teach federated searching in a way that would appeal to students and assist them in their research, as well as offer guidance in developing evaluative criteria relevant to various user needs, and not just librarians’ perceived needs, in the process of selecting or renewing a federated search tool for their institution.



March 2011
Cordell, R. M. & Fisher, L. F. (2010). Reference questions as an authentic assessment of information literacy.Reference Services Review, 38(3), 474-481.
Facilitators: Katherine Hanz & Jessica Lange
Liked
The authors took an innovative approach to assessing information literacy by judging the complexity level of their reference queries.
Needed Improvement
Unfortunately, the methodological weaknesses made it difficult to draw conclusions about the long-term effectiveness of the information literacy sessions. In addition, it would have been beneficial for the authors to share their instructional activities in more detail.
Bottom Line
The taxonomy created by Cordell and Bloom is original and potentially very useful to other librarians who are looking for a method of categorizing students’ questions according to a knowledge hierarchy. Otherwise the conclusions of the study are not strong enough to make definite links between information literacy efforts and lifelong learning.


April 2011
Eldredge, J. D., Carr, R., Broudy, D., & Voorhees, R. E. (2008). The effect of training on question formulation among public health practitioners: Results from a randomized controlled trial. Journal of the Medical Library Association, 96(4), 299-309.
Facilitator: Jill Boruff
Liked
This was a very well designed study, and the journal club found its design to be an excellent model for those undertaking similar research.
Needed Improvement
The paper would have benefited from a description of the instruction that took place, so librarians working with similar populations could adapt it for their own use.
Bottom Line
Though the results are not statistically significant, they do suggest the importance and effectiveness of instruction in evidence-based medicine.


May 2011
Anderson, K., & May, F. A. (2010). Does the method of instruction matter? An experimental examination of information literacy instruction in the online, blended, and face-to-face classrooms. Journal of Academic Librarianship, 36(6), 495-500.
Facilitator: Brian McMillan
Liked
The study employed a pre- and post- test (5 weeks later) to three different groups to determine a cause-and-effect relationship between the instruction type and level of learning.
Needed Improvement
The pre- and post-test, which was included as an appendix in the publication, would have benefited from better design, as the pre-test demonstrated scores that were already high..
Bottom line
This study provides a useful design for comparing instruction methods, but much thought is required in the preparation of assessment materials. This study does not provide evidence that one method of instruction is superior than any other, but rather, that students entering university demonstrate some basic information literacy skills already.


June 2011
Swanson, T. A., & Green, J. (2011). Why we are not Google: Lessons from a library web site usability study. The Journal of Academic Librarianship, 37(3), 222-229. doi: 10.1016/j.acalib.2011.02.014
Facilitator: Julie Jones
Liked
The methodology of the study was very clearly described and could easily be replicated. The usability test instrument is included and seemed to represent authentic tasks for undergraduate library users.
Needed improvement
One of the research questions involved a comparison between past and present usability studies. However, the earlier study was not described, and thus readers cannot evaluate whether the conclusions drawn with regard to the comparison were supported.
Bottom Line
The study provided solid evidence that a single-search style of federated searching is not necessarily the most effective solution for improving access to library resources.


July 2011
Hayslett, M. M., & Wildemuth, B. M. (2004). Pixels or pencils? The relative effectiveness of Web-based versus paper surveys. Library & Information Science Research, 26(1), 73-93. doi:16/j.lisr.2003.11.005
Facilitator: Megan Fitzgibbons
Liked
This meeting’s discussion focused on survey design more generally, specifically best practices and methodological considerations. The assigned article acted to prompt the larger discussion.
Needed Improvement
Participants at the meeting were skeptical that the same results would be found if this study was replicated today, since web-based surveys are much more common than when the research was conducted in 1999. Sampling bias and the inability to assess response rate were discussed as limitations to web-based surveys.
Bottom Line
Additional readings that iterate best practices are posted on the wiki. It was noted that researchers often use surveys when another method (e.g., focus groups) would be more appropriate.


August 2011
Arndt, T. S.(2010). Reference service without the desk. Reference Services Review, 36(1), 71-80.   
Facilitator: Susan Murray
Liked
This study demonstrated that with proper training of front-line staff, an on-call reference service can increase the number of questions referred to librarians compared to having a separate reference desk.
Needed Improvement
This study took place at a small institution. Combined with some methodological weaknesses, the findings are not necessarily transferable to other settings.
Bottom line
The study presents useful suggestions for the types of data that can be collected when evaluating a change in any public service. However, having unbiased information about all reference transactions (including those that may already take place away from the reference desk) is essential for evaluating the effectiveness of a new model.

Comments (0)

You don't have permission to comment on this page.