Cabaniss Usability Study

From UA Libraries Digital Services Planning and Documentation
(Difference between revisions)
Jump to: navigation, search
 
(3 intermediate revisions by one user not shown)
Line 1: Line 1:
 
*IRB approval for the Cabaniss usability study was obtained in May 2010.  
 
*IRB approval for the Cabaniss usability study was obtained in May 2010.  
  
 +
 +
'''Usability Study Flier and FAQs:'''
 
*Advertising began on August 6 and ran through August 30, 2010. Ads were placed on the University Libraries main web page, as well as on the student information portal called MyBama and on the Digital Collections web page. In addition, information was emailed to the History, American Studies, and Political Science Departments. These departments forwarded the information to their graduate students and faculty members. Fliers were also displayed in the reading room, and those work work at the reference desk were told about the study and were asked to encourage patrons to take a flier.   
 
*Advertising began on August 6 and ran through August 30, 2010. Ads were placed on the University Libraries main web page, as well as on the student information portal called MyBama and on the Digital Collections web page. In addition, information was emailed to the History, American Studies, and Political Science Departments. These departments forwarded the information to their graduate students and faculty members. Fliers were also displayed in the reading room, and those work work at the reference desk were told about the study and were asked to encourage patrons to take a flier.   
 +
 +
http://www.lib.ua.edu/wiki/digcoll/index.php/Image:Usability_Flyer_w_FAQs.pdf
 +
  
 
*Response to the advertising was fairly positive. We received more volunteers than we needed, but all volunteers were contacted and invited to come and participate in the study, in hopes that enough volunteers would participate to reach our goal of 20 participants. We recruited nearly 30 volunteers before taking down the ads, but due to no-shows, recruited a second time.  We finally tested 21 participants, though one set of results was disqualified from analysis due to unwillingness to complete the test.
 
*Response to the advertising was fairly positive. We received more volunteers than we needed, but all volunteers were contacted and invited to come and participate in the study, in hopes that enough volunteers would participate to reach our goal of 20 participants. We recruited nearly 30 volunteers before taking down the ads, but due to no-shows, recruited a second time.  We finally tested 21 participants, though one set of results was disqualified from analysis due to unwillingness to complete the test.
  
*We are using the Morae software to perform the usability studies. More information about this software can be found at [http://www.techsmith.com/morae.asp http://www.techsmith.com/morae.asp].
+
'''Usability Study Informed Consent:'''
  
*Usability testing was completed in September and analysis is currently in progress. We hope to have the results available early in 2011.
+
http://www.lib.ua.edu/wiki/digcoll/images/1/1c/Informed_Consent_Cabaniss.docx
  
 +
*We used the Morae software to perform the usability studies. More information about this software can be found at [http://www.techsmith.com/morae.asp http://www.techsmith.com/morae.asp].
  
Usability Study Informed Consent:  
+
*Usability testing was completed in September 2010.  The study consisted of asking participants to perform a total of 8 tasks, 4 within the Cabaniss collection (http://u0003_0000252), and 4 within a similar collection  which was laid out in our "traditional" fashion (The Robert Jemison Jr. Papers Collection: http://acumen.lib.ua.edu/c/u0003_0000753 ).
  
http://www.lib.ua.edu/wiki/digcoll/images/1/1c/Informed_Consent_Cabaniss.docx
+
'''Usability Study Questions:'''
  
 +
http://www.lib.ua.edu/wiki/digcoll/images/e/e4/Usability_Test_Questions.docx
  
Usability Study Questions:
+
After the participants performed these 8 tasks, they were asked to fill out a survey which gave them the opportunity to give us feedback on their experience and their opinions about the interfaces in general and the Cabaniss collection in particular.
  
http://www.lib.ua.edu/wiki/digcoll/images/e/e4/Usability_Test_Questions.docx
 
  
  
Usability Study Flier and FAQs:  
+
== '''Short Discussion on Results''' ==
 +
 
 +
 
 +
 
 +
The usability study and its results are detailed in the an article (currently submitted to JCDL 2011) entitled "Low Cost Digitization of Manuscript Collections:  Trading Usability for Access?"
 +
 
 +
 
 +
'''One of the more interesting findings was that participants without previous digital collection experience found the finding aid interface significantly easier than those who claimed familiarity with the traditional digital library interface.''' 
 +
 
 +
 
 +
Educational experience clearly improved ability for participants to locate content via the finding aid, as did a background in history or familiarity with special collections.
 +
 
 +
 
 +
''Not surprisingly, non-native speakers had difficulty with both interfaces. ''
 +
 
 +
 
 +
We did note differences in the apparent learnability of the two interfaces, implying that the finding aid interface may not be as learnable as the usual digital library interface (based on item-level description searching).
 +
 
 +
'''We determined, however, that a true test of learnability should cover a longer period of time and far more queries. '''
 +
 
 +
 
 +
We also decided that it is truly not fair to compare the finding aid interface to digital content against the item-level interface for a short-term test.  The finding aid presents far more information and context, requiring more time to read and become acquainted with the organization of the content.  It is primarily a browsing interface rather than a searching one. 
 +
 
 +
Once acquainted with a collection, however, we expect that locating appropriate content would be as speedy (or more so) than via searching item level metadata which may or may not contain the necessary search terms.
 +
 
 +
 
 +
'''The question arose as to where the boundary is between testing the interfaces, and testing the delivered information?'''
 +
 
 +
The majority of (primarily undergraduate) participants preferred the item-level description interface.  However, 40% of those with a background in history, and a third of those with a background in special collections, preferred the finding aid interface. 
 +
 
 +
 
 +
<u>Previous research has shown that the finding aid as interface to digitized manuscript collections is preferred by scholars and experienced researchers, whereas undergraduates prefer item-level description</u> (see footnotes below). Given that our study focused on undergraduates, our results are not surprising.
 +
 
 +
 
 +
Since delivery of digitized content via the finding aid is by far the cheaper method for digitization and web delivery, we believe this is the preferred path for future digitization of large manuscript collections. More research needs to be done on the learnability of finding aid interfaces, modifications to those interfaces to improve navigation and terminology (replacing archival terms with user-friendly ones).
 +
 
 +
 
 +
'''Suggested modifications to web delivery based on usability results:'''
 +
 
 +
# Revise current EAD terminology for the web display to exchange archivist terms for user-friendly labels.  This would require further research and user testing.
 +
# Add a “help” option explaining how to navigate the typical finding aid, for those who have never encountered one.
 +
# Add search within file feature for (at least) large finding aids.
 +
# Add a sidebar navigation box for EADs similar to that used by Princeton [http://diglib.princeton.edu/ead/getEad?eadid=C0290&kw=]  to facilitate location of items, particularly the dropdown list of series titles.
 +
# Add a “recently viewed” list on the side or bottom of the screen, so the user can return to previously viewed items.
 +
# Enlarge the thumbnails, as they are not large enough to really preview the item.
 +
# Allow  user to input search term by hitting the enter key instead of requiring user to click on the “search” button.
 +
# Add a browse option for subject headings, with a method of narrowing down a topic under a broader heading (perhaps a “search within results” option).
 +
 
 +
 
 +
 
 +
'''Data Tables from Usability Study:'''
 +
(Note: Kevin W. Walker performed the statistical analyses of the results.)
 +
 
 +
http://www.lib.ua.edu/wiki/digcoll/images/3/33/Data_Tables_from_Usability_Study.pdf
 +
 
 +
 
 +
 
 +
'''Footnotes:'''
 +
 
 +
 
 +
Nimer, C. and Daines III, J. G. 2008. "What do you mean it doesn't make sense?  Redesigning finding aids from the user's perspective.” Journal of Archival Organization 6:4 (2008), 216-232. DOI=http://dx.doi.org/10.1080/15332740802533214
 +
 
 +
 
 +
West, T., Fesenko, K., and Brown, L. C. 2009. “Extending the Reach of Southern Sources: Proceeding to Large-Scale Digitization of Manuscript Collections”. Final Grant Report for the Andrew W. Mellon Foundation, Southern Historical Collection, University Library, University of NC at Chapel Hill (June, 2009).
  
http://www.lib.ua.edu/wiki/digcoll/index.php/Image:Usability_Flyer_w_FAQs.pdf
 
  
  
''Updated September 7, 2010.''
+
[[User:Jlderidder|Jlderidder]]

Latest revision as of 09:51, 8 March 2011

  • IRB approval for the Cabaniss usability study was obtained in May 2010.


Usability Study Flier and FAQs:

  • Advertising began on August 6 and ran through August 30, 2010. Ads were placed on the University Libraries main web page, as well as on the student information portal called MyBama and on the Digital Collections web page. In addition, information was emailed to the History, American Studies, and Political Science Departments. These departments forwarded the information to their graduate students and faculty members. Fliers were also displayed in the reading room, and those work work at the reference desk were told about the study and were asked to encourage patrons to take a flier.

http://www.lib.ua.edu/wiki/digcoll/index.php/Image:Usability_Flyer_w_FAQs.pdf


  • Response to the advertising was fairly positive. We received more volunteers than we needed, but all volunteers were contacted and invited to come and participate in the study, in hopes that enough volunteers would participate to reach our goal of 20 participants. We recruited nearly 30 volunteers before taking down the ads, but due to no-shows, recruited a second time. We finally tested 21 participants, though one set of results was disqualified from analysis due to unwillingness to complete the test.

Usability Study Informed Consent:

http://www.lib.ua.edu/wiki/digcoll/images/1/1c/Informed_Consent_Cabaniss.docx

  • Usability testing was completed in September 2010. The study consisted of asking participants to perform a total of 8 tasks, 4 within the Cabaniss collection (http://u0003_0000252), and 4 within a similar collection which was laid out in our "traditional" fashion (The Robert Jemison Jr. Papers Collection: http://acumen.lib.ua.edu/c/u0003_0000753 ).

Usability Study Questions:

http://www.lib.ua.edu/wiki/digcoll/images/e/e4/Usability_Test_Questions.docx

After the participants performed these 8 tasks, they were asked to fill out a survey which gave them the opportunity to give us feedback on their experience and their opinions about the interfaces in general and the Cabaniss collection in particular.


[edit] Short Discussion on Results

The usability study and its results are detailed in the an article (currently submitted to JCDL 2011) entitled "Low Cost Digitization of Manuscript Collections: Trading Usability for Access?"


One of the more interesting findings was that participants without previous digital collection experience found the finding aid interface significantly easier than those who claimed familiarity with the traditional digital library interface.


Educational experience clearly improved ability for participants to locate content via the finding aid, as did a background in history or familiarity with special collections.


Not surprisingly, non-native speakers had difficulty with both interfaces.


We did note differences in the apparent learnability of the two interfaces, implying that the finding aid interface may not be as learnable as the usual digital library interface (based on item-level description searching).

We determined, however, that a true test of learnability should cover a longer period of time and far more queries.


We also decided that it is truly not fair to compare the finding aid interface to digital content against the item-level interface for a short-term test. The finding aid presents far more information and context, requiring more time to read and become acquainted with the organization of the content. It is primarily a browsing interface rather than a searching one.

Once acquainted with a collection, however, we expect that locating appropriate content would be as speedy (or more so) than via searching item level metadata which may or may not contain the necessary search terms.


The question arose as to where the boundary is between testing the interfaces, and testing the delivered information?

The majority of (primarily undergraduate) participants preferred the item-level description interface. However, 40% of those with a background in history, and a third of those with a background in special collections, preferred the finding aid interface.


Previous research has shown that the finding aid as interface to digitized manuscript collections is preferred by scholars and experienced researchers, whereas undergraduates prefer item-level description (see footnotes below). Given that our study focused on undergraduates, our results are not surprising.


Since delivery of digitized content via the finding aid is by far the cheaper method for digitization and web delivery, we believe this is the preferred path for future digitization of large manuscript collections. More research needs to be done on the learnability of finding aid interfaces, modifications to those interfaces to improve navigation and terminology (replacing archival terms with user-friendly ones).


Suggested modifications to web delivery based on usability results:

  1. Revise current EAD terminology for the web display to exchange archivist terms for user-friendly labels. This would require further research and user testing.
  2. Add a “help” option explaining how to navigate the typical finding aid, for those who have never encountered one.
  3. Add search within file feature for (at least) large finding aids.
  4. Add a sidebar navigation box for EADs similar to that used by Princeton [1] to facilitate location of items, particularly the dropdown list of series titles.
  5. Add a “recently viewed” list on the side or bottom of the screen, so the user can return to previously viewed items.
  6. Enlarge the thumbnails, as they are not large enough to really preview the item.
  7. Allow user to input search term by hitting the enter key instead of requiring user to click on the “search” button.
  8. Add a browse option for subject headings, with a method of narrowing down a topic under a broader heading (perhaps a “search within results” option).


Data Tables from Usability Study: (Note: Kevin W. Walker performed the statistical analyses of the results.)

http://www.lib.ua.edu/wiki/digcoll/images/3/33/Data_Tables_from_Usability_Study.pdf


Footnotes:


Nimer, C. and Daines III, J. G. 2008. "What do you mean it doesn't make sense? Redesigning finding aids from the user's perspective.” Journal of Archival Organization 6:4 (2008), 216-232. DOI=http://dx.doi.org/10.1080/15332740802533214


West, T., Fesenko, K., and Brown, L. C. 2009. “Extending the Reach of Southern Sources: Proceeding to Large-Scale Digitization of Manuscript Collections”. Final Grant Report for the Andrew W. Mellon Foundation, Southern Historical Collection, University Library, University of NC at Chapel Hill (June, 2009).


Jlderidder

Personal tools