Crowdsourced Data: Accuracy, Accessibility, and Authority
Improving accessibility and data integration practices for cultural heritage crowdsourcing work
This Early Career Research Development project (RE-252344-OLS-22) is funded by the Laura
Bush 21st Century Librarian Program from the Institute of Museum and Library Studies (IMLS).
Learn more about this project at the Institute of Museum & Library Services.
Victoria Van Hyning; Mason Jones; J. Bern Jordan
Outcomes and potential outcomes of this project:
- A Landscape Review of ongoing academic and gray literature surrounding
crowdsourced data quality, management, and accessibility and usability testing.
- Mixed-methods survey data from cultural heritage practitioners working with
crowdsourcing projects and crowdsourced data outputs.
- 12 individualized partner reports based on demonstrations of the partner’s website and
testing with 3-4 screen reader users which detail the benefits, opportunities, and areas of
need within the individual content management systems.
- A 30-page summative white paper of the findings geared to a cultural heritage
- Open Access publications in several target journals detailing the findings and
implications of this research.
Crowdsourcing is the practice of obtaining information or input into a task or project by
enlisting the services of a large number of people, either paid or unpaid, typically via the
internet. [Oxford Languages]
Accessibility is the capability of individuals to be able to discover and make use of information,
ideally in an equitable manner. This includes access for those who use screen readers, which
are computer software that takes textual and image data and converts it into auditory
information or Braille.
- Victoria Anne Van Hyning and Mason A. Jones, “Data’s Destinations: Three Case Studies in
Crowdsourced Transcription Data Management and Dissemination,” Startwords, no. 2
(December 1, 2021), https://doi.org/10.5281/zenodo.5750691.