Addressing "Documentation Debt" in Machine Learning: A Retrospective Datasheet for BookCorpusDownload PDF

Published: 29 Jul 2021, Last Modified: 20 Oct 2024NeurIPS 2021 Datasets and Benchmarks Track (Round 1)Readers: Everyone
Keywords: bookcorpus, datasheet, dataset, documentation, data, text
TL;DR: A datasheet that provides documentation for the popular (yet heretofore fairly mysterious) BookCorpus dataset, which helped train Google's BERT models and OpenAI's GPT-N models.
Abstract: This paper contributes a formal case study in retrospective dataset documentation and pinpoints several problems with the influential BookCorpus dataset. Recent work has underscored the importance of dataset documentation in machine learning research, including by addressing ``documentation debt'' for datasets that have been used widely but documented sparsely. BookCorpus is one such dataset. Researchers have used BookCorpus to train OpenAI's GPT-N models and Google's BERT models, but little to no documentation exists about the dataset's motivation, composition, collection process, etc. We offer a retrospective datasheet with key context and information about BookCorpus, including several notable deficiencies. In particular, we find evidence that (1) BookCorpus violates copyright restrictions for many books, (2) BookCorpus contains thousands of duplicated books, and (3) BookCorpus exhibits significant skews in genre representation. We also find hints of other potential deficiencies that call for future research, such as lopsided author contributions. While more work remains, this initial effort to provide a datasheet for BookCorpus offers a cautionary case study and adds to growing literature that urges more careful, systematic documentation of machine learning datasets.
Supplementary Material: zip
URL: https://github.com/jackbandy/bookcorpus-datasheet
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/addressing-documentation-debt-in-machine/code)
5 Replies

Loading