This is a joint post by Tom Richardson of BookNet Canada and Brian O’Leary of Magellan Media Partners.
The BISG just released a report on “Development, Use, and Modification of Book Product Metadata” that was jointly sponsored by BookNet Canada. Its purpose was to try to track what happens to metadata—how it’s created, how it’s transformed as it moves through the supply chain, where it works and what the problems are. It’s an ambitious agenda and one that Brian O’Leary of Magellan Media Consultants met square on achieving a remarkable synthesis of issues and clear recommendations that need to be discussed.
A cynic might dismiss the whole system: Too many publishers produce metadata poorly, forcing receivers to change and manipulate the data using proprietary definitions and standards, and the standards change too quickly to allow new players to accommodate them.
Any way you look at it, the business case—the ROI—flounders for those that try to make it: Publishers see their expensive data not used, or worse changed; and receivers can try to educate but lack the resources to do it. Even so there is a lot of evidence that metadata sells so the industry must rise to the metadata challenge.
Brian has created a remarkable report that is not written by or for cynics—it clearly details the problems and bases them on industry (Canada and the US) interviews and feedback. He doesn’t sanitize the problems but breaks them down and proposes solutions that can work for how and when data should be checked, how to create feedback loops, the need for transparency.
What struck me in reading it is that the Canadian publishing industry can provide some case studies of some of the things he proposes. An obvious one is BNC BiblioShare’s File Quality reports—there is automated feedback on files submitted to it—but there are even better ones: BNC CataList and 49th Shelf are two data receivers that rigorously apply standards as they are written. More than that they use publisher data (largely) unchanged. And because they do that, it means that the business case in Canada has changed. Publishing companies see ROI in metadata.
Brian’s report is a blueprint for doing this on a broader scale and should be read widely. In this guest post below he gives us some highlights from the report. Canadian BNC stakeholders should email us for a free copy of the executive summary and exclusive pricing on the full report.
Since January, BookNet Canada (BNC) has been working with the U.S.-based Book Industry Study Group (BISG) to study the development, use and modification of metadata across the book industry supply chain. With BNC serving as the project’s coordinating partner, BISG hired us to conduct and report on the results of the research, which covered both the United States and Canada.
The study included in-depth interviews with 30 supply-chain participants as well as an online survey that gathered 125 additional responses. The project builds upon prior best-practice work done by BISG, NISO and OCLC.
The interviews uncovered several consistent themes, including:
- Supplier (publisher) concerns with added or modified metadata
- Recipient reports that metadata quality still needs work
- Separate feeds for physical and digital products
- Significant “forking” in using the ONIX standard
- A slow start for ONIX 3.0 migration in the U.S.
As a result of our research, we were able to make two sets of recommendations: process recommendations, such as senders should develop an internal feedback loop by comparing prepared metadata to the actual product; and “future-proofing” recommendations, including discontinuing the use of style tags, which many recipients do not support.
The research also identified three categories of metadata that may benefit from additional review. These categories include:
- Additional metadata elements (39 were suggested)
- Discarded or modified elements (BISAC codes are mentioned most frequently)
- Problematic uses of metadata elements (potentially differing definitions)
Those who suggested additional elements are not necessarily looking for a new field. Of the 39 suggestions, the majority (23) were ideas that could help better market physical and digital products. In general, requests reflected an interest in data not effectively supported by the respondents’ current ONIX 2.1 data feeds. Sources confirm that ONIX 2.1 can support many of the marketing-related data requests, though changes to sender systems and practices would likely be needed.
Our research found that little data is truly discarded, but it can be set aside. We found instances in which data is not used, or it is not used fully, as is the case when a recipient chooses a subset of BISAC codes to roll up into a retailing category. We also found instances in which data is modified. Age ranges and styled text were mentioned frequently as examples here.
Finally, the research uncovered 16 instances in which senders and receivers seem to be interpreting data requirements differently. The most frequently mentioned examples include page count, age range, on sale and pub dates as well as several aspects of territorial rights.
The final report, which runs 37 pages, is based on the findings obtained through 30 interviews and a survey of 125 supply-chain participants, supplemented with our analysis of the strengths and opportunities evident in the existing book industry supply chain. It is being made available through BookNet Canada in both PDF and EPUB 2 formats.
In addition to BookNet Canada, project sponsors included: Disney Publishing Worldwide, Hachette Book Group, John Wiley & Sons and Random House (Silver Sponsors); Readerlink, Sensible Solutions and Sourcebooks (Supporting Sponsors); and Safari Books Online and Sally Dedecker Enterprises (Friend Sponsors).
Brian O’Leary is founder and principal at Magellan Media Partners. To read more from Brian on the book industry, visit the Magellan Media blog.
For a free copy of the executive summary and exclusive pricing on the full report—or just to talk about metadata—contact us.