ONIX 3.0: Schema getting stricter

Getting ready for ONIX 3.0? You should take note of these recent and upcoming changes to schema validation.

Anyone who produces ONIX files should be aware that ONIX 3.0 is designed for validation by schema. We’ve previously promoted using schema validation for ONIX 2.1 to help pave the way for this—but mostly we ask because schema validation finds real data problems. There are some links below if you don’t know anything about schema validation, but simply put it ensures your ONIX file is loadable and readable in other company’s systems: it’s the minimum acceptable standard for trading data.

You might not know that the ONIX 3.0 schema is already stricter than the one you should be used to for 2.1. Try it for yourself: put a page return in SubjectHeadingText (b070) and watch your 3.0 file validation fail. That’s useful because data aggregators and retailers shouldn’t have to deal with page returns in the middle of keyword lists, so data senders should clean that up as part of the transition to 3.0. With the recent release of codelists issue 30, EDItEUR has added newly updated “strict” versions of the schema to help you get ready. There’s the standard “formal” schemas, plus two marked as “strict” that will eventually replace the norm.

If you have an ONIX 3.0 file, you should see if it works against either of the strict versions. Start by reviewing this document (PDF) to see if what’s in your database can pass against these stricter measures. Some elements are being restricted to integers, positive numbers, and real numbers. Here’s a quick tip: measurement values and price amounts of “0” won’t pass. They have always been a poor practice and soon they’ll be less than the minimum acceptable standard.

We’ll be posting more about this as BiblioShare will use the strict schema once it becomes EDItEUR’s formal standard. If you want to know more about schema validation, there’s an exhaustive and exhausting document here.