© AdobeStock

Dossier: Disruptive Technologies

The WCO Data Model: guidance for efficient implementation

12 October 2022
By Alejandro Rinaldi, CEO, Customs-hub.com

A fundamental requirement for information to flow seamlessly across different IT systems is that they all speak the same language. Cross-border regulatory agencies retaining the right to determine the data they require should all use a similar language and have harmonized the way in which the required data is submitted. This is also critical for enabling economic operators to use the same system to comply with the requirements of different administrations and countries.

Such a common language exists: the WCO Data Model (DM). Developed and maintained through the WCO Data Model Projects Team (DMPT), the Model is a compilation of clearly structured, harmonized, standardized and reusable sets of data definitions and electronic messages, to meet the operational and legal requirements of cross-border regulatory agencies, including Customs[1].

The WCO Data Model and service-oriented architecture

The steps to be taken by Customs administrations which are considering the adoption of the WCO DM have been described in an article published in March 2022 in this magazine[2].

The article explains that they should:

  • identify the areas for implementation (e.g. Imports, Exports, Transit, Cargo report, Manifest, Authorized Economic Operators, Origin, Phytosanitary, Food safety, Animal health, Endangered species, Environment, Cultural goods);
  • identify the data requirements of the selected process and harmonize them;
  • map the list of national (or regional) data requirements to the WCO DM and develop a “My Information Package” (MyIP); and
  • implement the MyIP in the IT system to ensure that the system can receive and/or produce data that complies with the WCO DM technical specifications.

Customs processes are complex and it is very unlikely that a single IT solution will efficiently deal with all their aspects and complications. This is also the case with many business processes. For this reason, more and more administrations and companies are developing IT systems based on a service architecture, or service-oriented architecture (SOA), an approach that uses existing services (self-contained units of software) and applications for the development of computer components.

One of the great advantages of an architecture of services is that it is possible to use modules while building technological solutions, which allows developers to solve problems step by step, considering a system part by part, and component by component. Other advantages include the capability to adapt the system easily to increased workload or market demands (high scalability), the capacity to make adjustments easily, and lower implementation costs compared with other IT system development methods.

SOA facilitates the building of robust solutions by enabling specific and specialized solutions to be developed separately to respond to the needs and constraints of processes and procedures. Components related to risk management, Single Window, the AEO programme, tariff classification or inspections, for example, can be designed and built individually and separately, using the best technology and the best service provider.

But this brings with it the great challenge of ensuring compatibility and interoperability between so many components and providers. Adoption of the WCO Data Model is therefore fundamental here too, and this is why the upcoming Version 4.0 of the Model will provide for the use of standards and syntaxes used in systems based on SOA.

We advise anyone implementing the WCO Data Model to use SOA and the standards supporting it, such as OpenAPI and JSON, which are described in the following paragraphs.

API and Version 4.0

Another trend to have emerged with SOA is the use of application programming interfaces (APIs) as an alternative to EDI for document exchange. Both methods allow data to be exchanged quickly and securely from system to system.

With EDI, computer systems are able to understand the information that is exchanged because each party uses the standard EDI document format. EDI data is stored then transmitted. As a result, there may be some limitations to real-time access and responsiveness. Moreover, EDI documents do not transmit updated information, but a sequential version of the same document, and it is up to the receiver to parse the document, compare it to the prior version and pass to the database any changes detected.

API is a set of programming instructions and standards for accessing web-based software applications that allows software platforms to communicate with each other. Unlike EDI, there are no predefined standard business document formats for API-based data exchange transactions. API transactions use JSON, XML, YAML and other data-serialization formats for information exchange. These data-serialization formats are generic and not specific like EDI business document formats. Moreover, unlike EDI, APIs enable real-time data exchange. APIs are able to transfer data in less than a second.

The DMPT is well aware of such technological changes, and Version 4.0 of the WCO Data Model will include guidelines for the use of OpenAPI and JSON syntaxes, which are considered the basic pillars of a modern SOA. OpenAPI is the name given to an initiative which aims to provide a new way to describe electronic services in an agnostic form of the programming language or of the technology supporting such services. It has become the lingua franca for everything API, offering the possibility to build services which are technologically independent from one another but which remain compatible and interoperable. JSON, in turn, provides a light data interchange format that is easily readable by machines, as well as by humans. Its main feature is that it has been devised to support very large volumes of information.

By adopting SOA, OpenAPI and JSON syntaxes, Customs and other regulatory agencies will be able to build their platforms component by component, with the WCO Data Model as the universal language between those components.

It is already possible to implement the Data Model using such syntaxes as part of an SOA. By way of example, Brazil’s Single Window implemented the WCO DM using OpenAPI and JSON, currently supporting 2 million Export Declarations and 2.4 million Import Declarations per year. When combined and included as syntax in the WCO DM, OpenAPI and JSON ensure its technological validity and maintain technological agnosticism.

The electronic message templates of the WCO Data Model currently use UN/EDIFACT data formats (GOVCBR), as well as XML, a language used to create and exchange structured data. The use of OpenAPI and JSON does not turn the use of XML into something obsolete. Both standards supplement XML and XML schemas, and offer new options for the implementation of XML specifications. Electronic messages built in XML are fully compatible with any system that uses OpenAPI/JSON, and vice versa as well.

Conformity framework

The WCO Data Model is a universal language, agnostic of technology and providers. When developing a system using the Model, each entity involved and each provider must be able to confirm that the components of the system speak this universal language perfectly.

To facilitate this work, a key tool, the “Conformity Framework”, has been developed. It clearly defines the WCO Data Model’s technical specifications, removing any ambiguities in order to ensure that solutions are compliant with the Model. This tool enables Customs authorities, government agencies, financial entities and donors lacking expertise in the WCO Data Model to request compliance with the “Conformity Framework” as a basic criterion for accepting proposals and granting contracts.

The WCO Data Model is a toolbox containing interrelated components: information models, codes for international standards, harmonized data sets, and business process models. Many systems in use around the world have been in place for many years. This means it would be almost impossible, in the majority of instances, for a party deciding to adopt the WCO Data Model to be fully compliant with all its elements without building from the ground up. On top of this, many implementers will also have challenges due to the need to integrate with legacy applications and business processes, as these need to continue to be supported.

Consequently, the Conformity Framework is a graduated system which provides for four degrees of conformity with the Model:

  • Level 1: each data element in the message uses the WCO data element name and format representation.
  • Level 2: each data element in the message meets the level 1 criteria, and its structure conforms with the structure of the WCO Data Model UML Class diagram.
  • Level 3: meets the level 2 criteria and uses WCO-recommended code lists.
  • Level 4: meets the level 3 criteria, and is based on a message format supported by the WCO Data Model, e.g. EDIFACT GOVCBR, XML, OpenAPI/JSON.

The “Conformity Framework” should be considered as a quality assurance certificate which guarantees long-term sustainability of the system being built. The investment required in the construction of Customs IT solutions is significant, and the notion of sustainability is key.

“My Information Package”: two options for effective mapping

When developing a “My Information Package”, Customs administrations must map the list of national or regional data requirements against the WCO Data Model data sets. There are two ways to do this.

The first option is to do so process by process, focusing on each piece of data to find its correspondence in the Model. When building its Single Window for Foreign Trade (VUCE), Costa Rica applied this method to define the information that the importer must provide to enable the government agencies involved in the clearance process to carry out their duties. Costa Rica’s MyIP is estimated to cover 129 processes and includes 250 data elements of the WCO Data Model.

The second option is to create two interfaces: one to encapsulate the data components which currently exist in a Customs system, and another to translate that data into WCO Data Model data. This allows Customs administrations to implement the Data Model while giving them more time to decide on adjustments or reconstruction of their IT system, or to decide if there is no need to rebuild an IT system, but only to make it compliant with the WCO standard. Such an option was used in Uruguay, including by private sector companies wishing to align their management systems with the WCO Data Model and especially with the regional MyIP titled “MODDA”, developed by Mercosur (a Customs union between Argentina, Brazil, Paraguay, Uruguay and Venezuela).

Perspectives

In a service-oriented architecture, software components are called services. Each service provides a business capability, and services can also communicate with each other across platforms and languages by using syntax and standardization of the data structures.

In this article, we argued for the implementation of the WCO Data Model using an SOA as well as OpenAPI and JSON. In such a configuration, the different components in an IT solution may ultimately be considered as services. So far, the WCO Data Model has focused on defining a harmonized data dictionary to be used by border regulatory agencies. In the future, it could include services as a sort of extension to the Model. The possible services applicable to the data represented by the WCO Data Model are endless. They include services for receiving declarations from the private sector, as well as for offering the possibility to amend and correct them.

Going one step further, the various business rules applicable to a procedure or process could become services. Business rules are directives that define (or constrain) business activities and provide the foundation for automation systems by taking documented or undocumented information and translating it into various conditional statements. Regulatory authorities define not only the data required, but also a set of business rules applicable to that data. The rules range from controls regarding information, to verification of valid codifications of data, and pattern rules, such as the pattern for a text to be compliant as a valid e-mail. A risk assessment module could be seen as a service that will return a risk value for a transaction.

These rules could supplement the MyIP or even be part of it, enabling anyone who must submit a declaration to be aware of the required data and the corresponding rules. Making the rules part of the WCO Data Model would bring harmonization to a new level and enable further cooperation between regulatory agencies. This would turn the WCO Data Model into an SOA services specification for cross-border regulatory purposes. We strongly believe that this is where the future of the Model lies.

More information
info@customs-hub.com

[1] See article in WCO News https://mag.wcoomd.org/magazine/wco-news-97-issue-1-2022/making-digital-collaboration-possible-the-wco-data-model-latest-developments-and-implementation-guidance/

[2] Idem