Master data management projects are largely viewed as data consolidation projects. The efforts replace separate, scattered departmental databases of key business data such as customer information with a single, master file. Through creating well-defined metadata that clarifies the fields’ meanings and establishing proper data governance and data validations to ensure the data’s integrity, the hope is that master data will mean that data is viewed uniformly across the organization.

Much of the master data management project work is about creating the data: determining where the master data will come from, defining the metadata, and implementing the master-data model. The implementation process typically focuses on selecting tools to clean, merge, and store the master data. The project may define a process for updating the master data records.

What’s often given short shrift is an analysis of how the business will use the master data. Or, rather, when “use” is thought of, it’s thought of in business terms: how the meaning of the master data’s fields will be understood and mapped to the departmental needs. The question of how the departments will access the data is omitted.

That question of access is critical. Business units need the ability to easily access the data and integrate their department’s additional operational data with master records while meeting their application performance needs. Without this ability, departments will rapidly begin to maintain their own local, “enhanced” copies of the master database, defeating the point of the project.

Solutions for Sharing Master Data

Companies can take several approaches to ensure that their master data can be easily accessed and that applications rely on it as a golden source:

  • Create a service-oriented architecture. Through developing services that allow applications to query and retrieve data, the master data project makes it simple for client applications to retrieve the records they need.
  • Deploy the master data to the cloud. Deploying the master data in the cloud makes it accessible to all users. By making deployments in cloud regions located near business units, performance concerns can be minimized. Data fabrics like that from Netapp make it easy to move data between cloud and local environments.
  • Allow departments to make local copies. In most cases, departments should only make read-only copies of master data, except for development or test purposes. Tools such as Veritas Velocity provide copy data management that can help manage departmental copies of data.

Beyond technical architectures that support data sharing, strong data governance policies are required as well. The departmental technology teams need to understand and support the effort to maintain the master data; this requires providing them with training that explains the business importance of this data. Without both that understanding and infrastructure that supports application access to the master data, the need for rapid development of business applications will quickly lead to business units developing their own local data sets once more.

dcVAST offers analysis and implementation services that help businesses tackle their data management challenges, including creating infrastructure and deploying tools like Veritas and Netapp that can support your master data management projects. Contact us to learn how we can help you get control of your data.