Managing into Mastering

William L. Weaver
TL;DR Innovation
Published in
5 min readApr 17, 2018

The Rise of Master Data Integration

Among the many benefits of teaching in the School of Arts & Sciences is the opportunity to observe the many interactions between these two academic traditions. Systems theory describes an artistic ability as being able to appreciate the differences among a collection of things that appear similar and a scientific ability as being able to appreciate the similarities among a collection of things that appear different. We have many instances of this dichotomy throughout our various organizations in the form of Scientists & Engineers, Research & Development, Analysis & Synthesis. Regardless of your preferred orientation of Right-Brained or Left-Brained, we all recognize the powerful pattern that uses differentiation to understand the parts followed by the use of integration to form an understanding of the whole.

Photo by Sanwal Deen on Unsplash

This pattern was particularly evident at the 2011 International Consumer Electronics Show which introduced a slew of smart phones, touch-screen tablets and 3D televisions all vying for popularity and adoption while simultaneously touting their differentiation from competitors and sporting integration of the latest technology and standards. At a higher level the information sector is also witnessing an integration cycle that has us returning to centralized computing in the form of the “cloud”. Back when speed and capacity required large amounts of matter and space, centralized IT was king and users submitted computational requests via slow terminals. The ensuing evolution toward ever faster speed and higher capacity of shrinking devices led to decentralized IT in the form of personal hard-drives and portable memory — and along with it an imperative to manage all of the disparate data.

In addition to the need to store and locate the voluminous information generated by the various sectors of an organization such as research, development, production, management, marketing, sales, procurement, and regulatory affairs, is the requirement to keep the information consistent and accurate across the entire organization. This Single Version of the Truth or SVOT is an ideal goal that permits each sector to access the correct information and to contribute to the SVOT though error corrections or by providing additional information about the entity. In this way, the purpose of IT evolves from the managing task of collection, storage and retrieval of information in relational databases into the facilitation of organizational learning — a process that involves the integration of differential information into knowledge.

The business sectors of our organizations have come to rely on the importance of customer data to enhance their ability to innovate new products and satisfy future demand. Master Data Management (MDM) that provides a “single customer view” across separate business units using solutions from multiple vendors has recently been analyzed by the information and technology research and advisory group Gartner, Inc. Gartner’s Magic Quadrant for Master Data Management of Customer Data of October 2010 summarizes the inability of Customer Relationship Management (CRM), Enterprise Resource Planning (ERP) and independent vertical industry systems to solve the problem of inconsistent master data.[1]

While MDM seeks to provide a single view of the data, there are currently four different approaches to the solution. In the Consolidation style, the Master Data is authored in the source systems and then copied into a central “hub” where it is processed, analyzed and ultimately amalgamated into a single “golden” copy of the truth. The golden copy continues to be updated while the source systems continue to operate independently. The Registry style of MDM does not copy the source data but instead creates a registry of pointers to the data. In response to a query, the different versions of the truth are assembled into a point-in-time composite view. The Centralized style creates a centralized repository of all the Master Data, a “hub” that permits access via “spoke” applications and authors in a collaborative environment. This style requires the most amount of system redesign to existing installations but provides for the most control over the Master Data. Finally, the Coexistence style recognizes that Master Data may be authored and stored in different systems across a heterogeneous and distributed environment and is a hybrid of the previous three styles. This style is most closely described as “cloud computing” as it is difficult to determine the actual location of the golden copy of the Master Data at any point in time. Gartner points to Oracle, IBM, and Informatica as being leaders in the development and deployment of Master Data management and integration systems.

However, these corporations are most often associated with the business aspects of our organizations. The research laboratory is also faced with the challenge of maintaining a single version of the truth in the form of integrating information residing in the Laboratory Information Management System (LIMS), Electronic Notebooks (ELNs), and various instruments and data collection points throughout the laboratory. Analogous to their CRM/ERP business counterparts, LIMS/ELN systems too often get the management of data correct, but are inadequate when it comes to integrating the information into the big picture.

The Institute for Laboratory Automation has put forth a research proposal titled The Integration of Laboratory Systems. The proposal seeks to examine one existing solution and to determine how its success may be implemented on a broader basis to provide for smoother laboratory workflow, an easier path for meeting regulatory requirements, reduced cost of development and support, reduction in duplication of records and greater flexibility of upgrades to meet changing requirements. All of the recent activity toward cloud computing, Master Data Management and the proliferation of broadband networks and communication standards are pointing to an integration phase that may be the best chance in decades to achieve the automated laboratory envisioned through the introduction of personal computers into the laboratory.

[1] Gartner RAS Core Research Note G00206031, John Radcliffe, 4 October 2010.

________

This material originally appeared as a Contributed Editorial in Scientific Computing January/February 2011, pg. 14.

William L. Weaver is an Associate Professor in the Department of Integrated Science, Business, and Technology at La Salle University in Philadelphia, PA USA. He holds a B.S. Degree with Double Majors in Chemistry and Physics and earned his Ph.D. in Analytical Chemistry with expertise in Ultrafast LASER Spectroscopy. He teaches, writes, and speaks on the application of Systems Thinking to the development of New Products and Innovation.

--

--

William L. Weaver
TL;DR Innovation

Explorer. Scouting the Adjacent Possible. Associate Professor of Integrated Science, Business, and Technology La Salle University, Philadelphia, PA, USA