New releases in the technology have advanced it to be a true enabling operational and analytical MDM platform. I think that the state of the technology is there are a lot of vendors in the marketplace who initially went out with an application to try to solve a specific niche problem.
Those were the ones who have a CDI product, or the ones that have a PIM specific product. They are realizing that the market is evolving where cross-domain data management is what people want. They want to start with CDI and then move to a product or move to a supplier, etc., etc. So the platforms now need to be able to support those cross domains.
And the reality is in most MDM implementations, it's not like I am going to bring an application, and immediately it’s in place, and it's up and running. Because of the nature of the business rules and business processes associated with any MDM implementation, there is a customization that’s needed within that framework.
|
Read how InetSoft saves money and resources with deployment flexibility. |
So the platforms really need to be flexible to map to those specific business processes or business rules that that company needs. I think that within the market, there is some confusion about should I have a master hub or should I have the federated type, what you call a registry style, etc. Vendors who provide databases and a data platform, of course, believe you need to actually have that data in a place centrally located, etc.
Let me see if I can simplify it even more. I see three major components in an MDM environment. The first one is the application. You touched on it. You have got to have some application whether it's customer data integration or product information management but some kind of application that manages the data coming in and out.
The other piece is you got to have the database without a doubt. I think that if you are really going to have a full blown MDM environment that you are going to end up ultimately storing the data somewhere and the metadata that goes along with it. And that means the database has to be needed.
But in between the application and the database or what I call MDM services, and there is a whole bunch of stuff that’s in the MDM services. There is the data quality. There is the ETL processing. There is business rules management. There is the delivery of master data into wherever it needs to go. There is archiving.
|
Read why choosing InetSoft's cloud-flexible BI provides advantages over other BI options. |
There are all kinds of services, and they can be virtual or they can be physical. We may deliver the master data in a virtual fashion as opposed to actually having to physically move it into some other environment.
I think many of the applications didn’t really supply that, or they pieced it together in some way, shape or form. And many of them don’t have their own DBMS. They certainly don’t have the database, but I think all three of those major components need to be a part of master data management environment.
And then the services, going back to the point on the services component, I think a key component of a solution is identifying whether data quality is an issue. If a customer already has something like a Trillium or FirstLogic, and they are already leveraging this in their environment, then the MDM provider doesn’t have to bring in a data quality module.
If I have the ability through web services or other methods to leverage those existing technologies that exist in my environment, then I think that there is a benefit. So it's not always having all the pieces in one package but being able to access within those workflow processes, within those business processes those specific other technologies that add value within the MDM environment, such as a data quality provider or being able to call out to another database and bring that data back into that workflow process. And I think that’s a critical component.