This is the third of a series of three pieces from the inaugural issue of Nielsen’s Perspectives on Retail Technology, a publication of Nielsen’s Buy Technology group.
In the early 2000s, the maturity of enterprise software meant that IT was seen as a commodity, not a source of competitive advantage. This received wisdom has been turned on its head in the last decade: the digital economy created a huge number of business opportunities that could only be realized through differentiating software.
As the digital revolution took hold, big data blurred the data boundary of the enterprise. In theory, a haulage company should be able to combine its own fleet information with weather, traffic, satellite and competitor data, and then analyze it using the best available predictive models to optimize its fleet distribution. Some of the data and algorithms that companies need for this type of 360-degree analysis are freely available or can be bought, but the majority of the most useful data is still locked behind the firewall. While companies are keen to use third-party data for their own benefit, most are understandably unwilling to make their own commercially sensitive data available for others to use.
How can this circle be squared? Nielsen has already solved the problem for market research, acting as a broker to make sensitive retailer data available to manufacturers in a way that is commercially acceptable. It is possible that analytic marketplaces will arise to provide a generic solution to this problem, encompassing all types of data, and to allow companies to share and monetize sensitive information without undue risk.
The same analytic marketplaces also have the potential to redefine the way software is built and sold. The rise of mobile devices has led to the creation of small, single-purpose, consumerized apps. Apps make it easy to accomplish individual tasks; apps are in marked contrast to traditional enterprise applications, such as resource planning, which attempt to automate a broad range of tasks for a diverse group of users. There is still a need to accomplish multistage, multi-part tasks in the app world. But users have become adept at assembling and choreographing multiple app interactions to do this, aided by improved app-to-app integration and tools like Zapier and IFTTT that automate workflows between apps. There is the potential for monolithic enterprise software to be consumerized and “appified.” An analytic marketplace could promote the creation of modular components within the marketplace, which can be snapped together to create new business capabilities. If enterprise companies move their IT wholesale to the cloud, traditional software vendors would also need to move wholesale into cloud-based software marketplaces or provide their software as a service in the cloud.
SHARING SENSITIVE INFORMATION
The approach of building a data warehouse (or a data lake or a logical data warehouse) as the foundation of an enterprise’s business intelligence system is based on the premise that all the data an enterprise needs to analyze is freely available. In reality, the range of data of interest to a company extends into areas that its competitors, partners or customers may consider sensitive or confidential.
It is accepted wisdom that enterprises need to keep their intellectual property and commercially sensitive information confidential. There has been a recent counter-current to this received wisdom: Tesla gave up its electric car patents, and many technology companies (such as Cloudera, Google, Facebook, Yahoo! and IBM) made some of their proprietary software open source, including foundational components of the big data revolution such as Hadoop. But these companies appear to be exceptions, and there is no obvious trend towards open sourcing intellectual property outside the software industry. Most data owners have limited appetite for sharing transactional data, and no appetite for sharing their more commercially sensitive information, such as pricing and margins. On the other hand, companies realize that there are large potential benefits to combining their data with that of partners and third parties so that it can be better mined for insights. They are also increasingly aware that their enterprise data is an asset, and they would like to be able to monetize it—if they can do so without compromising commercial confidentiality.
One conceptual solution would be for the data owner (or its agent) to provide an analytic software appliance that implements its acceptable use policy. The appliance connects to the data and allows it to be processed as certified by the owner; all other usage is disallowed. The data is encrypted at rest, so even an administrator with access to the data is only able to read it through the analytic appliance.
In principle, such an analytic appliance could be installed in the consumer’s data center. This would effectively also require the data to be copied into the data center, with all the known drawbacks of that approach. In addition, the data owner would need to certify the software for the consumer’s environment on an ongoing basis and maintain and support the appliance on-site; this would be a burden both for the data owner and the consumer.
Given that the dataset is almost certainly already in the cloud, it makes sense to deploy the analytic software appliance to the cloud, too. Cloud environments support software containers (for example, Docker) that package applications for easy, secure and isolated deployment. The containerized appliance would be deployed to the enterprise’s virtual private cloud inside its security perimeter.
The fact that the analytic software appliance, inside its container, is essentially independent of the external environment makes support, maintenance, troubleshooting and upgrades much easier for all parties. Upgrades and bug fixes simply require the deployment of a new version of the container.
The solution described is point-to-point: it makes a connection between one data provider and one data consumer. In an environment in which there are many providers and many consumers, point-to-point solutions tend to break down. This is where marketplaces have a potential role to play.
In theory, the major cloud platform providers could create analytic marketplaces that would solve the problem of many-to-many point connections. Analytic marketplaces would not sell data, because data is sensitive. They would sell the outcome of running an analytic model against a dataset. The result might be a narrative summary of the data or a set of aggregations, which do not reveal sensitive information. The marketplace owner would curate the third-party models that can run against the data, and would ensure they were certified against the constraints specified by the data owner. Consumers would purchase the ability to run a model against a dataset.
In this environment the data provider and consumer would both have a single connection with the marketplace owner, rather than having to manage many connections with different providers and consumers.
Changing the Software Paradigm
Some industry-watchers believe that the conjunction of consumerization, big data and marketplaces could force a revolution in the way software is developed, sold and used:
- Apps and software consumerization have created users who don’t need the security blanket of monolithic applications and one-vendor-does-everything technology stacks.
- Big data and predictive analytics offer huge opportunities to any business able to assemble a holistic data view of its internal and external environment.
- Companies are increasingly aware of the value of their data and algorithms—and are looking for ways to monetize them without compromising their commercial interests.
Analytic marketplaces could exploit these trends.
Developers could use the marketplace platform to build components or even complete apps for sale. They could also sell computational algorithms and data visualizations.
Data owners would use the marketplace to sell their data in an encapsulated form, as already described.
Marketplace customers would be able to purchase an app, algorithm, analytic component or data capsule to meet a particular need. More ambitious users could assemble a complete solution combining multiple components from different vendors.
Described like this, it is clear that the analytic marketplace is an ultra-sophisticated B2B app store. The potential for positive network effects and the concomitant explosive growth in the marketplace is obvious. But the potential for network effects exists in every marketplace, and only a minority of digital marketplaces have succeeded—what factors might predispose analytic marketplaces for success?
The benefits to application and data providers are obvious: they have the opportunity to monetize otherwise empty assets. However, they will only be able to realize this value if the marketplace attracts sufficient consumers—what is in it for the consumer? The consumer gains access to data, algorithms and analytic components that he or she would not otherwise have; more importantly, these components come pre-implemented and pre-integrated as part of the marketplace platform. In theory a consumer could assemble exactly the same solution outside a marketplace, but this would be technically far more challenging and very time-consuming.
The consumer would also have to negotiate usage, licensing and confidentiality agreements with the owner of each component, whereas this is handled for them by the marketplace owner. The benefits are huge, so long as the model of assembling software works.
In the short term, marketplaces are probably going to be centers of experimentation and innovation. They should allow companies to assemble leading-edge proofs of concept more quickly, more cheaply and with less risk than if they did everything themselves from scratch. The improved ability to innovate will be very attractive to visionary companies.
Once marketplaces have achieved a degree of maturity, they should become attractive to a much broader group of companies, with more pedestrian use cases. The marketplace could allow a business to assemble a vertical solution tailored to its specific requirements, as an alternative to implementing a monolithic vendor application containing a lot of functionality it never uses. Marketplace assembly would be easier, faster and potentially cheaper than the vendor route. It would also confer greater flexibility and agility: everyone knows the frustration of having to wait for an incumbent full-stack vendor to upgrade their product to incorporate the latest innovations. Marketplaces will allow companies to stay constantly up to date simply by swapping components.
The shift from vendor applications to analytic marketplaces will be analogous to the shift from on-premise computing to the cloud—it may not be cheaper than doing things the traditional way, but the flexibility, agility and improved quality that result will be compelling reasons to move.
It is notoriously difficult to predict the success of marketplaces, otherwise we would all be Uber millionaires. Analytic marketplaces are barely beyond the concept stage at the moment. It is likely they will undergo major, unpredictable evolution before they mature—if they ever do. That said, the confluence of the forces of consumerization, big data and digital marketplaces make it likely that there will be a major shake-up in the way that software is developed and used in the next five years.
For additional insight and perspective, download Nielsen’s Perspectives on Retail Technology report.