fbpx

Five data friction points that block innovation

Posted On

By Abhishek Prabhakar

Share


Data velocity is the speed at which data is collected, processed, and utilized. For businesses that integrate data-driven decision-making into their processes, high data velocity delivers tangible returns. There are many situations where data velocity has a direct impact on an organization’s objectives and bottom line, such as:

  • Being able to spot sales trends to get ahead of competitors to secure suppliers.
  • Recognizing and patching usage issues in a newly released app.
  • Accurately balancing service provision and staffing to meet load demands.
  • Making development iteration cycles shorter through faster feedback and testing input.
  • Easily collaborating with other organizations so their resources or technology can have immediate impact.

Unfortunately, for organizations at all stages of data adoption, there are several obstacles that can slow down data velocity. 

Five data friction points that block innovation 

These obstacles are often referred to as data friction points, and they can block innovation, obstruct collaboration, and reduce the returns that data can deliver to your organization. Below, we take a look at five of the biggest data friction issues. 

1. Regulatory issues

Legal risk surrounding data comes in a variety of forms, including industry regulation and national data laws. The largest risk, however, comes from consumer data protection legislation, such as Europe’s GDPR and California’s CCPA. These outline clear restrictions on data storage and usage, with significant financial penalties for regulatory non-compliance. 

As a point of data friction, these regulatory issues increase the risk associated with internal and external data sharing, which has a negative impact on both innovation and collaboration.

2. Disjointed organizational approach

The larger an organization becomes, the more difficult it is to maintain a unified and coherent approach to data. Different departments have their own data goals, use varying formats and analysis tools, and store data across a diverse storage network of on-site, cloud, or hybrid options. It’s easy to see how data strategy can quickly become misaligned. 

This creates data friction that affects every aspect of basic data strategy. Storage issues impact collection and processing, while poor communication of needs and objectives means that processed data doesn’t deliver insights quickly enough. The opportunity cost of underutilized data might not be a headline-making amount, but it damages data’s ROI and is an avoidable business cost.

3. Data quality

The term “data” covers a vast array of information that is collected in a multitude of ways. Not only that, but data coming from different sources may not describe the same things. Alternatively, it may describe the exact same information but get collected anyway, leading to duplication. Understandably, this creates major data friction issues for processing and analysis. These issues may be overcome at an organizational level, but it requires significant resources and financial investments.

4. Collaboration and supply chain

Very few organizations can produce and maintain all of their technology and component solutions in-house. This means that data will be shared across a supply chain. In doing so, security threats may be imported, creating significant data breach risks. It is a similar situation for collaboration with partners, where control is lost over data that is shared. This creates data friction as the integration of new technology or collaboration opportunities are held back by data security concerns.

5. Data integration

Different software produces and collects data in a vast range of different formats. Data friction arises when analysts want to query the data, as they must first convert it to a single format. On top of this, where and how data is stored will also affect speed of access, with data generally having to be migrated to the same location to be queried. This process can take significant time and slow data velocity.

The essential features of a data friction solution

Data friction is a complex issue that organizations need to solve if they are to maximize the value gained from their data. To effectively do this, a data friction solution needs to have a number of features, which we’ll dive into below. 

Flexibility

A primary focus of any data friction solution has to be the integration of data no matter its format or storage location. For large organizations with dispersed data functions, overcoming integration issues quickly increases data velocity. This means data-driven decisions can be made better and faster.

Security

Data breaches are a huge business risk – they can trigger loss of consumer trust and regulatory fines. To reduce the security risks associated with data collection, migration, and collaboration, a data friction solution should add an extra security layer rather than expanding the potential attack surface.

Promoting innovation

Reducing data friction means that decision-makers are better able to identify needs, both organizational and customer-focused, and create solutions for them through innovation. The data friction solution should also be able to easily integrate third-party analysis tools to improve insight delivery.

Enabling collaboration

Creating secure environments for data exchanges means that data can be ring-fenced, reducing the risk of data leakages or breaches. This allows collaboration to happen more freely without creating further regulatory risk.

Cost effectiveness

 A data friction solution loses its value if its deployment ends up costing as much as (or more than) the returns gained from improved data velocity. The right data friction solution should instead enable organizations to use cheaper storage options and should work with the data as it is without the need for significant migration or transformation. 

Intertrust Platform: the ultimate data friction solution

Intertrust’s secure data exchange platform helps organizations maximize their data’s value by reducing data friction through a number of features that tackle the pain points preventing secure, fast, and cost-effective data usage. Intertrust Platform creates a virtualized data layer that allows data, no matter what format it is in or where it is held, to be brought together and queried without the need for migration. In doing so, more affordable data storage options can be utilized without losing analysis performance. 

Intertrust Platform ensures security and regulatory compliance by creating trusted execution environments so data collaboration is performed in secure containers, reducing the chance of breaches or data leakage. It also enables fine-grained access controls to limit what people can see and do with data, simultaneously creating an audit trail to further improve governance and transparency.

To find out more about how Intertrust Platform is reducing data friction and improving innovation and collaboration through data you can read more here or reach out to our team.

Share

intertrust-xpn CTA Banner
Avatar photo

About Abhishek Prabhakar

Abhishek Prabhakar is a Senior Manager ( Marketing Strategy and Product Planning ) at Intertrust Technologies Corporation, and is primarily involved in the global product marketing and planning function for The Intertrust Platform. He has extensive experience in the field of new age enterprise transformation technologies and is actively involved in market research and strategic partnerships in the field.

Related blog posts

Blog

How VPPs enable an interoperable energy grid

Read more

Blog

Boosting IoT device security with entity attestation tokens

Read more

Blog

How to avoid the energy AI backlash? Protect the data.

Read more