top of page

Canada wants to phase out data copying

The following excerpt was taken from the original article posted on March 01, 2023 in Blocks & Files.


We talked to several suppliers, whose products and services currently involve making data copies, about the implications for them.


WANdisco’s CTO Paul Scott Murphy told us: “At first glance, the standard may appear to push against data movement, but in fact, it supports our core technology architecture. Our data activation platform works to move data that would otherwise be siloed in distributed environments, like the edge, and aggregates it in the cloud to prevent application-specific copies from proliferating.


“Our technology eliminates the need for application-specific data management and ensures it can be held as a single physical copy, regardless of scale.”


He added: “Notably, there’s a particularly important aspect of the new guidance on preventing data fragmentation by building access and collaboration into the underlying data architecture. Again, our technology supports this approach, dealing directly with data in a single, physical destination (typically a cloud storage service). Our technology does not rely on, require, or provide application-level interfaces.


“In response to the standard, Canadian organizations will need to adopt solutions and architectures that do not require copies of data in distributed locations – even when datasets are massive and generated from dispersed sensor networks, mobile environments, or other complex systems.”


W Curtis Preston, Chief Technical Evangelist at Druva, told us: “This is the first I’ve heard of the standard. However, I believe it’s focusing on a different part of IT, meaning the apps themselves. They’re saying if you’re developing a new app you should share the same data, rather than making another copy of the data. The more copies of personal data you have the harder it is to preserve privacy of personal data in that data set. I don’t have any problem with that idea as a concept/standard.


“I don’t see how anyone familiar with basic concepts of IT could object to creating a separate copy of data for backup purposes. That’s an entirely different concept.”


We have asked the DGC how it would cope with examples above and what it would recommend to the organizations wanting to develop their IT infrastructure in these ways. Dan DeMers told us:


“One of the most important concepts within the Zero-Copy Integration framework is the emphasis on data sharing via granting access for engagement on uncopied datasets (collaboration) rather than data sharing via the exchange of copies of those datasets (cooperation).


“But as you point out, many IT ecosystems are entirely reliant upon the exchange of copies, and that is why Zero-Copy Integration focuses on how organizations build and support new digital solutions.”


Legacy apps can contribute, he said. “One capability that is not defined in the standard (but we are seeing in new data management technologies such as dataware) is the ability to connect legacy data sources into a shared, zero-copy data architecture. These connections are bi-directional, enabling the new architecture to be fueled by legacy apps and systems on a ‘last time integration’ (final copy) basis.


“It’s like building the plane while it’s taking off in that sense – you’re using a Zero-Copy data architecture to build new solutions without silos or data integration, but it’s being supplied with some of its data from your existing data ecosystem.


“It’s all about making a transition, not destroying what’s working today, so the scenarios you outlined would not be an issue in its adoption.”


13 views0 comments
bottom of page