Administration, Azure, Editorials, Ethics

Analytics, Reporting and Fine Line of Privacy

It can be very challenging to pull together reporting, analytics, data privacy expectations and support for legislation for information in your systems. People – those users that you support – are learning to request more and more from reporting. They’re expecting systems to be smarter about the information stored in your databases.

Azure does amazing things with analytics, looking at your information, providing tools to see what’s really happening and giving you options to see how that all plays out in your databases.

Other tools are continuing to evolve as well, providing detailed drill-down experiences. inferences that need factual backup and so-on.

These all tend to lead to knowing the underpinnings of the information you’re being presented. One of the ways this impacts you as a data person is that you’re expected to be able to provide the detail-level data behind the calculated information your reporting tools are showing.

This leads to a tug of war of sorts between the users of that data and the the owners of it.

This is where we come in – the data platform folks that are responsible for knowing what must be managed, protected and provided. It’s very common any more to have to run interference between the requests for more information and the protection of that information. It’s a great opportunity, however, to learn how people are using the information. We’ve had several cases where we were able to re-architect a bit on the presentation of information, provide a solid level of detail, but still respect the protection of information. Many times this means a new view on the data literally or figuratively.

The change that we’re seeing is the end-user of data is far less concerned about the data and privacy protection things that are happening in the world. At the same time, their expectations regarding what they have access to and the level of detail that includes, is expanding rapidly. Those can be at odds with adhering to the standards of GDPR and the CCPA, not to mention work by other countries and locations to clamp down on information use.

So, what can you do?

It’s the old “seek first to understand…” thing. Make sure you know the end goals and requirements, rather than the steps along the way. Most times, you can create the views and logical data constructs to support the reporting and extract requirements, while still providing the protection and accountability for information.

It’s also possible that you can provide “enough” information to be helpful and meaningful, and to provide next steps that still honor the data owners.

For example, if a report is showing the next targets of the email campaign based on purchase history, try showing an ID or abstracted information that can be used to validate the dataset, then handed off to the email process to determine the emails to be included. Essentially, you can provide an meaningful abstract layer between the information and the end-user doing some reporting tasks, and again between the marketing process and the raw personal information.

It’s a simple example, but we’ve been having some good successes running different types of what amounts to being an abstraction layer and still providing solid reporting and functionality.

It’s critical to stay plugged in to the planning for information use so you can keep on top of what is coming request-wise and at the same time know what the necessary outcomes are for the information in your systems.