Next generation tech brands have a responsibility to change data privacy standards

Travis Montaque
Holler Developers
Published in
6 min readJun 3, 2021

Consumer trust is being tested with every new report on security breaches and misuse of personal data. And although digital data concerns have always been a complicated topic, recent privacy-forward moves by Apple and Google, and Facebook’s response, suggest that big change is bound to come.

But, if the slow-moving debate taking place between tech giants and the government is any indication, the process of regulating the tech industry will be a long and arduous one. There are many privacy puzzles to solve and no clear consensus on what is or isn’t acceptable, but one thing is clear; tech practices of the past won’t have a place in the future. It’s time for a new generation of tech companies to step up and make privacy-safe practices an industry standard.

For established legacy tech leaders, improving tech practices often involves an overhaul of existing systems, which requires extensive capital, resources, and time. Retroactively building safeguards into existing technology is no easy task. However, young tech brands have the opportunity and responsibility to make privacy-safe decisions from the start.

The next generation of tech must weave security into their DNA, regardless of whether it’s mandated by law or not. With early infrastructure still being built, there is opportunity to invest early in innovative, responsible technologies built to foster a data-safe future. Doing so won’t only benefit end users; next gen tech brands are bound to gain a competitive edge by being able to seize opportunities that might be more challenging for companies with legacy practices.

Building a privacy-safe future with the help of AI

When done right, Artificial Intelligence is helping to make meaningful strides toward privacy-safe tech. It’s crucial for tech companies that invest in AI to utilize innovative methodologies to limit the use of personal user data, and do so at the stages of concept and design, rather than rely on reverse engineering later. In many cases, it is not necessary to use private data to achieve a similar desired result. AI practitioners need to consciously break the habit of collecting as much personal data as possible, take a step back, and ask themselves whether collection of personal data is truly making a positive impact to a user’s experience. For example: is knowledge of a user’s age, gender, and location actually needed to solve most of their problems? No, and yet it is still a “normal” practice to capture this information and more.

At Holler, we’re designing a new breed of next generation content suggestion systems that can feel extremely personalized, without the use of personal data. This may sound like a contradiction, but it’s simply just a new method of problem solving. When you break away from antiquated patterns and do not allow the collection of personal data to be an option, viable solutions will be found; this was the case within our own research science team. Through AI advancements based on context (not identity) we’re able to suggest useful and relevant content to people in messaging without knowing anything personal about them.

A revolution towards privacy-safe advertising

Developments in AI and machine learning have led to the use of contextual AI for safe data collection. This is not only a step toward a more privacy-safe future of tech, but creates a more useful tool for advertisers.

Historically, advertisers have relied on first-party data deeply tied to an individual user. In collecting a growing ‘profile’ of information including things like location, age, and personal interests, advertisers are able to easily target ads to a laser-focused group of relevant people. These practices have become common in recent years, especially with the rise of social media as a marketing tool. This quickly blurred the line of what marketing tactics were acceptable and what was viewed by consumers as invasive. Soon, there was no line left at all. And consumers started to question what this all meant for them.

Contextual understanding opens up new opportunities for the tech industry and for advertisers; breakthroughs in natural language processing and sophisticated AI make contextual targeting a viable method of monetization that prioritizes consumer experience and trust. Today, the technology can understand the context around a phrase or conversation, instead of honing in on just one keyword to trigger a response. If a user is talking about how they want an apple, the technology can use the context around it to determine correctly if they are talking about a fruit or a computer, for instance.

These new developments in AI will help put user privacy first without sacrificing advertisers’ relevance or reach. The focus is instead on the context, not the user identity — because it doesn’t matter who you are as long as a service understands what you want or need. Understanding intent and providing the right content in the right moment without knowing who you are or what you’ve clicked on recently is a difficult task, but worth it when it comes to user privacy.

My phone; my privacy

The second half of this puzzle involves access to data. Most of the time, data and the analysis of that data is executed within the cloud in order to deliver an experience to users. Information left the device, went into the cloud, and then a response or service based on that information was sent back to the device to ensure a personalized or targeted user experience.

Now it is possible for AI to do its job on the device alone, understanding an “action” and sending a response or experience without having to be sent to the cloud. That means that any personal data about a user never leaves their phone — and there is no danger of that data being intercepted as it travels to the cloud and back.

Combining the advancements of contextual AI and the ability to do AI work on-device means that user privacy gets a double boost. Not only is the AI designed to rely on contextual information, but the raw data will never leave the device and have the capacity to be hacked by other potential bad actors.

The obligation to consumers and the future

For any emerging tech brand, now is a critical time to build a sustainable startup that can develop better products and powerful experiences without the exploitation of users. Young tech companies that position themselves as partners to consumers from the beginning will earn trust and gain an advantage over legacy brands who are busy trying to reverse decisions.

Contextual understanding through AI is one data-safe path forward, but it isn’t the only one.The AI that Amazon Macie uses can identify sensitive information where it doesn’t belong so it can be dealt with properly. Adding in capabilities for AI to function fully on-device is another option. We’ll likely see additional innovations as data science teams push themselves to think of other creative options.

It’s up to the next generation of tech to realize what else we can do to make our digital world safer for everyone.

Putting users at the forefront of privacy policies needs to be at the top of all of our to-do lists. We simply can’t operate and thrive at the cost of user privacy, so putting the right systems in place early on is essential. We can’t erase the past, but we can innovate toward a more ethical and transparent future.

For more insights about data privacy, check out my recent episodes of Conversation Nation:

My chat with Sara Morrison, Data Privacy Reporter at Vox
My chat with Meredith Grauer, Chief Privacy Officer at Nielsen
My chat with Kelly Wulff, General Counsel at Vouch

--

--