Introducing Privacism

Introducing Privacism

Are you a privacist?

What’s the first thing that comes to mind when thinking about fundamental human rights. – Dignity? Freedom of religion? Freedom of speech? The right to privacy probably wasn’t among your first thoughts, but it should be.

According to the United Nations, all humans have a “Right to Privacy in the Digital Age”. The sad truth, however, is that even in liberal, democratic countries, privacy is still thought of as a topic best avoided by law-abiding citizens who have “nothing to hide.” But we all have things to hide, from credit card purchase history and location data to private emails and the things we ‘like’ on social media. We share intimate, revealing details about our lives to data harvesters, but we draw our shades and close our doors at home to keep prying eyes from seeing inside.

At Xayn we wondered: why is it that we, as users, are so willing to let go of this fundamental human right when it comes to online services? Are blind eyes being turned, is privacy seen as a fair and necessary sacrifice for service, or is it a mixture of both?

We want this ambivalence towards privacy to change. Since we founded our company in 2017, we’ve been active in the privacism wave. The what now? P-R-I-V-A-C-I-S-M. If you haven’t heard the term yet, pay attention, because it’s going to shape the future.

An introduction to Privacism

So what is privacism? You won’t find it in dictionaries – at least not yet. We believe it should be part of your lexicon, starting today. Here’s how we define it:

In a sense, privacism is claiming one’s own, individual rights. But it goes beyond that – privacists fight for everyone else’s privacy rights too. It focuses on advocating for the privacy of users as opposed to technical evolution or even the collective interests of governments. Privacism isn’t to be confused with privatism – which is about private ownership – instead it’s an ideology built on the foundation that privacy is a basic human right.

It’s not a new idea. In the past, privacism debates have centred on government surveillance, though they’ve since shifted to include not only that but the impact of digital services on our privacy as well. These are the types of services we use daily, typically for free, that collect mountains of personal data while few are paying attention. Any modern company, any major platform, should be held to a higher standard of reasonable, ethical data collection. Even with groundbreaking privacy-protecting legislation, like the GDPR in the EU, we’re still not anywhere near where we need to be.

Never has it been easier to collect vast amounts of data on users in a relatively short time. Experts estimate that by 2025, 463 exabytes of data will be created each day globally – the equivalent of 212,765,957 DVDs per day. Harvard Business Review even invented the term “gross data product” to measure and rank countries in terms of data production. However, while we know data is being harvested by our favourite platforms, few of us know what they’re doing with it.

What Does AI have to do with it?  

AI, or artificial intelligence, essentially refers to statistical algorithms that are applied to large amounts of historical data which then predict future outcomes. Thanks to the above-mentioned trends in data collection, AI stands to greatly benefit from the details each of us hand over to the platforms we use every day. Given the projected rise of AI in the coming years, it’s time to think about data protection now – before we reach the point of no return.

To illustrate the scale of the problem, let's look at Google. The company isn't alone in harvesting massive amounts of data from its users, but looking at how it works for them gives us a glimpse into data harvesting in general. When Google collects your data it’s stored in a central server where it gets processed, filtered, and used to train algorithms for everything from predictive text analysis to the cards that appear at the top of search results. Though its intentions may be good, the process of handling this data is more than questionable. Yet while we may know about this, societally we’re willing to put up with their violation of our privacy rights for the convenience of using Google Search, Gmail, YouTube and others – “free” platforms that few of us really question at this point. To sum it up, please remember this quote by former Google CEO Eric Schmidt. Given that this quote is almost 10 years old and not much has changed since, it just amplifies the severity of this problem.

“We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.” - Eric Schmidt, former Google CEO

The point is this: if we’re not careful in how we treat our algorithm-training personal data, we run the risk of hammering the last nail into the coffin of individual privacy. Some users might be willing to risk this, whether for convenience or a lack of viable alternatives, but they don’t have to. At Xayn, we want to be that alternative. we want to preserve your right to privacy and we want to provide you with the personalized search results you’re used to while doing it.

Be the change

Advocates of privacism, called privacists, are digital natives. They advocate for their right to privacy by exercising their rights as consumers and choosing products that mesh with their principles.

We aren’t the first company standing up for your rights. The Mozilla Foundation, known for their Firefox browser, NextCloud or DuckDuckGo, a privacy-focused, yet non-personalised search alternative are just two examples of companies leading the way in the fight. We’re proud to be joining the fight for protecting your right to privacy, one line of code at a time.

We’ve been working alongside numerous corporations to develop privacy-preserving AI since 2017. We’ve bundled our expertise from academic research, and our experience in creating privacy tech to conceive our own technology stack, XayNet. But since privacy shouldn’t be a luxury product, nor reserved for people who have something to hide. We now want to open it to everyone.

We have open-sourced XayNet so that other companies can build directly on our backend to create innovative AI use cases. Sharing is caring, after all. By empowering other companies to integrate their own technology solutions, we’re providing alternatives to any companies willing to use them. We hope that other companies join the wave and become privacists too.

The fight for privacy can only be won through consumer choice. Users need to demand better, but they also need choices that don’t sacrifice their user experience. Anything less will never appeal to the masses. Only after we build privacy-oriented products with the convenience we’ve come to expect will we bring change to the world.

Introducing Xayn Search

Search engines are the gateway to information online.
Everyone uses them, with Google being the world’s most-visited website. But those that hold the key to online information also have the responsibility of protecting it. They should not only present the information in a convenient, neutral way but also take great care in protecting the user data that makes it all possible.

This is why we’ve created Xayn – an AI-based search assistant that combines great user experience with the privacy you should always expect.

You can read about the technology that powers Xayn here. What’s important is that our approach keeps all of your data where it belongs: with you. The AI works behind the scenes and is designed to protect your privacy.

This may not solve all of the internet’s privacy problems, but we are addressing a massive fault in the way companies currently handle search. We’re providing access to the information you’ve come to expect without making you pay for it with your privacy. Xayn ends the trade-off between true privacy protection and the convenience you love to have.

We’re privacists. Are you?

© Photo by Jason Dent on Unsplash

More posts for you

We are using cookies to optimise your experience on By clicking “Allow all,” you agree to the storing of cookies on your device for our website to work reliably and customise content or ads for you. For more details, please refer to our cookie policy.
We are using cookies to optimise your experience on By clicking “Allow all,” you agree to the storing of cookies on your device for our website to work reliably and customise content or ads for you. For more details, please refer to our cookie policy.