Cognos - a private AI option

Generative AI is here with the most commonly known option likely to be ChatGPT offering a conversational interface for interacting with AI.

However those of us concerned with the privacy of our data can see that creating databases of conversations, often containing personal or confidential information, is a ticking time bomb.

Services openly admit that data sent to them can, and will, be used for training and may also be reviewed by their staff. Even if they don't explicitly say they will use your data today, by having it all on their systems it opens up the possibility to access and use your data in the future.

Google Gemini

Please don’t enter confidential information in your conversations or any data you wouldn’t want a reviewer to see or Google to use to improve our products, services, and machine-learning technologies.

Taken directly from the Google Gemini privacy policy.

OpenAI ChatGPT

We may use content submitted to ChatGPT, DALL·E, and our other services for individuals to improve model performance.
Cleared chats are deleted from our systems within 30 days, unless they have been de-identified and disassociated from your account. If you have not opted out, we may use these de-identified chats for training to improve model performance.
A limited number of authorized OpenAI personnel, as well as trusted service providers that are subject to confidentiality and security obligations, may access user content [...]

Taken directly from the OpenAI FAQ on data usage.

Host your own models

Self hosting open source models can address some of these concerns. You operate on servers that you own and chats are stored only on your personal devices. However to achieve this you need to have a certain level of hardware (which comes with a cost) and the technical knowledge to be able to bring it up and operate it.

You will also likely lack the benefits of cloud features such as cross device syncing unless you expose your services to the internet and then you need even more knowledge and time to ensure this is done safely.

Just like email, just because you can doesn't mean you should.

Adding encryption

What if we could offer the convenience and user experience of cloud services with the privacy guarantees of hosting models yourself locally?

It worked for email

Products such as Proton Mail or Tuta have addressed this problem by applying zero-access encryption to email.

Emails sent from non-encrypted providers (e.g. Gmail) arrive in plain text before immediately being encrypted and the plain text discarded. Now the only person who can access your email is you and there is no risk that your emails can be viewed by staff members at Proton or Tuta now, or in the future.

Launching Cognos - Encrypted, multi-model AI chat

Cognos has a simple aim, to offer a privacy respecting alternative to generative AI chat applications.

You should be able to choose which provider you use, from commercial offerings like OpenAI's GPT-3.5 & GPT-4 to open source models that may even be uncensored and NSFW.

What you choose to share with these models and their responses back to you should remain private with you in full control of who has access.

If you're interested on our planned features you can have a look at our initial product roadmap.