this post was submitted on 03 Aug 2025
551 points (93.4% liked)
Technology
73698 readers
3556 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If an AI can work on encrypted data, it's not encrypted.
SMH
No one is saying it's encrypted when processed, because that's not a thing that exists.
End to end encryption of a interaction with a chat-bot would mean the company doesn't decrypt your messages to it, operates on the encrypted text, gets an encrypted response which only you can decrypt and sends it to you. You then decrypt the response.
So yes. It would require operating on encrypted data.
The documentation says it's TLS encrypted to the LLM context window. LLM processes, and the context window output goes back via TLS to you.
As long as the context window is only connected to Proton servers decrypting the TLS tunnel, and the LLM runs on their servers, and much like a VPN, they don't keep logs, then I don't see what the problem actually is here.
homomorphic encryption?
not there yet, of course, but it is conceptually possible
@wewbull@feddit.uk