Is this actually practically achievable or mostly theoretical in a lab? Is it confirmed that the cops have actually managed to do this?
Sir. Haxalot
- 0 Posts
- 4 Comments
Joined 6 days ago
Cake day: February 3rd, 2026
You are not logged in. If you use a Fediverse account that is able to follow users, you can follow this user.
Sir. Haxalot@nord.pubto
Technology@lemmy.world•VS Code for Linux may be secretly hoarding trashed filesEnglish
1·5 days agoI can’t really tell if you’re joking or not but no, I’m saying that it’s a bug, and at no point anything is sent off your computer
Sir. Haxalot@nord.pubto
Technology@lemmy.world•VS Code for Linux may be secretly hoarding trashed filesEnglish
522·5 days agoI like that the article excerpt clearly says that it’s simply about files not being removed when the trash bin is emptied, and it’s a problem specific to the Canonical snap system… Yet every single other comment in here rants about Microsoft spyware. Not many people read beyond the headline, lol.

Maybe i misunderstand what you mean but yes, you kind of can. The problem in this case is that the user sends two requests in the same input, and the LLM isn’t able to deal with conflicting commands in the system prompt and the input.
The post you replied to kind of seems to imply that the LLM can leak info to other users, but that is not really a thing. As I understand when you call the LLM it’s given your input and a lot of context that can be a hidden system prompt, perhaps your chat history, and other data that might be relevant for the service. If everything is properly implemented any information you give it will only stay in your context. Assuming that someone doesn’t do anything stupid like sharing context data between users.
What you need to watch out for though, especially with free online AI services is that they may use anything you input to train and evolve the process. This is a separate process but if you give personal to an AI assistant it might end up in the training dataset and parts of it end up in the next version of the model. This shouldn’t be an issue if you have a paid subscription or an Enterprise contract that would likely state that no input data can be used for training.