- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
Researchers jailbreak a Tesla to get free in-car feature upgrades::A group of researchers found a way to hack a Tesla’s hardware with the goal of getting free in-car upgrades, such as heated rear seats.
I really wonder if there’s a way to use LLMs just to point out every concerning thing in a EULA/TOS
To what end? Probably every eula/tos you click through has concerning shit that is against your best interest. Either you use the product or you don’t.
Yeah but I want to know just how fucked I am when I sign it
TLDR If you’re the consumer, you’re always the fucked party of a TOS.
That’s why EULAs or other contracts are not necessarily legally binding if they contain specific parts that could be considered “unfair”; at least in the European Union.
You can give this a try
https://www.tosdr.org/
Probably not ChatGPT because who knows what was in its EULA and we couldn’t use it to summarize it before agreeing to it.
Bet you could but not sure what that would get you. So you don’t click agree to it. Now what?