(sorry if anyone got this post twice. I posted while Lemmy.World was down for maintenance, and it was acting weird, so I deleted and reposted)
(sorry if anyone got this post twice. I posted while Lemmy.World was down for maintenance, and it was acting weird, so I deleted and reposted)
What, is it illegal to know how to make meth?
It’s not illegal to know. OpenAI decides what ChatGPT is allowed to tell you, it’s not the government.
It got upset when I asked it about self-trepanning
I had a very in depth detailed “conversation” about dementia and the drugs used to treat it. No matter what, regardless of anything I said, ChatGPT refused to agree that we should try giving PCP to dementia patients because ooooo nooo that’s bad drug off limits forever even research.
Fuck ChatGPT, I run my own local uncensored llama2 wizard llm.
Can you run that on a regular PC, speed wise? And does it take a lot of storage space? I’d like to try out a self-hosted llm as well.
It’s pretty much all about your gpu vram size. You can use pretty much any computer if it has a gpu(or 2) that can load >8gb into vram. It’s really not that computation heavy. If you want to keep a lot of different llms or larger ones, that can require a lot of storage. But for your average 7b llm you’re only looking at ~10gb hard storage.
Yeah, if it was illegal to know wikipedia would have had issues