Mixtral@sh.itjust.workstoLocalLLaMA@sh.itjust.works•Mistral shocks AI community as latest open source model eclipses GPT-3.5 performanceEnglish
3·
11 months agoSome disgusting thing about recent updates for Mistral:
Some disgusting thing about recent updates for Mistral:
Would it be possible to fork Mistral code and remove parts about ethicalguideline stuff? If so, we should create a community who might donate their hardware resources for training such model/models…I am strongly against of any censorship/ethnical things in models.
I am wondering if I can run mistral/mixtral on my server. It doesn’t have videocard but RAM amount can be almost unlimited, I have unused ~100GB inside and can top up to 1TB if needed, and give 20-25 vCPU cores(the rest cores of CPU are used already).
Do cloud services see everything - text/images data for training? And finished trained model? If so, runpod.io etc are no solution.
Is it censored one?
Open training code too?