65
AI-generated videos now possible with gaming GPUs with just 6GB of VRAM
(www.tomshardware.com)
Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.
Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.
As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.
If generating ai content is so easy now, and deepseek can run locally on anything, what are all those datacenters working on?
training but at the same time having 100 million people daily asking a variety of questions will put up a server load