Wait a second.
Grok's symbol is a lightning bolt?
Wait a second.
Grok's symbol is a lightning bolt?
(None of them are THAT much better than previous, even worse in some areas)
how about you grok deez nuts (it is 1 am please forgive me)
OK Tech bros, repeat after me:
"Thou shalt not make a machine in the likeness of a human mind."
I remember how we would get into trouble for copying each other's homework in high school.
Now we get in trouble for generating each other's homework 🤷
I wonder if a good fine tuned model beats every general purpose LLM if you need it for a really specific purpose
Yes it does
Of course, and this is why the new hotness is a Mixture of Experts for one model that is effectively a bunch of experts arguing over the answer, or else on a different scale there's the Combination of Agents where different specialized agents perform specialized tasks.
There is new project which they share fine-tuned modernbert on some task. Here is the org https://huggingface.co/adaptive-classifier
I thought DeepSeek's selling point was efficiency?
It was beating near frontier models at tenth of a cost could be hosted by you or any cloud service for you and the open weights were not censored for China's pr and could be jailbroken to say write code for shady stuff which any frontier model would refuse.
You dropped these: ,,,,,,
No time for commas with how fast this tech is developing
AI as a tech product is advancing faster than any other tech I've ever seen, you mentioning DeepSeek's revelations from like 8 months ago already feels like an eternity
The most surprising thing about this image is that Grok is on it, they started way behind the 8ball and have caught up
It’s flip flopping between being a full on Nazi and arguing against MAGA with facts and logic.
The RL is so good grok changed it's personality by changing small part of it's system prompt
Anakin: It's aligned.
Padme: To be good?
It changed after Grok 3
yeah i hate musk but the fact grok was launched in nov 2023 years behind the competition and has caught up is shocking
They got the whole Twitter database. It's kinda the same with Gemini. But somehow Meta isn't catching up, maybe their llama 4 architecture isn't that stable to train.
Or maybe Facebook data is even worse than Twitter?
Llama 3.3 was good, tho. For the multimodal, llama 4 also use llama3.2 approach where the image and text is made into single model instead using CLIP or siglip.
Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.
Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.
As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.
Rules:
Rule 1 - No harassment or personal character attacks of community members. I.E no namecalling, no generalizing entire groups of people that make up our community, no baseless personal insults.
Rule 2 - No comparing artificial intelligence/machine learning models to cryptocurrency. I.E no comparing the usefulness of models to that of NFTs, no comparing the resource usage required to train a model is anything close to maintaining a blockchain/ mining for crypto, no implying its just a fad/bubble that will leave people with nothing of value when it burst.
Rule 3 - No comparing artificial intelligence/machine learning to simple text prediction algorithms. I.E statements such as "llms are basically just simple text predictions like what your phone keyboard autocorrect uses, and they're still using the same algorithms since <over 10 years ago>.
Rule 4 - No implying that models are devoid of purpose or potential for enriching peoples lives.