Image is of Colombian President Gustavo Petro giving a speech at the UN in 2022.
Trump has arrived in office with the force of an avalanche; ~~ending~~ slowing a genocide on the one hand, while simultaneously promising a total nightmare for minorities and the poor throughout and outside the United States on the other hand. [edited for clarity; I do not actually think Trump has ended the Palestinian genocide obviously, I was making a joke - but the ceasefire is a genuine improvement in conditions for millions of people right now who are on the edge of death, so it cannot be dismissed]
It's still far too early to truly compare and contrast his imperial strategy with Biden's, but initial signs show that there does appear to be somewhat of a reorientation. Biden was famous for being two-faced; ostensibly offering aid and stability, while also blowing up your pipeline to ensure you did not actually have an alternative to his idea. Trump, meanwhile, seems only really capable of aggression, threatening several "allied" nations with what may as well be sanctions because of the economic harm they'd do. I suspect we'll be debating for a long time how much of this can be attributed to the specific characteristics of Trump, or whether he merely embodies the zeitgeist of imperial decline - a wounded empire lashing out with extreme violence to try and convince everybody, including themselves, that they can still be the world imperialist hegemon.
I'll admit it: I did not believe that Trump would actually try and go ahead with putting tariffs on basically anybody who annoys him. And while the threat could still be empty in regards to countries like China and Canada, Colombia is the first indication of the potential of his strategy. Despite some fiery words from President Petro, after Trump's administration revealed the punishment if Colombia did not agree, it appears that Colombia will in fact be accepting deported migrants after all. It's funny how that works.
Last week's thread is here. The Imperialism Reading Group is here.
Please check out the HexAtlas!
The bulletins site is here. Currently not used.
The RSS feed is here. Also currently not used.
Israel-Palestine Conflict
Sources on the fighting in Palestine against Israel. In general, CW for footage of battles, explosions, dead people, and so on:
UNRWA reports on Israel's destruction and siege of Gaza and the West Bank.
English-language Palestinian Marxist-Leninist twitter account. Alt here.
English-language twitter account that collates news.
Arab-language twitter account with videos and images of fighting.
English-language (with some Arab retweets) Twitter account based in Lebanon. - Telegram is @IbnRiad.
English-language Palestinian Twitter account which reports on news from the Resistance Axis. - Telegram is @EyesOnSouth.
English-language Twitter account in the same group as the previous two. - Telegram here.
English-language PalestineResist telegram channel.
More telegram channels here for those interested.
Russia-Ukraine Conflict
Examples of Ukrainian Nazis and fascists
Examples of racism/euro-centrism during the Russia-Ukraine conflict
Sources:
Defense Politics Asia's youtube channel and their map. Their youtube channel has substantially diminished in quality but the map is still useful.
Moon of Alabama, which tends to have interesting analysis. Avoid the comment section.
Understanding War and the Saker: reactionary sources that have occasional insights on the war.
Alexander Mercouris, who does daily videos on the conflict. While he is a reactionary and surrounds himself with likeminded people, his daily update videos are relatively brainworm-free and good if you don't want to follow Russian telegram channels to get news. He also co-hosts The Duran, which is more explicitly conservative, racist, sexist, transphobic, anti-communist, etc when guests are invited on, but is just about tolerable when it's just the two of them if you want a little more analysis.
Simplicius, who publishes on Substack. Like others, his political analysis should be soundly ignored, but his knowledge of weaponry and military strategy is generally quite good.
On the ground: Patrick Lancaster, an independent and very good journalist reporting in the warzone on the separatists' side.
Unedited videos of Russian/Ukrainian press conferences and speeches.
Pro-Russian Telegram Channels:
Again, CW for anti-LGBT and racist, sexist, etc speech, as well as combat footage.
https://t.me/aleksandr_skif ~ DPR's former Defense Minister and Colonel in the DPR's forces. Russian language.
https://t.me/Slavyangrad ~ A few different pro-Russian people gather frequent content for this channel (~100 posts per day), some socialist, but all socially reactionary. If you can only tolerate using one Russian telegram channel, I would recommend this one.
https://t.me/s/levigodman ~ Does daily update posts.
https://t.me/patricklancasternewstoday ~ Patrick Lancaster's telegram channel.
https://t.me/gonzowarr ~ A big Russian commentator.
https://t.me/rybar ~ One of, if not the, biggest Russian telegram channels focussing on the war out there. Actually quite balanced, maybe even pessimistic about Russia. Produces interesting and useful maps.
https://t.me/epoddubny ~ Russian language.
https://t.me/boris_rozhin ~ Russian language.
https://t.me/mod_russia_en ~ Russian Ministry of Defense. Does daily, if rather bland updates on the number of Ukrainians killed, etc. The figures appear to be approximately accurate; if you want, reduce all numbers by 25% as a 'propaganda tax', if you don't believe them. Does not cover everything, for obvious reasons, and virtually never details Russian losses.
https://t.me/UkraineHumanRightsAbuses ~ Pro-Russian, documents abuses that Ukraine commits.
Pro-Ukraine Telegram Channels:
Almost every Western media outlet.
https://discord.gg/projectowl ~ Pro-Ukrainian OSINT Discord.
https://t.me/ice_inii ~ Alleged Ukrainian account with a rather cynical take on the entire thing.
![](https://hexbear.net/pictrs/image/ddf41891-a9ec-4417-8d10-a342a09dd2ce.webp)
The Short Case for Nvidia Stock
This is not a finance article nor is it financial advice despite the title (I do not condone any gambling nor do I hold any positions on Nvidia, just obligatory disclaimer here), but genuinely an excellent breakdown of the DeepSeek “breakthrough”.
To quote from a summary I read from Twitter:
That’s it! There is nothing groundbreaking about DeepSeek. There is no fundamentally new invention here (compared to the original transformer architecture). It’s simply a bunch of Chinese interns looking for shortcuts to bypass the tech sanctions, and stumbled upon a simple yet elegant solution to the problem.
What is groundbreaking though is that apparently none of the OpenAI engineers getting paid >500k per year managed to come up with a solution like this.
And this really is an indictment of the entire field of “AI”, the kind of people who are getting paid a fortune, and how overhyped, bloated and unrealistic it is to burn through $500 billion to chase their AGI scam which is never going to work. (The real AGI is as far as you can get from the current iterations of artificial neural networks based architecture, and we barely have any idea what it would look like).
The question is will it burst the AI bubble? Are we getting another AI winter? Or the US government will find a way around it like it did with all the crypto scam bullshit or the trillion dollar F-35 project?
The entire AI bubble is just a massive scam lol. Their version of AGI is a pipe dream that will never going to be materialized except for burning through $500 billion waste of GPU hardware.
However I have been wrong about bitcoin before so I won’t celebrate just yet. The crypto idiots have earned their right to laugh back at me.
AGI is a pipe dream but as long as bullshit jobs exist, people will use AI to do them. Those people may get fired but then it'll be their boss running everything through chatGPT instead.
That said, probably not worth a trillion dollars, maybe not even worth enough to make a profit. We'll see how long that pipe dream can keep glorified chatbots going
This is another case of white people video games weighting 500 gb because no one knows how to optimize their stuff
i am begging the white tech sector to be as civilized as the chinese tech sector
Theres just no fucking way no one thought of this before ahahahahahah what the actual fuck
8 bit is kinda impressive honestly. I could see going from 32 bit to 16 bit, but 8 bit requires some confidence that you're only going to be dealing with fairly small numbers. It's obviously something the US players should've done by now but I have a feeling the coding done by the Chinese to make this work was not insignificant
It’s genuinely funny. To be fair the DeepSeek team used some impressive sleight of hands to make the 8 bit precision floating point mesh well, but the OpenAI people are getting paid 500k a year and couldn’t even think of this. How.
I’ve spoken to a nvidia engineer before who told me how they have optimized for 8 and even 4 bit fp operations using dedicated hardware and lookup tables for the explicit purpose of ML tasks. So somebody has used this before, just not in this way I suppose
I beg to differ. Finding ways to make do within a resource-constrained environment is the very heart of innovation. Especially compared to throwing a dragon's hoard amount of money at bigger data centers and more stolen personal data, which had been the style in Silicon Vallay for a time now.
I won't be satisfied until relativity is broken
No, see, you can't break relativity or else we'll all go back in time and suck off our grandfathers. or something, I tend to clock out when sci fi shit is happening
It’s an impressive piece of complex engineering with far reaching applications, don’t get me wrong, and it’s damn impressive to put everything together and get it to work.
But fundamentally new invention (at least in this particular field) is like when the transformer architecture was introduced that opened up an entirely new generation of neural network architectures and completely changed the field.
Think of it in terms of basic vs applied research.
32-bit weights on the panoptidataset for the GPU melter![emoji peltier-laugh peltier-laugh](https://hexbear.net/api/v3/image_proxy?url=https%3A%2F%2Fwww.hexbear.net%2Fpictrs%2Fimage%2F7f45d169-1810-4e53-ae51-6d9b385a1291.png)
did no one commit to trying even just, like, half-precision??? we need 8 decimal places for training our predictive text model because. because um. because
I wrote about this elsewhere on HB, but the reasons for this debacle are structural, it's not just a matter of crackers cracka-lackin'.
There are dozens of managers in Meta, OpenAI etc who are getting paid millions - more than DS was worth altogether. This wasn't (just) a mistake - it's how corpo hell works. The moment the AI hype cycle revved up all these careerist fucks must have started politicking their way into the next big thing. Why ask for less money and a smaller data center, when this will mean your budget next quarter will be cut? Or means that that fucker Kyle from the country club is gonna get an edge on you come promotion time. Why worry about resource constraints at all when VC and Wall Street are begging you to take their money? The only innovation capitalism breeds is grift.
Yea management is less than useless and are probably taking up more of the budget than engineers. The more senior the less they matter. They are experts in ass kissing and justifying their bs position so they will likely stay while putting more of the pressure on engineers.
But not only is the management part of the problem, I think the whole culture of burning through people through hire fast/fire fast is also going to be a problem for Americans to catch up. It works when you have to get a barely functioning slop out, but I don't think it can produce something truly innovative or optimized.
I think if America wants to actually compete they have would have to completely dismantle the corporate structure, but I don't think that will happen. What's going to happen are these companies are going to get more money than god with little oversight, but it's not going to work when most of that money is going to the bloat. In my view nothing innovative has come without government intervention (excluding maybe the transistor)
For the last five years, all the executives in the company I work for could talk about is "putting AI in that". It was a "solution" without a problem. And now we do have LLM capabilities added in...which, trust me, are absolutely useless to our platform. But they're there, so they can be sold.
Drives me fucking insane. I look forward to seeing how much DeepSeek makes them whine and cry and scramble. I do kinda hope I still have a job afterward, though.![emoji limmy-awake limmy-awake](https://hexbear.net/pictrs/image/563134a5-db17-403a-aed5-e971120e5474.png)
yep, i wonder how much of the american industry has been influenced by blind pursuits in model fit/precision scores, no matter how marginal and practically insignificant, because shareholders expect that the "best" model with the best precision is the most profitable. Why not consume consume 50x times as much energy and memory as necessary for a 0.001% gain in one of ten performance metrics if it means doubling your shareholder value?!
put another way, 'who cares how much returns diminish when we can fund them with the money printer that turns into an infinite money printer when we win'
Fuck it, let's just do unsigned integers.
my first compsci prof essentially pulled the whole class aside to do a whole "do not, my friends, become addicted to doubles" speech in the data types lecture, and we're talking programs that were handling, like, at max a hundred kilos of data for that class.
Every starting compsci class will tell you that, which makes the whole thing that much funnier. It was probably like one or two people who made the call to use doubles without really thinking about it and here we are
to be fair: it is singles, I was referencing the doubles aside as a general "you should be aware of the scope of the data you plan to work with so you aren’t wasteful" lecture.
to not be fair: you should probably not be using that much depth for predictive text weights when you're working with petabytes of data, like, be so fucking for real here, did it not occur to you that may be an unnecessary bottleneck![emoji kiryu-stare kiryu-stare](https://hexbear.net/pictrs/image/2f3d3989-f8c3-4dab-b58c-cb7a5bcbf31f.png)
The sign bit is pretty useful in the curve fitting coefficients, and is probably too much work to try to do without. But otherwise: yeah, probably.
Any more complaints and I'm making it a bool
So a lot of people replied talking about commercial AI development, but I think the interesting thing is that even Western university AI research hadn't really caught up to this. I have a prof who's been researching ways to get deep learning models to run locally on minimal hardware and was familiar with quantization (although I think he was doing 16 bit), compression, all the tactics I've heard DeepSeek was using. I wonder if maybe the thing that makes R1 so much better is the whole COT thing? Maybe no one was trying COT with a much smaller model?
That's likely because there is no capital benefit to reducing the size of the model. Sam Alton is the chair of a Nuclear Energy corporation and if the models continue to require more energy with every new release, well, that's just good for business.