It's probably either waiting for approval to sell ads or was denied and they're adding more stuff. Google has a virtual monopoly on ads, and their approval process can take 1-2 weeks. Google's content policy basially demands that your site by full of generated trash to sell ads. I did a case study here, in which Google denied my popular and useful website for ads until I filled it with the lowest-quality generated trash imaginable. That might help clarify what's up.
It's not a solution, but as a mitigation, I'm trying to push the idea of an internet right of way into the public consciousness. Here's the thesis statement from my write-up:
I propose that if a company wants to grow by allowing open access to its services to the public, then that access should create a legal right of way. Any features that were open to users cannot then be closed off so long as the company remains operational. We need an Internet Rights of Way Act, which enforces digital footpaths. Companies shouldn't be allowed to create little paths into their sites, only to delete them, forcing guests to pay if they wish to maintain access to the networks that they built, the posts that they wrote, or whatever else it is that they were doing there.
As I explain in the link, rights of way already exist for the physical world, so it's easily explained to even the less technically inclined, and give us a useful legal framework for how they should work.
I do software consulting for a living. A lot of my practice is small organizations hiring me because their entire tech stack is a bunch of shortcuts taped together into one giant teetering monument to moving as fast as possible, and they managed to do all of that while still having to write every line of code.
In 3-4 years, I'm going to be hearing from clients about how they hired an undergrad who was really into AI to do the core of their codebase and everyone is afraid to even log into the server because the slightest breeze might collapse the entire thing.
LLM coding is going to be like every other industrial automation process in our society. We can now make a shittier thing way faster, without thinking of the consequences.
That's a bad faith gotcha and you know it. My lemmy account, the comment I just wrote, and the entire internet you and I care about and interact with are a tiny sliver of these data warehouses. I have actually done sysadmin and devops for giant e-commerce company, and we spent the vast majority of our compute power on analytics for user tracking and advertising. The actual site itself was tiny compared to our surveillance-value-extraction work. That was a major e-commerce website you've heard of.
Bitcoin alone used half a percent of the entire world's electricity consumption a couple of years ago. That's just bitcoin, not even including the other crypto. Now with the AI hype, companies are building even more of these warehouses to train LLMs.
We are usually not given a good example of how bad things actually happen. We imagine the barbarians storming the gate, raping and pillaging. That does happen, but more often, things getting worse is more complicated, and it affects different people at different times.
For the one in five (!!) children facing hunger, our society has failed. For a poor person with diabetes and no medical insurance, our society has already failed. For an uber driver with no family support whose car broke down and missed rent, facing an eviction, society is about to break down for them. I'm a dude in my mid thirties that writes code, so for me, things are fine, but if I get hit by a bus tomorrow and lose the ability to use my hands, society will probably fail for me.
More and more people are experiencing that failure. Most of us are fine, but our being fine is becoming incredibly fucking precarious. More often than not, society collapsing looks like a daily constitution saving throw that becomes harder and harder to pass, and more and more of us who have a stroke of bad luck here or there fail.
Understanding society this way is important, and it's why solidarity is the foundation of leftist politics. I march for people without healthcare because I care about them, and also, because there but for the grace of god go I. Bakunin put this beautifully almost 200 years ago:
I am truly free only when all human beings, men and women, are equally free. The freedom of other men, far from negating or limiting my freedom, is, on the contrary, its necessary premise and confirmation.
I'd say less than a week. Capitalism is something that we have to wake up and make happen every single day. How many days worth of food does the average person have? Definitely not 45 days. People would have to start self-organizing within 2-3 days, and in doing so, they would actively make something that isn't capitalism, which directly challenges those in power.
This is why every time there are emergencies or protests, the media is obsessed with "looting." If there's no food because of a hurricane or whatever, it is every single person's duty to redistribute what there is equitably. The news and capitalists (but I repeat myself) call that "looting," even when it's a well-organized group of neighbors going into a closed store to distribute spoiling food to hungry people.
Rebecca Solnit writes about this in detail in A Paradise Built in Hell. It's really good. She's an awesome writer.
The purpose of a system is what it does. "There is no point in claiming that the purpose of a system is to do what it constantly fails to do.” These articles about how social media is broken are constant. It's just not a useful way to think about it. For example:
It relies on badly maintained social-media infrastructure and is presided over by billionaires who have given up on the premise that their platforms should inform users
These platforms are systems. They don't have intent. There's no mens rea or anything. There is no point saying that social media is supposed to inform users when it constantly fails to inform users. In fact, it has never informed users.
Any serious discussion about social media must accept that the system is what it is, not that it's supposed to be some other way, and is currently suffering some anomaly.
That sucks, but I argue that it's even worse. Not only do they tweak your results to make more money, but because google has a monopoly on web advertising, and (like it or not) advertising is the main internet funding model, google gets to decide whether or not your website gets to generate revenue at all. They literally have an approval process for serving ads, and it is responsible for the proliferation of LLM-generated blogspam. Here's a thing I wrote about it in which I tried to get my already-useful and high-quality website approved for ads, complete with a before and after approval, if you're curious. The after is a wreck.
Is that really all they do though? That's what theyve convinced us that they do, but everyone on these platforms knows how crucial it is to tweak your content to please the algorithm. They also do everything they can to become monopolies, without which it wouldn't even be possible to start on DIY videos and end on white supremacy or whatever.
I wrote a longer version of this argument here, if you're curious.
This study is an agent-based simulation:
The researchers used a type of math called “agent-based modeling” to simulate how people’s opinions change over time. They focused on a model where individuals can believe the truth, the fake information, or remain undecided. The researchers created a network of connections between these individuals, similar to how people are connected on social media.
They used the binary agreement model to understand the “tipping point” (the point where a small change can lead to significant effects) and how disinformation can spread.
Personally, I love agent-based models. I think agent modeling is a very, very powerful tool for systems insight, but I don't like this article's interpretation, nor am I convinced the author of this article really groks what agent-based modeling really is. It's a very different kind of "study" than what most people mean when they use that word, and interpreting the insights is its own can of worms.
Just a heads up, for those of you casually scrolling by.
The real problem with LLM coding, in my opinion, is something much more fundamental than whether it can code correctly or not. One of the biggest problems coding faces right now is code bloat. In my 15 years writing code, I write so much less code now than when I started, and spend so much more time bolting together existing libraries, dealing with CI/CD bullshit, and all the other hair that software projects has started to grow.
The amount of code is exploding. Nowadays, every website uses ReactJS. Every single tiny website loads god knows how many libraries. Just the other day, I forked and built an open source project that had a simple web front end (a list view, some forms -- basic shit), and after building it, npm informed me that it had over a dozen critical vulnerabilities, and dozens more of high severity. I think the total was something like 70?
All code now has to be written at least once. With ChatGPT, it doesn't even need to be written once! We can generate arbitrary amounts of code all the time whenever we want! We're going to have so much fucking code, and we have absolutely no idea how to deal with that.
I have worked at two different start ups where the boss explicitly didn't want to hire anyone with kids and had to be informed that there are laws about that, so yes, definitely anti-parent. One of them also kept saying that they only wanted employees like our autistic coworker when we asked him why he had spent weeks rejecting every interviewee that we had liked. Don't even get me started on people that the CEO wouldn't have a beer with, and how often they just so happen to be women or foreigners! Just gross shit all around.
It's very clear when you work closely with founders that they see their businesses as a moral good in the world, and as a result, they have a lot of entitlement about their relationship with labor. They view laws about it as inconveniences on their moral imperative to grow the startup.