In game dev, a binary file conflict means someone is going to have to do their work a second time.

It was truly glorious, for BF3 and BF4 I helped run a server and we had that thing full with 64 people from 5pm to 3 am everyday.

This is definitely written by an ai, and it's pure laziness when people don't fix the styling, which makes it so obvious.

I ate the whole fucking thing, I didn't realize until the last sentence. It's not that this is out of the scope of reality, it is just this dementia patient had not come up with that idea.

I have made worse, I used to do a cascading merge everyday to move stuff from dev branches to staging to production. Then I did a merge in the opposite direction for a small selection of branches so they could get their updates from staging. Feature branches were rebased as needed.

"During the gold rush, sells shovels model" Is a perfect analogy. It worse, OpenAi and similar companies don't find using any sluice machines profitable. It is basically a gold rush were even the big operations don't want to be involved with anything, other than selling shovels. This bubble is going to burst and there is an unimaginable about of money invested if this scam.

Undocumented feature flag in a plug-in, that changes the behavior drastically when in any deployment mode.

The funny thing is that reversing the decision proves that they are only interested in money, not customer satisfaction or the product itself.

I think it is too risky to deal with a business that flips on policy, too unpredictable.

Don't. Unless you are confident you are not adding hot garbage to the code base.

The guy at work who managed git before me, well didn't quite have the knowledge I do and was not using LFS. In one of the main repos a 200mb binary was pushed 80+ times. This is not the only file that this happened to. Even if you do a shallow clone, you still need to add the commit depth eventually. It's a nightmare.

SleeplessCityLights

joined 1 week ago