In Europe cashiers don't pack your groceries, they just scan the stuff and handle the payment, and you pack the groceries yourself, and you better be quick because they are gonna start scanning the next customer's stuff immediately. I think this would solve many such cases.
What's the bug?
Finite games are all definite, either player 1 as a winning strategy or player 2 has, all other "outcomes" are just mental illnesses. Get over it, math doesn't care about your feelings.
That's like... It's purpose. Compilers always have a frontend and a backend. Even when the compiler is entirely made from scratch (like Java or go), it is split between front and backend, that's just how they are made.
So it makes sense to invest in just a few highly advanced backends (llvm, gcc, msvc) and then just build frontends for those. Most projects choose llvm because, unlike the others, it was purpose built to be a common ground, but it's not a rule. For example, there is an in-developement rust frontend for GCC.
They die. Full stop.
Not even Microsoft had the strength to maintain a browser engine, that's why they moved Edge to Chromium, they gave up.
Tech Bros make a panopticon and call it a novel approach
The USB protocol was simple by design, so it could be implemented in small dumb devices like pen drives. More specifically, it used two couples of cables, one couple was for power and the other for data (four wires in total). Having a single half-duplex data line means you need some way of arbitrating who can send data at any time. The easiest way to do it is having a single machine that decides who gets to send data (master), and the easiest way to decide the master is to not do it and have the computer always do the master. This means you couldn't connect two computers together because they would both try to be the master.
I used the past tense because you may have noticed that micro USB have 5 pins and not 4, that's because phones are computers and they use the 5th pin to decide how to behave. If it's grounded they act as a slave (the male micro to male A cable grounds it). If it has a resistor (the otg cable has it) it act as master. And if the devices are connected with a wire on that pin (on some special micro to micro) they negotiate the connection.
When they made usb 3.0 and they realized that not having the 5th wire on the usb-A was stupid, so they put it (along side some extra data lines) that's why they have an odd number of wires. So with usb 3 you can connect computers together, but you need a special cable that uses the negotiation wire. Also I don't know what software you need for it to work.
Usb-c is basically two USB 3.0 in the same cable, so you can probably connect computers with that. But often the port on the devices only uses one, so it might not be faster. Originally they put the pins for two connections so you could flip the connector, but later they realized they could use them to get double speed.
AI upscaling, I think
I am a computer scientist after all
If I get back to 2005 I can easily get more than 10 millions by the time it's 2024 again. Plus all the other perks of restarting your life
Dude what are you talking about, it was still here less than 15 years ago. The Nintendo Wii literally had an ATI GPU
In that instance it wasn't really training, it was crowdsourcing the transcription. Rechapta would pull out a word from their book archive that the OCR failed to recognise, and if many people identified it as the same word, it would be archived. Now that rechapta has been purchased by Google, the archive and the transcriptions are available on Google books.
They stopped doing this once ai became more effective than rechapta for book transcriptions.
Modern chapta actually is about training models. But old, classic rechapta was really just about book transcriptions, and those are available.