240
GTK ported to Android — here we gooooo!
(lemmy.world)
A community for everything relating to the GNU/Linux operating system (except the memes!)
Also, check out:
Original icon base courtesy of lewing@isc.tamu.edu and The GIMP
It does that for all clients
You just need to wait for the proof of work to complete
It actually doesn't do that for all clients, according to the docs
It'll let you straight through if your user agent doesn't contain "Mozilla"
Whaaaat? Why only look for Mozilla?
All normal web browsers have Mozilla in the name so it’s kinda weird to only do it for that. Both chrome safari and FF start with Mozilla 5.0
Because it's super common in web scrapers
I will never find the irony in this anything other than pathetic.
The one legitimate grievance against Bitcoin and other POW cryptocurrencies - the wasteful burning of energy to do throw-away calculations simply to prove the work has been done... the environmental cost of distributed scale meaningless CPU cycle waste purely for the purpose of wasting CPU cycles, has been so eagerly grasped by people who are largely doing it to foil another energy wasteful infotech invention.
It really is astonishing.
Do you have a better way? It is way more private than anything else I've seen.
From a energy usage perspective it also isn't bad. Spiking the CPU for a few seconds is minor especially compared to other tasks.
Yeah, tarpits. Or, even just intentionally fractionally lagging the connection, or putting a delay on the response to some mime types. Delays don't consume nearly as much processing as PoW. Personally, I like tar pits that trickle out content like a really slow server. Hidden URLs that users are not likely to click on. These are about the least energy-demanding solutions that have a chance of fooling bots; a true, no-response tarpit would use less energy, but is easily detected by bots and terminated.
Proof of work is just a terrible idea, once you've accepted that PoW is bad for the environment, which it demonstrably is.
None of those things work well is the problem. It doesn't stop the bots from hammering you site. Crawlers will just timeout and move on.
Tarpits suck. Not worth the implementation or overhead. Instead the better strat is to pretend the server is down with a 503 code or that the url is onvalid with a 404 code so the bots stop clinging to your content.
Also we already have non-PoW captchas that dont require javascript. See: go-away for these implemwntations
I run a service that gets attacked by AI bots, and while PoW isn't the only way to do things, none of your suggestions work at all.
The point is to make it too expensive for them, so they leave you alone (or, ideally, totally die but that's a long way off). They're making a choice to harvest data on your site. Make them choose not to. It saves energy in the long run.
They’re making way more money off the data they get from the website than they waste on the POW.
If you really wanted efficiency then make a plain text version of the web page that doesn’t require them to do expensive JavaScript and other Ajax BS. Or shit give them a legitimate sitemap too.
Yet there are countless examples of webmasters alleviating traffic that is crushing their sites by deploying this solution. The reasoning is up in the air, but the effectiveness is there.
either you have the service with anubis or you have no service at all
unlike pyramid coins, anubis serves a purpose
It still uses Proof-of-Work, in which the coal being burned is only to prove that you burned the coal.
Everything uses energy
Do you have any measurements on power usage? It seems very minor.
Everything computer does use power. The issue is the same very valid criticism of (most) crypto currencies: the design objectives are only to use power. That's the very definition of "proof of work." You usually don't care what the work is, only that it was done. An appropriate metaphor is: for "reasons", I want to know that you moved a pile of rocks from one place to another, and back again. I have some way of proving this - a video camera watching you, a proof of a factorization that I can easily verify, something - and in return, I give you something: monopoly money, or access to a web site. But moving the rocks is literally just a way I can be certain that you've burned a number of calories.
I don't even care if you go get a ~~GPU~~ tractor and move the rocks with that. You've still burned the calories, by burning oil. The rocks being moved has no value, except that I've rewarded you for burning the calories.
That's proof of work. Whether the reward is fake internet points, some invented digital currency, or access to web content, you're still being rewarded for making your CPU burn calories to calculate a result that has no intrinsic informational value in itself.
The cost is at scale. For a single person, say it's a fraction of a watt. Negligible. But for scrapers, all of those fractions add up to real electricity bill impacts. However - and this is the crux - it's always at scale, even without scrapers, because every visitor is contributing to the PoW total, global cost of that one website's use of this software. The cost isn't being noticeable by individuals, but it is being incurred; it's unavoidable, by design.
If there's no cost in the aggregate of 10,000 individual browsers performing this PoW, then it's not going to cost scrapers, either. The cost has to be significant enough to deter bots; and if it's enough to be too expensive for bots, it's equally significant for the global aggregate; it's just spread out across a lot of people.
But the electricity is still being used, and heat is still being generated, and it's yet another straw on the environmental camel's back.
It's intentionally wasteful, and a such, it's a terrible design.
It doesn't need to be anywhere near as resource intensive as a crypto currency since it isn't used for security. The goal is not to stop bots altogether. The goal is to slow down the crawlers enough so that the server hosting the service doesn't get pegged. The bots went from being respectful of server operators to hitting pages millions of times a second. This is made much worse by the fact that git hosting services like Forgejo have many links many of which trigger the server to do computations. The idea behind Arubis is that a user really only has to do the PoW once since they aren't browsing to millions of pages. On a crawler it will try to do tons of proofs of work which will bog down the crawling rate. PoW also has the advantage of requiring the server to hold minimal state. If you try to enforce a time delay that means that the server has to track all of that.
It is also important to realize that Anubis is a act of desperation. Many projects do not want to implement it but they had no choice since their servers were getting wrecked by bots. The only other option would be Cloudflare which is much worse.
Just know that you are 100% wrong on this. You don't understand what Anubis is doing, you don't understand the problem it's solving, and you need to educate yourself before having strong opinions about things
who gives a shit
Not many, but I sure wish all.
I willl never find the irony in how tech literate people still make assumptions about technology without reading up on it.
Proof of work does not equal mass power usage. The proof of work here is fingerprinting.
It really is astonishing.
Do you understand how Proof-of-Work works? Perhaps you can give me an ELI5 that shows that it does not use the client's CPU to perform calculations for only the purpose of proving that the client's CPU performed the calculations?
I do, yes, you are just too crypto-focused. Anubis uses the result of the PoW to fingerprint, I.E. to do a job greater than just the calculation. It's also an extremely efficient alg. I won't read the docs for you, but feel free to read them first if you would like to discuss it further.
Not if it's an effective proof-of-work anti-scraping mechanism. The point of these are to make it prohibitively expensive for scrapers to harvest data.
A mire energy efficient way to do this is with lags and tar pits, which do not cause CPU cycles to be wasted.
Any mechanism - any - that uses proof-of-work is by definition wasting CPU cycles. If there's a useful waste product, like boinc, where the work that's proved to be done is science, then the POW isn't pure waste energy. There are certainly more efficient ways of generating fingerprints than PoW; Google and Facebook are peerless at fingerprinting without any PoW at all. The value of these ~~fingerprint coins~~ tokens is entirely incidental to the real purpose: to cost the scraper CPU cycles, cost them energy, and make scraping less profitable.
Anubis is all of the execution cost of cryptocurrency, without the financial flavoring.
I did but it told me I'm a bot :(
Edit: Yay, it worked.