70

cross-posted from: https://lemmy.ml/post/18299168

Back in the day the best way to find cool sites when you were on a cool site was to click next in the webring. In this age of ailing search engines and confidently incorrect AI, it is time for the webring to make a comeback.

This person has given his the code to get started: Webring

top 7 comments
sorted by: hot top controversial new old
[-] FizzyOrange@programming.dev 23 points 4 months ago

Eh web rings were pretty lame even when they existed. There are plenty of ways to find new stuff these days. I hear they even have sites where anyone can post links and vote on which ones are good.

[-] bizarroland@fedia.io 1 points 4 months ago

Might be an interesting addition to have an aggregator aggregator, something that would count how often a particular website is linked and in what categories it is linked in.

Then you can filter by how many times that web page has gotten an upvote or downvote.

If you filtered out social media and all of the say top 100 web pages what would be left and how popular are they?

[-] badcommandorfilename@lemmy.world 5 points 4 months ago

I have a vision of starting a community.

Basically building a set of tools to help people host content with just plain HTML and CSS, using static personal hosting and organically sharing links like the pioneer days of the web.

I think that the shift to client-side scripting, like tracking pixels, algorithmic content, infinite scrolling, targeted advertising, etc is how we ended up with the monoculture we see today.

Just disable JavaScript on your browser and 99% of those things go sway and we can support people building personal homepages again.

[-] benjhm@sopuli.xyz 6 points 4 months ago

I built personal webpages in the 1990s, and still do it now, I included javascript then, and still do now - to make calculations, show interactive graphics, quantitative stuff about climate change - see for example this model.
I get your concept, that more websites should be written and hosted by individuals not big tech - but javascript is not the essence of the problem - js is just calculating stuff client-side for efficiency. In theory big tech could still serve up personalised algorithm-driven feeds and targeted advertising, just with server-side page generation (like php) and a few cookies, would waste more bandwidth but no stress to them. Whereas disabling client side calculations would kill what i do, as I can't as an individual afford to host big calculations on cloud servers (which is also technically harder).

[-] badcommandorfilename@lemmy.world 2 points 4 months ago

Yeah, this isn't supposed to be a silver bullet, it's more about democratizing the internet more.

I think that

  • Low barrier to entry
  • Focus on users owning their own content
  • Privacy is more important than advanced functionality

I.e. if you want to start a blog, it should be easy own it and host it yourself rather than surrending your content to Twitter and Facebook. Make it accessible to others who also want to surf the web without being targeted and tracked.

[-] yournameplease@programming.dev 1 points 4 months ago

Are you familiar with neocities (geocities revival thing)? It's not anti-scripting but it may scratch your itch.

[-] Auzy@beehaw.org 1 points 4 months ago* (last edited 4 months ago)

They were never that cool. In fact, people only did it because they wanted more traffic.

Sorry, people stopped using them for a reason imho.

this post was submitted on 23 Jul 2024
70 points (96.1% liked)

Opensource

1408 readers
16 users here now

A community for discussion about open source software! Ask questions, share knowledge, share news, or post interesting stuff related to it!

CreditsIcon base by Lorc under CC BY 3.0 with modifications to add a gradient



founded 1 year ago
MODERATORS