cross-posted from: https://lemmy.ml/post/18299168

Back in the day the best way to find cool sites when you were on a cool site was to click next in the webring. In this age of ailing search engines and confidently incorrect AI, it is time for the webring to make a comeback.

This person has given his the code to get started: Webring

  • FizzyOrange
    link
    fedilink
    arrow-up
    24
    arrow-down
    1
    ·
    5 months ago

    Eh web rings were pretty lame even when they existed. There are plenty of ways to find new stuff these days. I hear they even have sites where anyone can post links and vote on which ones are good.

    • bizarroland@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      Might be an interesting addition to have an aggregator aggregator, something that would count how often a particular website is linked and in what categories it is linked in.

      Then you can filter by how many times that web page has gotten an upvote or downvote.

      If you filtered out social media and all of the say top 100 web pages what would be left and how popular are they?

  • badcommandorfilename@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    5 months ago

    I have a vision of starting a <noscript> community.

    Basically building a set of tools to help people host content with just plain HTML and CSS, using static personal hosting and organically sharing links like the pioneer days of the web.

    I think that the shift to client-side scripting, like tracking pixels, algorithmic content, infinite scrolling, targeted advertising, etc is how we ended up with the monoculture we see today.

    Just disable JavaScript on your browser and 99% of those things go sway and we can support people building personal homepages again.

    • Ben Matthews@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      ·
      5 months ago

      I built personal webpages in the 1990s, and still do it now, I included javascript then, and still do now - to make calculations, show interactive graphics, quantitative stuff about climate change - see for example this model.
      I get your concept, that more websites should be written and hosted by individuals not big tech - but javascript is not the essence of the problem - js is just calculating stuff client-side for efficiency. In theory big tech could still serve up personalised algorithm-driven feeds and targeted advertising, just with server-side page generation (like php) and a few cookies, would waste more bandwidth but no stress to them. Whereas disabling client side calculations would kill what i do, as I can’t as an individual afford to host big calculations on cloud servers (which is also technically harder).

      • badcommandorfilename@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        Yeah, this isn’t supposed to be a silver bullet, it’s more about democratizing the internet more.

        I think that

        • Low barrier to entry
        • Focus on users owning their own content
        • Privacy is more important than advanced functionality

        I.e. if you want to start a blog, it should be easy own it and host it yourself rather than surrending your content to Twitter and Facebook. Make it accessible to others who also want to surf the web without being targeted and tracked.

    • yournameplease
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      Are you familiar with neocities (geocities revival thing)? It’s not anti-scripting but it may scratch your itch.

  • Auzy@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    5 months ago

    They were never that cool. In fact, people only did it because they wanted more traffic.

    Sorry, people stopped using them for a reason imho.