Is it just me or are many independent search engines down? Duckduckgo, my go to engine, qwant, ecosia, startpage… All down? The only hint I got was on the qwant page…

Edit: it all seems to be related to bing being down. I hope the independent engines will find a way to get really independent…

    • HootinNHollerin@lemmy.world
      link
      fedilink
      arrow-up
      23
      ·
      5 months ago

      Isn’t that searx / searxng?

      SearXNG is a fork from the well-known searx metasearch engine which was inspired by the Seeks project. It provides basic privacy by mixing your queries with searches on other platforms without storing search data. SearXNG can be added to your browser’s search bar; moreover, it can be set as the default search engine.

      SearXNG appreciates your concern regarding logs, so take the code from the SearXNG sources and run it yourself!

      Add your instance to this list of public instances to help other people reclaim their privacy and make the internet freer. The more decentralized the internet is, the more freedom we have!

    • WhatAmLemmy@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      5 months ago

      I was thinking about this and imagined the federated servers handling the index db, search algorithms, and search requests, but instead leverage each users browser/compute to do the actual web crawling/scraping/indexing; the server simply performing CRUD operations on the processed data from clients to index db. This approach would target the core reason why search engines fail (cost of scraping and processing billions of sites), reduce the costs to host a search server, and spread the expense across the user base.

      It also may have the added benefit of hindering surveillance capitalism due to a sea of junk queries from every client, especially if it were making crawler requests from the same browser (obviously needs to be isolated from the users own data, extensions, queries, etc). The federated servers would also probably need to operate as lighthouses that orchestrate the domains and IP ranges to crawl, and efficiently distribute the workload to client machines.

    • Icr8tdThis4ccToWarn@lemmy.ml
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      5 months ago

      I’ve also thought about this, but I don’t know what would be the costs to do such a thing. (I’m ignorant on the subject)