TL;DR - which privacy-focused search engine do people recommend, preferably one that can also easily be used as a default option in Safari?

I ditched Google in about 2016ish I would guess, and since then have used DDG as my default search engine.

As someone entrenched in the Apple ecosystem, it’s always seemed like a sound choice, as it’s one of the search engines built in to Safari on both iOS and macOS.

After spending a bit more time recently playing around with and updating my Docker containers, I started hosting a Whoogle container, which seemed to work pretty well, but I don’t see many out there talking about it, so not sure how good it actually is. I then tried a SearXNG container, but either had it misconfigured or just wasn’t getting many search results back.

At the moment I’m trying out Startpage, but I know there are potential privacy concerns since they were part-bought in 2019 by a US ad-tech company.

I’m also playing around with different browsers at the moment, flicking between Safari, Firefox and Brave. At which point I stumbled across Brave Search, which seems pretty promising.

So, which search engines do you all recommend?

UPDATE: Probably should’ve done a poll! But latest (if I’ve captured everything correctly) is:

  • DuckDuckGo - 10
  • Qwant / SearXNG / Kagi / Brave - 4
  • Startpage / Ecosia - 2
  • Google - 1

As to my other questions around browsers:

  • Majority seem to use Firefox
  • Some mentions of Brave
  • One mention of Arc
  • schmurnanOP
    link
    English
    11 year ago

    I replied to another comment on here saying that I’d tried this once before, via a Docker container, but just wasn’t getting any results back (kept getting timeouts from all the search engines).

    I’ve just revisited it, and still get the timeouts. Reckon you’re able to help me troubleshoot it?

    Below are the logs from Portainer:

     File "/usr/local/searxng/searx/network/__init__.py", line 165, in get
        return request('get', url, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 98, in request
        raise httpx.TimeoutException('Timeout', request=None) from e
    httpx.TimeoutException: Timeout
    2023-08-06 09:58:13,651 ERROR:searx.engines.soundcloud: Fail to initialize
    Traceback (most recent call last):
      File "/usr/local/searxng/searx/network/__init__.py", line 96, in request
        return future.result(timeout)
               ^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/lib/python3.11/concurrent/futures/_base.py", line 458, in result
        raise TimeoutError()
    TimeoutError
    The above exception was the direct cause of the following exception:
    Traceback (most recent call last):
      File "/usr/local/searxng/searx/search/processors/abstract.py", line 75, in initialize
        self.engine.init(get_engine_from_settings(self.engine_name))
      File "/usr/local/searxng/searx/engines/soundcloud.py", line 69, in init
        guest_client_id = get_client_id()
                          ^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/engines/soundcloud.py", line 45, in get_client_id
        response = http_get("https://soundcloud.com")
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 165, in get
        return request('get', url, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 98, in request
        raise httpx.TimeoutException('Timeout', request=None) from e
    httpx.TimeoutException: Timeout
    2023-08-06 09:58:13,654 ERROR:searx.engines.soundcloud: Fail to initialize
    Traceback (most recent call last):
      File "/usr/local/searxng/searx/network/__init__.py", line 96, in request
        return future.result(timeout)
               ^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/lib/python3.11/concurrent/futures/_base.py", line 458, in result
        raise TimeoutError()
    TimeoutError
    The above exception was the direct cause of the following exception:
    Traceback (most recent call last):
      File "/usr/local/searxng/searx/search/processors/abstract.py", line 75, in initialize
        self.engine.init(get_engine_from_settings(self.engine_name))
      File "/usr/local/searxng/searx/engines/soundcloud.py", line 69, in init
        guest_client_id = get_client_id()
                          ^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/engines/soundcloud.py", line 45, in get_client_id
        response = http_get("https://soundcloud.com")
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 165, in get
        return request('get', url, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 98, in request
        raise httpx.TimeoutException('Timeout', request=None) from e
    httpx.TimeoutException: Timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.wikidata: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.duckduckgo: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.google: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.qwant: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.startpage: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.wikibooks: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.wikiquote: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.wikisource: engine timeout
    2023-08-06 10:02:05,025 ERROR:searx.engines.wikipecies: engine timeout
    2023-08-06 10:02:05,025 ERROR:searx.engines.wikiversity: engine timeout
    2023-08-06 10:02:05,025 ERROR:searx.engines.wikivoyage: engine timeout
    2023-08-06 10:02:05,025 ERROR:searx.engines.brave: engine timeout
    2023-08-06 10:02:05,481 WARNING:searx.engines.wikidata: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,481 ERROR:searx.engines.wikidata: HTTP requests timeout (search duration : 6.457878380082548 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,482 WARNING:searx.engines.wikisource: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,484 ERROR:searx.engines.wikisource: HTTP requests timeout (search duration : 6.460748491808772 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,485 WARNING:searx.engines.brave: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,485 ERROR:searx.engines.brave: HTTP requests timeout (search duration : 6.461546086706221 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,487 WARNING:searx.engines.google: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,487 ERROR:searx.engines.google: HTTP requests timeout (search duration : 6.463769535068423 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,489 WARNING:searx.engines.wikiversity: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,489 ERROR:searx.engines.wikiversity: HTTP requests timeout (search duration : 6.466003180015832 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,490 WARNING:searx.engines.wikivoyage: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,490 ERROR:searx.engines.wikivoyage: HTTP requests timeout (search duration : 6.466597221791744 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,490 WARNING:searx.engines.qwant: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,490 ERROR:searx.engines.qwant: HTTP requests timeout (search duration : 6.4669976509176195 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,491 WARNING:searx.engines.wikibooks: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,491 ERROR:searx.engines.wikibooks: HTTP requests timeout (search duration : 6.4674198678694665 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,491 WARNING:searx.engines.wikiquote: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,492 WARNING:searx.engines.wikipecies: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,492 ERROR:searx.engines.wikiquote: HTTP requests timeout (search duration : 6.468321242835373 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,492 ERROR:searx.engines.wikipecies: HTTP requests timeout (search duration : 6.468797960784286 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,496 WARNING:searx.engines.duckduckgo: ErrorContext('searx/engines/duckduckgo.py', 98, 'res = get(query_url, headers=headers)', 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,497 ERROR:searx.engines.duckduckgo: HTTP requests timeout (search duration : 6.47349306801334 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,511 WARNING:searx.engines.startpage: ErrorContext('searx/engines/startpage.py', 214, 'resp = get(get_sc_url, headers=headers)', 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,511 ERROR:searx.engines.startpage: HTTP requests timeout (search duration : 6.487425099126995 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:04:27,475 ERROR:searx.engines.duckduckgo: engine timeout
    2023-08-06 10:04:27,770 WARNING:searx.engines.duckduckgo: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:04:27,771 ERROR:searx.engines.duckduckgo: HTTP requests timeout (search duration : 3.2968566291965544 s, timeout: 3.0 s) : TimeoutException
    2023-08-06 10:04:50,094 ERROR:searx.engines.duckduckgo: engine timeout
    2023-08-06 10:04:50,187 WARNING:searx.engines.duckduckgo: ErrorContext('searx/engines/duckduckgo.py', 98, 'res = get(query_url, headers=headers)', 'httpx.ConnectTimeout', None, (None, None, 'duckduckgo.com')) False
    2023-08-06 10:04:50,187 ERROR:searx.engines.duckduckgo: HTTP requests timeout (search duration : 3.0933595369569957 s, timeout: 3.0 s) : ConnectTimeout
    

    The above is a simple search for “best privacy focused search engines 2023”, followed by the same search again but using the ddg! bang in front of it.

    I can post my docker-compose if it helps?

    • @ech0
      link
      English
      11 year ago

      First thing that comes to mind is are you running it on Host Network? That’s a requirement

      • schmurnanOP
        link
        English
        11 year ago

        No I’m running it on a bridge network with my other containers. And even the documentation shows SearXNG being run on a bridge network - see below, from https://github.com/searxng/searxng-docker/blob/master/docker-compose.yaml:

        searxng:
          container_name: searxng
          image: searxng/searxng:latest
          networks:
            - searxng
          ports:
           - "127.0.0.1:8080:8080"
          volumes:
            - ./searxng:/etc/searxng:rw
          environment:
            - SEARXNG_BASE_URL=https://${SEARXNG_HOSTNAME:-localhost}/
          cap_drop:
            - ALL
          cap_add:
            - CHOWN
            - SETGID
            - SETUID
          logging:
            driver: "json-file"
            options:
              max-size: "1m"
              max-file: "1"
        
        networks:
          searxng:
            ipam:
              driver: default
        

        The only difference between my Compose and the one above is that they suggest using Caddy as a reverse proxy, whereas I’m using Traefik.