J’avais présenté Whoogle il y a quelques années et complètement oublié de présenter searX, que j’utilise, qui me semble plus abouti.

Le principe est d’avoir un moteur chercher sur Google et X autres moteurs, selon vos préférences, le tout en auto-hébergé et respectueux de la vie privée. Évidemment, utilisé que par moi et depuis une IP Orange, ce n’est pas le top pour éviter tout le tracking publicitaire… C’est pourquoi je le passe via mon VPN dont l’IP est utilisée par X clients d’AirVPN (lien sponso sur le blog d’ailleurs, le seul, merci).
Bardé d’options, on peut notamment sélectionner ou non certains moteurs de recherche par types.




Et ça s’installe simplement en Docker en suivant leur documentation : (exemple sans VPN)
docker run -d \ --name=searx \ --restart always \ -p 2932:8080 \ -e BASE_URL=http://0.0.0.0:2932 \ -v /mnt/Data/docker/searx:/etc/searx \ --label=com.centurylinklabs.watchtower.enable=true \ searx/searx
Notez qu’on peut aussi le paramétrer via le settings.yml dont voici le début
cat settings.yml general: debug : False # Debug mode, only for development instance_name : "HomesearX" # displayed name contact_url: False # mailto:[email protected] enable_stats: False # activate /stats page - note: it may leak usage data brand: git_url: https://github.com/searx/searx git_branch: master issue_url: https://github.com/searx/searx/issues docs_url: https://searx.github.io/searx public_instances: https://searx.space wiki_url: https://github.com/searx/searx/wiki twitter_url: https://twitter.com/Searx_engine search: safe_search : 0 # Filter results. 0: None, 1: Moderate, 2: Strict autocomplete : "" # Existing autocomplete backends: "dbpedia", "duckduckgo", "google", "startpage", "swisscows", "qwant", "wikipedia" - leave blank to turn it off by default default_lang : "" # Default search language - leave blank to detect from browser information or use codes from 'languages.py' ban_time_on_fail : 5 # ban time in seconds after engine errors max_ban_time_on_fail : 120 # max ban time in seconds after engine errors prefer_configured_language: False # increase weight of results in configured language in ranking server: port : 8080 bind_address : "10.0.4.64" # address to listen on secret_key : "032c2c1bb18d16f2eff043d17c3bcaba31d785934e2d122768274bdb82378ada" # change this! base_url : http://localhost:2932/ # Set custom base_url. Possible values: False or "https://your.custom.host/location/" image_proxy : False # Proxying image results through searx http_protocol_version : "1.0" # 1.0 and 1.1 are supported method: "POST" # POST queries are more secure as they don't show up in history but may cause problems when using Firefox containers default_http_headers: X-Content-Type-Options : nosniff X-XSS-Protection : 1; mode=block X-Download-Options : noopen X-Robots-Tag : noindex, nofollow Referrer-Policy : no-referrer ui: autofocus : True # Autofocus search input archive_today : True # show archive.today links static_path : "" # Custom static path - leave it blank if you didn't change templates_path : "" # Custom templates path - leave it blank if you didn't change default_theme : oscar # ui theme default_locale : "" # Default interface locale - leave blank to detect from browser information or use codes from the 'locales' config section theme_args : oscar_style : logicodev # default style of oscar # results_on_new_tab: False # Open result links in a new tab by default # categories_order : # - general # - files # - map # - it # - science # Lock arbitrary settings on the preferences page. # To find the ID of the user setting you want to lock, check # the ID of the form on the page "preferences". #preferences: # lock: # - language # - autocomplete # - method # searx supports result proxification using an external service: https://github.com/asciimoo/morty # uncomment below section if you have running morty proxy # the key is base64 encoded (keep the !!binary notation) # Note: since commit af77ec3, morty accepts a base64 encoded key. #result_proxy: # url : http://127.0.0.1:3000/ # key : !!binary "your_morty_proxy_key" outgoing: # communication with search engines request_timeout : 2.0 # default timeout in seconds, can be override by engine # max_request_timeout: 10.0 # the maximum timeout in seconds useragent_suffix : "" # suffix of searx_useragent, could contain information like an email address to the administrator pool_connections : 100 # Number of different hosts pool_maxsize : 10 # Number of simultaneous requests by host # uncomment below section if you want to use a proxy # see https://2.python-requests.org/en/latest/user/advanced/#proxies # SOCKS proxies are also supported: see https://2.python-requests.org/en/latest/user/advanced/#socks # proxies: # http: # - http://proxy1:8080 # - http://proxy2:8080 # https: # - http://proxy1:8080 # - http://proxy2:8080 # using_tor_proxy : True # extra_proxy_timeout : 10.0 # Extra seconds to add in order to account for the time taken by the proxy