1
0
mirror of https://github.com/searxng/searxng.git synced 2024-11-23 12:31:25 +01:00
Commit Graph

43 Commits

Author SHA1 Message Date
Markus Heiser
aecfb2300d [mod] one logger per engine - drop obsolete logger.getChild
Remove the no longer needed `logger = logger.getChild(...)` from engines.

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2021-09-06 18:05:46 +02:00
Alexandre Flament
d14994dc73 [httpx] replace searx.poolrequests by searx.network
settings.yml:

* outgoing.networks:
   * can contains network definition
   * propertiers: enable_http, verify, http2, max_connections, max_keepalive_connections,
     keepalive_expiry, local_addresses, support_ipv4, support_ipv6, proxies, max_redirects, retries
   * retries: 0 by default, number of times searx retries to send the HTTP request (using different IP & proxy each time)
   * local_addresses can be "192.168.0.1/24" (it supports IPv6)
   * support_ipv4 & support_ipv6: both True by default
     see https://github.com/searx/searx/pull/1034
* each engine can define a "network" section:
   * either a full network description
   * either reference an existing network

* all HTTP requests of engine use the same HTTP configuration (it was not the case before, see proxy configuration in master)
2021-04-12 17:25:56 +02:00
Alexandre Flament
a4dcfa025c [enh] engines: add about variable
move meta information from comment to the about variable
so the preferences, the documentation can show these information
2021-01-14 20:57:17 +01:00
Alexandre Flament
d703119d3a [enh] add raise_for_httperror
check HTTP response:
* detect some comme CAPTCHA challenge (no solving). In this case the engine is suspended for long a time.
* otherwise raise HTTPError as before

the check is done in poolrequests.py (was before in search.py).

update qwant, wikipedia, wikidata to use raise_for_httperror instead of raise_for_status
2020-12-11 14:37:08 +01:00
Alexandre Flament
bef2f2efa8 [fix] wikidata: fix crash when the item has no description at all and at least one URL. 2020-12-04 17:17:20 +01:00
Alexandre Flament
9ed3ee2beb [mod] wikidata: WDGeoAttribute class: doesn't change the method signature of get_str 2020-12-01 15:21:17 +01:00
Alexandre Flament
3038052c79 [mod] remove unused import
use
from searx.engines.duckduckgo import _fetch_supported_languages, supported_languages_url  # NOQA
so it is possible to easily remove all unused import using autoflake:
autoflake --in-place --recursive --remove-all-unused-imports searx tests
2020-11-14 14:11:02 +01:00
Alexandre Flament
95bd6033fa [mod] wikidata engine: use one SPARQL request instead of 2 HTTP requests. 2020-10-28 08:09:25 +01:00
Alexandre Flament
2006eb4680 [mod] move extract_text, extract_url to searx.utils 2020-10-02 18:13:56 +02:00
Dalf
1022228d95 Drop Python 2 (1/n): remove unicode string and url_utils 2020-09-10 10:39:04 +02:00
Marc Abonce Seguin
0d8970c8f2
only return one url per "type" in Wikidata (#2151)
i.e. only one official website, one Twitter, etc.
2020-08-27 21:44:48 +02:00
Adam Tauber
29960aa1d9 [enh] add official site link to the top of the infobox - closes #1644 2020-06-09 23:49:13 +02:00
Dalf
85b3723345 [mod] speed optimization
compile XPath only once
avoid redundant call to urlparse
get_locale(webapp.py): avoid useless call to request.accept_languages.best_match
2019-11-15 09:33:15 +01:00
Dalf
6e0285b2db [fix] wikidata engine: faster processing, remove one HTTP redirection.
* Search URL is https://www.wikidata.org/w/index.php?{query}&ns0=1 (with ns0=1 at the end to avoid an HTTP redirection)
* url_detail: remove the disabletidy=1 deprecated parameter
* Add eval_xpath function: compile once for all xpath.
* Add get_id_cache: retrieve all HTML with an id, avoid the slow to procress dynamic xpath '//div[@id="{propertyid}"]'.replace('{propertyid}')
* Create an etree.HTMLParser() instead of using the global one (see #1575)
2019-07-29 07:39:39 +02:00
Noémi Ványi
b63d645a52 Revert "remove 'all' option from search languages"
This reverts commit 4d1770398a.
2019-01-07 21:19:00 +01:00
Marc Abonce Seguin
5568f24d6c [fix] check language aliases when setting search language 2019-01-06 20:31:57 -06:00
Léo Bourrel
7a474db61b Fix formatting 2018-07-06 10:31:01 +02:00
Léo Bourrel
acaef6600e Update path to wikidata image 2018-07-05 10:11:45 +02:00
Marc Abonce Seguin
b12857a70d [fix] make search requests on wikidata more accurate 2018-04-08 21:17:00 -05:00
Marc Abonce Seguin
772c048d01 refactor engine's search language handling
Add match_language function in utils to match any user given
language code with a list of engine's supported languages.

Also add language_aliases dict on each engine to translate
standard language codes into the custom codes used by the engine.
2018-03-27 00:08:03 -06:00
marc
4d1770398a remove 'all' option from search languages 2017-12-06 01:20:15 -06:00
Adam Tauber
52e615dede [enh] py3 compatibility 2017-05-15 12:02:30 +02:00
marc
af35eee10b tests for _fetch_supported_languages in engines
and refactor method to make it testable without making requests
2016-12-15 00:40:21 -06:00
marc
f62ce21f50 [mod] fetch supported languages for several engines
utils/fetch_languages.py gets languages supported by each engine and
generates engines_languages.json with each engine's supported language.
2016-12-13 19:58:10 -06:00
marc
149802c569 [enh] add supported_languages on engines and auto-generate languages.py 2016-12-13 19:32:00 -06:00
marc
ad58b14be7 [fix] merge infoboxes based on weight
also minor changes in attributes and images from wikidata
2016-08-05 23:51:04 -05:00
marc
a0a1284998 wikidata refactor and more attributes (see issue #560) 2016-08-05 23:51:04 -05:00
a01200356
93ef11adc0 [enh] multilingual wikidata
disambiguation and tags are in local language

TOFIX:
    needs to query the api every time to know each label's name
2016-08-05 23:51:04 -05:00
a01200356
8d335dbdae [enh] wikipedia infobox
creates simple multilingual infobox using wikipedia's api
2016-04-17 16:22:19 -05:00
Adam Tauber
bd22e9a336 [fix] pep8 compatibilty 2016-01-18 12:47:31 +01:00
Adam Tauber
362c849797 [fix][mod] wikidata date handling refactor - fixes #387 2015-09-07 22:39:33 +02:00
dalf
d07cfd9089 [enh] use one single http connection pool : improve response time. close #100 2015-01-21 11:33:16 +01:00
Adam Tauber
d7ea44ab8d [fix] dates before 1900 2015-01-11 13:26:42 +01:00
Adam Tauber
cc4e17b668 [fix] pep8 2015-01-02 12:33:40 +01:00
Cqoicebordel
7937218be6 Use human readable date
For DoB and DoD, wikipedia use a non standard ISO format, not easily readable.
Now the date is displayed in an human readable form, using the language setting as locale if available. If not, it uses the default locale.
2014-12-09 02:36:53 +01:00
dalf
ffcec383b7 [fix] pep8 : duckduckgo_definitions and wikidata engines 2014-12-07 16:36:20 +01:00
Adam Tauber
b0bb94fd37 [fix] wikidata: using only the first url 2014-10-12 14:33:03 +02:00
dalf
cac1761a54 [enh] infoboxes : if the result doesn't contain anything except one link, use the normal result template 2014-10-11 15:49:50 +02:00
dalf
295b1699ce [mod] return only one result from the wikidata engine 2014-10-11 12:47:30 +02:00
Adam Tauber
727c7226d9 [fix] code cleanup 2014-10-04 22:53:54 +02:00
dalf
63a0328c8b [enh] wikidata engine : add links to musicbrainz 2014-10-02 23:40:06 +02:00
Dalf
0a71525ab6 [enh] add infoboxes and answers (clean up) 2014-10-01 22:17:35 +02:00
Dalf
6bfd566353 [enh] add infoboxes and answers 2014-09-28 16:51:41 +02:00