Policy:User-Agent policy/ru: Difference between revisions

From Wikimedia Foundation Governance Wiki
Content deleted Content added
FuzzyBot (talk | contribs)
Updating to match new version of source page
FuzzyBot (talk | contribs)
Updating to match new version of source page
 
(43 intermediate revisions by 8 users not shown)
Line 1: Line 1:
<languages />{{DISPLAYTITLE:Политика использования User-Agent}}
<languages />
{{notice|1=Данная страница носит сугубо информативный характер, отражая текущее положение дел. Для обсуждения этой темы, пожалуйста, используйте [[:m:Special:MyLanguage/Mailing lists|список рассылки]] wikitech-l.}}
{{notice|<div lang="en" dir="ltr" class="mw-content-ltr">
{{policy-staff}}
This page is purely informative, reflecting the current state of affairs. To discuss this topic, please use the wikitech-l [[Special:MyLanguage/Mailing lists|mailing list]].

</div>}}
Начиная с 15 февраля 2010 года, сайты Викимедиа требуют наличия '''HTTP-заголовка [[{{lwp|User-Agent}}|User-Agent]]''' во всех запросах. Это оперативное решение, принятое техническим персоналом, которое анонсировалось и обсуждалось в технической рассылке<ref>[[mailarchive:wikitech-l/2010-February/thread.html#46764|The Wikitech-l February 2010 Archive by subject]]</ref><ref>[[listarchive:list/wikitech-l@lists.wikimedia.org/thread/R4RU7XTBM5J3BTS6GGQW77NYS2E4WGLI/|User-Agent: - Wikitech-l - lists.wikimedia.org]]</ref>. Обоснование заключается в том, что клиенты, которые не посылают строку User-agent, в основном являются некорректно работающими скриптами, которые создают большую нагрузку на серверы, не принося пользу проектам. Строки User-agent, которые начинаются с неописательных значений по умолчанию, например, <code>python-requests/x</code>, также могут быть заблокированы на сайтах Викимедиа (или разделах сайта, например, <code>api.php</code>).

Запросы (например, от браузеров или скриптов), которые не отправляют описательный заголовок User-Agent, могут столкнуться с сообщением об ошибке, подобным этому:

:<span lang="en" dir="ltr" class="mw-content-ltr">''Scripts should use an informative User-Agent string with contact information, or they may be blocked without notice.''</span>


<div lang="en" dir="ltr" class="mw-content-ltr">
<div lang="en" dir="ltr" class="mw-content-ltr">
Requests from disallowed user agents may instead encounter a less helpful error message like this:
As of February 15, 2010, Wikimedia sites require a '''HTTP [[w:User-Agent|User-Agent]] header''' for all requests. This was an operative decision made by the technical staff and was announced and discussed on the technical mailing list.<ref>[//lists.wikimedia.org/pipermail/wikitech-l/2010-February/subject.html#46777 The Wikitech-l February 2010 Archive by subject]</ref><ref>[http://www.gossamer-threads.com/lists/wiki/wikitech/189275 User-Agent: | Wikipedia | Wikitech]</ref> The rationale is, that clients that do not send a User-Agent string are mostly ill behaved scripts that cause a lot of load on the servers, without benefiting the projects. Note that non-descriptive default values for the User-Agent string, such as used by Perl's libwww, may also be blocked from using Wikimedia web sites (or parts of the web sites, such as api.php).
</div>
</div>

:<span lang="en" dir="ltr" class="mw-content-ltr">''Our servers are currently experiencing a technical problem. Please try again in a few minutes.''</span>


<div lang="en" dir="ltr" class="mw-content-ltr">
<div lang="en" dir="ltr" class="mw-content-ltr">
This change is most likely to affect scripts (bots) accessing Wikimedia websites such as Wikipedia automatically, via api.php or otherwise, and command line programs.<ref>[[:mw:Special:MyLanguage/API:FAQ|API:FAQ - MediaWiki]]</ref> If you run a bot, please send a User-Agent header identifying the bot with an identifier that isn't going to be confused with many other bots, and supplying some way of contacting you (e.g. a userpage on the local wiki, a userpage on a related wiki using interwiki linking syntax, a URI for a relevant external website, or an email address), e.g.:
User agents (browsers or scripts) that do not send a User-Agent header may now encounter an error message like this:
</div>
</div>
<pre>
User-Agent: CoolBot/0.0 (https://example.org/coolbot/; coolbot@example.org) generic-library/0.0
</pre>


<div lang="en" dir="ltr" class="mw-content-ltr">
<div lang="en" dir="ltr" class="mw-content-ltr">
The generic format is <code><client name>/<version> (<contact information>) <library/framework name>/<version> [<library name>/<version> ...]</code>. Parts that are not applicable can be omitted.
:''Scripts should use an informative User-Agent string with contact information, or they may be IP-blocked without notice.''
</div>
</div>


<div lang="en" dir="ltr" class="mw-content-ltr">
<div lang="en" dir="ltr" class="mw-content-ltr">
If you run an automated agent, please consider following the Internet-wide convention of including the string "bot" in the User-Agent string, in any combination of lowercase or uppercase letters. This is recognized by Wikimedia's systems, and used to classify traffic and provide more accurate statistics.
User agents that send a User-Agent header that is blacklisted (for example, any User-Agent string that begins with "lwp", whether it is informative or not) may encounter a less helpful error message (lie) like this:
</div>
</div>


<div lang="en" dir="ltr" class="mw-content-ltr">
<div lang="en" dir="ltr" class="mw-content-ltr">
Do not copy a browser's user agent for your bot, as bot-like behavior with a browser's user agent will be assumed malicious.<ref>[[mailarchive:wikitech-l/2010-February/046783.html|[Wikitech-l] User-Agent:]]</ref> Do not use generic agents such as "curl", "lwp", "Python-urllib", and so on. For large frameworks like pywikibot, there are so many users that just "pywikibot" is likely to be somewhat vague. Including detail about the specific task/script/etc would be a good idea, even if that detail is opaque to anyone besides the operator.<ref>[[mailarchive:mediawiki-api/2014-July/003308.html|Clarification on what is needed for "identifying the bot" in bot user-agent?]]</ref>
:''Our servers are currently experiencing a technical problem. This is probably temporary and should be fixed soon. Please try again in a few minutes.
</div>
</div>


<div lang="en" dir="ltr" class="mw-content-ltr">
<div lang="en" dir="ltr" class="mw-content-ltr">
Web browsers generally send a User-Agent string automatically; if you encounter the above error, please refer to your browser's manual to find out how to set the User-Agent string. Note that some plugins or proxies for privacy enhancement may suppress this header. However, for anonymous surfing, it is recommended to send a generic User-Agent string, instead of suppressing it or sending an empty string. Note that other features are much more likely to identify you to a website — if you are interested in protecting your privacy, visit the [//coveryourtracks.eff.org/ Cover Your Tracks project].
This change is most likely to affect scripts (bots) accessing Wikimedia websites such as Wikipedia automatically, via api.php or otherwise, and command line programs.<ref>[//www.mediawiki.org/w/index.php?title=API:FAQ#do_I_get_HTTP_403_errors.3F API:FAQ - MediaWiki]</ref> If you run a bot, please send a User-Agent header identifying the bot with an identifier that isn't going to be confused with many other bots, and supplying some way of contacting you (e.g. a userpage on the local wiki, a userpage on a related wiki using interwiki linking syntax, a URI for a relevant external website, or an email address), e.g.:
</div>
</div>
<pre>
User-Agent: CoolToolName/0.0 (https://example.org/cool-tool/; cool-tool@example.org) used-base-library/0.0
</pre>


<div lang="en" dir="ltr" class="mw-content-ltr">
The generic format is <code><client name>/<version> (<contact information>) <library/framework name>/<version> [<library name>/<version> ...]</code>. Parts that are not applicable can be omitted.
Browser-based applications written in JavaScript are typically forced to send the same User-Agent header as the browser that hosts them. This is not a violation of policy, however such applications are encouraged to include the <code>Api-User-Agent</code> header to supply an appropriate agent.
</div>


<div lang="en" dir="ltr" class="mw-content-ltr">
<div lang="en" dir="ltr" class="mw-content-ltr">
As of 2015, Wikimedia sites do not reject all page views and API requests from clients that do not set a User-Agent header. As such, the requirement is not automatically enforced. Rather, it may be enforced in specific cases as needed.
If you run an automated agent, please consider following the Internet-wide convention of including the string "bot" in the User-Agent string, in any combination of lowercase or uppercase letters. This is recognized by Wikimedia's systems, and used to classify traffic and provide more accurate statistics.
</div><ref>gmane.science.linguistics.wikipedia.technical/83870 ([//thread.gmane.org/gmane.science.linguistics.wikipedia.technical/83870/ deadlink])</ref>
<span id="Code_examples"></span>
== Примеры кода ==

<div lang="en" dir="ltr" class="mw-content-ltr">
On Wikimedia wikis, if you don't supply a <code>User-Agent</code> header, or you supply an empty or generic one, your request will fail with an HTTP 403 error. Other MediaWiki installations may have similar policies.
</div>
</div>


<div lang="en" dir="ltr" class="mw-content-ltr">
<div lang="en" dir="ltr" class="mw-content-ltr">
=== JavaScript ===
Do not copy a browser's user agent for your bot, as bot-like behavior with a browser's user agent will be assumed malicious.<ref>[//lists.wikimedia.org/pipermail/wikitech-l/2010-February/046783.html [Wikitech-l&#93; User-Agent:]</ref> Do not use generic agents such as "curl", "lwp", "Python-urllib", and so on. For large frameworks like pywikibot, there are so many users that just "pywikibot" is likely to be somewhat vague. Including detail about the specific task/script/etc would be a good idea, even if that detail is opaque to anyone besides the operator.<ref>{{cite web|url=http://lists.wikimedia.org/pipermail/mediawiki-api/2014-July/003308.html|title=Clarification on what is needed for "identifying the bot" in bot user-agent?|publisher=Mediawiki-api|author=Anomie|date=31 July 2014}}</ref>
</div>
</div>


<div lang="en" dir="ltr" class="mw-content-ltr">
<div lang="en" dir="ltr" class="mw-content-ltr">
If you are calling the API from browser-based JavaScript, you won't be able to influence the <code>User-Agent</code> header: the browser will use its own. To work around this, use the <code>Api-User-Agent</code> header:
For more information, please refer to the [[mw:API:Main page#Identifying your client|MediaWiki API Documentation]].<ref>As an example (among [[mw:API:Quick_start_guide#Identifying_your_client|other examples]]) of how to set a user-agent, in PHP, one [http://php.net/manual/en/function.curl-setopt.php might use] the following, if one's cURL handle is <code>$ch</code>:<syntaxhighlight lang="php">curl_setopt($ch, CURLOPT_USERAGENT ,'CoolToolName/0.0 (https://example.org/cool-tool/; cool-tool@example.org) used-base-library/0.0');</syntaxhighlight></ref>
</div>
</div>

<syntaxhighlight lang="javascript">
// Using XMLHttpRequest
xhr.setRequestHeader( 'Api-User-Agent', 'Example/1.0' );
</syntaxhighlight>
<syntaxhighlight lang="javascript">
// Using jQuery
$.ajax( {
url: 'https://example/...',
data: ...,
dataType: 'json',
type: 'GET',
headers: { 'Api-User-Agent': 'Example/1.0' },
} ).then( function ( data ) {
// ..
} );
</syntaxhighlight>
<syntaxhighlight lang="javascript">
// Using mw.Api
var api = new mw.Api( {
ajax: {
headers: { 'Api-User-Agent': 'Example/1.0' }
}
} );
api.get( ... ).then( function ( data ) {
// ...
});
</syntaxhighlight>
<syntaxhighlight lang="javascript">
// Using Fetch
fetch( 'https://example/...', {
method: 'GET',
headers: new Headers( {
'Api-User-Agent': 'Example/1.0'
} )
} ).then( function ( response ) {
return response.json();
} ).then( function ( data ) {
// ...
});
</syntaxhighlight>


<div lang="en" dir="ltr" class="mw-content-ltr">
<div lang="en" dir="ltr" class="mw-content-ltr">
=== PHP ===
Web browsers generally send a User-Agent string automatically; if you encounter the above error, please refer to your browser's manual to find out how to set the User-Agent string. Note that some plugins or proxies for privacy enhancement may suppress this header. However, for anonymous surfing, it is recommended to send a generic User-Agent string, instead of suppressing it or sending an empty string. Note that other features are much more likely to identify you to a website — if you are interested in protecting your privacy, visit the [https://panopticlick.eff.org/ Panopticlick project].
</div>
</div>


<div lang="en" dir="ltr" class="mw-content-ltr">
<div lang="en" dir="ltr" class="mw-content-ltr">
In PHP, you can identify your user-agent with code such as this:
Browser-based applications written in Flash or JavaScript are typically forced to send the same User-Agent header as the browser that hosts them. This is not a violation of policy, however such applications are encouraged to include the <code>Api-User-Agent</code> header to supply an appropriate agent.
</div>
</div>

<syntaxhighlight lang="php">
ini_set( 'user_agent', 'CoolBot/0.0 (https://example.org/coolbot/; coolbot@example.org)' );
</syntaxhighlight>


<div lang="en" dir="ltr" class="mw-content-ltr">
<div lang="en" dir="ltr" class="mw-content-ltr">
=== cURL ===
As of 2015, Wikimedia sites do not reject all page views and API requests from clients that do not set a User-Agent header. As such, the requirement is not automatically enforced. Rather, it may be enforced in specific cases as needed.
</div>
</div><ref>gmane.science.linguistics.wikipedia.technical/83870 ([http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/83870/ deadlink])</ref>

== Примечания ==
<div lang="en" dir="ltr" class="mw-content-ltr">
Or if you use [[{{lwp|cURL}}|cURL]]:
</div>

<syntaxhighlight lang="php">
curl_setopt( $curl, CURLOPT_USERAGENT, 'CoolBot/0.0 (https://example.org/coolbot/; coolbot@example.org)' );
</syntaxhighlight>

<div lang="en" dir="ltr" class="mw-content-ltr">
=== Python ===
</div>

<div lang="en" dir="ltr" class="mw-content-ltr">
In Python, you can use the [[{{lwp|Requests (software)}}|Requests]] library to set a header:
</div>

<syntaxhighlight lang="python">
import requests

url = 'https://example/...'
headers = {'User-Agent': 'CoolBot/0.0 (https://example.org/coolbot/; coolbot@example.org)'}

response = requests.get(url, headers=headers)
</syntaxhighlight>

<div lang="en" dir="ltr" class="mw-content-ltr">
Or, if you want to use [//sparqlwrapper.readthedocs.io SPARQLWrapper] like in https://people.wikimedia.org/~bearloga/notes/wdqs-python.html:
</div>

<syntaxhighlight lang="python">
from SPARQLWrapper import SPARQLWrapper, JSON

url = 'https://example/...'
user_agent = 'CoolBot/0.0 (https://example.org/coolbot/; coolbot@example.org)'

sparql = SPARQLWrapper(url, agent = user_agent )
results = sparql.query()
</syntaxhighlight>

== {{int string|Notes}} ==
<references />
<references />


== {{int string|See also}} ==
[[Category:Global policies{{#translation:}}]]

* <span lang="en" dir="ltr" class="mw-content-ltr">[[wikitech:Robot policy|Policy for crawlers and bots]] that wish to operate on Wikimedia websites</span>

[[Category:Bots{{#translation:}}]]
[[Category:Bots{{#translation:}}]]
[[Category:Policies maintained by the Wikimedia Foundation{{#translation:}}]]

Latest revision as of 01:03, 29 March 2024

Начиная с 15 февраля 2010 года, сайты Викимедиа требуют наличия HTTP-заголовка User-Agent во всех запросах. Это оперативное решение, принятое техническим персоналом, которое анонсировалось и обсуждалось в технической рассылке[1][2]. Обоснование заключается в том, что клиенты, которые не посылают строку User-agent, в основном являются некорректно работающими скриптами, которые создают большую нагрузку на серверы, не принося пользу проектам. Строки User-agent, которые начинаются с неописательных значений по умолчанию, например, python-requests/x, также могут быть заблокированы на сайтах Викимедиа (или разделах сайта, например, api.php).

Запросы (например, от браузеров или скриптов), которые не отправляют описательный заголовок User-Agent, могут столкнуться с сообщением об ошибке, подобным этому:

Scripts should use an informative User-Agent string with contact information, or they may be blocked without notice.

Requests from disallowed user agents may instead encounter a less helpful error message like this:

Our servers are currently experiencing a technical problem. Please try again in a few minutes.

This change is most likely to affect scripts (bots) accessing Wikimedia websites such as Wikipedia automatically, via api.php or otherwise, and command line programs.[3] If you run a bot, please send a User-Agent header identifying the bot with an identifier that isn't going to be confused with many other bots, and supplying some way of contacting you (e.g. a userpage on the local wiki, a userpage on a related wiki using interwiki linking syntax, a URI for a relevant external website, or an email address), e.g.:

User-Agent: CoolBot/0.0 (https://example.org/coolbot/; coolbot@example.org) generic-library/0.0

The generic format is <client name>/<version> (<contact information>) <library/framework name>/<version> [<library name>/<version> ...]. Parts that are not applicable can be omitted.

If you run an automated agent, please consider following the Internet-wide convention of including the string "bot" in the User-Agent string, in any combination of lowercase or uppercase letters. This is recognized by Wikimedia's systems, and used to classify traffic and provide more accurate statistics.

Do not copy a browser's user agent for your bot, as bot-like behavior with a browser's user agent will be assumed malicious.[4] Do not use generic agents such as "curl", "lwp", "Python-urllib", and so on. For large frameworks like pywikibot, there are so many users that just "pywikibot" is likely to be somewhat vague. Including detail about the specific task/script/etc would be a good idea, even if that detail is opaque to anyone besides the operator.[5]

Web browsers generally send a User-Agent string automatically; if you encounter the above error, please refer to your browser's manual to find out how to set the User-Agent string. Note that some plugins or proxies for privacy enhancement may suppress this header. However, for anonymous surfing, it is recommended to send a generic User-Agent string, instead of suppressing it or sending an empty string. Note that other features are much more likely to identify you to a website — if you are interested in protecting your privacy, visit the Cover Your Tracks project.

Browser-based applications written in JavaScript are typically forced to send the same User-Agent header as the browser that hosts them. This is not a violation of policy, however such applications are encouraged to include the Api-User-Agent header to supply an appropriate agent.

As of 2015, Wikimedia sites do not reject all page views and API requests from clients that do not set a User-Agent header. As such, the requirement is not automatically enforced. Rather, it may be enforced in specific cases as needed.

[6]

Примеры кода

On Wikimedia wikis, if you don't supply a User-Agent header, or you supply an empty or generic one, your request will fail with an HTTP 403 error. Other MediaWiki installations may have similar policies.

JavaScript

If you are calling the API from browser-based JavaScript, you won't be able to influence the User-Agent header: the browser will use its own. To work around this, use the Api-User-Agent header:

// Using XMLHttpRequest
xhr.setRequestHeader( 'Api-User-Agent', 'Example/1.0' );
// Using jQuery
$.ajax( {
    url: 'https://example/...',
    data: ...,
    dataType: 'json',
    type: 'GET',
    headers: { 'Api-User-Agent': 'Example/1.0' },
} ).then( function ( data )  {
    // ..
} );
// Using mw.Api
var api = new mw.Api( {
    ajax: {
        headers: { 'Api-User-Agent': 'Example/1.0' }
    }
} );
api.get( ... ).then( function ( data ) {
    // ...
});
// Using Fetch
fetch( 'https://example/...', {
    method: 'GET',
    headers: new Headers( {
        'Api-User-Agent': 'Example/1.0'
    } )
} ).then( function ( response ) {
    return response.json();
} ).then( function ( data ) {
    // ...
});

PHP

In PHP, you can identify your user-agent with code such as this:

ini_set( 'user_agent', 'CoolBot/0.0 (https://example.org/coolbot/; coolbot@example.org)' );

cURL

Or if you use cURL:

curl_setopt( $curl, CURLOPT_USERAGENT, 'CoolBot/0.0 (https://example.org/coolbot/; coolbot@example.org)' );

Python

In Python, you can use the Requests library to set a header:

import requests

url = 'https://example/...'
headers = {'User-Agent': 'CoolBot/0.0 (https://example.org/coolbot/; coolbot@example.org)'}

response = requests.get(url, headers=headers)
from SPARQLWrapper import SPARQLWrapper, JSON

url = 'https://example/...'
user_agent = 'CoolBot/0.0 (https://example.org/coolbot/; coolbot@example.org)'

sparql = SPARQLWrapper(url, agent = user_agent )
results = sparql.query()

Примечания

  1. The Wikitech-l February 2010 Archive by subject
  2. User-Agent: - Wikitech-l - lists.wikimedia.org
  3. API:FAQ - MediaWiki
  4. [Wikitech-l] User-Agent:
  5. Clarification on what is needed for "identifying the bot" in bot user-agent?
  6. gmane.science.linguistics.wikipedia.technical/83870 (deadlink)

См. также