also originated the same question. I went to see ...
and I found nothing.
any of the rules or of a distinct "Company".
What / who is not entirely clear.
here wants are in the court file and to whom? even it is not clear in which country.
although the rules vsezh naguglit:
but on the site links I do not see here.
about blocking it certainly not a word.
# Cat access.log-20200618 | wc -l 1668779 # Cat access.log-20200618 | grep "+ http: //www.bing.com/bingbot.htm" | wc -l 113249 # Cat access.log-20200618 | grep "+ http: //yandex.com/bots" | wc -l 109748
It is very very active. And for low-bandwidth sites is sometimes a problem.
To do this, there are crawl-delay directive. Actually, I have it and bot Mehl generally meager load created (when sitting on the UPU, tracked the case). But Yasha, at da. or is there a Googol.
Although, if the chela serious site, here, of course, needs and resources. And if half crippled on the site, and a couple of hundred pages, where is the load.
and whom she recorded in 2020?
I do not know today, but 5 years ago, there was no method to limit bingo.
Yes, normally that Bing and any left bots hoster bans and could be others - you exceed the load blah blah, you have a day to increase the rate or move out on GVA ... ---------- Posted 18.06.2020 at 16:22 ---------- I would place a hoster has ordered the customer to hang Antibot to websites: D Resources Savings would come out 3-5 times at least.
the next step - breaking customer sites and forced optimization without their knowledge :)
and that more than half are not even aware of the need for plug-ins such as caching.
Well, a page for 200 ms commercials and well. on the eye like a bright enough, what for something else to do. Let them each time the page is fully generated, let a bunch of queries to the database is normally ...
when the site is correctly made him fear no bots.
and depending on the size / laziness host, some easier to just block the problem than to try to gain an understanding of each client.
And you see the adepts of "time I bought hosting, the hosting company is obliged to help me with a wordpress / joomla / drupal / cms different"?
Clear. Niochem in general. I for sites at once Robots prescribe rules for bots, unnecessary Bloch. Usually, this is enough to avoid excessive load on the hosting
Unfortunately, not all bots follow these directives. Therefore, customers often have to Bloch bots with ahrefs for example. And Yasha can also create problems. Here you write your 10-20 traffic users. But if you have 1000+ and even 10000+ pages are indexed often yashoy, the hosting provider will not find it :) At 10000+ no guidelines do not help in the SHARED hosting. This is especially a big issue for those hosts that are placed on the node from the 1000+ hosting accounts. On such nodah oddly Optimize hardware and software, we have a very hard to follow the load, otherwise the hosting become not just a pumpkin, a very rotten pumpkin.
To post a new comment, please log in or register