Regional sub-domains and blog

12
Lazy Badger
Site user since 14.06.2017
Offline
168
#11
luckylemon:
On the other indicate that it takes.

Krosskhostovy Canonical Yandex does not understand, BTW

Производство жести методом непрерывного отжига
L
Site user since 30.07.2019
Offline
7
#12
LazyBadger:
Krosskhostovy Canonical Yandex does not understand, BTW

Clearly, thanks for the info on Yandex - esteem dock. Then it is possible to eliminate duplicates blog on regions robots.txt

In general, the solution doubles point I took from the docks of Google: https://support.google.com/webmasters/answer/66359?hl=en

Google does not recommend blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can not crawl pages with duplicate content, they can not automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel = "canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Search Console.
The WishMaster:
How?

A strange decision, to say the least.

I'll be glad to take into account your decision. How would you suggest to solve our delemmu? The description in the first post.

12

To post a new comment, please log in or register