luckylemon

Rating
7
Registration
30.07.2019
LazyBadger:
Кроссхостовый каноникал Яндекс не понимает, BTW

Ясно, спасибо за инфо по яндексу - почитаю доку. Тогда можно исключить дубли блога на регионах robots.txt

А вообще решение указать дубли я взял из доки гугла: https://support.google.com/webmasters/answer/66359?hl=en

Google does not recommend blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Search Console.
The WishMaster:
Каким образом?

Странное решение, мягко говоря.

Буду рад принять во внимание ваше решение. Как бы вы посоветовали решить нашу делемму? Описание в первом посте.

LazyBadger:
Krosskhostovy Canonical Yandex does not understand, BTW

Clearly, thanks for the info on Yandex - esteem dock. Then it is possible to eliminate duplicates blog on regions robots.txt

In general, the solution doubles point I took from the docks of Google: https://support.google.com/webmasters/answer/66359?hl=en

Google does not recommend blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can not crawl pages with duplicate content, they can not automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel = "canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Search Console.
The WishMaster:
How?

A strange decision, to say the least.

I'll be glad to take into account your decision. How would you suggest to solve our delemmu? The description in the first post.

Спасибо за ответы. Решили дублицировать блог на всех поддоменах/регионах, но индексировать только на одном регионе. На остальных указывать что это дубли.

Thanks for answers. Dublitsirovat decided to blog on all subdomains / regions, but the index only on one region. On the other indicate that it takes.

The WishMaster:
А какую пользу от блога ты ожидаешь?

Постить контект релевантный аудитории связанный с основным сервисом сайта => Привлекать органический трафик => Рости в позциях по всем территориям, тоесть city1.domain.ru city2.domain,ru, etc.

LazyBadger:
Разместить блог на domain.ru/blog

Но ведь тогда ни один из регионов не будет расти в поисковиках, т.к. региональные сайты будут на своих поддоменах, а блог на основном домене. Можно, пожалуйста, поподробнее почему такое решение самое верное?

The WishMaster:
And what benefits do you expect from a blog?

Post kontekt relevant audiences associated with the main service site => To attract organic traffic => poztsii growth in all areas, ie city1.domain.ru city2.domain, ru, etc.

LazyBadger:
Add a blog to domain.ru/blog

But then, none of the regions will not grow in the search engines, because regional sites will be in their subdomains, a blog on the primary domain. It is possible, please detail why such a decision is the right?