Forewarned is forearmed: learn correct basic mistakes in SEO

Tips for beginners SEO-experts

About SEO can be said in one sentence: "A powerful marketing tool, but only if you do not make mistakes." In fact, search engine optimization - a complex multi-level process designed to not only drive traffic to the site, but in general, work to improve the product and building relationships with customers. Do not let a miss and choose the right optimization strategy is quite difficult, especially if the first time engaged in self-promotion of the project.

Service specialists Rookee conducted an analysis of the major mistakes in the promotion of sites and have shown practical examples, how to fix them.

So 20 Critical SEO Mistakes and ways to correct them:

1. Error: ignore robots.txt

A common mistake that prevents the promotion of the site.

A robots.txt - is a text file that is located in the site's root system. This set of rules that includes site indexing parameters for the search engine robots. Simply put, search engines dictate what pages they index your site and which ones are not.

It is preferable to close by search engines:

  • pay channels entry point (utm-tags, etc...);
  • functional results (search, filter, display items, printing, etc...);
  • duplicates of the home page;
  • Foreign versions of the site (if they duplicate content on the main site);
  • files to pdf, doc, xls (if you have duplicated content on the core);
  • special pages (basket, private office, registration, authorization).

How to check:

The service Yandex Webmaster , you can check whether the page from being indexed, without registering it closed.

How to close:

For example, the search functionality can generate a huge amount of takes on the site.

Assume that the reference addresses in the functional search may look like this:

  • Site.ru/search/?search.text=мужские-джинсы
  • Site.ru/search/?search.text=женские-джинсы
  • Site.ru/search/?search.text=горячие-беляши

All of the above references have a common lemma (word):

/ search

To close the page, you must use the following directive:

Disallow: / search

Disallow - prohibits indexing sections or individual pages of a site.

Disallow: / search - Search page. This tells search engine spiders that do not need to search in the search results.

Next, we check and form to Yandex webmaster new Robots.txt correct file, copy it and place it in a text file in the root directory of a site.

Similarly, you can verify and close by indexing all possible duplication of pages that we quoted above.

2. Do not you are working with potential doubles pages

Let us consider how to search for duplicates on the site, if they are not obvious and are not included in the above list of the first paragraph. For example, due to the nature of the site CMS can generate duplicates pages.

To find them, you need to Yandex webmaster of the section "Indexing" go to "page in the search", click on the tab "page Exception" and "status" choose "Double". Next, check to see which pages have been excluded from the search, as Yandex found them doubles.

All pages with duplicate meta tags should be checked. If the pages are actually takes, they must be corrected - close to a robots, customize, or 301 redirects to register the attribute rel = «canonical». If the pages are not doubles, and they just repeated the meta tags, it is also considered an error, and meta tags need to unikalizirovat.

Another way to find the page with the double - use the program Screaming Frog SEO Spider (Unfortunately, it is paid)..

For example, with the help of the program has encountered a problem: one product is present in several sections of three different URL, which overlap.

3 options to solve the problem:

  • Cause all pages to one the URL, set up 301 redirects, placing all products in one section.
  • Customize canonical attribute to one of the duplicate pages by assigning it to the main.

  • Configure the 301 redirect to one of the duplicate pages.

3. Error: updated sitemap.xml

According to the Help Yandex , Sitemap - a file with links to pages on your site that tells the search engines about the current structure of the site. It helps search engines to quickly and accurately index the web resource. It should be updated. For example, after the changes were made in a robots.txt file, you need to make changes to the site map.

How to create:

With this program, Screaming Frog update and update your sitemap.

This is not the only option. There are special plug-ins that automatically generate a sitemap.

Plugins for formation SITEMAP.XML:

How to check:

Conduct site analysis card through the service Yandex Webmaster .

4. Too much JavaScript

JavaScript (abbreviated JS) - is a popular programming language, which website owners use to create animations, pop-up banners, and other user interface elements. Search engines have long said that they can recognize the JavaScript language, so its use does not affect the progress. However, the experience of other claims: at the site may have difficulty in moving, if it is displayed correctly in the index.

Check out this easy. To do this, open the saved copy of the site in the Yandex. Click on the link "Text Only" and see a copy of the text:

Alternative - to add the resulting url «& cht = 1". A text version of the page. That's what the robot sees a page of search engines. It is necessary to check whether all the elements of the site displayed correctly - if not, fix bugs, referring to the programmer.

5. Lack of priority sections of study with the upcoming rise in seasonal demand

As a rule, optimization, layout improvements to the site and indexing all changes will take approximately 2 months. It turns out that the sections, which are promoted by certain requests will be most in demand after about 2 months after the beginning of SEO-campaign. Here is their and should be promoted in the first place.

To do this, you need to clearly understand which sections will be most in demand after 2 months, 3, 4 and so on. However, some of the requests are obvious peak demand (for example, "buy Christmas toys"), and some - absolutely not obvious ( "buy a refrigerator" with peak demand in July and August).

Check peak demand can be Yandeks.Vordstat.

6. The positions are already in the top - optimization is not needed

SEO - an ongoing process. Therefore, the position of the owner of the site "requests already in the top - to pay more for the promotion is not necessary" is not justified. Search engines are evolving, new factors and ranking algorithms. Site page of the top-3 in a month or two can not compete with the other resources that are constantly evolving and changing to meet the new requirements of the users and search engines.

What is the result of such an approach:

  • reduced visibility positions;
  • the number of sales in the performance of SEO KPI becomes smaller.

7. Only the frequency and competitive demands

SEO-experts divide all requests for frequency - parameter that indicates how often a request is entered in the search box:

  • low frequency (LF);
  • medium frequency (MF);
  • high-frequency (HF).

Popular (HF) calls are very important, and completely abandon is not recommended by them. But a successful SEO does not live by them united.

Cons high query:

  • A large number of advertisements. The more popular the query, the more companies want to advertise on it.
  • Big-budget brands to issue, which is difficult to compete.

What is the result:

  • long term O request in the top search engine;
  • Low CTR;
  • a small loop coverage;
  • the high cost of promotion.

The following error stems from logical cons -

8. There is no low-frequency queries and mikrochastotnyh

Low-frequency queries users enter in the search for rare, but they are also important. Typically, low-frequency queries more specific and more prepared to lead buying audience. In the semantic kernel must be present all the options on the frequency.

What is characteristic of the low-frequency and mikrochastotnyh:

  • Simply bring in the top.
  • High purified demand (demand more narrowed, the number of hits without the query phrases).
  • A high CTR. According to a study Rookee service at low frequency request is almost always higher CTR.
  • High conversion. A person who enters a low-frequency queries' buy Italian female red down jacket, "probably he knows exactly what he wants.

9. Do not take into account requests that have already brought traffic

Collecting queries and optimized for their starting page, you can lose already gained traffic from other keywords. And the site will lose visitors.

To avoid this, you can view Yandeks.Metrike, for whatever phrases pages of the site are already visible in search, upload a list of current trafikonosnyh queries, add it to pick up and remove duplicates.

10. Clustering based only on logic, not statistics

Clustering SEO-experts called the grouping of Intent requests (for the user's intention, which drives a query into a search engine).

Grouping requests to the user's needs, do not need to refer to human logic, correctness and reliability of trust only intelligence services Yandex and Google. People are unpredictable, and in the search engines look at your clustering. View issue, you can use a free service coolakov.ru .

11. There is not and will not be required landing pages

For each request requires a landing page. You can not create a page - do not try to trim and optimize certain queries.

12. Request all the time moving on the same page

The idea is simple: if you have two matching pages under one request, and it is impossible to bring the site to a single page, try to redirect the request to the other. For example, not on the main site, and on the inside. Perhaps so you succeed.

13. The same page and promoted in Yandex and Google

To explain the cause of this error, it is necessary to delve into the search history. Initially Google was an information retrieval system, Yandex - commercial. Therefore, Google is much easier to find and otranzhirovat page with a sufficiently large text, Yandex loves and is able to rank the sales pages. It recommended Google pay more attention to information pages in Yandex - commercial.

14. Excel Ignorance

30-40% of the work SEO-experts pass to Excel, so no knowledge base is indispensable. For example, with this program you can see the number of the request to the page, some of them potentially bring more traffic, search and delete duplicates.

The basic formula for a beginner SEO-experts:

  1. SUM = SUM (number1; number2).
  2. AVERAGE = SUM (number1; number2).
  3. COUNT = COUNT (adres_yacheyki1: adres_yacheyki2).
  4. IF = IF (boolean, "the text, if the logical expression is true," the text, if the logical expression is false ").
  5. SUMIF = SUMIF (range, condition, sum_range).
  6. = COUNTIF COUNTIF (range, condition).
  7. LEN = LEN (cell_reference).
  8. TRIM = TRIM (cell_reference).
  9. CDF = CDF (lookup_value table; column_num; tip_sovpadeniya).
  10. = Concatenate Concatenate (yacheyka1; ""; yacheyka2).

15. Optimizing manually

SEO can and should avtomatizirovat.Ne only by means of specialized services, but also the possibility of using the CMS website. This will help increase the profitability of the business and work faster, without losing in terms of quality. When the online store thousands of products, quickly create a meta description and Title for each of them it is physically impossible. In such cases, at first, we recommend using the automatic generation of the title and description for the template.

In some cases, you can automate the texts in the cards on the pattern of the goods, but only for a short period of time, gradually replacing them with unique descriptions. For example, this type of automation practice site "M.Video".

16. Do not work with CTR

In Yandex webmasters have the ability to check the CTR. If we see that it is low, you can try to rectify the situation by making a snippet attractive. Including using Emoji and symbols with the help of specialized services:

17. Do not involve user

Behavioral factors (PF) - one of the most important ranking factors. PF includes a set of actions of visitors to the site, such as residence time, number of pages viewed, registration, add to your favorites by clicking on the links, and so on. D.

Involving content - this is exactly what improves behavioral factors. How to force users to actively communicate, laykat and repost your publications:

  • short videos;
  • discounts random generator;
  • satisfaction questionnaires.

18. Promote the only site

Yandex and Google - this part of the market. There are other sites where you can promote your company. We must not forget that any social network, any service, the electronic marketplace have their algorithms for finding documents within the site. Each of these systems can be used to promote your content in it.

19. Forget about commercial factors

Trading ranking factors include concepts such as trust, quality, service, convenience, choice and design. Based on these user decides to make him a target action or not. For example, create an account on the site, make a purchase, fill out the form.

A small list of what should be on the site:

  • page Contact
  • About Us
  • Online consultant
  • buttons of social networks
  • Payment Information
  • prices
  • Warranty, return, refusal
  • Reviews
  • order form of goods or services

20. Manual selection semantic core

Collect SMILING manually - a process long and laborious. And automatically create a semantic core can be when Rookee. To do this, enter the site address and promote the region. Next, the system will select the most relevant questions for the site and immediately will predict traffic and budget for each of them.

You can add requests by hand or from a file, if, for example, decide to use the service in addition SERPSTAT, in which:

  • select competitors;
  • check the competitors;
  • unload their visibility and the visibility of the site;
  • remove inappropriate requests;
  • We check the position.
subscribe

Subscribe to SearchEngines newsletter

preview The complexity of SEO: Myth or Reality?

The complexity of SEO: Myth or Reality?

In the beginning, SEO was a lot easier than it is today...
preview As new features in the SERP change user behavior

As new features in the SERP change user behavior

Nielsen Norman Group company has studied how users interact with the Google search results Because modern search results page has a lot of different and complex patterns...
preview What to expect from BERT algorithm?

What to expect from BERT algorithm?

Perhaps the most striking event of the last months in the field of SEO was to discuss BERT algorithm, which began after members of Google ads that the algorithm is already...
preview CTR snippet on the issue: how to get more traffic from their positions

CTR snippet on the issue: how to get more traffic from their positions

Author: Vitaly Churochkin, SEO-specialist digital-agency Original Works CTR factor (CTR) is one of the key indicators to measure the effectiveness of Internet marketing...
preview Rand Fishkin of optimization "for" and "against» Google

Rand Fishkin of optimization "for" and "against» Google

According to the materials of my head SparkToro Rand Fishkin at SMX East 2019. The report was devoted to the evolution of the Google search engine business model...
preview How do I get more traffic to listings

How do I get more traffic to listings

How to create new listings so that they fall into the need of users...