SEO NEWS
Cloudflare has released its website analytics system
Cloudflare has released its website analytics system

It is free, does not use cookies and works without additional scripts.

Cloudflare announced its website analytics tool back in September and today officially announced its launch. It is non-commercial: according to the creators, the main goal of the product is to provide all data for optimizers around the world without compromising privacy.

Cloudflare Web Analytics can be used without changing your DNS settings by simply adding an HTML snippet. This simplifies the work and reduces the load on the system: the tool can be used even on weak devices.

However, you shouldn’t expect something supernatural from it – the metric is based on two main factors: the number of page views and visits.

Cloudflare has released its website analytics system

One of the handy tricks is scaling and grouping a dedicated area to see which pages bring the most traffic, from which regions and from which devices. Cloudflare’s tool also shows all visitors, even those with an ad blocker.

Cloudflare’s tool can be used by aspiring London SEOs who don’t have complex analytics queries.

The new report is located in the Analytics tab in YouTube Studio. The report is titled “How viewers find your video,”
YouTube Analytics Added New Traffic Sources Report

YouTube Analytics Added New Traffic Sources Report

With the new metric, you can see how users find your videos.

The new report is located in the Analytics tab in YouTube Studio. The report is titled “How viewers find your video,” and it shows where viewers come from and how many views each one brings. In total, the new metric uses six categories:

  • Alerts
  • Subscriptions
  • Recommendations
  • List “Next”
  • The “Channels” tab of other bloggers
  • Other

There was no clarification from the developers regarding the latter category. Foreign colleagues suggest that these are clicks on links in the comments or embedded videos on other resources.

The green arrow next to the source shows the increase in the number of views relative to the previous period. Gray is his decline. A dash indicates that there is not enough data yet, or that the changes are insignificant.

At the same time, YouTube Studio opened a free library of sounds.

YouTube Analytics Added New Traffic Sources Report


As with the original YouTube, it features sounds and tracks that you won’t receive a copyright strike for using.

Google has started highlighting featured results in bold

Google has started highlighting featured results in bold

A new kind of issue can be seen in extended snippets.

Usually Google uses bold to highlight keywords in snippets. But now this issue has begun to highlight products in extended snippets.

Google has started highlighting featured results in bold


As reported by the Alaich Telegram channel , this was first noticed by foreign colleagues, but this innovation works for us too.

There is still no definite answer to such a selection. If it was just an advertisement, then it is unlikely that the same results were observed in other requests. For example, in this case. Google probably offers the most relevant options here.

Google has started highlighting featured results in bold


There was no official Google comment on this innovation, but what do you think about this? Write in the comments!

How to speed up your website: working methods

How to speed up your website: a selection of working methods

How to increase website loading speed, reduce weight and prepare it for Google Core Vitals. We will deal with image optimization and new parameters.

Collected the most important and detailed articles on how to speed up website loading. This is both about working with code and about optimizing images.

How to reduce the weight of website pages and speed up loading

In 2021, Google will consider a new metric to measure user experience and page quality – Google Core Vitals. It includes three main elements:

  • the time it takes for the browser to render the largest visible object in the viewport – Largest Contentful Paint, LCP;
  • estimating layout shifts during page load – Cumulative Layout Shift, CLS;
  • the time between the first interaction of the user with the page and the response of the browser – First Input Delay, FID.

These metrics can be optimized so that the site is of higher quality and gets a better score from the search engine.

How to optimize your LCP score – speed up content loading

We should strive to ensure that the rendering of the largest element on the page does not take more than 2.5 seconds from the start of the page load. This is considered the optimal indicator of a site that is comfortable to work on.

How to speed up your website

LCP is influenced by four factors:

  • server response time;
  • JavaScript and CSS with rendering blocking;
  • resource loading time;
  • client side rendering.

In this article, we have discussed how to optimize each item to arrive at a good LCP score.

How to Optimize CLS: Page Layout Shifts That Disrupt Users

The content on the page can move if some elements are loaded asynchronously: this happens if the webmaster has not allocated enough space for the loaded banner at the top of the page. In this case, loading it will move all content down.

How to speed up your website
The user missed due to shifted buttons

CLS stands for Cumulative Layout Shift and helps you gauge how often users encounter unexpected shifts. The optimal CLS score is no more than 0.1 for 75% of sessions.

How to speed up your website

In the article, we analyze how to measure the indicator, which shifts are considered normal and how to optimize the indicator.

What affects website loading speed [Research 5.2 Million Pages]

The Backlinko blog team, led by Brian Dean, did some research on Google SERPs to see which acceleration methods are used by the fastest pages. The sample had 5.2 million pages from desktop and mobile, so the result is worth seeing.

Learn more about the findings with graphs and charts in the full blog article. A couple of interesting points:

  1. The average download speed of the first byte (TTFB) is 1.286 seconds on a desktop and 2.594 seconds on a smartphone. Average time to full page load is 10.3 seconds on desktop and 27.3 seconds on mobile.
  2. Oddly enough, the best options are to either compress files as little as possible before sending them from the server, or as much as possible. These pages have better performance than average compression.
  3. For downloads on desktop, the speed is more influenced by the use of CDN, on mobile – by the number of HTML requests.

More interesting information in the full article.

How to reduce website weight and speed up page loading using gzip, brotli, minification and more

Pictures, videos and various interactive elements weigh a lot and slow down the site. You can compress heavy items and speed up loading.

There are compression algorithms for this, the most popular now are gzip and brotli. Brotli compresses harder than gzip and has more compression levels. But at higher levels, its speed is slower.

These compression methods stress the server due to archiving operations, but in general they are faster – they reduce the size of the downloaded data and speed up the loading of the site.

There are also ways to speed up the site: minify, that is, reduce CSS, HTML and JS, set up caching, optimize images – this is all covered in this article.

How to speed up loading: optimizing the code at the top of the page

There is another way to make loading faster – to optimize the code of the upper part of the page, which the user sees first of all when they visit the site. If the top of the page is optimized, the user will see the loading content as early as possible. And the rest can be loaded later.

There are several methods to optimize the code at the top of the page:

  • remove unnecessary symbols and scripts from the top of the code;
  • set up asynchronous loading with jQuery;
  • speed up receiving first bytes (TTFB)
  • combine and shorten JavaScript and CSS;
  • configure loading from the cache on the user side;
  • use CDN.

All this in the article.

How to optimize images for fast loading

Great SEO Guide for Images

A great detailed article on everything important to do with image optimization. It’s not only about compression and weight reduction, but also about requirements for size, quality, uniqueness and relevant tips for filling in meta tags.

Much of the advice is based on a webinar by Demi Murych, a technical SEO and reverse engineering specialist.

Requirements for pictures:

  • is the number of pictures on the page important;
  • how quality affects SEO and what should be the minimum image sizes on the site;
  • how uniqueness is important for search engines and how to use other people’s images legally;
  • how the search engine analyzes the subject of images;
  • How image placement on the page affects SEO.

Technical issues:

  • what image format to choose;
  • how to set up the choice of a picture by the browser: correctly, not how everyone does it;
  • how to set up responsive images;
  • how to set up lazy loading
  • the best compression methods.

Filling meta tags:

  • which meta tags must be filled in, and which ones are optional;
  • how to fill in title and alt;
  • is the file name important to the search engine.

How to set up lazy loading of images – lazy loading of images

A separate material with a detailed description of setting up lazy loading of images, also called lazy loading. With this implementation, the user does not have to wait until all the content is loaded, because the images will be loaded as they view the page.

There are several configuration options:

  1. While the user scrolls: when he reaches the place where the picture should be, it will be loaded.
  2. When the user clicks on the element: the picture will be loaded if he follows the link or clicks on the preview.
  3. In the background: Content will load gradually, such as when the user opens a document and leaves it. Usually used for large drawings and diagrams.

Pictures are loaded as they are viewed:

How to speed up your website
Displaying images with lazy loading

The choice of option depends on the behavior of users on the site. In the article, we will analyze whether lazy loading is really necessary, and how to configure it correctly.

WebP format: should I use it for optimization

WebP is a graphics format developed by Google in 2010. The result is an alternative to PNG and JPEG, but with a smaller size and the same image quality. However, in WebP, you can preserve background transparency or animation.

The format is more advantageous in terms of speeding up website loading, but not all browsers support it.

In this article, we have collected all the most important about the WebP format: studies of quality and weight, advantages and disadvantages of the format, browser support, conversion methods, and other topics.

Google spoke about how it detects duplicate content and conducts canonicalization

Google spoke about how it detects duplicate content and conducts canonicalization

The developers talked about this in the new episode of the Search Off The Record podcast.

Google spoke about how it detects duplicate content and conducts canonicalization

Google employees John Mueller, Martin Splitt, Gary Ilsh, and Lizzie Harvey have elaborated on duplicate content and Google’s canonicalization. We have chosen the most important.

How Google detects duplicate pages

Everything turned out to be quite simple: there is a metric called checksum for each page. This is a unique cipher based on the text of the page. If two pages match the checksums, then Google counts them as duplicates. In practical applications, the checksum is also used to check the integrity of data during transmission.

To calculate the checksum, the main indicator is used – the central element of the page – which includes the main content (except for headers and footers and sidebars), and after calculating it, a cluster of duplicates is created. Of these, Google will choose one, which will appear in the SERP. Thus, the search engine can select not only full duplicates, but also partial ones.

Martin Splitt on Partial Duplicate Detection:“We have several algorithms that detect and ignore the template part of the pages. For example, this is how we exclude navigation from the checksum calculation, remove the footer. We are left with what we call the central element – the central content of the page, something like the very essence of the page.After calculating and comparing checksums, those that are strongly or partially similar to each other, we combine into a duplicate cluster. “

The process of reducing the page to the checksum is necessary to simplify the work: developers simply do not see the point in scanning all pages. It will take more resources with the same result.

How Google selects a canon page

In this podcast, the main difference between duplicates and canonicalization was determined: first, duplicates of pages are determined and grouped together, and then the main one is found – this is canonicalization.

Canonicalization is the process of selecting a home page in a cluster. For an objective selection of the canonical page, Google uses more than 20 signals. The neural network assigns the weight to them. When one signal decreases, the weight of the other increases and vice versa.

Martin Splitt on signals:“Obviously one of them is the content of the page. But there may be other signals: which page has a higher PageRank, on which page protocol (http or https), is the page included in the sitemap, is it redirected to another page, is the rel = canonical attribute set … Each of these signals has your weight, we use machine learning to calculate.After comparing all signals for all pairs of pages, we are approaching the actual definition of canonical. “

Finally, the developers noted that canonicalization has nothing to do with ranking.

What is robots.txt: the basics for newbies

What is robots.txt: the basics for newbies

What is robots.txt: the basics for newbies

Successful indexing of a new site depends on many components. One of them is the robots.txt file, which any novice webmaster should be familiar with correctly. Updated material for beginners.

  •  
  •  
  •  
  •  

Learn more about the rules for composing a file in the complete guide ” How to compose robots.txt yourself” .

And in this material, the basics are for beginners who want to keep up with professional terms.

What is robots.txt

What is robots.txt

A robots.txt file   is a .txt document containing instructions for indexing a specific site for search bots. It tells search engines which pages of a web resource should be indexed and which should not be allowed to be indexed. 

A search robot, having come to your site, first of all tries to find robots.txt. If the robot does not find the file or it is compiled incorrectly, the bot will explore the site at its own discretion. It is far from the fact that he will start with the pages that need to be entered into the search in the first place (new articles, reviews, photo reports, and so on). Indexing a new site may take a long time. Therefore, the webmaster needs to take care of creating the correct robots.txt file in time.

On some website builders, the file is generated by itself. For example, Wix automatically generates robots.txt. To view the file, add “/robots.txt” to the domain. If you see strange elements like “noflashhtml” and “backhtml” in there, don’t be alarmed: they are related to the structure of the sites on the platform and do not affect the attitude of search engines.

Why robots.txt is needed

It would seem, why prohibit the indexing of some site content? Not all of the content that makes up the site is needed by search robots. There are system files, there are duplicate pages, there are headings of keywords and a lot of other things that do not have to be indexed. There is one thing:The contents of the robots.txt file are 

guidelines for bots, not hard and fast rules. Bots can ignore recommendations.

Google warns that you cannot block pages from showing on Google through robots.txt. Even if you block access to the page in robots.txt, if on some other page there is a link to this page, it can get into the index. It is better to use both robots restrictions and other blocking methods:

Prohibition of site indexing, Yandex
Blocking indexing, Google

However, without robots.txt, it is more likely that information that should be hidden will end up in the search results, and this can be fraught with the disclosure of personal data and other problems.

What robots.txt consists of

The file should only be named “robots.txt” in lowercase letters and nothing else. It is placed in the root directory – https://site.com/robots.txt in a single copy. In response to the request, it should return an HTTP code with a 200 OK status. The file size must not exceed 32 KB. This is the maximum that Yandex will accept, for Google robots it can weigh up to 500 KB.

Everything inside must be in Latin, all Russian names must be translated using any Punycode converter. Each URL prefix must be written on a separate line.

In robots.txt directives (commands or instructions) are written using special terms. Briefly about directives for search bots:

“Us-agent:” – the main directive robots.txt

Used to concretize the search robot that will be given instructions. For example,  User-agent: Googlebot or  User-agent: Yandex.

In the robots.txt file, you can refer to all other search engines at once. The command in this case will look like this: User-agent: * . The special character “*” is usually understood as “any text”.

After the main directive “User-agent:” specific commands follow.

“Disallow:” command – prohibit indexing in robots.txt

Using this command, a search robot can be prohibited from indexing a web resource in whole or in some part of it. It all depends on what kind of extension she will have. 

User-agent: Yandex

Disallow: /

This kind of entry in the robots.txt file means that the Yandex search robot is not allowed to index this site at all, since the prohibition sign “/” is not accompanied by any clarifications.

User-agent: Yandex

Disallow: / wp-admin

This time there are clarifications and they concern the wp-admin system folder in the CMS WordPress That is, the indexing robot is advised to abandon indexing this entire folder.

Command “Allow:” – allow indexing in robots.txt

Antipode to the previous directive. Using the same qualifying elements, but using this command in the robots.txt file, you can allow the crawler robot to add the site elements you need to the search base. 

User-agent: *

Allow: / catalog

Disallow: /

Everything that starts with “/ catalog” is allowed to scan, and everything else is prohibited.

In practice, “Allow:” is not used very often. There is no need for it because it is applied automatically. In robots “everything is allowed that is not prohibited.” The site owner just needs to use the “Disallow:” directive, prohibiting indexing of some content, and all other content of the resource is perceived by the search robot as available for indexing.

Directive “Sitemap:” – an indication of the sitemap

” Sitemap: ” tells the crawling robot the correct path to both the Sitemap – sitemap.xml and sitemap.xml.gz files in the case of the WordPress CMS.

User-agent: *

Sitemap: http://seonewsjournal.com/sitemap.xml

Sitemap: http://seonewsjournal.com/sitemap.xml.gz

Writing the command in the robots.txt file will help the search robot to index the Sitemap faster. This will speed up the process of getting resource pages into the search results.

file ready – what’s next

So you’ve created a robots.txt text document based on your site’s needs. It can be done automatically, for example, using our tool .

What to do next:

  1. check the correctness of the created document, for example, using the Yandex service ;
  2. using an FTP client, upload the finished file to the root folder of your site. In a WordPress situation, this is usually the Public_html system folder.

Then you just have to wait for the search robots to appear, examine your robots.txt, and then start indexing your site.

How to view the robots.txt of someone else’s site

If you are interested in first looking at ready-made examples of the file performed by others, then there is nothing easier. To do this, in the address bar of the browser, just enter seonewsjournal-com/robots.txt . Instead of “seonewsjournal.com” – the name of the resource you are interested in.

Anchor Text SEO Guide

Anchor Text SEO Guide

What are link anchors and how to choose anchor text for SEO in 2020.

Anchor Text SEO

In this guide, we will understand what anchor text is and how you can work with link anchors so that it is useful and relevant for promotion. The first paragraphs of the article with basic knowledge will be relevant to beginners.

In the manual:

  1. What is anchor text
  2. Why anchor text is important for SEO
  3. What are anchor text: 9 types of anchor text
  4. What anchors to use for links
  5. Where to post links with anchors
  6. Anchor Cycle Strategy: How to Rank with Fewer Backlinks
  7. Re-optimization of anchors: PS ratio and risks

Marketer and SEO specialist Nathan Gotch compiled Anchor Text: Definitive Guide , the PR posting team made the translation, and we adapted and refined the article.

What is anchor text

Anchor text is readable and clickable text in a link. In HTML format, it looks like this:

<a href="http://seonewsjournal.com">seo news</a>

The definition of “anchor text” applies to all links, including internal and external. This tutorial will focus on external anchors.

Anchors SEO history in a nutshell

All you had to do to get ahead quickly in 2011 was create links with keyword-rich anchor text. An example of that time:

But on April 24, 2012, Google announced the launch of the first version of the Penguin algorithm update . From that moment on, when using anchor text with a large number of keywords, the site’s rank dropped sharply in search results.

Why anchor text is important for SEO

Anchor is one of the indicators that a search engine uses to determine relevance.

Google has a patent for “indexing anchor tags in a search engine”:

If you wade through the jungle of technical terms, the idea is clear:Google uses the text around the link and the anchor text to determine the relevance of a link 

.

Using this, you can build relevance without exact occurrence. Post links in relevant content and put your main keyword next to your link.

What should be done:

  1. Find relevant sites in your niche.
  2. Get backlinks in content relevant to your landing page.
  3. Try placing your main keyword next to your link.
  4. Use logical anchor text.

Examples for the keyword “anchor text”:

  • “If you’re looking for more information on anchor text, click here .”
  • “Anchor text is the visible and clickable text in a link. More on this in the article PR-CY . “
  • “To understand the topic of anchor text, I recommend this article: https://pr-cy.ru/news/p/8215-polnoe-seo-rukovodstvo-po-ankornym-tekstam ”.

Before we see the optimization tricks, let’s go over the basics. The first thing to know is the types of anchor texts you can use.

What are anchor text: 9 types of anchor text

The author of the original article Nathan Gotch identified nine types of anchor text, from the safest in terms of promotion to the least secure:

1. Branded anchors / Brand anchors / Brand anchors

Brand anchors are any text that uses the name of your company or site. “Follow the positions of your project in search engines using the LINE service ”

Branded anchors are the safest type of link text if your site has a branded domain. If the domain is an exact or partial match, you should be careful. More in the next paragraph

2. General anchors

General anchors often call for action: “go here”, “click on the link”, “on this site”. In the sentence “ Come here if you are looking for information on SEO”, the phrase “Come here” is generic anchor text.

3. Anchors are bare links

Any anchor containing an open URL is considered a bare link: https:/seonewsjournal.com/, www.seonewsjournal.com/, seonewsjournal.com.

4. Anchors without keywords

This is how it looks:

Anchor Text SEO

The easiest way to create an anchor without text is to use images. This is a little strategic trick that big brands often use. It doesn’t matter if they do it on purpose or not, it’s a good idea.

5. Links from images without anchor text

Google uses the image’s ALT attribute as the anchor text for the linking image.

6. Brand + anchor with a keyword

You can diversify your anchor profile by combining your brand name and target keyword, for example:

7. Keyword variations

To increase the thematic relevance of the page, it is worth diversifying the profile of the anchor text with different variants of keywords on the topic.

Examples for the target keyword ” backlinks “:

  • what are backlinks;
  • how to get backlinks;
  • free backlinks.

8. Anchor texts with partial occurrence

Partial match for anchors is similar to keyword variations. The main difference is that you add generic words around the main keyword phrase.

Examples of target keywords for anchor text:

  • anchor text guide;
  • detailed article on anchor text;
  • read this post about anchor text.

9. Anchor texts with exact occurrence

Exact match anchor text is an exact match of any target keyword and landing page. For example, if “buy backlinks” is the target keyword, then it will also be the anchor of the exact match.

Anchors with an exact keyword match are the most effective, but the most insecure. They can boost your rankings, but they can also lead to search engine penalties.Read on:

SEO and natural links: link promotion in Google and Yandex

What anchors to use for links

There are three things to pay attention to:

  1. Relevance
    Anchor links and the surrounding text must be relevant to each other, as well as to the topic of the site on which they are located and the site to which they link.
  2. Natural sounding in the text The
    artificial sounding of the anchor will be a negative signal for the search engine. The algorithms understand the morphology of words, so they recognize where the text is constructed unnaturally. The phrase “men’s jackets price” should sound like “men’s jacket prices.”
  3. Percentage of anchors
    In the distribution of different types of anchors, it is better to keep a balance so as not to get punished by the PS. No one has exact recommendations with percentages based on the opinion of the PS, but optimizers come to understand in practice.

Safe anchor percentage: what anchor indicators should SEO be targeting

These indicators are not the law, you can experiment and do what is best for your situation. Nathan Gotch derived these relationships from his own experience, they help him avoid penalties and get results without risk:

  • anchor text with brand mention – 70%;
  • Anchors with bare links – 20%;
  • shared anchors – 5%;
  • anchors with partial occurrence – <5%;
  • anchors with exact match – <1%.

Percentage of anchors for domains with exact and partial matches

Nathan advises newbies to avoid exact or partial domains. The reason is that they are very easy to over-optimize.

If you have an exact match domain, he advises you to focus on the following coefficients:

  • bare links – 70%;
  • general – 20%;
  • LSI – 5%;
  • partial entry – 1–5%;
  • anchors with brand / exact match – 1-5%.

There are several nuances here:

  1. Reduce the percentage of “branded” anchor text, because exact match domains (EMDs) aren’t really branded – they’re just keywords in the domain.
  2. Increase the number of links without text and generic anchors, which helps to combat over-optimization.

Anchors of the second and third level

Some SEOs use multilevel link building for promotion on Google. It is usually not used for Yandex. The scheme looks like this: there is a main monetized site, links lead to it – this is the first level. Links that lead to these links are second. And so on, this is done to leverage posts that share link power with the site being promoted. An example is links from PBN to guest posts.

For anchors of the second and third levels, the author allows more keywords.

For the second:

  • bare links – 40%;
  • general – 30%;
  • LSI, partial compliance – 25%;
  • exact match – 5%.

Concentrate keyword anchors on the best placement resources.

For the third level:

  • bare links – 10%;
  • general – 10%;
  • LSI, partial entry – 50%;
  • exact entry – 30%.

Where to post links with anchors-Anchor Text SEO

The placement of each link requires attention. Types of links and anchor text that the author recommends using:

Link anchors for the best sites

At the best sites to place, concentrate anchors with exact and partial occurrence. The best are usually those where it is difficult and expensive to accommodate.

1. Guest posts on relevant resources

If you can place a link in your niche-relevant content, use keyword-rich anchors.Read on:

How to choose guest sites

If you can only get a link in a biographical article, use brand anchors (or non-optimized ones). Google has long struggled with the practice of hosting guest spam. And one of the signals about this is when people fill up authors’ biographies with anchor words with keywords.

2. Specialized pages

Specialized pages are a great opportunity to place links with text containing keywords with exact or partial occurrence.

Using your website title is also an effective and safe solution.

3. Private Blogging Networks (PBNs)

There are SEOs who don’t believe in the effectiveness of PBN, but there is another opinion. If you are working with site grids, use links with exact or partial occurrences.Read on:

10 Satellite Sites Promotion Experiments (PBN)

Anchors for other sites

All base (unedited) links must contain unoptimized anchor text. Use only brand anchors, bare links or generic anchors for links on these sites:

  • paid catalogs;
  • traditional catalogs;
  • business links;
  • press releases;
  • comments on blogs relevant to the niche;
  • forums;
  • site sidebar or links in the footer;
  • links to profiles;
  • social bookmarks;
  • donations, sponsorship.

Some people follow the anchors manually – if the site is small, it will be easy to enter the anchors of the links being placed in a Google Sheet or Excel file. If this is inconvenient, there are special services for monitoring the anchor list.

Next, let’s take a look at the strategy that the author of the guide developed to create a natural anchor text profile.

Anchor Cycle Strategy: How to Rank with Fewer Backlinks-Anchor Text SEO

Optimizer Nathan Gotch came up with this strategy and uses it in his projects. The strategy consists of four steps, and on the diagram it looks like this:

Anchor Text SEO

Step 1. Promote your landing page with exact match anchors

Start with multiple anchors with target keys. You may wonder: “Isn’t it dangerous to use an exact match anchor for a new site?” The optimizer thinks not. Websites penalize for linking profiles in general, not just one or two links. It’s like saying that one dinner at McDonald’s is the cause of someone’s overweight. Obesity is the cumulative effect of poor diet over time, and so is the link profile.

The author uses an exact match anchor for the first backlink for two reasons:

  • see how the site will react;
  • Show search engine bots what the site or landing page is about.

Step 2. Promote your site with non-optimized anchors with all mentions

Use brand mentions, naked links, generic anchors, and various keyword variations at this stage.

Step 3. Monitor rankings and track progress

You can get enough information about your work within one to three months. If the page doesn’t grow, you need to re-evaluate it. You may find that you need to use less anchor text with the best possible match.

Often the pages don’t work because:

  • the site is not strong enough;
  • there are not enough backlinks on the page;
  • you have low quality backlinks;
  • the page is poorly constructed

If you feel like you are 10/10 on these three dimensions, move on to the next step.

Step 4. Promote your site with other exact match anchors (if necessary)

Repeat this until you reach your rating. The essence of the anchor cycle is to create a diverse and natural profile of link texts.

The secret to a natural anchor profile is to avoid templates.

Take a look at these two examples:

Site # 1 is the typical link profile you’ll see on a site with anchor spammed content.

Anchor Text SEO

Google’s algorithm can easily decide that the site is creating artificial links. You don’t even need a manual check.

Site # 2 has a natural and varied link profile.

Anchor Text SEO

Although it has fewer backlinks and less keyword-richness, it will likely outperform site # 1 in the rankings.

We cannot vouch for the result of using Nathan’s strategy on your site and would read your opinions with interest in the comments.

Should you copy your competitors’ Anchor Text SEO strategy?

As a first step in working with anchors, expert Matt Diggity recommends determining the ideal anchor distribution for your niche. That is, you need to look at the percentage of anchor text among the rated competitors and do the same.

This approach can be applied, but there are a number of problems.

  1. Copying only part of the anchor text of an authoritative site is risky.
    Authoritative sites are more trusted, so they can have a higher percentage of keyword anchors. If you copy this approach, you will most likely lose your rankings. Your site simply doesn’t have enough credibility and credibility to do this.
  2. The percentage of anchor text on the entire site is not taken into account.
    How do some sites manage to avoid over-spamming anchor text with exact match? Because they have authority and trust, and there is also a high percentage of unoptimized anchors in the site’s profile.

For these two reasons, you shouldn’t copy competitors’ anchors page by page – you need to simulate the entire profile of text anchors. If you don’t have authority, trust, and unoptimized anchor text all over your site, then you shouldn’t be repeating strong competitors on this issue.

Re-optimization of anchors: PS ratio and risks

On a site that has fallen under the filter, the first step is to analyze the link anchors. Most sites under the filters have an aggressive strategy of placing anchor text with commercial key entries.

Google is no longer updating the Penguin and Panda algorithms, but the concepts are still applicable. The main signal for manual overlaying of “Penguin” is anchor overspam. Using backlinks with commercial or other keywords will most likely result in a manual filter.How to remove a manual filter, we described in the article ” 

Search Engine Filters. Google “. There is the same one, but 

about Yandex .

Search engines rarely punish sites just for overly optimized link anchors. Usually aggressive anchor copy strategies are often accompanied by other bad SEO practices.

In theory, you can go overboard with anchor optimization and not get penalized if everything else is okay. But this rarely happens. Websites are punished for doing a lot of things wrong.

Most sites with overly optimized anchors have poor quality links and content, poor user experience, and are often too aggressive in page optimization. To solve these problems, an SEO audit is required.


Check your site for 70+ parameters using the Site Analysis service : it will check optimization, links, meta tags, texts, loading speed, technical parameters, usability, and more. Looks for errors on the main and internal pages, and also in it you can track the positions of pages in search engines.


What does the search engine think about changing the anchor of the link

If you want to get Google’s attention, replace the non-keyword anchor with a target keyword. This is how Google might consider the link to be spam. Or maybe not, but you shouldn’t risk it. It is better to leave the link as it is, and find another place for the anchor with a commercial key.

Situations that do not cause concern:

  1. Replacing keyword anchor text with non-optimized text.
    Reducing the number of commercial anchors can increase your rankings if there are too many of them. If anchors with exact match or multiple keys are greater than 25%, it may be worth replacing some of them.
  2. Removing the link and placing it in another part of the article.
    If you decide to change the link, post a new one elsewhere in the article. When you do this, the anchor / link will “refresh” in the eyes of Google. You will lose the old link, but you will gain a new one. However, the referenced text should only be changed as a last resort.

Do I need to reject links from SEO attacks?

Google automatically ignores link spam, as John Mueller talked about, so you don’t have to do unnecessary work and disavow spam links through Disavow Links.“We take this into account and try to automatically ignore such links. I see very few people with linking issues, so this works well. If these are just spam links, then I wouldn’t bother with them.If you are anxious, sleepy about these links and want to make sure Google is handling them properly, then use the Disavow Links tool. This is not an admission of guilt, you are simply telling our systems that these links should not be counted for your site. “

The search engine spokesperson added that for most sites there is no need to reject spam, Disavow Links should only be used in really extreme cases. The search engine successfully detects spam links and ignores them.

How Past Sanctions Affect Website SEO

In general, there are two points of view. Optimizer Nathan Gotch believes that getting hit by the filter is like going to jail for a felony. While you might get out of jail one day, a felony will always be in your bio.“Do you really think that Google is erasing everything from a site that has previously been manually punished? Even if your fine gets canceled, your site’s ranking will never be easy and you will always feel like something is holding you back. “

In some cases, he advises creating a new site because he believes that it is not that difficult and much more cost effective than trying to get out of the penalties.

Google employee Gary Ilsh has a different opinion:“Rumors that manual sanctions will brand a site forever are just an SEO myth. Submit the resource for recheck and if everything is fine, you will be off our hook. “

That is, according to the PS employee, manual sanctions do not leave traces in the history of the site. If, after the removal of sanctions, the previous positions have not returned to the site, this does not mean that the past affects it so. Perhaps the search engine has changed the algorithms, competitors have made it to the top.

Which side are you on, what does your experience say? Write in the comments!


Anchor text is a very small piece in the SEO puzzle. Most sites are rife with SEO, SEO content and backlink quality issues. They need to be addressed first. Once you’re good at optimizing them, move on to the anchor text. This can give you an edge that allows you to dominate the competition.

Dogpile search engine

Dogpile search engine

Dogpile search engine

Dogpile is an meta search engine for information over the World Wide Web that attracts effects from GoogleYahoo!YandexBing, along with other popular search engines, such as individuals from audio and video content providers like Yahoo!.

What kind of search engine is dogpile

Dogpile search engine

Dogpile, one of the most popular metasearch engines on the Web, was launched in 1996. It’s now operated by InfoSpace, which recently streamlined its interface, so which makes it a brand fresh look and features. Using advanced metasearch technology,

Dogpile searches the Web via the Internet’s leading search engines (see list below), promising to bring, with one click, the very best results from its joint pool of search engine sources.

(Note: Although it today labels”sponsored links” these are interspersed throughout the results listings and aren’t always easy to spot.

Dogpile also displays result links on the right-hand side of their results page for clustering and refining searches even further.

Thus, the searcher can drill down into narrower subtopics without needing to use complex search programs. For the intrepid researcher, nonetheless,

Dogpile also supplies an Advanced search site.

What is metasearch engine?

A metasearch engine (or search aggregator) is an internet Information retrieval tool which employs the data of a internet search motor to create its own results. Metasearch motors take input from a user and instantly query search engines for outcomes.

History of metasearch engine and Dogpile

The first person to incorporate the notion of meta searching was Daniel Dreilinger of Colorado State University. He developed SearchSavvy, which let users search around 20 different search engines and directories simultaneously.

Though quickly, the search engine was limited to simple searches and so was not dependable.

University of Washington pupil Eric Selberg published a more”updated” version called MetaCrawler.

This search engine improved on SearchSavvy’s accuracy with the addition of its search syntax behind the scenes, and also matching the syntax to that of the search engines it was probing.

Metacrawler reduced the amount of search engines queried to 6, but although it produced more accurate results, it still wasn’t considered as accurate as searching a query within a single engine.

Dogpile searches in

  1. Google    (search engine)
  2. Yahoo! Search    (directory / search engine)
  3. Ask.com    (search engine)
  4. MSN Search    (search engine)
  5. MIVA     (e-commerce directory)
  6. LookSmart     (directory)

seonews

Facebook Page SEO Optimization for Business

Facebook Page SEO Optimization for Business

Optimizing website is not a hard task if one website follows the requirements of the search engine.

Same thing goes for Facebook too. You need to optimize your Facebook page to get nice results and gain quality Facebook page likes from your target audience. There are a few powerful tips which you can follow to reach your target audience without any problems or issues.

This article will take you out in a journey of those powerful tips and later we will discuss why Facebook Page likes are important for your Facebook page for business. With that you can get an actual guide for marketing in Facebook organically.

Powerful Tips for Facebook Page SEO Optimization

Right Name- Giving right name is always important because legends say that your name says it all. Therefore it is always advisable that you give the right name with extreme care and with proper thought process. Always remember that it is not just about keyword but also about your page name when it comes to SEO. Your page name needs proper featuring to catch the eyes of the audience you target. Therefore, make sure you choose the page name with proper discussion and idea exploring with the support of experts.

Sprinkle your Facebook page with your target keywords- Keywords are always an important thing to look after when you are talking about Facebook page SEO optimization.  Be strategically clear about where to place the keywords. You can always look to add your keyword in about section, description section, headline and photo captions to generate an organic advantage while compared to your competitors. You can use regular SEO tools for generating the keywords you need, for example SEMrush and Ahrefs. Then your SEO games become easy and you can also check your competitor with the help of these tools.

Leverage the power of backlinks- Backlinks are always the most powerful off page SEO strategy. It helps you to gain positive influence for your target audience and search engine too. Getting authority backlink makes Google think that your Facebook page is highly trustworthy. This thing efficiently helps you in long term as well as gathers eyeballs in short term too. To get backlinks you need to research a bit and connect with influencers. Always have a separate budget for your backlink creation and it helps you exponentially in your journey to be a strong brand and generate good sales and revenue online.

Importance of Facebook Likes

Facebook likes for your Facebook page for business is very important for your growth. There are basically two things in which Facebook likes for your Facebook page for business can help. Let us talk about that in this part.

Creates trust and increases reliability factor- It is important to note that Facebook page likes are very important factor which creates trust among the common people.

Revenue and Sales- Revenue and sales can be impacted directly with the increase of likes for your Facebook page of business.

If you find it hard to follow the tactics given here for increasing your Facebook page likes then you can always hire Fbpostlikes.

Why choose Fbpostlikes to buy Facebook Page Likes?

Fbpostlikes is a huge brand which is known to do quality SEO to provide you high quality Facebook page likes for their clients. Fbpostlikes have an experienced team which takes care of every type of clients’ requirements including geo targeting interests too. Fbpostlikes are also very punctual with the process they follow which can make you a happy client. They are the best in the business for the past 10 years consistently, so choose them without second thought.