Google updates The latest news about SEO, Online Marketing, Social Media Marketing from the best SEO software Mon, 09 Dec 2024 12:35:18 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 New Google Update: How to Make Your Content More Helpful https://www.webceo.com/blog/google-update-helpful-content/ https://www.webceo.com/blog/google-update-helpful-content/#comments Tue, 20 Sep 2022 11:40:44 +0000 https://www.webceo.com/blog/?p=9880

Don’t you love it when Google makes waves with its major updates? Earlier this September, the search giant finished rolling out a new core update. The immediate effects were as usual: some websites didn’t even notice anything, while others reported...

The post New Google Update: How to Make Your Content More Helpful appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>

Don’t you love it when Google makes waves with its major updates?

Earlier this September, the search giant finished rolling out a new core update. The immediate effects were as usual: some websites didn’t even notice anything, while others reported fluctuations in traffic and/or rankings. Thankfully, the SEO community knew why it was happening because Google warned everyone in advance.

Maybe you’ve felt the effects, too? Even if you haven’t, you want to know what’s going on, trust us. Google updates should never be ignored, and this one is particularly big (although it’s still warming up). Let’s give you a quick rundown.

What is Google’s Helpful Content Update?

As is the case with literally any algorithm update, Google wants to give users better search results. Only the method differs.

This time, Google’s preference appears to be content. Sites with better content receive more love, and vice versa: low-quality content earns you lower rankings. What’s more, it affects entire websites rather than individual pages. That’s right: all of your site rankings will be re-evaluated based on your content’s quality. All content you have on your site.

Cheerful news, isn’t it? But Google never demands the impossible from webmasters. Even the worst-case scenario can be overcome with a solid plan.

Was I hit by the Helpful Content Update?

There’s an easy way to find out. Just check how your site rankings and traffic behaved before and after September 9th. This information can be found in Google Search Console…

However, we recommend connecting your GSC account to WebCEO for an extra bit of useful information.

You see, WebCEO keeps track of Google’s updates and marks them on the chart with Search Console’s data. This is the easiest way to see if the Helpful Content Update has done anything to your site.

Here’s what the Top Pages report will show you:

In addition to that, you can view the list of the latest Google updates for more context. (You can even add your own events to this list: for example, if you have done anything notable for your SEO which could affect your site’s performance in search engines and you’d like to track your progress in the report.)

Look for September 9th on the chart. It will be marked by a G icon. If you see any unusual fluctuations past this mark, the Helpful Content Update could be the reason.

And if that’s the case, why?

Who is targeted by the Helpful Content Update?

Google’s goal is the same as always: present the most useful and valuable content. This update merely raised the bar on what it considers valuable – or, rather, made it official.

When does content fall short?

  • It offers too little or no value. If a user isn’t satisfied after visiting a page, it clearly needs improvement.
  • It’s optimized for search engines rather than for users. On-page SEO is important, but don’t forget the human whose needs matter even more.
  • It’s made without any real expertise on the subject. If others can tell about it better or make a more compelling offer, they will definitely outrank you.
  • It’s unrelated or too loosely related to the site’s niche. Google doesn’t think that a jack of all trades is better than a master of one. When your content steps outside of your site’s specialty, Google may see it as a cheap attempt to grab more traffic from users who don’t need what you have to offer.
  • It’s automatically generated. No good level of quality can be expected from content that wasn’t made by humans. Unless that is the selling point (e.g. AI-generated Harry Potter chapters).
  • It’s just clickbait. Pretty self-explanatory. Nobody likes clickbait content.

If you have anything like that on your site, that’s what you must improve. With that in mind, what is the next step?

How to optimize your site for the Helpful Content Update

Let’s start with what you shouldn’t do: wait until you get hit by the update. You can take preemptive measures right now to soften the impact – if it comes. And if it does come, you will find it much easier to deal with.

Since the goal is to raise the quality of your site’s content, two obvious actions come to mind:

  1. Improve what you have;
  2. Delete the pages you may deem unnecessary.

But the catch is, this is a site-wide update. At the moment, there’s no telling just how much low-quality content is enough to tank your rankings. The most bothersome scenario may call for a full content revision on your site.

And if your page count is in triple digits or higher, then I can hear you groan from way over there. Do you see now why it’s better to start early?

All right, the task is to find low-quality content on your site. Where are the easiest places to look?

  • Pages with a low word count.

Google has stated they don’t have a preferred word count. However, more often than not, you will need more words rather than fewer to explore a topic well. The tricky part is bringing up the word counts for all of your pages to help you find the offenders.

If you made your site in WordPress, there are plugins for that, such as WP Word Count. Alternatively, Screaming Frog can do it.

  • Pages with low Google rankings.

These are way easier to find.

Open the Top Pages report in WebCEO and click on the Avg. position column to sort it in descending order. All your lowest ranking content will appear at the top of the table, with a delta displaying recent changes.

In other words: if a page has lost rankings after the Helpful Content Update, it might be just what you are looking for. If it isn’t, keep digging through the table and through the pages.

  • Pages with high Google rankings.

Surprise! Well, not really. Site rankings are a fluid thing, and even the best content doesn’t stay on SERP #1 forever. Your competitors are working hard to make sure of that.

Once again in the Top Pages report, sort the Avg. position column in the ascending order. Visit your top ranking pages and see if their content could be improved in any way. Most likely it can. At least you know you won’t have to delete those pages.

Wrapping up

The Helpful Content Update has only just finished rolling out. We won’t see its full effects until some time passes – weeks, or maybe months. However, Google is pretty clear about what it wants from websites and their content, and it all boils down to two simple points:

  1. Deepen your expertise;
  2. Go all out when you write.

What can you learn today? What can you do on your site today? It’s all up to you!

Find your unhelpful content! Sign Up Free

The post New Google Update: How to Make Your Content More Helpful appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>
https://www.webceo.com/blog/google-update-helpful-content/feed/ 3
[Updated] A Full Overview of Google Algorithm Updates https://www.webceo.com/blog/google-algorithm-updates/ https://www.webceo.com/blog/google-algorithm-updates/#comments Thu, 12 Aug 2021 14:00:00 +0000 https://www.webceo.com/blog/?p=6660

Google Search Algorithm Updates always will be something that strikes some fear into the hearts of webmasters who care about their site rankings. Why is this so when these updates presumably aim to provide the best user experience?

The post [Updated] A Full Overview of Google Algorithm Updates appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>

Download a free PDF version of this post Download

Google Search Algorithm Updates always were, are, and, we suppose, will be something that strikes some fear into the hearts of webmasters who care about their site rankings. Google plays hardball, webmasters play whack-a-mole.

Why is this so when these updates presumably aim to provide the best user experience and results to searchers and to reward websites for high quality content?

Everybody who is engaged in the SEO field knows that Google operates and applies more than 200 ranking factors before letting your website get the highest positions on the SERP. However, before improving each of those 200 ranking factors you are better off to first learn which of them impact a website negatively or positively and then start your work.

It is difficult to consider even half of Google’s ranking factors when you conduct a website’s optimization, and some webmasters don’t even want to do this. Instead they go for black-hat SEO in order to get quick results. Their quick high positions eventually get lost and end with a Google Penalty which in a simple word means trouble. Google is too smart, so don’t try to cheat.

What Is a Google Penalty and Why Is It Dangerous for Your Website?

A Google Penalty is represented by a negative impact on your rankings which comes to you after Google conducts a manual review or algorithm update. This means that you did something to your website that was against Google’s webmaster guidelines and your website was mentioned in a spam report. If you see that your rankings and traffic suddenly became lower in a matter of several days, especially after a new algorithm update, this usually means that a Google Penalty was applied according to new rules.

You can check whether your website was punished in Google Search Console. However, this shows you only manual review results. A penalty as a result of the latest algorithm update can be confirmed only via website ranking and traffic analysis.

google-search-console-security-and-manual-actions

There are two types of Google Penalty: partial matches and site wide matches. The first type of penalty is applied when there are several artificial links or those of low quality which point to specific pages of your website and you are losing valuable organic traffic. The second type takes place when the backlink profile of your entire website needs significant and instant auditing and cleaning.

A rank decrease may refer to all the pages of your website, a specific keyword, or a specific page.

What Stands Behind Google Algorithm Updates

A Google Algorithm Update covers a bunch of changes for Google’s ranking algorithm, including improvements in the existing algorithm or a set of new rules on how to analyze the quality of websites and then rank them in the SERPs.

Since Google launched in 1998, a lot of algorithm updates have happened. Of course, more often people pay attention only to the major ones, e.g. Panda, Penguin, Hummingbird, and Pigeon. However, there are actually more of them which have had a significant impact on the current SERPs view and which have been a total pain in the neck for many webmasters:

Google May Day Update

Release date: April 28 – May 3, 2010

This update aimed at rewarding websites with high quality long tail content. The peculiarity of the update is that Google now paid more attention to the quality of the presented content. It got you to higher positions on the SERPs regarding specific long tail keywords despite a poor backlink profile. Its main goal was relevancy of the content presented.

The greatest losses came to websites in a niche where they were selling goods generically. Focusing mostly on short tail keywords they went through significant traffic and rankings drops regarding long tail keywords.

Remedy: fill your website with enough long tail keywords. This may come in the form of articles, extended reviews, long descriptions, a characteristics overview, guidance, and so on.

WebCEO’s Keywords Research Tool will help you to work on keywords and create unique and winning combinations for your website.

The WebCEO Keyword Research Tool

Google Panda Update

Release date: February 23, 2011

This update aims at rewarding websites with high quality content and punishing low quality websites. Panda looks for everything you did in order to get a higher position on a SERP without quality in your pocket. This could have been:

  • Low quality content, whether it is employee or machine generated, will lead you nowhere. Google said multiple times that quality is everything and it is better to listen to these words if you want to be the first and keep your positions for a long time. Be sure to express the language of the text fluently.
  • Unnatural language, in other words, this could be a keyword “overdose”. If you put too many keywords in your text, this will be noticed almost immediately, simply because neither we nor Google bots are used to reading content with a great repetition of specific words. Google also finds this suspicious and doesn’t delay penalties.
  • Thin content or lack of content. This is when you have a really low amount of material on one of your website’s pages. Google likes it when you spend more time and present in-depth content to searchers. Writing a short paragraph with little sense, but an overdose of keywords, is not a good decision.
  • Content farming, i.e. a method of creating a significant amount of low quality content, for example a bunch of very short articles written for popular search queries, with the aim of getting greater traffic and revenue. Google doesn’t like it when content is created with the aim of ad monetization. User experience and quality should always be first.
  • Lack of authority/trustworthiness plays negatively with your rankings, you can see this by analyzing your website performance: how often your content is updated, domain age, type, and authority, poor or bad backlink profile, visitor behavior on your website, etc.
  • Inappropriate ads. If there are a lot of advertisements on your website and those are not relevant to your content or disturbing for a website visitor, you can eventually expect a penalty from Panda. The situation may become especially risky if the amount of advertisements overtakes the amount of content (ad-to-content ratio).

Remedy: content improvement.

1. Work on the content of your website. Rewrite your material in order to make it of high quality, put keywords only in places where their presence is necessary and use only those keywords which are relevant to your niche and specifically to the page you are trying to improve. It will not be a bad decision if you remove all the pages with low quality content.

2. Forget about advertisements for a second. Go to websites that are trustworthy and popular and learn their advertisement profile: how many ads they have, whether they are disturbing, and their relevancy to a website’s niche. Then come back to your place and think about the same points regarding your website. Optimize things properly and create an ideal place for visitors.

3. No black-hat SEO. Honest and “clean”, well done content will bring you success and traffic, therefore heightening your authority. Searchers will come to your place and stay there for a long time. Google needs nothing more.

Google Exact Match Domain (EMD) Update

Release date: September, 2012

The Google Exact Match Domain (EMD) Update focused on websites with domain names that exactly repeated a searchers’ query and then got their websites to the top pretty much based on that alone.

GOOGLE-EXACT-MATCH-DOMAIN-(EMD)-UPDATE-example-1
GOOGLE-EXACT-MATCH-DOMAIN-(EMD)-UPDATE-example-2

Remedy: unfortunately, no advice will help here, because you either have such domain name or you don’t. It’s no longer a guaranty at all that you will score for a keyword just because your domain name is an exact match.

Google Penguin Update

Release date: April 24, 2012

Penguin was released to punish websites that try to improve their positions by getting links from low quality websites and by stuffing pages and anchor links with an enormous amount of keywords. Such schemes are easily recognizable even for a user. Google only needs to check websites from which those links came and then present you a penalty.

Nobody likes bad quality websites, because they, by default, mean that: a user will not find any good material there and can’t trust them, so it is better not to visit them at all. Accordingly, being linked to from such a website will reflect on you.

Remedy: forget about black-hat SEO.

1. Don’t try to build fast, easy, or paid spammy backlinks, because those more often will bring you harm instead of success. Take your time and try to get links from websites which have already reached some popularity and high domain authority. WebCEO’s My Backlinks Tool will help you to learn your backlink profile from A to Z, including link texts, linking domains and the tool will show you toxic pages that link to your website.

The WebCEO My Backlinks Tool

Use white hat link building techniques like high quality guest blogging, link round-ups, the skyscraper technique, etc, which will be a win for both sides. By taking these steps, webmasters will gain a decent backlink profile which will be appreciated by Google, and build up domain authority for their website.

WebCEO’s Content Submission Tool will help you to find the best places where your content can be your best advertisement.

The WebCEO Content Submission Tool

2. Avoid keyword stuffing in any form and place. Keywords are not flowers which you can put anywhere and enjoy them. Their main mission is to help you to find a reader, but not to attract Google’s attention. Find the best variants, build some relevant long tail or short tail keywords which are successful for your niche, find some synonymic alternatives and put them into your text so users and Google will not be annoyed. Google recognizes synonyms and can reward you even more for using them.

Google Hummingbird Update

Release date: August 20, 2013

Because semantic search is so complicated, the Google Hummingbird update went farther than any other updates. It tried to understand a searcher’s way of thinking at the moment they wanted to find something on Google. This approach extended the borders of information which should be presented. For instance, Google would not just show the definition of “pizza” on its local SERPs, but information which concerns that word: recipes, history, the nearest pizza places, the most popular among them, recommendations in the form of “people also ask”, etc. Hummingbird tries to get you the most accurate results concerning your query, by analyzing what you probably need this information for.

Hummingbird works with a “knowledge graph” which was first presented in 2012. There one can usually find relevant answers to a query, and you won’t even need to enter a website! With this update, local search was vastly improved. Since Hummingbird gives you less primitive information and goes deeper into your wishes, local businesses were given a large incentive to do better with their Internet performance by improving title tags, keywords in descriptions, and conducting website updates.

Advice: structure your content properly.

1. Because Hummingbird uses knowledge graphs, it has become better to write your content in a way that answers the following questions: Who? What? Where? When? Why? How? By doing this you heighten the chances of your website being chosen as an answer for a searcher’s query and you can also be selected for a Featured Snippet (technically speaking, we are suggesting that you optimize your Open Graph code and your Schema code – webmasters will know what we mean). This helps to bring more traffic.

featured-snippet-and-knowledge-graph

WebCEO’s Rank Tracking Tool will show you your results in organic search and whether your website was shown in a Featured Snippet or Knowledge Panel (the box where Knowledge Graph data is presented).

webceo-featued-snippet

2. Diversify your content. Long articles with comprehensive analysis are really great and Google likes them, but for a visitor’s convenience you can write paragraphs that are easy and quick to read. Moreover, these short articles can also be used by Google in a knowledge panel.

3. Your language should follow your niche. You can write in a simple, interesting, and engaging way, but don’t forget that you must create content related to a definite niche. Don’t make your article too easy; use up-to-date terms, statistics, diagrams, and so on. Remember that all those terms are your keywords, and Hummingbird can consider your information more relevant to somebody’s query than anything else.

4. Set a Schema markup. This markup determines whether your page will be featured in a Snippet. The data presented in your Schema markup can help visitors judge what your site represents: ratings, quantity of reviews and skillful descriptions. If you are an owner of a local business you can present more data concerning your working hours, menu, and phone numbers.

google-hummingbird-update-local-search

Google Pigeon Update

Release date: July 24, 2014

Pigeon brought a lot of changes to local SEO after its release:

  • With Pigeon, the 7-pack changed into the 3-pack: since the update, searchers can now see the three best results for local businesses instead of seven. You can also see a map on the SERP above the 3-pack which shows the distance to those three places;
  • Pigeon shows you results not only depending on the closeness of venues to you, but also takes into account a website’s position for that keyword in organic search results. In simple words, Pigeon sees the nearest places to you, analyzes their organic positions on the SERP considering all SEO ranking factors, rates them, and then finally presents a list of the best variants for you in local search. This is a great feature because you receive not just the ordinary results of where you can go, but the best results;

Advice: become visible.

1. Make your business visible on all local business directories: Facebook, LinkedIn, Bing, Yelp, and many others. Local ranking factors have become more and more important: reviews, citations, links, social media engagement, and so on.

2. Use local search terms as your keywords. Write them down in a snippet of your own content, in title tags, and in the descriptions of your place in the local directories. Of course, don’t forget to mention them in your text.

3. Time to think about your website optimization. As Pigeon takes into account a website’s organic SERP results, you should take care of your website performance: high quality content, backlink profile, domain authority, mobile-friendliness, etc.

4. If you have a local brick and mortar business, go to the major travelling websites, e.g. TripAdvisor in order to gain some popularity, backlinks, and good reviews.

Google Mobile-Friendly Update

Release date: April 21, 2015

Mobile devices are everywhere nowadays. Users have traded their desktops for smartphones and prefer to chill out with them 24/7. Google sees trends and follows them. Trying to provide users with the best performance even on mobile devices, Google released its Mobile Friendly Update which had an impact only on those websites which aren’t optimized for smartphones. It has been a great motivator for webmasters to make their sites convenient for any type of device. Going into detail: with this update your search rankings on desktops are not lowered at all. As this update concerns only mobile devices, accordingly only your mobile rankings suffer from it if you haven’t optimized your website yet. However, it won’t necessarily affect the whole website. If some of your website’s pages are mobile-friendly, then they will not be “touched” by Google. Google has even provided a test for website owners which can help to check whether a website is mobile-friendly.

Advice: make your website mobile-friendly. Use AMP for this purpose, a framework that provides for the fast and smooth loading of your website on mobile devices.

WebCEO’s Landing Page SEO Tool will give you detailed information concerning your website’s mobile optimization, so you can discover any issues and see instructions on how to solve them.

With the Google Mobile-First Indexing which was enabled on July 1, 2019, a website’s mobile-friendliness obtains even more value. Google has begun to crawl and index websites primarily from the mobile version point of view. If you run a website that Google hasn’t seen yet and it is not mobile-friendly yet, be ready to experience problems with your indexing and rankings.

Google Possum Update

Release date: September 1, 2016

Possum was a greater version of Pigeon in which developers addressed the disadvantages of the latter:

  • This update let companies situated beyond a city’s borders be shown in a local search when searchers specified that city. Earlier this was impossible, because Pigeon focused on places which were strictly on the territory of a chosen city. Even if a website had good positions in organic search results, it could not be seen in local results because of this.
  • Google improved its results sorting. Now it doesn’t show several results which belong to one address. For example, if there are two coffee houses in one place near you, Google will show you only one of them in order to avoid duplicate content. The second result will also be presented on a list, but pushed down.
  • Now you will see different results for keyword variations. Even a slight difference between them will give a list of new places, e.g.:
google-possum-update-local-search-results
  • Possum is more sensitive to a searcher’s physical location than it was before. Now the 3-pack shows you not just the best results for you, but also which of them are the closest.
  • Local search has become more independent from the organic results. Even despite low rankings in organic search, some businesses do really well in local search results.
google-possum-update-local-search-results-2

Remedy:

1. As Google still considers organic results while giving searchers the best local matches, it is important to constantly keep track of the general website’s performance: backlink profile, domain authority, etc.

2. Because of the keyword variation problem, you should do comprehensive research on the keywords you are ranked for. Maybe, there is a necessity to change them into new ones or add other variants.

Google Fred Update

Release date: March 7-8, 2017

The codename “Fred” is not official and was proposed by Gary Illyes on Twitter as being something that is unknown. The target of this update was to punish websites that use black-hat SEO and too many advertisements for aggressive monetization. “Fred” fights against websites:

  • that contain an excessive amount of advertisements;
  • the content of which is thin and of low quality;
  • which contains text written about different topics with the aim of fast rank position gains;
  • have little benefit for users, a lot of page issues and have a bad impact on the user experience;
  • that are not mobile-friendly.

Remedy: website’s quality reevaluating.

1. Your website should belong to a specific niche and fulfill a user’s needs with relevant content, which is rich and well written, without keyword stuffing and without any signs of thin or duplicate content.

2. There should be no game playing with title tags, metadata, keywords, and schema code. Any attempts to use black-hat SEO must be stopped immediately. Google doesn’t like them and you should not either.

3. Be shy when time for advertisements on your website comes. An excessive amount of it will always disturb users and they will leave your site instantly, heightening your bounce rate, which Google automatically doesn’t like. Remember that a lot of people nowadays use Ad Blockers, so the chances to get something from those advertisements may become minimal.

Google Medic Update

Release date: August 1, 2018

The “Medic” update presumably punishes websites which can negatively influence people’s well-being. This includes: health, financial security, safety of a user, and so on. To be specific, this update affects websites which:

  • require personal information, e.g. name, date of birth, Personal Identification Number, Social Security Number, bank account number, driver’s license, – on the whole, the type of information which may be used for Identity theft;
  • offer goods for buying and use insecure monetary transactions, e.g. online shops, where the information about your credit card and bank account number is used and may be potentially stolen for the sake of lucre;
  • offer a user some advice or general information regarding the medical sphere and health, which in Google management’s estimation may be harmful.
  • present information in the form of advice regarding important life problems and future decisions, e.g. serious purchases like cars, houses, stocks, some financial advice and so on.

Remedy: raise trust among users.

1. Work on your landing pages and content – make it of high quality and erase all features which Google doesn’t like: low quality, thin, duplicate content, keyword stuffing, and everything else that Panda hunts for. Take the freshest information from trustworthy and official sources, attaching statistics, tables, diagrams, etc. With this you will show users and Google that you haven’t pulled your data out of thin air.

2. E.A.T. concept – expertise, authoritativeness, trust. Write in detail on your About page who you are, why you can be useful, and why people should trust you – prove that you are a specialist who has, for example, the necessary background and education to be an expert in a chosen niche. If you recommend some information that goes against the words of most scientists, then give your users more proof that what you say may be right (Copernicus was right after all). Trust rises with good reviews about your website, so ask people/your customers/visitors/subscribers to leave a comment concerning their attitude to your website.

3. Create an author bio which will present you as a specialist. This variant should be used if your About page gives information concerning the services you provide on your website. In your bio you can write about yourself as an expert in a specific sphere and why people can trust you. Maybe, you have a Bachelor’s, Master’s, or Doctoral Degree, completed some courses or internship, and so on. Google presumably also wants to trust you, so give it such an opportunity.

Google never stops developing and updating. And each time website owners encounter more and more new rules and limits, which they should obey in order to stay visible on the SERPs. 2019 follows this trend as well. The June Google Core Update made a lot of changes. Many YMYL websites were impacted and fell in the rankings. These are “Your Money or Your Life” sites that presume to sell you things that effect your health and financial situation. Meanwhile educational and informational resources gained more authority. Learn more about the June 2019 Google Core Update in order to protect your website from decreasing rankings and pick up new rules.

Google September 2019 Core Update

Release date: September 24, 2019

As the June 2019 Core Update’s successor, the September Core Update focused on websites that in any way might damage people’s well-being. Websites containing information regarding health, money, travel and medical stuff were the target of this update. Some publishing websites like The Daily Mail did better after this update after having fallen with the June update. Google says that any kind of information that might have any influence on people’s lives should be harmless. It’s still too early to describe everything this update might have brought, however, some points already cry for your attention.

Remedy: revise your content.

1. E.A.T. concept is still necessary to follow: show your visitors that you don’t get information out of thin air. Disclose your qualifications and prove your credibility so that neither Google nor users would hesitate to come to your website and later apply knowledge you provide them with.

2. Monitor your backlinks, both coming to and from you: the sources you use while creating your content might influence people’s trust in you as a professional and as a writer. Unproven and suspicious websites you are linking to will make people trust you less, because they will not be sure whether they can handle the data or not unless you provide them with recognized specialists’ opinions.

3. Freshness is an always winning feature: update your content with fresh and relevant information. If you use any statistics in your texts, make sure the figures are recent and a hundred percent true. 

Google BERT Update

Release date: October 21, 2019

The BERT algorithm (Bidirectional Encoder Representations from Transformers) is not a simple update to the existing algorithm. It is the introduction of a new system that will help understand users’ natural language and provide them with more accurate results for their queries. We can call BERT Hummingbird’s successor at some point because both these updates are focused on the understanding of search intent.

BERT goes deeper in a searcher’s query analysis, catching the context of it. It considers the whole word groups, prepositions that surround the sense-leading word of a query and other language units. BERT will analyze the linguistics to understand what a person really wants and will deliver the most accurate results. This update will also influence featured snippets.

Google has provided examples of how the BERT algorithm works:

google-explains-how-the-bert-algorithm-works

Remedy: there are no particular instructions on how to write content or optimize your website for this update. BERT was created to understand people’s way of thinking while creating a query from a linguistic point of view. The advice is to write for people in a natural way: no machine-generated content.

Google November 2019 Local Search Update

Release date: November 2019

As Google announced on their official Twitter account, neural matching will be used in delivering local search results. Neural matching is used “to better understand how words are related to concepts”. In simple words, neural matching was implemented to help the search engine build connections between the information about a local business and a searcher’s query and deliver more accurate results regarding particular businesses someone might be looking for even without specific names in a query. This might also concern similar location and business names and how to distinguish between these.

There is no remedy for this type of update. This concerns only Google’s ability to better understand what people might be looking for.

Google Link Attributes Update

Release date: September 10, 2019

Google introduced additional link attributes. Besides rel=”nofollow” there will be two more attributes to use:

  • rel=”sponsored”: an attribute to a link to specify that it is a part of a sponsored promotion;
  • rel=”ugc”: an attribute to a link to specify that it is a part of user generated content: comments or forum posts.

There is no remedy for this update, just a helpful feature for webmasters. Starting from March 2020, these attributes will be considered by Googlebots when crawling and indexing to understand a website’s content better.

Google Rich Results Update

Release date: September 16, 2019

Rich snippets for the “LocalBusiness” and “Organization” schema types (including their sub-types) have become the center of attention. Google decided to “eliminate” self-serving reviews for these categories. These are the reviews that are placed on a website’s pages with the help of widgets. You don’t have to switch off such widgets if you are using them currently, it’s just that Google will no longer show these on the SERPs. This update concerns organic search alone. If there are reviews about your website on other directories – that are not managed by you – such reviews will still be shown in SERPs.

Remedy: there is no exact remedy for such updates. Our advice is to work and wait for 5-star reviews about your business on other websites. The name property has also become important: when reviewing a product, it’s necessary to mention its name for the feedback to be more useful for users.

Google Snippet Update

Release date: October 2019

In June, France adopted a copyright reform according to which Google and other huge services have to pay media resources even for a tiny piece of content used on their platforms, for instance, an article’s abstract shown on the SERP. Due to these changes in EU legislation, Google introduced new robots meta tags that give webmasters an option to choose how their snippets will look like on the SERP. Currently, Google search results for some queries in France look like an ordinary list of websites without previews.

Remedy: this is not a problem to be solved or an update that can influence your rankings. This is a feature for webmasters to basically pre-approve or not whether Google can show a content snippet from your site in the SERPs. It is up to you whether to use these meta tags or not.

New robots meta tags from Google:

“max-snippet:[number]” for the quantity of characters;

“max-video-preview:[number]” for the length of a video-preview;

“max-image-preview:[setting]” for the size of an image-preview.

You are also free to mix these.

Google January 2020 Core Update

Release date: January 2020

Google continues to work on presenting the best content in organic search results. Authority and relevant content of high quality were again emphasized, especially for YMYL (Your Money Your Life) websites. Some of the websites of this type have experienced a rank drop, some have gotten to the top.

Featured Snippet deduplication is now a thing. A website that earned a featured snippet in the SERP will not be seen elsewhere on the first page anymore, at least not with the identical URL and anchor. This is because the featured snippet itself will be considered as the organic result.

Remedy: create high quality and relevant content that will answer people’s questions very well. Don’t dilute your text. Creating content that covers as many topics as possible is great. However, don’t try to shed light on everything in one shot. People don’t want to read three kilometers of text. They want accurate answers. Provide them with these, and they will look for other articles of yours.
To increase your authority, try to link to and get links from authoritative websites. A kind word and mention from a big and respected name will always matter.

Google May 2020 Core Update

Release date: May 2020

Relevancy is stressed again here. High quality has become a very broad metric. This may include great text, a cool and funny style, a lot of photos/videos, statistics and so on simultaneously. However, it can have zero value for a user if it doesn’t answer a query. Take into account the fact that Google highlights text on a webpage that is a clear answer to a user’s question to help them get what they wanted;

Core Web Vitals – Largest Contentful Paint, First Input Delay, Cumulative Layout Shift – all these are gaining more and more importance, and eventually may become official ranking factors. The User Experience is not a joke for Google. So it shouldn’t be for you. The way a user interacts with a website plays an important role in his or her further journey on it. Take some time to ensure the best possible experience in terms of content and website functionality. 

The E-A-T concept (Expertise, Authoritativeness, Trustworthiness) hasn’t faded into oblivion. Your authority and professional expertise are key points on which people and Google decide on whether to trust your words or not. A lot of websites in the YMYL niche (Your Money or Your Life) have experienced a massive decrease in rankings. Smaller websites with no less great content went up;

Nofollow links that previously were ignored by Google have finally gotten a job. According to Google, from March, 2020 they use such links as hints for crawling and indexing. 

Remedy: Provide users with relevant, clear and informative content that will give clear data for queries. Optimize UX, especially the mobile version of your website for comfortable learning and interaction with a website’s components. Prove your authority not only with diplomas, but via links, respective sources of information, and fresh content as well.

Google December 2020 Core Update

Release date: December 2021

The Google December 2020 Update was a loud event in the SEO world. This broad update showed no regard to specific countries, languages, or niches. The SERP’s fluctuations were a matter of concern during the rollout which started on December 3 and ended on December 16. What a Christmas present from Google!

There are no particular niches that suffered the most. This update focused on all industries according to reports from SEO agencies. People suggest that a matter of interest was E.A.T. and the quality of content. We shouldn’t forget that content has always been the most critical thing for Google and has already been touched upon in a line of previous updates.

Let’s set our eyes on the experience of Ignite Visibility with this update. According to John Lincoln, after the May 2020 Core Update, the company focused on putting their Core Web Vitals scores and old content in right. They refreshed their old content to make it up to date and accurate in data and figures. As a result, they experienced an increase in rankings after the December update.

Remedy: Google itself offers a universal cure. At that moment Core Web Vitals were gaining significance rapidly. Now we have it as one of the most important ranking factors. What you should do is check your Core Web Vitals index and optimize them to comply with Google’s requirements.

Google Passage Ranking Update

Release date: February 2021

This type of update doesn’t concern the SEO world in terms of optimization of specific aspects. It’s how  Google understands the content. The Passage Ranking Update has opened a door to a better understanding of people’s needs. This is a globally focused update across all industries.

The short outline of the update is that Google will analyze content deeper by evaluating each passage on the page instead of getting the meaning of the whole page. This approach will help present people with the information they really need. The main change will be seen in the rankings as usual.

Now, if your content is not entirely relevant to a searcher’s query, but has a little part that directly answers a person’s question, this page will have a chance to get higher in the SERP for that question.

I suppose you have noticed the highlighted yellow text zones which Google addresses you to when you try to find some specific information? That’s it.

Remedy: the only way to prepare for such changes or comply with them is to work on your content: use only up-to-date data, properly structure text fragments and always consider relevance.

Google Product Reviews Update

Release date: April 2021

This update was not as big as the core ones. Its main focus was on reviewing systems, specifically their level of expertise. If you leave a short typical review it will not have much value. If your website mainly aims at reviewing different products, it’s high time to reconsider your writing strategy and start relying on the Google guidelines which will be presented below.

The retail industry has experienced the biggest losses in the rankings because reviews are one of the most important aspects of their prosperity.

Remedy: now, you have to pay more attention to the reviews you have on your website. Google offers a list of questions to answer when you build a review of a product:

  • Express expert knowledge about products where appropriate?
  • Show what the product is like physically, or how it is used, with unique content beyond what’s provided by the manufacturer?
  • Provide quantitative measurements about how a product measures up in various categories of performance?
  • Explain what sets a product apart from its competitors?
  • Cover comparable products to consider, or explain which products might be best for certain uses or circumstances?
  • Discuss the benefits and drawbacks of a particular product, based on research into it?
  • Describe how a product has evolved from previous models or releases to provide improvements, address issues, or otherwise help users in making a purchase decision?
  • Identify key decision-making factors for the product’s category and how the product performs in those areas? For example, a car review might determine that fuel economy, safety, and handling are key decision-making factors and rate performance in those areas.
  • Describe key choices in how a product has been designed and their effect on the users beyond what the manufacturer says?

Google June and July 2021 Core Updates

Release date: June 2 – June 12 & July 1 – July 12

This update was the global one. Its peculiarity was its two-stage system. The June update was the first one in a series, and the second one was in July.

It’s important to mention that the June update was brighter than its successor.

According to Searchmetrics, YMYL websites (finance, law, health, etc) have come to harm more than others. Industries that touch people’s well-being have experienced a lot of fluctuations over the years.

SEMrush stated that the Food & Drink, Law & Government, and Internet & Telecom sectors have won the most during this update. Whereas YMYL websites (Jobs & Education, Business & Industrial) have gone through hardships.

However, we can’t say for sure, because many websites haven’t gone through a lot of changes in their rankings.

The June update was a long-awaited event because Core Web Vitals has become an official Google ranking factor.

The July update was the second part of the June Core Update, and webmasters have said, this update was not as tough as the previous one.

According to SEMrush, niches that have experienced the biggest changes during this period are Real Estate, Shopping, Beauty & Fitness, Science, and Pets & Animals. Whereas Finance, Arts & Entertainment, Games, Sports Jobs & Education, Food & Drink, and News were less damaged.

Remedy: 

  • brush up on your Core Web Vitals and optimize your website to properly comply with new requirements, 
  • if your website belongs to the YMYL category of websites, work on your content to prove that it’s reliable and furnish more evidence of your high level of expertise in the niche. 

Google Spam Updates

Release dates: June 23 & June 28, 2021 

Google has been fighting spam since the beginning of time. These particular updates were dedicated to it as well. If your website felt changes in its rankings during these two days, it’s a strong signal for you to check your website inside out and find holes in your anti-spam protection.

The Spam update was also a global one.

Remedy: eliminate any signs of spam from your website.

Google Page Experience Update

Release dates: June & August 2021

This update concerns the conditions people experience on a website: the better the experience the better your rankings. 

It will be fully rolled out by the end of August. The name of the update tells us clearly about its main goal: to stand out with a great user experience against a background of those websites that don’t meet the requirements.

The following signals are important for delivering a good page experience in Google Search:

page-experience-signals

Changes will be applied to the News section in the SERPs as well: the AMP format will no longer be required for a website to appear in the Top Stories Carousel section until other conditions are met. Also, the Core Web Vitals index and the page experience status will not prohibit news from appearing in this section.

To help webmasters understand how things work, the Pages Experience report is presented.

Remedy: optimize your website for the points mentioned in the table or at least check whether you have any problems with them.


TO SUM UP, year by year, Google tries to improve the experiences of its users, releasing updates that have the potential to make each search for information more easy and satisfying (you may disagree if your site dropped in the rankings but you can work with us to bring them back up). The updates we’ve mentioned in this article had a huge impact on the SERPs. There have actually been many more updates than the ones we mentioned, plenty of which were never mentioned to the public. Maintain your rankings with WebCEO’s Rank Tracking Tool and protect your website from further Google algorithm updates.

google-algorithm-update-cta-

The post [Updated] A Full Overview of Google Algorithm Updates appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>
https://www.webceo.com/blog/google-algorithm-updates/feed/ 6
Core Web Vitals: Test Your Site for a New Google Update https://www.webceo.com/blog/core-web-vitals-update/ https://www.webceo.com/blog/core-web-vitals-update/#comments Tue, 08 Jun 2021 13:38:00 +0000 https://www.webceo.com/blog/?p=9224

The new Core Web Vitals update was finally released! Earlier this year, Google released a product review update that targets only (you guessed it) product review pages, but Core Web Vitals are going to affect every website. This new update...

The post Core Web Vitals: Test Your Site for a New Google Update appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>

The new Core Web Vitals update was finally released! Earlier this year, Google released a product review update that targets only (you guessed it) product review pages, but Core Web Vitals are going to affect every website. This new update is much wider in scope, and no website will be left untouched. So you’d best be ready.

You’ve most likely already heard about Core Web Vitals over the past few months – indeed, they have been a major topic since the last year. If you have already prepared your site for it, great work! If you haven’t, or if you are worried you might still be missing something, then you have come to the right place. Let’s look into what exactly Core Web Vitals are, what makes them so global and how to optimize your website for this new Google update.

What are the Core Web Vitals?

It’s simple: Core Web Vitals are signals which measure the quality of the user experience on a page. There are three of them:

  1. Cumulative Layout Shift. It’s responsible for a page’s visual stability.
  2. Largest Contentful Paint. It’s tied to a page’s loading speed.
  3. First Input Delay. It shows the quality of your pages’ interactivity, how responsive they are.

Let’s explore each of the three.

1. Cumulative Layout Shift (CLS)

First, let’s define CLS: it’s the sum of all layout shifts that occur while a web page is opened in your browser. And what is a layout shift? It means that a visible element changes its position on a page from one rendered frame to the next.

The browser calculates CLS by looking at the viewport size and the movement of unstable elements in the viewport between two rendered frames. The layout shift score is a product of two measures of that movement: the impact fraction and the distance fraction.

So, Layout shift score = impact fraction * distance fraction.

Ideally, CLS should be under 0.1. Of course, perfect readings on websites are hard to achieve, so you should still be fine in the orange zone, which is under 0.25. If your CLS is any higher than that number, it’s time to start worrying.

Simply put:

Your CLS < 0.1: perfect

0.1 < your CLS < 0.25: optimal, but could be improved

Your CLS > 0.25: needs fixing

There are a few ways to check your CLS and the other Core Web Vitals metrics. One is to open the developer’s tools in your browser. In Chrome, for example, you need to press F12 or Ctrl-Shift-I and open the Performance tab. Reload the page, and you will see something like this:

If it took you a while to find CLS there, you are not the only one. It doesn’t look any more appealing in other browsers either. There’s an easier way to find this information: with SEO tools.

Open the Desktop Speed report in WebCEO and you will see this:

Much better and straight to the point.

If you check the list of found issues in the report, you may find the types of issues which lower the quality of your CLS:

  • Large layout shifts;
  • Render-blocking resources;
  • Text hidden during web font load;
  • Not preloaded key requests;
  • Improperly sized images.

(Note: underlined issues in the picture above are related to all Core Web Vitals, not just CLS.)

Click on the issues to expand them and see the details, then fix them and rescan the page to see the improvements.

2. Largest Contentful Paint (LCP)

LCP measures the loading speed of your site’s content. Specifically, it isn’t all content, but only its largest part in the first scroll of a page. For example, if you have a large banner in the top of your page, that’s what LCP will choose for its judgement (unless you have something even bigger).

Here’s an example of what LCP looks like in action:

The largest element is a paragraph of text that is displayed before any of the images or logo finish loading. Since all the individual images are smaller than this paragraph, it remains the largest element throughout the load process.

An ideal LCP is under 2.5 seconds. It stops being comfortable at after 4 seconds, which is definitely a sign that something needs fixing.

Your LCP < 2.5s: perfect

2.5s < your LCP < 4s: optimal, but could be improved

Your LCP > 4s: needs fixing

Same as CLS, LCP can be checked in your browser’s developer’s tools (e.g. the Performance tab in Chrome). But we’ve already demonstrated a better option, haven’t we?

Open the Desktop Speed report in WebCEO again. If you don’t like what it shows for your LCP, scroll down to see what causes it to be so poor. Actually, a good practice is to check the list of found issues anyway, just in case.

Here are some of the most common issues affecting LCP that you may find in the report:

  • High request counts and large transfer sizes;
  • High network round-trip time;
  • Critical request chains;
  • High initial server response time;
  • Images not served in next-gen format.

3. First Input Delay (FID)

FID measures the time between the users interacting with the page (e.g. by clicking on a button) and the browser starting to respond to the interaction. The shorter this time, the sooner the users get what they want, so you want FID to be low.

To be as specific as possible: FID is not the whole time between the input and the end result (like a form appearing on the screen). It ends when the browser begins to process the response.

An FID of 100 milliseconds and lower is perfect. Any higher than 300 ms puts you into the red zone.

Your FID < 100ms: perfect

100ms < your FID < 300ms: optimal, but could be improved

Your FID > 300ms: needs fixing

FID can be checked in the Main tab of Chrome’s DevTools or in the same Desktop Speed report we’ve been using so far.

Some of the most commonly occurring FID-related issues are:

  • Inefficient cache policy;
  • Long main-thread tasks;
  • Unused JavaScript;
  • Unused CSS;
  • Excessive Document Object Model size.

Afterword

What do you think? Optimizing your website for Core Web Vitals is much easier than you imagined, isn’t it? All it takes is to keep an eye on specific kinds of site issues. You’ve already done it before, now you simply need to pay them a little more attention.

The new update is going to roll out soon, and the SEO world has been prepared for it for a long time now. You have both the information and the tools at your disposal. Check your website now and see what you can do!

Are you ready for the new Google update? Test your website's Core Web Vitals! Sign Up Free

The post Core Web Vitals: Test Your Site for a New Google Update appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>
https://www.webceo.com/blog/core-web-vitals-update/feed/ 1
Google Update: New Tips to Improve User Experience https://www.webceo.com/blog/improve-user-experience-google-update-2021/ https://www.webceo.com/blog/improve-user-experience-google-update-2021/#comments Sat, 18 Jul 2020 11:45:00 +0000 https://www.webceo.com/blog/?p=7982

What is Google's Core Web Vitals update and how should you improve your user experience to prepare for it? It's easier than you think.

The post Google Update: New Tips to Improve User Experience appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>

Are you keeping up with Google’s updates?

In early May 2020, Google announced that, by 2021, they would be implementing Web Vitals. It’s a new program that will guide website owners to focus on “quality signals” that will deliver the best user experience on the web.

Google has always prioritized the user experience in its search engine algorithm – this is not groundbreaking news. They are now defining new on-page elements. These elements will inform and influence SEO with the introduction of Web Vitals initiative, including Core Web Vitals.

What are Google’s new Core Web Vitals, and how will this affect your SEO? How can you prepare for this shift in Google’s search algorithm? Let’s answer these questions.

What are Google’s new Core Web Vitals and how do they affect me?

So what exactly are Google’s Core Web Vitals? In a nutshell, they are a group of signals that make up a great user experience on the web. They consist of three main things: page loading speed, page responsiveness (interactivity), and visual stability.

We asked Dan Fries from Blue Tree PR, an SEO and PR agency, what this means for site owners looking to rise up the ranks of Google Search, and he says:

“These Core Web Vitals are just essentially Google telling us that, if your site provides a great user experience, then it will rank better. And while it’s obviously hard to truly measure user experience – the true test would likely be conducting customer surveys or gathering feedback about user experience on your website – Google has just made it easier by identifying these actual, measurable metrics that could point to good or bad user experience.”

Here are the new Core Web Vitals and what they point to, according to Google:

  • Largest Contentful Paint (LCP)
  • First Input Delay (FID)
  • Cumulative Layout Shift (CLS)

Because Google will be implementing improvements in their search algorithm, we can expect that, as always, websites that perform well in their ranking factors will be rewarded with better site rankings overall.

These words might not mean anything to you right now, but they will soon be a, well, vital part of your SEO strategy. In the next section, we’ll break down what each Core Web Vital tries to measure, so you will know what to improve and focus on with your own site and SEO.

What do the Core Web Vitals measure?

Let’s take a look at each Core Web Vital signal and what that means for your site. For each of these, you can use Google’s built-in tools like PageSpeed Insights to get your site’s current scores.

Largest Contentful Paint (LCP)

  • 2-second summary: LCP measures your page’s perceived loading performance.
  • Ideal measurement: 2.5 seconds or faster
  • What this means: Perceived loading performance indicates how fast your page can load the most important elements that are important to the user.

While overall page loading time does still matter, there will in fact be a difference.

LCP focuses instead on the first thing that users will see on your page, showing them the most interesting parts of your page that will convince them to stay.

So, for example, if users click through to a news article from a Google search, the first elements they might want to see are the article title, or a featured photo, or the first paragraph of the article.

First Input Delay (FID)

  • 2-second summary: FID measures responsiveness of a page.
  • Ideal measurement: 100ms or less
  • What this means: FID measures the time between users taking action or interacting on your page (e.g. clicking a link or a button) and the time it actually takes for the browser to deliver the expected result from that interaction.

Google wants to pay attention to this because a slow FID can indicate user frustration. After all, if it takes a long time for a site to take them to a new page, open a pop-up form, or jump to another section, then users will likely not be happy.

If your FID score is close to zero, then this means your page responsiveness is performing well, and users are able to interact with each element on your site without much delay.

Cumulative Layout Shift (CLS)

  • 2-second summary: CLS measures the visual stability of your page as users scroll through.
  • Ideal measurement: 0.1 or 0
  • What this means: Imagine you’re reading a blog post and are about halfway through. All of a sudden, the page appears to reload, and then the paragraph you were on has disappeared, replaced by a button or content box.

You realize that you need to scroll down to get back to that paragraph you were previously reading, and this might cause some frustration.

This is exactly what CLS aims to measure: how often these layout shifts happen to your reader while on your page.

A CLS score of zero is the ultimate ideal because it means users aren’t experiencing much, if any, layout shifts in the middle of their experience on your site.

How can I optimize my website for SEO when Core Web Vitals is live?

While Google will be implementing these priority shifts reportedly by 2021, it would still be good practice to prepare as much as you can.

Here are some simple steps to help you be on your way to optimizing for Core Web Vitals.

1. Measure your loading speed

Since these metrics are largely centered around a website’s loading speed, you can view them using SEO tools which measure this speed. For instance, Core Web Vitals have recently been incorporated in WebCEO’s Page Speed tool.

Here’s what they look like (marked with a blue flag):

One other WebCEO report displays this data: Mobile Optimization. And here’s the fun thing: the two reports may show completely different data for the same page!

However, when you think about it, the reason for this discrepancy is obvious. The Mobile Optimization report measures the loading speed only for mobile devices (duh). Focus on your mobile SEO as well if you want good numbers!

Also, if your Field Data is blank, don’t panic. First, it displays data from the last 30 days; and second, Google obtains the Core Web Vitals data by analyzing user activity on pages. The more users visit the page, the more accurate data will be in the report. This means the opposite is also true: low user activity will result in a lack of information, and Field Data may be missing some of the metrics or even appear completely blank.

Of course, there’s still plenty of time before 2021. Google may yet start using less data for composing these reports.

2. Prioritize site fixes

If these metrics signal to you that there are problems, what are the solutions?

Good news: you won’t be stumbling in the dark. Instead of trying to find and fix every single issue on your site yourself, you can count on high-quality SEO tools to point you in the right direction. The two WebCEO reports mentioned above can do just that.

They won’t just detect problematic pages – they can even tell you which files are causing your site to perform poorly. Don’t be alarmed if you’re seeing a lot of issues on your site when you first check. It’s completely normal, especially if you’re paying attention to these issues just now.

3. Resolve site issues based on your data and insights

The last and final step is to now address the issues of your site, based on what you’ve seen in the Page Speed and Mobile Optimization reports, as well as their recommendations for fixing issues.

The good news for CMS users is that, when you fix the problems on one page, you can fix the problems for everything else.

For example: if you’re on WordPress, you can customize your WordPress site to compress images, enable Lazy Load, use a CDN, and implement an AMP framework all from one dashboard.

If you’re not using a CMS, you’ll need to talk to your developer about implementing these changes and fixes.

Get ready for Core Web Vitals!

Google’s updates can send some people into panic. However, the changes to its algorithm are always meant to make search a better experience for everyone. The new Core Web Vitals are the same: they will help you deliver a better user experience on your website, as well as keep your visitors happier and more engaged.

And this update actually lets you see the difference with your own eyes, which is uncommon for Google. You can check the new metrics with SEO tools any time and use them to improve your site’s performance. Loading speed is one of the major factors in UX. Tend to it now!

Sign up to check Google's Core Web Vitals on your site!

The post Google Update: New Tips to Improve User Experience appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>
https://www.webceo.com/blog/improve-user-experience-google-update-2021/feed/ 5
The Google Search Algorithm Matrix: The Timeline of Crucial SEO Updates https://www.webceo.com/blog/the-google-search-algorithm-matrix-the-timeline-of-crucial-seo-updates/ https://www.webceo.com/blog/the-google-search-algorithm-matrix-the-timeline-of-crucial-seo-updates/#respond Wed, 12 Apr 2017 19:50:55 +0000 https://www.webceo.com/blog/?p=4111

With all the Google search algorithm changes, you will never know when and where your site will appear in Google search results. According to Google’s John Mueller, the search engine giant rolls out hundreds of  SEO updates to its core...

The post The Google Search Algorithm Matrix: The Timeline of Crucial SEO Updates appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>

With all the Google search algorithm changes, you will never know when and where your site will appear in Google search results. According to Google’s John Mueller, the search engine giant rolls out hundreds of  SEO updates to its core algorithm throughout the year. Nothing scares more than the unknown. There were times, when we knew menace by sight, we were given a heads up on upcoming updates and knew how to recover from them.  Now Google search is frequently invaded by a legion of search algorithm updates, which are like Agents Smith who keep order within the google search ecosystem by eliminating SEO threats assumed to be coming from humans’ never-ending desire to implement black and grey-hat SEO techniques. Welcome to the Google Search Matrix.

What is a Google Search Algorithm Update?

Google Search Algorithm Updates were originally designed as filters to track down and penalize websites that flooded search results with low-quality content, link spam or which provided a poor user experience. In the course of time, Google changed its manner of rolling out updates. As a result, some of the algorithm changes run on a real-time basis, others are integrated into the core ranking algorithm with no new refreshes or public announcements on new updates from Google.

There have been so many SEO updates released by Google over the last 15 years, that you might get confused.

Google Updates Related to Content Quality

November 2003 – FLORIDA UPDATE. This was one of the most noticeable updates that blew out the high rankings of some e-commerce sites and product affiliates like a hurricane. A lot of websites lost their top rankings and even disappeared from the search index itself. Among the key aspects of the Florida Update were:

  • The SEO filter that was activated when the over-usage of search terms was detected;
  • Stemming which helped to rank websites based on the matches to the search terms.

February 2004 – BRANDY UPDATE. This update resulted in the expansion of Google’s index, a better understanding of context and the consideration by Google of synonyms of search query terms. There was also an increased focus on anchor text relevance and close attention was paid to your backlink neighborhood quality.

May 2010 – MAYDAY UPDATE. As a precursor to the Panda Update, it penalized a lot of websites for using thin content.

February 2011 – The beginning of the PANDA era. The first Panda Update targeted websites which provided low-quality, thin and duplicate content.  Panda 1.0 affected 12% of search queries. Since then, this most important content-quality update has survived 26 tweaks and refreshes. The last update made Panda part of the core algorithm.

August 2013 – HUMMINGBIRD UPDATE. This content-quality SEO update was aimed at understanding a searcher’s intent and the context behind a search query. Made up of more than 200 SEO factors which affect both ranking and search, Hummingbird was designed to deal with conversational and voice search queries which are mostly popular on mobile devices. The update gave a boost to semantic SEO.

Google Updates related to Links Quality

April 2003 – CASSANDRA UPDATE. Cassandra was the first Google Update designed specifically to filter link spam. These were hard times for websites that built links from co-owned domains and hid links and text in order to manipulate their rankings.

May 2003 – DOMINIC UPDATE. The SEO community assumes that this update changed the way Google counted backlinks.

January 2005 – NOFOLLOW UPDATE. These were rainy days for black-hat SEOs and webmasters as Google, in cooperation with Microsoft and Yahoo, introduced the nofollow attribute in order to fight with spam.

September-October 2005 – JAGGER UPDATES. Google rolled out this series of SEO updates to filter low-quality link schemes, like paid links, reciprocal links and links built with the help of link farms. The updates took 3 months to fully roll out.

April 2012 – The beginning of the PENGUIN era. Google initiated a war against link spam. Penguin 1.0 affected 3,1% of English search results. Over-optimization, spam links, keyword stuffing were the reason of website penalization.  The next major and heavyweight Penguin Update (2.0) was rolled out in May 2013 and affected 2,3% of search queries.  On September 2016, Google released a real-time Penguin 4.0 Update and backed it into Google’s core algorithm. The key difference of Penguin 4.0 was that it re-indexed and re-evaluated sites in real time, demoting only specific pages with spam signals on them.

Google Updates related to Mobile Search Quality

April 2015 – MOBILE-FRIENDLY UPDATE. Dubbed as Mobilegeddon, the Mobile-Friendly update was not actually a penalty update, it just favored those websites with better rankings which were SEO friendly on mobile screens. Optimizing for mobile search has become paramount, and tools like the Website Audit can ensure your site is fully optimized for mobile users, addressing the critical factors that influence mobile search ranking.

Google Updates related to Local Search Quality

October 2005 – LOCAL MAPS UPDATE. Google merged two products, Local Business Center and Google Maps into one complex in order to simplify local search.

July 2014 – PIGEON UPDATE. Local SEO changed dramatically after the launch of the Google Pigeon Update. By improving its location and distance ranking parameters, this had a huge impact on both Google Maps and traditional Google search results. Among the most significant changes was the boost in rankings for local directories, such as Yelp, TripAdvisor etc.

September 2016 – GOOGLE POSSUM UPDATE. The new local search update was designed to filter out local search spam, specifically in Google Maps results.

Google Updates related to User Search Experience Quality

July 2003 – FRITZ UPDATE. With this update Google switched from monthly index refreshes to indexing on a daily basis. This was a great improvement for the average search experience because Google started to return fresher and more accurate results.

June 2005 – PERSONALIZED SEARCH UPDATE. Another enhancement in the user search experience was the usage of search history in order to deliver more personalized and relevant search results.

August 2008 – GOOGLE SUGGEST UPDATE.  Search experiences were improved to a greater extent thanks to Google Suggest which eventually evolved into Google Instant.

August 2010 – CAFFEINE UPDATE. This update was designed for users and with the help of users. Everyone was invited to test the new infrastructure of the Google search indexing system and leave feedback. The key features of the Caffeine update were an expanded index, faster crawling and real-time indexation. As a result, Caffeine provided 50% fresher results than with earlier updates.

December 2010 – NEGATIVE REVIEWS UPDATE. The update changed the way websites are ranked based on experience provided by users.

June 2011 – GOOGLE SCHEMA UPDATE. This SEO update aimed at enriching search results by allowing users to optimize their structured data with the help of Schema.org.

November 2011 – FRESHNESS UPDATE. The update affected 35% of search queries, mainly time-specific ones. As a results, Google started to pay closer attention to fresh, regularly updated content which portends a good user experience. Searchers were now regularly filtering searches for content from the past hour, day, week, month or year.

May 2012 – KNOWLEDGE GRAPH UPDATE. This update provided direct answers to common questions about specific people and places and other subjects. In the long run, the Knowledge Graph feature evolved into Knowledge Graph panels.

In the next post we will give some advice on how to recover from Google Penalties, provided to us by a former Google Employee.

The post The Google Search Algorithm Matrix: The Timeline of Crucial SEO Updates appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>
https://www.webceo.com/blog/the-google-search-algorithm-matrix-the-timeline-of-crucial-seo-updates/feed/ 0
Semantic SEO Strategy: How to Do SEO in 2017 https://www.webceo.com/blog/semantic-seo-strategy-how-to-do-seo-in-2017/ https://www.webceo.com/blog/semantic-seo-strategy-how-to-do-seo-in-2017/#comments Wed, 07 Dec 2016 16:53:07 +0000 https://www.webceo.com/blog/?p=3511

The following SEO Guide is not a Bible, but rather a list of recommendations and SEO trends listed in our actionable Semantic SEO Strategy guide. You can download this for FREE and use it alongside the 14+ free SEO tools...

The post Semantic SEO Strategy: How to Do SEO in 2017 appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>

The following SEO Guide is not a Bible, but rather a list of recommendations and SEO trends listed in our actionable Semantic SEO Strategy guide. You can download this for FREE and use it alongside the 14+ free SEO tools which you will need for your website optimization and promotion in 2017.

What you’ll learn from this step-by-step handbook:

In the free 19 page handbook we’ve collected some up-to-date practical advice and how-to’s backed by 14+ free SEO tools hand-picked by our editors which will help you understand and implement the latest Google search optimization trends:

Chapter I. How contextual (semantic) search will change the way you manage your Content SEO Audits, and what tools will help you create user- and SEO-friendly website copy.

RankBrain forever changed the way we optimize website content for search engines. If, in 2015, RankBrain was used to interpret 15-20% of search queries, Google now uses RankBrain as a part of its core algorithm. Moreover, it applies its AI system for re-ranking search results. RankBrain doesn’t punish websites, but rather elevates or demotes them by relevance. How can you optimize for RankBrain? Ideally, you will need to learn how to read your targeted audience’s minds. But in real life, you will need to optimize your website for LSI keywords and contextual clues instead of standalone keywords.

In this chapter you’ll learn how to:

  • Improve your website user experience by fixing on-page technical and content SEO issues.
  • Contextualize Your Keyword Research and Optimization
  • Improve visitors’ engagement and your content search visibility with a proper Internal Links structure
  • Improve your website SERP clickability by optimizing your structured data with relevant and descriptive copy.

Chapter II. Step Up Your Mobile-First Search Index Game

Given the fact that Google, in 2017, is planning to launch a separate mobile search index which will eventually become the primary one, you should give mobile SEO your best shot…

Chapter III. Help Tools to Improve & Measure Your SEO Performance

When you plan something for tomorrow, you should play upon your previous experience, taking into account what worked best of all and what didn’t bring any benefits. You want to measure; we have the tools…

Sign up for a 14-days Free Trial and get exclusive access to a full guide and 14 professional SEO tools to help you apply our recommendations to your SEO strategy.

guide-2017-blog-CTA


Words of Cheer


You never know what you can do til you try. You have a website and, if you work hard on it, you will be rewarded by Google. Our SEO strategy is a little leg-up for you to get this reward in 2017.

The post Semantic SEO Strategy: How to Do SEO in 2017 appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>
https://www.webceo.com/blog/semantic-seo-strategy-how-to-do-seo-in-2017/feed/ 2
RankBrain – the Wild Card of the Google Search Ranking System https://www.webceo.com/blog/rankbrain-the-wild-card-of-the-google-search-ranking-system/ https://www.webceo.com/blog/rankbrain-the-wild-card-of-the-google-search-ranking-system/#comments Thu, 27 Oct 2016 07:15:54 +0000 https://www.webceo.com/blog/?p=3240

RankBrain represents one of the most intriguing changes in the world of SEO in the last year. Initially introduced in October 2015, it still remains a mystery for most experts. However, due to numerous studies performed since then, we are...

The post RankBrain – the Wild Card of the Google Search Ranking System appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>

rankbrain-joker-wildcard

RankBrain represents one of the most intriguing changes in the world of SEO in the last year. Initially introduced in October 2015, it still remains a mystery for most experts. However, due to numerous studies performed since then, we are able to understand some basics features of this machine learning system.

Even though the name may imply otherwise, its priority is to provide more relevant results to searches that are completely new. After initial trials, people from Google realized that this system is actually great when dealing with ambiguous and poorly defined queries as well as natural language.

History of RankBrain

Have in mind that this is not Google’s first attempt to implement machine learning into their systems. Previously, we could see a similar technology in Google AdWords. Whenever RankBrain needs to assess a page, it focuses on the relevancy of the page itself, similarly to the Google AdWords Quality Score. Based on this score, it places different pages on different positions. Have in mind that the Google AdWords Quality Score doesn’t use external signals such as links. At this point, we are not certain whether RankBrain relies on links when judging content. Also, this system has some similarities with Word2Vec.

Word2Vec was based on a technology called skip-gram and a continuous bag of words. These two models allowed a system to establish a relationship between main words and all neighboring words. Also, Word2Vec was able to embed words into vectors which allowed them to be better understood.

RankBrain works similarly. It is pretty good at establishing semantic relationships and reading user intent. Based on this, we can surmise that RankBrain is at least partially based on Word2Vec technology.

Why was RankBrain introduced in the first place?

There are about 3 billion searches every day. Out of these 3 billion, 15 % are completely unique, never seen before. This amounts to 450 million. RankBrain was invented in order to process these unique queries and give a proper answer to them. The system does this by making an educated guess. Based on all previously accumulated information, it is able to make semantic connections and establish a user’s intent. But, in some cases, even with all its advanced technology backing it up, RankBrain can make a mistake and provide a final user with results that he is not looking for. If that is the case, it will provide a new set of results, hopefully satisfying the individual. This is why it is called machine learning. RankBrain is able to constantly refine its results and improve its suggestions.

Many people think that RankBrain is an artificial intelligence system. This is not the case. It is a machine learning system able to improve itself without any human interference. Through advancement, RankBrain could one day reach the state of “artificial intelligence,” but, there are still too many limitations for this to happen. Google product managers seem to have realized that RankBrain is excellent when dealing with ambiguous and long-tail keywords. Unlike before, when Google would focus on one word within the phrase, RankBrain is able to understand the meaning behind the words and give proper suggestions.

Relationship with Hummingbird

There are many misconceptions when it comes to Hummingbird and RankBrain. The latter is not an algorithm nor does it replace Hummingbird. In fact, these two are separate entities that are meant to work in conjunction. Experts refer to RankBrain as Hummingbird’s modification. Most likely, when processing a query, RankBrain finishes its part of the job and then Hummingbird additionally refines it.

Similarly to Word2Vec and Google AdWords, it is possible that Google took the best out of Hummingbird and implemented it within RankBrain. At this point, we can only speculate. The fact is that Google is very protective of its technology and all we can do at this point is notice similarities and differences between various systems.

RankBrain as the Wild Card of the Google Ranking System

Google has announced that RankBrain has become the number three ranking signal in the short period of its existence. But, there are many questions regarding it. Unlike other ranking signals (more than 200 of them), RankBrain most likely doesn’t represent a direct signal. Instead, it affects the way Google perceives queries and through it, search as a whole. RankBrain is not static. It is constantly improving and refining queries according to its own perception. Additionally, unlike other ranking signals, RankBrain is completely autonomous, able to work without any human interference.

But, keep in mind that this machine learning system still has limited use. It was primarily created in order to help users with unique queries. And to be fair, it is doing a great job. This means that RankBrain will not interfere if there are already valid suggestions available. Perhaps this is the reason why it is only at position three when it comes to ranking signal importance. We have learned that RankBrain can also help out with other long-tail queries, ambiguous keywords and slang. The system will not interfere when there is sufficient data for a query (at least the way we understand).

Nevertheless, this raises some questions. If RankBrain has managed to gain so much importance in so little time, it is quite possible that it will continue increasing in relevance eventually taking over common queries. This could change the whole SEO world.

How Does RankBrain Work?

As we mentioned previously, RankBrain has to embed words into vectors so it can use them properly. After that, all these vectors are put into the same virtual space. This includes all semantically related words that help the system establish relationships. Here is a good example:

rankbrain-how-it-works

In order to weigh the words properly, RankBrain has to understand correlations between them. Some of these semantically close keywords will have more impact on the query while other will have less impact. The importance of a keyword is established based on its distance from the main word. Related keywords that are far away from the main keyword have less importance on the query while those that are close to it, have more importance. Based on this, RankBrain gives priority to different content.

But, that is not all. This is only the part of the equation, the part which we know about. There is probably much more to it given that RankBrain is able to provide answers to some poorly defined questions and other queries that previously presented a problem.

How do you optimize for RankBrain?

At this point in time, there is no point in optimizing for it. Simply put, these queries have too small of a volume for us to bother with them. However, if one day RankBrain takes over a larger chunk of the queries, we should start considering it. Anyway, your main SEO strategy should remain the same; you have to have great content that will attract clicks and make visitors read it. If one day RankBrain becomes the dominant element of Google search, we will have to forget about link building. Instead of off-page, we will have to concentrate more on on-page optimization.

RankBrain works through trials and error. Even though the system performs its own analysis before presenting results to users, that doesn’t mean that users will be satisfied with suggestions. In order for your copy to be successful and to properly optimize it for RankBrain, you will have to focus on things such as click through rate, time spent on page and bounce rate.

According to our presumptions, these statistics are the best measure of visitor satisfaction with a page and at the same time, they send a strong signal to RankBrain regarding relevance. So, if you wish to optimize a page for RankBrain these are our suggestions:

  • Create compelling titles and meta descriptions
  • Write longer copy
  • Include authoritative resources and studies within the text
  • Be direct and focus on the user’s benefit
  • Make your website responsive

WebCEO’s SEO Content Assistant can help optimize your content to match RankBrain’s preferences, guiding you in crafting relevant content that resonates with your audience and RankBrain’s understanding of relevance.

Have in mind that your article still has to be somewhat important to the query. If it isn’t, RankBrain will not suggest it to a user in the first place. But, if it does, try and make the best out of it, because RankBrain will notice how the customer reacts.

In the future, a visitor will be the one that determines whether a website lives or dies. In that regard, be sure to provide maximum value and the freshest, most relevant information.

Examples of RankBrain

Here is one of the more popular examples of RankBrain in action:

rankbrain-examples

We used the query “What’s the title of the consumer at the highest level of a food chain”. This can be seen as quite an ambiguous query. You might think that Google would answer the query by presenting websites about production, shopping malls, food chains, human as a consumer, prices etc. However, this query is clearly connected to language commonly used in biology textbooks and specifically, the word “highest level” would clearly refer to “predators.” By using its database, RankBrain is able to make a good guess and return results connected to this particular topic.

As it turns out, RankBrain is also good at giving results based on our own browser history and previous searches. Let us use an example with Barack Obama.

barak-obama-rankbrain

In order to get Barack Obama’s age, we can type “How old is President Obama?” Google can easily recognize what we are looking for giving us data on Barack Obama. However, after this query, if we type “How old is his wife?” Google may understand that we are looking for data about Michelle Obama based on our history.

Conclusion

At this point, there are simply too many unknowns. We still cannot properly assess RankBrain or its true potential. It has shown great results so far but that doesn’t mean that Google will give it a greater role than it currently has. One thing is for sure. As always, Google is trying to improve the user experience and create new technologies that will help users get the most relevant results to their queries. In terms of SEO, things may or may not change in the future. Ultimately, it doesn’t matter. Even if they do change, Google will still be based on a certain technology, technology that can be exploited and optimized for. With this in mind, if you are working within SEO, there is no reason to be concerned.

For tracking how your site’s rankings are affected by RankBrain and other factors, use WebCEO’s Rank Tracking tool. Check your website positions on desktop, smartphone and tablet: the results may differ dramatically!

Additionally, to ensure your SEO strategies are comprehensive and up-to-date, explore WebCEO’s Online SEO tools to cover all your bases from on-page optimization to backlink analysis.

The post RankBrain – the Wild Card of the Google Search Ranking System appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>
https://www.webceo.com/blog/rankbrain-the-wild-card-of-the-google-search-ranking-system/feed/ 3
Google Penguin 4.0 vs. Google Possum https://www.webceo.com/blog/google-penguin-4-0-vs-google-possum/ https://www.webceo.com/blog/google-penguin-4-0-vs-google-possum/#comments Mon, 10 Oct 2016 12:25:26 +0000 https://www.webceo.com/blog/?p=3368

Intro:  “Since the dawn of time, a secret war has been waged between two species (Penguins and Possums)… Throughout the millennia, they have kept their battle confined to the shadows; however, one brazen act has escalated this conflict to a...

The post Google Penguin 4.0 vs. Google Possum appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>

Intro:  “Since the dawn of time, a secret war has been waged between two species (Penguins and Possums)… Throughout the millennia, they have kept their battle confined to the shadows; however, one brazen act has escalated this conflict to a fever pitch. As the (SEO) world careens toward Armageddon, every human on the planet will soon be forced to CHOOSE A SIDE!” (The excerpt and the image are taken from PvP, by Sebastian Kadlecik).

It seems that Google decided to create another beast and start a fight between the newly born Google Possum and our old spam fighter buddy, Penguin 4.0, for the title of the most significant Google Update of the year. If you noticed a spike or a fluctuation in your ranking history trend starting at the beginning of September, you may have been involved in the war of two species for supremacy in the Google search universe.

Tip: Keep a wary eye of the overall trend of your website rankings after every Google update rollout by using the WebCEO Rank Tracking Tool (Historical Data Report)

(click on the image to see a full version) 

google-updates-ranking-trend

Let’s take a closer look at the nature, surroundings and key targets of these beasts and the ways to keep safe from their major updates:

File on Google Possum:

google-possum-update

Nature: Google’s Possum update is a new local search algorithm update designed to filter out local search spam from the Google Map search results.

Date of Attack: September 1, 2016

Surrounding:  3-pack and Google Maps search results.

Key Target: Even though possums are not aggressive animals and are afraid of human beings, they can do considerable damage to your website results when provoked. It means that business listings found in the 3-pack and Local Finder search results which are irrelevant to a searcher’s query (or which provide duplicate listings), can be filtered out from the local search results.

If your website meets the following criteria, your listings may get into the filter funnel:

  • Businesses that share a physical address with a similar business. Even if your business is not associated with the other business at all, Google may still consider this as a duplicate listing and filter it out.
  • Businesses located close to a city center are now losing some ground to listings for businesses located further out, which were once penalized for not being in the center.
  • Businesses which have not optimized for localized terms and long tails may no longer rank high in 3-pack and Google Maps search, depending on the wording of the search and physical location of the searcher.

How to avoid the bite of the Google Possum:

The Google Possum iteration doesn’t penalize websites, it just filters them out. If you checked your local rankings and found that, having been earlier ranked in the top of 3-pack and Google Maps, you are now found in a place where even a dead body can’t be found…then the sneaky Possum has found your business.

From all the tactics mentioned above (changing locations can be challenging), the only thing you can do is to revise your keyword list, do some detailed localized keyword research and optimize your website pages for local terms. Follow this step-by-step Quick Start Guide to run a quick SEO audit for your site and see how it stacks up. If even this doesn’t work, place a clove of garlic or scatter mothballs around any area where you have noticed possum activity:)

Tip: Use the WebCEO Local Rank Tracker to monitor how your local target audience sees your business listings in the Google 3-pack or Maps search.

WebCEO provides the ability to add country specific search engines, its mobile versions and a specific locality (state, city, Zip Code) in order to get location-based search results emulated to your clients’ location.

webceo-rank-tracking-tool-settings

Scan your ranking report in order to see if your business shows up in the 3-pack or Google Maps for any of your target keywords.

google-map-ranking-results

File on Google Penguin 4.0

google-penguin-4thNature: Google Penguin 4.0 is the 4th attempt to purge Google search of spam and low quality links. This is the last iteration of its kind, as there will be no more Penguin updates in the future.  Google has integrated the web spam algorithm into its core search engine algorithm in order to make it run on a real-time basis.

Date of Attack: September 23, 2016

Surrounding:  Google universal SERPs.

Key Target: Web pages with the smell of spam signals. Even though Google has rolled out a new iteration, the technology remains the same. Google still punishes websites which do not follow Google’s search quality guidelines.

Be sure that you don’t abuse them yourself:

  • Your website pages should be FREE of links coming from low-quality and irrelevant third-party websites.
  • You website link profile should NOT be built with the help of paid and reciprocal link building (excessive sponsored articles and guest posts, link exchanges etc.).
  • You anchor texts should NOT be over-optimized with generic terms, non-descriptive and irrelevant keywords.
  • Your website pages should NOT contain unnatural, low-quality or hidden links embedded in widgets
  • Your website should be FREE of sneaky redirects (showing different or irrelevant content to users with the help of doorway pages, cloaking).

How to avoid the bite of the Google Penguin:

The 4th iteration of Penguin is not as scary as its predecessors. Actually, the SEO community forums report minimal fluctuations have occurred in rankings that were caused by Penguin 4.0. The reason is in its granular effect where it devalues single web pages which spread spam. Thanks to its real-time effect, you won’t have to wait for your devalued page rankings to be improved after you disavow toxic links. Google refreshes the algorithm often:

With this change, Penguin’s data is refreshed in real time, so changes will be visible much faster, typically taking effect shortly after we recrawl and reindex a page. It also means we’re not going to comment on future refreshes.” – Google Webmaster Central Blog.


Must Read: How to Keep Your Backlink Profile Ever Clean>>


 Summary

For now we can see that the Google Possum update will have a greater impact on local search rankings compared to the effect of Google Penguin 4.0.

Have you been hit by one of the updates? If you withstand the attack, feel free to share your insights in the comment section.

The post Google Penguin 4.0 vs. Google Possum appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>
https://www.webceo.com/blog/google-penguin-4-0-vs-google-possum/feed/ 6
Amp It Up: Is Your Site In Need for Google AMP Pages? https://www.webceo.com/blog/amp-it-up-is-your-site-in-need-for-google-amp-pages/ https://www.webceo.com/blog/amp-it-up-is-your-site-in-need-for-google-amp-pages/#comments Thu, 05 May 2016 12:53:29 +0000 https://www.webceo.com/blog/?p=3118

“Anything less than instant represents a decline in engagement”  Richard Gingras, Head of News at Google Today it’s crucial for your brand to go mobile. If your website is mobile, chances are you will be found in the right place...

The post Amp It Up: Is Your Site In Need for Google AMP Pages? appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>

“Anything less than instant represents a decline in engagement”

 Richard Gingras, Head of News at Google

Today it’s crucial for your brand to go mobile. If your website is mobile, chances are you will be found in the right place at the right time. In this mobile optimization chase, time and speed are the key advantages. Delivering content in the most accessible and user-friendly way is the number one priority for content publishers who don’t want to lose impatient mobile searchers. Industry giants like Facebook and Apple have already jumped on the bandwagon by providing solutions for mobile-friendly content delivery platforms like Facebook Instant Articles and Apple News.

Google has catered to the trend by providing the alternative to Facebook Instant Articles and Apple News which is the AMP project. “Accelerated Mobile Pages” is an open-source framework initiative designed to reduce burdensome dynamic JS elements of web pages in order to speed up their loading time in mobile search. Unlike Instant Articles and Apple News which technically are closed RSS feed aggregators available only to the platforms’ in-app users, the Google AMP Pages is an open web streamline framework designed for creating instant mobile pages. Put it simply, the Google AMP pages are alternative versions of desktop pages which have separate URLs. The project is now live and works for newsworthy posts which are displayed in the Google News carousel in the prime real estate of mobile SERPs. You can have a quick view of how AMP pages look like on your mobile device by browsing at g.co/amp or emulate the mobile device mode from your desktop using Chrome Developer Tools.

How Do I AMPlify My Website Content?

The Google AMP project’s goal is to simplify and minimize the requests your browser has to make for loading website pages. When you create an AMP page it contains only the specific AMP HTML file which restricts author-written scripts and other third-party slow JavaScript libraries used in ads, subscription forms and call-to-action buttons.

If we break down the boilerplate, we’ll see that an AMP page is wrapped up into 3 layer components:

  • AMP HTML is the markup language based on an HTML framework that includes custom tags and a subset of restrictions for reliable and fast performance.
  • AMP JavaScript library helps with the fast and reliable rendering of HTML page components and tags
  • AMP Content Delivery Network (Cache) fetches the AMP pages, caches them in the cloud and improves their performance automatically via the Google hosted Content Delivery Network (or your own cache).

Here are some learning resources, setup guides, CMS plugins, check-and-fix tools which make AMP optimization a piece of cake.

How AMP Works:

The primer guide How AMP Works highlights the key points of what AMP is, why it is needed and how it works.

The AMP Project Official Blog will keep you updated about the latest news and updates of the Google AMP project.

Setup Guides:

The guide Get Started with Google AMP  will help you get through all the stages, from creating your first AMP HTML page and its validation to its publication and distribution

The Github Repository will provide hands-on information on AMP HTML source code, samples and documentation for webmasters.

Automatic CMS Plugins:

WordPress AMP plugin provided by Automattic enables AMP content on your blog, which makes it a powerful tool for WordPress SEO.

Drupal AMP Module provided by Acquia will help you convert your Drupal pages into pages that comply with the AMP standard.

Check-and-Fix Tools and Recommendations:

AMP Test is Google’s AMP tool which tests the validity of the AMP markup and structured data on the page related to AMP in real time.

The AMP Validator is an AMP tester and editor that comes bundled with the AMP JS library and is available on every AMPed page.

Google Search Console AMP Error Report will point to the most common issues in your site’s AMP page implementation.

What Are the Effects of Content Optimization for AMP Pages?

Google assures that, after wrapping your content into AMP pages, they will be loaded four times faster and render with eight times less encoded data than average mobile pages. According to Google, it reduced the AMP page load time by 85%. Not bad! But taking into account that AMP is a pilot project and there are still a lot of unexplored facets, consider both positive and negative effects of AMP project:

Advantages:

+ Lower bounce rate and higher Time on Site and Pageview rate.

+ Content is displayed in an above-the-fold area of mobile SERPs.

+ Increased click-through rate to the above-the-fold mobile AMP carousel results.

+ Improved user experience.

+ Competitors who do not use AMP will suffer a decrease in impressions and clicks as their entries in the SERPs are pushed below the fold.

Disadvantages:

– The AMP version of a regular page can cause duplicate content issues. It is recommended to declare an AMP version as canonical vis a vis a regular HTML page, by putting a canonical tag <link rel=”amphtml” href=”_AMP_URL_”>.

– Content monetization is hard to execute in AMP content. There are 5 ad platforms allowed on the Google AMP pages which are Google AdSense and DoubleClick, OpenX, AOL Advertising and Outbrain.

What About Other News Unworthy Pages of Your Website?

Before the AMP era becomes a new mobile search reality and AMP is designated an official ranking factor, traditional mobile search optimization techniques will still be in place. Google doesn’t punish pages for not being AMPtimized, but they will still punish you for being mobile-unfriendly.  Thes AMP project is aimed at article content which means that regular landing pages of your site are on the sidelines for this.

If you haven’t heard yet, Google announced another “Mobile May” Update that will boost the effects of the mobile-friendly algorithm that was rolled out last year on April 21st.

mobilegeddon-may-2016

This means that if your pages are still non-mobile friendly to users and to Google, your competitors are probably outranking you and this won’t change until you get with the program and qualify your pages as mobile-friendly. Now answer a gut check question: “Is your website mobile-friendly?”  Check it now by following the recommendations generated in the WebCEO Mobile Optimization tool report that will guide you through the issues of your website’s mobile SEO.  The Mobile Issues Checklist will give you detailed info on what Google considers a non-mobile-friendly design and content accompanied by Google-proof recommendations.

mobile-optimization-report

Just a Matter of Time…

At the moment the Google AMP project works only with breaking news publishers like BuzzFeed, the Washington Post, The Verge, etc. Google has persuaded us, however, that the project is aimed at:

“…all published content, from news stories to videos and from blogs to photographs and GIFs, which should work using Accelerated Mobile Pages.”

Let’s see how fast the AMP project expands to more types of content providers and websites. We will keep you updated once the Google AMP project provides more opportunities for regular publishers on its AMP dedicated blog.

The post Amp It Up: Is Your Site In Need for Google AMP Pages? appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>
https://www.webceo.com/blog/amp-it-up-is-your-site-in-need-for-google-amp-pages/feed/ 1
Happy SEO-News-Giving Day! https://www.webceo.com/blog/happy-seo-news-giving-day/ https://www.webceo.com/blog/happy-seo-news-giving-day/#comments Fri, 27 Nov 2015 10:03:16 +0000 https://www.webceo.com/blog/?p=2802

Happy Thanksgiving to all those of you, who regularly check your sites for issues and fix them, keep your backlinks clear and make customers happy. Here’s a brief recap of the latest SEO news so you can get back to...

The post Happy SEO-News-Giving Day! appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>

Happy Thanksgiving to all those of you, who regularly check your sites for issues and fix them, keep your backlinks clear and make customers happy. Here’s a brief recap of the latest SEO news so you can get back to enjoying family and friends.

Google News

Google has confirmed that a new Penguin Update 4.0 will go live by the end of 2015. Luckily for many webmasters, the upcoming Penguin iteration will process all the toxic and low quality links that you removed or disavowed after the previous manual penalty in real time. Do you remember that it took months to recover from the manual penalties of the previous Penguin updates? If you have not yet refined your backlink profile for the upcoming Google Penguin 4.0, be sure to check your backlinks for domains with a low trust level (0.30 or less) and then submit them in a list to the Google Disavow tool through the Web CEO Backlink Quality Check tool.

Google has officially rolled out its new artificial intelligence algorithm RankBrain. This new machine learning system is a part of the Google Hummingbird algorithm and the third-most important Google search ranking signal that helps to interpret and better understand non-exact match search queries on the level of an individual searcher’s intent. This means that SEO will see a shift from keywords to topic-focused research and optimization.

Google+ just recently updated its design with a heavy focus on Communities of Interests and Collections. Now the “ugly duckling” of the social media family strongly resembles the look of Reddit and Pinterest. With the new design, folks were saying that Google+ local pages forgot to add business reviews. Google then officially confirmed that business reviews will no longer be displayed on Google+ local pages, but they will be accessible when doing search via Google and Maps.

Yet again G+ received an enthusiastic response from users, but maybe not the response they had hoped for:

Google+ redesign

twitter google+ redesign

Yandex News

Yandex has announced that they will start to crawl CSS and JavaScript content, as a test in order to better understand the content of web pages. They are now informing webmasters to stop blocking their CSS and JavaScript files and resources. In the long-run, this will have an effect on Yandex SEO processes. Yandex has also added a mobile-friendly label to mobile search results and released a supporting mobile-friendly diagnostic tool for webmasters to make sure that their website pages meet mobile-friendly guidelines.

Facebook news

Facebook has launched lead ads! Yes, you heard right. Facebook now allows running lead generation ads (similar to Twitter lead generation cards) which collect user information, share lead capture content and make prospects complete a desired action without actual leaving Facebook. Read more on how to configure your lead generation program on Facebook here>> .

Now local businesses can promote their products on Facebook more effectively with local awareness ads. The new ad feature allows you to target groups of people who reside near your business neighborhoods.  Furthermore, Facebook has made the Instagram Ad Platform available to all marketers around the world. Now you can launch a photo or video ad campaign right from your Facebook ad management tools (Power Editor).

Twitter news

Twitter has suffered a lot of quality changes recently. One of the major updates of Twitter has been the change of the shape of its Favorite icon from a star to a heart.

Not long after this redesign trick, Twitter officially shut down its API that provides data on Tweet Share Count. Now you can’t see the number of your article Twitter shares in your social share button bar. This may have a negative impact on user interaction and social proof factors for your content.

For those of you who want to drive more user engagement and learn what followers think on this or that topic, you can now create polls on Twitter!

The post Happy SEO-News-Giving Day! appeared first on SEO tools & Online Marketing Tips Blog | WebCEO.

]]>
https://www.webceo.com/blog/happy-seo-news-giving-day/feed/ 1