Thursday, 30 October 2014

Mobile Usability Reports Come To Google Webmaster Tools

Google announced a new feature available in Google Webmaster Tools this morning for tracking your mobile usability issues. The new report is named the Mobile Usability report and it is available at within Google Webmaster Tools.

The tool will show you common usability issues with your mobile site, so that you can fix them and improve your mobile experience for your users.

The specific errors the report shows are flash content (which they issue warnings for), missing viewport meta-tag for mobile pages, tiny fonts that are hard to read on mobile, fixed-width viewports, content not sized to viewport, and clickable links/buttons too close to each other.

John Mueller from Google’s Webmaster Trends Analyst team based in Zurich said they “strongly recommend you take a look at these issues in Webmaster Tools.” Why? He doesn’t mention any ranking signals but it does seem based on Google’s back talk, mobile UX will be a ranking signal in Google’s algorithm in the near future. This new report is just one more sign that this ranking signal is indeed coming.

Here is a screen shot of a sample report:

Either way, mobile traffic is only being a larger part of your overall traffic, so implementing these suggestions may make a lot of sense, outside of its ranking potential.

Wednesday, 29 October 2014

Actions in the Inbox, powered by schemas

Search engines have been using structured data for years to understand the information on web pages and provide richer search results. Today, we are introducing schemas in emails to make messages more interactive and allow developers to deliver a slice of their apps to users’ inboxes.

Schemas in emails can be used to represent various types of entities and actions. Email clients that understand schemas, such as Gmail, can render entities and actions defined in the messages with a consistent user interface. In the case of Gmail, this means that the emails can display quick action buttons that let users take actions directly from their inboxes, as in the following screenshot:

Using schemas to add quick action buttons to the emails you send is easy. All it takes is adding some markup to your HTML emails, together with your regular content, in one of the supported formats - Microdata and JSON-LD.

As an example, the following JSON-LD markup can be used to define a movie and the corresponding one-click action to add the movie to your queue:

<script type="application/ld+json">
  "@context": "",
  "@type": "Movie",
  "name": "The Internship",
  ... information about the movie ...
  "action": {
    "@type": "ConfirmAction",
    "name": "Add to queue",
    "actionHandler": {
      "@type": "HttpActionHandler",
      "url": "",
      "method": "POST",

Gmail renders the markup above with a button labelled “Add to queue” next to the email subject line. When the user clicks on the button, Gmail sends a POST request to the url specified in the action handler. Your app has to handle these requests and respond to the email client with an appropriate HTTP response code (200 for successful requests, 400 for invalid requests, etc.).

Schemas in emails currently support four different types of actions - rate/review, RSVP, one-click action and go to link - and we plan to add more types moving forward. We are collaborating with a number of partners who will launch their integrations in the coming weeks, making the messages they send more useful and interactive for Gmail users. For example, Esna is using this to inform users of missed calls and provide them with a one-click button to be called again, while Seamless is implementing the rate/review action to collect feedback about restaurants.

To learn more about all supported entities and actions and to find out how to get started with schemas in email, visit

Google AdWords Renaming Column Names To Match Google Analytics

Google announced on Google+ that by November 10, 2014, they will be renaming a few columns within the Google AdWords reporting interface to be consistent with the Google Analytics naming convention.

Google said, "three of the Google Analytics reporting columns in AdWords will be re-named to match the corresponding column names in Google Analytics."

What is changing?

  • Pages / visit to Pages / session
  • Avg. visit duration (seconds) to Avg. session duration (seconds)
  • % new visits to % new sessions

If any of your internal reporting tools reference these old names, make sure to update them before your reports break.

The changes go into effect November 10th.

Forum discussion at Google+.

Tuesday, 21 October 2014

Penguin 3.0: The Definitive Guide To Diagnosis And Recovery

It’s been over a year since Google last unrolled a new Penguin update, and now it looks like they’re ready to catch up. Google recently confirmed that the most recent iteration of their large-scale

“Penguin” algorithm update started rolling out late on Friday night (October 17th). One of Google’s Webmaster Trends Analysts recently hinted that the latest Penguin update would be coming out soon, but as of October 17, we have full confirmation that the update is live and currently unfolding.

Penguin is an algorithm, originally launched in April of 2012, that identifies evidence of what Google identifies as “webspam” (occurring both on and off a website), and significantly penalizes websites identified as being guilty of spammy, manipulative tactics by drastically reducing their visibility in its search results.

Like with the initial Penguin rollout and its 2.0 follow-up, there’s a chance any website will be affected. If you’ve noticed a major change in rankings and organic search traffic, you need to be proactive to diagnose and correct the actions that caused the Penguin penalty. In this article, I’ll explain how Penguin 3.0 works, how it connects to previous Penguin iterations, how to tell whether you’ve been affected, and of course, what you can do to recover if you’ve been hit.

The Penguin Update – to Date

The first official Penguin update rolled out on April 24, 2012 as a complementary partner to the Panda algorithm, which was designed to reward sites with a better user experience, while penalizing sites with a poor user experience. The Panda algorithm just saw a new refresh in the form of Panda 4.1. The Penguin algorithm covers the biggest ranking factor—external links. Penguin rewards sites that have natural, valuable, authoritative, relevant links, and penalizes sites that have built manipulative links solely for the purpose of increasing rankings, or links that do not appear natural.

The original update, later dubbed “1.0,” impacted about 3.1 percent of all search queries. That may not seem like a lot, but the impact it had on the world of search optimization was stunning.
Google followed up a month later, with Penguin 1.1, and again in October of 2012 with Penguin 1.2. Over the course of 2012, Google unleashed a series of “refreshes,” which updated data, but did not make any major changes to the search algorithm.

The next big hit was on May 22, 2013, when Penguin 2.0 further refined the rules laid out by Penguin 1.0. Rather than making tweaks or refreshing data, 2.0 made further fundamental changes to the algorithm, impacting about 2.3 percent of all search queries. There were a handful of refreshes in 2013, but all was quiet on the Penguin front for over a year, until Penguin 3.0.

For most of 2014, there has been speculation and anticipation of a new Penguin update—especially in the spring, when the pattern predicted a 3.0 revision to the algorithm. Throughout 2014, we haven’t heard a single update about Penguin—not even a refresh. Now, it looks like 3.0 has hit and is sending some webmasters scrambling.

What Makes 3.0 Different

In many ways, Penguin 3.0 is similar to its predecessors. Its intention is to cut down on spam and improve search results by eliminating or penalizing links that don’t appear to be naturally built. Like with the transition from 1.0 to 2.0, the algorithm has grown more sophisticated. There’s no word yet on what percentage of search queries has been affected, but it’s reasonable to think it will reach a similar level as 2.0. Google has confirmed that this is a major algorithm change, not just a slight update or a data refresh, and webmasters should be prepared for a little turbulence (if they haven’t already seen some).

According to Pete Meyers, over at Moz, there haven’t been any major shakeups in SERPs—at least as of October 19. Since this data comes from a limited number of site observations, it’s possible that this update affects a small number of sites tremendously, rather than affecting a large number of sites on a smaller scale. If this is the case, only the worst offenders or greatest practitioners will notice any volatility—and that volatility will be massive in scale.

Looking at the volatility index over at, it seems like the initial rollout on the 17th was just warming up, but the impact on the 18th and 19th was significant (and still rising). Like with most Google algorithm updates, it has taken some time to show signs of impact. According to Google’s John Mueller, as of Monday, October 20th, the rollout is complete—though there may still be ranking changes coming for some websites.


The full details on the scope and specifics of what, exactly Penguin 3.0 targets, are still being investigated. Google is not in the habit of revealing its strategies or algorithm specifics—mostly to keep people from exploiting weaknesses—so it’s unlikely that we’ll receive any information from the source.

Should You Be Concerned?

While each round of the Penguin update is significant, the most impactful updates still only hit around 3-4 percent of all search queries. That means unless you’re a serious offender—and if you are,  you’ve probably already been hit with 1.0 and 2.0—you probably won’t be hit.

Nevertheless, take a look at your rankings, as well as your organic search traffic, and note how they’ve changed over the past several days. Since the rollout is generally considered to be complete, you should have a good idea of whether or not your website was affected. If you notice a sharp drop in rankings on Saturday or Sunday, odds are you were hit by Penguin 3.0. If you haven’t noticed any changes, or if you’ve improved in rankings—you may have recovered from a previous Penguin penalty that was holding you down, or you benefited from competitors that were previously ranking ahead of you getting hit by Penguin and falling in the rankings. At this point, it’s unlikely that you’re going to be hit by an “echo” of the rollout if you weren’t already hit. There is, however, a possibility that Google will follow up with tweaks to Penguin 3.0 (possibly in the form of Penguin 3.1, etc.) in the coming weeks or months.

Penguin 3.0 Recovery Steps

If you do notice that your rankings have dropped, or if your organic traffic numbers are inexplicably low over the past few days, you need to take action to recover from the update. Do note that this is a long-term process; there is no “quick fix” for a sharp ranking drop after an algorithm update, but with time and effort, you can turn your situation around.

Step One: Identify Bad Links & On-site webspam

If you got hit with a penalty, it’s probably a result of too many “bad” inbound links pointing to your website. Bad links include:
Links on article directories, link farms, and other gimmicky aggregators
Links you paid for directly (other than advertising)
Links posted in irrelevant forums or conversations
Links in non-industry specific directories
Links embedded in fluffy content, or those with spammy (ie, exact-keyword-match) anchor text.
You can look for these links in your link profile by using Google Webmaster Tools, Moz’s Open Site Explorer, Majestic SEO, or Ahrefs.

If you’re not the DIY-type, there are professional services available to audit your inbound links and identify harmful ones.

While spammy, manipulative inbound links are the overwhelming majority of the reasons websites get hit by Penguin, on-site webspam can also cause a Penguin penalty. Keyword stuffing, link cloaking, and hidden text can all trigger Penguin penalties, so if you haven’t engaged in any manipulative link building, review your website for these possible issues.

Step Two: Remove the Offenders

Next, you’ll want to remove the questionable links that could be responsible for your ranking drop. First, reach out to the webmasters in charge of the source sites and ask them to remove your link. If they refuse or are unable (or ignore your request), you can use Google’s Disavow Tool, in Webmaster Tools, to notify Google that you would like to disavow your site’s relationship with those links.

Getting links removed is preferable to disavowing them, but you should disavow the ones that you’re unable to remove. If you received a manual penalty in addition to an algorithmic one, you’ll get a notice from Google Webmaster Tools (Note: you’ll only receive this notice if you have set up your website in Google Webmaster Tools already). If you have a manual penalty, you’ll need to follow steps one and two, and then also file a reconsideration request.

Step Three: Reassess your strategy

Moving forward, make an effort to build better links on better sites through a strong content strategy. Instead of building links, think about how you can earn them. Turn your website into a magnet for links, attracting them with amazing content, rather than building links that may feel forced.


Over the next few months, it’s likely that Google will roll out a few data refreshes, but the majority of Penguin 3.0’s impact has already taken root. Even if you weren’t hit, this is a great opportunity to review your existing strategy as well as the work that’s been done for you to this point, and update your strategy to protect yourself against future penalties, while also positioning yourself to recover when the next Penguin refresh is launched by Google.

Source: Forbes

Wednesday, 1 October 2014

How can I research a domain that I may want to purchase?

Question: How can we check to see if a domain (bought from a registrar) was previously in trouble with Google? I recently bought, and unbeknownst to me the domain isn't being indexed and I've had to do a reconsideration request. How could I have prevented?