Friday, 28 March 2014

What is the Best way to differentiate between ccTLD’s

Today at Google a question came from a user asking that “how can I tell Google that multiple sites are related? Also, the genuine interlinking of these TLD’s will be treated as spam or paid link by Google? ”. Here in this post we will try to elaborate what Matt Cutts of Google explains in his weblog in the reply for the same question.

Answering this question Matt explained that it’s really hard to find out that the TLD’s are really related with each other and not for spam and paid SEO links purpose. Here he gave an example, suppose you have a main domain “” and its ccTLDs like “” for the people who are looking in Germany and “” for the people who are looking your website in France. He suggests in his weblog using a “hreflang” tag to distinguish the websites and to tell google to show the precise version in a particular country. By using this Google can differentiate the different ccTLD’s  for the different region and will treat them as related with each other instead of treating them as spam links.

He also explains using a sitemap telling the different version of the website for different country and with different language. For more information please refer the below weblog from Matt Cutts.

How to Markup Videos on your website

If your video is appearing in the SERP for a targeted keyword, it enhances user experience. We are already optimizing our video sitemaps in order to increase visibility of videos in the SERP; a video sitemap helps Google to find videos and thumbnail images in a proper manner. Google is constantly improving its indexing system in order to provide best results to users. You can use yahoo search engine monkey, in order to markup your video content.  If you mark up your pages with Yahoo! Searchmonkey Video tags, Google will, as stated here, recognize these and list your video in the search results. When optimizing content for Google, why are we using Yahoo’s technology? The reason is pretty simple, Google supports it. Unfortunately, there is no tool which you can use to generate an RDFa markup code for your videos. You have to generate the code manually but it’s not a tedious task.

Other than XML Video Sitemaps and MediaRSS enhancements, Yahoo! Searchmonkey code goes straight into your pages. This has the benefit that Google will recognize your videos without any additional submission of feeds or sitemaps. Depending on how you embed video, this might be fairly easy to add or might take some more work. The reason is that it starts with attributes on the object element and some JavaScript based injection methods might not have an object element. While optimizing videos, you can either have a video which is hosted on your own website or you can embed a video from a video hosting website like Youtube, Vimeo or Metacafe. As far as video optimization is concerned, we should have videos hosted on our own server as it is relatively easy to optimize a video if it hosted on the same server as compared to other video hosting websites. We will take both the examples and will learn how to optimize video in both cases. Google also supports Facebook shares; please refer to learn more on this.

Case 1: Video Hosted On Same Website:

Below is normal HTML code which is used to add a video on a webpage.

Now, we have to embed our RDFa code in the above video content.  To start with, we have to specify RDFa namespaces (refer to introduction of RDFa above, if you have forgotten about namespaces). 



Once we are done with the namespaces, there are some properties which we have to add in the markup; they are compulsory properties, we have to add them. 

Google recognizes the following Yahoo! SearchMonkey RDFa properties.

(Required.) A URL to the video to play when the user clicks the "play" button.
(Required.) Must consist of the following URL: "".
(Required.) A URL pointing to a preview thumbnail, which must be a GIF, PNG, or JPG image. The preview thumbnail must be hosted on the same domain as the video.
A valid URL to the Dublin Core namespace. The only acceptable value is "". Required only if you are using Dublin Core metadata such as dc:description
Indicates the license for this content. You may use dc:license or cc:license.
A short (up to 200 characters) description of the video. Take this description from the meta data of the page.
Specifies the title of a video. Up to 60 characters. Again, this can be taken from the title tag of the page on which the video resides.
The video's width in pixels, including any "chrome" provided by the third-party player.
The video's height in pixels, including any "chrome" provided by the third-party player.
The video's MIME-type. Currently, the only acceptable value is application/x-shockwave-flash.
An international region where the video may be played. The default is * (play in all regions).
The duration of the video, in seconds.

As mentioned above, media:video, xmlns:media and media:thumbnail are required properties, which you have to add in the markup, though, it is always beneficial to add as much information as you can, so try to use all the properties.

Based on the knowledge we have gained so far, the RDFa embedded code for the video is as below:

In the above piece of code, in order to differentiate the original code and the added RDFa markup, red color has been used to denote the added markup and black color code shows the original video code.  As the above code indicates, we haven’t altered the original code; we have just added few extra lines. While performing this, you need to make sure that you haven’t changed the original code of the video; make sure that the original code remains intact. If unsure about adding the RDFa code, please feel free to comment your query below.

While, media:video, xmlns:media and media:thumbnail are required properties, we have to use all possible properties which can describe you video content.

Note: In dc.description and media:title, use Meta description and title of the page, respectively. 

Case 2: When Videos are hosted on YouTube:

If you are using videos on your website which are hosted on YouTube, then you have to embed those videos on your website through the code provided by Youtube.

Here is an sample code(old custom code) provided by YouTube:

We need to find out below properties from this code:
media:video --> A URL to the video you wish to be displayed when the user clicks the "play" button.
media:thumbnail --> A URL pointing to a preview thumbnail, which must be a GIF, PNG, or JPG image.

Also, the preview thumbnail must be hosted on the same domain as the video. YouTube does not provide an option to upload a preview thumbnail as they create one.  Here is the trick, in order to find out the path to the image thumbnail, follow below steps.

1        The Thumbnail url will be with the ID of your video
2        You just need to enter the ID of your video here. You can find the id of the video from the embed scr section of the above code, the embed url is".
3        The / or = after the "v" is always the first delimiter, and the "?version=3&" is always the second delimiter. Everything in between is the video ID. In other words,
4        In the above url, your Id is 6mZShors3o0, which is marked in Red in the above code. So, the thumbnail url would be

Note: This thumbnail URL will be useful in creating video sitemaps for such videos as well.

The media:video url will be the url which contains the video id, i.e, 
Now we have media:video url and thumbnail url. We will now create the RDfa markup code in the same way as we have done for the video which is hosted on the same site.  The equivalent RDFa code for the video would be like this:

Feel free to ask any queries regarding RDFa and Semantic Markup of videos on your website.

Tuesday, 25 March 2014

How to judge that with which algorithm your website been hit or penalized

Hi friends, its really been a tough call to tell you that with which algorithmic update your website is hit but there are few hit and trial methods by which you can check that. One of the most popular methods is to look into the webmaster tool for any message which will show you that is there any problem like keyword stuffing, cloaking or something else. You can also go and check for the crawl errors, if there are so many crawl error of specific type like 404 not found or DNS errors, try fixing those errors and hope for the better results.

The other method by which you can track this is tracking the drop of traffic on your website using Google Analytics. In that you have go and check that in which period your traffic fall dramatically. Then do a search on internet for any algorithmic update. If there is any then you can modify your website accordingly and try to get rank higher again.

However telling a reason is not possible every time for the drop in rankings and fall in traffic. But if its algorithmic, its really good that you can update your website again according to the update policies and can rank higher again.

Below is the video from Matt Cutts answering the same question. 

Friday, 21 March 2014

Conquering Pagination – A Guide to Consolidating your Content

A topic sure to make any SEO neophyte’s head spin, approaching and handling pagination can seem a daunting prospect at first. Pagination is a wily shapeshifter, rearing its ugly head in contexts ranging from e-commerce, to newspapers, to forums. Bottom line is, if you’re in the business of on-page optimization, it’s not a question of if you’ll have to deal with pagination problems – it’s a question of when. Luckily, we’re here to give you some advice to get you started, and answer some of the more thought-provoking questions that can arise in tricky situations.
So what exactly is pagination, you ask? In a very basic sense, pagination occurs when a website segments content over the span of multiple pages. On an e-commerce site, this will take the form of product and category listings. On a news site, articles may be divided up across multiple pages or arranged in the form of a slideshow. On forums, groups and topic threads will typically span at least 2-3 pages. Even blogs, which tend to feature article previews ordered from latest to oldest, will run into pagination issues on their homepage.
“Big deal”, you may say. “I see this happening all over the place, so what is the problem with paginated content?” From an SEO perspective, pagination can cause serious issues with Google’s ability to index your site’s content. Let’s explore a few of the potential issues that arise when you paginate your content without taking the proper precautions:
  • Crawler Limitations
When Googlebot is crawling your site, the depth (or levels of clicks deeper into the content) it travels will vary depending on the site’s authority and other factors. If you have a tremendous amount of paginated content, the odds that Googlebot will travel through all paginated content to reach and index the final pages decreases significantly.
  • Duplicate Problems
Depending on the context of the pagination, it is very likely that some elements across the series of pages may contain similar or identical content. In addition to this, you’ll often find that identical title tags and meta descriptions tend to propagate across a span of paginated content. Duplicate content can cause massive confusion for Googlebot when it comes time to determine which pages to return for search queries.
  • Thin Content
In situations (such as the aforementioned news sites) where articles or product reviews tend to be segmented into multiple pages, you run the risk of not providing enough original content for the individual pages to be indexed separately. More importantly, this also creates the risk of running too low on content-to-advertisement ratios, which can set your site up for devastating Panda penalties further down the road.

So how do you deal with Pagination?

Your best option is always optimal site design. There are a number of ways that these problems can be prevented before they begin. When planning the design of an ecommerce or similar site, consider the following measures you can take to cut down on large-scale pagination issues:
  1. Increasing the number of categories, which will decrease the depth of each paginated series
  2. Increasing the number of products per page, which will decrease the number of total pages in the paginated series
  3. Linking to all pages within the now manageable paginated series from the first page, which will alleviate any crawl-depth and link authority flow problems
However, in many real world scenarios, the damage has already been done and a site structure overhaul is not an option. Luckily, Google has given us a variety of methods to better steer the crawlers through our deep crypts of paginated content. As an SEO, you have a weapon in your arsenal to preemptively deal with any problems that may arise out of pagination:


Google now recognizes the rel=“prev” and “next” HTML attributes as a method of indicating a sequence of paginated pages. The implementation can be tricky, and you have to be exceptionally careful when applying this method. Let’s take a look at how this works.
You can refer the below link for the same:

Suppose, you have four pages of paginated content:

By using rel="prev"/"next", you’re essentially creating a chain between all sites in the pagination series. You’ll begin the chain with Page 1, adding the following code to the <head> section of the page’s HTML:
(Page 1):
<link rel="next" href="">
That’s the only step we have to take for the beginning of the chain. Now we move on to Page 2. Consider that Page 2 is now in the middle of the chain, so we have to attach it both to the page before it, and to the next page in the sequence. Page 2 would have the following code in the <head>:
(Page 2):
<link rel="prev" href="">
<link rel="next" href="">
Now just as you might have assumed, since Page 3 is also in the center of this sequence of linked pages, we have to continue to implement the code in a similar manner:
(Page 3):
<link rel="prev" href="">
<link rel="next" href="">
And so we’ve reached Page 4, the last in our chain of paginated content. The last page should only contain a rel="prev" attribute in the<head>, as there are no further pages within the sequence:
(Page 4):
<link rel="prev" href="">
Using this complete sequence of rel="prev"/"next", Google is able to consolidate this group of paginated content into a single entry in their index. This essentially tells Google to treat the sequence of paginated content as one entry within their index. Typically, the first page will be returned to the user as it is usually the most relevant to a query regarding the paginated series. However, Google has noted there as scenarios where a more relevant page within the sequence is returned if the query is particularly centered around the content on that page.



  • ·         Unparalleled flexibility
  • ·         Allows resolving pagination issues without use of a View-All page
  • ·         Can be executed properly with only minor changes to HTML

  • ·         Implementation can be complicated
  • ·         Requires proper execution of the chain in order to be effective

What FACEBOOK and GOOGLE are Hiding from world.

Monday, 10 March 2014

How content should be designed to get rank higher?

This is really difficult question to answer as everybody have its own opinion on the design of the content that what should you content must have to get ranked higher. While designing the content one must have to research on the audience he/she is targeting. If the audience is very much scientifically aware with the subject you can design your content with hyper scientific terms and can fill your content with technical language.

But if you are writing in general, one must have to take care that each and everyone who come across with your content must understand what you are writing about. Here I am sharing  a video from official web log from Google webmaster youtube channel. In this video Matt cutts has answered the same question by elaborating that on what, clarity or Jargon you have to focus while writing your content.

Wednesday, 5 March 2014

What is a "Paid Link" - From Matt Cutts

Here I am sharing a video from Google webmaster Official blog on which Matt Cutts answered a question about Paid Links on how google and other search engine characterized the High PR Paid Links from a genuine High PR links..