Posted:
Webmaster level: all

It's been a bit more than five years now that our Webmaster Help Forum has been up and running, helping webmasters around the world. Over the years, over tens of thousands of users have discussed various topics in well over 100,000 threads, helping each other to improve their websites and to solve a variety of issues that web publishers are confronted with. Among those users is a group that we call the "Bionic Posters," users who have proven to be consistently helpful and knowledgeable, selflessly helping others to tackle seemingly insurmountable problems.

It's great to have such an awesome community -- but thanks is best said by those who are helped. Here is some of the feedback that we collected this year:
  • "Thank you for the time you have spent helping me. It is genuinely appreciated."
  • "Thank you to everyone who helped with my problem! My creaky old 1996 era website is all cleaned up and doing just fine now! Good Guys Rule!"
  • "WOW!!!!!!!! Thank you so much for your help! I was reluctant to post because I thought you guys might think my site is too small, too insignificant, etc. Thanks so much! To me, it's a BIG deal!"
  • "My traffic has doubled and I am now either top or close to the top in search terms"
  • "Thanks. Hopefully my late night paid off then, all the help and information has been great!!!!!!!"
  • "Wow Webado Thank You Again.. You Really Know Your stuff! (…) You Are a True professional and Seriously The Only Person That Could Even Figure This out. I even Spoke to Other Top Specialists and YOU were the Only one who told me what to do and what was wrong."
  • "You are AMAZING! Thank you so much!"
  • "Finally, thanks so much for your concern and prompt reply, Squibble. Can't imagine a person would deligated his efforts to someones you don't even know and met. (…) Thanks for your help!! It means everything to me!"
  • "Wow thanks so much! I have just been to the hot springs in Banos Ecuador, the volcano is rumbling and the town has evacuated but I am still here talking Apache, now that is dedication, I will test your helpfulness out 2moro! Thanks so much!!!"
  • "THANKYOU THANKYOU THANKYOU! YOU ARE SOOOO AMAZING! MY BLOG IS GONE FROM GOOGLE AND ALL THE OLD POSTS ASSOCIATED WITH THAT TOO! THANK YOU! I LOVE YOU!"
  • "It worked! Thank you very much for your help Cristina :)"
  • "I could NEVER have seen what's possible without this forum. I am so grateful."
  • "Thank you so much for your detailed response to my questions. In 10 years of me having a website, no one has explained these concepts better to me than you did."
  • "IT WORKED!!! Thank you so much for saving me the grief and embarrassment of this problem. I truly appreciate both your knowledge and guidance."
  • "Thank you so much for such a detailed answer and putting into terms I can easily understand. (…) Where shall I send the batch of brownies?"
  • "Thank you so much Squibble you are a hero. I have done what it says and i will check to see if i appear in google! Thanks again!"
  • "you guys are amazing! thank you so much redleg! (…) and if you happen to ever be in san carlos give me a shout - you deserve at least a beer and a lunch! "
  • "Thank you Squibble, Vanessa, Cristina, and Ishigaki for weighing in on this and helping me. I hope I can pay it forward one day."
  • "Great info Guys. I really appreciate it. It was awesome of y'all to help me out. I really appreciate it. Thank you."
  • "Your amazing :-) i love you lol xx im sorted now thanks and never in a million years would i have found that out!"
  • "THANK YOU! I love Google and appreciate that they have these safety precautions in place for those nasty hackets - especially when we can fix the problem! Thanks so much for your help. Whew."
  • "Thank you very much, Robbo! With a little tweaking it worked perfectly!"
  • "yay!!! I think it finally went through - THANK YOU SO MUCH!!!"

If you have a question that you would like to ask, a problem that you need help with, we'd love to see you in the forums! We just ask that you please take the time to read through our frequently asked questions, and search the forum before posting. Chances are high that a question like yours has already been answered. Tell us a little bit about yourself and then join us to learn more and help others!

Posted:
Webmaster Level: Intermediate to Advanced

To the fabulous, savvy audience that attended our Video Sitemap webinar several months ago, please accept our re-gift: a summary of your questions from the Video Sitemaps Q&A!

To those who were unable to attend the webinar, please enjoy our gift of the summarized Q&A -- it’s like new!

Either way, happy holidays from all of us on the Webmaster Central Team. :)


Our entire webinar covers the basics of Video Sitemaps and best practices -- nearly everything you’d need to know when submitting a video feed.

  1. Can the source/content of the video (perhaps a third-party vendor) be hosted on another site? For example, can I host my videos on YouTube and still be eligible for Video Search traffic?

    Yes, you can use a third party to host videos. Only the play page--the URL within the <loc> tag--needs to be on your site. <video:content_loc> and <video:player_loc> can list URLs on a different site or subdomain.

    For example, here’s a snippet from a valid Video Sitemap that shows content hosted on a different subdomain from the play page:

    <url>
      <loc>http://www.example.com/videos/some_video_landing_page.html</loc>
        <video:video>
          <video:thumbnail_loc>http://www.example.com/thumbs/123.jpg</video:thumbnail_loc>
          <video:title>Grilling steaks for summer</video:title>
          <video:description>Alkis shows you how to get perfectly done steaks every time</video:description>
          <video:content_loc>http://video-hoster.example.com/video123.flv</video:content_loc>
          <video:player_loc allow_embed="yes" autoplay="ap=1">http://www.example.com/videoplayer.swf?video=123</video:player_loc>
        </video:video>
    </url>


  2. If I’m using YouTube to host my videos, can Google verify that I’m the legitimate owner?

    Currently, there doesn’t exist functionality that allows you, as the uploader, to verify that you’re the owner of a video. The issue of authorship is a hard problem on the web, not just for videos, but nearly all types of content.

  3. Because Google owns YouTube, should users who embed YouTube videos still submit Video Sitemaps or is it unnecessary?

    Google treats YouTube as just another source for video content -- though you don’t need to submit a Video Sitemap if you only want your YouTube-hosted videos indexed. If, however, you’re using YouTube as a online video platform (i.e., with play pages on your own site), then we do recommend Sitemap submission.

  4. How long does it take for Google to accept and verify a Video Sitemap?

    Video Sitemap submission is a two-part process:

    1. We fetch the Sitemap and parse it for syntax errors. This happens within minutes.

    2. We fetch the assets referenced in the Sitemap, perform checks, validate metadata, do more cool stuff, and last, index the video. This step can require varied amounts of time depending on your site and our system load.

  5. What tags and categories are most important in Video Sitemaps or mRSS? Should I create my own categories or is there a list that I should conform to?

    Currently, the most important metadata to include is title and description -- both are required. The category tag is optional, and there isn’t a list from which to select.

  6. Do I have to use HTML5 to use Video Sitemaps?
    Does HTML5 help with discovery?
    Or, if my site is HTML5 compliant, do I still need to submit a Video Sitemap?


    None of the Video Search principles change with HTML5. We still recommend using a Video Sitemap regardless of the markup on your site. HTML5 can be helpful, though, because tags like <video> make it easier for our systems to verify that video exists on the page.

  7. If I use an iframe rather than embedding my videos, can Google still find it?

    We do not recommend using iframes to embed video content on your pages.

  8. Can I have multiple videos on one URL?

    You can. We’ve found, however, that users may not consider it the best experience. When users click on a video search result, they most often don’t like being forced to locate the correct video among multiple videos on the resulting page.

  9. Do I need to specifically create a robots.txt file that allows Googlebot, or do I just need to make sure Googlebot isn’t blocked?

    Just make sure that Googlebot isn’t blocked.

  10. I provided a thumbnail, but it’s not being used. Does Google create their own thumbnails from my videos?

    We try to use the thumbnail you provide if it’s valid. If not, we’ll try to generate a thumbnail ourselves. We recommend that you provide thumbnails that are at least 120x90 pixels. We also accept many thumbnail formats, such as PNG and JPEG.

  11. Any video filesize limitations?

    At this time, there aren’t video filesize limitations on content submitted through VIdeo Sitemaps.

  12. Is there any way to indicate a transcript or closed captioning for a video?

    Currently there isn’t, but perhaps down the road.

  13. What if I’m using Lightbox or a popup to display a video; can it still be indexed?

    Depends on the use case and how it’s rendered, but if indexing by search engines is important to you, it’s not the safest method. In the Webmaster Help Center, we explain that “When designing your site, it's important to configure your video pages without any overly complex JavaScript or Flash setup.” Most often, for bots, simpler is safer.
Have a safe and happy holiday!

Posted:
Webmaster Level: All

We provide lots of information for webmasters across many different channels — you can stay up to date with the latest features here on our blog, browse articles in our Help Center, have discussions in our forums (in 17 languages!), watch videos on our YouTube channel, or even read in-depth interviews (in English, Portuguese, and other languages).

There’s no shortage of useful information, but sometimes the relevant bits may be a bit difficult to locate, especially for novice webmasters. We see the same questions popping up over and over again, so we’ve tried to make our most frequently searched information as accessible and visible as possible:
We analysed the questions asked over the past year and a half and identified the issues you are most interested in. We then picked out the relevant bits from across our different resources and collected the answers to those questions in one new convenient FAQ page in our Help Center (available in 20 languages).

We also frequently get questions on how to get in touch with us, so we’ve put together all the different ways you can:
...tell us about a page you want to remove from our search results;
...tell us about spam you found;
...let us know when you’ve fixed issues on your website;
...and many more! All of these contact channels are now listed conveniently in one article with direct links to the relevant forms: Webmaster help and contacts (available from the homepage of our Help Center, also in 20 languages).

Now isn’t that a nice stocking stuffer (-:?
Happy webmastering in 2011, and keep the feedback coming!

Posted:
Webmaster Level: Intermediate to Advanced

What are the benefits of submitting feeds like Video Sitemaps and mRSS vs. the benefits of Facebook Share and RDFa? Is one better than the other? Let’s start the discussion.

Functionality of feeds vs. on-page markup

Google accepts information from both video feeds, such as Video Sitemaps and mRSS, as well as on-page markup, such as Facebook Share and RDFa. We recommend that you use both!

If you have limited resources, however, here’s a chart explaining the pros and cons of each method. The key differentiators include:
  • While both feeds and on-page markup give search engines metadata, Video Sitemaps/mRSS also help with crawl discovery. We may find a new URL through your feed that we wouldn’t have easily discovered otherwise.

  • Using Video Sitemaps/mRSS requires that the search engine support these formats and not all engines do. Because on-page markup is just that -- on the page -- crawlers can gather the metadata through organic means as they index the URL. No feed support is required.

 Feeds
(Video Sitemaps & mRSS)
On-page markup
(Facebook Share & RDFa)
Accepted by Google
Helps search engines discover new URLs with videos (improves discovery and coverage)
Provides structured metadata (e.g. video title and description)
Allows search engines without sitemap/mRSS support to still obtain metadata information (allows organic gathering of metadata)
Incorporates additional metadata like “duration”


If you’re further wondering about the benefits of specific feeds (Video Sitemaps vs. mRSS), we can help with clarification there, too. First of all, you can use either. We’re agnostic. :) One benefit of Video Sitemaps is that, because it’s a format we’re actively enhancing, we can quickly extend it to allow for more specifications.

All this said, if you’re going to start from scratch, Video Sitemaps is our recommended start.

 Video SitemapsmRSS
Accepted by Google
Been around for a long, long time and pretty widely accepted
Extremely quick for Google Video Search team to extend


“Starving” to start conversation about feeds or on-page markup? Join us in the Sitemaps section of the Webmaster discussion forum.

Posted:
Webmaster level: Beginner

Cross-posted on the Google Grants Blog

In our previous post, we did some source code housekeeping -- just in time for the holidays. But once users have landed on your site, how can you make sure they’ll know how to get around?

As it turns out, easily accessible content on your site can make a big difference. Users tend to have a better experience when a site helps them find and understand its content. Having an accessible site not only empowers users, it also helps search engines understand what your site is really about.

So if you’ve resolved to boost your site’s user experience and online presence for the new year, improving your content accessibility is a great way to start. Thankfully, there are tons of features you can add to make your site more accessible. In this post, we’ll highlight three of them:
  • Intuitive navigation
  • Concise, descriptive anchor text for links
  • Unique, accurate page titles throughout the site
Intuitive navigation
Help users avoid confusion by providing them with intuitive navigation, so that when they arrive at your site, they’ll know where to click to find the information they’re looking for.

Here are three features you can implement in order to lead your users down the right path:
  • Navigational menu: Having a menu with links to the site’s most important pages is the fastest, easiest way to show users where to click next.
  • Text-based links: While drop-down menus, image-based links, and animation-based links can be appealing, keep in mind that users on text-only devices and some search engines may not be able to see or understand these links. Thus, many users prefer text-based links, which are also easier for search engines to crawl and interpret.
  • User-viewable site map: 59% of our submissions did not have a user-viewable site map. By providing one, you display the structure of your site and give the user easy one-click navigation. If users are having trouble finding specific pages on your site, a site map can help them find their way. Don’t send your users into the wild without a map!
Let’s explore how these features can make a site’s navigation more intuitive by looking at one of our submitted sites, Philanthropedia.


Thanks to this site’s clean navigational menu, users can find all of the site’s important pages within a few clicks. Wherever users end up on the site, they can always click on the “Home” button to return to the main page, or on any of the links in the menu to return to the site’s important subpages. Like all of the links on this site, the links in the navigational menu are text-based links, which make it easier for both search engines and users to access the site’s content. Finally, Philanthropedia has included a user-viewable site map, shown below, in case visitors are looking for a specific page not listed in the main menu.


Concise, descriptive anchor text for links
Anchor text -- the clickable text of a link -- can help users quickly decide which links they want to click on and find out more about. Meaningful anchor text makes it easier for users to navigate around your site and also helps search engines understand what the link’s destination page is about.

20% of our submissions could improve their sites by improving the anchor text used in some of their internal links. When writing anchor text, keep two things in mind:
  • Be descriptive: Use words that are relevant to the destination page, avoiding generic phrases like “click here” or “article.” Make sure the user can get a snapshot of the destination page’s overall content and functionality by reading the anchor text.
  • Keep it concise: Anchor text that contains a few words or a short phrase is more attractive and convenient for users to read than a sentence or paragraph-long link.
Let’s take a look at how anchor text played out in two user-submitted examples:

OrganizationAnchor Text ExamplesUser FriendlinessAnchor Text Behavior
The Mosaic ProjectWork for Mosaic

Order Our Curriculum Guide

Outdoor School
High: Users can get an accurate idea of the content on the links’ destination pages just by reading the anchor text.Active verb phrases and rich nouns accurately describe the pages that the links are pointing to.
Asian Liver CenterLearn more

here
Low: The anchor text is too generic and does not give users an idea of what the linked-to content is. Generic phrases give little insight into the pages that the links are pointing to.

You can learn more about anchor text and internal linking strategies by checking out this blog post on the importance of link architecture.

Unique, accurate page titles throughout the site
Each page on your site is different, so flaunt your site’s diversity by giving a unique title to each page. Giving each page a unique title lets search engines know how that page is distinct from others within your site. In our analysis, over 28% of sites could have improved their site quality by adding unique page titles.

Let’s check out a few more examples to see what a difference unique, accurate page titles can make:

OrganizationPage Title ExamplesUser FriendlinessPage Title Behavior
VAMS InternationalUpcoming Events | VAMS International

Request Service | VAMS International

FAQ’s | VAMS International

High: Each page’s content is relevant to its title, and the user can get a good idea of each page’s unique offerings and functionality.Concise, rich language joined with the organization’s name accurately describes the corresponding pages. The titles show how each page is unique while also acknowledging that they are all associated with one organization.
MHCD Evaluation and ResearchMHCD Evaluation and ResearchLow: This site contains a lot of diverse content and rich functionality; however, the uniform page titles do not convey these strengths.This page title is too general and does not accurately describe the content on each page. The same title is used across all the pages on this site.

Wrapping things up
We hope that this blog post has given you some ideas on how to ring in the new year with improved content accessibility, which can boost the user experience and online presence for your site.

To learn more about the features discussed here and in our previous two site clinic posts, check out our SEO Report Card and SEO Starter Guide.

This blog post wraps up our website clinic for non-profits. We send our warmest regards to all the great non-profit causes you are working on, and thanks to everyone who took the time to submit their sites and read our posts!

Contributors: Aditya Goradia, Brandon Falls, Charlene Perez, Diara Dankert, Michael Wyszomierski, and Nelson Bradley

Posted:
Webmaster Level: Beginner

Cross-posted on the Google Grants Blog

As the holiday season comes around, we all have a bit of housekeeping to do. This is precisely why we wanted to focus the second post in our site clinic series on cleaning up your source code. Throughout our analysis of submitted non-profit websites, we noticed some confusion about what HTML markup, or tags, to use where, and what content to place within them, both of which could have significant impact on users and how your website looks on the search results page.

Before you deck the halls, deck out your <title> elements
Out of all the submitted non-profit websites, 27% were misusing their <title> elements, which are critical in letting both Google and users know what’s important to your website. Typically, a search engine will display ~60 characters from your title element; this is valuable real estate, so you should use it! Before getting into the actual code, let’s first take a look at how a great title element from one of our submitted sites, Sharp, will appear in the search results page:


Ideally, a great <title> element will include the name of the organization, along with a descriptive tag line. Let’s take a look at some submitted examples:

Organization

<title> source code

User Friendliness

Tag Behavior

Sharp

<title>Top San Diego Doctors and Hospitals - Sharp HealthCare</title>

Best

Includes organization’s name and a descriptive tag line

Interieur

<title>Interieur 2010 - 15-24 October Kortrijk, Belgium</title>

Good

Includes the organization’s name and a non-descriptive tag line

VAMS International

<title>Visual Arts and Music for Society | VAMS International</title>

Okay

Includes only the organization’s name


If you don’t specify a <title> tag, then Google will try to create a title for you. You can probably do better than our best guess, so go for it: take control of your <title> tag! It’s a simple fix that can make a huge difference. Using specific <title> tags for your deeper URLs is also important, and we’ll address that in our next site clinic post.

Keep an eye on your description meta tags
Description meta tags weren’t being utilized to their full potential in 54% of submitted sites. These tags are often used to populate the two-line snippet provided to users in the search results page. With a solid snippet, you can get your potential readers excited and ready to learn more about your organization. Let’s take another look at a good example from among the submitted sites, Tales of Aussie Rescue:


If description meta tags are absent or not relevant, a snippet will be chosen from the page’s content automatically. If you’re lucky and have a good snippet auto-selected, keep in mind that search engines vary in the way that they select snippets, so it’s better to keep things consistent and relevant by writing a solid description meta tag.

Keep your <h> elements in their place
Another quick fix in your housekeeping is assuring your website makes proper use of heading tags. In our non-profit study, nearly 19% of submitted sites had room for improvement with heading elements. The most common problem in heading tags was the tendency to initiate headers with an <h2> or <h3> tag while not including an <h1> tag, presumably for aesthetic reasons.

Headings give you the opportunity to tell both Google and users what’s important to you and your website. The lower the number on your heading tag, the more important the text, in the eyes of Google and your users. Take advantage of that <h1> tag! If you don’t like how an <h1> tag is rendered visually, you can always alter its appearance in your CSS.

Use alt text for images
Everyone is always proud to display their family photos come holiday season, but don’t forget to tell us what they’re all about. Over 37% of analyzed sites were not making appropriate use of the image alt attribute. If used properly, this attribute can:
  • Help Google understand what your image is
  • Allow users on text-only browsers, with accessibility problems, or on limited devices to understand your images
Keep in mind, rich and descriptive alt text is the key here. Let’s take another look at some of our submitted sites and their alt attribute usage:

Organization

Source Code

User Friendliness

Tag Behavior

Sponsor A Puppy

<img alt="Sponsor a Puppy logo" src=...

Best: the alt text specifies the image is the organization’s main logo

Uses rich, descriptive alt text to describe images, buttons, and logos

Philanthropedia

<img alt="Logo" height=...

Good: the alt text specifies the image is a logo, but does not further describe it by the organization or its behavior

Uses non-descriptive alt text for images, buttons, and logos, or uses alt text only sporadically

Coastal Community Foundation

<img src="...”>

Not ideal: alt text not present

No use of alt text, or use of text that does not add meaning (often seen in numbering the images)


A little window shopping for your New Year’s resolution
Google has some great resources to further address best practices in your source code. For starters, you can use our HTML Suggestion Tool in Webmaster Tools. Also, it’s always a good practice to make your site accessible to all viewers.

Posted:
Webmaster Level: Beginner

Cross-posted on the Google Grants Blog

A New Year’s resolution
In the spirit of the holidays, here at Google we wanted to take the time to help out those who spend their days making our world a better place: non-profit organizations. A few weeks back, we asked webmasters of non-profits to submit their organization’s site to our Search Quality team for analysis. After some number crunching and trend analysis, we’re back to report on general areas for improvement and to guide you towards some useful resources!

Making our list, checking it twice
First, we’d like thank all of the amazing organizations who participated by submitting their sites. We got some great results, and are excited about all the diverse non-profit causes out there.

Our analysis will take place in the following two posts. The first post will focus on cleaning up HTML tags in your source code, while the second will examine improving user experience via better content accessibility.

Visions of... URLs... dancing in our heads
The great news is, every single site submitted had at least one or two areas to tweak to make it even better! So this information should be helpful to everyone out there, big or small. Just to whet your appetites, here’s a quick list of items that will not be addressed in our following posts, but that had some room for improvement in a large percentage of submitted sites:
  • Keep an eye on proper canonicalization: 56% of analyzed non-profit sites could improve their canonicalization practices. You can read more about canonicalization in this blog post from a previous site clinic.
  • Make sure your volunteer/support sections are visible: 29% of our submissions could improve their sites by making their support, volunteer, or donation sections easier to find. A great way to accomplish this is to add a donations tab to your navigation bar so it’s just one click away at all times.
  • Protect your confidential information: Lots of non-profits, especially those in the medical industry, deal with some very important and confidential information. Read up on how to control your crawled and indexed content, and remember to protect confidential content through proper authentication measures.
  • Make your Flash sites search engine friendly: We saw some beautiful sites running on Flash. Search engines have a hard time understanding Flash files, and we’re working to improve Flash comprehension on our end, but here are some discussion points on how you can help us understand your Flash content.
Contributors: Aditya Goradia, Brandon Falls, Charlene Perez, Diara Dankert, Michael Wyszomierski, and Nelson Bradley

Posted:
Webmaster level: All

Today we’ve added a new notification to our search results that helps people know when a site may have been hacked. We’ve provided notices for malware for years, which also involve a separate warning page. Now we’re expanding the search results notifications to help people avoid sites that may have been hacked and altered by a third party, typically for spam. When a user visits a site, we want her to be confident the information on that site comes from the original publisher.

Here’s what the notification looks like:


Clicking the “This site may be hacked” link brings you to an article in our Help Center which explains more about the notice. Meanwhile, clicking the result itself brings you to the target website, as expected.

We use a variety of automated tools to detect common signs of a hacked site as quickly as possible. When we detect something suspicious, we’ll add the notification to our search results. We’ll also do our best to contact the site’s webmaster via their Webmaster Tools account and any contact email addresses we can find on the webpage. We hope webmasters will also appreciate these notices, because it will help you more quickly discover when someone may be abusing your site so you can correct the problem.

Of course, we also understand that webmasters may be concerned that these notices are impacting their traffic from search. Rest assured, once the problem has been fixed, the warning label will be automatically removed from our search results, usually in a matter of days. You can also request a review of your site to accelerate removal of the notice.

If you see this notification appearing on your site’s listing, please take a look at the instructions in our Help Center to learn how you can begin to address the problem. Together, we can make the web a safer place.

Update (2:50pm PT, September 19th 2013): We've updated this post to reflect the change in the notification wording.

Posted:
Webmaster Level: All

Just in time for the holidays, the Webmaster Tools team has updated the "Search queries" and "Links to your site" features.

Search queries with top pages:
Throughout the past year we’ve made some significant changes to the search queries feature in Webmaster Tools. We've received lots of feedback about this tremendously popular feature. One frequent request we heard was that people wanted to be able to see search queries data for their site’s individual pages. Well, we totally agreed that this would be useful and promptly set out to add this functionality to search queries. The fruits of our effort have finally ripened enough on the vine and are ready for you to enjoy. Now when you visit the search queries feature in Webmaster Tools you'll see a new tab titled "Top Pages." The "Top Pages" tab lists impression, click, and position data for the top pages on your site based on their performance in Google's search results.


If you click on one of the individual pages listed you’ll see a list of the queries driving traffic to that page, along with impressions and number of clicks for each query.


Just like in the "Top queries" view, you can click on a specific query to see more detailed data and evaluate how the query is performing across the whole site.


To make filtering in search queries even easier, we've added pie charts to show visually the proportions of search type, location and traffic. Also in the "Top queries" view, you can now specify “containing” or “not containing” when filtering queries.


Links to your site with intermediate links:
Links to your site now shows when a particular URL redirects. If there's a link to your site that links to URL1 which then redirects to URL2, we are now showing the link from URL1 to URL2 as an intermediate link. We also added a "Download all links" option to all the tables in the Links to your site feature.


Now that you know about these updates, please take a few minutes to check them out for your site. We hope it will provide you with a little bit of extra joy this holiday season. Let us know what you think by submitting a comment here or posting in our Webmaster Help Forum. Happy Holidays from the Webmaster Tools team!

Posted:
Webmaster level: All

Do you know how Google's crawler, Googlebot, handles conflicting directives in your robots.txt file? Do you know how to prevent a PDF file from being indexed? Do you know Googlebot’s favorite song? The answers to these questions (except for the last one :)), along with lots of other information about controlling the crawling and indexing of your site, are now available on code.google.com:

Controlling crawling and indexing



Now site owners have a comprehensive resource where they can learn about robots.txt files, robots meta tags, and X-Robots-Tag HTTP header directives. Please share your comments, and if you have questions you can post them in our Webmaster Help Forum.

Posted:
Webmaster Level: All

Recently we made a change to show more results from a domain for certain types of queries -- this helped searchers get to their desired result even faster. Today we’re expanding the feature so that, when appropriate, more queries show additional results from a domain. As a webmaster, you’ll appreciate the fact that these results may bring targeted visitors directly to the pages they’re interested in.

Here’s an example: in the past, the query [moma] (the Museum of Modern Art), might have triggered two results from the official site:


With this iteration, our search results may show:
  • Up to four web results from each domain (i.e., several domains may have multiple results)
  • Single-line snippets for the additional results, to keep them compact
As before, we still provide links to results from a variety of domains to ensure people find a diverse set of sources relevant to their searches. However, when our algorithms predict pages from a particular site are likely to be most relevant, it makes sense to provide additional direct links in our search results.


Like all the hundreds of changes we make a year, we’re trying to help users quickly reach their desired result. Even though we’re constantly improving our algorithms, our general advice still holds true: create compelling, search-engine friendly sites in order to attract users, buzz, and often targeted traffic!

Posted:
Webmaster Level: All

We often get questions from webmasters about how we index content designed for Flash Player, so we wanted to take a moment to update you on some of our latest progress.

About two years ago we announced that through a collaboration with Adobe we had significantly improved Google’s capability to index Flash technology based content. Last year we followed up with an announcement that we had added external resource loading to our SWF indexing capabilities. This work has allowed us to index all kinds of textual content in SWF files, from Flash buttons and menus to self-contained Flash technology based websites. Currently almost any text a user can see as they interact with a SWF file on your site can be indexed by Googlebot and used to generate a snippet or match query terms in Google searches. Additionally, Googlebot can also discover URLs in SWF files and follow those links, so if your SWF content contains links to pages inside your website, Google may be able to crawl and index those pages as well.

Last month we expanded our SWF indexing capabilities thanks to our continued collaboration with Adobe and a new library that is more robust and compatible with features supported by Flash Player 10.1. Additionally, thanks to improvements in the way we handle JavaScript, we are also now significantly better at recognizing and indexing sites that use JavaScript to embed SWF content. Finally, we have made improvements in our video indexing technology, resulting in better detection of when a page has a video and better extraction of metadata such as alternate thumbnails from Flash technology based videos. All in all, our SWF indexing technology now allows us to see content from SWF files on hundreds of millions of pages across the web.

While we’ve made great progress indexing SWF content over the past few years, we’re not done yet. We are continuing to work on our ability to index deep linking (content within a Flash technology based application that is linked to from the same application) as well as further improving indexing of SWF files executed through JavaScript. You can help us improve these capabilities by creating unique links for each page that is linked from within a single Flash object and by submitting a Sitemap through Google Webmaster Tools.

We’re excited about the progress we’ve made so far and we look forward to keeping you updated about further progress.

Posted:
Webmaster Level: Intermediate to Advanced

Today Google introduced Instant Previews, a new search feature that helps people find information faster by showing a visual preview of each result. Traditionally, elements of the search results like the title, URL, and snippet—the text description in each result—help people determine which results are best for them. Instant Previews achieves the same goal with a visual representation of each page and where the relevant content is, instead of a text description. For our webmaster community, this presents an opportunity to reveal the design of your site and why your page is relevant for a particular query. We'd like to offer some thoughts on how to take advantage of the feature.

First of all, it's important to understand what the new feature does. When someone clicks on the magnifying glass on any result, a zoomed-out snapshot of the underlying page appears to the right of the results. Orange highlights indicate where highly relevant content on the page is, and text call outs show search terms in context.

Here’s the Instant Preview for the Google Webmaster Forum.

These elements let people know what to expect if they click on that result, and why it's relevant for their query. Our testing shows that the feature really does help with picking the right result—using Instant Previews makes searchers 5% more likely to be satisfied with the results they click.

Many of you have put a lot of thought and effort into the structure of your sites, the layout of your pages, and the information you provide to visitors. Instant Previews gives people a glimpse into that design and indicates why your pages are relevant to their query. Here are some details about how to make good use of the feature.

  • Keep your pages clearly laid out and structured, with a minimum of distractions or extraneous content. This is always good advice, since it improves the experience for visitors, and the simplicity and clarity of your site will be apparent via Instant Previews.
  • Try to avoid interstitial pages, ad pop-ups, or other elements that interfere with your content. In some cases, these distracting elements may be picked up in the preview of your page, making the screenshots less attractive.
  • Many pages have their previews generated as part of our regular crawl process. Occasionally, we will generate screenshots on the fly when a user needs it, and in these situations we will retrieve information from web pages using a new "Google Web Preview" user-agent.
  • Instant Previews does not change our search algorithm or ranking in any way. It's the same results, in the same order. There is also no change to how clicks are tracked. If a user clicks on the title of a result and visits your site, it will count as a normal click, regardless of whether the result was previewed. Previewing a result, however, doesn't count as a click by itself.
  • Currently, adding the nosnippet meta tag to your pages will cause them to not show a text snippet in our results. Since Instant Previews serves a similar purpose to snippets, pages with the nosnippet tag will also not show previews. However, we encourage you to think carefully about opting out of Instant Previews. Just like regular snippets, previews tend to be helpful to users—in our studies, results which were previewed were more than four times as likely to be clicked on. URLs that have been disallowed in the robots.txt file will also not show Instant Previews.
  • Currently, some videos or Flash content in previews appear as a "puzzle piece" icon or a black square. We're working on rendering these rich content types accurately.

We hope you're as excited about this next step in the search results as we are. We're looking forward to many more improvements to Instant Previews in the future.

Posted:
Webmaster Level: All

At Google, we continually strive to improve our algorithms to keep search results relevant and clean. You have been supporting us on this mission by sending spam reports for websites that violate our Webmaster Guidelines, using the spam report form in Google Webmaster Tools. While you might not see changes right away, we take your reports seriously and use them to fine-tune our algorithms -- the feedback is much appreciated and helps us to protect the integrity of our search results. We also take manual action on many of these spam reports. A recent blog post covers more information on how to identify webspam.

For those of you who regularly report spam, or would like to do so, we’ve now published a Chrome extension for reporting spam that makes the process more convenient and simple. The extension adds “Report spam” links to search results and your Web History, taking you directly to the spam report form and autocompleting some form fields for you. With this extension, Google’s spam report form is always just one click away.

The Google Webspam Report Chrome extension provides further tools to help you quickly fill out a spam report:
  • a browser button to report the currently viewed page
  • an option to retrieve recent Google searches from your Chrome history
  • an option to retrieve recently visited URLs from your Chrome history
As before, you need to be logged into your Google Account to report spam. You can find a more detailed walkthrough of the use cases and features in this presentation and on the Chrome Extensions Gallery page, where you can also provide feedback and suggestions. We hope that you find this extension useful and that you continue to help us fight spam.

The extension is available in 16 languages. If your Chrome browser is set to a language supported by the extension, it will automatically use the localized version, otherwise defaulting to English.

Note: We care about your privacy. The Google Webspam Report Chrome extension allows you to access your personal Chrome history for the purpose of reporting spam, but does not send data retrieved from it to our servers. The source code of the extension has been published under an open source license.

Posted:
Webmaster level: All

Everyone who uses the web knows how frustrating it is to land on a page that sounds promising in the search results but ends up being useless when you visit it. We work hard to make sure Google’s algorithms catch as much as possible, but sometimes spammy sites still make it into search results. We appreciate the numerous spam reports sent in by users like you who find these issues; the reports help us improve our search results and make sure that great content is treated accordingly. Good spam reports are important to us. Here’s how to maximize the impact of any spam reports you submit:

Why report spam to Google?

Google’s search quality team uses spam reports as a basis for further improving the quality of the results that we show you, to provide a level playing field for webmasters, and to help with our scalable spam fighting efforts. With the release of new tools like our Chrome extension to report spam, we’ve seen people filing more spam reports and we have to allocate appropriate resources to the spam reports that are mostly likely to be useful.

Spam reports are prioritized by looking at how much visibility a potentially spammy site has in our search results, in order to help us focus on high-impact sites in a timely manner. For instance, we’re likely to prioritize the investigation of a site that regularly ranks on the first or second page over that of a site that only gets a few search impressions per month. A spam report for a page that is almost never seen by users is less likely to be reviewed compared to higher-impact pages or sites. We generally use spam reports to help improve our algorithms so that we can not only recognize and handle this particular site, but also cover any similar sites. In a few cases, we may additionally choose to immediately remove or otherwise take action on a site.

Which sites should I report?

We love seeing reports about spammy sites that our algorithms have missed. That said, it’s a poor use of your time to report sites that are not spammy. Sites submitted through the spam report form are reviewed for spam content only. Sites that you think should be tackled for other reasons should be submitted to us through the appropriate channels: for example, for those that contain content which you have removed, use our URL removal tools; for sites with malware, use the malware report form; for paid links that you find on sites, use the paid links reporting form. If you want to report spammy links for a page, make sure that you read how to report linkspam. If you have a complaint because someone is copying your content, we have a different copyright process--see our official documentation pages for more info. There’s generally no need to report sites with technical problems or parked domains because these are typically handled automatically.

The same applies to redirecting legitimate sites from one top level domain to another, e.g. example.de redirecting to example.com/de. As long as the content presented is not spammy, the technique of redirecting one domain to another does not automatically violate the Google Webmaster Guidelines.


If you happen to come across a gibberish site similar to this one, it’s most likely spam.

The best way to submit a compelling spam report is to take a good look at the website in question and compare it against the Google Webmaster Guidelines. For instance, these would be good reasons to report a site through the spam report form:
  • the cached version contains significantly different (often keyword-rich) content from the live version
  • you’re redirected to a completely different domain with off-topic, commercial content
  • the site is filled with auto-generated or keyword-stuffed content that seems to make no sense
These are just a few examples of techniques that might be potentially spammy, and which we would appreciate seeing in the form of a spam report. When in doubt, please feel free to discuss your concerns on the Help Forum with other users and Google guides.

What should I include in a spam report?

Some spam reports are easier to understand than others; having a clear and easy-to-understand report makes it much easier for us to analyze the issue and take appropriate actions. Here are some things to keep in mind when submitting the spam report:
  • Submit the URLs of the pages where you see spam (not just the domain name). This makes it easy for us to verify the problem on those specific pages.
  • Try to specify the issue as clearly as possible using the checkboxes. Don’t just check every single box--such reports are less likely to be reviewed.
  • If only a part of the page uses spammy techniques, for example if it uses cloaking or has hidden text on an otherwise good page, provide a short explanation on how to look for the spam you’re seeing. If you’re reporting a site for spammy backlinks rather than on-page content, mention that.
By following these guidelines, your spam reports will be reproducible and clear, making them easier to analyze on our side.

What happens next?

After reviewing the feedback from these reports (we want to confirm that the reported sites are actually spammy, not just sites that someone didn’t like), it may take a bit of time before we update our algorithms and a change is visible in the search results. Keep in mind that sometimes our algorithms may already be treating those techniques appropriately; for instance, perhaps we’re already ignoring all the hidden text or the exchanged links that you have reported. Submitting the same spam report multiple times is not necessary. Rest assured that we actively review spam reports and take appropriate actions, even if the changes are not immediately visible to you.

With your help, we hope that we can improve the quality of and fairness in our search results for everyone! Thank you for continuing to submit spam reports and feel free to post here or in our Help Forum should you have any questions.

Posted:
Webmaster Level: All

Last year, as part of Google’s initiative to make the web faster, we introduced Page Speed, a tool that gives developers suggestions to speed up web pages. It’s usually pretty straightforward for developers and webmasters to implement these suggestions by updating their web server configuration, HTML, JavaScript, CSS and images. But we thought we could make it even easier -- ideally these optimizations should happen with minimal developer and webmaster effort.

So today, we’re introducing a module for the Apache HTTP Server called mod_pagespeed to perform many speed optimizations automatically. We’re starting with more than 15 on-the-fly optimizations that address various aspects of web performance, including optimizing caching, minimizing client-server round trips and minimizing payload size. We’ve seen mod_pagespeed reduce page load times by up to 50% (an average across a rough sample of sites we tried) -- in other words, essentially speeding up websites by about 2x, and sometimes even faster.

Comparison of the AdSense blog site with and without mod_pagespeed


Here are a few simple optimizations that are a pain to do manually, but that mod_pagespeed excels at:
  • Making changes to the pages built by the Content Management Systems (CMS) with no need to make changes to the CMS itself,
  • Recompressing an image when its HTML context changes to serve only the bytes required (typically tedious to optimize manually), and
  • Extending the cache lifetime of the logo and images of your website to a year, while still allowing you to update these at any time.
We’re working with Go Daddy to get mod_pagespeed running for many of its 8.5 million customers. Warren Adelman, President and COO of Go Daddy, says:
"Go Daddy is continually looking for ways to provide our customers the best user experience possible. That's the reason we partnered with Google on the 'Make the Web Faster' initiative. Go Daddy engineers are seeing a dramatic decrease in load times of customers' websites using mod_pagespeed and other technologies provided. We hope to provide the technology to our customers soon - not only for their benefit, but for their website visitors as well.”
We’re also working with Cotendo to integrate the core engine of mod_pagespeed as part of their Content Delivery Network (CDN) service.

mod_pagespeed integrates as a module for the Apache HTTP Server, and we’ve released it as open-source for Apache for many Linux distributions. Download mod_pagespeed for your platform and let us know what you think on the project’s mailing list. We hope to work with the hosting, developer and webmaster community to improve mod_pagespeed and make the web faster.