Does Google Remove The PageRank Coming From Links On Pages That No Longer Exist?

Backlinks are in constant flux. Some get added while others get removed during the course of a year, most notably those coming from sites that have closed down for one reason or another (like Geocities, AOL member pages, and so on). When this happens, does the link juice that once flowed from them get cut off?

 

Matt Cutts answers in the affirmative. Google regularly monitors the links that point to the pages in its index to keep the information fresh and relevant, so the absence of an erstwhile link should be detected. Since Geocities no longer exists, then the probability of getting to a desired page through it is nil. Page Rank is meant to be a model of how users randomly search the Web at any given time so it only makes sense to reflect the current situation.

 

The magnitude of the effect on Page Rank will vary depending on the percentage of backlinks, which have disappeared. Those that were heavily dependent on Geocities might be adversely affected whereas those with more diverse sources of link juice should not feel much of an impact.

 

Will Google Offer Ranking Reports?

In this video, a webmaster from Michigan is inquiring about the possibility of Google offering a keyword rankings report to their users. He surmises that this can be done either through the Webmaster Tools service or through an API, which app makers can utilize. Similar services are currently being provided by scrapers like WebPosition but it would be much better if Google can come up with an official report generator.

 

Matt Cutts agrees that this is an interesting idea that they could explore in the future. Google does not like scrapers as they consume server bandwidth so this suggestion has some merits. Right now, though, they are held back by resource allocation issues as the engineering team is already taxed. They are focused on other projects that would add features to existing services like Fetch as Googlebot, Malware Details, and ignoring URL parameters which important as well.

 

Webmaster Tools continues to be improved and there are a few alternatives already in place that can show keyword positioning on a limited basis. These might get expanded in the future towards an in-depth reporting tool but there are currently no concrete plans to do so.

 

Will Changing Hosts Cause Any SEO Concerns?

Fernando from Spain is anxious about transferring from one hosting company to another given that he has a pretty big site and so many things could go wrong, though he did mention that the new host is located in the same country. Should he be concerned from an SEO standpoint?

 

According to Matt Cutts, Fernando should not be too worried. The company is only changing its webhost and not its domain name so everything will appear the same. The IP address may change as well but this will have no adverse effects, especially since the new host is in Spain as well. Matt has done several host transfers himself and has encountered no problems whatsoever.

 

His tip is to set the DNS TTL (Time To Live) to something low like five minutes so that caches will be forced to refresh right away. Otherwise, people may keep getting pointed to the old address up to 24 hours after the switch. If everything is set correctly, then the new IP address should be seen right away by both Googlebot and end users. Consider keeping the site live on both hosts while things stabilize as a best practice.

 

If I Use Multiple Versions Of A Phrase, Would It Be Seen As Keyword Spam?

A webmaster noticed that keyword arrangement is important in Google Search. In his experience, typing “Texas widget” drew results that were different from “widget Texas.” He would like to include both phrases and perhaps other permutations on a page to cast a wide net among searchers. However, he is worried about the possible implications. Would this plan be frowned upon as keyword spamming?

 

Well, not necessarily. Matt Cutts says that Google will not be quick to penalize people for optimizing their pages in this manner, but he does advise caution. The different combinations should appear naturally throughout the content. Having them all bunched up towards the bottom may draw a red flag and be considered as keyword stuffing.

 

For Matt, this practice is not even necessary. He says that webmasters are better off sticking to the most popular keyword arrangement. Go for the phrase that people actually use. The others typically draw very few hits and are thus not worth the time and effort to put in place, especially since the phrases can become unwieldy as the number of words increase.

 

 

Should We Create A Mobile Version Of Our Site?

Mobile browsing is becoming more prevalent thanks to better smartphones and larger screens. Tommo from London is considering the creation of a mobile version of his site, but is this really necessary? If he does push through with the plan, how should he go about it?

 

Matt Cutts suggests figuring out a way in which the site works well for both desktop and mobile devices to avoid the complications of having two completely different versions. If this is difficult to achieve, then creating a mobile version would be the next best option. For recommendations on the more technical aspects of implementation, he points to two excellent blog posts on the subject written by the Google Japan team. The links can be found within the video description.

 

One of the posts provides helpful tips on how to ensure that the mobile site is crawled and indexed correctly, while the other deals with proper link redirection and content switching depending on the user-agent. There is also a link to an extremely useful tool for troubleshooting called “Fetch as Google” which allows webmasters to see their sites from the search giant’s point of view.

 

Does Google Treat “Brandname” And “Brandname®” Differently?

Ryan of Dearborn, Michigan is on a roll with his questions. This time, his inquiry is about the use of trademark symbols. Webmasters are often forced to incorporate these symbols in titles and page contents by corporate legal departments. Would this cause any keyword problems? Does Google treat those with an attached symbol any differently from a plain keyword?

 

Matt Cutts thinks there should not be any problem as no distinction is made between the two in the eyes of Google. He explains that they typically isolate the keywords and ignore odd symbols such as dashes, pipes, trademarks, and so on. In effect, the processed content will be the same whether the attachment is made or not. It should also be noted that Google will recognize a brand name even without a symbol next to it.

 

As for the end-users, they will be able to get relevant results when they type the brand name. There is no need for them to append the registered symbol or trademark star just to ensure that product-related results turn up. Google will be able to discern what they are trying to find and respond accordingly.

 

What Are Some Best Practices For Indicating Breadcrumbs?

Bauer from Germany noticed that Google has started adding breadcrumb URLs in search engine result pages. For those who want to maximize their pages for this, is the kind of delimiter used important? Should people gravitate towards one symbol and ignore others? What are the best practices that can be implemented for this particular feature?

 

It is a tough question to answer for Matt Cutts solely because the breadcrumb project is in its infancy. Google is still experimenting with different ways of execution and this volatility means that it is not yet the right time to recommend best practices. He compares the situation to the time when site links was still new. Back then, they did not have the tools and other infrastructure to support site links but things were eventually built one by one. The same thing is likely to happen with breadcrumbs since it is proving to be a popular feature.

 

For now, all that they can adviSe is that webmasters use delimited links that faithfully reflect the site’s hierarchy and Google should be able to take it from there. Any new developments will be posted on the Webmaster Central blog.

 

How Can A Site That Focuses On Video Or Images Improve Its Rankings?

Search engines mainly rely on textual content to determine results and site rankings. This makes articles, page titles, and other on-page text extremely important for SEO. However, some websites focus more on rich media content like video, audio and images. How can they make their nontextual content work for them so that they rank well on search results?

 

Matt Cutts recommends the extraction of metadata from the nontextual content and adding these to the page. For instance, image metadata can describe its file size, time taken, resolution, camera used, exposure settings, and many more. Tags may also be added to describe the picture and anticipate relevant search terms. Flickr allows its user community to add tags to most of the images and this crowd-sourcing effort is an enormous help in increasing their search engine visibility. Individual owners can do the same on a smaller scale and get excellent results.

 

As for videos, owners may want to consider having them transcribed. The transcription can be added underneath the embedded clip on a blog post. YouTube also allows close captioning and the text can even be translated to several different languages automatically.

 

 

What Are Some Effective Techniques For Building Links?

Organic link building is a difficult task but it is undoubtedly a worthwhile endeavor. Those who can do it successfully are likely to get the highest page rank. Matt Cutts provides a few tips on how to accomplish this:

 

Generate controversy. Writing about a controversial topic is bound to stir interest in an industry or a niche. The hotter the subject, the more likely it is that people will react in droves. Often, they will not only leave a comment on a blog post but create their own counter-argument on their sites and link to the original. They are also likely to share the controversial post through their social networks and this could lead to a viral effect.

 

Answer questions. Although generating controversy can be effective, doing it over and over again will lead to diminishing returns. It can make people question the purpose and integrity of the author. Use more gentle tactics sometimes like answering questions in a community. By helping other people, reputation is enhanced and curiosity is piqued. They will start looking at your site and linking to the most useful posts. Watch the video to get more of Matt’s tips.

 

An Introduction To ReCAPTCHA

In this video, Google research scientist Luis von Ahn presents an innovative service Google call reCAPTCHA. It is essentially a reimagining of the CAPTCHA security protocol to make it more useful. 

 

The original process should be familiar to most people: a garbled image is presented and users must type the alphanumeric code correctly in order to pass through. This typically takes 10 seconds to complete. Although this may not seem much, it adds up to thousands of wasted hours when multiplied by the millions of times people do it around the world.

 

Google wanted to make those hours count and so reCAPTCHA was born. Instead of providing random garbled images, reCAPTCHA uses scanned texts from digitizing efforts of various libraries. Sometimes optical character recognition software cannot recognize the letters but humans can quickly identify the patterns. This ideal arrangement helps the archives finish more books in less time.

 

This service is provided for free by Google so anyone can install it on their site quite easily. The images come directly from the company’s servers to ensure that everything is centralized and secure. Measures have been put in place to prevent spammers from gaming the system.