Matt Cutts encourages you to Link to Your Sources

link to your sources

In a recent Google Webmaster Help video Matt Cutts encouraged people to “Link to Your Sources.”  The video also had an interesting question about where linking should be in a post or article.

I have a blog and I post original articles but I also like to link to the original website. So I link the website in a word in the first paragraph. Is this the right way or I should give a link separately at bottom.
nayanseth, India

It was a good and interesting question as it is something I have wondered about.  Whether I should link to a source in the text of the article or at the bottom.  Since Matt Cutts said either way Google will give credit to the original source and flow pagerank.   So as long as you have the link somewhere in the post or article, you are doing the right thing in Google's eyes.

I will sometimes link to article, news, graphic, etc. sources at the bottom of a post if I took them from many different places.  Including too many hyperlinks in an article can make it look cluttered and disorganized in my opinion and make it harder for someone to read the article.

Matt Cutts does point out in the video it is more convenient for a user when the source link is in the text.  His personal preference is to find the article source easily but that is not something that Google cares about.  Again, he also encourages publications and bloggers to “Link to Your Sources!” as he constantly notices when they don't.  I don't think you want to make Matt Cutts unhappy with your website.  Who knows what secret Google power he yields.

How do you link to article sources?  In a post text or at the bottom?

Google revists Country Code Top Level Domain usage

Matt Cutts via the Google Webmaster Help channel released another video discussing country code top level domain (ccTLD) usage  a few days ago.  The title of the video was “Should I use ccTLDs for sites not targeted to those countries?” and this was the question that was asked.

As memorable .COM domains become more expensive, more developers are choosing alternate new domains like .IO and .IM – which Google geotargets to small areas. Do you discourage this activity?
Andy, NY

Matt Cutts says in the beginning of the video, “I want you to go in with eyes open.” which basically means you should be careful buying a random ccTLDs to use for your website if your intention is global use.  Later he said, “Most domains do pertain to that specific country.” when he was talking about certain ccTLDs.

Specifically he mentioned the extension .LI, which is the country code top level domain for Lichtenstein.  Some people in Long Island, New York have started to use the .LI extension for “Long Island.”  Cutts confirmed in this video thought that Google doesn't view .LI for use on Long Island or global use and still considers it a ccTLD for the country of Lichtenstein.

Back in February, 2013 there was a Google Webmaster Help video discussing ccTLD hacks.  In the video Cutts specifically mentioned the domain extension .IO, for the Indian Ocean, which was still considered targeted for that area of use in February.  What is interesting is that in this video he said that Google had looked at who was using .IO extension and it mostly wasn't people from the Indian Ocean.  For a $99 renewal fee I doubt many people in the Indian Ocean will spring to use .IO anyway.  This should make any startups or websites that are using the .IO happy since this is the first official confirmation of this from Google.

country code top level domain

Here is a list of ccTLDs that Google has confirmed are for global use.  If you are interested you can find a lot of short domains for these various extensions using Short Domain Search, which I wrote about.

What Google should do is allow people who buy a country code top level domain that isn't on that list to be able to go into Google Webmaster Tools and geoselect if it is for the country or global use.  So many new companies and start-ups seem to be using ccTLD domain extensions due to the lack of good available .COM, .NET, and even .ORG domains these days

Since this issue is not going away and I suspect that more and more people will pester Google about this I wouldn't be surprised if they changed their minds in the future.  If Google allowed webmasters to geoselect it would actually bring down the cost of ccTLDs for people to use in those specific countries since the more registrations you have the lower the annual domain renewal cost is.

There will be a lot of new global domain extensions that will be available for registration soon but I doubt the ccTLD craze will go away even with these new extensions on the horizon.  People really seem to like domain hacks and .IO for some reason.

Still most webmasters, including myself, would prefer to have the widest range of possible sources of traffic.  So in my opinion it is preferable to go with a global top level domain (gTLD) if you can.  I have Singing Dogs and that is a .NET.  A lot of good gTLDs are still out there and it still seems Google and Matt Cutts recommend you go with that anyway instead of choosing a country code top level domain which might confuse Google and users.

Does site Downtime hurt Search Engine Rankings?

search engine rankings

Does site downtime hurt Search Engine rankings? ”

This is a question that is hotly debated by webmasters, search engine optimization specialists, bloggers, hosting companies, etc.  A lot of people say that a little bit of downtime, say 20 minutes, in a day, can hurt SERPS (search engine results page) for your website.  Others say that you can have a little bit of downtime here and there and it will not matter much.

Well this question was touched on in one of a Google Webmaster Help videos with Matt Cutts recently. Check it out and the question that was asked.

I got a “Googlebot can't access your site” message in Webmaster Tools from my host being down for a day. Does it affect my rankings when this happens?
Sally

Matt Cutts initial response to this question was,

“Well if it is just for a 1 day you should be in pretty good shape. If your host is down for 2 weeks then there is a better indicator that the website is down and we don't want to send users to a website that is actually down but we do try to compensate for websites that are transiently or sporadically down.  We make a few allowances and we try to comeback 24 hours later… So it is just a short period of downtime I wouldn't really worry about that.”

While I mostly agree with what he said in the video, and after explaining that the Googlebot was having trouble crawling sites a few weeks ago, Matt Cutts commented, “If it is just 24 hours I really wouldn't stress about that very much.”

Well… a friend of mine recently had his websites on a JustHost dedicated server and it went down for 1 day.  He told me hasn't been able to get back his SERP rankings since the downtime.  Despite what was said you should realize downtime can hurt your search engine rankings in Google.  I've heard this from a number of experienced webmasters.

However, I want people to think about how the Googlebot spider works when indexing pages.  I will not go into everything as it would take too long to explain but just do a quick overview.

When you do a Google search you are not actually searching the web instantly, like a lot of people assume, but you are actually searching Google's stored version of the web.  For instance when this article was first posted it DID NOT immediately get indexed by Google and was searchable. Why?  While this blog gets ok traffic my current pagerank is 3, which is decent but not too high.  Sites that post content more frequently and that have a higher pagerank are going to get crawled before mine.  Websites like FoxNews and NY Times will get crawled first since they have a higher pagerank, more content, and are in Google News.

So if my website was down for say 1 hour it is actually pretty possible that Google will not even see my website is down since the Googlebot may not crawl it.  While Google is really good about crawling new webpages very fast these days they can't get to every new piece of content posted simultaneously.  If you were running FoxNews and had downtime 24 hours that would be a much bigger deal since they get millions of visitors a day and the Googlebot expects there to be content frequently.

So my answers to the question “Does site downtime hurt Search Engine rankings?” would generally be the same as Cutts.  I caution anyone asking this question to consider the type of website you are running, how much traffic you get, and your user's expectations which will influence Google's.  Choosing a reliable web hosting company is very important if you want good uptime and don't want to have to worry about websites going down.  I prefer Site5 and you can read my Site5 review to get a better idea about their web hosting services.

So that is my professional opinion on this topic of site downtime and search engine rankings.  By the way if you want to monitor website downtime and uptime I highly recommend a service called Uptime Robot.  It will ping your website every 5 minutes to see if it is up and if it isn't you can get a text message, email, or RSS feed notification.  Best part about Uptime Robot is that it is completely free website monitoring service.

Has website downtime ever hurt or affected your search engine rankings?  Have you ever had your website hosted with a hosting company that had frequent downtime?  Share your experiences below as I am sure a lot of people have something to say about this.

Social Widgets powered by AB-WebLog.com.