6 Questions to Ask a Search Engine Optimizer BEFORE your Hire Them

Hiring a Search Engine Optimizer is no easy task.  For one thing there are a lot of people out there that claim they know how to do Search Engine Optimization (SEO) and build quality links, when in fact they don't.  Also there are Search Engine Optimizers that will give a great price that undercuts what everyone else told you.  However, what they don't tell you is they will use all BlackHat (meaning Google doesn't like it) link building techniques to rank your site.  You could very well rank for 1st for some keywords for a couple days or weeks, but most likely your site might get deindexed from Google or face penalties.  You could still be listed in other search engines… but who really cares about those?

search engine optimizerAs someone who's seen and heard of waaay to many people getting ripped off by SEO scammers out there, here are 6 Questions to Ask a Search Engine Optimizer BEFORE you make the decision to hire them.

1. What SEO Software and Tools Do you Use?

Search Engine Optimizers need quite a few SEO tools and software to pull data about competitors websites and to try to figure out how to rank your site for certain keywords and terms.  Typically they will have a subscription to some sort of rank tracking software or monitoring service, link analysis software, and have one or several VPS (Virtual Private Servers) for running some of these programs.  In addition they might have to hire workers via oDesk or another freelancing service for doing small tasks and jobs they don't want to do.  This could be writing articles, contacting blogs or websites where they might be able to post an article with your URL above the fold (this means not as a comment or forum signature link), link analysis, filling out forms, etc.

Any legit Search Engine Optimizer should be honest with you about what SEO tools, software, and subscriptions they are paying for and what tasks they sub-contract other people to do.  Many feel sharing with clients is giving away trade secrets, but that just isn't true.  If you bought SEO tools, guess what, other people bought use those SEO tools too.

Even though I would let clients of mine know what SEO software I use, they obviously don't have the expertise or knowledge to use the effectively.  Of course if later they want to learn, that's fine with me.  The client is paying me for my knowledge and there is always plenty of SEO business out there. 🙂

2. What link building Techniques & Strategies Do you Use? Blackhat or Grayhat?

This is an important question you really need to be on your toes about.  If it's clear they completely use BlackHat techniques, then run away very fast!  (This is hard to know if you don't have any idea about search engine optimization.)

I assume most people that read my website are looking for long term benefits from SEO.  You need someone who understands that and isn't going to give you a bunch of crappy low quality links, even if they drive traffic.

Consider how your potential search optimizer answers question 1# and what tools they told you they use.  Guess what, you should Google it.  Is it SEO software that is mainly used for BlackHat link building?  Then ask how they use the tools and what strategies they use to build links?

To be fair and honest and what you do need to keep in mind… no Search Engine Optimizer, and I mean NOBODY, completely does whitehat SEO.  Everyone works in “Grayhat SEO” when it comes to link building.

Larger sites and companies will buy links, which is technically against Google's guidelines, for certain keywords they want to rank for.  Smaller sites and blogs don't care since they need the money.  (I know since I've been there.)  It's not like Google isn't aware of it but I've never seen Google do much about it since a lot of these companies have large advertising deals with Google.  Also there is no way the bots are smart enough to figure this out on ever single website out there.

A lot of SEO software is not necessarily considered “whitehat” by Google.  For instance I use a program called ScrapeBox, which everyone uses.  You can't use ScrapeBox with one IP address or Google would ban it.  ScrapeBox sends to many automated queries at one time, so if you are going to use it you need to buy proxies.  ScrapeBox is not a bad tool in my opinion, but Google doesn't like how it pulls data.

Bottom line, a Search Engine Optimizers should be honest about SEO strategies and link building tactics they will use for your website before you give them any money.  Like I said I'd keep in mind Google expects everyone to play fair, but nobody does.

3. What Keywords can you Rank me for within my Budget?

Before you talk with a Search Engine Optimizer you should figure out how much you can spend monthly or for a one-time upfront cost.  You need to understand that harder the keywords you want to rank for, the more time and therefore money it will require.

If you are told by a company they can rank you for some crazy competitive keywords such as “make money online” they might be able to… but it will cost you a lot of money.  That's why nobody posts pricing on directly on their websites.  It doesn't make sense since they are billions of searches for different categories and regions.  Cost and difficulty range with regards to what keywords and terms you want to rank in Google for.

There should be a discussion of what is doable within your SEO budget.  I'm not saying you should not expect something for your money, but understand some companies spend $500-$2,000 a month with SEO firms while other are spending upwards of $10,000+ for a whole range of SEO, social media, and website development services.

4. Can you Guarantee 1# Ranking?

This a bit of a trick question since the answer, should be “No!”  If you work with a huge SEO company that's been in the game for awhile and you are paying them tons of money each month, then it might be possible if they have the staff and resources.  Still there is no way any Search Engine Optimizers or company can “100% Guarantee” #1 sport ranking in Google for extraordinarily competitive keywords.

Even these large SEO companies can't always figure out how to get a 1# SERP (search engine results pages) ranking.  People who claim otherwise are blowing smoke.

Google rankings are NOT determined by the search optimizers you hire.  They are determined by Google and their massive computing power and stupid algorithms which sometimes nobody can figure out.  There are over 200 different ranking factors Google uses and they don't tell the public what all of these are.

5. Do you hate Matt Cutts?

The answer to this question, should be “Yes!”  (In case you don't know who Matt Cutts heads the Webspam team at Google and works on Google's search algorithm.  He wrote the family filter engine for Google as well.)

I personally don't trust a lot of things Matt Cutts tells people in the Google Webmaster Help videos on Youtube.  He tells you what Google doesn't like, not what doesn't work.  These are 2 very different things and SEO pros know the difference and will exploit them.

Watching the Google Webmaster Help videos is good for many reasons though.  For instance I wouldn't have been aware Google changed their stance on .IO domains for global use.  Also he gives straight-up advice like Don't Buy a Spam Domain and that you should always Link to Your Sources.

Often times it is quite difficult to figure out what he means in these Youtube videos though.  The issue is he is trying to appeal to beginners watching these videos that don't know much about SEO and people in the SEO industry, which doesn't work.  This was clear when Penguin 2.0 hit and his answers about regarding “Does site downtime hurt Search Engine rankings?” should have been more clearcut.  (That's why there was a website created call The Short Cutts.)  Additionally Matt Cutts SEO talk at WordCamp 2009 doesn't tell the whole story about Google and SEO.  There are a lot more factors that go into ranking than diverse keywords in articles.

6. Do you hate Google?

The answer should always be “Yes!”  🙂

Other Questions to Ask a Search Engine Optimizer

Obviously I can't account for what your specific situation or website needs are.  There are too many factors or specialty areas.  So you need to feel out how the SEO company or Search Engine Optimizer you are going to hire seems.  Do they seem like an “SEO Diva?” or are they pretty chill.  Try to think of other questions and definitely get on Skype or Google+ to talk with them “face to face” if you can.  If they don't want to take a little time to answer your questions I'd find someone else.

Personally if I get hired by a small business to do SEO work I am always open about what I am doing and I keep them updated.  They are paying me to do work for them and it's my job to make it clear what I am doing and how I am doing it, so they feel they are getting value for their money.  I even try to provide clients with tools and resources other people wouldn't.  Transparency is not something you find often in the SEO world, but you should expect it from whoever you hire in my opinion.  (If you want to Hire Me, I'd be happy to talk with you by the way.)

Ranking in search engines is something you should want to do in the long term, not short term.  Be careful who you hire and just make sure you feel comfortable working with them and what they are going to do with your website.

Think I missed something?  Have anything to add regarding SEO or search optimizers?  Let me know below!

Catch me on Twitter @AdamYamada … if you can!

Matt Cutts encourages you to Link to Your Sources

link to your sources

In a recent Google Webmaster Help video Matt Cutts encouraged people to “Link to Your Sources.”  The video also had an interesting question about where linking should be in a post or article.

I have a blog and I post original articles but I also like to link to the original website. So I link the website in a word in the first paragraph. Is this the right way or I should give a link separately at bottom.
nayanseth, India

It was a good and interesting question as it is something I have wondered about.  Whether I should link to a source in the text of the article or at the bottom.  Since Matt Cutts said either way Google will give credit to the original source and flow pagerank.   So as long as you have the link somewhere in the post or article, you are doing the right thing in Google's eyes.

I will sometimes link to article, news, graphic, etc. sources at the bottom of a post if I took them from many different places.  Including too many hyperlinks in an article can make it look cluttered and disorganized in my opinion and make it harder for someone to read the article.

Matt Cutts does point out in the video it is more convenient for a user when the source link is in the text.  His personal preference is to find the article source easily but that is not something that Google cares about.  Again, he also encourages publications and bloggers to “Link to Your Sources!” as he constantly notices when they don't.  I don't think you want to make Matt Cutts unhappy with your website.  Who knows what secret Google power he yields.

How do you link to article sources?  In a post text or at the bottom?

Google revists Country Code Top Level Domain usage

Matt Cutts via the Google Webmaster Help channel released another video discussing country code top level domain (ccTLD) usage  a few days ago.  The title of the video was “Should I use ccTLDs for sites not targeted to those countries?” and this was the question that was asked.

As memorable .COM domains become more expensive, more developers are choosing alternate new domains like .IO and .IM – which Google geotargets to small areas. Do you discourage this activity?
Andy, NY

Matt Cutts says in the beginning of the video, “I want you to go in with eyes open.” which basically means you should be careful buying a random ccTLDs to use for your website if your intention is global use.  Later he said, “Most domains do pertain to that specific country.” when he was talking about certain ccTLDs.

Specifically he mentioned the extension .LI, which is the country code top level domain for Lichtenstein.  Some people in Long Island, New York have started to use the .LI extension for “Long Island.”  Cutts confirmed in this video thought that Google doesn't view .LI for use on Long Island or global use and still considers it a ccTLD for the country of Lichtenstein.

Back in February, 2013 there was a Google Webmaster Help video discussing ccTLD hacks.  In the video Cutts specifically mentioned the domain extension .IO, for the Indian Ocean, which was still considered targeted for that area of use in February.  What is interesting is that in this video he said that Google had looked at who was using .IO extension and it mostly wasn't people from the Indian Ocean.  For a $99 renewal fee I doubt many people in the Indian Ocean will spring to use .IO anyway.  This should make any startups or websites that are using the .IO happy since this is the first official confirmation of this from Google.

country code top level domain

Here is a list of ccTLDs that Google has confirmed are for global use.  If you are interested you can find a lot of short domains for these various extensions using Short Domain Search, which I wrote about.

What Google should do is allow people who buy a country code top level domain that isn't on that list to be able to go into Google Webmaster Tools and geoselect if it is for the country or global use.  So many new companies and start-ups seem to be using ccTLD domain extensions due to the lack of good available .COM, .NET, and even .ORG domains these days

Since this issue is not going away and I suspect that more and more people will pester Google about this I wouldn't be surprised if they changed their minds in the future.  If Google allowed webmasters to geoselect it would actually bring down the cost of ccTLDs for people to use in those specific countries since the more registrations you have the lower the annual domain renewal cost is.

There will be a lot of new global domain extensions that will be available for registration soon but I doubt the ccTLD craze will go away even with these new extensions on the horizon.  People really seem to like domain hacks and .IO for some reason.

Still most webmasters, including myself, would prefer to have the widest range of possible sources of traffic.  So in my opinion it is preferable to go with a global top level domain (gTLD) if you can.  I have Singing Dogs and that is a .NET.  A lot of good gTLDs are still out there and it still seems Google and Matt Cutts recommend you go with that anyway instead of choosing a country code top level domain which might confuse Google and users.

Does site Downtime hurt Search Engine Rankings?

search engine rankings

Does site downtime hurt Search Engine rankings? ”

This is a question that is hotly debated by webmasters, search engine optimization specialists, bloggers, hosting companies, etc.  A lot of people say that a little bit of downtime, say 20 minutes, in a day, can hurt SERPS (search engine results page) for your website.  Others say that you can have a little bit of downtime here and there and it will not matter much.

Well this question was touched on in one of a Google Webmaster Help videos with Matt Cutts recently. Check it out and the question that was asked.

I got a “Googlebot can't access your site” message in Webmaster Tools from my host being down for a day. Does it affect my rankings when this happens?
Sally

Matt Cutts initial response to this question was,

“Well if it is just for a 1 day you should be in pretty good shape. If your host is down for 2 weeks then there is a better indicator that the website is down and we don't want to send users to a website that is actually down but we do try to compensate for websites that are transiently or sporadically down.  We make a few allowances and we try to comeback 24 hours later… So it is just a short period of downtime I wouldn't really worry about that.”

While I mostly agree with what he said in the video, and after explaining that the Googlebot was having trouble crawling sites a few weeks ago, Matt Cutts commented, “If it is just 24 hours I really wouldn't stress about that very much.”

Well… a friend of mine recently had his websites on a JustHost dedicated server and it went down for 1 day.  He told me hasn't been able to get back his SERP rankings since the downtime.  Despite what was said you should realize downtime can hurt your search engine rankings in Google.  I've heard this from a number of experienced webmasters.

However, I want people to think about how the Googlebot spider works when indexing pages.  I will not go into everything as it would take too long to explain but just do a quick overview.

When you do a Google search you are not actually searching the web instantly, like a lot of people assume, but you are actually searching Google's stored version of the web.  For instance when this article was first posted it DID NOT immediately get indexed by Google and was searchable. Why?  While this blog gets ok traffic my current pagerank is 3, which is decent but not too high.  Sites that post content more frequently and that have a higher pagerank are going to get crawled before mine.  Websites like FoxNews and NY Times will get crawled first since they have a higher pagerank, more content, and are in Google News.

So if my website was down for say 1 hour it is actually pretty possible that Google will not even see my website is down since the Googlebot may not crawl it.  While Google is really good about crawling new webpages very fast these days they can't get to every new piece of content posted simultaneously.  If you were running FoxNews and had downtime 24 hours that would be a much bigger deal since they get millions of visitors a day and the Googlebot expects there to be content frequently.

So my answers to the question “Does site downtime hurt Search Engine rankings?” would generally be the same as Cutts.  I caution anyone asking this question to consider the type of website you are running, how much traffic you get, and your user's expectations which will influence Google's.  Choosing a reliable web hosting company is very important if you want good uptime and don't want to have to worry about websites going down.  I prefer Site5 and you can read my Site5 review to get a better idea about their web hosting services.

So that is my professional opinion on this topic of site downtime and search engine rankings.  By the way if you want to monitor website downtime and uptime I highly recommend a service called Uptime Robot.  It will ping your website every 5 minutes to see if it is up and if it isn't you can get a text message, email, or RSS feed notification.  Best part about Uptime Robot is that it is completely free website monitoring service.

Has website downtime ever hurt or affected your search engine rankings?  Have you ever had your website hosted with a hosting company that had frequent downtime?  Share your experiences below as I am sure a lot of people have something to say about this.

Matt Cutts SEO talk at WordCamp 2009

I got a lot of emails all the time from people asking me what SEO (search engine optimization) strategies and tactics I use.  While I am happy to provide my experiences, thoughts, and frustrations about SEO I encourage people to always get additional information straight from the largest search engine, Google.

One video I usually always encourage people to watch is this Matt Cutts SEO talk which he presented during WordCamp 2009.  (For those that don't know Matt Cutts is the head of Google's webspam team and he wrote Google's family filter engine.)  While Google has made a lot of updates to their search engine algorithms since 2009, including Penguin 2.0 recently, what he says in this talk at WordCamp is still relevant today if you are a blogger or publication.

There is a lot of great advice in the talk and Cutts takes people through the basics of what you need to know.  This includes why WordPress is great for SEO, finding the right keyword research using the Google Adword Keyword Tool, what you need to do to standout from the crowd, and even more.

Matt Cutts SEO advice is also very valuable.  There are a lot of SEO myths out there which forums and blogs buy into.  It's best to honestly get your SEO advice straight from Cutts and Google since that is who knows what is going on.

While I do think Cutts provides great SEO and blogging advice in this talk, keep in mind their are a lot of factors which might make or break your website or blog.  Still it's worth watching the entire 46 minutes talk so you can better understand SEO.

Matt Cutts SEO Talk at WordCamp

Matt Cutts from the Web Spam team at Google showcases the good and the bad of WordPress as seen through the eyes of Google, including basics on how Google search works and how you can boost your blog’s results in Google searches.

I hope you enjoy the video and if you learned something from watching it or got something out of it, please leave a comment below and let me know.

If you want to see the presentations slides more slowly you can view and download them here on Matt Cutts personal blog.

Catch me on Twitter @AdamYamada … if you can!

Penguin 2.0 part of Google Algorithm Updates in 2013

Google has released some details via it's GoogleWebmasterHelp Youtube channel about some major Google Algorithm Updates that will be taking place in 2013.  The big news that Matt Cutts revealed is that Google is releasing Penguin 2.0 in an effort to continue to cut-out Blackhat and spammy link building tactics from search engine results.

What should we expect in the next few months in terms of SEO for Google?
Matt Cutts, Mountain View

From what I can tell that was said in the video Penguin 2.0 will most likely go deeper than the original Penguin update Google made to it's search algorithm last April.  That means if you using duplicate content without linking to the original source, using automated link building software, basically any BlackHat link building tactics or whatever your websites will likely get harder hit this time around.  “This one is a little more comprehensive than Penguin 1.0” commented Cutts in the video. “We expect it to go a little deeper and have a little bit more of an impact than the original version of Penguin.”

penguin 2.0

The other change that webmasters and sites owners should be aware if is that Google doesn't want sponsored posts or paid advertisements passing on PageRank to sites that have paid for links.  This violates Google's quality guidelines and I have a feeling that the Penguin 2.0 and the next set of Google Algorithm updates might make sponsored posts harder to rank.  It is possible i fewer companies will want to pay for links if there is no PageRank or traffic benefit for doing so.  You could always keep it secret though.

Google will also be providing more tools to webmasters that have hacked sites.  “We hope in the next few months to roll out a next generation of hack sites detection that is even more comprehensive.”  Cutts also said that Google will be working on communicating to webmasters so they know when their sites have been hacked sooner.  It seems they will accomplish this by having more comprehensive info in Google's Webmaster Tools and a “One stop shop” as Cutts puts it, so when someone realizes they have been hacked they can go get the resources they need to clean it their site.

It also seems authority sites with great content will be getting a boost from Penguin 2.0 and the next Google Algorithm updates happening in 2013.  Google wants to serve up sites with content, “According to the algorithms, might be a little more appropriate for users.”  So authority sites in certain areas, like medical or travel, could be seeing a spike in traffic for certain keywords in the coming months.

Some websites that might have gotten unfairly hit by the Google Panda update back in February and April 2011 might get a reconsideration.  Since Google Panda affected roughly 12% of search engine rankings, there are probably a lot of sites that were unfairly hit.  Most were not hit unfairly however.

A possibly change will be that Google will have less site clustering on search engine result pages.  “Once you've seen a cluster of results one site, you'd be less likely to see the more results from that site as you go deeper into the next page of Google search results.”  said Cutts.  It sounds like this change is not definite but will likely be something that Google will roll-out.

Matt Cutts explicitly states in the video people should take what is said, “With a grain of salt.”  Google could move resources around and decide to make different updates to it's search engine if the need arises.  It sounds like for the most part a lot of these plans will be rolling out over next few months in 2013.

What was most surprising in the video was actually nothing that was said by Matt Cutts, but he is wearing a Firefox T-shirt! 🙂  I noticed it immediately and I assume the Google execs and Chrome team probably were not too happy about it.  Most likely they didn't know until the video went up.  Google is a pretty cool place to work if they allow employees to wear T-shirts that represent competitors products.  (I am writing this blog post using Firefox by the way.)

Anyway, bottom line is if you were affected by Penguin or Panda and still do BlackHat link building techniques, Google is coming for you.  If you just have a simple blog or website and are not engaging in shady practices you should be fine as long as you have a good SEO gameplan with great content for users.

Matt Cutts did announce on his personal blog that Penguin 2.0 officially rolled out on May 22, 2013.  About 2.3% of English language and US based queries have been affected at this point.

What are your thoughts about the Penguin 2.0 and Google Algorithm Updates for 2013?  Please leave a comment below and let me know how you interrupted the video if you are a webmaster, Search Engine Optimizer, or just think I am plain wrong.  Hopefully that isn't the case though.

Catch me on Twitter @AdamYamada … if you can!