Link Removal is Serious Business!

Well, it’s official – the Google Penguin updates are costing me money. Thanks a lot, Matt Cutts! Just kidding. It’s actually kinda funny.

Zoidberg gets angry when you don't remove your links to his site.
Zoidberg gets angry when you don't remove your links to his site.

Well, it’s official – the Google Penguin updates are costing me money. Thanks a lot, Matt Cutts! Just kidding. It’s actually kinda funny.

The Backstory: I have a few Web 2.0 properties that I don’t really care about. I never post new content on them, and I never update them. Ever. Really. I never ever even look at them. But a long time ago in a land far, far away (circa 2005), I signed up for Text-Link-Ads because I thought maybe I could make some extra money by selling links on these properties that I don’t really do much with. And wouldn’t you know – I’ve been making about $25-$50 a month for the past several years from those sites. It’s been a lot of fun. I mean, it’s nothing to brag about, but it is some extra money. I’ll take it! [BTW just to be clear, I would never ever sell links on yourseosucks.com because I actually like this site and I don’t want any ads/paidlinks on it simply because I don’t want anything screwing up this really crappy design I’ve got going here.]

Every few months, I’ll get a ‘link requested’ or ‘link cancelled’ email from TLA, and I LOL because it’s still funny to me that anyone would want to buy a link on any of my crappy sites that altogether get maybe 20 visitors each month. I shouldn’t laugh when they cancel their links, but I do because it’s funny to me that they bought them in the first place.

Most of the time when a link is cancelled, I leave it live on the site. I rarely take links down. I simply don’t care enough to actually go update those sites. So whoever bought links from me and then cancelled – you are probably still getting links on my sites for free. Not a bad deal, right? Wrong! It is a bad deal. Because now we live in the age of Penguin, and Penguin is rabid for bad links.

For the past 15 years, the SEO world has profited on the fact that Google likes links. Now it’s time to profit on the fact that Google Penguin hates bad links. It’s like everything has been reversed, or maybe it’s like everything has come full circle. I don’t know. I’m not good with analogies and metaphors. But I do know this: smart, opportunistic SEOs will use Google’s hatred of bad links to make even more money.

That’s right, folks: LINK REMOVAL IS SERIOUS BUSINESS. I predict that the next few years will be marked by a massive rise in the number of link removal services offered by independent SEMs, SEOs, and agencies alike. I don’t typically make predictions, so I’m kinda nervous about that one. But seriously, take a look at this email I got from TLA:

Link Removal Request Email from Text-Link-Ads.com
Link Removal Request Email from Text-Link-Ads.com

After the 5th day and 3rd notification we will remove your site from the marketplace!

Wow. That is serious! I’ve never received an email like that from TLA. Like I mentioned earlier, I always leave the links up – even after the person stopped paying for them. Maybe I’ve missed these email before, but I’ve don’t ever remember seeing an email from them with an ultimatum like “Take them down or else!” I feel like they are bullying me. Yeah. This is link removal bullying. Well, not really. But they’ve definitely taken a new approach to link removal requests. On the bright side, the email included a complete list of all the links I’ve never taken down. So I guess I’ll set aside some time to remove all of them. And there goes my $25-$50 per month that I was making. Thanks again, Google Penguin. Now how am I supposed to buy all those +1’s each month?!?

Now let’s get back to link removal and link removal services.

My theory: Text-Link-Ads.com didn’t change their link removal policy just because they had nothing better to do. Well, maybe this has been their policy all along, and I’ve never really noticed. Or maybe this is just the first time they’ve ever decided to enforce it. It doesn’t really matter. Regardless of the cause, this is the first time I have ever received an email about it. And this email was obviously catalyzed by the Google Penguin updates.

There are a ton of webmasters, link builders, and SEOs that are really scared of the Penguin. It is only logical that the same people who have been interested in building links are now interested in removing them. So, not only does a company like TLA have to provide  a high level of service and support in the acquisition of links – they also have to add a new level of service and support dedicated to the removal of links (when advertisers stop paying publishers for the links).

As noted earlier, when advertisers cancelled links on my sites, I just left the links up. I didn’t take them off my sites. Did that cost me money? Sure. Was I giving something away for free? Yeah. But I didn’t really care enough to spend time updating my sites every time a link was cancelled. It just wasn’t worth it to me. But now, with this new policy from TLA, I’ve got to remove links when they are cancelled…OR ELSE!

In the end, I think this is a good move on the part of Text-Link-Ads.com. Over the years, I have enjoyed using their services. Any time I’ve needed additional support, their customer support team has been very quick to respond and very helpful. So this new addition to their service makes me happy as a publisher. Ultimately, if you’re buying links from anyone, you should be able to have the links removed whenever you want.

I have worked with enough link publishers to know that they don’t really specialize in link removal. In fact, in my experience, the removal of links is the one thing that most link publishers could care less about, especially the link publishers who specialize in building private network links in bulk. I mean, seriously: how do you expect someone to remove 100,000 links? Even if they are on private networks, it’s pretty much impossible. It’s even more impossible when the links were built in comment threads, profile pages, and articles that were built on sites not owned by the link publisher. So really, good luck with all of that. But that is exactly why I think it is a good thing that link publishers are taking the time to create processes that make link removal an easier thing to manage.

Site owners who were negatively affected by Google Penguin must come clean. Google Penguin will continue to have updates, and at some point, ALL site owners will have to come clean about their links and linkbuilding history. How many of the bad links do you need to remove? Well, according to this post and this thread, the answer is 85% of the inorganic links need to be removed before you submit a reinclusion request. I’m not sure if that’s a solid number or if it will work for any situation. At any rate, Google wants to see you at least trying. In fact, at SMX Seattle last week, Matt Cutts said that he’s actually had webmasters sending in screenshots of their please-take-down-my-links emails to link publishers. Matt’s point: he wants to see some effort.

So good luck and Godspeed with your link building…err…removing!

 

SMX Advanced 2012: iSEO – Doing Mobile SEO Right

Live Blogging SMX Advanced 2012: iSEO – Doing Mobile SEO Right

SMX Advanced 2012 (Seattle)
SMX Advanced 2012 (Seattle)

Cindy Krum, CEO MobileMoxie (@suzzicks)

Mobile Marketing: Finding Your Customers Wherever They Are

1. Separate MObile Pages

– Mobile speficic design templates and content
– user agent detection, redirect from desktop to mobile

2. Responsive Design

– dual purpose pages for desktop and mobile
– same URLs for both versions

3. Mixed Solution (RESS)

Case Study: Info-tainment

DUST – duplicate URL same text

Mobilization Engine was creating lots of DUST

Kills the efficiency of the crawl

Since not all pages were mobilized, not all pages were redirected to a mobile version
– desktop pages still ranked in mobile search

Mobilization engine not caching and compressing pages correctly

– update feeds sent to the mobile engine to include SEO tags
– better server rules to reduce DUST
– Canonical tags up to the desktop URLs

Case Study: Doctor Directory

Mobile CMS driven templates in limited beta; wanted responsive design

Had launched hand-templated ‘m’ site but not happy with it

Beta mobile site does not contain directory pages
– concerned that deep pages would not get crawled or indexed very well

Current desktop pages too big to use responsive design effectively
– how many roundtrip DNS requests have to be made?

Move forward with current m. beta launch
– rely on UA detection and redirection from desktop to mobile
– test success and track performance
– New smartphone bot will get around the need for directory pages (eventually)

Smartphone bot relies on desktop rankings

Start a resp design with serverside components

Case Study: e-commerce site

Improve mobile rankings to drive more mobile shopping and sales

Desktop content ranking really well in mobile search

Wap & smartphone sites

Clean, light mobile pages with mostly good redirects

Inconsistencies between robots.txt, canonical tags, and XML sitemaps

Javascript Error message was the first thing indexed and cached – became a description tag in SERPS

Misdirection – desktop pages ranking well & redirecting to blank error pages

Only 6 pieces of unique anchor text on the entire site

Almost all links on the site (internal links)  have no anchor text

Fix parsing error in redirection rules

STOP JS ERROR indexing by adding bot specific instructions

Changing linking on DIVs to linking on images and anchor text

Hard to get lots of ecommerce rankings on one search
– pushing Universal Search Results & App Results in Google to own more of the mobile SERP page

Improve social interactivity on the mobile content
– better profile page rankings for the brand
– good for future of mobile search

MobileMoxie tools: SMXA2012

Bryson Meunier (@BrysonMeunier)

Mobile search will surpass desktop search in a matter of years…not decades

Many paths to mobile SEO

Desktop vs Mobile

Responsive Design

Smartphone, Tablet optimized

Site that works well on all devices

Disagreements within Google

Matt Cutts said mobile sites don’t cause canonicalization issues

Google Webmaster Team selected Responsive design for maintainability

Different search behavior requires different content to achieve user goals

Case study: Arby’s

Some categories need dedicated mobile sites

What content appears in smartphone search results?

Tested code validation, mobile usability and pagespeed and 37 total factors

Validation is key? Only 1 site validated, so it’s not true that you need to be validated.

Mobile usability/page speed is helpful? — 65% of sites in sample actually failed W3C’s Mobile OK test

72% got a score of bad (ready.mobi)

Linkbuilding is unnecessary? — Linkbuilding is necessary even though it is mobile

dotmobi helps? — not true. Distribution of TLDs resemble desktop distribution of TLDs.

Mobile sites help ranking? — 64% of sites in the site had mobile content…a mobile site or responsive design or both.

How do top sites approach mobile SEO? Top 100 SEMrush sites response to smartphone googlebot crawl”

– 83% redirect or reformat
– 10% not crawled
– 7% no response

What they do?

–  60% redirect to mobile URL

71% of top 100 SEMrush sites have a mobile URL, 10% have no mobile site

Data-driven mobile seo best practices

1. understand differences between what mobile user wants vs desktop user wants

2. build a mobile homepage at m.domain.com OR build a mobile first responsive design driven site if goals are same

3. Don’t block mobile URLs with robots.txt. Use canonical tags for duplicate URLs and redirect smartphone URLs to smartphone Googlebot but make mobile homepage unique to appear for mobile navigational searches

Next Up: Pierre Far from Google (@pierrefar), webmaster trends analyst

Smartphone sites and Google search

Recommendations for building smartphone-optimized sites

First choice: mobile site on same URLs or on different URLs

Are you going to serve same HTML or different HTML?

Responsive web design

Dynamic serving

Responsive web design

Same HTML
+ Same URL
++ CSS Media Queries

With responsive web design, there is an efficiency win because we don’t have to crawl your site with all of our different crawlers.

Please let all Googlebots to access all your HTML

Responsive web design tips

– Max width 640px

– Allow all bots

Dynamic serving

Different HTML
+ Same URL

Separate mobile site

Different HTML
+ Different URL

We need you to annotate these pages (relationship annotation)

rel-canonical from m.domain to www.domain

on www.domain.com rel-alternate

It’s a URL-level annotation

1. rel=”alternate” in sitemaps

2. Vary HTTp header if you automatically redirect (this is another signal to Google)

3. If not, understand trade-offs and pitfalls, and implement correctly (if you can’t get it right, use responsive website design instead)

Q&A

Pierre says ranking factors are same on mobile for desktop. I’m not going to say anything definitive because I know you guys will break it.

iAcquire, Paid Links & SEO Noobs

It’s been a little over a year since the last big SEO scandal hit the news, so I guess we were due for a big story. And what do you know – IT’S ABOUT PAID LINKS! Good lord. Are we still outraged about people buying links? WTF, man. Get over it.

More paid links drama?!? SERENITY NOW!
More paid links drama?!? SERENITY NOW!

It’s been a little over a year since the last big SEO scandal hit the news, so I guess we were due for a big story. And what do you know – IT’S ABOUT PAID LINKS! Good lord. Are we still outraged about people buying links? WTF, man. Get over it.

The SEO industry is full of a bunch of amateurs right now. They think buying links is a black hat strategy. LOLz. If they knew anything about blackhat tactics, they would know that blackhatters are not paying for links. Not in the traditional sense anyway. And not good blackhatters. Blackhat SEOs are much more intelligent than spending all of their time building links. They write software for that. Also, while blackhat SEOs operate in the gray area of SEO and Google’s guidelines, many of them also tend to operate in the gray area of the law. Furthermore, they are more interested in making money than giving it to someone else for a crappy link.

It’s pretty clear that the SEO industry has an abundance of  ignorant SEO noobs. They actually have no idea WTF blackhat SEOs are doing these days. Paying for links is blackhat? Shut up and get back to your title tags and robots.txt files.

As far as calling out iAcquire – no shit iAcquire has a linkbuilding component to their business! They bought Conductor’s paid link business, and everyone knows about that. Big deal. Do a little research (i.e. go to LinkedIn), and you’ll find that iAquire’s co-founders are Joe Griffin and Jay Swansson. These guys have been in the game a long time. Joe founded SubmitAWebsite and the search agency at Web.com. He was also a VP at iCrossing. And Jay Swansson used to work at Text-Link-Ads. He knows all about effective link building. These are two smart dudes who have a track record of producing results. Now everyone is under the impression that iAcquire is some evil blackhat company. That is a gross reduction of their company.

And Mike King – that guy is a fully registered SEO badass. He is the kind of person that most SEOs wish they could be. He’s got mad skills as a coder, developer, SEO, speaker, and blogger, and it’s like someone fine tuned his brain to be an SEO. If you’ve ever heard him speak at conferences, you’d see what I mean. Also, Mike is someone who shares a lot of information about modern SEO strategy. Take a look at his post on SEOmoz.org called ‘The Noob Guide to Link Building‘. Yeah. That’s some good stuff right there. I’m really not sure why Mike was even mentioned in Josh’s post.

Link building is not a bad thing. If I was approached by an SEO that didn’t have a linkbuilding strategy, I’d be like “GTFO! NOW!” Go ahead and call it ‘inbound marketing’ – it’s a service that everyone needs. It’s integral to SEO. It’s basically digital PR. Nowadays, ‘inbound marketing’ is the safest word in SEO. Some people use it as an umbrella term that covers paid link building, but it covers so much more than that. And here’s a shocker: nearly every aspect of inbound marketing involves getting links, and getting links is going to cost money. Press releases don’t write and distribute themselves. Someone has to connect with influencers and bloggers. Guest posts don’t write themselves. Building links is an art, and it can take on 100’s of forms. But at the end of the day, you’ve gotta do it if you want to succeed in organic search today.

Here’s a little secret: Every good SEO has a good link builder. Never forget that. It’s the reality today. But now that Penguin and Panda are out, a lot of SEOs are scrambling to find new linkbuilders and new linkbuilding tactics. A lot of the HPBL networks got nailed, and now those network owners are going even more private. And believe me: shit is about to get really expensive because the cost of intelligently building links on private networks is going to skyrocket. Content has to get better. Hosting has to get more sophisticated. And they’ve gotta erase any and all footprints. There can be no evidence.

Ultimately, with PageRank becoming less and less of a ranking factor, gaming it with links will increasingly prove less valuable over time. There will always be paid link networks, but eventually paid links will become too expensive to produce, manage, and maintain (for most link buyers). All the money will move over to content generation and social media, where buying Likes and Tweets and +1’s is much safer and affordable – and effective! But that’s an entirely different post. I’ll save that for some other time.

On another note: I’ve seen the term ‘myopic SEO’ flying around Twitter today. Myopic should not have a negative connotation. That isn’t fair. Some SEOs are myopic by nature because that’s how they survive in their niche. They are nearsighted because all they care about is making money RIGHT NOW! Today even! Those SEOs are probably some of the most industrious, entrepreneurial people you will ever meet. But guess what – they are not out there sharing their secrets, and you’ll probably never meet them.

In the past, SEOs would share their secrets at conferences, in forums, on blogs, etc… But around 2009/2010, SEO went back underground, and the people who know how to make quick money sealed their lips (and/or their keyboards). Oh sure. You may still catch people blabbing about their victories on forums here and there, but DM any of those forums’ senior members, and they’ll tell you that the forum is full of noobs and that it’s nowhere near what it used to be in terms of content. And if someone on a forum goes public with a strategy that works, 99% of the time all the readers will go out and beat it to death.

So if you’re not the myopic SEO, you probably fall into the group of SEOs who is interested in sustainable SEO. In reality, this bucket is where most SEOs reside. They promote the idea that SEO is a longterm strategy, where good guys finish first in the SERPs and Alt tags make a difference for your rankings. Sustainable SEO means that you are most likely working for a client where you are fitting into a larger marketing machine. You can’t act alone, and if you do something dumb, you could end up costing yourself and/other people their jobs. SEO for this group is a longterm strategy, and you’d better get buckled in because it’s going to be a long flight.

Sustainable SEO is the place where you have to be really smart about link building. You might even call it white hat link building…which is kind of an oxymoron. It means that you need to be smart about your link building strategies. 2011 was an amazing year for building links via private blog networks. You can’t rely on that any longer. You also can’t only go after unbranded anchors. You’ve gotta diversify. When it comes to paid link building, there’s a lot of stuff to consider these days. Do your research and be smart. You don’t want the Penguin catching you.

BTW here’s a great post Danny Sullivan wrote about all of this. I’m really looking forward to Aaron Wall’s blog post about all of this.

Well, that’s my rant. Smell ya later.

PageRank Update: November 8-9, 2011

Last night, around 11:45pm PST, I started noticing that the PageRank had increased for most of my sites. Or it stayed the same. I haven’t seen the PageRank for any of my sites drop. That’s for sure.

Probably the best part about this PageRank update is that no one is really talking about it. I’m at PubCon in Las Vegas with some of the best SEOs around, and no one has even mentioned it. I’ve been wondering when the day would come where people would lose interest/fascination with PageRank. Are we there yet? Maybe.

Some Big Websites Suck at Non-WWW to WWW Redirects

Alright, folks. Today we’re going to talk about PageRank. Oh, I know. It’s dying and/or dead. Ok. Awesome. But it’s still a metric that we can use… at least a little. Like in this post, which happens to be about people not correctly redirecting their homepage URL from the non-www to www version.

Alright, folks. Today we’re going to talk about PageRank. Oh, I know. It’s dying and/or dead. Ok. Awesome. But it’s still a metric that we can use… at least a little. Like in this post, which happens to be about people not correctly redirecting their homepage URL from the non-www to www version.

I’m going to assume you know the advantages of redirecting the non-www version of your site to the www version of the site. Across all pages. Or maybe you prefer to go the other way – swim up stream like I do with this site’s URLs – and redirect the www version to the non-www version. It’s all about eliminating duplicate content and making sure every link is most effectively attributed to the ‘official’ URL for your product pages, category pages, etc… Blah blah blah.

Here are some sites that do this very efficiently and effectively (and their respective PageRank values):

So fresh and so clean clean: Awesome examples of non-www to www via 301 redirects
So fresh and so clean clean: Awesome examples of non-www to www via 301 redirects

It’s so nice to see this being done correctly. And these are some bigtime brands. Well done, big brand sites. Well, done.

In March 2010, Matt Cutts dropped some knowledge on us:

Note: in a follow on email, Matt confirmed that this is in fact the case. There is some loss of PR through a 301.

So really, we shouldn’t think of this non-www to www redirect method as something that is going to pass all of our linkjuice and PageRank through to the final URL. It simply doesn’t work that way. However, it still passes some PageRank AND it helps reduce duplicate content. And that’s good enough for me.

So…the awesome examples were just the beginning of this post. Now we’re going to see some websites that are losing a lot of link juice from doing it all wrong.

First, let’s take a look at sites that do not use any type of non-www to www redirect. And again, these are some big brand names right here:

These sites need to join the party: Go ahead and 301 those non-www's already!
These sites need to join the party: Go ahead and 301 those non-www's already!

Wow! Sprint has the PR8. That’s impressive. But what would it be if you 301’d the 4,280 links that are currently pointing to the non-www homepage URL? If you wanted to go buy 4,280 links, that would be pretty expensive. You could pass all that linkjuice to the www version of your site – FOR FREE! How about that? That’s 4% of your total links to the non-www version of your site! And you could at least get some of that linkjuice. [BTW I got the external link numbers from SEOmoz’s Open Site Explorer tool.] The point is: Returning a 200OK for your non-www and www homepage URL isn’t terrible, but it’s not up-to-date with fundamental SEO principles.

Now here are some sites that are doing the non-www to www redirect, but these sites all share the honor of having used the dreaded 302 temporary redirect. In general, the 302 redirect is the ‘Voldemort’ of SEO. You really don’t want to be caught mentioning it – ever. The 302 has its place, but it certainly is not needed in this conversation. Mainly, we don’t want to use a 302 for this because it does not pass any linkjuice. So here are some big sites that are using a 302 redirect from the non-www to the www version of their sites:

I won't tell anyone. Just please change that to a 301 redirect. Already even!
I won't tell anyone. Just please change that to a 301 redirect. Already even!

Really, Apple? Really, Costo? Really, Walgreens? I’m just going to assume that you all have awesome SEOs who know what they are doing. Obviously there is a perfectly good reason for your sites to be using the 302 redirect for non-www to www. It’s probably classified. I’m going to give you the benefit of the doubt. Regardless, I still recommend that you update that redirect to a 301. It would make me happy. Because I’m an SEO nerd, and even more specifically – I want to see Apple at a PR10. I mean, maybe the linkjuice from another 114,524 links would help to get to PR10. I dunno. But it certainly wouldn’t hurt.

Let’s dive into some other sites that are really screwing the pooch. As you will see in the following screenshot, some websites prefer to use *double* redirects and, yes, even *triple* redirects to get from the non-www to the www version of the homepage URL:

Chain chain chain: Redirect chain of fools.
Chain chain chain: Redirect chain of fools.

To me, the double redirects and triple redirects look like a chain. Just imagine all the PageRank that is lost, siphoned off with each redirect. It’s terrible. Simply terrible. Verizon, I am a customer of yours. In general, I think you service is great. But it pains me to see a double 302 redirect on your homepage. Please fix that. Please. For the love… This is just depressing.

Finally, I found a website that is guilty of something that I can barely bring myself to discuss. From a usability standpoint, it makes my blood boil with the rage of 1,000 SuperBowl-week ice storms. This site does not load when you visit their non-www homepage URL:

The unforgivable sin of usability: homepage URL unreachable
The unforgivable sin of usability: homepage URL unreachable

Okay. Hahaha. Get your mind out of the gutter. BJ’s is a Fortune 500 company and a wholesale club. Now, redirecting the non-www to www is one thing. But in this case, basic web design and usability principles override SEO practices. It’s imperative that they at least get the non-www version of their homepage to return the actual homepage of their site, even if it is a 200OK. Even that would be better than the current scenario. I urge everyone to make sure this is not happening with your site. Please go check. You’ll feel better if you do. I promise.

Well, that’s pretty much it. In conclusion:

  • 301’s are good, and 302’s are generally bad
  • non-www to www redirects can help increase your homepage URL’s PageRank
  • double redirects and triple redirects are not good
  • bjs.com is a respectable Fortune 500 wholesale company

Now get out there and take a look at your .htaccess file(s)!

Cheers!

Monitoring Historical PageRank Trends & Changes

With today’s Google PageRank update, I had to dust off a few old Excel docs. Actually, I pretty much check the PageRank for all of my sites every couple of months, so there was no ‘dusting off’ involved.

With today’s Google PageRank update, I had to dust off a few old Excel docs. Actually, I pretty much check the PageRank for all of my sites every couple of months, so there was no ‘dusting off’ involved. I know, I know. PageRank is dying and/or dead. And it’s Toolbar PageRank, so it’s not even current data. In fact, looking at Toolbar PageRank is a lot like looking into a telescope that is pointed at the center of our universe. Essentially, you are looking back in time. Don’t be surprised if Emmett Brown jumps into view with a flux capacitor, offering to have Mr. Fusion eat all your garbage.

But seriously, I thought I would share the method that I use to visually monitor historical PageRank changes and trends for a set of URLs:

Monitoring Historical PageRank Changes & Trends
Monitoring Historical PageRank Changes & Trends

Yeah. I know. It looks like a weird game of Minesweeper (SEO version). But it’s easy to set up, using conditional formatting, and it really helps to quickly identify Toolbar PageRank lottery winners and losers.

Now, do I freak out if some of my sites drop in PR? Not at all. What about the URLs that move up in PageRank? Well, I do take a little joy in that. But overall, I really don’t use this data to make any major SEO decisions. And I wouldn’t recommend using PageRank as a KPI for your SEO campaigns. However, this data can be used for diagnostic purposes if something totally random occurs with your PageRank. For instance, if you see a massive PageRank drop (i.e. PR4 –> PR1), then I recommend you find out why that might have happened, as it could be affecting your overall results.

Let me know if you have any other ways of monitoring PageRank trends. I’d love to hear about them.

Cheers!

PS. I know that ‘-1’ is not a valid PR value. I use ‘-1’ instead of ‘PageRank Unavailable’. It makes it easier to sort. Boom. Roasted.

5 Common SEO Questions Answered by Google’s Matt Cutts

In my experience as an SEO, there are several questions that pop up quite often. Matt Cutts has answered these questions, so I thought I would post his answers on this awesome SEO blog of mine.

  1. Should I use pipes or dashes in my title tags?
    “I think they’re both viewed as separators, so I think either one should be fine. Dashes are a lot more common…We definitely handle dashes well. I would expect that we handle pipes well as well.” – Matt Cutts
  2. Should I use underscores or hyphens in URLs?
    “It does make a difference. I would go with dashes or hyphens if you can. If you have underscores and things are working fine for you, I wouldn’t worry about changing your architecture.” – Matt Cutts
  3. Can the geographic location of a web server affect SEO?
    “Yes it does because we look at the IP address of your web server. So if your web server is based in Germany, we’re more likely to think that it’s useful for German users…We also look at TLD…If you want to experiment, you can certainly try switching the geographic location of your web server [in Google Webmaster Central], which is essentially changing your IP address…” – Matt Cutts
  4. Is excessive whitespace in the HTML source bad?
    “Um. We really don’t care that much…Any time we see white space, we’ll separate stuff. And we can ignore white space, so it doesn’t really cause us a lot of harm either way…As long as you’re doing normal, reasonable stuff, I wouldn’t worry about it that much.” – Matt Cutts
  5. Does the position of keywords in the URL affect ranking?
    “It does help a little bit to have keywords in the URL. It doesn’t help so much that you should go stuffing a ton of keywords into your URL.” – Matt Cutts

There. I hope you enjoyed that. Thanks to Matt Cutts for answering these questions for us all. I’m just going to send any and all clients to this blog post from now on.

Matt Cutts Debuts Shaved Head at SES San Jose 2009

Shaved Head Duo: Greg Boser & Matt Cutts look tough for the Live Site Clinic session at SES San Jose 2009
Shaved Head Duo: Greg Boser & Matt Cutts look tough for the Live Site Clinic session at SES San Jose 2009

Well, here we are. It’s Thursday, August 13, 2009. SES San Jose 2009 is coming to an end. What did I think of the conference? Actually, it was fairly quiet. I wasn’t able to catch many sessions, but the sessions I was able to attend were relatively mild. It was kinda quiet, and the Expo Hall was not very crowded. In year’s past, SES San Jose has been a place where breaking news from search engines creates a buzz or some other stuff has got the SEM world in a tizzy. Sure, Google Webmaster Blog announced the Caffeine update on Monday, but that did not seem to really have an impact on the SES crowd. I surfed around sandbox for a while, noticing some changes in the top results. However, it was kind of boring after a while. I even found this cool tool that allows you to see side-by-side results from the sandbox and the current Google. But it, too, was nothing to write home about.

Furthermore, I did not hear anyone talking about the recent Bing/Yahoo merger deal and how it might affect SEO. I heard nothing about the recent Facebook acquisition of FriendFeed. The site clinic with Matt Cutts was very funny, and I’m not used to that. I mean, Matt’s a great guy and his banter with seasoned SEO’s is always hilarious. But in the end, those sessions with Matt Cutts typically devolve into in-your-face questions about nofollow tags and disclosing links (thanks to Michael Gray, who is also awesome btw).

Danny Sullivan wasn’t there, and I didn’t see Rand Fishkin or Dave Naylor there. Could it be that allegiances have been sworn and sides have been picked? (Stay tuned for UFC 103: SES vs. SMX! ) I had a few drinks with Greg Boser and Todd Friesen, and Todd Malicoat’s charity party was awesome. But overall, the conference was nothing to write home about. For me, it was really a great chance to enjoy the weather in California. My love affair with this State continues…

Anyways, there were still a few highlights.

  • Social is killing SEO
    This is such bullshit. Stupid claims and rumors driven to gain readers or followers – that’s all this really is. If you consider that SEO is an entire industry based on driving free traffic to a website, then there is nothing to fear from social media and social search. It is not replacement for SEO. It’s a supplement and a complement to SEO. In fact, whenever ‘social’ comes up in sessions, it is usually followed by a conversation of how to optimize it (ie. SEO). Social and SEO are all part of the same game. Can’t we all just get along? [Note: Personally, I believe claims like this are happening because people are bored. We’re in the middle of an industry news lull. I think big things are on the horizon, but I’m sick of hearing about social media.
  • Matt Cutts shaved his head
    From Blogoscoped.com: Matt says “I bet my team that they couldn’t meet a certain turnaround speed for an entire quarter. They were able to maintain that turnaround time for the whole quarter, so they got to do whatever they wanted to my hair. :)” I asked Matt about his hair before the live site clinic session, and he was all smiles about it. Matt’s a good sport. During the session, he talked addressed a sex toys website. His back-and-forth with Greg Boser was great for the rest of the session. I’m glad that guy from MyPleasure was there. It made the session hilarious.
  • Matt Cutts says not to use nofollow attributes on internal links
    Matt said this during the live site clinic session. Okay. Uh…should we follow this advice (no pun intended)? This news was big at SMX Seattle back in June. I feel like the dust is still clearing. I might wait a little bit longer before I follow Matt on this one.
  • Tim Ash handed out cash in his session on landing page optimization
    Tim Ash wrote the book on Landing Page Optimization. Literally. He has a book called Landing Page Optimization. I had a chance to meet him. Super nice guy. During his session, he asked questions, and he was handing out $10 and $20 bills to people just for attempting to answer the questions. I have never seen anything like that before. But his session was awesome. It’s really the next big frontier for SEM. From my experience, conversion optimization and landing page optimization is falling under the SEO umbrella, but it really deserves its own category. He said for every $80 spent on driving visits to websites via PPC, only $1 was being spent on landing page optimization. This is tragic. After spending years optimizing our PPC and SEO campaigns, it’s definitely time to focus on what our visitors see when they get to our site. It’s the natural, organic next step (no pun intended).
  • Clay Shirky’s Keynote: Here Comes Everybody
    Clay Shirky is a great speaker. If you’re even remotely interested in human behavior and interaction with technology, I definitely recommend checking out his book (Here Comes Everybody), and I certainly recommend catching one of his sessions.

That’s it for now. Sorry it’s kind of a half-assed post today. I only had 30 minutes before I have to get on a plane. Cheers!

The SEO Impact of the Microsoft Bing Yahoo Search Merger

As you have all heard, Yahoo gave up today. Epic give up. Danny Sullivan wrote a great eulogy over at Search Engine Land, and Jason Calacanis said, “Yahoo committed seppuku today.” And over at TechCrunch, “Today, Yahoo died as a search engine.” To make it more depressing – I have already seen name mashups like YooBing, BingYoo, YaBing, Bingoo, BingYah, MicroHoo, BingYa, etc…

I’ve got to admit: Today, I actually teared up a little. I remember surfing Yahoo in 1995, looking for Cliffs Notes for Grendel. Yahoo was the only place to go. I mean, search is a space that Yahoo created! WTF are they doing by throwing in the towel? Yahoo had 20% market share in search. What were they thinking? Obviously, this deal benefits Microsoft more than it benefits Yahoo. So sad…

How about the SEO impacts of the merger of Yahoo and Microsoft Bing? There are a few that come to mind:

  • Will the link: and linkdomain: search operators continue to work on the new Yahoo?
    A few years ago, MSN disabled the link: and linkdomain: search operators at msn.com. This was an important day because you could no longer check MSN’s database stats for inbound links for a site. It came back about a year later, and then went away again. If you haven’t noticed, Google’s link: operator sucks. Google doesn’t want you to know all the links for a site, so the link: command on Google always returns an extremely low, inaccurate number of inbound links. Those bastards! But if you have a WebmasterCentral account for your site(s), you can see some more actual/honest inbound link data. With MSN and Google not providing any worthwhile backlink data, we have been forced to use Yahoo’s linkdomain: operator. Yahoo’s backlink data is much more honest and accurate. For the most part, checking back links is great for 2 purposes: 1) checking your own site(s) backlinks quantity and 2) checking your competitors’ backlinks. If Yahoo uses MSN’s search algorithm and the linkdomain: operator is disabled, it’s going to be really tough to check your competitors’ backlink growth. Furthermore, it will be tough to tell if they are buying links. I’m not into reporting people for buying links, but if you are, you may want to invest in some new backlink tool.
  • What happens to the Yahoo Search Directory?
    This one is interesting. As the Yahoo Directory (dir.yahoo.com) is a money-maker, I can’t imagine Yahoo or Microsoft getting rid of it. However, you may recall that MSN once had a Small Business Directory (archive view) at sbd.bcentral.com. There were thousands of sites in that directory. Now that site redirects to the MS OfficeLive website. If MSN got rid of their own directory, what might they do with the Yahoo Directory? It’s a good question. In terms of link authority and trust, the Yahoo Directory is the #2 directory behind DMOZ. But unlike DMOZ, you can pay $299 per year to be in the the Yahoo Directory. It’s a highly-respected directory, and unlike DMOZ, you won’t have to waits months on end with no answer. As long as you have a good site, you can get into the Yahoo Directory for $299 per year. But if the Yahoo Directory is discontinued….holy crap. That is a lot of link juice that will just evaporate. A lot of sites will lose quality historical links. Maybe it will shake things up a bit. Maybe not. Either way, you may want to make sure your sites find their way into other trusted directories, like business.com and botw.org.
  • What happens to Yahoo’s feeds programs, such as Paid Inclusion and SSP?
    Yahoo’s Paid Inclusion and Search Submit Pro (SSP) programs are crucial traffic and revenue sources for many search agencies and online retailers. It’s a huge business, and without it, many online retailers would see massive drops in revenue. Essentially, these programs allow you to pay for organic rankings in Yahoo on a pay-per-click basis. You may not know this, but for several years, MSN used Google and  Yahoo for its search. It wasn’t until LiveSearch launched that MSN actually broke away from Yahoo and Google. During the time MSN was using Yahoo for its search platform, Yahoo feeds were showing up in MSN results. But when LiveSearch launched on 9/11/2006, MSN no longer had a feeds program. They didn’t use one for LiveSearch, and there is currently no feeds program for Bing. As Yahoo’s Paid Inclusion and SSP programs are critical components for agencies and retailers alike, it should be a no-brainer to keep the programs active as part of the Microsoft search platform. But I guess we’ll have to see what happens.

    Update: Yahoo Search Submit Pro (SSP) Discontinued Effective Dec. 31, 2009

  • What happens to Yahoo channels such as Yahoo Shopping and Yahoo Travel?
    This is really just an extension to the previous question, but these are huge sources of traffic. Be sure to keep an eye on what happens with these channels. Yahoo Shopping is HUGE. I mean HUGE!!!! I can’t imagine anything happening to it. I can’t even imagine them merging it with Bing Shopping or Bing Cashback. That would be stupid.
  • One less searchbot crawling the internet
    We’ll miss you, Yahoo Slurpbot. You traveled long. You traveled far. You did your job without complaining once. You were a true soldier. RIP, Slurpbot.
  • Rank checkers will have one less engine to check
    Whereas you were probably checking Google, Yahoo and Bing for rankings, now you’ll just have to run your keyword lists across Google and Bing. And honestly, some website owners might be happy with the results, as many sites rank much better in Bing than in Yahoo.
  • Ask.com quietly moves into position as the #3 search engine
    Ha! I still miss Jeeves. He was a trooper. But also be aware that LBi Netrank has some data showing that Ask.com is (sometimes) scraping Google for search results. Are we losing Yahoo and Ask?!?!?!
  • Optimizing for Google and Bing at the same time
    It kinda sucks, but SEOs get accused of only optimizing for Google. It happens all the time, and all we have to do is point to the fact that Google has a 70% market share (and depending on the vertical, it’s sometimes higher). And then clients remind us that Yahoo has 20% and Microsoft Bing has 8%. At that point, we continue to point at Google’s 70% market share. But now, according to comScore, Yahoo’s 20% will combine with Bing’s 8% market share to combine for 28% of the market share. 28% market share is nothing to sneeze at, so we have to focus on both Google and Bing. While Google and Bing both respond to strategic SEO methods, it is worthwhile to note that Google gives more weight to links and Bing gives more weight to a site’s domain name (i.e. You’d better have keywords in your domain name & URL!). In my experience, Google and Bing are a lot closer in terms of how they value traditional SEO methods. Keywords in the title tag, keywords in the domain, organized site structure, updated content with decent keyword density, optimized internal links, inbound link growth – both engines reward these methods, as they are the basic building blocks of an SEO campaign. And while these are common signals for all 3 engines, my experience leads me to believe that Google and Bing reward these methods more quickly and predictably. Furthermore, seeing how Bing is pulling in more content into its search results pages, you may want to pay more attention to how your content is optimized and arranged on your site’s pages.

Well, that’s all for today. We’ll miss you, Yahoo. I’m still upset. I hope you find happiness. I know we can be friends again some day in the future, but please don’t call me now. I need some time to get over you.

Update:

  • John Battelle: Questions on the Yahoo Bing Deal (link)
  • SEOmoz: Top 10 Things the Microsoft/Yahoo! Deal Changes for SEO (link)

SEO Fail: WolframAlpha.com Not Loading without WWW

To be clear, I’m excited about Wolfram|Alpha and its potential as a game changer for the internet and the search industry. WolframAlpha.com is a fun concept, and I think it will catch on over time. I laughed when I typed in hello, and it responded with Hello, human.

Wolfram|Alpha: Hello, human.
Wolfram|Alpha: Hello, human.

Despite the fact that I really like this website, I still have to call them out for one thing: a network timeout when users don’t type in the www for wolframalpha.com. If you directly navigate by typing in www.wolframalpha.com, the site loads correctly. However, if you simply type in wolframalpha.com, you get a network timeout error:

Wolfram|Alpha: Network Timeout Error
Wolfram|Alpha: Network Timeout Error

The SamSpade header status checker reports a Socket Error when trying to retrieve http://wolframalpha.com:

SamSpade returns a Socket Error for http://wolframalpha.com
SamSpade returns a Socket Error for http://wolframalpha.com

I see this error a lot for retail websites. Most marketers are typically unaware of this error because they are more concerned with search and display links malfunctioning. But what if people direct navigate to your site and see a network timeout screen? It’s terrible for user experience. It confuses customers. From an SEO standpoint it’s awful because the non-www version of your homepage usually has a ton of links and a relatively high Google PageRank. The worst part is that this error is typically caused because some box didn’t get checked or unchecked in the server settings. It’s a damn shame is what it is. It’s like you are throwing away all of those inbound links and some direct navigation traffic. This type of thing makes me sad, so go forth and make sure it’s not happening on your site!

UPDATE: A few minutes after this post was published, the error was resolved. Coincidence? Definitely. I mean, seriously, who reads this blogs anyways?