iAcquire, Paid Links & SEO Noobs

More paid links drama?!? SERENITY NOW!
More paid links drama?!? SERENITY NOW!

It’s been a little over a year since the last big SEO scandal hit the news, so I guess we were due for a big story. And what do you know – IT’S ABOUT PAID LINKS! Good lord. Are we still outraged about people buying links? WTF, man. Get over it.

The SEO industry is full of a bunch of amateurs right now. They think buying links is a black hat strategy. LOLz. If they knew anything about blackhat tactics, they would know that blackhatters are not paying for links. Not in the traditional sense anyway. And not good blackhatters. Blackhat SEOs are much more intelligent than spending all of their time building links. They write software for that. Also, while blackhat SEOs operate in the gray area of SEO and Google’s guidelines, many of them also tend to operate in the gray area of the law. Furthermore, they are more interested in making money than giving it to someone else for a crappy link.

It’s pretty clear that the SEO industry has an abundance of  ignorant SEO noobs. They actually have no idea WTF blackhat SEOs are doing these days. Paying for links is blackhat? Shut up and get back to your title tags and robots.txt files.

As far as calling out iAcquire – no shit iAcquire has a linkbuilding component to their business! They bought Conductor’s paid link business, and everyone knows about that. Big deal. Do a little research (i.e. go to LinkedIn), and you’ll find that iAquire’s co-founders are Joe Griffin and Jay Swansson. These guys have been in the game a long time. Joe founded SubmitAWebsite and the search agency at Web.com. He was also a VP at iCrossing. And Jay Swansson used to work at Text-Link-Ads. He knows all about effective link building. These are two smart dudes who have a track record of producing results. Now everyone is under the impression that iAcquire is some evil blackhat company. That is a gross reduction of their company.

And Mike King – that guy is a fully registered SEO badass. He is the kind of person that most SEOs wish they could be. He’s got mad skills as a coder, developer, SEO, speaker, and blogger, and it’s like someone fine tuned his brain to be an SEO. If you’ve ever heard him speak at conferences, you’d see what I mean. Also, Mike is someone who shares a lot of information about modern SEO strategy. Take a look at his post on SEOmoz.org called ‘The Noob Guide to Link Building‘. Yeah. That’s some good stuff right there. I’m really not sure why Mike was even mentioned in Josh’s post.

Link building is not a bad thing. If I was approached by an SEO that didn’t have a linkbuilding strategy, I’d be like “GTFO! NOW!” Go ahead and call it ‘inbound marketing’ – it’s a service that everyone needs. It’s integral to SEO. It’s basically digital PR. Nowadays, ‘inbound marketing’ is the safest word in SEO. Some people use it as an umbrella term that covers paid link building, but it covers so much more than that. And here’s a shocker: nearly every aspect of inbound marketing involves getting links, and getting links is going to cost money. Press releases don’t write and distribute themselves. Someone has to connect with influencers and bloggers. Guest posts don’t write themselves. Building links is an art, and it can take on 100’s of forms. But at the end of the day, you’ve gotta do it if you want to succeed in organic search today.

Here’s a little secret: Every good SEO has a good link builder. Never forget that. It’s the reality today. But now that Penguin and Panda are out, a lot of SEOs are scrambling to find new linkbuilders and new linkbuilding tactics. A lot of the HPBL networks got nailed, and now those network owners are going even more private. And believe me: shit is about to get really expensive because the cost of intelligently building links on private networks is going to skyrocket. Content has to get better. Hosting has to get more sophisticated. And they’ve gotta erase any and all footprints. There can be no evidence.

Ultimately, with PageRank becoming less and less of a ranking factor, gaming it with links will increasingly prove less valuable over time. There will always be paid link networks, but eventually paid links will become too expensive to produce, manage, and maintain (for most link buyers). All the money will move over to content generation and social media, where buying Likes and Tweets and +1’s is much safer and affordable – and effective! But that’s an entirely different post. I’ll save that for some other time.

On another note: I’ve seen the term ‘myopic SEO’ flying around Twitter today. Myopic should not have a negative connotation. That isn’t fair. Some SEOs are myopic by nature because that’s how they survive in their niche. They are nearsighted because all they care about is making money RIGHT NOW! Today even! Those SEOs are probably some of the most industrious, entrepreneurial people you will ever meet. But guess what – they are not out there sharing their secrets, and you’ll probably never meet them.

In the past, SEOs would share their secrets at conferences, in forums, on blogs, etc… But around 2009/2010, SEO went back underground, and the people who know how to make quick money sealed their lips (and/or their keyboards). Oh sure. You may still catch people blabbing about their victories on forums here and there, but DM any of those forums’ senior members, and they’ll tell you that the forum is full of noobs and that it’s nowhere near what it used to be in terms of content. And if someone on a forum goes public with a strategy that works, 99% of the time all the readers will go out and beat it to death.

So if you’re not the myopic SEO, you probably fall into the group of SEOs who is interested in sustainable SEO. In reality, this bucket is where most SEOs reside. They promote the idea that SEO is a longterm strategy, where good guys finish first in the SERPs and Alt tags make a difference for your rankings. Sustainable SEO means that you are most likely working for a client where you are fitting into a larger marketing machine. You can’t act alone, and if you do something dumb, you could end up costing yourself and/other people their jobs. SEO for this group is a longterm strategy, and you’d better get buckled in because it’s going to be a long flight.

Sustainable SEO is the place where you have to be really smart about link building. You might even call it white hat link building…which is kind of an oxymoron. It means that you need to be smart about your link building strategies. 2011 was an amazing year for building links via private blog networks. You can’t rely on that any longer. You also can’t only go after unbranded anchors. You’ve gotta diversify. When it comes to paid link building, there’s a lot of stuff to consider these days. Do your research and be smart. You don’t want the Penguin catching you.

BTW here’s a great post Danny Sullivan wrote about all of this. I’m really looking forward to Aaron Wall’s blog post about all of this.

Well, that’s my rant. Smell ya later.

Greg Boser Keeps It Real at PubCon 2011

SEO in 2011

1. Site Quality vs. Page Quality
Google is no longer “page” focused
The days of Google determining what will or won’t rank primarily based on page-level analysis are gone. Aka Panda.

2. Human Engagement Signals
Stole some slides from Rand Fishkin, showing Facebook shares are highly correlated with Google rankings
Google acquires PostRank, analytics service for the social web (with lots of historical data)
Google knows that no one is reading a page. So it doesn’t matter if your link is on that page.

3. Brute Force is Dead
Unnatural anchor text can hurt you.
Large volumes of exact match anchor text are not how people link in the real world.
– automated filtering
– automated correction

4. Extreme Localization
Google goes insane!
Oct-2010, everyone has a location.
Points out that he has clients with double-digit clickthrough rates at positions 9-15…because of the maps in the middle.

Change is Good.
– embrace it.
– find new areas of opportunities.

Our job is to understand human behavior while they are looking for information.

Read the leaked Google doc about raters guidelines.

Focus for the Agency
– learn to say no
– autonomous solutions
– new products/services to solve common problems

You can’t do it like you used to. Can’t give small wins more often. You need to have stellar results. Otherwise, you won’t grow as a business. And you’ll be stuck with those clients who pay you a lot of money and don’t listen to you.

Project lists that outline problems but also solutions.

Focus for the Client

– lose the tie!
– don’t be afraid to go rogue.
– proof-of-concept is the key
– embrace open graph and individual identities for your employees

Q&A:

– If you’re using opengraph and your site gets hit with panda, does Google then also apply that penalty to your profile?
– How are you talking to your clients about how to use Facebook right now?

PageRank Update: November 8-9, 2011

Last night, around 11:45pm PST, I started noticing that the PageRank had increased for most of my sites. Or it stayed the same. I haven’t seen the PageRank for any of my sites drop. That’s for sure.

Probably the best part about this PageRank update is that no one is really talking about it. I’m at PubCon in Las Vegas with some of the best SEOs around, and no one has even mentioned it. I’ve been wondering when the day would come where people would lose interest/fascination with PageRank. Are we there yet? Maybe.

SMX Advanced 2011: Mega Session: SEO Vets Take All Comers

Moderator: Danny Sullivan, Editor-in-Chief, Search Engine Land

Q&A Moderator: Michelle Robbins, Director of Technology, Third Door Media, Inc.

Speakers:

  • Alex Bennert, In House SEO, Wall Street Journal
  • Greg Boser, SVP of Search Services, BlueGlass Interactive, Inc.
  • Bruce Clay, President, Bruce Clay, Inc.
  • Vanessa Fox, Contributing Editor, Search Engine Land
  • Todd Friesen, Director of SEO, Performics
  • Stephan Spencer, Founder of Netconcepts, Co-author of The Art of SEO, StephanSpencer.com

This year we have one hundred and eight trillion years of SEO experience. Danny gives nice introductions.

Boser says that he’s not technically an SEO anymore. Yeah, right.

What are the most useful social share buttons?
— Bruce Clay says that the Google +1 button adds about 2 seconds to each page’s load time.
— Boser: There’s nothing worse than a site with social sharing buttons that all say zero.
— Alex says that all her blogs are on WP platform, and they do a lot of testing on which buttons are best for users.
— Vanessa: Be sure to monitor what the buttons add to your page speed. And there are also several things to look for when adding third party widgets. Check out the “perceived load time” for your pages after adding buttons.
—  Stephan recommends “High Performance Websites” and “Even Faster Websites”, by the author of the Y Slow plugin

How’s social sharing affecting traffic or SEO practices?
— Alex: not a ton of impact. Twitter has more impact, especially for breaking news stories (with use of hashtag clusters).

Canonical tag, index/nofollow on pagination? Discuss.
—  Todd: There’s the Google version and the version that works.
—  Vanessa: There’s actually two issues that get combined, and the issue gets clouded. Article pagination is one topic, and SE’s didn’t intend the canonical tag to be used in that case. Pagination in onsite search results is another issue. You could use a noindex on page 2 and 3 if you don’t want users to land on page 2 or 3. But Maile says that unique content on page 2 and page 3 could get you more traffic for keywords on those pages not on page 1. The other type is onsite search result pages. Vanessa says use noindex on these pages.
— Boser: We always use it on WP category pages. The cool thing about the canonical is that it can act like a 301 but Google gets to see the content.
— Bruce: Google may ignore the canonical or they may ignore the page itself. (Vanessa and Stephan dispute this.)
—  Boser proposes canonical tag tag. Brilliant.
— Stephan: noindex implies a follow, and it still passes link authority, which is a good method to keep pages from being indexed while still passing link juice. Also, Google doesn’t like internal search results pages showing up in their search results. Be sure to take your search results pages and make them look more valuable to users (while somewhat disguising them to SEs)

Schema.org stuff. Can we manipulate Schema.org tags? And how?
—  Boser: SE’s have gone full circle. SEs gave us meta data. Then they took it away. Now they’re asking us for meta data again. Predatory aggregation: these tags makes it so easy for scrape your data and then use it in the search results. There are some negative things about these tags.
—- Alex: We need to see how it really works, test it and come to conclusions about it.
— Bruce: The only reason I see you use it right now is if you are losing rankings on ambiguity search terms. I’m worried about the amount of code bloat from schema.org tags. 6 months down the road, I’m sure we’ll be here and no one is even using it yet.
— Stephan: I’ve been a fan of microformats for a while now, especially on real estate sites. It’s good for user experience.
— Todd: I thought Stephan would like microdata because it makes it so much easier to crawl names and phone numbers.

Panda? Help?…
— Boser: If you got hit, c’mon. There’s been a lot less anger about this algo update. If you got caught by Panda, you know why and you just gotta start over from scratch.

Danny gives each panelist a site:
—  Boser: Look at backlinks and see if that content ever developed any backlinks (eHow). Is there stickiness? Does you content generate backlinks? If not, you need to address that.
— Stephan: I would acquire some sites with high link authority, semi-abandoned, not well monetized, etc…
— Alex: Identify pages that have content, are relevant but don’t get clicks. Focus on those pages where there is potential.
— Todd: It’s a cleanup project. You gotta go through your site and clean it up.
— Bruce: Restructuring templates and pruning off weak pages are the best things you can do for the short term.
— Vanessa: You have to re-think your entire site. It’s not going to be just one thing.

What scares you about Google lately? From an SEO perspective? But also, what do you like about them?
—  Boser: Other than the fact that they’ve gone from organizing the world’s information to wanting to own the world’s information… But seriously, he likes how Google tells you when you have been punished.
— Bruce: I have a particularly paranoid vision of Google.  I see Google moving into Local very strong. And then I see Google moving into the news. Like producing the news. Scary.

Links and tweets as ranking factors?
—  Boser: it’s a corroborated signal that goes along with the other signals. If you have a wave of activity from trusted people on twitter, it will spill over to other sites.
— Todd: It goes back to following:follower ratios and identifying the authorities on Twitter. It’s the next thing of getting to the front page of Digg.
— Stephan: Going viral is not in a vacuum. There’s a lot of signals to make things legit.
— Vanessa: None of us actually care about ranking. We just care about getting people to our site.

Do you think social signals will become more important than links?
— Stephan: I don’t think so. We’re going to end up in a world with highly sophisticated AI, and the link graph will be the underlying reason for a page being deemed important.
— Vanessa: Links have been so big because for a long time, they were the only signal. Now there are a lot more signals.
— Boser: Connectivity will never go away. The Web is all about the hyperlink, and that connection gets stronger with social.
—  Bruce: Much like a link graph, a trust graph will become even more important.
— Alex: Social media gives the SE’s a much larger pool to pull from. Links only come from people with websites. Likes and tweets comes from a much larger group of people than just people with websites.
— Todd: I was up for the keynote…only the last 10 minutes of it. Ranking is going to start going away because I see someone’s face in the FB widget on a website. That gives me confidence as a consumer.
— Alex: One thing that I hate about Google is that they consider Wikipedia as a news source.
— Danny:  You know what – I want full link reporting on any site. Blekko has great tools, too, but I want it from Google.
— Stephan: I am not happy with Google playing ‘Hide the Banana’. I think they are going to get more and more into health information. What could be really interesting is the merging of SEO and genetic/health data. Cuz you could send in your DNA to get sequenced and organized. I think they’ve made great improvements in their local algo and Google Places.
— Todd: Something I like about Google —> Google’s given me a hell of a career for the last 10 years.

Lightning round:

  • Anchor text over-optimization? YES! Don’t go crazy with it.
  • Digg, Reddit? Does anyone even care anymore? No.

Final thoughts:

Danny: I want to do a panel where you guys run a search engine. I’m trying to get Matt Cutts moderate a panel of SEOs.

SMX Advanced 2011: The Really Complicated Technical SEO Infrastructure Issues

Moderator: Vanessa Fox, Contributing Editor, Search Engine Land

Q&A Moderator: Alex Bennert, In House SEO, Wall Street Journal

Speakers:

  • Jonathon Colman, Internet Marketing Manager, REI
  • Kavi Goel, Product Manager, Google
  • Steven Macbeth, Group Program Manager, Bing Search Quality, Microsoft
  • Todd Nemet, Director, Technical Projects, Nine By Blue
  • Maile Ohye, Senior Developer Programs Engineer, Google Inc.

This is one of my favorite sessions each year, primarily because Maile Ohye typically gives us some good easter eggs. And I like when Vanessa Fox laughs at dumb people, which she does in most other sessions as well. Anyways, let’s get started already!

Vanessa claims there is going to be a lot of “bonus things” in this session. #awesome

This panel is packed with awesome people. Now they are getting into schema.org talk. It’s fun to see Google and Bing people talking about schema.org. These two dudes seem to be friends. Dogs and Cats! Living together! 40 years of darkness! Earthquakes! Volcanoes! …and that’s the end of my Ghostbusters bit. Anyways, schema.org looks like it might be big. Like MySpace big!

Now I’m watching a schema.org video. WTF is going on! This is awesome. Content in Context. So it appears that schema.org is a structured meta data foundation that will serve us potato salad.

Holy crap! Schema.org talk has overtaken this session. The session is no long stuctured at all. It’s just a bunch of questions. Getting tough to tweet and blog. And now they are talking about the new author tag from Google that was announced today.

At any rate, here are some of the questions:

  • If they are moving to microdata? will Schema.org effectively make microformats obsolete?
  • What do we do with the schema.org tags and standards? Do we go back and mark up our entire site?

Oh wait. The schema.org talk was only part of the session. So I guess now we’re back to the original session topic. Up first is Maile Ohye.

Maile Ohye is going to talk about Internation SEO issues:

  • Considerations for international expansion of your site
  • Before you start, ask yourself if you can really support that site.
  • Key factors:
  • Flow chart: Language or country/region
  • Language focused expansion: en.domain.com, put site on gTLD or ccTLD if you can afford it
  • Country focused expansion: language still remains a factor. Multiple languages in one country. Currency, shipping, local laws.
  • Can you get ccTLD for your site?
  • example.ch/de
  • example.ch/fr
  • Check out United Airlines site. Maile mentioned that they “are doing it right” (I’d take her word and use their site as a model)
  • No cloaking! Lots of sites do this. Don’t.
  • Shareable URLs.
  • Indicated language/country in the URL (example.com/fr/welcome.html)
  • Matching URL structure. (ie. www.example.com/country/ for all countries)
  • Helpful language/country crosslinking
  • Provide user ability to navigate to their desired language of choice
  • Site load times factor in
  • Geo metatags not used by Google with web search? (Well, what are they used for? #confusing)
  • Duplicate content ice cream: on different ccTLDs, duplicate content shouldn’t be a problem

Next up: Jonathan Colman

  • “I’m like you. I’m also a non-developer.”
  • Jonathan hit a homerun with his rel=canonical tag idea, and he increased his pages indexed while eliminating a lot of duplicate content.
  • He saw massive increases in pages indexed, while also seeing a 96% decrease in duplicate content.
  • Now it’s Vanessa and Maile addressing what Jonathan talked about. Maile does not recommend what Jonathan does.
  • I’m really confused now. I feel bad for Jonathan. He’s a badass. And I hope his site doesn’t get blown up this week by Maile.

Next up: Todd Nemet

  • Really long redirect chains, unintentional referrer cloaking, IIS issues, robots.txt issues, IDS blocking Googlebot
  • Redirect chains:
  • Crawlers give up after too many redirects
  • AdWords adds even more redirects
  • Cloaking:
  • By referring URL
  • By user agent
  • By IP address
  • IIS browscap cloaking – look it up
  • IIS error page handling:
  • creates 302 chain out of the box for 404 pages

Then the Q&A was all about rel=canonical stuff. I’ve heard it a hundred times before. Google created a monster when it created the rel=canonical tag.

SMX Advanced 2011: Yes, Virginia, Tweeting is SEO

Moderator: Danny Sullivan, Editor-in-Chief, Search Engine Land

Q&A Moderator: Jeff Ferguson, CEO, Fang Digital Marketing

Speakers:

  • Michael Hayward, CEO, ROI Labs
  • Jennifer Lopez, Community Manager, SEOmoz
  • Sean Percival, Vice President, Online Marketing, Myspace
  • Elle Shelley, VP Social Media, Zog Media

Up first is Michael Hayward:

  • Challenge 1: We need a Twitter strategy.
  • Commerce and lead generation
  • Customer service and complaint monitoring
  • Reputation management
  • Promotional messaging
  • Trying to monitor everything becomes a nightmare for scalability.
  • Challenge 2: Online Publishing Reality: niche publications don’t develop an online audience; advertising can’t be sold
  • Future of Four Seasons Magazine
  • Advertising support
  • Online presence
  • Can’t just PDF your magazinge and be successful
  • However, the online value-ad is a competitive necessity
  • Challenge 3: Trying to reach customers earlier in the buying cycle
  • Reaching into the top of the longtail of travel search
  • Homepage overload: new spa, new restaurant, new chef, latest promotion, destination guides
  • Search to the Rescue:
  • Tweets contain: brand, top of the tail terms, built a custom link shortner (fshr.com), deep link into magazine with tracking variables built in, editorial calendar set up to a year in advance based on when articles go into magazine and then get tweeted
  • Good point about scheduling tweets months and months in advance, based on seasonal peaks. Optimize frequency with search in mind.
  • Search traffic from search has grown over 50% in the past 4 years; all other traffic sources +5%
  • Now ranking for non-travel terms

Takeaways:

  1. Use twitter as key driver of SEO strategy
  2. Magazine became part of the the SEO strategy
    — advertising now subsidizes search
  3. Use the Magazine to carry general/inspiration content

Next up is Elle Shelley from ZOGMedia:

  • “Just as search was in 1998, social is today.” – Jeff Herzog
  • Tip 1: Integrate into Strategy
  • Be organized, Work backwards with content, sprinkle short AND longtail keywords
  • Customer Continuum: Awareness, Consideration, Decision, Transaction, Advocacy

Oops. Gotta run. Sorry.

SMX Advanced 2011: The New Periodic Table of SEO

New Periodic Table of SEO (2011)
New Periodic Table of SEO (2011)

Moderator: Danny Sullivan (@dannysullivan)

Speakers:

  • Matthew Brown, AudienceWise (@MatthewJBrown)
  • Duane Forrester, Microsoft (@DuaneForrester)
  • Jeff MacGurn, Covario (@yerrbo)
  • Rand Fishkin, SEOmoz (@randfish)

The session starts with Danny reviewing the old periodic tables of SEO…from 1998, 2003, 2004, 2006, 2007, 2009, 2010, 2011. Nice little SEO history lesson. I think I even heard him mention the Florida update. #vintage

Jeff MacGurn is up first. Leads off with a philosoraptor meme image. A redditor? We can smell our own. He works at Covario. Cites some big study/analysis they did. And then spends a minute or two explaining correlation between causation.

Technical SEO factors:

  • Page size
  • URL character length – not much correlation
  • Flash navigation – not much correlation
  • Session IDs – strong negative correlation on rankings when present
  • Dynamic parameters in URLs – not much correlation
  • Proximity of page to root directory – not much correlation
  • Page load time – surprisingly strong correlation

Content Factors:

  • Keyword Emphasis – not so much
  • Keyword in title tag – strong on Yahoo/Bing
  • Keyword in Meta Desc tag – not so much
  • Keyword in H1/H2 tag – strongly correlated overall
  • Keyword in H3 tag – not so much
  • Keyword in image Alt tag – not so much
  • Keywords in URL – strong positive correlation

Links:

  • Internal link count – very little correlation
  • External link count – marginally better than internal link count
  • Keywords in the anchor text – waning, but still slight correlation
  • Hub Links – Very strong, one of the strongest factors of all examined factors in the study. ie – Having links from sites that link to relevant group of site

Takeaways:

  • Look for differentiating factors in your landscape
  • Links from Hubs
  • Page load time
  • Keyword in URL
  • H1

And next up in Rand Fishkin. You’ve probably heard of him. He’s about to throw Jeff under the bus. Oops! Jeff kinda got unlucky because Rand wasn’t scheduled to be on this panel. Sorry, Jeff. You were a worthy competitor though.

It’s pretty clear that in the realm of SEO data analysis, @randfish is king.

  • 2011 data has changed a lot. Link factors fell from 65% to 45%. Whoa! Again, this is a survey. But still.
  • Google’s use of ranking features in the future: usage data, social signals at the domain level, social signals, analysis of perceived value to users
  • Big one: #of linking C-blocks
  • # linking ip addresses, linking root domains, subdomains, followed links
  • linking root domains with partial anchor text
  • Exact match anchor texts are on the way down in terms of value
  • # of linking C-blocks to page more important than domains
  • Exact match domains: Matt Cutts said at PubCon 2010 that they will not value exact match domains as much. Rand’s data shows that Matt was telling the truth. Nice work, Matt. Exact match domains are not as valuable any more.
  • Is Google evil? — Lots of Google adsense slots? Strong negative correlation with rankings.
  • Don’t link to Google.com! It has negative correlation. Microsoft on the other hand is slightly positive.
  • Do you use Google Analytics? If so, slight negative correlation.
  • Number of external links on the page shows positive correlation.
  • Social media factors – #of Facebook shares, Sum of FB shares, likes and comments
  • # of Facebook shares – highest single positively correlated metric for rankings (present for 61% of pages ranked in the top 30)
  • Facebook shares are predictive of links. Nope. They are not correlated.
  • Maybe FB shares are correlated with some other factors Rand’s not looking at, but he acknowledges that. Could be things like site speed.
  • Don’t try this at home, kids — Don’t misuse or misattribute correlation data.

And now it’s @MatthewJBrown from AudienceWise. And he also uses a reddit meme in his slides. I know what he does at work.

  • Points out to put FB Like buttons on every single page on your site. Not a button for your brand’s FB page, but a FB Like button for all your products, categories and URLs
  • User averages of FB likes, shares, comments, tweets, linkedin shares, etc…
  • Make sure to check out schema.org
  • Localization: the online brand killer?
  • Check out Google Suggest for keyword suggestions rather than the Adwords keyword tool
  • Use Advanced Web Rankings with Proxify
  • Get on board with Twitter, FB, and Google +1

Q&A:

  • Duane Forrester: I agree with some of what I’ve seen. Some of what I’ve seen has me scratching my head. And I’m a little bit humbled about how close everyone is.
  • Rand asks Matt about localized results. Give me data for 150 keywords from 50 different data centers/cities. Gets complicated.
  • When we look at all these factors, are we not looking about the human use? – Rand says the two are merging more and more. Jeff says they are one in the same. Duane says Bing is looking at what the user is experiencing.

Google Rankings Show Overstock is Back to Kicking Ass

Good news, everyone!
Good news, everyone!

Did you hear? Google lifted the -50 penalty on Overstock! Yep. It’s true. Surprisingly, no one really cares. Oh sure, it’s made the SEO news cycle over the past couple of days, but it certainly isn’t causing the same hysteria as when Google was bitch-slapping Overstock, JCPenney and Forbes back in February for their involvement in the filthy evil world of paid link building.

I’m actually surprised I haven’t had anyone ask me about the Overstock penalty being lifted. It hasn’t even come up at lunch or at the water cooler. Nope. People are just too cynical. We all want to slow down to look at the gruesome car wreck, but no one slows down to get a good look at the vehicles being towed away. “Oh hey – let’s slow down to look at a happy ending!” Mark that under ‘Things that you’ll never hear’. People don’t care about happy endings. People want to watch the tigers (Google) devour the gladiators (websites) in the Colosseum (the Internet).

Back on February 24th, I saw the Overstock story, and I decided to do a little research. I wanted to know what kind of impact the Google penalty would have on Overstock’s rankings. And I also wanted to know how long such a publicized penalty would last. So I did what any SEO would do — I found roughly 500 non-branded keywords that were driving a ton of organic traffic to Overstock.com. On Feb. 25th, I ran a ranking report. Sure enough, nearly every keyword had gone from ranking in the top 10 to ranking in the 50’s and 60’s.

Google's -50 Penalty on Overstock.com has obviously been lifted.
Google's -50 Penalty on Overstock.com has obviously been lifted.

Yesterday, when I saw the news about the penalty being lifted, I ran a ranking report on the same keywords. As you can see, the penalty has been lifted. It’s undeniable:

Before & After: Google Ranking Analysis & Comparison for Overstock.com
Before & After: Google Ranking Analysis & Comparison for Overstock.com

Just look at that. Damn! I know it’s the NBA playoffs, but that is the biggest rebound I’ve seen in a long time! Going from 27 position 1 rankings to 265? Wow. That is amazing. Google really does have the power to ruin your day if they want to. But Google also forgives. So that is good.

And now, how about I upload an overly long JPG file that shows you all the keywords and their respective rankings? Well, okay. I suppose I can do that.

Overstock Keyword Rankings: Before and After the Google Penalty
Overstock Keyword Rankings: Before and After the Google Penalty

So there’s that. I guess that post really wrapped up nicely. Or not. Not.

BTW I think it’s awesome that Overstock.com sent out a press release to announce the Google penalty had been removed. Online press releases are a great tool for reputation management….and link building! Aside from the obvious irony, I think it’s pretty ballsy for Overstock to use this announcement to build more links to their awesome o.co domain. Well done, indeed.

Interesting note from the SEWatch article:

The toll of the Google penalty on Overstock.com: a 5 percent drop in sales and 32 percent loss of organic traffic.

Wow. I’ll have more on that later.

Overstock SEO Campaign: Punished By Google For .edu Links Tactic

Welcome to the party, Overstock! Incentivizing teachers and students to post links on their .edu sites? That is awesome. Well, I guess it’s awesome until someone at WebmasterWorld alerts people to your recent success. And it’s awesome until Google catches you and drops all your rankings. Just ask JC Penney.

Additional coverage:

From the WSJ article:

Overstock’s pages had recently ranked near the top of results for dozens of common searches, including “vacuum cleaners” and “laptop computers.” But links to Overstock on Tuesday dropped to the fifth or sixth pages of Google results for many of those categories, greatly reducing the chances that a user would click on its links.

Aw, c’mon. That doesn’t even make sense. Does the media know anything about SEO? And really why does the media continue to put SEO in the same bucket as aggressive, super awesome link building tactics? Because really, most SEOs wouldn’t know the first thing about coming up with awesome link building campaigns like the one that Overstock has been using. Granted, Google caught them. But that is always the risk of a great link building campaign, especially one that drives a disproportionate number of .edu links. That’s where they went wrong.

Building a ton of .edu links is dangerous, as it’s so easy to see those links in a link report. In fact, if most of your links are coming from .edu sites, you might as well have a parade to announce them to Google! Too many of those links too fast – and your history! Well, I’m sure that Overstock.com enjoyed several months (maybe years) of top results from this type of link strategy. Good for them. And thank Jebus they didn’t blame it on their SEO agency. Because that would be really lame.

The point is: In today’s Google, you’ve got to be *REALLY* smart about your link building. Not too fast. Not too slow. Diversify your links. Don’t get thousands of crappy links (aka the SearchDex method), but don’t get too many too-good-to-be-true golden .edu links (aka the Overstock method). In fact, I think we all remember one of the recurring themes from Goldilocks. The third try was always “just right.” For her, that worked out great. But for us SEOs and link builders, we don’t get three tries. We need to make it right the first time.

Laters haters.

PS. Google, I think we all would really like to know how much is too much? The 2 most recent stories involved link building methods that are downright flagrant. Are you not de-listing these sites because they are ‘too big to fail’? Sure, you are probably scaring other companies into thinking that link campaigns are a bad idea. But me thinks that the blackhats out there are loving the fact that you are not de-listing these major brands. Juts sayin’…

 

 

Forbes Caught Selling Links & Google’s ManBearPig of 2011

Now it seems that Google’s scare tactics have targeted paid link publishers. These are the people who are selling the paid links that companies like JCPenney were buying. Today, there’s a great post about Google punishing Forbes for selling links. Yeah, *that* Forbes! You know Forbes.com, right? They are a fairly popular website. Their Alexa rank is 502. Even though it’s Alexa, 502 is not too shabby. So hopefully you’ve heard of them. But why are they important in the JCPenney link saga? I’m glad I asked.

"Paid links are the ManBearPig of 2011." - Al Gore
"Paid links are the ManBearPig of 2011." - Al Gore

First, rewind to this past Saturday when that New York Times article came out. The NYTimes ratted out JCPenney to Google for buying links. Yep. It was a massive news story involving black hattery and Google. It went viral in the digital marketing world. Everyone has read it or at least heard about it by now. And it really made JCPenney look like the bad guy. And their SEO agency, Searchdex, got as much negative PR as I’ve ever seen an SEO agency get. They were fired immediately by JCP. Sucks for them. While JCPenney will bounce back after the Google penalty is lifted, I’m thinking SearchDex might be better off if they changed their name and rebranded completely.

Basically, with that NYTimes article, Google equated paid links to ManBearPig and scared the hell out of a lot of paid link buyers and would-be paid link buyers. I’m sure that most companies called their search agencies to ask about the article, and I’m sure that most search agencies were not looking forward to those calls. That article made us all look like we are evil blackhats, especially if we buy paid links. Regardless, I’m certain that a lot of link campaigns were ended this week. It’s just a theory. I don’t actually know. But I’ve heard of the bell curve. And I know that most people tend to have a kneejerk reaction about pretty much anything that scares them. To that end, Google scared a lot of people into discontinuing their link campaigns and/or not starting new ones. Again, I can’t prove that. But whatever. It makes sense to me.

Google really needs to affect both sides of the link-buying equation. Now that the people buying the links are scared, the next step is to go after the websites that are selling the links. But what does Google do if a site is selling links? In the past, we would notice the PageRank of the publisher site drop from a PR5 to a PR4 or something like that. In some cases, I saw PR4 sites drop down to PR1. That was pretty harsh. But a lot of websites that were selling links didn’t have the first clue about PageRank. They were just mommy blogs, news sites and magazine-style blogs. But that was like 3 years ago. Today, it’s different ballgame.

Nowadays – well, really in the past few months – I have caught a couple of news stories about Google contacting websites suspected of selling links. In one case, the site owner flat out denied ever selling links. I’m not sure what ever happened with that one. But in today’s Forbes case, the site owner was none other than Denis Pinsky, the Digital Marketing Manager at Forbes.com. He was asking about which links were paid, as if he couldn’t find them. Maybe this guy really didn’t know. He seemed concerned enough to post about it in a public setting. What kind of SEO would do that?! Maybe that’s his alibi! Sheesh. I just wish someone would admit it already. As Barry Schwartz pointed out in his post on serountable.com (with pictures to prove it), it’s easy to see where Forbes wis selling paid links.

In both cases I linked to, the site owners received this message in the Google Webmaster Central Account:

Dear site owner or webmaster of http://www.forbes.com/,

We’ve detected that some or all of your pages are using techniques that are outside our quality guidelines, which are available here: http://www.google.com/support/webmasters/bin/answer.py?answer=35769&hl=en.

Specifically, look for possibly artificial or unnatural links on your site pointing to other sites that could be intended to manipulate PageRank. For more information about our linking guidelines, visit
http://www.google.com/support/webmasters/bin/answer.py?answer=66356&hl=en.

We encourage you to make changes to your site so that it meets our quality guidelines. Once you’ve made these changes, please visit https://www.google.com/webmasters/tools/reconsideration?hl=en to submit your site for reconsideration in Google’s search results.

If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support: http://www.google.com/support/forum/p/Webmasters?hl=en.

Sincerely,
Google Search Quality Team
1600 Amphitheatre Parkway
Mountain View, CA 94043

The lesson here: you should be checking your Webmaster Central accounts every day. If you see that message, you probably need to take down the links you are selling. Well, I can’t say that. What if you are selling links to…oh I dunno, a top insurance agency, a massive computer manufacturer, and a major fashion brand for thousands of dollars per month? Would it be worth it to have your PageRank dropped and/or lose some of your rankings if you kept the link and made thousands of dollars per month? That’s your call. Do the math. Ask an economist. I think it’s something like “the marginal benefit must outweigh the marginal cost.” I was never good with graphs about guns and butter. But for most cases, if you get that message from Google, you should probably remove the paid links from your website and submit a reinclusion request.

Lastly, I want you to know how I really feel about Google and paid links:

  1. Don’t be afraid of buying links. Be cautious. Just make sure you’re not dumb about it. For more information on dumb link campaigns, read that NYTimes article again. SearchDex pretty much lays out a path to dumb that anyone can follow.
  2. In general, it’s okay to sell links on your personal site(s) because, hey, your site isn’t home to a major brand. Plus, your personal site is probably just some side income anyways, right? However, for your clients’ sites: NEVER sell links on your clients sites. No no no no never ever ever! If they are looking for additional revenue, take the selling links option off the table before the discussion ever starts. The last thing you need is to be doing a great job at SEO and then your client’s site gets hit with a penalty for selling links. If that happens, you are just spinning your wheels with any SEO you do for them.

Well, how much longer will the paid links saga drag on this week? I don’t know. But if something else happens, maybe I’ll write about it!

Happy selling….err, I mean buying! Or do I?!?! Mwuaaahahahaha!