<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=714210352038039&amp;ev=PageView&amp;noscript=1">

P1P Archives

Matt Cutts Says Volume 2: Seattle SMX Advanced You&A 2013 Edition

Posted by Cory Collins on Jul 3, 2013 8:52:20 AM

Matt Cutts SMX 2013

Matt Cutts recently participated at SMX Advanced in Seattle on June 11. Cutts was featured in the classic “You&A” open forum style discussion, during which Danny Sullivan quizzed him on a number of issues and then opened the floor to the audience. The whole event lasted roughly an hour.

This post will distill the answers Cutts gave, and more importantly what we can learn of Google’s philosophy and what might be in the pipeline, based on what Matt Cutts (head of Google's Webspam team) says.

Welcome to Matt Cutts Says Volume 2: Seattle SMX Advanced 2013 Editon.

The topics ranged from Panda to Penguin, to penalties, mobile, (not provided), bounce rates and everything else under the SEO sun.

To make sense of his answers, and the diverse topics he covered, I’ll be tackling his session by category rather than chronologically.

First, the video of the session (warning – bad quality; also it’s long – over an hour total):

http://youtu.be/B_jyMt5CPmE?t=7m46s

Secondly, a few live blogs of the event if you’d like to skim through the conversation, as it happened:

Matt McGee’s live blog of You&A with Matt Cutts at SMX 2013
Mark Traphagen’s live blog of You&A with Matt Cutts at SMX 2013

Now, let’s dive in to the analysis.

Panda

Cutts had a few choice statements concerning Panda, which sheds some light Panda’s current status, and why Google is no longer announcing each update as they happen.

Important Panda takeaways:

      • Google is no longer announcing Panda updates due to the frequency of which they’re occurring, and the lack of interest that constant announcements seem to cause
      • Panda updates are much smaller now, according to Cutts, reaching an almost ‘steady state’, in which Google’s only baking in new data
      • Cutts called Panda updates the “Panda Dance” – happening frequently enough, with small changes, that it’s not worth naming each update
      • Cutts once again referenced a Panda update on the horizon that’s specifically designed to ‘pull people out of the gray zone’
      • Panda isn’t large brand focused.

 

Penguin

Penguin hasn’t reached the refined level of Panda, with a recent ‘2.0’ iteration roughly one month ago. Cutts purposely ‘telegraphed’ the release of Penguin 2.0, in order to prepare the SEO community for it’s release. Much of the community claims Penguin 2.0 didn’t live up to the hype prerelease, leading Cutts to state that Penguin isn’t done.

Important Penguin takeaways:

      • Cutts purposely ‘telegraphed’ Penguin 2.0 because Google wants the community aware of large updates coming out
      • Penguin 2.0 is designed to go deeper than 1.0 – now Penguin can target individual pages throughout a site, whereas before it mainly focused on the homepage
      • Penguin addresses what Cutts calls a ‘large chunk of webspam’, but can’t address it all – there will be future updates that target different types of webspam, one of which will be for hacked sites, soon to release
      • There will doubtless be more Penguin updates, and large ones. Cutts reiterated that they can change the strength of Penguin’s signals, if needed.

The general tone of Cutts’s answers seemed to be that Penguin 2.0 was a good update, but there will be more to come.

Future Updates

Danny Sullivan prepared the stage for the You&A with three stuffed animals – a polar bear, a pug, and a pig. The direct implication is that Sullivan believes there’s likely to be another large update, named similarly to Panda and Penguin.

Cutts downplayed the names of these Google updates, specifically mentioning the worry of “update name inflations” and remarking “we don’t want to have a menagerie.”

Despite these statements, there seemed to be little doubt that Google will be releasing another update on scale with Panda and Penguin in perhaps the not too distant future.

The Validity of Update Tracking Tools

There’s recently more visibility on tools designed to track the volatility of Google’s SERPs (search engine result pages).

The idea being, the higher the volatility (or weather, as Moz dubbed it), the higher the likelihood that Google is pushing out a notable update.

The two most popular SERP tracking tools:

However, despite these tools tracking a wide range of SERPs for diverse data, it’s really hard to be sure when Google’s released a sizable update, much less a small one. Because of this, a few have questioned the validity of such tracking.

Notably, Mozcast measured a 113 degree all time high on June 25th, despite no word from Google on a specifically large and targeted update. Furthermore, in his breakdown “Early Look at Google’s June 25 Algo Update” Dr. Pete of Moz could only find a dip in PMD (partial match domain). So, despite seeing a lot of fluctuation in a diverse set of data groups (over 1,000 different SERPs being tracked), Dr. Pete himself says it’s hard to guess at what the algorithm targeted or in fact accomplished.

Cutts mentioned that they roll out 500+ algorithm updates a year, and that it’s hard to know when people will notice one update versus another. Also, often times each update affects verticals or niches differently.

Despite the complications, these tools do provide a useful function in tracking multiple SERPs. And, although they can’t pin down too many specifics, it is good to know what’s going on in a broader sense.

Links

Google has famously the core of their algorithm around the idea that links are an editorial vote from one website to another. And, despite their algorithm now using over 200 signals, links have continued to prove a potent ranking factor.

In recent history, especially with Penguin, Google has reinforced their stance against spammy link building tactics, leading some to doubt the viability of continued link building.

Danny Sullivan referred to links as the ‘fossil fuel’ of the SEO world, asking Cutts when we’ll see ‘cold fusion’.

Cutts responded saying Google’s had a steady stance concerning links, and that he believes most webmasters and SEOs are finally understanding, and that it’s a “healthier world”.

“Our guidelines have always been kind of steady, we've always said look, make a great site, such that people want to link to your site, that makes it a lot easier to rank, the links come easier, all that sort of stuff. It's just now we're bringing these tools to bear where that's backed up with the algorithm,” - Matt Cutts

He continued on with:

“Yes, we are going through a little bit of a transition, but I think we're moving to a healthier world. It gets harder and harder to spam every year." - Matt Cutts

Cutts also referred to the rant Sullivan himself gave at SMX Advanced concerning links.

You can listen to the rant here, as well as read the post Sullivan wrote to be the ‘coherent version’ of the rant.

Important link takeaways:

      • Google is happy with the direction they’re moving with the algorithm updates of recent years, and will more than likely continue in the direction of harsher enforcement
      • Links are still very important, but they have to be real quality links – low quality spam is a bygone era, and will only hurt the health of your site
      • Cutts specifically said to go after ‘hard links’ – they’re not likely to value sponsored posts, 200 word posts with keyword rich anchor text, or any other easy/spammy tactics
      • Google and Facebook (read social media) don’t get along very well; Google still can’t crawl most of Facebook, and doesn’t have access to a ‘like’ stream
      • Links aren’t close to being replaced with social

 

Disavow

Tied closely with links, disavow was another hot topic during the You&A with Matt Cutts.

Specifically, Sullivan ranted about how even his site, Search Engine Land, an extremely high quality site with a well-known reputation, has received link removal requests.

This led to Sullivan saying:

“Why don’t you just disavow the bad links yourself? Why do we have to do it? If you know they’re bad…” - Danny Sullivan

Link removal, disavow, and reconsideration requests have become all too common in recent years.

Cutts responded calling this a “one time market correction”, further elaborating:

“Because, for a long time everybody thought I have to get as many links as possible, links links links, I have to get all the links. If I can go pay five dollars and get a bunch of links, that's a great thing to get.”       - Matt Cutts

This “one time market correction” seems to imply that there’s going to be a bit of transition time as sites realign themselves with Google.

Furthermore, Cutts warned not to disavow immediately following an unnatural link warning. Instead, he said time needs to be spent on link removal before moving on to disavow. He remarked that Google might be working on providing webmasters more examples of which links are bad, but that it’s a long ways out.

Important disavow takeaways:

      • There’s going to be a ‘transition time’ as websites realign themselves with Google’s Webmaster Guidelines. Cutts called this a “one time market correction”.
      • Building spammy links is only going to hurt your site, rather than help
      • Do not jump straight to disavow upon receiving an unnatural link warning from Google. Instead, spend time on link removal first
      • Google might be working on providing examples of URLs they consider to be spammy with unnatural link warnings, but it’s a long ways out

 

Penalties

With so much conversation around Panda, Penguin, links and disavows Cutts naturally delved into the topic of penalties as well.

He made a point of differentiating between algorithmic drops and manual actions, stating that only manual actions are penalties, and

“If we’re taking direct action on your site we’ll almost always alert you with a message.” - Matt Cutts

Algorithmic drops are the result of an algorithm tweak, and typically won’t receive any kind of message.

Sullivan also quizzed Cutts on the duration of penalties, since all manual actions have an expiration date.

Cutts responded:

“The length of the penalty depends on how severe we think the penalty is.” - Matt Cutts

He then went on to say despite no penalty being forever, they can last a long time.

“There’s not life in prison. If a domain is completely awful, black hat, pure evil, we don’t think there’s any redeeming characteristics then we might set penalties or manual actions along the lines of waiting until the domain expires.” - Matt Cutts

And finally, Cutts said that they typically don’t review a site when a penalty expires. He said they’re confident in their spam catching processes, and if the site hasn’t cleaned up its act it will simply work its way back into a penalty.

Important penalty takeaways:

      • A penalty is a manual action taken by a Google team member, and will almost always be followed up with a message
      • An algorithmic ‘penalty’ – really a drop – is the result of an algorithmic change, and can’t be removed
      • All penalties eventually expire
      • Penalty length is dependent upon the infraction, and if a site is truly unredeemable the penalty can be set until the domain expires
      • Sites typically aren’t reviewed right before/after a penalty expires – Google is confident in their abilities to catch any continuing offenders.

 

Mobile

The one subject Cutts specifically went out of his way to bring during the You&A was mobile, stating:

“You really need to be thinking about mobile. Mobile is happening much faster than almost anyone expected. If you run a savvy site, you should probably look at the graph of when mobile will exceed your desktop usage.” - Matt Cutts

Clearly, Cutts believes there’s a lack of appreciation of mobile, and that it’s an important topic that needs to come to the forefront of SEO.

Google also posted an article on their blog dealing with mobile search result rankings.

Cutts mentioned some common issues Google is seeing with websites:

• Every single URL redirecting mobile traffic to exactly one mobile URL
• Some sites having poor redirecting, creating an infinite or annoying redirect loop
• Page speed issues

Cutts went on to mention that if your site is ‘smart phone antagonistic’, it might not rank as well for users searching on a mobile device.

“Don’t panic, but put a little bit of thought into mobile. Because you probably have a lot more users coming from mobile than you might expect.” - Matt Cutts

Important mobile takeaways:

      • Google is putting more emphasis and thought into mobile, and site owners should as well
      • Google believes that mobile traffic will surpass desktop traffic in the not too distant future
      • Google is gearing up to algorithmically ding websites that are ‘antagonistic’ to mobile traffic

 

Miscellaneous

These are all the subjects Matt Cutts touched briefly upon, with not quite enough information to warrant their own section.

  • (not provided)

Sullivan jumped into this subject with a vengeance, as Cutts had apparently told him two years back that (not provided) would only account for a single digit percentage of web traffic. As we all now know, that’s now patently understated.

Cutts defended himself stating that at the time he said it, it was true. However, with privacy protection becoming ever more important, along with other factors, things have since changed – and he’s happy with the changes, considering recent news regarding the NSA.

  • Affiliates

Cutts mentioned affiliates in the same sentence as black hat several times, implying a connection. When confronted about this Cutts apologized, stating Google likes any site that adds value to the web.

  • Bounce rate

Cutts stated last year that as far as he knew bounce rate wasn’t a ranking signal. This year, he reaffirmed that statement with the caveat that once again, he hasn’t researched the issue.

  • Google Plus as a ranking factor

Cutts acknowledges that they have access to the data behind the social media of Google+, and they’re definitely deeply examining it. However, he denies that it serves as any kind of ranking signal in its current iteration.

  • Google algorithmically determining authority

Google’s recently made statements about an update they’re working on that will allow them to recognize authorities within their given niche, and use that as a ranking signal on a case by case basis for their contributions to the web, within their niche.

This seems to be tied with rel=author, although this time Cutts didn’t mention rel=author at all when talking authority. Instead, Cutts merely confirmed that authority is determined algorithmically – otherwise he was fairly closed-mouth.

Bonus Questions

Sullivan ended the You&A by asking a few last questions, which mostly pertained to Cutts himself. Regardless, they were interesting and relayed some interesting information. Here they are:

Sullivan: “What is the most overrated thing out there right now with SEO”
Cutts: “Short term social, but not long term social.”

Sullivan: “Most underrated?”
Cutts: “Design, user experience… So put some work into the polish.”

Sullivan: “What’s your biggest surprise over the last year?”
Cutts: “It is impossible to gauge what people are or aren't going to notice. That's one of the other reasons we don't announce algorithms.”

Questions, comments, thoughts? Let me know, either by leaving a comment or shooting me an email at ccollins at pageonepower.com. I’d love to hear your theories and discuss it further.