Pig, Pug Or Polar Bear? Matt Cutts At #SMX Advanced!
Between deflecting jabs about the PRISM program and keyword not provided data, Matt Cutts closed out day 1 of SMX Advanced in a way only Google’s Distinguished Engineer could. Search Engine Land Founding Editor Danny Sullivan joined Matt on stage, driving the conversation from the NSA to algorithm updates to SEO best practices. Matt dropped several big announcements that drew spontaneous applause, and also offered advice for what to focus on in terms of SEO. Read on for the top takeaways from last night’s “You&A With Matt Cutts.”
Keyword Not Provided/NSA’s PRISM Program
When asked if using the NSA PRISM program will allow us to see not provided search terms, Matt asked, “Was that a serious question?” He later added that top search queries within Google Webmaster Tools give you a pretty good idea of what these terms are.
Matt reported that the last Panda update was about a month and a half ago, and that updates typically occur at a rate of about once a month now. The algorithm has reached the point where Google bakes new Panda data into the index. In fact, it’s rolling in new data about 10 days every month. As to why they no longer announce these changes, Matt said that Panda updates are incremental. If it’s large, he said, SEOs need to know what it is and Google will share when it happens and what’s going on. In terms of the smaller updates, Google used to announce them. At the end of each month, they would announce what had launched that month, but by the end of year, people got tired. With about 500 algorithm changes every year, Google usually launches 1 or 2 things per day. If they share that every time, people would go crazy with the noise. So now Google is only sharing big things such as Penguin 2.0.
Alluding to his video about what to expect in the coming months regarding SEO, Matt mentioned that Google has found a new signal that it thinks will help pull people out of the Panda “grey zone.” Many of the sites affected by Panda were considered borderline, so Google is exploring means to reduce the impact of the algorithm on those sites.
Matt sees a lot of complaints about sites that seem to have survived Penguin, despite their use of spammy tactics. He reminded the audience that an algorithm is not going to improve every search under the sun. An algorithm targets specific queries and updates will help reduce a certain type of webspam. For example, hacked sites may still show in the SERPs, even post Penguin. Matt cautioned the SEOs not to mimic the spam tactics that seem to be getting past Google. Don’t look at queries and see sites ranking because of their use of blackhat or illegal tactics and think that’s what works so I’m going to do that, he said. Google is always looking at different types of queries and how to make them better, and they’ll keep working to make the SERPs cleaner.
Animal Algorithm Updates
What will the next algorithm be named, Danny asked, referring to the three stuffed animals on the stage. Pig? Pug? Polar Bear? We don’t want to have a menagerie of updates, Matt replied, so Google probably won’t be naming further algorithms.
Link Disavow Tool/Unnatural Link Warnings
Google has a lot of data about which sites are bad, so why should we have to use the disavow tool, Danny asked. Why don’t you just disavow all the bad links yourself? Matt’s response was that Google’s algorithms try to find and detect those sites so SEOs don’t have to worry about them. For a long time, people were getting a lot of low-quality links. We’re going through a transition now, and moving to a healthier world where people are focusing on the quality of their links. If someone generated spam links in the past and you need to clean them up, the disavow tool is there to help you.
That led to the question of what to do first upon receiving an unnatural links warning: Use the disavow tool or contact webmasters to remove links? Matt advised manual cleanup first. The disavow tool should be used when link removal requests are unsuccessful. Might this affect your rankings while you undertake this tedious task? Yes, but it’s only fair to your competitors, he said, since you’ve been using spammy links.
Matt also reminded the audience that messages such as unnatural link warnings are for manual webspam actions. Algorithmic updates will not generate a message. While on the topic, Matt dropped his first big announcement: Within the next few days, Google will start including example URLs with manual webspam messages. The warnings will now include a handful of URLs (1 to 3, Matt said) that demonstrate the problem.
Google’s guidelines have always been steady: Make a great site, such that people want to link to it. If you aim for fantastic user experience, you will find it easier to get traction, word of mouth and links.
Mobile is a hugely important area that everyone needs to focus on because mobile usage will exceed desktop usage faster than anyone expected. That said, it’s worth thinking about your mobile experience. When every desktop URL a user tries to access from a smartphone redirects to the mobile homepage, that’s a bad experience, and these errors could affect rankings. If your site is smartphone antagonistic, it might not rank as well, Matt said. Given Google’s focus on mobile, Danny reported that Search Engine Land will update its Periodic Table of SEO Success Factors to include a mobile element.
Earlier in the day, Eric Enge of Stone Temple Consulting discussed Facebook social signals as a possible means of driving indexing and ranking, so naturally, this was put before Matt to confirm or deny. His take? We’re not saying it’s not possible, but Google generally doesn’t get access to a lot of Facebook data. If you make fantastic content, people like it more on Facebook, which also generates links. Correlation does not equal causation.
Another question related to ranking factors by industry. Do they differ? No, Matt said, but Google does look at who is an authority for that space.
Asked if bounce rate affects ranking, Matt said that, to the best of his knowledge, it does not. What about site load speed? You probably won’t get a boost for being fast, but really slow sites won’t do as well, he reported.
- Matt’s second big bombshell related to markup. He mentioned that Google is developing a structured data dashboard that will return errors if you don’t use markup correctly, and invited attendees to join as an early tester of webmaster tools structured data error reporting.
- Asked for his opinion of weather/temperature-style tools that predict when something is going on in the SERPs that could indicate an algorithm update, Matt said that the tools aren’t crazy, but there’s a lot of sampling or subsampling skew.
- A press release link, by definition, is a link you are paying for, and anchor text shouldn’t count.
- Does Google view affiliates as the equivalent of black hats? No, Matt said, as he apologized for lumping the two together: I feel bad about that. I shouldn’t have grouped affiliates in with black hats, although we do tend to see more bad affiliates than good affiliates.
As the session was coming to a close, Danny asked Matt for his opinion on the most overrated and underrated things when it comes to SEO. As for the overrated, Matt said short-term social. Longer-term social will be an important trend, he added. On the flip side, the best thing to put time and effort into is user experience, both desktop and mobile.
With that, day 1 of SMX Advanced drew to a close. We’ll be back with more from day 2!
Image © Monkey Business – Fotolia dot com