aimClear’s coverage of #SMX East 2011 kicked off with… PANDA WATCH! How else? Google’s furry little algorithm update, aimed to crack down on duplicate content, among other things, rolled out in waves during the late winter and early spring of 2011. Initially, only a portion of U.S. sites were affected, then international sites, then more U.S. sites, etc.
Largely due to this sweeping smackdown, Panda has been a hot topic at online marketing conferences this year. Google estimates the algo’ update has affected a mere 12% of all websites, but how many of them truly deserved the ranking-spanking?
On the morning of #SMX East Day 1, moderator Matt McGee, Q&A moderator Chris Silver Smith, and speakers Alan Bleiweiss, Director of Search Services, Click2Rank Consulting, Micah Fisher-Kirshner, Senior SEO Manager, Become, Inc, & Mark Munroe, Senior Director, SEO, Everyday Health, came together to share insightful two cents and survival tips for overcoming Panda’s wrath and staking your claim in the search engine result pages (SERPs) once more. aimClear live-tweeted this session via @beebow. Read on for the full guide.
Matt McGee took the mic, welcomed attendees, sussed the level of sleepiness and sobriety, introduced the speakers, and kicked off the session. On the menu, Google’s Panda, the infamous algorithm update that hit Feb 24, 2011, and recovery tips for those spanked, clobbered, annihilated, i.e.: affected.
Mark Munroe was up first. He began with various site traffic charts that showcased the decrease in traffic some sites suffered during the first wave of Panda (by comparison of the roll out of the MayDay algo update). For some sites, the decrease attributed to Panda was a fiery crash and burn. Stock-market style.
Conclusion according to Mark: The Panda update is the most significant change to the organic search also since the intro of Page Rank and link reputation.
Follow Up Conclusion: Whether or not you were hit by Panda, it is vital to learn from what’s going on, and prepare yourself for the next wave. Some folks weren’t hit by Panda, and got cocky. You better believe they got spanked during the second wave.
General Sites That Got Hit:
- Q&A sites
- Sites about everything (eHow.com, for example)
- Content farms
In pre-Panda Google, search relevancy was defined by content, links, and anchor text. Site reputation and page importance were determined by linking structure of the Internet. Pre-Panda SEO strategies and SEO objectives focused on ways to create new content and new pages, e.g. search result pages, tag pages, data driven pages, or variations of same page optimized for different keywords. Even if users were less targeted, site owners still made money based on some conversions and ad clicks.
In Google’s eyes, pre-Panda days made it too easy to manipulate the SERPs without respect to quality.
SERP positions are now modified based on new reputation factor, a quality assessment of site as whole. Mark hypothesized that reputation is based primarily on user metrics, i.e. on how the user interacts with your site in the SERP.
“Google know more than you do!” –Mark Munroe
Truth – when it comes to the SERP user experience (XP), Google knows more about the user experience of an SEO visit to your site than you do.
That said, what do you know?
- Bounce rates
- Two-page visits
- Three-page visits
What do those user actions mean about user satisfaction?
It’s not easy to glean a whole lot. If a user searches for android app reviews, clicks on “topandroid.com,” goes through to site, finds no reviews, clicks back to the SERP to find a good site, this tells Google that the topandroid.com site was not useful.
What Interests Google?
Think like a Google product manager. What do they care about? They care about Google SERPs – their effectiveness and organization. They care about the way the user interacts with the SERP.
It’s likely that Google does not get its data from Google Analytics or the Google Toolbar. They likely focus more on metrics like:
- G-Bounce, a.k.a. if someone clicks on a link from the SERP and bounces immediately back to the SERP
- Query behavior after the G-Bounce
- Average time before the G-Bounce
- Repeat visits
…and similar metrics that are not be reported by your analytics package.
Ranking Spanking Recovery Plan, Tips & Tactics:
- Fix the pages that drive you traffic.
- Addressing the issue on pages that address only a small part of traffic will not have a large impact on overall metrics.
- Integrate SEOs and User XP engineers – make sure your SEOs are thinking about user XP, and make sure your designers are thinking about search.
- Define, implement, and report on good metrics that correlate to the search XP.
- Analyze the impact of each new release on the search XP.
- Beware of and give close scrutiny to type of content and techniques that have specifically led to Panda issues.
- Remember the user XP starts on the SERP, and starts with a KW.
- Leverage survey polls to get a deeper understanding of who is ending up on your landing page.
- Survey people who are representatives of your overall traffic. Only survey the ones coming from Google (because that’s the traffic source you’re testing!). Your homepage users and your incoming-from-Google users are very different users.
- Do actual user testing to get a sense of usability – but make sure to start on the SERP, not on your website. (Check out usertesting.com, Mark recommends- it’s cheap, effective, and spits back answers within hours.)
- Use KWs from your analytics package that are representative of your traffic.
- Create scenarios that test the key Qs people have when they come to your site (based on your surveys). Do people find what they’re looking for? Are ads in the way? Do people know what they’re supposed to do?
- Look for bad KWs. Do you see KWs that don’t make sense? Fix them, or change the content that causes them to show up.
- Make sure the content relevant to the search query that brought a user to a site is easy to find.
- Beware of content hidden behind read more buttons, tabs, etc.
- If you allow comments & UGC, get sophisticated spam filtering implemented. Spammers are good at bringing in unwanted traffic.
- Make sure you have good titles! SEO 101. That’s just plain smart.
- Beware of having content for content’s sake – content should be very tightly focused on the title of a page.
- De-index content that does not deliver, e.g. Q&A pages without answers, dynamically generated pages with little to no content, etc.
- Use no-index / no-follow on these pages, but be careful to test to make sure it doesn’t go on the wrong pages.
- Link freely to relevant content – if you cant give users what they want, show them where they can find it.
- Don’t annoy your users with too many ads, slow load time, or anything that will make people bounce quickly.
- Don’t ever have downtime. If you do, create a lovely-looking maintenance page so a user knows what’s up.
- Give a good mobile experience. Some sites are seeing 15-20% of traffic coming form Google. Make sure your mobile site makes for a good UX.
- Be mindful about the validity of your metrics – standard bounce rates as reported by most analytics packages is extremely flawed, says Mark. Likewise with time on site, as it doesn’t factor in bounces. Bounce rate based on time instead of single page view – 15 seconds, 30 seconds, 1 minute, true time on site, etc.
- Industry specific concerns – Certain types of were royally spanked by Panda (shopping sites, for example). If you’re in one of these verticals, you’ve just got to work extra hard to create a solid UX, to help users get to their ultimate destinations.
Mark summed up by noting Panda is a quality assessment of your site, most likely based on data of how user interacts with SERPs.
There you have it. Next up was Alan Bleiweiss. In case you didn’t know, Alan is a sorcerer. He must be – how else could he have brought back sites from the dead after mondo-Panda-spankings?
Ah, yes. Perhaps he’s just a great SEO.
Alan performed a 17 site audits that compiled the data in his presentation. The 17 sites amounted to approximately 43 million pages indexed by Google. Of those 43 mil, only 10 million were indexed in Bing. Discuss amongst yourselves.
Right off the bat, Alan emphasized the importance of attracting activity and traffic from a lot of sources other than Google.
Myopic SEO vs. Sustainable SEO
Alan toured us through two approaches to SEO, one he calls “myopic” and the other, “sustainable.”
- Myopic SEO = The magic bullet theory. “If I focus on this single type of work, I will succeed!” Not exactly realistic, or responsible. Myopic SEO can be caused by or lead to topical confusion, not enough text, internal link poisoning, unnatural off-site patterns, and similar factors.
- Sustainable SEO = Focuses on user XP as seen through the eyes of search bots & algorithms. Sustainable SEO can be characterized by consistent signals on topical focus, confirming focus, not overwhelming senses, and off-site diversity. Sustainable SEO usability factors include section specific sub-nav, microdata breadcrumbs, high quality topic focused unique content, main content area.
As if we had any doubts, Alan showed some graphs from his case studies, comparing the results of Panda as felt by sites that took a myopic approach to SEO vs. a sustainable one. The sites with sustainable SEO were on a noticeable road to recovery. The myopic SEO sites, not so much.
Spread Your SEO Wings!
Spread out your focus and your efforts. SEO needs to be a broad range of techniques, and tactics. Really take time to put yourself in your users’ shoes. Examine your page. Is it overwhelming? Is it enough? If users have a problem with your site, you can pretty much guarantee that under certain circumstances, Google will, too.
A Word About AdSense…
If you are a real person or a real company that offers real services, don’t have AdSense ads on your site. Just don’t do it. It’s gross. And unnecessary!
If you want powerful results from the search engines and a sustainable user experience, spread out how you communicate what you’re offering. One way to do this is through diverse and optimized anchor text, and beefing up your inbound link profile. The more domains you have sending inbound links to you, the lower your link to root ratio is. Your overall link to root ratio show more sites sending fewer links to same site. It is natural (and good) to have a low link to root ratio, as this helps tell the search engine you have a diverse source of inbound links.
Be Kind to Bing
If you’re taking the time to submit your sitemap to Google, go the extra step and submit to Bing. Lest we forget, Bing does drive traffic. That said, Alan notes it has a hard time finding content on its own, so submitting a sitemap can be extra helpful. A lot of site owners neglect this small but super smart tactic. Don’t be one of them.
Things Bing Loves
- Diversity in inbound links
- Really tight correlation between anchors and relevant pages
- Social (but that doesn’t mean you should settle for having a social presence. Become a social authority!)
Alan next hit some insightful points on the future-thinking:
- Sustainable SEO cares about the future. Always consider, what will users be doing six months from now? A year form now?
- What Myopic SEO techniques is Matt Cutt’s next target?
- What are the biggest emerging tech trends?
- Next big thing: Social! (For Google and Bing… from+1 for websites to authority tweeters and everything in between
Let’s talk about Schema.org.
What does it mean? Schema.org means more diversity of deep information – of events, products, locations, and people profiles.
“Schema.org is definitely going to be a ranking factor in 2012,”Alan stressed. “Take the next 6 months to learn about it, to get on board, and to use it.”
Powerful and smart! Last but not least, Micah Fisher-Kirshner. He laid out 11 questions to ask in the event of a massive drop in traffic.
Sample Event: OMG! Your monitoring system goes haywire! Traffic drops +20%! Why?!?!
Q1 – Is the data fully available? This is where maintaining a good relationship with ops team is crucial. Massive events require flexibility within the organization. If you don’t have an ops team to work with, you can utilize Google Analytics to double check. Drill into hourly reports within advanced segments.
Q2 – Who else is affected? Find the limits of the event. SEO affects everything and everything affects SEO. Communication is essential. This isn’t the time to send emails and then wait for responses! Keep your departments near by.
Q3 – Are there rumors of an algorithm update rumors? The best way to know: Read, read, read. Focus on trusted online forums.
Q4 – What was recently launched? Keep an event log. Sometimes, product launches a month back can be the cause of a sudden drop in traffic. At the same time, go bug engineers who may have launched something. Not every entail is written down. Watch for rollbacks that undo critical changes.
If nothing was recently launched, go back to Q3
Keep on reading! Remember -just because you find one issue doesn’t mean this is the only issue. Go back to the same forums and search news sites.
Proceed to Q5.
Q5 – What areas are affected? Segment in any way you can. KW grouping, KW length, traffic level, motive, page grouping, home, category, products, domain / site groupings, etc.
Return to Q3.
Any mentions around the web? No? Get back to work. And proceed to Q6.
Q6 – Did something break? Let’s assume everything has been white hat. Sometimes, broken or forgotten process can lead to a broken website that looks like a black hat SEO site. Ouch! Know what is fundamental to your sites SEO – backend function are easiest things to miss, worker transitions always miss certain processes. Go back to ops team to run through SEO checklist.
Did that? Good. Return to Q3.
Keep reading! Can you find a confirmation from around the web of an algorithm update? Yes? Finally!
So… now what? Now’s time for data collection. Pull the sources mentioning the algo update together as you move to Q7.
Q7 – Who is talking? Recognize the regulars. Skip the broken track records. Always read the important people, even if its just one sentence from them. Conversely, scrutinize the strangers – read long commentators. Short comments are typically not worth it. Repeat the mantra, “Jerks will be jerks.” Push past annoyances and listen to what people are saying.
Q8 – What sites are dropping? Your ranking data shows the severity of impact. Take a look at competitors ranking data- this is essential! Seeing who survives can help provide answers about ago updates.
Q9 – What are the theories? Think like a black hat (did we just say that?). Think about what is going on, where the series fit.
“Hate content farms? Wait for the new click farms!” -Micah
Q10 – What theories fit? Find out everything you can – work your business connections, read blogs for in-depth analysis. Jot down likely possibilities, make sure you have enough data. When you’re collecting data, make sure you’re collecting enough. Two sites do not = enough data. Throw in a third, and suddenly data doesn’t quite look right…
Q 11 – What can we do to recover? Build for the user, think like a search engine, learn to love statistics – understand how changes were made. A/B test. Everything. Multiple times. Test across subdomains, page types, categories.
With that, Micah wrapped up, and turned it over for a little Q&A. That’s all aimClear captured, and we hope it whet your appetite for more #SMX East 2011 coverage to come 🙂 Stay tuned.