4 Steps to Panda-Proof Your Website (Before It’s Too Late!)

Panda-Proof

 

by ,

It may be a new year, but that hasn’t stopped Google from rolling out yet another Panda refresh.

Last year Google unleashed the most aggressive campaign of major algo updates ever in its crusade to battle rank spam. This year looks to be more of the same.

Since Panda first hit the scene two years ago, thousands of sites have been mauled. SEO forums are littered with site owners who have seen six figure revenue websites and their entire livelihoods evaporate overnight, largely because they didn’t take Panda seriously.

If your site is guilty of transgressions that might provoke the Panda and you haven’t been hit yet, consider yourself lucky. But understand that it’s only a matter of time before you do get mauled. No doubt about it: Panda is coming for you.

Over the past year, we’ve helped a number of site owners recover from Panda. We’ve also worked with existing clients to Panda-proof their websites and (knock on wood) haven’t had a single site fall victim to Panda.

Based on that what we’ve learned saving and securing sites, I’ve pulled together a list of steps and actions to help site owners Panda-proof websites that may be at risk.

Step 1: Purge Duplicate Content

Duplicate content issues have always plagued websites and SEOs. But with Panda, Google has taken a dramatically different approach to how they view and treat sites with high degrees of duplicate content. Where dupe content issues pre-Panda might hurt a particular piece of content, now duplicate content will sink an entire website.

So with that shift in attitude, site owners need to take duplicate content seriously. You must be hawkish about cleaning up duplicate content issues to Panda-proof your site.

Screaming Frog is a good choice when you want to identify duplicate pages. This article by Ben Goodsell offers a great tutorial on locating duplicate content issues.

Some suggestions for fixing dupe content issues include:

Now, cleaning up existing duplicate content issues is critical. But it’s just as important to take a preventative measures as well. This means, addressing the root cause of your duplicate content issues before they end up in the index. Yoast offers some great suggestions on how to avoid duplicate content issues altogether.

Step 2: Eradicate Low Quality, Low Value Content

Google’s objective with Panda is to help users find “high-quality” sites by diminishing the visibility (ranking power) of low-quality content, all of which is accomplished at scale, algorithmically. So weeding out low value content should be mission critical for site owners.

But the million dollar question we hear all the time is “what constitutes ‘low quality’ content?”

Google offered guidance on how to asses page-level quality, which is useful to help guide your editorial roadmap. But what about sites that host hundreds or thousands of pages, where evaluating every page by hand isn’t even remotely practical or cost-effective?

A much more realistic approach for larger sites is to look at user engagement signals that Google is potentially using to identify low-quality content. These would include key behavioral metrics such as:

  • Low to no visits.
  • Anemic unique page views.
  • Short time on page.
  • High bounce rates.

Of course, these metrics can be somewhat noisy and susceptible to external factors, but they’re the most efficient way to sniff-out out low value content at scale.

Some ways you can deal with these low value and poor performing pages include:

  • Deleting any content with low to no user engagement signals.
  • Consolidating the content of thin or shallow pages into thicker, more useful documents (i.e., “purge and merge).”
  • Adding additional internal links to improve visitor engagement (and deeper indexation). Tip: make sure these internal links point to high-quality content on your site.

One additional type of low quality content that often gets overlooked is pagination. Proper pagination is highly effective at distributing link equity throughout your site. But high ratios of paginated archives, comments and tag pages can also dilute your site’s crawl budget, cause indexation cap issues and negatively tip the scales of high-to low-value content ratios on your site.

Tips for Panda-proofing pagination include:

Step 3: Thicken-Up Thin Content

Google hates thin content. And this disdain isn’t reserved for spammy scraper sites or thin affiliates only. It’s also directed at sites with little or no original content (i.e., another form of “low value” content).

One of the riskiest content types we see frequently on client sites are thin directory-style pages. These are aggregate feed pages you’d find on ecommerce product pages (both page level and category level); sites with city, state and ZIP code directory type pages (think hotel and travel sites); and event location listings (think ticket brokers). And many sites host thousands of these page types, which other than a big list of hyperlinks have zero-to-no content.

Unlike other low-value content traps, these directory pages are often instrumental in site usabilityand helping users navigate to deeper content. So deleting them or merging them isn’t an option.

Instead, the best strategy here is to thicken up these thin directory pages with original content. Some recommendations include:

  • Drop a thousand words of original, value-add content on the page in an effort to treat each page as a comprehensive guide on a specific topic.
  • Pipe in API data and content mash-ups (excellent when you need to thicken hundreds or thousands of pages at scale).
  • Encourage user reviews.
  • Add images and videos.
  • Move thin pages off to subdomains, which Google hints at. Though we use this is as more of a “stop gap” approach for sites that have been mauled by Panda and are trying to rebound quickly, rather than a long-term, sustainable strategy.

It’s worth noting that these recommendations can be applied to most types of thin content pages. I’m just using directory style pages as an example because we see them so often.

When it comes to discovering thin content issues at scale, take a look at word count. If you’re running WordPress, there are a couple of plugins you can use to asses word count for every document on your site:

As well, here are some all-purpose plugin recommendations to help in the war against Panda.

All in all, we’re seeing documents that have been thickened up get a nice boost in rankings and SERP visibility. And this isn’t boost isn’t a temporal QDF bump. In the majority of cases, when thickening up thin pages, we’re seeing permanent ranking improvements over competitor pages.

Step 4: Develop High-Quality Content

On the flipside of fixing low or no-value content issues, you must adopt an approach of onlypublishing the highest quality content on your site. For many sites, this is a total shift in mindset, but nonetheless raising your content publishing standards is essential to Panda-proofing your site.

Google describes “quality content” as “content that you can send to your child to learn something.” Which is a little vague but to me it says two distinct things:

  • Your content should be highly informative.
  • Your content should easy to understand (easy enough that a child can comprehend it).

For a really in-depth look at “What Google Considers Quality Content,” check out Brian Ussery’sexcellent analysis.

When publishing content on our own sites, we ask ourselves a few simple quality control questions:

  • Does this content offer value?
  • Is this content you would share with others?
  • Would you link to this content as an informative resource?

If a piece of content doesn’t meet these basic criteria, we work to improve it until it does.

Now, when it comes to publishing quality content, many site owners don’t have the good fortune of having industry experts in house and internal writing resources at their disposal. In those cases, you should consider outsourcing your content generation to the pros.

Some of the most effective ways we use to find professional, authoritative authors include:

  • Placing an ad on Craigslist and conduct a “competition.” Despite what the critics say, this method works really and you can find some excellent, cost-effective talent.  “How to Find Quality Freelance Authors on Craigslist” will walk you through the process.
  • Reaching out to influential writers in your niche with columns on high profile pubs. Most of these folks do freelance work and are eager to take on new projects. You can find these folks with search operators like [intitle:“your product nice” intext:“meet our bloggers”] or [intitle:“your product nice” intext: “meet our authors”] since many blogs publish an author’s profile page.
  • Targeting published authors on Amazon.com is a fantastic way to find influential authors who have experience writing on topics in your niche.

Apart from addressing writing resource deficiencies, the advantages of hiring topic experts or published authors include:

Finally, I wanted to address the issue of frequency and publishing quality content. Ask yourself this: are you publishing content everyday on your blog, sometimes twice a day? If so, ask yourself “why?”

Is it because you read on a popular marketing blog that cranking out blog posts each and every day is a good way to target trending topics and popular terms, and flood the index with content that will rank in hundreds of relevant mid-tail verticals?

If this is your approach, you might want to rethink it. In fact, I’d argue that 90 percent of sites that use this strategy should slow down and publish better, longer, meatier content less frequently.

In a race to “publish every day!!!” you’re potentially polluting the SERPs with quick, thin, low value posts and dragging down the overall quality score of your entire site. So if you fall into this camp, definitely stop and think about your approach. Test the efficacy of fewer, thicker posts vs short-form “keyword chasing” articles.

Panda-Proofing Wrap Up

Bottom line: get your site in-shape before it’s too late. Why risk being susceptible to every Panda update, when Armageddon is entirely avoidable.

The SEO and affiliate forums are littered with site owners who continue to practice the same low value tactics in spite of the clear dangers because they were cheap and they worked. But look at those sites now. Don’t make the same mistake.

Posted in Google | Tagged , | Leave a comment

4 Ways to Create a Better Website Experience for Users & Search Engines

Create a Better Website Experience

 

by ,

As SEO professionals or webmasters, we have built or maintained a website molded to our perception of what a site should be. But while piece-mealing an SEO project, band-aiding, or adding features/content, has the role of your site gone off course?

Taking a step back or “getting out of our own head” allows us to revisit why we even keep this site going in the first place. It exists to convert, to inform, to portray a brand or message, to facilitate a user.

Bringing it Back in Scope

Your site may “look good” but be a failure when it comes to SEO, usability, or conversions. Looks don’t matter to those of us who pay attention to analytics and know the benefits of organic visitors.

While this post has the overtone of usability of conversion optimization, it’s also important to think about the search engines. After all, search engines use the site too, don’t they? Like users, search engines crawling the website must understand it.

What follows are a few common situations where sites have become so consumed with growing, adding to, adapting, and trying to be “it” that they lost focus of what their site truly is supposed to do.

Think Like Your Customers

Search engines have the ability to form semantic relationships. Thus, they have the ability to quickly take a 30,000 foot view of your site, understand the theme of your site, know link relationships and the content of those sites that help form your topical relevance, and so on.

A user doesn’t necessarily know those acronyms you use internally as a widely known reference for what you do or sell. You need to get out of your head and into the heads of your customers.

Just because you refer to products in acronym or slang in-house, this may not be how it is searched or understood by your potential site visitors. The bots may understand you, but you just lost a sale from the human visitor.

First Time Visitors Navigate Differently

If you want users to see your most important information, you have to place it in common areas.

If you sell products at all your locations, then “Locations” should be in the main navigation of your website and featured as a call to action on the page as well. The user sees this and now with heightened internal linking, while the search engine bots see the importance too.

The Navigation Nightmare

There’s nothing more annoying than being lost digitally. Visitors should know in the first few seconds of looking at your main navigation what you offer.

Visitors who land on an internal site page have no idea where they are on the site if the website doesn’t use of breadcrumb navigation, the URL structure has no folder structure, pages are located directly off the root or dynamic in naming convention, the internal page lends no use of sub navigation or is different between internal site pages.

Let us take our human hats off for a moment and think about the search engines. Might it annoy them too if they enter a site and can’t understand where they are or how this site is tied together.

Key point: people don’t always enter your site on the homepage and neither do search engines.

The Apology Page

Even a well-managed site that is quickly changing or growing is going to see 404 pages appear. We’re human, we sometimes forget redirects.

Would you show up at someone’s front door for a first date with your zipper down, or go to a big job interview with a huge stain on your shirt? 404 pages reflect badly on you – and it’s a bad experience for users and search engines.

You messed up. You have to apologize and hope the user or crawler forgives you.

Counter this by at least serving a custom 404 page providing an apology for the error, showing the standard page template with main navigation as well as including links and messaging to visit top areas of the site. The worst thing you can do is show a blank page.

404’s are a lost opportunity. Don’t believe me? Go visit your Google Analytics Content section and search for Landing Pages of your 404 URL or by Landing Page with the filter set to the title element of your 404 page and take a look at how many people encountered a dead page on your site in the last month.

Summary

While this article has only brushed the surface of how you should cater to a visitor or crawling bot experience, these are some of my biggest pet peeves that are often times easy fixes.

Posted in SEO | Tagged , | Leave a comment

Google Warns SEO & Businesses to Avoid Fake Reviews

Google Warns

 

by ,

An update to Google’s spam detection algorithms will grow the number of reviews appearing on some Google+ Local pages. And Google hasshared some advice with reviewers, business owners, and SEO professional on how to keep reviews from being deleted.

Google warns business owners that “fake glowing testimonies” written by SEO or reputation management companies will be taken down.

Also, “if a business accepts paper comment cards it might be tempting to collect them and ‘digitize’ them by posting the reviews on Google+ Local,” Google says in its advice for SEOs. “We ask that all reviews come from first hand experience and do not allow posting reviews on behalf of others.”

On a related note, Google advises against companies asking customers to write a review on a computer or tablet located on the business’s premises. Google said businesses should send reminder emails to customers encouraging them to review the business on their own time – just don’t go so far as to give free gifts or discounts in exchange for encouraging them to leave positive reviews, Google warned.

Positive reviews on Google+ Local can help attract new customers. But when it comes to negative reviews, Google emphasized that it doesn’t take down negative reviews (unless they violate Google’s guidelines) or work with third-party reputation management companies. Google urged business owners to respond to such reviews and address the reviewer’s concerns.

As for reviewers, they should:

  • Write reviews are about one specific location if a business has multiple locations.
  • Not write reviews for a company they currently work for.
  • Not include links in the text of reviews.
Posted in Google | Tagged , | Leave a comment

How to Verify Google Authorship for Your Blog

Google Authorship

 

Posted by Shivam Rana

Google is working on its implementation into its search algorithms for a long time, but it managed to introduce it in the year 2011. It is basically a way to verify relation between a person’s content on the internet and his Google Plus account.

Google developed rel Author HTML tag in order to make it easy for authors to link their content to Google Plus profiles. It will also effect the way your site is presented in Google’s search result.

After successful implementation of Authorship on your blog, the profile picture will show up in search results along side with your content. If you have successfully implemented it on your blog, your search results will look like this.

Experts are saying that concept of authorship is going to play an important role in future of SEO and I do agree with them, as Google has many reason to do so. It is also noticed that authorship implemented blog are getting high CTR, as their search results contains their smiling pictures which attracts readers. 🙂

HOW TO IMPLEMENT AUTHORSHIP ON YOUR BLOG

Follow these steps
  • You must have a profile on Google Plus, if you don’t have, create it now.
  • Confirm your E-mail ID.
  • Now go to Google Plus profile and click on edit profile.
  • Select “About” and then click on Contributor to option. Add the URL of your blog or site there and save it.
Note: Make sure the name you are using in your Google Plus profile must appear at the end of the post (for example “By Shivam Rana”. Always add the root domain of your blog in contributor to section in One Plus profile.
Now if you are also doing guest posting for the purpose of link building, then Google must know that it is your content. To do so you can add rel=”author” tag in the back link to my blog in Author’s Bio. Don’t forget to add URL of the blog on which you are guest posting in “Contributor To” section of your profile.
Posted in Google | Tagged , | 6 Comments

Responding to the Link Devaluation “Google Update” on January 17th, 2013

Link Devaluation

 

by 

It’s been almost half a month and Google still denies that an algorithm update occurred on January 17th, 2013. Even so, SEOmoz’s Dr. Pete noticed that several sites were no longer ranking for branded terms. Within a day of the suspected update, several webmasters contacted us with concerns about big drops in their rankings. We noticed a common narrative in their stories.

Very few of them had seen a site-wide drop in rankings.

Whether the drop in rankings was large or small, the sites were only seeing a loss in rankings for some of their keywords. We discovered that many of them were still enjoying top positions for some of their other keywords.

After analyzing the affected sites in depth, we reached the conclusion that some sort of targeted link devaluation was underway. Some pages had dropped a few pages, others had plummeted out of the search results, and still others were completely unaffected.

We’ve been tracking the rankings of a wide variety of sites over the past several months, and we find ourselves in agreement with what Branded3 has to say on the matter. We’re seeing Google moving in the direction of devaluing links on a continuous basis, as they are crawled, rather than eliminating them in large chunks with one-off updates like Penguin.

At this point, we’re fairly certain that the January 17 event was the result of continuous link devaluation, rather than a single update.

There was already some talk of an update on January 15, and certainly no shortage of grumblings before then. It’s our belief that January 17 was merely the point where this process reached critical mass. If Google is crawling bad links and using them as seeds to detect other bad links, then at some point this process will explode exponentially, which we feel is exactly what happened.

So in that respect, what Google is saying is true. There was no update on January 17. The changes started several months earlier.

Rather than delve into every nook and cranny of these link devaluation algorithms, we thought it would be more useful to offer you a guide to recovery, in a similar vein as our Ultimate Guide to Advanced Guest Blogging for SEOmoz. So let’s take a look out how to recover from link devaluation, and how to prevent it in the first place.

What is Link Devaluation?

Link devaluation itself is nothing new. Google has never released an “official” update on the matter, but it has been happening for quite some time. Any webmaster who has witnessed a drop in rankings that had nothing to do with a penalty, increased competition, or a Google update has experienced link devaluation. It is simply the process whereby Google disregards a link, treating it as though it does not exist.

What is new is the continuous devaluation of links. In the past, Google employees manually devalued links, or used a combination of manual selection and algorithmic extrapolation to find and devalue links. Now, it appears that Google is devaluing links as they are crawled and indexed, rather than removing them in large chunks.

Google has many incentives to move in this direction. Penalties are post-hoc and selective. They are usually based on sites surpassing a certain threshold of suspicious links or on-site activity. Penalties, in general, target sites or individual pages rather than link graph itself. In short, penalties only put a band-aid on the larger problem of webspam.

In contrast, link devaluation cleans up the entire link graph, rather than targeting individual sites.

Were Your Lost Rankings the Result of Link Devaluation?

If you are seeing a slow decline in your rankings rather than a sudden drop, this is almost certainly the result of link devaluation (if it’s not due to higher competition or fewer searches in the first place). But a relatively swift drop can also be the result of link devaluation if it only seems to be affecting specific pages, or if you are still seeing traffic even after the drop.

This is also true for some other updates and penalties, however, so you’ll want to consider the following:

    1. Has there been a Google update? Wait a few days and check with the top sites to see if any updates have been announced around the time you saw a drop in traffic. Check with MozCast to see if there were any major fluctuations in rankings around that time. If an update has occurred around that time, you will want to check into it before pursuing anything else.

 

    1. Take a look at the total “Links to your site” in Google Webmaster Tools to see if this number is dropping. If so, link devaluation is almost certainly the issue, since Google doesn’t typically omit links from webmaster tools for no reason. It is a good idea to create a spreadsheet and record your links over time (or use a tool to do this for you), so that these changes are more obvious.

 

  1. Identify the landing pages that have seen the largest drop in search traffic. If the quality of links is lower than usual and the quality of the content is average for your site, it’s unlikely to be Panda. If there hasn’t been a Penguin update, this means it is probably link devaluation.

Misconceptions About Link Devaluation

It’s easy to conflate all the various aspects of Google’s algorithm, so it’s important to clarify the following:

  • Link devaluation is not Panda – Panda is designed to target low quality content. It is not based on link metrics. However, links from these affected pages are devalued, and this can indirectly affect your rankings.
  • Link devaluation is not Penguin – Penguin targets entire sites that either use manipulative links to rank their own site or other sites. However, the links from these affected sites are devalued, and this is an effect that you may notice even if your site is not directly hit by Penguin.
  • Link devaluation is not the unnatural links penalty – The unnatural links penalty was a precursor to Penguin that completely deindexed several sites that people were using to manipulate their rankings. Once again, links from these penalized sites are devalued, which can indirectly impact your rankings.

The important thing to understand about link devaluation is that it is not a penalty in the true sense. Devalued links simply don’t count, or don’t count as much as they used to. Most people who are impacted by Google updates aren’t actually directly affected. Instead, they are affected by the devalued links from the sites that are directly affected.

Now that links are being devalued on a continuous basis, you can be impacted even in the absence of a Google update. Do not confuse devalued links with penalties.

Responding to Link Devaluation: Do’s and Don’ts

It’s easy to do more harm than good by overreacting to a devaluation of your links (or any update or penalty, for that matter). Here are a few things to keep in mind to keep you on the right track.

Do’s:

    1. Revise your link building strategy by putting a focus on high quality links. We’ve written extensively about how to do that at Search Engine Journal with three posts on the subject.

 

 

    1. Us an anchor text strategy built for the modern world.

 

    1. Focus on content marketing as a link earning strategy, rather than merely building links. Take a look at our guide on the subject to get a handle on how to approach this.

 

  1. Approach SEO from a brand-building perspective, not a ranking perspective

Don’ts:

    1. Do not waste time removing low quality or spam links from your profile. The bad ones have already been devalued and aren’t being counted anymore. They don’t count against you if this is genuinely a link devaluation.

 

    1. Do not use the Google link disavow tool. In general, you shouldn’t use this tool unless you have received a warning or a notice of action directly from Googleas we’ve previously discussed. At best, you’ll only disavow the links that Google has already devalued. More likely, you’ll disavow links that Google hasn’t devalued and end up shooting yourself in the foot.

 

    1. Do not use any link building technique that allows you to personally build a large number of links quickly.

 

    1. Do not build links using easily produced or automated content. Build links using content that attracts attention.

 

  1. Avoid links that don’t make sense in the modern era, like the ones we talked about in this SEJ post.

5 Reasons the Push Toward Link Devaluation is Actually a Good Thing

If Google’s new emphasis on continuous link devaluation sounds scary to some SEOs, here are a few reasons to see the change as a positive one:

    1. Devalued links don’t count against you, so there is no reason to spend time removing all the suspected links yourself.

 

    1. Devalued links don’t cause your site’s traffic to plummet overnight in the majority of cases, which gives you time to adjust your strategy.

 

    1. You will generally still see some of your pages unaffected after link devaluation occurs, unless a large number of devaluations causes your entire domain authority to start sinking.

 

    1. You can focus all of your efforts on building high quality links, rather than being concerned about weeding out bad ones.

 

  1. Spammers will be less likely to see results even in the short term, as opposed to the repeated process of success then penalty over and over again. Similarly, you will get more consistent feedback from your rankings about whether what you are doing is working or not.

Conclusion

Links devaluation is something not easy to identify unless you analyze it deeply or take help from the professionals. Identifying the right cause of penalty is the most important rather than taking actions and moving forward. I highly recommend taking help of professionals if you are unable to identify the cause of penalty as moving into wrong direction will put you in trouble and you will not be able to see your website rankings back that stay last long.

Posted in Google | Tagged , | Leave a comment

How to Hire & Retain SEO Talent in 2013

Hire & Retain SEO Talent

 

,

One of the most overlooked necessities required to execute performance-drivingSEO campaigns is talent.

Whether you’re doing SEO in an agency environment, in-house, or as a freelancer, your success depends heavily on the talent of your employees. A lack of talent hurts the service your agency provides, the results your in-house department drives, and the amount of money you make as a freelancer.

Over the last three to four years, the availability of talent in the marketplace has failed to keep up with demand. There are many reasons for this, some of which are chronicled in this article. A few of his points we agree with, others not so much.

Either way, this has created fierce competition for middle and upper management SEO jobs. So how can you hire and retain SEO staff in 2013?

Train & Develop Junior Staff

The most eligible candidates for any mid-level SEO opening should already work for you. Investment in entry level employees leads to a steady pipeline of qualified candidates.

Most companies face three common barriers when hiring mid-level staff:

  • Lack of talent in the marketplace.
  • Opportunity loss of the vacant position (recruiting time).
  • Cost.

Promoting from within your organization guarantees quality candidates, significantly reduces recruitment and transition time, and minimizes cost. As a bonus, promoting from within is a great way to increase employee morale and retention.

Hiring & the Value Proposition

The fact that I have the word “SEO” in my title guarantees me the pleasure of interacting with tons of recruiters, most via LinkedIn. The most powerful tool any of these recruiters and hiring managers have is their value proposition.

Creating a compelling and unique value proposition for potential employees will make recruiting mid- and upper-level talent much easier.

The majority of companies use this infamous combo to pique interest: money and titles. While the current marketplace enables recruits to demand plenty of both, it’s not unique.

Every recruiter in the industry is going to offer a $5,000, $10,000, even $15,000 salary bump (depending on level) to virtually any qualified candidate. In addition, 95 percent of hiring managers target recruits where a title increase can be leveraged.

Developing a creative and unique value proposition will help you stand out from the pack. This includes aligning your value with what SEO candidates are looking for. Some examples include:

  • Transparency from executives: Some employees need to “be in the loop”. Does your company align corporate goals with personal employee goals? If so, leverage this information in the recruitment process.
  • Entrepreneurial environment: Some SEO candidates may desire to work in a setting that fosters and rewards initiative. Having freedom to be creative and test new strategies is critical to the success of SEO campaigns. While some companies are extremely process oriented, those who aren’t may have an edge. Recruits may also be looking to expand their job responsibilities organically. If employees recognize areas for growth and new opportunities, can they pursue them in your company?
  • Unique corporate culture: If you work at an agency, you most likely already have an appealing culture. Most SEOs are young, so mentioning fun company events, happy hours, and other cultural benefits can be a huge draw. What new employee doesn’t want to play some ping pong in between doing some blogger outreach?
  • Being part of a talented, passionate staff: Most talented employees want to be surrounded by other smart individuals. Do you have any standout or notorious employees on your staff? Leverage your existing talent to help recruit new employees. Ask these employees to message potential candidates to give real testimonials. Most of the time candidates respond better to peers or potential managers than they do to recruiters.
  • Working for large brands: We’ve all seen and heard brand name dropping on resumes or in meetings. Some candidates will sacrifice other benefits for the chance to put a large brand on their resume. Specific to SEO, larger brands will see incremental lift from on-site optimization quicker than smaller companies. For on-site SEO specialists, this opportunity can be appealing. The PageRank of the clients’ site can also be appealing if it’s high enough. Who doesn’t want to work with a PR 8 or 9 site?

These are bare bone examples that may not apply to your individual company. Taking the time to define and communicate a compelling value proposition will enable your company to stand out and successfully recruit a higher level of talent.

Talent Retention

Employee turnover is bad for business due to a variety of reasons. The cost of recruiting talent is high. Client relationships and/or campaign performance can suffer when team members leave.

The perception that your company is a “chop shop” and doesn’t treat employees well can hurt future recruiting efforts. So what can you do to retain your talent?

  • Close new business: The majority of retention tactics rely on closing new business to create opportunities. If you aren’t closing new business, it will be extremely tough to retain talent. A strong sales funnel can also be used in your value proposition for filling open roles.
  • Establish career paths: Take the time for stellar employees to craft out custom career paths. Does an employee aspire to manage people or be a tactical expert? Identifying opportunities for growth that align with personal interest will increase retention.
  • Aligning goals: Employees need to understand how their work contributes to the overall goals of their department and/or company. A large percentage of candidates mention this specifically when responding to the question “What are you looking for in a job that isn’t being fulfilled currently”. Aligning goals from the executive level down and communicating the progress toward these goals will increase employee satisfaction and retention.

Stay Competitive: Invest in SEO Talent in 2013

Agility and creativity are required to drive incremental organic performance. Processes and procedures aren’t agile or creative, but people are.

Talented employees within your organization will be responsible for creating new strategies and tactics required for success. Investing in hiring and retaining talent will ensure you organization remains competitive.

Setting the standard is always better than following it. Leverage your talent to set new standards and continue driving SEO performance in 2013 and beyond.

Posted in SEO | Tagged | Leave a comment

Why Thin Content is Hurting Your Ecommerce Site & How to Fix It

thin content

 

by ,

By now, everyone in the ecommerce search engine world – including digital marketers and content writers – knows about the Google Panda update or, worse, has been directly affected by it. Many businesses were taken off guard when it first hit and have paid search engine optimization (SEO) consultants a large sum to recover prior search performance levels.

However, what if an ecommerce site was never “hit”? What if Panda is still hurting businesses, but they don’t even realize it?

This undiagnosed pain is a missed opportunity and one that hurts branded manufacturers the most.

When a brand is successful, it doesn’t need to worry about SEO, or so they say. But what if a brand is competing with large retail partners online (not to mention Amazon) for its own brand terms, or for generic searches on its key product lines?

What Does Thin Content Have to do With It?

thin-content-definition

As ecommerce continues to grow, a larger percentage of shopping takes place exclusively online. Search engines are the virtual malls where searchers window shop for their next purchase, from whichever device is most convenient.

While many brands with loyal customers can trust in continued business for their core offering – say, premium down parkas – how does a business lure in new customers online for this successful product line? Search is an obvious, but not always easy choice. Now it’s time to either pay for or make a play for premium organic real estate for “down parkas” in the search engine results.

With Panda, it’s not enough to sprinkle a couple keywords on a page and call it a day. Panda changed the rules – prioritizing text content over other attributes, effectively punishing brands whose digital worlds exist only in images. ecommerce businesses are now forced to think like traditional websites and focus on content.

2012 was the year of content strategy; in 2013, it’s paramount that ecommerce businesses include SEO as part of their content strategy, as well as their overall digital strategy. It all begins with an understanding that yes, thin content is a problem and it needs a solution.

How to Identify ‘Thin Content’ Category Pages

What you see…

vampire-category-google-cache-example

Is not what Google gets…

what-google-gets-vampires

As opposed to a typical web page – say, an article on the popularity of vampire books – an ecommerce category page has to squeeze more out of every attribute to signal to Google why the page is really, truly, about vampire books.

To understand what Google sees, you can typecache:http://www.example.com/thisisyourcategoryurl in Google and then select “Text-Only Version.” This will resemble the Barnes & Noble vampire category above.

Here’s what is most likely to be found missing in action on a typical eCommerce site:

  • A paragraph of text copy describing the category. Three to five sentences is sufficient; best practice is to include a couple keyword variations. For example:

vampire-paragraph-copy-example

  • Links within the paragraph copy, rich with anchor text linking out to subcategories. Internal links help Google understand content relationships. Building these out between associated categories is ideal. For example:

Browse our entire vampire book collection – vampire novels, historical reference books, andteen vampire fiction.

  • Alt attributes for the category image banner to describe what it is, as Google can’t interpret images. The image alt attribute helps to indicate the image “content” and is one of Google’s 200+ ranking factors. For example:

< img src=“vampire-books-category.jpg” alt=”vampire books”>

  • Breadcrumb navigation, which helps search engines understand site architecture and crawl the site with ease.
  • A text heading for the category name, tagged with an H1 heading to indicate it is the theme of the page. Where content doesn’t exist, an H1 heading helps Google understand the meaning of the page. A text heading is much preferred over an image banner; where necessary, it’s best to implement both in tandem. For example:

vampire-category-heading-h1-example

  • A title tag which includes the category name (or related primary keyword “vampire books”), listed first as an exact match before the brand, as Google reads title tags from left to right, placing more importance on the first term.
  • A short, friendly URL that includes the category name (or related primary keyword “vampire twilight books”). The ideal would be to have the keyword right after the domain, though a category parent or folder structure should not impact the URL strength. For example:

vampire-title-tag-url-serp-example

Summary

It might seem simple and surprising why this doesn’t exist on all ecommerce category pages. That would be ignoring the reality of the many, diverse hands – merchants, marketers, copywriters, front-end developers, agencies – who ultimately shape the user experience (and architecture) of a category page.

Ecommerce sites are designed to be shopped, not read; text is a useless distraction in the path to purchase. The challenge is that Google’s algorithm doesn’t always note the difference.

Balancing the need to drive traffic to and revenue from a category page is unfortunately not simple science. However, a solid mix of the above elements, leveraging the right keywords, will push a site further along the path to success.

When in doubt, emulate the dominant SEO players in your business niche. An eye to the competition is far more insightful than a checklist could ever dream to be.

Posted in Google | Tagged , | Leave a comment

Link Audit Diagnosis Tips

link audit

 

Most link audits come about after realization of your decrease in traffic relating directly to shrinking conversions.

Like most things in life, action comes out of necessity. The truth of the matter is easy to swallow when your conversion rate drops while you watch competitors rake in the dough.

Auditing a link profile is much more than backlinks and referring domains. If you only scratch the surface, the result will be tragic. There are numerous metrics to account for when researching the reason for declining traffic and unnatural link warnings.

Diagnosing Your Sick and Twisted Link Profile

The first consideration is method of diagnosis.

When you go to a doctor because you aren’t feeling well, does the doctor ask questions and then send you on your way with a prescription? If so, get a second opinion.

A good doctor will utilize the tools at his or her disposal to diagnose and treat the illness. Think of your link profile as a sick human and consult a professional to care for the website foundation.

Best Link Profile Audit Tools

Use the best tools available when diagnosing the problem. Some of the useful tools are Majestic SEO and Link Research Tools.

I can’t say enough about Link Research Tools and Chris Cemper. The dedication put forth to build and maintain such a wicked toolset is amazing; kudos to Chris and his team as well as many thanks for delivering data by the truckloads.

Recognize the Good Links

There are many different ways to skin this cat but the end result is the same.

You may start with recognizing the quality links with the Link Research Tools QBL if you so choose to do so. Check the:

  • Anchor text (matching this metric to keywords you are probably losing traffic is always useful)
  • Link status (follow or nofollow)
  • Link type (image, text)
  • Deep link ratio (index page, interior page, or deep link)
  • TLD (com, net, org, info or some obscure and strange one you may have never heard of like ws)
  • Country
  • IP
  • Class C
  • etc.

Comparing Link Profiles

Take a look at how your website compares to your competitors for a specific keyword (such as one you are seeing a decline in traffic). Top metrics to notice are IP, Class C, and referring domains.

Word to the wise: the number of referring domains is often overlooked and should be taken into consideration on many levels. It’s better to have thousands of referring domains yielding a few links each as opposed to having hundreds of referring domains yielding many links each.

Knee Deep in Dirty Links

link-audit-pie-chart

Take a gander at the total number of historical links (Majestic SEO) for the same keyword you used to compare and contrast competitors. Take notice of the backlink discovery and referring domains for the previous two years.

Now that you have your surface data, getting into the details is where it’s at. Pull a historical backlink report for your domain in Majestic. Also valuable: taking notice of the backlinks you have lost.

Anchor Text

Go back to your list of keywords declining in traffic and locate them on the anchor text list from Majestic. You’ll notice a pattern starting to form in regard to the percentage of certain anchor text and where the anchor text is found on the web linking to your website.

Link Rehab

link-audit-detox-tool-from-link-research-tools

Now it’s time to get dirty and process the toxic link list. Yes, you guessed it, Link Research Tools time again.

Click that detox icon and wait it out. Depending on the number of links pointing to your site, this may take many hours.

Detox a link profile with nearly 2 million links and you’ll be waiting a few hours for that report. If you only have a few thousand or tens of thousands, you’ll be good to go in 20-30 minutes.

When the report is complete you will probably notice that the links on the list are from domains that have been deindexed by Google. There is a high probability that all of those links are garbage and you don’t want backlinks from those sites.

The Link Detox Report is a bit tricky if you’re a first time user as the list won’t show all bad links from the same domain. You will have to cross reference these links with the historic backlink list to get the total number of bad links per domain. For example, one of my experiences using the tool was seeing five toxic links in the report and finding a total of 39,000 bad links from one domain in particular.

You may also see the suspicious links as well in Link Research Tools. The links designated as SUSP 7 and 9 are the ones you may want to investigate further as these are many times found to be link farms (and Google doesn’t look kindly on link farms or networks).

Summary

An article such as this could go on and on, but if you would like to know more about the dirty details regarding what to do next and the disavow tool, leave a comment about which link audit topic you want to learn more about. After successfully detoxing your link profile, try some new link building.

 

Posted in SEO Tools | Tagged , | Leave a comment

Victorious Over Google & AOL, Vringo Targets Microsoft in Search Patent Suit

victory

 

by ,

Microsoft has become the latest company to face a suit from Vringo subsidiary Innovate/Protect (I/P) engine over search patents.

The suit is based on two online search patents owned by the Vringo subsidiary. Patent numbers 6775664 and 6314420 are based on a technology that allows search engines to work concurrently with advertisement systems.

I/P Engine alleges that Microsoft has, and continues to, infringe on the two patents. The firm is seeking damages for past and future revenues potentially gained from the two search patents.

Vringo made headlines last November when it successfully sued Google and AOL for damages based on online search-related patents. The jury in the case awarded Vringo $30 million in damages. Vringo was originally looking to score some $696 million in damages from the consortium of infringing companies.

Vringo was founded in 2006 as a mobile software firm. Last year the company merged with I/P Engine, an intellectual property firm that makes money by licensing patents.

It was following the merger that Vringo received the patents involved in the current lawsuits. Both patents were originally created by employees of the internet search firm Lycos.

Posted in Google | Tagged | Leave a comment

Google Submits Response to EC Antitrust Regulators

google

 

by 

Google has submitted a response to the European Commission (EC) over its anti-competitive business concerns, Reuters reported.

Competition Commissioner Joaquín Almunia confirmed that Google submitted the documents Thursday evening, just in time for the deadline set by the EC. No details of the response have been made public.

Theoretically Google could face a fine as high as $4 billion – 10 percent of its global turnover – if found to be in breach of European laws.

The Initiative for a Competitive Online Marketplace (ICOMP), which is backed by the likes of Microsoft, welcomed Google’s response but said the most important thing was for the EC to act to curb any excesses in market dominance.

“To be seen as a success, any settlement must include specific measures to restore competition and allow other parties to compete effectively on a level playing field,” said David Wood, ICOMP legal counsel. “Any settlement must include explicit acceptance by Google of its dominance and that it has damaged European businesses through its anti-competitive practices.”

Google faced similar concerns in the U.S. that came to an end at the start of the year. After theconclusion of the FTC investigation, Google voluntarily made changes to some of its practices, including how it handles search and patent issues and agreeing not to sue willing licensees over standard essential patents.

Posted in Google | Tagged , | Leave a comment