Fantastic Interview With 2 SEO’s Aaron Wall and Rand Fishkin

seobook logo

seomoz logo

It is always interesting to hear debates between high level search engine optimization practitioners, and this is a great exchange between Aaron Wall and Rand Fishkin.

Rand from seomoz.com has been a leader in opening up tricks and tactics in SEO and was one of the first to openly publish specific SEO techniques. Aaron at seobook.com is a great SEO advocate and routinely grills Google for double standards that may occasionally go against their “Don’t Be Evil” motto.

Below are some great excerpts from the SEO interview, some of the content has been edited for brevity.

What are the biggest counter-intuitive things you have learned in SEO; things that theoretically shouldn’t work, but do

That the best content rarely wins. Content that best leverages (intentionally or not) the system’s pulleys and levers will rise up much faster than the material the search engines “intended” to rank first.

Another big one includes the success of very aggressive sales tactics and very negative, hateful content and personalities.

A very specific, technical tactic that I’m always surprised to see work is the placement of very obvious paid text links.

As a business selling something unique I have found word of mouth marketing is a more effective sales channel than SEO. Do you think the search results are overblown as a concern within the SEO industry?

In our analyses, it’s always been a combination of things that leads to a sale. People search and find us, they hear of us, they’ll find us through social media or referring site and maybe they’ll sign up.

This is what makes last touch attribution so dangerous, but it also speaks to the importance of having a marketing/brand presence across multiple channels. I think you could certainly make the case that many of us in the SEO field see every problem as a nail and our profession as the hammer.

What business models do you feel SEO fits well with?

I think SEO is terrific for a business that has content or products they can monetize over the web that also relate to things people are already searching for. It’s less ideal for a product/service/business that’s “inventing” something new that’s yet to be in demand by a searching population.

If you’re solving a problem that people already have an identified pain point around, whether that’s informational, transactional or entertainment-driven, search is fantastic. If that pain point isn’t sharp enough or old enough to have an existing search audience, then traditional advertising may actually do better to move the needle.

When large companies violate Google’s guidelines repeatedly usually nothing happens, yet smaller companies when outed often get crushed due to Google’s huge marketshare. Opine.

I would agree it’s not cool that Google applies its standards unfairly, but it’s hard to imagine a world where they didn’t. If mypaydayloan.biz isn’t in Google’s index, no ones thinks worse of Google. If Progressive.com isn’t in Google (even if they bought every link in the industry), searchers are going to lose faith and switch engines. The sensible response from any player in such an environment is to only violate search guidelines if you’re big enough to get away with it or diversified enough to not care.

I’m unhappy with how Google treats these issues, but I’m equally unhappy with how spam distorts the perception of the search engine optimization field. Thought leaders in the technology field often malign our industry, usually because of the “small” spammers. I don’t think most web spam should be classified as “SEO” and I don’t think any SEO professionals who want our field to be taken seriously by marketing and engineering departments should consider anything but white hat internet marketing tactics.

How do you see SEO business models evolving over the next 3 to 5 years?

In regards to the evolution of the SEO business model, we’re likely to see more sophistication, more automation, more scalability (and hopefully, more software to help with those) over the next few years from both in-house SEOs and external agencies/consultants. It’s surprising to me how little SEO consulting has progressed from the early days. Internet marketing tactics like email marketing or analytics have become more structured with sophisticated software with a standardized feature set. Many great companies compete even though Google has made competition more challenging in the analytics space. New creative companies like KissMetrics, Clicktale and Unbounce are still interesting and innovative things despite competing with the free GA tool.

Small businesses are an under-served market, but also the hardest to serve since they have limited time and small budgets. Do you think the rise of maps & local properties such as Google Places gives them a new opportunity, or is it just more layers of complexity they need to learn?

New positive opportunities for online marketing! Small business owners are beginning to get savvy to the web as a major driver of revenue. I think it might take another 10 years or more before we see true SEO maturity from local businesses. This gives a huge competitive advantage to those who are willing to invest the time and resources into doing local SEO right.

SEO only has about 10% of companies’ online marketing budget. When does the delta between paid search & SEO investment begin to shrink?

I think it’s shrinking right now. Paid search is so heavily invested it is a mature market. SEO is gowing with a higher Compound Annual Growth Rate according to Forrester, so the difference between natural and paid search should be shrinking.

seo growth chart

Often a Google policy sounds like something coming out of a conflicted government economist’s mouth. How much further do you think Google can grow before they collapse under complexity or draw enough regulatory attention to be forced to change?

I think if they are careful and heavily invest in political donations and PR, they will maintain a very positive outlook for the next decade. The unpredictable nature of online activity and wild shifts probably help them avoid most regulation. The rise of Facebook.com has been a boon to their risk exposure from government intervention.

You get lots of traffic from Facebook & Twitter, but almost 0 sales from it. Does there become a point where search is not the center of the web in terms of monetization?

As direct traffic portals, it’s hard to imagine a Facebook/Twitter user being as engaged in the buying/researching process as a Google searcher. Google traffic is just more valuable.

Those companies may launch products that compete with Google’s search revenue model, but I don’t foresee them being a direct sales channel. They’re great for driving traffic, branding, recognition and ad-revenue model sites, but marketers should not worry about the relevance or value of search disappearing.

What are the major differences between LDA & LSI search algorithms?

They are methodologies for building a vector space model of terms/phrases and measuring the distance between them as a way to find more “relevant” content. LSI was first developed in 1988 and has lots of scaling issues. It’s offshoot PLSI (probabilistic LSI) attempted to address some of those when it came out in 1999, but still has scaling problems.

LDA (Latent Dirichlet Allocation) started in 2002, is a more scalable (though still imperfect) system with the same intuition and goals – it attempts to mathematically show distances between concepts and words. All of the major search engines have lots of employees who’ve studied this in university and many folks at Google have written papers and publications on LDA.

Google’s ontology changes over time. We see drastic ranking increases and drops for outlier pages which target related keywords and search results for 2 similar keywords keep bouncing between results. How do you account for these sorts of changes?

We haven’t been changing the model. However, one nice thing we get to do consistently is to run our models against Google’s search results. If Google does change, our scores and seo recommendations will change as well. We have fast access to determine where Google’s at today and adjust our tools accordingly.

Some firms use predictive analytics to automatically change page titles & other attributes on the fly with proprietary systems that lock in clients.

Personally, I don’t like it, and I’d be surprised if it worked.

Editors/writers should be responsible for content, not machine-generated systems built to optimize for search engines. Machine systems can and should make recommendations, but I fear for the future of your content and usability should “perfect SEO” be the driving force behind every word and phrase on your site.

With links being such a powerful ranking signal, it’s much better to have a slightly less well-targeted page that people actually want to link to than a “perfect” page that reads like machine-generated content.

I think content creators who take pride in their work are the ones who’ll be better rewarded by the engines, and those are the same type of creators who won’t permit a system like this to automatically change their content based on algorithmic evaluation. Writers don’t want HAL as their editor.

When I got into SEO it seemed like you could analyze a person’s top backlinks and then literally just go out and duplicate most of them fairly easily. Now my approach to SEO has moved away from analysis and more toward just trying to do creative marketing & hope some part of it sticks.

The data about links, on-page, social stats, topic models, etc. is great for the analysis process, but it’s much harder to simply say “OK, I’ll just do what they did and then get one more link,”.

That analysis and ongoing metrics tracking is still valuable, because it helps define the distance between you and the leaders and gives critical insight into making the right strategic/tactical decisions. It’s also great to determine whether you’re making progress or not, but data can only take you so far. Becoming an influencer and having a community around your website is what will drive seo of the future.

How large a problem is webspam and page churn?

The decay of pages on the web is a big problem, with only 20% of pages remaining active after 1 year.

web page churn

To reach the deep corners of the web, we’ve found that limiting spam and “thin” content is the big problem. Just as email traffic is estimated to be 90%+ spam, it’s possible that the web, would have similar proportions.

Read the full interview on seobook.

FREE PDF DOWNLOAD

27 Little-known Tips to GET TRAFFIC to your Website

Related Posts

5 Responses

  1. I really enjoyed your post. I’ve been working with SEO website design recently and these guys explain it simply.

Leave a Reply

Your email address will not be published. Required fields are marked *

100 New Ideas to Boost Conversions

Get our ultimate guide to maximizing your ad campaigns.