The “new” SEO That Only 7% Of Marketers Truly Understand
SEO has greatly evolved in the past few years. It’s a fast moving industry, if you don’t evolve as quickly as the search engines do; you’ll fall behind and lose your rankings.
It’s a harsh reality and the same with every other marketing strategy. SEO is not the ultimate traffic source, it’s just as good as any other source. The leads are generally higher quality, but SEO involves a lot of hard work and emotions.
It can really screw with you when you lose half your rankings overnight. I know I’ve felt severely depressed in my early days of marketing online after losing 100+ rankings overnight; who wouldn’t? The point is all traffic generation strategies are volatile.
You really never know what you’re going to get. Which is why you need to look into the future and make a highly educated guess about what’s going to happen.
Right now it’s fairly obvious what’s going to happen to SEO. It’s going to become more challenging for little sites to rank. Google wants to see: fresh content, lots of content, contextual backlinks, social backlinks, a community and real authority.
Which is why it’s obvious that little sites aren’t going to cut it anymore. As Google pushes out update after update, it’s affecting the little sites the most. Why? Well small sites are generally less credible and authoritative than big sites.
If a site has been running and consistently publishing content for 2 years; it’s fairly obvious that the people running it are serious about what they’re doing. It also shows that they probably know a thing or two about what they’re writing about.
And because bigger sites are more credible and therefore trustworthy; it justifies all the backlinks and traffic they receive. When a weeny 10 page site pops up in the gardening niche and suddenly starts getting 500 backlinks per day, don’t you think it’s a little fishy?
It is fishy and it stinks of blatant manipulation. Which is why these small “stagnant” sites rarely survive the waves of Google attacks. However, if these sites consistently pushed out content, kept improving their interfaces and strived to have the best content; they wouldn’t be penalized.
Over the course of this article; I’m going to cover the three core components of the “new” SEO. The first core component of “New SEO” is freshness.
Component One: Freshness
Freshness means consistently publishing new content. Most sites never publish new pages or blog posts, they remain stagnant. They are born, live and die with X number of pages. However, during their early lifetimes they get random waves of incoming backlinks which results in you know what (penalties).
Everyone in the SEO community knows that SEO should be natural, or at least appear that way. Do you think it would be natural for a stagnant website of 10 pages to consistently receive hundreds of new backlinks per week? Probably not. The most popular sites online are those which constantly get updated. Those are the sites that receive the most links naturally.
Component one means constantly publishing new content. If you’re not publishing new content, your site is not growing.
It’s become evident that dormant websites are not welcome by Google. Its theory for now, but SEO’s believe there’s a part of Google’s algorithm that looks for ‘freshness’. As long as you publish new content at least weekly; you remain under the radar, build authority and rank higher.
Component Two: Popularity
This component is huge and not in the most obvious way. For years SEO’s have been asking “How can I get more backlinks?”.
We need a complete mindset shift. Instead of asking how we can gain more links, we should be asking “how can I make this more popular?”.
Popularity is what rules the web. The more popular your content, the more it’ll get shared, linked to and the higher up it will rank in the search engines. I experienced a breakthrough when I realized this.
When you realize it’s about popularity and not backlinks or any other tech crap; you’ll open up way more traffic sources.
SEO shouldn’t just get you search engine rankings. A good SEO knows that SEO helps in many other ways. Good SEO should generate REFERAL TRAFFIC, SOCIAL MEDIA TRAFFIC and SEO TRAFFIC.
For as long as I can remember the SEO community have been 100% focused on gaining backlinks. Backlinks achieve short term rankings. Any idiot can build a few links and get a few rankings.
But it takes a special idiot to keep those rankings. Most go about link building in a completely unnatural fashion. It may work for 3 months, maybe even 6 months. But inevitably the rankings will come crashing down unless you follow these principles.
Backlinks shouldn’t be artificially created for the sake of building backlinks. SEO’s are way too analytical. They analyse their competitor’s link profiles, look for backlinks they don’t have and find the quickest way to build them.
Links shouldn’t be built for the sole purpose of being “links”. They should be bridges to your site. The links you build and receive should drive traffic to your site.
In order for them to drive traffic to your site, the content used for the link building needs to be of the highest quality and posted on high traffic websites. That’s just a little education on contextual link building. We need contextual backlinks to rank well in the search engines, even with “new SEO”.
As I was saying if you look at what Google have been doing; it’s pretty easy to see what they’re “probably” going to do in the future. The current algorithm is too easy to manipulate. Google have no control over what sites link to.
It makes it painfully easy to manipulate. Anyone can build a site and link it to their own. It’s really not difficult. However, what if Google could control which sites they pay attention to? What if they could figure out which sites are most helpful just by looking at how many links they’re getting from certain sites?
They’re already doing it.
Facebook, Twitter, Google Plus, LinkedIn. Social media has revolutionized the way people connect with each other and has changed the face of SEO forever. Think about it, there are only a handful of websites that all Internet users regularly use right?
It’s fair to say that 97% of Internet users use at least one of the top 10 social networks on a regular basis. Since so many Internet users including webmasters use such a small cluster of sites; it makes it easy for Google to gauge how popular a site is by checking out their presence on the top sites.
The problem with the old system was that it was impossible to know if a link profile was artificial or real. Now Google are incorporating social ranking factors into their algorithm. It’s a lot more difficult to manipulate social backlinks than normal backlinks.
Google look at incoming links because it’s a clear indicator of popularity. The more people linking to your stuff, the better your stuff probably is. Clear logic, but many flaws.