If you’re in any SEO groups around the web, there has been chatter recently about ranking websites with just traffic. For years we’ve all been on a about quality content and links. This has always made sense. If you build a website that has quality content, and then carefully build relevant and high quality backlinks, all things being equal you should do well in the search engines. As the process has evolved, it has become harder and harder to do this, although due diligence and planning is mighty helpful.
It also makes sense that if you do these two things correctly, you will rank, and then the organic traffic should start to flow, so it’s third in line in the sequence. Recently though it seems that Google has switched, or at least merged the presence of traffic into the filter. Sites that are getting more quality and relevant traffic, tend to do better in the search engines from a ranking point of view. So why is this the case?
Social Media and Traffic
As social media has taken off, it’s now normal that a site could gain traffic immediately on a site being launched. This can happen either from social viral traffic or paid social traffic. Any other paid traffic will also play a part. Google seems to have realised this, and no longer are webmasters patient enough to wait for organic traffic to roll in. Why would they when there is masses of traffic at their fingertips on Facebook, Twitter or Instagram.
In this case, is would also make sense that if a site is gaining highly targeted and real (human), targeted traffic, from whatever the source, that it’s probably worthy of Google love also.
When I have time, I want to run a case study on simply ranking a brand new site with purely social traffic.
Crowdsearch.me, recognised a few years ago that traffic was starting to become important, and Dan Anton and his team released the popular software platform to send targeted traffic to a website or other properties. It’s very cool, and I’ll probably do a full review on this software some time in the future.
Certainly in the last couple of months I’ve noticed a lot more chatter in the SEO groups about if this is a real factor. But I have’t seen any real hard core case studies yet, that can legitimately prove the theory. For many, organic traffic is hard to get, especially highly targeted real people wanting your product or service, and it does take time. The quickest, easiest and simplest approach is to simply buy it.
My SEO Traffic Case Study……..Of Sorts
OK, so the site I’m going to talk about has a bit of a blurry history. It was my first agency site (“my city seo”) site here in Australia. I really didn’t know what I was doing. I was a member of one of the largest and most successful SEO courses and that certainly did help things along. At one point, I was ranked positions 1 , 2 and 3 for the term “SEO + city”. My site was number 1, a citation at number 2, and my Facebook page at number 3. They stuck for around a year and things were going along pretty well agency wise. It was growing quickly. I’d been pretty careful how I ranked it, although I could of, and probably did make mistakes along the way.
In 2015 there was still a thing call “negative SEO“. Google seems to have got on top of the issue for now, or at least the majority of the problem. If you haven’t come across the term before, it refers to an attack on a website with spam link profiles to negatively influence a sites rankings, by sending large quantities of low quality links, or over optimized anchor text, or both.
Ahrefs.com Referring Pages
So on Christmas eve 2015, someone spammed the crap out of my site.
You can see from the graph above that in January 2015, referring pages (links) increased from around 500 to over 12,000 in just a few days. The majority of these links were from ultra low quality domains, and all anchored to the term “seo + my city”. At the time this represented around 60% of the total anchor profile. Overoptimized…..? Just a little.
I immediately went about recovering the site, disavowing links and using some 301 redirect canonical strategies. It worked well and I was able to recover the site within about 3 weeks. It was immediately attacked again and I recovered it a second time. However it gradually decreased in rankings over the next month.
I don’t know which competitor I had pissed off, as I don’t really have anything to do with them in my city, although I have my suspicions. Only as I’ve heard they’ve done this type of thing before. Regardless I learnt some big lessons:
The SEO industry is no different to any other, there are shit people, and there are good people.
Negative SEO does indeed work
I learnt how to recover a site…..fast
What I found most devious was that they executed the attack on Christmas eve, hoping I wouldn’t notice for a few days or maybe a few weeks.
What Do I Do Next?
So at this point, after 3 attacks and the site languishing between pages 10 and 100 for all keywords, I had to decide what my next move would be. Luckily I had partnered up with another agency, and was busy ranking that site, and way too busy with clients. So I made the decision to abandon the site. Since early 2016 until late August 2017, I did absolutely nothing to it. Actually I tell a lie, I changed the theme, and removed some pages of content, and just left it as the home page, contact and about pages.
Fast Forward To August 2017
Around about July 2017, I was playing around with some new software that was on the market called Serp.tech, a new mass page building software by Herc Magnus and Todd Spears. I have been playing around with mass page builders for a while, including Network Empires V-Krakken, a mass page builder using video. It’s a beast, and a tool I’ll talk about in a dedicated post as it’s so powerful (I built a 1000 subscriber email list in less than two weeks for free using this tool). You can here more about it here:
One of the first sites I built with Serp.tech was a national SEO Agency site that built out every city and town in the country for the term “seo+city/town”. Once the site was complete, indexed and had started to rank, I simply redirected that traffic to my old SEO site. (I wont explain here how I redirected the traffic. All I can say it was not using a 301 or any other redirect protocol).
So over the last couple of months traffic has dribbled in from real users looking for seo services, albeit all over the country. Keep in mind though that my local area was also built out on the mass page site, so it has also received real local traffic also.
A couple of weeks ago (September 2017), I was checking our rankings for our other agency site and noticed my old site was started to show up for many related keywords, I checked “SEO+city” and sure enough, it was on page 4. Then a week later, it was on page 1, and the last few days it’s been slowly clawing it’s way up page 1.
Considering I haven’t touched the site in nearly 2 years, and only sent the mass page traffic to it, did the traffic recover, and then rank the site back to page 1? The only other possibility, is that the penalty has been removed, or expired (which does happen) due to a Google filter shift or it simply expired over time. The majority of the spammy links have dropped off according to ahrefs,com, but this happened over a year ago.
To me, it’s a little bit too coincidental that after sending the traffic, it was back in about 3 to 4 weeks, and it was completely by accident. My main aim was to give the mass page traffic a landing page that was somewhat functional and user friendly!
I’m inclined to think it presents a strong case for traffic ranking a site. If nothing else, it’s given me a lot of data and ideas to put together a proper case study.
When I first started building websites and trying to improve rankings and generate traffic to them, I remember my mentor, and now business partner tell me,
“remember content and structure is always the most important part of any website”
I’ve come a long way since first building a crappy little site promoting sports videos. I was hoping to generate some pocket money at the time with adsense. I’m now seasoned (and hardened) at ranking and doing it well, but along the way I’ve tried almost every strategy out there. I could of saved myself alot of pain along the way if I had just remembered those words. Of course, it’s now a core part of how I do things, but along the way I thought I could “trick” Google into ranking sites, and “pull the wool over it’s eyes”. As it turns out, there are ways to spam and “trick” the search engines into ranking and generating traffic, but I’ll leave those to future posts.
These days I’m very strict on how I look at the on page aspects of a website, both for mine and clients sites before I even consider linking. Too many times I’ve achieved big changes in a short period of time with simple, but highly effective on page adjustments. I’ve also had moments where I haven’t achieved what I wanted to if I was lazy, or thought I could get away with it.
The following is a recent case study (client) who came to me recently and asked to help with generating more traffic and rankings in Google.com.au.
The Importance Of On Page SEO – Case Study
Firstly I’ll explain a little bit about the history of the site (without divulging too much info about the site itself to protect the clients privacy), and then work through the steps I implemented to achieve some big gains over the course of 1 weekend. Below is a screen shot of some of the rankings improvements over 2 days, just from implementing on page seo. I took control of the site on a Friday morning. This screen shot was taken on the Monday after the weekend.
So the cool thing about this, is that when I showed the client this, they were stunned at the results in such a short period of time. From a client relationship perspective, there is nothing sweeter that starting a campaign like this. It provides an immediate boost to their confidence in you, and it also gives you a little room to breath, while you’re getting off page strategies underway. Keep in mind that this was done with nothing but on page adjustment, nothing black hat or spammy, just simply showing Google exactly what the site it about. To the client, it can seem a bit like magic.
The thing is, these are typical results we see on many new client sites. Typically if a client has had a site built somewhere else, they look great, and probably function well, but how do they perform?
To be fair, this site is about 6 year old. It was originally on an old html platform, and was a simple blog with a few html product elements added for products. It has never had any SEO implemented before. The backlink profile is rather sparse. So it does have age, and a clean profile in its favour. It’s in the equestrian niche. I would consider this a medium competition market, and after some keyword research it does prove to be that. Believe it or not there is a rather dynamic online market for equestrian gear and the market has become quite lucrative with highly priced brands now entering the market. So high ticket products are not uncommon.
About 2 months ago, the owner had the site migrated (in this case rebuilt) on Shopify. Not by a professional agency, but by a friend. So another tick, the site was now on a quality platform with pedigree.
The Exact Steps In 2 Day Ranking Gains
Step 1: SEMRush Audit
I use SEMRush daily, and it really now has become a tool I couldn’t do without. That, ahrefs.com and majestic.com. The 3 tools make up the core of my daily tools.
The first thing I did was run a quick domain search to get an idea of the current keywords that the site ranks for. I also used ahrefs.com to cross reference. They index sites in different ways so at times you’ll notice differences and also find data the other doesn’t have. So ended up with a list like this. Note this is just a sample.
So with this very quick analysis using SEMRush, I can quickly get an idea of the market, where the volume is, competition. Now at this point, if this was a new site, or I was doing in depth research for the market, I would delve deeper into competitor analysis and content analysis, but for this task, all I wanted was a quick overview of where to focus efforts on the site as it exists currently. We can add more keywords and markets into the strategy later on, but right now I just want to have the core market at reach and use that data to adjust the site. Once the site has settled after the adjustments, I can then move deeper into each product category and do further research on related phrases, long tails keywords and phrases etc. The site owner will have some say in this, as he will know what products have the best ROI, so then it makes sense to focus on those first.
Step 2: Adding Keywords To Rank Tracker
This is always one of the steps I do early in a campaign. It’s easy to get lazy and want to do it a week or two in, but in this case study, if I had done that, I would of missed tracking the immediate gains. There are ways to go back and grab data historically, but it’s a a pain, so just getting it done early is easier, and it only takes a few minutes anyway. I use two keyword tracking tools, RankTracker from Link-Assistant and ProRankTracker.com. You can download Rank Tracker for free here. I love Ran Tracker as you can track unlimited keywords for a one off cost of purchasing the software. Check out the free trial first.
Step 3: Added Structured Data Markup
I’m a big believer in adding schema to all sites. But it needs to be implemented correctly. My post on using schema for local business websites is good place to start. In this case, I just generated the schema markup using the Serpspace Markup Generator. For shopify, I simply added it to the theme.liquid file. Even though the business is national, I added Local Business schema as the the business also provides services locally.
Step 4: Edited Title, Descriptions & Meta Tags
Firstly I ensured all title and descriptions were appropriate for the pages, and then added appropriate SEO title and descriptions.
The next step has had the biggest impact on site rankings, IMO. Shopify uses “collections” to categorize products, and each collection should be treated as a hub page. Therefore attention was given to ensuring titles, and descriptions relevant and topical. Using the keyword research I did previously, each collection has the main keyword in the page title, and then product variations in the description.
Then to compliment the pages title and descriptions, the SEO titles and descriptions need to be added correctly. Shopify has improved this part of the process greatly. The anatomy of this process was again using the keywords and product variations. This time though, I used partial anchor text for the title, and then added the product variations in the SEO title also. The description was simply used as way to peak interest.
Essentially what I’m trying to do it include all keyword variations possible from the keyword research I’ve done, using a combination of both the page title and description and SEO title and description.
“MAGIC”? – Hardly. It’s just simple SEO 101, but time and again I see the foundations over looked.
Step 5: Added To Google Webmaster Tools
So after completing each collection page as per above, I added the site to GWT, included the sitemap.xml, and waited. The results after 2 days were great.
SEO KISS Principle
So by now you make be thinking:
“is that it? I thought I was going to learn some super ninja optimization trick?”
Keep It Simple Stupid
That’s the point. On page foundations are super critical for any site. The first thing that I’ll do if a site is stuck or not moving is audit the on site elements. I may look at links at the same time, but the on page has to be right to get any traction in the first place.
There is a standard line when talking about ranking sites that it takes time. In many cases it does, and there have been some studies recently about the time it takes to rank a site. But this proves, and I see it almost daily, that adjusting on page elements can have rapid, fast gains for very little effort.
Now though, this site is ready to do the same for each product (sigh), and then start an off page campaign to the collection pages and products.
As a an SEO who works with a lot of local clients, one of the strategies that has really become important for our local clients is the ability to add reviews to their website (and by extension star ratings), that show up in the search result listings. They way we achieve this is by using schema. Using schema markup code has a number of benefits, and having star ratings as rich snippets show up in the search results can greatly improve CTR and trust.
It’s been around a while, and there are a load of tutorials online on how to do it. Below I show two simple ways. One for the non techies (no code), and one for those who are happy to insert a bit of code, which is really easy also.
Below is an example:
Why Don’t All Websites Use Star Ratings?
This is something I get asked all the time. “If it’s so easy, why doesn’t everyone use them?” There are a number of reasons for this. Firstly, even if you add the code, Google will still decide if it will show the rich snippet or not.
As a general rule, star ratings will not show for the home or root page of a site. I did see them show in the past, but it seems Google has decided it wont render them. You can still add local schema to the home page, but I normally leave the aggregate rating sections out. At times, the ratings will also not show up on inner pages. Generally we see this if a page doesn’t have enough supporting content to warrant the rating.
Even though there is a lot of information out there that show you how to so this, it’s another thing for webmasters to actually act. Even web design companies struggle with the concept somehow, and we even get told that it’s spammy. It’s certainly not spammy if done correctly, and in fact it’s encouraged by Google. This link gives you Googles take on how it looks at structured data: https://developers.google.com/search/docs/guides/intro-structured-data
How To Add Local Schema Structured Data To A Web Page Or Post
The two ways I’m going to show you are using a plugin, and by adding the code yourself. Both are simple. One costs a little bit, to purchase the plugin (but has a heap of other great features), and one is just adding code.
Using A Plugin
Using a plugin to add your structured data is very simple. There are a few out there, but you need to be careful which one you use. The problem with plugins, is that if they are not updated regularly, they can become out of date quickly. This is the case a lot of the time with free plugins.
The plugin I use, and recommend is Project Supremacy. I’ve used pretty much every plugin out there for schema, and this IMO is by far the best. Again, the reason is not so much what it does (although it does a lot), it’s that the developers behind it time an again come up with the goods, and are legit. It’s updated all the time, and it simply works. Not only does is allow you to add schema quickly and easily, it also incorporates a heap of other features. It’s actually a stand alone SEO plugin, similar to Yoast or All In One SEO. I actually use it alot for this purpose on sites I don’t use schema, so it’s a bit of a swiss army knife. The other main feature which I haven’t used much is the project settings to do in depth keyword research. Further to this you can also then add affiliate products really easily also.
The is video gives a bit of information from the sales page but it does explain well, what it actually does:
I’m not going to go into detail how to add schema with this plugin here. If you purchase it you’ll get a heap of training on how to use it to it’s best ability. This is probably in my 10 most used plugins right now.
I’ll use Serpspace here as an example. These guys also provide a range of services for local agencies and affiliate marekters, well worth checking out. To use the schema generator you can simply sign up for a free account.
Step 1: Generating the code
Step 2: Copying the code
Note: The two items that a few people get confused with are the “@id:” type, and the “sameAs” type.
@id: This is simply the URL of the actual page where the schema is to be added. ie a location page of a local website. The URL field is the site home page.
sameAs: This is other important third party network sites. These could include a Facebook page, local directory, Twitter, Instagram etc. It’s important that these are associated with the site where you are adding the schema.
Step 3: Editing the code
So you might notice that the code you’ve been provided has not “aggregate rating” type. This is the part of the code that needs to be added for the stars to show up. Project Supremacy has this as part of the build, but it doesn’t have the @id type. So either way, I normally edit the code manually in a notepad before. adding to a site.
So this is similar to the code you should now have:
You can change the rating value and rating count to what ever you want. Be rational though. You want to make sure your are realistic with the ratings. What I also recommend is that you physically add real reviews to your site, or a least a link to your Google My Business page with reviews (and match them up), or a link to your testimonial page. There has been instances where sites have received a structured markup manual penalty by showing aggregate ratings without out any reviews shown or associated with the site.
You should add the above bit of code just after the last “sameAs” type, identical to the full code above.
Once you’ve added the plugin, at the bottom of each page in the dashboard editor you can add the code as per below. Don’t forget to hit save. Obviously, you only want to add the code on the page where you want the schema and aggregate ratings to show up.
Step 5 – Structured Data Testing Tool Verification
Once you’ve completed the addition of the code, you want to make sure the data is verified by Google. head over to https://search.google.com/structured-data/testing-tool and add your page URL in the field and hit “run test”. If you get zero errors, you’re good to go. Google has just crawled the page, so it knows you have the schema added. If you get errors, there is a problem with the code (normally a simple typo). Adjust and test again until it is verified.
If you open up the Local Business tab you can see the verified code inside. Any error will also show up here, which will help you fix them.
How Long Does It Take For The Star Ratings To Show Up In The Search Results?
Generally it will take 2 to 3 days for the star ratings to show up. Sometimes they wont show up at all, and that is just the fickleness of how it works. 9 times out of 10 it does, and the increased CTR you’ll achieve will be worth the effort, especially if you are already ranked on page 1, and no other competitors have star ratings. It simply will draw the users attention to your listing and they are more likely to click on the listing. The psychology of reviews is well tried and tested.