The end of 2018 also represents the end of another year of our charity donation scheme. While charitable giving has always been an important part of the employee experience at Builtvisible, our Financial Director Claudia came up with a formalised structure three years ago to make it more regular, more impactful and more meaningful. Every […]
CSR has always been a big part of Builtvisible’s culture, from sustainable working practices to a employee-led monthly charity donation scheme, and back in May we teamed up with Benefacto to take our positive impact on meaningful causes to the next level. The premise was simple – we would give two working days per employee, […]
The Enhanced Ecommerce features in Google Analytics can initially be daunting. The official documentation is extremely technical and implementation of the plugin can be a challenge. But don’t let this put you off: the Enhanced Ecommerce reports are hands down some of the best interfaces available in GA today. Enhanced Ecommerce offers marketers the ability […]
This is a TRUE & SUCCESSFUL story from Jim Miller – Founder of Salesbloom and one of our long time customers.
Jim Miller is an e-commerce consultant, and founder of Salesbloom, who collaborates with world-renowned marketing gurus. Enjoy his story, written and documented by himself, and see how we hit the Google’s top SERP spots with no backlinks but with a good content optimization.
Over a six to nine month period using the content assistant optimization tool, I have seen numerous pages swiftly develop and rank to the highest pages of Google. So getting traffic without building backlinks can be achieved with the right content strategy.
This case study was composed to help share the success I had for my client.
Working behind the scenes with some of the fastest developing agencies in marketing, you encounter conflicting strategies both internally and externally, which promise to improve your client’s rankings.
I’m a firm believer in testing everything before foolishly trusting any information from whoever it may be, whether an influencer, agency or otherwise, so I work behind the scenes to know what actually works.
Years ago, I cracked an algorithm on eBay’s search engine with one of my first career positions, taking a small (and I do mean small) local footwear retailer from £10k online sales per month to £250k (£450,000 Christmas Peak).
All in just under six months!
Yes, it’s true – and this included outselling retail giants like ‘Barretts’, ‘Office’ and ‘Schuh’- and all without a penny of a marketing budget!
I’ve never had the luxury of limitless budgets (and I prefer not to either as the constraints bring out the best in me). Having a typically micro-budget (maybe being a tad frugal as well), I’m always hesitant to pick something like ‘Marketmuse’ and put my house on the market for a taste!
Being overworked (and a little disorganized if I’m honest at the time) necessitated a search for a timesaving tool for obtaining the broader metrics or overarching topics and keywords which help others rank more highly. It was then that I came across Cognitive’s Content Assistant tool – though at that time it took a little digging as the software was so new.
Cognitive’s content assistant tool allowed me the opportunity to see correlated topics and missing keywords that enable the content to shine through. I’m not a natural writer, I rely on others to help me edit my thoughts as many e-commerce consultants do. I might add that my thoughts are not always that easy to convey, so this tool was an immense help.
I found the experience when using the Cognitive SEO to be intuitive in the main, simple to use and then ideal to communicate to others.
Now at Salesbloom, we constantly develop existing content using Cognitive and produce new articles within the same content writing process.
Here’s how we did it (as I’m honest enough to say I have great colleagues too!) …
How We Hit the Google’s Top SERP Spots
If you’ve never used Cognitive or unless you’re comfortable enough to drop into the Google Search console, it’s worth tracking your existing content’s keyword positions with Rank Tracker before you start.
The first things to do are:
Find existing content you determine needs to be improved upon.
Search for the main keyword with the ‘Cognitive Keyword Tool’.
Optimize your content with ‘Content Assistant’.
Fetch URL within Google Search Console
Step 1: Find Existing Content to Improve
One of the best positions to start from is by finding pages with keywords or queries that currently rank outside the top five spots on Google.
The ‘sweet spot’ is choosing keywords that sit in position 4 up to 30. As you can see, the blue line above did brilliantly!
Step 2: Search for the Main Keyword with Cognitive’s Keyword Tool
Pick your head term or the keyword you want to rank for and search with the keyword tool. On the example below, I chose “SEO agency” as my keyword phrase.
Feel free to pick any keyword and page you like as I’ve had success with category pages on e-commerce websites, homepages and blog articles.
Make note of the average content performance score. The goal is to beat that on your optimization round. ( I suggest you open a new Google doc and add it at the top to remind you of your target)
It’s good to pull out the relevancy scores and consider any that score higher than 3 stars. Create your new content so it reads well, and without worrying about keyphrases initially. Once you’ve done that, copy the list of keywords that Cognitive suggests and work out a structure for your article based upon the 3-5 star terms.
Take a peek at the ranking analysis tab next to see how you are performing before you continue.
It’s a good idea to see what competitor pages do on the look and feel of the page for content flow. Google ‘likes’ these sites for a reason so it’s a ‘no-brainer’. (Ultimately, it’s because the visitors do as well!)
The action to take for this example is: Write a 700-4000 word content piece that beats 69.
Step 3: Optimise with Content Assistant
Once you’ve gone into the Content Assistant, press the top right, ‘Import link’ if it’s an existing page (and yes, it can even score raw code). Otherwise, copy and paste your content into the relevant sections and hit ‘Check Score’.
This site is new so typically it’s not going to score well yet, but if you export the list and scroll down on the right-hand side you now have keywords to export and use. You do have the option to amend the copy on the left-hand-side in real time, but each check of the score reduces your overall monthly allowance.
I prefer to optimize outside in a Google document file so as to not squeeze keywords in when they are not needed, and to ensure your article or post still reads well. Repeat each process, and once you feel you’ve done a good job hopefully, you’ve beaten your competition’s score. If not, take a step back and see where your content can be improved.
Step 4: Fetch URL within Google Search Console
Login to Google Search Console and retrieve the revised URL.
(I’m not convinced this speeds it up, but it’s a habit I now do every time).
Results
Results can be manipulated but these graphs illustrate this type of content is not reliant upon seasonal wins or easy pickings. That alone should encourage you to test this for yourself, based on my findings.
The first piece of content below was devised for a SAAS company which has grown from zero to almost 7,000 clicks in a year.
This, with no backlinks or promotion: Pure organic traffic gain.
Some keywords can take longer to develop (as below) but site traffic appears when you produce good content via the ‘Content Assistant’.
Here’s another blog article created and developing nicely below. Perfect for a revisit within a year turnaround.
Here’s a comparison (below) for much older content.
To reiterate, I’m not going for easy choices on keywords. To my mind, there aren’t many anyway, but the example above is recently optimized and shows positive progress in a short period.
Disclaimer
This is not a paid post. cognitiveSEO made no agreement with the author. This is Salesbloom’s and Jim’s success story, written and documented by himself.
Please feel free to share your thoughts on this story with us.
About the author
Jim Miller is an e-commerce consultant, and founder of Salesbloom, who collaborates with world-renowned marketing gurus such as Neil Patel and Eric Siu on internal projects.
One could say that we’ve become addicted to constant and never-ending self-improvement. But there’s a more important and more meaningful reason behind a continuous quest for self-improvement. It’s the very reason we get into the tech business in the first place: we see something in the environment that can be made better and we think we can provide a tool for that.
And, as the environment changes, so do we. The key is in that very word: “better” – not “good”, but “better”, which comes with the promise of continuous improvement.
This is why after creating the cognitiveSEO tool we have never really stopped working on new ideas. Sure, there was plenty of tinkering with the existing tool, in trying to give you the best analysis of off-page SEO issues, overlooking the quality of referring domains, your backlink profile, potentially broken links and so on. But a comprehensive SEO audit isn’t complete without looking at the on-site elements related to site architecture: the website pages, their loading time, the xml sitemap, meta tags and meta descriptions, content quality, etc. And this is how the cognitiveSEO Site Auditwas born.
We’ve worked a lot (please read tremendously) to make this new Site Audit tool and give it all sorts of unique features.
What’s it all about?
In short, our brand new SEO website audit tool flags ALL possible OnPage SEO issues a site might have and provides recommendations on how to fix them. Putting it simply, the new tool allows you to improve your website’s onpage SEO performance, at the highest level of detail.
Therefore allow me to present you just some of the things our built-with-sweat Site Audit can do for you:
The addition of the Site Audit feature makes cognitiveSEO the only SEO software you’ll ever need.
What does this mean in terms of wins for you? At cognitiveSEO you now have everything SEO related all together: backlink analysis, technical SEO Audits, content audits, keywords research & rank tracking, content optimization, Google Penalty prevention and recovery, and much much more.
1. Find All Possible SEO Issues of Any Website and Get Recommendations on How to Fix Them
Although the title might be self explanatory, you need to know that the title is also very accurate.
Our website audit gives you a much wider array of SEO items to look at and can analyse issues of all types that might prevent you from reaching your best possible ranking.
Of course, off-page issues are highly important to work on at times, but they won’t target all the important search engines ranking factors. Here’s where the on-site factors step in. To make sure you’re not missing anything, you should be aware 24/7 of the offpage and onpage factors that influence your website’s performance.
The truth is, that with the ever-evolving search engines algorithms you need an efficient solution to keep your rankings safe. And cognitiveSEO does exactly that: it lets you know all the issues that might prevent your online business from getting the organic traffic and the high rankings you deserve.
Even more, cognitiveSEO’s technical SEO audit tool helps you detect all the weak points of your website before your users do, giving you a competitive advantage on the competitive market we are swimming in. Our SEO Audit Tool crawls all the pages it finds on your website, regardless of the size of your website, and provides a fully customized set of data easy to comprehend and visualize.
While knowing the exact problems your site encounters is great, just half of the job is done. But we take care at the other half as well, as we offer you precise recommendations on how to fix the error on your site so you can outrank your competitors and quickly increase your overall website performance. So, all you have to do is crawl your website with our SEO auditing tool. Pretty neat, right?
cognitiveSEO’s onpage analysis offers a set of unique features and numerous parameters that allows you to see what’s under the hood of your entire website and your SEO campaigns. Below are just some of the features that our comprehensive audit checklist contains.
✅ Duplicate Content & Duplicate Meta Tags
✅ Offpage & Onpage Ranking Factors
✅ Broken Internal & External links and Landing Pages
✅ Website Speed & Loading Time Issues
✅ AMP & Mobile Friendly Analysis
✅ HTTP Status Code Implementation Issues
✅ Incorrect Canonical Tags
✅ Website Architecture Issues
✅ Indexability Audit Reports
✅ Hreflang & International SEO Reports
✅ Linking Structure Problems
✅ Meta Descriptions, Title Tags & Content Issues
✅ Anchor text issues
✅ Social Media Issues
✅ Image Attributes Problems
✅ Malware Threats
✅ Unsecured Content Issues
✅ Google PageSpeed Integration
✅ XML Sitemap Issues
✅ Structured Data Problems
2. Use cognitiveSEO’s Site Audit to Get Improved SEO Results and Conversions
At the end of the day (or of the year) there are a lot of metrics we look at to measure our website’s success. But there is a particular one that is on everybody’s lips and which actually pays the bills: the conversion rate.
We are not going to talk about incredible tips and tricks on how to deliver overnight magic solutions for boosting your conversions. But about the tangible, actionable things that you ca do to improve you sales and conversions.
Performing a complete website audit will give you a deeper understanding of why your site is not generating the organic traffic you think it should or why your sales and conversions are not improving.
Let’s take for example the situation from the screenshot below. The analyzed website has pages that are really really difficult to reach. There are even pages you can reach after more than nine or ten clicks. In an era of instant data and solutions, having these type of pages could be a buzz killer…not to mention a conversion killer. One of those hard to reach pages could be an important one for you; one from which you would expect to generate conversions or any type of transaction.
Knowing what’s holding your conversions at a low rate is the first step for increasing them. Look for your website elements that could yield your biggest wins.
Once you run a complete website audit you will figure out that there is so little effort you can invest and the results could be so huge.
3. Improve User Experience by Correcting Site Architecture Problems
An excellent site architecture is mandatory for both search engines and for user experience. Why, you might ask.
If we think about the site architecture issues many websites encounter, we can understand how they can mess both search engine crawlers and the user’s experience.
Here are just a few examples:
3+ Clicks to reach any page – It’s known that pages that are 3 to 7 clicks away from the homepage are harder to reach while pages that are more than seven clicks away from the homepage may never be reached by a visitor that lands on the homepage. You should check that no important pages are too far aways from a click path point of view from the homepage. This way, both your visitor and the search engine could easily get to that page.
Internal anchor text bombing – Internal links are highly important; yet, if your landing page is not Wikipedia, having tons of internal links could do you more harm than good. Your best bet is to write internal anchor text for both visitors and engines.
The URLs are not search or user friendly – We’ve even conducted a study on this matter a while ago, finding out that the more concise and self explanatory the URL, the greater the chance to be higher up on ranks. Google loves shorter URLs but also your users. Really, as long as the length falls between 50 and 60 characters, you’re probably in a good spot. But as long as your URL looks something like this, http://www.example.com/product.aspx?ID=/c/YoZn1Z2v/46411526post=20908&action&IT=5f7d3d, you might want to reconsider your choices. Luckly, our Site Audit will let you know on any issues you might encounter with your URLs.
Dynamic URLs – If it’s even remotely possible, avoid them completely for your user and the search engine’s sake. Yet, knowing what URLs are dynamic can save you from lot of troubles.
With the flood of gigabytes and gigabytes of new information being created every second, search engines took on tremendous importance. Some even became so ever-present in our lives that we take their search results for granted, not wondering how they are produced or how the silent and irregular algorithmic update might influence them. So understanding why something is relevant, while something else is not, has become a separate mode of innovation and improvement in itself. It’s rank high in the search results or perish.
Our onpage SEO audit helps you diagnose all the SEO errors and problems with your site structure or website architecture so that you can deliver the best UX for your users. Our SEO checker also performs an in-depth competitor analysis so you can check what your competitors are doing and how can you beat them at their own game.
4. Cutting-Edge Technical Yet User Friendly Site Audits for both Marketers and Professionals
How do you recognize a good Site Audit tool? To be honest, there are many step by step guides on how to do that but one of them is by checking if it fits the needs of both marketers looking to improve the overall performance of a website and a skilled technical SEO who wants to dig deeper into the analysis.
cognitiveSEO’s powerful Site Audit offers you the exact customized data that fits your needs.
SEO audits are for both savvy, super technical SEO gurus and digital marketers or for content marketers coordinators. While speaking the language of all could be hard and why not, tricky, when it comes to SEO we need to understand that search engine optimization is a mean and not a purpose. Therefore, regardless of your job title, as long as you are part of the “online business team”, or you’re on the “improving your site” side, then you should take a sneak peak at a comprehensive website audit.
5. Why cognitiveSEO Is the Only SEO Tool You’ll Ever Need
By adding the new Site Audit onpage module, cognitiveSEO will be the only tool you’ll ever need, a complete toolset that will serve for all your SEO needs – both on and off page. Within the ever-growing tools landscape of SEO, it can be very comforting to be able to rely on a single provider for all your needs.
It’s not just comfort – relying on a single ecosystem means better integration, more efficiency and increased simplicity.
And if you don’t believe me yet, here is a short list of the features cognitiveSEO provides:
Technical SEO Website Audit
In-depth Backlink Analysis
Content Audit
SEO Visibility
Site Health Audit
Desktop, Mobile and Local Rank Tracking Tool
Keywords Research
Content Optimization
Social Visibility
Automated Digital Marketing Reports
Quick Backlink Checker
Shareable SEO Dashboard
Google Algorithm Changes
You need to know that the idea of having an all-in-one SEO analysis toolset started seven years ago from one man’s passion for SEO and digital marketing and his desire to find a cost-efficient solution that could meet all his professional requirements. Years have passed and cognitiveSEO has grown stronger each year, with lots of awesome users confirming the same need: an all-inclusive SEO software that would be reliable, accurate, affordable and would integrate anything an SEO Pro, webmaster or digital marketer would need.
We are proud to be standing here today, knowing that we’ve created something that both ourselves and our customers can use to improve their business.
With no hidden tricks and no shortcuts, but by using a powerful SEO software and the greatest tool of all: our brain.
Over the last few years, the Google Tag Manager product team have done an incredible job responding to requests and developing the platform. From enterprise functionality like Zones to time-saving features like RegEx table variables and element visibility triggers, the past 18 months have seen GTM go from strength to strength. That said, in the […]
There are many attributes that an SEO must have in order to be successful, but one of the most important ones is being willing to improve all the time. Improvement doesn’t always come from making things right. In fact, the only way you’ll really improve is by failing, over and over again.
However, when it comes to SEO, a mistake might pass by unnoticed. You might not have any idea that you’re doing something wrong. Now, there are thousands of horrible mistakes that you could be making, like not adding keywords in titles or engaging in low quality link building.
But here are some more subtle, modern mistakes that SEO might make these days.
In order to find out people’s most common modern SEO mistakes, I decided to ask a number of renowned SEO experts the following question:
Can you think of one major SEOmistake that is holding people’s websites back today?
Some of them were kind enough to take some of their time and share the wisdom with us, so keep reading because there’s top quality information lying ahead in this list of 8 SEO mistakes to avoid in 2019 and later on!
Broken pages are really one major issue for websites, especially if they have backlinks pointing to them through those web pages. Ignoring broken pages can be a big mistake.
John Doherty, Founder & CEO at Credo, a portal for connecting digital marketing experts with businesses, knows this and marks it as one of the biggest mistakes people make, as well as one of his team’s top priorities when optimizing websites:
One major SEO mistake that I see holding back websites these days is not fixing their site’s broken pages that have backlinks pointing to them. I work with a lot of very large (100,000+ page) websites, and the first thing I do when we begin our engagement is look at their 404s/410s and which ones of those have inbound external links. We then map out the 1:1 redirects and redirect those. This always shows a good gain in organic search traffic, and then we build on that momentum from there.
Broken pages: First, we have the broken pages. Broken web pages are bad for the internet and, thus, bad for your website.
Why you ask?
Well, to understand why we first have to understand what a broken page actually is. A broken page is simply a page that doesn’t exist. You see, it’s not actually the broken page that matters, but the link that’s pointing to it.
A page doesn’t really exist until another page links to it.
When Google crawls a website, it always starts from the root domain. It crawls https://www.yoursite.com and then looks for links.
Let’s assume that the first link the crawler finds is under the About Us anchor text and it links to https://www.yoursite.com/about-us/ but the page returns a 404 response code, because there’s no resource on the server at that address.
When Google’s crawler finds 404 pages, it wastes time and resources and it doesn’t like it.
In theory, there is an infinity of 404 pages, as you could type anything after the root domain, but a 404 page doesn’t really take form until some other page that exists links to it.
Now broken pages can occur due to two factors:
You delete a page that has been linked to (broken link due to broken page)
Someone misspells a URL (broken page due to broken link)
Both the linking website as well as the linked website containing the 404 have to suffer. If you have too many broken links on your site, Google will be upset because you’re constantly wasting its resources.
Backlinks: The second point John makes is regarding the backlinks pointing to the broken pages. As previously mentioned, some may occur due to people misspelling a URL, which isn’t your fault.
However, if you have 10 websites that link to one of your pages and you delete that page because you think it’s no longer relevant, then you’re losing the equity that those 10 backlinks were providing. Bad for SEO!
The cognitiveSEO Site Explorer is great for finding out backlinks that point to broken pages on your website:
You can also have internal broken links, as well as external broken links pointing from your website to 404 pages on other sites and you should also fix those! Soon, on the 12 of December 2018, cognitiveSEO will launch its OnPage module which you’ll be able to use to determine if you have any broken internal links so make sure you check it out!. Here’s for the first time a quick preview to it.
Big websites: After that, we see that John mentions something about big websites. Why? Pretty simple. It’s easier to mess things up on a big website. On a small website, you might have one or two 404s but they will be easy to spot and very easy to fix.
They might have some backlinks each, but not much is lost. However, when you have hundreds of thousands of pages, that link equity scales up pretty quickly.
Redirects: Lastly, there are the redirects. John tells us that to fix the issue, he always does the proper redirects. By using a 301 redirect from the broken link/page to another page that is relevant, we can pass the link equity from the wasted backlinks.
For maximum effect, don’t just redirect to the homepage or some page you want to rank if it’s not relevant. Instead, link to the most relevant page and then use internal links on relevant anchor texts surrounded by relevant content sections to pass the equity to more important pages.
This is the type of SEO fix that might bring invisible results ‘overnight’. Thanks, John, for this wonderful input!
2. You Publish Too Many Poor Quality Pages
Another issue that is generally related to bigger websites is the ‘thin content’ issue.
One very common mistake sites make is that they publish way too many poor quality pages on their site. As a result, Google sees the site has a lot of “thin content” and lowers the site’s rankings across the board.
Although not always the case, when you have a small website, it’s pretty easy to come up with some decent pages. However, when you have a site with thousands of pages, the effort required to have qualitative content on all of them is a lot bigger.
Thin content pages are pages that have no added value to what’s already on the web. Google doesn’t really have a reason to index the site so you’ll either end up in the omitted results or get this message in your Search Console:
The message above is a manual action, which means that you’ll have to submit a review request and someone hired by Google will actually take a look at your website to determine if you’ve fixed the issue. This might take a long time, so be careful! However, it’s possible that the algorithm ‘penalizes’ your website without any warnings, by simply not ranking it.
Matt Cutts, the former Head of Spam at Google puts it like this:
The thing is, thin content doesn’t always mean the site requires text. Why am I saying this? Well, Matt Cutts mentions doorway pages as an example but doesn’t really give an alternative to them. What if you do have a website that offers the same service in 1,000 cities? Should you write ‘unique’ content for each page?
The truth is there’s no alternative to doorway pages. They’re either thin content or they have unique qualitative content. However, there’s much more than content when it comes to ranking. So if you have a car rental service, it’s not necessary to add filler content to every page, but you have to do other things well. Structure it very well, make sure your design is user-oriented and maybe consider having a blog to add relevant content to your website.
However, although it does take a lot of time to add original content to every page, it might be worth the shot if you really want to stand out. Google has absolutely no reason to include you in the search results if the info you’re providing is already there, exactly in the same manner.
Again, the CognitiveSEO OnPage Tool which will be launched on December 12, 2018 can help you identify pages with thin content.
Thin content is a very big issue these days, especially for bigger eCommerce websites and should be treated as such. Thanks, Eric for this great addition to our list of mistakes!
3. You Have Duplicate Content Issues
Very closely related to thin content pages are duplicate pages, which are even worse. Andy Drinkwater (I know, that’s his real name! Pretty cool, right?) from iQ SEO knows this very well:
Probably the one that is the most prevalent through the audits that I conduct, is the duplication of pages. This tends to often be done because site owners might have read (or believe) that multiple pages targeting the same or similar phrases, is a good thing. Page duplication/keyword cannibalisation leads to Google seeing a site that is trying too hard with its SEO and is often heavily over-optimized (not something that they want site owners doing).
Duplicate pages can occur due to many factors. For example, one client of mine had a badly implemented translation plugin, which created duplicates of the main language for the pages that did not have any translations. Bad for SEO.
I have to admit, however, that I’m kind of surprised that any said people would do this voluntarily to themselves. Although… I did have one client that asked me why his competitors ranks with two pages for the same keyword. On his own, he might just have duplicated his pages, who knows?
Either way, duplicate content is bad. Not only that you’re not providing anything of value and you’ll get into Google’s omitted results, but you’re wasting Google’s resources and that might eventually affect your entire website.
I swear I did not cherry-pick these answers, but again, Andy’s input fits perfectly with our soon to release OnPage module. You’ll be able to use the tool to easily identify which pages are 100% exact copies of others and even which pages only have similar content.
Duplicate content is definitely an issue affecting many websites. Thanks, Andy, for sharing this with us!
4. You Target out of Reach Keywords
It’s good to have big dreams, but sometimes, too big dreams can overwhelm and demoralize you. If you want to be able to lift those 250 pounds, you first have to be able to lift 50.
Andy Crestodina from Orbit Media tells us that people should set realistic keywords:
By far, the most common SEO mistake is to target phrases that are beyond reach. Even now, in 2019, a lot of marketers don’t understand competition. They target key phrases even when they have no chance of ranking. There are a lot of mistakes you can make in SEO and a million reason why a page doesn’t rank. But this is the big one.
Often times, competition is hard to explain to clients because authority and page rank are often times misunderstood or far too complex subjects. Thus, Andy has come up with a little system to better and more easily explain how people should target their keywords.
Because it’s difficult to explain links and authority, I’ve started using a short-hand way to validate possible target phrases:
If you have a newer, smaller or non-famous website, target five-word phrases.
If you’re relevant in your niche, but not a well-known brand, target four-word phrases.
If you’re a serious player with a popular site, go ahead and target those three-word phrases.
Here’s a chart that helps make that recommendation…
I feel the need to point out that although there’s always been a correlation between lower search volume and higher word count, that’s not always the case. Many of you probably consider long tail keywords to be keywords with more words in the phrase but the term ‘long tail’ actually comes from the search graph:
So, theoretically, you can find high search volume, high word count keywords and also low search volume, low word count keywords. Some of these two or three keyword phrases might even have very low competition.
More on the true meaning of long tail keywords can be read here.
However, as Andy stated above, it’s pretty difficult to understand what ‘low competition’ really is in terms of SEO so, since there’s a correlation between low search volume and high word count, there’s a big chance you won’t go wrong with it.
When you’re first starting out, it’s always better to start targeting lower competition keywords and build your way up. Thanks, Andy, for the input and also for the very useful chart!
5. You Ignore the Organic Search Traffic You Already Have
Since we’ve just talked about what keywords you want to rank for, why not talk a little about keywords you’re already ranking for? Cyrus Shepard, ex Mozzer and current founder of Zyppy knows the value of ranking keywords data very well:
I regularly see websites make the mistake of only optimizing for the traffic they want, and ignoring the traffic they have. Lots of sites perform an “SEO optimization” when they create content: choosing keywords, writing titles, structuring headlines, etc. Sadly, a lot of folks stop there. After you publish and receive a few months of traffic, Google freely gives you a ton of data on how your content fits into the larger search ecosystem. This includes the exact queries people use to find you, which is also an indication of what Google thinks you deserve to rank for. By performing a “secondary optimization” around this real-world data, you take guessing out of the equation and take advantage of more targeted opportunities, hopefully leading to more traffic.
You know… you probably have no idea what keywords you’re actually ranking for. You’re probably thinking “I already rank for them, why should I care?”
Well… the truth is that you might get some traffic from keywords you already rank for, but not all of it. If you’re getting 5 searches from a keyword that has 100 monthly searches, you’re probably not on 1st position because the average CTR for 1st position is around 30%.
It’s either you’re number 5 or below or you have really bad CTR and won’t last long in the top spots. If you’re on position 5+, then there’s still room for improvement.
You can monitor the keywords you’re already ranking for in your website analytics or by using Google Tag Manager on your website and adding Google Analytics to it. There’s a good amount of info in the Search Console as well.
You can also use the CognitiveSEO Content Optimization Tool to easily identify terms which you should add to your content to make it more relevant for specific keywords:
Cyrus talks more about this here. Make sure you give it a good read! Cyrus, thanks so much for sharing your wisdom with us!
6. You’re Using Unconfirmed SEO Theories
If your SEO moves rely on unconfirmed theories, then it’s pretty much likely that you won’t be ranking high very quickly.
Josh Bachynski, science freak and renowned Google stalker, is very fond of this. He recommends that people should take a more scientific approach when it comes to search engine ranking factors because, in the end, it’s an algorithm:
The biggest mistake people are making in SEO these days is in not using scientific methods to determine their theories about ranking factors and instead just wild guesses which of course are not as good and eventually will be completely off and they won’t be able to rank and have no idea why.
The point Josh is trying to make is that you should always test your SEO methods before you actually implement them and you should test them the right way.
One good example here we can give is the content pruning technique. We tried it and it apparently worked. Our organic search rankings started to rise. However, the test was not isolated, so therefore it wasn’t really scientific. We don’t really know if the organic search traffic increase was due to the content pruning or other factors because this happened over a longer period of time.
This can be said about other tests, as well. Most of the time people approach search engine optimization from multiple angles at the same time and it really is difficult to attribute growth to only one factor. You can read more about the content pruning technique here.
He also tells you not to trust Google a lot. I tend to agree. Here’s one example: “social media isn’t a ranking factor”. Take it as is and you will completely ignore building a social media marketing strategy. However, if you also take into account that social media can bring your website some backlinks, then you might reconsider things. Sure, blindly posting on Facebook daily while getting 0 likes is a waste of time, but get some engagement going and you’ll definitely see how it can boost you up.
You can check Josh’s YouTube for more on his scientific approach on things and how he determines his ranking factors. It’s also kind of funny to watch how he keeps poking Google officials with a stick.
7. You Over-Complicate Things
On a completely different note, many people over-complicate things when it comes to search engine optimization. This is mostly true, especially for beginners or startups that want high organic search rankings in a record amount of time.
Although what Josh says is true, that you should test your theories and choose your techniques wisely, most of the times it’s just better to stick to the basics.
Kevin Gibbons, CEO at Re:Signal knows this very well and points it out in his answer:
The biggest mistake I find people making in SEO is that they over-complicate things. SEO can be very simple. Start with what’s the goal and find the easiest way to hit your target. If you start with identifying the real opportunity, you won’t go far wrong – but quite often people are doing what they think is best, without understanding what question they are answering. e.g. we just need more links… that might be correct, but first identify why – as it might not be the real problem to solve.
Kevin makes a really good point. I’ve recently had a client that told me about her previous local SEO expert and his work. I was shocked! Over 100 backlinks built in the past 6 months (that’s a lot for the Romanian market), while the title of the homepage was still only the brand name, written in capital letters. Literally, no page on the website was targeting any keywords with the title! Unforgivable!
Thanks, Kevin for the answer and the awesome presentation you gave at the 2018 WeContentContent Marketers’ Conference in Bucharest, Romania.
On the same note we have Aleyda Solis, international SEO speaker and consultant with astonishing results:
From SEOs: Overlooking the SEO pillars while trying to chase specific algorithm updates. I see many putting so much effort trying to identify the factors behind some of the latest updates that might have affected them while overlooking that at the end is about addressing the principles and fundamentals that will help them grow in the long-run: From relevance, organization and format of information to better addressing the targeted queries to user satisfaction, … It’s normal to have the need to keep updated and trying to identify potential causes that could potentially affect our SEO processes, but at the same time we should avoid getting obsessed with them and falling into the “can’t see the forest for the trees” situation.
SEO is an ever-changing domain and you have to constantly keep an eye for major updates such as the mobile-first index update. However, these updates follow a real issue, which is that most websites aren’t mobile friendly and most users now use mobile devices to perform their searches.
But constantly running after the next update, shifting your strategy 180 degrees or abandoning the essentials to pursue some rather unsure things won’t do any good.
Although it’s good to stay up to date with things (and you should), it would be a better idea if the foundation of your strategy were based on the basics of SEO, the long-lasting ones.
Aleyda was kind enough to give us two answers so keep reading:
8. You’re Not Starting with SEO in Mind Early On
Big words of wisdom here from Aleyda. I kept this one for the end because it’s so true and many SEO specialists will resonate with it. Most of the time, it’s not even the SEOs that make the biggest SEO mistakes, but the clients themselves:
From other specialists (developers, designers, copywriters) as well as business owners: Implementing certain actions thinking that SEO “can wait” to be included later on in the process, without taking into consideration than doing it so might mean to having to “re-do” the whole project sometimes just because there wasn’t a timely validation in the first place, that can help to save so much in the long-run.
SEO can often be postponed because it seems like other things are more important. Many people build their website with “I’ll start SEO later” in mind, just to wake up to the reality that their website is built completely wrong and major changes are required in order to make it SEO friendly.
Business owners would rather hire PPC experts than content marketers. They only ask an SEO consultant on their opinion after most of the development for the website has been done.
For example, this study found out that ‘load more’ buttons convert better than pagination when it comes to eCommerce. Read it and you might immediately want to switch to ‘load more’ buttons or build your website that way from the beginning. However, you might completely ignore the fact that you’ll basically remove hundreds of pages from your site and hide their content under some JS that there’s no guarantee Google will ever see. Good for conversion, but bad for SEO.
If you want to see good results with SEO and also minimize costs, you should start with SEO as soon as you start developing your website. Otherwise, things will only be harder and they will take more time.
Thanks, Aleyda, for sharing your experience with us!
Conclusion
If you’re an SEO or a digital marketing professional and you’re making any of these mistakes, take action immediately to fix them, as they affect your websites dramatically! Hopefully, this list of common SEO mistakes to avoid in 2019 and onward will be useful to you and your team.
What SEO mistakes have you done so far? Share it with us in the comments section so that we may all learn from it!
Last week saw Builtvisible pick up top honours at the biggest night in the content marketing calendar. Our campaign for Lastminute.com was awarded gold in the highly competitive Best use of SEO category, with the judges singling out the holistic nature of the work along with the ROI-accountable results for praise: The annual awards ceremony […]
Wonder how to price your services as a solopreneur, consultant or freelancer? Use this easy guide to solopreneur pricing. It outlines (with examples) 5 types of pricing to help you.
Whenever we talk about SEO and the idea of optimization, some of the first things that come to our mind are content, backlinks or keywords. This is not wrong, but it isn’t exhausting the possibilities of SEO either. Especially since there are plenty of factors that weigh in a page rank that are not purely backlink or content-driven. So it should not have come as too much of a surprise when Google announced that page load speed is a ranking factor for both desktop and mobile searches. Of course, one factor out of more than 200, but one that is still worth considering.
Site speed, according to Google, “reflects how quickly a website responds to web requests” and for all intents and purposes serves both the user and the site owner. Users get an enhanced experience, site owners are more likely to get reduced operation costs and… well, more users, who stay on the website longer.
Google thinks the same, as indicated by the very fact that it provides us with tools to check up on our site speed and suggests solutions for sluggish pages. There is the PageSpeed module in the Google Developers section and a Page Speed Insights for making a speed audit for every website.
Even Yahoo has a developer module for page speed improvement, called Yslow. The slew of freely available tools is certainly meant to wear off some of the concerns that only big sites can benefit from this change. In fact, chances are smaller sites have the upper hand here in terms of flexibility and adjusting. Some questions and concerns remain though, the most important being:
What is and how exactly does Google measure “site speed?”
For instance, the difference between “document complete” and “fully rendered” time when loading a page was mentioned. The former means that while not all the elements on a page are showing, you can start clicking and navigating. The latter means that everything is clearly visible on the page, even background stuff like ads and images. Even though it is difficult to pinpoint specifically and with certainty which aspect of speed directly helps your rankings, working on your site loading speed, in general, is bound, at the very least, to enhance your users’ experience. So, for instance, having a “document complete” time which is faster than the “fully rendered time” could help.
2. Reducing the Image File Size Using the Lossy and Lossless Compression Techniques
Images can play a major role in the difference between “document complete” and “fully rendered site” loading times, especially for mobile sites.
One solution is to simply delay loading images. Let everything else load first and then only at the end start loading images. This ensures that your site is functional long before you get to the bandwidth-chugging portions of the page. As with everything else nowadays, there are lots of plugins that can help you. On WordPress, you can try Advanced Lazy Load or Smush. But if you need to do that, maybe the problem is simply that your images are too big and they slow down your site speed anyway. So why not compress them to begin with?
We used two techniques for image compression for our case study:
lossless compression, which means the image quality remains largely the same.
lossy compression, which means some loss of fidelity may occur, though in general, it may be imperceptible.
Brian Jackson, Director of Inbound Marketing of Kinsta, explained each compression method:
Lossy image compression refers to compression in which some of the data from the original file (JPEG) is lost. The process is irreversible, once you convert to lossy, you can’t go back. And the more you compress it, the more degradation occurs.
Lossless image compression refers to compression in which the image is reduced without any quality loss. Usually, this is done by removing unnecessary metadata from JPEG and PNG files. RAW, BMP, GIF, and PNG are all lossless image formats.
BRIAN JACKSON
Dir. of Inbound Marketing Kinsta
You can see below the representation of both reduced files using these two techniques.
To be more exact on this, the correct term is perceptually lossless. A human can’t tell between a JPEG optimized version and the Original photo. JPEG is a lossy format. PNG is lossless.
3. Case Study – Image Compression Tools Tested to the Limit
We went out and researched some of the best image compression tools out there which use the lossless and lossy techniques for image compression. After we identified them, we made a list of the most relevant tools that would be up to the job:
I hope it’s useless to mention that this analysis is unbiased and none of the mentioned tools offered to take us out for a beer or promised any goods in exchange. So we are going to test those 4+1 tools on the same artifact and see if we can notice any difference in loading speed results.
To make the analysis more encompassing we added a 5th element to the comparison (a 5th dimension, if you will), which is slightly different from the other 4: Google’s PageSpeed. Unlike the other image compression tools we have included in our research, Pagespeed isn’t really a compression tool as much as an optimization tool that works holistically through a series of custom filters which are executed when the HTTP server serves the website assets.
What gets it a spot in our test is the fact that it “dynamically optimizes images by removing unused meta-data from each file, resizing the images to specified dimensions, and re-encoding images to be served in the most efficient format available to the user.” Sounds pretty nifty, right? We’ll see if the boast stands at the end of the day.
3.1 How the Image Optimization Test Was Performed
The test had to be the same for all and give these tools a run for their money, so we decided on optimizing our blog’s entire image data (galery), which rounded up at 3,824 images. This was done on a test version of the site and it has not gone live yet. It was designed for testing purposes only. Most of the services we used to compress the images have their own API and some example scripts but for our test, we’ve built our own scripts which send a request for each image.
Some services have a “batch upload” feature, but for consistency, we’ve sent a single image per request since we cannot do a “batch upload” on all servers. We don’t want to lade you with too many technical details. Yet, a few comments on the process are in order. Where an API was not available, the images were processed with the service provided by that particular software.
The first comment is that JPEGMini was the only tool that was a downloadable software, the rest being hosted online. The second is that all of the tools have APIs, except for JPEG Mini. Last but not least, for Pagespeed we used the Nginx module.
3.2 Top 6 Metrics Tested for Finding the Best Image Compression Tool
We measured several metrics, which we could split into two categories:
Image compression related metrics.
Page load speed related metrics.
For the first category, we only had results for the 4 image compression tools, since Pagespeed is a bit of a different type of tool, as discussed earlier. It is worth noticing that not all tools had the same workload:
TinyPNG only compressed both PNG and JPEG format files.
JPEGMini only compressed the JPEG format files.
Kraken compresses both PNG and JPEG format files.
Puny PNG compresses both PNG, JPEG and GIF format files.
The metrics in both categories, however, were calculated relative to the actual work and not to absolute standards. And here they are:
Metric 1 – Conversion Time
We ran tests for each tool to see which one of them manages to convert the image faster and our winner was JPEGMini.
Let’s start with the conversion time, measured in minutes. Since the number of conversions wasn’t the same for all tools (in fact, it was slightly different in each case), it makes more sense to look at the number of conversions per minute. The detached leader is JPEGMini, with roughly 113 conversions per minute, or close to 2 conversions per second. The other 3 tools trail significantly behind, with little difference between them, by comparison. TinyPNG and Kraken are at 30, respectively 21 conversions per minute, while PunyPNG is utterly unimpressive at merely 11 compressions per minute.
Metric 2 – Compression Power
Next, let’s look at compression power. This, again, is a metric resulting from the relation between two different measurements: the compressed size of the output relative to the original size of the input. The winner, by a margin, is TinyPNG, with a 2.78 compression ratio, with the other three being at less than half the rate (1.19 for Kraken and 1.3 for JPEGMini), or at best, just slightly better than half the rate (1.53 for PunyPNG). So far the format specific tools seem to fare much better than the format general ones.
Metric 3 – Pages with Mainly PNG Images
And now for the real test: does any of this impact the time to load the page? For PNG related pages, the time after compression is best improved with TinyPNG (5.29 seconds) or Pagespeed (5.44 seconds). The difference between the two is by and large negligible. Kraken and PunyPNG, however, are significantly lower (with basically no difference at all between them, however: 7.47 and 7.45 seconds respectively).
The latter loading times are more than double than if the web pages used no images and closer to the load speed if there were no compression at all (8.56 seconds). The sizes of images loaded on the web pages are, as expected, in line with the loading times. The average size is 8 MB for Pagespeed and 8.2 MB for TinyPNG, practically half the size of images obtained from Kraken (15.9 MB) or PunyPNG (15.7 MB). For comparison, the average size of the PNG images on a loaded page is 19.5 MB, not much bigger than for the compressed results for Kraken or PunyPNG.
Metric 4 – Pages with Mainly JPEG Images
What about JPEG image format? It’s another win for Pagespeed and the format-specific tool. This time, however, the result variation is nowhere near as dramatic as in the case of the PNG files. In fact, the times to load the page are really close:
3.24 seconds for Pagespeed
3.45 seconds for JPEGMini
3.53 seconds for Kraken
3.55 seconds for PunyPNG.
3.77 seconds for the Original Images
3.03 seconds with No Images
Hardly noticeable, isn’t it? Add to that the fact that all of these values are still not that much different from the page load time when there are no images on the page (3.03 seconds) or even when all the images on the page are at their original size (3.77 seconds). As you would expect, the difference in the average size of images loaded on page is similarly unremarkable.
Pagespeed gets the best compression, at 3.4 MB, with JPEGMini following closely at 3.9 MB. Kraken and PunyPNG are both above the 4 MB mark, with 4.3 MB and 4.4 MB respectively. While the first two values are significantly different than the average size of the page with uncompressed JPEG images on it (4.9 MB), they are nowhere near the size of the page with no JPEGs on it (1.2 MB).
Let’s do a quick recap on the JPEG and PNG compression.
Best JPEG Compression (Best is First)
PageSpeed
JPEGMini
Kraken
PunyPNG
Best PNG compression (Best is First)
PageSpeed
TinyPNG
PunyPNG
Kraken
The best JPEG compression is done by PageSpeed, then JPEGMini , followed by Kraken and PunyPNG. When it comes to PNG files, Pagespeed leads the way, tied with Tiny PNG, followed by PunyPNG and Kraken. Barring Pagespeed, the better tools seemed to be the ones that are geared for specific image formats (JPEGMini and TinyPNG), with the all-around tools doing poorer in terms of results.
Metric 5 – Bulk Conversion Speed
Total Conversion Times by Tool
JPEGmini – 20 minutes / 2275 images converted
TinyPNG – 60 minutes / 1460 images converted
Kraken – 197 minutes / 3824 images converted
PunyPNG – 328 minutes / 3722 images converted
Pagespeed – On the Fly Conversion / no initial conversion time
Images Converted per Minute by Tool
JPEGmini – 113 images/minute
TinyPNG – 24 images/minute
Kraken – 19 images/minute
PunnyPNG 11 images/minute
Pagespeed – On the Fly Conversion
In terms of bulk conversion speed there are some aspects that need to be mentioned. First of all, PageSpeed executes the conversion instantly, on the fly, and JPEGMini is a downloadable software. So, there are some technical differences between these two tools and the others, which might make the comparison between them a bit harder. At the same time, JPEGMini and PageSpeed’s way of functioning regarding this matter is their competitive advantage so we need to give the credits to Pagespeed and JPEGMini for being the fastest.
PageSpeed seems to be the best as it makes the conversion instantaneously and JPEGMini is the next best thing, but it is limited to JPEG files. TinyPNG and Kraken are tied up in terms of speed (both not very impressive), with the former only being useful for PNG files. PunyPNG, while good for all image file formats, has the least impressive bulk compression speed of all five tools.
The conversion speed indicator is important to take into account when deciding on the tool if you want to use it on web pages that rely extensively on images. If your website’s total count of images is comparable or higher than ours, it might be worth going for PageSpeed or JPEGMini as they have the best results in terms of total conversion times and images converted per minutes.
Metric 6 – Conversion Process Easiness
Figures and numbers are highly important but we all know that is not all about them. A software must provide great results but must also be easy to use, intuitive and simple to understand. After all, a software’s success is given by the end user. For instance, PageSpeed is a great product and can be the detached leader in many chapters but is definitely not the most user-friendly product.
If you are not a computer literate and you don’t know how to install and configure the requested Apache or Nginx modules, then no matter how good the results are, PageSpeed might not be for you. In addition, you will need to have the rights to alter web server configurations in order to install the PageSpeed module (it won’t work for shared hosting). Unless you want to turn to a tech expert, the use of this tool will make you feel inadequate and will, paradoxically, make things look harder.
JPEGMini, Punny PNG, Kraken or TinyPNG, on the other hand, have an easy to use interface and don’t require high technical skills. From the easiness of conversion point of view, these tools are made simple, helpful and really meet the user’s needs.
Deciding on the Winner
Deciding on a winner is hard, yet necessary in order to conclude our in-depth query. Before doing so, let’s take a look at the tools’ overall presentation and take a sneak peek at what they have to offer.
The distinction of the best all-around image compression tool when taking into account all the results goes to PageSpeed: it’s either tied for the lead, or the leading tool in terms of both file size and time it takes to load the page for both JPEG and PNG image file formats and none of the other four tools manage to do so consistently (it shares the distinction with two separate tools: TinyPNG for PNG files and JPEGMini for JPEG files).
So how much does PageSpeed actually help you? Well, the best way to measure this is by looking at the time you save by compression per load on tested page. For PNG, you get from 8.56 seconds (with the uncompressed files) to 5.44 seconds for a page to load. This means the page loads at a 36.45% faster rate. For JPEG files, we got from a loading time of 3.77 seconds (for uncompressed images) to a 3.24 seconds loading time, meaning pages load at a 14.06% faster rate. Less impressive than for PNGs, but still noteworthy.
Think of it this way: every time a user lands on one of your pages, their experience is up to 36.45% faster and better. After all, the life of the web surfer is improved with every second one is looking at relevant information rather than at a blinking hourglass. The one thing to keep in mind with Pagespeed is that there is some loss of image quality with its compression, but it is so small that it shouldn’t be a serious drawback for most websites (i.e., unless high image quality is what you’re actually going for, in which case maybe image compression is not really the best way to optimize speed on your website).
This is the fully complete metrics table data compared.
And the Winner is …
Looking back at all the graphs and figures it is not hard to say that PageSpeed stands out from the rest of the tools. In a world where the user doesn’t have the patience to wait more than 10 seconds before a page loads, Page Speed can be seen as one of the webmaster’s best friends. PageSpeed Modules are open-source server modules that optimize your site automatically. It is definitely not the average webmaster’s tool as it requires high technical skills in order to use it.
All in all, from our research it results that PageSpeed gives the best-added value to a website optimization process overall being, maybe, the best option in what concerns a page’s loading time.
Conclusion
A site’s speed clearly plays an important role in user experience and in the Google Algorithm. People from the SEO industry are making a big investment in content and that is great. Yet, what’s the point of all the struggle if it takes too long to load? Knowing the tools that can help you optimize your site’s speed and make your life as a webmaster a bit better might really come at hand. To summarize the result of our research, we can say that PageSpeed seems like a very helpful tool.
PageSpeed is a free to use tool that can be used on Apache and Nginx Servers and allows you do to all of your changes on the fly. It also has a cache so it won’t put too much burden on the server. If you do not have access to a server configuration, we would recommend using the JPEGMini for JPEG compression and TinyPng for PNGs. You could simply decide on only one format to use and go with the one you choose, since that would make things simpler for you and allow you to use the most efficient tool.
And don’t forget the Lazy Loading technique ;). All these together will make your site “lighter” and “faster”!
Note: This blog post is an improved version of an older one we had.