Competitor outranking us despite all SEO metrics in our favour
-
Hi,
We are trying to outrank a competitor of ours on Google for around 100 terms that we are both clearly targetting. Our competitor currently 'wins' in the SERPS around 70% of the time, often ranking in the top 3 positions.
The problem is, we are absolutely convinced that all onsite and offsite SEO metrics we are monitoring would suggest that our content should almost always place higher than theirs.
Here is what I know:
- All pages on both websites are rated 'A' by the Moz on-page-grader tool for the phrases we are targetting, with virtually no tech issues to be addressed on either site
- Our domain authority is 10 points higher than theirs
- Neither of us have external links to the relevant pages on our sites (although we are working on some for ours)
- Our Page Authority is considerably higher on average - often by 10 points or more
- We have considerably more linking route domains, from better sites. Also many more total external links
- The HTML title and H1 headings of our pages contain the target phrases at the beginning - our competitors are the same. Although they often ONLY include the target phrases in their title, wheras in our title they might only take up 1/3 of the total characters
- We have images optimised for the target phrases on our pages. As do they.
- We use the target phrases roughly the same number of times in our copy text, as do they
I have totally run out of ideas now for further optimising our site/pages to consistently rank better for these target phrases, although maybe these are a couple of factors that could be having an impact on the rankings:
- The structure of their website is perhaps optimised more for these target phrases - our site is much bigger, so perhaps our 100 pages are given less relevance on our site than their 100 pages? - but surely our stronger page authority would suggest otherwise?
- Perhaps Google is using page engagement statistics to determine that their site is 'betteer' than ours in terms of user appeal and engagment?
Can anyone think of something that I might have missed? Is there another major ranking factor I have perhaps neglected in my research?
I know link building strategies are a good way to approach this in the long run, but we are currently just concerned about why we are not already ranking better when clearly they are not undertaking any link building strategies of their own.
Any help or pointers here would be enormously appreciated.
Thanks
Lou
-
Thanks again to both of you for the pointers.
I just need to get stuck into this now I think. Even delving into just a coupled of the points raised by you has thrown up severel potentially very important areas for me to look to improve.
As a starting point I am going to stop obsessing over those heading SEO metrics that seem to have taken over my life.
Lou
-
EGOL,
Thank you for emphasizing the quality (helpfulness/human value). I only briefly mentioned it in my response, yet it really does need to be a top priority.
-
Lou,
"I just wanted to throw a few factors out there in order to encourage a response like yours - packed full of useful next steps for me to evalaute this further."
THAT is priceless
Pagination:
Loading all content on one page and using a "more" button to "reveal" it, is not a best practice. Individual pages need to exist for individual sub-topic based content. This is especially true since it now appears that Google, while indexing content initially hidden to users, is likely giving less value to that hidden content than content immediately seen.
Pagination is important IF it is executed properly. If you have tens of thousands of results in paginated lists, is that one paginated group, or are they split out into separate groups based on similarity of content? If it's all just one massive group, that's likely another problem to look into, since pagination is meant to be used to say "these pages all contain links to other content where the entire group comprises very similar content around one primary topic".
Internal linking should always point more to main category page destinations than individual pieces of content. It would be unnatural from a usability perspective to link more to individual pieces of content, and thus it would be bad for SEO.
5,000 or so average crawl errors - what is causing those? Are they 404s? Were they previously valid pages? If so, those typically need to not generate 404 but instead be a direct 301 to a highly relevant live page (and where internal links within the site are updated accordingly).
So many more issues to consider...
-
Any help or pointers here would be enormously appreciated.
You are really focused on metrics. Those metrics are good for two things (my opinion, certainly minority opinion here): 1) entertainment value; 2) diverting productive time away from the real work of running a website.
Perhaps Google is using page engagement statistics to determine that their site is 'betteer' than ours in terms of user appeal and engagment?
That is right. Announced here ten years ago. Take all of the time that you spend on metrics and links and start putting it into improving the website. Have the courage to divorce yourself from these metrics for a year. Get engaged in different battles... beating their articles, beating their images, beating their deals, beating their service. Then look at your improved website and how many visitors are engaging it at a higher level.
You still have to pay attention to the technical and usability details explained by Alan, but if you are making genuine improvements in your website, that make it more competitive on the basis of content and benefits to visitors then your metrics will advance on their own.
-
Hi Alan,
Thaks very much for offering your thoughts on this. Really useful comments.
You're right in that I had very much over simplified my analysis of the issue. I just wanted to throw a few factors out there in order to encourage a response like yours - packed full of useful next steps for me to evalaute this further.
On of the points you raised was 'crawl efficeincy; and on further investigation I noticed that we have an awful lot of pagination on our website which users use to browse through our articles (we have 10's of thousands). However the competitor site tends to have all results on one page along with a 'show more' button. Might this be a good place to start? Looking at the cache dates of some of the results in Google, it certainly looks like the competitor pages are being crawled more often.
I also noticed from Google webmaster tools that our internal links report shows that we are giving great prominence to many of our category landing pages, rather than the article pages themselves. Does this sound like an area worh investigating?
5000 or so average crawl errors probably also isn't helping. Particularly given that we seem to only get around 6000 or so page crawled per day. Again, I;m guessing this is worth a lot of attention.
Duplicate page titles also seem to be an issue, so there is certainly lots for me to look at here.
Thanks again
Lou
-
You are asking some very challenging questions, and using some very limited metric comparisons to try to figure it all out. SEO is not so easy. If it was, many sites would be in a continual state of leap-frog as they out-do each other in similar ways.
Here are just a few questions / considerations to add to your process:
1. Regardless of the number of instances of one or more keywords on a page, what is the total volume of highly relevant content on a given page? How helpful is that information in answering questions your specific target visitors are needing to have answered? How helpful is it in being able to allow visitors to achieve a goal they came to your site to achieve?
2. How well organized is your content in regard to very similar pages being grouped together in both navigation and URL structure? Since my reading of your question implies the competitor site is much more "tight" in its singular focus, this is a critical factor for your site to evaluate.
3. If their site is much more 'tight' in it's singular focus, how much is dilution a factor on the other pages of your site regarding topical focus and goal intent? If there is any serious dilution happening, you'd likely need even more content within that section you are comparing, to overcome that site's strength in refined singular focus.
4. What technical issues may exist on your site that you may not have considered? Crawl efficiency, page processing speed, canonical or duplicate content confusion? There are many other questions I could list with just this one consideration. Even if the competitor site has some worse signals among these, if any of yours are problematic enough, that alone can be a contributing factor.
5. How much higher is the quality of the inbound link footprint for your competitor in comparison to your inbound links footprint? Just having more links isn't at all a valid consideration if you don't dig deep into the quality issue? If they have 10% of the inbound link volume, yet half or most of their inbound links are from very highly authoritative sites and you have less of those, that is another massive consideration.
Those are just starting point considerations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The pages that add robots as noindex will Crawl and marked as duplicate page content on seo moz ?
When we marked a page as noindex with robots like {<meta name="<a class="attribute-value">robots</a>" content="<a class="attribute-value">noindex</a>" />} will crawl and marked as duplicate page content(Its already a duplicate page content within the site. ie, Two links pointing to the same page).So we are mentioning both the links no need to index on SE.But after we made this and crawl reports have no change like it tooks the duplicate with noindex marked pages too. Please help to solve this problem.
Moz Pro | | trixmediainc0 -
Who else has inaccurate reporting? SEO Moz BUG
I have been in contact with support to little avail no timelines for the correction kind of leaves me out in the cold. I have loved SEOmoz in the past but this recent bug is costing me time and money. Can someone please tell me how they are working around all these issues in rank reporting and a zero accuracy rate on page report cards? bugs are holding up my SEO efforts. SEOmoz has been in touch but has provided no timelines. My programs are suffering. I found inaccuracy in each report metric and now have very little faith that this is going to be corrected in a timely fashion. I rely greatly on the wonderful tools SEOmoz provides and now I am flying blind. Who else has been hit by the bug? And, what has support done for you to help?
Moz Pro | | ericajane0 -
Creating a SEO Report
We are looking to create a SEO report that is broken down by keywords. The traffic that the keywords generate for the site, the rankings in the search engines, the number of backlinks that have used the keyword as anchor text. We have a few tools that can do some of this, but are looking to find something that can aggregate all this info into a clean report. We are wondering if anyone knows a good website/application that can help manage a month-to-month report on the aspects above. Thanks!
Moz Pro | | insitegoogle0 -
What is the time range of social media metrics of Open Site Explorer?
Hi, I have a client with a huge efforts on social media and I want to know which is the time range of the social media data collected with OSE. Or is using some sort of APIs, so is showing totals? OSE shows this: Times Shared on Facebook: 932 Facebook Likes: 1,898 Times Shared on Twitter: 191 Times Shared on Google +1: 52 Total Social Shares: 1,175 Thanks 🙂
Moz Pro | | CristianGuasch0 -
Any SEO moz users notice a HUGE change in OSE (Open Site Explorer) link data numbers?
Hi All, I am having some serious concern with OSE data recently for numerous clients, one client I want to talk about today has the following data from OSE for the month of August 2011 compared with July 2011: Total links to the domain: (decrease of around 100,000+)
Moz Pro | | ColumbusAustralia
External Followed links: (decrease by around 5,000)
**Linking Root domains: (decrease of over 60) ** The crazy thing is that the domain authority has actually gone up by around 5 points for this client even though every thing has suddenly gone down? Also funny thing is we have been link building quite strong for this client over the last 12 months using only high quality sources from out niche. I am worried that their is serious issues with the data, I realise we saw some updates to OSE recently yet I am suprised it can be this drastic. Kind Regards. PSV1 -
All seo reports
I want to find out what all the report's correct definition and when I see improving or going down what does it mean for eg. Organic search report? Keywords? or non-paid keywords?
Moz Pro | | ITWEBTEAM0 -
Is the "Too Many Links" metric a blunt instrument?
The SEO Moz crawl diagnostics suggest that we have too many on-page links on several pages including on our homepage. How does your system determine when a page has too many links. It looks like when a page as more than 100 links it’s too many. Should the system take into account the page authority, domain authority, depth of page or other metrics? In addition, no-follow links are being included. As these are dropped from Google’s link graph, does it matter if we have too many? For example, we could have 200 links, 120 of which are no follow. Your tools would tell us we have too many links. Feedback appreciated. Donal
Moz Pro | | AdiRste0