Why Would My Page Have a Higher PA and DA, Links & On-Page Grade & Still Not Rank?
-
The Search Term is "Alcohol Ink" and our client has a better page authority, domain authority, links to the page, and on-page grade than those in the SERP for spaces 5-10 and we're not even ranked in the top 51+ according to Moz's tracker.
The only difference I can see is that our URL doesn't use the exact text like some of the 5-10 do. However, regardless of this, our on-page grade is significantly higher than the rest of them.
The one thing I found was that there were two links to the page (that we never asked for) that had a spam score in the low 20's and another in the low 30's.
Does anyone have any recommendations on how to maybe get around this?
Certainly, a content campaign and linking campaign around this could also help but I'm kind of scratching my head. The client is reputable, with a solid domain age and well recognized in the space so it's not like it's a noob trying to get in out of nowhere.
-
You are more than welcome.
I know I really enjoy answering questions on here and I suspect that EffectDigital does as well.
Please do let us know how you get on either directly or by replying to this post, that is one thing that is lacking when we respond to questions on any forum. People don't always let us know the results of our answers.
I wish you all the best working with what sounds to be a good client and hope to see more of you on the forums.
Steve
-
Thank you both for your additional information.
I was aware of most of the content that you've published here. However, you've taken my understanding and painted a very helpful "big picture" that takes a step back to understand all the factors at play.
It's been very helpful to me as it reminds me that there likely isn't a "magic fix" but that through continual work on the body of the business/website, and through continual differentiation and creation of great content, over the long-term we should be well positioned to compete even further.
Part of this question comes from a deep dive into trying to understand what's taken place for the client overall as there wasn't a lot of SEO work done at all over the history of the business and they've still managed to be very successful.
This company was a pioneer in their space, they're such wonderful people and such a wonderful company that they were able to grow significantly. Now, as competitors have crept in because of their success, and have deployed strong SEO strategies, they're starting to see their first shift from the major growth scale.
We have the closest thing to a "magic fix" that you can have which is moving their blog from the separate URL that it's presently on with its millions of links, to be attached to its main eCommerce site. There is some apprehension on their part with making this move too quickly (which is understandable). And so I'm really trying to paint an overall picture as you've just helped paint so that they can understand, what we can do from here (without moving the blog) and what that looks like.
Your answers have been so tremendously helpful to guiding the millions of thoughts in my head to something that's actionable and a great way to explain it and I just wanted to thank you both very much for your answers because I know you're also busy people, with a lot to do.
Thanks so much!
-
Not a problem! I always like these kinds of Qs and responses as they cover a bit of the history of SEO
-
Thanks EffectDigital,
Again your answer takes it to a new level though and provides great insights.
-
Steve's answer is really great. Basically in SEO we have to cater to Google's PageRank algorithm. We used to be able to see a very watered down, simplified version of PageRank using the Google toolbar for Firefox (before Chrome became big) and using various Chrome extensions thereafter
Google figured out that people were misusing this data and shut off the API which supplied the (very, very simplified version of) PageRank (a number for 0-10 for each URL on the web). PageRank still exists and Google still use it in their ranking algorithms, but no one except Googlers (and even then, only certain ones) can see it. Arguably no one could ever really see it, as TBPR (Toolbar PageRank) was really simplified and watered down, it was never a full view on a page's 'actual' PageRank
Suddenly, marketers had no way to evaluate the SEO authority of each web page they were looking at. Many stepped in to fill this hole (Ahrefs supply a URL and domain rating metric, Majestic SEO supply Citation Flow and Trust Flow metrics, Moz of course were first with PA and DA)
These metrics are our industry's attempt to fill a hole left by Google's removal of bad data from the public eye. Moz attempt to use various signals and metrics (link counts, search traffic estimates for URLs) to re-build TBPR as PA and DA
... but Google don't use PA and DA. Google use PR (PageRank). PA and DA are 'shadow metrics', they indicate and mimic but they are indicators only and cannot (read: absolutely must not) be taken at face value
For example, although link counts affected Google's old TBPR (Toolbar PageRank) metric, other things did too. If a site was blocked from Google, if a site had a penalty or algorithmic devaluations. Those things could lower or nullify the TBPR rating of a website. Since Google and Moz are not 'connected' in data terms, Moz's metrics miss many of the 'true' authority nullifying circumstances which could occur - thus you can end up with high PA / DA and still no traffic
Things that can affect you:
- Algorithmic devaluations, where the sites linking to your site are penalised and thus they no longer pass SEO authority to you - making your results go down as well. Not a penalty, just Darwinism in action I am afraid
- An actual penalty on your site
- Poor keyword targeting where your keywords aren't properly used in your content and / or Meta data, stuff like that. Sounds like this one is a real concern for you, as you may have SEO authority but NO relevance!
- Technical issues like an architecture which Google can't (or doesn't want to spend the time to) index, e.g: over-reliance on generated content through JavaScript (which Google can crawl, but it takes them much longer - so if you're a nobody don't expect them to care much or take that time)
- Technical indexation issues like blocking your own site with Meta no-index directives or robots.txt crawl blocks
- Legal challenges to your business or content in the form of DMCA requests, people filing reports directly with Google to have content removed from your site - there are many other types of legal challenge that can affect SEO
- Content duplication, internal or external
- Spam reports and disavow logs against your website
... there are many other factors, a big one is that your site may lack a value-proposition for end users. If other sites doing what you do, existed before you - and they're cheaper, have better reviews or tout unique features like free shipping (click fit and collect services for clothing, etc etc) then your offering itself may just not be competitive (and no matter how good your SEO is the site was doomed from the business end). Google expects sites to 'add value' to the web
The best thing to do is concentrate on your value proposition and making your site genuinely popular online. It's not easy. Building a successful site is as hard as building a successful business, it's just the digital reflection of what you are and what you do
-
Hi,
The first thing to remember, Google rankings do not use MOZ DA or PA to decide where a website should be ranked in the results.
The best way to use DA and PA are as an arbitrary measurement that allows you to compare against other sites, so you can see how you are doing against your competitors.
Now without knowing the URLs involved, I cannot check the websites to give any real insights. However, there is any number of reasons why the other sites may be ranking higher than your clients.
They may have better content for the keyword. They may have more backlinks pointing to the domain, they may be answering questions regarding that keyword in more detail.
Without actually being able to compare the sites it is hard to say.
I would analyse the sites that are higher than yours, check the backlink profiles, compare the number of pages and compare the quality of the content then you should have a plan to move forward and improve your rankings.
I hope this helps,
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How many links to the same page can there be for each page?
I need to know if I can add more than 2 equal links on the same page, for example 1 link in the header, another in the body and one in the footer
Intermediate & Advanced SEO | | Jorgesep0 -
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
Did Reviews still have the same value in Google places ranking?
I have two questions relating to Reviews. 1. Reviews still add value to Google places ranking. 2. I have a page and two clients posted reviews for me.They all get removed after 3,4 days.What is wrong with Google?Did they consider them fake?
Intermediate & Advanced SEO | | csfarnsworth0 -
2 links from the same external page question
Hi, I have always thought if 2 links on a single page, both going to the same url wouldnt pass PR from both. I watched a Matt Cutts vid and he was saying in the original algo it was built in that both links would pass PR. So for example if I guest posted say 1000 words and this article had 2 links pointing to the same url would they both work? Cheers
Intermediate & Advanced SEO | | Bondara0 -
Can a home page penalty cause a drop in rankings for all pages?
All my main keywords have dropped out of the SERPS. Could it be that the home page (the strongest) page has been devalued and therefore 'link juice' that used to spread throughout the site is no longer doing so. Would this cause all other pages to drop? I just can't understand how all my pages have lost rankings. The site is still indexed so there's no problem there.
Intermediate & Advanced SEO | | SamCUK0 -
Same article published 3 times--do we still benefit from the links?
Hi, A reporter recently mentioned us in a leading publication, and that article was picked up by two other big publications. Do we benefit from all three links, or do we only benefit from the link once since it is the same article?
Intermediate & Advanced SEO | | nicole.healthline0 -
How do you transition a keyword rank from a home page to a sub-page on the site?
We're currently ranking #1 for a valuable keyword, but the result on the SERP is our home page. We're creating a new product page focused on this keyword to provide a better user experience and create more relevant content. What is the best way to make a smooth transition to make the product page rank #1 for the keyword instead of the home page?
Intermediate & Advanced SEO | | buildasign0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0