Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to outrank a directory listing with high DA but low PA?
-
My site is at 4th place, 3 places above it is a gumtree (similar to yell, yelp) listing. How can you figure out how difficult it would be outrank those pages? I mean obviously the pages would have low PA and they are top based on the high DA of the site.
This also seems to go back to keyword research and difficulty, when I'm doing keyword research and I see a wikipedia site in top 5 rank, or a yell.com or perhaps an article in forbes.com outranks your site. Typically the problem seems to be Google giving a lot of credit to these pages rankings based on the high DA rather than PA of the pages. How would you gauge the difficulty of that keyword then if the competition are pages with very high DA which is impossible to compete with but low PA?
Thanks
-
Most of my work is writing articles that take between three days and a week to author. I also have employees who assist with these articles by taking photos, making graphics, doing research, collecting data and posting them to websites. Some of these articles attack very difficult keywords.
After doing this for about 12 years on the same website, I still don't know how these articles are going to rank. A year or two after posting some are on the first page of Google defeating popular websites that surprise me. Others, perplex me because I am being beaten by pissants - in SERPs that I would judge to be much easier. I suspect that semantics, keyword diversity and titles that elicit clicks help the pissants beat me but I don't know for sure.
I can't predict how my own rankings will turn out on a website that I know well, in an industry where I have worked for 40 years and against competitors who are often people who I know by name or are even my own customers. The SERPs can be very difficult to understand. One thing that I will say with confidence is that DA and PA explain nothing and give zero guidance in winning a fight. They count as zero importance in my decisions. I can't even tell you those numbers for my own websites unless I go look. That's how little attention I give to them.
-
Sorry I couldn't help.
From above DA is a factor and one that is very difficult for you to catch up to or surpass - BUT certainly not a massive % of the ranking algorithm.
-
Actually I don't, hence why I came here to ask the question. I think Egol answered it above, they are beatable as small businesses beat them every day. That's what I needed to know, whether they are beatable, how easy/hard is it to beat them as that is a deciding factor as to whether to invest more time and money into SEO or not (money could be better spent on ads for example).
You gave me a philosophical answer that basically said, "you can't change what they do or have so work on what you can change in yourself", which is all fine and dandy but its a loose, vague, cookie cutter spiritual science answer. I mean could I theoretically outrank "British Cancer Research" Website for the keyword cancer "cancer research"? Obviously the answer is yes, using your advice, I can just keep working on my cancer research site, maybe throw a million £ into its SEO and a couple of years and I'll outrank them. We know that, everyone knows that, everyone knows that with hard work and enough time/money/effort you can achieve anything - that is not the question, the question was "how hard/easy is it?" as that is obviously a big factor when considering to continue with that strategy or not.
I mean no disrespect, I think you just misunderstood my question from the start as a "complaining type of question", perhaps you interpreted it as me whinging about the high DA competition and you were trying to encourage me to not focus on that DA. I wasn't complaining, it was a straight up question of how much that high DA is a factor in their site outranking mine as I have built a lot of backlinks to my page, they have none to theirs and therefore must rely on their site DA and traffic.
-
If you did not want constructive suggestions, why even ask? You obviously already knew the answer you wanted.
Best
-
You came here asking how to beat a directory and you got good answers and an action plan.
Unfortunately you look at metrics that google does not use, metrics that are based upon a domain, metrics that have nothing to do with the methods of winning a SERP. Don't allow rubbish metrics to frighten you away.
These are directory sites that you are trying to defeat. Directories.
They are not the Library of Congress or the Pope. Pages on these sites are defeated by small businesses every day. Pages on Amazon are defeated by small businesses every day. These small businesses didn't run because they faced competition. They got to work.
If you are willing to work hard you should not fear competition. Because where there is competition there is usually a lot of search volume on a lot of diverse keywords. And, where there is competition there is usually a lot of money changing hands. Attack there with long content with diverse keywords and excellent quality. There is a good chance that you will earn traffic. Attack that keyword with multiple pages, each of excellent quality and targeting the long tail. One of more of those pages might eventually gain rankings for the short tail keyword.
Maybe you will not win if you fight. But you will never win if you run.
-
PA is built with inbound links to that page. That page has 0 backlinks. If it has a 29 PA which I had already checked, it is boosted probably by traffic, internal links to that page which is all a direct cause of Gumtree having massive traffic and DA.
I am not taking it personally, I'm just looking for a reasonable answer as to how much DA weigh in as a factor in rankings? I have a pragmatic approach to things so for example:
Site A has DA30 PA29, my site has DA25 PA28 - This gives me a good idea of what I need to do. I need to study site A's backlink profile, onsite SEO and try to raise my DA to match or beat theirs. Its clear pragmatic approach.
Now example B:
Site A has DA90 and PA 29, my site has DA30 and PA45. - Now for a logical approach, this is much harder to approach pragmatically on how to beat it. Mainly because we know we can never achieve a DA90, and my higher PA isn't overcoming that DA gap. So the question is, how much of a factor is that huge DA? This is important from a business decision perspective because if it means 6 months of high quality backlinks could overcome that, they maybe it's a go, but if it means that I may need to achieve a DA of 60 and PA 60 to outrank that site, then there would be no point. And yes I know DA/PA doesnt mean ranking, but its the closest measuring stick we have to how successful a site will be in ranking.
And I don't mean to be rude but the idea of just "overcoming" something by doing things better in things you can change is a very vague idea. If I want to be the best boxer in the world and I'm 50 years old, you could follow some rara and claim that you can't change your age so you might as well just train and work on what you can change. But fact of the matter is, its nearly impossible to be the best boxer in the world at 50yrs old so although theres a 0.01% chance it could be done, its not a worthwhile investment. And this is why I asked a simple question of how much of a factor that DA is, its to figure out whether its worth investing into more SEO.
And you're right in the context that keyword difficulty doesn't change based on whos above you, however if you are aiming for no.1 position and you are no.2 but no.1 is almost impossible to take over, then yes that keyword is hard. When building niche sites, part of keyword research is to research your competition for that keyword before embarking on that niche site project.
But thank you for taking your time to answer.
-
Magusara,
I think you are taking this a bit personally. Yes the pages above you are ultimately owned by someone even if that someone is a stockholder in the company that owns the page. The link you provide (https://www.gumtree.com/removal-services/bournemouth) has a page authority of 29.
When you complain about the authority reference with the Facebook example, you are missing the point. The issue is not 100% DA and you are fixated on that. Sorry, it is simply my opinion that you cannot fixate on the thing you say is insurmountable (increase your DA above theirs) and then say any other way to deal with the roadblock is in some way not relevant or that whomever supplied the suggestion is just wrong.
You are wrong about keyword difficulty. You say: And I disagree, who ranks above you is a reflection on keyword difficulty. Obviously if you are trying to rank #1 for the keyword "dog training", you are currently 2nd but no.1 is occupied by Facebook deciding to have a specific dog training page target for your area, it would be next to impossible to overtake them. Hypothetical situation but you get what I mean.
Who ranks above you is NOT a reflection on keyword difficulty. If today I erect a page that is purple dog jellybeans and I add content weekly to it. If in three months you erect a page that is purple dog jellybeans and index it, most likely, it will initially rank below my page. That doesn't mean that the term purple dog jellybeans is a competitive keyword.
That is determined by the keyword within the context of all the competition in a given vertical for that term. It is not determined by the site above you with a lot of DA.
Everyone involved in SEO on these Q&A pages have faced the same hurdles you are experiencing and all we can give you is our experience. Yes, DA is a factor and one that is very difficult for you to catch up to or surpass - BUT certainly not a massive % of the ranking algorithm.
The points we were making were to suggest you quit looking at that one roadblock (DA) and go about developing everything else that your competitor cannot. You KNOW the area the directory participates in. Use what your advantage is against them and quit being argumentative with those who only want to assist you. Focus is where we place our gaze and apply our energy. You are focusing on the roadblock of DA. We are suggesting you focus on everything else and make the roadblock cease to exist.
I will apply a golf analogy and then say goodbye:
If you are faced with water you must cross to get to the green with your shot, you can focus on the water or on the green. Ask any golfer where the ball goes if you are focused (caught up with avoiding) the water. Just a fact of life for me.
We wish you the very best,
Robert
-
I don't hate directories or envy their DA. The pages above me are NOT owned by a person. They are a directory listing.
I say "obviously they have a low PA" because these pages are not owned by anyone therefore they dont have any SEO or owner building traffic or backlinks to them. They are the result of the directory listing. Example: https://www.gumtree.com/removal-services/bournemouth
That URL has 0-1 backlinks according to ahrefs or moz.com site explorer. The page ranks highly literally because the DA of the site is 70. As for researching what they are doing, well thats kinda like saying research what Facebook or yell.com is doing to see if you can achieve the same DA as their site.
And I disagree, who ranks above you is a reflection on keyword difficulty. Obviously if you are trying to rank #1 for the keyword "dog training", you are currently 2nd but no.1 is occupied by Facebook deciding to have a specific dog training page target for your area, it would be next to impossible to overtake them. Hypothetical situation but you get what I mean.
The purpose of the question is not to hate on those sites above me, it's to estimate how difficult directory results are to outrank in search engines. Because they can't be held to the same ranking factors as other personally owned sites because of the huge DA they hold, such as a Forbes article. You can build more links to your page than there are to the Forbes article but the Forbes DA will also weigh in, I guess the question is how much of a factor is the DA over the PA.
-
I really enjoyed Robert's answer because we all see so many of these DA and PA envy questions.
To build on Robert's theme of "leverage your advantages" - these directories are usually built by people who know very little about your local area or industry. They are also generally cookie-cutter sites built by factory workers who live 1000 miles away. For those reasons, it is often very easy for a local expert or an industry expert to build content that is vastly superior and to provide a landing page experience that impresses the visitor enough that he will share it with others. You probably also know the visitor better than the factories who build these sites too.
Put some time into your content and presentation. Winning there can significantly improve the success of your page in the SERPs. It can also significantly improve your chances of attracting the searcher to your business instead of the business of your competitors. Show your expertise!
-
Magusara
In my opinion, if you wish to outrank anyone, you should first take a step back and not draw conclusions without first researching. You state, "...obviously the pages would have low PA and they are top based on the high DA of the site." If it is obvious, then why research to see what exactly they are doing?
If you have spent any time on Moz's Q&A you will have seen ton's of "How is this page outranking me?" questions. The key to combatting those pages is to learn more about them than they know about themselves. You cannot do that if you assume or believe anything to be obvious.Yes, I hate directories (when it serves me to not work with them) as much as any other SEO professional, but they do get many things correct. One thing you can do is grow your DA and yes, it will take a bit of time. But look closer, is everything as it really appears?
My point is this: Knowing they have an advantage is not going to assist you in combatting them, knowing what the advantages and disadvantages they have will help you fight them if you choose to exploit them. So, is it worth the time to fight it? If it is, then know your opponent and leverage your own advantages.
Hope that helps,
Robert
PS the keyword difficulty does not change based on who is above you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Image Audit: Getting a list of *ALL* Images on a Site?
Hello! We are doing an image optimization audit, and are therefore trying to find a way to get a list of all images on a site. Screaming Frog seems like a great place to start (as per this helpful article: https://moz.com/ugc/how-to-perform-an-image-optimization-audit), but unfortunately, it doesn't include images in CSS. 😞 Does the community have any ideas for how we try to otherwise get list of images? Thanks in advance for any tips/advice.
Intermediate & Advanced SEO | | mirabile0 -
Best Practices for Title Tags for Product Listing Page
My industry is commercial real estate in New York City. Our site has 300 real estate listings. The format we have been using for Title Tags are below. This probably disastrous from an SEO perspective. Using number is a total waste space. A few questions:
Intermediate & Advanced SEO | | Kingalan1
-Should we set listing not no index if they are not content rich?
-If we do choose to index them, should we avoid titles listing Square Footage and dollar amounts?
-Since local SEO is critical, should the titles always list New York, NY or Manhattan, NY?
-I have red that titles should contain some form of branding. But our company name is Metro Manhattan Office Space. That would take up way too much space. Even "Metro Manhattan" is long. DO we need to use the title tag for branding or can we just focus on a brief description of page content incorporating one important phrase? Our site is: w w w . m e t r o - m a n h a t t a n . c o m <colgroup><col width="405"></colgroup>
| Turnkey Flatiron Tech Space | 2,850 SF $10,687/month | <colgroup><col width="405"></colgroup>
| Gallery, Office Rental | Midtown, W. 57 St | 4441SF $24055/month | <colgroup><col width="405"></colgroup>
| Open Plan Loft |Flatiron, Chelsea | 2414SF $12,874/month | <colgroup><col width="405"></colgroup>
| Tribeca Corner Loft | Varick Street | 2267SF $11,712/month | <colgroup><col width="405"></colgroup>
| 275 Madison, LAW, P7, 3,252SF, $65 - Manhattan, New York |0 -
Conditional Noindex for Dynamic Listing Pages?
Hi, We have dynamic listing pages that are sometimes populated and sometimes not populated. They are clinical trial results pages for disease types, some of which don't always have trials open. This means that sometimes the CMS produces a blank page -- pages that are then flagged as thin content. We're considering implementing a conditional noindex -- where the page is indexed only if there are results. However, I'm concerned that this will be confusing to Google and send a negative ranking signal. Any advice would be super helpful. Thanks!
Intermediate & Advanced SEO | | yaelslater0 -
404 or 410 status code after deleting a real estate listing
Hi there, We manage a website which generates an overview and detailpages of listings for several real estate agents. When these listings have been sold, they are removed from the overview and pages. These listings appear as not found in the crawl error overview in Google Search Console. These pages appear as 404's, would changing this to 410's solve this problem? And if not, what fix could take care of this problem?
Intermediate & Advanced SEO | | MartijntenCaat0 -
Low text-HTML ratios
Are low text-HTML ratios still a negative SEO ranking factor? Today I ran SEMRUSH site audit that showed 344 out of 345 pages on our website (www.nyc-officespace-leader.com) show an text-HTML ratio that ranges from 8% to 22%. This is characterized as a warning on SEMRUSH. This error did not exist in April when the last SEMRUSH audit was conducted. Is it worthwhile to try to externalize code in order to improve this ratio? Or to add text (major project on a site of this size)? These pages generally have 200-400 words of text. Certain URLs, for example www.nyc-officespace-leader.com/blog/nycofficespaceforlease more text, yet it still shows an text-HTML ratio of only 16%. We recently upgraded to the WordPress 4.2.1. Could this have bloated the code (CSS etcetera) to the detriment of the text-HTML ratio? If Google has become accustomed to more complex code, is this a ratio that I can ignore. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
SEO Impact of High Volume Vertical and Horizontal Internal Linking
Hello Everyone - I maintain a site with over a million distinct pages of content. Each piece of content can be thought of like a node in graph database or an entity. While there is a bit of natural hierarchy, every single entity can be related to one or more other entities. The conceptual structure of the entities like so: Agency - A top level business unit ( ~100 pages/urls) Office - A lower level business unit, part of an Agency ( ~5,000 pages/urls) Person - Someone who works in one or more Offices ( ~80,000 pages/urls) Project - A thing one or more People is managing ( ~750,000 pages/urls) Vendor - A company that is working on one or more Projects ( ~250,000 pages/urls) Category - A descriptive entity, defining one or more Projects ( ~1,000 pages/urls) Each of these six entities has a unique (url) and content. For each page/url, there are internal links to each of the related entity pages. For example, if a user is looking at a Project page/url, there will be an internal link to one or more Agencies, Offices, People, Vendors, and Categories. Also, a Project will have links to similar Projects. This same theory holds true for all other entities as well. People pages link to their related Agencies, Offices, Projects, Vendors, etc, etc. If you start to do the math, there are tons of internal links leading to pages with tons of internal links leading to pages with tons of internal links. While our users enjoy the ability to navigate this world according to these relationships, I am curious if we should force a more strict hierarchy for SEO purposes. Essentially, does it make sense to "nofollow" all of the horizontal internal links for a given entity page/url? For search engine indexing purposes, we have legit sitemaps that give a simple vertical hierarchy...but I am curious if all of this internal linking should be hidden via nofollow...? Thanks in advance!
Intermediate & Advanced SEO | | jhariani2 -
How can I get a list of every url of a site in Google's index?
I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is. Is there a way to get an excel sheet with every url Google has indexed for a site? Thanks... Mike
Intermediate & Advanced SEO | | 945010 -
What should happen to expired real estate listings?
For a real estate website, when a house is sold or taken off of the market. What should happen to the listing? 301 redirect it to the grouping (such as zip code or city) which that listing resides in? 404 it?
Intermediate & Advanced SEO | | wattssw0