How to outrank a directory listing with high DA but low PA?
-
My site is at 4th place, 3 places above it is a gumtree (similar to yell, yelp) listing. How can you figure out how difficult it would be outrank those pages? I mean obviously the pages would have low PA and they are top based on the high DA of the site.
This also seems to go back to keyword research and difficulty, when I'm doing keyword research and I see a wikipedia site in top 5 rank, or a yell.com or perhaps an article in forbes.com outranks your site. Typically the problem seems to be Google giving a lot of credit to these pages rankings based on the high DA rather than PA of the pages. How would you gauge the difficulty of that keyword then if the competition are pages with very high DA which is impossible to compete with but low PA?
Thanks
-
Most of my work is writing articles that take between three days and a week to author. I also have employees who assist with these articles by taking photos, making graphics, doing research, collecting data and posting them to websites. Some of these articles attack very difficult keywords.
After doing this for about 12 years on the same website, I still don't know how these articles are going to rank. A year or two after posting some are on the first page of Google defeating popular websites that surprise me. Others, perplex me because I am being beaten by pissants - in SERPs that I would judge to be much easier. I suspect that semantics, keyword diversity and titles that elicit clicks help the pissants beat me but I don't know for sure.
I can't predict how my own rankings will turn out on a website that I know well, in an industry where I have worked for 40 years and against competitors who are often people who I know by name or are even my own customers. The SERPs can be very difficult to understand. One thing that I will say with confidence is that DA and PA explain nothing and give zero guidance in winning a fight. They count as zero importance in my decisions. I can't even tell you those numbers for my own websites unless I go look. That's how little attention I give to them.
-
Sorry I couldn't help.
From above DA is a factor and one that is very difficult for you to catch up to or surpass - BUT certainly not a massive % of the ranking algorithm.
-
Actually I don't, hence why I came here to ask the question. I think Egol answered it above, they are beatable as small businesses beat them every day. That's what I needed to know, whether they are beatable, how easy/hard is it to beat them as that is a deciding factor as to whether to invest more time and money into SEO or not (money could be better spent on ads for example).
You gave me a philosophical answer that basically said, "you can't change what they do or have so work on what you can change in yourself", which is all fine and dandy but its a loose, vague, cookie cutter spiritual science answer. I mean could I theoretically outrank "British Cancer Research" Website for the keyword cancer "cancer research"? Obviously the answer is yes, using your advice, I can just keep working on my cancer research site, maybe throw a million £ into its SEO and a couple of years and I'll outrank them. We know that, everyone knows that, everyone knows that with hard work and enough time/money/effort you can achieve anything - that is not the question, the question was "how hard/easy is it?" as that is obviously a big factor when considering to continue with that strategy or not.
I mean no disrespect, I think you just misunderstood my question from the start as a "complaining type of question", perhaps you interpreted it as me whinging about the high DA competition and you were trying to encourage me to not focus on that DA. I wasn't complaining, it was a straight up question of how much that high DA is a factor in their site outranking mine as I have built a lot of backlinks to my page, they have none to theirs and therefore must rely on their site DA and traffic.
-
If you did not want constructive suggestions, why even ask? You obviously already knew the answer you wanted.
Best
-
You came here asking how to beat a directory and you got good answers and an action plan.
Unfortunately you look at metrics that google does not use, metrics that are based upon a domain, metrics that have nothing to do with the methods of winning a SERP. Don't allow rubbish metrics to frighten you away.
These are directory sites that you are trying to defeat. Directories.
They are not the Library of Congress or the Pope. Pages on these sites are defeated by small businesses every day. Pages on Amazon are defeated by small businesses every day. These small businesses didn't run because they faced competition. They got to work.
If you are willing to work hard you should not fear competition. Because where there is competition there is usually a lot of search volume on a lot of diverse keywords. And, where there is competition there is usually a lot of money changing hands. Attack there with long content with diverse keywords and excellent quality. There is a good chance that you will earn traffic. Attack that keyword with multiple pages, each of excellent quality and targeting the long tail. One of more of those pages might eventually gain rankings for the short tail keyword.
Maybe you will not win if you fight. But you will never win if you run.
-
PA is built with inbound links to that page. That page has 0 backlinks. If it has a 29 PA which I had already checked, it is boosted probably by traffic, internal links to that page which is all a direct cause of Gumtree having massive traffic and DA.
I am not taking it personally, I'm just looking for a reasonable answer as to how much DA weigh in as a factor in rankings? I have a pragmatic approach to things so for example:
Site A has DA30 PA29, my site has DA25 PA28 - This gives me a good idea of what I need to do. I need to study site A's backlink profile, onsite SEO and try to raise my DA to match or beat theirs. Its clear pragmatic approach.
Now example B:
Site A has DA90 and PA 29, my site has DA30 and PA45. - Now for a logical approach, this is much harder to approach pragmatically on how to beat it. Mainly because we know we can never achieve a DA90, and my higher PA isn't overcoming that DA gap. So the question is, how much of a factor is that huge DA? This is important from a business decision perspective because if it means 6 months of high quality backlinks could overcome that, they maybe it's a go, but if it means that I may need to achieve a DA of 60 and PA 60 to outrank that site, then there would be no point. And yes I know DA/PA doesnt mean ranking, but its the closest measuring stick we have to how successful a site will be in ranking.
And I don't mean to be rude but the idea of just "overcoming" something by doing things better in things you can change is a very vague idea. If I want to be the best boxer in the world and I'm 50 years old, you could follow some rara and claim that you can't change your age so you might as well just train and work on what you can change. But fact of the matter is, its nearly impossible to be the best boxer in the world at 50yrs old so although theres a 0.01% chance it could be done, its not a worthwhile investment. And this is why I asked a simple question of how much of a factor that DA is, its to figure out whether its worth investing into more SEO.
And you're right in the context that keyword difficulty doesn't change based on whos above you, however if you are aiming for no.1 position and you are no.2 but no.1 is almost impossible to take over, then yes that keyword is hard. When building niche sites, part of keyword research is to research your competition for that keyword before embarking on that niche site project.
But thank you for taking your time to answer.
-
Magusara,
I think you are taking this a bit personally. Yes the pages above you are ultimately owned by someone even if that someone is a stockholder in the company that owns the page. The link you provide (https://www.gumtree.com/removal-services/bournemouth) has a page authority of 29.
When you complain about the authority reference with the Facebook example, you are missing the point. The issue is not 100% DA and you are fixated on that. Sorry, it is simply my opinion that you cannot fixate on the thing you say is insurmountable (increase your DA above theirs) and then say any other way to deal with the roadblock is in some way not relevant or that whomever supplied the suggestion is just wrong.
You are wrong about keyword difficulty. You say: And I disagree, who ranks above you is a reflection on keyword difficulty. Obviously if you are trying to rank #1 for the keyword "dog training", you are currently 2nd but no.1 is occupied by Facebook deciding to have a specific dog training page target for your area, it would be next to impossible to overtake them. Hypothetical situation but you get what I mean.
Who ranks above you is NOT a reflection on keyword difficulty. If today I erect a page that is purple dog jellybeans and I add content weekly to it. If in three months you erect a page that is purple dog jellybeans and index it, most likely, it will initially rank below my page. That doesn't mean that the term purple dog jellybeans is a competitive keyword.
That is determined by the keyword within the context of all the competition in a given vertical for that term. It is not determined by the site above you with a lot of DA.
Everyone involved in SEO on these Q&A pages have faced the same hurdles you are experiencing and all we can give you is our experience. Yes, DA is a factor and one that is very difficult for you to catch up to or surpass - BUT certainly not a massive % of the ranking algorithm.
The points we were making were to suggest you quit looking at that one roadblock (DA) and go about developing everything else that your competitor cannot. You KNOW the area the directory participates in. Use what your advantage is against them and quit being argumentative with those who only want to assist you. Focus is where we place our gaze and apply our energy. You are focusing on the roadblock of DA. We are suggesting you focus on everything else and make the roadblock cease to exist.
I will apply a golf analogy and then say goodbye:
If you are faced with water you must cross to get to the green with your shot, you can focus on the water or on the green. Ask any golfer where the ball goes if you are focused (caught up with avoiding) the water. Just a fact of life for me.
We wish you the very best,
Robert
-
I don't hate directories or envy their DA. The pages above me are NOT owned by a person. They are a directory listing.
I say "obviously they have a low PA" because these pages are not owned by anyone therefore they dont have any SEO or owner building traffic or backlinks to them. They are the result of the directory listing. Example: https://www.gumtree.com/removal-services/bournemouth
That URL has 0-1 backlinks according to ahrefs or moz.com site explorer. The page ranks highly literally because the DA of the site is 70. As for researching what they are doing, well thats kinda like saying research what Facebook or yell.com is doing to see if you can achieve the same DA as their site.
And I disagree, who ranks above you is a reflection on keyword difficulty. Obviously if you are trying to rank #1 for the keyword "dog training", you are currently 2nd but no.1 is occupied by Facebook deciding to have a specific dog training page target for your area, it would be next to impossible to overtake them. Hypothetical situation but you get what I mean.
The purpose of the question is not to hate on those sites above me, it's to estimate how difficult directory results are to outrank in search engines. Because they can't be held to the same ranking factors as other personally owned sites because of the huge DA they hold, such as a Forbes article. You can build more links to your page than there are to the Forbes article but the Forbes DA will also weigh in, I guess the question is how much of a factor is the DA over the PA.
-
I really enjoyed Robert's answer because we all see so many of these DA and PA envy questions.
To build on Robert's theme of "leverage your advantages" - these directories are usually built by people who know very little about your local area or industry. They are also generally cookie-cutter sites built by factory workers who live 1000 miles away. For those reasons, it is often very easy for a local expert or an industry expert to build content that is vastly superior and to provide a landing page experience that impresses the visitor enough that he will share it with others. You probably also know the visitor better than the factories who build these sites too.
Put some time into your content and presentation. Winning there can significantly improve the success of your page in the SERPs. It can also significantly improve your chances of attracting the searcher to your business instead of the business of your competitors. Show your expertise!
-
Magusara
In my opinion, if you wish to outrank anyone, you should first take a step back and not draw conclusions without first researching. You state, "...obviously the pages would have low PA and they are top based on the high DA of the site." If it is obvious, then why research to see what exactly they are doing?
If you have spent any time on Moz's Q&A you will have seen ton's of "How is this page outranking me?" questions. The key to combatting those pages is to learn more about them than they know about themselves. You cannot do that if you assume or believe anything to be obvious.Yes, I hate directories (when it serves me to not work with them) as much as any other SEO professional, but they do get many things correct. One thing you can do is grow your DA and yes, it will take a bit of time. But look closer, is everything as it really appears?
My point is this: Knowing they have an advantage is not going to assist you in combatting them, knowing what the advantages and disadvantages they have will help you fight them if you choose to exploit them. So, is it worth the time to fight it? If it is, then know your opponent and leverage your own advantages.
Hope that helps,
Robert
PS the keyword difficulty does not change based on who is above you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible to predict the future DA of a site?
Is there a method to predict the future DA of a site if I know the DA and PA of x sites that will be linking to them in the future? All inbound links will be pointing to the home page.
Intermediate & Advanced SEO | | richdan0 -
Multiply List of Keywords | Tools?
Hi guys, I was wondering does anyone know of any tools which you can had a large list of seed keywords and it will find related keywords per seed keyword. I know scrapebox, ultimate niche finder can do this, but was wondering if there was anything else in the market to checkout? Cheers.
Intermediate & Advanced SEO | | jayoliverwright0 -
Block subdomain directory in robots.txt
Instead of block an entire sub-domain (fr.sitegeek.com) with robots.txt, we like to block one directory (fr.sitegeek.com/blog).
Intermediate & Advanced SEO | | gamesecure
'fr.sitegeek.com/blog' and 'wwww.sitegeek.com/blog' contain the same articles in one language only labels are changed for 'fr' version and we suppose that duplicate content cause problem for SEO. We would like to crawl and index 'www.sitegee.com/blog' articles not 'fr.sitegeek.com/blog'. so, suggest us how to block single sub-domain directory (fr.sitegeek.com/blog) with robot.txt? This is only for blog directory of 'fr' version even all other directories or pages would be crawled and indexed for 'fr' version. Thanks,
Rajiv0 -
Low on Google ranking despite error-free!?
Hi all, I'm following up on a recent post i've made about our indexing and especially ranking problems in Google: http://moz.com/community/q/seo-impact-classifieds-website Thanks to all good comments we managed to get rid of most of our crawl errors and as a result our high priority /duplicated content decreased from +22k to 270. In short, we created canonical urls, run an xml sitemap, used url parameters in GWT, created h1 and meta description for each ad posted by users etc. I then used google fetch a few times (3 weeks ago and last week) both for desktop and mobile version for re-approval. Nothing really improves in google rankings (all our core keywords are ranked +50)since months now: yet yahoo and bing organic traffic went up and is 3x higher than google's. In the meanwhile we're running paid campagins on facebook and adwords since months already to keep traffic consistent, yet this is eating up our budget, even though our ctr and conversion rates are good. I realize we might have to create more content on-site and through social media, but right now our social media traffic is already around 50% and we are using more of twitter and google+ as well since recently. Our organic traffic is only 14%; with google only a third of that. In the end, I believe this breakdown should look more something like organic 50%-70%, (paid)social,referral and direct traffic. 50%-30%... I can't believe we are hit by a penalty although this looks like it is the case. Especially while yahoo and bing traffic goes up and google does not. Should I wait for a signal once our site is "approved" again through GWT fetch? Or am i missing something that i need to check as well to improve these rankings? Thanks for your help! Ivor ps: ask me for additional stats or info in a pm if needed!
Intermediate & Advanced SEO | | ivordg0 -
Benefit of Targeting Low/No Volume Keyword Phrases
Hi Folks, I was having a discussion with a friend and colleague of mine yesterday about the pros and cons of targeting keyword phrases that have very little if any search volume. I was of the opinion that if the keyword phrases (whether they were local or not) did not have any search volume as indicated by Google's Keyword Planner tool, then they had little if any value. Would this be a correct assumption? Or is there merit to targeting these phrases in order to begin to build a picture of a sites overall subject matter and to help rank in local search? For example, say there is a phrase like 'second hand clothing slough' (just a random phrase) which has no search volume but 'second hand clothing' has 2400 visits a month, would it be worth targeting the search phrase with no volume to build a better local profile, so that if someone in Slough searches for 'second hand clothing' the site shows up for that keyword? Thanks in advance guys! Gareth
Intermediate & Advanced SEO | | PurpleGriffon0 -
Local Listing Question
We will be starting local SEO efforts on a medical practice that has 4 locations & 15 doctors each location (so 60 listings total). I will submit each doctor & each location to InfoGroup, LocalEze, Axciom & Factual. Also, I will only submit each location (not doctors) to Google. The problem I'm seeing is the fact that each listing would have the same exact phone number - it all goes to one main routing center. What kind of problems could come of this? Do we need a separate phone numbers for each of the four locations (at the very least)?
Intermediate & Advanced SEO | | JohnWeb120 -
Google Places listings showing for businesses in different states.
Hi Moz Community I have a client with 4 x active Google Place listings in different locations all within in the same state of Australia (Western Australia) and all with the business name 'Crawford Realty' as per Google place guidelines. When searching for 'Crawford Realty' Google returns listings for real estate agents near a town called Crawford in a state called Queensland which is across the other side of the country. See screenshot: http://screencast.com/t/43p6LdtW Does anyone know why my Western Australian business listings wouldn't be showing when searching in Western Australia and why the listings for a town in Queensland, across the other side of the country would be showing. Thanks in advance, Freddy
Intermediate & Advanced SEO | | Jon_bangonline0 -
Merging Domains... Sub-domains, Directories or Seperate Sites?
Hello! I am hoping you can help me decide the best path to take here... A little background: I'm moving to a new company that has three old domains (the oldest is 10 years old), which get a lot of traffic from their e-letters. Until recently they have not cared about SEO. So the websites have some structural, coding, URL and other issues. The sites are indexed, but have a problem getting crawled and/or indexed for new content - haven't delved into this yet but am certain I will be able to fix any of these issues. These three domains are PR4, PR4, PR5 and contain hundreds of unique articles. Here's the question... They want to move these three sites **to their main company site (PR4) and create sub domains for each one. ** I am wondering if this is a good idea or not. I have merged sites before (creating categories and/or directories) and the end result is that the ONE big site, is much for effective than TWO smaller, less authoritative sites. But the sub domain idea is something I am unsure about from an SEO perspective. Should we do this with sub domains? Or do you think we should keep the sites separate? How do Panda and Penguin play into this? Thanks in advance for the help! SD P.S. I'm not a huge advocate in using PR as a measurement tool, but since I can't reveal the actual domains, I figured I would list it as a reference point.
Intermediate & Advanced SEO | | essdee0