Site not passing page authority....
-
Hi,
This site powertoolworld.co.uk is not passing page authority. In fact every page shows no links unless it has a link from an external source.
Originally this site blocked Roger from crawling it but that block was lifted over 6 months ago. I also ran a crawl test last night and it shows the same thing. PA of 1 and no links.
I would like to point out that the problem seems to be the same for all sites on the same platform. Which points me in the direction of code. for example there is a display: none tag in the ccs which is used to style where the side bar links are. It's a Blue Park platform.
What could be causing the problem?
Thanks in advance.
EDIT
Turns out that blocking the ezooms crawler stopped it from being included.
-
Hi there Kyle,
Thanks for writing in! Sorry for the delay, I was able to take a look at the data on the tests that I ran and to look at your crawl. From what I can tell, these are the things I noticed:
- Our crawler is pulling up your internal links, if you check out your crawl diagnostic from your "Crawl Diagnostic" tab (export to CSV), you will find that we are pulling your internal links, they are under "link count."
- The internal links count we are displaying 0 is because that data is based off of our Mozcape index, so pages that we have not indexed yet will display "0" links, which is something that can be misleading since if we have no data for that page, then technically we didn't crawl the links.
- Our Mozcape index found some of your higher index pages, you can check them out here: www.opensiteexplorer.org/pages?site=www.powertoolworld.co.uk%2F
In terms of why your pages isn't passing authority, I don't have a straight answer for that. Since Google can index more of your pages than us, they are looking at a more broader picture than OSE, so the metrics in OSE should be used as a secondary stat rather than as your primary source.
I hope that helps! Let me know if you have any questions here or in our help ticket
Best,
Peter
SEOmoz Help Team. -
I've got it in the support queue but don't have a firm answer yet. I suspect it would require a data update - those are getting faster (about 2-3 weeks).
-
Great.
I'm hoping its a moz issue over a site issue.
If it is a moz issue I'm assuming this wont update until the next Linkscape update?
Thanks
-
Ah, got it - you've got PA on a few pages, but that's it. Yeah, that definitely seems wrong. Let me ping support and see if we can get any answers.
-
Hey Peter,
Thanks for looking into this.
I'm checking the www version.
There is a page authority on around 6 pages in total. All have external links.
I've checked all of that too and everything looks normal. The ccs display: none thing is maybe just clutching at straws.
-
I'm not seeing any issues in the source code, and Xenu (desktop crawler) is seeing the internal links. You've got 22K+ pages indexed in Google, and the cached version looks normal (no cloaking or other oddities).
Are you check the "www" or non-www version? I notice you redirect to "www", so some of our tools may give you odd stats on the non-canonical version. I've seeing a PA of 39 in Open Site Explorer, though.
Let me know where you're looking, and maybe I can get Support to take a peek. It is possible something happened with blocking our bots in the past (I'm not sure how often we re-check that).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Thousands of 404-pages, duplicate content pages, temporary redirect
Hi, i take over the SEO of a quite large e-commerce-site. After checking crawl issues, there seems to be +3000 4xx client errors, +3000 duplicate content issues and +35000 temporary redirects. I'm quite desperate regarding these results. What would be the most effective way to handle that. It's a magento shop. I'm grateful for any kind of help! Thx,
Technical SEO | | posthumus
boris0 -
Is it easier to rank high with a front page than a landing page?
My product is laptop and of cause, I like to rank high for the keyword "laptop". Do any of you know if the search engines tends to rank a front page higher than a landing page? Eg. www.brand.com vs. www.brand.com/laptop
Technical SEO | | Debitoor0 -
One page of the site disappeared from serp for a month now
Im working on a clients site and been promoting a specific page to a keyword. started to move up the ranks and exactly a month ago on the 19/5 ( on the same day of the last update) updated the main page im working on with new content and published some other new pages on related subjects that all are linking to the main page im working on ( without the same anchor text in the links ) on the same day i found out that because of a technical error the new content was published on 5 other pages of the site and obviously created a duplicate content issue and i removed all the duplicates on the same day , i assume G caught this thing and punished the site for the duplicate content issue but : when i search the page directly with site:...i can find it. its been a month since i fixed all issues that i thought could impact the page..no duplicate content on the site. no KW stuffing. no spammy links to the page. everything seems fine now my question : why is my page not showing ? how long should i wait before giving up and creating a new page .? how come my site has not lost any organic traffic ( apart from that specific page ) ? is it possible to penalize only one page ? can i recover from this at all ? thanks
Technical SEO | | nira0 -
Author schema and Wordpress Author Page
Hi everyone, Has anyone tried using the author schema on their Wordpress author page or on their G+ profile or on their Moz profile? Would it be a good idea to always use it where you publish? I publish on several blogs Thanks Carla example: Use it here - http://www.posicionamientowebenbuscadores.com/blog/author/carla/ http://moz.com/community/users/392216 It seems like I would be over doing it.
Technical SEO | | Carla_Dawson0 -
Should I ask third pages to erase their links pointing at my site?
Good Morning Seomoz Fans, let me explain what is going on: A surfing site has included a link to my Site in their Footer. apparently, this could be good for my site, but as It has nothing to do with my site, I ask myself if I should tell them to erase it. Site A (Surfing Site) is pointing at Site B (Marketing Site) on their Footer. So Site B is receiving backlinks from every single page on Site A. But Site B has nothing to do with Site A: Different Markets. Should I ask them to erase the link on their footer as Surfing people will not find my Marketing Site interesting? Thanks in advance.
Technical SEO | | Tintanus0 -
Is optimising on page mobile site content a waiste of time?
Good Morning from dull & overcast 2 degrees C wetherby UK 😞 Whilst Ive changed markup for seo purposes on desktop versions I would like to know if the principles of optimising on page content ie modifyting <title><h1> is exactly the same for <a href="http://www.innoviafilms.com/m/Home.aspx">http://www.innoviafilms.com/m/Home.aspx</a></p> <p>Whilst the desktop version of innovia films ranks well for the terms the client requested some time back now their attention is focusing on the mobile site but I feel a bit confused and I'll try my best to explain...</p> <p>Is it not totally redundant to "Optimise" a mobile site content as when i search via google on a smartphone i'm seeing the SERPS from the desktop version and when I click on a snippet the mobile site just piggybacks on the back of the listing anyway.</p> <p>Put another way is it not a royal waist of time tinkering with mobile site on page content for long as Googles SERPS on a smartphone are exactly the same as on a desktop ie they are not too seperate entities.</p> <p>Or am i totally wrong and you could optimise a mobile for a completely different term to its parent desktop version.?</p> <p>Tried to explain this the best i can, my head hurts... :-(</p> <p>Any insights</p> <p>welcome :-)</p></title>
Technical SEO | | Nightwing0 -
Pros & Cons of deindexing a site prior to launch of a new site on the same domain.
If you were launching a new website to completely replace an older existing site on the same domain, would there be any value in temporarily deindexing the old site prior to launching the new site? Both have roughly 3000 pages, will launch on the same domain but have a completely new url structure and much better optimized for the web. Many high ranking pages will be redirected with 301 to the corresponding new page. I believe the hypothesis is this would eliminate a mix of old & new pages from sharing space in the serps and the crawlers are more likely to index more of the new site initially. I don't believe this is a great strategy, on the other hand I see some merit to the arguments for it.
Technical SEO | | medtouch0 -
Can leaving up old web pages no longer accessible through my site navigation hurt my rankings?
My firm recently overhauled a client's website. As part of the project, we gave the content a new structure, eliminating certain pages and creating several new ones. However, I just found out that some of the "old" pages (the ones we supposedly eliminated) still appear in the Google SERPs. Somehow, the client - who handled the coding - let these pages remain live even though they can no longer be accessed through the site navigation. This seems like something that could hurt the client's SEO rankings, but I want to make sure before contacting the client and suggesting they take down the old pages. Can anyone confirm my suspicion?
Technical SEO | | matt-145670