Unreachable Pages
-
Hi All
Is there a tool to check a website if it has stand alone unreachable pages?
Thanks for helping
-
The only possible way I can think of is if the other person's site has an xml sitemap that is accurate, complete, and was generated by the website's system itself. (As is often created by plugins on WordPress sites, for example)
You could then pull the URLs from the xml into the spreadsheet as indicated above, add the URLs from the "follow link" crawl and continue from there. If a site has an xml sitemap it's usually located at www.website.com/sitemap.xml. Alternately, it's location may be specified in the site's robots.txt file.
The only way this can be done accurately is if you can get a list of all URLs natively created by the website itself. Any third-party tool/search engine is only going to be able to find pages by following links. And the very definition of the pages you're looking for is that they've never been linked. Hence the challenge.
Paul
-
Thanks Paul! Is there any way to do that for another persons site, any tool?
-
The only way I can see accomplishing this is if you have a fully complete sitemap generated by your own website's system (ie not created by a third-party tool which simply follow links to map your site)
Once you have the full sitemap, you'll also need to do a crawl using something like Screaming Frog to capture all the pages it can find using the "follow link" method.
Now you should have a list of ALL the pages on the site (the first sitemap) and a second list of all the pages that can be found through internal linking. Load both into a spreadsheet and eliminate all the duplicate URLs. What you'll be left with "should" be the pages that aren't connected by any links - ie the orphaned pages.
You'll definitely have to do some manual cleanup in this process to deal with things like page URLs that include dynamic variables etc, but it should give a strong starting point. I'm not aware of any tool capable of doing this for you automatically.
Does this approach make sense?
Paul
-
pages without any internal links to them
-
Do you mean orphaned pages without any internal links to them? Or pages that are giving a bad server header code?
-
But I want to find the stand alone pages only. I don't want to see the reachable pages. Can any one help?
-
If the page is indexed you can just place the site url in quotes "www.site.com" in google and it will give you all the pages that has this url on it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Juice not moving???
Moved URL's from ldnwicklesscandles.com to ldnwicklesscandles.co.uk because I wanted to rank better for UK where I'm located and thought also the .co.uk for my competitors may have been giving them the advantage. Use Squarespace 7 (transferred over from SS5)----they told me to set primary domain to .co.uk and I've done it. I've also done a 301 redirect and done a change of address in webmaster tools although I'm not sure if all of this is needed? Squarespace seem to think just setting the primary domain is enough. My question is its been a couple of weeks, I've resubmited to Google webmaster to try to speed things up, the new URL is appearing in Google but none of my Page Juice seems to be transferring yet? How long will it take? I know not all the juice will move over but my PA/DA is non existent now and I have no idea if I'm just being impatient or I've done something wrong here. Not a Pro, Just a small biz owner here so forgive me if this has been asked before.
Technical SEO | | ldnwickless0 -
How Google sees my page
When looking for crawlability issues, what is the difference between using webmaster tools Fetch as google, looking at the cached pages in google index site:mypage.com, or using spider simulator tools.
Technical SEO | | shashivzw0 -
Is There Google 6th page penalty?
My site have keyword domain but my page doesnt up or down at 6th page on search results. And my main page doesnt show 6th page too my alt pages. So what can i do for this penaly? Thanks for your help
Technical SEO | | iddaasonuclari0 -
From page 1th to page 18th @ Google
Hello Mozzers! I have a question, you may help.. How may it be possible that a page ranking well (1th result) goes from 1th result to the 18th page just in 1 day? It doesnt seem to be any kind of penalization.. I now had all suspicious outgoing links to be nofollow (they were not before), this may be a cause .. (?) Do you have any other suggestion? Thanks
Technical SEO | | socialengaged0 -
Local SEO - Page Titles
Hi Folks, Complete newbie (well last 12 months) I have recentley added a blog to my site and have been doing quite a bit of quite word researching through google. I have found some good keywords that have up till now escaped me! Heres my question because I trying for local traffic, mainly newcastle durham and sunderlanddo i go with one of the following two options get two very similar keywords in my article and go for both and rely on google to bring up local listings for the end user in my area e.g Small garden design | Garden design from the experts. (keywords bold ) or Garden Design | Newcastle | Sunderland | Durham | so I have geo locations in title either way I will obviously have both keywords and locations in the artcle Help please I dont want to write many hours and find I have missed a trick! Many thank guys n girls!
Technical SEO | | easigrassne0 -
Two Domains for the Same Page
We are creating a website for a client that will have hundreds of geographically driven landing pages. These pages will all have a similar domain structure. For example www.domain.com/georgia-atlanta-fastfood-121 We want the domain to be SEO friendly, however it also needs to be print friendly for a business card. (ex www.domain.com/121) The client has requested that we have two domains for each page. One for the Search Engines and then another shorter one for print/advertising purposes. If we do that will search engines the site for duplicate content? I really appreciate any recommendations. Thanks! Anna
Technical SEO | | TracSoft0 -
Too Many On-Page Links
Hello. My Seomoz report this week tells me that I have about 500 pages with Too Many On-Page Links One of the examples is this one: https://www.theprinterdepo.com/hp-9000mfp-refurbished-printer (104 links) If you check, all our products have a RELATED products section and in some of them the related products can be UP to 40 Products. I wonder how can I solve this. I thought that putting nofollow on the links of the related products might fix all of these warnings? Putting NOFOLLOW does not affect SEO?
Technical SEO | | levalencia10 -
Ranking above PLACE PAGES
What does it take for results to show up above Place Page results. It seems like Google Local gets a lot of emphasis . Any thoughts?
Technical SEO | | musillawfirm0