Unreachable Pages
-
Hi All
Is there a tool to check a website if it has stand alone unreachable pages?
Thanks for helping
-
The only possible way I can think of is if the other person's site has an xml sitemap that is accurate, complete, and was generated by the website's system itself. (As is often created by plugins on WordPress sites, for example)
You could then pull the URLs from the xml into the spreadsheet as indicated above, add the URLs from the "follow link" crawl and continue from there. If a site has an xml sitemap it's usually located at www.website.com/sitemap.xml. Alternately, it's location may be specified in the site's robots.txt file.
The only way this can be done accurately is if you can get a list of all URLs natively created by the website itself. Any third-party tool/search engine is only going to be able to find pages by following links. And the very definition of the pages you're looking for is that they've never been linked. Hence the challenge.
Paul
-
Thanks Paul! Is there any way to do that for another persons site, any tool?
-
The only way I can see accomplishing this is if you have a fully complete sitemap generated by your own website's system (ie not created by a third-party tool which simply follow links to map your site)
Once you have the full sitemap, you'll also need to do a crawl using something like Screaming Frog to capture all the pages it can find using the "follow link" method.
Now you should have a list of ALL the pages on the site (the first sitemap) and a second list of all the pages that can be found through internal linking. Load both into a spreadsheet and eliminate all the duplicate URLs. What you'll be left with "should" be the pages that aren't connected by any links - ie the orphaned pages.
You'll definitely have to do some manual cleanup in this process to deal with things like page URLs that include dynamic variables etc, but it should give a strong starting point. I'm not aware of any tool capable of doing this for you automatically.
Does this approach make sense?
Paul
-
pages without any internal links to them
-
Do you mean orphaned pages without any internal links to them? Or pages that are giving a bad server header code?
-
But I want to find the stand alone pages only. I don't want to see the reachable pages. Can any one help?
-
If the page is indexed you can just place the site url in quotes "www.site.com" in google and it will give you all the pages that has this url on it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Purchased domain with links - redirect page by page or entire domain?
Hi, I purchased an old domain with a lot of links that I'm redirecting to my site. I want all of their links to redirect to the same page on my site so I can approach this two different ways: Entire site
Technical SEO | | ninel_P
1.) RedirectMatch 301 ^(.*)$ http://www.xyz.com or Page by page
2). Redirect 301 /retiredpage.html http://www.xyz.com/newpage.html Is there a better option I should go with in regards to SEO effectiveness? Thanks in advance!0 -
Should I deindex my pages?
I recently changed the URLs on a website to make them tidier and easier to follow. I put 301s in place to direct all the previous page names to the new ones. However, I didn't read moz's guide which says I should leave the old sitemap online for a few weeks afterwards. As I result, webmaster tools is showing duplicate page titles (which means duplicate pages) for the old versions of the pages I have renamed. Since the old versions are no longer on the sitemap, google can no longer access them to find the 301s I have put in place. Is this a problem that will fix itself over time or is there a way to quicken up the process? I could use webmaster tools to remove these old urls, but I'm not sure if this is recommended. Alternatively, I could try and recreate the old sitemap, but this would take a lot of time.
Technical SEO | | maxweb0 -
Switchboard Tags - Multiple desktop pages pointing to one mobile page
I have recently started to implement switchboard tags to connect our mobile and desktop pages, and to ensure that our mobile pages show up in rankings for mobile users. Because our desktop site is much deeper in content than our mobile site, there are a number of desktop pages we would like to have point to one mobile page. However, with the switchboard tags, this poses a problem because it requires multiple rel=canonical tags to be placed on the one mobile page. I'm assuming this will either confuse the search engines, or they will choose to ignore the rel=canonical tag altogether. Any ideas on how to approach this situation other than creating an equivalent mobile version of every desktop page or implementing a user agent detection redirect?
Technical SEO | | JBlank0 -
Issue: Duplicate Page Content > Wordpress Comments Page
Hello Moz Community, I've create a campaign in Moz and received hundreds of errors, regarding "Duplicate Page Content". After some review, I've found that 99% of the errors in the "Duplicate Page Content" report are occurring due to Wordpress creating a new comment page (with the original post detail), if a comment is made on a blog post. The post comment can be displayed on the original blog post, but also viewable on a second URL, created by Wordpress. http://www.Example.com/example-post http://www.Example.com/example-post/comment-page-1 Anyone else experience this issue in Wordpress or this same type of report in Moz? Thanks for your help!
Technical SEO | | DomainUltra0 -
Wordpress html page
Hi, WE are designing new agency site which contain from then 100 page. Which URL is best excample.com/about/ or excample/about.html excample.com/service/ or excample/service.html
Technical SEO | | srinathk0 -
Google Cache is not showing in my page
Hello Everyone, I have issue in my Page, My category page (http://www.bannerbuzz.com/custom-vinyl-banners.html) is regular cached in past, but before sometime it can't show the cached result in SERP and not show in cached result , I have also fetch this link in google web master, but can't get the result, it is showing following message. 404. That’s an error. The requested URL /search?q=cache%3A http%3A//www.bannerbuzz.com/custom-vinyl-banners.html was not found on this server. That’s all we know. My category page rank is 2 and its keyword is on first in google.com, so i am little bit worried about this page cache issue, Can someone please tell me why is this happening? Is this a temporary issue? Help me to solve out this cache issue and once again my page will regularly cache in future. Thanks
Technical SEO | | CommercePundit0 -
Two different page authority ranks for the same page
I happened to notice that trophycentral.com and www.trophycentral.com have two different page ranks even though there is a 301 redirect. Should I be concerned? http://trophycentral.com Page Authority: 47 Domain Authority: 42 http://www.trophycentral.com Page Authority: 51 Domain Authority: 42 Thanks!
Technical SEO | | trophycentraltrophiesandawards0