How to identify orphan pages?
-
I've read that you can use Screaming Frog to identify orphan pages on your site, but I can't figure out how to do it. Can anyone help?
I know that Xenu Link Sleuth works but I'm on a Mac so that's not an option for me.
Or are there other ways to identify orphan pages?
-
DeepCrawl.co.uk is another great resource here. This tool gives a full list of URLs, including number of internal links to each page. Filter this list by "No. links in" = 0, and this will give you a good list of orphaned pages.
Cheers,
Mike | Fresh Egg Australia -
Hi Marie!
Sadly, I don't use Xenu anymore either. Most of the solutions to find orphaned pages are either hit-and-miss manual methods (search OSE, search your server files). Or you could use a method like Agents of Value describes here.
Couple of posts that may help:
1. Find Orphaned Pages From Your Sitemap.xml File with Excel and IIS Toolkit
Requires IIS toolkit, which unless your installing on an external machine, isn't mac friendly
Ian has some great tips here, including:
- Search the server log files for every unique URL loaded over a 6-month period. Compare that to all unique URLs found in a site crawl. People have a funny way of stumbling into pages you’ve accidentally blocked or orphaned. Chances are, blocked pages will show up in your log file, even if they’re blocked.
- Do a database export. If you’re using WordPress or another content management system, you can export a full list of every page/post on the site, as well as the URL generated. Then compare that to a site crawl.
- Run two crawls of your site using your favorite crawler. Do the first one with the default settings. Then do a second with the crawler set to ignore robots.txt and nofollow. If the second crawl has more URLs than the first, and you want 100% of your site indexed, then check your robots.txt and look for meta ROBOTS issues.
3. Supposedly, Webseo has an automated option to find orphaned files, but I haven't used it nor can I vouch for it:http://www.webseo.com/
Hope this helps! Let us know what works.
-
Well, because they are 'orphans', you probably can't find them using a spider tool! I'd recommend the following process to find your orphan pages:
1. get a list of all the pages created by your CMS
2. get the list of all the pages found by Screaming Frog
3. add the two url lists into Excel and find the URLs in your CMS that are not in the Screaming Frog list.
You could probably use an Excel trick like this one:
http://superuser.com/questions/289650/how-to-compare-two-columns-and-find-differences-in-excel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One page templates
Hi, I have in plan to implement One-page template (http://shapebootstrap.net/wp-content/plugins/shapebootstrap/demo.php?id=380) is someone have experience how this type of page have SEO problems? Best regards
Technical SEO | | komir20040 -
Redirecting Several Hundred Pages
As of May 21st 2013 (Penguin 2.0 update) we hit a triple-header and I think we can now officially dubbed the "KING OF GOOGLE PANALTIES"! 😞 -July 2012 - recieved 2 "Unatural Links" email -April 2012 - 20% traffic hit -May 21st 2013 - 35% traffic hit We have/had lots of very low quality links using the same anchor text as well as about 150 very low quality articles and almost 100 categories w/several hundred products that recieved little to no traffic. We have spent the last several weeks cleaning up our link profile and were highly successful in getting most of them removed and have kept detailed reports for our Reconsideration Request for the manual "Unatural Links" penalty. We have also went a step further and have completely redesigned the site that is now much faster/better on-page seo with new, high quality articles and are removing all the low quality articles, categories and products but we are unclear what to do with these. Which brings me to my question. Should we redirect these pages back to the home page or just let them go to 404 error? I have been doing lots of reading on this subject but there doesnt seem to be any good answers. From what I read, neither are good choices and I cannot decide between the lesser of the 2 evil's ..so any help with this would be greatly apreciated! Note:
Technical SEO | | k9byron
-These category and product pages have absolutly no inbound links (link benefit) and in my opinion are only sucking off link juice and generating little to no revenue. There are also no similar categories or products that these could be redirected to. For example, redirecting dog toys to the dog bed category just sounds like it would increase our bounce rate. -Again, the articles also have no link benefit and only a small handful of the articles actually generate any traffic to speak of (several thousand visitors per year) and the rest generate less than 1000 visitors per year. All have high bounce rates and low conversions. It would be nice to keep them live as I think some are okay and could be rewritten/re-purpose over time but maybe in light of our Panda penalty it might be better to just to save them offline, let them go to 404 errors and rewritten/re-purpose them another time? -We did create a very nice 404 page with category navigation and huge search bar so I am leaning more toward this option.
..
Thank in Advance!0 -
How do I fix issue regarding near duplicate pages on website associated to city OR local pages?
I am working on one e-commerce website where we have added 300+ pages to target different local cities in USA. We have added quite different paragraphs on 100+ pages to remove internal duplicate issue and save our website from Panda penalty. You can visit following page to know more about it. And, We have added unique paragraphs on few pages. But, I have big concerns with other elements which are available on page like Banner Gallery, Front Banner, Tool and few other attributes which are commonly available on each pages exclude 4 to 5 sentence paragraph. I have compiled one XML sitemap with all local pages and submitted to Google webmaster tools since 1st June 2013. But, I can see only 1 indexed page by Google on Google webmaster tools. http://www.bannerbuzz.com/local http://www.bannerbuzz.com/local/US/Alabama/Vinyl-Banners http://www.bannerbuzz.com/local/MO/Kansas-City/Vinyl-Banners and so on... Can anyone suggest me best solution for it?
Technical SEO | | CommercePundit0 -
Why is the Page Authority of my product pages so low?
My domain authority is 35 (homepage Page Authority = 45) and my website has been up for years: www.rainchainsdirect.com Most random pages on my site (like this one) have a Page Authority of around 20. However, as a whole, the individual pages of my products rank exceptionally low. Like these: http://www.rainchainsdirect.com/products/copper-channel-link-rain-chain (Page Authority = 1) http://www.rainchainsdirect.com/collections/todays-deals/products/contempo-chain (Page Authority = 1) I was thinking that for whatever reason they have such low authority, that it may explain why these pages rank lower in google for specific searches using my exact product name (in other words, other sites that are piggybacking of my unique products are ranking higher for my product in a specific name search than the original product itself on my site) In any event, I'm trying to get some perspective on why these pages remain with the same non-existent Page Authority. Can anyone help to shed some light on why and what can be done about it? Thanks!
Technical SEO | | csblev0 -
Find where the not selected pages are from
Hi all Can anyone suggest how I can find where gtoogle is finding approx. 1000 pages not to select? In round numbers I have 110 pages on the site site: searech shows all pages index status shows 110 slected and 1000 not selected. For the life of me I cannot fingure where these pages are coming from. I have set my prefered domain to www., setup 301 's to www. as per below RewriteCond %{HTTP_HOST} ^growingyourownveg.com$
Technical SEO | | spes123
RewriteRule ^(.*)$ "http://www.growingyourownveg.com/$1" [R=301,L] site is www.growingyourownveg.com any suggestions much appreciated Simon0 -
Wrong Page Ranking
Higher-level page with more power getting pushed out by weaker page in the SERPs for an important keyword. I don't care about losing the weaker page. Should I: 404 the weaker page and wait for Google to (hopefully) replace it with the stronger page? 301 the weaker page to the stronger page? NOTE: Due to poor communication between content team and myself, the weak and strong pages have similar title tags (i.e, "lawsuits" and "litigation")
Technical SEO | | LCNetwork0 -
3 URLS Being Created All For The Same Page
I use wordpress for my blog and for some reason it is creating triple urls for my pages. I am not sure it has always been like this or not. I just noticed it in the errors section of SEO Moz. http://www.kisswedding.com/blog/?gid=7&r=20 http://www.kisswedding.com/blog/ashley-and-daniels-rainy-day-diy-farm-wedding/?gid=7&r=20 http://www.kisswedding.com/blog/ashley-and-daniels-rainy-day-diy-farm-wedding/ It's all the exact same page. Is there something I can do in my settings to make this stop. I don't imagine this is good. Ya think....ha! I saw this is the SEO Moz error area for Missing Title Tags. Apparently the number has gone from 200 to 400 which is weird because I never gave my blog posts meta stuff and I haven't written 200 pages since SEO Moz's last crawl.
Technical SEO | | annasusmiles
Maybe I changed something on my blog settings without even knowing. I can't think for the life of me what that would be though. Thanks so much and I appreciate any help received. Edited to add: I added some plugins over the past week. Maybe it's one of these? Category Text Category SEO Meta Tags (just deactivated this one) PhotoSmash (also deactivated this one) Clicky for WordPress0 -
301 for a deleted page?
Which is in your opinion the best "301 practice" to notify Google that a web page does not exists anymore? For example: ...
Technical SEO | | YESdesign
---CATEGORY PAGE
-------SUBCATEGORY PAGE
------------ PRODUCT PAGE 1
------------ PRODUCT PAGE 2
------------ PRODUCT PAGE 3
... If you delete “PRODUCT PAGE 2” does it make sense to create in the .htaccess a 301 redirect towards the “SUBCATEGORY”? Do you have others tested methods to deal with this issue? Thank you in advance for sharing your opinions and ideas. YESdesign0