How to find orphan pages
-
Hi all,
I've been checking these forums for an answer on how to find orphaned pages on my site and I can see a lot of people are saying that I should cross check the my XML sitemap against a Screaming Frog crawl of my site.
However, the sitemap is created using Screaming Frog in the first place... (I'm sure this is the case for a lot of people too).
Are there any other ways to get a full list of orphaned pages? I assume it would be a developer request but where can I ask them to look / extract?
Thanks!
-
Yes I mentioned in my case I use Semrush and there is a dedicated space for that specific parameter. The easiest way to get your log files is logging into your cPanel and find an option called Raw Log Files. If you are still not able to find it, you may need to contact your hosting provider and ask them to provide the log files for your site.
Raw Access Logs allow you to see what the visits to your website were without displaying graphs, charts, or other graphics. You can use the Raw Access Logs menu to download a zipped version of the server’s access log for your site. This can be very useful when you want to quickly see who has visited your site.
Raw logs may only contain a few hours’ worths of data because they are discarded after the system processes them. However, if archiving is enabled, the system archives the raw log data before the system discards it. So go ahead and ensure that you are archiving!
Once you have your log file ready to go, you now need to gather the other data set of pages that can be crawled by Google, using Screaming Frog.
Crawl Your Pages with Screaming Frog SEO Spider
Using the Screaming Frog SEO Spider, you can crawl your website as Googlebot would, and export a list of all the URLs that were found.
Once you have Screaming Frog ready, first ensure that your crawl Mode is set to the default ‘Spider’.
Then make sure that under Configuration > Spider, ‘Check External Links’ is unchecked, to avoid unnecessary external site crawling.
Now you can type in your website URL, and click Start.
Once the crawl is complete, simply
a. Navigate to the Internal tab.
b. Filter by HTML.
c. Click Export.
d. Save in .csv format.Now you should have two sets of URL data, both in .csv format:
All you need to do now is compare the URL data from the two .csv files, and find the URLs that were not crawlable.If you decided to analyze a log file instead, you can use the Screaming Frog SEO Log File Analyser to uncover our orphan pages. (Keep in mind that Log File Analyzer is not the same tool that SEO spyder)
The tool is very easy to use (download here), from the dashboard you have the ability to import the two data sets that you need to analyze
If the answer were useful do not forget to mark it as a good answer ....Good Luck
-
Hi Roman,
Out of interest, is there an option to expert an orphan page report like there is in Screaming Frog? (Reports / Orphan Pages).
I guess the true and most realistic option is to get the list from the dev team as using the sitemap isn't plausible as these pages should still get indexed. The new Google Search Console also lets you test individual pages and as long as they're in the sitemap, they should (hopefully) be indexed.
Still, trying to get a list of ALL pages on a site, without dev support, seems to be a challenge I'm trying to solve
-
Even Screaming-frog have problems to find all the orphan-pages, I use Screaming-frog, Moz, Semrush, Ahrefs, and Raven-tools in my day to day and honestly, Semrush is the one that gives me better results for that specific tasks. As an experience, I can say that a few months ago I took a website and it was a complete disaster, no sitemap, no canonical tags, no meta-tags and etc.
I run screaming-frog and showed me just 200 pages but I knew it was too much more at the end I founded 5k pages with Semrush, probably even the crawler of screaming frog has problems with that website so I commenting that as an experience.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it good to redirect million of pages on a single page?
My site has 10 lakh approx. genuine urls. But due to some unidentified bugs site has created irrelevant urls 10 million approx. Since we don’t know the origin of these non-relevant links, we want to redirect or remove all these urls. Please suggest is it good to redirect such a high number urls to home page or to throw 404 for these pages. Or any other suggestions to solve this issue.
Technical SEO | | vivekrathore0 -
Why are my page titles not being honored?
If you view source you will see my actual title tag http://www.sqlsentry.com/products/performance-advisor/sql-server-performance The google results are showing a completely different title tag that we do not want used. This was working properly and I'm not understanding how it got changed. Thanks!
Technical SEO | | Sika220 -
URL Structure for "Find A Professional" Page
I've read all the URL structure posts out there, but I'm really undecided and would love a second opinion. Currently, this is how the developer has our professionals directory working: 1. You search by inputting your Zip Code and selecting a category (such as Pool Companies) and we return all professionals within a X-mile radius of that ZIP. This is how the URL's are structured... 1. Main Page: /our-professionals 2. The URL looks like this after a search for "Deck Builders" in ZIP 19033: /our-professionals?zipcode=19033&HidSuppliers=&HiddenSpaces=&HidServices=&HidServices_all=[16]%2C&HidMetroareas=&srchbox= 3. When I click one of the businesses, URL looks like this: viewprofile.php?id=409 I know how to go about doing this, but I'm undecided on the best structure for the URL's. Maybe for results pages do this: find-professionals/deck-builders/philadelphia-pa-19033 And for individual pro's profiles do this: /deck-builders/philadelphia-pa-19033/Billys-Deck-Service Any input on how to best structure this so that we can have a good chance of showing in SERPs for "Deck Builders near New Jersey" and the such, would be much appreciated.
Technical SEO | | zDucketz0 -
Uservoice and Duplicate Page Content
Hello All, I'm having an issue where the my UserVoice account is creating duplicate page content (image attached). Any ideas on how to resolve the problem? A couple solutions we're looking into: moving the uservoice content inside the app, so it won't get crawled, but that's all we got for now. Thank you very much for your time any insight would be helpful. Sincerely,
Technical SEO | | JonnyBird1
Jon Birdsong SalesLoft duplicate duplicate0 -
NoIndex user generated pages?
Hi, I have a site, downorisitjustme (dot) com It has over 30,000 pages in google which have been generated by people searching to check if a specific site is working or not and then possibly adding a link to a msg board to the deeplink of the results page or something which is why the pages have been picked up. Am I best to noindex the res.php page where all the auto generated content is showing up and just have the main static pages as the only ones available to be indexed?
Technical SEO | | Wardy0 -
Where to put content on the page? - technical
The new algo update says any images at the top of the page negatively affect user experience if they are adverts? how does google know if its an advert or relevant banner? When trying to put text as far up as possible on the page, is it ok to make it appear higher in the code but appear further down using css? Or does Google not go from the code top to bottom when working this out, more how it renders? Any advice much appreciated.
Technical SEO | | pauledwards0 -
How do I eliminate duplicate page titles?
Almost...I repeat almost all of my duplicate page titles show up as such because the page is being seen twice in the crawl. How do I prevent this? <colgroup><col width="336"> <col width="438"></colgroup>
Technical SEO | | ENSO
| www.ensoplastics.com/ContactUs/ContactUs.html | Contact ENSO Plastics |
| ensoplastics.com/ContactUs/ContactUs.html | Contact ENSO Plastics | This is what is from the CSV...there are many more just like this. How do I cut out all of these duplicate urls?0 -
How do I know which page a link is from
I've got an interesting situation. I hope you can help. I have a list of links but I'm not sure which pages of my site they are from. How do I know which page a specific link is from? Thanks in advance.
Technical SEO | | VinceWicks0