I need help compiling solid documentation and data (if possible) that having tons of orphaned pages is bad for SEO - Can you help?
-
I spent an hour this afternoon trying to convince my CEO that having thousands of orphaned pages is bad for SEO. His argument was "If they aren't indexed, then I don't see how it can be a problem."
Despite my best efforts to convince him that thousands of them ARE indexed, he simply said "Unless you can prove it's bad and prove what benefit the site would get out of cleaning them up, I don't see it as a priority."
So, I am turning to all you brilliant folks here in Q & A and asking for help...and some words of encouragement would be nice today too
Dana
-
Agreed on all counts Jason, not to mention the improved customer experience because we won't have people landing on those God-awful ugly and useless pages!
From a server perspective, could deleting 8,000 files (pages, images, PDFs) results in our site speed improving too? Or would it likely have no impact?
-
So you have roughly 8,500 pages that are part of your customer experience and that you want customers to be able to navigate to from your site and presumably would like customers to find on Google. (from Screaming Frog).
But only 7,500 only pages are in Google's index. So best case, roughly 1,000 of your good pages (almost 12% of all the pages on your site) don't exist in organic search. Worst case, is that some of those 7,500 pages in google are depreciated pages that aren't part of your active site, making the percentage of live pages in google even worse.
It's very possible that a portion of your google crawl budget is being consumed by pages that don't help you. If you get those pages out of the index, you stand a better chance to get your 1000 good pages into the index.
-
Hi Jason,
Ok, here is what I saw in Screaming Frog:
27,616 total spidered URLs, of which:
- 8,494 are HTML pages
- 45 are CSS files
- 14,687 are images
- 4,287 are PDFs
Google says we have only 7,540 URLs indexed (of all types) - I know for a fact that at least 500 orphaned pages are indexed in Google. It seems to me, then, that Google is indexing content that isn't important to us, and perhaps not indexing other content that is important to us because it's having trouble telling what's important and what's not.
Any insights on that Jason? What do you make of it?
-
Hi Jason,
I'm just following up as I get my ducks in a row on this one. Above in your comment you said "Google Count of Pages - Screaming Frog count of Pages = # of Orphaned Pages" - to be perfectly accurate, this would only give me the number of orphaned pages that are indexed. There could be many additional orphaned pages that are not in Google's index.
My follow up question is, should I be concerned about those too? Or are orphaned pages that aren't indexed not worth cleaning up? I think I already know the answer (Yes! Clean those up too because they can interfere with crawl rate and site speed...)....but I want to know your take on it please. Thanks so much!
Dana
-
Tempting! Very tempting.:-)
-
I would not do this if I was an employee... but.... I would ask him to bet me an amount that would be equivalent to about "one month's pay" on the results.
He is a chicken so he wouldn't accept that bet. And if he did accept I would want it in writing.
-
Thanks EGOL. You made me chuckle, because all of these things crossed my mind. I did go home mad yesterday, and I don't get mad very easily or very often. I usually welcome the idea of explaining SEO strategies and tactics to newbies and laypeople (as is evidenced by my many posts here in Q & A).
Let's just say - my feelers are out looking at other possibilities.
-
In my opinion, the links are still evaporating pagerank.
If some of these pages are still in the index they could be counting as thin/duplicate content.
-
What would your response be to that?
- thinks for a while *
I would be mad about this. This is why I prefer to be self-employed.
I don't know the temperament or personality of this person.
I might not be working there much longer.
It seems to me that the effort required to cut links into these pages is tiny and the potential for gain is pretty high.
Downside risk is zero. Upside opportunity is good. He is a chicken and a fool.
-
EGOL, I thought I would just follow up on these thin content "Reviews/Ratings" pages. They are blocked from Google crawling them via the robots.txt file. Is this enough? Or are they still diluting the product page's authority just by being there?
Thanks!
Dana
-
Thanks EGOL,
And yes, they are.
The comment I received when trying to explain that those links were draining authority off the product pages was "No they aren't. Whatever PageRank the product page has, it has, regardless of whether the links are there or not."
What would your response be to that? I tried to explain it several different ways, but he just looked at me like I was full of malarkey...He is a visual person. Perhaps I should try a diagram?
It's difficult going into a situation like this when the opening premise in the other person's mind is that he knows more about SEO than I do, because all SEO is in his mind is a bunch of guesswork.
Sorry, moral's a bit low in my heart at the moment. I work too hard and study too hard at what I do to have someone who maybe read's a blog about SEO occasionally to come in and treat me like I have no idea what I'm talking about.
Thanks very much for responding. I appreciate it mucho!
Dana
-
Thanks Jason,
These are great suggestions and are exactly the kinds of things that will give me the proof I need to convince him that removing these is a worthwhile endeavor. I'm off to do them now and will come back here and post my discoveries.
Dana
-
Are these those thin content, duplicate content, review and email pages?
There are links into those pages that are evaporating pagerank.
Two links on each of your product pages are being wasted.
If they are getting indexed then they are dead weight on your site and make your site look like a skimpy spammy publisher.
-
By "orphaned" do you mean pages that are no longer linked to your site navigation taxonomy?
If you know the subject matter and/or URLs, you can easy show your boss that they are indexed: Google "site:oursite.com orphaned topic" and show him all the pages in the google index.
If you can't find the pages, then do a complete crawl of your site with Screaming Frog and see how many pages it finds. Now compare that number with how many pages Google has in your index in Google Webmaster Tools (under Health -> Index Status). Google Count of Pages - Screaming Frog count of Pages = # of Orphaned Pages.
Now to see if those pages are hurting you, run them through Open Site Explorer to see if any of them have backlinks. If so, they are diluting your SEO efforts. Even if not, look at your crawl stats in Google Webmaster tools under Health and see how many pages you're getting crawled per day. If it's a fraction of your total pages, then if you got rid of the orphaned pages, you could be getting your important pages crawled more regularly.
I hope that helps.
Jason "Retailgeek" Goldberg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
2 pages ranking for same keyword.. Need some advice on what to do.
Here's my question. When I first started my website we started Using keyword anchor building links To my homepage . Over the years Our business has expanded to more than just baby headbands. I now have a baby headband page. When tracking my rankings. I sometimes see Both pages in Google for the same keyword. Other days I do not see both of them. My question is Should I continue building links and keyword anchor text for the home page Or should I switch them and start building keyword-rich anchor text for my baby headband page. I'm just wondering if the Search engine is confused by the two. When searching for the keyword baby headbands. I will sometimes show up for eight and nine. 8 is my home page and 9 is for the baby headband page. I have always shown up for the keyword "baby Headbands" for my home page.
Technical SEO | | PB20070 -
Wrapping my head around an e-commerce anchor filter issue, need help
I am having a hard time understanding how Google will deal with this scenario, I would love to hear what you guys think or suggest. Ok a category page on the site in question looks like this. http://makeupaddict.me/6-skin-care All fine and well, But a paginated page or a filtered category pages look like these http://makeupaddict.me/6-skin-care#/page-2 and http://makeupaddict.me/6-skin-care#/price-391-1217 From my understanding Google does not index an anchor without a shebang (#!), but that doesn't mean that they do not still crawl them, correct? That is where the issue comes in, since anchors are not indexed and dropped from the urls, when Google crawls a filtered or paginated page, it is getting different results. From the best of my understanding, and someone can correct me if I am wrong but an anchor is not passed in web languages like a querystring is. So if I am using php and land on http://makeupaddict.me/6-skin-care or http://makeupaddict.me/6-skin-care#/price-391-1217 and use something like .$_SERVER['SELF'] to get the url both pages will return http://makeupaddict.me/6-skin-care since the anchor is handled client side. With that being the case, is it imagined that Google uses that standard or is it thought they have a custom function that grabs the whole url anchor in all? Also if they are crawling the page with the anchor, but seeing it anchor less how are they handling the changing content?
Technical SEO | | LesleyPaone0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Are links in menus to external sites bad for SEO?
We're building a blog on a subdomain of the main site. The main site is on Shopify and the blog will be on wordpress. I'd like to keep the user experience as simple as possible so I'd like to make the blog look exactly like the main Shopify site. This means having a menu in the blog that duplicates the Shopify menu. So is it bad for SEO to have someone click on the 'about us' button in the blog subdomain (blog.mainsite.com) which takes you to the 'about us page' on the main shopify website (mainsite.com)?
Technical SEO | | acs1110 -
Off-page SEO and on-page SEO improvements
I would like to know what off-page SEO and on-page SEO improvements can be made to one of our client websites http://www.nd-center.com Best regards,
Technical SEO | | fkdpl2420 -
What is the value in Archiving and how can I avoid negative SEO impact?
I have been very busy reducing GWT duplicate content errors on my website, www.heartspm.com, created on a Wordpress platform. Each month, blog entries are being archived and each month is generating a duplicate description by Google. We post 2-3 blog entries per month and they don't really go out of date. Most are not news related butr rather they are nuggets of information on entomology. Do I need to use the archiving feature? Can I turn it off? Should I switch to archive perhaps once per year instead of every month and how is that done? How do I stop Google from creating its' own meta-description, duplicates each month for these archive entries? Should I have the archive as NOINDEX, FOLLOW? I'm not the programmer, but I have some technical know how, so I have a lot of half baked ideas and answers that could use some polishing. Thanks for your help and suggestions. Gerry
Technical SEO | | GerryWeitz0 -
Need Help with MAGENTO - URL rewrite
Hello... Hopefully a Magento expert will stumble across this question and help me out. I have noticed that my site is no longer as prominent as it once was for specific product pages... I am looking for help in rewriting the URL's for the product pages. I want it to have xyz.com/product (which exists if you hard code it into the site) If you wind up on the product by clicking throught the categories the url looks like: xyz.com/category/subcategory/product. Does anyone know how to make it so when you land on a product page it is just xyz.com/product ? My Site is : http://goo.gl/JgK1e Thanks for the help...
Technical SEO | | Prime850 -
Destination URL in SERPs keeps changing and I can't work out why.. Help.
I am befuddled as to why our destination URL in SERPs keeps changing oak furniture was nicely returning http://www.thefurnituremarket.co.uk/oakfurniture.asp then I changed something yesterday I did 2 things. published a link to that on facebook as part of a competition. redirected dynamic pages to the static URL for oak furniture.. Now for oak furniture the SERPs in GG UK is returning our home page as the most relevant landing page.. Any Idea why? I'm leaning to an onpage issue than posting on FB.. Thoughts?
Technical SEO | | robertrRSwalters0