Adding Orphaned Pages to the Google Index
-
Hey folks,
How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls.
These pages are super low competition.
The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal)
a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned.
b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them?
c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned?
d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google.
Thanks for your opinions and if you have any hard evidence either way especially thanks for that info.
-
it's not a strategy, it's due to technical limitations on the dev side. i agree though thanks.
So, I asked this question to a very advanced SEO guru and he said they could be seen as doorways and present some risk and advised against it. That combined with the probability that they will most likely get dropped from Google's index anyway and we know that Google says they want pages to be part of the sites architecture has me leaning towards nofollowing all of them and maybe experiment with allowing 1000 to get indexed and see what happens with them.
Thanks for your input folks
-
I'd go back to the drawing board and rework your strategy.
Do you need additional sites? 150K orphaned pages you want indexed sounds spammy or poor site architecture to me.
-
Yikes, I didn't know the site was that big. Still, if you're afraid of how Google would "react" to those orphaned pages, I'd still test small, regardless of how large your overall site is.
-
Yea 1000 is probably a big enough sample.
10,000 seems like a lot i guess but not when you've got a site with 4.5 million pages.
-
yea submitting sitemap.xml files for 300k pages that are not part of the site seems a bit obnoxious.
-
we definitely want the 150k in the index since they are legitimate pages and linked to on the site. it's the 300k of orphaned ones we have to take along as a package deal that i am worried about. too many orphaned pages for Google.
-
That's a good idea. 10,000 Is still a lot. You could even test fewer than 10,000 pages. Why not try 1,000?
-
Hmmm. I am leaning towards the following solution since I would rather be on the cautious side, maybe this makes sense?
a) we noindex these 300k orphaned pages and do not submit sitemap.xml files
b) we experiment with say 10,000 pages and we allow only those to get indexed and submit sitemap.xml files for them
c) we closely monitor their indexing and ranking performance so we can determine if these are even worth opening up to Google and taking any risk.
-
In my opinion, add the 150k pages in the site map along with the 300k pages, let Google index all the pages and once they are all indexed , you can take a call on de indexing the 150k pages based on their traction.
-
I have no hard evidence, but if it were my site, I would do option C but keep an eye on what happens, and if I noticed anything strange happening, I would implement option B. But if option C makes you nervous, I see no reason you couldn't or shouldn't noindex them right off the bat.
That's merely one person's opinion, however.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Canonical Tags on Indexed Pages that are Ranking Well
Hi Guys, I recently rolled out a domain wide canonical tag change. Previously the website had canonical tags without the www, however the website was setup to redirect to www on page load. I noticed that the site competitors were all using www and as far as I understand www versus non www, it's based on preference. In order to keep things consistent, I changed the canonical tag to include the www. Will the site drop in rankings? Especially if the pages are starting to rank quite well. Any feedback is appreciated. Thanks!
Intermediate & Advanced SEO | | QuickToImpress0 -
Why is Google ranking irrelevant / not preferred pages for keywords?
Over the past few months we have been chipping away at duplicate content issues. We know this is our biggest issue and is working against us. However, it is due to this client also owning the competitor site. Therefore, product merchandise and top level categories are highly similar, including a shared server. Our rank is suffering major for this, which we understand. However, as we make changes, and I track and perform test searches, the pages that Google ranks for keywords never seems to match or make sense, at all. For example, I search for "solid scrub tops" and it ranks the "print scrub tops" category. Or the "Men Clearance" page is ranking for keyword "Women Scrub Pants". Or, I will search for a specific brand, and it ranks a completely different brand. Has anyone else seen this behavior with duplicate content issues? Or is it an issue with some other penalty? At this point, our only option is to test something and see what impact it has, but it is difficult to do when keywords do not align with content.
Intermediate & Advanced SEO | | lunavista-comm0 -
Can you no index a page in Wordpress from just Google news?
I'm trying to find a plugin for Wordpress that enables you to no-index an individual page from Google news but not from Google search results. We want to remove some of our pages from Google news without hurting others.
Intermediate & Advanced SEO | | uSw0 -
Better to 301 or de-index 403 pages
Google WMT recently found and called out a large number of old unpublished pages as access denied errors. The pages are tagged "noindex, follow." These old pages are in Google's index. At this point, would it better to 301 all these pages or submit an index removal request or what? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030 -
How to Optimize Product Page for Better Ranking in Google?
Today, I was searching for product page SEO on Google for better optimization of product pages and category level pages. I want to introduce about my website structure. Root category: http://www.vistastores.com/outdoor Sub category: http://www.vistastores.com/outdoor-umbrellas End level category: http://www.vistastores.com/patio-umbrellas Product page: http://www.vistastores.com/patio-umbrellas-california-umbrella-slpt758-f13-red.html I'm doing link building for sub category & end level category page. But, I'm not able to do inbound marketing for product level URLs due to resource problem. But, I have checked that... There are many competitors who are rank well with long trail keyword... I'm selling same product but not ranking well... What is reason behind it? I don't know... Just look at Olefin Red Patio Umbrella search result. I want to rank in top 3 with this kind of long trail keywords... I have made study of following video and article. http://www.seomoz.org/blog/ecommerce-seo-making-product-pages-into-great-content-whiteboard-friday http://www.distilled.net/blog/seo/in-search-of-the-perfect-seo-product-page/ I have done all things which were described by them... But, I'm still waiting for good ranking...
Intermediate & Advanced SEO | | CommercePundit0 -
1 of the sites i work on keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page
1 of the sites i work on (www.eva-alexander.com) keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page I have no idea why and have never experienced this before
Intermediate & Advanced SEO | | GMD10 -
How do Google Site Search pages rank
We have started using Google Site Search (via an XML feed from Google) to power our search engines. So we have a whole load of pages we could link to of the format /search?q=keyword, and we are considering doing away with our more traditional category listing pages (e.g. /biology - not powered by GSS) which account for much of our current natural search landing pages. My question is would the GoogleBot treat these search pages any differently? My fear is it would somehow see them as duplicate search results and downgrade their links. However, since we are coding the XML from GSS into our own HTML format, it may not even be able to tell.
Intermediate & Advanced SEO | | EdwardUpton610