Adding Orphaned Pages to the Google Index
-
Hey folks,
How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls.
These pages are super low competition.
The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal)
a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned.
b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them?
c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned?
d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google.
Thanks for your opinions and if you have any hard evidence either way especially thanks for that info.
-
it's not a strategy, it's due to technical limitations on the dev side. i agree though thanks.
So, I asked this question to a very advanced SEO guru and he said they could be seen as doorways and present some risk and advised against it. That combined with the probability that they will most likely get dropped from Google's index anyway and we know that Google says they want pages to be part of the sites architecture has me leaning towards nofollowing all of them and maybe experiment with allowing 1000 to get indexed and see what happens with them.
Thanks for your input folks
-
I'd go back to the drawing board and rework your strategy.
Do you need additional sites? 150K orphaned pages you want indexed sounds spammy or poor site architecture to me.
-
Yikes, I didn't know the site was that big. Still, if you're afraid of how Google would "react" to those orphaned pages, I'd still test small, regardless of how large your overall site is.
-
Yea 1000 is probably a big enough sample.
10,000 seems like a lot i guess but not when you've got a site with 4.5 million pages.
-
yea submitting sitemap.xml files for 300k pages that are not part of the site seems a bit obnoxious.
-
we definitely want the 150k in the index since they are legitimate pages and linked to on the site. it's the 300k of orphaned ones we have to take along as a package deal that i am worried about. too many orphaned pages for Google.
-
That's a good idea. 10,000 Is still a lot. You could even test fewer than 10,000 pages. Why not try 1,000?
-
Hmmm. I am leaning towards the following solution since I would rather be on the cautious side, maybe this makes sense?
a) we noindex these 300k orphaned pages and do not submit sitemap.xml files
b) we experiment with say 10,000 pages and we allow only those to get indexed and submit sitemap.xml files for them
c) we closely monitor their indexing and ranking performance so we can determine if these are even worth opening up to Google and taking any risk.
-
In my opinion, add the 150k pages in the site map along with the 300k pages, let Google index all the pages and once they are all indexed , you can take a call on de indexing the 150k pages based on their traction.
-
I have no hard evidence, but if it were my site, I would do option C but keep an eye on what happens, and if I noticed anything strange happening, I would implement option B. But if option C makes you nervous, I see no reason you couldn't or shouldn't noindex them right off the bat.
That's merely one person's opinion, however.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Prioritise a page in Google/why is a well-optimised page not ranking
Hello I'm new to Moz Forums and was wondering if anyone out there could help with a query. My client has an ecommerce site selling a range of pet products, most of which have multiple items in the range for difference size animals i.e. [Product name] for small dog
Intermediate & Advanced SEO | | LauraSorrelle
[Product name] for medium dog
[Product name] for large dog
[Product name] for extra large dog I've got some really great rankings (top 3) for many keyword searches such as
'[product name] for dogs'
'[product name]' But these rankings are for individual product pages, meaning the user is taken to a small dog product page when they might have a large dog or visa versa. I felt it would be better for the users (and for conversions and bounce rates), if there was a group page which showed all products in the range which I could target keywords '[product name]', '[product name] for dogs'. The page would link through the the individual product pages. I created some group pages in autumn last year to trial this and, although they are well-optimised (score of 98 on Moz's optimisation tool), they are not ranking well. They are indexed, but way down the SERPs. The same group page format has been used for the PPC campaign and the difference to the retention/conversion of visitors is significant. Why are my group pages not ranking? Is it because my client's site already has good rankings for the target term and Google does not want to show another page of the site and muddy results?
Is there a way to prioritise the group page in Google's eyes? Or bring it to Google's attention? Any suggestions/advice welcome. Thanks in advance Laura0 -
Adding hreflang tags - better on each page, or the site map?
Hello, I am wondering if there seems to be a preference for adding hreflang tags (from this article). My client just changed their site from gTLDs to ccTLDs, and a few sites have taken a pretty big traffic hit. One issue is definitely the amount of redirects to the page, but I am also going to work with the developer to add hreflang tags. My question is - is it better to add them to the header of each page, or the site map, or both, or something else? Any other thoughts are appreciated. Our Australia site, which was at least findable using Australia Google before this relaunch, is not showing up, even when you search the company name directly. Thanks!Lauryn
Intermediate & Advanced SEO | | john_marketade0 -
Removing content from Google's Indexes
Hello Mozers My client asked a very good question today. I didn't know the answer, hence this question. When you submit a 'Removing content for legal reasons report': https://support.google.com/legal/contact/lr_legalother?product=websearch will the person(s) owning the website containing this inflammatory content recieve any communication from Google? My clients have already had the offending URL removed by a court order which was sent to the offending company. However now the site has been relocated and the same content is glaring out at them (and their potential clients) with the title "Solicitors from Hell + Brand name" immediately under their SERPs entry. **I'm going to follow the advice of the forum and try to get the url removed via Googles report system as well as the reargard action of increasing my clients SERPs entries via Social + Content. ** However, I need to be able to firmly tell my clients the implications of submitting a report. They are worried that if they rock the boat this URL (with open access for reporting of complaints) will simply get more inflammatory)! By rocking the boat, I mean, Google informing the owners of this "Solicitors from Hell" site that they have been reported for "hosting defamatory" content. I'm hoping that Google wouldn't inform such a site, and that the only indicator would be an absence of visits. Is this the case or am I being too optimistic?
Intermediate & Advanced SEO | | catherine-2793880 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
What may cause a page not to be indexed (be de-indexed)?
Hi All, I have a main category page, a landing page, that does not appear in the SERPS at all (even if I serach for a whole sentence from it). This page once ranked high. What may cause such a punishment for a specific page? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
How to make Google include our recipe pages in its main index?
We have developed a recipe search engine www.edamam.com and serve the content of over 500+ food bloggers and major recipe websites. Our legal obligations do not allow us to show the actual recipe preparation info (e.g. the most valuable from the content), we can only show a few images, the ingredients and nutrition information. Most of the unique content goes to the source/blog. By submitting XML sitemaps on GWT we now have around 500K pages indexed, however only a few hundred appear in Google's main index and we are looking for a solution to include all of them in the index. Also good to know is that it appears that all our top competitors are in the exactly same situation, so it is a challenging question. Any ideas will be highly appreciated! Thanks, Lily
Intermediate & Advanced SEO | | edamam0 -
How do I increase rankings when the indexed page is the homepage?
Hi Forum, This is a two-part question. The first is: "what may be the cause of some rank declines?" and the second is "how do I bring them back up when the indexed page is the homepage?" Over the last week I noticed some declines in several of my top keywords, many of which point to the site's homepage. The site itself is an eCommerce site, which had less visits last week than normal (holidays it seems, since the data jibes with key dates). Can a decline in traffic cause ranking declines? Any other ideas of where to look? Secondly, for those keywords that link to the homepage, how do we bring these back up since a homepage can't be optimized for every single keyword? We sell yoga products and can't have a homepage that is optimized for keywords like "yoga mat," "yoga blocks," "yoga pilates clothing," and several others, as these are our category pages' keywords. Any thoughts? Thanks!
Intermediate & Advanced SEO | | pano0 -
Does Google count links on a page or destination URLs?
Google advises that sites should have no more than around 100 links per page. I realise there is some flexibility around this which is highlighted in this article: http://www.seomoz.org/blog/questions-answers-with-googles-spam-guru One of Google's justifications for this guideline is that a page with several hundred links is likely to be less useful to a user. However, these days web pages are rarely 2 dimensional and usually include CSS drop--down navigation and tabs to different layers so that even though a user may only see 60 or so links, the source code actually contains hundreds of links. I.e., the page is actually very useful to a user. I think there is a concern amongst SEO's that if there are more than 100ish links on a page search engines may not follow links beyond those which may lead to indexing problems. This is a long winded way of getting round to my question which is, if there are 200 links in a page but many of these links point to the same page URL (let's say half the links are simply second ocurrences of other links on the page), will Google count 200 links on the page or 100?
Intermediate & Advanced SEO | | SureFire0