Indexing a several millions pages new website
-
Hello everyone,
I am currently working for a huge classified website who will be released in France in September 2013.
The website will have up to 10 millions pages. I know the indexing of a website of such size should be done step by step and not in only one time to avoid a long sandbox risk and to have more control about it.
Do you guys have any recommandations or good practices for such a task ? Maybe some personal experience you might have had ?
The website will cover about 300 jobs :
- In all region (= 300 * 22 pages)
- In all departments (= 300 * 101 pages)
- In all cities (= 300 * 37 000 pages)
Do you think it would be wiser to index couple of jobs by couple of jobs (for instance 10 jobs every week) or to index with levels of pages (for exemple, 1st step with jobs in region, 2nd step with jobs in departements, etc.) ?
More generally speaking, how would you do in order to avoid penalties from Google and to index the whole site as fast as possible ?
One more specification : we'll rely on a (big ?) press followup and on a linking job that still has to be determined yet.
Thanks for your help !
Best Regards,
Raphael
-
Hello everyone,
Thanks for sharing your experience and your answers, it's greatly appreciated.
The website is build in order to avoid cookie cutter pages : each page will have unique content from classifieds (unique because classifieds won't be indexed in the first place, to avoid having too much pages).
The linking is as well though in order for each page to have permanents internal links in a logical way.
I understand from your answers that it is better to take time and to index the site step by step : mostly according to the number and the quality of classifieds (and thus the content) for each jobs/locality. It's not worth to index pages without any classifieds (and thus unique content) as they will be cut off by Google in a near future.
-
I really don't think Google likes it when you release a website that big. It would much rather you build it slowly. I would urge you to have main pages and noindex the sub categories.
-
We worked in partnership with a similar large scale site last year and found the exact same. Google simply cut off 60% of our pages out of the index as they were cookie cutter.
You have to ensure that pages have relevant, unique and worthy content. Otherwise if all your doing is replacing the odd word here and there for the locality and job name its not going to work.
Focus on having an on going SEO campaign for each target audience be that for e.g. by job type / locality / etc.
-
If you plan to get a website that big indexed you will need to have a few things in order...
First, you will need thousands of deep links that connect to hub pages deep within the site. These will force spiders down there and make them chew their way out through the unindexed pages. These must be permanent links. If you remove them then spiders will stop visiting and google will forget your pages. For a 10 million page site you will need thousands of links hitting thousands of hub pages.
Second, for a site this big.... are you going to have substantive amounts of unique content? If your pages are made from a cookie cutter and look like this....
"yada yada yada yada yada yada yada yada SEO job in Paris yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada send application to Joseph Blowe, 11 Anystreet, Paris, France yada yada yada yada yada yada yada yadayada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada"
.... then Google will index these pages, then a few weeks to a few months later your entire site might receive a Panda penalty and drop from google.
Finally... all of those links needed to get the site in the index... they need to be Penguin proof.
It is not easy to get a big site in the index. Google is tired of big cookie cutter sites with no information or yada yada content. They are quickly toasted these days.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Conundrum with brand new website keywords...
I'm working with on a website for an app called BetterRX. There's a prescription card called BetterRX Card. Our domain is Better RX.com and the card is BetterRXCard.com. "Better RX" as a brand search is dominated by prescription discount cards, with Good RX being the most dominant. Any suggestions on how to go about mixing optimization for brand as well as the app?
Intermediate & Advanced SEO | | sickle3110 -
Google Search Console indexes website for www but images for non www.
On the google search console, the website data is all showing for the www.promierproducts.com. The images however are indexed on the non www version. I'm not sure why.
Intermediate & Advanced SEO | | MikeSab1 -
Page must be internally linked to get indexed?
If a there is page like website.com/page; I think this page will be indexed by Google even we don't link it internally from anywhere. Is this true? Will it makes any difference in-terms of "indexability" if we list this page on sitemap? I know page's visibility will increase when link from multiple internal pages. I wonder will there be any noticeable difference while this page is listed in sitemap.
Intermediate & Advanced SEO | | vtmoz0 -
Several 301 Redirects to Same Page
Hi, I have 3 Pages we won't use anymore in our website. Let's call them url A, url B and url C. To keep their SEO strength on our domain, I've though about redirecting all of them to url D. For what I understand, when 301 redirecting, about 85-90% of the link SEO juice is passed. Then, if I redirect 3 URLs to the same page... does url D receive all the link SEO juices for URLs added up? (approximately)
Intermediate & Advanced SEO | | viatrading1
e.g. future url D juice = 100% current url D juice + 85% url A juice + 85% url B juice + 85% url C juice Is this the best practice, or is there a better way? Cheers,0 -
My website is not indexing
Hello Experts As i search site :http://www.louisvuittonhandbagss.com or just entering http://www.louisvuittonhandbagss.com on Google i am not getting my website . I have done following steps 1. I have submitted sitemaps and indexed all the site maps 2.i have used GWT feature fetch as Google . 3. I have submitted my website to top social book marking websites and to some classified sites also . Pleae
Intermediate & Advanced SEO | | aschauhan5210 -
Remove Google penalty or make a new website, Which is better??
My local website was hit by google and I have done all steps to remove the penalty, But it's still not ranked. So it is better to make a new website with new content and start working on it?
Intermediate & Advanced SEO | | Dan_Brown10 -
SEOmoz is only crawling 2 pages out of my website
I have checked on Google Webmaster and they are crawling around 118 pages our of my website, store.itpreneurs.com but SEOmoz is only crawling 2 pages. Can someone help me? Thanks Diogo
Intermediate & Advanced SEO | | jslusser0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0