Indexing a several millions pages new website
-
Hello everyone,
I am currently working for a huge classified website who will be released in France in September 2013.
The website will have up to 10 millions pages. I know the indexing of a website of such size should be done step by step and not in only one time to avoid a long sandbox risk and to have more control about it.
Do you guys have any recommandations or good practices for such a task ? Maybe some personal experience you might have had ?
The website will cover about 300 jobs :
- In all region (= 300 * 22 pages)
- In all departments (= 300 * 101 pages)
- In all cities (= 300 * 37 000 pages)
Do you think it would be wiser to index couple of jobs by couple of jobs (for instance 10 jobs every week) or to index with levels of pages (for exemple, 1st step with jobs in region, 2nd step with jobs in departements, etc.) ?
More generally speaking, how would you do in order to avoid penalties from Google and to index the whole site as fast as possible ?
One more specification : we'll rely on a (big ?) press followup and on a linking job that still has to be determined yet.
Thanks for your help !
Best Regards,
Raphael
-
Hello everyone,
Thanks for sharing your experience and your answers, it's greatly appreciated.
The website is build in order to avoid cookie cutter pages : each page will have unique content from classifieds (unique because classifieds won't be indexed in the first place, to avoid having too much pages).
The linking is as well though in order for each page to have permanents internal links in a logical way.
I understand from your answers that it is better to take time and to index the site step by step : mostly according to the number and the quality of classifieds (and thus the content) for each jobs/locality. It's not worth to index pages without any classifieds (and thus unique content) as they will be cut off by Google in a near future.
-
I really don't think Google likes it when you release a website that big. It would much rather you build it slowly. I would urge you to have main pages and noindex the sub categories.
-
We worked in partnership with a similar large scale site last year and found the exact same. Google simply cut off 60% of our pages out of the index as they were cookie cutter.
You have to ensure that pages have relevant, unique and worthy content. Otherwise if all your doing is replacing the odd word here and there for the locality and job name its not going to work.
Focus on having an on going SEO campaign for each target audience be that for e.g. by job type / locality / etc.
-
If you plan to get a website that big indexed you will need to have a few things in order...
First, you will need thousands of deep links that connect to hub pages deep within the site. These will force spiders down there and make them chew their way out through the unindexed pages. These must be permanent links. If you remove them then spiders will stop visiting and google will forget your pages. For a 10 million page site you will need thousands of links hitting thousands of hub pages.
Second, for a site this big.... are you going to have substantive amounts of unique content? If your pages are made from a cookie cutter and look like this....
"yada yada yada yada yada yada yada yada SEO job in Paris yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada send application to Joseph Blowe, 11 Anystreet, Paris, France yada yada yada yada yada yada yada yadayada yada yada yada yada yada yada yada yada yada yada yada yada yada yada yada"
.... then Google will index these pages, then a few weeks to a few months later your entire site might receive a Panda penalty and drop from google.
Finally... all of those links needed to get the site in the index... they need to be Penguin proof.
It is not easy to get a big site in the index. Google is tired of big cookie cutter sites with no information or yada yada content. They are quickly toasted these days.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Phasing in new website - new content on www2
Hi Mozzers, I'm working on a large website redesign / redevelopment project. New sections of the website will be phased in over the next 12 months. The plan is to launch all new content on a subdomain (www2.domain.com) while the old site remains on www.domain.com. There will be no duplicate content across the www and www2 sites, as old content will be removed on www as it is replaced with new content on www2. 301 redirects will also be setup from old content on www to new content on www2. Once the new site on www2 is complete, everything will be moved to www, with a robust 301 redirect setup in place. Is this approach logical, and can you see any SEO implication for managing the migration in this way? Thanks!
Intermediate & Advanced SEO | | RWesley0 -
Weird Indexing Issues with the Pages and Rankings
When I found the my page was non-existent on the search results page, I requested Google to index my page via the Search Console. And then just a few minutes after I did that, that page rose to top 3 ranking on the search page (with the same keyword and browser search). It happens to most of the pages on my website. Maybe a week later the rankings sank again, and I had to do the process again to make my pages to the top. Any reasons to explain this phenomenon, and how I can fix this issue? Thank you in advance.
Intermediate & Advanced SEO | | mrmrsteven0 -
Best wordpress plugin for redirects, Old to new pages
What is the best wordpress plugin for redirects, Old to new pages?
Intermediate & Advanced SEO | | Michael.Leonard1 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Best practice to prevent pages from being indexed?
Generally speaking, is it better to use robots.txt or rel=noindex to prevent duplicate pages from being indexed?
Intermediate & Advanced SEO | | TheaterMania0 -
Google de-indexed a page on my site
I have a site which is around 9 months old. For most search terms we rank fine (including top 3 rankings for competitive terms). Recently one of our pages has been fluctuating wildly in the rankings and has now disappeared altogether from the rankings for over 1 week. As a test I added a similar page to one of my other sites and it ranks fine. I've checked webmaster tools and there is nothing of note there. I'm not really sure what to do at this stage. Any advice would me much appreciated!
Intermediate & Advanced SEO | | deelo5550 -
Getting individual website pages to rank for their targeted terms instead of just the home page
Hi Everyone, There is a pattern which I have noticed when trying to get individual pages to rank for the allocated targeted terms when I execute an SEO campaign and would been keen on anyones thoughts on how they have effectively addressed this. Let me try and explain this by going through an example: Let's say I am a business coach and already have a website where it includes several of my different coaching services. Now for this SEO campaign, I'm looking to improve exposure for the clients "business coaching" services. I have a quick look at analytics and rankings and notice that the website already ranks fairly well for that term but from the home page and not the service page. I go through the usual process of optimising the site (on-page - content, meta data, internal linking) as well as a linkbuilding campaign throughout the next couple of month's, however this results in either just the home page improving or the business page does improve, but the homepage's existing ranking has suffered, therefore not benefiting the site overall. My question: If a term already ranks or receives a decent amount of traffic from the home page and not from the page that its supposed to, why do you think its the case and what would you be your approach to try shift the traffic to the individual page, without impacting the site too much?. Note: To add the home page keyword target term would have been updated? Thanks, Vahe
Intermediate & Advanced SEO | | Vahe.Arabian0 -
How to have pages re-indexed
Hi, my hosting company has blocked one my web site seeing it has performance problem. Result of that, it is now reactivated but my pages had to be reindexed. I have added my web site to Google Webmaster tool and I have submitted my site map. After few days it is saying: 103 number of URLs provided 39 URLs indexed I know Google doesn't promesse to index every page but do you know any way to increase my chance to get all my pages indexed? By the way, that site include pages and post (blog). Thanks for your help ! Nancy
Intermediate & Advanced SEO | | EnigmaSolution0