Tactic to get 2000+ indexed (fast)
-
Dear SEO'mozzers,
Soon i'll be launching my new project. A website with about 2000+ pages, with +- 150 words per page (simple instructions, can't make it longer).
It is vital that every page is indexed and will get in the SERP's.
Wich tactic can you recommend:
- Just set every page online at once ( with a good sitemap) or,
- Feed Google Sitemap with lets say 30 pages a day, so the crawler will come by every day. And hopefully get a better indexation and better rankings over time.
- Other tactic? of doesnt matter?
Many thanks for your help.
Gr Menno
-
I echo what Ryan said 100%. Another suggestion - especially because it sounds like you're going to start with a whole bunch of info - is to add a blog. When you're building a site, especially one that has a whole bunch of info go live at once, is to stay focused on fresh content.
With my businesses' sites, I've really found that pushing content all at once during the launch gets me indexed, but doesn't necessarily get me the SERP position I want. I try to write two articles a week per website at a minimum. It keeps the crawlers coming back and increases my site wide keyword density and potential for catching long tail searched.
-
Thanks for the advice. Think ill go with it and redesign structure to get more info on one page, so i can also put more effort in unique articles ( only around 700 then). Wich saves me time + make my website better for SEO.
-
I'm with Ryan on this one. If you can use less pages with more information on then do so.
And also I'd recommend reading up on the Panda Update.
-
Without thoroughly understanding your niche, the products / services / companies involved, it is very difficult to offer meaningful advice.
In brief, you can drop the "generic product" pages and instead make a single, rich page for Company A which offers all the details readers need.
You are welcome to operate your site however you see fit, but Google and Bing will operate their search results how they see fit, and they have determined the tactic you are using is not in the best interest of users.
If you felt compelled to present the site in the manner you described, you can add the canonical tag to all the Generic Product pages indicating the Company A page as the primary page to be indexed.
-
Ill try to explain what my problem is. Cause what you're telling is true, found that out myself onze too.
The problem is that every page NEEDS to be there, cause the little info differences are vital.
It a website with info about how to cancel subscriptions. Most of services are offered are all the same from all company's. Only the adress is the difference.
Its build up like this:
Company A - info page
Generic product a - cancel for adres for company A - infopage
Generic product b - cancel for adres for company A - infopage
Generic product b - cancel for adres for company A - infopage
Company B - info page
Generic product a - cancel for adres for company B - infopage
Generic product b - cancel for adres for company B - infopage
Generic product b - cancel for adres for company B - infopageThe difference from content is not more that 15%, but that 15% makes the difference and is vital. Any idea for a solution for this problem?
-
The second choice would be recommended.
It is common for site owners to publish more pages in an attempt to rank for more keywords. An example I can think of related to directions:
Article 1 - How to clear cache in Firefox 13
Article 2 - How to clear cache in Firefox 12
Article 3 - How to clear cache in Firefox 11
...and so forth. The directions are all the same but in an effort to target individual keywords the site owner generates numerous pages. Search engines view the pages as duplicate content.
Next, site owners attempt what you are suggesting...hire writers to change a few words around to make each article appear unique. This tactic does not help improve the quality of your pages and therefore does not help users. It is simply an attempt to manipulate search engines. It often does not work. If it does work, it may stop working after a time as search engines get better at filtering such techniques.
The suggestion I would make is to forget search engines exist and write the clearest, best directions ever written. Offer images, details about things that might go wrong, etc.
-
Thanks for list, i think everything is fine. Only not the content you mentioned. Think i need a few good text writers, to write 2000x200 words of unique articles.
To tackle the unique content problem i have 2 solutions. Wich one do you think its best?
- Publish the site with 75% possible dupe content, and then rewrite over time.
- Only publish only unique articles, and take some time for it ?
Gr
-
Your site size really is not a factor in determining how quickly the site is indexed. A few steps you can take to achieve the goal of having all 2k pages indexed fast:
-
ensure your site's navigation is solid. All pages should be reachable within a maximum of 3 mouse clicks from the home page.
-
for the most part, your site should be HTML based. You can use Javascript, flash and so forth but the HTML support needs to be there as well. Try turning off javascript and flash, then navigating your site.
-
for pages you do not wish to be indexed, add the "noindex" tag to them rather then blocking them in robots.txt when possible.
-
review your site map to ensure it is solid. Ensure all 2k pages you want indexed are included in the sitemap. Also ensure there are not any pages blocked by robots.txt or "noindex" in your sitemap.
-
review your content to ensure each page is unique. With only 150 words per page, there is a high likelihood many pages will be viewed as duplicate content and therefore not indexed.
-
review your site code (validator.w3.org) to ensure it is fairly clean. Some errors can impact a search engine's ability to crawl your site.
My biggest concern is the last point. If you simply change the title and a couple keywords, then the other pages will be viewed as duplicates and not indexed, or even if they are indexed they wont rank well.
I should also clarify the above applies to Google.com mostly. Bing is much pickier about the pages it will index.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How get google reviews on search results?
Hi, We have good google reviews. (4,8) Can we get this rating stars also on our organic search results ? Best remco
Intermediate & Advanced SEO | | remcoz0 -
How do we decide which pages to index/de-index? Help for a 250k page site
At Siftery (siftery.com) we have about 250k pages, most of them reflected in our sitemap. Though after submitting a sitemap we started seeing an increase in the number of pages Google indexed, in the past few weeks progress has slowed to a crawl at about 80k pages, and in fact has been coming down very marginally. Due to the nature of the site, a lot of the pages on the site likely look very similar to search engines. We've also broken down our sitemap into an index, so we know that most of the indexation problems are coming from a particular type of page (company profiles). Given these facts below, what do you recommend we do? Should we de-index all of the pages that are not being picked up by the Google index (and are therefore likely seen as low quality)? There seems to be a school of thought that de-indexing "thin" pages improves the ranking potential of the indexed pages. We have plans for enriching and differentiating the pages that are being picked up as thin (Moz itself picks them up as 'duplicate' pages even though they're not. Thanks for sharing your thoughts and experiences!
Intermediate & Advanced SEO | | ggiaco-siftery0 -
Google index
Hello, I removed my site from google index From GWT Temporarily remove URLs that you own from search results, Status Removed. site not ranking well in google from last 2 month, Now i have question that what will happen if i reinclude site url after 1 or 2 weeks. Is there any chance to rank well when google re index the site?
Intermediate & Advanced SEO | | Getmp3songspk0 -
Google Is Indexing My Internal Search Results - What should i do?
Hello, We are using a CMS/E-Commerce platform which isn't really built with SEO in mind, this has led us to the following problem.... a large number of internal (product search) search result pages, which aren't "search engine friendly" or "user friendly", are being indexed by google and are driving traffic to the site, generating our client revenue. We want to remove these pages and stop them from being indexed, replacing them with static category pages - essentially moving the traffic from the search results to static pages. We feel this is necessary as our current situation is a short-term (accidental) win and later down the line as more pages become indexed we don't want to incur a penalty . We're hesitant to do a blanket de-indexation of all ?search results pages because we would lose revenue and traffic in the short term, while trying to improve the rankings of our optimised static pages. The idea is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages. Our main focus is to improve user experience and not have customers enter the site through unexpected pages. All thoughts or recommendations are welcome. Thanks
Intermediate & Advanced SEO | | iThinkMedia0 -
What is the best approach for getting comments indexed, but also providing a great UX?
The way our in-house comments system was built, it uses AJAX to call comments as the page is loaded. I'm working on a set of requirements to convert the system over to be more SEO-friendly. Today, we have a "load more comments" after the first 20 comments, then it calls the server and loads more comments. This is what I'm trying to figure out. Should we load all the comments behind the scenes in the page, then lazy load the comments or use the same "load more" and just load what was already loaded behind the scenes? Or does anyone have a better suggestion about how to make the comments crawlable for Google?
Intermediate & Advanced SEO | | JDatSB0 -
How to get rid from unwanted backlinks
Hello, Before joining seomoz I was desperate to hire seo expert for my website and each posts on my website. I did some research and found some cheap services on fiverr.com as I didn't had huge marketing budget and I purchased some services (gig) on fiverr.com to add many backlinks to my latest post. Now before ordering this services I was on 3rd page of Google for my relevant keywords but after adding backlinks I am not on first 50 pages for my specific keywords. I admit that when I was on 3rd page the competition for my keywords was about 50,000 (with comma ofcourse) and now it is 186,000 so I do believe that competition is increased but I didn't expect such a drop in rankings. I suspect that may be Google put my site far behind because of so much backlinks and that too generated within one week. I didn't know that it will cause such a drop in rankings. I even suspect that may be there were spam backlinkgs or linkings which google doesn't like. So my question is first of all how do I know that is there any backlinks on my website or posts which can harm my rankings and if yes how do I get rid of them. Please guide me as I want to do proper and genuine seo which google likes and finally my rankings get better. Thanks Bhadresh
Intermediate & Advanced SEO | | intmktcom0 -
How accurate are the index figures in GWT?
I've been looking at a site in GWT and the number of indexed urls is very low when compared with the number or submitted urls on the xml sitemaps. The site has several stores which are all submitted using different sitemaps. When you perform a search in Google, eg site:domain.com/store1 site:domain.com/store2 site:domain.com/store3 The results are similar to the webmaster urls. However, looking in the analytics for landing pages used for organic traffic from Google shows a much higher number of pages. If these pages aren't indexed as reported in GMT, how could they be found in the results and be recorded as landing pages?
Intermediate & Advanced SEO | | edwardlewis0 -
How Fast Is Too Fast to Increase Page Volume of Your Site
I am working on a project that is basically a site to list apartment for rent (similar to apartments.com or rent.com). We want to add a bunch of amenity pages, price pages, etc. Basically increasing the page count on the site and helping users be able to have more pages relevant to their searches and long tail phrases. So an example page would be Denver apartments with a pool would be one page and Seattle apartments under 900 would be another page etc. By doing this we will take the site from about 14,000 pages or so to over 2 million by the time we add a list of amenities to every city in the US. My question is should I worry about time release on them? Meaning do you think we would get penalized for launching that many pages overnight or over the course of a week? How fast is too fast to increase the content on your site? The site about a year old and we are not trying to scam anything just looking to site functionality and page volume. Any advice?
Intermediate & Advanced SEO | | ioV0