Tactic to get 2000+ indexed (fast)
-
Dear SEO'mozzers,
Soon i'll be launching my new project. A website with about 2000+ pages, with +- 150 words per page (simple instructions, can't make it longer).
It is vital that every page is indexed and will get in the SERP's.
Wich tactic can you recommend:
- Just set every page online at once ( with a good sitemap) or,
- Feed Google Sitemap with lets say 30 pages a day, so the crawler will come by every day. And hopefully get a better indexation and better rankings over time.
- Other tactic? of doesnt matter?
Many thanks for your help.
Gr Menno
-
I echo what Ryan said 100%. Another suggestion - especially because it sounds like you're going to start with a whole bunch of info - is to add a blog. When you're building a site, especially one that has a whole bunch of info go live at once, is to stay focused on fresh content.
With my businesses' sites, I've really found that pushing content all at once during the launch gets me indexed, but doesn't necessarily get me the SERP position I want. I try to write two articles a week per website at a minimum. It keeps the crawlers coming back and increases my site wide keyword density and potential for catching long tail searched.
-
Thanks for the advice. Think ill go with it and redesign structure to get more info on one page, so i can also put more effort in unique articles ( only around 700 then). Wich saves me time + make my website better for SEO.
-
I'm with Ryan on this one. If you can use less pages with more information on then do so.
And also I'd recommend reading up on the Panda Update.
-
Without thoroughly understanding your niche, the products / services / companies involved, it is very difficult to offer meaningful advice.
In brief, you can drop the "generic product" pages and instead make a single, rich page for Company A which offers all the details readers need.
You are welcome to operate your site however you see fit, but Google and Bing will operate their search results how they see fit, and they have determined the tactic you are using is not in the best interest of users.
If you felt compelled to present the site in the manner you described, you can add the canonical tag to all the Generic Product pages indicating the Company A page as the primary page to be indexed.
-
Ill try to explain what my problem is. Cause what you're telling is true, found that out myself onze too.
The problem is that every page NEEDS to be there, cause the little info differences are vital.
It a website with info about how to cancel subscriptions. Most of services are offered are all the same from all company's. Only the adress is the difference.
Its build up like this:
Company A - info page
Generic product a - cancel for adres for company A - infopage
Generic product b - cancel for adres for company A - infopage
Generic product b - cancel for adres for company A - infopage
Company B - info page
Generic product a - cancel for adres for company B - infopage
Generic product b - cancel for adres for company B - infopage
Generic product b - cancel for adres for company B - infopageThe difference from content is not more that 15%, but that 15% makes the difference and is vital. Any idea for a solution for this problem?
-
The second choice would be recommended.
It is common for site owners to publish more pages in an attempt to rank for more keywords. An example I can think of related to directions:
Article 1 - How to clear cache in Firefox 13
Article 2 - How to clear cache in Firefox 12
Article 3 - How to clear cache in Firefox 11
...and so forth. The directions are all the same but in an effort to target individual keywords the site owner generates numerous pages. Search engines view the pages as duplicate content.
Next, site owners attempt what you are suggesting...hire writers to change a few words around to make each article appear unique. This tactic does not help improve the quality of your pages and therefore does not help users. It is simply an attempt to manipulate search engines. It often does not work. If it does work, it may stop working after a time as search engines get better at filtering such techniques.
The suggestion I would make is to forget search engines exist and write the clearest, best directions ever written. Offer images, details about things that might go wrong, etc.
-
Thanks for list, i think everything is fine. Only not the content you mentioned. Think i need a few good text writers, to write 2000x200 words of unique articles.
To tackle the unique content problem i have 2 solutions. Wich one do you think its best?
- Publish the site with 75% possible dupe content, and then rewrite over time.
- Only publish only unique articles, and take some time for it ?
Gr
-
Your site size really is not a factor in determining how quickly the site is indexed. A few steps you can take to achieve the goal of having all 2k pages indexed fast:
-
ensure your site's navigation is solid. All pages should be reachable within a maximum of 3 mouse clicks from the home page.
-
for the most part, your site should be HTML based. You can use Javascript, flash and so forth but the HTML support needs to be there as well. Try turning off javascript and flash, then navigating your site.
-
for pages you do not wish to be indexed, add the "noindex" tag to them rather then blocking them in robots.txt when possible.
-
review your site map to ensure it is solid. Ensure all 2k pages you want indexed are included in the sitemap. Also ensure there are not any pages blocked by robots.txt or "noindex" in your sitemap.
-
review your content to ensure each page is unique. With only 150 words per page, there is a high likelihood many pages will be viewed as duplicate content and therefore not indexed.
-
review your site code (validator.w3.org) to ensure it is fairly clean. Some errors can impact a search engine's ability to crawl your site.
My biggest concern is the last point. If you simply change the title and a couple keywords, then the other pages will be viewed as duplicates and not indexed, or even if they are indexed they wont rank well.
I should also clarify the above applies to Google.com mostly. Bing is much pickier about the pages it will index.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is a canonicalized URL still in index?
Hi Mozers, We recently canonicalized a few thousand URLs but when I search for these pages using the site: operator I can see that they are all still in Google's index. Why is that? Is it reasonable to expect that they would be taken out of the index? Or should we only expect that they won't rank as high as the canonical URLs? Thanks!
Intermediate & Advanced SEO | | yaelslater0 -
Duplicate Page getting indexed and not the main page!
Main Page: www.domain.com/service
Intermediate & Advanced SEO | | Ishrat-Khan
Duplicate Page: www.domain.com/products-handler.php/?cat=service 1. My page was getting indexed properly in 2015 as: www.domain.com/service
2. Redesigning done in Aug 2016, a new URL pattern surfaced for my pages with parameter "products-handler"
3. One of my product landing pages had got 301-permanent redirected on the "products-handler" page
MAIN PAGE: www.domain.com/service GETTING REDIRECTED TO: www.domain.com/products-handler.php/?cat=service
4. This redirection was appearing until Nov 2016.
5. I took over the website in 2017, the main page was getting indexed and deindexed on and off.
6. This June it suddenly started showing an index of this page "domain.com/products-handler.php/?cat=service"
7. These "products-handler.php" pages were creating sitewide internal duplicacy, hence I blocked them in robots.
8. Then my page (Main Page: www.domain.com/service) got totally off the Google index Q1) What could be the possible reasons for the creation of these pages?
Q2) How can 301 get placed from main to duplicate URL?
Q3) When I have submitted my main URL multiple times in Search Console, why it doesn't get indexed?
Q4) How can I make Google understand that these URLs are not my preferred URLs?
Q5) How can I permanently remove these (products-handler.php) URLs? All the suggestions and discussions are welcome! Thanks in advance! 🙂0 -
SEO Indexing issues
Hi, We have been submitting sitemaps on a weekly basis for couple of months now and only 40% of the submitted pages are indexed each time. Whether on the design , content or technical side, the website doesn't violate google guidelines.Can someone help me find the issue? website: http://goo.gl/QN5CevThanks!
Intermediate & Advanced SEO | | ZeFan0 -
Index or noindex mobile version?
We have a website called imones.lt
Intermediate & Advanced SEO | | FCRMediaLietuva
and we have a mobile version for it m.imones.lt We originally put noindex for m.imones.lt. Is it a good decision or no? We believe that if google indexes both it creates double content. We definitely don't want that? But when someone through google goes to any of imones.lt webpage using smartphone they are redirected to m.imones.lt/whatever Thank you for your opinion.0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Help my site it's not being indexed
Hello... We have a client, that had arround 17K visits a month... Last september he hired a company to do a redesign of his website....They needed to create a copy of the site on a different subdomain on another root domain... so I told them to block that content in order to not affect my production site, cause it was going to be an exact replica of the content but different design.... The developmet team did it wrong and blocked the production site (using robots.txt), so my site lost all it's organica traffic, which was 85-90% of the total traffic and now only get a couple of hundreds visits a month... First I thought we had been somehow penalized, however when I the other site recieving new traffic and being indexed i realized so I switched the robots.txt and created 301 redirect from the subdomain to the production site. After resending sitemaps, links to google+ and many things I can't get google to reindex my site.... when i do a site:domain.com search in google I only get 3 results. Its been now almost 2 month and honestly dont know what to do.... Any help would be greatly appreciated Thanks Dan
Intermediate & Advanced SEO | | daniel.alvarez0 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
Problem of indexing
Hello, sorry, I'm French and my English is not necessarily correct. I have a problem indexing in Google. Only the home page is referenced: http://bit.ly/yKP4nD. I am looking for several days but I do not understand why. I looked at: The robots.txt file is ok The sitemap, although it is in ASP, is valid with Google No spam, no hidden text I made a request for reconsideration via Google Webmaster Tools and it has no penalties We do not have noindex So I'm stuck and I'd like your opinion. thank you very much A.
Intermediate & Advanced SEO | | android_lyon0