Tactic to get 2000+ indexed (fast)
-
Dear SEO'mozzers,
Soon i'll be launching my new project. A website with about 2000+ pages, with +- 150 words per page (simple instructions, can't make it longer).
It is vital that every page is indexed and will get in the SERP's.
Wich tactic can you recommend:
- Just set every page online at once ( with a good sitemap) or,
- Feed Google Sitemap with lets say 30 pages a day, so the crawler will come by every day. And hopefully get a better indexation and better rankings over time.
- Other tactic? of doesnt matter?
Many thanks for your help.
Gr Menno
-
I echo what Ryan said 100%. Another suggestion - especially because it sounds like you're going to start with a whole bunch of info - is to add a blog. When you're building a site, especially one that has a whole bunch of info go live at once, is to stay focused on fresh content.
With my businesses' sites, I've really found that pushing content all at once during the launch gets me indexed, but doesn't necessarily get me the SERP position I want. I try to write two articles a week per website at a minimum. It keeps the crawlers coming back and increases my site wide keyword density and potential for catching long tail searched.
-
Thanks for the advice. Think ill go with it and redesign structure to get more info on one page, so i can also put more effort in unique articles ( only around 700 then). Wich saves me time + make my website better for SEO.
-
I'm with Ryan on this one. If you can use less pages with more information on then do so.
And also I'd recommend reading up on the Panda Update.
-
Without thoroughly understanding your niche, the products / services / companies involved, it is very difficult to offer meaningful advice.
In brief, you can drop the "generic product" pages and instead make a single, rich page for Company A which offers all the details readers need.
You are welcome to operate your site however you see fit, but Google and Bing will operate their search results how they see fit, and they have determined the tactic you are using is not in the best interest of users.
If you felt compelled to present the site in the manner you described, you can add the canonical tag to all the Generic Product pages indicating the Company A page as the primary page to be indexed.
-
Ill try to explain what my problem is. Cause what you're telling is true, found that out myself onze too.
The problem is that every page NEEDS to be there, cause the little info differences are vital.
It a website with info about how to cancel subscriptions. Most of services are offered are all the same from all company's. Only the adress is the difference.
Its build up like this:
Company A - info page
Generic product a - cancel for adres for company A - infopage
Generic product b - cancel for adres for company A - infopage
Generic product b - cancel for adres for company A - infopage
Company B - info page
Generic product a - cancel for adres for company B - infopage
Generic product b - cancel for adres for company B - infopage
Generic product b - cancel for adres for company B - infopageThe difference from content is not more that 15%, but that 15% makes the difference and is vital. Any idea for a solution for this problem?
-
The second choice would be recommended.
It is common for site owners to publish more pages in an attempt to rank for more keywords. An example I can think of related to directions:
Article 1 - How to clear cache in Firefox 13
Article 2 - How to clear cache in Firefox 12
Article 3 - How to clear cache in Firefox 11
...and so forth. The directions are all the same but in an effort to target individual keywords the site owner generates numerous pages. Search engines view the pages as duplicate content.
Next, site owners attempt what you are suggesting...hire writers to change a few words around to make each article appear unique. This tactic does not help improve the quality of your pages and therefore does not help users. It is simply an attempt to manipulate search engines. It often does not work. If it does work, it may stop working after a time as search engines get better at filtering such techniques.
The suggestion I would make is to forget search engines exist and write the clearest, best directions ever written. Offer images, details about things that might go wrong, etc.
-
Thanks for list, i think everything is fine. Only not the content you mentioned. Think i need a few good text writers, to write 2000x200 words of unique articles.
To tackle the unique content problem i have 2 solutions. Wich one do you think its best?
- Publish the site with 75% possible dupe content, and then rewrite over time.
- Only publish only unique articles, and take some time for it ?
Gr
-
Your site size really is not a factor in determining how quickly the site is indexed. A few steps you can take to achieve the goal of having all 2k pages indexed fast:
-
ensure your site's navigation is solid. All pages should be reachable within a maximum of 3 mouse clicks from the home page.
-
for the most part, your site should be HTML based. You can use Javascript, flash and so forth but the HTML support needs to be there as well. Try turning off javascript and flash, then navigating your site.
-
for pages you do not wish to be indexed, add the "noindex" tag to them rather then blocking them in robots.txt when possible.
-
review your site map to ensure it is solid. Ensure all 2k pages you want indexed are included in the sitemap. Also ensure there are not any pages blocked by robots.txt or "noindex" in your sitemap.
-
review your content to ensure each page is unique. With only 150 words per page, there is a high likelihood many pages will be viewed as duplicate content and therefore not indexed.
-
review your site code (validator.w3.org) to ensure it is fairly clean. Some errors can impact a search engine's ability to crawl your site.
My biggest concern is the last point. If you simply change the title and a couple keywords, then the other pages will be viewed as duplicates and not indexed, or even if they are indexed they wont rank well.
I should also clarify the above applies to Google.com mostly. Bing is much pickier about the pages it will index.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get visibility in Google Discover?
Hey everyone, I run a website that publish articles about pets. I have read some great things about Google Discover and the potential traffic it can bring to publishers (Condé Nast reported up to 20% of traffic coming from Discover in the US, at a certain point). I am currently trying to get indexed and after reading Google guidelines and a Ahrefs guide, I have made many optimizations to my site: structured data, creating an author page, fixing image size and publishing date... so far, it's not working. I feel the lack of a knowledge graph for my business may affect my chances. I'm currently building a GMB page to fix this. Do you have other recommendations or success stories of your own experiments with Discover? An example of an article I tried to get indexed was https://www.lebernard.ca/teletravail-chien-guide-survie/. Obviously, I'm not expecting feedback on the quality of the content since it's in French, but I'm curious if you see anything from a technical perspective that doesn't work. Thanks a lot for your help! Charles
Intermediate & Advanced SEO | | Cheebee1240 -
Anchor text penalties and indexed links
Hi! I'm working on a site that got hit by a manual penalty some time ago. I got that removed, cleaned up a bunch of links and disavowed the rest. That was about six months ago. Rankings improved, but the big money terms still aren't doing great. I recently ran a Searchmetrics anchor text report though, and it said that direct match anchors still made up the largest part of the overall portfolio. However, when I started looking at individual links with direct anchors, nearly every one had been removed or disavowed. My question is, could an anchor text penalty be in place because these removed links have not been reindexed? If so, what are my options? We've waited for this to happen naturally, but it hasn't occurred after quite a few months. I could ping them - could this have any impact? Thanks!
Intermediate & Advanced SEO | | Blink-SEO0 -
How do I get rid of my errors for Schema.org?
I put the Schema.org data on my item pages and it works great. However, when an item closes it removes the price. It showed an empty price and that causes an error. The site is now programmed to where if an item closes it removes the price component. This was done about 2 weeks ago and it is still showing a lot of errors. Any ideas?
Intermediate & Advanced SEO | | EcommerceSite0 -
Certain Product Pages Not Indexing
Hey All, We discovered an issue where new product pages on our site were not getting indexed because a "noindex" tag was inadvertently being added to section when those pages were created. We removed the noindex tag in late April and some of the pages that had not been previously indexed are now showing up, but others are still not getting indexed and I'd appreciate some help on why this could be. Here is an example of a page that was not in the index but is now showing after removal of noindex: http://www.cloud9living.com/san-diego/gaslamp-quarter-food-tour And here is an example of a page that is still not showing in the index: http://www.cloud9living.com/atlanta/race-a-ferrari UPDATE: The above page is now showing after I manually submitted it in WMT. I had previously submitted another page like a month ago and it was still not indexing so I thought the manual submission was a dead end. However, it just so happens that the above URL just had its Page Title and H1 updated to something more specific and less duplicative so I am currently running a test to see if that's the problem with these pages not indexing. Will update this soon. Any suggestions? Thanks!
Intermediate & Advanced SEO | | GManSEO0 -
Indexed non existent pages, problem appeared after we 301d the url/index to the url.
I recently read that if a site has 2 pages that are live such as: http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live... I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page. We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server. We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index The links are showing 200 OK. We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/. What is the best way to solve this issue?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Getting Google in index but display "parent" pages..
Greetings esteemed SEO experts - I'm hunting for advice: We operate an accommodation listings website. We monetize by listing position in search results, i.e. you pay more to get higher placing in the page. Because of this, while we want individual detailed listing pages to be indexed to get the value of the content, we don't really want them appearing in Google search results. We ideally want the "content value" to be attributed to the parent page - and google to display this as the link in the search results instead of the individual listing. Any ideas on how to achieve this?
Intermediate & Advanced SEO | | AABAB0 -
Why are so many pages indexed?
We recently launched a new website and it doesn't consist of that many pages. When you do a "site:" search on Google, it shows 1,950 results. Obviously we don't want this to be happening. I have a feeling it's effecting our rankings. Is this just a straight up robots.txt problem? We addressed that a while ago and the number of results aren't going down. It's very possible that we still have it implemented incorrectly. What are we doing wrong and how do we start getting pages "un-indexed"?
Intermediate & Advanced SEO | | MichaelWeisbaum0 -
Should I Allow Blog Tag Pages to be Indexed?
I have a wordpress blog with settings currently set so that Google does not index tag pages. Is this a best practice that avoids duplicate content or am I hurting the site by taking eligible pages out of the index?
Intermediate & Advanced SEO | | JSOC0