Getting 260,000 pages re-indexed?
-
Hey there guys,
I was recently hired to do SEO for a big forum to move the site to a new domain and to get them back up to their ranks after this move. This all went quite well, except for the fact that we lost about 1/3rd of our traffic. Although I expected some traffic to drop, this is quite a lot and I'm wondering what it is. The big keywords are still pulling the same traffic but I feel that a lot of the small threads on the forums have been de-indexed. Now, with a site with 260,000 threads, do I just take my loss and focus on new keywords? Or is there something I can do to get all these threads re-indexed?
Thanks!
-
Great, I'm going to try that, thanks a lot!
-
Link to your category pages... Or a good idea might be to prepare pages by topic that feature (and link to) some of the most informative and popular threads.
-
-
We didn't actually do a 404, we 301'd everything, and I do mean everything, to our new domain.
-
Yes
-
Aye, that's what I thought as well
-
Nothing changed except for ads, which we placed better, the site speed is the same because we didn't move hosts. It actually improved lately because of someone we hired to optimize the site's speed. The backlinks coming in have transfered and we are building new ones. The thing is, the site itself is ranking really well for its new keywords, it's just these old ones that apparently have died
-
-
260,000 threads indeed, they go back to 2006 though, so we've had some time to get posts.
Throwing those PR5 links in there would help of course, but where to I point them at? How deep do I link? I could link to all the 260,000 threads but I believe that would be a little crazy.
-
check list:
-
) 404 , done
-
301 done
-
Been two months so by now google must have settled down with the traffic
-
How about on page factors ?
- page Title
-Layout
-
ads
-
Site speed
-
Linking outside
U need to check if they are all the same.
if its not this then I am afraid I can't come up with anymore points to help you with
-
-
while this maybe true in the general since I would like to however point out that the loss of traffic is caused due to shifting of the domain.
-
Almost two months now.
-
How long has it been since you have moved your site ?
-
260,000 threads?
How many inbound links do you have to hold all of that pagemass in the index?
If you don't have lots of high PR deep links into the site the spiders will visit obscure pages infrequently and will forget about them.
You need to link deep into these pages at multiple points with heavy PR. That will force a continuous and recurring stream of spiders down into the mass and require them to chew their way out. I think that you need a few dozen PR5 links at least for healthy indexing.
-
We've checked Google webmasters for 404 and crawl errors which we all fixed a day after moving. I can't check all the pages in SEOMoz tools because of the limit. We did do a complete 301 actually, redirecting every page to its new location.
-
I wud check google webmaster for 404 and crawl errors and fix them first.
I would then do the same in using seo moz tools.
After all that I would do a complete 301 from the old domain to the new domain.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Index or not index Categories
We are using Yoast Seo plugin. On the main menu we have only categories which has consist of posts and one page. We have category with villas, category with villa hotels etc. Initially we set to index and include in the sitemap posts and excluded categories, but I guess it was not correct. Would be a better way to index and include categories in the sitemap and exclude the posts in order to avoid the duplicate? It somehow does not make sense for me, If the posts are excluded and the categories included, will not then be the categories empty for google? I guess I will get crazy of this. Somebody has perhaps more experiences with this?
Intermediate & Advanced SEO | | Rebeca10 -
Certain Product Pages Not Indexing
Hey All, We discovered an issue where new product pages on our site were not getting indexed because a "noindex" tag was inadvertently being added to section when those pages were created. We removed the noindex tag in late April and some of the pages that had not been previously indexed are now showing up, but others are still not getting indexed and I'd appreciate some help on why this could be. Here is an example of a page that was not in the index but is now showing after removal of noindex: http://www.cloud9living.com/san-diego/gaslamp-quarter-food-tour And here is an example of a page that is still not showing in the index: http://www.cloud9living.com/atlanta/race-a-ferrari UPDATE: The above page is now showing after I manually submitted it in WMT. I had previously submitted another page like a month ago and it was still not indexing so I thought the manual submission was a dead end. However, it just so happens that the above URL just had its Page Title and H1 updated to something more specific and less duplicative so I am currently running a test to see if that's the problem with these pages not indexing. Will update this soon. Any suggestions? Thanks!
Intermediate & Advanced SEO | | GManSEO0 -
What is the best approach for getting comments indexed, but also providing a great UX?
The way our in-house comments system was built, it uses AJAX to call comments as the page is loaded. I'm working on a set of requirements to convert the system over to be more SEO-friendly. Today, we have a "load more comments" after the first 20 comments, then it calls the server and loads more comments. This is what I'm trying to figure out. Should we load all the comments behind the scenes in the page, then lazy load the comments or use the same "load more" and just load what was already loaded behind the scenes? Or does anyone have a better suggestion about how to make the comments crawlable for Google?
Intermediate & Advanced SEO | | JDatSB0 -
Getting a Facebox Item De-Indexed
Hello all, Select pages on a website I manage utilizes Facebox lightbox elements for additional information. These Faceboxes have been indexed as their own pages by Google. Unfortunately, when they were created, there is no call to action or navigation elements or anything. So I would prefer that Google not index them since it is a pretty horrible user experience if you navigate to one of these Faceboxes directly. I have tossed up no-index tags on each about three weeks ago, but they are still being indexed at present. Is there any tips or tricks that anyone has to handle this scenario or any ideas that I am not thinking of outside of just no-indexing them? Thanks in advance!
Intermediate & Advanced SEO | | ClayPotCreative0 -
Why is a page with a noindex code being indexed?
I was looking through the pages indexed by Google (with site:www.mywebsite.com) and one of the results was a page with "noindex, follow" in the code that seems to be a page generated by blog searches. Any ideas why it seems to be indexed or how to de-index it?
Intermediate & Advanced SEO | | theLotter0 -
Google Re-Index or multiple 301 Redirects on the server?
Over a year ago we moved a site from Blogspot that was adding dates in the URL's (i.e.. blog/2012/08/10/) Additionally we've removed category folders (/category, /tag, etc). Overall if I add all these redirects (from the multiple date options, etc) I'm concerned it might be an overload on the server? After talking with the server team they had suggested using something like 'BWP Google Sitemaps' on our Wordpress site, which would allow Google some time to re-index our site. What do you suggest we do?
Intermediate & Advanced SEO | | seointern0 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030 -
The system shows duplicate content for the same page (main domain and index.html). Is this an error of SEOMOZ?
Should I be worried that this will affect SEO? Most sites redirect to the index.html page, right? [edited by staff to remove toolbar data]
Intermediate & Advanced SEO | | moskowman0