Getting 260,000 pages re-indexed?
-
Hey there guys,
I was recently hired to do SEO for a big forum to move the site to a new domain and to get them back up to their ranks after this move. This all went quite well, except for the fact that we lost about 1/3rd of our traffic. Although I expected some traffic to drop, this is quite a lot and I'm wondering what it is. The big keywords are still pulling the same traffic but I feel that a lot of the small threads on the forums have been de-indexed. Now, with a site with 260,000 threads, do I just take my loss and focus on new keywords? Or is there something I can do to get all these threads re-indexed?
Thanks!
-
Great, I'm going to try that, thanks a lot!
-
Link to your category pages... Or a good idea might be to prepare pages by topic that feature (and link to) some of the most informative and popular threads.
-
-
We didn't actually do a 404, we 301'd everything, and I do mean everything, to our new domain.
-
Yes
-
Aye, that's what I thought as well
-
Nothing changed except for ads, which we placed better, the site speed is the same because we didn't move hosts. It actually improved lately because of someone we hired to optimize the site's speed. The backlinks coming in have transfered and we are building new ones. The thing is, the site itself is ranking really well for its new keywords, it's just these old ones that apparently have died
-
-
260,000 threads indeed, they go back to 2006 though, so we've had some time to get posts.
Throwing those PR5 links in there would help of course, but where to I point them at? How deep do I link? I could link to all the 260,000 threads but I believe that would be a little crazy.
-
check list:
-
) 404 , done
-
301 done
-
Been two months so by now google must have settled down with the traffic
-
How about on page factors ?
- page Title
-Layout
-
ads
-
Site speed
-
Linking outside
U need to check if they are all the same.
if its not this then I am afraid I can't come up with anymore points to help you with
-
-
while this maybe true in the general since I would like to however point out that the loss of traffic is caused due to shifting of the domain.
-
Almost two months now.
-
How long has it been since you have moved your site ?
-
260,000 threads?
How many inbound links do you have to hold all of that pagemass in the index?
If you don't have lots of high PR deep links into the site the spiders will visit obscure pages infrequently and will forget about them.
You need to link deep into these pages at multiple points with heavy PR. That will force a continuous and recurring stream of spiders down into the mass and require them to chew their way out. I think that you need a few dozen PR5 links at least for healthy indexing.
-
We've checked Google webmasters for 404 and crawl errors which we all fixed a day after moving. I can't check all the pages in SEOMoz tools because of the limit. We did do a complete 301 actually, redirecting every page to its new location.
-
I wud check google webmaster for 404 and crawl errors and fix them first.
I would then do the same in using seo moz tools.
After all that I would do a complete 301 from the old domain to the new domain.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Index thousands of thin content pages?
Hello all! I'm working on a site that features a service marketed to community leaders that allows the citizens of that community log 311 type issues such as potholes, broken streetlights, etc. The "marketing" front of the site is 10-12 pages of content to be optimized for the community leader searchers however, as you can imagine there are thousands and thousands of pages of one or two line complaints such as, "There is a pothole on Main St. and 3rd." These complaint pages are not about the service, and I'm thinking not helpful to my end goal of gaining awareness of the service through search for the community leaders. Community leaders are searching for "311 request service", not "potholes on main street". Should all of these "complaint" pages be NOINDEX'd? What if there are a number of quality links pointing to the complaint pages? Do I have to worry about losing Domain Authority if I do NOINDEX them? Thanks for any input. Ken
Intermediate & Advanced SEO | | KenSchaefer0 -
Tidied up site by getting rid of bad pages and now rankings tanked. - Please help
Hello Mozzers. We historically had Location specific landing pages on our eCommerce site. examples - site.co.ukj/cleaning-enquipment-london site.co.ukj/cleaning-enquipment-Manchester These all had unique content(600 words approx) and ranked in top 10 for many cities. I understand these would have been classed as doorway pages so we got rid of them (301'd back to the category pages) and now our rankings for these terms have tanked. We also have specific branch pages but we have kept these like many other companies with multiple branches do. It feels like by doing a good thing and tidying up everything , we are actually making our site worse. Everything else seems to be in place. Loads of new regular content , clean profile , mobile friendly, lots of citations etc etc. Any idea what could be going on here. Here's a link in our site - http://goo.gl/0yjSd8 thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
How to speed indexing of web pages after website overhaul.
We have recently overhauled our website and that has meant new urls as we moved from asp to php. we also moved from http to https. The website (https://) has 694 urls submitted through site map with 679 indexed in sitemap of google search console. As we look through the google search console analytics we notice that google index section / index status it says: https://www.xyz.com version - index status 2
Intermediate & Advanced SEO | | Direct_Ram
www.xyz.com version - index status 37
xyz.com version - index status 8 how can we get more pages to be indexed or found by google sooner rather than later as we have lost major traffic. thanks for your help in advance0 -
Xml sitemap Issue... Xml sitemap generator facilitating only few pages for indexing
Help me I have a website earlier 10,000 WebPages were facilitated in xml sitemap for indexation, but from last few days xml sitemap generator facilitating only 3300 WebPages for indexing. Please help me to resolve the issue. I have checked Google webmaster indexed pages, its showing 8,141. I have tried 2-3 paid tools, but all are facilitating 3300 pages for indexing. I am not getting what is the exact problem, whether the server not allowing or the problem with xml sitemap generator. Please please help me…
Intermediate & Advanced SEO | | udistm0 -
Thinking about not indexing PDFs on a product page
Our product pages generate a PDF version of the page in a different layout. This is done for 2 reasons, it's been the standard across similar industries and to help customers print them when working with the product. So there is a use when it comes to the customer but search? I've thought about this a lot and my thinking is why index the PDF at all? Only allow the HTML page to be indexed. The PDF files are in a subdomain, so I can easily no index them. The way I see it, I'm reducing duplicate content On the flip side, it is hosted in a subdomain, so the PDF appearing when a HTML page doesn't, is another way of gaining real estate. If it appears with the HTML page, more estate coverage. Anyone else done this? My knowledge tells me this could be a good thing, might even iron out any backlinks from being generated to the PDF and lead to more HTML backlinks Can PDFs solely exist as a form of data accessible once on the page and not relevant to search engines. I find them a bane when they are on a subdomain.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
How can a Page indexed without crawled?
Hey moz fans,
Intermediate & Advanced SEO | | atakala
In the google getting started guide it says **"
Note: **Pages may be indexed despite never having been crawled: the two processes are independent of each other. If enough information is available about a page, and the page is deemed relevant to users, search engine algorithms may decide to include it in the search results despite never having had access to the content directly. That said, there are simple mechanisms such as robots meta tags to make sure that pages are not indexed.
" How can it happen, I dont really get the point.
Thank you0 -
De-indexing product "quick view" pages
Hi there, The e-commerce website I am working on seems to index all of the "quick view" pages (which normally occur as iframes on the category page) as their own unique pages, creating thousands of duplicate pages / overly-dynamic URLs. Each indexed "quick view" page has the following URL structure: www.mydomain.com/catalog/includes/inc_productquickview.jsp?prodId=89514&catgId=cat140142&KeepThis=true&TB_iframe=true&height=475&width=700 where the only thing that changes is the product ID and category number. Would using "disallow" in Robots.txt be the best way to de-indexing all of these URLs? If so, could someone help me identify how to best structure this disallow statement? Would it be: Disallow: /catalog/includes/inc_productquickview.jsp?prodID=* Thanks for your help.
Intermediate & Advanced SEO | | FPD_NYC0 -
404 with a Javascript Redirect to the index page...
I have a client that is wanting me to issue a 404 on her links that are no longer valid to a custom 404, pause for 10 seconds, then rediirect to the root page (or whatever other redirect logic she wants)...to me it seems trying to game googlebot this way is a "bad idea" Can anyone confirm/deny or offer up a better suggestion?
Intermediate & Advanced SEO | | JusinDuff0