How long for a sitewide 301 to reindex?
-
Hey Gang,
Finally joined the big boys here, excited to see what we all can do together.
Here is my situation. I have been struggling since panda 1.0 on a particular site at www.burnworld.com. Over 2011 we figured out what the issues were with the content and went on a major cleanup. This seemed to help towards the end of the 2011. However further panda updates this year mainly April have again struck. This was after adding a wordpress blog to the site late 2011, so it was a mix of a traditional html site and a wordpress blog. Thinking that this could be an issue in May this year we transferred all the content over to wordpress only. We did keep the same linking strucutre using a permallink plugin to set specific url's.
Forward to Panda 20. This wiped out all rankings and then we could not even rank for our own content. One site that syndicates our content is now ranking for our content instead of us, and many 'feed' sites that scrape out feeds also rank insead of us.
Okay now to my original question. 2 weeks ago we pulled the plug and made the decision it may be best to start over on www.burnworld.net since the .net in the past was a blog on wordpress (which was shutdown earlier n 2012), but sat with about 5 pages of content until we did the 301.
So today none of the pages are in the main index and I am wondering if doing the 301 might have been a mistake by pointing it to an existing site that never really ranked. Would it have been best to start on a new domain?
How long have others seen before google puts the pages back in the main index?
Would like to figure out the best action to take to get back into google's good graces. I'll keep this page updated so others with this issue can hopefully have a resource to turn to.
BTW- nothing has chaned with Binghoo, rankings are all the same and they have updated the domain change properly.
-
That depends upon why the page stopped ranking. If it was being filtered for duplicate content then rewriting might bring it back if your new content is unique.
If it stopped ranking because you have an overabundance of exact match anchor text links pointing at the page then rewriting will probably not work.
-
I appreciate the feedback EGOL. Is this statement based on experience with other sites? I don't see how all the content can be worthless.
So getting back to my original question, how long on average does google take to add a site back into the index after doing a 301 redirect? Webmaster tools does show that 165 out of 170 pages are indexed from the sitemaps, but the index status only shows 12 and this was the number before the switch, so nothing has changed there.
-
Appreciate the feedback Paul, regardless of what it is.
The main reason for 301 was due that there was a complete site redesign, layout, internal linking structure, rewriting/updating/removing content, etc.. I was hoping that the domain change would be treated as a new site with all the updates that we did and to eliminate the hornets nest of the .com which was a hybrid mix of html and wordpress.
Because of all these changes is google just filtering out the content until it knows what to do with it? Meaning it has crawled it enought times to know what it is? It's just strange that Binghoo renamed everything properly but google has completely filtered it all out.
-
IMO this domain and its content is screwed. Worthless.
I would start fresh on a new domain.
Dont reuse content from the original site. Don't redirect anything from the original site.
I would start on a new domain and not syndicate anything or republish anything.
That's what I would do.... others might do different.
-
Not encouraging to hear EGOL, but if that's the reality with google now then it really does look like they are pushing the little guy out regardless of PR on the site.
So do you think that I have a chance on my content ever showing back in the google serps after doing the site 301 redirect? Should I undo the 301 or does that cause another potential ranking issue?
-
Looking at this purely from a technical perspective, Rob... (from EGOL's observations above you have content issues as well)
I think you've misunderstood the biggest challenge to recovering from a heavy Panda hammering with a new site.
When you create a new site for recovery purposes you must completely dissociate it from the original damaged site. By 301-redirecting the old to the new URL, you've simply told Google that the old site (with all it's algorithmic penalties) simply exists at the new URL. So the new site inherits all the de-ranking of the old site.
This is why Panda recovery is so tough - you need to literally start again from scratch with the new site. You can't even easily redirect the human traffic from the old site to the new without risking bringing the "penalties" with them.
And this makes sense from the Search Engine perspective. If they've de-ranked your existing site, it wouldn't make sense to allow you to just point to a new domain and sidestep the penalties. Otherwise, nefarious webmasters would just run manipulative sites as hard as they could until they got penalised, then point to a new domain and pick up where they left off until they got caught again.
Looks to me like you're going to need to completely disconnect the two sites (I'm even paranoid enough to suggest a new Google Analytics account and new hosting/domain registration if you want to go that far) and start a significant content rewrite and new content program.
And that's why Panda/Penguin recovery with a new site is so tough. You have to rebuild from the ground up with no help from your previous site. Google has clearly told you it considers your existing content of little value as a search result. Moving that content to a new domain isn't going to change that opinion. Only new, higher-quality content (as well as new additional authority/ranking factors) will.
Sorry to be such a downer on your first post (welcome to the SEOMoz Clan by the way!) but you have a steep road ahead of you and it would unfair to mislead you otherwise.
Paul
-
When other people start grabbing your content or you start grabbing their content then the sites that will survive filters and panda are usually the most powerful sites with the most domain authority.
I had a couple hundred pages on one of my sites that appeared verbatim on many other sites and it got hit with a panda problem... and this is a PR7 domain and the panda problem pages were PR4 and PR5.
I escaped panda in a few months by deleting some of the problem content and noindexing the rest. I was lucky to have thousands of pages of original content left after getting rid of the dupes.
If you have a juvenile site and the scrapers get you or you syndicate or republish... then you have gotten the kiss of death.
That's my opinion. Others might believe different.
-
Hey Egol, now that you mentioned this about content, changing the content of the focused pages will help or its just useless, what do you think ?
I am working on changing content of few pages which stopped ranking, what i mean is i am just re-writing the text in the articles.
will it be re ranked with the new content?
-
yeah that's what I'm thinking too. All these rss scraper sites are now outranking the original content. Does not seem to make sense to me though as all these sites link back to the original article.
Would it make sense to put time into redoing the content to try to outrank my original content that is now strewn across scrapers and feed sites?
I thought that google made an update that made finding and giving credit to the original source more reliable. Is that true for only for certain 'brands'?
-
I searched Google for sentences from your content between quotes. Like this....
"It’s finally done! Our official review of DVD to iPod Converting Software for 2013."
I saw your site in those SERPs but I also saw lots of other sites with the same content. I saw this for six or seven sentences from various parts of your site.
IMO this site with this content is screwed.
I would start working on another project. This one might come back in a couple of years after other sites that have your content die off or your content rolls so deep into their site that it is not getting any more linkjuice.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
An immediate and long-term plan for expired Events?
Hello all, I've spent the past day scouring guides and walkthroughs and advice and Q&As regarding this (including on here), and while I'm pretty confident in my approach to this query, I wanted to crowd source some advice in case I might be way off base. I'll start by saying that Technical SEO is arguably my weakest area, so please bear with me. Anyhoozles, onto the question (and advance apologies for being vague): PROBLEM I'm working on a website that, in part, works with providers of a service to open their own programs/centers. Most programs tend to run their own events, which leads to an influx of Event pages, almost all of which are indexed. At my last count, there were approximately 800 indexed Event pages. The problem? Almost all of these have expired, leading to a little bit of index bloat. THINGS TO CONSIDER A spot check revealed that traffic for each Event occurs for about a two-to-four week period then disappears completely once the Event expires. About half of these indexed Event pages redirect to a new page. So the indexed URL will be /events/name-of-event but will redirect to /state/city/events/name-of-event. QUESTIONS I'M ASKING How do we address all these old events that provide no real value to the user? What should a future process look like to prevent this from happening? MY SOLUTION Step 1: Add a noindex to each of the currently-expired Event pages. Since some of these pages have link equity (one event had 8 unique links pointing to it), I don't want to just 404 all of them, and redirecting them doesn't seem like a good idea since one of the goals is to reduce the number of indexed pages that provide no value to users. Step 2: Remove all of the expired Event pages from the Sitemap and resubmit. This is an ongoing process due to a variety of factors, so we'd wrap this up into a complete sitemap overhaul for the client. We would also be removing the Events from the website so there are not internal links pointing to them. Step 3: Write a rule (well, have their developers write a rule) that automatically adds noindex to each Event page once it's expired. Step 4: Wait for Google to re-crawl the site and hopefully remove the expired Events from its index. Thoughts? I feel like this is the simplest way to get things done quickly while preventing future expired events from being indexed. All of this is part of a bigger project involving the overhaul of the way Events are linked to on the website (since we wouldn't be 404ing them, I would simply suggest that they be removed entirely from all navigation), but ultimately, automating the process once we get this concern cleaned up is the direction I want to go. Thanks. Eager to hear all your thoughts.
Technical SEO | | Alces0 -
Removing Multiple 301 Redirects
During my last redesign (and migration to Drupal) some of the updated SEO friendly url's on the new site were misspelled. Rather than updating the 301 redirects to point to the correct page the developer just added an additional 301 redirect. So it was redirected like this website.com/oldpage (301 to) website.com/new-paige (301 to) website.com/new-page Instead of website.com/oldpage (301 to) website.com/new-page I'll be finishing another redesign and updating to https soon, should I remove the redirect to the misspelled domain and just have one 301 from the original page? These multiple redirects have been up for over a year. Thanks for any specific advice!
Technical SEO | | talltrees0 -
301 redirect chains
Hi everyone, I've had my site for a while now and have changed the structure a number of times. I'm confident my 301's work well and am not concerned about dead ends on my site. My question is, is there a way to find 301 redirect chains? i.e. can I export my link data from webmaster tools and run it through some software that tells me how many steps my 301's are taking to get to the final page? I don't know for sure that there are long 301 chains in my link structure, but I have a suspicion and it's very hard to check by going through them manually. Thanks in advance Will
Technical SEO | | madegood0 -
Changing title tags, do we need 301 redirects
I found many duplicate title tags and I'm in the process of changing it Do I need 301 redirects in place when I switch it? I am only changing the title tag. Also, we are switching over to a new site very soon, I am worried that we might be using too many 301 redirect "hops" because we are doing a lot of optimization as well. (video from matt cutts describing 301 redirects and hops: http://www.youtube.com/watch?v=r1lVPrYoBkA. Does anyone have any experience in doing too many redirect hops that it affected your rankings? Any good ideas to avoid this?
Technical SEO | | EcomLkwd0 -
301 Redirect without specifying base domain?
Hi guys, about 10 minutes ago, I finally found the underlying problem in magento between 301 redirects and canonical tags - part of the reason why it was hard to identify was because the problem is not constant through all pages. Anyway now that I found the problem, I have about 50 301 redirects to implement via .htaccess. Now I know the regular syntax is Redirect 301 /oldpage.html http://www.yoursite.com/newpage.html But I'm wondering if there's a way that the base domain doesn't have to be specified in the second half of this line. Something like: Redirect 301 /oldpage.html /newpage.html I'm sure this can be done, somehow. Thanks!
Technical SEO | | yacpro130 -
301 permanent re-direct
My site can be accessed either with or without the www. Does this affect my search positions and do error 301-redirect cause any ill effects?
Technical SEO | | fireman0 -
301 Redirect NOT Working as Expected - HELP!
Hi! I just launched our newly coded site and just realized the installed 301 is NOT working. The URL string is the same EXCEPT for the removal of /shop/. Here is the code in .htaccess: ############################################ enable rewrites Options +FollowSymLinks RewriteEngine on #RedirectMatch 301 ^/shop?/$ http://hiphound.com/ RedirectMatch 301 ^/shop?/$ http://hiphound.com ########################################### When I go to Google and click on an old link I get a 404. No bueno!! Here is an example: http://hiphound.com/shop/rubit-dog-tag-clip I thought (and was told) that the installed 301 would send this page to: http://hiphound.com/rubit-dog-tag-clip It's not. Please HELP!! 🙂 What am I doing wrong??? Lynn
Technical SEO | | hiphound0 -
301 Redirect "wildcard" question
I have been looking at the SEOmoz redirect guide for some advice but I can't seem to find the answer : http://www.seomoz.org/learn-seo/redirection I have lots of URLs from a previous version of a site that look like the following: sitename.com/-c-25.html?sort=2d&page=1 sitename.com/-c-25.html?sort=3a&page=1 etc etc. I want to write a redirect so whenever a URL with the terms "-c-25.html" is requested it redirects to a specified page, regardless of what comes after the question mark. These URLs were created by our previous ecommerce software. The 'c' is for category, and each page of the cateogry created a different URL. I want to do these so I can rediect all of these URLs to the appropraite new cateogry page in a single redirect. Thanks for any help.
Technical SEO | | craigycraig0