How long for a sitewide 301 to reindex?
-
Hey Gang,
Finally joined the big boys here, excited to see what we all can do together.
Here is my situation. I have been struggling since panda 1.0 on a particular site at www.burnworld.com. Over 2011 we figured out what the issues were with the content and went on a major cleanup. This seemed to help towards the end of the 2011. However further panda updates this year mainly April have again struck. This was after adding a wordpress blog to the site late 2011, so it was a mix of a traditional html site and a wordpress blog. Thinking that this could be an issue in May this year we transferred all the content over to wordpress only. We did keep the same linking strucutre using a permallink plugin to set specific url's.
Forward to Panda 20. This wiped out all rankings and then we could not even rank for our own content. One site that syndicates our content is now ranking for our content instead of us, and many 'feed' sites that scrape out feeds also rank insead of us.
Okay now to my original question. 2 weeks ago we pulled the plug and made the decision it may be best to start over on www.burnworld.net since the .net in the past was a blog on wordpress (which was shutdown earlier n 2012), but sat with about 5 pages of content until we did the 301.
So today none of the pages are in the main index and I am wondering if doing the 301 might have been a mistake by pointing it to an existing site that never really ranked. Would it have been best to start on a new domain?
How long have others seen before google puts the pages back in the main index?
Would like to figure out the best action to take to get back into google's good graces. I'll keep this page updated so others with this issue can hopefully have a resource to turn to.
BTW- nothing has chaned with Binghoo, rankings are all the same and they have updated the domain change properly.
-
That depends upon why the page stopped ranking. If it was being filtered for duplicate content then rewriting might bring it back if your new content is unique.
If it stopped ranking because you have an overabundance of exact match anchor text links pointing at the page then rewriting will probably not work.
-
I appreciate the feedback EGOL. Is this statement based on experience with other sites? I don't see how all the content can be worthless.
So getting back to my original question, how long on average does google take to add a site back into the index after doing a 301 redirect? Webmaster tools does show that 165 out of 170 pages are indexed from the sitemaps, but the index status only shows 12 and this was the number before the switch, so nothing has changed there.
-
Appreciate the feedback Paul, regardless of what it is.
The main reason for 301 was due that there was a complete site redesign, layout, internal linking structure, rewriting/updating/removing content, etc.. I was hoping that the domain change would be treated as a new site with all the updates that we did and to eliminate the hornets nest of the .com which was a hybrid mix of html and wordpress.
Because of all these changes is google just filtering out the content until it knows what to do with it? Meaning it has crawled it enought times to know what it is? It's just strange that Binghoo renamed everything properly but google has completely filtered it all out.
-
IMO this domain and its content is screwed. Worthless.
I would start fresh on a new domain.
Dont reuse content from the original site. Don't redirect anything from the original site.
I would start on a new domain and not syndicate anything or republish anything.
That's what I would do.... others might do different.
-
Not encouraging to hear EGOL, but if that's the reality with google now then it really does look like they are pushing the little guy out regardless of PR on the site.
So do you think that I have a chance on my content ever showing back in the google serps after doing the site 301 redirect? Should I undo the 301 or does that cause another potential ranking issue?
-
Looking at this purely from a technical perspective, Rob... (from EGOL's observations above you have content issues as well)
I think you've misunderstood the biggest challenge to recovering from a heavy Panda hammering with a new site.
When you create a new site for recovery purposes you must completely dissociate it from the original damaged site. By 301-redirecting the old to the new URL, you've simply told Google that the old site (with all it's algorithmic penalties) simply exists at the new URL. So the new site inherits all the de-ranking of the old site.
This is why Panda recovery is so tough - you need to literally start again from scratch with the new site. You can't even easily redirect the human traffic from the old site to the new without risking bringing the "penalties" with them.
And this makes sense from the Search Engine perspective. If they've de-ranked your existing site, it wouldn't make sense to allow you to just point to a new domain and sidestep the penalties. Otherwise, nefarious webmasters would just run manipulative sites as hard as they could until they got penalised, then point to a new domain and pick up where they left off until they got caught again.
Looks to me like you're going to need to completely disconnect the two sites (I'm even paranoid enough to suggest a new Google Analytics account and new hosting/domain registration if you want to go that far) and start a significant content rewrite and new content program.
And that's why Panda/Penguin recovery with a new site is so tough. You have to rebuild from the ground up with no help from your previous site. Google has clearly told you it considers your existing content of little value as a search result. Moving that content to a new domain isn't going to change that opinion. Only new, higher-quality content (as well as new additional authority/ranking factors) will.
Sorry to be such a downer on your first post (welcome to the SEOMoz Clan by the way!) but you have a steep road ahead of you and it would unfair to mislead you otherwise.
Paul
-
When other people start grabbing your content or you start grabbing their content then the sites that will survive filters and panda are usually the most powerful sites with the most domain authority.
I had a couple hundred pages on one of my sites that appeared verbatim on many other sites and it got hit with a panda problem... and this is a PR7 domain and the panda problem pages were PR4 and PR5.
I escaped panda in a few months by deleting some of the problem content and noindexing the rest. I was lucky to have thousands of pages of original content left after getting rid of the dupes.
If you have a juvenile site and the scrapers get you or you syndicate or republish... then you have gotten the kiss of death.
That's my opinion. Others might believe different.
-
Hey Egol, now that you mentioned this about content, changing the content of the focused pages will help or its just useless, what do you think ?
I am working on changing content of few pages which stopped ranking, what i mean is i am just re-writing the text in the articles.
will it be re ranked with the new content?
-
yeah that's what I'm thinking too. All these rss scraper sites are now outranking the original content. Does not seem to make sense to me though as all these sites link back to the original article.
Would it make sense to put time into redoing the content to try to outrank my original content that is now strewn across scrapers and feed sites?
I thought that google made an update that made finding and giving credit to the original source more reliable. Is that true for only for certain 'brands'?
-
I searched Google for sentences from your content between quotes. Like this....
"It’s finally done! Our official review of DVD to iPod Converting Software for 2013."
I saw your site in those SERPs but I also saw lots of other sites with the same content. I saw this for six or seven sentences from various parts of your site.
IMO this site with this content is screwed.
I would start working on another project. This one might come back in a couple of years after other sites that have your content die off or your content rolls so deep into their site that it is not getting any more linkjuice.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirect Question
I am working with a website and I ran a Screaming Frog and noticed there are 4,600 301's on the website (www.srishoes.com). It seems like the issue is between the www. and without it and they aren't working together. Is this something that the website provider should update and what type of impact might this have on the site? Thanks!
Technical SEO | | ReunionMarketing
Matt0 -
301 Redirect Timing Questions
Hey all, Quick question on 301 redirects and the timing of creating them when transitioning from an old site to a new site. Does the timing matter? Can redirects interfere with DNS propigation (which seemed to happen to us when we did redirects minutes after redirecting someone's DNS A record to now point to the new site) And lastly, how long AFTER a new site launch can one still submit redirects and not lose the google juice? All the best,
Technical SEO | | WorldWideWebLabs0 -
301 and 302 for same link
Just trying to find out if this may be the root of a slight traffic dip and also if we should be redirecting differently. We relaunched and did 301 redirects to the new site, initially. Then, we decided to change from http to https. Our HTTP status now looks like this when using the MozBar: HTTP/1.1 301 Moved Permanently – http://site.com/oldurl
Technical SEO | | MichaelEka
HTTP/1.1 302 Found – https://site.com/oldurl
HTTP/1.1 200 OK - https://site.com/new Should we be changing that 302 to a 301? Are we losing link equity due to this? Thanks.0 -
Many Pages Being Combined Into One Long Page
Hi All, In talking with my internal developers, UX, and design team there has been a big push to move from a "tabbed" page structure (where as each tab is it's own page) to combining everything into one long page. It looks great from a user experience standpoint, but I'm concerned that we'll decrease in rankings for the tabbed pages that will be going away, even with a 301 in place. I initially recommending#! or pushstate for each "page section" on the long form content. However there are technical limitations with this in our CMS. The next idea I had was to still leave those pages out there and to link to them in the source code, but this approach may get shot down as well. Has anyone else had to solve for this issue? If so, how did you do it?
Technical SEO | | AllyBank1 -
Moving Blog and 301 Redirect Advice
Hello Moz Community, We recently moved our blog from its own domain to a directory on our website. We do not plan on moving over all the old blog posts because a majority most of them are based on events or time-sensitive information that has passed. We need advice on what to do with all of the old blog URL's? Should we just 301 all of them to the new blog directory on our website (www.domain.com/blog)? Should we take the time to move over all the old blog content and put the appropriate 301's in place? Any and all advice would be greatly appreciated. Thank you in advance.
Technical SEO | | All-Star-Vacation-Homes
Best,
Rich0 -
302 to 301 redirect
Our site has quite a few 302 redirects that really ought to be 301's. Our IT department is really busy so the question is, given that the 302's have probably been in place for years, is it worth changing them to 301's now? Thanks
Technical SEO | | Houses0 -
301'ing googlebot
I have a client that has been 301’ing googlebot to the canonical page. This is because they have a cart_id and session parameters in urls. This is mainly from when googlebot comes in on a link that has these parameters in the URL, as they don’t serve these parameters up to googlebot at all once it starts to crawl the site.
Technical SEO | | AlanMosley
I am worried about cloaking; I wanted to know if anyone has any info on this.
I know that Google have said that doing anything where you detect goolgebots useragent and treat them different is a problem.
Anybody had any experience on this, I would be glad to hear.0 -
200 Redirects for SEO instead of 301
We are working with a company on re-platforming our website. On a call yesterday they outlined a strategy to use 200 redirects for our top keywords instead of 301s. I am not familiar with this type of redirect and was wondering if anyone could provide some more insight.
Technical SEO | | EvergladesDirect0