Does this make sense to recover from panda?
-
Hello guys, our website was pandalized on 9/27/2012 and we haven't been able to recover since then.
I've fixed as much as possible when it comes to poor content, and we have been getting high quality links consistently for the past 3-4 months.
Our blog had some duplicate content issues due to categories, tags, feeds, etc. I solved those problems before the past 2 refreshes without success.
I'm considering moving the blog to a subdomain, more than PR, I'm interested in recovering from panda, and let the blog grow on its own. What do you think about that?
-
I have in mind doing 301 redirects.
-
Subdomaining will take time.
I subdomained a lot of content, and almost 2 months later, google is still looking for the content on the main domain, still reporting it as missing, still marking site health poorly because of those pages, and still showing them all in both places in the search index.
One possibility after seeing the new pages indexed, is to use the webmaster tools delete URL feature. You only have a few hundred blog pages, so that won't take long.
After two months, I wish I could tell you that it fixed my problem, but it hasn't.
Subdomaining the blog is not likely to hurt you. Remember you will need to set it up so you can have a separate robots file and a separate sitemap.
Also,
1. be sure that you've covered all the bases, that mainsite links that point to the blog now have the fully qualified URL.
2. be sure that any links on the blog that point to the main site are also fully qualified URLs, and not relative, therefore making a broken link.
-
Thank you for your comment, I'm highly considering this move.
-
Yes, for sure panda, that's what webmaster tools and analytics say. The blog does link to our main site, but only from the /blog home page. I did this on purpose, I didn't want hundreds of internal links with the same anchor text.
I'll have a look at our privacy policy on the blog and might replace it with the one from truste.
Regarding the business resources page, I agree that the content doesn't offer much for Google to see, so I might consider "noindex" for it.
Traffic loss, most of our organic traffic from Google. We are doing well on Yahoo and Bing (improving every week actually).
The poor links you see come from negative SEO done against us in may '12. Thankfully most of those comments are nofollow.
Thank you. Please keep the ideas coming!
-
Dave, are you sure that was panda?
What does your webmaster tools account tell you?
Have you fixed all duplicate titles, descriptions and pages?
What traffic have you lost?
Was it only google or yahoo and bing as well?
Here are some observations:
First, look at your link profile - it doesn't look good to me
I didn't see a link from your blog to your home page
You have no terms of use link on the blog pages
There are 2 privacy policies. One for the blog, and one for the main site, but that policy is on the truste site
The privacy policy on the blog says nothing about cookies, or collected email addresses.
The Business Resources page has no content the search engines can see.
The "Our Programs" menu item does nothing
Some pages don't have much content.
-
I agree with it, I think a fresh new blog with a new subdomain will help fix the issue. Aside from loosing PR on the blog, I don't see any real issues with it. My advice is go for it!
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Duplicate Content - Classifieds (Panda)
I've been wondering for a while now, how Google treats internal duplicate content within classified sites. It's quite a big issue, with customers creating their ads twice.. I'd guess to avoid the price of renewing, or perhaps to put themselves back to the top of the results. Out of 10,000 pages crawled and tested, 250 (2.5%) were duplicate adverts. Similarly, in terms of the search results pages, where the site structure allows the same advert(s) to appear under several unique URLs. A prime example would be in this example. Notice, on this page we have already filtered down to 1 result, but the left hand side filters all return that same 1 advert. Using tools like Siteliner and Moz Analytics just highlights these as urgent high priority issues, but I've always been sceptical. On a large scale, would this count as Panda food in your opinion, or does Google understand the nature of classifieds is different, and treat it as such? Appreciate thoughts. Thanks.
Intermediate & Advanced SEO | | Sayers1 -
How to recover google rank after changing the domain name?
I just started doing SEO for a new client. The case is a bit unique as they build a new website and for some reason lunched in under another domain name. Old name is foodstepsinasia.com and new one is foodstepsinasiatravel.com OLD one is a respected webites with 35 in MOZ page authority and with +15000 incomming link (104 root domains) NEW one is curently on 0 The programmer has just that build the new website has set it up so that when people write or find the old domain name it redirect to the front page of the new website with the new domain name. this caused that my friends lost a lot of their rankings was so I believ it was a very bad solution. But I also think I can get most of the old rankings back, but my question is what to do now to get as much back of the rankings as fast as possible?? A) I believe I must change the domain name back to foodstepsinasia.com on the new website ? O B) Should I on the old website try finding the url of the pages with most page authority and recreate these urls on the new website or should i redict them to a page with related content? Looking forward to feedback from someone who have experience with similar cases. Thanks!
Intermediate & Advanced SEO | | nm19770 -
Recovering from index problem (Take two)
Hi all. This is my second pass at the problem. Thank you for your responses before, I think I'm narrowing it down! Below is my original message. Afterwards, I've added some update info. For a while, we've been working on http://thewilddeckcompany.co.uk/. Everything was going swimmingly, and we had a top 5 ranking for the term 'bird hides' for this page - http://thewilddeckcompany.co.uk/products/bird-hides. Then disaster struck! The client added a link with a faulty parameter in the Joomla back end that caused a bunch of duplicate content issues. Before this happened, all the site's 19 pages were indexed. Now it's just a handful, including the faulty URL (thewilddeckcompany.co.uk/index.php?id=13) This shows the issue pretty clearly. https://www.google.co.uk/search?q=site%3Athewilddeckcompany.co.uk&oq=site%3Athewilddeckcompany.co.uk&aqs=chrome..69i57j69i58.2178j0&sourceid=chrome&ie=UTF-8 I've removed the link, redirected the bad URL, updated the site map and got some new links pointing at the site to resolve the problem. Yet almost two month later, the bad URL is still showing in the SERPs and the indexing problem is still there. UPDATE OK, since then I've blocked the faulty parameter in the robots.txt file. Now that page has disappeared, but the right one - http://thewilddeckcompany.co.uk/products/bird-hides - has not been indexed. It's been like this for several week. Any ideas would be much appreciated!
Intermediate & Advanced SEO | | Blink-SEO0 -
Should I make multiple landing pages for different cities?
I am trying to market my company to North Carolina & West Virginia. This is a bit of a challenge since the name is "Decorative Concrete of Virginia." My idea was to create landing pages for the specific areas (Greensboro & Raleigh, NC for now).... A new landing page them that I purchased came with a plugin that would allow you to generate a ton of landing pages with little effort by replacing some elements of the landing page, depending on the URL... For example, I have these two URLs set up right now: http://www.decorativeconcreteofvirginia.com/northcarolina/test/raleigh/nc http://www.decorativeconcreteofvirginia.com/northcarolina/test/greensboro/nc My question is... Is merely changing the city in each landing page enough, or should I change some of the other content too? I was going to create one landing page for NC, and then try to include all of the cities on that one page... but perhaps it would be easier to rank if I had one for each city. Any thoughts on this would be greatly appreciated. Thanks! Tim
Intermediate & Advanced SEO | | Timvroom0 -
How to make an AJAX site crawlable when PushState and #! can't be used?
Dear Mozzers, Does anyone know a solution to make an AJAX site crawlable if: 1. You can't make use of #! (with HTML snapshots) due to tracking in Analytics 2. PushState can't be implemented Could it be a solution to create two versions of each page (one without #!, so campaigns can be tracked in Analytics & one with #! which will be presented to Google)? Or is there another magical solution that works as well? Any input or advice is highly appreciated! Kind regards, Peter
Intermediate & Advanced SEO | | ConversionMob0 -
Panda Updates - robots.txt or noindex?
Hi, I have a site that I believe has been impacted by the recent Panda updates. Assuming that Google has crawled and indexed several thousand pages that are essentially the same and the site has now passed the threshold to be picked out by the Panda update, what is the best way to proceed? Is it enough to block the pages from being crawled in the future using robots.txt, or would I need to remove the pages from the index using the meta noindex tag? Of course if I block the URLs with robots.txt then Googlebot won't be able to access the page in order to see the noindex tag. Anyone have and previous experiences of doing something similar? Thanks very much.
Intermediate & Advanced SEO | | ianmcintosh0 -
Best free way to make our NAPs consistent - online software maybe?
Hello, What's the best free tool or method to making our local SEO citations consistent? We have more than one name and phone number out there and there are a lot of citations already.
Intermediate & Advanced SEO | | BobGW0 -
How to make Google forget my pages ?
Hello all ! I've decided to delete many pages from my website wich had poor content. I've made a php 301 redirect from all these old pages to a unique page (not the home page, a deep page). My problem is that this modification has been made a week ago and my position in the SERPs have crashed down... What can I do ? I believe that I'll get up again when Google will see that these pages don't exist anymore but it could take a long time 😞 (these page are in the Google cache with a date older than my modification's date) I've read somewhere that I should put a link to the destination page (where old pages are 301 redirected) but I don't understand how it could help... Can someone help me ? Tell me what I've done wrong... These pages were very poor and I've deleted them in order to boost the global quality of my site... It should help me in the SERPs, not penalize me...
Intermediate & Advanced SEO | | B-CITY0