Does this make sense to recover from panda?
-
Hello guys, our website was pandalized on 9/27/2012 and we haven't been able to recover since then.
I've fixed as much as possible when it comes to poor content, and we have been getting high quality links consistently for the past 3-4 months.
Our blog had some duplicate content issues due to categories, tags, feeds, etc. I solved those problems before the past 2 refreshes without success.
I'm considering moving the blog to a subdomain, more than PR, I'm interested in recovering from panda, and let the blog grow on its own. What do you think about that?
-
I have in mind doing 301 redirects.
-
Subdomaining will take time.
I subdomained a lot of content, and almost 2 months later, google is still looking for the content on the main domain, still reporting it as missing, still marking site health poorly because of those pages, and still showing them all in both places in the search index.
One possibility after seeing the new pages indexed, is to use the webmaster tools delete URL feature. You only have a few hundred blog pages, so that won't take long.
After two months, I wish I could tell you that it fixed my problem, but it hasn't.
Subdomaining the blog is not likely to hurt you. Remember you will need to set it up so you can have a separate robots file and a separate sitemap.
Also,
1. be sure that you've covered all the bases, that mainsite links that point to the blog now have the fully qualified URL.
2. be sure that any links on the blog that point to the main site are also fully qualified URLs, and not relative, therefore making a broken link.
-
Thank you for your comment, I'm highly considering this move.
-
Yes, for sure panda, that's what webmaster tools and analytics say. The blog does link to our main site, but only from the /blog home page. I did this on purpose, I didn't want hundreds of internal links with the same anchor text.
I'll have a look at our privacy policy on the blog and might replace it with the one from truste.
Regarding the business resources page, I agree that the content doesn't offer much for Google to see, so I might consider "noindex" for it.
Traffic loss, most of our organic traffic from Google. We are doing well on Yahoo and Bing (improving every week actually).
The poor links you see come from negative SEO done against us in may '12. Thankfully most of those comments are nofollow.
Thank you. Please keep the ideas coming!
-
Dave, are you sure that was panda?
What does your webmaster tools account tell you?
Have you fixed all duplicate titles, descriptions and pages?
What traffic have you lost?
Was it only google or yahoo and bing as well?
Here are some observations:
First, look at your link profile - it doesn't look good to me
I didn't see a link from your blog to your home page
You have no terms of use link on the blog pages
There are 2 privacy policies. One for the blog, and one for the main site, but that policy is on the truste site
The privacy policy on the blog says nothing about cookies, or collected email addresses.
The Business Resources page has no content the search engines can see.
The "Our Programs" menu item does nothing
Some pages don't have much content.
-
I agree with it, I think a fresh new blog with a new subdomain will help fix the issue. Aside from loosing PR on the blog, I don't see any real issues with it. My advice is go for it!
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO agency makes "hard to believe" claims
Hi I operate in a highly competitive niche of "sell house fast" in UK. Sites that are in top 1-3 tend to have thousands of links. Some of these are spammy type links. These sites have Domain Authority too. My site has good content http://propertysaviour.co.uk and is listed with around 12 well known directories. I have been building back-links manually over the last 3-4 months. The SEO agency we are looking to work with are claiming they can get my website to first page with above keyword. How would you go about this strategy? What questions would you ask SEO agency? What elements can do I myself? By the way, I am good at producing content!
Intermediate & Advanced SEO | | propertysaviour0 -
E-Commerce Panda Question
I'm torn. Many of our 'niche' ecommerce products rank well, however I'm concerned that duplicate content is negatively effecting our overall rankings via Panda Algo. Here is an example that can be found through quite a few products on the site. This sub-category page (http://www.ledsupply.com/buckblock-constant-current-led-drivers) in our 'led drivers' --> 'luxdrive drivers' section has three products that are virtually identical with much of the same content on each page, except for their 'output current' - sort of like a shirt selling in different size attributes: S, M, L and XL. I could realistically condense 44 product pages (similar to example above) down to 13 within this sub-category section alone (http://www.ledsupply.com/luxdrive-constant-current-led-drivers). Again, we sell many of these products and rank ok for them, but given the outline for how Panda works I believe this structure could be compromising our overall Panda 'quality score', consequently keeping our traffic from increasing. Has anyone had similar issues and found that its worth the risk to condense product pages by adding attributes? If so, do I make the new pages and just 301 all the old URLs or is there a better way?
Intermediate & Advanced SEO | | saultienut0 -
How to make AJAX content crawlable from a specific section of a webpage?
Content is located in a specific section of the webpage that are being loaded via AJAX.
Intermediate & Advanced SEO | | zpm20140 -
Loss of traffic due to domain move, not recovering
I have a new client who this year chose to eliminate using a "stronger", older domain (domain authority 50) for a newer, weaker domain (domain authority 38). The redirects actually started end of 2013 and happened over time by page/section. All were completed by Jan 12 2014. While 301 redirects are in place, and the robots.txt is disallowing all (187 pages blocked), it looks as though Google is still indexing pages (149 indexed) although not sure why. Perhaps they should be removed from the server? In spite of the redirects, they are not getting the (combined) traffic expected. Should they have had that expectation? Could it be because they are going from a "stronger", long established domain to a "weaker", newer domain, that it may take a long time to recover? They recently had another agency review the links on the weaker domain and they submitted a file to Google to disavow the links they found to be "toxic" however it doesn't seem to have made any difference, yet. Any idea how long it "should" take to make a difference, if it will indeed make a difference? They do have a blog in a sub-directory that doesn't get much traffic (approx 0.50% of the total traffic). Every post ends with a blatant self-promotion and due to Penguin, they have recently begun to mix up their link text and not include a link on every post. Last their target audience is both B-B and B-C, with B-B being priority. The big question I have is do you see changes take place with almost instant results in Google? Or am I right in telling him, this will take some time. He feels it's been almost 4 months now and their visibility/traffic should be more in par with what it was combined. Something to note is that they were sort of competing with themselves by using both domains however the number of searchers probably hasn't changed much... Thank you so much for giving me your 2 cents!
Intermediate & Advanced SEO | | cindyt-17038
xo0 -
Does making a copy of website harm my SEO?
We have made a demo server on a different domain than our main website domain to test new features on it before updating codes on the main domain. Does it hurt our SEO activities? Thanks Everybody
Intermediate & Advanced SEO | | AlirezaHamidian0 -
How to compete with duplicate content in post panda world?
I want to fix duplicate content issues over my eCommerce website. I have read very valuable blog post on SEOmoz regarding duplicate content in post panda world and applied all strategy to my website. I want to give one example to know more about it. http://www.vistastores.com/outdoor-umbrellas Non WWW version: http://vistastores.com/outdoor-umbrellas redirect to home page. For HTTPS pages: https://www.vistastores.com/outdoor-umbrellas I have created Robots.txt file for all HTTPS pages as follow. https://www.vistastores.com/robots.txt And, set Rel=canonical to HTTP page as follow. http://www.vistastores.com/outdoor-umbrellas Narrow by search: My website have narrow by search and contain pages with same Meta info as follow. http://www.vistastores.com/outdoor-umbrellas?cat=7 http://www.vistastores.com/outdoor-umbrellas?manufacturer=Bond+MFG http://www.vistastores.com/outdoor-umbrellas?finish_search=Aluminum I have restricted all dynamic pages by Robots.txt which are generated by narrow by search. http://www.vistastores.com/robots.txt And, I have set Rel=Canonical to base URL on each dynamic pages. Order by pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name I have restrict all pages with robots.txt and set Rel=Canonical to base URL. For pagination pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name&p=2 I have restrict all pages with robots.txt and set Rel=Next & Rel=Prev to all paginated pages. I have also set Rel=Canonical to base URL. I have done & apply all SEO suggestions to my website but, Google is crawling and indexing 21K+ pages. My website have only 9K product pages. Google search result: https://www.google.com/search?num=100&hl=en&safe=off&pws=0&gl=US&q=site:www.vistastores.com&biw=1366&bih=520 Since last 7 days, my website have affected with 75% down of impression & CTR. I want to recover it and perform better as previous one. I have explained my question in long manner because, want to recover my traffic as soon as possible.
Intermediate & Advanced SEO | | CommercePundit0 -
Making a site back link checker proof
Is it possible to make your site un readable by the back link checkers such as open explorer ? Or would it have a negative effect on search engine rankings. Sorry I have been in the business for 10 years and it has never crossed my mind. Figured I would say that before I get all the "I can't believe you don't know that" type comments 🙂
Intermediate & Advanced SEO | | onlinemediadirect0 -
With Panda, which is more important, traffic or quantity?
If you were to prioritize how to fix a site, would you focus on traffic or quantity of urls? So for example, if 10% of a site had thin content, but accounted for 50% of the traffic and 50% of the site had a different type of thin content but only accounted for 5% of organic traffic, which would you work on first? I realize both need to be fixed, but am unsure of which to tackle first (this is an extremely large site). Also, I am wondering if the simply the presence of thin content on a domain can affect a site even if it isn't receiving any traffic.
Intermediate & Advanced SEO | | nicole.healthline0