Strange recovery from Panda
-
I have 2 business sites. www.affordable-uncontested-divorce.com is a homestead template site which is old and clunky but has given me steady traffic despite little maintenance. It was unafected by the various Panda updates. It does load very fast. www.uncontesteddivorce-nyc I put up about 18 months ago it is a Thesis Theme Wordpress site with the usual bells and whistles. I put a lot of work into it and around May its traffic finally surpassed my old site. In June traffic to the new site started tanking, ultimately about 30% off. A friendly SEO thought that there was some duplication between the 2 sites and Google might have seen the older site as the authority site and the newere as the scraper. I tried the usual fixes and the decline finally bottomed out but no recovery. I read someone who said that Wordpress sites are problamatical with Panda because of inherent duplicate content issues unless you don't use them as blogs, just as CMS. So I got rid of all the blog posts save one. Around about 3 months ago my traffic started to go up again and now it once again has surpassed the older site. The strange thing about it is that since the recovery my Analytic numbers like bounce rate number of page views and time on site have gone down and are much worse on the new site than they are on the old site. Does anyone have any idea of what' s up?
Thx
Paul
-
Actually my question (should have made it clearer) is why is my site ranking so much better in the past few months even though all the important analytics numbers have simultaneously been changing for the worse? It would appear that the various pundies were wrong about how Panda really works.
Paul
-
Hi Paul
Have you looked deeper into your analytics - how do your traffic sources for the two sites compare- are they different? Are you getting traffic from keywords on your new site that you aren't on your old site? If so what is the bounce rate on these terms - my thoughts are you might be ranking well for some keywords that don't provide visitors with the information they are looking for. Alternatively your page could be so relevant for that search keyword that visitors don't need to dig any deeper into the new site. In this case time on page should give you a clue as to which of these it may be in this situation - is the time on page too short to read the content for you average person or not - how long does it take you to read? It might not be this but I always find looking deeper into the analytics of a website tends to hold the key, especially when you are comparing it to another site that you own in the same niche as you are in the fortunate position of having access to both sites analytics. Hope this helps.
-
One thing I want to comment on with wordpress and google is how wordpress makes blog post urls. They use dates. Google is all about freshness and this is one of the things I would change how the urls are structured in the permalink area of admin. Make sure not to use dates unless you want to for any material you do not wish to get long term traffic for.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirect recovery
Hello Please understand that English is poor. I used to run a site called A This time, I am running a site called B. I need to set up a temporary 302 redirect from A to B
Intermediate & Advanced SEO | | jinseok
I accidentally set a 301 redirect Site A has many spam links
For now I have removed the 301 redirect source to B. Will A's spam links affect site B?
For your reference, Site B is putting a lot of effort into SEO. Help me.0 -
Site recovery after manual penalty, disavow, SSL, Mobile update = but dropped again in May
I have a site that has had a few problems over the last year. We had a manual penalty in late 2013 for bad links, some from guest blogs and some from spammy sites. Reconsideration requests had me disavow almost all of the incoming links. Later in 2014, the site was hit with link injection malware and had another manual penalty. That was cleared up and manual penalty removed in Jan 2015. During this time the site was moved to SSL, but there were some redirect problems. By Feb 2015 everything was cleared up and a an updated disavow list was added. The site recovered in March and did great. A mobile version was added in April. About May 1st rankings dropped again. Traffic is about 40% off it's March levels. Recently I read that a new disavow file will supersede an old one, and if all of the original domains and URLs aren't included in the new disavow file they will no longer be disavowed. Is this true? If so, is it possible that a smaller disavow file uploaded in Feb would cause rankings to drop after the May 3 Quality update? Can I correct this by disavowing all the previously disavowed domains and URLs? Any advice for determining why the site is performing poorly again? We have well written content, regular blogs, nothing that seems like it should violate the Google guidelines.
Intermediate & Advanced SEO | | Robertjw0 -
Penguin recovery, no manual action. Are our EMD sites killing our brand site?
Hi guys, Our brand site (http://urban3d.net) has been seeing steady decline due to algorithm updates for the past two years. Our previous SEO company engaged in some black-hat link building which has hurt us very badly. We have recently re-launched the site, with better design, better content, and completed a disavow of hundreds of bad links. The site is technically indexed, but is still nowhere in the SERPs after months of work to recover it by our internal marketing team. The last SEO company also told us to build EMD sites for our core services, which we did: http://3dvisualisation.co.uk/ http://propertybrochure.com/ http://kitchencgi.com/ My question is - could these EMD sites now hurting us even further and stopping our main brand site from ranking? Our plan is to rescue our brand site, with a view to retiring these outlier sites. However, with no progress on the brand site, we can't afford to remove these site (which are ranking). It seems a bit chicken and egg. Any advice would be very much appreciated. Aidan, Urban 3D
Intermediate & Advanced SEO | | aidancass0 -
How long for Panda 4.1 fixes to take affect?
Hi, If you have been hit by Panda 4.1 and now putting fixes in place, for this example lets say you remove a load of dup content (and that's what caused the problem) - how long would it take for that fix to take affect? Do you have to wait for the next Panda update? or will it be noticed on the next crawl? Thanks.
Intermediate & Advanced SEO | | followuk0 -
Very Slow Recovery after Manual Penalty Removed - Are we missing something?
Our site was handed a manual penalty in November 2013 where exact match anchor text and low quality directory submissions seemed to be the problem. We began the process of link removal, reconfiguration and disavowing. We had already planned to change our domain in early 2014 to coincide with our SSL certificate renewal and although we were hesitant to do this with the manual penalty still there we proceeded and 301'd most of the site but left the pages that were the landing page for most of the exact match links as 302 to the new domain. We continued to work on removing the manual penalty for the old domain as we didn't want it to pass over to the new one and eventually this was removed n March 2014 Now the penalty is gone are we safe to change those 302 redirects to 301 so everything redirects. The problem we have is that six months on, a lot of the pages for the old domain are still indexed and even though we are indexed for the new domains are rankings haven't recovered. Is it just a case of needing to build up a new quality link profile to replace the links that were disregarded or removed when recovering from the penalty or we missing something else
Intermediate & Advanced SEO | | Ham19790 -
Magento Duplicate Content Recovery
Hi, we switched platforms to Magento last year. Since then our SERPS rankings have declined considerably (no sudden drop on any Panda/Penguin date lines). After investigating, it appeared we neglected to No index, follow all our filter pages and our total indexed pages rose sevenfold in a matter of weeks. We have since fixed the no index issue and the pages indexed are now below what we had pre switch to Magento. We've seen some positive results in the last week. Any ideas when/if our rankings will return? Thanks!
Intermediate & Advanced SEO | | Jonnygeeuk0 -
How to compete with duplicate content in post panda world?
I want to fix duplicate content issues over my eCommerce website. I have read very valuable blog post on SEOmoz regarding duplicate content in post panda world and applied all strategy to my website. I want to give one example to know more about it. http://www.vistastores.com/outdoor-umbrellas Non WWW version: http://vistastores.com/outdoor-umbrellas redirect to home page. For HTTPS pages: https://www.vistastores.com/outdoor-umbrellas I have created Robots.txt file for all HTTPS pages as follow. https://www.vistastores.com/robots.txt And, set Rel=canonical to HTTP page as follow. http://www.vistastores.com/outdoor-umbrellas Narrow by search: My website have narrow by search and contain pages with same Meta info as follow. http://www.vistastores.com/outdoor-umbrellas?cat=7 http://www.vistastores.com/outdoor-umbrellas?manufacturer=Bond+MFG http://www.vistastores.com/outdoor-umbrellas?finish_search=Aluminum I have restricted all dynamic pages by Robots.txt which are generated by narrow by search. http://www.vistastores.com/robots.txt And, I have set Rel=Canonical to base URL on each dynamic pages. Order by pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name I have restrict all pages with robots.txt and set Rel=Canonical to base URL. For pagination pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name&p=2 I have restrict all pages with robots.txt and set Rel=Next & Rel=Prev to all paginated pages. I have also set Rel=Canonical to base URL. I have done & apply all SEO suggestions to my website but, Google is crawling and indexing 21K+ pages. My website have only 9K product pages. Google search result: https://www.google.com/search?num=100&hl=en&safe=off&pws=0&gl=US&q=site:www.vistastores.com&biw=1366&bih=520 Since last 7 days, my website have affected with 75% down of impression & CTR. I want to recover it and perform better as previous one. I have explained my question in long manner because, want to recover my traffic as soon as possible.
Intermediate & Advanced SEO | | CommercePundit0 -
Why specify robots instead of googlebot for a Panda affected site?
Daniweb is the poster child for sites that have recovered from Panda. I know one strategy she mentioned was de-indexing all of her tagged content, fo rexample: http://www.daniweb.com/tags/database Why do you think more Panda affected sites specifying 'googlebot' rather than 'robots' to capture traffic from Bing & Yahoo?
Intermediate & Advanced SEO | | nicole.healthline0