Mobile Site Panda 4.2 Penalty
-
We are an ecommerce company, and we outsource our mobile site to a service, and our mobile site is m.ourdomain.com. We pass the Google mobile ready test.
Our product page content on the mobile site is woefully thin (typically less than 100 words), and it appears that we got hit with Panda 4.2 on the mobile site. Starting at the end of July, our mobile rankings have dropped, and our mobile traffic is now about half of what it was in July. We are working to correct the content issue but it obviously takes time.
So here's my question - if our mobile site got hit with Panda 4.2, could that have a negative effect on our desktop site?
-
My gut says that this is not a Panda hit. Any sites that I saw get hit with Panda in July were hit on exactly July 17th or 18th.
The decision to noindex (I'm assuming you meant noindex rather than nofollow?) product pages is one that would likely need some in depth investigation made in order to decide. But, in most cases I do not recommend noindexing products.
A drop from #1 to #3 does not sound typical of a Panda hit. That said, I have seen some Panda hit sites that start with a slight decline and continue to fall from there. But, again, my gut is saying this is not Panda. A drop from #1 to #3 could be a whole bunch of things and I'm guessing that having some thin product pages is not your main problem.
This sounds like a question that is probably a little too detailed to be answered in Q&A. Perhaps if you are able to provide the url you can get some better feedback, but otherwise you'll mostly be getting gut instincts and opinions.
-
Yep. We got hit around 7/22. We're an ecommerce site, and most of our 2000 products have little to no search value. We were focused on roughly 50 products. We took about a 20% hit on desktop and 35% on mobile. We had one high value (100K+ monthly searches) product ranked #1 that dropped to #3, and I wrote the traffic drop off to that. Digging in a little deeper, I realized that we got hit. The traffic dropped maybe 10%, but it's been a slow erosion ever since.
All but a handful of products have descriptions between 40-60 words. We are in the process of creating new content, but it's expensive.
Does it make sense to nofollow product pages that have no search value?
-
I can't say that I have heard of a case where just the mobile site was affected by Panda. As far as I know, Panda doesn't distinguish between mobile and desktop, but I could be wrong.
Did the drop happen at the end of July, or was it July 17/18? The latter is when Panda refreshed. Although it was a rolling refresh, we really didn't see much Panda action near the end of July.
My money would be on either some type of a technical issue surrounding the move to an m.dot site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Someone explain why this site ranks #1 and #2? I am confused.
Hi everyone, Here is my problem. This site: https://247ride.com/town-cars/ ranks for bunch of really good keywords. Such as lax car service, car service to lax etc. The keyword does not appear in Title tag, and only partially on meta description. The site's DA is 23 and PA is 22. Less than 29 links overall, and linking domains 8. Why in your opinion Google is ranking this page #1 and #2 for many competitive keywords. I know it's a hard question to ask but any input would be greatly appreciated. I am working really hard to rank for same keywords and so far i am at #11 position. Thanks in advance, Davit
Intermediate & Advanced SEO | | Davit19851 -
Two sites with same content
Hi Everyone, I am having two listing websites. Website A&B are marketplaces Website A approx 12k listing pages Website B : approx 2k pages from one specific brand. The entire 2k listings on website B do exist on website A with the same URL structure with just different domain name. Just header and footer change a little bit. But body is same code. The listings of website B are all partner of a specific insurance company. And this insurance company pays me to maintain their website. They also look at the traffic going into this website from organic so I cannot robot block or noindex this website. How can I be as transparent as possible with Google. My idea was to apply a canonical on website B (insurance partner website) to the same corresponding listing from website A. Which would show that the best version of the product page is on website A. So for example :www.websiteb.com/productxxx would have a canonical pointing to : www.websitea.com/productxxxwww.websiteb.com/productyyy would have a canonical pointing to www.websitea.com/productyyyAny thoughts ? Cheers
Intermediate & Advanced SEO | | Evoe0 -
Disavow without penalty
Hi fellow Mozians, I have come up with a doubt today which I would appreciate your thoughts on. I have always been convinced that the disavowal tool can be used at any time as part of your backlink monitoring activities- if you see a dodgy backlink coming in you should add it to your disavowal file if you can't get it removed (which you probably can't). That is to say that the disavowal tool can be used pre-emptively to make sure a dodgy link does do your site any harm. However, this belief of mine has taken a bit of a beating this morning as another SEO suggested that the disavowal tool only has en effect if acompanied by a reconsideratiosn request, and that you can only file a reconsideration request if you have some kind of manual action. This logic describes that you can only disavowal when you have a penalty. This theory was backed up by this moz article from May 2013:
Intermediate & Advanced SEO | | unirmk
https://moz.com/blog/google-disavow-tool
The comments didnt do much to settle my doubts. This Mat Cutts video, from November 2013 seems to confirm my belief however:
https://www.youtube.com/watch?time_continue=86&v=eFJZXpnsRsc It seems perfectly reasonable that Google does allow pre-emptive disavowal-ing, not just because of the whole negative seo issue, but just because nasty links do happen naturally. Not all SEOs spend all their waking hours building links which they know they will have to disavowal later shoudl a penalty hit at some point, and it seems reasonable that an SEO should be able to say- "Link XYZ is nothing to do with me!" before Google excercises retribution. If, for example you get hired working for a company that HAD a penalty due to spammy link building in the past that has been lifted; but you see that Google periodically discovers the occasional spammy link it seems fair that you should be able to tell google that you want to voluntarily remove any "credit" that that link is giving you today, so as to avoid a penalty tomorrow. Your help would be much appreciated. Many thanks indeed. watch?time_continue=86&v=eFJZXpnsRsc0 -
Transferring Domain and redirecting old site to new site and Having Issues - Please help
I have just completed a site redesign under a different domain and new wordpress woo commerce platform. The typical protocol is to just submit all the redirects via the .htaccess file on the current site and thereby tell google the new home of all your current pages on the new site so you maintain your link juice. This problem is my current site is hosted with network solutions and they do not allow access to the .htaccess file and there is no way to redirect the pages they say other than a script they can employ to push all pages of the old site to the new home page of the new site. This is of course bad for seo so not a solution. They did mention they could also write a script for the home page to redirect just it to the new home page then place a script of every individual page redirecting each of those. Does this sound like something plausible? Noone at network solutions has really been able to give me a straight answer. That being said i have discussed with a few developers and they mentioned a workaround process to avoid the above: “The only thing I can think of is.. point both domains (www.islesurfboards.com & www.islesurfandsup.com) to the new store, and 301 there? If you kept WooCommerce, Wordpress has plugins to 301 pages. So maybe use A record or CName for the old URL to the new URL/IP, then use htaccess to redirect the old domain to the new domain, then when that comes through to the new store, setup 301's there for pages? Example ... http://www.islesurfboards.com points to http://www.islesurfandsup.com ... then when the site sees http://www.islesurfboards.com, htaccess 301's to http://www.islesurfandsup.com.. then wordpress uses 301 plugin for the pages? Not 100% sure if this is the best way... but might work." Can anyone confirm this process will work or suggest anything else to redirect my current site on network solutions to my new site withe new domain and maintain the redirects and seo power. My domain www.islesurfboards.com has been around for 10 years so dont just want to flush the link juice down the toilet and want to redirect everything correctly.
Intermediate & Advanced SEO | | isle_surf0 -
Bad site migration - what to do!
Hi Mozzers - I'm just looking at a site which has been damaged by a very poor site migration. Basically, the old URLs were 301'd to a page on the new website (not a 404) telling everyone the page no longer existed. They did not 301 old pages to equivalent new pages. So I just checked Google WMT and saw 1,000 crawl errors - basically the old URLs. This migration was done back in February, since when traffic to the website has never recovered. Should I fix this now? Is it worth implementing the correct 301s now, after such a timelapse?
Intermediate & Advanced SEO | | McTaggart0 -
Need help or explanation on my site!
My site has suffered greatly since the recent Google update. I have done everything as suggested. I have had all bad links removed over 2 months ago. I have lowered keyword density (not easy since the keyword is in our company name!). I have rewritten various content and bolstered our existing content. What gives? What can I do? As an example the keyword, "maysville plumber" - I rank about 40th for this keyword. The first three pages are filled with websites with literally NO content or no added value. Maysville is a town of about 1k residents - there is no competition. Before the update I was #1 for years on this particular keyword. And this is the case with 35 other cities (mostly small cities, but a few larger ones). Please help me understand or suggest what I can possibly do at this point. We have hundreds of pages of unique content on each and every page. We have zero duplicate content (I have ran tests and crawlers). We have no fishy links. I have not gotten any messages from google on Webmasters. PLEASE HELP!! I asked a similar question a little while back and fixed all of the suggestions. My site is www.akinsplumbing.net.
Intermediate & Advanced SEO | | chuckakins0 -
Why specify robots instead of googlebot for a Panda affected site?
Daniweb is the poster child for sites that have recovered from Panda. I know one strategy she mentioned was de-indexing all of her tagged content, fo rexample: http://www.daniweb.com/tags/database Why do you think more Panda affected sites specifying 'googlebot' rather than 'robots' to capture traffic from Bing & Yahoo?
Intermediate & Advanced SEO | | nicole.healthline0 -
Does a mobile site count as duplicate content?
Are there any specific guidelines that should be followed for setting up a mobile site to ensure it isn't counted as duplicate content?
Intermediate & Advanced SEO | | nicole.healthline0