Penguin Update Issues.. What would you recommend?
-
Hi,
We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%.
We suspect it's for a couple of reasons
1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx
We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week
2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle?
3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com.
Any help will be much appreciated as this is Killing our business.
Jay
-
Hey Ben,
Thank you so much for your response.
I 'm pretty sure it was the Penguin update that brought our rankings down.
We don't participate in any paid linking, no blog networks etc.
The only thing we did was submit to article directories- which i understand are frowned upon now so we'll move away from that.
We'll try to get all the non existent pages to show 404 codes and any clear any duplicate page title and page content errors and hope that we'll get back in google good graces.
-
Hi Jay,
Sorry to hear it's hurting your business so much.
Have you double checked the dates of your decrease in traffic against the Penguin update? There were a lot of big changes going on around that time so it's worth being sure it was Penguin.
In answer to question 3 - If they're external sites then I don't think those 1700 404s are having a negative effect on your SEO. If those directories are hurting you at all through the Penguin update then it would be through over-optimised anchor text (although I haven't seen any definitive data on this).
In answer to question 2 - Would I be right in thinking that you're using a 301 or a 302 to send users to a generic error page? However you're generating soft 404s the best fix is to make them real 404 errors so the server returns a 404 code. The details of setting up a custom 404 page are pretty well documented around the web so you shouldn't have much problem with it.
In answer to question 1 - Have you tried checking to see if Google has re-cached your pages since the change? It's also probably worth looking at the rel=prev rel=next markup as well. Maile Ohye from Google has released a pretty comprehensive video on the topic of pagination and SEO so I'd recommend checking that out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Change of Address Tool Issue
We're currently migrating few "event" mini sites to a main site that will have a subfolder for each event.. Example:
Intermediate & Advanced SEO | | RichardUK
newsite.com/event1 The issue is that Search console is not able to verify this kind of redirect:
example.com --> 301 --> newsite.com/event Do you know any work around for this? I was thinking of using a subdomain instead which will in turn redirect to the /event subfolder.. but with each hop it will diminish the link's strength. I prefer not to leave it as subdomain as data gets mashed up in Google Analytics with subdomains and we have seen worse ranking results with subdomains. Any help is greatly appreciated.0 -
Redirect Issue in .htaccess
Hi, I'm stumped on this, so I'm hoping someone can help. I have a Wordpress site that I migrated to https about a year ago. Shortly after I added some code to my .htaccess file. My intention was to force https and www to all pages. I did see a moderate decline in rankings around the same time, so I feel the code may be wrong. Also, when I run the domain through Open Site Explorer all of the internal links are showing 301 redirects. The code I'm using is below. Thank you in advance for your help! Redirect HTTP to HTTPS RewriteEngine On ensure www. RewriteCond %{HTTP_HOST} !^www. [NC]
Intermediate & Advanced SEO | | JohnWeb12
RewriteRule ^ https://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301] ensure https RewriteCond %{HTTP:X-Forwarded-Proto} !https
RewriteCond %{HTTPS} off
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress USER IP BANNING <limit get="" post="">order allow,deny
deny from 213.238.175.29
deny from 66.249.69.54
allow from all</limit> #Enable gzip compression
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript #Setting heading expires
<ifmodule mod_expires.c="">ExpiresActive on
ExpiresDefault "access plus 1 month"
ExpiresByType application/javascript "access plus 1 year"
ExpiresByType image/x-ico "access plus 1 year"
ExpiresByType image/jpg "access plus 14 days"
ExpiresByType image/jpeg "access plus 14 days"
ExpiresByType image/gif "access plus 14 days"
ExpiresByType image/png "access plus 14 days"
ExpiresByType text/css "access plus 14 days"</ifmodule>0 -
Portfolio Image Landing Page Question/Issue
Hello, We have a client with a very image heavy website. They have Portfolio pages with a large number of images. We are currently working on adding more copy to the site but wanted to confirm we are taking the right approach for the images on the site. Under the current structure each image has its own landing page (with no copy) and is fed in (or generated on) to a Portfolio Page. While we know this is not ideal as it would be best to have the images on the Portfolio Page directly or even fill out the landing pages with copy; due to the amount of images and the fact these are only images (and not a 'targeted' page) that would not really be feasible. Aside from the thin content concern these individual landing pages were being indexed so they are showing hundreds of pages on their sitemap.xml and in GSC even though they only have a few actual pages. In the meantime we went into each image-page and placed a canonical tag back to the main Portfolio Page (with the hopes to add content to that page and have it as the ‘overarching’ page). Would this be the right approach? – We considered ‘noindex-follow’ tags but would want the images to be crawled; the issue is because the pages are not on the actual page are we canonicalizing these images to nothing? Any insight would really be appreciated. Thank you in advance.
Intermediate & Advanced SEO | | Ben-R0 -
Favorite SEO firm you would recommend
Is there a favorite SEO firm that you would recommend. Is there any site that ranks the top firms in the country?
Intermediate & Advanced SEO | | movieguide0 -
[eCommerce Issues] Having a tough time writing content for product color variations. Any recommendations?
wow, after being hit with panda i'm having a real tough time with this issue. Maybe i'm going about it the wrong way.. How can i possibly write unique content for all of these different colors of the same product?... http://www.suddora.com/green-sweatbands-wholesale-green-wristbands.html http://www.suddora.com/pink-sweatbands-wholesale-pink-wristbands.html http://www.suddora.com/black-sweatbands-wholesale-black-wristbands.html http://www.suddora.com/green-headbands-wholesale-pricing-available.html http://www.suddora.com/pink-headbands-wholesale-pricing-available.html http://www.suddora.com/black-headbands-wholesale-pricing-available.html Should i be going about this a different way? Thanks, Paul
Intermediate & Advanced SEO | | Hyrule0 -
REL canonicals not fixing duplicate issue
I have a ton of querystrings in one of the apps on my site as well as pagination - both of which caused a lot of Duplicate errors on my site. I added rel canonicals as a php condition so every time a specific string (which only exists in these pages) occurs. The rel canonical notification shows up in my campaign now, but all of the duplicate errors are still there. Did I do it right and just need to ignore the duplicate errors? Is there further action to be taken? Thanks!
Intermediate & Advanced SEO | | Ocularis0 -
Duplicate content issue for franchising business
Hi All We are in the process of adding a franchise model to our exisitng stand alone business and as part of the package given to the franchisee will be a website with conent identical to our existing website apart from some minor details such as contact and address details. This creates a huge duplicate content issue and even if we implement a cannonical approach to this will still be unfair to the franchisee in terms of their markeitng and own SEO efforts. The url for each franchise will be unique but the content will be the same to a large extend. The nature of the service we offer (professional qualificaitons) is such that the "products" can only be described in a certain way and it will be near on in impossible to have a unique set of "product" pages for each franchisee. I hope that some of you have come across a similar problem or that some of you have suggestions or ideas for us to get round this. Kind regards Peter
Intermediate & Advanced SEO | | masterpete0 -
Two Brands One Site (Duplicate Content Issues)
Say your client has a national product, that's known by different brand names in different parts of the country. Unilever owns a mayonnaise sold East of the Rockies as "Hellmanns" and West of the Rockies as "Best Foods". It's marketed the same way, same slogan, graphics, etc... only the logo/brand is different. The websites are near identical with different logos, especially the interior pages. The Hellmanns version of the site has earned slightly more domain authority. Here is an example recipe page for some "WALDORF SALAD WRAPS by Bobby Flay Recipe" http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1 http://www.hellmanns.us/recipe_detail.aspx?RecipeID=12497&version=1 Both recipie pages are identical except for one logo. Neither pages ranks very well, neither has earned any backlinks, etc... Oddly the bestfood version does rank better (even though everything is the same, same backlinks, and hellmanns.us having more authority). If you were advising the client, what would you do. You would ideally like the Hellmann version to rank well for East Coast searches, and the Best Foods version for West Coast searches. So do you: Keep both versions with duplicate content, and focus on earning location relevant links. I.E. Earn Yelp reviews from east coast users for Hellmanns and West Coast users for Best foods? Cross Domain Canonical to give more of the link juice to only one brand so that only one of the pages ranks well for non-branded keywords? (but both sites would still rank for their branded keyworkds). No Index one of the brands so that only one version gets in the index and ranks at all. The other brand wouldn't even rank for it's branded keywords. Assume it's not practical to create unique content for each brand (the obvious answer). Note: I don't work for Unilver, but I have a client in a similar position. I lean towards #2, but the social media firm on the account wants to do #1. (obviously some functionally based bias in both our opinions, but we both just want to do what will work best for client). Any thoughts?
Intermediate & Advanced SEO | | crvw0