Penguin Update Issues.. What would you recommend?
-
Hi,
We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%.
We suspect it's for a couple of reasons
1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx
We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week
2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle?
3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com.
Any help will be much appreciated as this is Killing our business.
Jay
-
Hey Ben,
Thank you so much for your response.
I 'm pretty sure it was the Penguin update that brought our rankings down.
We don't participate in any paid linking, no blog networks etc.
The only thing we did was submit to article directories- which i understand are frowned upon now so we'll move away from that.
We'll try to get all the non existent pages to show 404 codes and any clear any duplicate page title and page content errors and hope that we'll get back in google good graces.
-
Hi Jay,
Sorry to hear it's hurting your business so much.
Have you double checked the dates of your decrease in traffic against the Penguin update? There were a lot of big changes going on around that time so it's worth being sure it was Penguin.
In answer to question 3 - If they're external sites then I don't think those 1700 404s are having a negative effect on your SEO. If those directories are hurting you at all through the Penguin update then it would be through over-optimised anchor text (although I haven't seen any definitive data on this).
In answer to question 2 - Would I be right in thinking that you're using a 301 or a 302 to send users to a generic error page? However you're generating soft 404s the best fix is to make them real 404 errors so the server returns a 404 code. The details of setting up a custom 404 page are pretty well documented around the web so you shouldn't have much problem with it.
In answer to question 1 - Have you tried checking to see if Google has re-cached your pages since the change? It's also probably worth looking at the rel=prev rel=next markup as well. Maile Ohye from Google has released a pretty comprehensive video on the topic of pagination and SEO so I'd recommend checking that out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shopify Website Page Indexing issue
Hi, I am working on an eCommerce website on Shopify.
Intermediate & Advanced SEO | | Bhisshaun
When I tried Indexing my newly created service pages. The pages are not getting indexed on Google.
I also tried manual indexing of each page and submitted a sitemap but still, the issue doesn't seem to be resolved. Thanks0 -
Google Disavow File Format and MOZ Spam Score Updates
Hi, Is there a defined file format for Google disavow file name? Does it has to be disavowlinks.txt or can we do this like domain-name-date.txt ? Also, since Google does not share their data with Moz, how does MOz updates its spam score after we disavow the bad links? Do we need to connect Google search console with Moz?
Intermediate & Advanced SEO | | Sunil-Gupta0 -
Whats up with the last google update.
I have numerous clients who were at the top of page in the top 3 spots. They all dropped to page 3 or 4 or 2 and now they are number 1 in maps or in the top 3. Content is great on all these sites. backlinks are high quality and we do not build high quantity, we always focus on quality. the sites have authorship information. trust . we have excellent content written by professionals in the industry for each of the websites. The sites load super fast. they are very mobile friendly. we have CDN installed. content is organized per topic. all of our citations are setup properly and no duplicates, or missing citations. code is good on the websites. we do not have anchor text links pointing to the site from gust posts or whatever. we have plenty of content. our DA/PA is great. Audits of the website are great. I've been doing this a long time and ive never been so dumb founded as to what google has done this time. Or better yet what exactly is wrong with our clients websites today that was working perfectly for the last 5 years. I really am getting frustrated. im comparing my sites to competitors and everything's better. Please someone guide me here and tell me what im missing or tell me what you have done to recover from this nonsense.
Intermediate & Advanced SEO | | waqid0 -
Ecommerce catalog update: 301 redirects?
Hello mozers, We run an ecommerce store and are planning a massive catalog update this month. Essentially, 100% of our product listings will be deleted, and an all new catalog will be uploaded. The new catalog contains mostly new products, however there are some products that already existing in the old catalog as well. The new catalog has a bunch of improvements to the product pages, included optimized meta titles and descriptions, multiple language, optimized URLs and more. My question is the following: When we delete the existing catalog, all indexed URLs will return 404 errors. Setting up 301 redirects from old to new products (for products which existing previously) is not feasible given the number of products. Also, many products are simply being remove entirely. So should we go ahead and delete all products, upload the new catalog, update the sitemap, resubmit it for crawling, and live with a bunch of 404 errors until these URLs get dropped from Google? The alternative I see is setting 301 redirects to the home page, but I am not sure this would be correct use of 301 redirects. Thanks for your input.
Intermediate & Advanced SEO | | yacpro130 -
Weird Indexation Issue
On this webpage, we have an interactive graphic that allows users to click a navigational element and learn more about an anatomical part of the knee or a knee malady. For example, a user could click "Articular Cartilage" and they will land on this page: http://www.neocartimplant.com/knee-anatomy-maladies/anatomy/articular-cartilage The weird thing is whether you perform a Google Search for the above URL or for a string of text on that URL (i.e. "Articular cartilage is hyaline cartilage (as opposed to menisci, which consists of fibrocartilage) on the articular surfaces, or the ends, of bones. This thin, smooth tissue lines both joint surfaces where the bones come together to form the knee. ") the following page ranks: http://www.neocartimplant.com/anatmal/knee-anatomy-maladies/anatomy/articular-cartilage.php I have two questions: 1 - Any idea on how the Googlebot is getting to that page?
Intermediate & Advanced SEO | | davidangotti
2 - How should I get the Googlebot to index the correct page (http://www.neocartimplant.com/knee-anatomy-maladies/anatomy/articular-cartilage)? Thanks in advance for your help!0 -
Google Fetch Issue
I'm having some problems with what google is fetching and what it isn't, and I'd like to know why. For example, google IS fetching a non-existent page but listing it as an error: http://www.gaport.com/carports but the actual url is http://www.gaport.com/carports.htm. Google is NOT able to fetch http://www.gaport.com/aluminum/storage-buildings-10x12.htm. It says the page doesn't exist (even though it does) and when I click on the not found link in Google fetch it adds %E@%80%8E to the url causing the problem. One theory we have is that this may be some sort of server/hosting problem, but that's only really because we can't figure out what we could have done to cause it. Any insights would be greatly appreciated. Thanks and Happy Holidays! Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Really, is there much difference between an unnatural links warning and Penguin?
We know that the unnatural links warnings are manual and that Penguin is algorithmic. (I'm not talking about the latest round of confusing unnatural links warnings, but the ones sent out months ago that eventually resulted in a loss of rankings for those who didn't clean their link profiles up.) Is there much difference in the recovery process for either? From what I can see, both are about unnatural/spammy linking to your site. The only difference I can see is that once you feel you've cleaned up after getting an unnatural links warning you can file a reconsideration request. But, if you've cleaned up after a Penguin hit you need to wait for the next Penguin refresh in order to see if you've recovered. Are there other differences that I am not getting?
Intermediate & Advanced SEO | | MarieHaynes0