Affected by the Penguin 2.0...Lost Rankings!! Which one from the following causes did that?
-
Hi Moz Community,
we lost our rankings for almost all of our keywords due to penguin 2.0
I'm trying to rectify the situation by getting our rankings back and better. Here are the potential causes for our rankings drop.
1. My Company has the exact same site (in English) on different country-specific domains. companyname.com, companyname.ch, companyname.ca, companyname.nl etc..
We are using link rel="alternate" hreflang="en" href="http://companyname.com"
**Would this solve the duplicate content issue? **
2. All the international domains have footer links in the homepages, pointing to our main .com site. Google webmaster tools shows that every page on these international domains as a backlink source.
Would that be a problem? Should I ask our web designers to remove those footer links from the international domains?
3. In the past, our company had a partnership with forum...Like the forum promotes our company's services. I don't know how did that forum promoted our site, But Google webmaster tools shows that we have some 25K backlinks from that forum!!! But when I try to visit those backlinks, none of those pages have any links to our site....Could they be hidden? Or Some thing wrong with GWT?
I'd appreciate your valuable suggestions to help us understand the situation better...
Thank you.
-
Hey Chris,
Even I had the same reaction when I first started with this company, Why would you need all these sites...With the exact same content. They have different plans and ideas or probably misguided by some SEO guy!
Regarding the vast number of links from the forum, None of the backlink tools like OSE, Ahrefs is showing those links apart from GWT.
I'm thinking to Disavow that forum's domain. Hopefully, that should work!!
Thank for the response.
-
You must understand what Penguin is AntiWebSpam filter. You must control backlink profile of your site. Your profile must looking naturally. Not only your targeted keywords as anchor texts, not only backlinks on your homepage. For understanding structure of your backlink profile you can use Majestic SEO or Ahrefs (OSE on my opinion not perfect). And often after checking backlink profile I clearly understand what is wrong.
-
What's the purpose of having all those duplicate sites? Why not just 301 them and eliminate that as a possible cause of the problem--especially if some of them are showing up in the results? That would take care of #2, as well. As far as #3, have you tried any other tools to see if you can look at those links--OSE, MajesticSEO, or Ahrefs? Have you communicated with the webmaster there about them, documented it, and disavowed them?
-
Hi Tom,
First of all, thanks lot for your reply.
Funny thing is, Despite having the rel="alternate" included in our international domains. Some of them are ranking and in fact outranking our .com domain pages for some keywords.
How is that possible?
And your suggestion about putting a nofollow for the footer links is something I'm thinking to do.
-
Hi Eduard
First of all, if your site was effected by the Penguin 2.0 update, then your duplicate page content wouldn't have been the trigger. That has to do with the Panda algorithm (which is now part of the main algorithm), so if the drop came during the Penguin update, you could have a whole number of different problems with your site as well.
My advice would be to run your site through the Panguin tool: http://www.barracuda-digital.co.uk/panguin-tool/
This will line up your organic traffic with announced Google updates - see if your drop matches up with the Penguin update. If it does, then you have a whole number of things to check about the inbound links to your website.
To answer the questions you posted:
-
That should do the trick - the only problem is that by setting a canonical version you are instructing Google not to rank or consider any of the alternate versions. A more ideal solution in my opinion would be to rewrite the content for the other sites, ensuring no chance of a duplicate content penalty and giving the other sites a chance to rank.
-
I've seen this be a problem with Penguin before and every time I've seen it fixed the links have either been removed or have been made nofollow. You may want to limit the amount of pages the footer links appear (just core landing pages) and also make them nofollow.
Hope this helps
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does adding new pages, new slugs, new URLS in a site affects rankings and visibility?
hi reader, i have decided to add new pages to my site. if i add new urls, i feel like i have to submit the sitemap again. my question is, does submitting sitemap again with new slugs or urls affects visibility is serps, if yes, how do i minimize the impact?
Web Design | | SIMON-CULL0 -
How to fix non-crawlable pages affected by CSS modals?
I stumbled across something new when doing a site audit in SEMRUSH today ---> Modals. The case: Several pages could not be crawled because of (modal:) in the URL. What I know: "A modal is a dialog box/popup window that is displayed on top of the current page" based on CSS and JS. What I don't know: How to prevent crawlers from finding them.
Web Design | | Dan-Louis0 -
Looking to remove SSL because it is causing very slow website download speeds. Does WP have a plugin that redirects SSL urls to non SSL urls?
After some extended debate with our web development team we are considering dropping the SSL from our website because it is adding almost 2 additional seconds to our download speeds. We know there is a SEO boost from having a SSL but we believe the extended download speeds maybe outweighing the benefit. However we are concerned about the SEO implications of having no method possible of redirect SSL to non SSL webpages. Does anybody know of a Wordpress Plugin that can force redirect SSL urls to non SSL urls?
Web Design | | RosemaryB0 -
Is it important to keep your website home index page simple to rank better?
My website http://www.endeavourcottage.co.uk/ markets holiday cottages and it's grown from my own singular cottage into a small letting agency and I used to rank at best number 3 for the short tailed keywords like Whitby holiday cottages with its drop-down to position 10 on Google.co.uk. So this week I was looking for a UK business to help me improve my rankings and the first thing they said was my home page is detrimental with the listing too many conflicting info with it advertising all 12 properties on it. They suggested a door entry page into the site keeping it simple but when I run it through the analysing tool here on Moz for "Whitby holiday cottages" as an example it came out looking okay. I do the usual things of title tags and meta descriptions for my keywords etc any suggestions or advice would be very welcome thank you Alan
Web Design | | WhitbyHolidayCottages0 -
Best Approach to Rank For Multiple Locations With Similar Targeted Keywords
I'm trying to determine the best way to set up a website to rank for a similar set of keyword phrases in three different cities. The keyword phrases I want to rank for are all pretty much the same with the only difference being the city associated with the keyword phrase. For example, "Austin water restoration" vs "San Antonio water restoration" vs "Houston water restoration". Each city needs about 7 or 8 pages of unique content to accurately target the group of keywords I'm trying to rank for. My initial thought was to write up unique content for each city and have each city act a site within the main site. For example, the main navigation for xyz.com/austin would be Austin specific, so when you land on xyz.com/austin and go to Services - Water Restoration, it would be all Austin specific content. The same would be true for San Antonio and Houston. The only problem with this approach is that I have to build up the page authority for a lot of different pages. It would be much easier to build up the page authority for one Water Restoration page and just insert a little "Areas we serve" on the page that includes "Austin, San Antonio, and Houston" and maybe work the coverage area in again at the bottom of the page somewhere. However, it would be much more difficult to work "Austin, San Antonio, and Houston" into the title tags and H1s though, and I couldn't logically work the cities into the content as much either. That would be a downside to this approach. Any thoughts on this? Wondering how large companies with hundreds of locations typically approach this? I'd really appreciate your input.
Web Design | | shaycw0 -
Penguin 2.0 drop due to poor anchor text?
Hi, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update, and after years of designing my website with SEO in mind, generating unique content for users, and only focusing on relevant websites in my link building strategy, I'm a bit disheartened by the drop in traffic. Having rolled out a new design of my website at the start of April, I suspect that I've accidentally messed up the structure of the website, making my site difficult to crawl, or making Google think that my site is spammy. Looking at Google Webmaster Tools, the number 1 anchor text in the site is "remove all filters" - which is clearly not what I want! The "remove all filters" link on my website appears when my hotels page loads with filters or sorting or availability dates in place - I included that link to make it easy for users to view the complete hotel listing again. An example of this link is towards the top right hand side of this page: http://www.concerthotels.com/venue-hotels/agganis-arena-hotels/300382?star=2 With over 6000 venues on my website, this link has the potential to appear thousands of times, and while the anchor text is always "remove all filters", the destination URL will be different depending on the venue the user is looking at. I'm guessing that to Google, this looks VERY spammy indeed!? I tried to make the filtering/sorting/availability less visible to Google's crawl when I designed the site, through the use of forms, jquery and javascript etc., but it does look like the crawl is managing to access these pages and find the "remove all filters" link. What is the best approach to take when a standard "clear all..." type link is required on a listing page, without making the link appear spammy to Google - it's a link which is only in place to benefit the user - not to cause trouble! My final question to you guys is - do you think this one sloppy piece of work could be enough to cause my site to drop significantly following the Penguin 2.0 update, or is it likely to be a bigger problem than this? And if it is probably due to this piece of work, is it likely that solving the problem could result in a prompt rise back up the rankings, or is there going to be a black mark against my website going forward and slow down recovery? Any advice/suggestions will be greatly appreciated, Thanks Mike
Web Design | | mjk260 -
Does DNS location affect international SEO?
Hi All Smart SEOmozers! I have another dumb question =] I have almost no knowledge on how DNS works and all the website background work. I understand that DNS is the server that translates a domain name to the IP address. Furthermore, I also know that IP Address location or web host location plays a small factor in international SEO. Webhosts usually provide the DNS service as well but for this case ABC Company uses a different domain service, diferent DNS service and different webhost service so things get complicated. So the question, does the location of DNS service we use affect International SEO like how the location of the webhost does. Thank you in advance for your help!
Web Design | | TommyTan0 -
Hom much does getting a mobile website improve the "mobile ranking"?
There's been speculation about Google totally not ranking sites that are not mobile, but as far as I can see many sites rank THE SAME on mobile devices as on regular stationary PCs/lap-tops. I had hoped for a radically improved ranking once a mobile site was built. Or is it just that it takes a TON of time for it to go through and then it gets better or is it like with regular sites that a new mobile site on a sub-domain is parked in some form of mobile-sand-box? Also, does putting a mobile site on a ranking domain, as subdomain help, e.g. m.testingsite.com instead of testingsite.mobi ? Thanks.
Web Design | | yvonneq0