Penguin 2.0 drop due to poor anchor text?
-
Hi,
my website experienced a 30% drop in organic traffic following the Penguin 2.0 update, and after years of designing my website with SEO in mind, generating unique content for users, and only focusing on relevant websites in my link building strategy, I'm a bit disheartened by the drop in traffic.
Having rolled out a new design of my website at the start of April, I suspect that I've accidentally messed up the structure of the website, making my site difficult to crawl, or making Google think that my site is spammy. Looking at Google Webmaster Tools, the number 1 anchor text in the site is "remove all filters" - which is clearly not what I want! The "remove all filters" link on my website appears when my hotels page loads with filters or sorting or availability dates in place - I included that link to make it easy for users to view the complete hotel listing again. An example of this link is towards the top right hand side of this page:
http://www.concerthotels.com/venue-hotels/agganis-arena-hotels/300382?star=2
With over 6000 venues on my website, this link has the potential to appear thousands of times, and while the anchor text is always "remove all filters", the destination URL will be different depending on the venue the user is looking at. I'm guessing that to Google, this looks VERY spammy indeed!?
I tried to make the filtering/sorting/availability less visible to Google's crawl when I designed the site, through the use of forms, jquery and javascript etc., but it does look like the crawl is managing to access these pages and find the "remove all filters" link. What is the best approach to take when a standard "clear all..." type link is required on a listing page, without making the link appear spammy to Google - it's a link which is only in place to benefit the user - not to cause trouble!
My final question to you guys is - do you think this one sloppy piece of work could be enough to cause my site to drop significantly following the Penguin 2.0 update, or is it likely to be a bigger problem than this? And if it is probably due to this piece of work, is it likely that solving the problem could result in a prompt rise back up the rankings, or is there going to be a black mark against my website going forward and slow down recovery?
Any advice/suggestions will be greatly appreciated,
Thanks
Mike
-
Go to majestic SEO, type your url in. If your keywords you got penalized for are over 10% diversity you are being penalized generally, however there are a few exceptions, but not many. I analyzed 440 sites and found that the highest was 2.47 for a site that didn't have keywords in the url.
Also, I suggest you read this http://dailyseotip.com/what-other-marketing-firms-want-you-to-believe-that-isnt-true/3356/ I see that you are really focused on Onpage SEO. I think this will help you understand more.
The next thing you may want to do is start contacting admins and deleting low quality links if you have them. Use OSE and figure out low quality links. There are only a handful of directories I recommend out their. I have a message from Google telling one of my clients to get rid of their directory links, it was and example link coming from a directory site to be exact. Never use a keyword at a directory site, always use Brand name or your URL.
Make sure your Disavow is your last resort and I highly suggest you get someone to do it that has experience in it. Many have messed this up and really hurt their website.
Have a great day.
-
Hi Mat,
thanks for your reply. I'll definitely change the link, but I agree that it would be harsh if it was the sole reason for the 30% drop in organic traffic.
There are definitely some directories linking to ConcertHotels.com - at one stage I used the SEOmoz list of directories and got my website listed on some of the recommendations from the list. But my strategy for the last two years has been to approach venue's own websites and ask if they'd be interested in linking to our nearby hotels page, as a useful resource for their visitors. This strategy has worked quite well for me, and to me it sounds like a very natural, sensible link building strategy. I'll certainly work through my list of backlinks, but I would hope that the majority of them are from very relevant websites (due to the strategy I adopted). I guess there could be a percentage that I have not had any control over however, and I guess I should disavow these?
As for the directories, should I now be disavowing directory links? I didn't think that the percentage of directory links to my site would be that high -I used the directory link strategy in the past to simply enhance the number of links to my homepage - the strategy I described above is one which achieve links to specific pages throughout my website, not my homepage, so I felt the need to grow the number of homepage links.
Thanks again for your help and advice
Mike
-
That link is not ideal, but I really do not believe that it would cause the sort of drop you are talking about.
If you think you have been hit by penguin 2 then I'd start looking at your backlinks with a critical eye. I just stuck your domain into majestic seo and I hit a lot of questionable directories pretty quickly. That might be unfair - I certainly haven't analysed in any depth. However I took 10 domains at random and 9 were sites that at best are not helping you much.
If you're looking for a cause of a drop I'd say you could do worse than going through your backlink profile.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking drop after new website
Hi there, I have a new client who has just had a new website built (by someone else). It was quite a major change as it was 12 years old and has just been moved to Wordpress. However although they are by and large happy with the new site, they have lost a lot of their rankings in Google. The content and menu structure is apparently identical. I told them I didn't think this was unusual but I'm not sure how easy it will be to get them ranking again. Where are they likely to be starting from? Is it a case of starting from the beginning or will there be some residual ranking capability left over? Or can they expect a full recovery over time? I was going to start by looking to see if things like tagging and meta data has been filled in (I will add the site to my Moz account) but is there any way of comparing the old site with the new for SEO purposes? Thanks so much, Sarah.
Web Design | | Frog-Marketing0 -
How does ARIA-hidden text appear to search engines
I'm having trouble getting my accessibility team to add alt text to our site's images for SEO benefits as they feel some of it would add additional noise for screen readers. They proposed using ARIA-hidden attributes to hide the text but I'm wondering if will that be interpreted as a cloaking tactic to search engines? Also, I'm wondering if it the alt text will carry the same weight if ARIA-hidden is used. Has anyone had any experience with this? I'm having trouble finding any research on the topic.
Web Design | | KJ6001 -
Increase in Soft 404s due to Custom 404 page?
Hi all, We have noticed recently soft 404s are increasing day by day; which are landing on our custom 404 page created a month back. Other 404 pages are NOT landing on custom 404 page. Does this custom 404 page hurting us by causing an increase in soft 404s? Our CMS is WordPress. Thanks
Web Design | | vtmoz0 -
Google text-only vs rendered (index and ranking)
Hello, can someone please help answer a question about missing elements from Google's text-only cached version.
Web Design | | cpawsgo
When using JavaScript to display an element which is initially styled with display:none, does Google index (and most importantly properly rank) the elements contents? Using Google's "cache:" prefix followed by our pages url we can see the rendered cached page. The contents of the element in question are viewable and you can read the information inside. However, if you click the "Text-only version" link on the top-right of Google’s cached page, the element is missing and cannot be seen. The reason for this is because the element is initially styled with display:none and then JavaScript is used to display the text once some logic is applied. Doing a long-tail Google search for a few sentences from inside the element does find the page in the results, but I am not certain that is it being cached and ranked optimally... would updating the logic so that all the contents are not made visible by JavaScript improve our ranking or can we assume that since Google does return the page in its results that everything is proper? Thank you!0 -
Is it cloaking/hiding text if textual content is no longer accessible for mobile visitors on responsive webpages?
My company is implementing a responsive design for our website to better serve our mobile customers. However, when I reviewed the wireframes of the work our development company is doing, it became clear to me that, for many of our pages, large parts of the textual content on the page, and most of our sidebar links, would no longer be accessible to a visitor using a mobile device. The content will still be indexable, but hidden from users using media queries. There would be no access point for a user to view much of the content on the page that's making it rank. This is not my understanding of best practices around responsive design. My interpretation of Google's guidelines on responsive design is that all of the content is served to both users and search engines, but displayed in a more accessible way to a user depending on their mobile device. For example, Wikipedia pages have introductory content, but hide most of the detailed info in tabs. All of the information is still there and accessible to a user...but you don't have to scroll through as much to get to what you want. To me, what our development company is proposing fits the definition of cloaking and/or hiding text and links - we'd be making available different content to search engines than users, and it seems to me that there's considerable risk to their interpretation of responsive design. I'm wondering what other people in the Moz community think about this - and whether anyone out there has any experience to share about inaccessable content on responsive webpages, and the SEO impact of this. Thank you!
Web Design | | mmewdell0 -
Text in Images vs. Alt tags
Hi on my homepage i h ave multiple images They have the appropriate alt text for each image, but the text which the image displays is not written into the page and styled using CSS rather than placing text within an image. Is this a issue worth correcting, or is it sufficient to have just alt text for each image. Any major pros from having putting the text in the image into the CMS using appropriate CSS styling to achieve the same effect.
Web Design | | monster990 -
Google News we were dropped and need help finding ot why
Hi i have a site called in2town lifestyle magazine http://www.in2town.co.uk/ and up until two months ago we were with google news and for a long time. But then all of a sudden we were dropped which left us with no confidence about our site and led us to make changes to the site, some good and some bad to try and find out what was wrong with our site and why we were dropped. We have now been concentrating on sorting the site out which has led in a drop in traffic due to not updating it as we should because we are more concerned in trying to make it a quality lifestyle magazine and get back in google as well as making it a good experience for our readers.. I would like your help and finding out what you feel is wrong with our site so we can then work on it and change it and try and find out what went wrong with google news. we have spent years on the site but now we have gone in the wrong direction because we were more worried about google news. If you can advise us on how we should change the site and sort the site out and make it into the professional site it was once more then that would be great.
Web Design | | ClaireH-1848860 -
Long URLs due to foreign characters
I have a site which provides forum sections for various languages. When foreign characters are used in the post title, each letter is replace by a three character replacement such as %93. This conversion makes the URLs long. The site's software automatically uses the thread's title in the URL. It is never a problem except in these instances. Any suggestions on how to handle this issue?
Web Design | | RyanKent0