Business Site Hit Hard from The Penguin Update
-
Over the last week or so I have been swimming through all the forum posts regarding the latest google penguin update and I am still unclear as to why my site was hit so hard. I have lost 90% of my traffic and my target keywords have disappeared. Maybe I am missing something hear but if you could have a look at my site in question here : http://www.websitetemplatedesign.com/
Is there something that just stands out that I have missed as to why this site would have been hit so hard?
Any input would be appreciated.
-
You need to watch the White Board Fridays.
4:30 Footer Links
8:50 Webpage Spam. In this case, you are doing microsites selling the same thing: TEMPLATES.
-
I agree with the other responses, especially the keyword stuffing in the footer area of the page - which also exist in the top menu - so over doing the number of internal and external links esp. with the same anchor keyword text again and again and again is potentially an issue.
Plus one of the external links is to adultsextemplates - not sure how much that has to do with it, but when you link that many times to external links within the same site and if all the links are in the same C block, that's a problem.
I would also say the keyword density for "templates" is too high.
-
Link farming could be added to the list as well. Check your low PA & DA links!
|
| Western Cowgirl Collection | |
|
|
|
|
CountryRoad presents this wonderful classic western look with Rhinestone buckel and turquoise flat stones on tooled leather.
|
-
Oh yeah! Even I would hit you with a penguin. You might not like my response, but you asked for it.
-
Keyword stuffing in the footer.
-
Sitewide links in your footer linking to external sites. Why aren't they on the same site? Why are you making microsites? SPAMMING?
-
What's that? OMG same C-Block?
http://www.creloadedpro.com/ IP 72.52.235.211
http://www.adultsextemplates.com/ IP 72.52.235.209
Sorry to be blunt, but you get my point. I found all that within 2 minutes.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Query on Site Architecture
Hi All, When I check on my ecommerce site in one of the architecture tool in that my Ecommerce Homepage interlink with 765 pages whereas when I check few competitors and big brands then there homepage linked with 28 pages, 33, 47, 57 etc not like my site 765 pages. Do I am wrong anywhere? Can you please check the screenshot of mine & one of the competitor's site architecture? Because as per me site architecture also play good role in google organic ranking. vXs5dh2 16wre
Technical SEO | | pragnesh96390 -
Why are these blackhat sites so successful?
Here's an interesting conundrum. Here are three sites with their respective ranking for "dental implants [city]:" http://dentalimplantsvaughan.ca - 9 (on google.ca) http://dentalimplantsinhonoluluhi.com - 2 (on google.com) http://dentalimplantssurreybc.ca - 7 (on google.ca) These markets are not particularly competitive, however, all of these sites suffer from: Duplicate content, both internally and across sites (all of this company's implant sites have the same exact content, minus the bio pages and the local modifier). Average speed score. No structured data No links And these sites are ranking relatively quickly. The Vaughan site went live 3 months ago. But, what's boggling my mind is that they rank on the first page at all. It seems they're doing the exact opposite of what you're supposed to do, yet they rank relatively well.
Technical SEO | | nowmedia10 -
Micro-site homepage not being indexed
http://www.reebok.com/en-US/reebokonehome/ This is a homepage for an instructor network micro-site on Reebok.com The robots.txt file was excluding the /en-US/ directory, we've since removed that exclusion, and resubmitted this URL for indexing via Google Webmaster but we are still not seeing it in the index. Any advice would be very helpful, we may be missing some blocking issue or perhaps we just need to wait longer?
Technical SEO | | PatrickDugan0 -
Site-wide Links
Hey y'all, I know this question has been asked many times before but I wanted to see what your stance was on this particular case. The organisation I work for is a group of 12 companies - each with its own website. On some of the sites we have a link to the other sites within the group on every single page of that site. Our organic search traffic has dropped a bit but not significantly and we haven't received any manual penalties from Google. It's also worth mentioning that the referral traffic for these sites from the other sites I control is quite good and the bounce rate is extremely low. If you were in my shoes would you remove the links, put a nofollow tag on the links or leave the links as they are? Thanks guys 🙂
Technical SEO | | AAttias0 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
One site per location or all under and umbrella site?
I am working on a project where we are re-branding lots (100+) existing local business under one national brand. I am wondering what we should do with their existing websites, they are generally fairly poor and will need re-designing to match the new brand but may have some residual links? 301 redirect the URL to the national site, e.g. nationalsite.com/localbusinessA? If so what should I look out for? Do I need to specifically redirect any pages that have links to them to the same pages on the new site? Or should I give them a new standalone website that they link back to the national brand site? More than likely this will be hosted on the same server and CMS as the main site just the URL will remain Do I need to make sure that any old URL's that had links to them are 301'd to the new pages? Many thanks for you advice.
Technical SEO | | BadgerToo0 -
Does 301 redirecting a site multiple times keep the value of the original site?
Hi, All! If I 301 redirect site www.abc.com to www.def.com, it should pass (almost) all linkjuice, rank, trust, etc. What happens if I then redirect site www.def.com to www.ghi.com? Does the value of the original site pass indefinitely as long as you do the redirects correctly? Or does it start to be devalued at some point? If anyone's had experience redirecting a site more than once and they've seen reportable good/bad/neutral results, that would be very helpful. Thanks in advance! -Aviva B
Technical SEO | | debi_zyx0 -
Impact of 401s on Site Rankings
Will having 401s on a site negatively impact rankings? (e.g. 401s thrown from a social media sharing icon)
Technical SEO | | Christy-Correll0