Are we being Penalized? Can someone Assess Please!
-
We have two eCommerce sites. Both the sites can broadly be divided into 3 pages
- 1. Home Page.
2. Detail Page.
3 Category Pages (Altogether our site has approx 3 Million pages each)
These are the site URLs
http://bit.ly/9tRZIi - This is targeted for USA Audience
http://bit.ly/P8MxPR - This is targeted for UK audience
The .com domain which was launched earlier in 2011 is doing okay with decent organic traffic
Precautions Taken: To avoid content being duplicate on both the sites we are using:
a. Geo-targeting through Google webmaster tools
b. rel=alternate tag on printsasia.co.uk
Problem
1. The .co.uk domain which was launched in May 2012 started gaining organic traffic slowly but then suddenly dropped to almost 0 after September 18.
2. When we use operator site:printsasia.co.uk and apply a filter on past week/month we don't see any result. While when same operator used for "any time" we see some results.
3. According to webmaster tool, Google has indexed 95% of our URLs in the sitemap
Our concern: Is our UK site penalized for some reasons? If yes, what could be the possible reason(s) for this penalty and possible steps to get out of it? Would request if experts here can review our site and help us.
-
It doesn't necessarily matter if the auto generated content is unique or not - Panda was intended to penalize low quality content (such as auto generated content), not just duplicate content.
Even if you were able to figure out a way to auto generate content that didn't get penalized, there's a good chance you'll get penalized in a future update.
-
Yes what you see through copyscape is correct as those content comes along with the book and will be true for all retailers and marketplace websites, be it amazon, bn or abebooks.
Since we could not think of any other way to come up with unique content we thought of auto generated content. I slightly differ here as these auto-generated content is unique for each page at least partially. Though I am not 100% confident if this is great way to go about it.
Yes review is something we are definitely coming up and this may help.
-
Hi Cyril,
I doubt that the rel=alternate tag will help. Copyscape shows that at least some of the content is duplicated across other sites, not just your two sites.
I also doubt that auto generated content will help avoid Panda. That's one of the things Panda was specifically created to penalize - auto generated content.
If you're getting unique reviews from users and/or writing editor reviews, that very well may help.
I realize that it is impractical to write content for 3 million pages, but you may find that is what you need to do. You may need to start with your top pages and work from there, and in the meantime block indexing of all pages without unique content. I would not take that step hastily, but it may be what you end up having to do.
~Adam
-
Thank you Adam for your time and valuable feedback. We were also thinking of being hit by Panda but thought and as correction we used rel=alternate tag on printsasia.co.uk
It's been just a few days since we implemented this we are unable to say if this is working.
2ndly to increase the content we are introducing some review program and at the same time have also generated some auto generated content since it is impossible to develop content for 3 mn and increasing pages. If you can see the last "Book Information" Section on this page http://bit.ly/QqMAFR you will understand what i mean.
This section will be there on all book detail pages. Your comment post reviewing this will be appreciated
-
According to this, there was a Panda update on Sept 18, so I suspect that's what hit your site. Panda mainly targets the content of your website - my guess would be that your site was penalized because it has a lot of "thin content" pages. In other words, all your book pages have very little (no?) unique textual content.
FYI, I would say your US site is also in danger of being penalized by Panda and/or Penguin. I see that over 1/3 of linking root domains link to you with the anchor text "<a class="clickable title link-pivot" title="See top linking pages that use this anchor text">buy books online". Over-use of keyword anchor text like that is strongly correlated with getting a Penguin penalty.</a>
-
Good plan. I would wait at least 4 weeks after removing the link before you decide whether or not it's worked
-
Mark thanks for your time and valuable feedback. I think you almost answered my doubt why only one site being penalized and not the other.
Mark you are right when you say "Looking at your link profile, you simply don't have sufficient volume or diversity of links, nor do you have enough links from high-authority sites within your space"
As i mentioned co.uk is just 5 months old site and its taking us sometime to build links. But we are definitely working on it.
I believe having lesser links can only be the reason of poor page rank and low rank in SERP it should not be the reason of being penalized. I hope you will agree with this.
As immediate step
1. I will first remove the sitewide link and see if this was the reason. If thinks improves over the time, we will keep the link back with changed anchor text
2. We will definitely take care of the blog comments considering the importance of it in brand reputation
-
see my answer below!
-
Thanks for your time. You mean blog.printsasia and not (blog.bookshopasia) right?
Juts a question- blog.printsasia.com is our own official blog, we have placed links for both the sites on subdomain blog.printsasia.com. If that is the reason for penalty then why our other site is not being penalized? or why only co.uk is penalized and .com is having no issues
-
I agree in part with easyrider2. There may be a problem caused by the sitewide header links from your own blog (blog.printsasia.com). These currently use the keyword-rich anchor text "Online Bookshop UK", although I suspect you previously had this as "Bookshop UK", as this is what OSE has picked up. Either way, they look like the kind of links that might be targeted by Penguin, as they don't use your brand name as anchor text.
If the blog was a subdomain of your UK site (blog.printsasia.co.uk), I don't think this would be a problem. But because it's a subdomain of a US site (albeit the same company), this could look like a spammy type of link.
Note: it may be that Google has not penalised you, but has simply decided to discount a set of links, perhaps these ones.
The good news is that as this is your company blog you can quickly change the link.
You could try one of the following:
1. Remove the sitewide link from blog.printsasia.com altogether
2. Change the anchor text to your brand name (eg Printsasia UK)
3. Remove the sitewide link and add a few more "natural" links into blog posts (as easyrider2 suggests)
Personally, I would try 1, assuming it doesn't drive significant traffic to your site. If that helps then you know you've identified a problem.
However, I don't think this is your only problem, and I'm not even convinced it is a problem. Looking at your link profile, you simply don't have sufficient volume or diversity of links, nor do you have enough links from high-authority sites within your space. So even if you "fix" this immediate problem, you still need to focus on some serious linkbuilding (by which I mean relationship building) within your industry.
I agree with easyrider2 about the spammy blog comments. These may not cause a problem with Google but they look very poor to users (and webmasters who might potentially link to your sites).
-
Looking at open site explorer for your UK bookshop I would say with 99% confidence you are being penalised because of over optimisation. Sounds like you got hit on the penguin refresh around sept 18.
Your anchor text is nearly all pointing with Bookshops UK. In fact 236 times and the nearest alternative is printasia.co.uk 4 times. Plus they are all coming from the same domain (blog.bookshopasia). You need to vary your anchor text. However, i am guess that link is the page template, although I could only see a link for "online bookshop UK" it has to be in there somewhere as OSE picks it up.
Make sure that if it is on the template, make that link no-follow and get links from different domains for different keywords.
You also need to get on top of your blog commenting. People using names such as "how to build your own iphone app" are just spam and worthless comments. Even if you disallow websites to be linked, crap content is worthless to your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can subdomains avoid spam penalizations?
Hello everyone, I have a basic question for which I couldn't find a definitive answer for. Let's say I have my main website with URL: www.mywebsite.com And I have a related affiliates website with URL: affiliates.mywebsite.com Which includes completely different content from the main website. Also, both domains have two different IP addresses. Are those considered two completely separate domains by Google? Can bad links pointing to affiliates.mywebsite.com affect www.mywebsite.com in any way? Thanks in advance for any answer to my inquiry!
Intermediate & Advanced SEO | | fablau0 -
Not Adding Fresh Content Daily Did Got Me Penalized?
One of my website used to post like a 1000 words articles every 4-5 (say like 12 x 300 words articles each in a week) days in a week. The process went till 3 months. Then suddenly we stopped adding content to it for a flat 15 days due to unavailability of content writer. Suddenly a major drop took place. Now we have been adding the same amount of quality content but the ranking doesn't seem to be improving. Is it a penalty?
Intermediate & Advanced SEO | | welcomecure0 -
Algotihmic Penality
Hello friends need help 🙂
Intermediate & Advanced SEO | | gurpreet1234
Site age : 2 years but constructed 2-3 months ago.
Yesterday suddenly ranking for all keywords dropped (even for brand name). 1st page to 8-9 page
Niche : mp3/videos download
One of my competitors also has same problem. (Their site is 3 yr old). Common between us :-
1.Meta description same for all pages,
2.Having nofollow links (Example user landed on page A, page A has nofollow links to page B (download links to file exist).
This happen first time with me can anybody help with his knowledge.. Someone suggested me to do a 301 to new domain.0 -
Can't crawl website with Screaming frog... what is wrong?
Hello all - I've just been trying to crawl a site with Screaming Frog and can't get beyond the homepage - have done the usual stuff (turn off JS and so on) and no problems there with nav and so on- the site's other pages have indexed in Google btw. Now I'm wondering whether there's a problem with this robots.txt file, which I think may be auto-generated by Joomla (I'm not familiar with Joomla...) - are there any issues here? [just checked... and there isn't!] If the Joomla site is installed within a folder such as at e.g. www.example.com/joomla/ the robots.txt file MUST be moved to the site root at e.g. www.example.com/robots.txt AND the joomla folder name MUST be prefixed to the disallowed path, e.g. the Disallow rule for the /administrator/ folder MUST be changed to read Disallow: /joomla/administrator/ For more information about the robots.txt standard, see: http://www.robotstxt.org/orig.html For syntax checking, see: http://tool.motoricerca.info/robots-checker.phtml User-agent: *
Intermediate & Advanced SEO | | McTaggart
Disallow: /administrator/
Disallow: /bin/
Disallow: /cache/
Disallow: /cli/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /layouts/
Disallow: /libraries/
Disallow: /logs/
Disallow: /modules/
Disallow: /plugins/
Disallow: /tmp/0 -
Am I on the right way ? any suggestion please ?
Hi : Now it's 3 month from starting seo my website by myself ( my website is like prchecker.info that give users one online service " My both primary keywords have 450.000 and 100.000 **exact usa search , **when I start my goal is to rank my both keywords on second page during the first year , now and after 3 month after creating few quality backlinks ( guest posting and comments on relevant topic on forum ) my both keywords are ranked on 3rd and fifth page. Any suggestion to create quality backlinks that might help me ? should I continue with guest posting ?
Intermediate & Advanced SEO | | Khaledmoalla0 -
If I only Link to Page via Sitemap, can it still get indexed?
Hi there! I am creating a ton of content for specific geographies. Is it possible for these pages to get indexed if I only put them in my sitemap and don't link to them through my actual site (though the pages will be live). Thanks!
Intermediate & Advanced SEO | | Travis-W
Travis0 -
Robots.txt: Can you put a /* wildcard in the middle of a URL?
We have noticed that Google is indexing the language/country directory versions of directories we have disallowed in our robots.txt. For example: Disallow: /images/ is blocked just fine However, once you add our /en/uk/ directory in front of it, there are dozens of pages indexed. The question is: Can I put a wildcard in the middle of the string, ex. /en/*/images/, or do I need to list out every single country for every language in the robots file. Anyone know of any workarounds?
Intermediate & Advanced SEO | | IHSwebsite0 -
Okay can someone straighten out SEO for me
If my keyword is dog training and I wanted to make 5 posts on a blog, do I target all the posts with the keyword of dog training or what?
Intermediate & Advanced SEO | | 6786486312640