Are we being Penalized? Can someone Assess Please!
-
We have two eCommerce sites. Both the sites can broadly be divided into 3 pages
- 1. Home Page.
2. Detail Page.
3 Category Pages (Altogether our site has approx 3 Million pages each)
These are the site URLs
http://bit.ly/9tRZIi - This is targeted for USA Audience
http://bit.ly/P8MxPR - This is targeted for UK audience
The .com domain which was launched earlier in 2011 is doing okay with decent organic traffic
Precautions Taken: To avoid content being duplicate on both the sites we are using:
a. Geo-targeting through Google webmaster tools
b. rel=alternate tag on printsasia.co.uk
Problem
1. The .co.uk domain which was launched in May 2012 started gaining organic traffic slowly but then suddenly dropped to almost 0 after September 18.
2. When we use operator site:printsasia.co.uk and apply a filter on past week/month we don't see any result. While when same operator used for "any time" we see some results.
3. According to webmaster tool, Google has indexed 95% of our URLs in the sitemap
Our concern: Is our UK site penalized for some reasons? If yes, what could be the possible reason(s) for this penalty and possible steps to get out of it? Would request if experts here can review our site and help us.
-
It doesn't necessarily matter if the auto generated content is unique or not - Panda was intended to penalize low quality content (such as auto generated content), not just duplicate content.
Even if you were able to figure out a way to auto generate content that didn't get penalized, there's a good chance you'll get penalized in a future update.
-
Yes what you see through copyscape is correct as those content comes along with the book and will be true for all retailers and marketplace websites, be it amazon, bn or abebooks.
Since we could not think of any other way to come up with unique content we thought of auto generated content. I slightly differ here as these auto-generated content is unique for each page at least partially. Though I am not 100% confident if this is great way to go about it.
Yes review is something we are definitely coming up and this may help.
-
Hi Cyril,
I doubt that the rel=alternate tag will help. Copyscape shows that at least some of the content is duplicated across other sites, not just your two sites.
I also doubt that auto generated content will help avoid Panda. That's one of the things Panda was specifically created to penalize - auto generated content.
If you're getting unique reviews from users and/or writing editor reviews, that very well may help.
I realize that it is impractical to write content for 3 million pages, but you may find that is what you need to do. You may need to start with your top pages and work from there, and in the meantime block indexing of all pages without unique content. I would not take that step hastily, but it may be what you end up having to do.
~Adam
-
Thank you Adam for your time and valuable feedback. We were also thinking of being hit by Panda but thought and as correction we used rel=alternate tag on printsasia.co.uk
It's been just a few days since we implemented this we are unable to say if this is working.
2ndly to increase the content we are introducing some review program and at the same time have also generated some auto generated content since it is impossible to develop content for 3 mn and increasing pages. If you can see the last "Book Information" Section on this page http://bit.ly/QqMAFR you will understand what i mean.
This section will be there on all book detail pages. Your comment post reviewing this will be appreciated
-
According to this, there was a Panda update on Sept 18, so I suspect that's what hit your site. Panda mainly targets the content of your website - my guess would be that your site was penalized because it has a lot of "thin content" pages. In other words, all your book pages have very little (no?) unique textual content.
FYI, I would say your US site is also in danger of being penalized by Panda and/or Penguin. I see that over 1/3 of linking root domains link to you with the anchor text "<a class="clickable title link-pivot" title="See top linking pages that use this anchor text">buy books online". Over-use of keyword anchor text like that is strongly correlated with getting a Penguin penalty.</a>
-
Good plan. I would wait at least 4 weeks after removing the link before you decide whether or not it's worked
-
Mark thanks for your time and valuable feedback. I think you almost answered my doubt why only one site being penalized and not the other.
Mark you are right when you say "Looking at your link profile, you simply don't have sufficient volume or diversity of links, nor do you have enough links from high-authority sites within your space"
As i mentioned co.uk is just 5 months old site and its taking us sometime to build links. But we are definitely working on it.
I believe having lesser links can only be the reason of poor page rank and low rank in SERP it should not be the reason of being penalized. I hope you will agree with this.
As immediate step
1. I will first remove the sitewide link and see if this was the reason. If thinks improves over the time, we will keep the link back with changed anchor text
2. We will definitely take care of the blog comments considering the importance of it in brand reputation
-
see my answer below!
-
Thanks for your time. You mean blog.printsasia and not (blog.bookshopasia) right?
Juts a question- blog.printsasia.com is our own official blog, we have placed links for both the sites on subdomain blog.printsasia.com. If that is the reason for penalty then why our other site is not being penalized? or why only co.uk is penalized and .com is having no issues
-
I agree in part with easyrider2. There may be a problem caused by the sitewide header links from your own blog (blog.printsasia.com). These currently use the keyword-rich anchor text "Online Bookshop UK", although I suspect you previously had this as "Bookshop UK", as this is what OSE has picked up. Either way, they look like the kind of links that might be targeted by Penguin, as they don't use your brand name as anchor text.
If the blog was a subdomain of your UK site (blog.printsasia.co.uk), I don't think this would be a problem. But because it's a subdomain of a US site (albeit the same company), this could look like a spammy type of link.
Note: it may be that Google has not penalised you, but has simply decided to discount a set of links, perhaps these ones.
The good news is that as this is your company blog you can quickly change the link.
You could try one of the following:
1. Remove the sitewide link from blog.printsasia.com altogether
2. Change the anchor text to your brand name (eg Printsasia UK)
3. Remove the sitewide link and add a few more "natural" links into blog posts (as easyrider2 suggests)
Personally, I would try 1, assuming it doesn't drive significant traffic to your site. If that helps then you know you've identified a problem.
However, I don't think this is your only problem, and I'm not even convinced it is a problem. Looking at your link profile, you simply don't have sufficient volume or diversity of links, nor do you have enough links from high-authority sites within your space. So even if you "fix" this immediate problem, you still need to focus on some serious linkbuilding (by which I mean relationship building) within your industry.
I agree with easyrider2 about the spammy blog comments. These may not cause a problem with Google but they look very poor to users (and webmasters who might potentially link to your sites).
-
Looking at open site explorer for your UK bookshop I would say with 99% confidence you are being penalised because of over optimisation. Sounds like you got hit on the penguin refresh around sept 18.
Your anchor text is nearly all pointing with Bookshops UK. In fact 236 times and the nearest alternative is printasia.co.uk 4 times. Plus they are all coming from the same domain (blog.bookshopasia). You need to vary your anchor text. However, i am guess that link is the page template, although I could only see a link for "online bookshop UK" it has to be in there somewhere as OSE picks it up.
Make sure that if it is on the template, make that link no-follow and get links from different domains for different keywords.
You also need to get on top of your blog commenting. People using names such as "how to build your own iphone app" are just spam and worthless comments. Even if you disallow websites to be linked, crap content is worthless to your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can't crawl website with Screaming frog... what is wrong?
Hello all - I've just been trying to crawl a site with Screaming Frog and can't get beyond the homepage - have done the usual stuff (turn off JS and so on) and no problems there with nav and so on- the site's other pages have indexed in Google btw. Now I'm wondering whether there's a problem with this robots.txt file, which I think may be auto-generated by Joomla (I'm not familiar with Joomla...) - are there any issues here? [just checked... and there isn't!] If the Joomla site is installed within a folder such as at e.g. www.example.com/joomla/ the robots.txt file MUST be moved to the site root at e.g. www.example.com/robots.txt AND the joomla folder name MUST be prefixed to the disallowed path, e.g. the Disallow rule for the /administrator/ folder MUST be changed to read Disallow: /joomla/administrator/ For more information about the robots.txt standard, see: http://www.robotstxt.org/orig.html For syntax checking, see: http://tool.motoricerca.info/robots-checker.phtml User-agent: *
Intermediate & Advanced SEO | | McTaggart
Disallow: /administrator/
Disallow: /bin/
Disallow: /cache/
Disallow: /cli/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /layouts/
Disallow: /libraries/
Disallow: /logs/
Disallow: /modules/
Disallow: /plugins/
Disallow: /tmp/0 -
404 Pages. Can I change it to do this without getting penalized ? I want to lower our bounce rate from these pages to encourage the user to continue on the site
Hi All, We have been streaming our site and got rid of thousands of pages for redundant locations (Basically these used to be virtual locations where we didn't have a depot although we did deliver there and most of them was duplicate/thin content etc ). Most of them have little if any link value and I didn't want to 301 all of them as we already have quite a few 301's already We currently display a 404 page but I want to improve on this. Current 404 page is - http://goo.gl/rFRNMt I can get my developer to change it, so it will still be a 404 page but the user will see the relevant category page instead ? So it will look like this - http://goo.gl/Rc8YP8 . We could also use Java script to show the location name etc... Would be be okay ? or would google see this as cheating. basically I want to lower our bounce rates from these pages but still be attractive enough for the user to continue in the site and not go away. If this is not a good idea, then any recommendations on improving our current 404 would be greatly appreciated. thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Can links be hidden?
I was wondering if anyone can help me with some advice on agency work. We have just employed a new SEO agency to conduct work on one of our websites. I took a look on OSE and GWT to see if we had any new links since the agency started working (1 month ago) but there's was nothing new. When l asked for an update as to what link building efforts had been completed last month, l was told they don't give out a list of links as it could compromise the agencies techniques. They told me that they use software to hide links form link aggregators so that our competitors don't know what we are doing. Can anybody confirm that such software exists or is this agency just taking us for a ride? If there is such a software, could this not hinder what links the search engines could see? Any comments would be greatly appreciated.
Intermediate & Advanced SEO | | RobSchofield0 -
Loss of rankings due to hack. No manual penalty. Please advise.
I have a client who's site was hacked. The hack added a fake directory to the site, and generated thousands of links to a page that no longer exists. We fixed the hack and the site is fully protected. We disavowed all the malicious/fake links, but the rankings fell off a cliff (they lost top 50 Google rankings for most of their targeted terms). There is no manual penalty set, but it has been 6 weeks and their rankings have not returned. In webmaster tools, their priority #1 "Not found" page is the fake page that no longer exists. Is there anything else we can do? We are out of answers and the rankings haven't even come back at all. Any advise would be helpful. Thanks!
Intermediate & Advanced SEO | | digitalimpulse0 -
Can submitting sitemap to Google webmaster improve SEO?
Can creating fresh sitemap and submitting to Google webmaster improve SEO?
Intermediate & Advanced SEO | | chanel270 -
How do you Hire someone for SEO?
What questions do you ask when hiring an SEO person? What is the most important thing in selecting a company for SEO? Should they be able to make recommendations on changes in your website for seo?
Intermediate & Advanced SEO | | Realtor1010 -
Why is Google Still Penalizing My Site?
We got hit pretty hard by Penguin. There were some bad link issues which we've cleared up and we also had a pretty unique situation stemming from about a year ago when we changed the name of the company and created a whole new site with similar content under a different URL. We used the same phone number and address, and left the old site up as it was still performing well. Google didn't care for that so we eventually used 301 redirects to push the link juice from the old site to the new site. That's the background, here's the problem...... We've partially recovered, but there are several keywords that haven't come back anywhere near where they were in Google. We have higher page rank and more links than our competition and are performing in the top 5 for some of our keywords. Other, similar keywords, where we used to be in the top 5, we are now down on page 4 or 5. Our website is www.hudsoncabinetrydesign.com. We build custom cabinetry and furniture in Westchester County, NY just north of NYC. Examples - For "custom built-ins new york" we are number 3 on Google, number 1 on Bing/Yahoo. For "custom kitchen cabinetry ny" we are number 3 on Bing/Yahoo, not in the top 50 on Google. For "custom radiator covers ny" we used to be #1 on Google, are currently #48, currently #2 on Bing/Yahoo. Obviously, we've done something to upset the Google, but we've run out of ideas as to what it could be. Any ideas as to what is going on? Thanks so much for your feedback, Doug B.
Intermediate & Advanced SEO | | doug_b0 -
Can I add NOFOLLOW or NOINDEX attribute for better organic ranking?
I am working on online retail store which is highly dedicated to Patio Umbrellas. My website is on 2nd page of Google web search for Patio Umbrellas keyword. I have one another internal page with Patio Umbrellas text link. http://www.vistapatioumbrellas.com/21/patio-umbrellas.html I assume that, Google have confusion to give rank for my keyword during Patio Umbrellas keyword. I want to set NOFOLLOW attribute or NOINDEX FOLLOW meta for this page. Will it help me to rank high for Patio Umbrellas keyword. My ultimate goal is to reduce confusion for Patio Umbrellas keyword.
Intermediate & Advanced SEO | | CommercePundit0