UK rankings disappeared after US website launch
-
Hi all,
I had a client that recently released a US version of their UK website and put it live without informing me first! Once I saw it (about 3/4 days later) I immediately asked them to include the rel=alternate tag onto both websites. However, in the meantime our UK rankings have all gone and it seems as if Google has just kicked the UK website. How long will it take for our rankings to return to normal?
Thanks for the advice,
Karl
-
I don't know.
A good decision would require me to understand the content history, SEO history, and technical history (CMS, htaccess, canonical) of both domains. I would need study the site and talk with all of the people who have done these things and compile that history as it occurred through time. I would then get technical advice from a consultant who really really knows the technical details and be willing to pay them for research and study.
That history would be only as good as what the people who did the work were willing to share and could remember. I would not be comfortable with that information, because, the people who did the work have methods and views that are very different from mine.
One thing that I can say with certainty. When I was done with this site the ecommerce pages would be reduced to product pages and a very small number of category pages. That is all.
Would that fix the problem? I don't know, because I would worry that penalties and technical problems in the history of the site still hang over it.
-
Hi Egol,
do you think it would help if we took down the US website?
-
I see that the link in my post above is not working...
If you grab almost any string of text from a product description and search for it in quotes you will get a page with about 10 search results... here is a string of text....
"Close the door to the wind and rain and brew up this all-natural, gingery, lemony brew to sweep you"
Then at the bottom of the search results page you will see....
In order to show you the most relevant results, we have omitted some entries very similar to the 8 already displayed.
If you like, you can repeat the search with the omitted results included.If you click the link it will show you 51,000 pages that have been omitted... Click a few pages deep, most of them are from teapigs.com.
If you grab any other string of text from any other product description you will likely see another 50,000 pages... etc etc etc.
Simply eliminating these pages might not solve the problem. Somehow they are being crawled. So their must be links to them somewhere.
There
-
Thanks EGOL,
I've asked them to remove all those types of links and asked how they were being created. So, just to confirm in my mind, the rel=alternative hreflang tag is implemented correctly?
-
Here is one of those pages...
They are being indexed by google. They are not about a single product. They look like they are being produced by the software that runs the site. These pages are either an accident or they are spam. Google will view them as spam.
If this was my site they would be removed ASAP and I would get away from blubolt for making a site with thousands of pages like this with duplicate content, duplicate title tags and allowing them in the index and putting a followed sitewide link in my footer.
-
Thanks EGOL,
Really appreciate the advice. Those pages then, should I 301 them to the proper product page?
I've mentioned the link at the bottom but, surprise surprise, they seem to ignore that one!
-
There is a canonical tag on each page that points to the most authoritative version of the page.
In my opinion, pages like should be eliminated. It does not matter what type of tag is on them.
If this was my site, those pages would be gone and I would have get someone else to work on the website. Nobody should have thousands of pages like that in the google index. Those pages are dead weight. Google hates them with a vengeance. They should be blocked from the google index. I don't like the followed site-wide link to the developer in the footer.
Just saying what I would honestly do if this was my site.
-
Thanks for the reply Egol,
The problem is, there are a few ways that a user can get to the same product. There is a canonical tag on each page that points to the most authoritative version of the page.
The web company has put a geo-target redirect in place so that people from the US see the US site and people from the UK see the UK website. This isn't causing a problem is it?
-
So, when I try to visit the .uk site I am redirected to the .com.
When I visit the .com and grab sentences from the content I see results like this.... 56,800 pages with that sentence and LOTS of them with the same "All products | teapigs USA" title tag.... and some old "All products | teapigs" pages from the .uk site still surviving.
So, just from a quick look (which ain't nearly what should be done) I am thinking that this site has many thousands of pages that are simply mash-ups of products causing a huge duplicate content problem.
And, since I don't know what was on .uk in the past, if it had a penalty, if it was cross-linked... I can't say much with confidence about other deadly problems that this site could have.
If this was my site I would kill those mash-up pages and get back to the old-fashioned one page per product format.
-
In addition, do you think if we took down the US website, our UK rankings would come back quicker? Or is the damage already done and we need to wait it out.
Thanks again!
-
Hi Jane and Egol,
Still no luck in the recovery of our UK rankings
The websites in question are www.teapigs.co.uk and www.teapigs.com. I think the tags have been implemented properly but I was wondering if you would be so kind enough to have a look for me?
Thanks in advance,
Karl
-
Always good to get a blog post or case study out of things like this, at least! I also seriously hope (and suspect) that you will take less time to recover than my client did - the fact that their sites were less than six months old when they made this error will certainly not have helped their chances.
Cheers,
Jane
-
Hi Jane,
The tags have been implemented about a week ago but nothing has really improved yet. The UK website has been up for about 7 years and had quite a lot of authority so I'm surprised I haven't seen anything notable yet
That doesn't fill me with confidence! I'm hoping in the next week or 2 we'll see things improve. We've had to ramp up our email, PPC and social over the last week but it's obviously hampered organic as we ranked on the first page for 30/40 keywords and they've all gone. I think the Panda refresh hasn't helped either and it couldn't have happened at a worse time! I guess it's just a case of keeping an eye on things and praying that Google see the tags sooner rather than later.
I'll let you know when (please God let it be when and not if!!) our rankings return, might be a good YouMoz post!
-
Hi Karl,
Have the tags been implemented, and have you seen any improvement in the situation?
Not to be the bearer of bad news, but I had a client accidentally canonicalise two entire websites to those website's home pages in 2012. That is, www.site.com/page/product/123.html contained a canonical tag pointing to www.site.com, as did every other page on the website. They put this live across two ecommerce domains.
We kicked up a huge fuss, but it still took their dev team a week to fix (the fix was put into a queue... grr). The site lost all of its rankings very quickly, which wasn't surprising given that its ranking all stemmed from internal pages ranking for products. It took about six weeks for Google to properly re-index and start ranking the internal pages again, and months to return to the previous rankings. Granted, these sites were quite new when this happened so they didn't have much authority to begin with.
The short version is that it took a minute to break, six days to fix and six weeks to re-index. Every single case like this will be different, and I do not think the brand-new status of the two domains helped in this case.
-
If the websites had identical content and they were linked together that could have been the problem. Google has been killing identical websites that are linked for about ten years. One site gets killed... almost completely.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am having an issue with my rankings
I am having an issue with my rankings. I am not sure if there are issues with onpage dup content or with the way wordpress is behaving but there is no reason based upon the sites back link profile that the site shouldn't be ranking well. The site is mesocare.org. If anyone can help it would be appreciated.
Technical SEO | | weitzluxenberg0 -
Eligible To Rank For Few Queries
I can't figure out why this site (www.liveathomeseniors.com) is eligible for so few search queries on Google Webmaster Tools. I know there is a lot of work to be done, but this is my biggest puzzle right now. What am I not seeing? 379 pages are indexed and yet the site is has only been deemed eligible to rank for 3 queries over the past 3 months. Is it all the repetition in the way the content has been structured? I'd appreciate people's thoughts on this. I can't see the forest for the trees. Donna
Technical SEO | | DonnaDuncan0 -
Slow website
Hi I have just migrated from a custom written php/mysql site to a site using wordpress and woocommerce. I couldnt believe the drop in speed . I am using a few plugins for wordpress - contact forms / social sharing. and I have a few woocommerce plugins for taking payment etc. I am hosting images css's and js's on W3 Total Cache and MAXCDN hoping to speed the site up but tools at http://tools.pingdom.com/fpt sometimes show that the time between browser request and reply can be between 1 and 15 secs. I have searched all day looking for a post I read about two months ago with a tool that seems to look at server responce and redirect processing etc hoping it would help but cant find it. If anyone knows what I am talking about I would appreciate them posting a link The site is http://www.synergy-health.co.uk and an example of an inner page is http://www.synergy-health.co.uk/home/shop/alacer-emergen-c-1000-mg-vitamin-c-acai-berry-30-packets-8-4-g-each/ Any suggestions please? Perhaps I have w3total cache set wrong? Also, as the has been tanked and is in freefal iin google ranking since January would this be a good time to change the structure of Url from home/shop/product to domain.name/brand/product? Thanks in advance !
Technical SEO | | StephenCallaghan0 -
Website Down
Hello guys, My website hasn't been reachable for couple of hours today and I can't really understand why as no links have been built, all the best practices have been followed regarding on page optimization. I also checked google webmaster tools and there are no warning messages, crawl problems or anything so I don't understand why this has happened. Now for some reason the website is up and running again.
Technical SEO | | PremioOscar1 -
Canonicalization on my website
I am kind of new to all this but I would like to understand canonicalization. I have a website which when you arrive on it is www.mysite.com but once inside and flicking back to the homepage it reverts to www.mysite.com/index.html. Should I be doing something re canonicalization? If so what? Will the link juice be diluted by having two home page versions? Thanks
Technical SEO | | FCAbroad0 -
Google not showing my website ?
The website is medicare.md. if you search for term "medicare doctors PG county maryland" it is #1 in bing and yahoo but not even showing on google.com first TEN pages, although not banned. Interestingly if you do that search on google.co.pk it is #4. Quite Puzzuling !! Would appreciate any help or advice . Sherif Hassan
Technical SEO | | sherohass0 -
Mobile website settings - I am doing right?
Hi, http://www.schicksal.com has a "normal" and a "mobile' version. We are using a browser detection routine to redirect the visitor to the "default site" or the "mobile site". The mobile site is here:
Technical SEO | | GeorgFranz
http://www.schicksal.com/m The robots.txt contains these lines: User-agent: *
Allow: / User-agent: Googlebot
Disallow: /m
Allow: / User-agent: Googlebot-Mobile
Disallow: /
Allow: /m Sitemap: http://www.schicksal.com/sitemaps/index So, the idea is: Only allow the Googlebot-Mobile Bot to access the mobile site. We have also separate sitemaps for default and mobile version. One of the mobile sitemap is here My problem: Webmaster tool is saying that Google received 898 urls from the mobile sitemap, but none has been indexed. (Google has indexed 550 from the "web sitemap".) I've checked the webmaster tools - no errors on the sitemap. So, if you are searching at google.com/m - you are getting results from the default web page, but not the mobile version. This is not that bad because you will be redirected to the mobile version. So, my question: Is this the "normal" behaviour? Or is there something wrong with my config? Would it be better to move the mobile site to a subdomain like m.schicksal.com? Best wishes, Georg.0 -
Rebuilding an old website
Since we have a strong website; meaning high traffic, but we got 2 issues 1. the framework of the design is not user friendly. 2. the current platform is really old; therefor it comes up with technical problems daily/ We are worried about our links which will affect in our new design, what would be wise to do? Thanks
Technical SEO | | apexcue0