UK rankings disappeared after US website launch
-
Hi all,
I had a client that recently released a US version of their UK website and put it live without informing me first! Once I saw it (about 3/4 days later) I immediately asked them to include the rel=alternate tag onto both websites. However, in the meantime our UK rankings have all gone and it seems as if Google has just kicked the UK website. How long will it take for our rankings to return to normal?
Thanks for the advice,
Karl
-
I don't know.
A good decision would require me to understand the content history, SEO history, and technical history (CMS, htaccess, canonical) of both domains. I would need study the site and talk with all of the people who have done these things and compile that history as it occurred through time. I would then get technical advice from a consultant who really really knows the technical details and be willing to pay them for research and study.
That history would be only as good as what the people who did the work were willing to share and could remember. I would not be comfortable with that information, because, the people who did the work have methods and views that are very different from mine.
One thing that I can say with certainty. When I was done with this site the ecommerce pages would be reduced to product pages and a very small number of category pages. That is all.
Would that fix the problem? I don't know, because I would worry that penalties and technical problems in the history of the site still hang over it.
-
Hi Egol,
do you think it would help if we took down the US website?
-
I see that the link in my post above is not working...
If you grab almost any string of text from a product description and search for it in quotes you will get a page with about 10 search results... here is a string of text....
"Close the door to the wind and rain and brew up this all-natural, gingery, lemony brew to sweep you"
Then at the bottom of the search results page you will see....
In order to show you the most relevant results, we have omitted some entries very similar to the 8 already displayed.
If you like, you can repeat the search with the omitted results included.If you click the link it will show you 51,000 pages that have been omitted... Click a few pages deep, most of them are from teapigs.com.
If you grab any other string of text from any other product description you will likely see another 50,000 pages... etc etc etc.
Simply eliminating these pages might not solve the problem. Somehow they are being crawled. So their must be links to them somewhere.
There
-
Thanks EGOL,
I've asked them to remove all those types of links and asked how they were being created. So, just to confirm in my mind, the rel=alternative hreflang tag is implemented correctly?
-
Here is one of those pages...
They are being indexed by google. They are not about a single product. They look like they are being produced by the software that runs the site. These pages are either an accident or they are spam. Google will view them as spam.
If this was my site they would be removed ASAP and I would get away from blubolt for making a site with thousands of pages like this with duplicate content, duplicate title tags and allowing them in the index and putting a followed sitewide link in my footer.
-
Thanks EGOL,
Really appreciate the advice. Those pages then, should I 301 them to the proper product page?
I've mentioned the link at the bottom but, surprise surprise, they seem to ignore that one!
-
There is a canonical tag on each page that points to the most authoritative version of the page.
In my opinion, pages like should be eliminated. It does not matter what type of tag is on them.
If this was my site, those pages would be gone and I would have get someone else to work on the website. Nobody should have thousands of pages like that in the google index. Those pages are dead weight. Google hates them with a vengeance. They should be blocked from the google index. I don't like the followed site-wide link to the developer in the footer.
Just saying what I would honestly do if this was my site.
-
Thanks for the reply Egol,
The problem is, there are a few ways that a user can get to the same product. There is a canonical tag on each page that points to the most authoritative version of the page.
The web company has put a geo-target redirect in place so that people from the US see the US site and people from the UK see the UK website. This isn't causing a problem is it?
-
So, when I try to visit the .uk site I am redirected to the .com.
When I visit the .com and grab sentences from the content I see results like this.... 56,800 pages with that sentence and LOTS of them with the same "All products | teapigs USA" title tag.... and some old "All products | teapigs" pages from the .uk site still surviving.
So, just from a quick look (which ain't nearly what should be done) I am thinking that this site has many thousands of pages that are simply mash-ups of products causing a huge duplicate content problem.
And, since I don't know what was on .uk in the past, if it had a penalty, if it was cross-linked... I can't say much with confidence about other deadly problems that this site could have.
If this was my site I would kill those mash-up pages and get back to the old-fashioned one page per product format.
-
In addition, do you think if we took down the US website, our UK rankings would come back quicker? Or is the damage already done and we need to wait it out.
Thanks again!
-
Hi Jane and Egol,
Still no luck in the recovery of our UK rankings
The websites in question are www.teapigs.co.uk and www.teapigs.com. I think the tags have been implemented properly but I was wondering if you would be so kind enough to have a look for me?
Thanks in advance,
Karl
-
Always good to get a blog post or case study out of things like this, at least! I also seriously hope (and suspect) that you will take less time to recover than my client did - the fact that their sites were less than six months old when they made this error will certainly not have helped their chances.
Cheers,
Jane
-
Hi Jane,
The tags have been implemented about a week ago but nothing has really improved yet. The UK website has been up for about 7 years and had quite a lot of authority so I'm surprised I haven't seen anything notable yet
That doesn't fill me with confidence! I'm hoping in the next week or 2 we'll see things improve. We've had to ramp up our email, PPC and social over the last week but it's obviously hampered organic as we ranked on the first page for 30/40 keywords and they've all gone. I think the Panda refresh hasn't helped either and it couldn't have happened at a worse time! I guess it's just a case of keeping an eye on things and praying that Google see the tags sooner rather than later.
I'll let you know when (please God let it be when and not if!!) our rankings return, might be a good YouMoz post!
-
Hi Karl,
Have the tags been implemented, and have you seen any improvement in the situation?
Not to be the bearer of bad news, but I had a client accidentally canonicalise two entire websites to those website's home pages in 2012. That is, www.site.com/page/product/123.html contained a canonical tag pointing to www.site.com, as did every other page on the website. They put this live across two ecommerce domains.
We kicked up a huge fuss, but it still took their dev team a week to fix (the fix was put into a queue... grr). The site lost all of its rankings very quickly, which wasn't surprising given that its ranking all stemmed from internal pages ranking for products. It took about six weeks for Google to properly re-index and start ranking the internal pages again, and months to return to the previous rankings. Granted, these sites were quite new when this happened so they didn't have much authority to begin with.
The short version is that it took a minute to break, six days to fix and six weeks to re-index. Every single case like this will be different, and I do not think the brand-new status of the two domains helped in this case.
-
If the websites had identical content and they were linked together that could have been the problem. Google has been killing identical websites that are linked for about ten years. One site gets killed... almost completely.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO for a a static content website
Hi everyone, We would like to ask suggestions on how to improve our SEO for our static content help website. With the release of each new version, our company releases a new "help" page, which is created by an authoring system. This is the latest page: http://kilgray.com/memoq/2015/help-en/ I have a couple of questions: 1- The page has an index with many links that open up new subpages with content for users. It is impossible to add title tags to this subpages, as everything is held together by the mother page. So it is really hard to for users to find these subpage information when they are doing a google search. 2- We have previous "help" pages which usually rank better in google search. They also have the same structure (1 page with big index and many subpages) and no metadata. We obviously want the last version to rank better, however, we are afraid exclude them from bots search because the new version is not easy to find. These are some of the previous pages: http://kilgray.com/memoq/2014R2/help-en/ http://kilgray.com/memoq/62/help-en/ I would really appreciate suggestions! Thanks
Technical SEO | | Kilgray0 -
Mobile website content optimisation
Hi there, someone I know is going to put their site to a mobile version with a mobile sub domain (m.). I have recommended responsive but for now this is their only way forward to cope with the 21st April update by Google. My question is what is the best practice for content, as its a different url will there need to be a canonical tag in to stop duplication and thus being penalised from the Google panda update? Any advice much appreciated.
Technical SEO | | tdigital0 -
All other things equal, do server rendered websites rank higher than JavaScript web apps that follow the AJAX Crawling Spec?
I instinctively feel like server rendered websites should rank higher since Google doesn't truly know that the content its getting from an AJAX site is what the user is seeing and Google isn't exactly sure of the page load time (and thus user experience). I can't find any evidence that would prove this, however. A website like Monocle.io uses pushstate, loads fast, has good page titles, etc., but it is a JavaScript single page application. Does it make any difference?
Technical SEO | | jeffwhelpley0 -
Launching Website
We are developing a new website and thinking google would not find it because of the directory we put it in (no homepage yet) and because there are no links to it. For example, we are building it in this directory example.com/wordpress/ but somehow google found it and indexed pages not ready to be indexed. What should we do to stop this until we are ready to launch? Should we just use a robots.txt file with this in it? User-agent: *
Technical SEO | | QuickLearner
Disallow: / Will this create repercussions when we officially launch?0 -
Removing indexed website
I had a .in TLD version of my .com website floated for about 15 days, which was a duplicate copy of .com website. I did not wish to use the .in further for SEO duplication reasons and had let the .in domain expire on 26th April. But still now when I search from my website the .in version also shows up in results and even in google webmaster it shows the the website with maximum (190) number of links to my .com website. I am sure this is hurting the ranking of my .com website. How can the .in website be removed from googles indexing and search results. Given that is has expired also. thanks
Technical SEO | | geekwik0 -
Should I promote each section of my website
Hi, i have a magazine website and i have been heavily promoting the main page of the site thinking that all the work i am doing for the main page which includes links and so on would then pass onto the rest of my site but i have a feeling this is not correct. Can anyone let me know if i should be concentrating on each section of the site and also on my articles should i be promoting these articles or let the search engines pick them up. I already use facebook and twitter to promote new articles but i would like to know if i should be doing more than this
Technical SEO | | ClaireH-1848860 -
Website is extreemly slow
A couple of days a go one of our websites became extreemly slow. I'm not sure if this is the right place to ask this question but frankly i don't where else to ask it Our hosting provider mentioned it was a socket exploid but even after removing all the infected files we are still running into a strange wait time of 45 seconds (See attachements) This has mayor efects on the SEO as well the link is www[dot]schouw[dot]org Hopefully there is someone how can help me out 12.png
Technical SEO | | TiasNimbas0 -
.com or .co.uk in UK index? but the .com has higher domain authority...
Hi there i have a .com and a .co.uk for a site that has been around a while. However not much seo has been done on it, i was wonderign do i continue to optimise for the .com or switch to the .co.uk to rank in Google UK index for various search terms. .COM = 40 domain authority .CO.UK - 10 domain authority. Let the debate start 🙂
Technical SEO | | pauledwards0