UK rankings disappeared after US website launch
-
Hi all,
I had a client that recently released a US version of their UK website and put it live without informing me first! Once I saw it (about 3/4 days later) I immediately asked them to include the rel=alternate tag onto both websites. However, in the meantime our UK rankings have all gone and it seems as if Google has just kicked the UK website. How long will it take for our rankings to return to normal?
Thanks for the advice,
Karl
-
I don't know.
A good decision would require me to understand the content history, SEO history, and technical history (CMS, htaccess, canonical) of both domains. I would need study the site and talk with all of the people who have done these things and compile that history as it occurred through time. I would then get technical advice from a consultant who really really knows the technical details and be willing to pay them for research and study.
That history would be only as good as what the people who did the work were willing to share and could remember. I would not be comfortable with that information, because, the people who did the work have methods and views that are very different from mine.
One thing that I can say with certainty. When I was done with this site the ecommerce pages would be reduced to product pages and a very small number of category pages. That is all.
Would that fix the problem? I don't know, because I would worry that penalties and technical problems in the history of the site still hang over it.
-
Hi Egol,
do you think it would help if we took down the US website?
-
I see that the link in my post above is not working...
If you grab almost any string of text from a product description and search for it in quotes you will get a page with about 10 search results... here is a string of text....
"Close the door to the wind and rain and brew up this all-natural, gingery, lemony brew to sweep you"
Then at the bottom of the search results page you will see....
In order to show you the most relevant results, we have omitted some entries very similar to the 8 already displayed.
If you like, you can repeat the search with the omitted results included.If you click the link it will show you 51,000 pages that have been omitted... Click a few pages deep, most of them are from teapigs.com.
If you grab any other string of text from any other product description you will likely see another 50,000 pages... etc etc etc.
Simply eliminating these pages might not solve the problem. Somehow they are being crawled. So their must be links to them somewhere.
There
-
Thanks EGOL,
I've asked them to remove all those types of links and asked how they were being created. So, just to confirm in my mind, the rel=alternative hreflang tag is implemented correctly?
-
Here is one of those pages...
They are being indexed by google. They are not about a single product. They look like they are being produced by the software that runs the site. These pages are either an accident or they are spam. Google will view them as spam.
If this was my site they would be removed ASAP and I would get away from blubolt for making a site with thousands of pages like this with duplicate content, duplicate title tags and allowing them in the index and putting a followed sitewide link in my footer.
-
Thanks EGOL,
Really appreciate the advice. Those pages then, should I 301 them to the proper product page?
I've mentioned the link at the bottom but, surprise surprise, they seem to ignore that one!
-
There is a canonical tag on each page that points to the most authoritative version of the page.
In my opinion, pages like should be eliminated. It does not matter what type of tag is on them.
If this was my site, those pages would be gone and I would have get someone else to work on the website. Nobody should have thousands of pages like that in the google index. Those pages are dead weight. Google hates them with a vengeance. They should be blocked from the google index. I don't like the followed site-wide link to the developer in the footer.
Just saying what I would honestly do if this was my site.
-
Thanks for the reply Egol,
The problem is, there are a few ways that a user can get to the same product. There is a canonical tag on each page that points to the most authoritative version of the page.
The web company has put a geo-target redirect in place so that people from the US see the US site and people from the UK see the UK website. This isn't causing a problem is it?
-
So, when I try to visit the .uk site I am redirected to the .com.
When I visit the .com and grab sentences from the content I see results like this.... 56,800 pages with that sentence and LOTS of them with the same "All products | teapigs USA" title tag.... and some old "All products | teapigs" pages from the .uk site still surviving.
So, just from a quick look (which ain't nearly what should be done) I am thinking that this site has many thousands of pages that are simply mash-ups of products causing a huge duplicate content problem.
And, since I don't know what was on .uk in the past, if it had a penalty, if it was cross-linked... I can't say much with confidence about other deadly problems that this site could have.
If this was my site I would kill those mash-up pages and get back to the old-fashioned one page per product format.
-
In addition, do you think if we took down the US website, our UK rankings would come back quicker? Or is the damage already done and we need to wait it out.
Thanks again!
-
Hi Jane and Egol,
Still no luck in the recovery of our UK rankings
The websites in question are www.teapigs.co.uk and www.teapigs.com. I think the tags have been implemented properly but I was wondering if you would be so kind enough to have a look for me?
Thanks in advance,
Karl
-
Always good to get a blog post or case study out of things like this, at least! I also seriously hope (and suspect) that you will take less time to recover than my client did - the fact that their sites were less than six months old when they made this error will certainly not have helped their chances.
Cheers,
Jane
-
Hi Jane,
The tags have been implemented about a week ago but nothing has really improved yet. The UK website has been up for about 7 years and had quite a lot of authority so I'm surprised I haven't seen anything notable yet
That doesn't fill me with confidence! I'm hoping in the next week or 2 we'll see things improve. We've had to ramp up our email, PPC and social over the last week but it's obviously hampered organic as we ranked on the first page for 30/40 keywords and they've all gone. I think the Panda refresh hasn't helped either and it couldn't have happened at a worse time! I guess it's just a case of keeping an eye on things and praying that Google see the tags sooner rather than later.
I'll let you know when (please God let it be when and not if!!) our rankings return, might be a good YouMoz post!
-
Hi Karl,
Have the tags been implemented, and have you seen any improvement in the situation?
Not to be the bearer of bad news, but I had a client accidentally canonicalise two entire websites to those website's home pages in 2012. That is, www.site.com/page/product/123.html contained a canonical tag pointing to www.site.com, as did every other page on the website. They put this live across two ecommerce domains.
We kicked up a huge fuss, but it still took their dev team a week to fix (the fix was put into a queue... grr). The site lost all of its rankings very quickly, which wasn't surprising given that its ranking all stemmed from internal pages ranking for products. It took about six weeks for Google to properly re-index and start ranking the internal pages again, and months to return to the previous rankings. Granted, these sites were quite new when this happened so they didn't have much authority to begin with.
The short version is that it took a minute to break, six days to fix and six weeks to re-index. Every single case like this will be different, and I do not think the brand-new status of the two domains helped in this case.
-
If the websites had identical content and they were linked together that could have been the problem. Google has been killing identical websites that are linked for about ten years. One site gets killed... almost completely.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have one keyword which disappeared
My site currently ranks for over 2000 of which 92 are in the number 1 position. Im very active with this site. My Main keyword is a two word keyword ,I have never been below the position #30. Has always been at best 3rd and fourth page. I rank primarily for mostly 3 part keywords and 4 part keywords. However , I noticed about a month ago,The main keyword no longer showing up on serp tools .Moz included. At first i was getting wierd results using keyword tool .Not moz serp tool. I would get different result every time i checked it. When i tried to verify these results ,Only to find my site not at the projected page and rank. After more than a month now this particular keyword is no where in site. Im still getting traffic fom all the other keywords but im a bit confused over this. I thought my main keyword was about to be on the bottom of the 2nd page but instead seems to be completely gone.
Technical SEO | | Yellow20000 -
Using both .co.uk and .com
Hello a client has launched a website with both the .com and .co.uk The content is identical. I understand that you should add rel="alternate" hreflang="x" to the code. However, will there be a problem with the identical content? It would be hard to localise the content to one country. I understand why the client has got both domains, particularly the UK one but the actual content is not specific to one country. It is written for English speaking customers really. Also what about links? In this case do you need to build two sets of links to make them both rank? Thanks for any help.
Technical SEO | | AL123al0 -
All other things equal, do server rendered websites rank higher than JavaScript web apps that follow the AJAX Crawling Spec?
I instinctively feel like server rendered websites should rank higher since Google doesn't truly know that the content its getting from an AJAX site is what the user is seeing and Google isn't exactly sure of the page load time (and thus user experience). I can't find any evidence that would prove this, however. A website like Monocle.io uses pushstate, loads fast, has good page titles, etc., but it is a JavaScript single page application. Does it make any difference?
Technical SEO | | jeffwhelpley0 -
Rank tracker and Rankings report differs
Hi all Is this normal? I have set up a campaign for a site. Tracking a variety of keywords. For one of them, which is a quite important keyword I've been working on I've moved down one step in my rankings report. This is first of all weird because my on page optimization went from grade c to a, and even weirder beacuse if I run Rank Tracker tool on the keyword and the URL I see that I've moved up 6 steps, to 15 in Google. Kinda makes it hard to grasp if I'm on the right path or not! (I've checked and they are both results on google.dk, same URL and same keyword - exact)
Technical SEO | | Budskab0 -
Local Keywords Not Ranking Well in a Geographic Location (but Rank Very Well Outside of Geographic Location)
Has anyone experienced, in the last few months, an issue where a website that once ranked well for 'local' terms in Google stopped ranking well for those terms (but saw a ranking decrease only within the geographic location contained within those keywords)? For example only, some 'root' keywords could be: Chicago dentist Chicago dentists dentist Chicago dentists Chicago What happens is that when a searcher searches from within the geographic area of Chicago, IL, the target website no longer ranks on the 1st page for these types of keyword phrases, but they used to rank in the top 3 perhaps. However, if someone was to search for the same keyword phrases from another city outside of Chicago or set a custom location (such as Illinois or even Milwaukee, WI perhaps) in their Google search, the target website appears to have normal (high) 1st page rankings for these types of terms. My own theory: At first I thought it was a Penguin related issue but the client's rankings overall haven't appeared to have been affected on the date(s) of Penguin updates. Authority Labs and Raven Tools (which uses Authority Labs data) did not detect any ranking decrease and still reports all the local keyword rankings as high on the 1st page of Google. However, when the client themselves goes to check their own rankings (as they are within that affected geographic area), they are no where to be found on the 1st page. :S After some digging I found that (one of) the company's Google Places listings (the main office listing) became an 'unsupported' status in Google Maps. So now I am thinking that this phenomenon is due to the fact that other listings are now appearing in search results for the same location. For example, in this case, an individual dentist's Google Places listing (who works within the dental office) is being displayed instead of the actual dental office's listing. Also, the dentist's name on the Google Places listing is being swapped out by Google with the name of the dental office, but if you click through to the Google Places listing, it shows the name of the individual Dentist. Anyone encounter a similar issue or have any other theories besides the Google Places issue?
Technical SEO | | OrionGroup0 -
Website IP Location
My main target audience is in the UK, but my website's IP is in the United States. Would it be worthwhile to change the IP to a UK address? How would I go about that? Thanks!
Technical SEO | | theLotter0 -
What is the best website structure for SEO?
I've been on SEOmoz for about 1 month now and everyone says that depending on the type of business you should build up your website structure for SEO as 1st step. I have a new client click here ( www version doesn't work)... some bugs we are fixing it now. We are almost finished with the design & layout. 2nd question have been running though my head. 1. What would the best url category for the shop be /products/ - current url cat ex: /products/door-handles.html 2. What would you use for the main menu as section for getting the most out of SEO. Personally i am thinking of making 2-3 main categories on the left a section where i can add content to it (3-4 paragraphs... images maybe a video).So the main page focuses on the domain name more and the rest of the sections would focus on specific keywords, this why I avoid cannibalization. Main keyword target is "door handles" Any suggestions would be appreciated.
Technical SEO | | mosaicpro0 -
Pages not ranking - Linkbuilding Question
It has been about 3 months since we made some new pages, with new, unique copy, but alot of pages (even though they have been indexed) are not ranking in the SERPS I tested it by taking a long snippet of the unique copy form the page and searching for it on Google. Also I checked the ranking using http://arizonawebdevelopment.com/google-page-rank
Technical SEO | | Impact-201555
Which may no be accurate, I know, but would give some indication. The interesting thing was that for the unique copy snippets, sometimes a different page of our site, many times the home page, shows up in the SERP'sSo my questions are: Is there some issue / penalty / sandbox deal with the pages that are not indexed? How can we check that? Or has it just not been enough time? Could there be any duplicate copy issue going on? Shouldn't be, as they are all well written, completely unique copy. How can we check that? Flickr image details - Some of the pages display the same set of images from flickr. The details (filenames, alt info, titles) are getting pulled form flickr and can be seen on the source code. Its a pretty large block of words, which is the same on multiple pages, and uses alot of keywords. Could this be an issue considered duplication or keyword stuffing, causing this. If you think so , we will remove it right away. And then when do we do to improve re-indexing? The reason I started this was because we have a few good opportunities right now for links, and I was wondering what pages we should link to and try to build rankings for. I was thinking about pointing one to /cast-bronze-plaques, but the page is not ranking. The home page, obviously is the oldest page, and ranked the best. The cast bronze plaques page is very new. Would linking to pages that are not ranking well be a good idea? Would it help them to get indexed / ranking? Or would it be better to link to the pages that are already indexed / ranking? If you link to a page that does not seem to be indexed, will it help the domains link profile? Will the link juice still flow through the site0