Whats happening with Google UK?
-
Within the last week we have had a handful of our rankings drop dramatically down the SERPS. About 15% but this an estimate and has not been fully investigated yet.
Whilst looking into possible scenarios that could be causing this i wanted to check what the SERPS looked like for the terms that we are still holding position on.
Typing "extending dining tables" into Google UK today i was amazed at what i found...
Ranking in position 1 and 2 is a massive UK furniture store.
But isnt that the same landing page being returned for both positions??It appears to be a navigation problem within the site category tags causing duplicate content. However they have been rewarded with the top two positons subsequently pushing our website onto page two.
I find it so frustrating that we listen to Googles best practices when it comes to pagination issues yet this is how our hard work is rewarded!
Anyone else have any thoughts about this?
-
Pleasure. Shout if I can help!
-
Fantastic. Thank you very much. Interestingly this website is hosted on a different platform to our others, so I wonder whether this has something to do with the config. We'll set up 301s for w. and ww. as a short term fix and look at the config going forward.
Many thanks again.
-
Hey, I think I have spotted something:
Google this:
portland clic-clac sofa bed
& Closely Check the result:
http://ww.franceshunt.co.uk/live/sofa-beds/portland-clic-clac-sofa-bed.html
ww not www
Also, we have another version of that page indexed:
v 1.
info:ww.franceshunt.co.uk/live/sofa-beds/portland-clic-clac-sofa-bed.htmlv 2.
info:www.franceshunt.co.uk/live/sofa-beds/portland-clic-clac-sofa-bed.htmlSo, you have something whack going on with your sub domains.
Digging a bit deeper:
site:franceshunt.co.uk/live/sofa-beds/portland-clic-clac-sofa-bed.html
This shows that we have not only some ww. & www. results we also have pages being returned on
w.
ww.
www.
www.w.These are all the clic clac sofa bed pages so that most likely explains that one away and could well be at the root of your other problems.
I quickly checked the obvious and you do a 301 from franceshunt.co.uk to www.franceshunt.co.uk but if we do a general indexation query
site:franceshunt.co.uk
We see all kinds of weirdness and for the homepage alone (again, checking very quickly we have indexed and can resolve that page on
So.... not to hard to assume you may have lost a little bit of trust here through duplicate version of the page.
It obviously needs a bit more digging around but this should be easily fixed with a 301 for all these variations to www. and a double check across the board and on your internal linking to figure out just how this has happened and why it resolves on those wacky sub domains.
I didn't find a:
if-we-create-duplicate-versions-of-the-site-do-we-get-more-serp-share.franceshunt.co.uk but.... it resolves so it seems the site will resolve on any sub domain so we have two main issues
1. The virtual host is wrongly configured to allow it rank on anything.franceshunt.co.uk - a competitor could use this to harm you!
2. There are variations indexed that you need to take care of and a (*). rule for anything other than www. should 301 to the www. version of the page and that should, given a bit of time for reindexation etc, do the job (or at least help, who's to say we don't have multiple issues here).
Hope it helps and please let me know how it works out!
Marcus
-
First of all, thanks very much for taking the time to have a look for us and offer your opinions Marcus, much appreciated.
We are certainly going to be experimenting with the canonical tag in this way moving forward. We've never experienced problems with user interaction within the site since Google decided to start ranking the "show all" version of the pages instead so we've never really worried too much about it until now.
The worst hit was another non-competitive term "clic clac sofa bed" - we grew it steadily from 10th position back in feb and this was 3rd last week (!) and is no longer ranking at all! The page that was ranking is: http://www.franceshunt.co.uk/live/sofa-beds/
When this campaign began back in the old days of yore we were still using free directorys for optimisation of deep pages. Ive read alot about these being slowly de-indexed by Google so was wondering if this was having an adverse impact on some of the "weaker" pages. As you can see though there has been no off-site optmisation towards this page its a pretty new term (only added to campaign in feb) so im discounting that theory - for now!
-
Hey
First up, you have rel = next & prev on the paginated pages so that's good but I would also use the rel=canonical to the view all page as described here:
http://googlewebmastercentral.blogspot.co.uk/2011/09/view-all-in-search-results.html
The view all page in this category is not huge and loads nice and quickly so I cant see any reason not to 'help' google and give them the indication that this is where you want all rankings for those pages to be concentrated.
As always, experimentation is needed but I see things like this:
-
You have a view all page and that is the desired page to display and Google prefers it all by itself
-
You have a rel=next & rel=prev set up that is really for when you want to display individual component pages rather than the main page
-
The search query you are referencing has no intent that makes it more specific to one of the paginated pages so the ideal landing page is the view all page
So, remove the rel=next & rel=prev and canonical it to the view all page and see how you get on. Allow it to reindex, record the results and make an decision based on that information.
As a disclaimer, this may not make any difference with the ranking as it seems they are not indexing your paginated pages AND if we do an info query on the main category page it shows details for the show all page. That said, this is the correct way to do it unless you would rather show the individual pages so I would still make the change.
I think when it comes down to it, Harveys just have like 5 x as many linking domains as you and you both have fairly natural looking anchor text (at the most cursory of views) so they are just outranking you here. I have not digged into the other results between you and them and a drop from 3 to 11 is a bit more than the usual flutters - is there anything else that has had a similar drop?
-
-
Thanks Marcus!
Our site is http://www.franceshunt.co.uk/
We have asked a couple of questions before on Moz as to how best to solve the pagination issues within our site.
Google seems to prefer to rank the "show all" version of the targeted landing pages.
So whilst we are optimising http://www.franceshunt.co.uk/dine/extending-dining-tables/
Google prefers to rank http://www.franceshunt.co.uk/dine/extending-dining-tables/?p=all
Which hasn't caused us any problems before, yet now im wondering if this could be part of the issue too. Please let us know what you think!
-
We were ranking third before the update for this term.
Surely brand exposure and social signals are related to their number one positioning, but whats with the second result?
This is the same landing page yet through a different navigational path. This is what im questioning here?
-
Hmmm, yeah, that kind of sucks. That is the same page, and like you say it just seems to be either tagged as either living room or dining room. Looking at them closely, they are vaguely different, not a lot in it, both just a weak category page.
Whilst this is an obvious example of something amiss here, they should not have the top two spots, I would not waste too much time worrying about it. I imagine this will be a short lived deal for them.
Can you drop a link to your site? Maybe we can better advise you on what you can control so you can try to win back some footing here?
-
The update went in favour of companies with good brand exposure, so it is possible that Harvey's link profile is a mix of brand and keyword anchor text.
Your also notice they have 9,000+ facebook fans, in order to obtain that they must activity work on social media, so your also looking at social signals being built another thing Google is now focusing on.
But I don't really see that keyword being that competitive, you should be able to push through SERP's
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google analytic's API information
I have multiple websites and instead of having to log in to each google analytics I want to create a dashboard inside my MIS that has the audience overview graph, is there any way to use API to do this? Is there a way to
Web Design | | BobAnderson0 -
With Google's new Speed Update, what does that mean for AMP pages?
Hey everyone! I wanted to get the other Mozzers opinions on this. With Google announcing a new Speed Update that will affect mobile rankings, I wanted to ask: How will AMP pages play into this? Let me know what you think!
Web Design | | TaylorRHawkins
Thanks!2 -
Website Form Doesn't Work for Visitors from Google Search App
I've got a strange issue where if visitor is viewing a website from within the Google Search App, the app doesn't seem to be able to handle online forms. The form relies users on entering data, which is then POSTed to an external booking engine site. Normally this works fine, except when Google Search App is acting as the browser, the post payload is empty and the URL breaks. We're a bit stumped as to how to move forward. So far the only lead is that Android users can override having the app behave as a browser - but doesn't seem like iOS users can. Any additional ideas/tips are welcome here - thanks. 🙂
Web Design | | mirabile0 -
Google Search Console Block
Am new to SEO. My clients site was completed using Yoast premium and then used Google search console to initiate the crawl. Initially setup an http:// property and all seemed good. Then i removed that under search console an created an https:// did the render and it appears google has put a block and placed their own robots.txt file which basically has rendered the site useless. Feedback most appreciated.
Web Design | | BoostMyBiz0 -
Curious why site isn't ranking, rather seems like being penalized for duplicate content but no issues via Google Webmaster...
So we have a site ThePowerBoard.com and it has some pretty impressive links pointing back to it. It is obviously optimized for the keyword "Powerboard", but in no way is it even in the top 10 pages of Google ranking. If you site:thepowerboard.com the site, and/or Google just the URL thepowerboard.com you will see that it populates in the search results. However if you quote search just the title of the home page, you will see oddly that the domain doesn't show up rather at the bottom of the results you will see where Google places "In order to show you the most relevant results, we have omitted some entries very similar to the 7 already displayed". If you click on the link below that, then the site shows up toward the bottom of those results. Is this the case of duplicate content? Also from the developer that built the site said the following: "The domain name is www.thepowerboard.com and it is on a shared server in a folder named thehoverboard.com. This has caused issues trying to ssh into the server which forces us to ssh into it via it’s ip address rather than by domain name. So I think it may also be causing your search bot indexing problem. Again, I am only speculating at this point. The folder name difference is the only thing different between this site and any other site that we have set up." (Would this be the culprit? Looking for some expert advice as it makes no sense to us why this domain isn't ranking?
Web Design | | izepper0 -
Will numbers & data be considered as user generated content by Google OR naturally written text sentences only refer to user generated content.
Hi, Will numbers & data be considered as user generated content by Google OR naturally written text sentences only refer to user generated content. Regards
Web Design | | vivekrathore0 -
Does Google take email server IP blacklists into account?
This is just a hypothetical, but would Google use information from email server blacklists to determine the quality of a website? The reason is that we're planning to code in an e-mail queuing system for our next CMS, and we would put SPF and DKIM in place. We wouldn't be sending any bulk e-mails (we use Constant Contact for this), but we might be sending personalised follow up e-mails, unpaid order emails and that sort of thing. There's no reason to think we'll be blacklisted, but from experience I know that these email blacklist directories quite often give false positives when an e-mail server is incorrectly configured. So the risk is that we might get blacklisted by mistake when we start using this new feature. Would Google take this into account as part of the algorithm? And if so, would the damage be permanent? (I.e. does getting removed from the blacklist mean Google will stop thinking we're a low quality / spammy site)
Web Design | | OptiBacUK0 -
UK Hosting companies any recommendations please
Hi i am asking this question again as not had much resonce with the other question. We are urgently looking for a decent hosting company in the UK who offer managed hosting. We are looking for a dedicated server that can handle forty sites or more. Can anyone please let me know if there are any good hosting companies in the UK for joomla. I have researched a large number but they all receive bad reviews so i thought i would ask on here again and see what other people are using many thanks
Web Design | | ClaireH-1848860