Whats happening with Google UK?
-
Within the last week we have had a handful of our rankings drop dramatically down the SERPS. About 15% but this an estimate and has not been fully investigated yet.
Whilst looking into possible scenarios that could be causing this i wanted to check what the SERPS looked like for the terms that we are still holding position on.
Typing "extending dining tables" into Google UK today i was amazed at what i found...
Ranking in position 1 and 2 is a massive UK furniture store.
But isnt that the same landing page being returned for both positions??It appears to be a navigation problem within the site category tags causing duplicate content. However they have been rewarded with the top two positons subsequently pushing our website onto page two.
I find it so frustrating that we listen to Googles best practices when it comes to pagination issues yet this is how our hard work is rewarded!
Anyone else have any thoughts about this?
-
Pleasure. Shout if I can help!
-
Fantastic. Thank you very much. Interestingly this website is hosted on a different platform to our others, so I wonder whether this has something to do with the config. We'll set up 301s for w. and ww. as a short term fix and look at the config going forward.
Many thanks again.
-
Hey, I think I have spotted something:
Google this:
portland clic-clac sofa bed
& Closely Check the result:
http://ww.franceshunt.co.uk/live/sofa-beds/portland-clic-clac-sofa-bed.html
ww not www
Also, we have another version of that page indexed:
v 1.
info:ww.franceshunt.co.uk/live/sofa-beds/portland-clic-clac-sofa-bed.htmlv 2.
info:www.franceshunt.co.uk/live/sofa-beds/portland-clic-clac-sofa-bed.htmlSo, you have something whack going on with your sub domains.
Digging a bit deeper:
site:franceshunt.co.uk/live/sofa-beds/portland-clic-clac-sofa-bed.html
This shows that we have not only some ww. & www. results we also have pages being returned on
w.
ww.
www.
www.w.These are all the clic clac sofa bed pages so that most likely explains that one away and could well be at the root of your other problems.
I quickly checked the obvious and you do a 301 from franceshunt.co.uk to www.franceshunt.co.uk but if we do a general indexation query
site:franceshunt.co.uk
We see all kinds of weirdness and for the homepage alone (again, checking very quickly we have indexed and can resolve that page on
So.... not to hard to assume you may have lost a little bit of trust here through duplicate version of the page.
It obviously needs a bit more digging around but this should be easily fixed with a 301 for all these variations to www. and a double check across the board and on your internal linking to figure out just how this has happened and why it resolves on those wacky sub domains.
I didn't find a:
if-we-create-duplicate-versions-of-the-site-do-we-get-more-serp-share.franceshunt.co.uk but.... it resolves so it seems the site will resolve on any sub domain so we have two main issues
1. The virtual host is wrongly configured to allow it rank on anything.franceshunt.co.uk - a competitor could use this to harm you!
2. There are variations indexed that you need to take care of and a (*). rule for anything other than www. should 301 to the www. version of the page and that should, given a bit of time for reindexation etc, do the job (or at least help, who's to say we don't have multiple issues here).
Hope it helps and please let me know how it works out!
Marcus
-
First of all, thanks very much for taking the time to have a look for us and offer your opinions Marcus, much appreciated.
We are certainly going to be experimenting with the canonical tag in this way moving forward. We've never experienced problems with user interaction within the site since Google decided to start ranking the "show all" version of the pages instead so we've never really worried too much about it until now.
The worst hit was another non-competitive term "clic clac sofa bed" - we grew it steadily from 10th position back in feb and this was 3rd last week (!) and is no longer ranking at all! The page that was ranking is: http://www.franceshunt.co.uk/live/sofa-beds/
When this campaign began back in the old days of yore we were still using free directorys for optimisation of deep pages. Ive read alot about these being slowly de-indexed by Google so was wondering if this was having an adverse impact on some of the "weaker" pages. As you can see though there has been no off-site optmisation towards this page its a pretty new term (only added to campaign in feb) so im discounting that theory - for now!
-
Hey
First up, you have rel = next & prev on the paginated pages so that's good but I would also use the rel=canonical to the view all page as described here:
http://googlewebmastercentral.blogspot.co.uk/2011/09/view-all-in-search-results.html
The view all page in this category is not huge and loads nice and quickly so I cant see any reason not to 'help' google and give them the indication that this is where you want all rankings for those pages to be concentrated.
As always, experimentation is needed but I see things like this:
-
You have a view all page and that is the desired page to display and Google prefers it all by itself
-
You have a rel=next & rel=prev set up that is really for when you want to display individual component pages rather than the main page
-
The search query you are referencing has no intent that makes it more specific to one of the paginated pages so the ideal landing page is the view all page
So, remove the rel=next & rel=prev and canonical it to the view all page and see how you get on. Allow it to reindex, record the results and make an decision based on that information.
As a disclaimer, this may not make any difference with the ranking as it seems they are not indexing your paginated pages AND if we do an info query on the main category page it shows details for the show all page. That said, this is the correct way to do it unless you would rather show the individual pages so I would still make the change.
I think when it comes down to it, Harveys just have like 5 x as many linking domains as you and you both have fairly natural looking anchor text (at the most cursory of views) so they are just outranking you here. I have not digged into the other results between you and them and a drop from 3 to 11 is a bit more than the usual flutters - is there anything else that has had a similar drop?
-
-
Thanks Marcus!
Our site is http://www.franceshunt.co.uk/
We have asked a couple of questions before on Moz as to how best to solve the pagination issues within our site.
Google seems to prefer to rank the "show all" version of the targeted landing pages.
So whilst we are optimising http://www.franceshunt.co.uk/dine/extending-dining-tables/
Google prefers to rank http://www.franceshunt.co.uk/dine/extending-dining-tables/?p=all
Which hasn't caused us any problems before, yet now im wondering if this could be part of the issue too. Please let us know what you think!
-
We were ranking third before the update for this term.
Surely brand exposure and social signals are related to their number one positioning, but whats with the second result?
This is the same landing page yet through a different navigational path. This is what im questioning here?
-
Hmmm, yeah, that kind of sucks. That is the same page, and like you say it just seems to be either tagged as either living room or dining room. Looking at them closely, they are vaguely different, not a lot in it, both just a weak category page.
Whilst this is an obvious example of something amiss here, they should not have the top two spots, I would not waste too much time worrying about it. I imagine this will be a short lived deal for them.
Can you drop a link to your site? Maybe we can better advise you on what you can control so you can try to win back some footing here?
-
The update went in favour of companies with good brand exposure, so it is possible that Harvey's link profile is a mix of brand and keyword anchor text.
Your also notice they have 9,000+ facebook fans, in order to obtain that they must activity work on social media, so your also looking at social signals being built another thing Google is now focusing on.
But I don't really see that keyword being that competitive, you should be able to push through SERP's
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Multiple sites using same text - how to avoid Google duplicate content penalty?
Hi Mozers, my client located in Colorado is opening a similar (but not identical) clinic in California. Will Google penalize the new California site if we use text from our website that features his Colorado office? He runs the clinic in CO and will be a partner of the clinic in CA, so the CA clinic has his "permission" to use his original text. Eventually he hopes to go national, with multiple sites utilizing essentially the same text. Will Google penalize the new CA site for plagiarism and/or duplicate content? Or is there a way to tell Google, "hey Google, this new clinic is not ripping off my text"?
Web Design | | CalamityJane770 -
Is there an issue if we show our old mobile site to Google & new site to users
Hi, We have our existing mobile site that contains interlinking in footer & content and new mobile site that does not have interlinking. We will show existing mobile site to google crawler & new mobile site to users. Will this be taken as black hat by Google. The mobile site & desktop site will have same url across devices & browsers. Regards
Web Design | | vivekrathore0 -
For a web design firm, should i make a google plus local page or company page?
I have a web design firm located in India, At this moment we are focusing on local clients as the current competition in local market is very low. But in few months we will shift our focus to outsourcing. So I wanted to know if we should make a google plus local page and connect it with my google places account and website or should I make a google plus business page and connect it to website? Our major focus is on seo. Thanks
Web Design | | hard0 -
Google Tag Manager
I recently discovered the Google Tag Manager and I am in the process of updating many of my websites with this feature. I am using Tag Manager to mange Google Analytics, Google Remarketing, Alive Chat, Woopra, etc. I have one question about how Tag Manager actually works. As best I can tell, the Tag Manager code snippet that I insert into my web pages is the same for all my websites and does not include a unique ID. If that is the case, then Tag Manager must search all the URLs in the TM database to find a match. What is to stop someone else from adding some rules for my URLs to their containers? I expect Google has a method to ensure proper matching, but I'm not clear on how that is enforced. Best,
Web Design | | ChristopherGlaeser
Christopher0 -
Does Google take email server IP blacklists into account?
This is just a hypothetical, but would Google use information from email server blacklists to determine the quality of a website? The reason is that we're planning to code in an e-mail queuing system for our next CMS, and we would put SPF and DKIM in place. We wouldn't be sending any bulk e-mails (we use Constant Contact for this), but we might be sending personalised follow up e-mails, unpaid order emails and that sort of thing. There's no reason to think we'll be blacklisted, but from experience I know that these email blacklist directories quite often give false positives when an e-mail server is incorrectly configured. So the risk is that we might get blacklisted by mistake when we start using this new feature. Would Google take this into account as part of the algorithm? And if so, would the damage be permanent? (I.e. does getting removed from the blacklist mean Google will stop thinking we're a low quality / spammy site)
Web Design | | OptiBacUK0 -
Empirical Data on the effect of embedded Google Maps on Search Ranking
Does anyone have any empiric data on the effect of an embedded map on SERP's? Please understand that I already have anecdotal info and a personal opinion. I am looking for data. Thanks
Web Design | | RobertFisher0