UK Serps
-
Hi there,
I have a website that is aimed primarily at the US market. However, I have the option to make some content for people in my niche for people in the UK. Is there anything I can do to help the site rank on google.co.uk for the UK content, baring in mind that most of my content is aimed at the US.
Thanks,
Peter
-
Thanks Svet!
-
First I will start with the most annoying and obvious answer - try to get as many backlinks as possible, most of them from sites which are firmly associated with the UK market.
You can also try providing the UK content under a subdomain, or even better - a /folder on your site. Then you will be able to associate it with the relevant market on Google Webmaster Tools. If you go with a subdomain, that might be better if you submit your site in local listings. This post by Rand Fishkin on ranking in Google Local if found useful for ideas about local listings.
Regards,
Svet
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I avoid duplicate brand name in the title serp?
Hello, How can I avoid duplicate brand name in the title serp? For example:
Technical SEO | | jh0sz
In this page: https://www.latam.com/es_cl/ The title setted is: <title>LATAM Airlines en Chile | Sitio Oficial</title>
But in the SERP show: LATAM Airlines en Chile | Sitio Oficial - LATAM.com Can I avoid LATAM.COM at the end of the title? Regards 8J3jEAX1 -
Google UK and the slog of Link building
Background:
Technical SEO | | Brinley
I have a number of sites built using the open eCommerce software zen cart. One of these sites was penalised by the original Penguin algorithm back in April 24, 2012. The reason for the panalty was that two ecommerce sites in Hong kong had a link to the above site in the footer of their 2000 & 4000 product website. I have no idea why the site had these links and even though I did contact them a few months before the Penguin massacre asking them to remove the footer link I was technically unaware of the ticking time bomb that they presented. The result, as is now engrained in SEO history, was that the site was moved to sit alongside Googles equivalent of the restaurant at the end of the universe and stayed there for 2 years until April 2014.
As I had never indulged in link building for the simple reason that I found it laborious I was obviously infuriated with the resulting loss of revenue but that was balanced with an understanding that I had not kept pace with the changing landscape of SEO according to Google. The quest I am now on is to increase my 3 sites profile on the web without getting another spanking from Google in the near future. The problem I have is that white hat today may well be black hat tomorrow. (I can recall the days when Google said links are good and everyone went out and asked other websites to link with them and look where that led.) So do I ignore actively cultivating links as some suggest and look to produce good content (which is quite difficult when you make mugs and candles by the way.) or do you go out and look to intentionally build links by studying competitors links, reviewing link opportunity or get bloggers to review products. For a small lifestyle entrepreneur like myself, the ever changing seo landscape and the amount of time & effort it requires is slowly and inevitably pushing us back out to that restaurant mentioned earlier. If only Google had a little brother that was designed purely for small businesses - like it was in the good old days before the dinosaur that is big business grunt and thought hmmm! whats that?
And if there were such a thing I would add a caveat that it would be illegal to generate pointless amount of cyber content because the web is becoming something akin to a landfill. Which leaves me nowhere really - but I think I am okay with that. Waiter !!0 -
Google dropping pages from SERPs even though indexed and cached. (Shift over to https suspected.)
Anybody know why pages that have previously been indexed - and that are still present in Google's cache - are now not appearing in Google SERPs? All the usual suspects - noindex, robots, duplication filter, 301s - have been ruled out. We shifted our site over from http to https last week and it appears to have started then, although we have also been playing around with our navigation structure a bit too. Here are a few examples... Example 1: Live URL: https://www.normanrecords.com/records/149002-memory-drawings-there-is-no-perfect-place Cached copy: http://webcache.googleusercontent.com/search?q=cache:https://www.normanrecords.com/records/149002-memory-drawings-there-is-no-perfect-place SERP (1): https://www.google.co.uk/search?q=memory+drawings+there+is+no+perfect+place SERP (2): https://www.google.co.uk/search?q=memory+drawings+there+is+no+perfect+place+site%3Awww.normanrecords.com Example 2: SERP: https://www.google.co.uk/search?q=deaf+center+recount+site%3Awww.normanrecords.com Live URL: https://www.normanrecords.com/records/149001-deaf-center-recount- Cached copy: http://webcache.googleusercontent.com/search?q=cache:https://www.normanrecords.com/records/149001-deaf-center-recount- These are pages that have been linked to from our homepage (Moz PA of 68) prominently for days, are present and correct in our sitemap (https://www.normanrecords.com/catalogue_sitemap.xml), have unique content, have decent on-page optimisation, etc. etc. We moved over to https on 11 Aug. There were some initial wobbles (e.g. 301s from normanrecords.com to www.normanrecords.com got caught up in a nasty loop due to the conflicting 301 from http to https) but these were quickly sorted (i.e. spotted and resolved within minutes). There have been some other changes made to the structure of the site (e.g. a reduction in the navigation options) but nothing I know of that would cause pages to drop like this. For the first example (Memory Drawings) we were ranking on the first page right up until this morning and have been receiving Google traffic for it ever since it was added to the site on 4 Aug. Any help very much appreciated! At the very end of my tether / understanding here... Cheers, Nathon
Technical SEO | | nathonraine0 -
Sudden drop in ranking google.co.uk ranking
Anyone else had any sudden drops in rankings this week? Is there an update going on? One of my primary keywords has dropped from 6th to 49th in the google.co.uk search results. Not in webmaster tools to flag an issue. I have downloaded the links from webmaster and it does look if some content has been scraped and then linked back to us from a large number of sites that we have never sort links from. I have upload the google disavow link tool. Only one keyword appears to be effected not all of them? Any ideas? Thanks
Technical SEO | | highwayfive0 -
How do you get a Google+ pic in your SERP snippet
Hi from from 20 degrees C 83% humidity wetherby UK 🙂 A few weeks back i decided i needed to get my pretty face appearing in my serps for www.davidclick.com But after having set up a Gppgle+ account and linking my site to the Google+ account i think I may have done something wrong 😞 I linked to the Google+ page via a footer link in www.davidclick.com but alas I'm not able to get my face in my SERP which this website has: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/google-plus-picJPGcopy.jpg So my question is please - "How do you get your Google+ account image to appear in the SERPS. Ta muchly,
Technical SEO | | Nightwing
David0 -
Will same language different region (US/UK) geotargeting via subdirectory (& GWT) cause dupe content or other issues ?
If a UK hosted site on a .com, needs to target US now too but for keywords that are spelt differently in US is creating duplicate version of uk hosted .com site and putting it on a subdirectory .com/us/ and geotargeting via webmaster tools (to usa) ok ? I take it in this scenario no dupe content issues (or other issues) so long as is geotargeted via GWT ? Or are there ? Comments from anyone with experience doing similar (same language, different region geo-targeting dupe content with kw spelling being only difference, via a subdirectory or other route) much appreciated ? Many Thanks 🙂
Technical SEO | | Dan-Lawrence0 -
Why do some serps have a + map symbol?
Hi from sunny but freezing wetherby UK.. Ive noticed when you enter "York solicitors" some listings have + Show map symbol. Hers is a screen shot to illustrate: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/plus-map-serps-langleyscopy.jpg I'd like to know please what i would have to do to emulate this. Thanks in advance, David
Technical SEO | | Nightwing1