Www/nonwww .co.uk/.com
-
When I started SEO - I didn't really know what I was doing (still don't!) Just wondering if anyone can help me with this small problem.
I now understand that I basically have 4 URLs
www.ablemagazine.com (Page Authority: 38/100)
www.ablemagazine.co.uk (Page Authority: 47/100)
ablemagazine.com (Page Authority: 3/100)
ablemagazine.co.uk (Page Authority: 51/100)
What should be configuration be to ensure I'm not loosing masses amounts of linkjuice? At the moment I have ablemagazine.co.uk set as my default domain in webmaster tools. www.ablemagazine.com www.ablemagazine.co.uk and ablemagazine.com all 301 redirect here (I think)
-
Just checked with the Rank Tracker and the .com domains don't appear to be ranking for any of my keywords (including domain name). Will I notice a bump in SERPs/traffic once I 301 the domain .com -> .co.uk or has google already figured that out behind the scenes?
-
#1 - You'll probably incur a small hit but the authority is so closely matched between pages I don't see it as a massive problem (are the .com pages ranking btw?)
#2 - Adding a 301 to a page will pass most of the authority/link juice but not all. The 'time taken' can depend on the next time the page is crawled and indexed and/or how often the page is crawled. Hard to give a definite I'm afraid!
On a side note - Other members will say if you're ranking then why change what you've currently got? That's fine - but in my opinion you should 'future-proof' as much as you can and having the www. version is jut best practice.
DD
-
If I do this, will that negatively affect my SERPs (temporarily)
How long before all the page authority is passed on if I do this (roughly)
Thanks
-
Hey,
The only 301 you have set up is from www.ablemagazine.co.uk to ablemagazine.co.uk
The other URLs are returning a 200 status which means that you have pretty much 3 versions of your homepage.
In a perfect world I would have made www.ablemagazine.co.uk your actual domain - then 301 all other URLs including the non-www version of the page to this address. When people link to your site they are more likely to use the full address not the non-www version. (Do this if you can easily update all URLs across the site.)
DD
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi! I'm wondering whether for keyword SEO - a url should be www.salshoes.com/shoes/mens/day-wear (so with a few parent categories) or www.salshoes.com/shoes-mens-day-wear is ok for on page optimization?
Hi! I'm wondering whether for keyword SEO - a url should be www.salshoes.com/shoes/mens/day-wear (so with a few parent categories) or www.salshoes.com/shoes-mens-day-wear is ok for on page optimization? Hi! I'm wondering whether for keyword SEO - a url should be www.salshoes.com/shoes/mens/day-wear (so with a few parent categories) or www.salshoes.com/shoes-mens-day-wear is ok for on page optimization?
Technical SEO | | SalSantaCruz0 -
Content available only on log-in/ sign up - how to optimise?
Hi Mozzers. I'm working on a dev brief for a site with no search visibility at all. You have to log in (well, sign up) to the site (via Facebook) to get any content. Usability issues of this aside, I am wondering what are the possible solutions there are to getting content indexed. I feel that there are two options: 1. Pinterest-style: this gives the user some visibility of the content on the site before presenting you with a log in overlay. I assume this also allows search engines to cache the content and follow the links. 2. Duplicate HTTP and HTTPS sites. I'm not sure if this is possible in terms of falling foul of the "showing one thing to search engines and another thing to users" guidelines. In my mind, you would block robots from the HTTPS site (and show it to the users where log in etc is required) but URLs would canonicalise to the HTTP version of the page, which you wouldn't present to the users, but would show to the search engines. The actual content on the pages would be the same. I wonder if anyone knows any example of large(ish) websites which does this well, or any options I haven't considered here. Many thanks.
Technical SEO | | Pascale0 -
Website Redesign / Switching CMS / .aspx and .html extensions question
Hello everyone, We're currently preparing a website redesign for one of our important websites. It is our most important website, having good rankings and a lot of visitors from Search Engines, so we want to be really careful with the redesign. Our strategy is to keep as much in place as possible. At first, we are only changing the styling of the website, we will keep the content, the structure, and as much as URLs the same as possible. However, we are switching from a custom build CMS system which created URLs like www.homepage.com/default-en.aspx
Technical SEO | | NielsB
No we would like to keep this URL the same , but our new CMS system does not support this kind of URLs. The same with for instance the URL: www.homepage.com/products.html
We're not able to recreate this URL in our new CMS. What would be the best strategy for SEO? Keep the URLs like this:
www.homepage.com/default-en
www.homepage.com/products Or doesn't it really matter, since Google we view these as completely different URLs? And, what would the impact of this changes in URLs be? Thanks a lot in advance! Best Regards, Jorg1 -
Can you have a /sitemap.xml and /sitemap.html on the same site?
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community! My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain? For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts. I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this. What do you think?
Technical SEO | | PioneerServices0 -
Multiple Region/Language Solutions
So I understand that this is a fairly broad question but I am trying to work through this on a bunch of different levels with a bunch of different sites that have multiple different issues. First I am wondering if I have an e-commerce site on a .com that is used to serve to different languages and locales around the world. Instead of a Domain.com/ES/ for a site that is supposed to serve Spain and a Domain.com/DE/ for a site that is supposed to serve Germany, we do Domain.com/en_ES/ and Domain.com/es_ES/ for an English and a Spanish version for our consumers that come from Spain. My first question is this a bad way to set this up just from a structure standpoint and my second question is what do I do about duplicate content on different locales but same languages? I am afraid that if I rel=canonical this to 1 region for each language that it may not show up in SE's for other regions but the same language. (Example Brazil and Portugal for Portuguese, Belgium and Netherlands for Dutch, Canada and France for French, Spain and Mexico for Spanish, etc...) Second do the language meta tags actually do anything or not? I am finding mixed opinions on this. Third what is the IDEAL website structure for a website that will serve multiple languages and locales from the same ccTLD? I understand this is not ideal but what is the best setup with this situation? Again I know this is a broad question but I am coming across a lot of e-commerce sites wanting help and dealing with this situation. The duplicate content thing is worrisome and I want good, localized indexing. Thanks!
Technical SEO | | DRSearchEngOpt0 -
UK Serps
Hi there, I have a website that is aimed primarily at the US market. However, I have the option to make some content for people in my niche for people in the UK. Is there anything I can do to help the site rank on google.co.uk for the UK content, baring in mind that most of my content is aimed at the US. Thanks, Peter
Technical SEO | | PeterM220 -
Considering redirecting my site from .com/us to just .com. What could the possible SERP consequences?
Today we use country-specific .tlds but have the US site on .com/us; .com is now a 'flag-site.' Ikea uses that structure too (.com/.com/us). Looking potential risk to redirecting current US site to .com.
Technical SEO | | KnutDSvendsen0 -
Google / Bing Product Feeds - Optimization - Taxonomies
Can anyone provide insightful optimization tips for GOOGLE and BING product feeds? How important is it to use the respective SE' taxonomies? Any other effective tactics apart from data stufffing?
Technical SEO | | DavidS-2820610