Second URL
-
Hi
We have a .com and a .co.uk
Main website is .co.uk, we also have a landing page for the .com
If we redirect the .com to the .co.uk, will it create duplicate content ...
May seem like a silly question, but want to be sure that that the visitors cant access our website at both urls, as that would be duplicate content
Thanks in advance
John
-
Thanks Tom
-
Hi John
No, if you put the redirect in place it won't create duplicate content. In fact, redirections are often used to avoid any potential duplication problems.
The redirection tells the Google bot that the old URL is no longer required, the URL it points to is the correct one. This will tell the bot to stop indexing and crawling the old URL, pass on any of the links pointing to the old URL and consider the new URL to be the definitive article.
Out with the old and in with the new, so to speak!
You can read more on redirection with the Moz Guide.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New website on new url?
We have a new website on a new url (been up for around 2 years now) and our old website is slowly fading in the background, we are now at the point where the money is still ok but we are having issues running both side by side, we have a calculator on each page and are thinking about removing this and adding a box with please order from our new site here (with url of similar page). Now the issue is we don't want to link for SEO purposes and google hammer us (thinking of no - following these) and we also have a penalty we got in 2012 on the site but we did get out of this, would this cause any issue to the new site?
Technical SEO | | BobAnderson1 -
Should I change the URL now?
Hi all, I have a client website that got hit in the latest algorithm update. It since appears that it had over 100 suspect links to it. I performed the Disavow procedure a few weeks ago via my Google Webmaster account, but have not received a message yet to say its been actioned. The majority of these suspect links go to one page. I am considering changing the base category (in Wordpress) to a different keyphrase and then submitting a new sitemap for indexing. This way there will be no actual link from a suspect website to a page on my website. Do you see what I mean? Will this help do you think? Thanks in advance.
Technical SEO | | BrandC0 -
URL Structure for Deal Aggregator
I have a website that aggregates deals from various daily deals site. I originally had all the deals on one page /deals, however I thought that maybe it might be more useful to have several pages e.g. /beautydeals or /hoteldeals. However if I give every section it's own page that means I have either no current deals on the main /deals page or I will have duplicate content. I'm wondering what might be the best approach here? A few of the options that come to mind are: 1. Return to having all the deals on one page /deals and linking internally to content within that page
Technical SEO | | andywozhere
2. Have both a main /deals page with all of the deals plus other pages such as /beautydeals, but add re="canonical" to point to the main /deals page
3. Create new content for the /deals page... however I think people will probably want to see at least some deals straight away, rather than having to click through to another page.
4. Display some sub-categories on the main /deals page, but have separate URLs for other more popular sub-categories e.g. /beautydeals (this is how it works at the moment) I should probably point out that the site also has other content such as events and a directory. Any suggestions on how best to approach this much appreciated! Cheers, Andy0 -
Case sensitive url's
Hi, Really appreciate advice on this one in advance! We had a problem with case sensitive urls (eg: /web-jobs or /Web-jobs) We added a code to convert all urls into lowercase letters and added 301 redirection. We are now experiencing problems with duplicate page content. Each time a url contains caps letter it is converted and redirected to small letter url. I can convert all urls into lowercase letters (all places) but the problem now is google have already indexed urls so they may cause duplicate content issue. The solution: Remove 301 redirection added to convert url into small letter. Add canonical url which converts url into complete small letter, so google index content only from canonical url. But I am little confused about what will happen to already indexed pages with caps in url. Appreciate any advice you can give? Simon
Technical SEO | | simmo2350 -
How do I use only one URL
my site can be reach by both www.site.com and site.com. How do I make it only use www?
Technical SEO | | Weblion0 -
Should we introduce subfolders into the URLs on a new site?
A site we are working on currently gives no indication of the subfolders in the URL. Eg. the site uses: www.examplesite.com/brand-name Rather than: www.examplesite.com/popular-products/brand-name There are breadcrumbs on site to show the user what part of the site they are in and how they navigated there. We are building a new site and have to decide what route to take: Since the site is already performing relatively well in the SERPs and the URLs are nice and short this way, is it a good idea to keep them like this or is it better for usability to include the subfolders? This post suggests that we would be best off to keep the URLs as they are - particularly since less would be changed http://www.seomoz.org/blog/should-i-change-my-urls-for-seo Thanks in advance for your opinions! Liz @lizstraws
Technical SEO | | oneresult0 -
How to Block Urls with specific components from Googlebot
Hello, I have around 100,000 Error pages showing in Google Webmaster Tools. I want to block specific components like com_fireboard, com_seyret,com_profiler etc. Few examples: http://www.toycollector.com/videos/generatersslinks/index.php?option=com_fireboard&Itemid=824&func=view&catid=123&id=16494 http://www.toycollector.com/index.php?option=com_content&view=article&id=6932:tomica-limited-nissan-skyline-r34--nissan-skyline-gt-r-r34-vspec&catid=231&Itemid=634 I tried blocking using robots.txt. Just used this Disallow: /com_fireboard/
Technical SEO | | TheMartingale
Disallow: /com_seyret/ But its not working. Can anyone suggest me to solve this problem. Many Thanks Shradda0 -
Blank Canonical URL
So my devs have the canonical URL loaded up on pages automatically, and in most cases this gets done correctly. However we ran across a bug that left some of these blank like so: Does anyone know what effect that would have? I am trying to provide a priority for this so I can say "FIX IT NOW" or "Fix it after the other 'FIX IT NOW' type of items". Let me know if you have any ideas. I just want to be sure I am not telling google that all of these pages are like the home page. Thanks!
Technical SEO | | SL_SEM0