URL Duplicate Content Issues (Website Transition)
-
Hey guys,
I just transitioned my website and I have a question. I have built up all the link juice around my old url styles. To give you some clarity:
My old CMS rendered links like this: www.example.com/sweatbands
My new CMS renders links like this: www.example.com/sweatbands/
My new CMS's auto-sitemap also generates them with the slash on the end. Also throughout the website the CMS links to them with the slash at the end and i link to them without the slash (because it's what i am used to). I have the canonical without the slash.
Should I just 301 to the version with the slash before google crawls again? I'm worried that i'll lose all the trust and ranking i built up to the one without the slash. I rank very high for certain keywords and some pages house a large portion of our traffic. What a mess! Help!
-
Hi Paul, did you find a good way to automatically do the trailing slash redirect?
-
Paul--you may be able to automate this using the rewrite module or just a few lines of PHP. Check out the SEOmoz "how to" for 301 redirects. Scroll down to the "SEO Best Practices".
The article advises that the fastest way to make the switch is to use the PHP header function. If you aren't using PHP (like wordpress or Joomla) than look at the instructions for editing the .htaccess file.
It's a little dense, but hopefully this can save you hours of manually typing in the new URLs in a text doc
-
thank you - Apache.
-
Paul--there are some automated options to avoid rewriting the lines by hand. What kind of server are you running? Windows? Linux? Let me know and I'll try to throw a script your way.
-
Josh, Thank you. I was going in that direction but just wasn't sure. I'm gunna have to add a lot of slashes here in a minute. Anyone else have any input?
-
Paul,
I understand your situation seems messy! Good news though--this should be a relatively simple fix.
A 301 redirect by definition preserves all your link juice. IMO, you should 301 all of the "no slash" pages to the ones with slashes. This will keep your site consistent as you continue to produce content under the new CMS.
As for canonical: technically it won't make a difference, but as a best practice, I would change the canonical page to the one with the slash. This avoids calling attention to a page that will ultimately redirect. Since the new "slash" page is going to inherit all of the "no slash" link juice (via 301), it is appropriate to label it as "canonical".
Even if you were to see a slight fluctuation in your ranking, don't be alarmed--nothing will have changed in the eyes of the search engine.
In short: 301 to your heart's content and keep producing good content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content and Subdirectories
Hi there and thank you in advance for your help! I'm seeking guidance on how to structure a resources directory (white papers, webinars, etc.) while avoiding duplicate content penalties. If you go to /resources on our site, there is filter function. If you filter for webinars, the URL becomes /resources/?type=webinar We didn't want that dynamic URL to be the primary URL for webinars, so we created a new page with the URL /resources/webinar that lists all of our webinars and includes a featured webinar up top. However, the same webinar titles now appear on the /resources page and the /resources/webinar page. Will that cause duplicate content issues? P.S. Not sure if it matters, but we also changed the URLs for the individual resource pages to include the resource type. For example, one of our webinar URLs is /resources/webinar/forecasting-your-revenue Thank you!
Technical SEO | | SAIM_Marketing0 -
50 Duplicate URLS, but not the same
Hi According to my latest site crawl, many of my pages are showing up to 50 duplicate urls. However this isn't the case in real life. http://www.fortusgroup.com.au/browse-products/rubber-tracks/excavator-rubber-tracks/hitachi/ex-33mu.html is showing 31 duplicate URL. Examples include: http://www.fortusgroup.com.au/browse-products/rubber-tracks/excavator-rubber-tracks/parts/x430.html
Technical SEO | | JDadd
http://www.fortusgroup.com.au/browse-products/rubber-tracks/excavator-rubber-tracks/case/cx-75sr.html Obviously these URL's are very similar and I know that Moz judges URLs by 90% of their similarity, but is this affecting my actual raking on google? If so, what can I do? This pages are also very similar in code and content, so they are also showing as duplicate content etc as well. Worried that this is having an affect on my SERP rankings, as this pages arent ranking particularly well. Thanks, Ellie0 -
Responsive Code Creating Duplicate Content Issue
Good morning, Our developers have recently created a new site for our agency. The site is responsive for mobile/tablets. I've just put the site through Screaming Frog and I've been informed of duplicate H2s. When I've looked at some of the page sources, there are some instances of duplicated H2s and duplicated content. These duplicates don't actually appear on the site, only in the code. When I asked the development guys about this, they advised this is duplicated because of the code for the responsive site. Will the site be negatively affected because of this? Not everything is duplicated, which leads me to believe it probably could have been designed better... but I'm no developer so don't know for sure. I've checked the code for other responsive sites and no duplicates can be found. Thanks in advance, Lewis
Technical SEO | | PeaSoupDigital0 -
Duplicate Pages on GWT when redesigning website
Hi, we recently redesigned our online shop. We have done the 301 redirects for all product pages to the new URL (and went live about 1.5 week ago), but GWT indicated that the old product URL and the new product URL are 2 different pages with the same meta title tags (duplication) - when in fact, the old URL is 301 redirecting to the new URL when visited. I found this article on google forum: https://productforums.google.com/forum/#!topic/webmasters/CvCjeNOxOUw
Technical SEO | | Essentia
It says we either just wait for Google to re-crawl, of use the fetch URL function for the OLD URLs. Question is, after i fetch the OLD URL to tell Google that it's being redirected, should i click the button 'submit to index' or not? (See screengrab - please note that it was the OLD URL that was being fetched, not the NEW URL). I mean, if i click this button, is it telling Google that: a. 'This old URL has been redirected, therefore please index the new URL'? or
b. 'Please keep this old URL in your index'? What's your view on this? Thanks1 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
A site I am working with has multiple duplicate content issues.
A reasonably large ecommerce site I am working with has multiple duplicate content issues. On 4 or 5 keyword domains related to site content the owners simply duplicated the home page with category links pushing visitors to the category pages of the main site. There was no canonical URL instruction, so have set preferred url via webmaster tools but now need to code this into the website itself. For a reasonably large ecommerce site, how would you approach that particular nest of troubles. That's even before we get to grips with the on page duplication and wrong keywords!
Technical SEO | | SkiBum0 -
How do I fix duplicate content with the home page?
This is probably SEO 101, but I'm unsure what to do here... Last week my weekly crawl diagnostics were off the chart because http:// was not resolving to http://www...fixed that but now it's saying I have duplicate content on: http://www.......com http://www.......com/index.php How do I fix this? Thanks in advance!
Technical SEO | | jgower0 -
Duplicate content across multiple domains
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain. My question relates to the removal of these pages. There are thousands of these duplicate pages. I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen? Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site. I suppose another option would be to include no cache meta tag for these pages. Any thoughts or comments would be appreciated.
Technical SEO | | jmsobe0