Redirecting duplicate pages
-
For whatever reason, X-cart creates duplicates of our categories and articles so that we have URLs like this
www.k9electronics.com/dog-training-collars
www.k9electronics.com/dog-training-collars/or
http://www.k9electronics.com/articles/anti-bark-collar
http://www.k9electronics.com/articles/anti-bark-collar/now our SEO guy says that we dont have to redirect these because google is "smart enough" to know they are the same, and that we should "leave it as-is". However, everything I have read online says that google sees this as dupe content and that we should redirect to one or the other / or no /, depending on which most of our internal links already point to, which is with a slash. What should we do? Redirect or leave it as is? Thanks!
-
I agree 100% with Matt. I would also recommend checking your internal links to ensure that they are all following the same format with or without the /.
-
Hi there, URLs with a trailing slash or without a trailing slash..this has been very much discussed and debated in the SEO circles for quite a long time. The fact is you can go with either of these but be consistent with what you go with through out your website. Personally, I would prefer all the URLs ending with a folder name go with a trailing slash and all the URLs that end with file names should end with proper file extension. And for me if anything ends without a trailing slash, it would be a file and not a folder (don't you think I am biased towards static websites?). This issue is very common with dynamic websites and to conclude, as your SEO guy rightly mentioned, there is absolutely nothing to worry about and Google will take care of this issue (rather a common practice or should I call it a mistake?) pretty well. Good luck.
Regards,
Devanur.
-
Redirect the URLs to one version, for a start it will make your site more crawler friendly for search engines so they don't waste time crawling each version of the page fully and also as you mention it will eliminate duplicate content.
Why make Google do extra work? - your site architecture needs to be as clear as possible, so that Google can crawl it efficiently and not have to deal with duplicates..
This link says it all - http://googlewebmastercentral.blogspot.co.uk/2010/04/to-slash-or-not-to-slash.html
Here is a recent Q&A from some of the others in the community backing up the fact that you should redirect one version of the URL to eliminate duplicate content
- http://www.seomoz.org/q/duplicate-content-issue-with-trailing
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google Judge Duplicate Content on Responsive Pages to be Keyword Spamming?
I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides. Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
Web Design | | CurtisB
The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible. When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's. But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's. Here is the layout of one landing page, as served by the server. 1000 words of text goes here. No text. jQuery will copy the text from div id="desktop" into here. No text. jQuery will copy the text from div id="desktop" into here. ===================================================================================== My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design? Thank you!0 -
How to check if the website has duplicate content?
I've been working with the websites from couple of months and it was always in my mind if there could be a legit way to find if the website have a duplicate content. I've tried couple of websites through google but nothing worked for me. It would be much appreciated if anyone can help. Thanks
Web Design | | rajveer_singh0 -
Looking to remove SSL because it is causing very slow website download speeds. Does WP have a plugin that redirects SSL urls to non SSL urls?
After some extended debate with our web development team we are considering dropping the SSL from our website because it is adding almost 2 additional seconds to our download speeds. We know there is a SEO boost from having a SSL but we believe the extended download speeds maybe outweighing the benefit. However we are concerned about the SEO implications of having no method possible of redirect SSL to non SSL webpages. Does anybody know of a Wordpress Plugin that can force redirect SSL urls to non SSL urls?
Web Design | | RosemaryB0 -
Redirects Not Working / Issue with Duplicate Page Titles
Hi all We are being penalised on Webmaster Tools and Crawl Diagnostics for duplicate page titles and I'm not sure how to fix it.We recently switched from HTTP to HTTPS, but when we first switched over, we accidentally set a permanent redirect from HTTPS to HTTP for a week or so(!).We now have a permanent redirect going the other way, HTTP to HTTPS, and we also have canonical tags in place to redirect to HTTPS.Unfortunately, it seems that because of this short time with the permanent redirect the wrong way round, Google is confused as sees our http and https sites as duplicate content.Is there any way to get Google to recognise this new (correct) permanent redirect and completely forget the old (incorrect) one?Any ideas welcome!
Web Design | | HireSpace0 -
Funnel tracking with one page check-out?
Hi Guys, I'm creating a new website with a one page checkout that follows the following steps:
Web Design | | Jerune
1. Check availability
2. Select product
2. Select additional product & Add features
3. Provide personal information
4. Order & Pay I'm researching if it is possible to track all these steps (and even steps within the steps) with Google Analytics in order to analyse checkout abandonment. The problem is only that my one-page checkout has only one URL (I want to keep it that way) and therefore can not be differentiated on URL in the Analytics funnel. To continue to the next step also the same button (in a floating cart) in used to advance. The buttons to select/choose something within one step are all different. Do you guys know how I can set this up and how detailed I can make this? For example, is it also possible to test at which field visitors leave when for example filling in their personal information? Would be great if you can help me out!0 -
Subdomains, duplicate content and microsites
I work for a website that generates a high amount of unique, quality content. This website though has had development issues with our web builder and they are going to separate the site into different subdomains upon launch. It's a scholarly site so the subdomains will be like history and science and stuff. Don't ask why aren't we aren't using subdirectories because trust me I wish we could. So we have to use subdomains and I'm wondering a couple questions. Will the duplication of coding, since all subdomains will have the same design and look, heavily penalize us and is there any way around that? Also if we generate a good amount of high quality content on each site could we link all those sites to our other site as a possible benefit for link building? And finally, would footer links, linking all the subdirectories, be a good thing to put in?
Web Design | | mdorville0 -
Is it necessary to redirect every Error page (404 or 500) found?
If I have Hundreds of pages with 404 and 500 erros should set up 301 redirects for all of them? Some of the pages have external links, some don't.
Web Design | | jmansd0 -
Do Pages That Rearrange Set Off Any Red Flags for Google?
We have a broad content site that includes crowdsourced lists of items. A lot of the pages allow voting, which causes the content on the pages (sometimes the content is up to 10 pages deep) to completely rearrange, and therefore spread out and switch pages often among the (up to 10) pages of content. Now, could this be causing any kind of duplicate content or any other kind of red flags for Google? I know that the more the page changes the better, but if it's all the same content that is being moved up and down constantly, could Google think we're pulling some kind of "making it look like we have new content" scheme and ding us for these pages? If so, what would anyone recommend we do? Let's take an example of a list of companies with bad customer service. We let the internet vote them up and down all the time, the order changes depending on the votes in real time. Is that page doomed, or does Google see it and love it?
Web Design | | BG19850