Multiple URLs and Dup Content
-
Hi there,
I know many people might ask this kind of question, but nevertheless ....
In our CMS, one single URL (http://www.careers4women.de/news/artikel/206/) has been produced nearly 9000 times with strings like this: http://www.careers4women.de/news/artikel/206/$12203/$12204/$12204/ and this http://www.careers4women.de/news/artikel/206/$12203/$12204/$12205/ and so on and so on...
Today, I wrote our IT-department to either a) delete the pages with the "strange" URLs or b) redirect them per 301 onto the "original" page.
Do you think this was the best solution? What about implementing the rel=canonical on these pages?
Right now, there is only the "original" page in the Google index, but who knows? And I don't want users on our site to see these URLs, so I thought deleting them (they exist only a few days!) would be the best answer...
Do you agree or have other ideas if something like this happens next time?
Thanx in advance...
-
One additional comment, and it's tricky. You need to find the crawl path creating these, BUT you don't necessarily want to block it yet. Add the canonical, and let Google keep crawling these pages. Otherwise, the canonical can't do its job properly. Then, once they've cleared out, fix the crawl path.
Are you seeing this in our (SEOmoz) tools or in Google? I'm not actually seeing these variants indexed, so it could potentially be a glitch. It looks a bit like some kind of session variable.
-
Thanks Nakul and Harald for helping.
So, we will implement the rel=canonical on these pages...
Thanx again!!!
-
Hi Stefan,
Since you have multiple URLs containing same data you have to redirect the extra links to the original URL and to do that you can either use the 301 redirect code or the rel="canonical" in the repeated pages.Deleting might not be the best solution because it would take up a lot of time. Instead go for redirection of those pages and I think that since there are too many pages to redirect the rel="canonical" would be the right option.And you must do this fast since the original page has already been indexed by the search engine.
-
I would strongly suggest doing the rel=canonical tag on all pages to the original/correct URLs. So in your CMS, the canonical tag is added, all those variations of the pages will point to the same URL just in case Google Bots find those pages.
You are on the right track about doing a canonical.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mobile site content and main site content
Help, pls! I have one main site and a mobile version of that site (m.domain.com). The main site has more pages, more content, different named urls. The main site has consistently done well in Google. The mobile site has not: the mobile site is buried. I am working on adding more content to the mobile site, but am concerned about duplicate content. Could someone pls tell me the best way to deal with these two versions of our site? I can't use rel=canonical because the urls do not correspond to the same names on the main site, or can I? Does this mean I need to change the url names, offer different content (abridged), etc? I really am at a loss as to how to interpret Google's rules for this. Could someone please tell me what I am doing wrong? Any help or tips would GREATLY appreciated!!!!! Thanks!
Technical SEO | | lfrazer0 -
Duplicate content question...
I have a high duplicate content issue on my website. However, I'm not sure how to handle or fix this issue. I have 2 different URLs landing to the same page content. http://www.myfitstation.com/tag/vegan/ and http://www.myfitstation.com/tag/raw-food/ .In this situation, I cannot redirect one URL to the other since in the future I will probably be adding additional posts to either the "vegan" tag or the "raw food tag". What is the solution in this case? Thank you
Technical SEO | | myfitstation0 -
Should I change the URL now?
Hi all, I have a client website that got hit in the latest algorithm update. It since appears that it had over 100 suspect links to it. I performed the Disavow procedure a few weeks ago via my Google Webmaster account, but have not received a message yet to say its been actioned. The majority of these suspect links go to one page. I am considering changing the base category (in Wordpress) to a different keyphrase and then submitting a new sitemap for indexing. This way there will be no actual link from a suspect website to a page on my website. Do you see what I mean? Will this help do you think? Thanks in advance.
Technical SEO | | BrandC0 -
Dup Content - ASP.NET Web Forms (Default.aspx)
What are some best practices or tips for handling duplicate content issues for sites built on ASP.NET Web Forms? One duplicate content issue I see all the time is www.xyz.com/pages/ and www.xyz.com/pages/Default.aspx. While I'm able to canonicalize the www and non-www version of the domain in the web config file I'm not sure what the best way is to remove or keep the default.aspx from being indexed. I know we can specify certain parameters for the search engines to ignore but isn't it better to have this done on the server side?
Technical SEO | | RedCaffeine0 -
Removing a lot of content & changing url structure.
I recently moved an existing ecommerce site, which I recently purchased, from Volusion to Shopify. The new site has a completely different link structure. The old site also had about 120 products which are not even close to being up to par with the products I now have on the site. So I had to remove all of those pages too. I was just wondering which measures I need to take to deal with this? I created a really nice 404 page. I also 301 redirected the pages which still exist. But I was wondering if there is anything else I should do? Should I request a removal of all the old pages, which no longer exist? Should I do something else I'm not thinking about? Any help would be greatly appreciated. Thanks. jim
Technical SEO | | PedroAndJobu0 -
301 redirecting old content from one site to updated content on a different site
I have a client with two websites. Here are some details, sorry I can't be more specific! Their older site -- specific to one product -- has a very high DA and about 75K visits per month, 80% of which comes from search engines. Their newer site -- focused generally on the brand -- is their top priority. The content here is much better. The vast majority of visits are from referrals (mainly social channels and an email newsletter) and direct traffic. Search traffic is relatively low though. I really want to boost search traffic to site #2. And I'd like to piggy back off some of the search traffic from site #1. Here's my question: If a particular article on site #1 (that ranks very well) needs to be updated, what's the risk/reward of updating the content on site #2 instead and 301 redirecting the original post to the newer post on site #2? Part 2: There are dozens of posts on site #1 that can be improved and updated. Is there an extra risk (or diminishing returns) associated with doing this across many posts? Hope this makes sense. Thanks for your help!
Technical SEO | | djreich0 -
Friendly URLs
Hi, I have an important news site and I am trying to implement user friendly URLs. Now, when you click a news in the homepage, it goes to a redirect.php page and then goes to a friendly url. the question is, It is better to have the friendly URL in the first link or it is the same for the robot having this in the finally url? Thanks
Technical SEO | | informatica8100 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0