Bad Duplicate content issue
-
Hi,
for grappa.com I have about 2700 warnings of duplicate page content. My CMS generates long url like: http://www.grappa.com/deu/news.php/categoria=latest_news/idsottocat=5 and http://www.grappa.com/deu/news.php/categoria%3Dlatest_news/idsottocat%3D5 (this is a duplicated content).
What's the best solution to fix this problem? Do I have to set up a 301 redirect for all the duplicated pages or insert the rel=canonical or rel=prev,next ?
It's complicated becouse it's a multilingual site, and it's my first time dealing with this stuff.
Thanks in advance.
-
Your original question had two URLs, one of where the "=" was replaced with "%3D". If that was an actual crawled URL (and not a copy-and-paste error), then it's likely coming from bad links within your own site. That's malformed, so you should definitely check it out. A desktop crawler like Xenu or Screaming Frog could help track down the culprit:
http://www.seomoz.org/blog/crawler-faceoff-xenu-vs-screaming-frog
-
Thanks Peter for the reply!
What do you mean by "bad internal links" ?
I'm well ranked so based on your suggestions what I have to do is to set up properly the rel=canonical tag and rel=alternate, right? I'm still bit scarred about duplicate content report in the SEOmoz campaign. 2.700 warnings is kind of a big deal.
-
One of these URLs just seems to be the encoded version of the other, which should appear as identical. I'm not seeing any evidence that Google is indexing both. I have a feeling that you may have some bad internal links that need to be fixed. I'm seeing the English/German version of this page in the index, but that should be fine. As Khem said, you could use .
Be careful about converting to a "static" version. It's not that it's a bad idea, but the problem is that you could end up turning 2 duplicates into 3 duplicates. You'll still have to canonicalize the dynamic version to the static version. In other words, done badly, changing your URLs could actually make the problem worse.
-
Rel=prev/next is for paginated series, such as internal search results. While I see you have a pagination parameter on these pages ("idpagina=13"), it doesn't seem like this is a series or that the two pages are even duplicates. I'm a bit confused on the intent, but my initial reaction is that rel=prev/next doesn't fit the bill here.
-
As long as you are managing a multilingual site, it is always recommended to use rel="alternative" even if you're redirecting your website.
For next, prev, don't use, unless you feel it is really required, as I could not find the need May be I missed something, could you be please bit more specific?
-
Thanks Raj! I will for sure re-write the dynamic urls into static and that's a starting point. Take for example these pages:
http://www.grappa.com/eng/grappa.php/argomento=grappa_in_italy/idsezione=1/idpagina=13
Do you suggest in this case to use rel=nex, prev ?
I thought about using rel="alternate" for the multilingual issue, but now my site redirects automatically from www.grappa.com to www.grappa.com/eng/index.php. is that bad for SEO? Should I put rel="canonical" to www.grappa.com ?
Many thanks
-
Hey Nicola, ~2700 is a huge no.
I would suggest you to talk to you programmer/developer to re-write the dynamic URLs into static, which I am sure they can easily do.
second thing, make sure to delete all the duplicate pages or use rel=unfollow. using 301 for all the duplicate pages is not a bad option but not a permanent solutions. It is better to re-write all the dynamics urls into static one, delete all the dups pages and then 301 redirect all the deleted pages to the originals.
for multilingual you can use the following code:
The tag enables you to say, “This is for Spain. this is for Germany
The rel="alternate" hreflang="es" annotations help Google serve the Spanish language or regional URL to searchers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content and Subdirectories
Hi there and thank you in advance for your help! I'm seeking guidance on how to structure a resources directory (white papers, webinars, etc.) while avoiding duplicate content penalties. If you go to /resources on our site, there is filter function. If you filter for webinars, the URL becomes /resources/?type=webinar We didn't want that dynamic URL to be the primary URL for webinars, so we created a new page with the URL /resources/webinar that lists all of our webinars and includes a featured webinar up top. However, the same webinar titles now appear on the /resources page and the /resources/webinar page. Will that cause duplicate content issues? P.S. Not sure if it matters, but we also changed the URLs for the individual resource pages to include the resource type. For example, one of our webinar URLs is /resources/webinar/forecasting-your-revenue Thank you!
Technical SEO | | SAIM_Marketing0 -
I have duplicate content but // are causing them
I have 3 pages duplicated just by a / Example: https://intercallsystems.com/intercall-nurse-call-systems**//**
Technical SEO | | Renalynd
https://intercallsystems.com/intercall-nurse-call-systems**/** What would cause this?? And how would I fix it? Thanks! Rena0 -
Would Google Call These Pages Duplicate Content?
Our Web store, http://www.audiobooksonline.com/index.html, has struggled with duplicate content issues for some time. One aspect of duplicate content is a page like this: http://www.audiobooksonline.com/out-of-publication-audio-books-book-audiobook-audiobooks.html. When an audio book title goes out-of-publication we keep the page at our store and display a http://www.audiobooksonline.com/out-of-publication-audio-books-book-audiobook-audiobooks.html whenever a visitor attempts to visit a specific title that is OOP. There are several thousand OOP pages. Would Google consider these OOP pages duplicate content?
Technical SEO | | lbohen0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Duplicate Content in Wordpress.com
Hi Mozers! I have a client with a blog on wordpress.com. http://newsfromtshirts.wordpress.com/ It just had a ranking drop because of a new Panda Update, and I know it's a Dupe Content problem. There are 3900 duplicate pages, basically because there is no use of noindex or canonical tag, so archives, categories pages are totally indexed by Google. If I could install my usual SEO plugin, that would be a piece of cake, but since Wordpress.com is a closed environment I can't. How can I put a noindex into all category, archive and author peges in wordpress.com? I think this could be done by writing a nice robot.txt, but I am not sure about the syntax I shoud use to achieve that. Thank you very much, DoMiSol Rossini
Technical SEO | | DoMiSoL0 -
How to Fix Duplicate Content Issue of Manufacturer Details Paragraph?
I am surviving with Google's crawling issue. Google had not index my product pages yet. I have Google a lot and read too many articles to get it done. But, I did not get satisfy answer with it. I just checked my product pages and found that: There is one tab with Manufacturers Details containing one paragraph. This content is available on too many product pages with same manufacturer. You can know more by visiting following URL. http://www.vistastores.com/indoorlighting-elklighting-d1472.html So, Does it matter to stop my crawling? If yes so How can I fix it?
Technical SEO | | CommercePundit0 -
Duplicate Content Question
Just signed up for pro and did my first diagnostic check - I came back with something like 300 duplicate content errors which suprised me because every page is unique. Turns out my pages are listed as www.sportstvjobs.com and just sportstvjobs.com does that really count as duplicate? and if so does anyone know what I should be doing differently? I thought it was just a canonical issue, but best I can tell I have the canonical in there but this still came up as a duplicate error....maybe I did canonical wrong, or its some other issue? Thanks Brian Clapp
Technical SEO | | sportstvjobs0 -
50+ duplicate content pages - Do we remove them all or 301?
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much). Should we 301 all 50 of the URLs to one URL or should we just completely get rid of all the pages? Are there any steps to take when completely removing pages completely? (submit sitemap to google webmaster tools, etc) thanks!
Technical SEO | | Motava0