Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate URL's in Sitemap? Is that a problem?
-
I submitted a sitemap to on Search Console - but noticed that there are duplicate URLs, is that a problem for Google?
-
Hi Luciana! If Logan and/or Matthew answered your question, mind marking one or both of their responses as a "Good Answer" (down in the lower-right of the responses)? It helps us keep track of things, and it gives them a few extra MozPoints.

-
Thank you everyone!
Basically for some reason the system I used to generate the sitemap just has some (not a whole lot) of duplicate URLs, they are exact duplicates. I figured Google would just overlook that.
This was helpful!
Thanks again,
Luciana
-
Generally speaking, this isn't the worst problem you can have with your XML sitemap. In an ideal world, you'll be able to remove duplicate URLs from the sitemap and only submit a single URL for each page. In reality, most larger sites I've encountered have some amount of duplicate content in their XML sitemap with no real major problems.
Duplicate content is really only a major problem if it is "deceptive" in nature. So long as this is just a normal consequence of your CMS, or similar, vs. an attempt to game the rankings you are probably fine. For more about that check out this support article.
The other problem you may encounter is with your search results for those duplicate pages. That article makes mention that Google will pick the URL they think is best (more about that here as well) and the URL they deem the best will be the URL that surfaces in the search results. That may or may not be the same URL you or your visitors would deem best. So, what you might find is Google picked a not great URL (like one with extra parameters) and with the not great URL appearing in the SERPs, your search result isn't as compelling to click on as some other version of the URL may be.
-
Hi,
This isn't necessarily a problem, but XML sitemaps should be as clean as possible before they're uploaded. i.e., no 301'd URLs, no 404s, no dupes, no parameter'd URLs, no canonicalized, etc..
Are they duplicates in the sense that one has caps, and the other doesn't? As in /example.html and /Example.html. If so, you'll want to fix that.
If they're identically formatted URLs, there should be no problem, but you're at duplicate content risk if they're different in anyway and not canonicalized.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Url shows up in "Inurl' but not when using time parameters
Hey everybody, I have been testing the Inurl: feature of Google to try and gauge how long ago Google indexed our page. SO, this brings my question. If we run inurl:https://mysite.com all of our domains show up. If we run inurl:https://mysite.com/specialpage the domain shows up as being indexed If I use the "&as_qdr=y15" string to the URL, https://mysite.com/specialpage does not show up. Does anybody have any experience with this? Also on the same note when I look at how many pages Google has indexed it is about half of the pages we see on our backend/sitemap. Any thoughts would be appreciated. TY!
On-Page Optimization | | HashtagHustler1 -
How do I fix duplicate page issue on Shopify with duplicate products because of collections.
I'm working with a new client with a site built on Shopify. Most of their products appear in four collections. This is creating a duplicate content challenge for us. Can anyone suggest specific code to add to resolve this problem. I'm also interested in other ideas solutions, such as "don't use collections" if that's the best approach. I appreciate your insights. Thank you!
On-Page Optimization | | quiltedkoala0 -
Will "internal 301s" have any effect on page rank or the way in which an SE see's our site interlinking?
We've been forced (for scalability) to completely restructure our website in terms of setting out a hierarchy. For example - the old structure : country / city / city area Where we had about 3500 nicely interlinked pages for relevant things like taxis, hotels, apartments etc in that city : We needed to change the structure to be : country / region / area / city / cityarea So as patr of the change we put in place lots of 301s for the permanent movement of pages to the new structure and then we tried to actually change the physical on-page links too. Unfortunately we have left a good 600 or 700 links that point to the old pages, but are picked up by the 301 redirect on page, so we're slowly going through them to ensure the links go to the new location directly (not via the 301). So my question is (sorry for long waffle) : Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually? Thanks for any help anyone can give.
On-Page Optimization | | TinkyWinky0 -
Using phrases like 'NO 1' or 'Best' int he title tag
Hi All, Quick question - is it illegal, against any rule etc to use phrases such as 'The No 1 rest of the title tag | Brand Name' on a site?
On-Page Optimization | | Webrevolve0 -
Duplicate eCommerce Product Descriptions
I know that creating original product descriptions is best practices. What I don't understand is how other sites are able to generate significant traffic while still using duplicate product descriptions on all product pages. How are they not being penalized by Google?
On-Page Optimization | | mj7750 -
No follow for html sitemap?
Hi, I have been working on an e-commerce site and have been wondering if i should add the meta robot tag with no follows, on pages like delivery , terms, returns and the html sitemap? From what I have read this seems a good idea, but i am a little confused as what to do with the html sitemap. I can understand that having this on the homepage will allow crawlers quick access to the deeper pages within the site. But would it be better to guide them down the natural route of the category navigation instead? Thanks
On-Page Optimization | | Bmeisterali0 -
Does a page's url have any weight in Google rankings?
I'm sure this question must have been asked before but I can't find it. I'm assuming that the title tag is far more important than the page's url. Is that correct? Does the url have any relevance to Google?
On-Page Optimization | | rdreich490 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5