Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate URL's in Sitemap? Is that a problem?
-
I submitted a sitemap to on Search Console - but noticed that there are duplicate URLs, is that a problem for Google?
-
Hi Luciana! If Logan and/or Matthew answered your question, mind marking one or both of their responses as a "Good Answer" (down in the lower-right of the responses)? It helps us keep track of things, and it gives them a few extra MozPoints.

-
Thank you everyone!
Basically for some reason the system I used to generate the sitemap just has some (not a whole lot) of duplicate URLs, they are exact duplicates. I figured Google would just overlook that.
This was helpful!
Thanks again,
Luciana
-
Generally speaking, this isn't the worst problem you can have with your XML sitemap. In an ideal world, you'll be able to remove duplicate URLs from the sitemap and only submit a single URL for each page. In reality, most larger sites I've encountered have some amount of duplicate content in their XML sitemap with no real major problems.
Duplicate content is really only a major problem if it is "deceptive" in nature. So long as this is just a normal consequence of your CMS, or similar, vs. an attempt to game the rankings you are probably fine. For more about that check out this support article.
The other problem you may encounter is with your search results for those duplicate pages. That article makes mention that Google will pick the URL they think is best (more about that here as well) and the URL they deem the best will be the URL that surfaces in the search results. That may or may not be the same URL you or your visitors would deem best. So, what you might find is Google picked a not great URL (like one with extra parameters) and with the not great URL appearing in the SERPs, your search result isn't as compelling to click on as some other version of the URL may be.
-
Hi,
This isn't necessarily a problem, but XML sitemaps should be as clean as possible before they're uploaded. i.e., no 301'd URLs, no 404s, no dupes, no parameter'd URLs, no canonicalized, etc..
Are they duplicates in the sense that one has caps, and the other doesn't? As in /example.html and /Example.html. If so, you'll want to fix that.
If they're identically formatted URLs, there should be no problem, but you're at duplicate content risk if they're different in anyway and not canonicalized.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I fix duplicate page issue on Shopify with duplicate products because of collections.
I'm working with a new client with a site built on Shopify. Most of their products appear in four collections. This is creating a duplicate content challenge for us. Can anyone suggest specific code to add to resolve this problem. I'm also interested in other ideas solutions, such as "don't use collections" if that's the best approach. I appreciate your insights. Thank you!
On-Page Optimization | | quiltedkoala0 -
Content hidden behind a 'read all/more..' etc etc button
Hi Anyone know latest thinking re 'hidden content' such as body copy behind a 'read more' type button/link in light of John Muellers comments toward end of last year (that they discount hidden copy etc) & follow up posts on Search Engine Round Table & Moz etc etc ? Lots of people were testing it and finding such content was still being crawled & indexed so presumed not a big deal after all but if Google said they discount it surely we now want to reveal/unhide such body copy if it contains text important to the pages seo efforts. Do you think it could be the case that G is still crawling & indexing such content BUT any contribution that copy may have had to the pages seo efforts is now lost if hidden. So to get its contribution to SEO back one needs to reveal it, have fully displayed ? OR no need to worry and can keep such copy behind a 'read more' button/link ? All Best Dan
On-Page Optimization | | Dan-Lawrence0 -
How important are clean URLs?
Just wanting to understand the importance of clean URLs in regards to SEO effectiveness. Currently, we have URLs for a site that reads as follows: http://www.interhampers.com.au/c/90/Corporate Gift Hampers Should we look into modifying this so that the URL does not have % or figures?
On-Page Optimization | | Gavo1 -
Putting content behind 'view more' buttons
Hi I can't find an upto date answer to this so was wondering what people's thoughts are. Does putting content behind 'view more' css buttons affect how Google see's and ranks the data. The content isn't put behind 'view more' to trick Google. In actual fact if you see the source of the data its all together, but its so that products appear higher up the page. Does anyone have insight into this. Thanks in advance
On-Page Optimization | | Andy-Halliday0 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
H2's vs Meta description
in some of my serp results the h2's are showing up instead of the meta description. i have read that H2's arent really valid anymore. can someone clarify this for me?
On-Page Optimization | | dhanson240 -
What's the best practice for handling duplicate content of product descriptions with a drop-shipper?
We write our own product descriptions for merchandise we sell on our website. However, we also work with drop-shippers, and some of them simply take our content and post it on their site (same photos, exact ad copy, etc...). I'm concerned that we'll loose the value of our content because Google will consider it duplicated. We don't want the value of our content undermined... What's the best practice for avoiding any problems with Google? Thanks, Adam
On-Page Optimization | | Adam-Perlman0 -
Creating a sitemap
what is the best place to go to create a xml sitemap? I have used xml-sitemap.com but it only does 500 pages. Any suggestions? Does a sitemap have value on the SEO front?
On-Page Optimization | | jgmayes0