Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate URL's in Sitemap? Is that a problem?
-
I submitted a sitemap to on Search Console - but noticed that there are duplicate URLs, is that a problem for Google?
-
Hi Luciana! If Logan and/or Matthew answered your question, mind marking one or both of their responses as a "Good Answer" (down in the lower-right of the responses)? It helps us keep track of things, and it gives them a few extra MozPoints.

-
Thank you everyone!
Basically for some reason the system I used to generate the sitemap just has some (not a whole lot) of duplicate URLs, they are exact duplicates. I figured Google would just overlook that.
This was helpful!
Thanks again,
Luciana
-
Generally speaking, this isn't the worst problem you can have with your XML sitemap. In an ideal world, you'll be able to remove duplicate URLs from the sitemap and only submit a single URL for each page. In reality, most larger sites I've encountered have some amount of duplicate content in their XML sitemap with no real major problems.
Duplicate content is really only a major problem if it is "deceptive" in nature. So long as this is just a normal consequence of your CMS, or similar, vs. an attempt to game the rankings you are probably fine. For more about that check out this support article.
The other problem you may encounter is with your search results for those duplicate pages. That article makes mention that Google will pick the URL they think is best (more about that here as well) and the URL they deem the best will be the URL that surfaces in the search results. That may or may not be the same URL you or your visitors would deem best. So, what you might find is Google picked a not great URL (like one with extra parameters) and with the not great URL appearing in the SERPs, your search result isn't as compelling to click on as some other version of the URL may be.
-
Hi,
This isn't necessarily a problem, but XML sitemaps should be as clean as possible before they're uploaded. i.e., no 301'd URLs, no 404s, no dupes, no parameter'd URLs, no canonicalized, etc..
Are they duplicates in the sense that one has caps, and the other doesn't? As in /example.html and /Example.html. If so, you'll want to fix that.
If they're identically formatted URLs, there should be no problem, but you're at duplicate content risk if they're different in anyway and not canonicalized.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Url shows up in "Inurl' but not when using time parameters
Hey everybody, I have been testing the Inurl: feature of Google to try and gauge how long ago Google indexed our page. SO, this brings my question. If we run inurl:https://mysite.com all of our domains show up. If we run inurl:https://mysite.com/specialpage the domain shows up as being indexed If I use the "&as_qdr=y15" string to the URL, https://mysite.com/specialpage does not show up. Does anybody have any experience with this? Also on the same note when I look at how many pages Google has indexed it is about half of the pages we see on our backend/sitemap. Any thoughts would be appreciated. TY!
On-Page Optimization | | HashtagHustler1 -
Duplicate 'meta title' issue (AMP & NON-AMP Pages)
how to fix duplicate meta title issue in amp and non-amp pages? example.com
On-Page Optimization | | 21centuryweb
example.com/amp We have set the 'meta title' in desktop version & we don't want to change the title for AMP page as we have more than 10K pages on the website. ----As per SEMRUSH Tool---- ABOUT THIS ISSUE It is a bad idea to duplicate your title tag content in your first-level header. If your page’s <title>and <h1> tags match, the latter may appear over-optimized to search engines. Also, using the same content in titles and headers means a lost opportunity to incorporate other relevant keywords for your page.</p> <p><strong>HOW TO FIX IT</strong></p> <p>Try to create different content for your <title> and <h1> tags.<br /><br />this is what they are recommending, for the above issue we have asked our team to create unique meta and post title for desktop version but what about AMP page?<br /><br />Please help!</p></title>0 -
Changes taken over in the SERP's: How long do I have to wait until i can rely on the (new) position?
I changed different things on a particular page (mainly reduced the exaggerated keyword density --> spammy). I made it recrawl by Google (Search Console). The new version has now already been integrated in the SERP's.Question: Are my latest changes (actual crawled page in the SERP's is now 2 days old) already reflected in the actual position in the SERP's or should I wait for some time (how long?) to evaluate the effect of my changes? Can I rely on the actual position or not?
On-Page Optimization | | Cesare.Marchetti0 -
Hiding body copy with a 'read more' drop down option
Hi I just want to confirm how potentially damaging using java script to hide lots of on page body copy with a 'read more' button is ? As per other moz Q&A threads i was told that best not to use Javascript to do this & instead "if you accomplish this with CSS and collapsible/expandable <DIV> tags it's totally fine" so thats what i advised my clients dev. However i recently noticed a big drop in rankings aprox 1 weeks after dev changing the body copy format (hiding alot of it behind a 'read more' button) so i asked them to confirm how they did implement it and they said: "done in javascript but on page load the text is defaulting to show" (which is contrary to my instructions) So how likely is it that this is causing problems ? since coincides with ranking drop OR if text is defaulting to show it should be ok/not cause probs ? And should i request that they redo as originally instructed (css & collapsible divs) asap ? All Best Dan
On-Page Optimization | | Dan-Lawrence0 -
Schema and Rich Snippets What's the difference?
Sorry if this is a daft question but... what is the difference between Rich snippets and Schema markup? Are they one and the same? They seem to be used interchaneably and I'm confused. If someone could give a brief sentence or two about the differences between them that would be great. Thanks
On-Page Optimization | | AL123al1 -
Duplicate content penalty
when moz crawls my site they say I have 2x the pages that I really have & they say I am being penalized for duplicate content. I know years ago I had my old domain resolve over to my new domain. Its the only thing that makes sense as to the duplicate content but would search engines really penalize me for that? It is technically only on 1 site. My business took a significant sales hit starting early July 2013, I know google did and algorithm update that did have SEO aspects. I need to resolve the problem so I can stay in business
On-Page Optimization | | cheaptubes0 -
Will "internal 301s" have any effect on page rank or the way in which an SE see's our site interlinking?
We've been forced (for scalability) to completely restructure our website in terms of setting out a hierarchy. For example - the old structure : country / city / city area Where we had about 3500 nicely interlinked pages for relevant things like taxis, hotels, apartments etc in that city : We needed to change the structure to be : country / region / area / city / cityarea So as patr of the change we put in place lots of 301s for the permanent movement of pages to the new structure and then we tried to actually change the physical on-page links too. Unfortunately we have left a good 600 or 700 links that point to the old pages, but are picked up by the 301 redirect on page, so we're slowly going through them to ensure the links go to the new location directly (not via the 301). So my question is (sorry for long waffle) : Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually? Thanks for any help anyone can give.
On-Page Optimization | | TinkyWinky0 -
No follow for html sitemap?
Hi, I have been working on an e-commerce site and have been wondering if i should add the meta robot tag with no follows, on pages like delivery , terms, returns and the html sitemap? From what I have read this seems a good idea, but i am a little confused as what to do with the html sitemap. I can understand that having this on the homepage will allow crawlers quick access to the deeper pages within the site. But would it be better to guide them down the natural route of the category navigation instead? Thanks
On-Page Optimization | | Bmeisterali0