Why are these two URLs showing in Moz as duplicate content?
-
Here is the first URL - http://www.flagandbanner.com/Products/FBPP0000012376.asp
Here is the 2nd URL - http://www.flagandbanner.com/Products/flag-spreader.asp
Granted I am new to this issue on this website, but what is Roger seeing that I'm not? A lot of our duplicate pages are just like this example.
-
Couple things here I think might help -
First off, holy cow the methodist flag page had an entire narrative for a meta description. This is not causing your problem but you should definitely fix it. It's like... way too long. Shorten that up.
Second, what's up with all the javascript embedded internally in each page? Assuming by glancing at it the scripts are all the same, why not just make a js file and drop it on the directory and link to that with a script src tag in the header? Seeing as how Google does parse through this JS more and more, this could be affecting your crawls but also it's just dirty and a bit of a mess. Surely it will clean your server up this way and make things a bit more efficient and manageable. It might even help with duplication issues.
As for why it's recognized as a duplicate... The majority of the content is a duplicate for sure. The small amount of body that changes is not enough to keep Roger from thinking it's a duplicate. I think Roger is a bit touchier than Google's bots but it's still probably a good idea to think about how you can add content to make each page a bit more different. Even if you can change img alts or something that is repeated across each page to something specific.. this will help your onsite optimization at the same time by balancing out your keyword density.
That's a place to start. The src files of your html is miles long. I'd start there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google selecting incorrect URL as canonical: 'Duplicate, submitted URL not selected as canonical'
Hi there, A number of our URLs are being de-indexed by Google. When looking into this using Google Search Console the same message is appearing on multiple pages across our sites: 'Duplicate, submitted URL not selected as canonical' 'IndexingIndexing allowed? YesUser-declared canonical - https://www.mrisoftware.com/ie/products/real-estate-financial-software/Google-selected canonical - https://www.mrisoftware.com/uk/products/real-estate-financial-software/'Has anyone else experienced this problem?How can I get Google to select the correct, user-declared canoncial? Thanks.
Technical SEO | | nfrank0 -
50 Duplicate URLS, but not the same
Hi According to my latest site crawl, many of my pages are showing up to 50 duplicate urls. However this isn't the case in real life. http://www.fortusgroup.com.au/browse-products/rubber-tracks/excavator-rubber-tracks/hitachi/ex-33mu.html is showing 31 duplicate URL. Examples include: http://www.fortusgroup.com.au/browse-products/rubber-tracks/excavator-rubber-tracks/parts/x430.html
Technical SEO | | JDadd
http://www.fortusgroup.com.au/browse-products/rubber-tracks/excavator-rubber-tracks/case/cx-75sr.html Obviously these URL's are very similar and I know that Moz judges URLs by 90% of their similarity, but is this affecting my actual raking on google? If so, what can I do? This pages are also very similar in code and content, so they are also showing as duplicate content etc as well. Worried that this is having an affect on my SERP rankings, as this pages arent ranking particularly well. Thanks, Ellie0 -
Why is Coyscape showing content duplication error even after implementing 301 redirect ?
We are maintaining the corporate website of one of our prestigious clients "FineTech Toolings" (http://www.finetechtoolings.in). Recently I had raised a question regarding "2 websites running paralley in 2 diferent domains, i.e. 1 organisation having 2 different websites on 2 different domains". Recently my domain changed from http://www.finetechtoolings.co.in to http://www.finetechtoolings.in via 301 redirect, but still I am facing content duplication issue as per Copyscape. Hence I am having a small doubt regarding the same. Please note the following question very carefully and provide me the exact problem and the solution for the same: Even though I have implemented 301 redirect (http://www.finetechtoolings.co.in is redirected to http://www.finetechtoolings.in), which is completely ok as per the SEO rules, why is copyscape still showing that duplicate content exists in the former website? I think I am clear enough with my question.
Technical SEO | | KDKini0 -
How to find original URLS after Hosting Company added canonical URLs, URL rewrites and duplicate content.
We recently changed hosting companies for our ecommerce website. The hosting company added some functionality such that duplicate content and/or mirrored pages appear in the search engines. To fix this problem, the hosting company created both canonical URLs and URL rewrites. Now, we have page A (which is the original page with all the link juice) and page B (which is the new page with no link juice or SEO value). Both pages have the same content, with different URLs. I understand that a canonical URL is the way to tell the search engines which page is the preferred page in cases of duplicate content and mirrored pages. I also understand that canonical URLs tell the search engine that page B is a copy of page A, but page A is the preferred page to index. The problem we now face is that the hosting company made page A a copy of page B, rather than the other way around. But page A is the original page with the seo value and link juice, while page B is the new page with no value. As a result, the search engines are now prioritizing the newly created page over the original one. I believe the solution is to reverse this and make it so that page B (the new page) is a copy of page A (the original page). Now, I would simply need to put the original URL as the canonical URL for the duplicate pages. The problem is, with all the rewrites and changes in functionality, I no longer know which URLs have the backlinks that are creating this SEO value. I figure if I can find the back links to the original page, then I can find out the original web address of the original pages. My question is, how can I search for back links on the web in such a way that I can figure out the URL that all of these back links are pointing to in order to make that URL the canonical URL for all the new, duplicate pages.
Technical SEO | | CABLES0 -
Duplicate page content issue needs resolution.
After my last "crawl" report, I received a warning about "duplicate page content". One page was: http://anycompany.com and the other was: http://anycompany.com/home.html How do I correct this so these pages aren't competing with each other or is this a problem?
Technical SEO | | JamesSagerser0 -
Duplicate Content
Hi - We are due to launch a .com version of our site, with the ability to put prices into local currency, whereas our .co.uk site will be solely £. If the content on both the .com and .co.uk sites is the same (at product level mainly), will we be penalised? What is the best way to get around this?
Technical SEO | | swgolf1230 -
Snippets on every page considered duplicate content?
If I create a page that pulls a 10 snippets of information from various external site, would that content be considered duplicate content? If I link to the source, would it be recommended to use a "nofollow" tag?
Technical SEO | | nicole.healthline0 -
Duplicate content and URL's
Hi Guys, Hope you are all well. Just a quick question which you will find nice and easy 🙂 I am just about to work through duplicate content pages and URL changes. Firstly, With the duplicate content issue i am finding the seo friendly URL i would normally direct to in some cases has less links, authority and root domain to it than some of the unseo friendly URL's. will this harm me if i still 301 redirect them to the seo friendly URL. Also, With the url changed it is going to be a huge job to change all the url so they are friendly and the CMS system is poor. Is there a better way of doing this? It has been suggested that we create a new webpage with a friendly URL and redirect all the pages to that. Will this lose all the weight as it will be a brand new page? Thank you for your help guys your legends!! Cheers Wayne
Technical SEO | | wazza19850