Sitemap and Privacy Policy marked for duplicate content?
-
On a recent crawl, Moz flagged a page of our site for duplicate content. However, the pages listed are our sitemap and our privacy policy -- both very different:
http://elearning.smp.org/sitemap/
http://elearning.smp.org/privacy-policy/
What is our best option to address this issue? I had considered a noindex tag on the privacy policy page, but since we have enabled user insights in Google Analytics we need to have the privacy policy displayed and I worry that putting a noindex on the page would cause problems later.
-
Just ignore it, duplicate content is not a real issue. Definitely not in this case. What Moz is looking at is the overlap in code, if the code is for xx% the same they'll mark it as duplicate. That's why it isn't super intelligent, also don't worry about duplicate content and Google itself. Only if you really mess it up, you'll get yourself in trouble.
-
Maybe you should try re-indexing the pages (search console)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hiding content until user scrolls - Will Google penalize me?
I've used: "opacity:0;" to hide sections of my content, which are triggered to show (using Javascript) once the user scrolls over these sections. I remember reading a while back that Google essentially ignores content which is hidden from your page (it mentioned they don't index it, so it's close to impossible to rank for it). Is this still the case? Thanks, Sam
Web Design | | Sam.at.Moz0 -
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast: Redirect attachment URLs to parent post URL. Media...Meta Robots: noindex, follow I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages. However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc? As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
Web Design | | SEOguy1
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time. My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues.. Please can you advise. Thanks.0 -
Curious why site isn't ranking, rather seems like being penalized for duplicate content but no issues via Google Webmaster...
So we have a site ThePowerBoard.com and it has some pretty impressive links pointing back to it. It is obviously optimized for the keyword "Powerboard", but in no way is it even in the top 10 pages of Google ranking. If you site:thepowerboard.com the site, and/or Google just the URL thepowerboard.com you will see that it populates in the search results. However if you quote search just the title of the home page, you will see oddly that the domain doesn't show up rather at the bottom of the results you will see where Google places "In order to show you the most relevant results, we have omitted some entries very similar to the 7 already displayed". If you click on the link below that, then the site shows up toward the bottom of those results. Is this the case of duplicate content? Also from the developer that built the site said the following: "The domain name is www.thepowerboard.com and it is on a shared server in a folder named thehoverboard.com. This has caused issues trying to ssh into the server which forces us to ssh into it via it’s ip address rather than by domain name. So I think it may also be causing your search bot indexing problem. Again, I am only speculating at this point. The folder name difference is the only thing different between this site and any other site that we have set up." (Would this be the culprit? Looking for some expert advice as it makes no sense to us why this domain isn't ranking?
Web Design | | izepper0 -
How to avoid duplicate title tags?
I've got roughly 1200 location pages for a travel client. Since the business does the same thing at every location, the title tags and descriptions are almost identical except for the location name. I know Google likes tags and meta descriptions to be unique, but how many different ways can I write the same title in a 55 character limit? For example, here's how the titles look: Things to do in San Jose, CA | Company Name
Web Design | | Masbro
Things to do in Dallas, TX | Company Name
Things to do in Albuquerque, NM | Company Name **My question: Are 1200 title tags structured this way unique enough for Google? ** I have got the same problem with the meta descriptions, but I can vary those a bit more because i have more characters to work with. Thanks for your input,
Dino2 -
Page Content
What is the minimum amount of content a page should have to be seo friendly? What is the maximum amount of content a page should have to be seo friendly?
Web Design | | bronxpad0 -
Duplicate content issue
I have recently built a site that has a main page intended to rank for national coverage. This site also has a number of pages targeted at local searches, these pages are slight variations of each other with town specific keywords. Does anyone know if google will see this as spam and quarantine my site from ranking? Thanks
Web Design | | stebutty0 -
Do these links count a duplicate content?
If you do a Google search for the following term it brings up 6 results are these considered duplicate content by Google? Also if so how do I prevent this but still offer other stories to readers of other articles? Google Search Term: site:yakangler.com Okuma helios
Web Design | | mr_w0 -
Two URLs with same content
We recently had a client who own multiple brands switch from having multiple urls to having a single domain with multiple sub domains. I've posted an example below to better explain. My question is the original url is still functional, so there are two urls with identical content, yet I haven't been getting a duplicate content error. Also, would a rel canonical link be beneficial in this case since the duplicate content is on two separate domains? My thoughts were to put a 301 redirect on the original pages so they permanently forward to the new sub-domain format. Is this the best course of action? If not, what would you recommend? Example: Original URLs
Web Design | | BluespaceCreative
www.example1.com
www.example2.com
www.example3.com
www.parentcompany.com New URLs
example1.parentcompany.com
example2.parentcompany.com
example3.parentcompany.com
www.parentcompany.com Let me know if this I need to clarify anything in better detail.
Thanks in advance!0