Duplicate Content & Tags
-
I've recently added tags to my blog posts so that related blog posts are suggested to visitors.
My understanding was that my robot.txt was handling duplicate content so thought it wouldn't be an issue but after Moz crawled by site this week is reported 56 issues of duplicate content in my blog.
I'm using Shopify, so I can edit the robot.txt file but is my understanding correct that if there are 2 or more tags then they will be ignored? I've searched the Shopify documents and forum and can't find a straight answer. My understanding of SEO is fairly limited.
Disallow: /blogs/+
Disallow: /blogs/%2B
Disallow: /blogs/%2b -
If the only option is to disallow via the robots.txt, then I would agree with your setup - disallow the slugs specific to the tags you don't want indexed. I've heard shopify is a little rough to work with sometimes because of the limitations, so whatever you can do I think is better than nothing. Remember that the robots exclusion is treated as a suggestion and not a command, so if it's possible to assign a no-index meta tag to those URL types that would be best case.
Looks like you're on the right track with the post below:
{ % if handle contains "tagged" % }
{ % endif % }
The one suggestion I would make is that you use noindex,follow so the content will still be crawled, but the duplicate tag won't get indexed. That would create multiple paths to the content on your site, but not create an index bloat issue with multiple tags.
-
Yoast is a WordPress plugin, not Shopify so that option isn't available with the current CMS. Just wanted to chime in to make sure others aren't looking for Yoast SEO in the Shopify app store.
-
I'm using Meta Tagger as the SEO plugin, I've not heard of Yoast SEO but will certainly check it out.
I understand that I need to exclude the tags from being crawled and think I might have worked it out but I'm not 100% sure, as I mentioned my understanding is fairly limited.
My URL which is being seen as duplicate content looks like this
http://www.tangled-yarn.co.uk/blogs/news/tagged/sock-knitting
If I exclude the handle 'tagged' from being index this should work. I think the code should be
{ % if handle contains "tagged" % }
{ % endif % }
Do you think this will work?
-
Do you use Yoast SEO, or another plugin? The key is to set tags to no index so that the crawler only goes through your category links. The issue is that your tag URLs are being indexed and you don't want that. The option is under XML site map.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reusing content on different ccTLDs
We have a client with many international locations, each of which has their own ccTLD domain and website. Eg company-name.com, company-name.com.au, company-name.co.uk, company-name.fr, etc. Each domain/website only targets their own country, and the SEO aim is for each site to only rank well within their own country. We work for an individual country's operations, and the international head office wants to re-use our content on other countries' websites. While there would likely be some optimsation of the content for each region, there may be cases where it is re-used identically. We are concerned that this will cause duplicate content issues. I've read that the separate ccTLDs should indicate to search engines that content is aimed at the different locations - is this sufficient or should we be doing anything extra to avoid duplicate content penalties? Or should we argue that they simply must not do this at all and develop unique content for each? Thanks Julian
Content Development | | Bc.agency0 -
Reposting My Original Content
I wanted to know how I should repost my original content from my blog to other sites, obviously quality ones and legacy style content. I don't want get hit with any duplicate content or spam type penalties, but want to further spread some of my content from my blog to get it noticed and drive traffic and leads. Do I submit the posts as is, then put a link to the original source and if so, how many sites can i safely repost the same piece of content too? I am not trying to do anything sneaky, just want it to spread my best content to a good amount high authority sites. I already do social bookmarking and Facebook, twitter etc.
Content Development | | photoseo10 -
Modifying Content to Avoid Duplicate Content Issues
We are planning to leverage specific posts from a US-based blog for our own Canadian blog (with permission, of course) but are aware that this can cause duplicate content issues. We're willing to re-write as much or as little as we must from the initial blog posts to avoid duplicate content issues but I have no idea just how much we will need to re-write. Is there some guideline for this (e.g., 25% of content must be re-written)? I've been unable to find anything. Thank you in advance!
Content Development | | QueenSt0 -
Is paid content a good or bad thing
Hi, over the past couple of years we have turned down thousands of request from companies to have paid editorial on our sites, I have always turned this down but i have seen some sites accept this and would like to know your stance on this. In newspapers and magazines which i have worked in both, they have paid editorial all the time, so i am just wondering what google thinks of paid editorial. look forward to hearing your thoughts
Content Development | | ClaireH-1848860 -
Duplicate Content From Huffington Post Blog
A client who writes blog posts for Huffington Post also wants an identical version of the blog posted to his personal site. Do you think there could be a problem of being punished for duplicate content? Would a better SEO practice be to have the client do an on-site blog just linking to the Huffington Post blog and providing information about it?
Content Development | | EmarketedTeam0 -
Content writters needed
Good Morning from 14 degrees C light showers & rain wetherby UK 🙂 Ok I'm getting really frustrated with clients inability to ad content so much so its time to go round the problem and find an outsource solution. So my question is please: 1. Can anyone recommend a contnet writting resource
Content Development | | Nightwing
2. Whats reasonable rates to expect Thanks in advance 🙂0 -
New Magento store with lots of duplicate contect
Hi, I am fairly new to SEOmoz and would really appreciate some pointers. I have just received the results of my first crawl and I have over 1800 pages of duplicate content. this is an example Cream poplin slim fit mens business shirt from Jermyn Street Shirt Co. http://jsshirts.com.au/aylesbury-cream-poplin-slim-fit-shirt.html Heading other URL's indicates 10 My questions are what does this figure 10 represent? what do I need to do to remove this duplicate content. Thanks | | | | |
Content Development | | mullsey
| | | | |
| | | | |0 -
How to deal with an media press content?
We have a company that create content and send as media press. We would like to use this content in our blog. We made it using RSS and having in our blog the same content. So right now we have some concern about duplicate content. How do you guys deal with deal? Would we be penalized by duplicated content?
Content Development | | kauelinden0