Duplicate Content & Tags
-
I've recently added tags to my blog posts so that related blog posts are suggested to visitors.
My understanding was that my robot.txt was handling duplicate content so thought it wouldn't be an issue but after Moz crawled by site this week is reported 56 issues of duplicate content in my blog.
I'm using Shopify, so I can edit the robot.txt file but is my understanding correct that if there are 2 or more tags then they will be ignored? I've searched the Shopify documents and forum and can't find a straight answer. My understanding of SEO is fairly limited.
Disallow: /blogs/+
Disallow: /blogs/%2B
Disallow: /blogs/%2b -
If the only option is to disallow via the robots.txt, then I would agree with your setup - disallow the slugs specific to the tags you don't want indexed. I've heard shopify is a little rough to work with sometimes because of the limitations, so whatever you can do I think is better than nothing. Remember that the robots exclusion is treated as a suggestion and not a command, so if it's possible to assign a no-index meta tag to those URL types that would be best case.
Looks like you're on the right track with the post below:
{ % if handle contains "tagged" % }
{ % endif % }
The one suggestion I would make is that you use noindex,follow so the content will still be crawled, but the duplicate tag won't get indexed. That would create multiple paths to the content on your site, but not create an index bloat issue with multiple tags.
-
Yoast is a WordPress plugin, not Shopify so that option isn't available with the current CMS. Just wanted to chime in to make sure others aren't looking for Yoast SEO in the Shopify app store.
-
I'm using Meta Tagger as the SEO plugin, I've not heard of Yoast SEO but will certainly check it out.
I understand that I need to exclude the tags from being crawled and think I might have worked it out but I'm not 100% sure, as I mentioned my understanding is fairly limited.
My URL which is being seen as duplicate content looks like this
http://www.tangled-yarn.co.uk/blogs/news/tagged/sock-knitting
If I exclude the handle 'tagged' from being index this should work. I think the code should be
{ % if handle contains "tagged" % }
{ % endif % }
Do you think this will work?
-
Do you use Yoast SEO, or another plugin? The key is to set tags to no index so that the crawler only goes through your category links. The issue is that your tag URLs are being indexed and you don't want that. The option is under XML site map.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I want to use some content that I sent out in a newsletter and post as a blog, but will this count as duplicate content?
I want to use some content that I sent out in a newsletter a while ago - adding it as a blog to my website. The newsletter exists on a http://myemail.constantcontact.com URL and is being indexed by Google. Will this count as duplicate content?
Content Development | | Wagada0 -
Are press releases that could end up being published with duplicate content links point back to you bad for your site ?
With all the changes to the seo landscape in the resent years im a little unsure as to how a press release work looks in the eyes of Google (and others). For instance, you write up a 500 word press release and it gets featured on the following sites : Forbes Techcrunch BBC CNN NY Times etc ... If each of these cover your story but only rewrite 50% of the article (not saying these sites wouldn't re write the entire artcile, but for this purpose lets presume only 50% is rewritten) could it be negative to your backlink profile, ? Im thinking not, as these sites will have high authority, but what if once your press release is published on these sites 10 other smaller sites re publish the stories with almost no re writing, either straight from the press release or straight from the article in the mainstream news sites. (For clarification this Press release would be done in the fashion of a article suggestion to relevant journalists, rather than a blanket press release, via PR Newswire, mass mail out etc. Although i guess the effect with duplicate content backlinks is the same.) You now have c. 50 articles online all with very similar content with links pointing back at you, would this have a negative effect or would each link just not carry as much value as it normally would. By now we all understand publishing duplicate content on our own sites is a terrible idea, but dose have links pointing back to your self from duplicate (or similar) content hosted on other sites (some being highly authoritative) effect your site 's seo ?
Content Development | | Sam-P1 -
The rel="alternate" tag
Hey guys and gals. So I run a hyperlocal blog and get a heap of press releases sent to me daily. I don't have time to rewrite each one before posting it to the website, so I am aware that I am duplicating content and possibly extracting link juice from the original source. Would using the rel="alternate" tag highlighting the source on each press release page fix this? Am I correct in saying this tag tells google that the content on the page is sourced from elsewhere? I also get requests from other sites for my original posts to be posted on their websites but I decline because of duplicate content, would asking for this same ' the rel="alternate" tag' fix on their website be wise? Thanks in advance JC
Content Development | | jonnycraft0 -
How to produced amazing contents
How to get started and produce some really fantastic contents on regular basis? I am into weight lose niche and don't want to produce garbage. where to start and get going?
Content Development | | Sajiali0 -
Can I share share my content with other sites either as a individual post or RSS or will I be encourage duplicate content on the web and upset Google
Hi, I have a new site http://www.homeforbusiness.co.uk. I want to encourage traffic to the site by sharing some of my content with other related websites which have a higher PR ranking and traffic for a link to my site. Is this going to upset Google re-duplicate content and devalue my site and stop any organic rankings in the future? Equally some high PR sites which have a good synergy with mine such as http://thewomensbusinessclubs.com/ allow me to add my RSS feed with their blog network. Is this a good thing to do or not for the same reasons as above? Or can I only do the above my creating fresh content? Thanks, Elizabeth Conley
Content Development | | econley0 -
Syndicating content with rel=author tag in it
If I have an article with my rel=author tag attached to it, and then I syndicate that article to another web site, should I keep the rel=author tag in that synbdicated article? Basically, what I'm worried about is that there will be 2 duplicate articles with my author tag on 2 different web sites. (I intend to put a canonical tag in the syndicated article so there is no duplicate content penalty) What is the best practice for this?
Content Development | | greggseo0 -
Fresh content ideas for a static site?
I have an ecommerce site. My home page is set-up just as I want it. I'm not looking to redo it or change my site to a blog. Just looking for some new, different, SEO friendly ideas or concepts to keep it "fresh".
Content Development | | VictorVC0 -
Duplicate content and Facebook
If i have content on my site and the same content duplicated on my facebook pages, will google treat this as duplicate content? At the moment when i copy and paste a line of text from the content on my site Facebook is returned first.
Content Development | | Turkey0