Duplicate content and blog/twitter feeds
-
Hi Mozzers, I have a question...
I'm planning to add a blog summary/twitter feed throughout my website (onto every main content page) and then started worrying about duplicate content.
What is best practice here?
Let me know - thanks, Luke
PS. I sat down and re: blog feed... thought that perhaps it would help if I fed different blog posts through to different pages (which I could then edit so I could add<a></a> text different from that in blog). Not sure about twitter.
-
That's really useful (and interesting) Erica - thanks so much
-
Most feeds, like Feedburner, and blogs solve this issue with the canonical URL tag. Your post is marked as the canonical one and your feed a copy, which eliminates the duplicate content issue. You might also dig this post about advanced canonical tags.
Twitter is a different issue. Google doesn't treat Twitter like websites, but like a social platform. Additionally, you can only post 140 characters on it. Though I do encourage you to not just put a feed to Twitter, but instead craft it to be interesting for your audience or a call out for them to actually read it. I wouldn't worry about Twitter as duplicate content at all.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it still necessary to have a "home" page button/link in the top nav?
Or is it not necessary to have a "home" tab/link because everybody by this time knows you can get to the home page by clicking on the logo?
Web Design | | FindLaw0 -
Requirements for mobile menu design have created a duplicated menu in the text/cache view.
Hi, Upon checking the text cache view of our home page, I noticed the main menu has been duplicated. Please see: http://webcache.googleusercontent.com/search?q=cache:http://www.trinitypower.com&strip=1 Our coder tells me he created one version for the desktop and one for the mobile version. Duplicating the menu cannot be good for on page SEO. With that said, I have had no warnings reported back from Moz. Maybe the moz bots are not tuned to looks for such a duplication error. Anyway, the reason the coder created a different menu for mobile in order to support the design requirements. I did not like the look and feel of the responsive version created based on the desktop version. Hi solution to this problem is to convert the Mobile version menu into ajax. what do you guys think? Thanks, Jarrett
Web Design | | TrinityPower0 -
Our on page blog is off page! What?
Hi Moz Community, I have a customer who has a blog that is FULL of great unique content however the blog resides at a URL that differs from the main site. Eg. blog.mywebsite.com Instead of www.mywebsite.com/blog . With the latest Google updates I fear that this may be hurting our web ranking. In addition the web blog is a carbon copy of the main URL. My Question: I am going to schedule a meeting with the web designers, How vociferously should I argue for having them move the blog onsite and write 301 redirects for the current blog site?
Web Design | | CKerr0 -
How to handle International Duplicated Content?
Hi, We have multiple international E-Commerce websites. Usually our content is translated and doesn't interfere with each other, but how do search engines react to duplicate content on different TLDs? We have copied our Dutch (NL) store for Belgium (BE) and i'm wondering if we could be inflicting damage onto ourselves... Should I use: for every page? are there other options so we can be sure that our websites aren't conflicting? Are they conflicting at all? Alex
Web Design | | WebmasterAlex0 -
Crawl Diagnostics Summary - Duplicate Content
Hello SEO Experts, I am a developer at www.bowanddrape.com and we are working on improving the SEO of the website. The SEOMoz Crawl Diagnostics Summary shows that following 2 URL have duplicate content. http://www.bowanddrape.com/clothing/Tan+Accessories+Calfskin+Belt/50_5142 http://www.bowanddrape.com/clothing/Black+Accessories+Calfskin+Belt/50_5143 Can you please suggest me ways to fix this problem? Is the duplicate content error because of same "The Details", "Size Chart" and "The Silhouette" and "You may also like" ? Thanks, Chirag
Web Design | | ChiragNirmal0 -
Word Press Seo Errors/ Questions
Hi my name is Tina I am new here I hope you guys can help me out. I thought building my new site with Word Press was going to simplify things, however I have a ton of errors, and I am not sure what they are, or how to fix them. I am hoping someone could share with me a solution for these errors. I have 28 rel=canonical errors, I am not sure what this means, I understand it to mean my pages are similar, and this is to set a heirarchy between my pages. Please correct me if I am wrong. If I am correct would this be necessary to add if my main keyword was "widgets" and my home page was optimized for "widgets" and my next page was "blue widgets" and so on. While my pages are similar they are all optimized for different versions of my main keyword some using long tail keywords. Do you know of a plugin that can help solve this problem? Also does anyone have a plugin they recommend for G+ my G+ authorship verification is causing an error as well? I am using Head Space 2 I have used this seo plugin numerous times with great success it has been my favorite seo plugin. However, we have a portfolio that shows our clients websites, and on those pages Head Space will not let me enter a description tag. What plug in do you guys recommend with more control over each page? Another interesting issue is on one of our pages I optimized it for our Canadian clients, and now every page has been listed in Google.ca for the keywords it should have on Google.com. We are listed on Google maps, verified in Google places, and our address is on the site so they know we're from the USA however, the majority of our keywords are only listed in Google.ca. We're on page one for all of them, we are in the top three on most of them so that's not bad, but we want to be listed in Google.com as well. Any suggestions on this?
Web Design | | TinaGammon1 -
SEO tricks for a one page site with commented html content
Hi, I am building a website that is very similar to madebysofa.com : means it is one page site with entire content loaded (however are commented in html) and by clicking on sections it modify the DOM to make specific section visible. It is very interesting from UX point of view but as far as I know, since this way most of my content is always commented and hidden from crawlers, I will loose points regarding SEO. Is there any workaround you can recommend or you think sites like madebysofa.com are doomed to loose SEO points by nature? Best regards,
Web Design | | Ashkan10 -
Homepage and Category pages rank for article/post titles after HTML5 Redesign
My site's URL (web address) is: http://bit.ly/g2fhhC Timeline:
Web Design | | mcluna
At the end of March we released a site redesign in HTML5
As part of the redesign we used multiple H1s (for nested articles on the homepage) and for content sections other than articles on a page. In summary, our pages have many many, I mean lots of H1's compared to other sites notable sites that use HTML5 and only one H1 (some of these are the biggest sites on the web) - yet I don't want to say this is the culprit because the HTML5 document outline (page sections) create the equivalent of H1 - H6 tags. We have also have been having Google cache snapshot issues due to Modernzr which we are working to apply the patch. https://github.com/h5bp/html5-boilerplate/issues/1086 - Not sure if this would driving our indexing issues as below. Situation:
Since the redesign when we query our article title then Google will list the homepage, category page or tag page that the article resides on. Most of the time it ranks for the homepage for the article query.
If we link directly to the article pages from a relevant internal page it does not help Google index the correct page. If we link to an article from an external site it does not help Google index the correct page. Here are some images of some example query results for our article titles: Homepage ranks for article title aged 5 hours
http://imgur.com/yNVU2 Homepage ranks for article title aged 36 min.
http://imgur.com/5RZgB Homepage at uncategorized page listed instead of article for exact match article query
http://imgur.com/MddcE Article aged over 10 day indexing correctly. Yes it's possible for Google index our article pages but again.
http://imgur.com/mZhmd What we have done so far:
-Removed the H1 tag from the site wide domain link
-Made the article title a link. How it was on the old version so replicating
-Applying the Modernizr patch today to correct blank caching issue. We are hoping you can assess the number H1s we are using on our homepage (i think over 40) and on our article pages (i believe over 25 H1s) and let us know if this may be sending a confusing signal to Google. Or if you see something else we're missing. All HTML5 and Google documentation makes clear that Google can parse multiple H1s & understand header, sub & that multiple H1s are okay etc... but it seems possible that algorythmic weighting may not have caught up with HTML5. Look forward to your thoughts. Thanks0