How to avoid duplicate content when blogging from a site
-
I have a wordpress plastic surgery website. I have a wordpress blog on the site.
My concern is avoiding duplicate content penalties when I blog. I use my blog to add new information about procedures that have pages on the same topic on the main site. Invariably same keywords and phrases can appear in the blog-will this be considered Duplicate content?
Also is it black hat to insert anchor text in a blog linking back to site content-ie internal link or is one now and then helpful
-
Jane,
Thank you very much. That is what I try to do-blog on a sub topic to add more depth to the page topic. I have just been paranoid after Penguin about trying to avoid similar key words and phrases that can be in page-but impossible for blog to make sense without some repetition.
Dr. Seckel
-
Hi,
If you have a page on a certain procedure, a blog post about that procedure should aim to expand on the existing content rather than repeat it in an attempt to create fresh content. You're naturally going to mention the same topics, but this is normal and Google sees "similar" content all over the web all the time. What you'd be looking to do with a blog to make it both useful for search engines and people, is take a topic where you have basic information on a "product" page and consider exploring the following for a blog post (please excuse a lack of surgery knowledge here!):
- Gather the latest medical research on the procedure and talk about how it has changed over time
- Go into more detail about how a procedure is different for men and woman (if it is) and what needs to be considered when treating patients
- Tips for how to maximise results after the procedure is complete
- Common recovery issues and how to resolve them or watch out for them
Any new or different information you can include will add to the status of the blog post as a new, relevant piece of content, rather than something that has been repurposed from a product page.
You can link internally in the way described above, but make sure you're always linking with the intention of explaining what the linked-to page is about, rather than with the idea that you are trying to manipulate how search engines view the page.
-
Thank you for your helpful advice
-
Hello,
As long as the content is not identical then you should not have any issues with duplicate content. If the same keywords or phrases appear that is alright, but try not to over due it and mix it up as best you can. As for the internal linking, it is ok to link internally, but again, mix it up. Don't use the same keywords in the anchor text over and over, use different phrases and always, make it sound natural. Focus on the content and don't over optimize your pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Avoiding duplication in TLDs
I have started a ecom site with following config global version geekwik.com priced in usd india version geekwik.in priced in inr mostly the content in both sites is same (90% same), major difference is currency (and payment gateway) and helpline numbers etc How do I setup robots.txt and google webmaster so that indian users get results from India TLD and global users get results from global TLD and there is no duplication of content. .
Technical SEO | | geekwik0 -
How unique does a page need to be to avoid "duplicate content" issues?
We sell products that can be very similar to one another. Product Example: Power Drill A and Power Drill A1 With these two hypothetical products, the only real difference from the two pages would be a slight change in the URL and a slight modification in the H1/Title tag. Are these 2 slight modifications significant enough to avoid a "duplicate content" flagging? Please advise, and thanks in advance!
Technical SEO | | WhiteCap0 -
Link Structure & Duplicate Content
I am struggling with how I should handle the link structure on my site. Right now most of my pages are like this: Home -> Department -> Service Groups -> Content Page For Example: Home -> IT Solutions -> IT Support & Managed Services -> IT Support Home -> IT Solutions -> IT Support & Managed Services -> Managed Services Home -> IT Solutions -> IT Support & Managed Services -> Help Desk Services Home -> IT Solutions -> Virtualization & Data Center Solutions -> Virtualization Home -> IT Solutions -> Virtualization & Data Center Solutions -> Data Center Solutions This structure lines up with our business and makes logical sense but I am not sure how to handle the department and service group pages. Right now you can click them and it just brings you to a page with a small snippet for the links below. The real content is on the content pages. What I am worried about is that the snippets on those pages are just a paragraph or two of the content that's on the content page. Will this hurt me and get considered duplicate content? What is the best practice for dealing with this? Those department/service group pages have some good content on them but it's just parts of other pages. Am I okay doing this because there are not direct duplicates of other pages just parts of a few pages? Any help on this would be great. Thanks in advance.
Technical SEO | | ZiaTG0 -
Duplicate Content
Many of the pages on my site are similar in structure/content but not exactly the same. What amount of content should be unique for Google to not consider it duplicate? If it is something like 50% unique would it be preferable to choose one page as the canonical instead of keeping them both as separate pages?
Technical SEO | | theLotter0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0 -
Duplicate Content from Google URL Builder
Hello to the SEOmoz community! I am new to SEOmoz, SEO implementation, and the community and recently set up a campaign on one of the sites I managed. I was surprised at the amount of duplicate content that showed up as errors and when I took a look in deeper, the majority of errors were caused by pages on the root domain I put through Google Analytics URL Builder. After this, I went into webmaster tools and changed the parameter handling to ignore all of the tags the URL Builder adds to the end of the domain. SEOmoz recently recrawled my site and the errors being caused by the URL Builder are still being shown as duplicates. Any suggestions on what to do?
Technical SEO | | joshuaopinion0 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0 -
CGI Parameters: should we worry about duplicate content?
Hi, My question is directed to CGI Parameters. I was able to dig up a bit of content on this but I want to make sure I understand the concept of CGI parameters and how they can affect indexing pages. Here are two pages: No CGI parameter appended to end of the URL: http://www.nytimes.com/2011/04/13/world/asia/13japan.html CGI parameter appended to the end of the URL: http://www.nytimes.com/2011/04/13/world/asia/13japan.html?pagewanted=2&ref=homepage&src=mv Questions: Can we safely say that CGI parameters = URL parameters that append to the end of a URL? Or are they different? And given that you have rel canonical implemented correctly on your pages, search engines will move ahead and index only the URL that is specified in that tag? Thanks in advance for giving your insights. Look forward to your response. Best regards, Jackson
Technical SEO | | jackson_lo0