Duplicate content. Wordpress and Website
-
Hi All,
Will Google punish me for having duplicate blog posts on my website's blog and wordpress?
Thanks
-
I was just hired at this company about 4 months ago and for years they have been running the blog through wordpress and had about 40k visitors last year. I decided that running the blog on the website would be a great boost for SEO and lead to better conversions.
The site was made from a shopify template and the social media manager hates the layout and stats he obtains. He decided that he would rather go back to wordpress and wants to convert all the posts that he created in shopify to wordpress.
I was not sure if about 20 posts would penalize us on google or not? Also, if we could post to both I would still get the benefits from the blog being on the site and he would get the benefits from having it on wordpress.
Thanks for your help,
-
It's not a penalty as much as it is Google doesn't want to show searchers lots of identical content in the results, and just won't show what they consider duplicate.
Do you mean that you have a blog on your site, and on wordpress.com? Is there a reason for having both? Perhaps we could help you figure out a better overall solution.
-
Yes blog post content should be unique otherwise Google can be penalised.
-
Yes, however, implementing canonical tags in the posts on one of the properties will resolve this issue.
Here's a post to help you out with implementation: http://moz.com/learn/seo/canonicalization
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated content & url's for e-commerce website
Hi, I have an e-commerce site where I sell greeting cards. Products are under different categories (birthday, Christmas etc) with subcategories (for Mother, for Sister etc) and same product can be under 3 or 6 subcategories, for example: url: .../greeting-cards/Christmas/product1/for-mother
Technical SEO | | jurginga
url:.../greeting-cards/Christmas/product1/for-sister
etc On the CMS I have one description record per each card (product1) with multiple subcategories attached that naturally creates URLs for subcategories. Moz system (and Google for sure) picks these urls (and content) as duplicated.
Any ideas how to solve this problem?
Thank you very much!0 -
Duplicate Content due to CMS
The biggest offender of our website's duplicate content is an event calendar generated by our CMS. It creates a page for every day of every year, up to the year 2100. I am considering some solutions: 1. Include code that stops search engines from indexing any of the calendar pages 2. Keep the calendar but re-route any search engines to a more popular workshops page that contains better info. (The workshop page isn't duplicate content with the calendar page). Are these solutions possible? If so, how do the above affect SEO? Are there other solutions I should consider?
Technical SEO | | ycheung0 -
Duplicate content issue with trailing / ?
Hi ,I did a SEOmoz Crawl Test and found most pages show twice, for example: A: www.website.com/index.php/dog/walk B: www.website.com/index.php/dog/walk/ I've checked Google Analytics and 90% of organic search traffic arrives on the URLs with the trailing slash (B). Question 1: Can I assume I've a duplicate content problem? Question 2: Is it best to do 301 redirects from the 'non trailing slash' pages to the 'trailing slash pages'? Question 3: For some reason every web page has a '/index.php' in it (see A&B) above. No idea why. Should it be a SEO concern? Kind regards and thank you in advance Nigel
Technical SEO | | Richard5550 -
Duplicate Content on Product Pages
Hello I'm currently working on two sites and I had some general question's about duplicate content. For the first one each page is a different location, but the wording is identical on each; ie it says Instant Remote Support for Critical Issues, Same Day Onsite Support with a 3-4 hour response time, etc. Would I get penalized for this? Another question i have is, we offer Antivirus support for providers ie Norton, AVG,Bit Defender etc. I was wondering if we will get penalized for having the same first paragraph with only changing the name of the virus provider on each page? My last question is we provide services for multiple city's and towns in various states. Will I get penalized for having the same content on each page, such as towns and producuts and services we provide? Thanks.
Technical SEO | | ilyaelbert0 -
How can something be duplicate content of itself?
Just got the new crawl report, and I have a recurring issue that comes back around every month or so, which is that a bunch of pages are reported as duplicate content for themselves. Literally the same URL: http://awesomewidgetworld.com/promotions.shtml is reporting that http://awesomewidgetworld.com/promotions.shtml is both a duplicate title, and duplicate content. Well, I would hope so! It's the same URL! Is this a crawl error? Is it a site error? Has anyone seen this before? Do I need to give more information? P.S. awesomewidgetworld is not the actual site name.
Technical SEO | | BetAmerica0 -
Duplicate Page Title & Content Penalty On Website Tonight Platform
I built my primary website on Website Tonight (WT) five years ago when I was a net newbie and I'm presently new to seomoz. The initial crawl indicated a problem with duplicate page title and duplicate content with my website home page in WT. It turns out that the WT platform makes you assign a file name to your homepage i.e: www.business.com/homepage.html that differs from the www.business.com that you want as your homepage url. Apparently the search engines are recognizing these identical pages as separate and duplicate. I know that the standard answer would be to just do a 301 redirect from the long file name to the short file name - end of story. But WT does not allow you to do 301 redirects and they also do not give you the ability to go into the htaccess file to fix this yourself manually. I spoke to the folks at WT tonight and they claim that they automatically do 301 redirects on the platform. But if this true then why am I getting the error message in seomoz? Does anyone know if this is a problem? If so, does anyone here have a fix? Thanks in advance. Sincerely - Bill in Denver
Technical SEO | | anxietycoach0 -
How damaging is duplicate content in a forum?
Hey all; I hunted around for this in previous questions in the Q&A and didn't see anything. I'm just coming back to SEO after a few years out of the field and am preparing recommendations for our web dev team. We use a custom-coded software for our forums, and it creates a giant swathe of duplicate content, as each post has its own link. For example: domain.com/forum/post_topic domain.com/forum/post_topic/post1 domain.com/forum/post_topic/post2 ...and so on. However, since every page of the forum defaults to showing 20 posts, that means that every single forum thread that's 20 posts long has 21 different pages with identical content. Now, our forum is all user-generated content and is not generally a source of much inbound traffic--with occasional exceptions--but I was curious if having a mess of duplicate content in our forums could damage our ability to rate well in a different directory of the site. I've heard that Panda is really cracking down on duplicate content, and last time I was current on SEO trends, rel="canonical" was the hot new thing that everyone was talking about, so I've got a lot of catching up to do. Any guidance from the community would be much appreciated.
Technical SEO | | TheEnigmaticT0 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0