WordPress - duplicate content
-
I'm using WordPress for my website. However, whenever I use the post section for news, I get a report back from SEOmoz saying that there's duplicate content. What it does is it posts them in the Category and Archive section. Does anyone know if Google sees this as duplicate content and if so how to stop it?
Thanks
-
Brian,
Frankly it is quite weird that you using great SEO plugIn and still have duplicate content issues. As far as I'm aware Yoast plug-in should add canonical URL's tag automatically if you have duplicate content on your website.
Did you modified Yast plug in? - If yes - set everything to default.
Changed it,s settings? - if yes - change everything to default.
Do you use fresh version of it? - if no - disable, uninstall, download and reinstall it again.
Best of luck!
Jungles
-
I'm already using Yoast SEO plugin. The thing is that I'm useless with code. I really don't understand what anyone is saying, so may have to break it down in easy to understand terminology.
-
Rather than disable or de-index archives, consider de-indexing the paginated versions (page 1,2,3,etc.), as this is likely causing the penalty. I also recommend adding category (and tag, if applicable) descriptions to the archive templates.
Yoast's WordPress SEO plugin can handle the de-indexing of the paginated archive/index pages.
-
Have you rel=canonical each post?
-
Brian,
can't you disable the function archive? Then you just have the message/content in your category and no longer in any archive.
Maybe this is what you are looking for:
http://wordpress.org/support/topic/archives-how-to-disable-or-remove
good luck
kind regards
Jarno
-
Which SEO plugIn do you use on you site?
J
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Correct robots.txt for WordPress
Hi. So I recently launched a website on WordPress (1 main page and 5 internal pages). The main page got indexed right off the bat, while other pages seem to be blocked by robots.txt. Would you please look at my robots file and tell me what‘s wrong? I wanted to block the contact page, plugin elements, users’ comments (I got a discussion space on every page of my website) and website search section (to prevent duplicate pages from appearing in google search results). Looks like one of the lines is blocking every page after ”/“ from indexing, even though everything seems right. Thank you so much. FzSQkqB.jpg
On-Page Optimization | | AslanBarselinov1 -
Duplicate Content in Footers (Not as routine as it seems)
Hello there, I know that content in the footer of sites are safe from duplication penalisation; however, what if the footers where replicated across different subdomains? For instance, the footer was duplicated across: www.example.com blog.example.com blog2.example.com I don't see it as a big issue personally; however, outsourced "specialists" seem to think that this is causing duplication problems and therefore negatively affecting the ranking power of "lesser" subdomains i.e. not the www version, which is by far the strongest subdomain. Would be good to get some insight if anybody has any. Thanks.
On-Page Optimization | | SEONOW1230 -
Unique Pages with Thin Content vs. One Page with Lots of Content
Is there anyone who can give me a definitive answer on which of the following situations is preferable from an SEO standpoint for the services section of a website? 1. Many unique and targeted service pages with the primary keyword in the URL, Title tag and H1 - but with the tradeoff of having thin content on the page (i.e. 100 words of content or less). 2. One large service page listing all services in the content. Primary keyword for URL, title tag and H1 would be something like "(company name) services" and each service would be in the H2 title. In this case, there is lots of content on the page. Yes, the ideal situation would be to beef up content for each unique pages, but we have found that this isn't always an option based on the amount of time a client has dedicated to a project.
On-Page Optimization | | RCDesign741 -
Duplicate Home Page
Hi, I have a question around best practise on duplicate home pages. The /index.aspx page is showing up as a top referrer in my analytics. I have the rel=canonical tag implemented for the www.mysite.com on both pages. Do I need to 301 the /index.aspx to the mysite.com? I have a lot of links pointing to the /index.aspx (half of those are coming from the mysite.com). www.mysite.com/index.aspx www.mysite.com Many thanks Jon
On-Page Optimization | | JonRaubenheimer0 -
Duplicate Content Issues with Forum
Hi Everyone, I just signed up last night and received the crawl stats for my site (ShapeFit.com). Since April of 2011, my site has been severely impacted by Google's Panda and Penguin algorithm updates and we have lost about 80% of our traffic during that time. I have been trying to follow the guidelines provided by Google to fix the issues and help recover but nothing seems to be working. The majority of my time has been invested in trying to add content to "thin" pages on the site and filing DMCA notices for copyright infringement issues. Since this work has not produced any noticeable recovery, I decided to focus my attention on removing bad backlinks and this is how I found SEOmoz. My question is about duplicate content. The crawl diagnostics showed 6,000 errors for duplicate page content and the same for duplicate page title. After reviewing the details, it looks like almost every page is from the forum (shapefit.com/forum). What's the best way to resolve these issues? Should I completely block the "forum" folder from being indexed by Google or is there something I can do within the forum software to fix this (I use phpBB)? I really appreciate any feedback that would help fix these issues so the site can hopefully start recovering from Panda/Penguin. Thank you, Kris
On-Page Optimization | | shapefit0 -
Removing syndicated duplicate content from website - what steps do I need to take to make sure Google knows?
Hey all, So I've made the decision to cancel the service that provides my blog with regular content / posts, since it seems that having duplicate content on my site isn't doing me any favors. So I'm on a Wordpress system - I'll be exporting the posts so I have them for reference, and then deleting the posts. There are like 150 or so - What steps should I take to ensure that Google learns of the changes I've made? Or do I not need to do anything at all in that department? Also - I guess I've assumed that the best decision would be to 'remove' the content from my blog. IS that the best way to go? Or should I leave it in place and start adding unique content? (my guess is that I need to remove it...) Thanks for your help, Kurt
On-Page Optimization | | KurtBullock0 -
What is the best way to manage industry required duplicate Important Safety Information (ISI) content on every page of a site?
Hello SEOmozzer! I have recently joined a large pharmaceutical marketing company as our head SEO guru, and I've encountered a duplicate content related issue here that I'd like some help on. Because there is so much red tape in the pharmaceutical industry, there are A LOT of limitations on website content, medication and drug claims, etc. Because of this, it is required to have Important Safety Information (ISI) clearly stated on every page of the client's website (including the homepage). The information is generally pretty lengthy, and in some cases is longer than the non-ISI content on each page. Here is an example: http://www.xifaxan.com/ All content under the ISI header is required on each page. My questions are: How will this duplicated content on each page affect our on-page optimization scores in the eyes of search engines? Is Google seeing this simply as duplicated content on every page, or are they "smart" enough to understand that because it is a drug website, this is industry standard (and required)? Aside from creating more meaty, non-ISI content for the site, are there any other suggestions you have for handling this potentially harmful SEO situation? And in case you were going to suggest it, we cannot simply have an image of the content, as it may not be visible by all internet users. We've already looked into that 😉 Thanks in advance! Dylan
On-Page Optimization | | MedThinkCommunications0 -
Wordpress Pages vs. Posts
When building a blog to promote a particular affiliate offer, I usually like to use a static "page" homepage and then have my posts displayed on another part of my site. I've noticed that my wordpress pages almost always rank higher than wordpress posts and I can't explain it... Here are some possibilities I've thought of: XML Sitemap priority is set at 60% for pages and only 20% for posts My main navbar lists the pages which consequently means they get linked to on every page on my website. Some other phenomenon within the wordpress framework... If it helps, I use Thesis v1.8 on all my sites. I guess my ultimate question is: If pages do in fact rank higher than posts, is it worth it for me to go back and change the site structure on all my blogs which are using posts instead of pages. I know making major modifications like that can be disasterous but will it ultimately pay off? Thanks
On-Page Optimization | | drewhammond1