Duplicate page content on numerical blog pages?
-
Hello everyone,
I'm still relatively new at SEO and am still trying my best to learn.
However, I have this persistent issue. My site is on WordPress and all of my blog pages e.g page one, page two etc are all coming up as duplicate content.
Here are some URL examples of what I mean:
http://3mil.co.uk/insights-web-design-blog/page/3/
http://3mil.co.uk/insights-web-design-blog/page/4/
Does anyone have any ideas?
I have already no indexed categories and tags so it is not them.
Any help would be appreciated.
Thanks.
-
Thanks for your help Logan.
This is exactly what I was looking for.
-
I think the solution you're looking for is pagination markup. Use the rel=prev and rel=next tag to point search engines towards the proper paginated URLs. This effectively eliminates duplicate content issues where http://3mil.co.uk/insights-web-design-blog/page/3/ is flagged as a duplicate of http://3mil.co.uk/insights-web-design-blog/page/2/.
You can read more on this topic in this article on the Search Console help site.
-
Hello Eric,
Thanks for your quick response. I have already implemented the "more" tag onto all of my posts and this doesn't seem to stop the issue.
I don't mean that the blog posts have multiple pages I mean that the blog itself where there are various snippets of blog posts has more than one page. Like this: http://3mil.co.uk/insights-web-design-blog/
If that makes sense?
-
Those pages (e.g., /page/3/, /page/4/ etc.) are definitely duplicate content pages if you're not splitting up posts. In WordPress, we typically recommend using the "more" tag or other functionality to split up posts so that only a snippet is shown in places other than the full blog post URL itself.
There are a few ways to deal with this, but always make sure you're splitting up posts using the 'more' tag or something similar. You'll want to make sure that your category page or blog home page is only showing a 'snippet' of each post, then the user must click on the blog post itself to see the full post.
If you're running into an issue where you have a ton of pages like /page/3/, /page/4/, /page/5/, then you might look at those posts and see how many posts you're showing on each page (5, 10, or 20)? You might increase the number of posts you're showing. That would cut down on the number of "pages" (/page/3/) that you'll end up with.
Another way to deal with this would be to look at the posts and see if you can split them up into more categories or subcategories. I personally prefer subcategories if possible--but you won't want to have a category or subcategory with only one post in it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalty for duplicate content on the same website?
Is it possible to get a penalty for duplicate content on the same website? I have a old custom-built site with a large number of filters that are pre-generated for speed. Basically the only difference is the meta title and H1 tag, with a few text differences here and there. Obviously I could no-follow all the filter links but it would take an enormous amount of work. The site is performing well in the search. I'm trying to decide whether if there is a risk of a penalty, if not I'm loath to do anything in case it causes other issues.
Intermediate & Advanced SEO | | seoman100 -
Duplicate Content Question
Brief question - SEOMOZ is teling me that i have duplicate content on the following two pages http://www.passportsandvisas.com/visas/ and http://www.passportsandvisas.com/visas/index.asp The default page for the /visas/ directory is index.asp - so it effectively the same page - but apparently SEOMOZ and more importantly Google, etc treat these as two different pages. I read about 301 redirects etc, but in this case there aren't two physical HTML pages - so how do I fix this?
Intermediate & Advanced SEO | | santiago230 -
Duplicate Content in News Section
Our clients site is in the hunting niche. According to webmaster tools there are over 32,000 indexed pages. In the new section that are 300-400 news posts where over the course of a about 5 years they manually copied relevant Press Releases from different state natural resources websites (ex. http://gfp.sd.gov/news/default.aspx). This content is relevant to the site visitors but it is not unique. We have since begun posting unique new posts but I am wondering if anything should be done with these old news posts that aren't unique? Should I use the rel="canonical tag or noindex tag for each of these pages? Or do you have another suggestion?
Intermediate & Advanced SEO | | rise10 -
Duplicate content on ecommerce sites
I just want to confirm something about duplicate content. On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content? Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page. If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products... Cheers,
Intermediate & Advanced SEO | | Creode0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0 -
Blog posts, blog archives and duplication
Just reviewed a blog integrated with my website, and have noticed duplicate content - the blog homepage includes blogpost summaries (not a major issue as now set up so only put in opening paragraphy then anchor text to full blog post). Then that's a full blog blog post if you click for more - then that's carbon copied over in the archive. So one near exact duplicate. Is this something worth taking action on with nocrawl tags, etc., on archive duplicates of blog posts, or shouldn't I be to hung-up on this? I'm a scientist by training, so tend to go further and further once I get going...
Intermediate & Advanced SEO | | McTaggart0 -
Can PDF be seen as duplicate content? If so, how to prevent it?
I see no reason why PDF couldn't be considered duplicate content but I haven't seen any threads about it. We publish loads of product documentation provided by manufacturers as well as White Papers and Case Studies. These give our customers and prospects a better idea off our solutions and help them along their buying process. However, I'm not sure if it would be better to make them non-indexable to prevent duplicate content issues. Clearly we would prefer a solutions where we benefit from to keywords in the documents. Any one has insight on how to deal with PDF provided by third parties? Thanks in advance.
Intermediate & Advanced SEO | | Gestisoft-Qc1 -
Diagnosing duplicate content issues
We recently made some updates to our site, one of which involved launching a bunch of new pages. Shortly afterwards we saw a significant drop in organic traffic. Some of the new pages list similar content as previously existed on our site, but in different orders. So our question is, what's the best way to diagnose whether this was the cause of our ranking drop? My current thought is to block the new directories via robots.txt for a couple days and see if traffic improves. Is this a good approach? Any other suggestions?
Intermediate & Advanced SEO | | jamesti0