I've got duplicate pages. For example, blog/page/2 is the same as author/admin/page/2\. Is this something I should just ignore, or should I create the author/admin/page2 and then 301 redirect?
-
I'm going through the crawl report and it says I've got duplicate pages. For example, blog/page/2 is the same as author/admin/page/2/ Now, the author/admin/page/2 I can't even find in WordPress, but it is the same thing as blog/page/2 nonetheless. Is this something I should just ignore, or should I create the author/admin/page2 and then 301 redirect it to blog/page/2?
-
I'd take a slightly different approach to solving your issue though blocking the pages will work. My only concern with doing that is that if you do any weight to that page will evaporate from your site as opposed to being passed back internally. You won't be finding the pages in Wordpress as they're auto-generated and I'm guessing there's only one author which is why the author archive would be the same as the general archive.
Assuming you're using Yoast you can remedy the issue by simply going to the Titles & Metas area, selecting the "Archives" tab and check the box next to "Add noindex, follow to the author archives". This will allow the PageRank to pass but the page won't be indexed as duplicate content. There are other types of pages int eh same area you can do the same thing for.
As an aside, you should change your username. From the example you've given I'm assuming you've left the user as "admin". Since that's the default for Wordpress it makes it easier for attacks to brute-force their way in as they already have the username. This can be done via phpMyAdmin to just change it but if you're not comfortable in there you can simply create a different user with Admin privileges and delete the old "admin" making sure to attribute all posts and pages to the new user.
I shouldn't have to say this but just in case something goes wrong BE SURE TO BACK UP YOUR DATABASE !
-
Hey Michael,
Better safe than sorry. If you are picking up duplicate pages, you could get slapped with some duplicate content issues. This won't get you blacklisted by any means...but it can help push your results away from the spotlight.
In a situation like this, I would advise blocking these types of pages from bots - but do not redirect them (that could cause some serious issues for navigation)! Also, using your rel="canonical" can be helpful pointing out the original source of content.
Example robots.txt
User-agent: *
Disallow: /authorNote this will block anything after author...so author/admin/, author/admin/page, etc.
-
sorry! the first line is mistranslated;I meant
I happen to me the same thing on some websites ...I usually ignore these messages, provided they do not find the page
I think the tool is not 100% accurate !! -
Hello! I too think of some places ... I usually ignore these messages, provided they do not find the page
I think the tool is not 100% accurate !!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I 301 redirect a sub-page that is #1, will I risk losing SERP?
I have a site that for some reason Google decided to rank one of our articles #1 for a fairly competitive term. The article is kind of a BS blog post and I want to 301 it to our page about the topic as that's designed for conversion. If I do this, will we risk losing the ranking? If so, what are other options? Can I change the content of the ranked page to something closer to our landing page? Any advice is welcome!
Intermediate & Advanced SEO | | dk80 -
How do we avoid duplicate/thin content on +150,000 product pages?
Hey guys! We got a rather large product range (books) on our eCommerce site (+150,000 titles). We get book descriptions as meta data from our publishers, which we display on the product pages. This obviously is not unique, as many other sites display the same piece of description of the book. It is important for us to rank on those book titles, so my question to You is: How would you go about it? I mean, it seems like a rather unrealistic task to paraphrase +150,000 (and growing) book descriptions. As I see it, there are these options: 1. Don't display the descriptions on the product pages (however then those pages will get even thinner!)
Intermediate & Advanced SEO | | Jacob_Holm
2. Display the (duplicate) descriptions, but put no-index on those product pages in order not to punish the rest of the site (not really an option, though).
3. Hire student workers to produce unique product descriptions for all 150,000 products (seems like a huge and expensive task) But how would You solve such a challenge?
Thanks a lot! Cheers, Tommy.0 -
301 redirect to search results page?
Hi - we just launched our redesigned website. On the previous site, we had multiple .html pages that contained links to supporting pdf documentation. On this new site, we no longer have those .html landing pages containing the links. The question came up, should we do a search on our site to gather a single link that contains all pdf links from the previous site, and set up a redirect? It's my understanding that you wouldn't want google to index a search results page on your website. Example: old site had the link http://www.oldsite.com/technical-documents.html new site, to see those same links would be like: http://www.newsite.com/resources/search?View+Results=&f[]=categories%3A196
Intermediate & Advanced SEO | | Jenny10 -
A/B Testing - Should I add product descriptions on my category landing pages as well as on product pages and if so . how to do this to avoid duplicate content
Hi All, I recently relaunched a new design on my tool hire eCommerce website and now display my products in grid form on my category landing pages as opposed to just a list view which we previously had on the old design. My bounce rates are alot higher than they use to be and my gut instinct is telling me maybe this is wrong . I want to do some a/b testing using a list view. My question is , previously in our list views we just showed the images and pricing and had on page content on the bottom of the page. The user would click on the product image and they would then we taken to the product page which has the product description , t&c, etc etc.. If I was to do this in my a/b testing but change it so we also displayed the product descriptions as well on the category landing pages . Is there a special way to do this as in effect, we would have duplicate content as the product descriptions are also on the product page?. Does anyone have any thoughts on this as to whether its a No No from an SEO point of view ?... Heres a short url link to one of my category pages - http://goo.gl/QJv5gw Historically we use to rank well for the category landing pages and not for the product pages.Our Rankings are down , bounce rates are higher so I am trying to sort both. We have good content on pages etc. Any advice greatly appreciated as always thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Penalized for Duplicate Page Content?
I have some high priority notices regarding duplicate page content on my website www.3000doorhangers.com Most of the pages listed here are on our sample pages: http://www.3000doorhangers.com/home/door-hanger-pricing/door-hanger-design-samples/ On the left side of our page you can go through the different categories. Most of the category pages have similar text. We mainly just changed the industry on each page. Is this something that google would penalize us for? Should I go through all the pages and use completely unique text for each page? Any suggestions would be helpful Thanks! Andrea
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
Will pages irrelevant to a site's core content dilute SEO value of core pages?
We have a website with around 40 product pages. We also have around 300 pages with individual ingredients used for the products and on top of that we have some 400 pages of individual retailers which stock the products. Ingredient pages have same basic short info about the ingredients and the retail pages just have the retailer name, adress and content details. Question is, should I add noindex to all the ingredient and or retailer pages so that the focus is entirely on the product pages? Thanks for you help!
Intermediate & Advanced SEO | | ArchMedia0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0