Rel author and duplicate content
-
I have a question if a page who a im the only author, my web will duplicate content with the blog posts and the author post as they are the same. ¿what is your suggestion in that case? thanks
-
Hi Dario, no-indexing the author archive is probably the most commonly used method to prevent duplicate content issues from harming single author blogs in terms of search. However, it is not the only method, and is not the best one in terms of usability. Other options include redirecting the author archive page to the blog home page, using canonical tags, or disabling the author archive altogether. In terms of usability, I prefer the last option. Why create an author archive at all for a single person blog?
-
I know that it sounds weird to no-index the author page. In some ways i agree with that.
But it's very normal to no-index archive pages because that is obvious duplicate content.An author page is nothing more than an archive page filtered to just one author.
I hope this makes you see my solution in a different way. I still think that no-indexing is the best way you thing you can do.
-
Thats correct
domain.com/blog content is the same as domain.com/author/name
domain.com/blog page rank 1 (more authority)
domain.com/author/name page rank 2 (less authority) no links from the site
its not the only option, i would like more options, like create a new author and intercalate post, or other suggestions as just do not index page
-
What i gather from your question is that if you say you are the author of a piece of content. Your CMS will create to pages. One for the category where your blog post resides and one on the author page.
If this is what you mean then you should make sure the search engines don't index your author page. You can do that by placing the following piece of code in the HTML head section of your website: <meta name="robots" content="index, follow" =""></meta name="robots">
In order to be associated as the author to the search engines you should use the rel=author in your hyperlink.
For example: rel="author">My domainDid this answer your question?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google checks the author name of the articles with backlinks to a website?
Hi, This may sound a little too suspicious; but just want to take your suggestions and experience in this. We are trying to create articles on third party websites to increase backlinks, our brand popularity and awareness about our features. If the same author is mentioned in multiple or tens of articles with backlinks to same website; will Google monitor the author name? Is there anything wrong in creating too many external articles with same author name? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Will pillar posts create a duplication content issue, if we un-gate ebook/guides and use exact copy from blogs?
Hi there! With the rise of pillar posts, I have a question on the duplicate content issue it may present. If we are un-gating ebook/guides and using (at times) exact copy from our blog posts, will this harm our SEO efforts? This would go against the goal of our post and is mission-critical to understand before we implement pillar posts for our clients.
White Hat / Black Hat SEO | | Olivia9540 -
How do i check content is fresh or duplicate?
Hello there, As per google we need Fresh content For our website, i have content writer, but if i want to check it is duplicate before Submitting any where , Then How can i check ?? please any body let me know. Thanks,
White Hat / Black Hat SEO | | poojaverify060 -
How do I optimize pages for content that changes everyday?
Hi Guys I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend. However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page. As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes? How can I optimize the Title Tags and Meta Tags for pages that are constantly changing? I'm really stuck on this one and would appreciate some feedback into this tricky beast. Thanks in advance
White Hat / Black Hat SEO | | edward-may0 -
Is this a duplicated content?
I have an e-commerce website and a separated blog hosted on different domains. I post an article on my blog domain weekly. And I copy the 1st paragraph (sometimes only part of it when it's too long) of the article to my home page and a sub-catalog page. And then append it by anchor text "...more" which linked to the article. 1. Is that digest (1st paragraph) on my e-commerce site deemed as duplicated content by Google? Any suggestion? 2. In the future if I move the blog under the e-commerce website would it make any different with regards to this issue? Thanks for your help!
White Hat / Black Hat SEO | | LauraHT0 -
Schema.org tricking and duplicate content across domains
I've found the following abuse, and Im curious what could I do about it. Basically the scheme is: own some content only once (pictures, description, reviews etc) use different domain names (no problem if you use the same IP or IP-C address) have a different layout (this is basically the key) use schema.org tricking, meaning show (the very same) reviews on different scale, show a little bit less reviews on one site than on an another Quick example: http://bit.ly/18rKd2Q
White Hat / Black Hat SEO | | Sved
#2: budapesthotelstart.com/budapest-hotels/hotel-erkel/szalloda-attekintes.hu.html (217.113.62.21), 328 reviews, 8.6 / 10
#6: szallasvadasz.hu/hotel-erkel/ (217.113.62.201), 323 reviews, 4.29 / 5
#7: xn--szlls-gyula-l7ac.hu/szallodak/erkel-hotel/ (217.113.62.201), no reviews shown It turns out that this tactic even without the 4th step can be quite beneficial to rank with several domains. Here is a little investigation I've done (not really extensive, took around 1 and a half hour, but quite shocking nonetheless):
https://docs.google.com/spreadsheet/ccc?key=0Aqbt1cVFlhXbdENGenFsME5vSldldTl3WWh4cVVHQXc#gid=0 Kaspar Szymanski from Google Webspam team said that they have looked into it, and will do something, but honestly I don't know whether I could believe it or not. What do you suggest? should I leave it, and try to copy this tactic to rank with the very same content multiple times? should I deliberately cheat with markups? should I play nice and hope that these guys sooner or later will be dealt with? (honestly can't see this one working out) should I write a case study for this, so maybe if the tactics get bigger attention, then google will deal with it? Does anybody could push this towards Matt Cutts, or anybody else who is responsible for these things?0 -
Same template site same products but different content?
for the sake of this post I am selling lighters. I have 3 domains small-lighters.com medium-lighter.com large-lighters.com On all of the websites I have the same template same images etc and same products. The only difference is the way the content is worded described etc different bullet points. My domains are all strong keyword domains not spammy and bring in type in traffic. Is it ok to continue in this manner in your opinion?
White Hat / Black Hat SEO | | dynamic080 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0