Rel author and duplicate content
-
I have a question if a page who a im the only author, my web will duplicate content with the blog posts and the author post as they are the same. ¿what is your suggestion in that case? thanks
-
Hi Dario, no-indexing the author archive is probably the most commonly used method to prevent duplicate content issues from harming single author blogs in terms of search. However, it is not the only method, and is not the best one in terms of usability. Other options include redirecting the author archive page to the blog home page, using canonical tags, or disabling the author archive altogether. In terms of usability, I prefer the last option. Why create an author archive at all for a single person blog?
-
I know that it sounds weird to no-index the author page. In some ways i agree with that.
But it's very normal to no-index archive pages because that is obvious duplicate content.An author page is nothing more than an archive page filtered to just one author.
I hope this makes you see my solution in a different way. I still think that no-indexing is the best way you thing you can do.
-
Thats correct
domain.com/blog content is the same as domain.com/author/name
domain.com/blog page rank 1 (more authority)
domain.com/author/name page rank 2 (less authority) no links from the site
its not the only option, i would like more options, like create a new author and intercalate post, or other suggestions as just do not index page
-
What i gather from your question is that if you say you are the author of a piece of content. Your CMS will create to pages. One for the category where your blog post resides and one on the author page.
If this is what you mean then you should make sure the search engines don't index your author page. You can do that by placing the following piece of code in the HTML head section of your website: <meta name="robots" content="index, follow" =""></meta name="robots">
In order to be associated as the author to the search engines you should use the rel=author in your hyperlink.
For example: rel="author">My domainDid this answer your question?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content issue in Magento
I am getting duplicate content issue because of the following product URL in my Magento store. http://www.sitename.com/index.php/sports-nutritions/carbohydrates http://www.sitename.com/sports-nutritions/carbohydrates Please can someone guide me on how to solve it. Thanks Guys
White Hat / Black Hat SEO | | webteamBlackburn0 -
Removing duplicated content using only the NOINDEX in large scale (80% of the website).
Hi everyone, I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user. What do you think about this "theory"? What would you do? Thank you for your help!
White Hat / Black Hat SEO | | Lukas_TheCurious0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Rel Canonical and Rel No Index, Follow
Hi, Cant implement rel next and prev as getting difficulty in coding - tried lot for same, but to no luck... Considering now rel=canonical and rel noindex,follow to 2 sections Deals and Discounts - We have been consistenly ranking on first position for over 1.5 yr, however recently slipped to position 4,5 on many keywords in this section URL - http://www.mycarhelpline.com/index.php?option=com_offers&view=list&Itemid=9 here, the page content for page 1 and 2 pertains to the current month and from page 3 to all other pages pertains to previous months. Is adding up rel canonical from page 3 to last page to page 1 - makes sense & also simultaneously add noindex, follow from page 3 to last page News & Reviews Section - Here, all news & article items are posted. Been the links of news items are primarily there. However, the pages are not duplicates, does adding noindex, follow makes sense here URL - http://www.mycarhelpline.com/index.php?option=com_latestnews&view=list&Itemid=10 Look forward for recommendations to implement the best - to gain SERP, avoid duplicate and white hat method.. Many thanks
White Hat / Black Hat SEO | | Modi0 -
Schema.org tricking and duplicate content across domains
I've found the following abuse, and Im curious what could I do about it. Basically the scheme is: own some content only once (pictures, description, reviews etc) use different domain names (no problem if you use the same IP or IP-C address) have a different layout (this is basically the key) use schema.org tricking, meaning show (the very same) reviews on different scale, show a little bit less reviews on one site than on an another Quick example: http://bit.ly/18rKd2Q
White Hat / Black Hat SEO | | Sved
#2: budapesthotelstart.com/budapest-hotels/hotel-erkel/szalloda-attekintes.hu.html (217.113.62.21), 328 reviews, 8.6 / 10
#6: szallasvadasz.hu/hotel-erkel/ (217.113.62.201), 323 reviews, 4.29 / 5
#7: xn--szlls-gyula-l7ac.hu/szallodak/erkel-hotel/ (217.113.62.201), no reviews shown It turns out that this tactic even without the 4th step can be quite beneficial to rank with several domains. Here is a little investigation I've done (not really extensive, took around 1 and a half hour, but quite shocking nonetheless):
https://docs.google.com/spreadsheet/ccc?key=0Aqbt1cVFlhXbdENGenFsME5vSldldTl3WWh4cVVHQXc#gid=0 Kaspar Szymanski from Google Webspam team said that they have looked into it, and will do something, but honestly I don't know whether I could believe it or not. What do you suggest? should I leave it, and try to copy this tactic to rank with the very same content multiple times? should I deliberately cheat with markups? should I play nice and hope that these guys sooner or later will be dealt with? (honestly can't see this one working out) should I write a case study for this, so maybe if the tactics get bigger attention, then google will deal with it? Does anybody could push this towards Matt Cutts, or anybody else who is responsible for these things?0 -
competitor sites link to a considerable amount of irrelevant sites/nonsense sites that seem to score high with regard to domain authority
According to my recent SEOmoz links analysis, my competitor sites link to a considerable amount of irrelevant sites/nonsense sites that seem to score high with regard to domain authority... e.g. wedding site linking to a transportation attorney's website. Aother competitor site has an overall of 2 million links, most of which are seemingly questionable index sites or forums to which registration is unattainable. I recently created a 301 redirect, and my external links have yet to be updated to my new domain name in SEOmoz. Yet, by comparing my previous domain authority rank with those of the said competitor sites, the “delta” is relatively marginal. The SEOmoz rank is 21 whereas the SEOmoz ranks of two competitor sites 30 and 33 respectively. The problem is, however, is to secure a good SERP for the most relevant terms with Google… My Google pagerank was “3” prior to the 301 redirect. I worked quite intensively so as to receive a pagerank only to discover that it had no affect at all on the SERP. Therefore, I took a calculated risk in changing to a domain name that translates from non-latin characters, as the site age is marginal, and my educated guess is that the PR should rebound within 4 weeks, however, I would like to know as to whether there is a way to transfer the pagerank to the new domain… Does anyone have any insight as to how to go about and handling this issue?
White Hat / Black Hat SEO | | eranariel0 -
Are duplicate item titles harmful to my ecommerce site?
Hello everyone, I have an online shopping site selling, amongst other items, candles. We have lots of different categories within the LED candles category. One route a customer can take is homepage > LED candles > Tealights. Within the tealights category we have 7 different products which vary only in colour. It is necessary to create separate products for each colour since we have fantastic images for each colour. To target different keywords, at present we have different titles (hence different link texts, different URLs and different H1 tags) for each colour, for example "Battery operated LED candles, amber", "Flameless candles, red" and "LED tealights, blue". I was wondering if different titles to target different keywords is a good idea. Or, is it just confusing to the customer and should I just stick with a generic item title which just varies by colour (eg. "LED battery candles, colour")? If I do the latter, am I at risk of getting downranked by Google since I am duplicating the product titles/link texts/URLs/H1 tags/img ALTs? (the description and photos for each colour are unique). Sorry if this is a little complicated - please ask and I can clarify anything...because I really want to give the best customer experience but still preserve my Google ranking. I have attached screenshots of the homepage and categories to clarify, feel free to go on the site live too. Thank you so much, Pravin BqFCp.jpg KC2wB.jpg BEcfX.jpg
White Hat / Black Hat SEO | | goforgreen0