Personalized Content Vs. Cloaking
-
Hi Moz Community,
I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction?
Thanks,
-
It sounds like you're on the right track. If users and bots start off with the same content, that's a good start.
From there, the question is "how much content is being customized, and how frequently?" For example, if you're swapping out 5 different headlines for 40% of users, and 60% of users see the original, that's not a big deal, particularly if the rest of the page is the same.
But if you're swapping out 80% of page copy (eg removing a bunch of excess copy that is shown for SEO purposes), and 60-90% of users are seeing that "light" version of the page, you run the risk of two things:
- First, the chance that it wouldn't pass a manual review if one was performed.
- Second, the chance that Google may render a copy of the page as a user (not announcing themselves as a crawler), seeing a different version of the page multiple times, and then effectively devaluing the missing content, or worse, flagging the page in their system as cloaked content.
We could get lost in details of whether or not they're doing this, or how they're doing this, but from a technology standpoint it's pretty simply for them to render content from non-official IPs and user-agents and do an 'honesty check' for situations where content is showing up multiple ways. This is already how them compare the page on desktop vs mobile to see which sections of the page render, and which are changed.
I think you are also right to rely on site interaction before personalizing, but since there are multiple ways to do that, you should know that it's possible for Google to simulate some of those interactions. So there's a chance at some point they will render your content in a personalized manner, particularly if personalization is the result of visiting a URL or clicking a simple toggle switch or button.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 vs 302
Hello everyone! I'm working with a site right now that is currently formatted as subdomain.domain.net. The old version of the site was formatted as domain.net, with domain.com and several other variants redirecting to the current format, subdomain.domain.net. All of these redirects are 302, and I'm wondering if I should have all these changed to 301. Many of our old backlinks go to the old format of domain.net and i know the juice isn't being passed through, but i was wondering if there is any reason why you may want a 302 over a 301 in this case? Any insight would be appreciated. Thanks!
Technical SEO | | KathleenDC0 -
Duplicate content problem
Hi there, I have a couple of related questions about the crawl report finding duplicate content: We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots? The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great! Thanks very much! George
Technical SEO | | canadageorge0 -
What could be the cause of this duplicate content error?
I only have one index.htm and I'm seeing a duplicate content error. What could be causing this? IUJvfZE.png
Technical SEO | | ScottMcPherson1 -
301 redirecting old content from one site to updated content on a different site
I have a client with two websites. Here are some details, sorry I can't be more specific! Their older site -- specific to one product -- has a very high DA and about 75K visits per month, 80% of which comes from search engines. Their newer site -- focused generally on the brand -- is their top priority. The content here is much better. The vast majority of visits are from referrals (mainly social channels and an email newsletter) and direct traffic. Search traffic is relatively low though. I really want to boost search traffic to site #2. And I'd like to piggy back off some of the search traffic from site #1. Here's my question: If a particular article on site #1 (that ranks very well) needs to be updated, what's the risk/reward of updating the content on site #2 instead and 301 redirecting the original post to the newer post on site #2? Part 2: There are dozens of posts on site #1 that can be improved and updated. Is there an extra risk (or diminishing returns) associated with doing this across many posts? Hope this makes sense. Thanks for your help!
Technical SEO | | djreich0 -
301s vs. rel=canonical for duplicate content across domains
Howdy mozzers, I just took on a telecommunications client who has spent the last few years acquiring smaller communications companies. When they took over these companies, they simply duplicated their site at all the old domains, resulting in a bunch of sites across the web with the exact same content. Obviously I'd like them all 301'd to their main site, but I'm getting push back. Am I OK to simply plug in rel=canonical tags across the duplicate sites? All the content is literally exactly the same. Thanks as always
Technical SEO | | jamesm5i0 -
Does turning website content into PDFs for document sharing sites cause duplicate content?
Website content is 9 tutorials published to unique urls with a contents page linking to each lesson. If I make a PDF version for distribution of document sharing websites, will it create a duplicate content issue? The objective is to get a half decent link, traffic to supplementary opt-in downloads.
Technical SEO | | designquotes0 -
.me vs .com for new personal blog site
Hi guys, this is my first ever post on SEOMoz (woo!) I have researched and I did see someone else ask something similar but I still wasn't clear, so i hope this question is not considered a duplicate and can go on to help other people too Enough waffle For various reasons I am moving our company blog to startup a personal blog instead and I have bought a couple of appropriate domain names in a firstname/lastname format for the new blog, basically: myname.me and iammyname.com My question is, which would you consider 'better', if either, for SEO? (bonus point: are there any other non-SEO factors I should consider?) Obviously the second name is longer, but it is a .com and I hear all the time that .com is king and .me is waaaay behind Ultimately I want to rank #1 for my name If it was your site and your blog and you had my choices which one would you go for? Many thanks for your help. I'm looking forward to being part of the SEOMoz community and learning a lot from you guys, cheers, Nick
Technical SEO | | NickDavis0 -
How damaging is duplicate content in a forum?
Hey all; I hunted around for this in previous questions in the Q&A and didn't see anything. I'm just coming back to SEO after a few years out of the field and am preparing recommendations for our web dev team. We use a custom-coded software for our forums, and it creates a giant swathe of duplicate content, as each post has its own link. For example: domain.com/forum/post_topic domain.com/forum/post_topic/post1 domain.com/forum/post_topic/post2 ...and so on. However, since every page of the forum defaults to showing 20 posts, that means that every single forum thread that's 20 posts long has 21 different pages with identical content. Now, our forum is all user-generated content and is not generally a source of much inbound traffic--with occasional exceptions--but I was curious if having a mess of duplicate content in our forums could damage our ability to rate well in a different directory of the site. I've heard that Panda is really cracking down on duplicate content, and last time I was current on SEO trends, rel="canonical" was the hot new thing that everyone was talking about, so I've got a lot of catching up to do. Any guidance from the community would be much appreciated.
Technical SEO | | TheEnigmaticT0