Does anyone know how dynamic/personalized website content affects SEO?
-
A client using Marketo has inquired about personalizing their website content to be personalized based on a persona. To be clear, I'm talking about key website pages, maybe even the Home page, not PPC/campaign specific landing pages. For example, areas of on the site would change to display content differently to a CEO vs a sales person.
I'm new to marketing automation and don't exactly know how this piece works. Hoping someone here has experience or can provide pros/cons guidance. How would search engines work with this type of page?
Here's Marketo's site explaining what it does: https://docs.marketo.com/display/public/DOCS/Web+Personalization+-+RTP
-
Thanks, Sergey. Have you had much experience with implementing this type of feature on a website's main pages (not campaign focused landing pages)? I understand the power in theory but wonder how it would behave for user experience should someone from one position were to share the info to another in the company or different chain of command.
Secondly, you mention strict parameters need to be placed so as not to be seen as duplicate content. The search engine crawler would "read" everything at once, no? I'm wondering if a crawler would scan the entire page for all content and could perhaps be confused by the focus of a page if different features were set up to target different audience types. Or maybe it doesn't matter at all and I'm over thinking it?
-
Hello Wendy,
Great question. First of all, if done right, something like this can be extremely powerful.
Here are some pros associated with this kind of persona targeting:
-
Increasing conversion rate (more personalized content = more people wanting to convert)
-
Increase in user experience metrics (things like bounce rate, time no site, etc.)
-
IF setup correctly, it could eventually lead to (potentially) less work in the long run for developers, etc.
With that being said, there are some things to watch it for. I wouldn't necessarily call these cons, but nonetheless.
-
Duplicate/thin content can become an issue if strict parameters, etc. aren't setup.
-
There is a possibility that conversion rates don't increase, and therefore you wouldn't be getting any ROI from this.
-
It can take a lot of time to make this system fit into your developers program/work.
It comes down to what the goal is for your website. If you are trying to test and see if (using your example), changing an image for a certain user would benefit the user experience/conversion rates, I would recommend doing some A/B testing first. Tools like Optimizely offer great ways to test out changes to various pages that you control. If a certain test goes well, then you could work with an automation tool to implement that change for specific users.
**TLDR; Can this work? YES. Does it take a lot of resources/thought? YES. Does it vary from business to business? YES. **
Hope that helps.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
SEO'ing a sports advice website
Hi Team Moz, Despite being in tech/product development for 10+ years, I'm relatively new to SEO (and completely new to this forum) so was hoping for community advice before I dive in to see how Google likes (or perhaps doesn't) my soon to be built content. I'm building a site (BetSharper, an early-stage work in progress) that will deliver practical, data orientated predictive advice prior to sporting events commencing. The initial user personas I am targeting would need advice on specific games so, as an example, I would build a specific page for the upcoming Stanley Cup Game 1 between the Capitals and the Tampa Bay Lighting. I'm in the midst of keyword research and believe I have found some easier to achieve initial keywords (I'm realistic, building my DA will take time!) that include the team names but don't reference dates or state of the tournament. The question is, hypothetically if I ranked for this page for this sporting event this year, would it make sense to refresh the same page with 2019 matchup content when they meet again next year, or create a new page? I am assuming I would be targeting the same intended keywords but wondering if I get google credit for 2018 engagement post 2019 refresh. Or should I start fresh with a new page and specifically target keywords afresh each time? I read some background info on canonical tabs but wasn't sure if it was relevant in my case. I hope I've managed to articulate myself on what feels like an edge case within the wonderful world of SEO. Any advice the community delivers would be much appreciated...... Kind Regards James.
Intermediate & Advanced SEO | | JB19770 -
How do I know if I am correctly solving an uppercase url issue that may be affecting Googlebot?
We have a large e-commerce site (10k+ SKUs). https://www.flagandbanner.com. As I have begun analyzing how to improve it I have discovered that we have thousands of urls that have uppercase characters. For instance: https://www.flagandbanner.com/Products/patriotic-paper-lanterns-string-lights.asp. This is inconsistently applied throughout the site. I directed our website vendor to fix the issue and they placed 301 redirects via a rule to the web.config file. Any url that contains an uppercase character now displays as a lowercase. However, as I use screaming frog to monitor our site, I see all these 301 redirects--thousands of them. The XML sitemap still shows the the uppercase versions. We have had indexing issues as well. So I'm wondering what is the most effective way to make sure that I'm not placing an extra burden on Googlebot when they index our site? Should I have just not cared about the uppercase issue and let it alone?
Intermediate & Advanced SEO | | webrocket0 -
SEO value of article title content?
I work for an online theater news publisher. Our article page titles include various pieces of data: the title, publication date, article category, and our domain name (theatermania.com). Are all of these valuable from an SEO standpoint? My sense it'd be cleaner to just show the title (and nothing more) on a SERP. But we'll certainly keep whatever helps us with rankings.
Intermediate & Advanced SEO | | TheaterMania0 -
How does having multiple pages on similar topics affect SEO?
Hey everyone, On our site we have multiple pages that have similar content. As an example, we have a section on Cars (in general) and then specific pages for Used Cars, European Cars, Remodeled Cars etc. Much of the content is similar on these page and the only difference is some content and the additional term in the URL (for example car.com/remodeled-cars and /european-cars). In the past few months, we've noticed a dip in our organic ranking and started doing research. Also, we noticed that Google, in SERPs, shows the general page (cars.com/cars) and not the specific page (/european-cars), even if the specific page has more content. Can having multiple pages with similar content hurt SEO? If so, what is the best way to remedy this? We can consolidate some of the pages and make the difference between them a little clearer, but does it make that much of a difference for rankings? Thanks in advance!
Intermediate & Advanced SEO | | JonathonOhayon0 -
Does making a copy of website harm my SEO?
We have made a demo server on a different domain than our main website domain to test new features on it before updating codes on the main domain. Does it hurt our SEO activities? Thanks Everybody
Intermediate & Advanced SEO | | AlirezaHamidian0 -
I am SEO amameur and have bee adding links slowly to site. I cannot seem to increase my domain authority from 20 however, Anyone any advice please????
I updated meta tags on website 2/3 months ago and saw a significant improvements in rankings for keyowrds, however since then I have been dropping back down. I am wondering if this is because of low domain authoriyty. it is currentyly 20. www.babskibay.com
Intermediate & Advanced SEO | | babski0 -
How worth it is it to pursue websites who steal your content via cease/desist or DMCA takedown?
We publish popular general interest content. Using a commercial scanning service, we've found our copied content in many places. Is there an SEO value in getting copied content removed from websites who infringe / copy our content? It is a time consuming process and many infringements. And.... what if they copy the content, but include original links to our site in the content. Ironically, this is actually generating links for us - does this affect the answer?
Intermediate & Advanced SEO | | sftravel0