What are the SEO recommendations for dynamic, personalised page content? (not e-commerce)
-
Hi,
We will have pages on the website that will display different page copy and images for different user personas. The main content (copy, headings, images) will be supplied dynamically and I'm not sure how Google will index the B and C variations of these pages.
As far as I know, the page URL won't change and won't have parameters.
Google will crawl and index the page content that comes from JavaScript but I don't know which version of the page copy the search robot will index. If we set user agent filters and serve the default page copy to search robots, we might risk having a cloak penalty because users get different content than search robots.
Is it better to have URL parameters for version B and C of the content? For example:
- /page for the default content
- /page?id=2 for the B version
- /page?id=3 for the C version
The dynamic content comes from the server side, so not all pages copy variations are in the default HTML.
I hope my questions make sense. I couldn't find recommendations for this kind of SEO issue.
-
Hi everyone,
I have a related question about personalisation too which is a variation on the theme but which I would appreciate some help with.
There is a project afoot within my company to "personalise" the user experience by presenting pages to users which better respond to their interests.
That is to say that, when a user visits our page about "tennis-shoes", the next time they visit the homepage they will be presented with a homepage which focusses on tennis-shoes.
So far so good.
However rather than personalising certain elements of the homepage, the idea is to intercept those users, and 301 them to an entirely different URL, completly hidden from Google, which will contain entirely different content focussing only on shoes.
The top navegation will remain the same.
This sounds like a massive breach of Quality Guidelines on at least two counts to me. It reeks of cloacking and "sneaky redirects", and I am very concerned this will do us way more harm than good.
I'm guessing that the correct way of going about this would be to either generate a great "shoes" page and allow users to navigate to it, visit it, and do whatever they want with it, or to personalise the homepage including some dynamic elements on the same URL, without hiding things from Google or frustrating users by not allowing them to access the page they are trying to access.
Any feedback from the community would be a great help.
Thanks a lot!
-
Brilliant thread guys!
This will be far more discussed in the not so distant future i'm sure!
Dynamic Homepages are becoming more common and I have a client using one so this info has really helped me.
This topic should be a future Whiteboard Friday.
-
Yes, that sounds great! Please let me know how it all goes and if you run into any other hiccups.
Cheers,
B
-
Hi Britney
Thank you for your detailed feedback!
I checked the posts you linked and a few other sources and I think the solution will be the following:
- The default content will be loaded with the parameter free URL, e.g. /product
- Personalised versions of the page will have different (short) parameters, e.g. /product?version=8372762
- The default and the personalised pages will have the same canonical tag (default page)
- Let Google know in the Search Console's URL Parameters settings that the version parameter changes the page content (specifies + let Googlebot decide)
I hope it makes sense.
-
Did some digging and found a few resources stating:
Googlehadan official statement about this in its webmaster guidelines:
"If you decide to use dynamic pages (i.e., the URL contains a ? character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few. Don't use &id= as a parameter in your URLs, as we don't include these pages in our index."That was many years ago but more recently Google changed its position on that subject. The entry has been removed from Google's guidelines but here's the official statement from Google's blog:
"Google now indexes URLs that contain the &id= parameter. So if your site uses a dynamic structure that generates it, don't worry about rewriting it -- we'll accept it just fine as is.Keep in mind, however, that dynamic URLs with a large number of parameters may still be problematic for search engine crawlers in general, so rewriting dynamic URLs into user-friendly versions is always a good practice when that option is available to you. If you can, keeping the number of URL parameters to one or two may make it more likely that search engines will crawl your dynamic urls."
Click here read the full article
Penalization for personalisation
Let me know if this helps
-
Fascinating question Gyorgy!
I've always been a big fan of dynamic targeting.
It would be a great idea to have different URL parameters for each unique set of content. You might also want to push these pages to fetch & index within Google Search Console (and your sitemap.xml to showcase you're not attempting to cloak, etc.)
This would be a fantastic question for Google reps...I can try to reach out to someone today and let you know what they say.
Cheers,
B
PS. Just curious, how are you pulling in persona data?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does anyone know the linking of hashtags on Wix sites does it negatively or postively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please?
Does anyone know the linking of hashtags on Wix sites does it negatively or positively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please? For example at the bottom of this blog post https://www.poppyandperle.com/post/face-painting-a-global-language the hashtags are linked, but they don't go to a page, they go to search results of all other blogs using that hashtag. Seems a bit of a strange approach to me.
Technical SEO | | Mediaholix0 -
Our protected pages 302 redirect to a login page if not a member. Is that a problem for SEO?
We have a membership site that has links out in our unprotected pages. If a non-member clicks on these links it sends a 302 redirect to the login / join page. Is this an issue for SEO? Thanks!
Technical SEO | | rimix1 -
Spam pages / content created due to hack. 404 cleanup.
A hosting company's server was hacked and one of our customer's sites was injected with 7,000+ pages of fake, bogus, promotional content. Server was patched and spammy content removed from the server. Reviewing Google Webmaster's Tools we have all the hacked pages showing up as 404's and have a severe drop in impressions, rank and traffic. GWT also has 'Some manual actions apply to specific pages, sections, or links'... What do you recommend for: Cleaning up 404's to spammy pages? (I am not sure redirect to home page is a right thing to do - is it?) Cleaning up links that were created off site to the spam pages Getting rank bank // what would you do in addition to the above?
Technical SEO | | GreenStone0 -
Content Aggregator Services....good or bad for seo?
I've just had a demo from a content aggregator service called NewsCred. Essentially the service licenses the use of content from multiple sources on our own site. They claim that as the content is all properly referenced back to the original sites there are no SEO implication to the host site. Are they correct? Should we stay away?
Technical SEO | | J_Sinclair0 -
What is the best way to handle these duplicate page content errors?
MOZ reports these as duplicate page content errors and I'm not sure the best way to handle it. Home
Technical SEO | | ElykInnovation
http://myhjhome.com/
http://myhjhome.com/index.php Blog
http://myhjhome.com/blog/
http://myhjhome.com/blog/?author=1 Should I just create 301 redirects for these? 301 http://myhjhome.com/index.php to http://myhjhome.com/ ? 301 http://myhjhome.com/blog/?author=1 to http://myhjhome.com/ ? Or is there a better way to handle this type of duplicate page content errors? and0 -
Best way to deal with over 1000 pages of duplicate content?
Hi Using the moz tools i have over a 1000 pages of duplicate content. Which is a bit of an issue! 95% of the issues arise from our news and news archive as its been going for sometime now. We upload around 5 full articles a day. The articles have a standalone page but can only be reached by a master archive. The master archive sits in a top level section of the site and shows snippets of the articles, which if a user clicks on them takes them to the full page article. When a news article is added the snippets moves onto the next page, and move through the page as new articles are added. The problem is that the stand alone articles can only be reached via the snippet on the master page and Google is stating this is duplicate content as the snippet is a duplicate of the article. What is the best way to solve this issue? From what i have read using a 'Meta NoIndex' seems to be the answer (not that i know what that is). from what i have read you can only use a canonical tag on a page by page basis so that going to take to long. Thanks Ben
Technical SEO | | benjmoz0 -
How can I have pages with media that changes and avoid duplicate content when the text stays the same?
I want to have a page that describes a specific property and/or product. The top part of the page has media options such as video and photos while the bottom includes the description. I know I can set up the media in tabs and have it separated by javascript, but everything resides on one page so there are no duplicate content issues. Example: http://www.worldclassproperties.com/properties/Woodside BUT what if I need to the photos and the videos to have separate URLs so I can link to them individually? For example, for a real estate site blog, I may want to send visitors to the page of the home tour. I don't want to link them to the version of the page with the photos because I want them to arrive on the video portion. Example: http://www.worldclassproperties.com/properties/Woodside?video=1 Is there any way to get around the problem that would result from the duplicate content of the product/property description? I do not have the resources in the budget to make two unique descriptions for every page.
Technical SEO | | WebsightDesign0 -
Duplicate page content errors in SEOmoz
Hi everyone, we just launched this new site and I just ran it through SEOmoz and I got a bunch of duplicate page content errors. Here's one example -- it says these 3 are duplicate content: http://www.alicealan.com/collection/alexa-black-3inch http://www.alicealan.com/collection/alexa-camel-3inch http://www.alicealan.com/collection/alexa-gray-3inch You'll see from the pages that the titles, images and small pieces of the copy are all unique -- but there is some copy that is the same (after all, these are pretty much the same shoe, just a different color). So, why am I getting this error and is there any best way to address? Thanks so much!
Technical SEO | | ketanmv
Ketan0