What are the SEO recommendations for dynamic, personalised page content? (not e-commerce)
-
Hi,
We will have pages on the website that will display different page copy and images for different user personas. The main content (copy, headings, images) will be supplied dynamically and I'm not sure how Google will index the B and C variations of these pages.
As far as I know, the page URL won't change and won't have parameters.
Google will crawl and index the page content that comes from JavaScript but I don't know which version of the page copy the search robot will index. If we set user agent filters and serve the default page copy to search robots, we might risk having a cloak penalty because users get different content than search robots.
Is it better to have URL parameters for version B and C of the content? For example:
- /page for the default content
- /page?id=2 for the B version
- /page?id=3 for the C version
The dynamic content comes from the server side, so not all pages copy variations are in the default HTML.
I hope my questions make sense. I couldn't find recommendations for this kind of SEO issue.
-
Hi everyone,
I have a related question about personalisation too which is a variation on the theme but which I would appreciate some help with.
There is a project afoot within my company to "personalise" the user experience by presenting pages to users which better respond to their interests.
That is to say that, when a user visits our page about "tennis-shoes", the next time they visit the homepage they will be presented with a homepage which focusses on tennis-shoes.
So far so good.
However rather than personalising certain elements of the homepage, the idea is to intercept those users, and 301 them to an entirely different URL, completly hidden from Google, which will contain entirely different content focussing only on shoes.
The top navegation will remain the same.
This sounds like a massive breach of Quality Guidelines on at least two counts to me. It reeks of cloacking and "sneaky redirects", and I am very concerned this will do us way more harm than good.
I'm guessing that the correct way of going about this would be to either generate a great "shoes" page and allow users to navigate to it, visit it, and do whatever they want with it, or to personalise the homepage including some dynamic elements on the same URL, without hiding things from Google or frustrating users by not allowing them to access the page they are trying to access.
Any feedback from the community would be a great help.
Thanks a lot!
-
Brilliant thread guys!
This will be far more discussed in the not so distant future i'm sure!
Dynamic Homepages are becoming more common and I have a client using one so this info has really helped me.
This topic should be a future Whiteboard Friday.
-
Yes, that sounds great! Please let me know how it all goes and if you run into any other hiccups.
Cheers,
B
-
Hi Britney
Thank you for your detailed feedback!
I checked the posts you linked and a few other sources and I think the solution will be the following:
- The default content will be loaded with the parameter free URL, e.g. /product
- Personalised versions of the page will have different (short) parameters, e.g. /product?version=8372762
- The default and the personalised pages will have the same canonical tag (default page)
- Let Google know in the Search Console's URL Parameters settings that the version parameter changes the page content (specifies + let Googlebot decide)
I hope it makes sense.
-
Did some digging and found a few resources stating:
Googlehadan official statement about this in its webmaster guidelines:
"If you decide to use dynamic pages (i.e., the URL contains a ? character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few. Don't use &id= as a parameter in your URLs, as we don't include these pages in our index."That was many years ago but more recently Google changed its position on that subject. The entry has been removed from Google's guidelines but here's the official statement from Google's blog:
"Google now indexes URLs that contain the &id= parameter. So if your site uses a dynamic structure that generates it, don't worry about rewriting it -- we'll accept it just fine as is.Keep in mind, however, that dynamic URLs with a large number of parameters may still be problematic for search engine crawlers in general, so rewriting dynamic URLs into user-friendly versions is always a good practice when that option is available to you. If you can, keeping the number of URL parameters to one or two may make it more likely that search engines will crawl your dynamic urls."
Click here read the full article
Penalization for personalisation
Let me know if this helps
-
Fascinating question Gyorgy!
I've always been a big fan of dynamic targeting.
It would be a great idea to have different URL parameters for each unique set of content. You might also want to push these pages to fetch & index within Google Search Console (and your sitemap.xml to showcase you're not attempting to cloak, etc.)
This would be a fantastic question for Google reps...I can try to reach out to someone today and let you know what they say.
Cheers,
B
PS. Just curious, how are you pulling in persona data?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
150+ Pages of URL Parameters - Mass Duplicate Content Issue?
Hi we run a large e-commerce site and while doing some checking through GWT we came across these URL parameters and are now wondering if we have a duplicate content issue. If so, we are wodnering what is the best way to fix them, is this a task with GWT or a Rel:Canonical task? Many of the urls are driven from the filters in our category pages and are coming up like this: page04%3Fpage04%3Fpage04%3Fpage04%3F (See the image for more). Does anyone know if these links are duplicate content and if so how should we handle them? Richard I7SKvHS
Technical SEO | | Richard-Kitmondo0 -
Duplicate content by php id,page=... problem
Hi dear friends! How can i resolve this duplicate problem with edit the php code file? My trouble is google find that : http://vietnamfoodtour.com/?mod=booking&act=send_booking&ID=38 and http://vietnamfoodtour.com/.....booking.html are different page, but they are one but google indexed both of them. And the Duplcate content is raised 😞 how can i notice to google that they are one?
Technical SEO | | magician0 -
Is it bad to have your pages as .php pages?
Hello everyone, Is it bad to have your website pages indexed as .php? For example, the contact page is site.com/contact.php and not /contact. Does this affect your SEO rankings in any way? Is it better to have your pages without the extension? Also, if I'm working with a news site and the urls are dynamic for every article (ie site.com/articleid=2323.) Should I change all of those dynamic urls to static? Thank You.
Technical SEO | | BruLee0 -
Does page size and relative content position affect SEO?
Good morning, Each product page of our e-commerce site consists of a fairly lengthy header and footer. The former of which contains links to ~60 product categories, the logo, etc, while the latter contains information such as the latest posts from our blog, links to support, etc. The main "content" of the page is of course product related information, which also happens to contain a bit of templated data such as links which when clicked open respective sliders containing information regarding our return and shipping policies. The question: We wonder whether the relative "size" of the page has anything to do with SEO results. As an example, suppose the page header consists of 20% of the total page size, the important page-specific content consumes 60%, and the footer consumes the final 20%. Is this relevant? Or to rephrase the question: Should we be concerned about keeping our headers and footers as small as possible? Thanks!
Technical SEO | | FondriestEnv0 -
Duplicate Page Content and Titles
A few weeks ago my error count went up for Duplicate Page Content and Titles. 4 errors in all. A week later the errors were gone... But now they are back. I made changes to the Webconfig over a month ago but nothing since. SEOmoz is telling me the duplicate content is this http://www.antiquebanknotes.com/ and http://www.antiquebanknotes.com Thanks for any advise! This is the relevant web.config. <rewrite><rules><rule name="CanonicalHostNameRule1"><match url="(.*)"><conditions><add input="{HTTP_HOST}" pattern="^www.antiquebanknotes.com$" negate="true"></add></conditions>
Technical SEO | | Banknotes
<action type="Redirect" url="<a href=" http:="" www.antiquebanknotes.com="" {r:1"="">http://www.antiquebanknotes.com/{R:1}" />
</action></match></rule>
<rule name="Default Page" enabled="true" stopprocessing="true"><match url="^default.aspx$"><conditions logicalgrouping="MatchAll"><add input="{REQUEST_METHOD}" pattern="GET"></add></conditions>
<action type="Redirect" url="/"></action></match></rule></rules></rewrite>0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0 -
Entry based content and SEO
My E-commerce team is implementing functionality that allows us to display different content based on what channel and even what keyword the customers used to reach our page. This is of course a move that we believe will strengthen our conversion rates, but how will this effect our organic search listings? Do you guys have any examples of how this could affect us, and are there any technology pitfalls that we absolutely need to know about?
Technical SEO | | GEMoney_No0 -
On Page 301 redirect for html pages
For php pages youve got Header( "HTTP/1.1 301 Moved Permanently" );
Technical SEO | | shupester
Header( "Location: http://www.example.com" );
?> Is there anything for html pages? Other then Or is placing this code redirect 301 /old/old.htm http://www.you.com/new.php in the .htaccess the only way to properly 301 redirect html pages? Thanks!0