Rotating Content Concern on Deep Pages
-
Hi there,
I apologize if I'm too vague, but this is a tough issue describe without divulging too much of our project.
I'm working on a new project which will provide information results in sets of 3. Let's say someone wants to find 3 books that match their criteria, either through their organic search which leads them to us, or through their internal search on our site.
For instance, if they're looking for classic movies involving monsters, we might display Frankenstein, Dracula, and The Mummy. We'd list unique descriptions about the movies and include lots of other useful information.
However, there are obviously many more monster movies than those 3, so when a user refreshes the page or accesses it again, a different set of results show up. For this example, assume we have 5 results to choose from. So it's likely Google will index different results shuffled around.
I'm worried about this causing problems down the line with ranking. The meat and potatoes of the page content are the descriptions and information on the movies. If these are constantly changing, I'm afraid the page will look "unstable" to Google since we have no real static content beyond a header and title tag.
Can anyone offer any insight to this?
Thanks!
-
Thanks for the response. The issue of "hiding" the content with the randomization was a fear of mine. Believe me, I don't like the rotating content design, but it's where we're at right now.
3 search results, think specific businesses, but for user experience, only 3 will be shown at once. This is not something to be changed unfortunately. If more than 3 are in that specific business category, we'll be rotating them out (which I don't like) upon refresh.
The only solution I can think of is to have the top 3 remain static and allow the user to click a "Show more" button which loads them beneath (or replaces the original 3). Either way, Google shouldn't have an issue with that, correct?
I know there are "better" ways to accomplish what we're asking, but the site is custom built and nearly 95% complete. We are also taking a unique approach to the way we display results and serve them to our clients, so the most optimal way is not achievable at this point. It's basically finding the most optimal for what we can do, if that makes sense. Thanks for understanding!
-
Sorry been mega busy
First of all, never hide content from Google if a user is unable to view that information. You will get slapped for it. Even if the algorithm does not pick it up, someone will report it at some point. That is a bad foundation to start from.
What you are trying to do is complicated to get the full picture in my head hence the lack of response from others in this forum I think.
You need to describe exactly what will be on the page and why and what will be on others and why those pages need to be indexed. This way we can work out of the strategy you are taking is even the right one. There is likely a better way to do it.
-
Hi Gary,
Were you notified of my follow-up posts? I'd love to hear additional information from you.
Thanks a lot!
-
Hi Gary,
One thing you could try is loading all the matches on to a page and only show the top 3 matches with an option to reveal more and mark all the code up with a schema. This way the content will always be on the page and able to be crawled by Googlebot.
This is the idea I've been toying with. Do you have any idea if we could preload all matches/results and still use the refresh? It'd technically (I think) be different because the user can't load more on command, like with a button, but Google can see them.
I feel like it's a little iffy since Google seems to only approve of hidden text if the user controls when they see it or not. Any idea?
Thanks again!
-
Lesley,
Thanks for the response.
If we scripted the page so Google would ignore the content, I'm afraid we'd be in nearly the same boat we're in now. As in, we'd have no content on the page and wouldn't rank for anything.
While it would effectively "solve" the potential rotating content issues and penalties, we wouldn't have anythign to rank for.
Gary,
Thanks for the helpful response!
1. How would we run into internal duplicate content issues? These 3 results (in full) would only be found on this specific page, they'd just be rotating.
I will say that the way these results pages are structured includes snippets of content that can be found on each results individual page, e.g., a snippet of Frankenstein's plot will show on the results page, and once clicked, will show the full entry. So there's going to be some duplicate content. That shouldn't be a huge deal though?
2. That's exactly the reason I hate this. Even if Google didn't get pissed, we wouldn't have static content (keywords, longtails) to build authority and rank for.
Idea #1: I actually have this prinicple written down, but slightly different. If we had a link at the bottom of the results in Javascript to "shuffle" or "refresh" the content, the user would get the benefit, but since it's not a new page, Google couldn't crawl it. So they'd only randomize on command, but stick with the initial 3 on pageload.
I was also toying with the idea of locking 2 of the results and only shuffling the 3rd, that way there's some semblance of continuity to the indexing and we'd always be working towards that content ranking. Thoughts?
Are you saying with SCHEMA we can "hide" the additional/rotated results initially to the user, but Google sees it immediately? If so, please elaborate or send me a link since this is interesting!
Idea #2: The snippets actually lead/link to their static pages on their own URL (this is the only duplicate content I believe) so that's fine, but yes, we aren't concerned with the static pages ranking, only the grouped together results.
-
You run into a number of issues by having these pages indexed.
1. Lots of internal duplicate content, Google has said this is not a problem and the they will serve up the best result. But it can trigger Panda issues.
2. The content always changes so you will confuse Googlebot and have issues ranking for specific terms for any period of time. (your SERPS would fluctuate like crazy or trigger a quality algorithm)
Some ideas:
One thing you could try is loading all the matches on to a page and only show the top 3 matches with an option to reveal more and mark all the code up with a schema. This way the content will always be on the page and able to be crawled by Googlebot.
Another option is to not index these pages at all and create static pages for each item. But this defeats the object of what you are trying to rank for.
Serving up random content is always going to be an issue for Googlebot, but more and more webmasters have responsive designs that hide and show content based on clickable actions on pages. Googlebot indexes all the content but is smart at working out what is also visible to the user and giving preference to it.
-
In my opinion the safest way to do it would be to have a discrete iframe that loaded the contents. The reason being is that google would ignore it. It would make it on par with twitter widgets and facebook like boxes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in Shopify - subsequent pages in collections
Hello everyone! I hope an expert in this community can help me verify the canonical codes I'll add to our store is correct. Currently, in our Shopify store, the subsequent pages in the collections are not indexed by Google, however the canonical URL on these pages aren't pointing to the main collection page (page 1), e.g. The canonical URL of page 2, page 3 etc are used as canonical URLs instead of the first page of the collections. I have the canonical codes attached below, it would be much appreciated if an expert can urgently verify these codes are good to use and will solve the above issues? Thanks so much for your kind help in advance!! -----------------CODES BELOW--------------- <title><br /> {{ page_title }}{% if current_tags %} – tagged "{{ current_tags | join: ', ' }}"{% endif %}{% if current_page != 1 %} – Page {{ current_page }}{% endif %}{% unless page_title contains shop.name %} – {{ shop.name }}{% endunless %}<br /></title>
Intermediate & Advanced SEO | | ycnetpro101
{% if page_description %} {% endif %} {% if current_page != 1 %} {% else %} {% endif %}
{% if template == 'collection' %}{% if collection %}
{% if current_page == 1 %} {% endif %}
{% if template == 'product' %}{% if product %} {% endif %}
{% if template == 'collection' %}{% if collection %} {% endif %}0 -
Webshop landing pages and product pages
Hi, I am doing extensive keyword research for the SEO of a big webshop. Since this shop sells technical books and software (legal books, tax software and so on), I come across a lot of very specific keywords for separate products. Isn't it better to try and rank in the SERP's with all the separate product pages, instead of with the landing (category) pages?
Intermediate & Advanced SEO | | Mat_C0 -
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Change of content
Hello, When you do a major change of content on a page I know it takes time to start seeing some results in terms of ranking. Let's say I make a change today expecting to see the first results of that change 2 months from now. Let's say in a month I decide to add some content and make again some minor changes. Do I have to wait another 2 months starting on the date I made my 2 nd changes to see some results or will I see the results of the 1 change as originally planned 2 months after my major content change ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Would you consider this thin content?
Just wondering what the community thinks about the following URLS and whether they are essentially thin content that should be handled through a canonical, noindex or a parameter filtering system: https://www.adversetdisplay.co.uk/products/3x1-popup-exhibition-stand https://www.adversetdisplay.co.uk/products/3x2-popup-exhibition-stand https://www.adversetdisplay.co.uk/products/3x3-popup-exhibition-stand https://www.adversetdisplay.co.uk/products/3x4-popup-exhibition-stand https://www.adversetdisplay.co.uk/products/3x5-popup-exhibition-stand
Intermediate & Advanced SEO | | ColinDocherty0 -
What is the best practice to optimize page content with strong tags?
For example, if I have a sub page dedicated to the keyword "Houston Leather Furniture" is it best practice to bold ONLY the exact match keyword? Or should ONLY the words from the keyword (so 'Houston' 'Leather' and 'Furniture') Is there a rule to how many times it should be done before its over-optimization? I appreciate any information as I want to do the BEST possible practice when it comes to this topic. Thanks!
Intermediate & Advanced SEO | | MonsterWeb280 -
K3 duplicate page content and title tags
I'm running a Joomla site, have just installed k2 as our blogging platform. Our Crawl Report with SEOMOZ shows a good bit of duplicate content and duplicate title tags with our K2 blog. We've installed sh404SEF. Will I need to go into sh404SEF each time we generate a blog entry to point the titles to one URL? If there is something simpler please advise. Thank you, Don
Intermediate & Advanced SEO | | donaldmoore0 -
Merging your google places page with google plus page.
I have a map listing showing for the keyword junk cars for cash nj. I recently created a new g+ page and requested a merge between the places and the + page. now when you do a search you see the following. Junk Cars For Cash NJ LLC
Intermediate & Advanced SEO | | junkcars
junkcarforcashnj.com/
Google+ page - Google+ page the first hyperlink takes me to the about page of the G+ and the second link takes me to the posts section within g+. Is this normal? should i delete the places account where the listing was originally created? Or do i leave it as is? Thanks0