Rotating Content Concern on Deep Pages
-
Hi there,
I apologize if I'm too vague, but this is a tough issue describe without divulging too much of our project.
I'm working on a new project which will provide information results in sets of 3. Let's say someone wants to find 3 books that match their criteria, either through their organic search which leads them to us, or through their internal search on our site.
For instance, if they're looking for classic movies involving monsters, we might display Frankenstein, Dracula, and The Mummy. We'd list unique descriptions about the movies and include lots of other useful information.
However, there are obviously many more monster movies than those 3, so when a user refreshes the page or accesses it again, a different set of results show up. For this example, assume we have 5 results to choose from. So it's likely Google will index different results shuffled around.
I'm worried about this causing problems down the line with ranking. The meat and potatoes of the page content are the descriptions and information on the movies. If these are constantly changing, I'm afraid the page will look "unstable" to Google since we have no real static content beyond a header and title tag.
Can anyone offer any insight to this?
Thanks!
-
Thanks for the response. The issue of "hiding" the content with the randomization was a fear of mine. Believe me, I don't like the rotating content design, but it's where we're at right now.
3 search results, think specific businesses, but for user experience, only 3 will be shown at once. This is not something to be changed unfortunately. If more than 3 are in that specific business category, we'll be rotating them out (which I don't like) upon refresh.
The only solution I can think of is to have the top 3 remain static and allow the user to click a "Show more" button which loads them beneath (or replaces the original 3). Either way, Google shouldn't have an issue with that, correct?
I know there are "better" ways to accomplish what we're asking, but the site is custom built and nearly 95% complete. We are also taking a unique approach to the way we display results and serve them to our clients, so the most optimal way is not achievable at this point. It's basically finding the most optimal for what we can do, if that makes sense. Thanks for understanding!
-
Sorry been mega busy
First of all, never hide content from Google if a user is unable to view that information. You will get slapped for it. Even if the algorithm does not pick it up, someone will report it at some point. That is a bad foundation to start from.
What you are trying to do is complicated to get the full picture in my head hence the lack of response from others in this forum I think.
You need to describe exactly what will be on the page and why and what will be on others and why those pages need to be indexed. This way we can work out of the strategy you are taking is even the right one. There is likely a better way to do it.
-
Hi Gary,
Were you notified of my follow-up posts? I'd love to hear additional information from you.
Thanks a lot!
-
Hi Gary,
One thing you could try is loading all the matches on to a page and only show the top 3 matches with an option to reveal more and mark all the code up with a schema. This way the content will always be on the page and able to be crawled by Googlebot.
This is the idea I've been toying with. Do you have any idea if we could preload all matches/results and still use the refresh? It'd technically (I think) be different because the user can't load more on command, like with a button, but Google can see them.
I feel like it's a little iffy since Google seems to only approve of hidden text if the user controls when they see it or not. Any idea?
Thanks again!
-
Lesley,
Thanks for the response.
If we scripted the page so Google would ignore the content, I'm afraid we'd be in nearly the same boat we're in now. As in, we'd have no content on the page and wouldn't rank for anything.
While it would effectively "solve" the potential rotating content issues and penalties, we wouldn't have anythign to rank for.
Gary,
Thanks for the helpful response!
1. How would we run into internal duplicate content issues? These 3 results (in full) would only be found on this specific page, they'd just be rotating.
I will say that the way these results pages are structured includes snippets of content that can be found on each results individual page, e.g., a snippet of Frankenstein's plot will show on the results page, and once clicked, will show the full entry. So there's going to be some duplicate content. That shouldn't be a huge deal though?
2. That's exactly the reason I hate this. Even if Google didn't get pissed, we wouldn't have static content (keywords, longtails) to build authority and rank for.
Idea #1: I actually have this prinicple written down, but slightly different. If we had a link at the bottom of the results in Javascript to "shuffle" or "refresh" the content, the user would get the benefit, but since it's not a new page, Google couldn't crawl it. So they'd only randomize on command, but stick with the initial 3 on pageload.
I was also toying with the idea of locking 2 of the results and only shuffling the 3rd, that way there's some semblance of continuity to the indexing and we'd always be working towards that content ranking. Thoughts?
Are you saying with SCHEMA we can "hide" the additional/rotated results initially to the user, but Google sees it immediately? If so, please elaborate or send me a link since this is interesting!
Idea #2: The snippets actually lead/link to their static pages on their own URL (this is the only duplicate content I believe) so that's fine, but yes, we aren't concerned with the static pages ranking, only the grouped together results.
-
You run into a number of issues by having these pages indexed.
1. Lots of internal duplicate content, Google has said this is not a problem and the they will serve up the best result. But it can trigger Panda issues.
2. The content always changes so you will confuse Googlebot and have issues ranking for specific terms for any period of time. (your SERPS would fluctuate like crazy or trigger a quality algorithm)
Some ideas:
One thing you could try is loading all the matches on to a page and only show the top 3 matches with an option to reveal more and mark all the code up with a schema. This way the content will always be on the page and able to be crawled by Googlebot.
Another option is to not index these pages at all and create static pages for each item. But this defeats the object of what you are trying to rank for.
Serving up random content is always going to be an issue for Googlebot, but more and more webmasters have responsive designs that hide and show content based on clickable actions on pages. Googlebot indexes all the content but is smart at working out what is also visible to the user and giving preference to it.
-
In my opinion the safest way to do it would be to have a discrete iframe that loaded the contents. The reason being is that google would ignore it. It would make it on par with twitter widgets and facebook like boxes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO implications of using Marketing Automation landing pages vs on-site content
Hi there, I'm hoping someone can help here... I'm new to a company where due to the limitations of their Wordpress instance they've been creating what would ordinarily be considered pages in the standard sitemap as landing pages in their Pardot marketing automation platform. The URL subdomain is slightly different. Just wondering if anybody could quickly outline the SEO implications of doing this externally instead of directly on their site? Hope I'm making some sense... Thanks,
Intermediate & Advanced SEO | | philremington
Phil1 -
Minimum amount of content for Ecommerce pages?
Hi Guys, Currently optimizing my e-commerce store which currently has around 100 words of content on average for each category page. Based on this study by Backlinko the more content the better: http://backlinko.com/wp-content/uploads/2016/01/02_Content-Total-Word-Count_line.png Would you say this is true for e-commerce pages, for example, a page like this: http://www.theiconic.com.au/yoga-pants/ What benefits would you receive with adding more content? Is it basically more content, leads to more potential long-tail opportunity and more organic traffic? Assuming the content is solid and not built just for SEO reasons. Cheers.
Intermediate & Advanced SEO | | seowork2140 -
Category pages
I am a very basic question on managing categories in WordPress. We have an Android website, and we cover news, rumors, tips and tricks about new devices. We have been creating categories for the new devices or at least for the popular ones which are launched every year, and link to them internally with the hope that it would improve the page authority and ranking. For example, we have a category page for Moto X, another one for Moto X (2014) and one more for Moto X (2015). One of the reasons for creating a category was to ensure that it is easier for readers to get information about a particular device rather than going to a category page that has information about all the models. However, the problem with their strategy we're now realizing is that it means we have to build page authority for the new category page from scratch, which can take time. So we are thinking of reusing the same category for multiple models. So reuse the Moto X category page for Moto X (2016). However, we are not sure if it would be right approach as we would be linking to the same category page with different anchor texts. So while it would be good to reuse a page rather than rebuild the page authority from scratch, would we be diluting the authority for the main keyword by using it for different models. I would love to hear your thoughts on how we should be handling categories and internal links in this case.
Intermediate & Advanced SEO | | Gautam0 -
After Server Migration - Crawling Gets slow and Dynamic Pages wherein Content changes are not getting Updated
Hello, I have just performed doing server migration 2 days back All's well with traffic moved to new servers But somehow - it seems that w.r.t previous host that on submitting a new article - it was getting indexed in minutes. Now even after submitting page for indexing - its taking bit of time in coming to Search Engines and some pages wherein content is daily updated - despite submitting for indexing - changes are not getting reflected Site name is - http://www.mycarhelpline.com Have checked in robots, meta tags, url structure - all remains well intact. No unknown errors reports through Google webmaster Could someone advise - is it normal - due to name server and ip address change and expect to correct it automatically or am i missing something Kindly advise in . Thanks
Intermediate & Advanced SEO | | Modi0 -
Update content or create a new page for a year related blog post?
I have a page called 'video statistics 2013' which ranks really well for video stat searches and drives in a lot of traffic to the site. Am I best to just change the title etc to 2014 and update the content, or create a totally new page? The page has 2013 in the URL as well which may be a problem for just updating?
Intermediate & Advanced SEO | | JonWhiting0 -
Is this ok for content on our site?
We run a printing company and as an example the grey box (at the bottom of the page) is what we have on each page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html We used to use this but tried to get most of the content on the page, but we now want to add a bit more in-depth information to each page. The question i have is - would a 1200 word document be ok in there and not look bad to Google.
Intermediate & Advanced SEO | | BobAnderson0 -
Need to move highest content pages into a sub-domain and want to minimize the loss of traffic - details inside!
Hi All! So the company that I work for owns two very strong domains in the information security industry. There are two separate sections on each site that draws a ton of long tail SEO traffic. For our corporate site we have a vulnerability database where people search for vulnerabilities to research, and find out how to remediate. On our other website we have an exploit database where people can look up exploits in order to see how to patch an attackers attack path. We are going to move these into a super database under our corporate domain and I want to ensure that we maintain or minimize the traffic loss. The exploit database which is currently on our other domain yields about three quarters of the traffic to the domain. It is obviously OK if that traffic goes directly to this new subdomain. What are my options to keep our search traffic steady for this content? There are thousands and thousands of these vulnerabilities and exploits so it would not make sense to 301 redirect all of them. What are some other options and what would you do?
Intermediate & Advanced SEO | | PatBausemer0 -
Duplicate content
I have just read http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world and I would like to know which option is the best fit for my case. I have the website http://www.hotelelgreco.gr and every image in image library http://www.hotelelgreco.gr/image-library.aspx has a different url but is considered duplicate with others of the library. Please suggest me what should i do.
Intermediate & Advanced SEO | | socrateskirtsios0