Is it OK to dynamically serve different content to paid and non-paid traffic from the same URL?
-
Hi Moz!
We're trying to serve different content to paid and non-paid visitors from the same URL. Is this black hat?
Here's the reason we want to do this -- we're testing a theory that paid ads boost organic rankings. This is something we saw happen to a client and we want to test this further. But we have to have a different UX that's more sparse and converts better for paid.
Thanks for reading!
-
Hi David,
First of all as far as I know paid campaign doesn't helps in organic ranking. Google repeatedly said that paid campaign doesn't affect organic rankings.
As far as I know Google says that showing one version to users and other version to boat is called cloaking and we must not use this but didn't say anything on paid & non paid visitors.
If I assume that paid campaign helps in organic ranking then it is the only one thing that can affect ranking by paid campaign that is CTR.
I do run AdWords campaign for my website over 8 years and CTR is minimum 10% but I never noticed that paid campaign helps in ranking.
** I wouldn't suggest you to do that***
Please also check this once @ https://support.google.com/adwordspolicy/answer/6020954?hl=en&rd=1#701
Hope this helps you.
Thanks
-
Hi Highland,
Thanks for the quick response!
I wasn't clear in my question. The paid visitors I'm referring to are visitors coming through search ads, not people who are subscribed to our service. So this isn't regarding paywalls. Rather, we're trying to send paid traffic to a page to see if it will increase its rankings. At the same time, we want to have a different user experience for paid and non-paid visitors to increase conversions.
Also we'd like to have different content for the two versions, not just have one version with no/little content and another with all of it.
-
Google's rule on paywalls is that you have to offer up some content for free to searchers. So, for instance, if you search for a NY Times article and click in, they'll tell you how many free articles you have left before you have to pay. Google, of course, can see all that content.
Search Engine Land had a good article on paywall implications
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version?
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version? Thant way all forms of the website are pointing to one version?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Many pages small unique content vs 1 page with big content
Dear all, I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
Intermediate & Advanced SEO | | Eurasmus.com
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content. The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla). When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages. What would you choose? Let me know what you think. Thanks!0 -
Indexing Dynamic Pages
http://www.oreillyauto.com/site/c/search/Wiper+Blade/03300/C0047.oap?make=Honda&model=Accord&year=2005&vi=1430764 How is O'Reilly getting this page indexed? It shows up in organic results for [2005 honda accord windshield wiper size].
Intermediate & Advanced SEO | | Kingof50 -
Interlinking from unique content page to limited content page
I have a page (page 1) with a lot of unique content which may rank for "Example for sale". On this page I Interlink to a page (page 2) with very limited unique content, but a page I believe is better for the user with anchor "See all Example for sale". In other words, the 1st page is more like a guide with items for sale mixed, whereas the 2nd page is purely a "for sale" page with almost no unique content, but very engaging for users. Questions: Is it risky that I interlink with "Example for sale" to a page with limited unique content, as I risk not being able to rank for either of these 2 pages Would it make sense to "no index, follow" page 2 as there is limited unique content, and is actually a page that exist across the web on other websites in different formats (it is real estate MLS listings), but I can still keep the "Example for sale" link leading to page 2 without risking losing ranking of page 1 for "Example for sale"keyword phrase I am basically trying to work out best solution to rank for "Keyword for sale" and dilemma is page 2 is best for users, but is not a very unique page and page 2 is very unique and OK for users but mixed up writing, pictures and more with properties for sale.
Intermediate & Advanced SEO | | khi50 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
How would you structure this content?
We have a site where we write about our son who was born with Down syndrome. I had a question regarding some content I'm trying to create and structure and hoping you guys can point me in the right direction. One of the things we are often asked by new parents is what toys we suggest for people to buy for their child with Down syndrome, or as gifts for a friend who has a child with Down syndrome. So I'd like to write some posts that suggest great toys for each year of a kids life (and continue that as Noah grows.) However, there are some variations of key words that I would like to rank for as well and it gets a little messy, which is where I need the help. For example for each year I could have a post titled: Top Ten (I could also change out top ten for Best, etc..)Toys For A One Year Old with Down Syndr Top Ten Christmas Gift Ideas For A One Year Old With Down Syndrome Top Ten Birthday Gift Ideas For a One Year Old With D.S. Top Ten Learning Toys For A One Year Old With D.S. Top Ten Toys Under 25 Dollars For A One Year Old with DS Top Ten Developmental Toys for a One Year Old With DS Top Ten Fisher Price Toys for a child with ds Best Light Up Toys For a one year old with ds best muscial toys for a one year old with ds I could also think of other variations as well. Also I can make each of these with the various ages. 2 year old, 3 year old, etc... So I'm not sure what the best way to go is. I could easily have a ton of content that is all virtually the same (birthday gifts / christmas gifts..although I could suggest different toys) so I'd have a ton of different toys pages trying to rank for one term each that is good for google searchers but probably not so great for folks coming to my site as I would have toy pages scattered all over the site. I also don't know how landing pages would fit in to all of this. Would I want a "Down Syndrome Toy Guide" landing page, or "Down Syndrome Gift Guide" ... or both...or something else, and then link all of those other pages on that page? I have a few pages on my site now that I wrote before I started to think about all the different combinations I wanted to rank for: http://noahsdad.com/gift-ideas-down-syndrome/ and http://noahsdad.com/best-fisher-price-learning-toys/ I'm open to any feedback you guys may have on this. I'd also like to do some posts on "Down Syndrome Books" and hope to use the same info that you guys give me and apply to books. (Therapy books, touch and feel books, resource books, new parents books, etc..) Hoping some folks chime in as your help would really be appreciated.
Intermediate & Advanced SEO | | NoahsDad0 -
Duplicate Content and Titles
Hi Mozzers, I saw a considerable amount of duplicate content and page titles on our clients website. We are just implementing a fix in the CMS to make sure that these are all fixed. What changes do you think I could see in terms of rankings?
Intermediate & Advanced SEO | | KarlBantleman0 -
Renaming a URL
Hi, If we rename a URL (below) http://www.opentext.com/2/global/company/company-ecm-positioning.htm
Intermediate & Advanced SEO | | pstables
to http://www.opentext.com/2/global/products/enterprise-content-management.htm (or something similar) Would search engines recognize that as a new page altogether? I know they would need to reindex it accordingly, so in theory it is kind of a "new" page. But the reason for doing this is to maintain the page's metrics (inbound links, authority, social activity, etc) instead of creating a new page from scratch. The page has been indexed highly in the past, so we want to keep it active but optimize it better and redirect other internal content (that's being phased out) to it to juice it up even more. Thanks in advance!
Greg0