What are the potential SEO downsides of using a service like unbounce for content pages?
-
I'm thinking of using unbounce.com to create some content driven pages. Unbounce is simple, easy-to-use, and very easy for non-devs at my company to create variations on pages.
I know they allow adding meta descriptions, title tags, etc and allow it to be indexable by Google, but I was wondering if there were any potential downsides to using unbounce as opposed to hosting it myself.
Any help would be appreciated!
-
Hi,
I'm the person behind SEO at Unbounce.
There is no technical SEO drawback. Unbounce allows you to directly control all of the elements of your on-page SEO. You can even employ rel="canonical" if you are so inclined to indicate which variation Google should pay the most attention to.
If you have any questions feel free to contact me: Carlos@unbounce.com
-
Hi Ben,
Thanks for the answer! Sorry if I wasn't clear in my original question, but we are actually using Unbounce for PPC testing already.
The pages that we are planning on creating are not necessarily landing pages. It's just much faster for us to create pages/content on Unbounce at the moment than it is creating actual pages on our site. (That way non-devs can work and create new pages as well.)
In your opinion would there be any major downsides to creating some page on unbounce? Obviously it's not ideal, but if there are no major issues we might use their service albeit temporarily.
Thanks!
Seiya
-
That's some solid advice right there.
-
This process may lend itself to PPC a bit more than SEO. When split testing you will need to be aware of duplicate content, and considering that your ultimate goal is to figure out which landing pages are more effective, you will end up removing some of the pages anyway. On a large scale this isn't going to be as effective.
I would consider running a PPC account to test these pages and not have them indexed. Then, once you have a landing page that performs well, create it on the site and promote it with SEO.
-
Seiyav-
If you are looking for a simple and easy solution to start taking advantage of AB testing and getting landing pages created quickly without waiting for a developer, it is a very cost effective model. You can generate landing pages quickly and easily without developers.....you can set up testing easily and the system will provide you metrics to measure the results without a lot of in-depth thought.
There really are no downsides other than the cost.....
We have found with larger clients as they generate some expertise and some clarity regarding which landing pages are working better, then they start to bring it inside to generate more control, save money and become more knowledgable about AB testing and development of landing pages. But unbounce is a great step in that process.
Good luck. Hope it helps..
Mark
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How important is it to have separate Privacy Policy and Terms of Use pages?
How important is it to Google's algorithm for a site to have separate PP and TOU pages? Is it necessary for a Terms of Use page in the SEO perspective?
Technical SEO | | agpz0 -
Will redirecting a logged in user from a public page to an equivalent private page (not visible to google) impact SEO?
Hi, We have public pages that can obviously be visited by our registered members. When they visit these public pages + they are logged in to our site, we want to redirect them to the equivalent (richer) page on the private site e.g. a logged in user visiting /public/contentA will be redirected to /private/contentA Note: Our /public pages are indexed by Google whereas /private pages are excluded. a) will this affect our SEO? b) if not, is 302 the best http status code to use? Cheers
Technical SEO | | bernienabo0 -
Is there any benefit in using a subdomain redirected to a single page?
For example if we have a domain www.bobshardware.com.au and we setup a subdomain sydneysupplies.bobshardware.com.au and then brisbanescrewdrivers.bobshardware.com.au and used those in ad campaigns. Each subdomain being redirected back to a single page such as bobshardware.com.au/brisbane-screw-drivers etc. Is there a benefit ? Cheers
Technical SEO | | techdesign0 -
Hreflang and possible duplicate content SEO issue
| 0 <a class="vote-down-off" title="This question does not show any research effort; it is unclear or not useful">down vote</a> favorite | Hey community, my first question here 🙂 Imagine there is a page with video, it has hreflang tags setup, to lead let's say German visitors to /de/ folder... So, on that German version of page, everything like menus, navigation and such are in German, but the video is the same, the title of the video (H1 tag) is the same, <title></code></strong> and <strong><code>meta description</code></strong> is the same as on the original English page. It means that general (English) page and German version of it has the same key content in English.</p> <p>To me it seems to be a SEO duplicate content issue. As I know, Google doesn't think that content is duplicate, if it is properly translated to other language.</p> <p>Does my explained case mean that the content will be detected by Google as duplicate?</p> </div> </div> </td> </tr> </tbody> </table></title> |
Technical SEO | | poiseo0 -
Added 301 redirects, pages still earning duplicate content warning
We recently added a number of 301 redirects for duplicate content pages, but even with this addition they are still showing up as duplicate content. Am I missing something here? Or is this a duplicate content warning I should ignore?
Technical SEO | | cglife0 -
Rel Canonical tag using Wordpress SEO plugin
Hi team I hope this is the right forum for asking this question. I have a site http://hurunuivillage.com built on Wordpress 3.5.1 using a child theme on Genesis 1.9. We're using Joost's Wordpress SEO plugin and I thought it was configured correctly but the Crawl Diagnostics report has identified an issue with the Rel Canonical tag on the sites pages. I have not edited the plugin settings so am surprised the SEOMoz Crawl has picked up a problem. Example: Page URL is http://hurunuivillage.com/ Tag Value http://hurunuivillage.com/ (exactly the same) Page Authority 39 Linking Root Domains 23 Source Code Considering the popularity of the plugin I'm surprised I have not been able to find tutorials to find what I'm doing wrong or should be doing better. Thanks in advance. Best Nic
Technical SEO | | NicDale0 -
Block or remove pages using a robots.txt
I want to use robots.txt to prevent googlebot access the specific folder on the server, Please tell me if the syntax below is correct User-Agent: Googlebot Disallow: /folder/ I want to use robots.txt to prevent google image index the images of my website , Please tell me if the syntax below is correct User-agent: Googlebot-Image Disallow: /
Technical SEO | | semer0 -
"nofollow pages" or "duplicate content"?
We have a huge site with lots of geographical-pages in this structure: domain.com/country/resort/hotel domain.com/country/resort/hotel/facts domain.com/country/resort/hotel/images domain.com/country/resort/hotel/excursions domain.com/country/resort/hotel/maps domain.com/country/resort/hotel/car-rental Problem is that the text on ie. /excursions is often exactly the same on .../alcudia/hotel-sea-club/excursion and .../alcudia/hotel-beach-club/excursion The two hotels offer the same excursions, and the intro text on the pages are the exact same throughout the entire site. This is also a problem on the /images and /car-rental pages. I think in most cases the only difference on these pages is the Title, description and H1. These pages do not attract a lot of visits through search-engines. But to avoid them being flagged as duplicate content (we have more than 4000 of these pages - /excursions, /maps, /car-rental, /images), do i add a nofollow-tag to these, do i block them in robots.txt or should i just leave them and live with them being flagged as duplicate content? Im waiting for our web-team to add a function to insert a geographical-name in the text, so i could add ie #HOTELNAME# in the text and thereby avoiding the duplicate text. Right now we have intros like: When you visit the hotel ... instead of: When you visit Alcudia Sea Club But untill the web-team has fixed these GEO-tags, what should i do? What would you do and why?
Technical SEO | | alsvik0