Customer Experience vs Search Result Optimisation
-
Yes, I know customer experience is king, however, I have a dilema, my site has been live since June 2013 & we get good feedback on site design & easy to follow navigation, however, our rankings arent as good as they could be?
For example, the following 2 pages share v similar URLs, but the pages do 2 different jobs & when you get to the site that is easy to see, but my largest Keyword "Over 50 Life Insurance" becomes difficult to target as google sees both pages and splits the results, so I think i must be losing ranking positions?
http://www.over50choices.co.uk/Funeral-Planning/Over-50-Life-Insurance.aspx
The first page explains the product(s) and the 2nd is the Quote & Compare page, which generates the income.
I am currently playing with meta tags, but as yet havent found the right combination!
Originally the 2nd page meta tags were focussing on "compare over 50s life insurance" but google still sees "over 50 life insurance" in this phrase, so the results get split. I also had internal anchor text supporting this.
What do you think is the best strategy for optimising both pages?
Thanks
Ash
-
Also if I do combine the 2 pages should i 301 the compare page to the other?
Thanks
Ash -
Tommy thanks for your advice.
Will having specific internal & external anchor text help google know which pages I am targeting for each keyword eg "Over 50 life Insurance" & "Compare Over 50s Life Insurance" or will i always have the same issue because the first phrase is contained within the second?
Ash
-
Hi Ash,
I agree with Aaron in combining both pages since both pages' targeting is very similar and that's the reason why Google is seeing the same keywords for both pages. You have the same keywords in the URL, Title tag and within the content. If you can combine these 2 pages, i think it will have a better effect since visitors can learn about your service and see examples on one page. Furthermore, it is easier to build back links for 50 Life Insurance to 1 page.
If you really want to have 2 separate pages, i would rename the 2nd page and target a different keyword. Life Insurance Quote? Life Insurance Comparison?
-
Again, my bigger concern is how thin the comparison page is.
This really has nothing to do with how similar the URLs are, although it might confuse users a bit. Its the lack of actionable and usable content on the comparison page that I think will make a difference here.
If you were really adamant about keeping the two pages intact and NOT merging them, which could be ok as well, then I would consider building up the comparison page so it was a bit richer in content, and provided some useful comparisons on the page rather than forcing the users to go somewhere else. Currently, there is just one big image in the center with a few lines above it.
Adding in a bit more content / context behind those comparisons could make all the difference.
-
Hi Aaron we could merge them, but when we designed the site we set out to be different from the other comparison type sites, in that rather than push the "Buy/Compare" option on the product landing page we wanted to take them on a journey depending on their level of knowledge.
Laying out their Nav options in the left hand side they could choose where to go depending on which page they arrived at, which is great once you are onsite, but not for optimising for search, hence my question.
Merging the page is an option but are their any options we could explore first?
Thanks
Ash
-
Hey Ash,
Is it an option to merge the pages? The reason I ask that is because when I go to the second page (the revenue generating one), the experience I get with it is actually pretty poor. It doesn't necessarily provide me a lot of information and then on top of that, it immediately redirects me to another page. So in many ways its like a gateway to the place you really want them to go.
Why not remove the second one all together, merge the content together so you can really provide a solid user experience, and then once the customers feel comfortable with the information they have gotten, they can click on receive a quote and get what they are looking for instead of jumping around, which runs the risk of them dropping off.
Is that possible?
Aaron
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
In the google index but search redirects to homepage
Hi everyone, thanks for reading i have a website "www.gardeners.scot" and have the following pages listed in google site: command http://www.gardeners.scot/garden-landscaping-Edinburgh.htm & http://www.gardeners.scot/garden-maintenance-Edinburgh.htm however when a user searches for "garden landscaping Edinburgh" or "garden maintenance Edinburgh" we are in the rankings but google search links these phrases to the home page not to their targeted pages. the site is about a year old have checked the robots.txt, sitemap.xml & .htaccess files but can see anything wrong there. any ideas out there?
Intermediate & Advanced SEO | | livingphilosophy0 -
Optimising Shopify Filtered Pages
Hi Guys, Currently working with a couple Shopify ecommerce sites, currently the main category urls cannot be optimised for SEO as they are auto-generated and basically filtered pages. Examples: http://tinyurl.com/hm7nm7p http://tinyurl.com/zlcoft4 One solution we have came up with is to create HTML based pages for each of these categories example: http://site.com.au/collections/women-sandals In the backend and keep the filtered page setup. So these pages can be crawled and indexed. I was wondering if this is the most viable solution to this problem for Shopify? Cheers.
Intermediate & Advanced SEO | | jayoliverwright0 -
No follow vs do follow the how to
Hi Guys, Sorry if this is an ammature question, just wanted to know I noticed a few people talking about no follows and do follows for backlinks. Is there suppose to be some way to set you website up as nofollow and dofollow for backlinks? I noticed a few people saying to make sure that some directories are nofollow, i would like to know if I can set this up for my own site as I'm a bit conscious and paranoid about others that might backlink to my site who have huge spam or negative seo etc? Any insight into this would be much appreciated Thanks all
Intermediate & Advanced SEO | | edward-may0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
SEO: Subdomain vs folders
Hello, here's our situation:We have an ecommerce website, say www.example.com. For support, we use zendesk which offers a great platform that makes publishing FAQ and various resource articles very easy. We're torn between publishing these articles on our actual website, or publishing them via Zendesk. If we publish them on our website, the url would be something like:
Intermediate & Advanced SEO | | yacpro13
www.example.com/articles/title_article.html On the other hand, if we publish them via zendesk, the url would look like:
support.example.com/articles/title_of_article We would like to publish them via Zendesk, however, we do no want to miss out on any SEO benefit, however marginal it may be. Doing it this way, the domain would have all of the ecommerce pages (product and category pages), and the subdomain would have ALL other types of pages (help articles and also policies, such as return policies, shipping info, etc etc). I know a long time ago, folders were preferred over subdomains for SEO, but things change all the time.Do you think setting up all the ecommerce pages on the domain, and all content pages on the subdomain, would be a lesser solution? Thanks.0 -
Optimising a page for multiple keywords
I remember reading a question a while back about seo for a page targetting multiple keywords but I'm blowed if I can find it now.... I have a page which is optimised for one phrase and want to add 5-6 phrases/keywords... obviously I can't stuff the all the keywords in the page title or the header 1 tag. So I have written the content to mention the other keywords trouble is not wanting to compromise the quality of the page so of the keywords/phrases I have only been able to use once in the content. I assumed as the phrases are all on the same topic/area that this should not really matter. Apart from link building with the correct anchor text is there anything else I should be doing? The other option is to create custom pages for the keyword but again I am not keen on this idea.... Any suggestions?
Intermediate & Advanced SEO | | JohnW-UK0 -
Temporarily Delist Search Results
We have a client that we run campaign sites for. They have asked us to turn off our PPC and SEO in the short term so they can run some tests. PPC no problem straight forward action, but not as straight forward to just turn off SEO. Our campaign site is on Page 1, Position 4, 3 places below our clients site. They have asked us to effectively disappear from the landscape for a period of 1-2 months. Has anyone encountered this before, the ability to delist good SERP for a period of time? Details: Very small site with only 17 pages indexed within google, but home page has good SERP result. My issues are, How to approach this in the most effective manor? Once the delisting process is activated and the site/page disappears, then we reverse the process will we get back to where we were? Anyone encountered this before? I realise this is a ridiculous question and goes against SEO logic, get to page 1 results only to remove it, but hey, clients are always presenting new challenges for us to address..... Thanks
Intermediate & Advanced SEO | | Jellyfish-Agency0 -
Search Engine Pingler
Hello everyone, it's me again 😉 I've just got a Pro membership on SeoMoz and I am full of questions. A few days ago I found very interesting tool called: Search Engine Pingler And description of it was like this: Your website or your page was published a long time, but you can not find it on google. Because google has not index your site. Tool Search engine pingler will assist for you. It will ping the URL of your Page up more than 80 servers of google and other search engines. Inform to the search engine come to index your site. So my question is that tool really helps to increase the indexation of the link by search engine like Google, if not, please explain what is a real purpose of it. Thank you to future guru who can give a right answer 🙂
Intermediate & Advanced SEO | | smokin_ace0