How not to get penalized by having a Single Page Interface (SPI) ?
-
Guys, I run a real estate website where my clients pay me to advertise their properties.
The thing is, from the beginning, I had this idea about a user interface that would remain entirely on the same page. On my site the user can filter the properties on the left panel, and the listings (4 properties at each time) are refreshed on the right side, where there is pagination.
So when the user clicks on one property ad, the ad is loaded by ajax below the search panel in the same page .. there's a "back up" button that the user clicks to go back to the search panel and click on another property.
People are loving our implementation and the user experience, so I simply can't let go of this UI "inovation" just for SEO, because it really is something that makes us stand out from our competitors.
My question, then, is: how not to get penalized in SEO by having this Single Page Interface, because in the eyes of Google users might not be browsing my site deep enough ?
-
Hi,
Google and Bing can see how much time your users spend on the page, and since they can also see that there is a large amount of information accessible through that page, I don't think you need to be as worried about the "single page" factor as normal.
That said, just because your main user interface lives within a single page, there is no reason that you cannot have other pages linked to it. In fact there are a number of other pages which should be included in your site. For example: Contact, About, Terms, Privacy Policy and (if relevant) Disclosure and/or Disclaimer. They do not have to be right up front or included in your main UI, but they should at least be available for users as text links at the bottom of the page, in a sidebar or somewhere. If you don’t include them you are reducing the appearance of transparency for the site. This works against trust and will make people less confident about doing business through your site. Given that you are in real estate, these things should be a major consideration.
Also, if you do not have an About page, you are reducing your opportunity to grow your customer base and add more clients.
Hope that helps,
Sha
-
If you have your listings available in an unordered list, that should be fine. If there aren't hundreds and hundreds of listings on your site, I don't think Google will have a problem with your implementation. If there are, you might consider building static pages for each category, and linking to the listings from there.
-
John, thanks for the quick reply.
I had already read the "make your Ajax page indexable", but unfortunately it was too late in product development and our programmers simply convinced us it would imply re-doing the entire backend for it to work.
So we already have in place a workaround for crawlers reach all these listings. Below the search panel (that has Ajax pagination and loads the ads on the same page with javascript) we have a standard html
So the crawlers can reach the properties individual pages. In other words, we comply with the rule "make each of your pages reachable by at least one internal link".
But my question was more focused about how google "sees" the navigation pattern of my users ... I know the crawler is reaching those pages, but since the majority of users use the search panel (that loads the properties by javascript/ajax) and not the static links below it, it might appear that the users only viewed one page inside our site.
-
Is there some alternate navigation to reach all of these listings without using your AJAX search? Or are the listings included in a sitemap? Is there some way for Google to find them already?
I'd recommend reading http://code.google.com/web/ajaxcrawling/ to learn more about how to make your AJAXy pages indexable. You may also want to take a look at http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html if you have prev and next pagination. If you have a view all, and want to make that the canonical form, you'll want to look at http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html
Also, in Bing Webmaster Tools, you can go to the Crawl > Crawl Settings tab and enable the "Configure your site to have bingbot crawl escaped fragmented URLs containing #!." option if that's applicable to you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Referring subdirectory pages from 3rd hierarchy level pages. Will this hurts?
Hi all, We have product feature pages at 3rd tier like website.com/product/features. We have the help guides for each of these features on a different subdirectory like website.com/help/guides. We are linking these help guides from every page of features. So, will it hurts us anywhere just because we are encouraging 4th tier pages in website, moreover they are from different sub-directory. Thanks
Web Design | | vtmoz0 -
Location of body text on page - at top or bottom - does it matter for SEO?
Hi - I'm just looking at the text on a redesigned homepage. They have moved all the text to the very bottom of the page (which is quite common with lots of designers, I notice - I usually battle to move the important text back up to the top). I have always ensured the important text comes at the top, to some extent - does it matter where on the page the text comes, for SEO? Are there any studies you can point me to? Thanks for your help, Luke
Web Design | | McTaggart2 -
Dedicated landing pages vs responsive web design
I've been doing some research into web design and page layout as my company is considering a re-design. However, we have come to an argument around responsive webdesign vs SEO. The argument is around me (SEO specialist) arguing that I want dedicated pages for all my content as it's good for SEO since it focuses keywords and content properly, and it still adheres to good user journeys (providing it's done correctly), and my web designer arguing that mobile traffic is on the rise (which it is I know) so we should have more content under 1 URL and use responsive web design so that users can just scroll through content instead of having to keep be direct to different pages. What do I do... I can't find any blogs, questions, or whiteboards that really touches on this topic, so can anyone advise me on whether I should: Create dedicated landing pages for each bit of content which is good for SEO and taking users on a journey around my site OR All content that is relative to a landing page, put all under that one URL (e.g. "About us" may have info on the company, our team, our history, careers) and allow people to scroll down what could be a very long page on any device, but may effect SEO as I can't focus keywords/content under one URL properly, so it may effect rankings. Any advice SEO and user experience whizzes out there?
Web Design | | blackboxideas0 -
How do I gain full SEO value from individual property pages?
A client of ours has a vacation rental business with rental locations all over the country. Their old sites were a messy assembly of black hat, broken links and htaccess files that were used over and over on each site. We are redoing everything for them, in one site, with multiple subdirectories for individual locations, like Aspen, Fort Meyers, etc. Anyhow, I'm putting together the SEO plan for the site and I have a problem. The individual rental properties have great SEO value (lots of text, indexable pictures, can create google/bing location pages), and are great for linking in social media (Look at this wonderful property, rental price just reduced!). However, I don't want individual properties, which will have very similar keywords, links, descriptions, etc, competing with each other when indexed. Truth be told, I don't really want search engines linking directly to the individual property pages at all. The intended browsing experience should allow a user to "narrow down" exactly what they're seeking using the site until the perfect rental appears. What I want is for searchers to be directed to the property listing index that most closely matches what they're seeking (Ft. Meyers Rental Condos or Breckenridge Rental Homes), and then allow them to narrow it down from there. This is ideal for the users, because it allows them to see all available properties that match what they want, and ideal for the customer, because it applies dozens of pages of SEO mojo to a single index, rather than dozens of pages. So I can't "noindex" or "nofollow", because I want all that good SEO mojo. I can't REL=CANONICAL, because the property pages aren't similar enough to the index. I can't 301 Redirect because I want the users to be able to see the property pages at some point. I'm stymied.
Web Design | | SpokeHQ0 -
Question #1: Does Google index https:// pages? I thought they didn't because....
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored) My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one. The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/ instead of **http://**www.example.com/example-page/ To double check that this was causing a loss in Link Juice. I jumped over to OSE. Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed. So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed... Right?? Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed. The problem is.. is this a volusion problem? Should I switch to Wordpress? here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress) http://www.uncommonthread.com/
Web Design | | TylerAbernethy0 -
Getting a lot more duplicate content warnings than I expected.
I run WordPress on many of my sites and a site crawl has found MANY duplicate content pages on the latest domain I started a campaign for. I expected to see quite a lot on the tag pages that only had one post but even tag pages with multiple posts and author and category pages with many posts are showing as duplicate content. Is this normal for a WordPress site to have so much duplicate content warnings from the taxonomy pages? I have the option to bulk noindex, follow the category and tag pages but should I do it? I get some traffic directly to the tag pages so removing the pages from search results would dent the traffic of the site a little (generally high bounce rate, low engagement traffic anyway) but could removing the apparent duplicate content actually improve the article pages themselves? Or does anyone have any WordPress specific advice for making the pages not duplicate content? I've toyed with the idea of just displaying excerpts but creating manual excerpts for the 4 years worth of posts, some of which I have no personal knowledge of the subject matter so other suggestions are welcome.
Web Design | | williampatton0 -
Page Title Optimization
I am reviewing the optimization on my site and it appears that my page titles follow this method: PAGE_NAME | KEYWORD in CITY ST - COMPANY_NAME I am pretty well optimized for "KEYWORD in CITY ST" but am wondering if I should drop it from all page titles except for the pages that actually deal with that keyword. What are your thoughts on optimizing?
Web Design | | nusani0 -
Site Redesign: Bounce rate, converstion, page views, etc.
Hi Fellow Mozzers, I had a few questions regarding some analytics data we have been seeing since our redesign. Just last week we did a site design overhaul at www.lylif.com. One of the biggest changes we immediately saw was a 15-20% increase in our bounce rate. However, our conversion rates, page views, pages per visit, and site duration has increased. If anyone has some insight as to why we may be having such a large increase in our bounce rate that would be most helpful!
Web Design | | lylif11