How not to get penalized by having a Single Page Interface (SPI) ?
-
Guys, I run a real estate website where my clients pay me to advertise their properties.
The thing is, from the beginning, I had this idea about a user interface that would remain entirely on the same page. On my site the user can filter the properties on the left panel, and the listings (4 properties at each time) are refreshed on the right side, where there is pagination.
So when the user clicks on one property ad, the ad is loaded by ajax below the search panel in the same page .. there's a "back up" button that the user clicks to go back to the search panel and click on another property.
People are loving our implementation and the user experience, so I simply can't let go of this UI "inovation" just for SEO, because it really is something that makes us stand out from our competitors.
My question, then, is: how not to get penalized in SEO by having this Single Page Interface, because in the eyes of Google users might not be browsing my site deep enough ?
-
Hi,
Google and Bing can see how much time your users spend on the page, and since they can also see that there is a large amount of information accessible through that page, I don't think you need to be as worried about the "single page" factor as normal.
That said, just because your main user interface lives within a single page, there is no reason that you cannot have other pages linked to it. In fact there are a number of other pages which should be included in your site. For example: Contact, About, Terms, Privacy Policy and (if relevant) Disclosure and/or Disclaimer. They do not have to be right up front or included in your main UI, but they should at least be available for users as text links at the bottom of the page, in a sidebar or somewhere. If you don’t include them you are reducing the appearance of transparency for the site. This works against trust and will make people less confident about doing business through your site. Given that you are in real estate, these things should be a major consideration.
Also, if you do not have an About page, you are reducing your opportunity to grow your customer base and add more clients.
Hope that helps,
Sha
-
If you have your listings available in an unordered list, that should be fine. If there aren't hundreds and hundreds of listings on your site, I don't think Google will have a problem with your implementation. If there are, you might consider building static pages for each category, and linking to the listings from there.
-
John, thanks for the quick reply.
I had already read the "make your Ajax page indexable", but unfortunately it was too late in product development and our programmers simply convinced us it would imply re-doing the entire backend for it to work.
So we already have in place a workaround for crawlers reach all these listings. Below the search panel (that has Ajax pagination and loads the ads on the same page with javascript) we have a standard html
So the crawlers can reach the properties individual pages. In other words, we comply with the rule "make each of your pages reachable by at least one internal link".
But my question was more focused about how google "sees" the navigation pattern of my users ... I know the crawler is reaching those pages, but since the majority of users use the search panel (that loads the properties by javascript/ajax) and not the static links below it, it might appear that the users only viewed one page inside our site.
-
Is there some alternate navigation to reach all of these listings without using your AJAX search? Or are the listings included in a sitemap? Is there some way for Google to find them already?
I'd recommend reading http://code.google.com/web/ajaxcrawling/ to learn more about how to make your AJAXy pages indexable. You may also want to take a look at http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html if you have prev and next pagination. If you have a view all, and want to make that the canonical form, you'll want to look at http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html
Also, in Bing Webmaster Tools, you can go to the Crawl > Crawl Settings tab and enable the "Configure your site to have bingbot crawl escaped fragmented URLs containing #!." option if that's applicable to you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does having too many wordpress portfolio pages with little content hurt a site's SEO?
I have a site that is for a service company, not image based like a photographer or artist. We utilize the Portfolio feature to create a gallery of floor coating finishes (images of all the flooring finish options available) but this solution has created /portfolio/file-name pages for each image. These pages have no other content besides the image. I've run SEMrush audits on this site which shows a high percentage of pages with low text/code ratio and duplicate content (a lot of the finishes have very similar names). This site has been extremely slow to improve any visibility online (more than 9 months) and I'm wondering if this is a factor by possibly having a negative effect on our site. We initially chose the portfolio option because it was the best-looking solution for our users but we can certainly change it to another format if that is better. Thanks!
Web Design | | WillGMG0 -
Regarding rel=canonical on duplicate pages on a shopping site... some direction, please.
Good morning, Moz community: My name is David, and I'm currently doing internet marketing for an online retailer of marine accessories. While many product pages and descriptions are unique, there are some that have the descriptions duplicated across many products. The advice commonly given is to leave one page as is / crawlable (probably best for one that is already ranking/indexed), and use rel=canonical on all duplicates. Any idea for direction on this? Do you think it is necessary? It will be a massive task. (also, one of the products that we rank highest for, we have tons of duplicate descriptions.... so... that is sort of like evidence against the idea?) Thanks!
Web Design | | DavidCiti0 -
Why would a developer build all page content in php?
Picked up a new client. Site is built on Wordpress. Previous developer built nearly all page content in their custom theme's PHP files. In other words, the theme's "page.php" file contains virtually all the HTML for each of the site's pages. Each individual page's back-end page editor appears blank, except for some of the page text. No markup, no widgets, no custom fields. And no dedicated, page-specific php files either. Pages are differentiated within page.php using: elseif (is_page("27") Has anyone ever come across this approach before? Why might someone do this?
Web Design | | mphdavidson0 -
Dedicated landing pages vs responsive web design
I've been doing some research into web design and page layout as my company is considering a re-design. However, we have come to an argument around responsive webdesign vs SEO. The argument is around me (SEO specialist) arguing that I want dedicated pages for all my content as it's good for SEO since it focuses keywords and content properly, and it still adheres to good user journeys (providing it's done correctly), and my web designer arguing that mobile traffic is on the rise (which it is I know) so we should have more content under 1 URL and use responsive web design so that users can just scroll through content instead of having to keep be direct to different pages. What do I do... I can't find any blogs, questions, or whiteboards that really touches on this topic, so can anyone advise me on whether I should: Create dedicated landing pages for each bit of content which is good for SEO and taking users on a journey around my site OR All content that is relative to a landing page, put all under that one URL (e.g. "About us" may have info on the company, our team, our history, careers) and allow people to scroll down what could be a very long page on any device, but may effect SEO as I can't focus keywords/content under one URL properly, so it may effect rankings. Any advice SEO and user experience whizzes out there?
Web Design | | blackboxideas0 -
Multiple Local Schemas Per Page
I am working on a mid size restaurant groups site. The new site (in development) has a drop down of each of the locations. When you hover over a location in the drop down it shows the businesses info (NAP). Each of the location in the Nav list are using schema.org markup. I think this would be confusing for search robots. Every page has 15 address schemas and individual restaurants pages NAP is at the below all the locations' schema/NAP in the DOM. Have any of you dealt with multiple schemas per page or similar structure?
Web Design | | JoshAM0 -
Too Many On Page Links, rel="nofollow" and rel="external"
Hi, Though similar to other questions on here I haven't found any other examples of sites in the same position as mine. It's an e-commerce site for mobile phones that has product pages for each phone we sell. Each tariff that is available on each phone links through to the checkout/transfer page on the respective mobile phone network. Therefore when the networks offer 62 different tariffs that are available on a single phone that means we automatically start with 62 on page links that helps to quickly tip us over the 100 link threshold. Currently, we mark these up as rel="external" but I'm wondering if there isn't a better way to help the situation and prevent us being penalised for having too many links on page so: Can/should we mark these up as rel="nofollow" instead of, or as well as, rel="external"? Is it inherently a problem from a technical SEO point of view? Does anyone have any similar experiences or examples that might help myself or others? As always, any help or advice would be much appreciated 🙂
Web Design | | Tinhat0 -
Duplicate Page Title
Virtually all of my pages are coming up with a "Duplicate Page Title" error even though the page title are different. I assume this is down to the end of the page title having the company name. Is this the reason and is it a problem to have a page title like below... "Page title description - Company Name"
Web Design | | petewinter0 -
Why is this page removed from Google & Bing indices?
This page has been removed from indices at Bing and Google, and I can't figure out why. http://www.pingg.com/occasion/weddings This page used to be in those indices There are plenty of internal links to it The rest of the site is fine It's not blocked by meta robots, robots.txt or canonical URL There's nothing else to suggest that the page is being penalized
Web Design | | Ehren0