Site Structure: How do I deal with a great user experience that's not the best for Google's spiders?
-
We have ~3,000 photos that have all been tagged. We have a wonderful AJAXy interface for users where they can toggle all of these tags to find the exact set of photos they're looking for very quickly.
We've also optimized a site structure for Google's benefit that gives each category a page. Each category page links to applicable album pages. Each album page links to individual photo pages. All pages have a good chunk of unique text.
Now, for Google, the domain.com/photos index page should be a directory of sorts that links to each category page. Alternatively, the user would probably prefer the AJAXy interface.
What is the best way to execute this?
-
I'm not sure that I totally understand your question: are you building a site and wondering if you should make it beautiful with AJAX or readable for search engines with a clear category structure? Or do you already have a site with AJAX and now you've taken the time to come up with a category page structure, but you're hesitating to implement it?
If it's the first, I don't think that you should think that your choices are 1) be usable/beautiful, or 2) be search-engine friendly. There are a lot of ways to make an HTML site beautiful. You can use new enhancements in HTML5, or build a standard HTML site and then use JavaScript to make elements more interactive.
If it's the second, I'd go with Thomas's suggestion and test your AJAX site to see how readable it is for Google. AJAX isn't readable by Google, but the underlying HTML is, so there may be enough HTML links that Google can still get around your site.
Once you've tested how readable your site is, then you have to decide if this interface is worth pages being missed or ranked lower by Google. My guess is that you can probably keep your site the way it is primarily, but you'll have to make some tweaks. It's hard to give specifics without knowing specifics, though.
Good luck!
-
the best thing to do right now is have a look at
Prior to building or finishing the site it would be a good idea to look at how Google is crawling the Ajax version of your planned directory or category I would use
the spider tool as well as the image tool
http://www.feedthebot.com/tools/spider/
http://www.feedthebot.com/tools/alt/
All tools
http://www.feedthebot.com/tools/
After your certain that the Ajax start causing any issues for Google bot I believe the best way to move forward is summed up in 2 links below very well.
https://www.distilled.net/blog/seo/case-study-determining-site-architecture-from-keyword-research/
https://www.distilled.net/blog/seo/why-you-should-map-out-your-sites-information-architecture/
I have not seen your website so I can't tell you if the user would prefer the Ajax However Google bot does not do well with Ajax vs text
"Problem 5: AJAX and URLs"
check out problem 5 in this link as well
https://www.distilled.net/blog/seo/fixing-seo-problems-with-html5/
Sincerely,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301ing one site's links to another
Hi, I have one site with a well-established link profile, but no actual reason to exist (site A). I have another site that could use a better link profile (site B). In your experience, would 301 forwarding all of site A's pages to site B do anything positive for the link profile/organic search of the site B? Site A is about boating at a specific lake. Site B is about travel destinations across the U.S. Thanks! Best... Michael
Intermediate & Advanced SEO | | 945010 -
SEO's Structuring Your Work Week
Hi I wanted some feedback on how other SEO's structure their time. I feel as though I'm falling into the trap of fire fighting with tasks rather than working on substantial projects... I don't feel as though I'm being as effective as I could be. Here's our set up - Ecommerce site selling thousands of products - more of a generalist with 5 focus areas. 2 x product/merchandising teams - bring in new products, write content/merchandise products Web team - me (SEO), Webmaster, Ecommcerce manager Studio - Print/Email marketing/creative/photography. A lot of my time is split between working for the product teams doing KWD research, briefing them on keywords to use, checking meta. SEO Tasks - Site audits/craws, reporting Blogs - I try and do a bit as I need it so much for SEO, so I've put a content/social plan together but getting a lot of things actioned is hard... I'm trying to coordinate this across teams Inbetween all that, I don't have much time to work on things I know are crucial like a backlink/outreach plan, blog/user guide/content building etc. How do you plan your time as an SEO? Big projects? Soon I'm going to pull back from the product optimisation & try focussing on category pages, but for an Ecommerce site they are extremely difficulty to promote. Just asking for opinions and advice 🙂
Intermediate & Advanced SEO | | BeckyKey3 -
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
Intermediate & Advanced SEO | | fablau3 -
Problems with US site being prioritized in Google UK
Our US version (.com) of our site is appearing above the UK version (co.uk) when using Google UK. I know Google has been giving US more priority in the UK market over the last couple years... What is protocol for fixing/dealing with this? Also, and probably more importantly, how do we handle users who are looking for the UK site right now? Majority of our users are coming from the US so we don't want to cause them any inconvenience, but the UK users need an easy way to get to the UK version quickly. Input is much appreciated!
Intermediate & Advanced SEO | | chrisvogel0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Website with only a portion being 'mobile friendly' -- what to tell Google?
I have a website for desktop that does a lot of things, and have converted part of it do show pages in a mobile friendly format based on the users device. Not responsive design, but actual diff code with different formatting by mobile vs desktop--but each still share the same page url name. Google allows this approach. The mobile-friendly part of the site is not as extensive as desktop, so there are pages that apply to the desktop but not for mobile. So the functionality is limited some for mobile devices, and therefore some pages should only be indexed for desktop users. How should that page be handled for Google crawlers? If it is given a 404 not found for their mobile bot will Google properly still crawl it for the desktop, or will Google see that the url was flagged as 'not found' and not crawl it for the desktop? I asked a similar question yest, but it was not stated clearly. Thanks,Ted
Intermediate & Advanced SEO | | friendoffood0 -
Best set up for mobile site for SEO
Hello Does anyone have any input into what is the best way to have a mobile website URL structure for not responsive display sites. mobile.site.com www.site.com/m/ or neither have it just display on the same URL. Thanks
Intermediate & Advanced SEO | | christaylorconsulting0 -
Affiliate Links Added and Site Dropped in only Google
My site was dropshipping a product and we switched to an affiliate offer. We had three 4 links to different affiliate products. Our site dropped the next day. I have been number 1 for 6 months, has a pr 6 and is 2 years old. It has been 2 weeks and the site hasn't jumped back. Any suggestions on how to handle this?
Intermediate & Advanced SEO | | dkash0