Site Structure: How do I deal with a great user experience that's not the best for Google's spiders?
-
We have ~3,000 photos that have all been tagged. We have a wonderful AJAXy interface for users where they can toggle all of these tags to find the exact set of photos they're looking for very quickly.
We've also optimized a site structure for Google's benefit that gives each category a page. Each category page links to applicable album pages. Each album page links to individual photo pages. All pages have a good chunk of unique text.
Now, for Google, the domain.com/photos index page should be a directory of sorts that links to each category page. Alternatively, the user would probably prefer the AJAXy interface.
What is the best way to execute this?
-
I'm not sure that I totally understand your question: are you building a site and wondering if you should make it beautiful with AJAX or readable for search engines with a clear category structure? Or do you already have a site with AJAX and now you've taken the time to come up with a category page structure, but you're hesitating to implement it?
If it's the first, I don't think that you should think that your choices are 1) be usable/beautiful, or 2) be search-engine friendly. There are a lot of ways to make an HTML site beautiful. You can use new enhancements in HTML5, or build a standard HTML site and then use JavaScript to make elements more interactive.
If it's the second, I'd go with Thomas's suggestion and test your AJAX site to see how readable it is for Google. AJAX isn't readable by Google, but the underlying HTML is, so there may be enough HTML links that Google can still get around your site.
Once you've tested how readable your site is, then you have to decide if this interface is worth pages being missed or ranked lower by Google. My guess is that you can probably keep your site the way it is primarily, but you'll have to make some tweaks. It's hard to give specifics without knowing specifics, though.
Good luck!
-
the best thing to do right now is have a look at
Prior to building or finishing the site it would be a good idea to look at how Google is crawling the Ajax version of your planned directory or category I would use
the spider tool as well as the image tool
http://www.feedthebot.com/tools/spider/
http://www.feedthebot.com/tools/alt/
All tools
http://www.feedthebot.com/tools/
After your certain that the Ajax start causing any issues for Google bot I believe the best way to move forward is summed up in 2 links below very well.
https://www.distilled.net/blog/seo/case-study-determining-site-architecture-from-keyword-research/
https://www.distilled.net/blog/seo/why-you-should-map-out-your-sites-information-architecture/
I have not seen your website so I can't tell you if the user would prefer the Ajax However Google bot does not do well with Ajax vs text
"Problem 5: AJAX and URLs"
check out problem 5 in this link as well
https://www.distilled.net/blog/seo/fixing-seo-problems-with-html5/
Sincerely,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Is there a difference between 'Mø' and 'Mo'?
The brand name is Mø but users are searching online for Mo. Should I changed all instances of Mø to be Mo on my clients website?
Intermediate & Advanced SEO | | ben_mozbot010 -
Need a layman's definition/analogy of the difference between schema and structured data
I'm currently writing a blog post about schema. However I want to set the record straight that schema is not exactly the same as structured data, although both are often used interchangeably. I understand this schema.org is a vocabulary of global identifiers for properties and things. Structured data is what Google officially stated as "a standard way to annotate your content so machines can understand it..." Does anybody know of a good analogy to compare the two? Thanks!
Intermediate & Advanced SEO | | RosemaryB0 -
Sitemaps during a migration - which is the best way of dealing with them?
Many SEOs I know simply upload the new sitemap once the new site is launched - some keep the old site's URLs on the new sitemap (for a while) to facilitate the migration - others upload both the old and the new website together, to support the migration. Which is the best way to proceed? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Is my site penalized by Google?
Let's say my website is aaaaa.com and company name is aaaaa Systems. When I search Google aaaaa my site do not come up at all. When I search for "aaaaa Systems" it comes up. But in WMT I see quite a few clicks from aaaaa as keyword. Most of the traffic is brand keywords only. I never received any manual penalty in WMT ever. Is the site penalized or regular algorithm issues?
Intermediate & Advanced SEO | | ajiabs0 -
Brand sections performing badly in SERP's but all SEO tools think we are great
I have had this problem for some time now and I've asked many many experts. Search for Falke in Google.co.uk and this is what you get: http://www.sockshop.co.uk/by_brand/falke/ 3rd Our competitor
Intermediate & Advanced SEO | | jpbarber
http://www.mytights.com/gb/brand/falke.html 4th Our competitor http://www.uktights.com/section/73/falke 104th this is us ????? 9th for Falke tights with same section not our falke tights section? All sites seem to link to their brand sections in the same way with links in the header and breadcrumbs, Opensite exporler only shows 2 or 3 internal links for our compertitors, 1600+ from us?
Many of our brand sections rank badly Pretty Polly and Charnos brands rank page 2 or 3 with a brand subsection with no links to them, main section dosn't rank? Great example is Kunert, a German brand no UK competition our section has been live for 8 years, the best we can do is 71st Google UK, 1st on Bing (as we should be). I'm working on adding some quality links, but our comtetitors have a few low quality or no external links, only slightly better domain authority but rank 100+ positions better than us on some brands. This to me would suggest there is something onpage / internal linking I'm doing wrong, but all tools say "well done, grade A" take a holiday. Keyword denisty is similar to our competiors and I've tried reducing the number of products on the page. All pages really ranked well pre Penguin, and Bing still likes them. This is driving me nuts and costing us money Cheers Jonathan
www.uktights.com1 -
Other domains hosted on same server showing up in SERP for 1st site's keywords
For the website in question, the first domain alphabetically on the shared hosting space, strange search results are appearing on the SERP for keywords associated with the site. Here is an example: A search for "unique company name" shows the results: www.uniquecompanyname.com as the top result. But on pages 2 and 3, we are getting results for the same content but for domains hosted on the same server. Here are some examples with the domain name replaced: UNIQUE DOMAIN NAME PAGE TITLE
Intermediate & Advanced SEO | | Motava
ftp.DOMAIN2.com/?action=news&id=63
META DESCRIPTION TEXT UNIQUE DOMAIN NAME PAGE TITLE 2
www.DOMAIN3.com/?action=news&id=120
META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 2
www.DOMAIN4.com/?action=news&id=120
META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 3
mail.DOMAIN5.com/?action=category&id=17
META DESCRIPTION TEXT3 ns5.DOMAIN6.com/?action=article&id=27 There are more but those are just some examples. These other domain names being listed are other customer domains on the same VPS shared server. When clicking the result the browser URL still shows the other customer domain name B but the content is usually the 404 page. The page title and meta description on that page is not displayed the same as on the SERP.As far as we can tell, this is the only domain this is occurring for.So far, no crawl errors detected in Webmaster Tools and moz crawl not completed yet.0