How to make AJAX content crawlable from a specific section of a webpage?
-
Content is located in a specific section of the webpage that are being loaded via AJAX.
-
Thanks Paddy! We'll definitely try these solutions.
-
Hi there,
There are plenty of really good resources online that cover this area, so I'd like to point you towards them rather than copy and paste their guidelines here!
Google has a good guide here with lots of visuals on how they crawl AJAX -
https://developers.google.com/webmasters/ajax-crawling/docs/getting-started
They also have a short video here covering some of the basics of Google crawling AJAX and JavaScript:
https://www.youtube.com/watch?v=_6mtiwQ3nvw
You should also become familiar with pushState which is cover in lots of detail, with an example implementation in this blog post:
http://moz.com/blog/create-crawlable-link-friendly-ajax-websites-using-pushstate
The guys at Builtvisible have also put together a few good blog posts on this topic which are worth a read:
http://builtvisible.com/javascript-framework-seo/
http://builtvisible.com/on-infinite-scroll-pushstate/
Essentially, you need to make sure that Googlebot is able to render your content as you intended and that this looks the same to them as it does to users. You can often test how well they can render your content by checking the cache of your page or by using this feature in Google Webmaster Tools.
I hope that helps!
Paddy
-
Hi,
Making the content being loaded by AJAX crawlable by Google involves serving a static HTML snapshot of the content being loaded by AJAX to Google. We should make sure that the HTML snapshot is the exact copy that will be served to the visitors through AJAX.
Here you go for more information:
https://support.google.com/webmasters/answer/174992?hl=en
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Teaser Content Help!!
I'm in the process of a redesign and upgrade to Drupal 8 and have used Drupal's taxonomy feature to add a fairly large database of Points of Interest, Services etc. initially this was just for a Map/Filter for site users. The developer also wants to use teasers from these content types (such as a scenic vista description) as a way to display the content on relevant pages (such as the scenic vistas page, as well as other relevant pages). Along with the content it shows GPS coordinates and icons related to the description. In short, it looks cool, can be used in multiple relevant locations and creates a great UX. However, many of these teasers would basically be pieces of content from pages with a lot of SEO value, like descriptive paragraphs about scenic viewpoints from the scenic viewpoints page. Below is an example of how the descriptions of the scenic viewpoints would be displayed on the scenic viewpoints pages, as well as other potential relevant pages. HOW WILL THIS AFFECT THE SEO VALUE OF THE CONTENT?? Thanks in advance for any help, I can't find an answer anywhere. About 250 words worth of content about a scenic vista. There’s about 8 scenic vista descriptions like this from the scenic vistas page, so a good chunk of valuable content. There are numerous long form content pages like this that have descriptions and information about sites and points of interest that don't warrant having their own page. For more specific content with a dedicated page, I can just the the intro paragraph as a teaser and link to that specific page of content. Not sure what to do here.
Intermediate & Advanced SEO | | talltrees0 -
Is writing good content the best SEO?
Hi, After reading Mr. Shepard's amazing article on the 7 concepts of advanced on-page SEO (https://moz.com/blog/7-advanced-seo-concepts), I decided to share my own experience in hopes of helping others. I started doing legal SEO back in 2013. At the time I really didn't know much about SEO. My first client (my brother) had recently left the D.A.'s office to become a criminal defense attorney. I told him to write content for the following areas: domestic violence, sex crimes, and homicide. He finished his first content piece on domestic violence and I was not impressed. It seemed too unique, individualized, and lacked the "generic" feel that many of the currently ranking pages had. Please note that I don't mean "generic" in a negative way. I just mean that his content regarding domestic violence felt too personalized. Granted, his "personalized" approach came from a Deputy D.A. with over 13 years handling domestic violence, sex crimes, and murder cases. I was inclined to re-write his content, but lacking any experience in criminal law I really had no choice but to use it. IMPORTANT: Please note that I barely knew any SEO at the time (I hadn't even yet discovered MOZ), and my brother knew, and continues to know, absolutely nothing about SEO. He simply wrote the content from the perspective of an attorney who had spent the better part of 13 years handling these types of cases. The result? Google: "Los Angeles domestic violence lawyer/attorney", "Los Angeles sex crimes lawyer/attorney", and "Los Angeles homicide attorney." They have held those spots consistently since being published. I know that MANY other factors contribute to the success of content, but at the time I published them we had few links and very little "technical SEO." Unfortunately, I started learning "SEO" and applied standard SEO techniques to future content. The result? Never as good as the articles that were written with no SEO in mind. My purpose in writing this is to help anyone about to tackle a new project or revamp an existing site. Before getting too caught up in the keywords, H tags, and all the other stuff I seem to worry too much about, simply ask yourself - "is this great content?" Thanks again to the MOZ team for the great advice they have shared over the years. Honestly, I think I sometimes become overly reliant on SEO b/c it seems easier than taking the time to write a great piece of content. P.s. Any "SEO" stuff you see on the above-mentioned pages was done by me after the pages ranked well. P.p.s. I don't mean to imply that the above-mentioned pages are perfect, because they are not. My point is that content can rank well even without any emphasis on SEO, as long as the person writing it knows about the subject and takes the time to write something that readers find useful.
Intermediate & Advanced SEO | | mrodriguez14403 -
Should sub domains to organise content and directories?
I'm working on a site that has directories for service providers and content about those services. My idea is to organise the services into groups, e.g. Web, Graphic, Software Development since they are different topics. Each sub domain (hub) has it's own sales pages, directory of services providers and blog content. E.g. the web hub has web.servicecrowd.com.au (hub home) web.servicecrowd.com.au/blog (hub blog) http://web.servicecrowd.com.au/dir/p (hub directory) Is this overkill or will it help in the long run when there are hundreds of services like dog grooming and DJing? Seems better to have separate sub domains and unique blogs for groups of services and content topics.
Intermediate & Advanced SEO | | ServiceCrowd_AU0 -
Duplicate content reported on WMT for 301 redirected content
We had to 301 redirect a large number of URL's. Not Google WMT is telling me that we are having tons of duplicate page titles. When I looked into the specific URL's I realized that Google is listing an old URL's and the 301 redirected new URL as the source of the duplicate content. I confirmed the 301 redirect by using a server header tool to check the correct implementation of the 301 redirect from the old to the new URL. Question: Why is Google Webmaster Tool reporting duplicated content for these pages?
Intermediate & Advanced SEO | | SEOAccount320 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
Is this will post Duplicated Content
I have domain let say abcshoesonlinestore.com and inside pages of this abcshoesonlinestore.com is ranking very well such as affiliate page, knowledgebase page and other pages, HOWEVER i would like to change my home page and product page to shorter url which abcshoes.com and keep those inside page like www.abashoesonlinestore.com/affiliate or www.abcshoesonlinestore.com/knowledgebase as it is - will this pose duplicate content? This is my plan to do it: the home page and product page will be www.abcshoes.com and when people click www.abcshoes.com/affiliate it will redirect 301 to abcshoesonlinestore.com/affiliate HOWEVER if someone type abcshoesonlinestore.com or abcshoesonlinestore.com/product it will redirect to abcshoes.com or its product page itself (i want to use 302 instead 301 (ASSUMING if the homapage or product page have manual penalization or anything bad we want to leave it behind and start fresh JUST assume because i read some post that 301 will carry any bad thing to new site too) The reason i do not want to 301 from abcshoesonlinestore.com to abcshoes.com is because those many pages is ranking top 3 in GOOGLE ( i worry will lose this ranking since this bringing traffic for us) Is this good idea or bad idea or any better idea or should i try to see the outcome 🙂 - the only concern is from abcshoesonlinestore.com to abcshoes.com will pose as duplicate content if i do not use 301 - or can i use google webmaster tools to remove the home page and product page for abcshoesonlinestore.com can we tell google that? PS: (home page and product page will have new revise content and minor design change) but inside page will stay the same design Please give me some advise
Intermediate & Advanced SEO | | owen20110 -
Image and Content Management
My boss has decided that on our new website we are building, that he wants all content and images managed by not allowing copying content and/or saving images. Some of the information and images is proprietary, yet most is available for public viewing, but never the less, he wants it prohibited from copy and/or saving. We would still want to keep the content indexable and use appropriate alt tags etc... I wanted to find out if there is any SEO reason and facts to why this would not be a good idea?Would implementing code to prohibit (or at least make it difficult) to save images and copy content, penalize us?
Intermediate & Advanced SEO | | KJ-Rodgers0 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0