Blocking AJAX Content from being crawled
-
Our website has some pages with content shared from a third party provider and we use AJAX as our implementation. We dont want Google to crawl the third party's content but we do want them to crawl and index the rest of the web page. However, In light of Google's recent announcement about more effectively indexing google, I have some concern that we are at risk for that content to be indexed.
I have thought about x-robots but have concern about implementing it on the pages because of a potential risk in Google not indexing the whole page. These pages get significant traffic for the website, and I cant risk.
Thanks,
Phil
-
Hey Phil. I think I've fully understood your situation but just to be clear I'm presuming you've URL's exposing 3rd party JSON/XML content that you don't want being indexed by Google. Probably the most foolproof method for this case is using the "X-Robots-Tag" HTTP header convention (http://code.google.com/web/controlcrawlindex/docs/robots_meta_tag.html). I would recommend going with "X-Robots-Tag: none", which should do the trick (I really don't think "noarchive" or other options are required if they're not indexing it at all). You'll need to modify your server-side scripts to do this. I'm assuming there's not much pain required for you (or the 3rd-party?) to do this. Hope this helps! ~bryce
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migrate Old Archive Content?
Hi, Our team has recently acquired several newsletter titles from a competitor. We are currently deciding how to handle the archive content on their website which now belongs to us. We are thinking of leaving the content on their site (so as not to suddenly remove a chunk of their website and harm them) but also replicating it on ours with a canoncial link to say our website is the original source. The articles on their site go back as far as 2010. Do you think it would help or hinder our site to have a lot of old archive content added to it? I'm thinking of content freshness issues.Even though the content is old some of it will still be interesting or relevant. Or do you think the authority and extra traffic this content could bring in makes it worth migrating. Any help gratefully received on the old content issue or the idea of using canonical links in this way. Many Thanks
Technical SEO | | frantan0 -
Can Page Content & Description Have Same Content?
I'm studying my crawl report and there are several warnings regarding missing meta descriptions. My website is built in WordPress and part of the site is a blog. Several of these missing description warnings are regarding blog posts and I was wondering if I am able to copy the first few lines of content of each of the posts to put in the meta description, or would that be considered duplicate content? Also, there are a few warnings that relate to blog index pages, e.g. http://www.iainmoran.com/2013/02/ - I don't know if I can even add a description of these as I think they are dynamically created? While on the subject of duplicate content, if I had a sidebar with information on several of the pages (same info) while the content would be coming from a WP Widget, would this still be considered duplicate content and would Google penalise me for it? Would really appreciate some thoughts on this,please. Thanks, Iain.
Technical SEO | | iainmoran0 -
Index.php duplicate content
Hi, new here. Im looking for some help with htaccess file. index.php is showing duplicate content errors with: mysite.com/index.php mysite.com/ mysite.com ive managed to use the following code to remove the www part of the url: IfModule mod_rewrite.c>
Technical SEO | | klsdnflksdnvl
RewriteCond %{HTTPS} !=on
RewriteCond %{HTTP_HOST} ^www.(.+)$ [NC]
RewriteRule ^ http://%1%{REQUEST_URI} [R=301,L] but how can i redirect the mysite.com/index.php and mysite.com/ to mysite.com. Please help0 -
Duplicate Content
Hi, we need some help on resolving this duplicate content issue,. We have redirected both domains to this magento website. I guess now Google considered this as duplicate content. Our client wants both domain name to go to the same magento store. What is the safe way of letting Google know these are same company? Or this is not ideal to do this? thanks
Technical SEO | | solution.advisor0 -
Block Baidu crawler?
Hello! One of our websites receives a large amount of traffic from the Baidu crawler. We do not have any Chinese content or do any business with China since our market is Uk. Is it a good idea to block the Baidu crawler in the robots.txt or could it have any adverse effects on SEO of our site? What do you suggest?
Technical SEO | | AJPro0 -
Block url with dynamic text in
I've just ran a report and I have a lot of duplicate page titles, most of which seem to be the review page, I use Magento and my normal url would be something like blah-blahtext.html but the review url is something like blah-blahtext/reviews/category/categoryname So I want to block the /reviews url bit as no one ever leaves reviews and it's not something I will be using in the future. Also I have a dynamic navigation which creates urls that look like product-name.html?size=2&colour=14 these are also creating duplicate urls, anyway to fix this? While I'm asking, anyone any tips for Magento?
Technical SEO | | Beermonster0 -
Google inconsistent in display of meta content vs page content?
Our e-comm site includes more than 250 brand pages - lrg image, some fluffy text, maybe a video, links to categories for that brand, etc. In many cases, Google publishes our page title and description in their search results. However, in some cases, Google instead publishes our H1 and the aforementioned fluffy page content. We want our page content to read well, be descriptive of the brand and appropriate for the audience. We want our meta titles and descriptions brief and likely to attract CTR from qualified shoppers. I'm finding this difficult to manage when Google pulls from two different areas inconsistently. So my question... Is there a way to ensure Google only utilizes our title/desc for our listings?
Technical SEO | | websurfer0 -
Is this considered as duplicate content?
One of my clients has a template page they have used repeatedly each time they have a new news item. The template includes a two-paragraph customer quote/testimonial for the company. So, they now have 100+ pages with the same customer quote. The rest of the page content / body copy is unique. Is there any likelihood of this being considered duplicate content?
Technical SEO | | bjalc20110