Can I delay an AJAX call in order to hide specific on page content?
-
I am an SEO for a people search site. To avoid potential duplicate content issues for common people searches such as "John Smith" we are displaying the main "John Smith" result above the fold and add "other John Smith" search results inside an iframe. This way search engines don't see the same "other John Smith" search results on all other "John Smith" profile pages on our site and conclude that we have lots of duplicate content.
We want to get away from using an iframe to solve potential duplicate content problem.
Question:
Can we display this duplicate "John Smith" content using a delayed AJAX call and robot.txt block the directory that contains the AJAX call?
-
Seems like Google does interpret "Post:
"Googlebot may now perform POST requests when we believe it’s safe and appropriate."
http://googlewebmastercentral.blogspot.com/2011/11/get-post-and-safely-surfacing-more-of.html
-
Thanks for the input, I will check around to see if Google really does not interpret "post"
-
If you are using ajax, I guess you async postback information to your server in order to retrieve results and normally your form method=post.
If it's the case, you are ok, because google (as far as I know) will not interpret the post method and just stop there and read the page as is.
Now if you use a form method=get (which I doubt), or use kind of querystring to query your db or display default profiles before posting, then google could be able to follow the results and this could lead to duplicate contents. Then you'll need to block these pages with a robot.txt.
Can you post the url you are working on? This will help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After Server Migration - Crawling Gets slow and Dynamic Pages wherein Content changes are not getting Updated
Hello, I have just performed doing server migration 2 days back All's well with traffic moved to new servers But somehow - it seems that w.r.t previous host that on submitting a new article - it was getting indexed in minutes. Now even after submitting page for indexing - its taking bit of time in coming to Search Engines and some pages wherein content is daily updated - despite submitting for indexing - changes are not getting reflected Site name is - http://www.mycarhelpline.com Have checked in robots, meta tags, url structure - all remains well intact. No unknown errors reports through Google webmaster Could someone advise - is it normal - due to name server and ip address change and expect to correct it automatically or am i missing something Kindly advise in . Thanks
Intermediate & Advanced SEO | | Modi0 -
Duplicate content issue with pages that have navigation
We have a large consumer website with several sections that have navigation of several pages. How would I prevent the pages from getting duplicate content errors and how best would I handle SEO for these? For example we have about 500 events with 20 events showing on each page. What is the best way to prevent all the subsequent navigation pages from getting a duplicate content and duplicate title error?
Intermediate & Advanced SEO | | roundbrix0 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
SEO structure question: Better to add similar (but distinct) content to multiple unique pages or make one unique page?
Not sure which approach would be more SEO ranking friendly? As we are a music store, we do instrument repairs on all instruments. Currently, I don't have much of any content about our repairs on our website... so I'm considering a couple different approaches of adding this content: Let's take Trumpet Repair for example: 1. I can auto write to the HTML body (say, at the end of the body) of our 20 Trumpets (each having their own page) we have for sale on our site, the verbiage of all repairs, services, rates, and other repair related detail. In my mind, the effect of this may be that: This added information does uniquely pertain to Trumpets only (excludes all other instrument repair info), which Google likes... but it would be duplicate Trumpet repair information over 20 pages.... which Google may not like? 2. Or I could auto write the repair details to the Trumpet's Category Page - either in the Body, Header, or Footer. This definitely reduces the redundancy of the repeating Trumpet repair info per Trumpet page, but it also reduces each Trumpet pages content depth... so I'm not sure which out weighs the other? 3. Write it to both category page & individual pages? Possibly valuable because the information is anchoring all around itself and supporting... or is that super duplication? 4. Of course, create a category dedicated to repairs then add a subcategory for each instrument and have the repair info there be completely unique to that page...- then in the body of each 20 Trumpets, tag an internal link to Trumpet Repair? Any suggestions greatly appreciated? Thanks, Kevin
Intermediate & Advanced SEO | | Kevin_McLeish0 -
Google is displaying my pages path instead of URLS (Pages name)
Does anyone knows why Google is displaying my pages path instead of the URL in the search results, i discoverd that while am searching using a keyword of mine then i copied the link http://www.smarttouch.me/services-saudi/web-services/web-design and found all related results are the same, could anyone one tell me why is that and is it really differs? or the URL display is more important than the Path display for SEO!
Intermediate & Advanced SEO | | ali8810 -
Does Bing(and Yahoo)Crawl AJAX Based Content?
I found this article and Bing appeared to at this time have a checkbox for the option for AJAX handling although looking at the Bing Webmaster Tools it doesnt appear that this option is available. Has it just been completely integrated now, relieving webmaster from needing to check the option or is it no longer supported? http://searchengineland.com/bing-now-supports-googles-crawlable-ajax-standard-84149
Intermediate & Advanced SEO | | imiJoe0 -
Which is more effective: JQuery + CSS for Tabbed Content or Create Unique Pages for each tab.
We are building a from-scratch directory site and trying to determine the best way to structure our pages. Each general listing page has four sections of specific information. What is a better strategy for SEO: Using tabs (e.g. JQuery + CSS) and putting all content on one page (and will all of the content still be indexible using JQuery?) OR creating unique pages for each section. JQuery: sitename.com/listing-name#section1 Unique Pages: sitename.com/listing-name/section1 If I go with option one, I can risk not being crawlable by google if they can't read through the scripting. However, I feel like the individual pages will not rank if there's a small amount of content for each section. Is it better to keep all the content on one page and focus on building links to that? Or better to build out the section pages and worry about adding quality content to them so that long term there is more specificity for long tail search and better quality search experience on Google? We are also set up to have "../listing-type/listing-name" but are considering removing 'listing type and just having "../listing-name/". Do you think this more advantageous for boosting rankings? I know that was like five questions. I've been doing a lot of research and these are the things that I'm still scratching my head about. Some general direction would be really great! Thank You!
Intermediate & Advanced SEO | | knowyourbank0 -
How can someone not call B.S. on this site ranking 4th.
We manage a lot of sites that are around pharmaceuticals and lawsuits. I was checking a couple of the sites around the keyword: Actos Lawsuit using the keyword difficulty with serp analysis. Our sites have done very little Adwords except for first month about a year ago and we have always ranked well and the client is very happy with the results. Tonight I notice a site that is http://wikilawsuit.org/drug-recalls/actos-side-effects-bladder-cancer-actos-lawsuit/ They are ranked fourth on Google. Our url which is http://actos-lawsuit.org/ is ranked 9th?? Frankly there are several sites ranked ahead and when you look at the parameters all the way across some we are killing. But Wiki, everyone is killing and it is still fourth. I ran it in OSE and the metrics came back better, but there is at best 3 to 4 real links out of 30 domains. This is a commercial site with a contact form in right sidebar and my guess is they are selling leads to lawyers. So they are about as Wiki as Hooters. That said, we see all the talk about quality links and I am seeing a lot of sites with few quality links and lots of junk links. Should we still believe it matters? Or, is it that it matters when the sites are huge (JC Penny), etc. but not if the site is under some critical number of poor links? Looking forward to a moz Fest on this.
Intermediate & Advanced SEO | | RobertFisher0