Can I delay an AJAX call in order to hide specific on page content?
-
I am an SEO for a people search site. To avoid potential duplicate content issues for common people searches such as "John Smith" we are displaying the main "John Smith" result above the fold and add "other John Smith" search results inside an iframe. This way search engines don't see the same "other John Smith" search results on all other "John Smith" profile pages on our site and conclude that we have lots of duplicate content.
We want to get away from using an iframe to solve potential duplicate content problem.
Question:
Can we display this duplicate "John Smith" content using a delayed AJAX call and robot.txt block the directory that contains the AJAX call?
-
Seems like Google does interpret "Post:
"Googlebot may now perform POST requests when we believe it’s safe and appropriate."
http://googlewebmastercentral.blogspot.com/2011/11/get-post-and-safely-surfacing-more-of.html
-
Thanks for the input, I will check around to see if Google really does not interpret "post"
-
If you are using ajax, I guess you async postback information to your server in order to retrieve results and normally your form method=post.
If it's the case, you are ok, because google (as far as I know) will not interpret the post method and just stop there and read the page as is.
Now if you use a form method=get (which I doubt), or use kind of querystring to query your db or display default profiles before posting, then google could be able to follow the results and this could lead to duplicate contents. Then you'll need to block these pages with a robot.txt.
Can you post the url you are working on? This will help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Adsbot crawling order confirmation pages?
Hi, We have had roughly 1000+ requests per 24 hours from Google-adsbot to our confirmation pages. This generates an error as the confirmation page cannot be viewed after closing or by anyone who didn't complete the order. How is google-adsbot finding pages to crawl that are not linked to anywhere on the site, in the sitemap or linked to anywhere else? Is there any harm in a google crawler receiving a higher percentage of errors - even though the pages are not supposed to be requested. Is there anything we can do to prevent the errors for the benefit of our network team and what are the possible risks of any measures we can take? This bot seems to be for evaluating the quality of landing pages used in for Adwords so why is it trying to access confirmation pages when they have not been set for any of our adverts? We included "Disallow: /confirmation" in the robots.txt but it has continued to request these pages, generating a 403 page and an error in the log files so it seems Adsbot doesn't follow robots.txt. Thanks in advance for any help, Sam
Intermediate & Advanced SEO | | seoeuroflorist0 -
Need references to a company that can transition our 1000 page website from Http to Https without breaking our SEO backlinks and site structure
Hi Ya'll I'm looking for a company or independent who can transition our website from http to https. I want to make sure they know what they're doing with a Wordpress website. More importantly, i want to make sure they don't break any seo juice from external sources while internally nothing gets broken. Anyone have any good recommendations? You can reply back or DM me. Best, Shawn
Intermediate & Advanced SEO | | Shawn1240 -
Can I use the old website content on the new website, after deleting it from the server?
My website nowwhatstudio.com hit by google pure spam and google applied manual spam action to the website. I create new website (nowwhatmoments.com) with the same content from the old spam action website (nowwhatstudio.com). As google removed my old website content from search indexed. Can I use the same content for a new website? If I delete my old website from the server, after that Can I use the old website content for a new website? Or Can make edits the old website content and make it 80% original for a new website?
Intermediate & Advanced SEO | | bondhoward0 -
Knowledge Graph Quick Answer Box: Is there anything we can do to get our content to appear there?
Hi everyone, The quick answers box can be really helpful for searchers by pulling through content which answers their question or provides a clear description of an item or entity. Our client appeared in the quick answer box for a period of time with their description of a product, but have since been replaced by one of their competitors. Previously, the answer was provided by Wikipedia. Is there anything we can do to help get our client's content back in there? We've been looking at possible structured data we can use but haven't found anything. Also suggesting our client ensures they have a paragraph within their copy which is a clear, concise description of the product that Google can pull. Can anyone give any suggestions? Thanks Laura
Intermediate & Advanced SEO | | tomcraig860 -
Forum generating automatically extra pages. Can I solve it with canonical?
Hey there Webmasters of the Universe. So i have this problem with my forum. The platform I am using it automatically creates extra pages for every page. For exampleIf my forum had one page called forum.com/examplethe same page you can find at forum.com/example?page=1If I set rel canonical into the second one pointing to the first one will that cause a problem for me?Thanks in advance!
Intermediate & Advanced SEO | | Angelos_Savvaidis0 -
Can Googlebots read canonical tags on pages with javascript redirects?
Hi Moz! We have old locations pages that we can't redirect to the new ones because they have AJAX. To preserve pagerank, we are putting canonical tags on the old location pages. Will Googlebots still read these canonical tags if the pages have a javascript redirect? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Web pages fighting over rank for one keyword. Can it be stopped?
Hey, See attachment. Website is Omega Red. The page I want to rank for seems like it is being held back by other closely related pages with similar titles. I am looking to rank for electrical earthing with this page. On the graph it shows how the other pages have interacted over a period of time on the website and how if they drop out of the top 50 this page then moves up in Google. I don't really want to canonicalise the other pages into one but maybe this is what needs to happen? Any suggestions? bWymgVt.jpg
Intermediate & Advanced SEO | | Hughescov0 -
Can SEO increase a page's Authority? Or can Authority only be earned via #RCS?
Hi all. I am asking this question to purposefully provoke a discussion. The CEO of the company where I am the in-house SEO sent me a directive this morning. The directive is to take our Website from a PR3 site to a PR5....in 6 months. Now, I know Page Rank is a bit of a deprecated concept, but I'm sure you would agree that "Authority" is still crucial to ranking well. When he first sent me the directive it was worded like this "I want a plan in place with the goal being to "beat" a specific competitor in 6 months." When I prodded him to define "beat," i.e. did he mean "outrank" for every keyword, he answered that he wanted our site to have the same "Authority" that this particular competitor has. So I am left pondering this question: Is it possible for SEO to increase the authority of a page? Or does "Authority" come from #RCS? The second part of this question is what would you do if you were in my shoes? I have been devoting huge amounts of time on technical SEO because the Website is a mess. Because I've dedicated so much time to technical issues, link-earning has taken a back seat. In my mind, why would anyone want to link to a crappy site that has serious technical issues (slow load times, no persistent cart, lots of 404s, etc)? Shouldn't we make the site awesome before trying to get people to link to us? Given this directive to improve our site's "Authority" - would you scrap the technical SEO and go whole hog into a link-earning binge, or would you hunker down and pound away at the technical issues? Which one would you do first if you couldn't do both at the same time? Comments, thoughts and insights would be greatly appreciated.
Intermediate & Advanced SEO | | danatanseo1