Can I delay an AJAX call in order to hide specific on page content?
-
I am an SEO for a people search site. To avoid potential duplicate content issues for common people searches such as "John Smith" we are displaying the main "John Smith" result above the fold and add "other John Smith" search results inside an iframe. This way search engines don't see the same "other John Smith" search results on all other "John Smith" profile pages on our site and conclude that we have lots of duplicate content.
We want to get away from using an iframe to solve potential duplicate content problem.
Question:
Can we display this duplicate "John Smith" content using a delayed AJAX call and robot.txt block the directory that contains the AJAX call?
-
Seems like Google does interpret "Post:
"Googlebot may now perform POST requests when we believe it’s safe and appropriate."
http://googlewebmastercentral.blogspot.com/2011/11/get-post-and-safely-surfacing-more-of.html
-
Thanks for the input, I will check around to see if Google really does not interpret "post"
-
If you are using ajax, I guess you async postback information to your server in order to retrieve results and normally your form method=post.
If it's the case, you are ok, because google (as far as I know) will not interpret the post method and just stop there and read the page as is.
Now if you use a form method=get (which I doubt), or use kind of querystring to query your db or display default profiles before posting, then google could be able to follow the results and this could lead to duplicate contents. Then you'll need to block these pages with a robot.txt.
Can you post the url you are working on? This will help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can someone help me understand why this page is ranking so well?
Hi everyone, EDIT: I'm going to link to the actual page, please remove if there are any issues with confidentiality. Here is the page: https://www.legalzoom.com/knowledge/llc/topic/advantages-and-disadvantages-overview It's ranking #2 on Google for "LLC" This page is a couple months old and is substantially heavy in content, but not much more so than all the dozens of other pages online that are competing with it. This is a highly competitive industry and this particular domain is an extremely huge player in this industry. This new page is suddenly ranking #2 for an extremely competitive head term, arguably the most important/high volume keyword being targeted by the entire site. The page is outranking the home page, as well as the service page that exactly targets the query - the one that you would think would be the ranking page for this head term. However, this new page is somewhat of a spin-off with some additional related content about the subject, some videos, resources, a lot of internal links, etc. The first word of the title tag exactly matches the head term. I did observe that almost no other pages on the site have the exact keyword as the first word of the title tag, but that couldn't be sufficient to bring it up so high in the ranks, could it? Another bizarre thing that is happening is that Google is ignoring the Title Tag in the actual HTML (which is a specific question that is accurate to the content on the page), and re-assigning a title tag that basically looks like this: "Head Term | Brand." Why would it do this on this page? Doesn't it usually prefer more descriptive title tags? There are no external links coming up on Moz or Majestic pointing to this page. It has just a couple social shares. It's not being linked to from the home page or top nav bar on the main site. Can anyone explain how this particular page would outrank the main service page targeting this keyword, as well as other highly authoritative, older pages online targeting the same keyword? Thanks for your help!
Intermediate & Advanced SEO | | FPD_NYC1 -
Imange name and html page name same are count spammy contents ?
Imange name and html page name same is count spammy contents ex. watertreatment - plan.jpg watertreatment - plan.html
Intermediate & Advanced SEO | | Poojath0 -
Can SPA (single page architecture) websites be SEO friendly?
What is the latest consensus on SPA web design architecture and SEO friendliness?
Intermediate & Advanced SEO | | Robo342
By SPA, I mean rather than each page having its own unique URL, instead each page would have an anchor added to a single URL. For example: Before SPA: website.com/home/green.html After SPA: website.com/home.html#green (rendering a new page using AJAX) It would seem that Google may have trouble differentiating pages with unique anchors vs unique URLs, but have they adapted to this style of architecture yet? Are there any best practices around this? Some developers are moving to SPA as the state of the art in architecture (e.g., see this thread: http://www.linkedin.com/groups/Google-crawling-websites-built-using-121615.S.219120193), and yet there may be a conflict between SPA and SEO. Any thoughts or black and white answers? Thanks.0 -
I've seen and heard alot about city-specific landing pages for businesses with multiple locations, but what about city-specific landing pages for cities nearby that you aren't actually located in? Is it ok to create landing pages for nearby cities?
I asked here https://www.google.com/moderator/#7/e=adbf4 but figured out ask the Moz Community also! Is it actually best practice to create landing pages for nearby cities if you don't have an actual address there? Even if your target customers are there? For example, If I am in Miami, but have a lot of customers who come from nearby cities like Fort Lauderdale is it okay to create those LP's? I've heard this described as best practice, but I'm beginning to question whether Google sees it that way.
Intermediate & Advanced SEO | | RickyShockley2 -
Any downsides of (permanent)redirecting 404 pages to more generic pages(category page)
Hi, We have a site which is somewhat like e-bay, they have several categories and advertisements posted by customers/ client. These advertisements disappear over time and turn into 404 pages. We have the option to redirect the user to the corresponding category page, but we're afraid of any negative impact of this change. Are there any downsides, and is this really the best option we have? Thanks in advance!
Intermediate & Advanced SEO | | vhendriks0 -
Any idea why I can't add a Panoramio image link to my Google Places page?
Hey guys & gals! Last week, I watched one of the Pro Webinars on here related to Google Places. Since then, I have begun to help one of my friends with his GP page to get my feet wet. One of the tips from the webinar was to geotag images in Panoramio to use for your images on the Places page. However, when I try to do this, I just get an error that says they can't upload it at this time. I tried searching online for answers, but the G support pages that I have found where someone asks the same question, there is no resolution. Can anyone help? PS - I would prefer not to post publicly the business name, URL, etc. So, if that info is needed, I can PM. Thanks a lot!
Intermediate & Advanced SEO | | strong11 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
We are changing ?page= dynamic url's to /page/ static urls. Will this hurt the progress we have made with the pages using dynamic addresses?
Question about changing url from dynamic to static to improve SEO but concern about hurting progress made so far.
Intermediate & Advanced SEO | | h3counsel0