Googlebot crawling AJAX website not always uses _escaped_fragment_
-
Hi,
I started to investigate googlebot crawl log of our website, and it appears that there is no 1:1 correlation between a crawled URL with escaped_fragment and without it.
My expectation is that each time that google crawls a URL, a minute or so after, it suppose to crawl the same URL using an escaped_fragmentFor example:
Googlebot crawl log for https://my_web_site/some_slugResults:
Googlebot crawled this URL 17 times in July:http://i.imgur.com/sA141O0.jpg
Googlebot crawled this URL additional 3 crawls using the escaped_fragment:
http://i.imgur.com/sOQjyPU.jpg
Do you have any idea if this behavior is normal?
Thanks,
Yohay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Brand new website shows 79% spam Score, what is the reason and how should I deal with this?
Hi, I have just launched my website 1 month before and I have used all paid images, Uniquely written contents, Everything is genuine for better SEO experience in the future. The actual problem is its showing spam by 79% in MOZ bar, I don't have a single link on my website also my content is unique, Images are unique. Why its showing so much spam on this brand new website? Can you please help me? I am very stressed due to this problem.
White Hat / Black Hat SEO | | rahat640 -
How to Increase US traffic for other GEO based website
Can you help me to understand why my traffic from the US is not increasing for landing pages/product pages whereas for our blog it has grown 2X to 3X past 2 quarters? I am afraid that I don't get any right answer for this. Could you or someone help me to discover the answer? Also, what should I do to rank in the US for a particular KW, we rank on 2 positions for KW "Hackathon" in India but for the US the no is 56. I don't understand what to do and the best possible way to rank in the US.
White Hat / Black Hat SEO | | Rajnish_HE0 -
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
Is there a danger linking to and from one website too many times?
Basically my webdeveloper has suggested that instead of using a subfolder to create an English and Korean version of the site I should create two different websites and then link them both together to provide the page in English, or in Korean, which ever the case may be. My immediate reaction is that search engines may perceive this kind of linking to be manipulative, as you can imagine there will be a lot of links (One for every page). Do you think it is OK to create two webpages and link them together page by page? Or do you think that the site will get penalized by search engines for link farming or link exchanging. Regards, Tom
White Hat / Black Hat SEO | | CoGri0 -
Are Links from blogs with person using keyword anchor text a Penguin 2.0 issue?
Hello, I am continuing a complete clean up of a clients link profile and would like to know if Penguin is against links from blogs with the user including keywords as anchor text? So far I have been attempting to get them removed before I go for a disavow. An example would be the work clothing comment at the bottom of: http://www.fashionstyleyou.co.uk/beat-the-caffeine-rush.html/comment-page-1 I am also questioning if we should keep any link directories, so far I have been ruthless, but worry I will be losing a hell of a lot of links. For example I have kept the following: http://www.business-directory-uk.co.uk//clothing.htm Your comments are welcomed!
White Hat / Black Hat SEO | | MarzVentures0 -
Is Meta Keywords Important For Websites?
Hi, I understand that meta title and descriptions are very important for websites. I would like to know if meta keywords are important? I have seen people talking about meta keywords are useless and it should be removed from the website to prevent competitors from knowing your keywords. Anyone has anything to share? 🙂
White Hat / Black Hat SEO | | chanel270 -
How many times should one submit the same article to various websites? 1 time? 10 times? What is okay to do with the most recent Panda update?'
For link-building purposes, seemingly it was okay to post the same article to multiple sites for links in the past. However, after the most recent Panda update our thought is that this may not be a good practice. So the question is, how many times is okay to submit an article for link building purposes. Should you always only submit to one site? Is it okay to do more than once? What is the right way to submit for link-building in Google's eyes? Thanks
White Hat / Black Hat SEO | | Robertnweil10