Googlebot crawling AJAX website not always uses _escaped_fragment_
-
Hi,
I started to investigate googlebot crawl log of our website, and it appears that there is no 1:1 correlation between a crawled URL with escaped_fragment and without it.
My expectation is that each time that google crawls a URL, a minute or so after, it suppose to crawl the same URL using an escaped_fragmentFor example:
Googlebot crawl log for https://my_web_site/some_slugResults:
Googlebot crawled this URL 17 times in July:http://i.imgur.com/sA141O0.jpg
Googlebot crawled this URL additional 3 crawls using the escaped_fragment:
http://i.imgur.com/sOQjyPU.jpg
Do you have any idea if this behavior is normal?
Thanks,
Yohay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Besides technical error improvement, best way to increase organic traffic to movie review website
I have a friend's website, ShowBizJunkies, that they work very had at improving and providing great content. I put the website in a more modern theme, increased speed (wpengine, but maxed out with cdn, caching, image optimization, etc) But now I'm struggling how to suggest further improving the seo structure or building backlinks. I know trying to come up for those terms like "movie reviews" and many similar are ridiculously difficult, and requires tons of high quality backlinks. What is my lowest hanging fruit here, any suggestions? My current plan is: 1. Fix technical errors 2. Create more evergreen content 3. Work on timing of article release for better Google News coverage 4. More social sharing, sharing on Tumblr, Reddit, Facebook Groups, G+ Communities, etc 5. Build backlinks via outreach to tv show specific sites, movie fan sites, actor fan sites (interviews)
White Hat / Black Hat SEO | | JustinMurray1 -
Website not moving?
We run a printing website www.fastprint.co.uk and have built a few decent tools such as http://www.fastprint.co.uk/adobe-shortcut-mapper/ and decent infographics such as http://www.fastprint.co.uk/blog/the-art-of-mixing-typefaces.html and had a fair few decent links from website over the course of the last 1 1/2 but we do not seem to be moving very far? If you take our site on sem rush (a decent percentage of our site traffic is through the above tools or decent blog posts so the number would be lower for E-commerce) http://www.semrush.com/uk/info/fastprint.co.uk+(by+organic)?sort=volume_desc in comparison to a few others http://www.semrush.com/uk/info/banana-print.co.uk+(by+organic)] http://www.semrush.com/uk/info/brunelone.com+(by+organic) Especially this site http://www.semrush.com/uk/info/instantprint.co.uk+(by+organic) I just don't get what we are doing wrong?
White Hat / Black Hat SEO | | BobAnderson0 -
Competitor is interlinking between his websites
I have a competitor who ranks in the first page for all his keywords and i found out in open site explorer that he has been interlinking between websites and it is obvious because he owns the same domain but different countries. for example, www.example.id (indonesia) www.example.my (malaysia) www.example.sg (singapore) (asian countries domain) my question here is this even consider "white hat"? I read one of the blog post from moz and here is the quote "#7 - Uniqueness of Source + Target The engines have a number of ways to judge and predict ownership and relationships between websites. These can include (but are certainly not limited to): A large number of shared, reciprocated links
White Hat / Black Hat SEO | | andzon
Domain registration data
Shared hosting IP address or IP address C-blocks
Public acquisition/relationship information
Publicized marketing agreements that can be machine-read and interpreted If the engines determine that a pre-existing relationship of some kind could inhibit the "editorial" quality of a link passing between two sites, they may choose to discount or even ignore these. Anecdotal evidence that links shared between "networks" of websites pass little value (particularly the classic SEO strategy of "sitewide" links) is one point many in the organic search field point to on this topic." will interlinking between your sites will be ignored by google in the future? is this a time bomb method or it is fine doing so? Because as far as concern my competitor is actually ranking on the first page for quite some time.1 -
Can a hidden menu damage a website page?
Website (A) - has a landing page offering courses Website (B) - ( A different organisation) has a link to Website A. The goal landing page when you click on he link takes you to Website A's Courses page which is already a popular page with visitors who search for or come directly into Website A. Owners of Website A want to ADD an Extra Menu Item to the MENU BAR on their Courses page to offer some specific courses to visitors who come from Website (B) to Website (A) - BUT the additional MENU ITEM is ONLY TO BE DISPLAYED if you come from having clicked on the link at Website (B). This link both parties are intending to track However, if you come to the Courses landing page on Website (A) directly from a search engine or directly typing in the URL address of the landing page - you will not see this EXTRA Menu Item with its link to courses, it only appears should you visit Website (A) having come from Website (B). The above approach is making me twitch as to what the programmer wants to do as to me this looks like a form of 'cloaking'. What I am not understanding that Website (A) URL ADDRESS landing page is demonstrating outwardly to Google a Menu Bar that appears normal, but I come to the same URL ADDRESS from Website (B) and I end up seeing an ADDITIONAL MENU ITEM How will Google look at this LANDING PAGE? Surely it must see the CODING INSTRUCTIONS sitting there behind this page to assist it in serving up in effect TWO VERSIONS of the page when actually the URL itself does not change. What should I advise the developer as I don't want the landing page of Website (A) which is doing fine right now, end up with some sort of penalty from the search engines through this exercise. Many thanks in advance of answers from the community.
White Hat / Black Hat SEO | | ICTADVIS0 -
Using Yext - Opinions? Thoughts? Harmful effects on SEO?
Hi All, Does anyone have any experience using Yext? We have 36 locations across the US and I think it would be great to get our local listings knocked out efficiently. Can anyone provide information on the directories they list you in (perhaps Google places listings, for example)? Also, can anyone provide feedback as to whether or not there is any harm with blasting so many directories at once? I don't want to do anything that might harm our SEO rankings or provide low quality links. Thanks in advance!
White Hat / Black Hat SEO | | CSawatzky0 -
SEO best practice: Use tags for SEO purpose? To add or not to add to Sitemap?
Hi Moz community, New to the Moz community and hopefully first post/comment of many to come. I am somewhat new to the industry and have a question that I would like to ask and get your opinions on. It is most likely something that is a very simple answer, but here goes: I have a website that is for a local moving company (so small amounts of traffic and very few pages) that was built on Wordpress... I was told when I first started that I should create tags for some of the cities serviced in the area. I did so and tagged the first blog post to each tag. Turned out to be about 12-15 tags, which in turn created 12-15 additional pages. These tags are listed in the footer area of each page. There are less than 20 pages in the website excluding the tags. Now, I know that each of these pages are showing as duplicate content. To me, this just does not seem like best practices to me. For someone quite new to the industry, what would you suggest I do in order to best deal with this situation. Should I even keep the tags? Should I keep and not index? Should I add/remove from site map? Thanks in advance for any help and I look forward to being a long time member of SEOMoz.
White Hat / Black Hat SEO | | BWrightTLM0 -
Domain Structure For A Network of Websites
To achieve this we need to set up a new architecture of domains and sub-websites to effectively build this network. We want to make sure we follow the right protocols for setting up the domain structures to achieve good SEO for the primary domain and local websites. Today we have our core website at www.doctorsvisioncenter.com which will ultimately will become dvceyecarenetwork.com. That website will serve as the core web presence that can be custom branded for hundreds. For example, today you can go to www.doctorsvisioncenter.com/pinehurst. Note when you start there, you can click around and it is still branded for Pinehurst or spectrum eye care. So the burning question(s). - if I am an independent doc at www.newyorkeye.com, I could do domain forwarding but Google does not index forwarded domains so that is out. I could do a 301 permanent redirect to my page www.doctorsvisioncenter.com/newyorkeye. I could then put a rule in the HT Access file that says if newyorkeye.com redirect to www.doctorsvisioncenter/newyorkeye and then have the domain show up as www.newyorkeye.com. Another way to do that is we point the newyorkeye DNS to doctorsvisioncenter.com rather than a 301 redirect with the same basic rule in the HT Access file. That means that, theoretically, every sub page would show up, for example, as www.newyorkeye.com/contact-lens-center which is actually www.doctorsvisioncenter.com/contact-lens-center. It also means, theoretically, that it will be seen as an individual domain but pointing to all the same content under that individual domain just like potentially hundreds of others. The goal is we build once, manage once and benefit many. If we do something like the above which will mean that each domain will essentially be a separate domain, but, will google see it that way or as duplicative content? While it is easy to answer "yes" it would be duplicative, it is not necessarily the case if the content is on separate domains. Is this a good way to proceed, or does anyone have another recommendation for us?
White Hat / Black Hat SEO | | JessTopps0 -
What happens if a company only uses black hat techniques for an extended period of time?
Let's say I were to start a company. Of course, I want to be indexed, crawled, and pulled up in the search engines. So I start using black hat seo techniques. I comment spam, keyword stuff, spin articles, hide text, etc. I publish hundreds of articles per day on well know sites with excellent page rank. If I am doing all of these unethical techniques, what is going to happen to my website?
White Hat / Black Hat SEO | | FrontlineMobility0