Does Google read code as is or as rendered?
-
Question - Does Google read code as is or as rendered?
So for example, with a Facebook Like box, it has all the profile pictures of people...will Google see these as all separate links or ignore them?
-
Probably it sees them, but remember what happen when Google try to cache a Facebook Page or Profile: this is the cache of the Facebook page of Moz:
- Google see it rendered;
- But the text only version sends to the Facebook Page itself not to the only-text version of it. That's the visualization of the FB Walled Garden.
Hence, if it can create association between links shared and the profile sharing it, that must be not using the link graph but co-occurences IMHO.
-
ThompsonPaul has a good suggestion which is test your pages with "Fetch as Googlebot" to see what they see.
Short answer to your question is no. Google will not pay attention the Facebook Like Box pictures because the codes comes up as javascript (raw code). Whether Google sees it or not is really dependent on how the source is written. For the Facebook Like/Group box, Google won't count them as individual links on a page and see it as one javascript widget.
-
Hi, Stephanie-
Googlebots read as raw code. But in the example you give, it would see those links, as well. What remains as a question is whether it recognizes the nature of those links and therefore pays them limited attention. Personally, I doubt Google ignores them, as it's a valuable source of data for them for both the link graph and the Knowledge Graph.
-
In pretty well all cases, the crawlers crawl code only, Stephanie. It would be too computationally expensive (both for your server and theirs) for them to render every page before crawling it.
The Fetch as Googlebot tool in your Google Webmaster Tools account will show you exactly what the Google crawler sees. When you us the Fetch tool, the result will show a line indicating the success of the fetch. The word Success (in green) is actually a link to a view of the page exactly as Google see it.
That make sense?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved How much time does it take for Google to read the Sitemap?
Hi there, I could use your help with something. Last week, I submitted my sitemap in the search console to improve my website's visibility on Google. Unfortunately, I got an error message saying that Google is not reading my sitemap. I'm not sure what went wrong. Could you take a look at my site (OceanXD.org) and let me know if there's anything I can do to fix the issue? I would appreciate your help. Thank you so much!
Intermediate & Advanced SEO | | OceanXD1 -
Missing Google verification
I just went to check my client sites in Google search console and noticed a whole bunch of them no longer 'verified'. They were all previously verified. Why would they suddenly change status to 'not verified'? Does this affect anything (eg. search analytics data flowing through to GA)? Does this mean I have to verify all over again?
Intermediate & Advanced SEO | | muzzmoz0 -
Google Search Console - Indexed Pages
I am performing a site audit and looking at the "Index Status Report" in GSC. This shows a total of 17 URLs have been indexed. However when I look at the Sitemap report in GSC it shows 9,000 pages indexed. Also, when I perform a site: search on Google I get 24,000 results. Can anyone help me to explain these anomalies?
Intermediate & Advanced SEO | | richdan0 -
Google Cache Is Blank for Text-only
Hi, I'm doing some SEO for www.suprafootwear.com, and for some reason when I go to text-only in google cache, nothing shows up. http://webcache.googleusercontent.com/search?q=cache:suprafootwear.com&es_sm=91&strip=1 That seems to be the case for all of the different pages on the site, but the content is still appearing on the serp. I have never seen this before, and I'm not sure what's happening. Any help would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | bigwavew0 -
Google Fetch Issue
I'm having some problems with what google is fetching and what it isn't, and I'd like to know why. For example, google IS fetching a non-existent page but listing it as an error: http://www.gaport.com/carports but the actual url is http://www.gaport.com/carports.htm. Google is NOT able to fetch http://www.gaport.com/aluminum/storage-buildings-10x12.htm. It says the page doesn't exist (even though it does) and when I click on the not found link in Google fetch it adds %E@%80%8E to the url causing the problem. One theory we have is that this may be some sort of server/hosting problem, but that's only really because we can't figure out what we could have done to cause it. Any insights would be greatly appreciated. Thanks and Happy Holidays! Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
What About Google Panda Update 22?
Maybe I haven't found the threads or whatever but I haven't seen lots of posts about the latest Google Panda update from November 21-22 on SEOmoz. Panda 22 is not even listed here: http://www.seomoz.org/google-algorithm-change Until November 21st, Google killed 3 of 5 websites I own through their Panda updates (never got hit by Penguin updates as I got only original content), accounting for about 25% of my income. Fortunately, the 2 remaining websites gained more traffic throughout the summer of 2012 so my income almost got back to 100% even though I got the "Unnatural Links" warning in Google Webmaster Tools in July. Since then, I did a huge link cleanup and according to the Link Detox Tool (from another SEO service), the number of "toxic links" went from about 350 to 50. Back link reports is as follow: 8% (52) Toxic Links; 57% (382) Suspicious Links; 35% (235) Healthy Links; Out of the 382 suspicious, most of them are coming from the same domain and they are all directories to which my website has been submitted automatically (not using any specific keyword anchor). On the opposite, healthy links are coming from different domains so I like to think they have a stronger impact than suspicious links. That said, my two remaining websites were still doing well until November 21 where it got hit by the Panda. Now traffic has dropped by 55% and income has dropped by 75% (yes I'll have to look for a job within a year if I don't fix this). (I want to add that none of my websites are "thin websites". One has over 1500 pages of content and the other has about 500 pages. All websites have content added 3 to 5 times a week.) What I don't get is that all my "money keywords" are still ranked in the top 10 results on Google according to multiple tools / services I use, yet the impressions dropped from 50% to 75% for those keywords?!? I have a feeling that this time it's not only a drop in ranking. There's a drop in impressions caused by something else. Is it caused by emphasis on local search? Are they showing more ads and less organic results? But here's the "funny part": For the last 5 years, I was never able to advertise my website on Google Adwords. Each time, I got a quality score of about 4/10 only to see it drop to 1/10 within a few hours of launching the campaign. On November 22nd, I build new PPC campaigns based on the exact same PPC campaigns I had the past (same keywords, same ads, same landing pages). Guess what? Now the quality score is between 7/10 and 10/10 (most of them have 10/10) for the exact same PPC campaign! What a "coincidence" huh?
Intermediate & Advanced SEO | | sbrault740 -
Not ranking on Bing but is on Google?
Hi What are the main differences between Bing and Google in terms of ranking sites? My site is ranking well in Google but in Bing it is very low down and does not deliver much traffic. In Bing webmaster tools there are no warning messages and I had sent in a sitemap back in 2011 and 77 pages are listed, but I had not submitted a URL could this be why my pages are not ranking highly? Or does anybody have a checklist on what a site should offer to get ranking on Bing?
Intermediate & Advanced SEO | | ocelot0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0