Does Google read code as is or as rendered?
-
Question - Does Google read code as is or as rendered?
So for example, with a Facebook Like box, it has all the profile pictures of people...will Google see these as all separate links or ignore them?
-
Probably it sees them, but remember what happen when Google try to cache a Facebook Page or Profile: this is the cache of the Facebook page of Moz:
- Google see it rendered;
- But the text only version sends to the Facebook Page itself not to the only-text version of it. That's the visualization of the FB Walled Garden.
Hence, if it can create association between links shared and the profile sharing it, that must be not using the link graph but co-occurences IMHO.
-
ThompsonPaul has a good suggestion which is test your pages with "Fetch as Googlebot" to see what they see.
Short answer to your question is no. Google will not pay attention the Facebook Like Box pictures because the codes comes up as javascript (raw code). Whether Google sees it or not is really dependent on how the source is written. For the Facebook Like/Group box, Google won't count them as individual links on a page and see it as one javascript widget.
-
Hi, Stephanie-
Googlebots read as raw code. But in the example you give, it would see those links, as well. What remains as a question is whether it recognizes the nature of those links and therefore pays them limited attention. Personally, I doubt Google ignores them, as it's a valuable source of data for them for both the link graph and the Knowledge Graph.
-
In pretty well all cases, the crawlers crawl code only, Stephanie. It would be too computationally expensive (both for your server and theirs) for them to render every page before crawling it.
The Fetch as Googlebot tool in your Google Webmaster Tools account will show you exactly what the Google crawler sees. When you us the Fetch tool, the result will show a line indicating the success of the fetch. The word Success (in green) is actually a link to a view of the page exactly as Google see it.
That make sense?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing
Hi We have roughly 8500 pages in our website. Google had indexed almost 6000 of them, but now suddenly I see that the pages indexed has gone to 45. Any possible explanations why this might be happening and what can be done for it. Thanks, Priyam
Intermediate & Advanced SEO | | kh-priyam0 -
How good is Google at reading geo-targeted dynamic content -- Javascript?
We are using a single page application for a section of our website where it generates content based on the user's geographical location. Because Google's Search Console is searching from Virginia (where we don't have any content), we are not able to see anything render in Google Search Console. How good is Google at reading geo-targeted dynamic content? Do we have anything to worry about in terms of indexing the content because it's being served through JS?
Intermediate & Advanced SEO | | imjonny1230 -
JavaScript Issue? Google not indexing a microsite
We have a microsite that was created on our domain but is not linked to from ANYwhere EXCEPT within some Javascript elements on pages on our site. The link is in one JQuery slide panel. The microsite is not being indexed at all - when i do site:(microsite name) on Google, it doesn't return anything. I think it's because the link's only in a Java element, but my client assures me that if I submit to Google for crawling the problem will be solved. Maybe so, but my point is that if you just create a simple HTML link from at least one of our site pages, it will get indexed no problem. The microsite has been up for months and it's still not being indexed - another newer microsite that's been up for a few weeks and has simple links to it from our pages is indexing fine. I have submitted the URL for crawling but had to use the google.com/webmasters/tools/submit-url/ method as I don't have access to the top level domain WMT account. p.s. when we put the microsite URL into the SEOBook spider-test tool it returns lots of lovely information - but that just tells me the page is findable, does exist, right? That doesn't mean Google's going to necessarily index it, as I am surmising...Moz hasn't found in the 5 months the microsite has been up and running. What's going on here?
Intermediate & Advanced SEO | | Jen_Floyd0 -
Website No Longer Ranking In Google:
My website was on first page google couple of months ago, now nothing. Shows up in Bing page one. Some queries/pages still showing OK, but some not at all. Example "residential elevators illinois" found nowhere. http://www.accesselevator.net is the website. Have found 900 poor quality links and used disavow tool. Any further suggestions? Their Page Rank also went from a 3 to a 2. Implemented nofollow on all outgoing links. Need advice.
Intermediate & Advanced SEO | | trailblazerzz90 -
Google Manual Penalty - Unnatural Links
Hi, We are in the process of trying to remove a partial manual penalty for unnatural links. I would like to do a complete link audit of our site, where can I get complete data on sites linking to my website? Webmaster tools only appears to show the top 1000 domains. Thanks
Intermediate & Advanced SEO | | halloranc0 -
Time for Google to change the emphasis?
Why doesn't Google recommend that links are nofollow as standard, via HTML5, etc., with follow being added if the link is on a quality site (defined by PR, or whatever.) and adds value. Wouldn't this save alot of time? Then they could whack all the sites with coding that doesn't comply, couldn't they? Also, instead of enabling negative SEO, why doesn't Google simply focus on wiping out the sites developed simply to pass on PR. I'm sure we could all send them a few thousand suggestions!
Intermediate & Advanced SEO | | McTaggart0 -
Does Google check Whois
Hello everyone, I own quite a lot of website active in the same niche and sometimes targeting the same keywords, these sites are hosted at different IP's. But they all have the same Whois details, i was wondering if Google checks the Whois-data? And if it affects the serp's? Regards, Yannick
Intermediate & Advanced SEO | | iwebdevnl0 -
Google, Links and Javascript
So today I was taking a look at http://www.seomoz.org/top500 page and saw that the AddThis page is currently at the position 19. I think the main reason for that is because their plugin create, through javascript, linkbacks to their page where their share buttons reside. So any page with AddThis installed would easily have 4/5 linbacks to their site, creating that huge amount of linkbacks they have. Ok, that pretty much shows that Google doesn´t care if the link is created in the HTML (on the backend) or through Javascript (frontend). But heres the catch. If someones create a free plugin for wordpress/drupal or any other huge cms platform out there with a feature that linkbacks to the page of the creator of the plugin (thats pretty common, I know) but instead of inserting the link in the plugin source code they put it somewhere else, wich then is loaded with a javascript code (exactly how AddThis works). This would allow the owner of the plugin to change the link showed at anytime he wants. The main reason for that would be, dont know, an URL address update for his blog or businness or something. However that could easily be used to link to whatever tha hell the owner of the plugin wants to. What your thoughts about this, I think this could be easily classified as White or Black hat depending on what the owners do. However, would google think the same way about it?
Intermediate & Advanced SEO | | bemcapaz0