Does Google read code as is or as rendered?
-
Question - Does Google read code as is or as rendered?
So for example, with a Facebook Like box, it has all the profile pictures of people...will Google see these as all separate links or ignore them?
-
Probably it sees them, but remember what happen when Google try to cache a Facebook Page or Profile: this is the cache of the Facebook page of Moz:
- Google see it rendered;
- But the text only version sends to the Facebook Page itself not to the only-text version of it. That's the visualization of the FB Walled Garden.
Hence, if it can create association between links shared and the profile sharing it, that must be not using the link graph but co-occurences IMHO.
-
ThompsonPaul has a good suggestion which is test your pages with "Fetch as Googlebot" to see what they see.
Short answer to your question is no. Google will not pay attention the Facebook Like Box pictures because the codes comes up as javascript (raw code). Whether Google sees it or not is really dependent on how the source is written. For the Facebook Like/Group box, Google won't count them as individual links on a page and see it as one javascript widget.
-
Hi, Stephanie-
Googlebots read as raw code. But in the example you give, it would see those links, as well. What remains as a question is whether it recognizes the nature of those links and therefore pays them limited attention. Personally, I doubt Google ignores them, as it's a valuable source of data for them for both the link graph and the Knowledge Graph.
-
In pretty well all cases, the crawlers crawl code only, Stephanie. It would be too computationally expensive (both for your server and theirs) for them to render every page before crawling it.
The Fetch as Googlebot tool in your Google Webmaster Tools account will show you exactly what the Google crawler sees. When you us the Fetch tool, the result will show a line indicating the success of the fetch. The word Success (in green) is actually a link to a view of the page exactly as Google see it.
That make sense?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whats up with the last google update.
I have numerous clients who were at the top of page in the top 3 spots. They all dropped to page 3 or 4 or 2 and now they are number 1 in maps or in the top 3. Content is great on all these sites. backlinks are high quality and we do not build high quantity, we always focus on quality. the sites have authorship information. trust . we have excellent content written by professionals in the industry for each of the websites. The sites load super fast. they are very mobile friendly. we have CDN installed. content is organized per topic. all of our citations are setup properly and no duplicates, or missing citations. code is good on the websites. we do not have anchor text links pointing to the site from gust posts or whatever. we have plenty of content. our DA/PA is great. Audits of the website are great. I've been doing this a long time and ive never been so dumb founded as to what google has done this time. Or better yet what exactly is wrong with our clients websites today that was working perfectly for the last 5 years. I really am getting frustrated. im comparing my sites to competitors and everything's better. Please someone guide me here and tell me what im missing or tell me what you have done to recover from this nonsense.
Intermediate & Advanced SEO | | waqid0 -
Google Indexing of Images
Our site is experiencing an issue with indexation of images. The site is real estate oriented. It has 238 listings with about 1190 images. The site submits two version (different sizes) of each image to Google, so there are about 2,400 images. Only several hundred are indexed. Can adding Microdata improve the indexation of the images? Our site map is submitting images that are on no-index listing pages to Google. As a result more than 2000 images have been submitted but only a few hundred have been indexed. How should the site map deal with images that reside on no-index pages? Do images that are part of pages that are set up as "no-index" need a special "no-index" label or special treatment? My concern is that so many images that not indexed could be a red flag showing poor quality content to Google. Is it worth investing in correcting this issue, or will correcting it result in little to no improvement in SEO? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Google Search Results...
I'm trying to download every google search results for my company site:company.com. The limit I can get is 100. I tried using seoquake but I can only get to 100. The reason for this? I would like to see what are the pages indexed. www pages, and subdomain pages should only make up 7,000 but search results are 23,000. I would like to see what the others are in the 23,000. Any advice how to go about this? I can individually check subdomains site:www.company.com and site:static.company.com, but I don't know all the subdomains. Anyone cracked this? I tried using a scrapper tool but it was only able to retrieve 200.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Google still listing old domain
Hi We moved to a new domain back in March 2014 and redirected most pages with a 301 and submitted change of domain request through Google Webmaster tools. A couple of pages were left as 302 redirect as they had rubbish links pointing to them and we had previously had a penalty. Google was still indexing the old domain and our rankings hadn't recovered. Last month we took away the 302 redirects and just did a blanket 301 approach from old domain to new in the the thinking that as the penalty had been lifted from the old domain there was no harm in sending everything to new domain. Again, we submitted the change of domain in webmaster tools as the option was available to us but its been a couple of weeks now and the old domain is still indexed Am I missing something? I realise that the rankings may not have recovered partly due to the disavowing / disregarding of several links but am concerned this may be contributing
Intermediate & Advanced SEO | | Ham19790 -
Can Google read content/see links on subscription sites?
If an article is published on The Times (for example), can Google by-pass the subscription sign-in to read the content and index the links in the article? Example: http://www.thetimes.co.uk/tto/life/property/overseas/article4245346.ece In the above article there is a link to the resort's website but you can't see this unless you subscribe. I checked the source code of the page with the subscription prompt present and the link isn't there. Is there a way that these sites deal with search engines differently to other user agents to allow the content to be crawled and indexed?
Intermediate & Advanced SEO | | CustardOnlineMarketing0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
How does Google know if a backlink is good or not?
Hi, What does Google look at when assessing a backlink? How important is it to get a backlink from a website with relevant content? Ex: 1. Domain/Page Auth 80, website is not relevant. Does not use any of the words in your target term in any area of the website. 2. Domain/Page Auth 40, website is relevant. Uses the words in your target term multiple times across website. Which website example would benefit your SERP's more if you gained a backlink? (and if you can say, how much more would it benefit - low, medium, high).
Intermediate & Advanced SEO | | activitysuper0 -
Custom Attributes in Google Places
Hi Guys I'm looking for some clarity of what I can and can't add to the custom attribute fields in a Google Places listing. From my understanding, you can add additional information about your services, but not what those services are. The issue I'm trying to resolve is that a client of mine offers far more than the 5 services/ category options Places allow. They are a home services company, covering all sorts from plumbing, painting and decorating, through to extensions etc. They have about 25 different services. At the moment I'm restricted to just getting rankings for 5 services (correlated to the categories in Places), when I'd like to rank locally for them all. As Google is showing local results for most search queries related to their services whether those searches are geographically modified or not, I'm in a position where even if I am ranking top 5 organically for the terms, I'm still on bottom of page 1, or top of page 2. Would it be wise to add these additional services to the custom attributes section of the Places listing, or would this set off the potential for a listing suspension? Any ideas how to combat this problem would be very welcome.
Intermediate & Advanced SEO | | PerchDigital0