Content Rendering by Googlebot vs. Visitor
-
Hi Moz!
After a different question on here, I tried fetching as Google to see the difference between bot & user - to see if Google finds the written content on my page
The 2 versions are quite different - with Googlebot not even rendering product listings or content, just seems to be the info in the top navigation - guessing this is a massive issue?
Help
Becky
-
Yeh, I have just seen a few ranking drops so I'm now a little concerned.
Thanks for your advice!
-
That's great!
I regularly see category pages on ecomm sites not render all the images in Fetch and Render - haven't been able to figure out why yet. They might just have a limit on the number of thumbnails they display in the tool.
-
Thanks Logan,
I have done this and am seeing a much better result in fetch & render.
On one of my pages (http://www.key.co.uk/en/key/dollies-load-movers-door-skates) for example it is not rendering all the images, only the first 2 - is there anything in particular I should look at for this?
I've attached a screen shot
Thanks for your help
-
Yes, you should allow GoogleBot to crawl all style related files, JS as well. They want to be able to render a page the same way a person would see it. Part of the reason for this is for determining the mobile friendliness of a site. I would assume they also want to be able to make general UX assessments of sites too since they're putting much more emphasis on the user journey and task completion.
-
-
In fetch and render in Search Console, there's usually some notifications below the renderings that explain why there might be discrepancies. Your robots.txt file may be preventing Google from accessing some important CSS (or other) files that drive layout. Check there before you dig too much deeper, it might be a simple robots.txt update that you need.
-
Hi Becky,
You should fix the issue in any case, whether ranking or not ranking it's a risk.
Try to fix all the issues that google shows you.
Regards,
Vijay
-
Hi
The weird thing is the page I checked does rank quite well - so I'm not sure what to make of it?
-
Hi Becky,
This can be a major issue, as fetch as google feature was introduced to show what Google crawler would see on your page.
Many times, websites use complex javascript, JSON, jquery, angular Js etc , these scripts render the content of the page either late or in a different way than what crawler expects.
Work with your developer and get it fixed, I have seen many beautiful websites not rankings due to this error.
I hope this helps, feel free to ask further questions.
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Question With New Domain
Hey Everyone, I hope your day is going well. I have a question regarding duplicate content. Let's say that we have Website A and Website B. Website A is a directory for multiple stores & brands. Website B is a new domain that will satisfy the delivery niche for these multiple stores & brands (where they can click on a "Delivery" anchor on Website A and it'll redirect them to Website B). We want Website B to rank organically when someone types in " <brand>delivery" in Google. Website B has NOT been created yet. The Issue Website B has to be a separate domain than Website A (no getting around this). Website B will also pull all of the content from Website A (menus, reviews, about, etc). Will we face any duplicate content issues on either Website A or Website B in the future? Should we rel=canonical to the main website even though we want Website B to rank organically?</brand>
Intermediate & Advanced SEO | | imjonny0 -
Scraped content ranking above the original source content in Google.
I need insights on how “scraped” content (exact copy-pasted version) rank above the original content in Google. 4 original, in-depth articles published by my client (an online publisher) are republished by another company (which happens to be briefly mentioned in all four of those articles). We reckon the articles were re-published at least a day or two after the original articles were published (exact gap is not known). We find that all four of the “copied” articles rank at the top of Google search results whereas the original content i.e. my client website does not show up in the even in the top 50 or 60 results. We have looked at numerous factors such as Domain authority, Page authority, in-bound links to both the original source as well as the URLs of the copied pages, social metrics etc. All of the metrics, as shown by tools like Moz, are better for the source website than for the re-publisher. We have also compared results in different geographies to see if any geographical bias was affecting results, reason being our client’s website is hosted in the UK and the ‘re-publisher’ is from another country--- but we found the same results. We are also not aware of any manual actions taken against our client website (at least based on messages on Search Console). Any other factors that can explain this serious anomaly--- which seems to be a disincentive for somebody creating highly relevant original content. We recognize that our client has the option to submit a ‘Scraper Content’ form to Google--- but we are less keen to go down that route and more keen to understand why this problem could arise in the first place. Please suggest.
Intermediate & Advanced SEO | | ontarget-media0 -
Cookieless subdomains Vs SEO
We have one .com that has all our unique content and then 25 other ccltd sites that are translated versions of the .com for each country we operate in. They are not linked together but we have href lang'd it all together. We now want to serve up all static content of our global website (26 local country sites, .com, .co.uk, .se, etc) from one cookie-less subdomain. Benefit is speed improvement. The question is whether from an SEO perspective, can all static content come from static.domain.com or should we do one for each ccltd where it would come form static.domain.xx (where xx is localised to the domain in question)
Intermediate & Advanced SEO | | aires-fb770 -
Content position and topic modelling
Hi, Two questions here, First: Does the position of content have any impact on performance? For example say a page displays a league table (20 rows) so eats up most of the above-fold space. Would that table being top followed by content have a negative impact? Would creating 'some' content before a table help? Second: Does topic modelling actually help relevance signals? So say I sold guitars and the page had the word 'guitar' throughout the content, would including electric, acoustic, strings, amps etc also in the content help the page become more relevant for the term 'guitar'? Or would it just expand the terms the page would be eligible to show for? Thanks.
Intermediate & Advanced SEO | | followuk1 -
Partial duplicate content and canonical tags
Hi - I am rebuilding a consumer website, and each product page will contain a unique product image, and a sentence or two about the product (and we tend to use a lot of the same words in different ways across products). I'd like to have a tabbed area below the product info that talks about the overall product line, and this content would be duplicate across all the product pages (a "Why use our products" type of thing). I'd have this duplicate content also living on its own URL's so they can be found alone in the SERP's. Question is, do I need to add the canonical tag to this page, since there's partial duplicate content on the product pages? And if I did that, would my product pages go un-indexed?? I understand how to handle completely duplicated content, it's the partial duplicate that I'm having difficulty figuring out.
Intermediate & Advanced SEO | | Jenny10 -
Duplicate Content
Hi everyone, I have a TLD in the UK with a .co.uk and also the same site in Ireland (.ie). The only differences are the prices and different banners maybe. The .ie site pulls all of the content from the .co.uk domain. Is this classed as content duplication? I've had problems in the past in which Google struggles to index the website. At the moment the site appears completely fine in the UK SERPs but for Ireland I just have the Title and domain appearing in the SERPs, with no extended title or description because of the confusion I caused Google last time. Does anybody know a fix for this? Thanks
Intermediate & Advanced SEO | | royb0 -
Lots of optimized content but crappy rankings
Hi, I write content for the site http://www.healthchoices.ca. We were hit by Panda (a different issue that is resolved now) but even before that, I'd write an entire library of good content. An example: the plant sterols library. Here's an article as an example: http://www.healthchoices.ca/article/where-are-plant-sterols-phytosterols-found There are about eight on average in each medical library, and we cover topics from acne to sexual health. The other half of the business is a directory. We have thousands of local health are providers, a basic version with just an address, and a longer version where we optimize the text. Here's an example: http://www.healthchoices.ca/profile/ct-mri-scans/canada-diagnostic-centres We come up buried on the third page, far after directories with zero content or crappy content. What am I missing? I am getting very frustrated as I've been writing this stuff for a long time and nothing seems to come of it. Thanks so much, Erin
Intermediate & Advanced SEO | | erinhealthchoices0 -
Geo-Targeting Content
I'm to get some ideas on restructuring existing content for geo-targeting. Example: Botox Page Tis is a hypothetical situation -- laser cosmetics clinic in Atlanta trying to rank for Atlanta Botox. The existing content is general information about botox procedures. The problem is editing the content to add Atlanta to the H1 tag and page copy. I'm wondering if there are some techniques to make the edits flow better? My idea is to add a geo-page for each procedure, but I'm wondering if this might interrupt or confuse users in the navigation funnel. Your thoughts? Thanks!
Intermediate & Advanced SEO | | 190west0