No content using Fetch
-
Wooah, this one makes me feel a bit nervous.
The cache version of the site homepage shows all the text, but I understand that is the html code constructed by the browser. So I get that.
If I Google some of the content it is there in the index and the cache version is yesterday.
If I Fetch and Render in GWT then none of the content is available in the preview - neither Googlebot or visitor view. The whole preview is just the menu, a holding image for a video and a tag line for it. There are no reports of blocked resources apart from a Wistia URL. How can I decipher what is blocking Google if it does not report any problems?
The CSS is visible for reference to, for example, <section class="text-within-lines big-text narrow">
class="data"> some content...
Ranking is a real issue, in part by a poorly functioning main menu. But i'm really concerned with what is happening with the render.
-
I got there in the end. They have a Wistia video loading on the homepage, but Wistia robots blocks this resource. When the resource is blocked the CSS is loading a holding image. However, this is configured to fill the whole page. So therefore when googlebot crawls it cannot render anything further than this image or this defined area in CSS. Dev is fixing.
-
Hi Michael,
We have faced similar issues, the main reason in our case was dynamic content being generated by Jquery, Ajax etc. You will need a good developer to help you through if this is the case.
Also, we consider Google render and fetch as best reference to how google bot will see us.
I hope this helps
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Letting Others Use Our Content: Risk-Free Attribution Methods
Hello Moz! A massive site that you've all heard of is looking to syndicate some of our original editorial content. This content is our bread and butter, and is one of the primary reasons why people use our site. Note that this site is not a competitor of ours - we're in different verticals. If this massive site were to use the content straight up, I'm fairly confident that they'd begin to outrank us for related terms pretty quickly due to their monstrous domain authority. This is complex because they'd like to use bits and pieces of the content interspersed with their own content, so they can't just implement a cross-domain canonical. It'd also be difficult to load the content in an iframe with noindex,nofollow header tags since their own content (which they want indexed) will be mixed up with ours. They're also not open to including a link back to the product pages where the corresponding reviews live on our site. Are there other courses of action that could be proposed that would protect our valuable content? Is there any evidence that using schema.org (Review and Organization schemas) pointing back to our review page URLs would provide attribution and prevent them from outranking us for associated terms?
Intermediate & Advanced SEO | | edmundsseo1 -
Content Aggregation Site: How much content per aggregated piece is too much?
Let's say I set up a section of my website that aggregated content from major news outlets and bloggers around a certain topic. For each piece of aggregated content, is there a bad, fair, and good range of word count that should be stipulated? I'm asking this because I've been mulling it over—both SEO (duplicate content) issues and copyright issues—to determine what is considered best practice. Any ideas about what is considered best practice in this situation? Also, are there any other issues to consider that I didn't mention?
Intermediate & Advanced SEO | | kdaniels0 -
Best strategy for duplicate content?
Hi everyone, We have a site where all product pages have more or less similar text (same printing techniques, etc.) The main differences are prices and images, text is highly similar. We have around 150 products in every language. Moz's algorithm tells me to do something about duplicate content, but I don't really know what we could do, since the descriptions can't be changed to be very different. We essentially have paper bags in different colors and and from different materials.
Intermediate & Advanced SEO | | JaanMSonberg0 -
Content Marketing for Local Businesses
Hey guys! As someone who works with a number of local businesses (with localized target markets) I find that developing ideas for content marketing can be VERY difficult. I like the idea of creating local guides, local event info etc, but what other ways can we create content for a localized target market? For example: I have an OBGYN client that we'd love to create content for that is related to their niche (women's health), but don't want to promote or create content for national audience. That would seem incredibly wasteful. Would love to hear ideas on how to create targeted content for a local audience! Thanks
Intermediate & Advanced SEO | | RickyShockley0 -
How do I best handle Duplicate Content on an IIS site using 301 redirects?
The crawl report for a site indicates the existence of both www and non-www content, which I am aware is duplicate. However, only the www pages are indexed**, which is throwing me off. There are not any 'no-index' tags on the non-www pages and nothing in robots.txt and I can't find a sitemap. I believe a 301 redirect from the non-www pages is what is in order. Is this accurate? I believe the site is built using asp.net on IIS as the pages end in .asp. (not very familiar to me) There are multiple versions of the homepage, including 'index.html' and 'default.asp.' Meta refresh tags are being used to point to 'default.asp'. What has been done: 1. I set the preferred domain to 'www' in Google's Webmaster Tools, as most links already point to www. 2. The Wordpress blog which sits in a /blog subdirectory has been set with rel="canonical" to point to the www version. What I have asked the programmer to do: 1. Add 301 redirects from the non-www pages to the www pages. 2. Set all versions of the homepage to redirect to www.site.org using 301 redirects as opposed to meta refresh tags. Have all bases been covered correctly? One more concern: I notice the canonical tags in the source code of the blog use a trailing slash - will this create a problem of inconsistency? (And why is rel="canonical" the standard for Wordpress SEO plugins while 301 redirects are preferred for SEO?) Thanks a million! **To clarify regarding the indexation of non-www pages: A search for 'site:site.org -inurl:www' returns only 7 pages without www which are all blog pages without content (Code 200, not 404 - maybe deleted or moved - which is perhaps another 301 redirect issue).
Intermediate & Advanced SEO | | kimmiedawn0 -
Optimize the category page or a content page?
Hi, We wish to start ranking on a specific keyword ("log house prices" in italian). We have two options on what pages we should optimize for this keyword: A long content page (1000+ words with images) Log houses category page, optimized for the keyword (we have 50+ houses on this page, together with a short price summary). I would think that we have better chances with ranking with option nr.2 , but then we can't use that page for ranking with a more short-tail keyword (like "log houses"). What would you suggest? Is there maybe a third option for this?
Intermediate & Advanced SEO | | JohanMattisson0 -
Using abbreviations in URL - Matching Keyword
We have a website that uses /us/, /ca/, /va/, etc for URLs of the different U.S. states. How much better is it (or is it at all better) to use /california/ or /virginia/ instead in our URLs to rank for searches that include the name of those states?
Intermediate & Advanced SEO | | Heydarian0 -
Duplicate Content across 4 domains
I am working on a new project where the client has 5 domains each with identical website content. There is no rel=canonical. There is a great variation in the number of pages in the index for each of the domains (from 1 to 1250). OSE shows a range of linking domains from 1 to 120 for each domain. I will be strongly recommending to the client to focus on one website and 301 everything from the other domains. I would recommend focusing on the domain that has the most pages indexed and the most referring domains but I've noticed the client has started using one of the other domains in their offline promotional activity and it is now their preferred domain. What are your thoughts on this situation? Would it be better to 301 to the client's preferred domain (and lose a level of ranking power throught the 301 reduction factor + wait for other pages to get indexed) or stick with the highest ranking/most linked domain even though it doesn't match the client's preferred domain used for email addresses etc. Or would it better to use cross-domain canoncial tags? Thanks
Intermediate & Advanced SEO | | bjalc20110