Fetch & Render
-
Hi
I've done Google fetch & render of this page & I have images which Google/customers aren't seeing - how do I identify the problems with this page? http://www.key.co.uk/en/key/chairs
-
They are actually upgrading our platform in the next couple of months, so I will be pushing this.
Are there any areas (apart from TTFB) I should push? This is really not my area so I'm a bit lost with where to start.
I know you mentioned the caching issue - I'll look for your comments and try to review this.
Thank you
-
Ok I made a test of your website and I didn't notice anything wrong until I run a 3rd party test.
And then I saw the error, both Pingdom Tools and Pagespeed had problems to render your site or even givi it a score.Mainly beacuse the site is not optimized so the system does not have the enought time to capture the site
Acording to Pingdom the page size is 2MB is a little bit overwieght but not a big deal, in the other hands it generate 232 request to the server, wich is too much, the 50% of that weight is created by scripts
This are the recomendations according to Google Page Speed
- Eliminate render-blocking JavaScript and CSS in above-the-fold content
- Leverage browser caching
- Reduce server response time
- Optimize images
- Minify HTML
- Minify JavaScript
- Minify CSS
So my advice is make a performace optimization.
-
There are images which customers and Google can't see which aren't loading on the fetch & render - I want to find out why?
-
Can you explain better which is the problem?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Menus, Ecommerce & SEO
Hi Our Dev team have updated our website with a new menu structure, they have given us 2 options to choose from. 1st option I think is better for SEO - this will be showing top 8 categories and then subcategories once you hover over category 1. Not much change from our current structure, just a slightly different layout. (I have added an image example of what option1 will look like) 2nd option - is preferred by management - shows all 24 categories & no subcategories. My question is, will removing the current subcategories from the main menu make them lose rankings & make them harder to rank in future? I'm guessing everything will move down a level in the structure and lost page authority... Does anyone have any articles/case studies to prove this point? Any help is much appreciated 🙂 Becky DKzgD
Intermediate & Advanced SEO | | BeckyKey1 -
Paragraphs/Tables for Content & SEO
Hi Does anyone know if Google prefers paragraphs over content in a table, or doesn't it make much difference?
Intermediate & Advanced SEO | | BeckyKey0 -
Help with Schema & what's considered "Spammy structured markup"
Hello all! I was wondering if someone with a good understanding of schema markup could please answer my question about the correct use so I can correct a penalty I just received. My website is using the following schema markup for our reviews and today I received this message in my search console. UGH... Manual Actions This site may not perform as well in Google results because it appears to be in violation of Google's Webmaster Guidelines. Site-wide matches Some manual actions apply to entire site <colgroup><col class="JX0GPIC-d-h"><col class="JX0GPIC-d-x"><col class="JX0GPIC-d-a"></colgroup>
Intermediate & Advanced SEO | | reversedotmortgage
| | Reason | Affects |
| | Spammy structured markup Markup on some pages on this site appears to use techniques such as marking up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google's Rich Snippet Quality guidelines. Learn more. | I have used the webmasters rich snippets tool but everything checks out. The only thing I could think of is my schema tag for "product." rather than using a company like tag? (https://schema.org/Corporation). We are a mortgage company so we sell a product it's called a mortgage so I assumed product would be appropriate. Could that even be the issue? I checked another site that uses a similar markup and they don't seem to have any problems in SERPS. http://www.fha.com/fha_reverse shows stars and they call their reviews "store" OR could it be that I added my reviews in my footer so that each of my pages would have a chance at displaying my stars? All our reviews are independently verified and we just would like to showcase them. I greatly appreciate the feedback and had no intentions of abusing the markup. From my site: All Reverse Mortgage 4.9 out of 5 301 Verified Customer Reviews from eKomi | |
| | [https://www.ekomi-us.com/review-reverse.mortgage.html](<a class=)" rel="nofollow" title="eKomi verified customer reviews" target="_BLANK" style="text-decoration:none; font-size:1.1em;"> |
| | ![](<a class=)imgs/rating-bar5.png" /> |
| | |
| | All Reverse Mortgage |
| | |
| | |
| | 4.9 out of 5 |
| | 301 Verified Customer Reviews from eKomi |
| | |
| | |
| | |
| | |1 -
What is Google supposed to return when you submit an image URL into Fetch as Google? Is a few lines of readable text followed by lots of unreadable text normal?
I am seeing something like this (Is this normal?): HTTP/1.1 200 OK
Intermediate & Advanced SEO | | Autoboof
Server: nginx
Content-Type: image/jpeg
X-Content-Type-Options: nosniff
Last-Modified: Fri, 13 Nov 2015 15:23:04 GMT
Cache-Control: max-age=1209600
Expires: Fri, 27 Nov 2015 15:23:55 GMT
X-Request-ID: v-8dd8519e-8a1a-11e5-a595-12313d18b975
X-AH-Environment: prod
Content-Length: 25505
Accept-Ranges: bytes
Date: Fri, 13 Nov 2015 15:24:11 GMT
X-Varnish: 863978362 863966195
Age: 16
Via: 1.1 varnish
Connection: keep-alive
X-Cache: HIT
X-Cache-Hits: 1 ����•JFIF••••��;CREATOR: gd-jpeg v1.0 (using IJG JPEG v80), quality = 75
��C•••••••••• •
••
••••••••• $.' ",#(7),01444'9=82<.342��C• ••••
•2!!22222222222222222222222222222222222222222222222222��•••••v••"••••••��••••••••••••••••
•���•••••••••••••}•••••••!1A••Qa•"q•2���•#B��•R��$3br�
••••%&'()*456789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz���������������������������������������������������������������������������•••••••••••••••••••
•���••••••••••••••w••••••!1••AQ•aq•"2�••B���� #3R�•br�0 -
Ajax Module Crawability vs. WMT Fetch & Render
Recently a module was built into the homepage to pull in content from an outside source via Ajax and I'm curious about the overall crawability of the content. In WMT, if I fetch & render the content it displays correctly, but if I view source all I am seeing is the empty container. Should I take additional steps so that the actual AJAX content displays in my source code, or am I "good" since the content does display correctly when I fetch & render?
Intermediate & Advanced SEO | | RosemarieReed0 -
Optimizing Product Catalogs for Multiple Brick & Mortar Locations
We're working on a project for a retail client who has multiple (5+) brick and mortar store locations in a given geographical area. They're regional, so they have locations in multiple states. We're optimizing their content (coupons, events, products, etc) across their site, but we're running into the issue of ranking well for specific products in one location, but not as well (or not at all) in others. The keywords we would like to rank for generally aren't super competitive, we're dealing with commodity products in local retail markets, so in most cases, good on page optimization is enough to rank in the top couple results. Our current situation: (specific examples are fictitious but representative) Title: My Company | Dogwood Trees - Fredericksburg, VA, Rocky Mt, NC, Rock Hill, SC…
Intermediate & Advanced SEO | | cballinger
Url: http://mycompany.com/catalog/product/dogwood-trees The content on the page is generally well optimized. We've claimed all the locations in Google places and we've deployed schema.org markup for each location that carries the item on the product page. We have specific location pages that rank well for Company name or Company Name Location, but the actual goal is to have the product page come up in each location. In the example above, we would rank #1 for "Dogwood Trees Fredericksburg VA" or just "Dogwood Trees" if the searcher is in or around Fredericksburg, on the first page for "Dogwood Trees Rocky Mt, NC", but not at all for any other locations. As these aren't heavily linked to pages, this indicates the title tag + on page content is probably our primary ranking factor, so as Google cuts the keyword relevance at the tail of the title tag, the location keywords stop helping us. What is the proper way to do this? A proposed solution we're discussing is subfolder-ing all the locations for specific location related content. For Example: My Company | Dog wood Trees - Fredericksburg, VA, Rocky Mt, NC, Rock Hill, SC…http://mycompany.com/catalog/product/dogwood-trees Becomes: My Company | Dogwood Trees - Fredericksburg, VA
http://mycompany.com/fredericksburg-va/product/dogwood-trees My Company | Dogwood Trees - Rocky Mt, NC
http://mycompany.com/rocky-mt-nc/product/dogwood-trees My Company | Dogwood Trees - Rock Hill, SC
http://mycompany.com/rock-hill-sc/product/dogwood-trees Of course, this is the definition of duplicate content, which concerns me, is there a "Google approved" way to actually do this? It's the same exact tree being sold from the same company in multiple locations. Google is essentially allowing us to rank well for whichever location we put first in the title tag, but not the others. Logically, it makes complete sense that a consumer in Rock Hill, SC should have the same opportunity to find the product as one in Fredericksburg, VA. In these markets, the client is probably one of maybe three possible merchants for this product within 20 miles. As I said, it's not highly competitive, they just need to show up. Any thoughts or best practices on this would be much appreciated!2 -
Penguin & Panda: Geographic Penalities?
Has anyone ever come across information about a website appearing strongly in SERP's in one region, but poorly in another? (ie: great in Europe, not so great in N. America) If so, perhaps it is a Panda or Penguin issue?
Intermediate & Advanced SEO | | Prospector-Plastics0 -
How to conduct catch 301 redirects & have the separate 301 redirects for the key pages
Hi, We've currently done a site migration mapping and 301 redirecting only the sites key pages. However two GWT (Google Webmaster Tools) is picking a massive amount of 404 areas and there has been some drop in rankings. I want to mitigate the site from further decline, and hence thought about doing a catch 301 - that is 301 redirecting the remaining pages found on the old site back to the home page, with the future aim of going through each URL one by one to redirect them to the page which is most relevant. Two questions, (1) can I do a catch 301 and if so what is the process and requirements that I have to give to the developer? (2) How do you reduce the number of increasing 404 errors from a site, despite doing 301 redirects and updating links on external linking sites. Note: The server is apache and the site is hosted on Wordpress platform. Regards, Vahe
Intermediate & Advanced SEO | | Vahe.Arabian0