Fetch & Render
-
Hi
I've done Google fetch & render of this page & I have images which Google/customers aren't seeing - how do I identify the problems with this page? http://www.key.co.uk/en/key/chairs
-
They are actually upgrading our platform in the next couple of months, so I will be pushing this.
Are there any areas (apart from TTFB) I should push? This is really not my area so I'm a bit lost with where to start.
I know you mentioned the caching issue - I'll look for your comments and try to review this.
Thank you
-
Ok I made a test of your website and I didn't notice anything wrong until I run a 3rd party test.
And then I saw the error, both Pingdom Tools and Pagespeed had problems to render your site or even givi it a score.Mainly beacuse the site is not optimized so the system does not have the enought time to capture the site
Acording to Pingdom the page size is 2MB is a little bit overwieght but not a big deal, in the other hands it generate 232 request to the server, wich is too much, the 50% of that weight is created by scripts
This are the recomendations according to Google Page Speed
- Eliminate render-blocking JavaScript and CSS in above-the-fold content
- Leverage browser caching
- Reduce server response time
- Optimize images
- Minify HTML
- Minify JavaScript
- Minify CSS
So my advice is make a performace optimization.
-
There are images which customers and Google can't see which aren't loading on the fetch & render - I want to find out why?
-
Can you explain better which is the problem?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
AMP - The what, why and how of it
Hi There, In our company, we never had AMP implemented on sites that we SEO for. We want to start doing that now. I know the basics of AMP and their requirements, however, I need to know a lot more about it from an SEO perspective of this before I actually get developers onto it. I want to know all the 1) risks involved, 2) the best ways to implement it (plugin etc.), 3) why it is worth it. I also want to know how to see if a developer knows what he is talking about - and really will do it the right way without messing me up? Are there any specific questions I should ask, or information he should be aware of? Also, is there a way for it it be done for pages that have more than just text like quote forms, sliding headers etc.? Should we only do it for the blog section of our site? I would greatly appreciate any links to additional resources on this topic, (not why to use AMP, but everything else) I greatly appreciate you taking the time to answer my question
Intermediate & Advanced SEO | | Ruchy0 -
Http & https domain names
We currently have a site which we found SEM Rush to show that their were duplicate pages for the site. Upon further inspection we realized this was because there existed both http:// and https:// Versions of the site. Is this a problem for Google that the site appears for both http:// and https:// and that there are therefore duplicate versions of the site?
Intermediate & Advanced SEO | | Gavo0 -
[Advice] Dealing with an immense URl structure full of canonicals with Budget & Time constraint
Good day to you Mozers, I have a website that sells a certain product online and, once bought, is specifically delivered to a point of sale where the client's car gets serviced. This website has a shop, products and informational pages that are duplicated by the number of physical PoS. The organizational decision was that every PoS were supposed to have their own little site that could be managed and modified. Examples are: Every PoS could have a different price on their product Some of them have services available and some may have fewer, but the content on these service page doesn't change. I get over a million URls that are, supposedly, all treated with canonical tags to their respective main page. The reason I use "supposedly" is because verifying the logic they used behind canonicals is proving to be a headache, but I know and I've seen a lot of these pages using the tag. i.e: https:mysite.com/shop/ <-- https:mysite.com/pointofsale-b/shop https:mysite.com/shop/productA <-- https:mysite.com/pointofsale-b/shop/productA The problem is that I have over a million URl that are crawled, when really I may have less than a tenth of them that have organic trafic potential. Question is:
Intermediate & Advanced SEO | | Charles-O
For products, I know I should tell them to put the URl as close to the root as possible and dynamically change the price according to the PoS the end-user chooses. Or even redirect all shops to the main one and only use that one. I need a short term solution to test/show if it is worth investing in development and correct all these useless duplicate pages. Should I use Robots.txt and block off parts of the site I do not want Google to waste his time on? I am worried about: Indexation, Accessibility and crawl budget being wasted. Thank you in advance,1 -
Display:None CSS & SEO
Hi A while back I was told that using the display:none tag to hide content you want minimised is bad for onpage SEO - is this the case? It's not that we want to hide it from Google, we just don't want it taking up a huge amount of space on product pages. I have found some of these on our site, and want to know how bad they are. Is the content under the tag going to be ignored? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Proper way to include Location & Zipcode Keywords
I have a client that is insisting that I add a list of approximately 50 cities and 80 zipcodes that their business serves within the keyword meta tag. Based on what I have been reading this will do absolutely nothing to help improve their search ranking. What would be the proper way today to let inform search engines of the geolocations a business serves?
Intermediate & Advanced SEO | | mmurphy0 -
Duplicate Title Tags & Duplication Meta Description after 301 Redirect
Today, I was checking my Google webmaster tools and found 16,000 duplicate title tags and duplicate meta description. I have investigate for this issue and come to know about as follow. I have changed URL structure for 11,000 product pages on 3rd July, 2012 and set up 301 redirect from old product pages to new product pages. Google have started to crawl my new product pages but, De-Indexing of old URLs are quite slower. That's why I found this issue on Google webmaster tools. Can anyone suggest me, How can I increase ratio of De-Indexing for old URLs? OR any other suggestions? How much time Google will take to De-Index old URLs from web search?
Intermediate & Advanced SEO | | CommercePundit0 -
How to get power tweets & Likes for social signals!
Hi, Just been looking into social signals a little deeper. From what I have read a tweet from one page is not the same as a tweet from another page, the authority and influence is also a big part. So a tweet from CNN does a lot more then a tweet from a random. So how do you find these authority and influential pages/users? I have come across Klout.com which gives a score out of 100, which is one way I guess BUT I have also noticed mozbar stats change for different facebook pages. Q: Can you use the mozbar on facebook & twitter pages to workout who will generate the best social signals? Cheers
Intermediate & Advanced SEO | | activitysuper0 -
Reducing pages with canonical & redirects
We have a site that has a ridiculous number of pages. Its a directory of service providers that is organized by city and sub-category of the vertical. Each provider is on the main city page, then when you click on a category, it will only show those folks who offer that subcategory of this service. example: colorado/denver - main city page colorado/denver/subcat1 - subcategory page There are 37 subcategories. So, 38 pages that essentially have the same content - minus a provider or two - for each city. There are approx 40K locations in our database. So rough math puts us at 1.5 million results pages, with 97% of those pages being duplicate content! This is clearly a problem. But many of these obscure pages do rank and get traffic. A fair amount when you aggregate all these pages together. We are about to go through a redesign and want to consolidate pages so we can reduce the dupe content, get crawl budget allocated to more meaningful pages, etc. Here's what I'm thinking we should do with this site, and I would love to have your input: Canonicalize Before the redesign use the canonical tag on all the sub-category pages and push all the value from those pages (colorado/denver/subcat1, /subcat2, /subcat3... etc) to the main city page (colorado/denver/subcat1) 301 Redirect On the new site (we're moving to a new CMS) we don't publish the duplicate sub-category pages and do 301 redirects from the sub-category URLs to the main city page urls. We'd still have the sub-categories (keywords) on-page and use some Javascript filtering to narrow results. We could cut to the chase and just do the redirects, but would like to use canonicalization as a proof of concept internally at my company that getting rid of these pages is a good thing, or at least wont have a negative impact on traffic. i.e. by the time we are ready to relaunch traffic and value has been transfered to the /state/city page Trying to create the right plan and build my argument. Any feedback you have will help.
Intermediate & Advanced SEO | | trentc0