Fetch & Render
-
Hi
I've done Google fetch & render of this page & I have images which Google/customers aren't seeing - how do I identify the problems with this page? http://www.key.co.uk/en/key/chairs
-
They are actually upgrading our platform in the next couple of months, so I will be pushing this.
Are there any areas (apart from TTFB) I should push? This is really not my area so I'm a bit lost with where to start.
I know you mentioned the caching issue - I'll look for your comments and try to review this.
Thank you
-
Ok I made a test of your website and I didn't notice anything wrong until I run a 3rd party test.
And then I saw the error, both Pingdom Tools and Pagespeed had problems to render your site or even givi it a score.Mainly beacuse the site is not optimized so the system does not have the enought time to capture the site
Acording to Pingdom the page size is 2MB is a little bit overwieght but not a big deal, in the other hands it generate 232 request to the server, wich is too much, the 50% of that weight is created by scripts
This are the recomendations according to Google Page Speed
- Eliminate render-blocking JavaScript and CSS in above-the-fold content
- Leverage browser caching
- Reduce server response time
- Optimize images
- Minify HTML
- Minify JavaScript
- Minify CSS
So my advice is make a performace optimization.
-
There are images which customers and Google can't see which aren't loading on the fetch & render - I want to find out why?
-
Can you explain better which is the problem?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt & Disallow: /*? Question!
Hi, I have a site where they have: Disallow: /*? Problem is we need the following indexed: ?utm_source=google_shopping What would the best solution be? I have read: User-agent: *
Intermediate & Advanced SEO | | vetofunk
Allow: ?utm_source=google_shopping
Disallow: /*? Any ideas?0 -
Video Hosting & Embedding
Hi Does anyone have experience with Wistia and does it still hold try embedding a video from Wistia is better for SEO? Or is there no difference with this compared with YouTube if embedding a product video on your product page? Thanks!
Intermediate & Advanced SEO | | BeckyKey0 -
Fetch as Google
I have odd scenario I don't know if anyone can help? I've done some serious speed optimisation on a website, amongst other things CDN and caching. However when I do a Search Console Fetch As Google It is still showing 1.7 seconds download time even though the cached content seems to be delivered in less than 200 ms. The site is using SSL which obviously creams off a bit of speed, but I still don't understand the huge discrepancy. Could it be that Google somehow is forcing the server to deliver fresh content despite settings to deliver cache? Thanks in advance
Intermediate & Advanced SEO | | seoman100 -
How can I use AMP html on a CMS
I have been trying to research using AMP to improve our mobile speed. We have a whole lot of sites on the same platform managed by a CMS. From what I have read, AMP html can only be used on static pages. Does that mean we would not be able to incorporate this into the html through our CMS? I would like to implement this across all our homepages to test the effectiveness of it if possible, but there is no way to rebuild all our homepages statically. Any advice is much appreciated!
Intermediate & Advanced SEO | | chrisvogel0 -
Question & Review should be seperate page
Hi pls look at the below page, http://www.powerwale.com/store/exide-xplore-xltz4-3ah-battery/76933 is questions and review should be in seperate page, as i think that in the future the comments, will become Key word stuffing for the product page. Pls suggest.. If yes, suggest the best url as well.. thanks
Intermediate & Advanced SEO | | Rahim1191 -
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
PR & DA
What are the best ways to increase a website's page rank and domain authority?
Intermediate & Advanced SEO | | WebMarkets0 -
301 Re-direct Implementation & Its Possible Aftermaths
Hi all, I'm currently working on a domain that seems to be 'unofficially' blacklisted by Google. The reason behind my belief are, Ranking process of KW became stagnant. Current crawling and indexing rate has been decreased. Site performance deteriorate after every Search engine update or major data refreshes. And few major indications pointing out that search engines might started doubting its authority. The site is live n running for about 10+ yr and consists of 6000+ pages out of which 5000+ pages are indexed. The site also have some serious issues like, The site has been 2 times penalized by Google. The link ratio & inbound link quality of the site is quite unnatural (mostly directory links, links form spammy sites, bad-neighborhood links etc. ) The site is in flat file and not CMS, thus making it extremely difficult to maintain and update it. Due to the above reasons I was thinking of implementing 301 re-direction. I would like to redirect this poor performing existing domain to a new fresh one keeping the URL structure and files same and maintaining 1:1 redirection rules. I've read an awesome article by Danny Dover on 301 Re direction of a site here in SEOMOZ. It seems that if any one follow the steps mentioned there can actually get benefited by the overall re direction process. Now I'd like know your suggestion about following points: 1. Considering the factors that I've stated, do you think that it would be good to go with this re direction idea? 2. If 301 is implemented then what can be its immediate effects on current rankings and site performance? 3. Assuming that the ranks drowned or gets completely vanished from SERP, after what approx time period can be regain back? 4. Any other suggestion that might help me out to better understand the situation.
Intermediate & Advanced SEO | | ITRIX0