Fetch & Render
-
Hi
I've done Google fetch & render of this page & I have images which Google/customers aren't seeing - how do I identify the problems with this page? http://www.key.co.uk/en/key/chairs
-
They are actually upgrading our platform in the next couple of months, so I will be pushing this.
Are there any areas (apart from TTFB) I should push? This is really not my area so I'm a bit lost with where to start.
I know you mentioned the caching issue - I'll look for your comments and try to review this.
Thank you
-
Ok I made a test of your website and I didn't notice anything wrong until I run a 3rd party test.
And then I saw the error, both Pingdom Tools and Pagespeed had problems to render your site or even givi it a score.Mainly beacuse the site is not optimized so the system does not have the enought time to capture the site
Acording to Pingdom the page size is 2MB is a little bit overwieght but not a big deal, in the other hands it generate 232 request to the server, wich is too much, the 50% of that weight is created by scripts
This are the recomendations according to Google Page Speed
- Eliminate render-blocking JavaScript and CSS in above-the-fold content
- Leverage browser caching
- Reduce server response time
- Optimize images
- Minify HTML
- Minify JavaScript
- Minify CSS
So my advice is make a performace optimization.
-
There are images which customers and Google can't see which aren't loading on the fetch & render - I want to find out why?
-
Can you explain better which is the problem?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Heading Tags & Content Count
Hi everyone I am looking into this page on our site http://www.key.co.uk/en/key/sack-trucks Just comparing it against competitors in SEMRush, the tool shows a wordcount of this page for over 4089 words, compared with http://www.wickes.co.uk/Wickes-Green-General-Purpose-Sack-Truck-200kg/p/500302 which only has 2658 - it has a lot more written content than our page - where is this word count coming from? Also looking at the same page on our site Woorank suggests we have the word 'sack truck' in the h1 and title too many times - it's only there once, but its this showing because its an exact match keyword? I'm just wondering if there is something wrong with the html or how the page is being crawed?
Intermediate & Advanced SEO | | BeckyKey0 -
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
Checking Rankings Again & Again Can Drop Rankings
Is it possible that if i check my google rankings again & again it can drop ranking?Like checking where do my keywords rank every hours rank drop the rankings? Because this indirectly affects the CTR. Might be because of it? No one has faced such an weird thing before.
Intermediate & Advanced SEO | | welcomecure0 -
Disavowal & Reconsideration request - Can I do one without the other?
I submitted a link disavowal file for a client a few weeks ago and before doing that I read up on how to properly use the tool. My understanding is that if you received a manual penalty then you need to submit a reconsideration request after cleaning up links. We didn't receive a penalty so I didn't submit one. I'm wondering if anyone has used the tool (not stemming from a penalty) and if you did or didn't submit a recon. request, and what the results were. I've read that if a site is hit algorithmically, then filing a recon request won't help. Should I just do it anyway? Would be great to hear from anyone who has gone through a similar situation.
Intermediate & Advanced SEO | | Vanessa120 -
Link Reclimation & Redirects
Hello, I'm in the middle of a link reclamation project wherein we're identifying broken links, links pointing to dupe content etc. I found a forgotten co-brand which is effectively dupe content across 8 sub-domains, some of which have a significant number of links (200+ linking domains | 2k+ in-bound links). Question for the group is what's the optimal redirect option? Option 1: set 301 and maintain 1:1 URL mapping will pass all equity to applicable PLPs and theoretically improve rank for related keyword(s). requires a bit more configuration time and will likely have small effect on rank given links are widely distributed across URLs. Option 2: set 301 to redirect all requests to the associated sub-domain e.g. foo.mybrand.cobrand.com/page1.html and foo.mybrand.cobrand.com/page2 both redirect to foo.mybrand.com/ will accumulate all equity at the sub-domain level which theoretically will be roughly distributed throughout underlying pages and will limit risk of penalty to that sub-domain. Option 3: set 301 to redirect all requests to our homepage. easiest to configure & maintain, will accumulate the maximum equity on a priority page which should positively affect domain authority. run risk of being penalized for accumulating links en mass, risk penalty for spammy links on our primary sub-domain www, won't pass keyword specific equity to applicable pages. To be clear, I've done an initial scrub of anchor text and there were no signs of spam. I'm leaning towards #3, but interested in others perspectives. Cheers,
Intermediate & Advanced SEO | | PCampolo
Stefan0 -
Duplicate (Basically) H1 & H2
We've about to relaunch one of our ecommerce sites and have a question regarding H1 & H2 tags. We use our primary keyword for each category in that category page's H1. We also include a block of text at the bottom of the page explaining the benefits of the products, the various styles we offer, personalization options, gift packaging, etc. We were planning on having an H2 at the beginning of that text that read 'About [keyword:]', but the question of duplicate H1 & H2 tags has come up. Is penalization possible for having them almost the same? It's not like they're not relevant - the H1 is referring to the category itself and the H2 references our explanation of the category. Just curious what the best way to approach this would be.
Intermediate & Advanced SEO | | Kingof50 -
UK Company Major drop in traffic & rankings on one primary keyword since March
I am helping out a small UK company who have had a sudden drop in organic search traffic since March 24th. Investigation highlights some issues with the site,e.g. Potential canonicalization of home page, a few html errors, some inbound links to the /index.html version of the homepage rather than /. But, nothing particualrly major and nothing that is different to pre-March 24th. The indexed pages looks ok in Google (although Bing is ranking the non-www version of the homepage) but this does not appear in Google's index. Searches for the company name on Google.co.uk show it as top result & some keywords are ranking reasonably well (based on homepage). Selecting blocks of text from the homepage and it ranks #1, but its Google rank for the primary keyword has gone from #2 pre-March 24th to not in the top 100 results since. SEOMOZ is grading the page A for the keyword which appears prominently on the page & keyword is the first characters of the title. It is not a particularly competitive keyword. Adding UK to the keyword and the page is Google.co.uk ranked #3. It's almost as if they are being penalised for a single keyword which I've never seen or heard of before. Any ideas? ** The company has never carried out any SEO - white hat or black hat. The site is perfectly normal, nothing dodgy or concerning about it at all.** Thanks in advance for your advice.
Intermediate & Advanced SEO | | bjalc20110 -
Multiple stores & domains vs. One unified store (SEO pros / cons for E-Commerce)
Our company runs a number of individual online shops, specialised in particular products but all in the same genre of goods overall, with a specific and relevant domain name for each shop. At the moment the sites are separate, and not interlinked, i.e. Completely separate brands. An analogy could be something like clothing accessories (we are not in the clothing business): scarves.com, and silkties.com (our field is more niche than this) We are about to launch a related site, (e.g. handbags.com), in the same field again but without precisely overlapping products. We will produce this site on a newer, more flexible e-commerce platform, so now is a good time to consider whether we want to place all our sites together with one e-commerce system on the backend. Essentially, we need to know what the pros and cons would be of the various options facing us and how the SEO ranking is affected by the three possibilities. Option 1: continue with separate sites each with its own domains. Option 2: have multiple sites, each on their own domain, but on the same ecommerce system and visible linked together for the customer (with unified checkout) – on the top of each site could be a menu bar linking to each site: [Scarves.com] – [SilkTies.com] – [Handbags.com] The main question here is whether the multiple domains are mutually beneficial, particularly considerding how close to target keywords the individual domains are. If mutually benefitial, how does it compare to option 3: Option 3: Having recently acquired a domain name (e.g. accessories.com) which would cover the whole category together, we are presented with a third option: making one site selling all of these products in different categories. Our main concern here would be losing the ability to specifically target marketing, and losing the benefit of the domains with the key words in for what people are more likely to be searching for (e.g. 'silk tie') rather than 'accessories.' Is it worth taking the hit on losing these specific targeted domain names for the advantage of increased combined inbound links?
Intermediate & Advanced SEO | | Colage0