Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Asynchronous loading of product prices bad for SEO?
-
We are currently looking into improving our TTFB on our ecommerce site.
A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched.
The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB.
My question is whether google considers this as black hat SEO or not?
-
Thanks for your response. We'll definitely go for this improvement.
But can you please explain what you mean by "an unintuitive UX idea" ?
-
I don't see any reason why this would be seen as black hat. On the contrary, I see it as an unintuitive UX idea and you should definitely do it.
The only information your withholding (and you're not even cloaking it) is a price that is dependent on a lot of factors. You're not hiding any content or links, so there's no worry there. Even if you were hiding content it wouldn't be a problem, unless it was completely irrelevant and there just to rank the page.
Any affect this could have is that if you're deferring elements to load on the page to improve Time To First Byte, then Google may not read them as they crawl and therefore the content it sees on the page may be depleted, affecting your ability to rank the page. But for something like deferring a price tag, this isn't relevant at all.
I'd say go for it - think it would be a great idea for user experience.
-
Definitely not black hat but could impact SEO and negate any schema markup you have.
I would go to GWT > Crawl > Fetch as Google and see what HTML is received by Googlebot.
If all the async elements are there, you should be gravy.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy to SEO Discontinued Products on Ecommerce Sites?
RebelsMarket.com is a marketplace for alternative fashion. We have hundreds of sellers who have listed thousands of products. Over 90% of the items do not generate any sales; and about 40% of the products have been on the website for over 3+ years. We want to cleanup the catalog and remove all the old listings that older than 2years that do not generate any sales. What is the best practice for removing thousands of listings an Ecommerce site? do we 404 these products and show similar items? Your help and thoughts is much appreciated.
White Hat / Black Hat SEO | | JimJ3 -
New Flurry of thousands of bad links from 3 Spammy websites. Disavow?
I also discovered that a website www.prlog.ru put 32 links to my website. It is a russian site. It has a 32% spam score. Is that high? I think I need to disavow. Another spammy website link has spam score of 16% with with several thousand links. I added one link to the site medexplorer.com 6 years ago and it was fine. Now it has thousands of links. Should I disavow all three?
White Hat / Black Hat SEO | | Boodreaux0 -
Is domain redirection a good method for SEO?
I have a question and need suggestion from you guys. I’ve searched for my question on Google but don’t get exact information what I need. Maybe I can’t search perfectly.
White Hat / Black Hat SEO | | kamishah
Let me explain my confusion:
I’ve checked backlink profile of a website. He is not using his main domain while doing comment backlink. He put his another domain while doing comment backlink. The another domain redirect to the main domain. Why he use another domain while doing comment backlink?
Is it helpful to get better rank on Google? For example: My Main Domain = solutionfall.com
Another Domain= xyz.com (It redirect to solutionfall.com)
He just uses xyz.com while doing comment backlink. Thank You so much1 -
White H1 Tag Hurting SEO?
Hi, We're having an issue with a client not wanting the H1 tag to display on their site and using an image of their logo instead. We made the H1 tag white (did not deliberately hide with CSS) and i just read an article where this is considered black hat SEO. https://www.websitemagazine.com/blog/16-faqs-of-seo The only reason we want to hide it is because it looks redundant appearing there along with the brand name logo. Does anyone have any suggestions? Would putting the brand logo image inside of an H1 tag be ok? Thanks for the help
White Hat / Black Hat SEO | | AliMac261 -
Dodgy backlinks pointing to my website - someone trying to ruin my SEO rankings?
I just saw in 'Just discovered' section of MOZ that 2 new backlinks have appeared back to my website - www.isacleanse.com.au from spammy websites which look like they might be associated with inappropriate content. 1. http://laweba.net/opinion-y-tecnologia/css-naked-day/comment-page-53/ peepshow says: (peepshow links off to my site)07/17/2016 at 8:55 pm2. http://omfglol.org/archives/9/comment-page-196 voyeur says: (voyeur linking off to my site)
White Hat / Black Hat SEO | | IsaCleanse
July 17, 2016 at 7:58 pm Any ideas if this is someone trying to send me negative SEO and best way to deal with it?0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Does having the same descrition for different products a bad thing the titles are all differnent but but they are the same product but with different designs on them does this count as duplicate content?
does having the same description for different products a bad thing the titles are all different but but they are the same product but with different designs on them does this count as duplicate content?
White Hat / Black Hat SEO | | Casefun1 -
Seo style="display: none;" ?
i want to have a funktion which shortens text in categorie view in my shop. apple is doing this in their product configurator see the "learn more" button at the right side: http://store.apple.com/us/configure/MC915LL/A apple is doing this by adding dynamic content but i want it more seo type by leaving the content indexable by google. i know from a search that this was used in the past years by black had seos to cover keywordstuffing. i also read an article at google. i beleive that this is years ago and keywordstuffing is completly no option anymore. so i beleive that google just would recognise it like the way its meant to be. but if i would not be sure i would not ask here 🙂 what do you think?
White Hat / Black Hat SEO | | kynop0