Thanks Thomas.
AggregateOffer is what I was looking for.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Job Title: Lead Developer
Company: 4RoadService.com, Inc
Favorite Thing about SEO
SEO is mysterious, but should be possible to master ;)
Thanks Thomas.
AggregateOffer is what I was looking for.
I'm implementing Schema.org, (JSON-LD), on an eCommerce site. Each product has a few different variations, and these variations can change the price, (think T-shirts, but blue & white cost $5, red is $5.50, and yellow is $6).
In my Schema.org markup, (using JSON-LD), in each Product's Offer, I could either have a single Offer with a price range, (minPricd: $5, maxPrice $6), or I could add a separate Offer for each variation, each with its own, correct, price set.
Is one of these better than the other? Why? I've been looking at the WooCommerce code and they seem to do the single offer with a price range, but that could be because it's more flexible for a system that's used by millions of people.
Interesting. I have 2 more thoughts:
Not that I'm aware of, unfortunately. Patience is an important skill when dealing with Google
I thought of one other possibility: Your sitemap.xml is probably auto-generated, so this shouldn't be a problem, but check to make sure that the URLs in the sitemap.xml have the www.
Other than that I'm out of ideas - I would wait a few days to see what happens, but maybe someone else with more experience watching Google will have seen this before. If it does resolve, I'd like to know what worked.
I'm not convinced that robots.txt is causing your problem, but it can't hurt to change it back. In fact, while looking for instructions on how to change it I came across this blog post by Joost de Valk, (aka Yoast), that pretty much says you should remove everything that's currently in your robots.txt - and his arguments are right for everything:
If you're using Yoast SEO, here are instructions on how to change the robots.txt file.
I don't know why this is happening, but this is what I would check:
One theory is: When you moved to the non-www version of the site, Google started getting 301s redirecting it from www to non-www, and now that you've gone back to www it's getting 301s redirecting it from from non-www to www, so it's got a circular redirect.
Here's what I would do to try to kick-start indexing, if you haven't already:
Good luck!
That's not so horrible - it just says not to crawl the plugins directory or the admin, and to delay a second between requests. You probably don't want your plugins or admin directories being indexed, and according to this old forum post Google ignores the crawl-delay directive, so the robots.txt isn't the problem.
I'm also going to recommend WordPress. It's big, battle-tested, and relatively easy to set up and use. They also get security updates out quickly, and your site will auto-patch itself if the security update is critical enough. Non-crucial updates are also very simple to install, (click a few things in a web interface).
For the E-commerce part, WooCommerce is the big guy in the room. I'm also happy with WP e-Commerce, (disclosure: I contribute to its development sometimes), if Woo doesn't work for you. Shopify just launched WordPress integration as well, if that's more up your alley.
As for SEO: Yoast SEO will do a ton. Also, if you really like code you can make WordPress output markup in pretty much whatever way you want without sacrificing the upgradability I started with, so if you're willing to go deep enough, it's, (to me, a WP fan), the perfect CMS.
Like Mary, I'm not a guru, but don't think it'll harm your SEO, but:
TL;DR: If you can get rid of them I would.
Like Mary, I'm not a guru, but don't think it'll harm your SEO, but:
TL;DR: If you can get rid of them I would.
That's not so horrible - it just says not to crawl the plugins directory or the admin, and to delay a second between requests. You probably don't want your plugins or admin directories being indexed, and according to this old forum post Google ignores the crawl-delay directive, so the robots.txt isn't the problem.
I don't know why this is happening, but this is what I would check:
One theory is: When you moved to the non-www version of the site, Google started getting 301s redirecting it from www to non-www, and now that you've gone back to www it's getting 301s redirecting it from from non-www to www, so it's got a circular redirect.
Here's what I would do to try to kick-start indexing, if you haven't already:
Good luck!
I'm not convinced that robots.txt is causing your problem, but it can't hurt to change it back. In fact, while looking for instructions on how to change it I came across this blog post by Joost de Valk, (aka Yoast), that pretty much says you should remove everything that's currently in your robots.txt - and his arguments are right for everything:
If you're using Yoast SEO, here are instructions on how to change the robots.txt file.
I thought of one other possibility: Your sitemap.xml is probably auto-generated, so this shouldn't be a problem, but check to make sure that the URLs in the sitemap.xml have the www.
Other than that I'm out of ideas - I would wait a few days to see what happens, but maybe someone else with more experience watching Google will have seen this before. If it does resolve, I'd like to know what worked.
I'm implementing Schema.org, (JSON-LD), on an eCommerce site. Each product has a few different variations, and these variations can change the price, (think T-shirts, but blue & white cost $5, red is $5.50, and yellow is $6).
In my Schema.org markup, (using JSON-LD), in each Product's Offer, I could either have a single Offer with a price range, (minPricd: $5, maxPrice $6), or I could add a separate Offer for each variation, each with its own, correct, price set.
Is one of these better than the other? Why? I've been looking at the WooCommerce code and they seem to do the single offer with a price range, but that could be because it's more flexible for a system that's used by millions of people.
Multi-disciplinary web developer that wants to make web pages I work on rise to the top of the web.
Looks like your connection to Moz was lost, please wait while we try to reconnect.