How do I get coupon information like retailmenot has on the SERPs?
-
Hello can anyone tell me how I can implement the same tactic that RetailMeNot is using to populate coupon information in the search results? They have below there meta description 4 fields labeled:Coupon Codes: 38, Free Shipping Deals: 6, Best Discount: 20% off, & Total Offers: 49 Is there some schema markup here? Or is this only allowed for RMN I have not seen it elsewhere but want my website coupons page to compete with them in the SERPs. Appreciate your help!
-
if you want any coupon code then visit my site coupon code 99 and get 500+ coupon codes.
-
Thanks for sharing your problem. I had also faced the same problem but don't worry soon I will share ample solutions here. You can also check here: https://www.dealsshutter.com/
-
This is the search query for anyone who is interested in reproducing:
https://www.google.com/search?q=bouqs%20coupons
Screenshot:
Google links through to this page:
https://www.retailmenot.com/view/bouqs.com
This is the schema read-out for the page:
... so it's not a schema thing. It's just that Google has begun identifying patterns in how coupon sites tend to visually and architecturally lay-out their coupon codes (which it is now recognising). The way the CSS classes are marked up may be helping them (plenty of references for: "OfferItemFull"). Although it's not schema code, there are some schemas which use very similar language e.g: OfferItemCondition
They're using Nginx on the React framework (JS). They're also using this, whatever the heck it is: https://www.signal.co/ - description seems wishy-washy to me. Doesn't seem schema-related, though
-
Hello,
Google is pulling this information from their table. Honey and Slickdeals also have it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best tool for getting a SiteMap url for a website with over 4k pages?
I have just migrated my website from HUGO to Wordpress and I want to submit the sitemap to Google Search Console (because I haven't done so in a couple years). It looks like there are many tools for getting a sitemap file built. But I think they probably vary in quality. Especially considering the size of my site.
Technical SEO | | DanKellyCockroach2 -
Do YouTube videos in iFrames get crawled?
There seems to be quite a few articles out there that say iframes cause problems with organic search and that the various bots can't/won't crawl them. Most of the articles are a few years old (including Moz's video sitemap article). I'm wondering if this is still the case with YouTube/Vimeo/etc videos, all of which only offer iFrames as an embed option. I have a hard time believing that a Google property (YT) would offer an embed option that it's own bot couldn't crawl. However, let me know if that is in fact the case. Thanks! Jim
Technical SEO | | DigitalAnarchy0 -
Unnecessary pages getting indexed in Google for my blog
I have a blog dapazze.com and I am suffering from a problem for a long time. I found out that Google have indexed hundreds of replytocom links and images attachment pages for my blog. I had to remove these pages manually using the URL removal tool. I had used "Disallow: ?replytocom" in my robots.txt, but Google disobeyed it. After that, I removed the parameter from my blog completely using the SEO by Yoast plugin. But now I see that Google has again started indexing these links even after they are not present in my blog (I use #comment). Google have also indexed many of my admin and plugin pages, whereas they are disallowed in my robots.txt file. Have a look at my robots.txt file here: http://dapazze.com/robots.txt Please help me out to solve this problem permanently?
Technical SEO | | rahulchowdhury0 -
Why can't i get the page if i type/paste url directly?
Hello, just click the following link, http://www.tuscany-cooking-class.com/es/alojamiento/villa-pandolfini/ It might be show the 404 page, but follow this way, www.tuscany-cooking-class.com/es then select alojamiento link, then select first property name with villa-pandolfini, Now you can view the page content, why it behave like this, We are using joomla with customized. Anyone help me to fix this issue Thanks Advance Alex
Technical SEO | | massimobrogi0 -
Link to Articles for news sites in Google SERPs
I'm trying to figure out why when I search for "international news" or "world news", for example, some sites in the SERPs have links to news articles, while others don't. For "international news", result of Fox News and New York Times have links to articles, while CNN (the top result), only have sitelinks. I would appreciate any theories on why this happens. Thanks.
Technical SEO | | seoFan210 -
How do I get my keyword rankings to update?
My keyword rankings did not update. It says updates every Thursday and the last update shows as Aug 23 (which was the original update). Any idea why this would happen and how I can get the updated info?
Technical SEO | | pattersonla1 -
Getting rid of low quality
If I wanted to get rid of a batch of low quality pages from the index, Is the best practise to let them 404 and remove them from sitemap files? Thanks
Technical SEO | | PeterM220 -
What to do with content that performs well in SERPs, but is dynamically generated?
A new client developed an application that generates dynamic content. They were hit hard from Panda, and I believe it is in part due to this application. About 500 of the urls from this application perform well in SERPs (rank well, drive traffic to the site, low bounce rate, high page views per visit, etc). And there are an additional 9,000 urls (and growing) in the index that don't drive any organic traffic. We are thinking of making the 500 url that perform well into static pages and de-indexing the rest. What are your thoughts on this?
Technical SEO | | nicole.healthline0