Product pages - should the meta description match our product description?
-
Hi,
I am currently adding new products to my website and was wondering, should I use our product description (which is keyword optimised) in the meta description for SEO purposes? Or would this be picked up by Google as duplicate content?
Thanks in advance.
-
Thanks for the advice, that's really helpful.
-
Avoid duplicate text. My understanding of best practice here would be to have unique content and intent to the meta description. Intent since this is encouraging engagement more than just describing the product.
What happens if you don't enter a meta description? If your product pages are really thin on content then Google may have difficultly scraping its own meta description. However, you've mentioned that prioduct descriptions are already optimised so hopefully that includes being rich and Google can create its own descriptions.
How could you create meta descriptions without bespoke copy? Two programmatic approaches to the meta tags come to mind. 1.) Scrape you own product descriptions as you suggest or 2.) create a template meta tag with variables for product name, category, etc , perhaps some randomised part sentences if you're feeling adventurous. Both approaches are going to look a little redundant/duplicate to Google however.
In my opinion, if you can't create bespoke meta descriptions, you might be best leaving it blank.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Does anyone know what causes the long meta description snippet?
You know the ones I mean... Google have been infrequently displaying some meta descriptions as 3-4 lines long for some time now. But recently, I've been noticing them more. Not sure whether it's just a coincidence that I've been seeing more for my searches, or whether Google are displaying more in this format. Does anybody know what causes Google to prefer the longer meta description or extended meta description for some results?
Algorithm Updates | | Ria_0 -
Delay between being indexed and ranking for new pages.
I've noticed with the last few pages i've built that there's a delay between them being indexed and them actually ranking. Anyone else finding that? And why is it like that? Not much of an issue as they tend to pop up after a week or so, but I am curious. Isaac.
Algorithm Updates | | isaac6630 -
Should plural keyword variations get their own targeted pages?
I am in the middle of changing a website from targeting just a single keyword on all pages to instead having each page target its own keyword/phrase. However, I'm a little conflicted on whether or not plural forms and other suffix (-ing) variations are different enough to get their own pages. SERP show different results for each keyword searched. Also, relevancy reports for the keywords score some differently and some the same. Is it best to instead use these as secondary and third level keywords on the same page as the main keyword for a page? See example below: OPTION A (Use each for different pages): Page 1 - Construction Fence Page 2 - Construction Fences Page 3 - Construction Fencing Page 4 - Construction Site Fence Page 5 - Construction Site Fences Page 6 - Construction Site Fencing ... OPTION B (Use as variations on same page): Page 1 - Construction Fence, Construction Fences, Construction Fencing Page 2 - Construction Site Fence, Construction Site Fences, Site Construction Fencing ... Any help is greatly appreciated. Thanks!
Algorithm Updates | | pac-cooper0 -
Should my canonical tags point to the category page or the filter result page?
Hi Moz, I'm working on an ecommerce site with categories, filter options, and sort options – teacherexpress.scholastic.com. Should I have canonical tags from all filter and sort options point to the category page like gap.com and llbean.com? or have all sort options point to the filtered page URL like kohls.com? I was under the impression that to use a canonical tag, the pages have to have the same content, meaning that Gap and L.L. Bean would be using canonical tags incorrectly. Using a filter changes the content, whereas using a sort option just changes the order. What would be the best way to deal with duplicate content for this site? Thanks for reading!
Algorithm Updates | | DA20130 -
Using a stop word when optimizing pages
I have a page (for a spa) I am trying to fully optimize and, using AdWords have run every conceivable configuration (using Exact Match) to ascertain the optimal phrase to use. Unfortunately, the term which has come up as the 'best' phrase is "spas in XXX" [xxx represents a location]. When reviewing the data, phrases such as "spas XXX" or "spa XXX" doesn't give me an appropriate search volume to warrant optimizing. So, with that said, do I optimize the page without the word "in", and 'hope' we get the search volume for searches using the word "in", or optimize using the stop word? Any thoughts? Thank you!
Algorithm Updates | | MarketingAgencyFlorida0 -
Too Many On-Page Links
After running a site analysis on here it has come up and said that I have a lot o pages with too many on page links and that this might be why the site is being penalized. Thing is I am not sure how to remedy this as one page that says it has 116 links is this one : http://www.whosjack.org/10-films-with-some-crazy-bitches/ Although there is only one link in the body Then again our home page has 165 http://www.whosjack.org which again it says is too many. The thing is is that surely it doesn't count on links all over the page as other wise every news homepage would be penalised? For example what would happen here on this home page? : http://www.dazeddigital.com/ Can anyone help me see what I am missing? Are there possible hidden links anywhere I should be looking for etc? Thanks
Algorithm Updates | | luwhosjack0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0