Unique meta descriptions for 2/3 of it, but then identical ending?
-
I'm working on an eCommerce site and had a question about my meta descriptions. I'm creating unique meta descriptions for each category and subcategory, but I'm thinking of adding the same ending to it. For example: "Unique descriptions, blah blah blah. Free Overnight Shipping..". So the "Free Overnight Shipping..." ending would be on all the categories. It's an ongoing promo so I feel it's important to add and attract buyers, but don't want to screw up with duplicate content. Any suggestions? Thanks for your feedback!
-
You don't get penalized for duplicate meta descriptions. You could have every one on your site the exact same, it won't matter. It doesn't impact rankings. It impacts how likely a searcher is to click through to your website.
-
Hi,
When it comes to Meta descriptions here are the different things you might want to consider:
- Should be more than 100 characters and under 150 characters.
- Should incorporate the primary keyword you targeted
- Should provide valuable and compelling reason for the user to visit the page by including keywords in a conversational format.
- Should be unique.
Unique means different for every single page on your website. As long as it is different -even if it's a word- SE won't consider it as duplicate ;). And yes, you're right, it's always good to mention your services such as free shipping, discounts etc...
Good luck, On-Page Optimization takes time but it's worth it
-
Hi-
I'm not sure how much meta-descriptions really matter for duplicate content purposes, I believe they are mostly used for display in the SERPS. Either way, sounds like they will be somewhat unique and if the page content is unique I don't think you would have an issue.
Good Luck
Ken
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Flurry of thousands of bad links from 3 Spammy websites. Disavow?
I also discovered that a website www.prlog.ru put 32 links to my website. It is a russian site. It has a 32% spam score. Is that high? I think I need to disavow. Another spammy website link has spam score of 16% with with several thousand links. I added one link to the site medexplorer.com 6 years ago and it was fine. Now it has thousands of links. Should I disavow all three?
White Hat / Black Hat SEO | | Boodreaux0 -
Do Ghost Traffic/Spam Referrals factor into rankings, or do they just affect the CTR and Bounce Rate in Analytics?
So, by now I'm sure everyone that pays attention to their Analytics/GWT's (or Search Console, now) has seen spam referral traffic and ghost traffic showing up (Ilovevitaly.com, simple-share-buttons.com, semalt.com, etc). Here is my question(s)... Does this factor into rankings in anyway? We all know that click through rate and bounce rate (might) send signals to the algorithm and signal a low quality site, which could affect rankings. I guess what I'm asking is are they getting any of that data from Analytics? Since ghost referral traffic never actually visits my site, how could it affect the CTR our Bounce Rate that the algorithm is seeing? I'm hoping that it only affects my Bounce/CTR in Analytics and I can just filter that stuff out with filters in Analytics and it won't ever affect my rankings. But.... since we don't know where exactly the algorithm is pulling data on CTR and bounce rate, I guess I'm just worried that having a large amount of this spam/ghost traffic that I see in analytics could be causing harm to my rankings.... Sorry, long winded way of saying... Should I pay attention to this traffic? Should I care about it? Will it harm my site or my rankings at all? And finally... when is google going to shut these open back doors in Analytics so that Vitaly and his ilk are shut down forever?
White Hat / Black Hat SEO | | seequs2 -
On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
White Hat / Black Hat SEO | | AMTrends0 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
Can you have too many NOINDEX meta tags?
Hi, Our magento store has a lot of duplicate content issues - after trying various configurations with canonicals, robots, we decided it best and easier to manage to implement Meta NOINDEX tags to the pages that we wish the search engines to ignore. There are about 10000 URL's in our site that can be crawled - 6000 are Meta No Index - and 3000 odd are index follow. There is a high proportion of Meta No Index tags - can that harm our SEO efforts? thanks, Ben
White Hat / Black Hat SEO | | bjs20100 -
Why would links that were deleted by me 3 months ago still show up in reports?
I inadvertently created a mini link farm some time back by linking all of my parked domains (2000 plus) to some of my live websites (I was green and didn't think linking between the same owner sites / domains was an issue). These websites were doing well until Penguin and although I did not get any 'bad link' advices from Google I figure I was hit by Penguin. So about 3 or 4 months ago I painstakingly deleted ALL links from all of those domains that I still own (only 500 or so - the others were allowed to lapse). None of those domains have any links linking out at all but old links from those domains are still showing up in WMT and in SEOmoz and every other link tracking report I have run. So why would these links still be reported? How long do old links stay in the internet archives? This may sound like a strange question but do links 'remain with a domain for a given period of time regardless'? Are links archived before being 'thrown out' of the web. I know Google keeps archives of data that has expired, been deleted, website closed etc, etc for about 3 years or so (?). In an effort to correct a situation I have spent countless hours manually deleting thousands of links but they won't go away. Looking for some insight here please. cheers, Mike
White Hat / Black Hat SEO | | shags380 -
Are paid reviews gray/black hat?
Are sites like ReviewMe or PayPerPost white hat? Are follow links allowed within the post? Should I use those aforementioned services, or cold contact high authority sites within my niche?
White Hat / Black Hat SEO | | 10JQKAs0 -
What on-page/site optimization techniques can I utilize to improve this site (http://www.paradisus.com/)?
I use a Search Engine Spider Simulator to analyze the homepage and I think my client is using black hat tactics such as cloaking. Am I right? Any recommendations on to improve the top navigation under Resorts pull down. Each of the 6 resorts listed are all part of the Paradisus brand, but each resort has their own sub domain.
White Hat / Black Hat SEO | | Melia0