Google Penguin w/ Meta Keywords
-
It's getting really hard filtering through the Penguin articles flying around right now so excuse me if this has been addressed:
I know that Google no longer uses the meta keywords as indicators (VERY old news). But I'm just wondering if they are starting to look at them as a bigger spam indicator since Penguin is looking at over-optimization.
If yes, has anyone read good article indicating so?
The reason I ask is because I have two websites, one is authoritative and the other… not so much. Recently my authoritative website has taken a dip in rankings, a significant dip. The non-authoritative one has increased in rankings… by a lot.
Now, the authoritative website pages that use meta-keywords seem to be the ones that are having issues… so it really has me wondering. Both websites compete with each other and are fairly similar in their offerings.
I should also mention that the meta-keywords were implemented a long time ago… before I took over the account.
Also important to note, I never purchase links and never practice any spammy techniques. I am as white hat as it gets which has me really puzzled as to why one site dropped drastically.
-
Thanks for the link.
I have them on lots of pages and am not worried about them... but they are not "overstuffed".
-
It is pretty much covered in this article by Danny Sullivan.
Excerpt: So use the tag? Sure, if you want to take a chance that by overstuffing it, you’ll cause Bing to think you’re spamming. Be safe, be smart, save your time. Don’t use it.
-
Can anyone suggest one reason why google would consider meta keywords a sin?
If nobody has good reasons then I doubt that google would penalize for them.
-
While I haven't seen any Penguin update that mentioned the meta keywords specifically, there was a lot of talk of over optimization and the meta keywords can certainly fall in that arena. Old webmasters would stuff the meta keywords with tons of similar words. The meta keywords has no upside at all and can potentially be a red flag to both Bing and Google if keywords in this meta tag are not consistent with page content.
All said, I would look for other reasons why the site may of been consumed by the penguin. But if your meta keywords are excessive and repetitious I would definitely remove them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with spam heavy industries that haven't gotten the hammer from Google?
One of our clients works in the video game category - specifically, helping people rank higher in games like League of Legends. In spite of our trying to do things the right way with white hat link building, we've suffered when trying to compete with others who are using comment and forum spam, private blog networks, and other black hat tactics. Our question is - what is the right approach here from a link building perspective? Is it an "if you can't beat them, join them" or do we wait it out and hope Google notices and punishes those who don't play nice? Some test terms to see what we're up against: "elo boost" and "lol coach." Would love to hear thoughts from anyone who's dealt with a similar situation.
White Hat / Black Hat SEO | | kpaulin0 -
Hreflang/Canonical Inquiry for Website with 29 different languages
Hello, So I have a website (www.example.com) that has 29 subdomains (es.example.com, vi.example.com, it.example.com, etc). Each subdomain has the exact same content for each page, completely translated in its respective language. I currently do not have any hreflang/canonical tags set up. I was recently told that this (below) is the correct way to set these tags up -For each subdomain (es.example.com/blah-blah for this example), I need to place the hreflang tag pointing to the page the subdomain is on (es.example.com/blah-blah), in addition to every other 28 subdomains that have that page (it.example.com/blah-blah, etc). In addition, I need to place a canonical tag pointing to the main www. version of the website. So I would have 29 hreflang tags, plus a canonical tag. When I brought this to a friends attention, he said that placing the canonical tag to the main www. version would cause the subdomains to drop out of the SERPs in their respective country search engines, which I obviously wouldn't want to do. I've tried to read articles about this, but I end up always hitting a wall and further confusing myself. Can anyone help? Thanks!
White Hat / Black Hat SEO | | juicyresults0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Should I report this to Google and will anything happen ?
Hi, I am working with a client and have discovered that a direct competitor has hidden the clients business name in meta information and also hidden the name on the page but off to the side. My intention is to ask the company to remove the content, but the client would like me to report it to Google. Is this a waste of time and what request in webmaster tools should I use. The name is not a trademark but the business name is not generic and it is an obvious attempt to target my clients business. Any help would be appreciated, Thanks in advance
White Hat / Black Hat SEO | | Mozzi0 -
Google authorship and multiple sites with multiple authors
Hi guys :). I am asking your help - basically I would like to know what would be the best way to set all of this up. Basically I have two main (e-commerce) sites, and a few other big web properties. What I would like to know is if it is ok to link the main sites to my real G+ account, and use alias G+ accounts for other web properties, or is that a kind of spamming? The thing is that I use a G+ account for those e-commerce sites, and would not necessarily want the other web properties to be linked to the same G+ account, as they are not really related. I do hope I was clear. Any insight would be appreciated. Thanks.
White Hat / Black Hat SEO | | sumare0 -
Article Re-posting / Duplication
Hi Mozzers! Quick question for you all. This is something I've been unsure of for a while. But when a guest post you've written goes live on someone's blog. Is it then okay it post the same article to your own blog as well as Squidoo for example? Would the search engines still see it as duplication if I have a link back to the original?
White Hat / Black Hat SEO | | Webrevolve0 -
I need de-spam help/advice
For one of my sites I am working on I outsourced SEO about 3 years ago. One of the "tricks" the SEO used at the time was to pay for several Blog posts to be "sponsored" by this web site using exact match keywords for the domain. 1 Where do I look to determine the spammy links pointing to this site? 2 Have you had success getting rid of these bad links?
White Hat / Black Hat SEO | | kadesmith0 -
We seem to have been hit by the penguin update can someone please help?
HiOur website www.wholesaleclearance.co.uk has been hit by the penguin update, I'm not a SEO expert and when I first started my SEO got court up buying blog links, that was about 2 years ago and since them and worked really hard to get good manual links.Does anyone know of a way to dig out any bad links so I can get them removed, any software that will give me a list of any of you guys want to do take a look for me? I'm willing to pay for the work.Kind RegardsKarl.
White Hat / Black Hat SEO | | wcuk0