Should change some pages with key stuffing?
-
Hello, i have a website with 1 years old and when i started, when i created the pages, theses have key stuffing (20-30-40 same words in meta descriptions and text, sometimes 15, sometimes 20 and sometimes 40). Since i saw this (about 4 months), i change that, doing new pages with 5-10 same keywords.
Some pages with many keywords (20-30-40) work very fine and i would not lose the position in google, but i don't want to be penalized for that.
Then, my question is:
Should change the old pages with key stuffing or let them?
Thanks so much.
-
I would update the meta descriptions, page titles, meta keywords, and the body content to eliminate the stuffing.
You shouldn't see a drop in ranking by removing extra keywords from the page titles.
-
Thanks Michael and SEO 5 Team. My page is of coupons codes. I have a good position with these pages and i have afraid to lose these positions. I saw that titles and metadescription are not okey, they have many keywords and are longer.
What do you think about this? Should change only meta-descriptions or too title? I saw that if i change title, can lose more position. Is it really?
Thanks so much again and sorry for my poor english.
-
I would agree with Michael's assessment. Better to be safe than sorry. Reduce the keyword stuffing proactively and think more about the visitor than the search engine and provide them with useful content.
-
I would update the pages to reduce the keyword stuffing. You run the risk of getting penalized (and not knowing it), basically because Panda's evaluation of the quality of the page is likely going to suffer. While we don't know exactly whether or not measuring keyword stuffing is a part of Panda, I believe it's extremely likely because:
- it was a heavily used spam technique for so many years
- it's really easy for Panda to measure algorithmically
As well, conversion rates for real customers who land on the page will do better with un-stuffed pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Implications of Disallowing A LOT of Pages
Hey everyone, I just started working on a website and there are A LOT of pages that should not be crawled - probably in the thousands. Are there any SEO risks of disallowing them all at once, or should I go through systematically and take a few dozen down at a time?
Technical SEO | | rachelmeyer1 -
Similar pages on a site
Hi I think it was at BrightonSEO where PI DataMetrics were talking about similar pages on a website can cause rankings to drop for your main page. This has got me thinking. if we have a category about jumpers so: example.com/jumpers but then our blog has a category about jumpers, where we write all about jumpers etc which creates a category page example.com/blog/category/jumpers, so these blog category pages have no index put on them to stop them ranking in Google? Thanks in Advance for any tips. Andy
Technical SEO | | Andy-Halliday1 -
How to identify orphan pages?
I've read that you can use Screaming Frog to identify orphan pages on your site, but I can't figure out how to do it. Can anyone help? I know that Xenu Link Sleuth works but I'm on a Mac so that's not an option for me. Or are there other ways to identify orphan pages?
Technical SEO | | MarieHaynes0 -
Wrong Page Ranking
Higher-level page with more power getting pushed out by weaker page in the SERPs for an important keyword. I don't care about losing the weaker page. Should I: 404 the weaker page and wait for Google to (hopefully) replace it with the stronger page? 301 the weaker page to the stronger page? NOTE: Due to poor communication between content team and myself, the weak and strong pages have similar title tags (i.e, "lawsuits" and "litigation")
Technical SEO | | LCNetwork0 -
Google Page speed
I get the following advice from Google page speed: Suggestions for this page The following resources have identical contents, but are served from different URLs. Serve these resources from a consistent URL to save 1 request(s) and 77.1KiB. http://www.irishnews.com/ http://www.irishnews.com/index.aspx I'm not sure how to fix this the default page is http://www.irishnews.com/index.aspx, anybody know what need to be done please advise. thanks
Technical SEO | | Liammcmullen0 -
Duplicate Page Content
I've got several pages of similar products that google has listed as duplicate content. I have them all set up with rel="prev" and rel="next tags telling google that they are part of a group but they've still got them listed as duplicates. Is there something else I should do for these pages or is that just a short falling of googles webmaster tools? One of the pages: http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
Technical SEO | | JAARON0 -
Consolidate page strength
Hi, Our site has a fair amount of related/similiar content that has been historically placed on seperate pages. Unfortuantely this spreads out our page strength across multiple pages. We are looking to combine this content onto one page so that our page strength will be focused in one location (optimized for search). The content is extensive so placing it all on one page isn't ideal from a user experience (better to separate it out). We are looking into different approaches one main "tabbed" page with query string params to seperate the seperate pages. We'll use an AJAX driven design, but for non js browsers, we'll gracefully degrade to separate pages with querystring params. www.xxx.com/content/?pg=1 www.xxx.com/content/?pg=2 www.xxx.com/content/?pg=3 We'd then rel canonical all three pages to just be www.xxx.com/content/ Same concept but useAJAX crawlable hash tag design (!#). Load everything onto one page, but the page could get quite large so latency will increase. I don't think from an SEO perspective there is much difference between options 1 & 2. We'll mostly be relying on Google using the rel canonical tag. Have others dealt with this issue were you have lots of similiar content. From a UX perspective you want to separate/classifiy it, but from an SEO perspective want to consolidate? It really is very similiar content so using a rel canonical makes sense. What have others done? Thoughts?
Technical SEO | | NicB10