Is This Worth Fixing?
-
Hi,
I'm working on a site that was last optimized some years ago. It has a fair number of pages that the url, h1, title tag and image alt exact match. Although this comes back as A+ in Moz's on page grader, it seems a bit much.
What do you think, is all this too heavy an SEO fingerprint for Google?
-
HI Erica,
It's not on the page too many times, it's just relentlessly exact-matched between h1, url, title tag and img alt, but I guess that's okay.
Thanks
-
No, if you put the keyword too many times, the page grader will give you a bad grade.
-
Hi Erica,
Thanks for the message. Whaddya mean... the page grader gives it an "A" for the keyword it hammered?
Thanks...
-
Very true.
-
The page grader will give you a bad grade if you keyword stuff.
-
Sounds like it should be fine. Unless Google slaps you for keyword stuffing you haven't done anything wrong...hell it could even be argued you are using a best practice. As long as the URL, H1, title tag, and image are relevant and accurately depict the keyword you are golden.
If anything, you may want to strengthen your keyword strategy and work in some synonyms in there...catch a few more eyes and rank for additional keywords. This can also help to water down the super optimized pages and make it feel a bit safer.
Hope that helps,
Christopher
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is Wordpress Website Backup Service Worth the Investment?
I was horrified to learn that my hosting company, InMotion Hosting does not offer redundant backups, that it is on the customer to set up backups to ensure they don't lose their data. I plan to back up to Google Drive 3 x a week for 12 backups and also create 3 backups on our server (Sunday, Tuesday, Thursday). So if something goes wrong and we catch it within a week we can generate the backup directly from our server. There are website backup services such as BlogVault. Do they offer any meaningful advantages to taking the contents of the entire server (16 gigs) and backing it up? They do offer Malware removal. Does this have value? Is back up on an external service like Google Cloud while simultaneously backing up on the server a safe way to proceed? If not, what is the simplest and most effective manner to backup? I prefer to avoid adding any plugins to WordPress as our site already has too many (about 30). Thanks!!
Intermediate & Advanced SEO | | Kingalan1
Alan1 -
Default Wordpress 301 Redirects of JS and CSS files. Bad for SEO & How to Fix?
Hi there: We are developers with some digital marketing expertise, but a current issue has us perplexed. An outside SEO firm has asked us to clean up a large number of 301 redirects. Most of these are 'default' Wordpress behavior that relate to calling the latest version of a JS or CSS file. For instance, a JS file is called with this: https://websitexyz.com/wp-includes/js/wp-embed.min.js?ver=4.9.1 but ultimately redirects to this: https://websitexyz.com/wp-includes/js/wp-embed.min.js. We are being asked to prevent the redirect from happening by, presumably, calling the ultimate file to begin with. The issue is that, as far as we know, there's no easy way to alter WP behavior to call the ultimate file to begin with. Does anyone have any thoughts on this? Thanks.
Intermediate & Advanced SEO | | Daaveey0 -
How to fix issues from 301s
Case: We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits. Issue: My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline. Proposed resolution: I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up. Question 1: Is the assumption that the decline could be because of missed authority signals reasonable? Question 2: Could the proposed solution be harmful? Question 3: Will the proposed solution be adequate to resolve the issue? Any help would be sincerely appreciated. Thank you in advance, David
Intermediate & Advanced SEO | | FireMountainGems0 -
Penalized because of Pharma Wordpress Hack, Fixed, When can we expect to get out?
Hey Guys, so one of our clients hired a web designers to re do his site. Unfortunately in the process the client got a nasty pharma hack and we had to completely re do his site in drupal by scratch because it was so difficult to remove the hack. In this process his lost all his rankings, sub 100 and the hack produced super low quality links from drug related sites pointing to his pages. We're 100% certain the hack is gone, we've disavowed every link, and used WMT to deindex all the drug pages the hack had created. Still 2 weeks later he is sub 100. Does anyone else know of any way to push this along faster? I wish there was some way to get Google to recognize its fixed faster as his business is destroyed.
Intermediate & Advanced SEO | | iAnalyst.com0 -
REL canonicals not fixing duplicate issue
I have a ton of querystrings in one of the apps on my site as well as pagination - both of which caused a lot of Duplicate errors on my site. I added rel canonicals as a php condition so every time a specific string (which only exists in these pages) occurs. The rel canonical notification shows up in my campaign now, but all of the duplicate errors are still there. Did I do it right and just need to ignore the duplicate errors? Is there further action to be taken? Thanks!
Intermediate & Advanced SEO | | Ocularis0 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
How worth it is it to pursue websites who steal your content via cease/desist or DMCA takedown?
We publish popular general interest content. Using a commercial scanning service, we've found our copied content in many places. Is there an SEO value in getting copied content removed from websites who infringe / copy our content? It is a time consuming process and many infringements. And.... what if they copy the content, but include original links to our site in the content. Ironically, this is actually generating links for us - does this affect the answer?
Intermediate & Advanced SEO | | sftravel0 -
Canonical Fix Value & Pointer To Good Instructions?
Could you tell me whether the "canonical fix" is still a relevant and valuable SEO method? I'm talking about the .htaccess (or ISAPI for Microsoft) level fix to make all of the non-www page URLs on a website redirect to the www. version - so that SEO "value" isn't split between the two. I'm NOT talking about the newer <rel= canonical="" http:="" ...="">tag that goes in the HEAD section on an HTML page - as a fix for some duplicate content issues (I guess). </rel=> I still hear about the latter, but less about the former. But the former is different than the latter right - it doesn't replace it? And I'm not sure if the canonical fix is relevant to a WordPress-based website - are you? Also I can never find any page or article on the Web, etc. that explains clearly how to implement the canonical fix for Apache and Microsoft servers. Could you please point me to one? Thanks in advance!
Intermediate & Advanced SEO | | DenisL0