Stuffing keywords into URLs
-
The following site ranks #1 in Google for almost every key phrase in their URL path for almost every page on their site. Example: themarketinganalysts.com/en/pages/medical-translation-interpretation-pharmaceutical-equipment-specifications-medical-literature-hippa/ The last folder in this URL uses 9 keywords and I've seen as many as 18 on the same site. Curious: every page is a "default.html" under one of these kinds of folders (so much architecture?).
Question: How much does stuffing keywords into URL paths affect ranking? If it has an effect, will Google eventually ferret it out and penalize it?
-
This was a good answer and deserves to be labeled as such. I decided not to pursue this since I have been lucky to take the top spot for important key phrases. Thank you for such a well crafted answer.
-
Hi Paul, no problem at all. As Ryan says, we all like a mystery.
As for the canonicals they can have a big effect if all variations of the domain are present. i.e.
etc
Not only are these duplicate pages they will most likely split up any inbound link juice as you can see from the PA of the pages you mention. Go to the http:// version and the http://www and you'll see the problem!
Using <link rel="canonical" href="<a href="http://www.vibralogix.com/">http://www.mydomain.com/" /> would probably be sufficient, and should be included, but I think it's best to have the canonicals redirected properly in the htaccess.</link rel="canonical" href="<a>
Very best wishes
Trevor
-
Thanks for the kind words Paul.
If you are looking for outstanding SEOs to follow, I would recommend EGOL and Alan Bleiweiss. I merely ride in the wake of their excellence.
Your response jumped around a bit but a few replies I would offer:
-
You are right. The value of most directories has dropped significantly. There are very few that offer any real value nowadays.
-
MVC is the current best practice for web design, but friendly URLs is a separate item. You can achieve them with or without MVC.
-
Most people who complain about their site's ranking drop actually have issues on their site if you look closely. I can't begin to share how many people I have encountered who were insistent their site was outstanding when their site had numerous issues.
-
Likewise, I have worked with clients who were quite upset about other sites that ranked well who referred to them as "junk" sites when those ranking were earned. Yes, there are exceptions and Google still has work to do, but they are doing a reasonable job. The truly bad sites usually disappear in 4-8 weeks.
-
I know nothing about "The Marketing Analysts" but they could have an offline presence or have undergone a name change which may explain the "Since 1989" claim. Let's remember Al Gore didn't invent the internet until about 1996 and there has been tremendous changes since then.
-
-
Hi Egol,
Thank you for your reply. The long folder names are probably from using WordPress as you pointed. I found a blog on their subdomain using WordPress.
I have to say that I've enjoyed reading your responses throughout the QA forum because your responses are short and to the point, pithy and no-BS. So, I'm curious about your response to my question. Above you responded "I doubt it" to the question about Google ferreting out keyword stuffed URL paths. Instead of trying to read between the lines, let me ask you, how good of a job is Google doing? How are they falling short?
Kindest regards,
Paul
-
Hi Trevor!
Thank you for your response! I'm VERY new to the concept of canonical issues. If you not in my other response, I'm just getting back into the game. How much do you think the canonical issue really plays?
Kind regards,
Paul
-
Hi Ryan!!
Man, I'm thrilled to see you responded, and that you responded so thoroughly. I've been reading threads in this QA forum for a few days, and I've come to think of you as a bit of an SEO celebrity! I have to figure out how to filter for questions you've answered! : )
Okay...the site. I've been away from SEO for about eight years and a lot has changed. In the past, I've enjoyed top positions in the SERPs under highly competitive key phrases, even recently (probably because good legacy websites seem to carry weight). Back then, I placed my primary site in directories thinking that people who visited the directories would see my listing and click on it and visit me (as opposed to getting a link to get "juice"). This is probably what has been giving my site good rankings for a while, and the fact that I've never used web-chicanery to outrank others. Over the years, I've seen spammy and trickster sites appear and disappear. I used to rip those sites (the only way to get a global vision of what's going on), and I studied what they did. I've got a curious little archive of living black hat tricks, all of which failed as Google caught on to them.
Now I turn my reflectors back on to what's going on in SEO and what companies and individuals are doing to position themselves in SERPs. I'm saddened to report this, but for all the overhauls, tweaking and tinkering that Google has done since 2001 when I started, spammy sites and sites with poor content, usability, usefulness, and design are still outranking truly useful, high-quality, high-integrity sites.
Very recently, I read complaints by people who felt like their sites had been unfairly affected by the Panda update (http://www.google.com/support/forum/p/Webmasters/thread?tid=76830633df82fd8e&hl=en). I followed the links to "offending" sites (sites people felt ranked higher than theirs for no good reasons), and I went through the code in the complainants' sites as well. Holy cow...many of the complainants have good reason to complain. Shallow, spammy, zero-effort sites are blowing away robust sites with truly useful content. I've NEVER had a sinking feeling in my gut in 10 years that ranking well was a crapshoot - but I got that feeling after studying those complaints.
Years ago I worked in Macromedia Dreamweaver (remember how cool "Macromedia" was?) with regular HTML and nowadays I work in Visual Studio, just recently creating my first MVC3 site. MVC allows you to manipulate every tiny aspect of your site, including the URL path. There is absolutely no relation between the path you see in your browser and the actual path to the files on the server. And you can change the path and name of any page instantly and almost effortlessly. It's GREAT for SEO. So, I've been paying special attention to directory names and page names out there on the Internet. That's when I came across "themarketinganalysts" site and their unusually high rankings for so many important key phrases. After combing through that site, studying the source code, checking their rankings across many key phrases - I have to say, regardless of PA of 53 and keyword variances, the code reminds me of some of the code from spammy trickster sites from the early 90s.
If you hand code html, you get a certain vision for what the page will look like as you type along, from the mind’s eye of a visitor. When you go to a site and the code is packed with keywords, weird use of elements (like themarketinganalystemarketinganalysts' textless use of the H1tag to render the logo through CSS – an old trick to put the
next to the tag), you get the feeling that whoever wrote that code is telling search engines one thing, and visitors something different. It's duplicitous. Oddly enough, I'm not fazed by a company that outranks me (there is enough work for ALL of us), but I want to see healthy optimization, not one story in the code and another on the rendered page.
I'm going to do a more in-depth review of the code, page by page, look for trends and track down the sources that provide PA coefficients (or try to!). I’ll use the Wayback Machine to study the evolution of the site. Off the bat:
Mar 21, 2009 "This website coming soon"
Mar 31, 2009 "PREDICTIVE WEB ANALYTICS" - nothing about translation
May 25, 2009 Starts taking current formOdd. This is claimed on the current site: "Since 1989, The MARKETING ANALYSTS has built its Language Translation Services business..." That claim in not supported by what Wayback Machine shows. Geesh... Did I stumble across enterprise-wide shadiness? Hope not!
I'll come back to you and share my SEO findings.
-
Yep those PAs are strong even without canonicalization. Let's hope for Paul's sake that the site doesn't get an seo audit anytime soon!
-
Really great catch on the canonical issue Trevor! The entire time I just knew I was missing something, and that's it.
The www version of the URL has a PA of 53 which put's it as even stronger then the wiki page. The links mostly use "medical translation" as the anchor text with some "medical translator" and "medical translation service" variances thrown in. The link profile is varied enough to satisfy me the page has earned it's ranking.
-
Hi Ryan I noticed that the site has a canonical issue with both an http and www version too. Nice and thorough analysis, really interesting regarding the flag. Now I'm back home I might just have to take a look....although really should think about getting some shut eye here in blighty
-
I love a great SEO mystery and, for me at least, you have found one. I think this is a case for the famous SEO forensic analyst Alan "Sherlock" Bleiweiss.
I can confirm your overall findings and cannot explain the results. Specifically, on Google.com I searched for "medical translation". The results are listed below.
Result #19: http://en.wikipedia.org/wiki/Medical_translation
PA: 52, DA 98
Title: Medical translation - Wikipedia, the free encyclopedia
H1: Medical translation
First words of content: Medical translation is the translation of technical, regulatory....
Internal links (2): Anchor text on both links is "medical translation". Lowest PA of a linked page is 61. About 1000 links per page.
Title: Medical Translation Services: Pharmaceutical, Equipment, Specifications, Medical Literature, HIPPA, [99 chars in title so display is cut-off]
PA: 12, DA 60
H1: none. H2: Medical Translation: Medical Translation Services: Pharmaceutical, Equipment, Specifications, Medical Literature, HIPPA
First words of content: When it comes to the medical translation, you can trust THE MARKETING ANALYSTS.
Internal links (3): Anchor text on all three "Medical translation". The highest PA from a page is 15. One of the links is from the home page which has 220 links total.
As I try to reach for some other factor that would allow this site to rank so well compared to the wiki page I notice the following:
-
the site has "medical translation" in it's site's navigation bar
-
the site has a link in the left sidebar on the home page directly to the page. The sitebar is a tad spammy with 43 links.
The above two items are factors, but not enough to do it for me.
I still couldn't explain the ranking so I searched the page for the term "medical". It only appeared twice so I performed a "find" which indicated the term was being used many more times on the page but was not visible. After searching the HTML and CSS I determined there was extra hidden content. I could not find anything suspicious in the CSS and was puzzled on how this content was being hidden then I realized the "trick" involved.
Please notice the US/UK flag in the upper-right area of the page. Press it. Viola! The home page contains extra content directly related to Medical Transcription that no one will ever see. The content includes "Medical Transcription" as a H3 tag, a link to the target page, and a nice paragraph.
This technique is squarely black hat. The purpose of a language button is to offer a translation. There is only one button for the language the page is already being presented in, so no one will ever press it. The content is additional text and links which has nothing to do with a translation.
Even so, I find it interesting this content is enough to yield the #1 ranking in SERP. Either there is another factor remaining that I could not locate (I really don't think that is the case but would love to hear from others) or Google is putting more weight to content on the home page. I have always felt home page content was very strong, but this page just is not strong enough to blow the Wiki page away like this at all, unless Google is weighing this home page content quite strongly.
I like the Yahoo results MUCH better for this search. Wiki is #2 and this page is #13. Bing shows Wiki as #5 with this page as #13. I am ok with those ranking as well.
-
-
WordPress produces similar long URLs that match the post title.
-
How much will it help? Very little, except where competition is very lo.
Will Google ferret it out? I doubt it.
-
Hi Paul
Wow! To me that just looks so spammy and over-optimised. I would think that the SE's would think the same too but as you say the urls rank #1.
What are the other metrics like for the site, perhaps they may show the reasons for high rankings?
Update: Just taken a quick look and it does seem the domain is quite strong with a DA 60. Having said that they have a canonical issue which,, if they sorted may make them even stronger.....so keep that quiet!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to improve PA of Shortened URLs
Why some of shortened urls like bitly/owly/googl has PA>40? I tried everything to improve PA of my shortened urls like facebook shares, retweets and backlinks to them but still i have PA-1. Checkout this URL: https://moz.com/blog/state-of-links in MOZ OSE and you will many 301 links from shortners
White Hat / Black Hat SEO | | igains
I asked many seo experts about this but no one answered this question so today subscribed MOZ pro for the solution. Please give me the answer.0 -
URL disappeared from the search results
Hey folks, A URL on my webpage that has been climbing in search results ever since has suddenly completely disapeared from the search results and i'm absolutely stuck - no idea what the reason might be. It was ranked #11 for the targeted keyword, than it slightly started dropping down to #14 and #17 after which it completely disappeared, not only for specific targeted keyword, but also for exact name of the product. The URL has vanished from search results. I looked in search console, no particular errors or messages from Google. The only case I might come with is that many URLs are cannonicaly linked to the URL in matter, but i don't assume this might be the case. Does anyone have a suggestion what might the reason why the URL has completely vanished from the search results? Thank you a lot. The URL: http://chemometec.com/cell-counters/cell-counter-nc-200-nucleocounter/ Targeted keyword: 'cell counter'
White Hat / Black Hat SEO | | Chemometec0 -
Why a drop in certain keyword but not others?
Hi. I am looking into a potential clients SEO issues and reasons why their rankings for specific keywords that they ranked for have dropped but others stayed the same or sometimes have improved. They received a manual spam action that was revoked after some disavow and so on. In May of this year they noticed a huge drop in more generic terms. The main keyword that was ranked 6th, dropped to 35th yet other keywords rose slightly. I have noticed this issue for another potential client as well. They ranked 1st for their brand name, then received a spam action that was revoked. Now they do not rank for their brand name but do for other long tail keywords? Any ideas the best way to investigate this or root out the issue and build to improve rank for more generic keywords? Thanks
White Hat / Black Hat SEO | | YNWA0 -
Duplicate keywords in URL?
Is there such a thing as keyword stuffing URLs? Such as a domain name of turtlesforsale.com having a directory called turtles-for-sale that houses all the pages on the site. Every page would start out with turtlesforsale.com/turtles-for-sale/. Good or bad idea? The owner is hoping to capitalize on the keywords of turtles for sale being in the URL twice and ranking better for that reason.
White Hat / Black Hat SEO | | CFSSEO0 -
Glossary pages - keyword stuffing danger?
I've put together a glossary of terms related to my industry that have SEO value and am planning on building out a section on our site with unique pages for each term. However, most of these terms have synonyms or are highly similar to other valuable terms. If I were to make a glossary, and on each page (that will have high-quality, valuable, and accurate definitions and more), wrote something like "{term}, also commonly referred to as {synonym}, {synonym}," would I run the risk of keyword stuffing penalties? My only other idea beyond creating a glossary with separate pages defining each synonym is to use schema.org markup to add synonyms to the HTML of the page, but that could be seen as even more grey-hat type keyword stuffing. I guess one other option would be to work the synonyms into the definition so that the presence of the keyword reads more organically. Thanks!
White Hat / Black Hat SEO | | alecfwilson0 -
From keyword rankings to ......... what KPI?
Hi Folks, I have a customer whos keyword rankings for Google have fluctuated rather widly over the past two months which has caused some consternation on their part dispite our reassurance. This is caused in large part due to their lack of understanding of SEO, little effort on their part in implementing changes for SEO and what I belive to be unrealistic expectations (having done no SEO work on their site wanting to see first page ranking for competitive keywords like 'heart attack' within 4-8 weeks). At the moment we are using just keyword rankings as a KPI and I wish to reframe them by using additional or alternative KPIs so that as rankings fluctuate with future Google seach algorithm tweaks and changes that the customer isn't solely focused on them. I am still in the process of formulating this list but so far I have decided to include the KPIs listed below. Month-on-Month / Quarter-On-Quarter Organic search traffic volume (should be rising) Top landing pages excluding branding keywords and homepage (should corelate to content created to target specific keywords) Number of landing pages on the client site that rank List of landing pages and bounce rates (are the 'gateway pages' holding visitors due to meeting their search requirements?) Average number of keywords per landing page (possibly integrated with the landig page reports above as a dimension to demonstrate correlation of # of keywords to landing pages) Some visibility on top keyword search terms (provided from Adwords where possible and GWT also) Top organic keywords (from adwords and GWT) Conversions from organic search (will vary from client to client for their own needs but will primarily be implemented using Google Tag Manager event tracking for things like enquiry forms etc) Referral traffic Delta/Ranking trends over large set of datapoints (will depend on how often you poll/track rankings but for example if you track rankings weekly then assess the trend of the rankings over 3-6months to smooth out the fluctuations) Your thoughts and feedback on this would be greatly appreciated. Regards, Dave
White Hat / Black Hat SEO | | icanseeu1 -
Removing/ Redirecting bad URL's from main domain
Our users create content for which we host on a seperate URL for a web version. Originally this was hosted on our main domain. This was causing problems because Google was seeing all these different types of content on our main domain. The page content was all over the place and (we think) may have harmed our main domain reputation. About a month ago, we added a robots.txt to block those URL's in that particular folder, so that Google doesn't crawl those pages and ignores it in the SERP. We now went a step further and are now redirecting (301 redirect) all those user created URL's to a totally brand new domain (not affiliated with our brand or main domain). This should have been done from the beginning, but it wasn't. Any suggestions on how can we remove all those original URL's and make Google see them as not affiliated with main domain?? or should we just give it the good ol' time recipe for it to fix itself??
White Hat / Black Hat SEO | | redcappi0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0