Duplicate keywords in URL?
-
Is there such a thing as keyword stuffing URLs? Such as a domain name of turtlesforsale.com having a directory called turtles-for-sale that houses all the pages on the site. Every page would start out with turtlesforsale.com/turtles-for-sale/.
Good or bad idea? The owner is hoping to capitalize on the keywords of turtles for sale being in the URL twice and ranking better for that reason.
-
Sounds like they are wanting to make turtlesforsale.com/turtles-for-sale/ their homepage, which is quite strange. "Keyword stuffing" URLs like this will not help SEO in any way. I would advise against it and leave the homepage as turtlesforsale.com.
eg. for other pages on the site: turtlesforsale.com/state/city/ is better than turlesforsale.com/turtles-for-sale-in-state/turtles-for-sale-in-city/
Like most things in SEO, think of what is better for humans. The first example is much cleaner and easier to read, and for SEO purposes, there would be no difference between the two if all other factors were the same.
A similar question was asked (and answered very well) here: http://moz.com/community/q/url-seo-better-directory-structure-vs-exact-keyword-phrase
-
We have done similar in the past but rather than repeating the URL you could use /turtle-to-buy which is similar and as Google is now using similar terms that may be more positive in the long term. We see great success by mixing it a bit with associated terms.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What if i dont use an H1, but rather, h2 with multiple keywords.
the reason i dont want to use h1 is because i can have only one h1, however if i use several h2s. is it gonna help me rank? bacause google likes h1 more than h2, is google gonna give more priority or same priority to h2., and if that priority is gonna be less, what will be the percentage of that lessness? for ex: h1 gets 90 score if my h1 is missing how much score my h2 will get out of hundred(i know there is no such metric but i am just wondering anyways)
White Hat / Black Hat SEO | | Sam09schulz0 -
Why to add a product id in the url
Hello ! shop.com/en/2628-buy-key-origin-the-sims-4-seasons/ Why will people use a product id in the link? Is there any advantage like better ranking or else?
White Hat / Black Hat SEO | | kh-priyam0 -
Removing/ Redirecting bad URL's from main domain
Our users create content for which we host on a seperate URL for a web version. Originally this was hosted on our main domain. This was causing problems because Google was seeing all these different types of content on our main domain. The page content was all over the place and (we think) may have harmed our main domain reputation. About a month ago, we added a robots.txt to block those URL's in that particular folder, so that Google doesn't crawl those pages and ignores it in the SERP. We now went a step further and are now redirecting (301 redirect) all those user created URL's to a totally brand new domain (not affiliated with our brand or main domain). This should have been done from the beginning, but it wasn't. Any suggestions on how can we remove all those original URL's and make Google see them as not affiliated with main domain?? or should we just give it the good ol' time recipe for it to fix itself??
White Hat / Black Hat SEO | | redcappi0 -
Solved PayDay hack - but SERPs show URLs - what should I do?
We had the PayDay hack - and solved it completely. The problem is - the SERPs have over 3,000 URLs pointing to 404 on our website all of which have urls that are like this: <cite>www.onssi.com/2012/2/post1639/payday-loan-companies-us</cite> What should I do? Should I disavow every one of the 3,000? No Follow?
White Hat / Black Hat SEO | | Ocularis0 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Is Meta Keywords Important For Websites?
Hi, I understand that meta title and descriptions are very important for websites. I would like to know if meta keywords are important? I have seen people talking about meta keywords are useless and it should be removed from the website to prevent competitors from knowing your keywords. Anyone has anything to share? 🙂
White Hat / Black Hat SEO | | chanel270 -
I'm worried my client is asking me to post duplicate content, am I just being paranoid?
Hi SEOMozzers, I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries. My client believes Google might like us a bit more if we had more "text" content. So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media). My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent. I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content. Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid? Thanks everyone. This is my first post to the Moz community 🙂
White Hat / Black Hat SEO | | steve_benjamins0 -
Penguin Update or URL Error - Rankings Tank
I just redid my site from Godaddy Quick Shopping Cart to Drupal. The site is much cleaner now. I transferred all the content. Now my site dropped from being in the top ten on almost every key word we were targeting to 35+. I "aliased" the urls so that they were the same as the Godaddy site. However when I look at our search results I notice that our URLs have extra wording at the end like this: ?categoryid=1 or some other number. Could this be the reason that our rankings tanked? Previously on the godaddy site the results didnt show this.
White Hat / Black Hat SEO | | chronicle0