Duplicate keywords in URL?
-
Is there such a thing as keyword stuffing URLs? Such as a domain name of turtlesforsale.com having a directory called turtles-for-sale that houses all the pages on the site. Every page would start out with turtlesforsale.com/turtles-for-sale/.
Good or bad idea? The owner is hoping to capitalize on the keywords of turtles for sale being in the URL twice and ranking better for that reason.
-
Sounds like they are wanting to make turtlesforsale.com/turtles-for-sale/ their homepage, which is quite strange. "Keyword stuffing" URLs like this will not help SEO in any way. I would advise against it and leave the homepage as turtlesforsale.com.
eg. for other pages on the site: turtlesforsale.com/state/city/ is better than turlesforsale.com/turtles-for-sale-in-state/turtles-for-sale-in-city/
Like most things in SEO, think of what is better for humans. The first example is much cleaner and easier to read, and for SEO purposes, there would be no difference between the two if all other factors were the same.
A similar question was asked (and answered very well) here: http://moz.com/community/q/url-seo-better-directory-structure-vs-exact-keyword-phrase
-
We have done similar in the past but rather than repeating the URL you could use /turtle-to-buy which is similar and as Google is now using similar terms that may be more positive in the long term. We see great success by mixing it a bit with associated terms.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Following urls should be add in disavow file or not
Hey Moz Friends, Should I include following spam link urls in disavow file or not? OR Will Google handle automatically? These type I have thousands urls. =>>>web-seek.org/the_worlds_most_visited_web_pages_42.html<<<=
White Hat / Black Hat SEO | | Rajesh.Prajapati
=>>>web-seek.net/the_worlds_most_visited_web_pages_42.html<<<=
=>>>websearching.net/the_worlds_most_visited_web_pages_42/
=>>>websearch.pl/the_worlds_most_visited_web_pages_42.html<<<=
=>>>web-search.net/the_worlds_most_visited_web_pages_42/
=>>>web-pages.org/the_worlds_most_visited_web_pages_42/
=>>>web-page.org/the_worlds_most_visited_web_pages_42.html<<<=
=>>>the-world.net/the_worlds_most_visited_web_pages_42.html<<<=
=>>>the-internet.tv/the_worlds_most_visited_web_pages_42.html<<<=
=>>>the-internet.in/the_worlds_most_visited_web_pages_42.html<<<=
=>>>the-globe.tv/the_worlds_most_visited_web_pages_42/
=>>>theglobe.sk/the_worlds_most_visited_web_pages_42.html<<<=
=>>>theglobe.ru/the_worlds_most_visited_web_pages_42.html<<<=
=>>>theglobe.pl/the_worlds_most_visited_web_pages_42.html<<<= Hope you will give any solution. Waiting for your positive response.0 -
Dfferent domains on same ip address ranking for the same keywords, is it possible?
Hello, I want to ask if two domains which r hosted on the same server and have the same ip ( usually happens with shared hosts ) tries to rank for the same keywords in google, does the same ip affects them or not.
White Hat / Black Hat SEO | | RizwanAkbar0 -
Keyword stuffing?
Hi Guys, I'm working on an site which faces some ranking problems. Although some of the problems have been mapped and will get fixed in the future I’m wondering if you could give me an second opinion on the amount of keywords used on the website. Although the texts reads “OK” I’m wondering if the site could experience negative influences of the amount of keywords used. Website: http://premium-hookahs.nl/ Main keyword: waterpijp / shisha Besides the general keyword the secondary keywords get used a lot on category and product pages. I would love to hear your opinion!
White Hat / Black Hat SEO | | Bob_van_Biezen0 -
Do searchs bot understand SEF and non SEF url as the same ones ?
I've jsut realized that since almost for ever I use to code first my website using the non sef for internal linkings. It's very convenient as I'm sure that what ever will be the final url the link will always be good. ex: website.com/component1/id=1 Before releasing the website I use extensions to make the url user friendly according the choosen strategy. ex: website.com/component1/id=1 -> website.com/article1.html But I just wondered if google consider both urls as the same ones or if it consider just as a 301 redirection. What do you think is the best to do ?
White Hat / Black Hat SEO | | AymanH0 -
Content optimized for old keywords and G Updates
Hi, We've got some old content, about 50 pages worth in an Ecommerce site, that is optimized for keywords that aren't the subject of the page - these keywords occur about 8 times (2 keywords per page) in the old content. We are going through these 50 pages and changing the title, H1, and meta description tag to match the exact subject of the page - so that we will increase in rankings again - the updates have been lowering our rankings. Do we need to completely rewrite the content for these 50 pages, or can we just sprinkle it with any needed additions of the one keyword that is the subject of the page? The reason I'm asking is that our rankings keep dropping and these 50 pages seem to be part of the problem. We're in the process of updating these 50 pages Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Best method to target similar keywords??
Hi Guys, We have client that wants to target 3 similar terms (used, secondhand and pre-owned) variations. We have been having a discussion about the different methods to try but can't make a decision on the best route. The target page has a list of pre-owned products so whichever route was take these products still need to be visible without creating duplicate content issues.... 1 - Go all in on one page do our best at optimising a single page for all 3. - i don't like this route.
White Hat / Black Hat SEO | | Kal-SEO
2 - Stick with the current pre-owned url and create a url for used and secondhand with a 301 redirect back to the pre-owned url.
3 - Create three individual pages aimed at a keyword individually, keep the pre-owned as the original and add canonical links to used and secondhand I look forward to hearing your thoughts.
Thanks in advanced0 -
Is it still valuable to place content in subdirectories to represent hierarchy or is it better to have every URL off the root?
Is it still valuable to place content in subdirectories to represent hierarchy on the site or is it better to have every URL off the root? I have seen websites structured both ways. It seems having everything off the root would dilute the value associated with pages closest to the homepage. Also, from a user perspective, I see the value in a visual hierarchy in the URL.
White Hat / Black Hat SEO | | belcaro19860 -
Does having the same descrition for different products a bad thing the titles are all differnent but but they are the same product but with different designs on them does this count as duplicate content?
does having the same description for different products a bad thing the titles are all different but but they are the same product but with different designs on them does this count as duplicate content?
White Hat / Black Hat SEO | | Casefun1