Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I avoid duplicate url keywords?
-
I'm curious to know
Can having a keyword repeat in the URL cause any penalties ?
For example
xyzroofing.com/commercial-roofing
xyzroofing.com/roofing-repairs
My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way.
Also
One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
-
Thank you Boyd
-
There is no penalty for using the same keyword twice in a URL, especially if it's part of your domain name.
There are many examples of sites that have a sub folder that contain the same keyword as their domain name that have no problem ranking including your competition:
- runningwarehouse.com/mens-running-shoes.html ranks #2 for 'running shoes'
- seo.com/seo ranks #5 for 'professional seo'
- overthetopseo.com/professional-seo-services-what-to-expect/ ranks #2 for 'professional seo' (in fact only 3 url's that rank for that phrase don't repeat the term 'seo' in their url.)
- contentmarketinginstitute.com/what-is-content-marketing ranks #1 for 'content marketing'
- etc.
**Ranking the correct page: **
Whenever you have an issue with the wrong page ranking better than the one you want, you just need to work on tweaking your onsite optimization for those pages. (And you may have to continue building more links to the page you want to rank.)
Here is a list of things that I'd make some test changes to: (Keep in mind that you can always revert things back if a test makes rankings go down.)
- Test different title tags on the two pages making one less optimized for the keyword and the other more optimized.
- Add more copy to the page you want to rank.
- Do an internal link audit. You want to make sure that anytime you are linking from one page to another with a specific keyword as the anchor text, that it links to the page that you want to rank for that phrase.
After you make a change, you need to wait until Google re-caches that page and sees the update (which can take a few days or more sometimes) and then check your rankings after that to see if there was any movement or not.
Boyd
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Get google to get to index New URL and not the OLD url
Hi Team, We are undertaking a Domain migration activity to migrate our content frrom one domain to another. 1. the Redirection of pages is handeled at Reverse proxy level. 2. We do have 301 redirects put in place. However we still see that google is indexing pages with our Old domain apart from the pages from new domain. Is there a way for us to stop google from indexing our pages from Old domain. The recommendations to have Noindex on Page mete title and disallow does not work since our redirection is setup at RP and google crawlers always discover the new pages after redirection.
Local Website Optimization | | bhaskaran0 -
Is CNAME / URL flattening a bad practice?
I recently have moved a number of websites top a new server and have made the use of CNAME / URL flattening (I believe these are the same?). A network admin had said this is an unrecommended practice. From what I have read it seems flattening can be beneficial for site speed and SEO even if very little.
Local Website Optimization | | Dissident_SLC0 -
301 or 302 Redirects with locale URLs?
Hi Mozers, I have a bit of a tricky question I need some help answering. My agency are building a brand new website for a client of ours which means changing the domain name (yay...). So! I have my 301's all ready to go for the UK locale, however, the issue I have is that the site will also eventually have French, German and Spanish locales - but these won't be ready to go until later this year. We will be launching in just English for September. The current site already has the French and German locales on it as well. Just to make sure I'm being clear, the site will be www.example.com for launch, but by lets say November, we will also have a www.example.com/fr/ and www.example.com/de/ site launched too. So what do I do with the locale URLs? As I said above, the exisitng site already has the French and German locales on it, so I don't particularly want to redirect the /fr/ and /de/ URLs to the English homepage, as I will want to redirect them to the new URLs in November, and redirecting more than once is bad for SEO right? Any ideas? Would 302s maybe be the best suggestion? Thanks! Virginia
Local Website Optimization | | Virginia-Girtz1 -
How to Handle Franchise Duplicate Content
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns. Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to) Should we just continue to use robots.txt/noindex for all duplicate pages for now? Any other recommendations? Thanks in advance!
Local Website Optimization | | TriMarkDigital0 -
Duplicate Schema within webpage
I'm implementing schema across a few Wordpress sites. Most (probably all) WP sites use widgets for their footer, which offer their own editable HTML. Is it damaging (or helpful) to implement the exact same markup in the footer and a specific page, like for instance, a locations page that has the address and contact info (which are also in the footer)?
Local Website Optimization | | ReunionMarketing0 -
Local SEO - Adding the location to the URL
Hi there, My client has a product URL: www.company.com/product. They are only serving one state in the US. The existing URL is ranking in a position between 8-15 at the moment for local searches. Would it be interesting to add the location to the URL in order to get a higher position or is it dangerous as we have our rankings at the moment. Is it really giving you an advantage that is worth the risk? Thank you for your opinions!
Local Website Optimization | | WeAreDigital_BE
Sander0 -
Title Tag, URL Structure & H1 for Localization
I am working with a local service company. They have one location but offer a number of different services to both residential and commercial verticals. What I have been reading seems to suggest that I put the location in URLs, Title Tags & H1s. Isn't it kind of spammy and possibly annoying user experience to see location on every page?? Portland ME Residential House Painting Portland ME Commercial Painting Portland Maine commercial sealcoating Portland Maine residential sealcoating etc, etc This strikes me as an old school approach. Isn't google more adept at recognizing location so that I don't need to paste it In H1s all over the site? Thanks in advance. PAtrick
Local Website Optimization | | hopkinspat0 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0