Creating pages as exact match URL's - good or over-optimization indicator?
-
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain).
Example:
keyword: cars that start with AWhich way to go is better when creating your pages on a non-exact domain match site:
www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the
or
www.sample.com/starts-with-a/ again has "cars that start with A" as the
Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So:
www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/or
www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/Hope someone here at the MOZ community can help out. Thanks so much
-
Hi Curtis,
Thanks for your reply. Well to be more specific the domain would be:
freecarfinder.com/cars-that-start-with-a/The domain is new so it has not authority whatsoever. The domain is not that long but it's not really short neither. The content on the page is pretty small where the exact keyword that's in the URL is mentioned in the heading 1 and twice on a small piece of text that explains how to use the page to search for results.
Totally agree best practise is to test it out. I do see that our competition is using /starts-with/a and is ranking really well with it. Maybe the best option is to create half of the pages using the exact keyword in the URL and half with /starts-with-a/ to see which one performs better?
-
Unless your domain is really strong on car keywords I would include car in the URL, assuming the URL is not that long. Although Google is moving away from exact match into symantic search it seems to be happening slowly and we have certainly seen improvements in ranking by having some exact matches. So I think as longs as you don't have the exact same phrase in all places on the page there isn't much danger. However, the best pratice is to test and learn, make the change and see if it improves the ranking.
Hope that helps, let me know if you need anything more?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Will blank category pages automatically get updated
Hello, We've got old category pages that are blank like domain/shoes.html (blank white page not in menu anymore) domain/newshoesurl.html (working URL with link in menu) Will the blank pages be automatically deindexed and updated by Google?
White Hat / Black Hat SEO | | BobGW0 -
Better ranking competitors have paid links from blog pages
I have a trial of all the tools at the moment and it's a lot of fun. I have been delving into site explorer and found that some competitors have links to them from obvious seo promoting paid blog sites. One has no other links except a paid for blog from a site that openly admits it offers paid marketing and they shot up to 4th on page one for a main keyword phrase. The info from moz and matt cuts video's say not to do this, but it's so tempting. The blog is well written, while I sit here and do the right thing, my competitors have page one. If the blog is well written and is meaningful is it OK and if google ever decide it's paid and don't like it, wouldn't it be better to be page one for 6 months and then recover? I'd love to give the link to the seo, blogger thingy but don't want to come across as promoting it in any way. I am sure there are loads of them anyway.
White Hat / Black Hat SEO | | Peter24680 -
All pages going through 302 redirect - bad?
So, our web development company did something I don't agree with and I need a second opinion. Most of our pages are statically cached (the CMS creates .html files), which is required because of our traffic volume. To get geotargeting to work, they've set up every page to 302 redirect to a geodetection script, and back to the geotargeted version of the page. Eg: www.example.com/category 302 redirects to www.example.com/geodetect.hp?ip=ip_address. Then that page 302 redirects back to either www.example.com/category, or www.example.com/geo/category for the geo-targeted version. **So all of our pages - thousands - go through a double 302 redirect. It's fairly invisible to the user, and 302 is more appropriate than 301 in this case, but it really worries me. I've done lots of research and can't find anything specifically saying this is bad, but I can't imagine Google being happy with this. ** Thoughts? Is this bad for SEO? Is there a better way (keeping in mind all of our files are statically generated)? Is this perfectly fine?
White Hat / Black Hat SEO | | dholowiski0 -
Is pulling automated news feeds on my home page a bad thing?
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom. After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
White Hat / Black Hat SEO | | amit20760 -
"Unnatural Linking" Warning/Penalty - Anyone's company help with overcoming this?
I have a few sites where I didn't manage the quality of my vendors and now am staring at some GWT warnings for unnatural linking. I'm assuming a penalty is coming down the pipe and unfortunately these aren't my sites so looking to get on the ball with unwinding anything we can as soon as possible. Does anyone's company have experience or could pass along a reference to another company who successfully dealt with these issues? A few items coming to mind include solid and speedy processes to removing offending links, and properly dealing with the resubmission request?
White Hat / Black Hat SEO | | b2bmarketer0 -
Over optimization penalty on the way
Matt Cutts has just anouced that they are bringing in a penalty for over optimized sites, to try and reward good content. http://searchengineland.com/too-much-seo-google%e2%80%99s-working-on-an-%e2%80%9cover-optimization%e2%80%9d-penalty-for-that-115627?utm_source=feedburner&utm_medium=feed&utm_campaign=feed-main
White Hat / Black Hat SEO | | AlanMosley3 -
My page rank dropped by 20 places 1 day before it was cached....any connection?
Hi I've been rather silly and been linking out to other websites for reciprical links. I added about 20 and just discovered some were bad neigbourhoods. On Sunday my rankings tanked but the page was only cached the following day on the Monday. Just wondering if there is any connection. I genuinely did not know that linking out could was bad and have removed all reciprical links as a precaution.
White Hat / Black Hat SEO | | BelfastSEO0