Creating pages as exact match URL's - good or over-optimization indicator?
-
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain).
Example:
keyword: cars that start with AWhich way to go is better when creating your pages on a non-exact domain match site:
www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the
or
www.sample.com/starts-with-a/ again has "cars that start with A" as the
Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So:
www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/or
www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/Hope someone here at the MOZ community can help out. Thanks so much
-
Hi Curtis,
Thanks for your reply. Well to be more specific the domain would be:
freecarfinder.com/cars-that-start-with-a/The domain is new so it has not authority whatsoever. The domain is not that long but it's not really short neither. The content on the page is pretty small where the exact keyword that's in the URL is mentioned in the heading 1 and twice on a small piece of text that explains how to use the page to search for results.
Totally agree best practise is to test it out. I do see that our competition is using /starts-with/a and is ranking really well with it. Maybe the best option is to create half of the pages using the exact keyword in the URL and half with /starts-with-a/ to see which one performs better?
-
Unless your domain is really strong on car keywords I would include car in the URL, assuming the URL is not that long. Although Google is moving away from exact match into symantic search it seems to be happening slowly and we have certainly seen improvements in ranking by having some exact matches. So I think as longs as you don't have the exact same phrase in all places on the page there isn't much danger. However, the best pratice is to test and learn, make the change and see if it improves the ranking.
Hope that helps, let me know if you need anything more?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to stop google bot from crawling spammy injected pages by hacker?
Hello, Please help me. Our one of website is under attack by hacker once again. They have injected spammy URL and google is indexing, but we could not find these pages on our website. These all are 404 Pages. Our website is not secured. No HTTPS Our website is using wordpress CMS Thanks
White Hat / Black Hat SEO | | ShahzadAhmed0 -
More than 450 Pages Created by a hacker
Hi Moz Community, I am in charge of the Spanish SEO for an international company, related to security. A couple of months ago, I realized that my Spanish/keywords/post all vanished from Google, Yahoo, Bing and Duckduckgo. Then I noticed that somebody in command of the main website used a disavow all! I was in shock, as all of you can imagine. Knowing that all the inbound links were spam score under 4, highly relevant and so. Later on, I was informed that the website was hacked and somebody took that action. Of course, it did not solved the issue. I continue researching and found those pages - "Online%20Games%20-%20Should%20Parents%20Worry%20Or%20Celebrate%3F" - all of them like this one. I informed the owner of the website - he is not my client - my client is the Spanish Manager. They erased the pages, of course plus sent all those, to avoid the 404 responses, to the homepage with a 301. My heart stopped at that point! I asked them to send all those with a redirect 301 to a new hidden page with nofollow and noindex directives. We recover, my keywords/pages are in the first page again. Although the DA fell 7 points and no inbound links for now. I asked for the disavow file "to rewrite it", not received yet. Any better ideas? Encountered a similar issue? How did you solved it?
White Hat / Black Hat SEO | | Mª Verónica B.
Thanks in advance.0 -
How Important is it to Use Keywords in the URL
I wanted to know how important this measure is on rankings. For example if I have pages named "chair.html" or "sofa.html" and I wanted to rank for the term seagrass chair or rattan sofa.. Should I start creating new pages with the targeted keywords "seagrass-chair.html" and just copy everything from the old page to the new and setup the 301 redirects?? Will this hurt my SEO rankings in the short term? I have over 40 pages I would have to rename and redirect if doing so would really help in the long run. Appreciate your input.
White Hat / Black Hat SEO | | wickerparadise0 -
HELP!! We are losing search visibility fast and I don't know why?
We have recently moved from http to https - could this be a problem? https://www.thepresentfinder.co.uk As far as I'm aware we are doing everything by SEO best practice and have no manual penalties, all content is unique and we are not doing any link farming etc...
White Hat / Black Hat SEO | | The-Present-Finder0 -
What's the deal with Yext?
Ok, the "SEO" in me says don't sign my clients up for this. But their ads are EVERYWHERE. All the time. Is this bad/good? thoughts? Have you ever used Yext? I can't find a review online that I don't think is biased. Should I trust my gut on this one and pass?
White Hat / Black Hat SEO | | cschwartzel0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Best Link Building Practices to Avoid Over Optimizing
With all the new over opting talk, one of the things mentioned is having the same anchored text linking to a page over and over without variation. Is there a good estimate on how many external linking in keywords should be exact versus how many should be in variation? Also, keeping value of pages links in mind. Would it be best to use [Exact] phrase for the higher PR sites or more relevant higher traffic sites? and save the long tail or keyword variation text for the lesser valued sites. When to use exact phrase and when to long tail is my question/discussion I always stay relevant in my link building, and all my links are liking within context. Because I know that relevancy has been an important factor. After watching this video from Matt Cutt's http://youtu.be/KyCYyoGusqs I assume relevancy is becoming even more of an important factor.
White Hat / Black Hat SEO | | SEODinosaur0 -
Why doesn't Google find different domains - same content?
I have been slowly working to remove near duplicate content from my own website for different locals. Google seems to be doing noting to combat the duplicate content of one of my competitors showing up all over southern California. For Example: Your Local #1 Rancho Bernardo Pest Control Experts | 858-352 ... <cite>www.pestcontrolranchobernardo.com/</cite>CachedYou +1'd this publicly. UndoPest Control Rancho Bernardo Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 858-352-7728. Your Local #1 Oceanside Pest Control Experts | 760-486-2807 ... <cite>www.pestcontrol-oceanside.info/</cite>CachedYou +1'd this publicly. UndoPest Control Oceanside Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 760-486-2807. The competitor is getting high page 1 listing for massively duplicated content across web domains. Will Google find this black hat workmanship? Meanwhile, he's sucking up my business. Do the results of the competitor's success also speak to the possibility that Google does in fact rank based on the name of the url - something that gets debated all the time? Thanks for your insights. Gerry
White Hat / Black Hat SEO | | GerryWeitz0