Is this spamming keywords into a url?
-
My company has previously added on extensions to a url like the example below
http://www.test.com/product-name/extra-keywords
My question is since there is no difference between the pages
http://www.test.com/product-name and http://www.test.com/product-name/extra-keywords and you don't leave the product page to reach the extra-keyword page is this really necessary? I feel like this is probably not a best practice. Thanks for any suggestions.
-
Hey guys thanks again for the help but I am still looking for some data to prove this point. I have gone ahead and determined that the keywords we are currently using are placing us page 4 and below while our title tags are placing us higher. If there are any other articles are hard facts on how this doesn't help in this scenario it would be greatly appreciate.
-
do you know where I can find data showing this? will it actually hurt rankings? I'm thinking it's going to make duplicate content.
-
Sika22
Typically, when you come up with a URL structure, that URL structure should closely resemble the breadcrumb trail of your site. So, if you have something like www.domain.com/category/product1 then you should be able to go to www.domain.com/category/ and get a list of products.
It looks like the URL structure you're talking about is adding extra keywords, and that's not necessary.
-
That's the plan. Our site has not been updated since 2011 so I'm trying to point out these issues and tackle them one at a time. We are also double indexed with www and non and about to launch a new site that we will be redirecting to. Lots of work ahead
-
Google is getting pretty good at sniffing out webmasters trying to game the system. Just naturally put your keywords in places that make sense and try not to overdo it, and you should be fine.
-
Thank you. This is what I was thinking but wanted to confirm my thoughts.
-
It is going to be spamming your URL's with keywords that are not serving a subfolder purpose. I would focus on naturally putting your keywords into your product name in the title tag, URL and H1 tag.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
ATTN SEO MINDS: Is there a way/tool to categorize keywords from an Omniture/GA report?
So ideally I would like to take the list of keywords I am currently ranking for, and group these based on what the user intent was in making that query. For example if I am a Thai delivery chain and I am currently receiving traffic from the queries "vegan dish" and "tofu thai food", I would want to have a column in a keyword report that says these queries fall into the VEGETARIAN category. I think what I want to know is how can I filter a massive list by a range of keywords? I want to know does this cell contain, "keyword A" or "keyword B" or "keyword Z". If so list the corresponding category. This way I can look at keyword performance by category or user intent/motivation. Is there a tool out there that will help me accomplish this, or is there a good solution in excel I can use?
Algorithm Updates | | Jonathan.Smith0 -
Do we take a SEO hit for having multiple URLs on an infinite scroll page vs a site with many pages/URLs. If we do take a hit, quantify the hit we would suffer.
We are redesigning a preschool website which has over 100 pages. We are looking at 2 options and want to make sure we meet the best user experience and SEO. Option 1 is to condense the site into perhaps 10 pages and window shade the content. For instance, on the curriculum page there would be an overview and each age group program would open via window shade. Option 2 is to have an overview and then each age program links to its own page. Do we lose out on SEO if there are not unique URLS? Or is there a way using metatags or other programming to have the same effect?
Algorithm Updates | | jgodwin0 -
AS we using the keyword related to our link but we are not listed in first page of Google search
AS we using the keyword related to our link but we are not listed in first page of Google search, but our competitors using the same keyword , they are listing in first page. how we can short this problem and get into first page on search
Algorithm Updates | | krisanantha0 -
Should We Switch from Several Exact Match URLs to Subdomains Instead?
We are a company with one product customized for different vertical markets. Our sites are each setup on their own unique domains:
Algorithm Updates | | contactatonce
contactatonce.com (Brand)
autodealerchat.com (Auto Vertical)
apartmentchat.com (Apartment Vertical)
chatforrealestate.com (Real Estate Vertical) We currently rank well on the respective keyword niches including:
- auto dealer chat (exact match), automotive chat, dealer chat
- apartment chat (exact match), property chat, multifamilly chat
- chat for real estate (exact match), real estate chat To simplify the user experience we are considering moving to a single domain and subdomain structure: contactatonce.com
auto.contactatonce.com
apartment.contactatonce.com
realestate.contactatonce.com QUESTIONS:
1. Considering current Google ranking strategies, do we stand to lose keyword related traffic by making this switch?
2. Are there specific examples you can point to where an individual domain and subdomains each ranked high on Google across a variety of different niches? (I'm not talking about Wikipedia, Blogger, Blogspot, Wordpress, Yahoo Answers, etc. which are in their own class, but a small to mid size brand). Thank you,
Aaron0 -
Meta description & Meta keywords
Good morning, One of our HTML experts, just told me that Google is not reading meta keywords or meta description - and they (or one of them) are no longer part of my website SEO ranking Do you know where can i read about it? Are other SE do look at these parameters? Thank you SEOWiseUs
Algorithm Updates | | iivgi1 -
Sub-domains and keyword rich domains
Hello All I'm hoping for some opinions as i am confused as to the best action for me to take. The problem:
Algorithm Updates | | jonny512379
Although i say the below, we have never been penalised by Google, not taken part in any bad link building and don't do too bad with SERP. but i worry Google may not like what i do these days. We have one main site that is broken down into areas/cities (i,e London, Manchester, etc) so the domain looks like www.domain.co.uk/London But in addition to this we also use Sub-domains to target popular areas (i,e. http://London.domain.co.uk).
These sub-domains take the content from the main site but of course only display results relevant to London and are optimised for "London + Keyword"
Any page that gets duplicated (i.e London.domain.co.uk/profile123 and www.domain.co.uk/profile123 are ALMOST the same content) we add a rel="canonical" link that points to the main domain+page on www.
All these sites have a large amount of links back to www.domain.co.uk/?Page so the user can also search in other areas other then London, etc. This method has worked well for us and is popular with both users and Google search results. All sites/sub-domains are added to GWT under the same account and all sites have unique sitemaps. I do however worry that Google may class this as link manipulation owing to the amount of links pointing back to the main domain and its pages (this is not the reason we use the sub-domains though) In addition to the above sub-domains we have a few domain names (5/6) that are keyword rich that we also place the same content on (i,e www.manchester-keyword.co.uk would show only content relevant to Manchester), and again these sites have links back to the main domain, so users can navigate other areas of the UK. I worry that these additional domains may also not be liked by Google What do people think? I have started to reduce/replace some of the additional keyword rich domains with sub-domains from the main site and then 301 the keyword rich domain (i.e. www.manchester-Keyword.co.uk now goes to http://Manchester.domain.co.uk) as i feel sub-domains may not be penalised as much as unique domains are.
There are domains that i dont really want to 301 as they bring in good amounts of traffic and users have bookmarked them, etc. Any opinions or what you think i should do would be great, as i really worry that if Google stops giving us good results, i'm in real trouble. Although im not sure if what we do is wrong with Google or not.0 -
Are Keywords Dying?
I'm freelancing in SEO work, looking to make it a full time career, and as a result I'm juggling the prospect of having to pick and choose what area I spend most of my time on when working on client sites. My background is in writing so I always lean towards creating content and engaging people via social media. But the standard is also to optimize page titles and - at a deeper level - descriptions for each page. For larger sites, especially e-commerce with many product pages, this is a daunting task. Is it worth it or is the better strategy to focus the limited time available to content creation? Will page titles, etc. eventually become obsolete anyway?
Algorithm Updates | | Nobody15330770827560 -
Is This Keyword Stuffing/Spamming?
We are a custom patch company--we make patches for many different types of clients. I have a gallery of patches for almost every kind of client, and they all have their own pages. If I put navigation on the home page such as what I show below, will Google consider that to be too much? Boy Scout Patches | Motorcycle Patches | Fire Patches | Police Patches | Military Patches | Sports Patches | Business and Organization Patches | Paintball Patches | Scooter Patches | In Memory Patches They would all be links to different pages, and there would be literally 50-60 more! Would it be better to remove the word patches from all of the links? And then another question comes up: too many on-page links?
Algorithm Updates | | UnderRugSwept0