REMOVE
-
REMOVE
-
Thanks Simon. I totally agree with you about visitor experience - it's ultimately the most important factor and I failed to mention it.
-
That's a superb answer from Alan.
The only thing I'd add is to also consider what the most logical and easily navigatable structure is for your website's visitors, which will likely be one of Alan's suggestions for your structure anyway as they are logical suggestions. Just make sure that you're chosen option caters well for your visitors as well as for Search.
Regards
Simon
-
The structure of the URLs chosen would be based upon your desired outcome is regarding what your site is associated with.
So if you want to become recognized eventually as an authority source for the brands you carry, and that is more important than being recognized as an authority for the types of products you offer (categories), or even the individual products, then you would include the brands in that structure.
widget.com/brand/ (the main index of products within that brand)
widget.com/brand/product-name/
widget.com/brand/product-name2/
etc.
These URLs communicate that "all of these products are part of the larger group of "brand".
Alternately
widget.com/category/ (the main index of products within that category)
widget.com/category/product-name/
widget.com/category.product-name2/
these URLS communicate that you've got x pages "within" the "category" section, giving more strength to that category.
Part of the decision also has to do with how much you can apply (time. resources) to getting off-site confirmation that each highest level brand category page is an important page because a key to SEO in 2011 and beyond depends on building citations and links around and pointing to those individual pages.
Alternately, if you just want to go with widget.com/product-name/ then you'd need to get that external effort to point to and reference each of those product pages, so the more products you have, the more difficult it would be to get the same amount of external confirmation for.
And if you have a lot of products, it takes less effort to group the URLs by brand or category and focus on building authority for those brand and category pages than it does all the multiples of individual product pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Major Landing page removed from Google SERP and replace homepage URL.How do I fix it?
Hi Major Landing page removed from Google SERP and replace homepage URL.How do I fix it? In an SPA website (angularJS), Why it happens?
Intermediate & Advanced SEO | | cafegardesh0 -
Any success stories after removing excessive cross domain linking?
Hi, I found some excessive cross domain linking from a separate blog to the main company website. It sounds like best practice is to cut back on this, but I don't have any proof of this. I'm cautious about cutting off existing links; we removed two redundant domains that had a huge number of links pointing to the main site almost 1 year ago, but didn't see any correlated improvement in rankings or traffic per se. Hoping some people can share a success story after pruning off excessive cross linking either for their own website or for a client's. Thanks 🙂
Intermediate & Advanced SEO | | ntcma0 -
URLs: Removing duplicate pages using anchor?
I've been working on removing duplicate content on our website. There are tons of pages created based on size but the content is the same. The solution was to create a page with 90% static content and 10% dynamic, that changed depending on the "size" Users can select the size from a dropdown box. So instead of 10 URLs, I now have one URL. Users can access a specific size by adding an anchor to the end of the URL (?f=suze1, ?f=size2) For e.g: Old URLs. www.example.com/product-alpha-size1 www.example.com/product-alpha-size2 www.example.com/product-alpha-size3 www.example.com/product-alpha-size4 www.example.com/product-alpha-size5 New URLs www.example.com/product-alpha-size1 www.example.com/product-alpha-size1?f=size2 www.example.com/product-alpha-size1?f=size3 www.example.com/product-alpha-size1?f=size4 www.example.com/product-alpha-size1?f=size5 Do search engines read the anchor or drop them? Will the rank juice be transfered to just www.example.com/product-alpha-size1?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Remove unwanted web pages
Hi All, I have a number of web pages that yield little or no traffic. I have analysed the traffic data in both normal SERPs and Google Adwords over a year. All low traffic pages rank on the first page.
Intermediate & Advanced SEO | | Mark_Ch
Redirecting these poor performing pages to the main content page would provide the user with a richer experience. Why do I need to remove these pages?
Cost, time and duplicate content issues is causing untold problems.
Removing the existance of the low/no traffic pages will allow me to provide fresh content on the main content pages. Question
Each main content page has about 20 low/no traffic pages associated with it.
I have about 30 instances of the main page scenario. Would carrying htacces page redirects hurt my ranking or worse? Regards Mark0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
How long should you wait between submitting link removal requests?
I'm in the process of trying to clear up a spammy link profile for a site I'm working on. I'm using the excellent data from MOZ and the list of links from Google Webmaster Tools to come up with a list of sites and Remove'em to manage the process and before I go to Google I want to make sure the file I am going to submit for the disavow process is as strong as possible. I am aware that I need to contact webmasters about three times to do the removal request properly. How long between requests should there be and how long should I wait between submitting a final removal request and submitting the file to the disavow tool? Any advice welcome. Thanks.
Intermediate & Advanced SEO | | johanisk0 -
Does 301 redirect to a new domain removes penguin penality
Hi, One of my client has shady link profile and has hit by penguin update. I have confirmed the penalty using Google hack. Now, seeing his link profile, most of his links comes from blog comments which are from unmoderated blogs, and there is no way, we cant remove those comments. But without removing them, we cant get rid of the Google's penguin penality. So, i am planning on 301 redirecting to a new domain. But my question is, will the penality transfers, if i 301 to a new domain? What iff, if someone buys an old domain hit by a penguin update? Please clarify me, or if there are any alternatives to get rid of penguin update, please help me.
Intermediate & Advanced SEO | | Indexxess0 -
Removing Duplicate Page Content
Since joining SEOMOZ four weeks ago I've been busy tweaking our site, a magento eCommerce store, and have successfully removed a significant portion of the errors. Now I need to remove/hide duplicate pages from the search engines and I'm wondering what is the best way to attack this? Can I solve this in one central location, or do I need to do something in the Google & Bing webmaster tools? Here is a list of duplicate content http://www.unitedbmwonline.com/?dir=asc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=asc&mode=list&order=name
Intermediate & Advanced SEO | | SteveMaguire
http://www.unitedbmwonline.com/?dir=asc&order=name http://www.unitedbmwonline.com/?dir=desc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=desc&mode=list&order=name http://www.unitedbmwonline.com/?dir=desc&order=name http://www.unitedbmwonline.com/?mode=grid http://www.unitedbmwonline.com/?mode=list Thanks in advance, Steve0