REMOVE
-
REMOVE
-
Thanks Simon. I totally agree with you about visitor experience - it's ultimately the most important factor and I failed to mention it.
-
That's a superb answer from Alan.
The only thing I'd add is to also consider what the most logical and easily navigatable structure is for your website's visitors, which will likely be one of Alan's suggestions for your structure anyway as they are logical suggestions. Just make sure that you're chosen option caters well for your visitors as well as for Search.
Regards
Simon
-
The structure of the URLs chosen would be based upon your desired outcome is regarding what your site is associated with.
So if you want to become recognized eventually as an authority source for the brands you carry, and that is more important than being recognized as an authority for the types of products you offer (categories), or even the individual products, then you would include the brands in that structure.
widget.com/brand/ (the main index of products within that brand)
widget.com/brand/product-name/
widget.com/brand/product-name2/
etc.
These URLs communicate that "all of these products are part of the larger group of "brand".
Alternately
widget.com/category/ (the main index of products within that category)
widget.com/category/product-name/
widget.com/category.product-name2/
these URLS communicate that you've got x pages "within" the "category" section, giving more strength to that category.
Part of the decision also has to do with how much you can apply (time. resources) to getting off-site confirmation that each highest level brand category page is an important page because a key to SEO in 2011 and beyond depends on building citations and links around and pointing to those individual pages.
Alternately, if you just want to go with widget.com/product-name/ then you'd need to get that external effort to point to and reference each of those product pages, so the more products you have, the more difficult it would be to get the same amount of external confirmation for.
And if you have a lot of products, it takes less effort to group the URLs by brand or category and focus on building authority for those brand and category pages than it does all the multiples of individual product pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I remove certain parameters from the canonical URL?
For example, https://www.jamestowndistributors.com/product/epoxy-and-adhesives?page=2&resultsPerPage=16 is the paginated URL of the category https://www.jamestowndistributors.com/product/epoxy-and-adhesives/. Can I remove the &resultsPerPage= variation from the canonical without it causing an issue? Even though the actual page URL has that parameter? I was thinking of using this: instead of: What is the best practice?
Intermediate & Advanced SEO | | laurengdicenso0 -
Removing Toxic Back Links Targeting Obscure URL or Image
There are 2 or 3 URLs and one image file that dozens of toxic domains are linking to on our website. Some of these pages have hundreds of links from 4-5 domains. Rather than disavowing these links, would it make sense to simply break these links, change the URL that the link to and not create a redirect? It seems like this would be a sure fire way to get rid of these links. Any downside to this approach? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan1 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
How to remove seemingly untouchable link spam
Hey Mozzers, I have been struggling with this issue, and I am hoping someone can help. I have a number of bad/spammy links to my site. We have never engaged in "bad SEO", but an old subdomain received a number of spammy blog comments, and everything seemed to escalate from there. We have removed a subdomain that received all of the bad links from our DNS settings (about a year ago), but these links are still there when using Ahrefs or MajesticSEO. I don't think we have been penalized for these links, but I would just like to clean them up because, well, it's the right thing to do. How does one do this when these sites seem so untouchable. Either they are from China, Russia, Denmark, abandoned in 2009, etc. If I look for someone to contact, I can't seem to find anyone to even email. Suggestions?
Intermediate & Advanced SEO | | evan890 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
First attempt at manual penalty removal fails - all example links provided by Google not in Majestic, GWT, Ahrefs, LinkDetox, or OSE.
Hello all, I am trying to recover a site from a manual penalty. I already submitted once. Here's what we did. We took the link profile from webmaster tools, majestic seo, ahrefs, link detox, and ose. We manually looked at every link to exclude good links. Then used a tool to run the removal campaign. Submitted a disavow file and reconsideration request. Google came back with a denial. When I looked at the three example links that Google provided, they were definitely spammy (forum profile and comment spam). But none of them were in any of the original csv downloads from GWT, Ahrefs, Majestic, OSE, or LinkDetox. What can I do? Thanks in advance for any help.
Intermediate & Advanced SEO | | NicoleDeLeon0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Removing hundreds of old product pages - Best process
Hi guys, I've got a site about discounts/specials etc. A few months ago we decided it might be useful to have shop specials in PDF documents "pulled" and put on the site individually so that people could find the specials easily. This resulted in over 2000 new pages being added to the site over a few weeks (there are lots of specials).
Intermediate & Advanced SEO | | cashchampion
However, 2 things have happened: 1 - we have decided to go in another direction with the site and are no longer doing this
2 - the specials that were uploaded have now ended but the pages are still live Google has indexed these pages already. What would be the best way to "deal" with these pages? Do I just delete them, do I 301 them to the home page? PS the site is build on wordpress. Any ideas as I am at a complete loss. Thanks,
Marc0