Remove Directory In Webmaster Tools
-
Hey Moz'erz,
I'm removing some URLS from the index and want to confirm the use of the "remove directory" request.
If my structure is this: /blogs/customer-success-stories/tagged/ --- all pages that are /tagged/abc, /tagged/dce etc. will be removed correct?
First time trying a directory removal as their are 100 plus of these tagged pages.
Comments, suggestions and past experiences welcome!
-
This might do what you want http://apps.shopify.com/power-tools-bulk-edit-tags
-
Hey Keri,
Thanks for the response!
The CMS is actually Shopify. The funny thing I have come to learn and have continued to look for a workaround is that you don't have any control over your robots.txt like you do with bigcommerce. These use to be on page links in a tag cloud.
After checking the analytics on them for conversions (first interaction attribution as a landing page from search) and traffic as well as any inbound links I deemed them fit for removal.
The tag cloud in since deleted and removed however with the Shopify CMS these pages still technically "exist" because of the way content and pages get served dynamically.
-
Are the URLs gone? Or you just don't want them to be indexed? It's better to set things up in Wordpress (if that's your CMS) to not index tags and categories than to go and tell GWT to remove them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why google does not remove my page?
Hi everyone, last week i add "Noindex" tag into my page, but that site still appear in the organic search. what other things i can do for remove from google?
Technical SEO | | Jorge_HDI0 -
Moving to old site to new domain sub directory
Hi, we've moved our old site to a new domain but in a subdirectory (the shopping site has been consolidated into overarching company website's shopping section, thus the move to sub dir). Are 301 redirects from old URLs to new domain's subdirectory ex newsite.com/shopping/page-1/ sufficient for site migration? I wasn't able to use Google's site address change tool since we're moving to a subdirectory on the new domain. Thanks
Technical SEO | | SoulSurfer80 -
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
How to tell when a directory backlink or other backlink is worthy of disavow tool? Especially when a keyword is not ranking passed where it should.
Hello, I jumped aboard as SEO for a client, who seems to of had been hit by panda and penguin back in 2012 of April, the panda part I feel I've fixed by creating better content, combining pages that were same topic into one, basically creating a better content experience that relates better to search terms users are searching for. Once the site was redesigned and relaunched all keywords improved minus one, the main keyword they want to rank for. Created a landing page for it, that is very nicely optimized for that keyword and it's brothers and sisters, however that page isn't used by google since it's brand new with a PA of 1. Doing a backlink audit I found 102 links out of 400 using the same anchor text as the keyword they want ranked for, they also have synonyms anchor text for other links too but not quite as much. Most of those 102 domains using the main keyword anchor text are directories, in my opinion I'd declare all of them spam, however there are a few with DAs higher than 50, making me little more nervous to disavow, since I want to make sure we get out of the penalty if we were hit by penguin but also don't want to ruin the ranking for other keywords we're doing better with, since they are longtails and short, but very relevant to users. How is the best way to determine if a site / directory is spammy enough that it's penalizing you and how could I approach the anchor text issue with backlinks? 99% of these links I cannot have changed, since they're directories I doubt many have had a human mess with them in a while. Sidenote* If you're going to post a link as a response, try to summarize what that link will be about, as many times links are giving as an answer but end up not really providing the meat we were seeking. Thank you!
Technical SEO | | Deacyde0 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Webmaster Tools stopped updating my sites
On Feb 6th WMT stopped updating the stats for my two main sites. I have seen other people complain about this but it seems like it's not known what causes it or if it's by design. Does anyone here have any insight on this issue? Thanks, Greg Davis AntiqueBanknotes
Technical SEO | | Banknotes0 -
Webmaster Tools finding phantom 404s?
We recently (three months now!) switched over a site from .co.uk to .com and all old urls are re-directing to the new site. However, Google Webmaster tools is flagging up hundreds of 404s from the old site and yet doesn't report where the links were found, i.e. in the 'Linked From' tab there is no data and the old links are not in the sitemap. SEOmoz crawls do not report any 404s. Any ideas?
Technical SEO | | Switch_Digital0 -
What tool can i use to get the true speed of my site
hi, i am trying to get the true speed of my site. i want to know how fast www.in2town.co.uk is but the tools that i am using are giving me different readings. http://tools.pingdom.com/fpt/#!/DkHoNWmZh/www.in2town.co.uk says the speed is 1.03s http://gtmetrix.com/reports/www.in2town.co.uk/i4EMDk34 says my speed is 2.25s and http://www.vertain.com/m.q?req=cstr&reqid=dAv79lt8 says it is 4.36s so as you can see i am confused. I am trying to get the site as fast as possible, but need to know what the correct speed is so i can work on things that need changing to make it faster. can anyone also let me know what speed i should be working for. many thanks
Technical SEO | | ClaireH-1848860