Should I block .ashx files from being indexed ?
-
I got a crawl issue that 82% of site pages have missing title tags
All this pages are ashx files (4400 pages).
Should I better removed all this files from google ? -
Thanks !
As simple as that
-
Are the pages useful to the user? Do you expect users to actively use these pages on your site? Do you want users to be able to find these pages when they search for their issues through Google?
If you've answered 'yes' to any of these questions, I wouldn't suggest removing them from Google. Instead, take your time and set a schedule to optimize each of these pages.
If these pages are: Not valuable to the user, unnecessary to be indexed by Google, locked behind a membership gate, duplicate pages, thin content - then these would be good reasons to noindex them from all search engines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large site with content silo's - best practice for deep indexing silo content
Thanks in advance for any advice/links/discussion. This honestly might be a scenario where we need to do some A/B testing. We have a massive (5 Million) content silo that is the basis for our long tail search strategy. Organic search traffic hits our individual "product" pages and we've divided our silo with a parent category & then secondarily with a field (so we can cross link to other content silo's using the same parent/field categorizations). We don't anticipate, nor expect to have top level category pages receive organic traffic - most people are searching for the individual/specific product (long tail). We're not trying to rank or get traffic for searches of all products in "category X" and others are competing and spending a lot in that area (head). The intent/purpose of the site structure/taxonomy is to more easily enable bots/crawlers to get deeper into our content silos. We've built the page for humans, but included link structure/taxonomy to assist crawlers. So here's my question on best practices. How to handle categories with 1,000+ pages/pagination. With our most popular product categories, there might be 100,000's products in one category. My top level hub page for a category looks like www.mysite/categoryA and the page build is showing 50 products and then pagination from 1-1000+. Currently we're using rel=next for pagination and for pages like www.mysite/categoryA?page=6 we make it reference itself as canonical (not the first/top page www.mysite/categoryA). Our goal is deep crawl/indexation of our silo. I use ScreamingFrog and SEOMoz campaign crawl to sample (site takes a week+ to fully crawl) and with each of these tools it "looks" like crawlers have gotten a bit "bogged down" with large categories with tons of pagination. For example rather than crawl multiple categories or fields to get to multiple product pages, some bots will hit all 1,000 (rel=next) pages of a single category. I don't want to waste crawl budget going through 1,000 pages of a single category, versus discovering/crawling more categories. I can't seem to find a consensus as to how to approach the issue. I can't have a page that lists "all" - there's just too much, so we're going to need pagination. I'm not worried about category pagination pages cannibalizing traffic as I don't expect any (should I make pages 2-1,000) noindex and canonically reference the main/first page in the category?). Should I worry about crawlers going deep in pagination among 1 category versus getting to more top level categories? Thanks!
Moz Pro | | DrewProZ1 -
Removing Domains From Disavow File
We may have accidentally included the wrong domains in our Disavow file and have since removed most domains leaving the only very highly rated spammy links (using moz's new spam score)in the file. How long can it take for to google to recognise this change?ThanksMike
Moz Pro | | mlb70 -
Special Characters in URL & Google Search Engine (Index & Crawl)
G'd everyone, I need help with understanding how special characters impact SEO. Eg. é , ë ô in words Does anyone have good insights or reference material regarding the treatment of Special Characters by Google Search Engine? how Page Title / Meta Desc with Special Chars are being index & Crawl Best Practices when it comes to URLs - uses of Unicode, HTML entity references - when are where? any disadvantage using special characters Does special characters in URL have any impact on SEO performance & User search, experience. Thanks heaps, Amy
Moz Pro | | LabeliumUSA0 -
Duplicate Page Content, Indexing and Rel Canonical Just DOUBLED! Need Advice to Fix
Last Friday (Penguin 5/2.1) my website shot way off the grid and I noticed in my MOZ PRO Campaign dashboard that all of the following just doubled in numbers on my website: duplicate page content, Google indexing, and rel canonicals. I also noticed that some of my pages, images, tags and categories now added a /page/2/ or a -2. I just changed noindex for tags, but indexing for media, pages, posts, and categories. I'm currently using All In One SEO for a plugin. Any advice would be much appreciated as I'm stuck on the issue. relconical.png Duplicate-Page-Content.png [Duplicate Content II](Duplicate Content II) index1.png
Moz Pro | | CelebrityPersonalTrainer0 -
My homepage is not getting indexed by google for some reason
My homepage http://www.truebluelifeinsurance.com is not indexed by google. The rest of my site is indexed. The hompage is indexed by bing. I looked in the webmaster tools and there is no indication why. I believe the issue started when I did a site re-design in August. Any ideas?
Moz Pro | | Brian_Greenberg0 -
CSV file messed up
I cannot convert my exported CSV file to a proper excel sheet. The data is mixed, so converting doesn't work. Some rows have all data in the first cell (column), some rows have data in first AND second cell.. Anyone a solution? yaIKsZz
Moz Pro | | nans2 -
Last Linkscape index update: 05/30/2012
When will the next Linkscape index update occur? I've been waiting to run our backlink profile numbers, but the Last Linkscape index update was 05/30/2012? Thanks!
Moz Pro | | larahill0 -
Best Practices for having Social Profiles indexed
There has been a lot of talk lately around social profiles potentially improving your brand as well as search. What I'd like to know is the best practices for getting those social profiles crawled and indexed so they actually provide a good link to my site. I'm also wondering what the difference between what Linkscape sees and what Google sees and when I'm looking at Open Site Explorer's rankings on one of those social profiles how can I be sure that Google sees it the same way. I ask this because a lot of these profiles are not well internally linked to. An example is about.me, it's a potentially great link, but it's essentially an island, and even after dropping a couple Twitter links to my profile, Open Site Explorer shows and Page Authority of 1, and it's not even indexed with Google. What I did last night was put a link to my about.me, flickr and wedding wire in my Connect menu drop down on my site to get that crawled hopefully soon. Are there other methods of getting those crawled and indexed so it starts passing some juice?
Moz Pro | | WilliamBay
What do you guys do?0