Don't do that, disallow in robots.txt will NOT resolve indexing issue! What you need to use is meta robots. Noindex, nofollow. Watch this WBF on this subject:
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Best posts made by DmitriiK
-
RE: Backlinks from subdomain, can it hurt ranking?
-
RE: Intro to programming/coding for seo
Hi there.
I don't think there is any specific courses for technical SEO, since it includes html, css, js, jquery, php and more.
I think you would be better off trying to find lessons/guides for specific programming languages your website is built on, rather than trying to learn everything at the same time.
As for APIs - the same thing - there are lots of good tutorials for specific APIs, but to use them you need to know some of programming languages, since all APIs are based on one or another programming language.
So, my advice - find out which languages your website is built on, what API it uses and what language that APIs is operating with (most likely it will be the same as your website platform language), then find some crash course online for that language, spend couple weeks learning it - then you should be able to do simple technical SEO changes.
Hope this helps.
-
RE: URL Rewriting Best Practices
Hi there.
Well, as for best practices - you got it covered - remove/substitute underscores, remove redundant directories, make urls readable and understandable by users, implement redirects for pages, which are being renamed.
As for removing extensions from files - i'm not sure it has any effect on SEO or user experience at all. But no, you don't have to create new format pages. Basically what mod_rewrite does is when somebody requests a page, server says "I gonna server you this file with this name, because you sent me this specific request". Just be aware that there is no way to access both original url and rewritten url at the same time, since it would create duplicate issues.
As for rankings affect - as long as all redirects are done properly and urls are targeting the keywords on the page - you should be fine.
-
RE: URL Rewriting Best Practices
Yes, I believe so, that's the only rewrite you'd need not to mess up rankings.
I don't know if one of codes is better than another. All I know that my piece of code is working and i haven't used the one you wrote. It seems ok to me, but just test it. If it works, I don't think there is any difference.
-
RE: URL Rewriting Best Practices
I'm saying rename files first and do rewrite for removing extensions.
You will have to do rewrite for replacing underscores with hyphens anyway, just for redirect purposes.
So, rename files from underscores to hyphens; do rewrite rule for underscore to hyphens to insure old pages are being redirected; do another rewrite for removing file extensions. In som time (2-3-4 months) when old file names (with underscores) are out of google index, delete first rewrite.
-
RE: Anyone heard of Deftsoft?
Hi there.
I never worked or heard about this company. However, very easy way to check if company is any good with their SEO services is to see how they rank for related keyphrases. So, that's what i did - using MOZ's rank tracker i checked their domain for three seo related keyphrases - seo denver, ppc denver and social media marketing denver - they are not in top 50 for any of these keyphrases. I understand that this industry is competitive, but if they can't make themselves rank on first 5 pages - something is wrong. Therefore I personally wouldn't go with them.
Hope this helps.
-
RE: After HTTPS upgrade, should I change all internal links, or a general 301 redirect is better?
Hi again. I've seen it. Quite honestly I disagree with absolutes being a priority. The arguments, presented in that WBF don't really work for me against the pain in development (I believe she mentioned even more drawbacks). Also, from my experience I have not seen any (at all) benefits in any way (SEO or loading speed) from having absolutes, rather than relatives.
-
RE: Do Blog Tags affect SEO at all anymore?
Hi there.
Well, one thing is for sure - do not block tags in robots.txt. That won't help anyhow for sure.
As for organic rankings - at the last company I worked for, I've seen some instances when tag page was ranking for a longer tail keyphrase. So, I would keep tags somewhat SEO friendly. But, as you said, main reason for tags nowadays is UX. Therefore I approach it this way - UX/navigational help first, but, if possible, also make tag SEO friendly. Also 10 tags is too many to my opinion. I believe that recommendations are 2-3 tags, maybe 4, but over that is a murder.
Hope this helps