Please help :) Troubles getting 3 types of content de-indexed
-
Hi there,
I know that it takes time and I have already submitted a URL removal request 3-4 months ago.
But I would really appreciate some kind advice on this topic.Thank you in advance to everyone who contributes!
1) De-indexing archives
Google had indexed all my:
/tag/
/authorname/
archives.I have set them as no-index a few months ago but they still appear in search engine.
Is there anything I can do to speed up this de-indexing?2) De-index /plugins/ folder in wordpress site
They have also indexed all my /plugins/ folder. So I have added a disallow /plugin/ in my robots.txt 3-4 months ago, but /plugins/ still appear in search engine.
What can I do to get the /plugins/ folder de-indexed?
Is my disallow /plugins/ in robots.txt making it worse because google has already indexed it and not it can't access the folder? How do you solve this?3) De-index a subdomain
I had created a subdomain containing adult content, and have it completely deleted it from my cpanel 3months ago, but it still appears in search engines.
Anything else I can do to get it de-indexed?
Thank you in advance for your help!
-
Hi Fabio
If the content is gone when you visit your old URLs do you get a 404 code? You can plug the old URLs into urivalet.com to see what code is returned. If you do, then you're all set. If you don't, see if you can just upload a robots.txt file to that subdomain and block all search engines. Here's info on how to do that http://www.robotstxt.org/robotstxt.html
-Dan
-
Hey Dan,there is no content.
The whole website has been deleted, but it still appears in search results.What should I do?
should I put back some content and then de-index it?Thanks!
fabio -
Hi There
You should ensure the content either;
- has meta noindex tags
- or is blocked with robots.txt
- or 404's or 410's (is missing)
And then use the URL removal tool again and see if that works.
-
Hey Dan thanks a lot for all your help!
There still is a problem though. A while ago I had created an adult subdomain: adult.mywebsite.comThen I completely deleted everything inside it (even though I noticed the subfolder is still in my account).
A few days ago, when I started this thread, I also created a GWMT account for adult.mywebsite.com and submitted a removal request for all those URLs (about 15).Now today when I check:
site:mywebsite.com
or
site.adult.mywebsite.comthe URLs still appear in search results.
When I check
cache:adult.mywebsite.comit sends me to a google 404 page:
http://webcache.googleusercontent.com/search?/complete/search?client=hp&hl=en&gs_rn=31&gs_ri=hp&cp=26&gs_id=s xxxxxxxxxxxxxxxxxxxxxxxxSo I don't know what this means...
Does it mean google hasn't deindexed them?
How do I get them deindexed?
Is it possible google is having troubles de-indexing them because they have no content in them or something like that?What should I do to get rid of them?
Thanks a lot!!!!!!!!!!
Fabio -
Hey Fabio
Regarding #2 I'd give it a little bit more time. 301's take a little longer to drop out, so maybe check back in a week or two Technically the URL removal will mainly work if the content now 404's, is noindexed or blocked in robots.txt but with a redirdect you can do none of those, so you just have to wait for them to pick up on the redirects.
-Dan
-
Hi Dan,
1. Ok! I will.
2. When I click on the /go/ link in search results it redirects me to the affiliate website. I asked for the removal of /go/ a few days ago, but they (about 30 results) still appear in google when I search with the site:mywebsite.com trick.
What should I do about it? How can I get rid of them? They were created with the SimpleUrl plugin which I deleted about 3 months ago though.
3. Got it!
Thanks!
Fabio -
Hi There
1. For the flash file NoReflectLight.swf - I would do a removal request in WMT and maintain the blocking in robots.txt of /plugins/
2. When you do a URL removal in WMT the files need to either be blocked in robots.txt or have a noindex on them or 404. Doesn't that sort of link redirect to your affiliate product? In other words, if I were to try to visit /go/affiliate-product/ it would redirect to www.affiliateproductwebsite.com ?Or does /go/affiliate-product/ load it's on page on your site?
3. I would maintain the robots.txt bloking on /plugins/ - if no other files from there are indexed, they will not be in the future.
-Dan
-
Hey Dan,
thanks for the quick reply.I have gone trough site:mywebsite.com and I found that tags and categories disappeared but there still is some content that shouldn't be indexed like this:
mywebsite.com/wp-content/plugins/wp-flash-countdown/counter_cs3_v2_NoReflectLight.swf
and this:
mywebsite.com/go/affiliate-product/and I found this:Disallow: /wp-content/plugins/
in my robots.txtThing is that:
- I have deleted that wp-flash-countdown plugin at least 9 months ago
- I have manually removed all the urls with /go/ from GWMT and when I search for a cached version of them they are not there
- If I remove Disallow: /wp-content/plugins/ from my robots.txt won't that get all my plugins' pages to be indexed? So how do I make sure they are not indexed?
Thank you so much for your help!So far you have been the most helpful answerer in this forum.
-
Hey There
You want to look for this;
You can just do a cntrl-f (to search text in the source) and type in "noindex" and it should be present on the Tag archives.
-Dan
-
Hey Dan, thanks a lot for your help.
I have tried the cache trick on my home page and the cached version was about 4-5 days old.
I have then tried to cache:mywebsite/tag/ and it gives me a google 404 not found which I suppose is a good sign.
But if they have been de-indexed why do they appear in search results then?
I am not sure how to check the double SEO no-index in the source code though. How do I do that exactly? What should I look for after right-clicking -> source code?
Thanks for your help!
My MOZ account ends in two days so I may not be able to reply back next time.
-
Hi There
Should have explained better
if you type cache: in front of any web URL for example cache:apple.com you get;
And see the "cache" date? This is not the same as the crawl date, but it can give you a rough indication of how often Google might be looking at your pages.
So try that on some of your tag archives and if the cache date is say 4+ weeks ago maybe Google isn't looking at the site very often.
But it's odd they haven't been removed yet, especially with the URL removal tool - that tool usually only takes a day. Noindex tags usually only take a week or two.
Have you examined the source code to make sure it does in fact say "noindex" by the robots tag - or that there is not a conflicting duplicate robots noindex tag? Sometimes wordpress themes and plugins both try adding SEO tags and you can end up with duplicates.
-Dan
-
Hey Dan thanks,
well, so google had indexed all my tags, categories and stuff.The only things I had blocked in my robots was
/go/ for affiliate links
and
/plugins/ for pluginsso I did let google see that categories and archives pages were no-indexed.
I have also submit the removal request many months ago but I haven't quite understood what you say about the cache dates. What should I check?
Thanks for your help!
-
Hi There
For all these cases above, this may be a situation where you've BOTH blocked these in robots.txt and added noindex tags. You can not block the directories in robots.txt and get them deindexed, because Google can not then crawl the URLs to see the noindex tag.
If this is the case, I would remove any disallows to /tag/ etc in robots.txt, allow Google to crawl the URLs to see the nodinex tags - wait a few weeks and see what happens.
As far as the URL removal not working, make sure you have the correct subdomain registered - www or non-www etc for the URLs you want removed.
If neither one of those is the issue, please write back so I can try to help you more with that. Google should noindex the pages in a week or two under normal situations. The other thing is, check the cache date of the pages. If the cache dates are prior to the date you added the noindex, Google might not have seen the noindex directives yet.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site architecture, inner link strategy and duplicate or thin content HELP :)
Ok, can I just say I love that Moz exists! I am still very new to this whole website stuff. I've had a site for about 2 years that I have re-designed several times. It has been published this entire time as I made changes but I am now ready to create amazing content for my niche. Trouble is my target audience is in a very focused niche and my site is really only about 1 topic - life insurance for military families. I'm a military spouse who happens to be an experience life insurance agent offering plans to active duty service members, their spouses as well as veterans and retirees. So really I have 3 niches within a niche. I'm REALLY struggling on how to set up my site architecture. My site is basically fresh so it's a good time to get it hammered down as best as possible with my limited knowledge. Might I also add this is a very competitive space. My competitors are big, established brands who offer life insurance along with unaffiliated, informational sites like military.com or the va benefits site. The people in my niche rarely actually search for life insurance because they think they are all set by the military. When they do search it's very short which is common as this niche lives in a world of acronyms. I'm going to have to get real creative to see if there are any long tail keywords I can use as supporting posts but I think my best route is to attempt to rank for the short one to three keyword phrases this niche looks for while searching. Given my expertise on the subject I am able to write long 1000-5000 content on the matter that will also point out some considerations my competitors dont really cover. My challenge is I cant see how this can be broken into sub topics without having thin supporting content. It's my understanding that I should create these in order to inner link and have a shot at ranking. In thinking about my topic I feel like the supporting posts can only be so long. Furthermore, my three niches within my small overall niche search for short but different keywords. Seems I am struggling to put it all into words. Let me stop here with a question - is it bad to have one category in a website? If not I feel like this would solve my dilemma in making a good site map and content plan. it is possible to split my main topic into 3 categories. I heard somewhere you shouldn't inner link posts from different categories. Problem is if I dont it's not ideal for the user experience as the topics really arent that different. Example a military member might be researching his/her own life insurance and be curious about his spouses coverage. In order to satisfy this user's experience and increase the time on my site I should link to where they can find more dept on their spouses coverage which would be in a different category. Is this still acceptable since it's really not a different subject?
Intermediate & Advanced SEO | | insuretheheroes.com0 -
Does integration of external supplemenatry data help or hurt regarding googles perception of content quality? (e.g weather info, climate table, population info, currency exchange data via API or open source databases)
We just lost over 20% traffic after google algo update at June 26.
Intermediate & Advanced SEO | | lcourse
In SEO forums people guess that there was likely a Phantom update or maybe a Panda update. The most common advice I found was adding more unique content. While we have already unique proprietary content on all our pages and we plan to add more, I was also considering to add some content from external sources. Our site is travel related so I thought about adding for each city page external data such as weather, climate data, currency exchange data via APIs from external sources and also some data such as population from open source databases or some statistical info we would search on the web. I believe this data would be useful to the visitors. I understand that purely own content would be ideal and we will work on this as well. Any thoughts? Do you think the external data may rather help or hurt how google perceives content quality?0 -
Prerender.io and similar services to index content - legit?
A client has a huge, unique, updated list of B2B products that are in javascript and not indexed. Reading around, I think I've found that: Google allows showing bots and users different content (if it's fundamentally the same) with no penalty There are good, bad, and ugly ways to do it It's a semi-common problem There are services like prerender.io and formerly ajaxsnapshots.com that can help with this However..... I can't find a single authoritative (read: from Google or Moz) that says the above point 1. I found this White Hat Cloaking: It exists. It's permitted. It's useful. But can't tell where my situation fits (or if it does). So... if I use prerender.io to surface content to get it indexed... is that a smart move? I'm 95% sure it is, but I need 100% to make the decision.
Intermediate & Advanced SEO | | DanSullivan0 -
Why is page still indexing?
Hi all, I have a few pages that - despite having a robots meta tag and no follow, no index, they are showing up in Google SERPs. In troubleshooting this with my team, it was brought up that another page could be linking to these pages and causing this. Is that plausible? How could I confirm that? Thanks,
Intermediate & Advanced SEO | | SSFCU
Sarah0 -
404 for duplicate content?
Sorry, I think this is my third question today... But I have a lot of duplicated content on my site. I use joomla so theres a lot of unintentional duplication. For example, www.mysite.com/index.php exists, etc. Up till now, I thought I had to 301 redirect or rel=canonical these "duplicated pages." However, can I just 404 it? Is there anything wrong with this rpactice in regards to SEO?
Intermediate & Advanced SEO | | waltergah0 -
Duplicate Content Question
My client's website is for an organization that is part of a larger organization - which has it's own website. We were given permission to use content from the larger organization's site on my client's redesigned site. The SEs will deem this as duplicate content, right? I can "re-write" the content for the new site, but it will still be closely based on the original content from the larger organization's site, due to the scientific/medical nature of the subject material. Is there a way around this dilemma so I do not get penalized? Thanks!
Intermediate & Advanced SEO | | Mills1 -
Ranking with other pages not index
The site ranks on page 4-5 with other page like privacy, about us, term pages. I encounter this problem allot in the last weeks; this usually occurs after the page sits 1-2 months on page 1 for the terms. I'm thinking of to much use the same anchor as a primary issue. The sites in questions are 1-5 pages microniche sites. Any suggestions is appreciated. Thank You
Intermediate & Advanced SEO | | m3fan0 -
Duplicate Content Help
seomoz tool gives me back duplicate content on both these URL's http://www.mydomain.com/football-teams/ http://www.mydomain.com/football-teams/index.php I want to use http://www.mydomain.com/football-teams/ as this just look nice & clean. What would be best practice to fix this issue? Kind Regards Eddie
Intermediate & Advanced SEO | | Paul780