Please help :) Troubles getting 3 types of content de-indexed
-
Hi there,
I know that it takes time and I have already submitted a URL removal request 3-4 months ago.
But I would really appreciate some kind advice on this topic.Thank you in advance to everyone who contributes!
1) De-indexing archives
Google had indexed all my:
/tag/
/authorname/
archives.I have set them as no-index a few months ago but they still appear in search engine.
Is there anything I can do to speed up this de-indexing?2) De-index /plugins/ folder in wordpress site
They have also indexed all my /plugins/ folder. So I have added a disallow /plugin/ in my robots.txt 3-4 months ago, but /plugins/ still appear in search engine.
What can I do to get the /plugins/ folder de-indexed?
Is my disallow /plugins/ in robots.txt making it worse because google has already indexed it and not it can't access the folder? How do you solve this?3) De-index a subdomain
I had created a subdomain containing adult content, and have it completely deleted it from my cpanel 3months ago, but it still appears in search engines.
Anything else I can do to get it de-indexed?
Thank you in advance for your help!
-
Hi Fabio
If the content is gone when you visit your old URLs do you get a 404 code? You can plug the old URLs into urivalet.com to see what code is returned. If you do, then you're all set. If you don't, see if you can just upload a robots.txt file to that subdomain and block all search engines. Here's info on how to do that http://www.robotstxt.org/robotstxt.html
-Dan
-
Hey Dan,there is no content.
The whole website has been deleted, but it still appears in search results.What should I do?
should I put back some content and then de-index it?Thanks!
fabio -
Hi There
You should ensure the content either;
- has meta noindex tags
- or is blocked with robots.txt
- or 404's or 410's (is missing)
And then use the URL removal tool again and see if that works.
-
Hey Dan thanks a lot for all your help!
There still is a problem though. A while ago I had created an adult subdomain: adult.mywebsite.comThen I completely deleted everything inside it (even though I noticed the subfolder is still in my account).
A few days ago, when I started this thread, I also created a GWMT account for adult.mywebsite.com and submitted a removal request for all those URLs (about 15).Now today when I check:
site:mywebsite.com
or
site.adult.mywebsite.comthe URLs still appear in search results.
When I check
cache:adult.mywebsite.comit sends me to a google 404 page:
http://webcache.googleusercontent.com/search?/complete/search?client=hp&hl=en&gs_rn=31&gs_ri=hp&cp=26&gs_id=s xxxxxxxxxxxxxxxxxxxxxxxxSo I don't know what this means...
Does it mean google hasn't deindexed them?
How do I get them deindexed?
Is it possible google is having troubles de-indexing them because they have no content in them or something like that?What should I do to get rid of them?
Thanks a lot!!!!!!!!!!
Fabio -
Hey Fabio
Regarding #2 I'd give it a little bit more time. 301's take a little longer to drop out, so maybe check back in a week or two Technically the URL removal will mainly work if the content now 404's, is noindexed or blocked in robots.txt but with a redirdect you can do none of those, so you just have to wait for them to pick up on the redirects.
-Dan
-
Hi Dan,
1. Ok! I will.
2. When I click on the /go/ link in search results it redirects me to the affiliate website. I asked for the removal of /go/ a few days ago, but they (about 30 results) still appear in google when I search with the site:mywebsite.com trick.
What should I do about it? How can I get rid of them? They were created with the SimpleUrl plugin which I deleted about 3 months ago though.
3. Got it!
Thanks!
Fabio -
Hi There
1. For the flash file NoReflectLight.swf - I would do a removal request in WMT and maintain the blocking in robots.txt of /plugins/
2. When you do a URL removal in WMT the files need to either be blocked in robots.txt or have a noindex on them or 404. Doesn't that sort of link redirect to your affiliate product? In other words, if I were to try to visit /go/affiliate-product/ it would redirect to www.affiliateproductwebsite.com ?Or does /go/affiliate-product/ load it's on page on your site?
3. I would maintain the robots.txt bloking on /plugins/ - if no other files from there are indexed, they will not be in the future.
-Dan
-
Hey Dan,
thanks for the quick reply.I have gone trough site:mywebsite.com and I found that tags and categories disappeared but there still is some content that shouldn't be indexed like this:
mywebsite.com/wp-content/plugins/wp-flash-countdown/counter_cs3_v2_NoReflectLight.swf
and this:
mywebsite.com/go/affiliate-product/and I found this:Disallow: /wp-content/plugins/
in my robots.txtThing is that:
- I have deleted that wp-flash-countdown plugin at least 9 months ago
- I have manually removed all the urls with /go/ from GWMT and when I search for a cached version of them they are not there
- If I remove Disallow: /wp-content/plugins/ from my robots.txt won't that get all my plugins' pages to be indexed? So how do I make sure they are not indexed?
Thank you so much for your help!So far you have been the most helpful answerer in this forum.
-
Hey There
You want to look for this;
You can just do a cntrl-f (to search text in the source) and type in "noindex" and it should be present on the Tag archives.
-Dan
-
Hey Dan, thanks a lot for your help.
I have tried the cache trick on my home page and the cached version was about 4-5 days old.
I have then tried to cache:mywebsite/tag/ and it gives me a google 404 not found which I suppose is a good sign.
But if they have been de-indexed why do they appear in search results then?
I am not sure how to check the double SEO no-index in the source code though. How do I do that exactly? What should I look for after right-clicking -> source code?
Thanks for your help!
My MOZ account ends in two days so I may not be able to reply back next time.
-
Hi There
Should have explained better
if you type cache: in front of any web URL for example cache:apple.com you get;
And see the "cache" date? This is not the same as the crawl date, but it can give you a rough indication of how often Google might be looking at your pages.
So try that on some of your tag archives and if the cache date is say 4+ weeks ago maybe Google isn't looking at the site very often.
But it's odd they haven't been removed yet, especially with the URL removal tool - that tool usually only takes a day. Noindex tags usually only take a week or two.
Have you examined the source code to make sure it does in fact say "noindex" by the robots tag - or that there is not a conflicting duplicate robots noindex tag? Sometimes wordpress themes and plugins both try adding SEO tags and you can end up with duplicates.
-Dan
-
Hey Dan thanks,
well, so google had indexed all my tags, categories and stuff.The only things I had blocked in my robots was
/go/ for affiliate links
and
/plugins/ for pluginsso I did let google see that categories and archives pages were no-indexed.
I have also submit the removal request many months ago but I haven't quite understood what you say about the cache dates. What should I check?
Thanks for your help!
-
Hi There
For all these cases above, this may be a situation where you've BOTH blocked these in robots.txt and added noindex tags. You can not block the directories in robots.txt and get them deindexed, because Google can not then crawl the URLs to see the noindex tag.
If this is the case, I would remove any disallows to /tag/ etc in robots.txt, allow Google to crawl the URLs to see the nodinex tags - wait a few weeks and see what happens.
As far as the URL removal not working, make sure you have the correct subdomain registered - www or non-www etc for the URLs you want removed.
If neither one of those is the issue, please write back so I can try to help you more with that. Google should noindex the pages in a week or two under normal situations. The other thing is, check the cache date of the pages. If the cache dates are prior to the date you added the noindex, Google might not have seen the noindex directives yet.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google frown on using 3 different page titles with same content to secure the top 3 results in SERPs?
Is it frowned upon by Google to create 3 different pages with the sames content yet different titles to secure the top three results in SERPs? For example: Luxury Care Homes in Liverpool Care Homes in Liverpool Private Care Homes in Liverpool The page titles are different with slightly different meta data but the user content is exactly the same, would this be considered a cheeky win or negative to rankings?
Intermediate & Advanced SEO | | TrustedCare.co.uk1 -
About duplicate content
We have to products: - loan for a new car
Intermediate & Advanced SEO | | KBC
- load for a second hand car Except for title tag, meta desc and H1, the content is of course very similmar. Are these pages considered as duplicate content? https://new.kbc.be/product/lenen/voertuig/autolening-tweedehands-auto.html
https://new.kbc.be/product/lenen/voertuig/autolening-nieuwe-auto.html thanks for the advice,0 -
Block lightbox content
I'm working on a new website with aggregator of content.
Intermediate & Advanced SEO | | JohnPalmer
i'll show to my users content from another website in my website in LIGHTBOX windows when they'll click on the title of the items. ** I don't have specific url for these items.
What is the best way to say for SE "Don't index these pages"?0 -
Duplicate Content: Is a product feed/page rolled out across subdomains deemed duplicate content?
A company has a TLD (top-level-domain) which every single product: company.com/product/name.html The company also has subdomains (tailored to a range of products) which lists a choosen selection of the products from the TLD - sort of like a feed: subdomain.company.com/product/name.html The content on the TLD & subdomain product page are exactly the same and cannot be changed - CSS and HTML is slightly differant but the content (text and images) is exactly the same! My concern (and rightly so) is that Google will deem this to be duplicate content, therfore I'm going to have to add a rel cannonical tag into the header of all subdomain pages, pointing to the original product page on the TLD. Does this sound like the correct thing to do? Or is there a better solution? Moving on, not only are products fed onto subdomain, there are a handfull of other domains which list the products - again, the content (text and images) is exactly the same: other.com/product/name.html Would I be best placed to add a rel cannonical tag into the header of the product pages on other domains, pointing to the original product page on the actual TLD? Does rel cannonical work across domains? Would the product pages with a rel cannonical tag in the header still rank? Let me know if there is a better solution all-round!
Intermediate & Advanced SEO | | iam-sold0 -
How would you structure this content?
We have a site where we write about our son who was born with Down syndrome. I had a question regarding some content I'm trying to create and structure and hoping you guys can point me in the right direction. One of the things we are often asked by new parents is what toys we suggest for people to buy for their child with Down syndrome, or as gifts for a friend who has a child with Down syndrome. So I'd like to write some posts that suggest great toys for each year of a kids life (and continue that as Noah grows.) However, there are some variations of key words that I would like to rank for as well and it gets a little messy, which is where I need the help. For example for each year I could have a post titled: Top Ten (I could also change out top ten for Best, etc..)Toys For A One Year Old with Down Syndr Top Ten Christmas Gift Ideas For A One Year Old With Down Syndrome Top Ten Birthday Gift Ideas For a One Year Old With D.S. Top Ten Learning Toys For A One Year Old With D.S. Top Ten Toys Under 25 Dollars For A One Year Old with DS Top Ten Developmental Toys for a One Year Old With DS Top Ten Fisher Price Toys for a child with ds Best Light Up Toys For a one year old with ds best muscial toys for a one year old with ds I could also think of other variations as well. Also I can make each of these with the various ages. 2 year old, 3 year old, etc... So I'm not sure what the best way to go is. I could easily have a ton of content that is all virtually the same (birthday gifts / christmas gifts..although I could suggest different toys) so I'd have a ton of different toys pages trying to rank for one term each that is good for google searchers but probably not so great for folks coming to my site as I would have toy pages scattered all over the site. I also don't know how landing pages would fit in to all of this. Would I want a "Down Syndrome Toy Guide" landing page, or "Down Syndrome Gift Guide" ... or both...or something else, and then link all of those other pages on that page? I have a few pages on my site now that I wrote before I started to think about all the different combinations I wanted to rank for: http://noahsdad.com/gift-ideas-down-syndrome/ and http://noahsdad.com/best-fisher-price-learning-toys/ I'm open to any feedback you guys may have on this. I'd also like to do some posts on "Down Syndrome Books" and hope to use the same info that you guys give me and apply to books. (Therapy books, touch and feel books, resource books, new parents books, etc..) Hoping some folks chime in as your help would really be appreciated.
Intermediate & Advanced SEO | | NoahsDad0 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Wordpress Duplicate Content
We have recently moved our company's blog to Wordpress on a subdomain (we utilize the Yoast SEO plugin). We are now experiencing an ever-growing volume of crawl errors (nearly 300 4xx now) for pages that do not exist to begin with. I believe it may have something to do with having the blog on a subdomain and/or our yoast seo plugin's indexation archives (author, category, etc) --- we currently have Subpages of archives and taxonomies, and category archives in use. I'm not as familiar with Wordpress and the Yoast SEO plugin as I am with other CMS' so any help in this matter would be greatly appreciated. I can PM further info if necessary. Thank you for the help in advance.
Intermediate & Advanced SEO | | BethA0 -
What constitutes duplicate content?
I have a website that lists various events. There is one particular event at a local swimming pool that occurs every few months -- for example, once in December 2011 and again in March 2012. It will probably happen again sometime in the future too. Each event has its own 'event' page, which includes a description of the event and other details. In the example above the only thing that changes is the date of the event, which is in an H2 tag. I'm getting this as an error in SEO Moz Pro as duplicate content. I could combine these pages, since the vast majority of the content is duplicate, but this will be a lot of work. Any suggestions on a strategy for handling this problem?
Intermediate & Advanced SEO | | ChatterBlock0