How can I exclude display ads from robots.txt?
-
Google has stated that you can do this to get spiders to content only, and faster. Our IT guy is saying it's impossible.
Do you know how to exlude display ads from robots.txt?Any help would be much appreciated.
-
You'd want to make the URL paths where the display ads live to have the crawl disallowed in your robots.txt, just like any other section of your site. Here's some basics on robots.txt.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can we re rank our Penalyzed website in Google?
Hello This is Maqbul, from India. I have a jobs portal blog [ bharatrecruit.com]. It was getting around 50K to 100K Views a Day and made me $100 a day. But after a few months, my competitor made negative SEO with 12,000 Spammy backlinks. Suddenly my site was hit by Google and now it is getting 200 to 300 Pageviews a day. So the question is I did not disavow bad links for a long time like 3 to 4 months. Now I disavow all the bad links but the website is not ranking. Can we re-rank this site or create another website. Please reply must. None of the bloggers can answer this. Thanks, Regards Maqbul
Technical SEO | | vinaso960 -
Can hreflang replace canonicalisation ?
Hi Im working with a site that has ALOT of duplicate content and have recommended developer fix via correct use of Canonicalisation i.e the canonical tag. However a US version (of this UK site) is about to be developed on a subfolder (domain.com/uk/ & domain.com/US/ etc so also looking into adopting the hreflang attribute on these. Upon reading up about the hreflang attribute i see that it performs a degree of canonicalisation too. Does that mean that developing the international versions with hreflang means there's no need to apply canonicalistion tags to deal with the dupe content, since will deal with the original dupe content problems as well as the new country related dupe content, via the hreflang ? I also understand that hreflang and canonicalisation can conflict/clash on different language versions of international subfolders etc as per: http://www.youtube.com/watch?v=Igbrm1z_7Hk In this instance we are only looking at US/UK versions but very likely will want to expand into non english countries too in the future like France for example. So given both the above points if you are using hreflang is it advisable (or even best) to totally avoid the canonical tag ? I would be surprised if the answers yes, since whilst makes logical sense given the above (if the above statements are correct), that seems strange given how important and standard best practice canonical usage seems to be these days. What best ? Use the Hreflang alone, or the Canonical tag alone or both ? What does everyone else do in similar situation ? All Best Dan
Technical SEO | | Dan-Lawrence0 -
Google is indexing blocked content in robots.txt
Hi,Google is indexing some URLs that i don't want to be indexed and also is indexing the same URLs with https. This URLs are blocked in the file robots.txt.I've tried to block this URLs through Google WebmasterTools but Google doesn't let me do it because this URL are httpsThe file robots.txt is correct so, what can i do to avoid this content to be indexed?
Technical SEO | | elisainteractive0 -
What can i do to stop my site from dropping in the rankings
Hi, we were number one in google for the keyword lifestyle magazine but now our magazine website www.in2town.co.uk is doing very bad in the rankings. One week ago we were around 8 then we went to 12 and now we are on the third page and i am not sure what is happening. We wanted and needed our home page to rank for the keywords of lifestyle magazine, lifestyle news but none of these keywords are doing very well with google can anyone please point me in the right direction so i can stop my site falling any further I am not sure if the home page is properly optimized but i have never had trouble with it before many thanks
Technical SEO | | ClaireH-1848860 -
Best use of robots.txt for "garbage" links from Joomla!
I recently started out on Seomoz and is trying to make some cleanup according to the campaign report i received. One of my biggest gripes is the point of "Dublicate Page Content". Right now im having over 200 pages with dublicate page content. Now.. This is triggerede because Seomoz have snagged up auto generated links from my site. My site has a "send to freind" feature, and every time someone wants to send a article or a product to a friend via email a pop-up appears. Now it seems like the pop-up pages has been snagged by the seomoz spider,however these pages is something i would never want to index in Google. So i just want to get rid of them. Now to my question I guess the best solution is to make a general rule via robots.txt, so that these pages is not indexed and considered by google at all. But, how do i do this? what should my syntax be? A lof of the links looks like this, but has different id numbers according to the product that is being send: http://mywebshop.dk/index.php?option=com_redshop&view=send_friend&pid=39&tmpl=component&Itemid=167 I guess i need a rule that grabs the following and makes google ignore links that contains this: view=send_friend
Technical SEO | | teleman0 -
How many links can i do in a day
Hi, I built a niche site yesterday, the domain was 1 week old. What is the safest way to do link building? I.e How many links should i do in order to be in safe? So far, i've concentrated on building my social profile. Like getting as many retweets as possible. Slowly getting facebook likes and added few youtube videos with my link in it. Now i am on to link building. I will only do legit link building methods. Currently i am gonna get 2 press releases. But can you tell me no. of links i must get in a day. PS: The keyword i am competing is highly competitive.
Technical SEO | | Vegit0 -
Has anyone used Micrositez? Or can recommend them?
Hi all, After using SEOmoz for a while (and I have learnt alot - it's a fantastic resource) I think it's about time our website is given the nudge by a professional, who unlike me isn't learning the ropes and trying to compete with the big boys! So easy question, has anyone used/can vouch for http://www.micrositezdigital.co.uk/ I've been in contact with them alot recently and spent hours on the phone. 1 side of me is impressed, the other is scared to death of the monthly price. Any help would be appreciated with this, It would nice to know how good they are especially in regards to their bespoke campaigns. Thanks, Allan
Technical SEO | | allan-chris0 -
Site not being Indexed that fast anymore, Is something wrong with this Robots.txt
My wordpress site's robots.txt used to be this: User-agent: * Disallow: Sitemap: http://www.domainame.com/sitemap.xml.gz I also have all in one SEO installed and other than posts, tags are also index,follow on my site. My new posts used to appear on google in seconds after publishing. I changed the robots.txt to following and now post indexing takes hours. Is there something wrong with this robots.txt? User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /wp-login.php Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /? Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /? Disallow: /*?Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: ia_archiverUser-agent: ia_archiver Disallow: / disable duggmirror User-agent: duggmirror Disallow: / allow google image bot to search all imagesUser-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire siteUser-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz
Technical SEO | | ideas1230