Right way to block google robots from ppc landing pages
-
What is the right way to completely block seo robots from my adword landing pages? Robots.txt does not work really good for that, as far I know.
Adding metatags noindex nofollow on the other side will block adwords robot as well. right?
Thank you very much,
Serge
-
Thank you very much
-
Gotcha. I did some searching around and you will not block the AdWords bot unless you explicitly block AdsBot-Google. A wildcard user agent disallow will not block the AdsBot-Google. Hope that helps!
-
Thanks guys. AdWords robots scans the page to determine it's relevancy to your ad group.
When you block it in robots.txt ONLY, the link to this page will be indexed and show up in your serp results site:example.com
I wondered if you add meta tags noindex nofollow you also blocking all robots to scan the page, adwords as well
-
If you have a specific directory for PPC pages then you can follow these steps that have worked for us wonderfully:
Block the directory /ppc to the robots.txt file for all user agents. This can even be live before the /ppc directory even exists.
User-agent: *
Disallow: /ppc
Add noindex, nofollow in the meta tags for all pages in /ppc
I'm not sure what you are referring to when you mention an AdWords robot?
-
Dear Serge,
Your question is difficulty to answer, because you have several possibilities. If you use the noindex, will stop all the robots.
There is a post at SEOMoz blog writed by Lindsay that i think will answer your question. You will find it here: http://www.seomoz.org/blog/robot-access-indexation-restriction-techniques-avoiding-conflicts
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why my website some pages is not index in google ?
Hi, I have submitted my pages in Google fetch for consideration tool but they are not indexed yet in the Google search. Additionally, there is also no error shown by the Google.
On-Page Optimization | | seo.kishore890 -
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
How can I drive organic traffic to a specific landing page?
HI, I have a site which is attracting traffic for my target keywords but to the wrong pages. I usually create a series of articles on the topic (10-15) an start getting organic traffic, but I have not been able to drive the traffic to the main page for that topic. How can I get the main page rank over sub pages? Thanks in advance!
On-Page Optimization | | Rajaindiain0 -
Why does Google pick a low priority page on my site?
Hi Guys. One of my pages ranks quite well for "mid year diaries 14-15" on Google. The problem is it's a really specific product page (A4, Hardback, day-to-a-page diary I think). It would be much better for the user to land on our mid-year diaries category, not really deep into the site. Why is Google prioritizing this product page over our general 'mid year diaries' category? Especially when the category would relate to the search more accurately? I work for TOAD diaries and I think our page rank is 10 for this search. Eagerly awaiting some insight 🙂 Thanks in advance everyone! Isaac.
On-Page Optimization | | isaac6630 -
Duplicate pages
Hi I have recently signed up to Moz Pro and the first crawl report on my wordpress site has brought up some duplicate content issues. I don't know what to do with this data! The original page : http://www.dwliverpoolphotography.co.uk/blog/ and the duplicate content page : http://www.dwliverpoolphotography.co.uk/author/david/ If anyone can point me to a resource or explain what I need to do thanks! David.
On-Page Optimization | | WallerD0 -
Duplicate Page Titles in Crawl Errors (although Google is rewriting in serps ??)
Hi Im working on a client/project and crawl report is showing thousands of dupe page titles In the case of the blog/news section its aprox 50 since aprox 50 posts and they all have the same meta-title: "Brand News | Brand" as opposed to: "Title Unique to Page/Topic/KW Relating to Content | Brand" Since these are the main content pages we want to rank (in addition to the main site category pages) then i have instructed dev must prioritise populating these pages meta-titles with the actual post/article titles, as per the latter version of the above example. (I should mention that i have requested they fix all dupe titles but main content pages are the priority). Whilst this will reduce the number of dupe titles in crawl error/warning report which is a good thing, is it actually likely to increase the ranking of these news/content pages given that Google does seem to be rewriting the titles correctly in the serps based on the page content ? Many Thanks in advance for your input
On-Page Optimization | | Dan-Lawrence0 -
Page architecture
We have some good content on our site, particularly relating to UK employment law. One section on unfair dismissal is split into 9 pages - there is a fair amount of legal detail. The question is whether we should combine it all into one "mother of all unfair dismissal" page just to satisfy the Google monster or keep in as it is. Some of the individual pages rank on page 1 already. If we change the architecture are 301 redirects the best way to handle the changing urls? The other more important issue is whether it is easier to read it all on one page or split it. Keeping G happy may not actually keep our users happy. As the content is quite dense we want to ensure we don't overload people. Any thoughts appreciated.
On-Page Optimization | | dexm100 -
How to fix duplicate page content and page titles?
Apologies in advance if this has already been answered (it probably has) - I'm just not seeing it. Is there a guide on here for how to fix the issues brought up by the crawler - specifically, things like duplicate page content, or duplicate page titles? A lot of these seem to have been created by wordpress.org combos that I didn't anticipate - i.e., category pages, author pages, etc. The crawler brings up the problems, but I don' t know where to start to go about fixing them. Also, any guide on best SEO practices or fixing optimization problems, specifically for wordpress.org blogs, would be greatly appreciated. Thanks!
On-Page Optimization | | prospects1