Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will a disclaimer affect Crawling?
-
Hello everyone!
My German users will have to get a disclaimer according to German laws, now my question is the following:
Will a disclaimer affect crawling? What's the best practice to have regarding this? Should I have special care in this? What's the best disclaimer technique? A Plain HTML page? Something overlapping the site?
Thank you all!
-
Hi friend, you can display the disclaimer using a JavaScript overlay and this would be absolutely fine. The bots won't have any trouble crawling the website behind the JS overlay as they won't see it. This is a very common practice among the websites that display age gate verification page like porn sites and sites that talk or sell liquor etc..
This technique is not considered cloaking as the intention is not malicious or deceptive and Google handles these normally. Hope it helps and Good Luck.
I addressed a similar question here on Moz:
http://moz.com/community/q/different-user-experience-with-javascript-on-off
Best regards,
Devanur Rafi
-
Maybe I will try as you said, will just wait to see if someone else responds so I can gather more ideas. Thanks though!
About cookies, yes, it's an Europe thing, but in Germany if you have an adult site, if you sell some type of products, etc, you have to display a disclaimer
-
Hmm, I honestly do not know in this situation. One thing you might try is to do a modal that blocks the page with a semi transparent layer, but check if it is googlebot accessing the site and not do a modal.
But honestly, I thought it was a cookies thing being in the EU so I am not an expert in this area.
-
Thanks for the input!
while the site will not be pornographic it will include art nudity and I want to have a disclaimer that covers at least a portion of teh page
-
Don't block the site totally and it will not matter really. A lot of people in the e-commerce world do it like in this demo, http://warehouse.iqit-commerce.com/selector/?theme=warehouse2 Just a small bar on the bottom of the page. If you wanted to get even more clever, you could geographically target the user and show based on that and exclude bots from seeing it. But I would not suggest blocking the whole page like an adult site does if it is for cookies. If it is an adult site, that needs a full disabling disclaimer, I have no experience in that area.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
XML sitemap generator only crawling 20% of my site
Hi guys, I am trying to submit the most recent XML sitemap but the sitemap generator tools are only crawling about 20% of my site. The site carries around 150 pages and only 37 show up on tools like xml-sitemaps.com. My goal is to get all the important URLs we care about into the XML sitemap. How should I go about this? Thanks
Intermediate & Advanced SEO | | TyEl0 -
My last site crawl shows over 700 404 errors all with void(0 added to the ends of my posts/pages.
Hello, My last site crawl shows over 700 404 errors all with void(0 added to the ends of my posts/pages. I have contacted my theme company but not sure what could have done this. Any ideas? The original posts/pages are still correct and working it just looks like it did duplicates and added void(0 to the end of each post/page. Questions: There is no way to undo this correct? Do I have to do a redirect on each of these? Will this hurt my rankings and domain authority? Any suggestions would be appreciated. Thanks, Wade
Intermediate & Advanced SEO | | neverenoughmusic.com0 -
[Very Urgent] More 100 "/search/adult-site-keywords" Crawl errors under Search Console
I just opened my G Search Console and was shocked to see more than 150 Not Found errors under Crawl errors. Mine is a Wordpress site (it's consistently updated too): Here's how they show up: Example 1: URL: www.example.com/search/adult-site-keyword/page2.html/feed/rss2 Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword/page2.html Example 2 (this surprised me the most when I looked at the linked from data): URL: www.example.com/search/adult-site-keyword-2.html/page/3/ Linked From: www.example.com/search/adult-site-keyword-2.html/page/2/ (this is showing as if it's from our own site) http://a-spammy-adult-site.com/search/adult-site-keyword-2.html Example 3: URL: www.example.com/search/adult-site-keyword-3.html Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword-3.html How do I address this issue?
Intermediate & Advanced SEO | | rmehta10 -
Does having a different sub domain for your Landing Page and Blog affect your overall SEO benefits and Ranking?
We have a domain www.spintadigital.com that is hosted with dreamhost and we also have a seperate subdomain blog.spintadigital.com which is hosted in the Ghost platform and we are also using Unbounce landing pages with the sub domain get.spintadigital.com. I wanted to know whether having subdomain like this would affect the traffic metric and ineffect affect the SEO and Rankings of our site. I think it does not affect the increase in domain authority, but in places like similar web i get different traffic metrics for the different domains. As far as i can see in many of the metrics these are considered as seperate websites. We are currently concentrating more on our blogs and wanted to make sure that it does help in the overall domain. We do not have the bandwidth to promote three different websites, and hence need the community's help to understand what is the best option to take this forward.
Intermediate & Advanced SEO | | vinodh-spintadigital0 -
After Server Migration - Crawling Gets slow and Dynamic Pages wherein Content changes are not getting Updated
Hello, I have just performed doing server migration 2 days back All's well with traffic moved to new servers But somehow - it seems that w.r.t previous host that on submitting a new article - it was getting indexed in minutes. Now even after submitting page for indexing - its taking bit of time in coming to Search Engines and some pages wherein content is daily updated - despite submitting for indexing - changes are not getting reflected Site name is - http://www.mycarhelpline.com Have checked in robots, meta tags, url structure - all remains well intact. No unknown errors reports through Google webmaster Could someone advise - is it normal - due to name server and ip address change and expect to correct it automatically or am i missing something Kindly advise in . Thanks
Intermediate & Advanced SEO | | Modi0 -
Prevent Google from crawling Ajax
With Google figuring out how to make Ajax and JS more searchable/indexable, I am curious on thoughts or techniques to prevent this. Here's my Situation, we have a page that we do not ever want to be indexed/crawled or other. Currently we have the nofollow/noindex command, but due to technical changes for our site the method in which this information is being implemented if it is ever displayed it will not have the ability to block the content from search. It is also the decision of the business to not list the file in robots.txt due to the sensitivity of the content. Basically, this content doesn't exist unless something super important happens, and even if something super important happens, we do not want Google to know of its existence. Since the Dev team is planning on using Ajax/JS to pull in this content if the business turns it on, the concern is that it will be on the homepage and Google could index it. So the questions that I was asked; if Google can/does index, how long would that piece of content potentially appear in the SERPs? Can we block Google from caring about and indexing this section of content on the homepage? Sorry for the vagueness of this question, it's very sensitive in nature and I am trying to avoid too many specifics. I am able to discuss this in a more private way if necessary. Thanks!
Intermediate & Advanced SEO | | Shawn_Huber0 -
Will redirecting poor traffic web pages increase web presence
A number of pages on my site have low traffic metrics. I intend to redirect poor performing pages to the most appropriate page with high traffic. Example
Intermediate & Advanced SEO | | Mark_Ch
www.sampledomomain.co.uk/low-traffic-greyshoes
www.sampledomomain.co.uk/low-traffic-greenshoes
www.sampledomomain.co.uk/low-traffic-redshoes all of the above will be redirected to the following page:
www.sampledomomain.co.uk/high-traffic-blackshoes Question
Will carrying out htaccess redirects from the above example influence to web positioning of both www.sampledomomain.co.uk/high-traffic-blackshoes and www.sampledomomain.co.uk Regards Mark0 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0