What's the best practice for implementing a "content disclaimer" that doesn't block search robots?
-
Our client needs a content disclaimer on their site. This is a simple "If you agree to these rules then click YES if not click NO" and you're pushed back to the home page.
I have this gut feeling that this may cause an upset with the search robots.
Any advice?
R/
John
-
Hi John. I've seen some websites that use a simple box that is "lightboxed" on top of the content. When you click Yes, the lightbox appears and the content is shown as normal. To a search engine, this would look like a perfectly normal website.
However, if your "click yes or click no" refers the end-user to another page ONLY AFTER they click yes, then this would be a huge issue with search engines.
I'd recommend using the "User Agent Switcher" in Firefox to view your site as a Googlebot. This should tell you whether or not it's seeing the entire site or just a portion of your site:
https://addons.mozilla.org/en-US/firefox/addon/user-agent-switcher/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
For an e-commerce product category page that has several funnels to specific products, for SEO purposes does it matter whether the category page's overview content is above or below those funnels?
We manage an e-commerce site. On a category page, there are several funnels to specific products. We moved the category overview content below those funnels to make it easier for users to quickly get to products. Seems more user friendly to me, but could that move of the main content to the lower part of the page be a negative ranking factor?
On-Page Optimization | | PKI_Niles0 -
Blocking internal search results
Hello Everyone, Does anyone know how I can block Google from indexing internal search results? Thanks. Ryan
On-Page Optimization | | RyanUK0 -
Thousands of 404's showing up from Wordpress Blog!?!?
Hey guys, Have recently seen thousands of 404 errors thrown up from my wordpress blog in Google Search Console. These are URL's trying to link (i'm not sure where from) to other parts of my site, but they are not relative to the site root... infact they are a mix of random folders/subfolders and pages on my site. E.g: http://www.MYSITE.co.uk/blog/how-to/driving-to-the-alps/attachment/2013-land-rover-range-rover-evoque-front-snow-1/st-martin-de-belleville/chalet-st-martin-de-belleville/ski-holidays/ski-holidays/summer/st-martin-de-belleville/summer/your-stay-st-martin-de-belleville.html This is a link to a picture on the blog: http://www.MYSITE.co.uk/blog/how-to/driving-to-the-alps/attachment/2013-land-rover-range-rover-evoque-front-snow-1/ And the rest of it is finding it's own way there! Any ideas? This is Wordpress by the way. Cheers, Paul. p.s. I got no help from the Wordpress community so am posting here! p.p.s I forgot to mention that MOZ is reporting these issues too, but running Screaming Frog does NOT show any 404's at all on my site...
On-Page Optimization | | SnowTrippin0 -
Description tag not showing in the SERPs because page is blocked by Robots, but the page isn't blocked. Any help?
While checking some SERP results for a few pages of a site this morning I noticed that some pages were returning this message instead of a description tag, A description for this result is not avaliable because of this site's robot.s.txt The odd thing is the page isn't blocked in the Robots.txt. The page is using Yoast SEO Plugin to populate meta data though. Anyone else had this happen and have a fix?
On-Page Optimization | | mac22330 -
When You Add a Robots.txt file to a website to block certain URLs, do they disappear from Google's index?
I have seen several websites recently that have have far too many webpages indexed by Google, because for each blog post they publish, Google might index the following: www.mywebsite.com/blog/title-of-post www.mywebsite.com/blog/tag/tag1 www.mywebsite.com/blog/tag/tag2 www.mywebsite.com/blog/category/categoryA etc My question is: if you add a robots.txt file that tells Google NOT to index pages in the "tag" and "category" folder, does that mean that the previously indexed pages will eventually disappear from Google's index? Or does it just mean that newly created pages won't get added to the index? Or does it mean nothing at all? thanks for any insight!
On-Page Optimization | | williammarlow0 -
Duplicate Content- Best Practise Usage of the canonical url
Canonical urls stop self competition - from duplicate content. So instead of a 2 pages with a rank of 5 out of 10, it is one page with a rank of 7 out of 10.
On-Page Optimization | | WMA
However what disadvantages come from using canonical urls. For example am I excluding some products like green widet, blue widget. I have a customer with 2 e-commerce websites(selling different manufacturers of a type jewellery). Both websites have massive duplicate content issues.
It is a hosted CMS system with very little SEO functionality, no plugins etc. The crawling report- comes back with 1000 of pages that are duplicates. It seems that almost every page on the website has a duplicate partner or more. The problem starts in that they have 2 categorys for each product type, instead of one category for each product type.
A wholesale category and a small pack category. So I have considered using a canonical url or de-optimizing the small pack category as I believe it receives less traffic than the whole category. On the original website I tried de- optimizing one of the pages that gets less traffic. I did this by changing the order of the meta title(keyword at the back, not front- by using small to start of with). I also removed content from the page. This helped a bit. Or I was thinking about just using a canonical url on the page that gets less traffic.
However what are the implications of this? What happens if some one searches for "small packs" of the product- will this no longer be indexed as a page. The next problem I have is the other 1000s of pages that are showing as duplicates. These are all the different products within the categories. The CMS does not have a front office that allows for canonical urls to be inserted. Instead it would have to be done going into the html of the pages. This would take ages. Another issue is that these product pages are not actually duplicate, but I think it is because they have such little content- that the rodger(seo moz crawler, and probably googles one too) cant tell the difference.
Also even if I did use the canonical url - what happened if people searched for the product by attributes(the variations of each product type)- like blue widget, black widget, brown widget. Would these all be excluded from Googles index.
On the one hand I want to get rid of the duplicate content, but I also want to have these pages included in the search. Perhaps I am taking too idealistic approach- trying to optimize a website for too many keywords. Should I just focus on the category keywords, and forget about product variations. Perhaps I look into Google Analytics, to determine the top landing pages, and which ones should be applied with a canonical. Also this website(hosted CMS) seems to have more duplicate content issues than I have seen with other e-commerce sites that I have applied SEO MOZ to On final related question. The first website has 2 landing pages- I think this is a techical issue. For example www.test.com and www.test.com/index. I realise I should use a canonical url on the page that gets less traffic. How do I determine this? (or should I just use the SEO MOZ Page rank tool?)0 -
How could I avoid the "Duplicate Page Content" issue on the search result pages of a webshop site?
My webshop site was just crawled by Roger, and it found 683 "Duplicate Page Content" issues. Most of them are result pages of different product searches, that are not really identical, but very similar to each other. Do I have to worry about this? If yes, how could I make the search result pages different? IS there any solution for this? Thanks: Zoltan
On-Page Optimization | | csajbokz0 -
Another SEO's point of view
Hiya fellow SEO's I have been working on a site - www.hplmotors.co.uk and I must say it has become difficult due to flaws with the content management system . We are speaking with the web site makers to be able to add a unique title, description to all pages. I know what is wrong but I would also like some 2nd opinions on this and welcome any suggestions for the site. A burnt out seo 🙂 thanks
On-Page Optimization | | onlinemediadirect0