Can cookies harm your webiste?
-
Hi mozzers,
I am doing an seo audit and one of the components of crawlability in the audit template I have is: "Disable Cookies/Make Googlebot user agent", I am not quite sure why cookies could harm your SEO?
Can someone explain me what problems can arise because of cookies? Does it prevent bots to crawl your website like .js on your nav?
Thanks!
-
The instruction in the audit template is probably given to give you a 100% true view of what Google sees when it comes to the website, unobstructed by things meant for humans like cookies, javascript, etc. As Stephen says, Google traditionally does not accept cookies or execute javascript, or do a lot of things that are meant for usability (but may take some of this into account if it's being used for malicious purposes).
This is not about whether cookies are "harmful" and you are not being instructed to turn off cookies on the website itself. You are being instructed to disable your browser from accepting cookies so you get an idea of what Google's experience on the website is like.
Edited to add - posts like this are being shared, literally today, so keep an eye on the Google-accepting-cookies issue, but for the purpose of the site audit, it's just wanting to let you see what Google traditionally sees.
-
Bot's don't accept cookies - so if there is a page/section on your site(usually user area requiring login) which requires a cookie in order to access it - the bot wouldn't make it there. If this was a page you needed indexed, that would be a problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a linking Root Domain figure be in the millions?
Hi there! In SERP analysis for my chosen keywords, DA level for a facebook page is 97, and the RDs linking to root domain is 42,117,874. So does this site have 42 million domains leading back to this page, making it impossible to beat with SEO? Any help is appreciated as I'm struggling to understand what this means! Thanks 🙂
Intermediate & Advanced SEO | | Kati_nav0 -
Can subdomains avoid spam penalizations?
Hello everyone, I have a basic question for which I couldn't find a definitive answer for. Let's say I have my main website with URL: www.mywebsite.com And I have a related affiliates website with URL: affiliates.mywebsite.com Which includes completely different content from the main website. Also, both domains have two different IP addresses. Are those considered two completely separate domains by Google? Can bad links pointing to affiliates.mywebsite.com affect www.mywebsite.com in any way? Thanks in advance for any answer to my inquiry!
Intermediate & Advanced SEO | | fablau0 -
Knowledge Graph Quick Answer Box: Is there anything we can do to get our content to appear there?
Hi everyone, The quick answers box can be really helpful for searchers by pulling through content which answers their question or provides a clear description of an item or entity. Our client appeared in the quick answer box for a period of time with their description of a product, but have since been replaced by one of their competitors. Previously, the answer was provided by Wikipedia. Is there anything we can do to help get our client's content back in there? We've been looking at possible structured data we can use but haven't found anything. Also suggesting our client ensures they have a paragraph within their copy which is a clear, concise description of the product that Google can pull. Can anyone give any suggestions? Thanks Laura
Intermediate & Advanced SEO | | tomcraig860 -
Can pop-ups cause duplicate content issues in product pages?
Normally for ecommerce clients that have 100's of products we advise for size guides, installation guides etc to be placed as downloadable PDF resources to avoid huge blocks of content on multiple product pages. If content was placed in a popup e.g. fancybox, across multiple product pages would this be read by Google as duplicate content? Examples for this could be: An affiliate site with mutiple prices for a product and pop-up store reviews A clothing site with care and size guides What would be the best practice or setup?
Intermediate & Advanced SEO | | shloy23-2945840 -
New Google AdWords Keyword Tool - What Can We Do?
What options do we have for keyword research now that Google is switching from the Google AdWords Keyword Tool to the Keyword Planner??
Intermediate & Advanced SEO | | alhallinan0 -
Okay can someone straighten out SEO for me
If my keyword is dog training and I wanted to make 5 posts on a blog, do I target all the posts with the keyword of dog training or what?
Intermediate & Advanced SEO | | 6786486312640 -
How can we optimize content specific to particular tabs, but is loaded on one page?
Hi, Our website generates stock reports. Within those reports, we organize information into particular tabs. The entire report is loaded on one page and javascript is used to hide and show the different tabs. This makes it difficult for us to optimize the information on each particular tab. We're thinking about creating separate pages for each tab, but we're worried about affecting the user experience. We'd like to create separate pages for each tab, put links to them at the bottom of the reports, and still have the reports operate as they do today. Can we do this without getting in trouble with Google for having duplicate content? If not, is there another solution to this problem that we're not seeing? Here's a sample report: http://www.vuru.co/analysis/aapl In advance, thanks for your help!
Intermediate & Advanced SEO | | yosephwest0 -
Bad neighborhood linking - anyone can share experience how significant it can impact rankings?
SEOMoz community, If you have followed our latest Q&A posts you know by now that we have been suffering since the last 8 months from a severe Google penalty we are still trying to resolve. Our international portfolio of sports properties has suffered significant ranking losses across the board. While we have been tediously trying to troubleshoot the problem for a while now we might be up to a hot lead now. We realized that one of the properties outside of our key properties, but are site that our key properties are heavily linking to (+100 outgoing links per property) seems to have received a significant Google penalty in a sense that it has been completely delisted from the Google index and lost all its PageRank (Pr4) While we are buffed to see such sort of delisting, we are hopeful that this might be the core of our experienced issues in the past i.e. that our key properties have been devalued due to heavy linking to a bad neighborhood site. My question two the community are two-fold: Can anyone share any experience if it is indeed considered possible that a high number of external links to one bad neighboorhood domain can cause significant ranking drops in the rank from being top 3 ranked to be ranked at around a 140 for a competetive key word? The busted site has a large set of high quality external links. If we swap domains is there any way to port over any link juice or will the penalty be passed along? If that is the case I assume the best approach would be to reach out to all the link authorities and have tem link to the new domain instead of the busted site? Thanks /Thomas
Intermediate & Advanced SEO | | tomypro0