Does disabling the "View Source" functionality prevent Google from crawling a website?
-
I know Google uses a lot of variables when crawling a website. I wasn't sure if disabling the "View Source" option hindered anything.
-
No, Google will spider just fine.
You can just look at the source code from the browsers tool bar anyway if you know anything about computers.. It probably does cut down on dumb people from saving images or content on your site a bit, but it's annoying IMO.
-
Hi,
I would think not - reason
The Browser itself is not seeing the page due to view source - as all the code IS the basis for the graphical representation. So the same principle applies to bots, View source is only the ability to see it in code, but the code is always available - or the page does not exist.
So I personally would think Bots would not care one way or the other, as long as they can get to it through HTTP queries/responses
Is there a way to disable just View Source, without disabling right click though?
As just from a UX perspective (IMHO) disabling right click is annoying....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
With Google's new Speed Update, what does that mean for AMP pages?
Hey everyone! I wanted to get the other Mozzers opinions on this. With Google announcing a new Speed Update that will affect mobile rankings, I wanted to ask: How will AMP pages play into this? Let me know what you think!
Web Design | | TaylorRHawkins
Thanks!2 -
Is this is Wow HIT ME IN THE Face Google bug or am I missing something?
We have a page on our site https://www.spurshelving.co.uk/shop/bigimage.aspx?m=353&i=3436 which enders happily on all browsers as far as I am aware and is reasonably well optimised. So when google sent me a link to a new test tool I just had to check it out. https://testmysite.withgoogle.com/intl/en-gb/?utm_source=awareness&utm_medium=email&utm_campaign=tmsv1awareness&utm_content=header Well the result was shocking...... The page that renders in the results is a default missing product page and not the page that the link renders on a web page. I played a little and simply used the I=3436 attribute and the page appeared no problem I then reversed the attributes so that they were i=3436&m=353 and the page again resolved totally as expected. This indicates to me that Google have an issue with aspx attributes. Now I know what to do but is this same issue an issue in spidering and indexing pages. If is is wow that is a big smack in the face. Does it also harm search results in other engines. Keen for comments here
Web Design | | Eff-Commerce0 -
Best techniques for trying to rank a single page website?
I am new to SEO and am currently trying to market a single page website. Its proving to be hard. I have managed to get the site to page one for a few keywords and it is improving (upto page 2 for some desired keywords) but it seems to have stuck there for a few weeks now - with no movement. I am able to develop it if required. However I thought that I would just ask if there was anything that could give it a nudge without this? I have done on-site optimisation. As far as I'm aware that's about as good as it can be. So any advice?
Web Design | | Chstphrjohn0 -
Duplicate content on websites for multiple countries
I have a client who has a website for their U.S. based customers. They are currently adding a Canadian dealer and would like a second website with much of the same info as their current website, but with Canadian contact info etc. What is the best way to do this without creating duplicate content that will get us penalized? If we create a website at ABCcompany.com and ABCCompany.ca or something like that, will that get us around the duplicate content penalty?
Web Design | | InvoqMarketing0 -
Link juice passing from a .org.uk link to a .org/uk websites
Hi all, A client I am working on had a CMS built in recently which has resulted in all their canonicals tags being taken off the website, and as such the same page with both a .org/uk and .org.uk/uk domain have appeared in the search results and I am wondering what your guys take is on the best cause of action. For further background: Historically they have always used .org.uk/uk (not sure why) for their UK website and used .org/xxx for other countries (they also have a .org splashpage FYI). Having seen the .org/uk pages, and knowing they have to choose one to avoid duplication, they would like to move their uk website to the .org/uk domain to fit in with the rest of the divisions. However due to the historical use of .org.uk/uk their backlink profile contains links to both the .org.uk and .org domains. My question then: would a canonical tag on all the .org.uk/uk pages pointing to the .org/uk pages be strong enough to pass on link juice to the .org/uk pages (from all links pointing to .org.uk) or would a 301 redirect be required in this instance, or indeed would it be best to stay with the .org.uk/uk domain? Thanks, Diana
Web Design | | Diana.varbanescu0 -
Any way of showing missed sales in Google Analytics?
Sit down, this might get a little complicated... I was approached by a design company to do some SEO work a client of theirs. Usually, this stuff is white label but I have direct contact with the client as the design agency felt it was easier for me to do this. The website is performing really well and looking at the sales funnel, I'm getting people wanting to buy. BUT, and here's the problem, everything falls apart because of the way the check out works. It's appalling. The customer has to register to buy a product, there's no guest check out or anything. The checkout button is right below the fold and you'd miss it completely if you didn't actually search for it. Basically, it's losing the client money. Last month alone there were 300~ people entering the conversion funnel and NONE of them complete it. I've been talking with the design company and they basically saying that it's too much work for them to change it, it's a signed off project blah blah. UI reports have been conducted and sent to them but still nothing. I have the client asking (a great client, obviously wondering why there is a lack of return on his investment) why he isn't making money. He's asking me because I'm the guy thats meant to be getting him the cash back. I keep saying to the design agency the problems and that it's never going to make money. The potential is massive. But thats my problem. Is there ANY way in GA to calculate the missed sales? I know that I can view the total amount made when the customer successfully checks out but I need figures to present that I'm leading the horse to water, but the check out system is preventing it from drinking. tl;dr I need to show client/design agency missed sales due to poorly built checkout system. Cheers!
Web Design | | jasonwdexter0 -
Can "poor" subdomains drop PR of the root domain?
The page rank of my company's website has dropped from a 6 to a 4 over the past year or so. In that time, we implemented subdomains for development sites to show clients progress on their websites. I noticed that our "dev" sites are being indexed while in development and my question is, will Google drop pagerank of our root domain purely off of these "dev" subdomains? Example - our site is www.oursite.com Dev site - development1.oursite.com I just began investigating the drop and this came to my mind yesterday but am not too sure what type of impact these non-credible subdomains will have on our root domain. Any thoughts?
Web Design | | ckilgore0