Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How much impact does bad html coding really have on SEO?
-
My client has a site that we are trying to optimise. However the code is really pretty bad.
There are 205 errors showing when W3C validating. The >title>, , <keywords> tags are appearing twice. There is truly excessive javascript. And everything has been put in tables.</keywords>
How much do you think this is really impacting the opportunity to rank? There has been quite a bit of discussion recently along the lines of is on-page SEO impacting anymore.
I just want to be sure before I recommend a whole heap of code changes that could cost her a lot - especially if the impact/return could be miniscule.
Should it all be cleaned up?
Many thanks
-
Hi Chammy,
I inherited a site that reported 3,184 crawl errors in MOZ and a significant number of them (nearly 600) were duplicate titles and content. I have that down to under 1,000 total errors and only 86 critical errors. I have seen my ranking grow pretty substantially and in one week had 6 pages increase over 20 positions in rank. I can share the MOZ Rank Report if you would like to see it.
So yes, it does have an impact.
-
I'm sorry, I don't have any evidence from the user experience point of view,. although I would also be interested to see the results of any studies.
I will say that from a site management/maintenance point of view it makes sense to try and keep the code as clean as possible. I've been involved in project were a considerable chunk of the cost was incurred due to the amount of time and effort that was required to unravel the mess even before any new changes were made!
-
Thanks very much everyone - very helpful.
Good point re page speed - the pages are certainly slow to load so this could well be due to the huge amount of js and bad code.
And yes, think the duplicate tags should be sorted - this shouldn't be difficult.
Has anyone got any tangible results that they've seen as a result of cleaning up js and code?
-
If you've got things like duplicate title and meta-description's going on then I'd certainly take a look at fixing those. Being able to manage these two tags is vital to managing the way your pages will appear in the search results. (And your title tag is an important ranking factor).
Normally, if your page doesn't validate then it's not a major problem and search engines won't penalise you for it. If however, your page is so badly crafted that the html errors, and general page structure makes it difficult for the search engines (and humans) to read your page then you're going to suffer.
The key is to make sure that your site/page content is accessible. How accessible is your page to someone with disabilities, using a screen reader etc.
You've got to make sure that the search engines can understand what your page is about or your page won't be seen as a relevant page for any search terms...
How bad is it? How does google render the page in it's instant previews (you can check this is Google Webmaster tools)
-
I personally don't worry about bad code unless it slows down my page or can possibly make things confusing for search engines or readers.
If the title and meta are appearing twice this could be confusing for search engines, so I would change this. But, if you've got things like an unclosed
here and there I personally don't think that's going to be much of a factor.
-
Invalid code has a small effect on ranking. However, if the invalid code causes usability issues such as load time and or causes a high bounce rate then it can lower your rankings and of course cut back on conversions.
Some of it is a higher priority than others. I would say defo remove the meta keywords.
Combine JS pages. The tables while out of date is not a big issue.
If you have the time and resources then yes it should be cleaned up. If not then clean up major problems
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Barba Plugin and SEO
Hello, community! My client wants to use the barba.js plugin for their new site. What are the implications for SEO?
Technical SEO | | SimpleSearch0 -
does <base> in html affect seo?
hey, just wanna know does <base> in head of website affect SEO? and if it's a yes, how?
Technical SEO | | m17001 -
Non-Existent Parent Pages SEO Impact
Hello, I'm working with a client that is creating a new site. They currently are using the following URL structure: http://clientname.com/products/furry-cat-muffins/ But the landing page for the directory /products/ does not actually have any content. They have a similar issue for the /about/ directory where the menu actually sends you to /about/our-story/ instead of /about/. Does it hurt SEO to have the URL structure set up in this way and also does it make sense to create 301 redirects from /about/ to /about/our-story/?
Technical SEO | | Alder0 -
Domain prefix changed, will this impact SEO?
Our web development team have changed our domain prefix from www to non www due to a server change. Our SSL certificate would not be recognised under www and would produce a substantial error message when visiting the secure parts of our website. To prevent issues with old links they have added a permanent 301 redirect from www. to non www. urls until our sitemap catches up. Would this impact our SEO efforts or would it have no impact as a redirect has been placed? Thanks
Technical SEO | | Jseddon920 -
SEO impact of the anatomy of URL subdirectory structure?
I've been pushing hard to get our Americas site (DA 34) integrated with our higher domain authority (DA 51) international website. Currently our international website is setup in the following format... website.com/us-en/ website.com/fr-fr/ etc... The problem that I am facing is that I need my development framework installed in it's own directory. It cannot be at the root of the website (website.com) since that is where the other websites (us-en, fr-fr, etc.) are being generated from. Though we will have control of /us-en/ after the integration I cannot use that as the website main directory since the americas website is going to be designed for scalability (eventually adopting all regions and languages) so it cannot be region specific. What we're looking at is website.com/[base]/us-en. I'm afraid that if base has any length to it in terms of characters it is going to dilute the SEO value of whatever comes after it in the URL (website.com/[base]/us-en/store/product-name.html). Any recommendations?
Technical SEO | | bearpaw0 -
SEO value of InDesign pages?
Hi there, my company is exploring creating an online magazine built with Adobe's InDesign toolset. If we proceeded with this, could we make these pages "as spiderable" as normal html/css webpages? Or are we limited to them being less spiderable, or not at all spiderable?
Technical SEO | | TheaterMania1 -
Exclude status codes in Screaming Frog
I have a very large ecommerce site I'm trying to spider using screaming frog. Problem is I keep hanging even though I have turned off the high memory safeguard under configuration. The site has approximately 190,000 pages according to the results of a Google site: command. The site architecture is almost completely flat. Limiting the search by depth is a possiblity, but it will take quite a bit of manual labor as there are literally hundreds of directories one level below the root. There are many, many duplicate pages. I've been able to exclude some of them from being crawled using the exclude configuration parameters. There are thousands of redirects. I haven't been able to exclude those from the spider b/c they don't have a distinguishing character string in their URLs. Does anyone know how to exclude files using status codes? I know that would help. If it helps, the site is kodylighting.com. Thanks in advance for any guidance you can provide.
Technical SEO | | DonnaDuncan0 -
International Seo - Canada
Our organization is currently only operating in the USA but will soon be entering the Canadian market. We did a lot of research and decided that for our needs it would be best to use a subfolder for Canada. Initially we will be targeting the english speaking community but eventually we will want to expand to the french speaking Canadians as well. The question is - is there a preferred version in setting up the subfolders: www.website.org/ca/ -- default will be english www.website.org/ca/fr/ - french www.website.org/en-ca/ - english www.website.org/fr-ca/ - french www.website.org/ca/en/ -english www.website.org/ca/fr/ - french Thanks
Technical SEO | | Morris770