Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How much impact does bad html coding really have on SEO?
-
My client has a site that we are trying to optimise. However the code is really pretty bad.
There are 205 errors showing when W3C validating. The >title>, , <keywords> tags are appearing twice. There is truly excessive javascript. And everything has been put in tables.</keywords>
How much do you think this is really impacting the opportunity to rank? There has been quite a bit of discussion recently along the lines of is on-page SEO impacting anymore.
I just want to be sure before I recommend a whole heap of code changes that could cost her a lot - especially if the impact/return could be miniscule.
Should it all be cleaned up?
Many thanks
-
Hi Chammy,
I inherited a site that reported 3,184 crawl errors in MOZ and a significant number of them (nearly 600) were duplicate titles and content. I have that down to under 1,000 total errors and only 86 critical errors. I have seen my ranking grow pretty substantially and in one week had 6 pages increase over 20 positions in rank. I can share the MOZ Rank Report if you would like to see it.
So yes, it does have an impact.
-
I'm sorry, I don't have any evidence from the user experience point of view,. although I would also be interested to see the results of any studies.
I will say that from a site management/maintenance point of view it makes sense to try and keep the code as clean as possible. I've been involved in project were a considerable chunk of the cost was incurred due to the amount of time and effort that was required to unravel the mess even before any new changes were made!
-
Thanks very much everyone - very helpful.
Good point re page speed - the pages are certainly slow to load so this could well be due to the huge amount of js and bad code.
And yes, think the duplicate tags should be sorted - this shouldn't be difficult.
Has anyone got any tangible results that they've seen as a result of cleaning up js and code?
-
If you've got things like duplicate title and meta-description's going on then I'd certainly take a look at fixing those. Being able to manage these two tags is vital to managing the way your pages will appear in the search results. (And your title tag is an important ranking factor).
Normally, if your page doesn't validate then it's not a major problem and search engines won't penalise you for it. If however, your page is so badly crafted that the html errors, and general page structure makes it difficult for the search engines (and humans) to read your page then you're going to suffer.
The key is to make sure that your site/page content is accessible. How accessible is your page to someone with disabilities, using a screen reader etc.
You've got to make sure that the search engines can understand what your page is about or your page won't be seen as a relevant page for any search terms...
How bad is it? How does google render the page in it's instant previews (you can check this is Google Webmaster tools)
-
I personally don't worry about bad code unless it slows down my page or can possibly make things confusing for search engines or readers.
If the title and meta are appearing twice this could be confusing for search engines, so I would change this. But, if you've got things like an unclosed
here and there I personally don't think that's going to be much of a factor.
-
Invalid code has a small effect on ranking. However, if the invalid code causes usability issues such as load time and or causes a high bounce rate then it can lower your rankings and of course cut back on conversions.
Some of it is a higher priority than others. I would say defo remove the meta keywords.
Combine JS pages. The tables while out of date is not a big issue.
If you have the time and resources then yes it should be cleaned up. If not then clean up major problems
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Appending a code at the end of a URL
Hi All, Some real estate/ news companies have a code appended to the end of a URL https://www.realestate.com.au/property-house-qld-ormiston-141747584 https://www.brisbanetimes.com.au/national/queensland/childcare-centre-could-face-prosecution-for-leaving-child-on-hot-bus-20230320-p5ctqs.html Can I ask if there's any negative SEO implications for doing this? Cheers Dave
Technical SEO | | Redooo0 -
How Does Dynamic Content for a Specific URL Impact SEO?
Example URL: http://www.sja.ca/English/Community-Services/Pages/Therapy Dog Services/default.aspx The above page is generated dynamically depending on what province the visitor visits from. For example, a visitor from BC would see something quite different than a visitor from Nova Scotia; the intent is that the information shown should be relevant to the user of that province. How does this effect SEO? How (or from what location) does Googlebot decide to crawl the page? I have considered a subdirectory for each province, though that comes with its challenges as well. One such challenge is duplicate content when different provinces may have the same information for some pages. Any suggestions for this?
Technical SEO | | ey_sja0 -
SEO-impact of mouseover text on header pictures
Hi, what do you reckon of taking away the mouseover effect on the header pictures seen on www.viventura.de/reisen/peru?
Technical SEO | | viventuraSEO
We are thinking of eliminating the mouseover text to make User Experience even better but are worrying that our ranking might go down when doing so. Any experiences, any help is highly appreciated!
Thanks, Benno0 -
Can a CMS affect SEO?
As the title really, I run www.specialistpaintsonline.co.uk and 6 months ago when I first got it it had bad links which google had put a penalty against it so losts it value. However the penalty was lift in Sept, the site corresponds to all guidelines and seo work has been done and constantly monitored. the issue I have is sales and visits have not gone up, we are failing fast and running on 2 or 3 sales a month isn't enough to cover any sort of cost let alone wages. hence my question can the cms have anything to do with it? Im at a loss and go grey any help or advice would be great. thanks in advance.
Technical SEO | | TeamacPaints0 -
Templates for Meta Description, Good or Bad?
Hello, We have a website where users can browse photos of different categories. For each photo we are using a meta description template such as: Are you looking for a nice and cool photo? [Photo name] is the photo which might be of interest to you. And in the keywords tags we are using: [Photo name] photos, [Photo name] free photos, [Photo name] best photos. I'm wondering, is this any safe method? it's very difficult to write a manual description when you have 3,000+ photos in the database. Thanks!
Technical SEO | | TheSEOGuy10 -
How much to change to avoid duplicate content?
Working on a site for a dentist. They have a long list of services that they want us to flesh out with text. They provided a bullet list of services, we're trying to get 1 to 2 paragraphs of text for each. Obviously, we're not going to write this off the top of our heads. We're pulling text from other sources and trying to rework. The question is, how much rephrasing do we have to do to avoid a duplicate content penalty? Do we make sure there are changes per paragraph, sentence, or phrase? Thanks! Eric
Technical SEO | | ericmccarty0 -
Sitmap Page - HTML and XML
Hi there I have a domain which has a sitemap in html for regular users and a sitemap in xml for the spiders. I have a warning via seomoz saying that i have too many links on the html version. What do i do here? regards Stef
Technical SEO | | stefanok0 -
Changing DNS -- SEO implications?
Hey Moz, We're migrating an old site on an old server over to a new server/DNS. The plan is to keep the same URL structure and reuse our existing URL's. As long as we make minimal changes to each page's content, we should be able to update our DNS entry and get all the pages recreated and assigned to their correct URLs without any reduction in SEO rankings. Is this correct? This site gets a lot of organic traffic and ranks highly on some challenging keywords, so it's key that we retain our rankings as much as possible. I've read that it's wise to lower the DNS time-to-live to one hour, about a day before the move, to help Google crawl the DNS a little quicker. Are there any other recommendations you guys can offer or past experiences?
Technical SEO | | stephen_reply0