SEO Value to Improving HTML Code of Website That Validates According to W3C?
-
Greetings MOZ Community:
My real estate website www.nyc-officespace-leader.com, originally designed in Drupal, was relaunched using Wordpress in 2013. The code for all URLs validates. The relaunch was performed by developers in Argentina.
As part of an SEO campaign, I very reputable design/coding company in the has provided new wire frames to correct useability issues holding back conversions. In the course of the design adjustments they inspected the code and have told me that it is inefficient, that a number of shortcuts were taken and that the code does not conform to Wordpress best practices. What concerns me most is their claim that the quality of coding makes it more difficult for Google to index the site and this may be detrimental to ranking.
Is it possible for the original developers to clean up this code if the deficiencies are pointed out to them? Or once coding shortcuts are taken are they impossible to fix?
Would it make sense for me to request that the new design team put together a list of HTML deficiencies and provide to the original developers and ask them to correct?
I am spending tens of thousands of dollars on content optimization, content marketing. It would be absurd if these coding issues would ultimately prevent improvement in ranking and traffic.
At the same time, I hate to be a cynic but the domestic design/coding firm, while being very professional, does have an incentive in getting me to ditch the original design so I commit to an costly rebranding. If these issues are really minor maybe it is not worth making the effort to clean up the code (assuming that is even possible) and focus the budget on content marketing.
Any thoughts?
-
Perhaps even pay someone a small amount for a audit that does not come with it the chance of further work for that person.
This is a really good idea. First ask the guys complaining to point out real stuff that needs to be fixed or give you an example of a page with problems. Then you have specific stuff to get opinions on.
My bet is that these complainers are simply picky code monkeys who can't stand work that does not meet their compulsive tidy standards. It bugs them that they have to "think:" about somebody else's code that is not formatted or written the way they like it.
I am very confident that they would not like my code.
-
Hi there,
I would also seek a second opinion - there are many HTML issues that aren't troublesome enough to bother Google, and the claim that "the quality of coding makes it more difficult for Google to index the site and this may be detrimental to ranking" depends heavily on what those coding "issues" actually are. I'd rather see a list of what they say is detrimental than do a full code audit here - I'd be super curious as to what those issues supposedly are.
That list can be used as a starting point to get the second opinion of other development and SEO companies - I would certainly run the list by SEOs as well as coders, as each group knows more about their chosen field than the other. You'll get a good idea about a) what has been done sloppily and b) how Google might treat that sloppiness.
Agreed that going back to the original coders, especially for diagnosis, isn't the best idea, but if you have proof that something they did is sub-par, you might mention it to them. As DJ123 says above, don't expect a high receptive response if you do, of course!
It could be a legitimate point or it could be a way to charge for services - I'd get some full independent quotes. Perhaps even pay someone a small amount for a audit that does not come with it the chance of further work for that person. That is, the auditor has no financial incentive to find something wrong with the site.
-
Yeah, hard it is hard to say without knowing what exactly is going on.
It is possible that at a small scale you may not see the potential issues rise up, but when you get over a certain traffic level the imperfections in the code may cause an issue. You definitely don't want to have a slow server time as you increase the traffic, which generally speaking, could cause you an issue.
I hate to have the typical lawyer-esque type of response, but it all depends on your plan, traffic levels, etc, etc.
That's why I would identify what the issues are and then ask yourself, and other pros out there what potential issues it is going to cause, then quantify the potential cost impact (tangible and intangible - I think you must have a "pain in rear end" threshold) those issues may cause, then calculate how much time and cost it will take to correct the issues. Then I would make the decision based on ROI when you consider all those factors.
If you have an issue in the coding that won't truly be an issue until you reach like 100,000 unique visitors per day, and you are only projecting having like 1,000 visitors a day, I would probably make a different decision than if the issue were going to show itself at 500 visitors per day and I am expecting 1,000.
Hope that helps!
-
Thanks!! Your suggestion to get a detailed list of the HTML imperfections is excellent.
So as long as the site loads relatively quickly, code validates and site map is correctly set up, you don't think some sloppiness in the HTML (and I will determine exactly what that is) will hamper Google ranking fatally?
Shame to spend tens of thousands of dollars on SEO and code kills the effort.
Thanks,
Alan -
Wow - that is tough. I have been there when it comes to hiring coders, etc.
I am not 100% sure what they are saying is correct, so here is what I would do:
-
Get them to explicitly list what the issues are, one by one, line by line of code which they are claiming to be causing an issue
-
Shop it around to other developers, not the original ones (in my experience you will not get anywhere by doing this) and see if what they are claiming holds some validity.
Just saying that it isn't coded correctly isn't sufficient enough to make sweeping changes. It needs to be quantified with a potential impact to the site also identified for each line item.
Your site loads pretty fast and as long as you have an XML sitemap submitted to Google, and a few other things going, I cannot imagine why it would be an issue. It is already on the Wordpress platform which is quite a large percentage of the overall code base, but then again, you may have a number of plugins running, etc.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is google still crawling my old website pages?
Why is google still looking at my old indexed pages and not my new index. ? Why are they crawling my old website links when none of them are available? How do I overcome these problems?
Web Design | | optimalspaces0 -
Creating a new website, but I'd like to control it under a different domain.
I'd like to control it under the domain of another website with a higher DA. Can I create the new website (website A) and do an immediate re-direct to another website (website B)? Or would I be better at putting it as a subdomain? Such as www.websitea.websiteb.com? Cheers all, Rhys
Web Design | | SwanseaMedicine0 -
Stolen website code - is this common?
I checked my google analytics stats under network - hostname and found another website showing up for some of the traffic. When i looked at that site's code it started with
Web Design | | AISFM0 -
Too Many Outbound Links on the Home Page - Bad for SEO?
Hello Again Moz community, This is my last Q of the day: I have a LOT of outbound links on the home page of www.web3.ca Some are to clients projects, most are to other pages on the website. Can reducing this to the core pages have a positive impact on SEO? Thanks, Anton
Web Design | | Web3Marketing870 -
Parallax, SEO, and Duplicate Content
We are working on a project that uses parallax to provide a great experience to the end user, and we are also trying to create a best case scenario for SEO. We have multiple keywords we are trying to optimize. We have multiple pages with the parallax function built into it. Basically each member of the primary navigation is it's own page, with all subpages built below it using the parallax function. Our navigation currently uses the hashbang method to provide custom URL's for each subpage. And the user is appropriately directed to the right section based on that hashbang. www.example.com/About < This is its own page www.example.com/about/#/history < This is a subpage that you scroll to on the About page We are trying to decide what the best method will be for trying to optimize each subpage, but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content? Currently the site can also serve each subpage as it's own page as well, so without the parallax function. Should I include those as part of the sitemap. There's no way to navigate to them unless I include them in the sitemap, but I don't want Google to think I'm disingenuous in providing them links that don't exist, solely for the purpose of SEO, but truthfully all of the content exists and is available to the user. I know that a lot of people are asking these questions, and there really are no right answers yet, but I'm curious about everyone else's experience so far.
Web Design | | PaulRonin2 -
How do I gain full SEO value from individual property pages?
A client of ours has a vacation rental business with rental locations all over the country. Their old sites were a messy assembly of black hat, broken links and htaccess files that were used over and over on each site. We are redoing everything for them, in one site, with multiple subdirectories for individual locations, like Aspen, Fort Meyers, etc. Anyhow, I'm putting together the SEO plan for the site and I have a problem. The individual rental properties have great SEO value (lots of text, indexable pictures, can create google/bing location pages), and are great for linking in social media (Look at this wonderful property, rental price just reduced!). However, I don't want individual properties, which will have very similar keywords, links, descriptions, etc, competing with each other when indexed. Truth be told, I don't really want search engines linking directly to the individual property pages at all. The intended browsing experience should allow a user to "narrow down" exactly what they're seeking using the site until the perfect rental appears. What I want is for searchers to be directed to the property listing index that most closely matches what they're seeking (Ft. Meyers Rental Condos or Breckenridge Rental Homes), and then allow them to narrow it down from there. This is ideal for the users, because it allows them to see all available properties that match what they want, and ideal for the customer, because it applies dozens of pages of SEO mojo to a single index, rather than dozens of pages. So I can't "noindex" or "nofollow", because I want all that good SEO mojo. I can't REL=CANONICAL, because the property pages aren't similar enough to the index. I can't 301 Redirect because I want the users to be able to see the property pages at some point. I'm stymied.
Web Design | | SpokeHQ0 -
Intuit Website Hosting - Stand-Alone Webite?
Hi all! My name is Yehuda and I'm new to this community. There's an issue i'm having trouble with: My site is hosted by Intuit Websites, and I use their Website Builder. Though it has its own domain name, it seems my "site" may just be a subdomain of their intuitwebsites.com website. I'm trying to figure out if my site is actually a stand-alone website, and what ramifications/issues there be in terms of bot crawling & SEO. Does anyone have a thought on this?
Web Design | | Jorge1110 -
Facebook code being duplicated? (Any developers mind taking a peek?)
I'm using a few different plug ins to give me various Facebook functions on my site. I'm curious there are any developers out there would could take a look at my source code and see if it looks there is some code being duplicated that's slowing down my site. Thanks so much!
Web Design | | NoahsDad0