Cantags within links affect Google's perception of them?
-
Hi, All!
This might be really obvious, but I have little coding experience, so when in doubt - ask...
One of our client site's has navigation that looks (in part) like this:
<a <span="">href</a><a <span="">="http://www.mysite.com/section1"></a>
<a <span="">src="images/arrow6.gif" width="13" height="7" alt="Section 1">Section 1</a><a <span=""></a>
WC3 told us the
tags invalidate, and while I ignored most of their comments because I didn't think it would impact on what search engines saw, because thesetags are right in the links, it raised a question.
Anyone know if this is for sure a problem/not a problem?
Thanks in advance!
Aviva B
-
Thanks, Ryan. Good ideas, and we'll see what "the authorities" choose to do.
-
If they would have to pay a significant amount of money to have it redone, though, would it be worth it in this kind of case? What would the odds be?
Without having any information about the site, it's not possible to offer any credible details, odds or measurements of worth. If you are asking for a guess, I would say it is very unlikely for the div tags to cause any SEO problems, but that's the problem with invalid code, you don't know how it will be handled.
The bigger concern I have is if that line of code was coded so poorly, there are likely other coding issues with the site.
May I suggest asking a couple developers for an estimate on how much it would adjust the site's code so it validates?
-
Thanks, Ryan. Point well taken. I think I may copy and paste this for the client in question. If they would have to pay a significant amount of money to have it redone, though, would it be worth it in this kind of case? What would the odds be?
Aviva
-
Thanks, Kyle. We're not the design/webmaster team, so while it might not have been a good idea to do that in the first place, our job here is just to tell our client what MUST change for SEO and what doesn't need to change, even though it might not have been ideal. The challenges of not having unlimited budget...
Thanks,
Aviva
-
Simply from a front-end development perspective, why would you place a
inside of an <a>? If you are trying to force a block element style, why not simply apply it through the CSS sheet to the</a> <a>tag?
If you supply a URL i can give more specific coding advice
Thanks - Kyle</a>
-
The problem with using invalid code is every browser may handle it differently. Even if your current browser handles it fine today, the next time it updates the results may change.
Code validation is representatives from all the major browsers getting together and agreeing on coding rules. The biggest problem with invalid code is people thinking their site is fine but then later finding out (or worse not finding out) their site does not appear correctly in various browsers.
You have ie6, ie7, ie8, ie9, ie10, Chrome, FF, Opera, Safari and other browsers on the market. You have a variety of phones, ipads and other devices. It is more important then ever to use valid code. If your page doesn't fully validate, it should still be almost valid and the couple errors which remain have been thoroughly researched and you consciously choose to not validate on those particular items. An example would be if you are using HTML 5 and the validation tool has not fully been updated for all the latest changes.
With the above noted, I am not aware of any problem with your code. The challenge is since it is not valid, you cannot predict how it will be handled by Google. Even if it is handled correctly today, a change can be made at any time which can impact you.
-
Thanks, Andy. You've seen sites that have used the tags the same way?
-
To be honest, I can't see, from an SEO perspective, how Google would view these in a negative way. I can only tell you that from all of the sites that I have seen, I have never seen this as a problem.
Someone else might come up with a definitive answer, but I would say that there is nothing wrong with
tags for SEO.
Cheers,
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I use a 301 redirect to pass 'back link' juice to a different domain?
Hi, I have a backlink from a high DA/PA Government Website pointing to www.domainA.com which I own and can setup 301 redirects on if necessary. However my www.domainA.com is not used and has no active website (but has hosting available which can 301 redirect). www.domainA.com is also contextually irrelevant to the backlink. I want the Government Website link to go to www.domainB.com - which is both the relevant site and which also should be benefiting from from the seo juice from the backlink. So far I have had no luck to get the Government Website's administrators to change the URL on the link to point to www.domainB.com. Q1: If i use a 301 redirect on www.domainA.com to redirect to www.domainB.com will most of the backlink's SEO juice still be passed on to www.domainB.com? Q2: If the answer to the above is yes - would there be benefit to taking this a step further and redirect www.domainA.com to a deeper directory on www.domianB.com which is even more relevant?
Technical SEO | | DGAU
ie. redirect www.domainA.com to www.domainB.com/categoryB - passing the link juice deeper.0 -
My old URL's are still indexing when I have redirected all of them, why is this happening?
I have built a new website and have redirected all my old URL's to their new ones but for some reason Google is still indexing the old URL's. Also, the page authority for all of my pages has dropped to 1 (apart from the homepage) but before they were between 12 to 15. Can anyone help me with this?
Technical SEO | | One2OneDigital0 -
Why are my URL's changing
My rankings suddenly dropped and when trying to understand why I realized that nearly all images in Google's cached version of my site were missing. In the actual site they appear but in the cached version they don't. I noticed that most of the images had a ?6b5830 at the end of the URL and these were the images that were not showing. I am hoping that I found the reason for the drop in rankings. Maybe since Google cannot see a lot of the content it decided not to rank it as well (particularly since it seems to happen on thousands of pages). This is a cached version of my site I am using the following plugins that might be causing it: Yoasts SEO plugin, W3 total cache. Does anyone know what is causing ?6b5830 to be added to the end of most of my URL's? Could this be the reason for the ranking drop? Thanks in advance!
Technical SEO | | JillB20130 -
Google using descriptions from other websites instead of site's own meta description
In the last month or so, Google has started displaying a description under links to my home page in its search results that doesn't actually come from my site. I have a meta description tag in place and for a very limited set of keywords, that description is displayed, but for the majority of results, it's displaying a description that appears on Alexa.com and a handful of other sites that seem to have copied Alexa's listing, e.g. similarsites.com. The problem is, the description from these other sites isn't particularly descriptive and mentions a service that we no longer provide. So my questions are: Why is Google doing this? Surely that's broken behaviour. How do I fix it?
Technical SEO | | antdesign0 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
What's the best way to solve this sites duplicate content issues?
Hi, The site is www.expressgolf.co.uk and is an e-commerce website with lots of categories and brands. I'm trying to achieve one single unique URL for each category / brand page to avoid duplicate content and to get the correct URL's indexed. Currently it looks like this... Main URL http://www.expressgolf.co.uk/shop/clothing/galvin-green Different Versions http://www.expressgolf.co.uk/shop/clothing/galvin-green/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/1 http://www.expressgolf.co.uk/shop/clothing/galvin-green/2 http://www.expressgolf.co.uk/shop/clothing/galvin-green/3 http://www.expressgolf.co.uk/shop/clothing/galvin-green/4 http://www.expressgolf.co.uk/shop/clothing/galvin-green/all http://www.expressgolf.co.uk/shop/clothing/galvin-green/1/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/2/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/3/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/4/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/all/ Firstly, what is the best course of action to make all versions point to the main URL and keep them from being indexed - Canonical Tag, NOINDEX or block them in robots? Secondly, do I just need to 301 the (/) from all URL's to the non (/) URL's ? I'm sure this question has been answered but I was having trouble coming to a solution for this one site. Cheers, Paul
Technical SEO | | paulmalin0 -
Can JavaScrip affect Google's index/ranking?
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop? I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... " One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website. All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website. Any advice would be much appreciated, thank you!
Technical SEO | | echo10 -
Are URL's with trailing slash seen as two different URLs
Hello, http://www.example.com and http://ww.example.com/ Are these seen as two different URL's ? Just as with www or non www ? Or it doesn't make any difference ?
Technical SEO | | seoug_20050