Is there such thing as a good text/code ratio? Can it effect SERPs?
-
As it says on the tin;
Is there such thing as a good text/code ratio? And can it effect SERPs?
I'm currently looking at a 20% ratio whereas some competitors are closer to 40%+.
Best regards,
Sam. -
Thank you James and Alan, for the quick response.
-
There is no set ration but clean code is important, large amount sof script, css, json and viewstate can affect your SEO, usly messy code has errors, many of todays CMS packages create messy code with errors. Seach engines have to try to work out what is visisble to the users, this is no easy feat when you have mess code with errors.
Herre a few errors that Bing picks up, no dount Googes does also
http://perthseocompany.com.au/seo/reports/violation/the-page-contains-a-large-amount-of-script-code
-
I would not worry too much about text to code ratio as an exact number.
Things I would more so worry about are the following:
1. Do you have more then 200 words of text per page.
2. Do you have low amount of code errors on page.
3. Do you have alot of code space on the page (I have see this numerous times.
4. Make sure you have the key text elements near the top of the page when Google crawls the content first, also your key on page elements.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is the Google results serp broken?
Hi everyone! We've been trying to decipher how many of our pages are indexed by google at the moment. If we do the usual "site:https://www.hobbydb.com" search term, the serp says that we have more than 740,000 pages indexed. However, when I do a deep dive and click through to the last page of results, I can only get to page 54, and then there are no more results. This would mean that I only have 540 pages indexed, not 740,000. We have also done other queries for other sub-sections of our website, and the results also truncate at 50 pages. Has anyone run into this problem? Any suggestions are appreciated! Best, Alex
Technical SEO | | mpchobbydb0 -
Value of having a good crawl budget?
Hi, I've seen several questions where people give advice on how to increase the crawl budget. What I haven't seen anyone comment is what the value of this really is if you have many pages that doesn't get updated very often. Take for example the typical agency page - 50 pages, most of them rarely gets updates. In a monthly basis normally 10% of the website gets updated. Is there really any value then of having 100% of the website crawled on a daily basis?
Technical SEO | | Inevo0 -
Why can no tool crawl this site?
I am trying to perform a crawl analysis on a client's website at https://www.bravosolution.com I have tried to crawl it with IIS for SEO, Sreaming Frog and Xenu and not one of them makes it further than the home page of the site. There is nothing I can see in the robots.txt that is blocking these agents. As far as I can see, Google is able to crawl the site although they have noticed a significant drop in organic traffic. Any advise would be very welcome Regards Danny
Technical SEO | | richdan0 -
Link Anchor Text
When we have run a Open Site Explorer analysis on our own site, it says that for all our internal links the Link Anchor Text is 'Help with logging in' I am a bit confused as to why it shows that. That text does appear in the header of the page, but is not the first piece of text. Why is it happening on our site?
Technical SEO | | MattAshby
Why do I not see this on other sites?
What affect does this have on our ranking?
What's the best fix? Example page that we ran on Open Site Explorer: www.rightboat.com/search?manufacturer=Beneteau&model=Antares+9.800 -
Why is this site ranking so good?
Site in question: http://bit.ly/aBvVbm Our main competitor in the UK seems to be ranking extremely good for the keyword "jigsaw puzzles" even though their linking profile doesn't seem all that great? They mainly have site-wide links on 2 of their other ecommerce sites which seem to be given them their ranking power as this equals to 100's of links. Does sitewide links on 2 sites really give this much ranking power or am I missing something?
Technical SEO | | Tonyy30 -
Schema coding
Hi, I was wondering if you may know if you have to keep to the and coding when adding schema code to the site. For example if I'm already using H and P tags can I add the "itemprop" to those or do they have to be in aor as in the example below: <span itemprop="name">Kenmore White 17" Microwavespan>
Technical SEO | | DragonSearch
Product description:
<span itemprop="description">0.7 cubic feet countertop microwave. Has six preset cooking categories and convenience features like Add-A-Minute and Child Lock.span> So could I code it like this? <h1 itemprop="name">Kenmore White 17" Microwaveh1>
Product description:
<p itemprop="description">0.7 cubic feet countertop microwave. Has six preset cooking categories and convenience features like Add-A-Minute and Child Lock.p> Thank you,
Etela0 -
What is the most effective way of indexing a localised website?
Hi all, I have a website, www.acrylicimage.com which provides products in three different currencies, $, £ and Euro. Currently a user can click on a flag to indicate which region they are in, or if the user has not manually selected the website looks at the users Locale setting and sets the region for them. The website also has a very simple content management system which provides ever so slightly different content depending on which region the user is in. The difference in content might literally be a few words per page, like contact details, measurements i.e. imperial to metric. I dont believe that GoogleBot, or any other bot for that matter, sets a Locale, and therefore it will only ever be indexing the content on our default region - the UK. So, my question really is if I need to be able to index different versions of content on the same page, is the best route to provide alternate urls i.e.: /en/about-us
Technical SEO | | dotcentric
/us/about-us
/eu/about-us The only potential downside I see to this is there are currently a couple of pages that do have exactly the same content regardless of whether you have selected the UK or USA regions - could this be considered content duplication? Thanks for your help. Al0