Oops - just noticed that I had an error in the first reply and didn't check the correct code. Mea culpa.
Posts made by DirkC
-
RE: International Site Migration
-
RE: I have an authority site with 90K visits per month. Now I have to change from non www to www. Will incur in any SEO issues while doing that? Could you please advice me on the best steps to follow to do this? Thank you very much!
Basically you have to make sure that you properly set up the redirects & that you update your sitemap and internal links (screaming frog can help for this). Make sure that you test properly before going making the change.
It's a simple case of "site migration" - so you might check this article which covers all topics related to a migration/change of url's (just skip the parts that are not applicable for your case).
Dirk
-
RE: Can you use the canonical tag and rel=next and rel=prev on category pages.
With rel next/previous you ask Google to consider all the pages with these tags to be considered as one page
With a canonical you indicate that the page is a duplicate (or variation) of the canonical url.Your developers are right - if you put the first page as canonical for the subsequent pages your are basically asking Google to ignore the 2nd, 3rd,... pages which is in conflict with rel next/previous.
It is possible to combine both canonical & rel next/previous - but not in the way as your SEO company is suggesting. Example from Google:
The canonical in this case is used to strip the sessionid which could be a cause for duplicate content.
Please not that if your category pages are split over a lot of pages (like 100) - the rel / next previous stops making sense. In that case it's probably better to focus on optimising the first page & put a "noindex/follow" on the subsequent pages.
Dirk
-
RE: International Site Migration
One sitemap is ok from technical perspective (as long as it doesn't become to long =>50K URLs). For reporting it is interesting to split per country - this way it's easier in WMT to check if you have indexing problems on a specific country version.
Dirk
PS There is a typo in my initial answer for UK - the language has to be en-uk & not en-us
-
RE: Strange rankings on new website
If you check your rankings in Search Console - do you see major fluctuations after the relaunch?
How do you track traffic - in the forum you seem to use Google Analytics. The other pages however do not contain an analytics tag.
Dirk
-
RE: International Site Migration
Hreflang will certainly help -however it's a bit confusing how you put in in your question. Hreflang is not put on domain but on page level:
Australian pages:
United States pages:
UK pages:
=> hreflang needs to be put on every page (you can test some sample pages here: http://flang.dejanseo.com.au/
Apart from hreflang - register each folder in WMT & target it to the specific country.
So set domain.com/uk/ to specifically target UK.Apart from that - make sure you adapt the text to the "local" English (so UK english for uk...etc) & use proper currencies & provide local contact detailes. Build local links for each subfolder.
If you do this you should be fine
Dirk
-
RE: I have two Facebook Pages connected to the same website. Is there a way to tell in Google Analytics which Facebook Page is responsible for what referral traffic?
In both cases you will just see Facebook as referrer - if you want to track more in detail you will have to add a tracking parameter to your links.
Dirk
-
RE: Delete or not delete outdated content
For old content which is expired - just let them 404 or redirect to a newer version of the page (if available).
For new content that is going to expire you could use the unavailable after tag - see also this advice from Matt Cutts on content that expires (it's more about products for e-commerce - but the general principle is the same).Dirk
-
RE: International Sitemaps
It doesn't really matter where you put the sitemap - as long as the search engines are able to find them.
You can indicate the location in Webmaster tools - or in the robots.txt file. (you need to validate each domain & subdomain in WMT).
Dirk
-
RE: SSL for SEO?
No difference for SEO; main difference is the Green bar which is displayed for Extended SSL certifcates - these are the ones which tend to be more expensive than the "standard" ones - on top of that they don't allow wildcards - so you'll need a certificate for each subdomain. Could increase confidence of your visitors in your site - but as stated before - no direct SEO impact.
Dirk
-
RE: Is there a limit to how many URLs you can put in a robots.txt file?
You could add them to the robots.txt but it you have to remember that Google will only read the first 500kb (source) - as far as I understand with the number of url's you want to block you'll pass this limit.
As Google bot is able to understand basic regex expressions it's probably better to use regex (you will probably be able to block all these url's with a few lines of code.
More info here & on Moz: https://moz.com/blog/interactive-guide-to-robots-txtDirk
-
RE: External links that don't exist
Hi,
I did a quick crawl of your site with Screaming Frog - your site seems to generate a lot of time outs & 5xx errors - could be that these are seen as external links by Moz. Response times are very bad for most of your pages.
Dirk
-
RE: External links that don't exist
I checked your site with a plugin - and it seems to have more than 1800 links - don't forget that the links in the navigation - even if not visible still count as link.
-
RE: Help for 404 error
Hi Ravi,
Not sure which link checker you are using - I did a quick check with both Screaming Frog & an extension in Chrome. Both indicate that the anchors do not generate 404's. They seem to be functional as well so it shouldn't be a problem.
Anyway, anchors exist for ages and do not cause SEO issues - so it's safe to use them.I noticed that both the www & non-www version are accessible - you might want to choose a preferred version & redirect the other one.
Dirk
-
RE: Canonical Question: Root Domain Geo Redirects to SubFolder.
To be very honest I don't think it will make a difference if it's going to the /us/ version rather than the root.
If you prefer - you could keep the us version on the root & only redirect the non-us visitors to a country version.
Dirk
-
RE: Canonical Question: Root Domain Geo Redirects to SubFolder.
As far as I understand there is no content on domain.com so your last line makes no sense.
If you want the default version to be the us version you should put
Don't forget that hreflang needs to be placed on every page of your site - you can check if the implementation is correct here: http://flang.dejanseo.com.au/
Dirk
-
RE: Canonical Question: Root Domain Geo Redirects to SubFolder.
Be careful when redirecting based on ip - you have to make sure that Googlebot (accessing your site with a Californian ip) can access the non-US versions. If you have a link on each page to change the version to another country (and these pages are accessible without being redirected) you should be ok.
An alternative to ip based redirection is to use your main domain for a country select page and to store the selection in a cookie - so you can redirect to the chosen version on subsequent visits. Check volvocars.com as an example. The advantage is of this method is that you give control to the user (I personally find it quite annoying when I'm being redirected to the local version when I'm abroad and want to visit my "home" version).
rgds,
Dirk
-
RE: Google Analytics View Filters
A bit to long to answer this here - you can find a detailed explanation here:
http://www.verticalrail.com/kb/filter-in-google-analytics-to-track-subdomains/
The filters you will need:
Subdomain only shop.domain.com
Only www: www.domain.com
Both domains - no filter necessary - unless you have a 3rd subdomain you don't want in the reporting:
www & shop: (www|shop).domain.com
Dirk
-
RE: Partial Match or RegEx in Search Console's URL Parameters Tool?
Don't forget that . & ? have a specific meaning within regex - if you want to use them for pattern matching you will have to escape them. Also be aware that not all bots are capable of interpreting regex in robots.txt - you might want to be more explicit on the user agent - only using regex for Google bot.
User-agent: Googlebot
#disallowing page.php and any parameters after it
disallow: /page.php
#but leaving anything that starts with par1=ABC
allow: page.php?par1=ABC
Dirk
-
RE: Mozbot Can Not Crawl Entire Domain
It's caused by the way you have build your site. If you click on redken.com - you get the choice of language. If you select "USA" you're redirected with 302 to redken.com/USA - then with 302 to redken.com/?country=USA then with 302 to redken.com I guess for browsers you store this somewhere (cookie?) - however for a simple bot (like Moz - but I have the same with Screaming Frog) - you just go back where you started = redken.com which again will start the same loop.
So - only 4 url's can be crawled. The other countries are on different url's so will not be included in the crawl.
Google bot is smarter and acts more like a real browser so will crawl the site - but Mozbot can't do that.
rgds
Dirk
Update - I actually forgot one redirect - redken.com first is redirected with 302 to redken.com/international
PS The site is horribly slow as well - and the redirect chain is certainly not helping.
-
RE: Canonical Tags & GWT Parameters
If Google for some reason chooses another url as the preferred version rather than the canonical I think you can assume that links to the duplicates are counted as links to the preferred version - no hard evidence to confirm this however.
If you check the Best Practices- Be consistent: Try to keep your internal linking consistent. For example, don't link to
http://www.example.com/page/
andhttp://www.example.com/page
andhttp://www.example.com/page/index.htm
.
So if possible - rather link to the canonical than the parameter version.
On duplicate content in general - there is an interesting article on Kissmetrics - https://blog.kissmetrics.com/myths-about-duplicate-content/
Dirk
- Be consistent: Try to keep your internal linking consistent. For example, don't link to
-
RE: Why is Moz.com saying that none are linking to www.oneworldcetner.eu
Even when the index is updated - it still no guarantee that your links are going to show up. The Moz index is huge - but still only 25% (or less) of the Google index.
Check https://moz.com/help/guides/research-tools/open-site-explorer -
"Just so you know, here's how we compile our index:
- We grab the most recent index.
- We take the top 10 billion URLs with the highest MozRank (with a fixed limit on some of the larger domains).
- We start crawling from the top down until we've crawled 65,000,000,000 pages (which is about 25% the amount in Google's index).
- Therefore, if the site is not linked to by one of these seed URLs (or one of the URLs linked to by them in the next update) then it won't show up in our index. Sorry! :("
Other tools may have different approaches - this is why it's a good idea to combine different sources to get a better idea of which links you gained (ahrefs, semrush, moz,...and so on)
Dirk
-
RE: How to increase Page Authority
Off course internal links would help. It would also help if you could enrich your content with data/content that Zillow is not providing. Video's could be indeed a good idea, floor plan, more info on the neighbourhood, schools, ...etc would be great.
As most people are not searching for individual homes, but rather for homes in a certain neighbourhood) - you could try to focus on enriching the content on these pages (if you check http://www.zillow.com/oxford-oh/ - it's not really rich in content - you could use your knowledge of this specific region to enrich your homepage. You already do this on your homepage - but it's very long, with few images and trying to tackle all kind of questions. Consider splitting this page in smaller chunks. You regional pages are better - but could still be enriched with more images, put links to other useful sources (schools,...), statistics about people who live there (average age, income,...). The video's you put are very static - and the one I checked had no sound. On your homepage you mention that Esplanade Ridge is the preferred area for alumni, parents of students - however on the detail page you mention nothing about this but rather a dry, almost technical description of the area.
Dirk
-
RE: Canonical Tags & GWT Parameters
The problem with canonical url is that it's just a request to Google to index the canonical rather than the real url - Google however is not obliged to do this (to quote google:
"This (=canonical) indicates the preferred URL to use so that the search results will be more likely to show users that URL structure. (Note: We attempt to respect this, but cannot guarantee this in all cases.)"
Example: if all your internal links go to mysite.com/page¶m=xyz with canonical mysite.com/page Google will probably still rather index the real url mysite.com/page¶m=xyz rather than the canonical version.
If you want to be absolutely sure that the parameter version is not indexed you should redirect the parameter version to the non-parameter version with a 301 which is a (binding) directive that Google has to follow.
You could use the parameter tool in Webmaster tools - but you run a risk that if you do it the wrong way Google will not index these pages at all. In any case - it will not solve your reporting issue in Analytics (as people coming from other sources with parameters will still be measured on the parameter url)
Dirk
-
RE: Canonical Tags & GWT Parameters
If you added the canonicals there is no need to configure parameters in search console.
The issue you have in Analytics is not the same - even if google is respecting the canonicals people are still visiting the pages with the parameters and these are tracked in analytics. You can however tell analytics to ignore the parameters and only measure the traffic on the "main" version of the page. A detailed how to can be found here: http://blog.crazyegg.com/2013/03/29/remove-url-parameters-from-google-analytics-reports/
Dirk
-
RE: How to increase Page Authority
From the definition of Page Authority:
"Unlike other SEO metrics, Page Authority is difficult to influence directly. It is made up of an aggregate of metrics (MozRank,MozTrust, link profile, and more) that each have an impact on this score. This was done intentionally; this metric is meant to approximate how competitive a given site is in Google.com. Since Google takes a lot of factors into account, a metric that tries to calculate it must incorporate a lot of factors, as well."
In the case of Zillow - given the fact that this is a extremely strong domain - it's quite easy to guess where the strong Page Authority is coming from - lots of (internal) links from a very strong domain.
In your case, probably the best way to increase page authority is increase the strength of your domain: getting useful links), work on user engagement, having great content, ... and so on - with the knowledge that it will almost impossible to beat sites like Zillow (much like your local bookstore is facing the almost impossible task to beat Amazon).
Dirk
-
RE: New non-www. web address but the domain is the same
No need to set up a new analytics - the old one will work just fine.
If both the www & non-www would be active at the same time and if you would still use the previous version of analytics (i.e. not Universal analytics) - you would need to make a modification to your tracking code - but as far as I understand this won't be the case.
Dirk
-
RE: Why is my site ranking so poorly compared to my competition?
Hi,
We all had to start somewhere. There is a lot of useful content on Moz (check the beginners guide for starters) & there is off course the q&a. For the budget you mentioned you should be able to get some decent SEO company helping you out.
On the robots.txt - it is in the root of your website: www.carshippingcarriers.com/robots.txtYour javascript is located in the /wp-includes/ folder - however in the 2nd line of your robots file you put
Disallow: /wp-includes/
I would take-out this line.
New design looks clean & content is better to read and is closer to the top. You moved your form to the bottom. Personally I would keep it on the top (there is sufficient place on the main image)
Page speed hasn't really improved mainly because of the images: the image http://www.carshippingcarriers.com/wp-content/uploads/2015/06/IMG_0528.jpg is high-res & much too big. Same for http://www.carshippingcarriers.com/wp-content/uploads/2015/10/IMG_0538.jpg & your main image. Resize & use a tool like https://compressor.io/ (free) to compress them even further.
Check also Pagespeed insights: https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fwww.carshippingcarriers.com%2F&tab=desktop - score is not too bad - but you could enable caching for static resources & minify your css/js/html files (the tool does this for you - you can download the optimised js/css at the bottom of the page.
From usability perspective - the links on the green background are blue rendering difficult to read.
Hope this helps,
Dirk
-
RE: Google Page Speed
It surprises me that it would cost a lot of money. It can be costly if you want to get 100% score - but most of the time things like optimising your images, gzip & minify your content, caching ... shouldn't cost a fortune.
Don't forget to also check tools like Webpagetest.org - which are checking the actual load time.
These are complimentary to Page speed insights. As an example: if you serve a 5 1000KB images that are compressed and optimized Google Page Speed insights will be quite happy - however - on Webpagetest.org you will see the impact of these heavy images on load time.
As Matt is saying - speed is important - and will probably become more important in the future (increasing number of visits on mobile devices with slower network connections)
Dirk
-
RE: Tidied up site by getting rid of bad pages and now rankings tanked. - Please help
In addition to Matt's reply - you state that you redirected the doorway pages type site.co.uk/cleaning-enquipment-Manchester - however if I try this with the url you provided it returns a 404 - rather than redirecting to branches/manchester-tool-hire-shop or /cleaning-equipment
Dirk
-
RE: 301's - Do we keep the old sitemap to assist google with this ?
You shouldn't keep the old sitemap. If the pages are in the index - Google will figure it out the next time when the bot is visiting the site. Make sure that you update all the internal links (avoid internal redirects) - Screaming Frog will do miracles here.
If you would keep the old one you will get warnings like this:
"When we tested a sample of URLs from your Sitemap, we found that some URLs redirect to other locations. We recommend that your Sitemap contain URLs that point to the final destination (the redirect target) instead of redirecting to another URL."
rgds,
Dirk
-
RE: Why is my site ranking so poorly compared to my competition?
4. You block javascript with your robots.txt - you shouldn't do that (http://googlewebmastercentral.blogspot.be/2014/10/updating-our-technical-webmaster.html)
-
RE: Why is my site ranking so poorly compared to my competition?
Hi,
Did a quick check - some remarks.
1. Homepage - very heavy to load (http://www.webpagetest.org/result/151007_D6_180W/ ) - important text at the bottom and difficult to read due to the background image - part of the text/links is hidden behind images which isn't exactly what Google likes.
2 A lot of content is on your site is about new cars - not sure if this is the best strategy to follow. You will never be best in class for this. The links inside this part to your "main" content look a bit artificial. I would rather build content around shipping cars (what is the most expensive car you ever shipped, the strangest car, remarkable stories...etc) - provide content about the shipping process (the different steps, illustrate how you take care of the cars during shipment,...etc which is much more related to your core business. Check what the main concerns of your customers are and build content around this. Use tools like Semrush to checkout the keywords that are generating the traffic for your competitors and build content around them as well.
3. Your competitor's site might be ugly and quite light in content - it charges much faster and has all the content that counts visible upfront. He has about 1200 follow links to his site - you have about 100 - so you might want to work on some linkbuilding (you will find plenty of resources on this topic here on Moz). His links seem to be quite (over) optimised - so it's possible he's buying them.
Hope this helps,
Dirk
-
RE: Why is my site ranking so poorly compared to my competition?
For questions like this it's always useful to publish the url - without the url you can only get very generic advice.
Dirk
-
RE: Download all GSC crawl errors: Possible today?
The script worked for the previous version of the API - it won't work on the current version.
You try to search to check if somebody else has created the same thing for the new API - or build something your self - the API is quite well documented so it shouldn't be to difficult to do. I build a Python script for the Search Analytics part in less than a day (without previous knowledge of Python) so it's certainly feasible.rgds
Dirk
-
RE: Excellent performance in BING, terrible performance in GOOGLE
I checked your website and states nowhere that you are associated with taobao.com - so I assume you are just "hijacking" the Chinese brand name for your own benefit. On top of that - you "steal" the content from the original site and auto-translate it to Dutch.
To be very honest - I don't see a reason why Google would start promoting your site. When I search for Taobao Google I get the .com version in Chinese rather than the Dutch version (I am searching with Google.be/Dutch version)
Suppose that a German would consider it a good idea to copy Bol.com to Bol.de and copy the complete offer and auto translate it to German - I don't think Bol would be very happy with that and I have strong doubts that Google would index and promote this site. What you are doing is not that different (even if you do it with the best intentions)
Dirk
-
RE: Redirect /label/ to /tags/
Hi,
For questions about redirect Google & Stackoverflow are your best friends:
http://stackoverflow.com/questions/18998608/redirect-folder-to-another-with-htaccess
Put this code in your htacess (if you already have rewrite rules you just have to add the rewrite rules (bold) before or after the existing ones and not the rest of the code - difficult to say if it needs to be before or after - it depends on the rules that you already have)
Options +FollowSymLinks -MultiViews
Turn mod_rewrite on
RewriteEngine On
RewriteBase /RewriteRule ^label/(.*)$ /tags/$1 [L,NC,R=301]
rgds,
Dirk
-
RE: Does Universal Analytics auto generate events?
Universal Analytics is not tracking by default (even in Drupal)- it must have been something that has been set up in the Analytics plugin by your Developer.
If made a mistake in my first answer - you should look for Analytics in the source (and not events) - you will notice a bit of code stating:
"googleanalytics":{"trackOutbound":1,"trackMailto":1,"trackDownload":1,"trackDownloadExtensions":"7z|aac|arc|arj|asf|asx|avi|bin|csv|doc(x|m)?|dot(x|m)?|exe|flv|gif|gz|gzip|hqx|jar|jpe?g|js|mp(2|3|4|e?g)|mov(ie)?|msi|msp|pdf|phps|png|ppt(x|m)?|pot(x|m)?|pps(x|m)?|ppam|sld(x|m)?|thmx|qtm?|ra(m|r)?|sea|sit|tar|tgz|torrent|txt|wav|wma|wmv|wpd|xls(x|m|b)?|xlt(x|m)|xlam|xml|z|zip","trackDomainMode":"1","trackUrlFragments":1},"field_group":{"fieldset":"full"},"scheduler_settings":{"scheduler_local_storage":1,"ttl":
=> this bit of code is coming from the module and is tracking all downloads of the extensions listed (7z,aac, arc....etc)
Hope this clarifies,
Dirk
-
RE: Would it be a valid "link building' strategy to pay youtube video owners, to link to our company website in the decription of a certain video. ( For popular video's that are relevant )
If you would see this links as generators of traffic or to increase awareness it could be a valid strategy (check this article:https://moz.com/blog/the-hidden-power-of-nofollow-links )- however if you need these links from an "pure" SEO perspective to increase your page rank you can forget about it - the links on Youtube are nofollow.
(apart from that - Google doesn't really approve paid links...)
rgds,
Dirk
-
RE: How do I find out where my "direct traffic" came from?
Maybe your site was mentioned in the Apple News App ? Check this article: http://www.i4u.com/2015/09/95161/apple-news-app-referral-traffic-spikes - apparently no referral info is passed when traffic is coming from Apple News.
rgds
Dirk
-
RE: Keyword Difficulty Search Results: Can't tell which search engine
You can't see it directly - however if you click on the detail for each keyword you can see based the top 10 url's. Normally based on the site titles you can see which language has been used (the French version will return French page titles, the German the german titles...etc). Agree that it would be useful to see this in the overview report as well.
Guess one of the disadvantages when tools are developed in countries with one dominant language
rgds,
Dirk
-
RE: Does Universal Analytics auto generate events?
Hi,
I think that you are using the Google Analytics plugin for Drupal - which seems to track events by default (search "event" in the source code).
You might want to correct the pdf download - it's currently generating a 404
Dirk
-
RE: Multilingual SEO subdirectories structure
Why don't you use the domain.com to serve a "choose language" page - store the choice in the cookie and for subsequent visits redirect to the chosen language. Example: http://www.volvocars.com. This is a pretty standard approach in Belgium to serve both Dutch/French content on the same domain.
rgds,
Dirk
-
RE: Canonical issues using Screaming Frog and other tools?
Hi,
The difference between them
-
canonical : url has a canonical url - which can be self-referencing (canonical url = url) or not
-
canonicalised: url has a canonical url which is not self-referencing (canonical url <> url)
-
no canonical : quite obvious - the url has no canonical.
Potential issues could be - url's that you would like to have a canonical don't have a canonical or url's that are canonicalised don't have the right canonical url. You can use the lists (both canonicalised & no canonical) from Screaming Frog to check them - but it's up to you to judge whether the canonical is ok or not (no automated tool can guess what your intentions are).
Typical mistakes with canonicals: all url's have the same canonical url (like the homepage), or have canonical url's that do not exist. You could also check this with Screaming Frog using the setting "respect canonicals" - this way only the canonical url's will be shown in the listing.Also keep in mind that canonical url's are merely a friendly request to Google to index the canonical rather than the normal url - but it's not an obligation for Google to do this (check https://support.google.com/webmasters/answer/139066?hl=en quote: "the search results will be more likely to show users that URL structure. (Note: We attempt to respect this, but cannot guarantee this in all cases.)"
Dirk
-
-
RE: Why google removed my landing pages from index?
On first sight there seems nothing obvious that is wrong with your site (I crawled the site with Screaming Frog and it looks ok). Site:meko.lv gives 85 results which looks ok (even a bit too much if you look at page 5 & onward)
The only strange thing is your robots.txt - User-agent: googlebot Disallow: /*? - I imagine you added this line to block pages like this https://meko.lv/index.php?option=com_ajax&format=json which are still in the index. Did you try to do a fetch like google on your sitemap to be sure that this rule isn't blocking the bot from reading the contents? (especially because your sitemap has a particular url http://meko.lv/index.php?option=com_osmap&view=xml&tmpl=component&id=1 with a question mark inside. You might want to rename this to a more standard meko.lv/sitemap.xml
Hope this helps,
Dirk
-
RE: Thin Content due to Photo Galleries
Hi Matthias,
I agree that the content is pretty thin and that it would probably be better to present them in a slider (check the example from Autobild http://www.autobild.de/bilder/mazda-mx-5-gegen-bw-z4-6937517.html#bild23). While the presentation is quite similar to your presentation - the source contains all the captions & all the images making the content much richer.
From a usability perspective: each image requires the page to reload completely which is not really great.
I imagine that changing the images from separate url's to a slider can be an enormous amount of work. Having thin content / semi duplicate content on your site is not necessarily a cause for punishment (unless with clear malicious intent) - the issue is mainly that these thin pages will not show up in search results. If you are not optimising for image search (which I assume based on the captions you put under the pictures) you could just as well leave them as it (your normal articles look ok on first sight so you have more than just thin content pages).
If you would optimise for images, you should make your captions a little bit more descriptive & longer and you definitely need to change you alt titles (looks too much like keyword stuffing) - you might check this WBF - it's old but not much has changed on Image Search since then (well - at least in Germany as you are still using the "old" type of image search)
rgds,
Dirk
-
RE: Only two pages crawled
Without the actual site it's difficult to say. Did you check your robots.txt? Do you block robot's in your headers? Is your site entirely made in Flash or Ajax?
If you send the site in a PM I could have a look.Dirk
-
RE: Optimize code
You should minimize / uglify your code - because it speeds up your site (https://developers.google.com/speed/docs/insights/MinifyResources).
Don't worry about readability/organisation of the code - that's for humans. Browsers & bots interpret the code - and as long as it is (more or less) valid HTML it will be rendered/crawled without problems.
Dirk
-
RE: New website server code errors
Hi
as this is already a post which is markdr as answered it's better to make a new question and to reference this one (else nobody will notice). I would however strongly advice to ask this question as well on a more technical forum like stackoverflow.com
Dirk
-
RE: New Software Requires us to redirect a sub domain to another IP Address.
More technical info on how to have a folder with different ip can be found here http://stackoverflow.com/questions/2405845/redirect-folder-to-different-server