Importance of correction of technical errors
-
Hello everyone!!!
I have question that i know it has been asked so many times. However i am looking for an idea for my specific situation.
I own a website about commercial steel. My main focus has been getting incoming links from important companies and sites, while maintaining a good quality site.
Ive been struggling with ranks and Page Authority. Ive never put attention to technical errors such as Duplicate Content, 4XX Errors and critical warnings such as Redirects. I have around 70 errors and around 400 warnings. Someone told me that as long as the website is "user friendly" i should worry about that.
I have scarce resources to my SEO efforts. Which aspect should i put more effort?. Link Building and Quality Content vs Technical SEO ??? Is there a recommended balance mix towards a better PA, DA and Overall Quality??
I know is difficult, but it would be extremely helpful to hear from you!!
Regards.
-
Thanks for the excellent response!
Exactly what i wanted to hear!
Regards!
-
Hi Jesus,
This is an interesting article about how fixing duplicate content increased a websites indexation, in turn increasing their website traffic by 150%.
I would definitely attack the Crawl Errors found ASAP. Duplicate page content, 4XX errors, and missing or duplicate title tags can definitely mess with rankings.
Once you have your errors under control, I would work on the warnings whenever you have time or are bored.
Long story short... the errors can definitely have a big impact on how you are ranking, while the warnings are just a "heads-up".
Hope this helps.
Good luck.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Have I constructed my robots.txt file correctly for sitemap autodiscovery?
Hi, Here is my sitemap: User-agent: * Sitemap: http://www.bedsite.co.uk/sitemaps/sitemap.xml Directories Disallow: /sendfriend/
Technical SEO | | Bedsite
Disallow: /catalog/product_compare/
Disallow: /media/catalog/product/cache/
Disallow: /checkout/
Disallow: /categories/
Disallow: /blog/index.php/
Disallow: /catalogsearch/result/index/
Disallow: /links.html I'm using Magento and want to make sure I have constructed my robots.txt file correctly with the sitemap autodiscovery? thanks,0 -
Am I using 301 correctly?
Hello, I have a 'Free download' type site for free graphics for designers. To prevent hot linking we authenticate the downloads and use a 301 redirect. So for example: The download URL looks like this if someone is clicking on the download button: http://www.website.com**/resources/243-name-of-the-file/download/dc37** and then we 301 that URL back to: http://www.website.com**/category-name/243-name-of-the-file** Is a 301 the correct way to do that?
Technical SEO | | shawn810 -
Client error 404 pages!
I have a number of 404 pages coming up which are left over in Google from the clients previous site. How do I get them out of Google please?
Technical SEO | | PeterC-B0 -
Which are the best website reporting tools for speed and errors
Hi, i have just made some changes on my site by adding some redirects and i want to know what affect this has had on my site www.in2town.co.uk I have been using tools such as Pingdom tools, http://gtmetrix.com but these always give me different time reports so i do not know the true speed of my site and give me different advice. So i am wanting to know how to check the true speed for my site in the UK and how to check for the errors to make it better any advice would be great on which tools to use
Technical SEO | | ClaireH-1848860 -
How to avoid 404 errors when taking a page off?
So... We are running a blog that was supposed to have great content. Working at SEO for a while, I discovered that is too much keyword stuffing and some SEO shits for wordpress, that was supposed to rank better. In fact. That worked, but I'm not getting the risk of getting slaped by the Google puppy-panda. So we decided to restard our blog from zero and make a better try. So. Every page was already ranking in Google. SEOMoz didn't make the crawl yet, but I'm really sure that the crawlers would say that there is a lot of 404 errors. My question is: can I avoid these errors with some tool in Google Webmasters in sitemaps, or shoud I make some rel=canonicals or 301 redirects. Does Google penalyses me for that? It's kinda obvious for me that the answer is YES. Please, help 😉
Technical SEO | | ivan.precisodisso0 -
Lots of overdynamic URL and crawl errors..
Just wanted some advice. SEOmoz crawl found out about 18,000 errors. The error URLs are all mainly URLs like the one below, which seem to be the registration URL with a re-direct on, going back the product after registration: http://www.DOMAIN.com/index.php?_g=co&_a=reg&redir=/index.php?_a=viewProd%26productId=3465 We have the following line in the robots file to stop the login page from being crawled: Disallow: /index.php?act=login If I add the following, will it stop the error? Disallow: /index.php?act=reg Thanks in advance**.**
Technical SEO | | filarinskis0 -
Fixing Missing MetaTag Errors
Hey all, I just had a crawl test done on my site(created using wordpress) and I received a ton of missing meta tag descriptions to fix. The odd thing is though I use "All in One" SEO Tool and the actual pages or posts on the site do have meta tag descriptions, however I noticed for every post an RSS Feed is being automatically generated and this Feed is the link missing the meta tag descriptions. Most of the errors display "Comments on" with a /feed in the end of the url. I am totally clueless on how to resolve these errors as I havent installed any WP plugins that generate feeds automatically. Has anyone encountered this problem before or know how to fix this?? The site url is http:// GovernmentGrantsAustralia . org I have left spaces above to avoid being a link dropper 🙂 Would really appreciate if anyone can help! FYI: I just found this link after digging through all the Q&A history, however I tried it and am not sure if it has worked as I still see the errors on my SEOmoz report. The link is:
Technical SEO | | justin99
http://www.seomoz.org/qa/view/41413/wordpress-missing-meta-description-tag-comments Hope someone can help me figure this one out! Thanks, Justin0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0