Importance of correction of technical errors
-
Hello everyone!!!
I have question that i know it has been asked so many times. However i am looking for an idea for my specific situation.
I own a website about commercial steel. My main focus has been getting incoming links from important companies and sites, while maintaining a good quality site.
Ive been struggling with ranks and Page Authority. Ive never put attention to technical errors such as Duplicate Content, 4XX Errors and critical warnings such as Redirects. I have around 70 errors and around 400 warnings. Someone told me that as long as the website is "user friendly" i should worry about that.
I have scarce resources to my SEO efforts. Which aspect should i put more effort?. Link Building and Quality Content vs Technical SEO ??? Is there a recommended balance mix towards a better PA, DA and Overall Quality??
I know is difficult, but it would be extremely helpful to hear from you!!
Regards.
-
Thanks for the excellent response!
Exactly what i wanted to hear!
Regards!
-
Hi Jesus,
This is an interesting article about how fixing duplicate content increased a websites indexation, in turn increasing their website traffic by 150%.
I would definitely attack the Crawl Errors found ASAP. Duplicate page content, 4XX errors, and missing or duplicate title tags can definitely mess with rankings.
Once you have your errors under control, I would work on the warnings whenever you have time or are bored.
Long story short... the errors can definitely have a big impact on how you are ranking, while the warnings are just a "heads-up".
Hope this helps.
Good luck.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do i fix fatal error message?
I Am Trying To Remove A Robots.txt code i put in my root domain a while back because i didn't know what i was doing. everytime i enter my domain (domain.com/robots.txt) i get a fatal error message. How do I fix this fatal error message? ipALV2z
Technical SEO | | icebergsal0 -
Error after scanning with browseo.net
Good day! I have done a scan on my site with browseo.net ( and a few other similar scanners ) and got the mess seen in the screenshot. I've tried deleting all the files in the website folder, replace it with a single image file, but it still shows the same error. What could this mean and should i be worried? P.S Found my answer after contacting the helpful support of browseo.net : It took me some time to figure out what was going on, but it seems as if you are mixing content types. Browsers are quite smart when it comes to interpreting the contents, so they are much more forgiving than we are. Browseo crawls your website and detects that you are setting utf-8 as part of the meta information. By doing so, it converts the content in a different character encoding then what they are supposed to be. In a quick test, I tried to fetch the content type based on the response object, but without any success. So I am suspecting that in reality your content is not utf-8 encoded when you parse it into joomla. The wrong character type is then carried over for the body (which explains why we can still read the header information). All of this explains the error. In order for it to work in browseo, you’d have to set the content type correctly, or convert your own content into utf-8 before parsing. It may be that you are either storing this incorrectly in the database (check your db settings for a different content type other than utf-8) or that other settings are a bit messed up. The good news is, that google is probably interpreting your websites correctly, so you won’t be punished for this, but perhaps something to look into… From Paul Piper VKNNnAL.png?1
Technical SEO | | AlexElks0 -
Expired domain 404 crawl error
I recently purchased a Expired domain from auction and after I started my new site on it, I am noticing 500+ "not found" errors in Google Webmaster Tools, which are generating from the previous owner's contents.Should I use a redirection plugin to redirect those non-exist posts to any new post(s) of my site? or I should use a 301 redirect? or I should leave them just as it is without taking further action? Please advise.
Technical SEO | | Taswirh1 -
.htaccess and error 404
Hi, I permit to contact the community again because you have good and quick answer ! Yesterday, I lost the file .htaccess on my server. Right now, only the home page is working and the other pages give me this message : Not Found The requested URL /freshadmin/user/login/ was not found on this server Could you help me please? Thanks
Technical SEO | | Probikeshop0 -
4XX (Client Error)
How much will 5 of these errors hurt my search engine ranking for the site itself (ie: the domain) if these 5 pages have this error.
Technical SEO | | bobbabuoy0 -
Technical SEO question re: java
Hi, I have an SEO question that came my way, but it's a bit too technical for me to handle. Our entire ecom site is in java, which apparently writes to a page after it has loaded and is not SEO-friendly. I was presented with a work-around that would basically consist of us pre redering an html page to search engines and leaving the java page for the customer. It sounds like G's definition of "cloaking" to me, but I wanted to know if anyone has any other ideas or work-arounds (if there are any) on how we can make the java based site more SEO-friendly. Any thoughts/comments you have would be much appreciated. Thanks!!
Technical SEO | | Improvements0 -
404 errors and what to do
Hi, I am fairly new to the whole seo thing and am still getting confused a bit as to what to do to sort things out. I've checked the help pages but I cannot seem to find the issue. I've just signed up so my site is crawled for the first time and coming up with more then a 1000 404 errors. I checked a couple of the links via the report I downloaded and it does indeed show a 404 error but when I check the pages all seems to work fine. I did find one issue where an image if clicked on twice was pointing to an url with 'title= at the end. Now I have tried to get of that but couldn't find anything wrong. I'm a bit lost as to where to start!
Technical SEO | | junglefrog0 -
Should I worry about errors MozBot finds but is not on my sitemap?
MozBot crawled a found a couple errors that isn't included on my sitemap plugin, such as duplicate page content on author pages. Should I worry about things not on my sitemap?
Technical SEO | | 10JQKAs0