Importance of correction of technical errors
-
Hello everyone!!!
I have question that i know it has been asked so many times. However i am looking for an idea for my specific situation.
I own a website about commercial steel. My main focus has been getting incoming links from important companies and sites, while maintaining a good quality site.
Ive been struggling with ranks and Page Authority. Ive never put attention to technical errors such as Duplicate Content, 4XX Errors and critical warnings such as Redirects. I have around 70 errors and around 400 warnings. Someone told me that as long as the website is "user friendly" i should worry about that.
I have scarce resources to my SEO efforts. Which aspect should i put more effort?. Link Building and Quality Content vs Technical SEO ??? Is there a recommended balance mix towards a better PA, DA and Overall Quality??
I know is difficult, but it would be extremely helpful to hear from you!!
Regards.
-
Thanks for the excellent response!
Exactly what i wanted to hear!
Regards!
-
Hi Jesus,
This is an interesting article about how fixing duplicate content increased a websites indexation, in turn increasing their website traffic by 150%.
I would definitely attack the Crawl Errors found ASAP. Duplicate page content, 4XX errors, and missing or duplicate title tags can definitely mess with rankings.
Once you have your errors under control, I would work on the warnings whenever you have time or are bored.
Long story short... the errors can definitely have a big impact on how you are ranking, while the warnings are just a "heads-up".
Hope this helps.
Good luck.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My translated pages are categorized as subpages of the originals / Importance of hreflang tags
Hi there We have a website that is originally in German, but has an English translation for all pages.
Technical SEO | | Jess_Smunch
I recently created a crawl map for it, which showed that all our translated pages are indexed as subpages of the German originals. I wonder if this is normal, or if it will have a negative impact on our SEO. If they are subpages, will Google still index and rank them with the same importance as the originals?
If not, what can I do to make them standalone pages and not subpages? Also, we have a few issues with hreflang tags that we cannot fix easily as our CMS does not give us a flexible option for editing our code. I wonder how much impact hreflang tags have on our ranking and if we can just disregards these issues? We use Hubspot as a CMS, if that matters. Thanks for your feedback!0 -
Critical crawler errors...4xx
Hey fam, I ran the Critical Crawler Issues and found 9 pages with critical crawler issues. I'm running a wordpress site and looked in the dashboard for Pages and Posts but the links aren't in the dashboard. Can you help fix? Thank you!
Technical SEO | | Myflgreen0 -
What should I do with URLs that cause site map errors?
Hi Mozzers, I have a client who uses an important customer database and offers gift cards via https://clients.mindbodyonline.com located within the navigation which causes sitemap errors whenever it is submitted since domain is different. Should I ask to remove those links from navigation? if so where can I relocate those links? If not what should I do to have a site map without any errors? Thanks! 1n16jlL.png
Technical SEO | | Ideas-Money-Art0 -
During my last crawl suddenly no errors or warnings were found, only one, a 403 error on my homepage.
There were no changes made and all my old errors dissapeard, i think something went wrong. Is it possible to start another crawl earlyer then scheduled?
Technical SEO | | KnowHowww0 -
How to correct a google canonical issue?
So when I initially launched my website I had an issue where I didn't properly set my canonical tags and all my pages got crawled. Now in looking at the search engine results I see a number of the pages that were meant to be canonical tagged to the correct page showing up in the results. What is the best way to correct this issue with google? Also I noticed that while initially I was ranking well for the main pages, now those results have disappeared entirely and deeper in the rankings I am finding the pages that were meant to be canonical tagged. Please Help.
Technical SEO | | jackaveli0 -
500 Server Error on RSS Feed
Hi there, I am getting multiple 500 errors on my RSS feed. Here is the error: <dt>Title</dt> <dd>500 : Error</dd> <dt>Meta Description</dt> <dd>Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 391, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 500 Internal Server Error</dd> <dt>Meta Robots</dt> <dd>Not present/empty</dd> <dt>Meta Refresh</dt> <dd>Not present/empty</dd> Any ideas as to why this is happening, they are valid feeds?
Technical SEO | | mistat20000 -
404 Error from site - is this normal?
I have been trying to clean up any 404 errors. We keep getting the following: URL /include/vdimgck.php referring domain http://www.856d.c@m/plus/feedback.php rendered domain unclickable by adding the "@" since I do not know if it is safe. I just turned off the trackbacks and pings in the blog since I saw it was producing duplicate content and from what I read it is not worth keeping those with Wordpress. is vdimgck.php anything some here instantly recognizes ? It tops all our 404 errors, seems like a lot of requests. Thanks!
Technical SEO | | Force70 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0