Hiring someone to assist us in fixing SEOMOZ Errors
-
Greetings. We have been using SEOMOZ for about 9 months and we are needing to hire someone to assist us in fixing ERRORS promulgated by our SEOMOZ weekly crawl.
Does anyone know of any person or firm that can assist us with this?
-
Good answer. I like that.
-
It may be worth sharing the web address in question and the type of errors SEOMoz is finding.
The Moz community is a thriving place of SEO experts who are very willing to offer advice to help fix your problems, it might be worth a shot as it could save you some dollar and also expand your own knowledge
-
SEOmoz has some great recommended firms.
Depending on the size, scope and budget I'd be interested in helping you. I've been helping some companies do this for over a year. Private message me up via my SEOmoz profile and I'd love to see if I could be of some help for you.
I don't wont to self promote but If my profile fits what you are looking for then I'd be interested.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why our competitor with lower DR and PA outranks us in Google?
Hi everyone, I really don't understand why our competitor with lower DR and PA outranks us in Google.lv (Google Latvia). Below is a screenshot showing that our company takes #2 for the following keyword "gāzes baloni" in Google. Our DR is 24 and our PA is: 26, whereas our competitors DR is 23 and their PA is 19. The content on our page is much better too - we have clear Title, description, Q&A section etc, whereas our competitor has very limited content, just photos of the product and titles. Any suggestions would be highly appreciated. Thank you very much in advance. PYNPLNw i1lp4QI NriO6O4
Technical SEO | | Intergaz0 -
Search Console - Mobile Usability Errors
A site I'm looking at for a client had 100's of pages flagged as having Mobile Usability errors in Search Console. I found that the theme uses parameters in the URLs of some of theme resources (.js/.css) to identify the version strings. These were then being blocked by a rule in the robots.txt: "Disallow: /*?" I've removed this rule, and now when I inspect URLs and test the live versions of the page they are now being reported as mobile friendly. I then submitted validation requests in Search Console for both of the errors ("Text to small" and "Clickable Elements too close") My problem now, is that the validation has completed and the pages are still being reported as having the errors. I've double checked and they're find if I inspect them individually. Does anyone else have experience clearing these issues in Search Console? Any ideas what's going on here!
Technical SEO | | DougRoberts1 -
Errors In Search Console
Hi All, I am hoping someone might be able to help with this. Last week one of my sites dropped from mid first day to bottom of page 1. We had not been link building as such and it only seems to of affected a single search term and the ranking page (which happens to be the home page). When I was going through everything I went to search console and in crawl errors there are 2 errors that showed up as detected 3 days before the drop. These are: wp-admin/admin-ajax.php showing as response code 400 and also xmlrpc.php showing as response code 405 robots.txt is as follows: user-agent: * disallow: /wp-admin/ allow: /wp-admin/admin-ajax.php Any help with what is wrong here and how to fix it would be greatly appreciated. Many Thanks
Technical SEO | | DaleZon0 -
How to solve this merchant error?
Hello All, In my google merchant suddenly lots of warning appeared i.e. 1) Automatic item updates: Missing schema.org microdata price information 2) Missing microdata for condition Can you please tell me how to solve this errors? Thanks!
Technical SEO | | varo
John0 -
How to fix google index filled with redundant parameters
Hi All This follows on from a previous question (http://moz.com/community/q/how-to-fix-google-index-after-fixing-site-infected-with-malware) that on further investigation has become a much broader problem. I think this is an issue that may plague many sites following upgrades from CMS systems. First a little history. A new customer wanted to improve their site ranking and SEO. We discovered the site was running an old version of Joomla and had been hacked. URL's such as http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate redirected users to other sites and the site was ranking for buy adobe or buy microsoft. There was no notification in webmaster tools that the site had been hacked. So an upgrade to a later version of Joomla was required and we implemented SEF URLs at the same time. This fixed the hacking problem, we now had SEF url's, fixed a lot of duplicate content and added new titles and descriptions. Problem is that after a couple of months things aren't really improving. The site is still ranking for adobe and microsoft and a lot of other rubbish and the urls like http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate are still sending visitors but to the home page as are a lot of the old redundant urls with parameters in them. I think it is default behavior for a lot of CMS systems to ignore parameters it doesn't recognise so http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate displays the home page and gives a 200 response code. My theory is that Google isn't removing these pages from the index because it's getting a 200 response code from old url's and possibly penalizing the site for duplicate content (which don't showing up in moz because there aren't any links on the site to these url's) The index in webmaster tools is showing over 1000 url's indexed when there are only around 300 actual url's. It also shows thousands of url's for each parameter type most of which aren't used. So my question is how to fix this, I don't think 404's or similar are the answer because there are so many and trying to find each combination of parameter would be impossible. Webmaster tools advises not to make changes to parameters but even so I don't think resetting or editing them individually is going to remove them and only change how google indexes them (if anyone knows different please let me know) Appreciate any assistance and also any comments or discussion on this matter. Regards, Ian
Technical SEO | | iragless0 -
Why would SEOMoz and GWT report 404 errors for pages that are not 404ing?
Recently, I've noticed that nearly all of the 404 errors (not soft 404) reported in GWT actually resolve to a legitimate page. This was weird, but I thought it might just be old info, so I would go through the process of checking and "mark as fixed" as necessary. However, I noticed that SEOMoz is picking up on these 404 errors in the diagnostics of the site as well, and now I'm concerned with what the problem could be. Anyone have any insight into this? Rich
Technical SEO | | secretstache0 -
Lots of overdynamic URL and crawl errors..
Just wanted some advice. SEOmoz crawl found out about 18,000 errors. The error URLs are all mainly URLs like the one below, which seem to be the registration URL with a re-direct on, going back the product after registration: http://www.DOMAIN.com/index.php?_g=co&_a=reg&redir=/index.php?_a=viewProd%26productId=3465 We have the following line in the robots file to stop the login page from being crawled: Disallow: /index.php?act=login If I add the following, will it stop the error? Disallow: /index.php?act=reg Thanks in advance**.**
Technical SEO | | filarinskis0