Domain MozRank / MozTrust too LOW
-
I release this site, about 1 year ago: www.bioespaco.com
Cant understand why the site has so low value on domain mozrank e moztrust.
The page has 1500 visitors/month and 3500 page views/month too.
Most of the visitors from Google come to this site, with the keyword "bioespaco" the name of the company.
Cant find what is wrong... Looks like something it penalizing the page (or the domain!?)
Thanks!
-
Google Webmaster says that the domain has +20 domains (follow) pointing to www.bioespaco.com :S
-
Forgot to mention: the domain was registered in 2004.
-
You may want to look at increasing the amount of unique domains linking in, at present you only have 4
Try finding some authorative resources, invest in building links from these and analyse the results.
It will take time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can We Publish Duplicate Content on Multi Regional Website / Blogs?
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs. USA: http://www.bannerbuzz.com/blog/ UK: http://www.bannerbuzz.co.uk/blog/ AUS: http://www.bannerbuzz.com.au/blog/ CA: http://www.bannerbuzz.ca/blog/ Let me give you very clear ideas on it. Recently, We have published one article on USA website. http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/ And, We want to publish this article / blog on UK, AUS & CA blog without making any changes. I have read following paragraph on Google's official guidelines and It's inspire me to make it happen. Which is best solution for it? Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
Content Development | | CommercePundit0 -
My Boss tells me personal narrative content isn't read online and bad for SEO, anyone else disagree? b/c I do!
I am in a constant debate that content 1st person or 3rd person doesn't make a difference in terms of SEO and what people on the web want to read. What do you all think? Does it make a difference?
Content Development | | GoAbroadKP0 -
With the structure of WordPress when multiple tags are selected, SEOMoz reports show each URL/tag as duplicated content? What to do?
wordpress.com/blogpost/tag/word1 wordpress.com/blogpost/tag/word 2 etc. Same page, but WP generates multiple URLs for each tag. in reports, this shows as duplicate content. Is it something to worry about? If yes, what is the best fix?
Content Development | | VividImage0 -
Possible to recover from Thin/Duplicate content penalties?
Hi all, first post here so sorry if in wrong section. After a little advice, if I may, from more experienced SEOers than myself with regards to writing off domains or keeping them. To cut a long story short I do a lot of affiliate marketing, back in the day (until the past 6 months or so) you could just take a merchant's datafeed and with some SEO outrank them for individual products. However, since the Google Panda update this hasn't worked as well and now it's much hard to do - which is better for the end user. The issue I have is that I got lazy and tried to see if I could still get some datafeeds to rank with only duplicate content. The sites ranked very well at first but after a couple of weeks died massively. They went from 0 to 300 hits a day in a matter of 24 hours and back to 2 hits a day. The sites now not rank for anything which is obviously because they are duplicate content. The question I have is are these domains dead, can they be saved? Not talking about duplicate content but as a domain itself. I used about 10 domains to test things, they ranged from DA 35 to DA 45 - one of the tests being can a domain with reasonable DA rank for duplicate content. Seeing as the test didn't work I want to use the domains for proper sites with proper unique content, however so far although the new unique content is getting indexed it is suffering from the same ranking penalties the duplicate (and now deleted content) pages had. Is it worth trying to use these domains, will Google finally remove the penalty when they notice that the bad content is no longer on the site or are the domains very much dead? Many thanks
Content Development | | ccgale0 -
Spell Check Bot / Crawler
Hi guys, Does anyone know of a script / app / bot / browser plugin that will crawl an entire website, examine the content, and report on any spelling (or grammar) mistakes? Many thanks. C.
Content Development | | cmoylan0 -
Block Low Quality Pages?
What are your thoughts on blocking (in robots.txt) and/or noindexing low-quality pages to defend against Panda, assuming you can't remove, redirect, or add quality content to it? Also, assume there are no external links pointing to these low-quality pages, no social shares, and zero incoming organic traffic. Has anyone had experience with this as a solution to Panda?
Content Development | | poolguy0 -
Sub directory vs sub domain for company blog
My company's blog is currently a sub directory - www.site/blog.com but for technical ease we are considering changing it to a sub domain - www.blog.site.com. What are the SEO ramifications of each? Thank you! Best, Sara
Content Development | | theLotter0 -
Blog on Sub-domain affect on SEO
Hi, Setting up a blog for a new website. What affects will hosting a blog on a sub-domain (blog.mysite.com vs. mysite.com/blog) have on SEO? Also, would having a separate blog domain be beneficial or detrimental (myblog.com)? Thanks, Shane
Content Development | | notarynow0