Best blog practices for website
-
For my Insurance website blog, I use MOZ to help me find high DA authoritative sites, then either generate ideas from them, or rewrite the copy.
If I rewrite the copy, I tend to pull from 2 - 3 top authoritative sites. Just so I don't get in trouble, but still offer the most concision information.
_My question is, Is this ok to do? _
Secondly, I just read that on some .Gov sites the information is public, and that you can use it as long as you give credit.
_My questions is, how do I tell which information is public? _
Thank you in advance
-
Miss Thumann,
My rule of thumb is to deem all content not to be yours, read up on your subject from more than one site and then write your own informed content.
I work for a large Industrial Products and Services Company and often carry out spot checks on content which doesn't rank highly on Google. I then send emails to anyone stealing our technical content and ask them politely to take the content down as we own the intellectual property rights to the articles. If nothing happens I send an email to our Intellectual property rights company who then proceed to get the domain withdrawn until the changes have been made.
In the eyes of Google new content is good content and it's clever algorithms can soon tell a clone page. Google doesn't care if it's public or not it cares if it's duplicate.
I hope this helps in some way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ive been using moz for just a minute now , i used it to check my website and find quite a number of errors , unfortunately i use a wordpress website and even with the tips , is till dont know how to fix the issues.
ive seen quite a number of errors on my website hipmack.co a wordpress website and i dont know how to begin clearing the index errors or any others for that matter , can you help me please? ghg-1.jpg
Moz Pro | | Dogara0 -
Tools necessary for a Technical Audit of website with penalties and need remediation?
Tools necessary for a Technical Audit of website with penalties and needs remediation? I am being tested for a job interview to prove and/or disprove a website has issues. I am familiar with Moz tools but I'm not sure of the procedure for this request? I am not finding anything online. The client will be giving a website and I will be doing this audit. What tools would you use? What exactly should I be looking for? What are some obvious fixes? WHERE CAN I LEARN MORE?
Moz Pro | | Joseph.Lusso0 -
Page Authority and Google updates favouring websites with black hat practices ?
Can someone explain how is it that most of the competitors I have online and that rank in first page of the search results almost entirely get links ( in the thousands) and still have higher or equal domain/page authority than mine? I went 1 by 1 checking all their links and they mostly come from sex pages, and non related sites. I say stop creating angry pandas and penguins and start taking out of the game people that just play dirty. Thanks.
Moz Pro | | AbellSEO0 -
Can increasing website pages decrease domain authority?
Hello Mozzers! Say there is a website with 100 pages and a domain authority of 25. If the number of pages on this website increases to 10,000 can that decrease its domain authority or affect it in any way?
Moz Pro | | MozAddict0 -
What is the best approach to handling 404 errors?
Hello All - I'm a new here and working on the SEO on my site www.shoottokyo.com. When I am finding 4xx (Client Errors) what is the best way to deal with them? I am finding an error like this for example: http://shoottokyo.com/2010/11/28/technology-and-karma/ This may have been caused when I updated my permalinks from shoottokyo.com/2011/09/postname to shoottokyo.com/postname. I was using the plug in Permalinks moved permanently to fix them. Sometimes I am able to find http://shoottokyo.com/a-very-long-week/www.newscafe.jp and I can tell that I simply have a bad link to News Cafe and I can go to the post and correct it but in the case of the first one I can't find out where the crawler even found the problem. I'm using Wordpress. Is it best to just use a plugin like 'Redirection' to move the rest that have errors where I cannot find the source of the issue? Thanks Dave
Moz Pro | | ShootTokyo0 -
Why There is no Page Authority in none of my websites?
I just saw, SEOMoz updated their Page Authority, Mozrank and Domain Authority. However in most of my websites, the Page Authority got reset. Am I doing something wrong?
Moz Pro | | GroupM0 -
Crawl went from a few errors to thousands when I added Blog
I am new here. I recently got the errors from SEOmoz crawl on my site down to just a handful from a couple hundred. So I took the leap and moved my blog to www.mysitename.com/blog (which I see recommended here) and now my errors are in the thousands. My blog which was a separate url has pages back to 2007. I am not sure if it is appropriate to post my site url in a question here? One error that really stands out is this: Description <dd>Using rel=canonical suggests to search engines which URL should be seen as canonical.</dd> On my root page I have: rel="canonical" href="http://www.mysitename.com"/> Thanks for any help...
Moz Pro | | CMCD0 -
301ing entire E-Com website - Using TopPages Report as a Guide
Hi Folks, My company is in discussions with an ERP provider to migrate our existing website to a new backend system, new url structures..the whole 9 yards. We have over 15,000 products, and a lot of "odd" url structures created by an outdated system that we were unable to adjust. All and all, if you're talking about a raw url count, it's in the 100s of thousands. During discussions, we were told that bulk 301 redirects would be a problem. They could be performed manually though (greaaattt). Due to this, I ran an OpenSite Explorer report, and isolated our top pages. After exporting the CSV, I was able to breakout "all" urls that have links from other domains. My question is, has anyone used the OpenSite Explorer on a website of similar size, to form the basis of a migration? Do you have enough confidence in the tool to use it in this way, or should we re-negotiate our agreement until a way can be found to mass 301 ALL urls. I'm at least a little concerned that OpenSite Explorer isn't indexing all of the links out there. Gasp..or would there be a better tool to accomplish what I'm trying to do? Thanks!
Moz Pro | | Blenny0