Are there any tools to give a value STRICTLY for Quantity of Content on your website?
-
I am trying to put a value to all the work I do and want to put a very specific value to the number of pages of unique content I have.
I know everyone says its about quality, and sure it is but quantity is still a factor and looked at.
(Can't argue with if you prefer 100 semi-optimized pages versus 1 optimized page- and is unfair for a tool to rate the website the 1 optimized page higher)
I use a ton of tools but yet to find something that puts a value on quantity of CONTENT ONLY (Please don't respond with PA or DA because that encompasses all the inherit value)
-
There's no good reason to measure your content purely by quantity, simply because there's no inherent reason to have 10 pages on a site versus 10 million. The "right" amount of content for any particular website will be entirely driven by your business model, you marketplace, and the actions of your audience.
If you're looking for a way to measure the value of each page of content you're producing against your actual revenue and business goals, then to do what you're asking, you need to have a website set up to perfectly track multi-touch attribution across the entire customer lifecycle, and that's a hard task for most business models.
For us mortals, GA is the simplest way to get closer to this. By assigning e-commerce revenue or goal values, GA will assign values to pages, allowing you to see the total sum value of that page. The content groups features of GA will allow you to analyze content in bucketed groups as well, rather than page by page. Using GA is limited by the goals you are tracking - if you don't attach a dollar value to a newsletter signup, then GA will never assign more value to a page that generates tons of newsletter signups.
Marketing automation/analytics tools like Hubspot, Mixpanel, and Kissmetrics have similar features allowing you to track the value of a piece of content.
So, setting up the right analytics environment is the best way to justify your efforts on a per page basis.
-
I assume you are talking about quantity as in say the number of articles vs the length of each article. If so you can track one metric that may be beneficial in this case, it is the inner linking between the articles the more articles you have the more interlinking that can be accomplished and this if done correctly helps.
This is true for the length of article, (assuming stellar content here), the longer the article the more room for quality inner linking and external linking, more room for awesome headlines and images that grab the attention of those that skim the article.
With this in mind you can see why a longer article and more of them can make sense!
Now for tools if you are into checking how much articles you have vs your competitor, you can use Screaming Frog or equivalents to scan the site and count the number of articles, and compare that to yours, as an example of course much more can be done to scrape and count the number of inner links, thin content etc.
Here is more of what can be done with screaming frog
Hope this helps!
-
I don't think you can get a tool that accurate, as the user will determine how good the content really is. I think the moz scores a good indication.
How would you measure it? You could have a criteria to measure against i.e
- Time spent on page
- Number of high quality reference
- Number of social shares
- Number of new v returning vistors
- Bounce rate etc...
I think if I am not mistaken this is what the PA is based on. Be interesting what a Moz staff member can say on this?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best blog practices for website
For my Insurance website blog, I use MOZ to help me find high DA authoritative sites, then either generate ideas from them, or rewrite the copy. If I rewrite the copy, I tend to pull from 2 - 3 top authoritative sites. Just so I don't get in trouble, but still offer the most concision information. _My question is, Is this ok to do? _ Secondly, I just read that on some .Gov sites the information is public, and that you can use it as long as you give credit. _My questions is, how do I tell which information is public? _ Thank you in advance 🙂
Moz Pro | | MissThumann0 -
301 Redirects - But still duplicate content?
Our website domain website.com redirects to website.com/en (since it's in English). Therefore, all pages on website.com redirects to website.com/en. In my Moz analytics, it says I have duplicate content, and lists all of these pages. Didn't the 301 redirects take care of the duplicate content? Or do I still have to add canonical tags?
Moz Pro | | Taulia0 -
Duplicate Page content
I found these URLs in Issue: Duplicate Page Content | http://www.decoparty.fr/Products.asp?SubCatID=4612&CatID=139 1 0 10 1 http://www.decoparty.fr/Products.asp?SubCatID=4195&CatID=280 1 0 10 1 http://www.decoparty.fr/Catproducts.asp?CatID=124 | 28 | 0 | 12 | 1 |
Moz Pro | | partyrama0 -
Duplicate content error?
I am getting a duplicate content error for the following pages: http://www.bluelinkerp.com/products/accounting/index.asp http://www.bluelinkerp.com/products/accounting/ But, of course, the 2nd link is just an automatic redirect to the index file, is it not? Why is it thinking it is a different URL? See image. NJfxA.png
Moz Pro | | BlueLinkERP0 -
Can I rely on Keyword Difficulty tool?
I just ran into a problem that I hadn't expected. Testing the Keyword Difficulty I saw the results contained a result for a page that has Domain Authority=1 and Page Authority=1. As a result, Keyword Difficulty was reduced (compared to last month), which may actually be reversed if the site is crawled. Sadly, I didn't run the report on the figures as it was a small project. Questions: Can I rely on results shown by Keyword Difficulty? Are results where Domain Authority =1 are used to calculate Keyword Difficulty? If so, why is that? Is there any difference between a page that has received no links and a page that OSE/Mozscape has no link data for? The problem Using the Keyword Difficulty tool, I found swings of up to 14% in Keyword Difficulty (between Oct -Nov). Dr Pete may suggest that this is because of changes in Google's index ( http://www.seomoz.org/blog/a-week-in-the-life-of-3-keywords ). However, It would be helpful to have a figure for Keyword Difficulty that isn't affected by the gaps in the Mozscape data. The (bad) solution You can mirror something close to Keyword Difficulty using: =(Sum of Page Authorities + Sum of Domain Authorities )/20 Right now, I have resorted to manually calculating keyword difficulty. I use the SEOMoz Page Authority & Domain Authority figures and a quick splash of Excel SUMIF and COUNTIF. I find the results don't look as 'easy' when I can ignore results where the data is unknown (PageAuthority=1 & DomainAuthority = 1). Background Info One result I still have a report on is for the phrase [fixing your business puzzle] using US Results on Google. For the specific result, I found the additional information about the site: DNS lookup shows the domain was registered in 2010
Moz Pro | | Darroch
Archive.org shows no records
OSE shows no data for the site
Site uses https
Google showing No links
Robots.txt file seems fine
No Sitemap.xml0 -
Why does Crawl Diagnostics report this as duplicate content?
Hi guys, we've been addressing a duplicate content problem on our site over the past few weeks. Lately, we've implemented rel canonical tags in various parts of our ecommerce store, over time, and observing the effects by both tracking changes in SEOMoz and Websmater tools. Although our duplicate content errors are definitely decreasing, I can't help but wonder why some URLs are still being flagged with duplicate content by our SEOmoz crawler. Here's an example, taken directly from our Crawl Diagnostics Report: URL with 4 Duplicate Content errors:
Moz Pro | | yacpro13
/safety-lights.html Duplicate content URLs:
/safety-lights.html ?cat=78&price=-100
/safety-lights.html?cat=78&dir=desc&order=position /safety-lights.html?cat=78 /safety-lights.html?manufacturer=514 What I don't understand, is all of the URLS with URL parameters have a rel canonical tag pointing to the 'real' URL
/safety-lights.html So why is SEOMoz crawler still flagging this as duplicate content?0 -
Keyword Difficulty Tool Ranking
I'm using the keyword difficultly tool to help me create a a list of 5 keywords (out of approx 50-60) to optimise pages for on a site. However I don't want to just choose the top 5 if 3 of them are too competitive and not worth targeting. From anyone's experience, for a small, new web company who has no pages optimised at this point, do you think there is a keyword difficulty score that I should create a hard limit on? So for instance, with a group of keywords, to only target keywords that have a difficulty score of 60 or below because anything higher would be too difficult to optimise the pages for in this stage of the sites development. Thanks in advance for your help Michelle 🙂
Moz Pro | | artlivemedia0 -
Is there a tool to show me exactly how a search engine sees my pages?
Our site is very graphics heavy and I know the search engines aren't seeing it as humans do. Is there a tool (almost like Wirify) that can show me exactly how Google sees my page(s)? Thanks in advance. 🙂
Moz Pro | | askotzko0