How do you measure content on a website?
-
I never thought of this question before. Maybe because i didn't focus myself on content but only on optimizing existing content from clients.
So how do you measure the content on a specific page?
-
There are several ways to measure the success of the content and these can be divided into two categories - quality and quantity metrics
Quantity metrics
-
visitor engagement metrics (time spent on page, bounce rate, page view per visit)
-
conversion metrics (orders / conversion driven by the content)
-
social metrics (facebook likes/shares, tweets and other share metrics)
-
of subscribers to rss feed (if the content is a part of blog)
-
of comments
-
Rank of keywords driving organic traffic to the page
Quality metrics
-
quality of comments
-
quality of social shares
-
Feedback received from users
-
LDA score (although LDA is a quantity metric it fits the quality criteria because the LDA algorithm essentially is a mathematical indication of how relevant the content is to a keyword)
Best
Sameer
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
If content is at the bottom of the page but the code is at the top, does Google know that the content is at the bottom?
I'm working on creating content for top category pages for an ecommerce site. I can put them under the left hand navigation bar, and that content would be near the top in the code. I can also put the content at the bottom center, where it would look nicer but be at the bottom of the code. What's the better approach? Thanks for reading!
Technical SEO | | DA20130 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin <cite>dev.rollerbannerscheap.co.uk/</cite><a id="srsl_0" class="pplsrsla" tabindex="0" data-ved="0CEQQ5hkwAA" data-url="http://dev.rollerbannerscheap.co.uk/" data-title="Roller Banners Cheap » admin" data-sli="srsl_0" data-ci="srslc_0" data-vli="srslcl_0" data-slg="webres"></a>A description for this result is not available because of this site's robots.txt – learn more.This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google.Please can anyone help?
Technical SEO | | SO_UK0 -
RSS Feed - Dupe Content?
OK so yesterday a website agreed to publish my RSS feed and I just wanted to check something. The site in question is far more established than mine and I am worrying that with my content appearing on their website pretty much at the same time as mine, will Google index theirs first and therefore consider mine to be dupe? They are linking back to each of my articles with the text "original post" and I'm not sure whether this will help. Thanks in advance for any responses!
Technical SEO | | marcoose810 -
Dealing with duplicate content
Manufacturer product website (product.com) has an associated direct online store (buyproduct.com). the online store has much duplicate content such as product detail pages and key article pages such as technical/scientific data is duplicated on both sites. What are some ways to lessen the duplicate content here? product.com ranks #1 for several key keywords so penalties can't be too bad and buyproduct.com is moving its way up the SERPS for similar terms. Ideally I'd like to combine the sites into one, but not in the budget right away. Any thoughts?
Technical SEO | | Timmmmy0 -
Mobile website settings - I am doing right?
Hi, http://www.schicksal.com has a "normal" and a "mobile' version. We are using a browser detection routine to redirect the visitor to the "default site" or the "mobile site". The mobile site is here:
Technical SEO | | GeorgFranz
http://www.schicksal.com/m The robots.txt contains these lines: User-agent: *
Allow: / User-agent: Googlebot
Disallow: /m
Allow: / User-agent: Googlebot-Mobile
Disallow: /
Allow: /m Sitemap: http://www.schicksal.com/sitemaps/index So, the idea is: Only allow the Googlebot-Mobile Bot to access the mobile site. We have also separate sitemaps for default and mobile version. One of the mobile sitemap is here My problem: Webmaster tool is saying that Google received 898 urls from the mobile sitemap, but none has been indexed. (Google has indexed 550 from the "web sitemap".) I've checked the webmaster tools - no errors on the sitemap. So, if you are searching at google.com/m - you are getting results from the default web page, but not the mobile version. This is not that bad because you will be redirected to the mobile version. So, my question: Is this the "normal" behaviour? Or is there something wrong with my config? Would it be better to move the mobile site to a subdomain like m.schicksal.com? Best wishes, Georg.0 -
Duplicate content
Greetings! I have inherited a problem that I am not sure how to fix. The website I am working on had a 302 redirect from its original home url (with all the link juice) to a newly designed page (with no real link juice). When the 302 redirect was removed, a duplicate content problem remained, since the new page had already been indexed by google. What is the best way to handle duplicate content? Thanks!
Technical SEO | | shedontdiet0 -
About duplicate content
Hi i'm a new guy around here, but i'm having this problem in my website. Using de Seomoz tools i ran a camping to my website, in results i get to many errors for duplicate conten, for example, http://www.mysite/blue/ http://www.mysite/blue/index.html, so my question is, what is the best way to resolve this problem, use a 301 or use the rel canonical tag? Wich url will be consider for main url, Thanks for yor help.
Technical SEO | | NorbertoMM0