Your Opinion: Thin Content? Should we Retire this section?
-
Only way to explain this was to make a video.
Would love everyone's input on this:
https://youtu.be/TcdaOvz24Aw
Thank you. -
Sounds great! Let me know what impact you see from the changes, curious to hear!
-
Wow! Thank you so much for such a comprehensive and thoughtful reply.
You're right, its on a very old CMS system, and not only does it not have an XML sitemap, but I'm certain there are duplicate content issues, improper pages being indexed (contact, member list, etc) and a hundred other issues. This process of cleaning up the site and upgrading it has resulted in us finding mass quantities of the above issues due to these very old CMS systems that were made prior to "duplicate content" and other things being an issue.
I will take all your recommendations to heart and get started right away.
-
A few thoughts - pretty good explainer video, I would imagine that would take a ton of time to write!
First impressions on user profiles:
-
From looking at a few of the profiles, it seems like you'd be missing out on a lot of good user data if you simply retired these profiles permanently. From a search engine perspective, it sounds like they're not bringing in much on their own. But there's also qualitative and quantitative data that you can glean from all of those profiles to inform and create content, charts, data, reports, etc for your site. You can map out all the user profiles to create personas for your site. You wouldn't want to trash these and lose the data.
-
I do see the point on the content being somewhat thin, but thin content pages don't always harm your site if they're serving a purpose on your site. If you look at Moz's profile pages, some of them aren't very meaty if the users haven't contributed many posts or comments. So it should be evaluated on a case by case budget.
-
What you can do - and I believe Moz does - is if the user hasn't been active in the last year, you deactivate/archive their page until the user signs in again. That way you only have active user pages. Conversely, you can leave the profiles up but just automatically noindex those pages that aren't being used so that the crawl budget isn't being wasted. Another idea is to automatically noindex some of the pages under a certain word count threshold.
-
Yes the user profiles may be eating up your crawl budget when Google crawls the site, so you wouldn't necessarily want the profiles being crawled above other important pages if the crawler is hitting it's limit. Check out Search Console to see the crawl activity and stats (under Crawl Stats)
-
It looks like you have 8,000+ profiles currently indexed - which is both good and bad. Good because it's proof that Google is able to crawl all of those profiles and deems them worth indexing. Bad if you don't want all of them indexed or to waste your crawl budget.
-
You are write that all of the title tags are the same. I would automatically rewrite those and run a 90-day experiment to see if you can actually get some long-tail organic traffic to those profiles. Perhaps in aggregate they can actually bring in some traffic. Or maybe you'll learn after 90 days that they just don't rank for anything and so aren't worth spending time on.
-
One thing I noticed is that you don't have an XML sitemap designated in your robots.txt - https://www.hairlosstalk.com/robots.txt I would add multiple XML sitemaps to your robots.txt by creating a sitemap index. The WordPress SEO plugin by Yoast does this well for your WordPress pages, but you should make sure your shop and profile pages get their own XML sitemaps as well.
-
You should submit all sitemaps in Search Console and track the page indexation % for each XML sitemap.
-
You should also run a cohort report in Google Analytics to see if those with profile pages on your site have higher engagement rates than non-users and let that inform your decision as well. Do they purchase more than others? Visit more often? You don't want to chop off a community that contributes to the health of your site and revenue.
-
From a UX perspective, it's a funk thing to have your headers and design different among the WordPress, Magento, and Blogs sections. I'm sure you're aware of that, but it's definitely something that stands out.
-
Action items:
-
Rewrite all title tags on user profiles (auto generated is ok) and test for 90 days to see if the traffic changes
-
Also optimize the profile pages with on-page best practices as well
-
Generate XML sitemaps for each section of your site, and submit them in Search Console
-
Create some type of programmatic way to deindex old or extra thin profile pages
-
Run cohort reports on profile page owners vs anonymous visitors
There's your free mini-site audit! Let me know if you have other questions!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Google treat significant content changes to web pages and how should I flag them as such?
I have several pages (~30) that I have plans to overhaul. The URLs will be identical and the theme of the content will be the same (still talking about the same widgets, using the same language) but I will be adding a lot more useful information for users, specifically including things that I think will help with my fairly high bounce rate on these pages. I believe the changes will be significant enough for Google to notice, I was wondering if it goes "this is basically a new page now, I will treat it as such and rank accordingly" or does it go "well this content was rubbish last time I checked so it is probably still not great". My second question is, is there a way I can get Google to specifically crawl a page it already knows about with fresh eyes? I know in the Search Console I can ask Google to index new pages, and I've experimented with if I can ask it to crawl a page I know Google knows (it allows me to) but I couldn't see any evidence of it doing anything with that index. Some background The reason I'm doing this is because I noticed when these pages first ranked, they did very well (almost all first / second page for the terms I wanted). After about two weeks I've noticed them sliding down. It doesn't look like the competition is getting any better so my running theory is they ranked well to begin with because they are well linked internally and the content is good/relevant and one of the main things negatively impacting me (that google couldn't know at the time) is bounce rate.
Search Behavior | | tosbourn0 -
Correct approach to a business website with separate content for personal and business customers
I'm laying the groundwork for a fairly involved website. The website is for a telco that caters to both residential and B2B. I was browsing the websites of the likes of Verizon, AT&T, Sprint & T-Mobile. What I saw is that they compartmentalize almost everything - all their business pages are in a business subdoman, all their investor info is in an investor subdomain and so-on. So I'm going implement this strategy on this website update. I just want to make sure that my idea makes sense and isn't a complete cluster****. I've attached a link to the mind map. Everything with "(sub)" attached to it is a subdomain. Everything else is a page at the root level of the top domain. Most of the visitors we get to the website are residential, so instead of loading a portal at first and asking if they're there for person or business reasons, I'm considering forwarding all visitors to the top-level domain to the personal.example.com site. Is this okay or would it be better to just keep the content in the top-level rather than forwarding all traffic to a subdomain? Thank you! 1JY7DWw
Search Behavior | | CucumberGroup0 -
Dupe Content: Canonicalize the Wordpress Tag or NoIndex?
Mozzers, Here we go. I've read multiple posts for years on taxonomy dupe content. In fact, I've read 10 articles tonight on taxonomies and categories. A little background: I am using Wordpress SEO with the Yoast plugin. **Here is the scenario: We have 560 tags - some make sense - some do not. ** What do I do? Do I not worry about it? Matt Cutts said twice that I should not stress about it, because in the worse non-spammy case, Google may just ignore the duplicate content. Matt said in the video, “I wouldn’t stress about this unless the content that you have duplicated is spammy or keyword stuffing.” (Found Via Search Engine Land - http://searchengineland.com/googles-matt-cutts-duplicate-content-wont-hurt-you-unless-it-is-spammy-167459). Do I NoIndex,Follow the Tags? Yoast and a Moz post both say I should NoIndex and Follow the Tags. From the post: "Tag, author, and date archives will all look too similar to other content. So it does not make sense to have them indexed." BUT! **The tags have been indexed for YEARS! And both articles go onto say **"if your blog has already existed for some time, and you've been indexing tags all along for example, you shouldn't just go deindexing them" (http://moz.com/blog/setup-wordpress-for-seo-success). So do I deindex tags that have been indexed for years? I checked the analytics, and in the past month, tags have brought in less than 1% of traffic, but they are bringing in traffic. Do I canonicalize the tags? Canonicalize the URL from "http://domain.com/blog/tag/addiction/" to "http://domain.com/blog/" ? And if I canonicalize, would you canonicalize to the /blog or to the base /tag? Thanks for any and all help. I just want to clarify this issue. One of the reasons is because I received a Moz Report with a TON of dupe content warning from the tags and categories.
Search Behavior | | Thriveworks-Counseling2 -
Content marketing where articles aren't high traffic
Hello, If no one is writing articles in your niche and articles are very scarce in the top 100 landing pages, what does that tell you about content and content marketing in your niche
Search Behavior | | BobGW0 -
Best way to remove worthless/thin content?
I have a Wordpress site with about 3,000 pages and 1,000 of those are no value/duplicate content and drive no traffic. They are blog posts each with a single image and permalinks like example.com/post1, example.com/post2 etc. I've started by deleting pages and 301 redirecting to relevant pages that actually have content. Is deleting and 301 redirecting the best route? Is 1,000 to many 301 redirects? Should I just delete the pages that aren't really relevant to anything else? Anything else I should know about deleting all of these pages? Any help would be great!
Search Behavior | | gfreeman230 -
Decline in engagement metrics, due to nav changes vs. content changes
With improvements in our rankings, we are seeing adverse changes in our measures of engagement. My gut reaction is to believe we are attracting more unqualified traffic, thus higher bounce rates, declines in pages/visit and time on site (approx 15%, 15%, 25%, respectively). While recent improvements in navigation might have contributed to these engagement declines, do you have any suggestions how best to determine whether these declines are due to nav changes vs. due to copy/content issues? There's been no change in copy content during this period. Thanks.
Search Behavior | | ahw0 -
Books about Content Marketing & Persona creation?
Hello SEOmoz, First question here on the forum, but a silent follower for years 🙂 I'm looking for a good book - you define what is "good" - about content marketing and or persona creation that you read and proved to be usable in real-life situations I've read "Accelerate" but found it too light-weight. Therefore your recommendations would come very in handy. Looking forward to your replies! Best regards, Nikolaas
Search Behavior | | TheReference1 -
Geo-targeting / Presenting Unique Content
A client is debating housing two websites under one URL. The sites would offer similar services at different price points. For example, if a user was coming from a San Fran IP they would be presented with the "high-end" packages while another user coming from Dallas would get the "low budget" content. What are the SEO implications? I know that auto geo-targeting can sometimes be risky. It seems like IP locators aren't accurate all the times (especially from a mobile device). Advise? Basically, the client wants to make sure that a Dallas user will be presented with the "right" keywords in the SERPs. What would you recommend? Thanks!
Search Behavior | | lhc670