Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to deal with auto generated pages on our site that are considered thin content
-
Hi there,
Wondering how to deal w/ about 300+ pages on our site that are autogenerated & considered thin content. Here is an example of those pages: https://app.cobalt.io/ninp0
The pages are auto generated when a new security researcher joins our team & then filled by each researcher with specifics about their personal experience. Additionally, there is a fair amount of dynamic content on these pages that updates with certain activities.
These pages are also getting marked as not having a canonical tag on them, however, they are technically different pages just w/ very similar elements. I'm not sure I would want to put a canonical tag on them as some of them have a decent page authority & I think could be contributing to our overall SEO health.
Any ideas on how I should deal w/ this group of similar but not identical pages?
-
All content marketing, whether it’s the main pages or blog posts, needs to all be written in a white hat way.
Do read about Google E-EAT. The pages need to be very high quality, or they won't appear in the Google SERPs.
We got a Bristol garden office company onto page one of Google, we have mainly done this by writing ever-green content marketing.
-
Ahh, this helped answer my question: https://moz.com/community/q/rel-canonical-tag-back-to-the-same-page-the-tag-is-on
Thanks All. Going to mark this question as answered.
-
I guess I just don't understand the purpose of doing that. If it's the original, why put the tag on it? What additional information is that tag providing? (I'm a novice here, and appreciate your help explaining
)
-
Why are you confused? It's totally normal to link a page to itself as the canonical. In the end most pages are the original of themselves.
Quoting this from the help center here at Moz: "It’s ok if a canonical tag points to the current URL. In other words, if URLs X, Y, and Z are duplicates, and X is the canonical version, it’s ok to put the tag pointing to X on URL X. This may sound obvious, but it’s a common point of confusion." - What is canonicalization?
-
Thanks so much for this. I'm a little confused about pointing a canonical tag to itself. Is there any information on this you can send over that would help me understand? Or an example of what that tag would look like.
Thanks Martijn
-
Hey!
My two cents are, just have them indexed by the search engine and put a canonical tag on it leading to themselves. I would definitely make the case for that these pages are unique enough that they're not seen as duplicates. Your problem is one that many sites have with some kind of dynamic pages. In the end an e-commerce site is barely any different, you have product data but the template remains the same and is totally something that you could optimize for.
So, they're all unique, let them point to themselves and try to improve them even more. Potentially you have a very rich data set here that you can leverage even more for SEO.
Martijn.
-
If you don't want to remove these from index and want them to provide more value try adding more fields and up the requirements for the minimum amount of text in the description.
Moz's user profile page is a good example https://i.imgur.com/C4YR7AE.png
There's not exactly a lot of content, but every field is unique for every user
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with Duplicate Content Error
Hello, the result of renewed content appeared in the scan results in my Shopify Store. But these products are unique. Why am I getting this error? Can anyone please help to explain why? screenshot-analytics.moz.com-2021.10.28-19_53_09.png
Moz Pro | | gokimedia0 -
How to increase DA and PA of my site.
I have been working on my site for the last 4 months and still, I am not getting improvement in site DA, PA, and site ranking. Also, I am generating unique content. Here is the Website link https://techynewx.com/
Moz Pro | | randomghrytu0 -
Is one page with long content better than multiple pages with shorter content?
(Note, the site links are from a sandbox site and has very low DA or PA) If you look at this page, you will see at the bottom a lengthy article detailing all of the properties of the product categories in the links above. http://www.aspensecurityfasteners.com/Screws-s/432.htm My question is, is there more SEO value in having the one long article in the general product category page, or in breaking up the content and moving the sub-topics as content to the more specific sub-category pages? e.g. http://www.aspensecurityfasteners.com/Screws-Button-Head-Socket-s/1579.htm
Moz Pro | | AspenFasteners
http://www.aspensecurityfasteners.com/Screws-Cap-Screws-s/331.htm
http://www.aspensecurityfasteners.com/Screws-Captive-Panel-Scre-s/1559.htm0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
How to find missing or incorrect title tags with a site hosting lots of pages.
i have a website that features more than 9,000 pages. i'm trying to figure out which ones have missing or incorrect title tags. Should I start with screaming frog??
Moz Pro | | SapphireCo0 -
Does SEOmoz have a tool to find mirror sites?
I heard from a company that is trying to get my clients SEO business that they discovered multiple sites mirroring our site's content. Does SEOmoz have a tool to find these websites? Or does Google?
Moz Pro | | thomas.wittine0 -
What do you use for site audit
What tools do you use for conducting a site audit? I need to do an audit on a site and the seomoz web crawler and on page optimization will takes days if not a full week to return any results. In past Ive used other tools that I could run on the fly and they would return broken links, missing htags, keyword density, server information and more. Curious as to what you all use and what you may recommend to use in conjunction with the moz tools.
Moz Pro | | anthonytjm0 -
How can a site have a backlink from Barclays website?
Hi, I have entered a competitiors website www.my-wardrobe.com into Open Site to see who they get links from and to my surprise they have a load from Barclays Business Banking. When I visit the page I can not see the links. But if I search the pages source code for my-wardrobe, there I have it, a link to my-wardrobe.com. How have they done this? Surely Barclays haven't sold them it? And more so, why are they receiving link juice when you cant even see the link on the Barclays page in question - http://www.barclays.co.uk/BusinessBanking/P1242557952664 Thanks | |
Moz Pro | | YNWA
| | <a <span="">href</a><a <span="">="</a>http://www.my-wardrobe.com" class="popup" title="Link opens in a new window" rel='' onmousedown="dcsMultiTrack('DCS.dcsuri','BusinessBankingfromBarclays/Footer/wwwmywardrobecom', 'WT.ti', '','WT.dl','1');"> |
| | www.my-wardrobe.com |
| |
|
| | |0