Are online tools considered thin content?
-
My website has a number of simple converters.
For example, this one converts spaces to commas
https://convert.town/replace-spaces-with-commasNow, obviously there are loads of different variations I could create of this:
Replace spaces with semicolons
Replace semicolons with tabs
Replace fullstops with commasSimilarly with files:
JSON to XML
XML to PDF
JPG to PNG
JPG to TIF
JPG to PDF
(and thousands more)If somoene types one of those into Google, they will be happy because they can immediately use the tool they were hunting for.
It is obvious what these pages do so I do not want to clutter the page up with unnecessary content.
However, would these be considered doorway pages or thin content or would it be acceptable (from an SEO perspective) to generate 1000s of pages based on all the permutations?
-
Ah - sorry for my misunderstanding. So you are leaning towards combining the pages.
So unit-conversion.info has a combined page: http://www.unit-conversion.info/metric.html
When I search for "convert from micro to deci", they appear as number 8. If I click on their page, it defaults to base and mega, so I then have to change the dropdowns.
The number 1 result for that search is this page https://www.unitconverters.net/prefixes/micro-to-deci.htm - it has Micro and Deci preselected.
unit-conversion.info only has 460 but Unitconverters.net has 50,000 pages indexed by Google. Despite the "thin content", they still appear number 1 (admittedly, this may be due to other factors).
As far as user experience goes, I would prefer to land on unitconverters.net because I have less things to click.
I guess the art is in finding the sweet spot in being able to give a search result with context without spinning out too much thin content.
Thanks again for your detailed response!
-
Hi again,
sorry if I have not expressed myself very well.
In my opinion, you would have only 1 page for each of those tools (with all the conversion options), and along the text of that page (+ title & meta description), there would be optimized the generic keywords like "replace character tool", "replace characters online"... and the conversion specific ones like "replace space with columns", without abusing to avoid keyword stuffing / spam.
The same for the Convert Image Tool, just one page, like this people did: unit-conversion.info with the conversion text tool and all the others.
More pages than that would surely create thin content and would divide the authority between all that pages instead of having all that authory in 1 quality page that optimizes along text and metas the most searched of the conversion options of each tool.
In any case, if you create additional pages for the most commonly searched-for variants (just a few), that could be acceptable as you said.
Greetings!
-
Yes I was thinking along the same lines - if I create a page for commonly searched-for variants, then that will be an acceptable "thin page".
OK, so if I understand correctly, you would suggest having one generic "replace text" page. The phrase variants - "replace character tool", "replace characters online", "replace text tool", should appear throughout that same page (not on separate pages).
The following SEPARATE pages would have the find / replace textboxes of the generic converter prefilled (because they are commonly searched for):
- Replace spaces with columns
- Replace spaces with semicolons
- Replace semicolons with spaces
- Replace and with &
...and all other common but relevant search phrases
But you would NOT create a separate page for:
- Replace question mark with space
- Replace the letter t with the letter b
Does that sound right to you?
Then for the Convert Image tool, wouldn't it be best (in a similar fashion) to have one generic tool but then the common searches prefilled on separate pages:
- Convert image to image
- Convert Image to GIF
- Convert PNG to JPG
- Convert PNG to GIF
(and perhaps 100 others)
Each of these tools are different in functionality and will be more helpful to the user if they are prefilled with what they are looking for?
-
So I guess that is actually my argument - that each tool deserves its own page (if it is something commonly searched for). The user experience is not as good if they search for "convert spaces to semicolons", then land on a page where they have to also enter a space and a semicolon before they get what they want. If these are prefilled, surely the user would prefer that. Will Google realise that users prefer that though? That is the big question.
OK - if I don't fill the page with spam, then it won't be considered a gateway page.
Thank you for your response.
-
Hi
It's a difficult question.
By one side, it would be interesting for the searcher to have directly access to the tool with the exact function they are looking for.
By the other, many functions are very similar and they will surely have very similar content that doesn't provide new interesting information (thin content).
I think you should go for the point between this sides. I mean, you can create many different tools, but tools that group all similar functions.
For example:
Replace Character Tool (you can replace with this any character or text by any other). Here you have an example of this tool: http://www.unit-conversion.info/texttools/replace-text/. In this tool you can moderately optimize all the keywords related to the different functions, by mentioning them on the text, h1-h2-h3, or in the Title / Meta Description. Don't try to optimize all different variants because there are too much. Go for the most searched ones (use Google Keyword Planner or a similar tool to identify them). You should also optimize the variants of "replace character tool" like "replace characters online" or "replace text tool", (important to also use "free" if the tools are free)
The same for image conversion with Convert Image Tool ("online picture conversion" + "free convert img tool"... + most popular img format conversion like "png to jpg conversion tool"), all in the same page.
Hope that helps!
-
Hi there,
My personal recommendation here, if possible, would be to compile all of the tools into one easy to use page. So all of the file converting permutations would be under one page and all of the 'replace' tools will be under another page.
Not only would this be better user experience but also you wouldn't clog up your site with thin pages from the multiple permutations of the pages.
You could of course argue that each tool deserves its own page because technically they each do different things.
What would make any one of these pages into a gateway page is if you bulked them out with a large amount of content that was specifically designed for search engines.
I hope this helps to answer your question
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Cross Domain Duplicate Content
Hi, We want create 2 company websites and each to be targeted specific to different countries. The 2 countries are Australia and New Zealand. We have acquired 2 domains, company.com.au and company.co.nz . We want to do it like this and not use different hreflang on the same version for maximum ranking results in each country (correct?). Since both websites will be in English, inevitably some page are going to be the same. Are we facing any danger of duplicate content between the two sites, and if we do is there any solution for that? Thank you for your help!
White Hat / Black Hat SEO | | Tz_Seo0 -
Tool to check google index status for backlinks?
I would like to check to see which backlink urls are indexed in Google. Is there a tool that can automate this work or will I have to do it manually?
White Hat / Black Hat SEO | | Choice0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Duplicate Content
Hi, I have a website with over 500 pages. The website is a home service website that services clients in different areas of the UK. My question is, am I able to take down the pages from my URL, leave them down for say a week, so when Google bots crawl the pages, they do not exist. Can I then re upload them to a different website URL, and then Google wont penalise me for duplicate content? I know I would of lost juice and page rank, but that doesnt really matter, because the site had taken a knock since the Google update. Thanks for your help. Chris,
White Hat / Black Hat SEO | | chrisellett0 -
Blogger Reviews w/ Links - Considered a Paid Link?
As part of my daily routine, I checked out inbound.org and stumbled upon an article about Grey Hat SEO techniques. One of the techniques mentioned was sending product to a blogger for review. My question is whether these types of links are really considered paid links. Why shouldn't an e-commerce company evangelize its product by sending to bloggers whose readership is the demographic the company is trying to target? In pre e-commerce marketing, it was very typical for a start-up company to send samples for review. Additionally, as far as flow of commerce is concerned, it makes sense for a product review to direct the reader to the company, whether by including a contact phone number, a mailing address, or in today's e-commerce world, a link to their website. I understand the gaming potential here (as with most SEO techniques, black-hat is usually an extreme implementation), but backlinks from honest product reviews shouldn't have a tinge of black, thus keeping it white-hat. Am I wrong here? Are these types of links really grey? Any help or insight is much appreciated!
White Hat / Black Hat SEO | | b40040400 -
I would like to know if there is a tool to know what keywords
Hi everyone, I am looking for a keywords searcher or a program that can help me to know which keywords my competitors are using. thanks!
White Hat / Black Hat SEO | | lnietob0 -
Links via scraped / cloned content
Just been looking at some backlinks on a site - a good proportion of them are via Scraped wikipedia links or sites with similar directories to those found on DMOZ (just they have different names). To be honest, many of these sites look pretty dodgy to me, but if they're doing illegal stuff there's absolutely no way I'll be able to get links removed. Should I just sit and watch the backlinks increase from these questionable sources, or report the sites to Google, or do something else? Advice please.
White Hat / Black Hat SEO | | McTaggart0