Are online tools considered thin content?
-
My website has a number of simple converters.
For example, this one converts spaces to commas
https://convert.town/replace-spaces-with-commasNow, obviously there are loads of different variations I could create of this:
Replace spaces with semicolons
Replace semicolons with tabs
Replace fullstops with commasSimilarly with files:
JSON to XML
XML to PDF
JPG to PNG
JPG to TIF
JPG to PDF
(and thousands more)If somoene types one of those into Google, they will be happy because they can immediately use the tool they were hunting for.
It is obvious what these pages do so I do not want to clutter the page up with unnecessary content.
However, would these be considered doorway pages or thin content or would it be acceptable (from an SEO perspective) to generate 1000s of pages based on all the permutations?
-
Ah - sorry for my misunderstanding. So you are leaning towards combining the pages.
So unit-conversion.info has a combined page: http://www.unit-conversion.info/metric.html
When I search for "convert from micro to deci", they appear as number 8. If I click on their page, it defaults to base and mega, so I then have to change the dropdowns.
The number 1 result for that search is this page https://www.unitconverters.net/prefixes/micro-to-deci.htm - it has Micro and Deci preselected.
unit-conversion.info only has 460 but Unitconverters.net has 50,000 pages indexed by Google. Despite the "thin content", they still appear number 1 (admittedly, this may be due to other factors).
As far as user experience goes, I would prefer to land on unitconverters.net because I have less things to click.
I guess the art is in finding the sweet spot in being able to give a search result with context without spinning out too much thin content.
Thanks again for your detailed response!
-
Hi again,
sorry if I have not expressed myself very well.
In my opinion, you would have only 1 page for each of those tools (with all the conversion options), and along the text of that page (+ title & meta description), there would be optimized the generic keywords like "replace character tool", "replace characters online"... and the conversion specific ones like "replace space with columns", without abusing to avoid keyword stuffing / spam.
The same for the Convert Image Tool, just one page, like this people did: unit-conversion.info with the conversion text tool and all the others.
More pages than that would surely create thin content and would divide the authority between all that pages instead of having all that authory in 1 quality page that optimizes along text and metas the most searched of the conversion options of each tool.
In any case, if you create additional pages for the most commonly searched-for variants (just a few), that could be acceptable as you said.
Greetings!
-
Yes I was thinking along the same lines - if I create a page for commonly searched-for variants, then that will be an acceptable "thin page".
OK, so if I understand correctly, you would suggest having one generic "replace text" page. The phrase variants - "replace character tool", "replace characters online", "replace text tool", should appear throughout that same page (not on separate pages).
The following SEPARATE pages would have the find / replace textboxes of the generic converter prefilled (because they are commonly searched for):
- Replace spaces with columns
- Replace spaces with semicolons
- Replace semicolons with spaces
- Replace and with &
...and all other common but relevant search phrases
But you would NOT create a separate page for:
- Replace question mark with space
- Replace the letter t with the letter b
Does that sound right to you?
Then for the Convert Image tool, wouldn't it be best (in a similar fashion) to have one generic tool but then the common searches prefilled on separate pages:
- Convert image to image
- Convert Image to GIF
- Convert PNG to JPG
- Convert PNG to GIF
(and perhaps 100 others)
Each of these tools are different in functionality and will be more helpful to the user if they are prefilled with what they are looking for?
-
So I guess that is actually my argument - that each tool deserves its own page (if it is something commonly searched for). The user experience is not as good if they search for "convert spaces to semicolons", then land on a page where they have to also enter a space and a semicolon before they get what they want. If these are prefilled, surely the user would prefer that. Will Google realise that users prefer that though? That is the big question.
OK - if I don't fill the page with spam, then it won't be considered a gateway page.
Thank you for your response.
-
Hi
It's a difficult question.
By one side, it would be interesting for the searcher to have directly access to the tool with the exact function they are looking for.
By the other, many functions are very similar and they will surely have very similar content that doesn't provide new interesting information (thin content).
I think you should go for the point between this sides. I mean, you can create many different tools, but tools that group all similar functions.
For example:
Replace Character Tool (you can replace with this any character or text by any other). Here you have an example of this tool: http://www.unit-conversion.info/texttools/replace-text/. In this tool you can moderately optimize all the keywords related to the different functions, by mentioning them on the text, h1-h2-h3, or in the Title / Meta Description. Don't try to optimize all different variants because there are too much. Go for the most searched ones (use Google Keyword Planner or a similar tool to identify them). You should also optimize the variants of "replace character tool" like "replace characters online" or "replace text tool", (important to also use "free" if the tools are free)
The same for image conversion with Convert Image Tool ("online picture conversion" + "free convert img tool"... + most popular img format conversion like "png to jpg conversion tool"), all in the same page.
Hope that helps!
-
Hi there,
My personal recommendation here, if possible, would be to compile all of the tools into one easy to use page. So all of the file converting permutations would be under one page and all of the 'replace' tools will be under another page.
Not only would this be better user experience but also you wouldn't clog up your site with thin pages from the multiple permutations of the pages.
You could of course argue that each tool deserves its own page because technically they each do different things.
What would make any one of these pages into a gateway page is if you bulked them out with a large amount of content that was specifically designed for search engines.
I hope this helps to answer your question
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Duplicate content - multiple sites hosted on same server with same IP address
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
White Hat / Black Hat SEO | | Jade0 -
Apparent Bot Queries and Impressions in Webmaster Tools
I've been noticing some strange stats in Google Webmaster Tools for my forum, which has been getting spam queries with impressions and no clicks. See the queries in the attached images. This might be a motive for the spammers or scrapers. I set the date range to just 22 Aug - 22 Nov and I see very obviously the spike is due to impressions. Questions: What should/can I do? Is Google doing something about this? How to avoid this? o6gKB
White Hat / Black Hat SEO | | SameerBhatia0 -
Disavow tool for blocking 4 to 5 sites for Article Republishing
Am finding some very low authority sites (recently picked our articles from ezine and other article sites - written over a year back) and pasted on to there site. The number of articles copies are not 1 or 2, but more than 10-12 in all these domains This has also led to our anchor based url - backlink to us from them (a part of article). Have Wrote down to remove my author profile and articles - but there has been no response from webmaster of these sites. Is Disavow a right approach. The number of such sites are 4 or 5 in nature !!
White Hat / Black Hat SEO | | Modi0 -
Is it still considered reciprocal linking if one of the links has a nofollow tag?
I have a popular website in which I include nofollow links to many local businesses, like restaurants and retailers. Many of the businesses are local startups that are more focused on word of mouth and often have no idea what SEO is. Seeing as I am already mentioning them on my website and my readers are finding them via the links, I want to reach out to these businesses to see me if they might give me a link since I have been linking to them for years. My question is: If these business owners decide to link to my wesbite and they give me a 'followed' link, will this look like reciprocal linking in the eyes of search engines? In other words, does the nofollow tag I put on my links to other businesses negate the reciprocal link penalty since both parties are not benefiting from a link juice exchange?
White Hat / Black Hat SEO | | AndrewHill0 -
Is using Zeus's gateway feature to display contents from the different URL OK to do?
I've been writing a blog on free hosting blog platform and planning to migrate that under my domain name as directory. myblog.ABCD.com to www.mydomain.com/myblog now, I've learned that my Zeus server has a way to show myblog.ABCD.com at mydomain.com/myblog without transferring anything by using the Gateway feature. This will save a lot of time and hassle for me, but my question is if this is ok to do?
White Hat / Black Hat SEO | | HypermediaSystems
Is there a chance that this could be considered a blackhat even though the content is mine? From the Zeus documentation:
"Gateway aliases enable users to request files from the new
web server, and receive them as if they were on the new server, when they are
still located on the legacy server. To the user, the files appear to be located on
the new server. " Thank you.0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Advice on using the disavow tool to remove hacked website links
Hey Everyone, Back in December, our website suffered an attack which created links to other hacked webistes which anchor text such as "This is an excellent time to discuss symptoms, fa" "Open to members of the nursing/paramedical profes" "The organs in the female reproductive system incl" The links were only visible when looking at the Cache of the page. We got these links removed and removed all traces of the attack such as pages which were created in their own directory on our server 3 months later I'm finding websites linking to us with similar anchor text to the ones above, however they're linking to the pages that were created on our server when we were attacked and they've been removed. So one of my questions is does this effect our site? We've seen some of our best performing keywords drop over the last few months and I have a feeling it's due to these spammy links. Here's a website that links to us <colgroup><col width="751"></colgroup>
White Hat / Black Hat SEO | | blagger
| http://www.fashion-game.com/extreme/blog/page-9 | If you do view source or look at the cached version then you'll find a link right at the bottom left corner. We have 268 of these links from 200 domains. Contacting these sites to have these links removed would be a very long process as most of them probably have no idea that those links even exist and I don't have the time to explain to each one how to remove the hacked files etc. I've been looking at using the Google Disavow tool to solve this problem but I'm not sure if it's a good idea or not. We haven't had any warnings from Google about our site being spam or having too many spam links, so do we need to use the tool? Any advice would be very much appreciated. Let me know if you require more details about our problem. <colgroup><col width="355"></colgroup>
| | | |0