Are online tools considered thin content?
-
My website has a number of simple converters.
For example, this one converts spaces to commas
https://convert.town/replace-spaces-with-commasNow, obviously there are loads of different variations I could create of this:
Replace spaces with semicolons
Replace semicolons with tabs
Replace fullstops with commasSimilarly with files:
JSON to XML
XML to PDF
JPG to PNG
JPG to TIF
JPG to PDF
(and thousands more)If somoene types one of those into Google, they will be happy because they can immediately use the tool they were hunting for.
It is obvious what these pages do so I do not want to clutter the page up with unnecessary content.
However, would these be considered doorway pages or thin content or would it be acceptable (from an SEO perspective) to generate 1000s of pages based on all the permutations?
-
Ah - sorry for my misunderstanding. So you are leaning towards combining the pages.
So unit-conversion.info has a combined page: http://www.unit-conversion.info/metric.html
When I search for "convert from micro to deci", they appear as number 8. If I click on their page, it defaults to base and mega, so I then have to change the dropdowns.
The number 1 result for that search is this page https://www.unitconverters.net/prefixes/micro-to-deci.htm - it has Micro and Deci preselected.
unit-conversion.info only has 460 but Unitconverters.net has 50,000 pages indexed by Google. Despite the "thin content", they still appear number 1 (admittedly, this may be due to other factors).
As far as user experience goes, I would prefer to land on unitconverters.net because I have less things to click.
I guess the art is in finding the sweet spot in being able to give a search result with context without spinning out too much thin content.
Thanks again for your detailed response!
-
Hi again,
sorry if I have not expressed myself very well.
In my opinion, you would have only 1 page for each of those tools (with all the conversion options), and along the text of that page (+ title & meta description), there would be optimized the generic keywords like "replace character tool", "replace characters online"... and the conversion specific ones like "replace space with columns", without abusing to avoid keyword stuffing / spam.
The same for the Convert Image Tool, just one page, like this people did: unit-conversion.info with the conversion text tool and all the others.
More pages than that would surely create thin content and would divide the authority between all that pages instead of having all that authory in 1 quality page that optimizes along text and metas the most searched of the conversion options of each tool.
In any case, if you create additional pages for the most commonly searched-for variants (just a few), that could be acceptable as you said.
Greetings!
-
Yes I was thinking along the same lines - if I create a page for commonly searched-for variants, then that will be an acceptable "thin page".
OK, so if I understand correctly, you would suggest having one generic "replace text" page. The phrase variants - "replace character tool", "replace characters online", "replace text tool", should appear throughout that same page (not on separate pages).
The following SEPARATE pages would have the find / replace textboxes of the generic converter prefilled (because they are commonly searched for):
- Replace spaces with columns
- Replace spaces with semicolons
- Replace semicolons with spaces
- Replace and with &
...and all other common but relevant search phrases
But you would NOT create a separate page for:
- Replace question mark with space
- Replace the letter t with the letter b
Does that sound right to you?
Then for the Convert Image tool, wouldn't it be best (in a similar fashion) to have one generic tool but then the common searches prefilled on separate pages:
- Convert image to image
- Convert Image to GIF
- Convert PNG to JPG
- Convert PNG to GIF
(and perhaps 100 others)
Each of these tools are different in functionality and will be more helpful to the user if they are prefilled with what they are looking for?
-
So I guess that is actually my argument - that each tool deserves its own page (if it is something commonly searched for). The user experience is not as good if they search for "convert spaces to semicolons", then land on a page where they have to also enter a space and a semicolon before they get what they want. If these are prefilled, surely the user would prefer that. Will Google realise that users prefer that though? That is the big question.
OK - if I don't fill the page with spam, then it won't be considered a gateway page.
Thank you for your response.
-
Hi
It's a difficult question.
By one side, it would be interesting for the searcher to have directly access to the tool with the exact function they are looking for.
By the other, many functions are very similar and they will surely have very similar content that doesn't provide new interesting information (thin content).
I think you should go for the point between this sides. I mean, you can create many different tools, but tools that group all similar functions.
For example:
Replace Character Tool (you can replace with this any character or text by any other). Here you have an example of this tool: http://www.unit-conversion.info/texttools/replace-text/. In this tool you can moderately optimize all the keywords related to the different functions, by mentioning them on the text, h1-h2-h3, or in the Title / Meta Description. Don't try to optimize all different variants because there are too much. Go for the most searched ones (use Google Keyword Planner or a similar tool to identify them). You should also optimize the variants of "replace character tool" like "replace characters online" or "replace text tool", (important to also use "free" if the tools are free)
The same for image conversion with Convert Image Tool ("online picture conversion" + "free convert img tool"... + most popular img format conversion like "png to jpg conversion tool"), all in the same page.
Hope that helps!
-
Hi there,
My personal recommendation here, if possible, would be to compile all of the tools into one easy to use page. So all of the file converting permutations would be under one page and all of the 'replace' tools will be under another page.
Not only would this be better user experience but also you wouldn't clog up your site with thin pages from the multiple permutations of the pages.
You could of course argue that each tool deserves its own page because technically they each do different things.
What would make any one of these pages into a gateway page is if you bulked them out with a large amount of content that was specifically designed for search engines.
I hope this helps to answer your question
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Social engineering content detected
hello, i have Got Social engineering content detected Message on webmaster tools on my around 20 sites, i have checked on server cleared, all unnecessary folders, But still i am not getting rectified this issue. One more error i got is Remove the deceptive content, But there is no any content on website which can harm my site, so kindly help & tell us steps we need take to resolve this issue, i am facing it from 10 days, yet not able to resolve, thnx in advance
White Hat / Black Hat SEO | | rohitiepl0 -
Drastic surge of link spam in Webmaster Tools' Link Profile
Hello all I am trying to get some insights/advice on a recent as well as drastic increase in link spam within my Webmaster Tools' Link Profile. Before I get into more detail, I would like to point out, that I did find some relevant MOZ community posts addressing this type of issue. However, my link spam situation may have to be approached from a different angle, as it concerns two sites at the same time and somewhat in the same way. Basically, starting in July 2017, from one day to the other, a multitude of domains (50+) is generating link spam (at least 200 links a month and counting) and to cut a long story short, I believe the sites are hacked. This is because most of the domain names sound legit and load the homepage, but all the sub-pages linking to my site contain "adult" gibberish. In addition, it is interesting to see, that each sub-page follows the same pattern, scraping content from my homepage including the on-page links - that generate the spammy backlinks to my sites - while inserting the adult gibberish in between (basically it's all just text and looks like as if a bot is at work). Therefore, it's not like my link is being inserted "specifically" into pages or to spam me with the same anchor text over and over. So, I am not sure what kind of link spam this really is (or the purpose of it). Some more background information: As mentioned above, this link spam (attack?) is affecting two of my sites and it started off pretty much simultaneously (in addition, the sites focus on a competitive niche). The interesting detail is, that one site suffered a manual penalty years ago, which has been lifted (a disavowal file exists and no further link building campaigns have been undertaken after the cleanup), while the other site has never seen any link building efforts - it is clean, yet the same type of spam is flooding that websites' link profile too. In the webmaster forums the overall opinion is, that Google ignores web spam. All well. However, I am still concerned, that the dozens of spammy links pointing to the website "with a history" may pose a risk (more spam on a daily basis on both sites though). At the same time I wonder, why the other "clean" site is facing the same issue. The clean sites' rankings do not appear to be impacted, while the other website has seen some drops, but I am still observing the situation. Therefore, should I be concerned for both sites or even start an endless disavowal campaign on the site with a history? PS: This MOZ article appears to advice so: https://moz.com/blog/do-we-still-need-to-disavow-penguin "In most cases, sites that have a history of collecting unnatural links tend to continue to collect them. If this is the case for you, then it’s best to disavow those on a regular basis (either monthly or quarterly) so that you can avoid getting another manual action." What is your opinion? Sorry for the long post and many thanks in advance for any help/insight.
White Hat / Black Hat SEO | | Hermski0 -
Question regarding subdomains and duplicate content
Hey everyone, I have another question regarding duplicate content. We are planning on launching a new sector in our industry to satisfy a niche. Our main site works as a directory with listings with NAP. The new sector that we are launching will be taking all of the content on the main site and duplicating it on a subdomain for the new sector. We still want the subdomain to rank organically, but I'm having struggles between putting a rel=canonical back to main site, or doing a self-referencing canonical, but now I have duplicates. The other idea is to rewrite the content on each listing so that the menu items are still the same, but the listing description is different. Do you think this would be enough differentiating content that it won't be seen as a duplicate? Obviously make this to be part of the main site is the best option, but we can't do that unfortunately. Last question, what are the advantages or disadvantages of doing a subdomain?
White Hat / Black Hat SEO | | imjonny0 -
Duplicate content - multiple sites hosted on same server with same IP address
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
White Hat / Black Hat SEO | | Jade0 -
Can I use content from an existing site that is not up anymore?
I want to take down a current website and create a new site or two (with new url, ip, server). Can I use the content from the deleted site on the new sites since I own it? How will Google see that?
White Hat / Black Hat SEO | | RoxBrock0 -
Is it still considered reciprocal linking if one of the links has a nofollow tag?
I have a popular website in which I include nofollow links to many local businesses, like restaurants and retailers. Many of the businesses are local startups that are more focused on word of mouth and often have no idea what SEO is. Seeing as I am already mentioning them on my website and my readers are finding them via the links, I want to reach out to these businesses to see me if they might give me a link since I have been linking to them for years. My question is: If these business owners decide to link to my wesbite and they give me a 'followed' link, will this look like reciprocal linking in the eyes of search engines? In other words, does the nofollow tag I put on my links to other businesses negate the reciprocal link penalty since both parties are not benefiting from a link juice exchange?
White Hat / Black Hat SEO | | AndrewHill0 -
Does posting a source to the original content avoid duplicate content risk?
A site I work with allows registered user to post blog posts (longer articles). Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content? Thanks!
White Hat / Black Hat SEO | | 945010