Are online tools considered thin content?
-
My website has a number of simple converters.
For example, this one converts spaces to commas
https://convert.town/replace-spaces-with-commasNow, obviously there are loads of different variations I could create of this:
Replace spaces with semicolons
Replace semicolons with tabs
Replace fullstops with commasSimilarly with files:
JSON to XML
XML to PDF
JPG to PNG
JPG to TIF
JPG to PDF
(and thousands more)If somoene types one of those into Google, they will be happy because they can immediately use the tool they were hunting for.
It is obvious what these pages do so I do not want to clutter the page up with unnecessary content.
However, would these be considered doorway pages or thin content or would it be acceptable (from an SEO perspective) to generate 1000s of pages based on all the permutations?
-
Ah - sorry for my misunderstanding. So you are leaning towards combining the pages.
So unit-conversion.info has a combined page: http://www.unit-conversion.info/metric.html
When I search for "convert from micro to deci", they appear as number 8. If I click on their page, it defaults to base and mega, so I then have to change the dropdowns.
The number 1 result for that search is this page https://www.unitconverters.net/prefixes/micro-to-deci.htm - it has Micro and Deci preselected.
unit-conversion.info only has 460 but Unitconverters.net has 50,000 pages indexed by Google. Despite the "thin content", they still appear number 1 (admittedly, this may be due to other factors).
As far as user experience goes, I would prefer to land on unitconverters.net because I have less things to click.
I guess the art is in finding the sweet spot in being able to give a search result with context without spinning out too much thin content.
Thanks again for your detailed response!
-
Hi again,
sorry if I have not expressed myself very well.
In my opinion, you would have only 1 page for each of those tools (with all the conversion options), and along the text of that page (+ title & meta description), there would be optimized the generic keywords like "replace character tool", "replace characters online"... and the conversion specific ones like "replace space with columns", without abusing to avoid keyword stuffing / spam.
The same for the Convert Image Tool, just one page, like this people did: unit-conversion.info with the conversion text tool and all the others.
More pages than that would surely create thin content and would divide the authority between all that pages instead of having all that authory in 1 quality page that optimizes along text and metas the most searched of the conversion options of each tool.
In any case, if you create additional pages for the most commonly searched-for variants (just a few), that could be acceptable as you said.
Greetings!
-
Yes I was thinking along the same lines - if I create a page for commonly searched-for variants, then that will be an acceptable "thin page".
OK, so if I understand correctly, you would suggest having one generic "replace text" page. The phrase variants - "replace character tool", "replace characters online", "replace text tool", should appear throughout that same page (not on separate pages).
The following SEPARATE pages would have the find / replace textboxes of the generic converter prefilled (because they are commonly searched for):
- Replace spaces with columns
- Replace spaces with semicolons
- Replace semicolons with spaces
- Replace and with &
...and all other common but relevant search phrases
But you would NOT create a separate page for:
- Replace question mark with space
- Replace the letter t with the letter b
Does that sound right to you?
Then for the Convert Image tool, wouldn't it be best (in a similar fashion) to have one generic tool but then the common searches prefilled on separate pages:
- Convert image to image
- Convert Image to GIF
- Convert PNG to JPG
- Convert PNG to GIF
(and perhaps 100 others)
Each of these tools are different in functionality and will be more helpful to the user if they are prefilled with what they are looking for?
-
So I guess that is actually my argument - that each tool deserves its own page (if it is something commonly searched for). The user experience is not as good if they search for "convert spaces to semicolons", then land on a page where they have to also enter a space and a semicolon before they get what they want. If these are prefilled, surely the user would prefer that. Will Google realise that users prefer that though? That is the big question.
OK - if I don't fill the page with spam, then it won't be considered a gateway page.
Thank you for your response.
-
Hi
It's a difficult question.
By one side, it would be interesting for the searcher to have directly access to the tool with the exact function they are looking for.
By the other, many functions are very similar and they will surely have very similar content that doesn't provide new interesting information (thin content).
I think you should go for the point between this sides. I mean, you can create many different tools, but tools that group all similar functions.
For example:
Replace Character Tool (you can replace with this any character or text by any other). Here you have an example of this tool: http://www.unit-conversion.info/texttools/replace-text/. In this tool you can moderately optimize all the keywords related to the different functions, by mentioning them on the text, h1-h2-h3, or in the Title / Meta Description. Don't try to optimize all different variants because there are too much. Go for the most searched ones (use Google Keyword Planner or a similar tool to identify them). You should also optimize the variants of "replace character tool" like "replace characters online" or "replace text tool", (important to also use "free" if the tools are free)
The same for image conversion with Convert Image Tool ("online picture conversion" + "free convert img tool"... + most popular img format conversion like "png to jpg conversion tool"), all in the same page.
Hope that helps!
-
Hi there,
My personal recommendation here, if possible, would be to compile all of the tools into one easy to use page. So all of the file converting permutations would be under one page and all of the 'replace' tools will be under another page.
Not only would this be better user experience but also you wouldn't clog up your site with thin pages from the multiple permutations of the pages.
You could of course argue that each tool deserves its own page because technically they each do different things.
What would make any one of these pages into a gateway page is if you bulked them out with a large amount of content that was specifically designed for search engines.
I hope this helps to answer your question
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
I need a lot of content completed in a short amount of time. Suggestions on where to look?
I'm looking for writers to write content for 1000+ key words. 300-400 words per keyword. I would like this done by the end of July. Any suggestions or recommendations on where to find a team that can produce quality content in that amount of time? Thank you!
White Hat / Black Hat SEO | | cloudhasher0 -
Disabling a slider with content...is considered cloaking?
We have a slider on our site www.cannontrading.com, but the owner didn't like it, so I disabled it. And, each slider contains link & content as well. We had another SEO guy tell me it considered cloaking. Is this True? Please give feedbacks.
White Hat / Black Hat SEO | | ACann0 -
Publishing the same article content on Yahoo? Worth It? Penalties? Urgent
Hey All, I am currently working for a company and they are publishing exactly the same content on their website and yahoo. In addition to this when I put the same article's title it gets outranked by Yahoo. Isn't against Google guidelines? I think Yahoo also gets more than us since they are on the first position. How do you think should the company stop this practice? Please need urgent responses for these questions. Also look at the attachment and look at the snippets. We have a snippet (description) like the first paragraph but yahoo somehow scans the content and creates meta descriptions based on the search queries. How do they do That?
White Hat / Black Hat SEO | | moneywise_test0 -
Access Denied - 2508 Errors - 403 Response code in webmaster tools
Hello Fellow members, From 9th may I am getting this error messages & these crawl errors is increasing daily. Google is not able to crawl my URLS & getting 403 response code & saying ACCESS Denied Errors in GWT. My all Indexed pages are de-indexed. Why I am receiving this errors ? My website is working fine but why Google is not able to crawl my pages. PLEASE TELL ME what is the ISSUE, I need to resolve ASAP on 9th may I got a message in GWT as well for "http://www.mysitename.co.uk/ Increase in authorization permission errors " Google detected a significant increase in the number of URLs we were blocked from crawling due to authorization permission errors. After this all problem started. Kindly tell what is the issue & how can I solve this. WGsu8pU
White Hat / Black Hat SEO | | sourabhrana390 -
DIV Attribute containing full DIV content
Hi all I recently watched the latest Mozinar called "Making Your Site Audits More Actionable". It was presented by the guys at seogadget. In the mozinar one of the guys said he loves the website www.sportsbikeshop.co.uk and that they have spent a lot of money on it from an SEO point of view (presumably with seogadget) so I decided to look through the source and noticed something I had not seen before and wondered if anyone can shed any light. On this page (http://www.sportsbikeshop.co.uk/motorcycle_parts/content_cat/852/(2;product_rating;DESC;0-0;all;92)/page_1/max_20) there is a paragraph of text that begins with 'The ever reliable UK weather...' and when you via the source of the containing DIV you will notice a bespoke attribute called "threedots=" and within it, is the entire text content for that DIV. Any thoughts as to why they would put that there? I can't see any reason as to why this would benefit a site in any shape or form. Its invalid markup for one. Am I missing a trick..? Thoughts would be greatly appreciated. Kris P.S. for those who can't be bothered to visit the site, here is a smaller version of what they have done: This is an introductory paragraph of text for this page.
White Hat / Black Hat SEO | | yousayjump0 -
Considering which agency to choose for a link building campaign is starting to seem like beating a dead horse.......
So first off, I've got to admit, I really haven't shopped around enough on SEOmoz's Recommended page. I have been doing some shopping, and have considered a few different people. The two main people that we are considering (or should I be saying 'were' right now?) is a company called Mainstreet Host. They have the best price, and when I first came into my partnership with Roseann at Uncommon Thread, she had already paid for some $1,500 trial of sorts. Our problems with these guys? Roseann says the sales guy is extremely pushy They want us to pay them a monthly fee to "optimize 11 pages, create 12 blog posts on wordpress blog, rss on homepage for fresh content, blah blah blah..." I was stuck on the fact that they want a recurring fee for a fairly small job I just looked into their ability to rank for the keyword they are targeting, and they rank #2 for a keyword difficulty score of 83. BUT, I looked into their linkbuilding and it's pretty blackhat. Several blog comments, mostly guest posts on what looked like some sore of article marketing site, and a few missing links according to opensiteexplorer.org did not say anything about link building other than a single Press Release distribution I guess my question is, is the $6,000 they want us to pay for those services actually going to get us to rank for some competitive terms? like keyword difficulty score 30 - 60? The other guys we have been considering is OrangeSoda. Right off the bat, they seem awesome, i mean just take a quick look at their site. but just like with the other company, they have a pretty dark backlink profile too. The only thing that they really have going for them is a few paid links on some sort of what appears to be semi-legitimate advertising partner based network. Google was on their too, near the bottom, which I thought was very strange, because it clearly discloses that its a paid network. They are asking $7,200 for 12 hours per week of work, in which time they will help us go through and fix any technical aspects, create a blog, and create content, as well as build a link building strategy. Should I keep shoppping??
White Hat / Black Hat SEO | | TylerAbernethy0