Publishing pages with thin content, update later?
-
So I have about 285 pages I created with very, very thin content on each. Each is unique, and each serves its own purpose.
My question is, do you guys think it is wise to publish all of these at once to just get them out there and update each as we go along? Each page is very laser targeted and I anticipate that a large handful will actually rank soon after publishing.
Thanks!
Tom
-
Each location has their own page, and each location page has their own departments listed with their own pages as well. Each department then has some content such as the NAP, an employee directory, and links to other resourceful pages on the website.
If this is making many pages for each location, then I would worry about them. However, if all of this information is on a single page then you might be fine. If I owned a company like this I would require each location to give me substantive content.
Also, if I "noindex" the pages to start, add some good content then "index" them, how long in your experience has it taken until you saw a considerable increase in traffic/see those pages indexed?
I republished two of my thin content pages last week. These were noindexed for about two years. They were upgraded from two or three sentences and one photo to nearly 1000 words and four or five photos. One appeared in the index about five days later and went straight to #4 for a moderately difficult single word query. That single word query is the name of a software product, the name of some type of "gold" in the minecraft video game and has a lot of competition from .gov and .edu. .
The second one was published about eight days ago and we have not seen it in the SERPs yet. This is an unusually long time for us to wait on a republished page for this site which has a DA of about 80.
The way I would approach it would be to crawl those pages manually in Search Console (RIP Webmaster Tools) once I updated the "index" tag.
I have never done this. I just republish the page.
-
Thanks Andy, I appreciate the response. This was a semi-large project with the main goal of capturing hyper-local leads. I guess once you throw locations into the mix it runs an even bigger chance of being hit due to popular practice of creating a page for every damn city in the country in hopes of ranking locally.
Fortunately we have real locations across the US but I don't want Google to think we're trying to dupe anyone.
Thanks again
Tom -
That's the answer I was expecting. The website I'm referencing has about 4,000 indexed pages, and those 285 may be enough to do some damage.
To give you an example (this mimics exactly what I'm doing), take a business with multiple locations. Each location has their own page, and each location page has their own departments listed with their own pages as well. Each department then has some content such as the NAP, an employee directory, and links to other resourceful pages on the website. Yeah or nay to that?
Also, if I "noindex" the pages to start, add some good content then "index" them, how long in your experience has it taken until you saw a considerable increase in traffic/see those pages indexed? I know that's a site-by-site, page-by-page kind of question but I'm curious to know.
The way I would approach it would be to crawl those pages manually in Search Console (RIP Webmaster Tools) once I updated the "index" tag.
Thoughts?
Thanks!
Tom -
Hi
I agree with the above, you run the risk of getting hit by Panda. If these pages are important to have live to help customers, then surely your priority should be to get good content on their to help your customers / potential customers. If they land on a low quality page with very little content, are they likely to stick around.
I wouldn't put any live until you have the content sorted. I would work out the priority and start there and once the content is good then put live.
There is probably a Panda update around the corner and you don't want to get hit with hit and then you are waiting for Google to release the next version to get out of it.
I wouldnt even run the risk of putting them live with noindex.
Unless of course as said above you have 100,000+ pages of amazing quality content then it probably wont affect you.
Thanks
Andy
-
In my opinion, publishing a lot of thin content pages will get you into trouble with the Panda algorithm. One of my sites had a lot of these types of pages and it was hit with a Panda problem. Most pages on the site were demoted in search. I noindexed those thin content pages and the site recovered in a few weeks.
Here is the code that I used... name="robots" content="noindex, follow" />
Although those pages had thin content, they were still valuable reference for my visitors. That is why I noindexed them instead of deleting them.
Those pages have been noindexed for about two years with no problems. Slowly, I am adding a good article to those pages to reduce their number. I worry that some day, Google might change their minds and hit sites that have lots of thin content pages that are noindexed.
I don't know how big your website is. But I am betting that 285 very very thin pages added to a website of a couple thousand pages will be a problem (that's about what I had when my site had a problem). However, if that many very very thin pages are added to a website with 100,000 pages you might get away with it.
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One Page Design / Single Product Page
I have been working in a project. Create a framework for multi pages that I have So here is the case
Intermediate & Advanced SEO | | Roman-Delcarmen
Most of them are single page product / one page design wich means that I dont have many pages to optimize. All this sites/ pages follow the rules of a landing page optimization because my main goals is convert as many users as I can. At this point I need to optimize the SEO, the basic stuff such as header, descriptions, tittles ect. But most of my traffic is generated by affiliates, which is good beacuse I dont have to worrie to generate traffic but if the affiliate network banned my product, then I lose all my traffic. Put all my eggs in the same basket is not a good idea. Im not an seo guru so that is the reason Im asking whic strategies and tactics can give me results. All kind of ideas are welcome1 -
Medical / Health Content Authority - Content Mix Question
Greetings, I have an interesting challenge for you. Well, I suppose "interesting" is an understatement, but here goes. Our company is a women's health site. However, over the years our content mix has grown to nearly 50/50 between unique health / medical content and general lifestyle/DIY/well being content (non-health). Basically, there is a "great divide" between health and non-health content. As you can imagine, this has put a serious damper on gaining ground with our medical / health organic traffic. It's my understanding that Google does not see us as an authority site with regard to medical / health content since we "have two faces" in the eyes of Google. My recommendation is to create a new domain and separate the content entirely so that one domain is focused exclusively on health / medical while the other focuses on general lifestyle/DIY/well being. Because health / medical pages undergo an additional level of scrutiny per Google - YMYL pages - it seems to me the only way to make serious ground in this hyper-competitive vertical is to be laser targeted with our health/medical content. I see no other way. Am I thinking clearly here, or have I totally gone insane? Thanks in advance for any reply. Kind regards, Eric
Intermediate & Advanced SEO | | Eric_Lifescript0 -
All Thin Content removed and duplicate content replaced. But still no success?
Good morning, Over the last three months i have gone about replacing and removing all the duplicate content (1000+ page) from our site top4office.co.uk. Now it been just under 2 months since we made all the changes and we still are not showing any improvements in the SERPS. Can anyone tell me why we aren't making any progress or spot something we are not doing correctly? Another problem is that although we have removed 3000+ pages using the removal tool searching site:top4office.co.uk still shows 2800 pages indexed (before there was 3500). Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Is this duplicate content?
My client has several articles and pages that have 2 different URLs For example: /bc-blazes-construction-trail is the same article as: /article.cfm?intDocID=22572 I was not sure if this was duplicate content or not ... Or if I should be putting "/article.cfm" into the robots.txt file or not.. if anyone could help me out, that would be awesome! Thanks 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Pages with Little Content
I have a website that lists events in Dublin, Ireland. I want to provide a comprehensive number of listings but there are not enough hours in the day to provide a detailed (or even short) unique description for every event. At the moment I have some pages with little detail other than the event title and venue. Should I try and prevent Google from crawling/indexing these pages for fear of reducing the overall ranking of the site? At the moment I only link to these pages via the RSS feed. I could remove the pages entirely from my feed, but then that mean I remove information that might be useful to people following the events feed. Here is an example page with very little content
Intermediate & Advanced SEO | | andywozhere0 -
10,000 New Pages of New Content - Should I Block in Robots.txt?
I'm almost ready to launch a redesign of a client's website. The new site has over 10,000 new product pages, which contain unique product descriptions, but do feature some similar text to other products throughout the site. An example of the page similarities would be the following two products: Brown leather 2 seat sofa Brown leather 4 seat corner sofa Obviously, the products are different, but the pages feature very similar terms and phrases. I'm worried that the Panda update will mean that these pages are sand-boxed and/or penalised. Would you block the new pages? Add them gradually? What would you recommend in this situation?
Intermediate & Advanced SEO | | cmaddison0 -
SEOMoz mistaking image pages as duplicate content
I'm getting duplicate content errors, but it's for pages with high-res images on them. Each page has a different, high-res image on it. But SEOMoz keeps telling me it's duplicate content, even though the images are different (and named different). Is this something I can ignore or will Google see it the same way too?
Intermediate & Advanced SEO | | JHT0