Publishing pages with thin content, update later?
-
So I have about 285 pages I created with very, very thin content on each. Each is unique, and each serves its own purpose.
My question is, do you guys think it is wise to publish all of these at once to just get them out there and update each as we go along? Each page is very laser targeted and I anticipate that a large handful will actually rank soon after publishing.
Thanks!
Tom
-
Each location has their own page, and each location page has their own departments listed with their own pages as well. Each department then has some content such as the NAP, an employee directory, and links to other resourceful pages on the website.
If this is making many pages for each location, then I would worry about them. However, if all of this information is on a single page then you might be fine. If I owned a company like this I would require each location to give me substantive content.
Also, if I "noindex" the pages to start, add some good content then "index" them, how long in your experience has it taken until you saw a considerable increase in traffic/see those pages indexed?
I republished two of my thin content pages last week. These were noindexed for about two years. They were upgraded from two or three sentences and one photo to nearly 1000 words and four or five photos. One appeared in the index about five days later and went straight to #4 for a moderately difficult single word query. That single word query is the name of a software product, the name of some type of "gold" in the minecraft video game and has a lot of competition from .gov and .edu. .
The second one was published about eight days ago and we have not seen it in the SERPs yet. This is an unusually long time for us to wait on a republished page for this site which has a DA of about 80.
The way I would approach it would be to crawl those pages manually in Search Console (RIP Webmaster Tools) once I updated the "index" tag.
I have never done this. I just republish the page.
-
Thanks Andy, I appreciate the response. This was a semi-large project with the main goal of capturing hyper-local leads. I guess once you throw locations into the mix it runs an even bigger chance of being hit due to popular practice of creating a page for every damn city in the country in hopes of ranking locally.
Fortunately we have real locations across the US but I don't want Google to think we're trying to dupe anyone.
Thanks again
Tom -
That's the answer I was expecting. The website I'm referencing has about 4,000 indexed pages, and those 285 may be enough to do some damage.
To give you an example (this mimics exactly what I'm doing), take a business with multiple locations. Each location has their own page, and each location page has their own departments listed with their own pages as well. Each department then has some content such as the NAP, an employee directory, and links to other resourceful pages on the website. Yeah or nay to that?
Also, if I "noindex" the pages to start, add some good content then "index" them, how long in your experience has it taken until you saw a considerable increase in traffic/see those pages indexed? I know that's a site-by-site, page-by-page kind of question but I'm curious to know.
The way I would approach it would be to crawl those pages manually in Search Console (RIP Webmaster Tools) once I updated the "index" tag.
Thoughts?
Thanks!
Tom -
Hi
I agree with the above, you run the risk of getting hit by Panda. If these pages are important to have live to help customers, then surely your priority should be to get good content on their to help your customers / potential customers. If they land on a low quality page with very little content, are they likely to stick around.
I wouldn't put any live until you have the content sorted. I would work out the priority and start there and once the content is good then put live.
There is probably a Panda update around the corner and you don't want to get hit with hit and then you are waiting for Google to release the next version to get out of it.
I wouldnt even run the risk of putting them live with noindex.
Unless of course as said above you have 100,000+ pages of amazing quality content then it probably wont affect you.
Thanks
Andy
-
In my opinion, publishing a lot of thin content pages will get you into trouble with the Panda algorithm. One of my sites had a lot of these types of pages and it was hit with a Panda problem. Most pages on the site were demoted in search. I noindexed those thin content pages and the site recovered in a few weeks.
Here is the code that I used... name="robots" content="noindex, follow" />
Although those pages had thin content, they were still valuable reference for my visitors. That is why I noindexed them instead of deleting them.
Those pages have been noindexed for about two years with no problems. Slowly, I am adding a good article to those pages to reduce their number. I worry that some day, Google might change their minds and hit sites that have lots of thin content pages that are noindexed.
I don't know how big your website is. But I am betting that 285 very very thin pages added to a website of a couple thousand pages will be a problem (that's about what I had when my site had a problem). However, if that many very very thin pages are added to a website with 100,000 pages you might get away with it.
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Publishing content in two or more places?
I've been thinking about publishing an article on LinkedIn and then posting the same article to the news page on the website. It would be high quality informative and useful but is that likely to cause any duplicate content issues?
Intermediate & Advanced SEO | | seoman100 -
Impact of May 2015 quality update and July Panda update on specialty brands or niche retail
We are seeing the following trend in our rankings and traffic after the recent Google algorithm updates (May 2015 quality/phantom, and July 2015 Panda), and I am curious if anyone here has encountered similar and/or has any good ideas on how to react. Background - we operate in a niche segment, but compete for keywords with large home improvement stores and mass retailers. In the past, prior to May 2015, we generally ranked higher than the large home improvement stores and mass retailers for our key specific terms in our niche. We believed that it was because we have a very specialized focus and so our store was highly relevant for someone searching in that niche (for example for the name of the product category as a keyword). In general, we ranked #1-3. Along with a few of our competitors in our niche. And then would be the big box home improvement stores in spots 5-10. The change we saw starting in May is that now all the home improvement stores and also a few large multi-category retailers took over those top 5 spots and bumped all the specialty retailers and the specialty brand manufacturers (like us) down. Our direct competitors in our niche all seem to have been impacted pretty much the same as us. So, in summary it seems like these latest updates may have favored the more general retailers but with stronger domain authority than the more specific but smaller retailers. Hard to know for sure, but this is the trend we believe we see. So, that said, what are some good strategies to respond to this situation? We can't really compete on overall domain authority with these huge retailers. And our previously successful strategy of having a very focused niche, with lots of helpful content, videos, instructional guides, etc. no longer seems to be enough. Has anyone else seen similar results since this past May? Where specialty retail or brand sites lost ground to larger general retailers? And if so, has anyone found any good strategies to gain back their previous rankings, or at least partially?
Intermediate & Advanced SEO | | drewk1 -
How can a website have multiple pages of duplicate content - still rank?
Can you have a website with multiple pages of the exact same copy, (being different locations of a franchise business), and still be able to rank for each individual franchise? Is that possible?
Intermediate & Advanced SEO | | OhYeahSteve0 -
301 redirect for page 2, page 3 etc of an article or feed
Hey guys, We're looking to move a blog feed we have to a new static URL page. We are using 301 redirects but I'm unsure of what to regarding page 2, page 3 etc. of the feed. How do I make sure those urls are being redirected as well? For example: Moving FloridaDentist.com/blog/dental-tips/ to a new page url FloridaDentist.com/dental-tips. So, we are using a 301 on that old url to the new one. My questions is what to do with the other pages like FloridaDentist.com/blog/dental-tips/page/3. How do we make sure that page is also 301'd to the new main url?
Intermediate & Advanced SEO | | RickyShockley0 -
Penalized for Duplicate Page Content?
I have some high priority notices regarding duplicate page content on my website www.3000doorhangers.com Most of the pages listed here are on our sample pages: http://www.3000doorhangers.com/home/door-hanger-pricing/door-hanger-design-samples/ On the left side of our page you can go through the different categories. Most of the category pages have similar text. We mainly just changed the industry on each page. Is this something that google would penalize us for? Should I go through all the pages and use completely unique text for each page? Any suggestions would be helpful Thanks! Andrea
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Alternative HTML Structure for indexation of JavaScript Single Page Content
Hi there, we are currently setting up a pure html version for Bots on our site amazine.com so the content as well as navigation will be fully indexed by google. We will show google exactly the same content the user sees (except for the fancy JS effects). So all bots get pure html and real users see the JS based version. My questions are first, if everyone agrees that this is the way to go or if there are alternatives to this to get the content indexed. Are there best practices? All JS-based websites must have this problem, so I am hoping someone can share their experience. The second question regards the optimal number of content pieces ('Stories') displayed per page and the best method to paginate. Should we display e.g. 10 stories and use ?offset in the URL or display 100 stories to google per page and maybe use rel=”next”/"pref" instead. Generally, I would really appreciate any pointers and experiences from you guys as we haven't done this sort of thing before! Cheers, Frank
Intermediate & Advanced SEO | | FranktheTank-474970 -
My landing page changed in google's serp. I used to have a product page now I have a pdf?
I have been optimizing this page for a few weeks now and and have seen our page for up from 23rd to 11th on the serp's. I come to work today and not only have I dropped to 15 but I've also had my relevant product page replaced by this page . Not to mention the second page is a pdf! I am not sure what happened here but any advice on how I could fix this would be great. My site is www.mynaturalmarket.com and the keyword I'm working on is Zyflamend.
Intermediate & Advanced SEO | | KenyonManu3-SEOSEM0 -
Linking to local pages on main page - keyword self-cannibalization issue?
Hi guys, Our website has this landing page: www.example.com/service1/ Is this considered keyword self-cannibalization if on the above page we link to local pages such as: www.example.com/service1-in-chicago/ www.example.com/service1-in-newyork/ www.example.com/service1-in-texas/ Many thanks David
Intermediate & Advanced SEO | | sssrpm0