Publishing pages with thin content, update later?
-
So I have about 285 pages I created with very, very thin content on each. Each is unique, and each serves its own purpose.
My question is, do you guys think it is wise to publish all of these at once to just get them out there and update each as we go along? Each page is very laser targeted and I anticipate that a large handful will actually rank soon after publishing.
Thanks!
Tom
-
Each location has their own page, and each location page has their own departments listed with their own pages as well. Each department then has some content such as the NAP, an employee directory, and links to other resourceful pages on the website.
If this is making many pages for each location, then I would worry about them. However, if all of this information is on a single page then you might be fine. If I owned a company like this I would require each location to give me substantive content.
Also, if I "noindex" the pages to start, add some good content then "index" them, how long in your experience has it taken until you saw a considerable increase in traffic/see those pages indexed?
I republished two of my thin content pages last week. These were noindexed for about two years. They were upgraded from two or three sentences and one photo to nearly 1000 words and four or five photos. One appeared in the index about five days later and went straight to #4 for a moderately difficult single word query. That single word query is the name of a software product, the name of some type of "gold" in the minecraft video game and has a lot of competition from .gov and .edu. .
The second one was published about eight days ago and we have not seen it in the SERPs yet. This is an unusually long time for us to wait on a republished page for this site which has a DA of about 80.
The way I would approach it would be to crawl those pages manually in Search Console (RIP Webmaster Tools) once I updated the "index" tag.
I have never done this. I just republish the page.
-
Thanks Andy, I appreciate the response. This was a semi-large project with the main goal of capturing hyper-local leads. I guess once you throw locations into the mix it runs an even bigger chance of being hit due to popular practice of creating a page for every damn city in the country in hopes of ranking locally.
Fortunately we have real locations across the US but I don't want Google to think we're trying to dupe anyone.
Thanks again
Tom -
That's the answer I was expecting. The website I'm referencing has about 4,000 indexed pages, and those 285 may be enough to do some damage.
To give you an example (this mimics exactly what I'm doing), take a business with multiple locations. Each location has their own page, and each location page has their own departments listed with their own pages as well. Each department then has some content such as the NAP, an employee directory, and links to other resourceful pages on the website. Yeah or nay to that?
Also, if I "noindex" the pages to start, add some good content then "index" them, how long in your experience has it taken until you saw a considerable increase in traffic/see those pages indexed? I know that's a site-by-site, page-by-page kind of question but I'm curious to know.
The way I would approach it would be to crawl those pages manually in Search Console (RIP Webmaster Tools) once I updated the "index" tag.
Thoughts?
Thanks!
Tom -
Hi
I agree with the above, you run the risk of getting hit by Panda. If these pages are important to have live to help customers, then surely your priority should be to get good content on their to help your customers / potential customers. If they land on a low quality page with very little content, are they likely to stick around.
I wouldn't put any live until you have the content sorted. I would work out the priority and start there and once the content is good then put live.
There is probably a Panda update around the corner and you don't want to get hit with hit and then you are waiting for Google to release the next version to get out of it.
I wouldnt even run the risk of putting them live with noindex.
Unless of course as said above you have 100,000+ pages of amazing quality content then it probably wont affect you.
Thanks
Andy
-
In my opinion, publishing a lot of thin content pages will get you into trouble with the Panda algorithm. One of my sites had a lot of these types of pages and it was hit with a Panda problem. Most pages on the site were demoted in search. I noindexed those thin content pages and the site recovered in a few weeks.
Here is the code that I used... name="robots" content="noindex, follow" />
Although those pages had thin content, they were still valuable reference for my visitors. That is why I noindexed them instead of deleting them.
Those pages have been noindexed for about two years with no problems. Slowly, I am adding a good article to those pages to reduce their number. I worry that some day, Google might change their minds and hit sites that have lots of thin content pages that are noindexed.
I don't know how big your website is. But I am betting that 285 very very thin pages added to a website of a couple thousand pages will be a problem (that's about what I had when my site had a problem). However, if that many very very thin pages are added to a website with 100,000 pages you might get away with it.
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will google be able to crawl all of the pages given that the pages displayed or the info on a page varies according to the city of a user?
So the website I am working for asks for a location before displaying the product pages. There are two cities with multiple warehouses. Based on the users' location, the product pages available in the warehouse serving only in that area are shown. If the user skips location, default warehouse-related product pages are shown. The APIs are all location-based.
Intermediate & Advanced SEO | | Airlift0 -
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Putting my content under domain.com/content, or under related categories: domain.com/bikes/content ?
Hello This questions plays on what Joe Hall talked about during this years' MozCon: Rethinking Information Architecture for SEO and Content Marketing. My Case:
Intermediate & Advanced SEO | | Inevo
So.. we're working out guidelines and templates for a costumer (sporting goods store) on how to publish content (articles, videos, guides) on their category pages, product pages, and other pages. At this moment I have 2 choices:
1. Use a url-structure/information architecture where all the content is placed in one subfolder, for example domain.com/content. Although it's placed here, there's gonna be extensive internal linking from /content to the related category pages, so the content about bikes (even if it's placed under domain.com/bikes) will be just as visible on the pages related to bikes. 2. Place the content about bikes on a subdirectory under the bike category, **for example domain.com/bikes/content. ** The UX/interface for these two scenarios will be identical, but the directories/folder-hierarchy/url structure will be different. According to Joe Hall, the latter scenario will build up more topical authority and relevance towards the category/topic, and should be the overall most ideal setup. Any thoughts on which of the two solutions is the most ideal? PS: There is one critical caveat her: my costumer uses many url-slugs subdirectories for their categories, for example domain.com/activity/summer/bikes/, which means the content in the first scenario will be 4 steps away from the home page. Is this gonna be a problem? Looking forward to your thoughts 🙂 Sigurd, INEVO0 -
Duplicate content on product pages
Hi, We are considering the impact when you want to deliver content directly on the product pages. If the products were manufactured in a specific way and its the same process across 100 other products you might want to tell your readers about it. If you were to believe the product page was the best place to deliver this information for your readers then you could potentially be creating mass content duplication. Especially as the storytelling of the product could equate to 60% of the page content this could really flag as duplication. Our options would appear to be:1. Instead add the content as a link on each product page to one centralised URL and risk taking users away from the product page (not going to help with conversion rate or designers plans)2. Put the content behind some javascript which requires interaction hopefully deterring the search engine from crawling the content (doesn't fit the designers plans & users have to interact which is a big ask)3. Assign one product as a canonical and risk the other products not appearing in search for relevant searches4. Leave the copy as crawlable and risk being marked down or de-indexed for duplicated contentIts seems the search engines do not offer a way for us to serve this great content to our readers with out being at risk of going against guidelines or the search engines not being able to crawl it.How would you suggest a site should go about this for optimal results?
Intermediate & Advanced SEO | | FashionLux2 -
What is the proper way to execute 'page to page redirection'
I need to redirection every page of my website to a new url of another site I've made. I intend to add:"Redirect 301 /oldpage.html http://www.example.com/newpage.html"I will use the 301 per page to redirect every page of my site, but I'm confused that if I add:"Redirect 301 / http://mt-example.com/" it will redirect all of my pages to the homepage and ignore the URLs i have separately mentioned for redirection.Please guide me.
Intermediate & Advanced SEO | | NABSID0 -
Should I allow a publisher to word-for-word re-publish our article?
A small blog owner has asked if they can word-for-word republish one of our blog articles on their own blog. I'm not sure how to respond. We're don't do any outreach to submit or duplicate our articles throughout the web... so this isn't something being done in mass. And this could be a great signal to Google that somebody else is vouching for the quality of our article, right? However, I'm a bit concerned about word-for-word duplicating. Normally, if somebody is interested in re-publishing, both the re-publisher and our website would get more value out of it if they re-publisher added some form of commentary or extra value to our post when citing it, right? This small blog just started releasing a segment in which they've titled "guest blog Thursday". And given the recent concerns with guest blogging (even though I'm not sure this is the classical sense of guest blogging), I'm even more concerned. Any ideas on how I should respond?
Intermediate & Advanced SEO | | dsbud0 -
My home page is not found by the "Grade a Page" tool
My home page as well as several important pages are not found by the Grade a Page tool. With our full https address I got this http://screencast.com/t/s1gESMlGwpa With just the www address I got this http://screencast.com/t/BMRHy36Ih https://www.joomlashack.com
Intermediate & Advanced SEO | | etabush
https://www.joomlashack.com/joomla-templates We recently lost a lot of positions for our most important keyword: Joomla Templates Please help us figure this out. Whats screwy with our site?0 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0