Publishing pages with thin content, update later?
-
So I have about 285 pages I created with very, very thin content on each. Each is unique, and each serves its own purpose.
My question is, do you guys think it is wise to publish all of these at once to just get them out there and update each as we go along? Each page is very laser targeted and I anticipate that a large handful will actually rank soon after publishing.
Thanks!
Tom
-
Each location has their own page, and each location page has their own departments listed with their own pages as well. Each department then has some content such as the NAP, an employee directory, and links to other resourceful pages on the website.
If this is making many pages for each location, then I would worry about them. However, if all of this information is on a single page then you might be fine. If I owned a company like this I would require each location to give me substantive content.
Also, if I "noindex" the pages to start, add some good content then "index" them, how long in your experience has it taken until you saw a considerable increase in traffic/see those pages indexed?
I republished two of my thin content pages last week. These were noindexed for about two years. They were upgraded from two or three sentences and one photo to nearly 1000 words and four or five photos. One appeared in the index about five days later and went straight to #4 for a moderately difficult single word query. That single word query is the name of a software product, the name of some type of "gold" in the minecraft video game and has a lot of competition from .gov and .edu. .
The second one was published about eight days ago and we have not seen it in the SERPs yet. This is an unusually long time for us to wait on a republished page for this site which has a DA of about 80.
The way I would approach it would be to crawl those pages manually in Search Console (RIP Webmaster Tools) once I updated the "index" tag.
I have never done this. I just republish the page.
-
Thanks Andy, I appreciate the response. This was a semi-large project with the main goal of capturing hyper-local leads. I guess once you throw locations into the mix it runs an even bigger chance of being hit due to popular practice of creating a page for every damn city in the country in hopes of ranking locally.
Fortunately we have real locations across the US but I don't want Google to think we're trying to dupe anyone.
Thanks again
Tom -
That's the answer I was expecting. The website I'm referencing has about 4,000 indexed pages, and those 285 may be enough to do some damage.
To give you an example (this mimics exactly what I'm doing), take a business with multiple locations. Each location has their own page, and each location page has their own departments listed with their own pages as well. Each department then has some content such as the NAP, an employee directory, and links to other resourceful pages on the website. Yeah or nay to that?
Also, if I "noindex" the pages to start, add some good content then "index" them, how long in your experience has it taken until you saw a considerable increase in traffic/see those pages indexed? I know that's a site-by-site, page-by-page kind of question but I'm curious to know.
The way I would approach it would be to crawl those pages manually in Search Console (RIP Webmaster Tools) once I updated the "index" tag.
Thoughts?
Thanks!
Tom -
Hi
I agree with the above, you run the risk of getting hit by Panda. If these pages are important to have live to help customers, then surely your priority should be to get good content on their to help your customers / potential customers. If they land on a low quality page with very little content, are they likely to stick around.
I wouldn't put any live until you have the content sorted. I would work out the priority and start there and once the content is good then put live.
There is probably a Panda update around the corner and you don't want to get hit with hit and then you are waiting for Google to release the next version to get out of it.
I wouldnt even run the risk of putting them live with noindex.
Unless of course as said above you have 100,000+ pages of amazing quality content then it probably wont affect you.
Thanks
Andy
-
In my opinion, publishing a lot of thin content pages will get you into trouble with the Panda algorithm. One of my sites had a lot of these types of pages and it was hit with a Panda problem. Most pages on the site were demoted in search. I noindexed those thin content pages and the site recovered in a few weeks.
Here is the code that I used... name="robots" content="noindex, follow" />
Although those pages had thin content, they were still valuable reference for my visitors. That is why I noindexed them instead of deleting them.
Those pages have been noindexed for about two years with no problems. Slowly, I am adding a good article to those pages to reduce their number. I worry that some day, Google might change their minds and hit sites that have lots of thin content pages that are noindexed.
I don't know how big your website is. But I am betting that 285 very very thin pages added to a website of a couple thousand pages will be a problem (that's about what I had when my site had a problem). However, if that many very very thin pages are added to a website with 100,000 pages you might get away with it.
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
6 months Later - 0 Domain Authority/Page Authority and losing Rankings
Hi Moz, Sorry if this comes across as a "Do My Job For Me" type of post but we are an E-Commerce store that have been live since January but have not seen any increase in performance on our site and over the past month, have even seen our rankings decrease. We have 1300 products on site and about 1500 pages in total. 1. As for on-site optimization, we have got 2 reviews and follow up reviews with a highly reputable reviewer from People Per Hour and solved any issues she has found. 2. Updated the Meta Data for products and Alt Descriptions for images focusing on the keywords we wish to rank for. We post weekly blogposts linking back to our products. 3. Social Media Campaigns with regular campaigns on FaceBook, Pinterest, Google+ and Twitter. 4. Attempted to build FOLLOW backlinks to articles relating to products on our site. We have also considered purchasing backlinks to improve our situation as we have yet to see any of these pages be crawled by Google over a month later. I have read a guides on Moz and other sites on how to improve our authority and improve rankings but none have offered much by way of practical solution. My question being, is this just a matter of patience or should I be worried/improving anything given we have 0 Domain Authority and Page Authority on all pages? Thanking you in advance, SEO Novice.
Intermediate & Advanced SEO | | csworkwear0 -
Would you consider this thin content?
Just wondering what the community thinks about the following URLS and whether they are essentially thin content that should be handled through a canonical, noindex or a parameter filtering system: https://www.adversetdisplay.co.uk/products/3x1-popup-exhibition-stand https://www.adversetdisplay.co.uk/products/3x2-popup-exhibition-stand https://www.adversetdisplay.co.uk/products/3x3-popup-exhibition-stand https://www.adversetdisplay.co.uk/products/3x4-popup-exhibition-stand https://www.adversetdisplay.co.uk/products/3x5-popup-exhibition-stand
Intermediate & Advanced SEO | | ColinDocherty0 -
Page structure and how to optimize old content
SITE STRUCTURE I am trying to optimize the structure of our site Dreamestatehuahin.com. Getting a visible sitemap of my page make me realized it was not a pyramid as I expected it to be but instead very flat. I Would be happy for some advise on how to structure my site in future aswell how to optimize certain place on the page that i think need a change. 1: structure on posts. Maybe I misunderstand how post works in wordpress or something happen with my theme. When I look at my page sitemap my page is VERY flat because permalinks setting I chose the setting as post name (recommended in most articles). http://www.dreamestatehuahin.com/sample-post What I actually believed was that post name was place after /blog/ like: http://www.dreamestatehuahin.com/blog/sample-post I would be a good idea to do like this right? Should I add some SEO text on the top of my blog page before the actually posts. Or would this be a bad idea due to pagination causing double content? Could one do 4 blogs in one site and replace the name “blog” in the url with a keywords http://www.dreamestatehuahin.com/real-estate-announcement/sample-post http://www.dreamestatehuahin.com/hua-hin-attractions/sample-post 2) Pages Based on property type From our top menu, i have made links under for sael using wordpress property types http://www.dreamestatehuahin.com/property-type/villa/ http://www.dreamestatehuahin.com/property-type/hot-deals/ http://www.dreamestatehuahin.com/property-type/condominium/ Earlier I found that these pages created duplictaon of titles due to pagenation so I deleted the h1 What would you do with these pages. Should I optimize them with a text and h1. maybe it is possible to add some title and text content for the top of the first page only (the one page that are linked to our top menu) http://www.dreamestatehuahin.com/property-type/villa and not to page 2, 3, 4….. http://www.dreamestatehuahin.com/property-type/villa/page/2/ b) Also maybe I should rename the property types WOuld it make sence to change name of the property types from etc villa to villas for sale or even better villas for sale hua hin Then the above urls will look like this instead: http://www.dreamestatehuahin.com/property-type/villas-for-sale/ Or Maybe renaming a property type would result in many 404 errors and not be worth the effort? 3) LINKING + REPOSTING OUR “PROPERTY” PAGES AND DO A 301 REDIRECT? a) Would It be good idea to link back from all properties description to one of our 5 optimized landingpages (for the keyword home/house/condo/villa) for sale in Hua Hin? http://www.dreamestatehuahin.com/property-hua-hin/ http://www.dreamestatehuahin.com/house-for-sale-hua-hin/ b) Also so far we haven’t been really good about optimizing each property (no keywords, optimized titles or descriptions) etc. http://www.dreamestatehuahin.com/property/baan-suksamran/ I wonder if it would be worth the effort to optimize content of each of the old properties )photos-text) on our page? Or maybe post the old properties again in a new optimized version and do a 301 redirect from the old post?
Intermediate & Advanced SEO | | nm19770 -
Should we show(to google) different city pages on our website which look like home page as one page or different? If yes then how?
On our website, we show events from different cities. We have made different URL's for each city like www.townscript.com/mumbai, www.townscript.com/delhi. But the page of all the cities looks similar, only the events change on those different city pages. Even our home URL www.townscript.com, shows the visitor the city which he visited last time on our website(initially we show everyone Mumbai, visitor needs to choose his city then) For every page visit, we save the last visited page of a particular IP address and next time when he visits our website www.townscript.com, we show him that city only which he visited last time. Now, we feel as the content of home page, and city pages is similar. Should we show these pages as one page i.e. Townscript.com to Google? Can we do that by rel="canonical" ? Please help me! As I think all of these pages are competing with each other.
Intermediate & Advanced SEO | | sanchitmalik0 -
Big discrepancies between pages in Google's index and pages in sitemap
Hi, I'm noticing a huge difference in the number of pages in Googles index (using 'site:' search) versus the number of pages indexed by Google in Webmaster tools. (ie 20,600 in 'site:' search vs 5,100 submitted via the dynamic sitemap.) Anyone know possible causes for this and how i can fix? It's an ecommerce site but i can't see any issues with duplicate content - they employ a very good canonical tag strategy. Could it be that Google has decided to ignore the canonical tag? Any help appreciated, Karen
Intermediate & Advanced SEO | | Digirank0 -
Can I delay an AJAX call in order to hide specific on page content?
I am an SEO for a people search site. To avoid potential duplicate content issues for common people searches such as "John Smith" we are displaying the main "John Smith" result above the fold and add "other John Smith" search results inside an iframe. This way search engines don't see the same "other John Smith" search results on all other "John Smith" profile pages on our site and conclude that we have lots of duplicate content. We want to get away from using an iframe to solve potential duplicate content problem. Question: Can we display this duplicate "John Smith" content using a delayed AJAX call and robot.txt block the directory that contains the AJAX call?
Intermediate & Advanced SEO | | SEOAccount320 -
Google swapped our website's long standing ranking home page for a less authoritative product page?
Our website has ranked for two variations of a keyword, one singular & the other plural in Google at #1 & #2 (for over a year). Keep in mind both links in serps were pointed to our home page. This year we targeted both variations of the keyword in PPC to a products landing page(still relevant to the keywords) within our website. After about 6 weeks, Google swapped out the long standing ranked home page links (p.a. 55) rank #1,2 with the ppc directed product page links (p.a. 01) and dropped us to #2 & #8 respectively in search results for the singular and plural version of the keyword. Would you consider this swapping of pages temporary, if the volume of traffic slowed on our product page?
Intermediate & Advanced SEO | | JingShack0