Creating 100,000's of pages, good or bad idea
-
Hi Folks,
Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites. Should we focus on the long tail again?
One option for us is to take every town across the UK and create pages using our activities. e.g.
Stirling
Stirling paintball
Stirling Go Karting
Stirling Clay shootingWe are not going to link to these pages directly from our main menus but from the site map.
These pages would then show activities that were in a 50 mile radius of the towns. At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to.
With 45,000 towns and 250 activities we could create over a million pages which seems very excessive! Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get.
Is there a limit to how big a site should be?
-
Hi Mark!
Thanks for asking this good question. While there is no limit to how big a website can be, I think you can see from the general response here that most members would encourage you to stick to manually developing quality pages rather than automating hundreds of thousands of pages, solely for ranking purposes. I second this advice.
Now, I would like to clarify your business model. Are you a physical, actual business that customers come to, either to buy paintball equipment or to play paintball in a gallery? Or, is your business virtual, with no in person transactions? I'm not quite understanding this from your description.
If the former, I would certainly encourage you to develop a very strong, unique page for each of your physical locations. If you have 10 locations (with unique street addresses and phone numbers), then that would be 10 pages. If you've got 20 locations, that would be 20 pages, etc. But don't approach these with a 'just switch out the city name in the title tags' mindset. Make these pages as exceptional as possible. Tell stories, show off testimonials, share pictures and videos, entertain, educate, inspire. These city landing pages will be intimately linked into your whole Local SEM campaign, provided they each represent a business location with a unique dedicated street address and unique local area code phone number.
But, if you are considering simply building a page for every city in the UK, I just can't see justification for doing so. Ask yourself - what is the value?
There are business models (such as carpet cleaners, chimney sweeps, general contractors, etc.) that go to their clients' locations to serve and for which I would be advising that they create city landing pages for each of their service cities, but this would be extremely regional...not statewide or national or International. A carpet cleaner might serve 15 different towns and cities in his region, and I would encourage him to start gathering project notes and testimonials, videos and photos to begin developing a body of content important enough for him to start creating strong, interesting and unique pages for each of these cities. But I've also had local business owners tell me they want to cover every city in California, for instance, because they think it will help them to do so, and I discourage this.
Even if the business is virtual and doesn't have any in-person transactions with clients or physical locations, I would still discourage this blanketing-the-whole-nation-with-pages approach. A national retailer needs to build up its brand so that it becomes known and visible organically for its products rather than your theoretical approach of targeting every city in the nation. In short order, the mindset behind that approach just doesn't make good horse sense.
And, as others have stated, adding thousands of thin, potentially duplicate pages to any site could definitely have a very negative effect on rankings.
My advice is to make the time to start developing a content strategy for cities in which you have a legitimate presence. If budget means you can't hire a copywriter to help you with this and to speed up the work, accept that this project deserves all the time you can give it and that a slow development of exceptional pages is better than a fast automation of poor quality pages.
Hope this helps!
-
Hi Mark,
If A,C, and E's page is similar to B,D, and F's page it is still consider dupllicate content. Based on Webmaster's definiton:
"Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar"
Each of your pages should be unique and different from other pages.
I suggest you to continue providing quality content and target the long tail keywords. That alone will help you generate more traffic. Furthermore, out ranking is not a problem. You should focus on getting to the frist page (providing quality content with long tail or regular keywords) and when you are on the first page, try to get searchers to click on your link using Title tag and Meta descriptions.
Out ranking just means they are ranked 4th and you are ranked 5th, 6th but as long as you have a better title tag and meta description. I believe searchers will click on the more attractive results.
-
Cookie cutter pages like these stopped working in Google about ten years ago.
If you toss them up I think that your entire site will tank.
I would go back to focusing on quality pages.
-
If the user experience awesome, and people are staying on your site and looking around, great. If you think the 100,000 pages will make search engines love you, machines can never provide the love users can give you.
-
Can you mix content up from your website e.g. paintball site A, C and E on one page and B,D and F on another if the towns are close together? What I'm not sure about is how different in % terms the content actually has to be.
If we have less written content then the amounts of words we have to actually change would be much less.
The challenge we have is we have build the site this time with filtering in mind, so rather than making customers navigate we allow them to be able to search which is much better in terms of getting the activities they want. The downside is now our site does not show for the long tail as we reduced the pages massively.
-
so we dont have the resources if we did it manually but what would happen is the content would be different on each page as we would only show activity sites within a 50 miles radius. And we would make certain text, h1 etc different and relate to the town.
Below are some examples of sites I see doing well ie number 1 using this method
Our content would be much better than say http://www.justpaintballDOTcoDOTuk/site_guide/Aberfeldy.htm or http://www.goballisticDOTcoDOTuk/paintball_in_/ABERFELDY.asp
But as you say getting this wrong is my worry.
-
Hi Mark,
Creating 100,000 pages is definitely good for Search Engine because you have a lot more contents for them to crawl and have more chances your pages might show up on related keywords. However, the problem is do you have enough unique contents you can post on all those 100,000 pages. If you use similar content, I am afraid it will be duplicate contents. You may think changing up the town names will be enough but it might be risky.
If you can create 100,000 unique contents, Sure go ahead. If not, don't take the risk of duplicate contents.
-
Do you have the resources to create unique content for all those pages? Because adding 500,000 pages of duplicate content will absolutely damage your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages excluded from Google's index due to "different canonicalization than user"
Hi MOZ community, A few weeks ago we noticed a complete collapse in traffic on some of our pages (7 out of around 150 blog posts in question). We were able to confirm that those pages disappeared for good from Google's index at the end of January '18, they were still findable via all other major search engines. Using Google's Search Console (previously Webmastertools) we found the unindexed URLs in the list of pages being excluded because "Google chose different canonical than user". Content-wise, the page that Google falsely determines as canonical instead has little to no similarity to the pages it thereby excludes from the index. False canonicalization About our setup: We are a SPA, delivering our pages pre-rendered, each with an (empty) rel=canonical tag in the HTTP header that's then dynamically filled with a self-referential link to the pages own URL via Javascript. This seemed and seems to work fine for 99% of our pages but happens to fail for one of our top performing ones (which is why the hassle 😉 ). What we tried so far: going through every step of this handy guide: https://moz.com/blog/panic-stations-how-to-handle-an-important-page-disappearing-from-google-case-study --> inconclusive (healthy pages, no penalties etc.) manually requesting re-indexation via Search Console --> immediately brought back some pages, others shortly re-appeared in the index then got kicked again for the aforementioned reasons checking other search engines --> pages are only gone from Google, can still be found via Bing, DuckDuckGo and other search engines Questions to you: How does the Googlebot operate with Javascript and does anybody know if their setup has changed in that respect around the end of January? Could you think of any other reason to cause the behavior described above? Eternally thankful for any help! ldWB9
Intermediate & Advanced SEO | | SvenRi1 -
Good or bad adding keywords in Pinterest description?
I added all keywords in description. Will this affect my website, Google takes this as negative way? I am not adding keywords on my own website, but adding keywords to third party website? https://www.pinterest.com/pin/304555993526970292/
Intermediate & Advanced SEO | | bondhoward0 -
My landing pages don't show up in the SERPs, only my frontpage does.
I am having some trouble with getting the landing pages for a clients website to show up in the SERPs.
Intermediate & Advanced SEO | | InmediaDK
As far as I can see, the pages are optimized well, and they also get indexed by Google. The website is a danish webshop that sells wine, www.vindanmark.com Take for an instance this landing page, http://www.vindanmark.com/vinhandel/
It is optimzied for the keywords "Vinhandel Århus". Vinhandel means "Winestore" and "Århus" is a danish city. As you can see, I manage to get them at page 1 (#10), but it's the frontpage that ranks for the keyword. And this goes for alle the other landing pages as well. But I can't figure out, why the frontpage keep outranking the landingpages on every keyword.
What am I doing wrong here?1 -
What to do when all products are one of a kind WYSIWYG and url's are continuously changing. Lots of 404's
Hey Guys, I'm working on a website with WYSIWYG one of a kind products and the url's are continuously changing. There are allot of duplicate page titles (56 currently) but that number is always changing too. Let me give you guys a little background on the website. The site sells different types of live coral. So there may be anywhere from 20 - 150 corals of the same species. Each coral is a unique size, color etc. When the coral gets sold the site owner trashes the product creating a new 404. Sometimes the url gets indexed, other times they don't since the corals get sold within hours/days. I was thinking of optimizing each product with a keyword and re-using the url by having the client update the picture and price but that still leaves allot more products than keywords. Here is an example of the corals with the same title http://austinaquafarms.com/product-category/acans/ Thanks for the help guys. I'm not really sure what to do.
Intermediate & Advanced SEO | | aronwp0 -
Silo Architecture - need an expert's advice
I understand the concept of silo architecture. What I don't understand is how to build the site navigation. I see experts talking about silos, but their sites have pervasive top level navigation. In theory, your top level nav breaks your silos. If I have 20 pages of supporting content all linked to my silo page, and the top nav is on the supporting content pages, then those pages all link to the pages in the top nav - silo broken, and link juice diluted. it would seem to me that the only way to build a true silo is to strip out all of the navigation on a supporting page, and only have it link to: 1. The silo landing page 2. Other supporting pages in the silo. is this what Bruce Clay does? I've seen Rand's lectures on silos as well. Is this what he is doing? I recently saw a video by the Network Empire team, and they'd also have a pervasive nav. Can someone please explain this?
Intermediate & Advanced SEO | | CsmBill0 -
Good idea to point all registered domains to main site?
We have close to 60 domains that we are considering pointing to our main site. Is this type of a redirect a good idea? We have a number of domains that are industry related but do not have our brand name in the domain. Should we point these sites to our homepage as well? Pros/Cons? Examples: <colgroup><col width="201"></colgroup> XXXXX.BE XXXXX.BIZ XXXXX.BZ XXXXX.CC XXXXX.CO XXXXX.CO.UK XXXXX.COM XXXXX.INFO XXXXX.JOBS XXXXX.ME XXXXX.ME.UK XXXXX.MOBI XXXXX.MX XXXXX.NET.CN XXXXX.NL XXXXX.ORG.UK XXXXX.TW XXXXX.US XXXXX.WS XXXXXEXCHANGE.COM XXXXXONLINE.COM WIDGETSRUS.COM WIDGETBLOG.COM |
Intermediate & Advanced SEO | | NaHoku0 -
How are pages ranked when using Google's "site:" operator?
Hi, If you perform a Google search like site:seomoz.org, how are the pages displayed sorted/ranked? Thanks!
Intermediate & Advanced SEO | | anthematic0 -
DCMI and Google's rich snippets
I haven't seen any consistent information regarding DCMI tags for organic SEO in a couple of years. Webmaster Tools obviously has a rich set of instructions for microdata. Has there been any updated testing on DCMI or information above the whisper/rumor stage on whether engines will be using Dublin? As a final point, would it be worth going back to static pages that haven't been touched in a couple of years and updating them with microdata? It seems a natural for retail sites and maybe some others, but what about content heavy pages?
Intermediate & Advanced SEO | | jimmyseo0