Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
-
Greeting Mozzers,
This is a long question, so please bare with me
We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up).
So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now).
http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-trainingSo, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location.
After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it But, still don't want to be reviewed manually lol.
So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way.
Thanks again and sorry for the detailed message!
-
THANK YOU, EGOL!
-
Those pages look just about identical to me. The top paragraph to left of the map is almost identical... then the huge block of "directions and lodging information" is identical and a lot of words.
If this was my site, I would do this...
-
Rewrite unique content for the top paragraph beside the map. Would take a bit of work but I would do it. Its is not hard writing.
-
For the "Directions and Lodging Information" ... I would place that on a separate page and link to it. That eliminates a LOT of duplicate content from the NYC pages.
If this was my site I would not publish the pages as I see them today... but would feel good publishing all 800 if I did 1 and 2 above.
-
-
EGOL,
Thanks for your reply! The content is not entirely unique, but all created internally with the user in mind. For example, the main segments on all of the New York pages say the similar things with the exception of the course topic area.
For example this New York page on SharePoint outlines our SharePoint courses in New York (http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training) and this New York page on Project Management Training (http://www2.learningtree.com/htfu/usny27/new-york/project-management-training) shows our Project Management courses in New York. You'll notice the similarities of the page, but the content is different per course area. The UI to create the page simply changes a few elements of the URL to dynamically adjust the location, which provides the unique address, meta description (etc - all other vital SEO aspects). Otherwise, we would have had to use significant resources to create truly unique content for each and every page, something that management did not want to do. So, this is as white hate as I can be given the resources that I have :)...make sense?
-
Honestly... if these are all pages with great, original, unique, substantive, non-duplicating content... I would blast them up right now. 800 ain't that many.... and if you are a white hat then google should be OK with it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I block blexhn30.webmeup.com. Or does it have anything to do with my Moz Local
I am getting alot of hits from blexhn30.webmeup.com. My web host says it could be a web service. Is this part of moz local activity? Otherwise I want to block it. Have you seen this before??
Local Website Optimization | | stephenfishman0 -
Closed Location Pages - 301 to open locations?
I work with several thousand local businesses and have a listing page for each on my site. Recently a large chunk of these locations closed, and a number of these pages rank well for localized keywords. I'm trying to figure out the best course of action.
Local Website Optimization | | Andrew_Mac
What I've done so far is make a note on each of the closed location pages that says something to the effect of "This location is currently closed. Here are some nearby options" and provide links to the location pages of 3 open places nearby. The closed location pages are continuing to rank well, but conversion rates from visitors landing on these pages has dropped. What I'm considering doing is 301ing these pages to the nearest open location page. I'm hoping this will preserve the ranking of the page for keywords for which the nearby location is still relevant, while not hurting user experience by serving up a closed location. I'm also thinking of, as a second step, creating new pages (with slightly altered URLs) for the closed listings. They won't rank as well obviously, but if someone searches for the address or even the street of the closed location, my hope is that I could still capture some of that traffic and hope to convert it through someone clicking through to an open location from there. I spoke with someone about this second step and he thought it sounded spammy. My thinking is, combined with the 301, I'm telling Google that the page it is currently ranking well no longer has the importance it once did and that the page I'm 301ing to does, but that the content on the page I'm creating for the closed location still has enough value to justify the newly created page. I'd really appreciate thoughts from the community on this. Thanks!0 -
How best to clean up doorway pages. 301 them or follow no index ?
Hi Mozzers, I have what is classed as doorway pages on my website. These have historically been location specific landing pages for some of our categories but from speaking to a number of different webmasters , then general consensus is that they are not in google guidelines so I will be getting punished by having them. My options are : I can 301 the pages back to their original category pages . This will conserve some link juice to pass back to the respective category page. I can set these as Follow No index. Not sure what will happen here with regards to link value etc. What would be best ?... Some of the pages do currently rank "fairly well" for some of the locations so I am getting traffic from them but I also know I will be getting a algorithmic penalty for having them so how best I clean these up ?. Also , by cleaning up the site structure , would I see any benefit here ? or will I have to wait for a new panda update/ refresh ? I thought the panda refresh won't use a new dataset thanks Pete
Local Website Optimization | | PeteC120 -
What is the Best Keyword Placement within a URL for Inner Location Pages?
I'm working on a website with 100s of locations. There is a location search page (Find Widget Dealer), a page for each state (Tennessee Widget Dealers) and finally a page for each individual location which has localized unique content and contact info (Nashville Widget Dealer). My question is is related to how I should structure my URL and the keywords within the URL. Keywords in my examples being the location and the product (i.e. widget). Here is a quick overview of each of the 3 tiered pages, with the Nashville page being the most optimized: Find Widget Dealer - Dealer Page only includes a location search bar and bullet list links to states Tennessee Widget Dealers - Page includes brief unique content for the the state and basic listing info for each location along with links to the local page) Nashville Widget Dealer - Page includes a good amount of unique content for this specific location (Most optimized page) That said, here are the 3 URL structure options I am considering: http://website.com/widget-dealers/tennesee/nashville http://website.com/dealers/tennesee-widget-dealers/nashville http://website.com/dealers/tennesee/nashville-widget-dealer Any help is appreciated! Thank you
Local Website Optimization | | the-coopersmith0 -
Is it okay for my H3 Tag to appear above my H2 Tag on the Web Page
Hello All, I am currently doing my H1 ,H2, H3 Tags on my redesigned website We have the ability to have links to relevant DIY Guides on the bottom of our webpage and these are currently displayed under a heading "DIY Useful Guides" above my on page content which is at the bottom of the page. My H2 Tag will obviously be the title that sits above my On Page Content at the bottom of the Webpage and I was going to do the H3 Tag for my DIY Guides Is it a problem if the H3 tag sits above the H2 Tag on the Page or not ? Or have i got this wrong and I need to move the DIY Guides (links) to below the on page content so the H3 tag sits below the H2 tag? thanks Pete OTmPbbR
Local Website Optimization | | PeteC120 -
URL structure for local SEO
Hi fokes, question; which url structure is best for local rankings. For example: when I want to rank on the keyword: "Plumber Londen". And I dont have plumber in my brand. What is the best url structure: example.com/plumber/londen example.com/plumber-londen
Local Website Optimization | | remkoallertz1 -
Merging two pages into one - bad seo done previously
Hi, I have two pages Page 1
Local Website Optimization | | Syed_Ozair
/stop-smoking-hypnotherapy.php
Page authority: 24 and Page 2
/stop-smoking-in-highgate-north-london-radlett-hertfordshire-and-city-of-london.php
Page authority: 13 with 2 internal links only This was probably done to get more local searches to the page but i think it is a bit spamy. Would it be better to 301 page 2 to page 1 or make it as a blog post and keep it alive?0 -
How to rank in Local Google Without physical address and phone number?
Can We Rank Well in Local Google without Physical address and Phone Number??? If Yes. How??
Local Website Optimization | | Dan_Brown10