Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
-
Greeting Mozzers,
This is a long question, so please bare with me
We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up).
So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now).
http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-trainingSo, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location.
After watching Matt Cutt's video here:Â http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058Â about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it But, still don't want to be reviewed manually lol.
So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way.
Thanks again and sorry for the detailed message!
-
THANK YOU, EGOL!
-
Those pages look just about identical to me.  The top paragraph to left of the map is almost identical... then the huge block of "directions and lodging information" is identical and a lot of words.
If this was my site, I would do this...
-
Rewrite unique content for the top paragraph beside the map. Would take a bit of work but I would do it.  Its is not hard writing.
-
For the "Directions and Lodging Information" ... I would place that on a separate page and link to it.  That eliminates a LOT of duplicate content from the NYC pages.
If this was my site I would not publish the pages as I see them today... but would feel good publishing all 800 if I did 1 and 2 above.
-
-
EGOL,
Thanks for your reply! The content is not entirely unique, but all created internally with the user in mind. For example, the main segments on all of the New York pages say the similar things with the exception of the course topic area.
For example this New York page on SharePoint outlines our SharePoint courses in New York (http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training) and this New York page on Project Management Training (http://www2.learningtree.com/htfu/usny27/new-york/project-management-training) shows our Project Management courses in New York. You'll notice the similarities of the page, but the content is different per course area. The UI to create the page simply changes a few elements of the URL to dynamically adjust the location, which provides the unique address, meta description (etc - all other vital SEO aspects). Otherwise, we would have had to use significant resources to create truly unique content for each and every page, something that management did not want to do. So, this is as white hate as I can be given the resources that I have :)...make sense?
-
Honestly... if these are all pages with great, original, unique, substantive, non-duplicating content... I would blast them up right now.  800 ain't that many.... and if you are a white hat then google should be OK with it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Service Location links in footer and on the service page - spamming or good practice?
We are are a managed IT services business so we try and target people searching for IT support in a number of key areas. We have created individual location pages (11) to localise our service in these specific areas. We put these location links in the footer which went to the specified IT support pages respectively. Now we have created a general 'managed IT services' page and are thinking of linking to these specific pages on there as well as it makes sense to do it. Would having these 11 links in the footer as well as on the 'managed IT services' page be spamming? or would it be good practice? If this is spamming, which linking location should hold preference. Would appreciate the feedback
Local Website Optimization | | AndyL93
Thanks
Andy0 -
My pages are absolutely plummeting. HELP!
Hi all, Several of my pages have absolutely tanked in the past fortnight, and I've no idea why. One of them, according to Moz, has a Page Optimisation Score of 96, and it's dropped from 10th to 20th. Our DA is lower than our competitors, but still, that's a substantial drop. Sadly, this has been replicated across the site. Any suggestions? Cheers, Rhys
Local Website Optimization | | SwanseaMedicine0 -
One of my man pages is not ranking and does not seem to exist.
One of my main pages is not ranking and does not seem to exist. I  have gone through every tool on webmaster and yoast. I cannot find an error, but  every metric I know of says my page should be on the first page of my target search term. Moz graded it as an A, but It is not ranked on any page. Can someone please help?? my target search is "Jekyll Island Wedding Photographer" my home page shows up on page 2 but this page http://saintsimonsphotography.com/jekyll-island-wedding-photographer/ is my page that does not seem to exist. I have never had this problem with any of my other businesses. Jekyll island is the next island over and I need this term to rank. Thank you for any help.
Local Website Optimization | | krivec80 -
URL and title strategy for multiple location pages in the same city
Hi, I have a customer which opens additional branches in cities where he had until now only one branch. My question is: Once we open new store pages, what is the best strategy for the local store pages in terms of URL and title?
Local Website Optimization | | OrendaLtd
So far I've seen some different strategies for URL structure:
Some use [URL]/locations/cityname-1/2/3 etc.
while others use [URL]/locations/cityname-zip code/
I've even seen [URL]/locations/street address-cityname (that's what Starbucks do) There are also different strategies for the title of the branch page.
Some use [city name] [state] [zip code] | [Company name]
Other use [Full address] | [Company name]Â
Or [City name] [US state] [1/2/3] | [Company name]
Or [City name] [District / Neighborhood] [Zip Code] | [Company name] What is the preferred strategy for getting the best results? On the one hand, I wish differentiate the store pages from one another and gain as much local coverage as possible; on the other hand, I wish to create consistency and establish a long term strategy, taking into consideration that many more branches will be opened in the near future.1 -
Structuring URLs of profile pages
First of all, I want to thank everyone for the feedback that I received on the first question. My next question has to do with the URL structure of personal trainer profiles pages on www.rightfitpersonaltraining.com. Currently, the structure of each trainer profile page is "www.rightfitpersonaltraining.com/personal-trainers/trainer/" and at the end I manually add the trainer's "city-firstname-lastinitial". Would it be to my benefit to have the developers change the structure so that the trainer profile URLs are "www.rightfitpersonaltraining.com/city-personal-trainers/trainername"? That way, each trainer profile would link directly to the trainer's city page as opposed to the general "personal-trainers" page. I don't mind paying a little extra to go back into the site to make these changes, as I think they would benefit the search ranking for each city page.
Local Website Optimization | | mkornbl20 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. Â We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / Â strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Multi Location business - Should I 301 redirect duplicate location pages or alternatively No Follow tag them ?
Hello All, I have a eCommerce site and we operate out of mulitple locations. We currently have individual location pages for these locations against each of our many categories. However on the flip slide , this create alot of duplicate content. All of our location pages whether unique or duplicated have a unique title Tag, H1, H2 tag , NAP and they all bring in the City Name . The content on the duplicated content also brings in the City name as well. We have been going through our categories and writing unique content for our most popular locations to help rank on local search. Currently I've been setting up 301 redirects for the locations in the categories with the duplicated content pointing back to the category page. I am wondering whether the increase in number of 301's will do more harm than having many duplicate location pages ?.. I am sure my site is affected by the panda algorithm penalty(on the duplicated content issues) as a couple of years ago , this didn't matter and we ranked top 3 for pretty much for every location but now we are ranking between 8 - 20th depending on keyword. An Alternative I thought,  may be to instead of 301 those locations pages with duplicate content, is to put No Follow tags on them instead ?... What do you think ?. It's not economically viable to write unique content for every location on every category and these would not only take years but would cost us far to much money. Our Site is currently approx 10,000 pages Any thoughts on this greatly appreciated ? thanks Pete
Local Website Optimization | | PeteC120 -
Does Schema Replace Conventional NAP in local SEO?
Hello Everyone, My question is in regards to Schema and whether the it replaces the need for the conventional structured data NAP configuration. Because you have the ability to specifically call out variables (such as Name, URL, Address, Phone number ect.) is it still necessary to keep the NAP form-factor that has historically been required for local SEO? Logically it makes sense that schema would allow someone to reverse this order and still achieve the same result, however I have yet to find any conclusive evidence of this being the case. Thanks, and I look forward to what the community has to say on this matter.
Local Website Optimization | | toddmumford0