Which is better for Local & National coupons --1000s of Indexed Pages per City or only a Few?
-
Not sure where this belongs..
I am developing a coupons site for listing local coupons and national coupons (think Valpak+RetailMeNot), eventually in all major cities, and am VERY concerned about how many internal pages to let google 'follow' for indexing, as it can exceed 10,000 per city.
Is there a way to determine what the optimal approach is for internal paging/indexing BEFORE I actually launch the site (it is about ready except for this darned url question, which seems critical) Ie can I put in searchwords for google to determine which ones are most worthy to have their own indexed page? I'm a newbie sort of, so please put answer in simple terms. I'm one person and have limited funds and need to find the cheapest way to get the best organic results for each city that I cover.
Is there a generic answer? One SEO firm told me the more variety the better. Another told me that simple is better, and use content on the simple pages to get variety. So confused I decided to consult the experts here!
Here's the site concept:
**FOR EACH CITY: **
User inputs location: Main city only(ie Houston), or 1 of 40 city regions(suburb, etc..), or zip code, or zip-street combo, OR allow gps lookup. A miles range is defaulted or chosen by the user.
After search area is determined, user chooses 1 of 6 types of coupons searches:
1. Online shopping with national coupon codes, choice of 16 categories (electronics, health, clothes, etc) and 100 subcategories (computers, skin care products, mens shirts) These are national offers for chains like Kohls, which do not use the users location at all.
2. Local shopping in-store coupons, choice of same 16 categories and 100 subcategories that are used for online shopping in #1 (mom & pop shoe store or local chain offer). The results will be within the users chosen location and range.
3. Local restaurant coupons, about 60 subcategories (pizza, fast food, sandwiches). The results are again within the users chosen location and range.
4. Local services coupons, 8 categories (auto repair, activities,etc..) and around 200 subcategories (brakes, miniature golf, etc..). Results within users chosen location and range.
5. Local groceries. This is one page for the main city with coupons.com grocery coupons, and listing the main grocery stores in the city. This page does not break down by sub regions, or zip, etc..
6. Local weekly ad circulars. This is one page for the main city that displays about 50 main national stores that are located in that main city.
So, the best way to handle the urls indexed for the dynamic searches by locations, type of coupon, categories/subcats, and business pages
The combinations of potential urls to index are nearly unlimited:
Does the user's location matter when he searches for one thing (restaurants), but not for another (Kohls)? IF so, how do I know this? SHould I tailor indexed urls to that knowledge? Is there an advantage to having a url for NATIONAL cos that ties to each main city: shopping/Kohls vs shopping/Kohls/Houston or even shopping/Kohls/Houston-suburb?
Again, I"m talking about 'follow' links for indexing. I realize I can have google index just a few main categories and subcats and not the others, or a few city regions but not all of them, etc.. while actually having internal pages for all of them..
Is it better to have 10,000 urls for say coupon-type/city-region/subcategory or just one for the main city: main-city/all coupons?, or something in between? You get the gist. I don't know how to begin to figure out the answers to these kinds of questions and yet they seem critical to the design of the site.
The competition: sites like Valpak, MoneyMailer, localsaver seem to favor the 'more is better' approach, with coupons/zipcode/category or coupons/bizname/zipcode But a site like 8coupons.com appears to have no indexing for categories or subcategories at all! They have city-subregion/coupons and they have individual businesses bizname/city-subregion but as far as I see no city/category or city-subregion/category. And a very popular coupons site in my city only has maincity/coupons maincity/a few categories and maincity/bizname/coupons.
Sorry this is so long, but it seems very complicated to me and I wanted to make the issue as clear as possible. Thanks, couponguy
-
Great! I just sent you an email.
-
Hi,
Sure, I can do some analysis for you - I work solely as a freelance consultant right now. If you're keen, just send me an email (jane.copland@gmail.com). I can do a competitive analysis audit for the main competitors, which could be of use!
Cheers,
Jane
-
Thanks. Not knowing this well, are you for hire to check out some competitors if I give you some names? I can't afford to mess this up (over 5000 hours of programming into this). I know I should learn more but I'm spread thin..
-
Hi,
If consolidation is an option, I'd certainly consider it. What I'd be curious about is the indexation and page count of your most successful competitors. I have not worked with a coupon site personally, and I must admit that the 8,000 page number per town does concern me... however, what I'd do is run a ScreamingFrog crawl (http://www.screamingfrog.co.uk/seo-spider/ - you will need to pay for the premium account at $99 to remove the 500 URL limit) for a look at competitors' websites. This will show you the response codes, canonical tags, directives, etc. that others are using. I am not a fan of the idea that if your competitors are doing it, you should do it too, but this will give you a good idea of what is working for sites who manage to rank well for both smaller terms ([jiffy lube coupon post falls]) and big terms ([kohls coupons]).
I would say that 1,000 is preferable to 8,000 if structured properly, but I'd be really keen to know what the rest of your field in vouchers / coupons looks like from an indexed / live URLs perspective.
-
Thank you. 8000 pages per city won't hurt me? That's perhaps my biggest concern..My structure right now has all those pages, but I want to make sure that's the best way to go..alternatively I could probably reduce the number to 1000 or so by combining subcategories in to 'grouped' subcategories (ie all plumber, carpenter, contractors go under 'home-repairs'). Is 1000 better than 8000?
-
Hi,
It really is complicated - I would definitely say that you do not need to think about building links to 8,000+ pages - the well-ranked competitors won't have good links to the majority of their internal pages, but they'll engage in good marketing that brings in authority to the home page and similar high-level pages on the site. Then they'll link well, with good site structure, down through the categories. They'll also (for the most part) avoid duplication issues with canonical tags, although as you point out, some duplication within sites like this is to be expected. Due to these sites' pages being indexed and often ranking well, we have to assume that Google understands the nature of coupon sites, although you still need to be careful of site hygiene and keep a close eye on your Webmaster Tools account for crawl errors, etc.
-
Thanks Jane,
This is, it seems, complicated, so I appreciate your taking the time to check into it.
Very good advice regarding avoiding duplication. Yet, in the Olive Garden example location IS important, so if I decide to go that route I need to be sure there is content unique to the location (maybe nearby offers, for example..)
If there are 40 regions in a city, and 200 subcategories that's 8000 indexed pages potentially without even listing businesses, so even a simple structure like you mention could yield a huge number of internal pages. I question the value of trying to build backlinks to 8000 pages and worry about losing 'juice' from the home page if I do so..(I've read that page rank is a very low search ranking factor anymore, so maybe I need not worry about the juice issue at all--your thoughts?).
-
Hi there,
The danger you face in creating tens of thousands of URLs / pages for everything on the site and allowing those pages to be indexed is that it is almost certain that these pages will essentially duplicate each other. A coupon page for deals at Olive Garden in Phoenix will not be different, besides one or two words, from a page about Olive Garden in Seattle.
This isn't stopping most of the competitors mentioned: Valpak is allowing these pages to be indexed, although I am not sure of their reach with these pages in terms of search engine performance. Users access a page like this one: http://www.valpak.com/coupons/printable/Jiffy-Lube/92048?addressId=1689472&offerId=1581320 from the main Spokane, WA page, but this URL contains a canonical tag that cuts off the ?addressId= section of the URL, leaving http://www.valpak.com/coupons/printable/Jiffy-Lube/92048. This URL is indexed: https://www.google.co.uk/search?q=http%3A%2F%2Fwww.valpak.com%2Fcoupons%2Fprintable%2FJiffy-Lube%2F92048&oq=http%3A%2F%2Fwww.valpak.com%2Fcoupons%2Fprintable%2FJiffy-Lube%2F92048&aqs=chrome..69i58j69i60j69i57.895j0j4&sourceid=chrome&espv=210&es_sm=119&ie=UTF-8 and I get it ranking sixth in google.com for [jiffy lube coupons post falls] (not the web's most competitive phrase, but an indicator that the site is indexed and able to rank well for "deep" pages).
MoneyMailer's pages are badly optimised - not even a descriptive title tag here: http://www.moneymailer.com/coupons/online/seattle/wa/dining/855894?pageNum=1 but the page is still indexed. That page doesn't rank for related terms, as far as I can see.
Regarding location, several are allowing URLs that do not denote location to load, with canonical tags pointing to a location-based URLs, e.g. http://www.localsaver.com/98102/Real_Estate_Agents/Windermere_Eastlake/BDSP-12576652/931434.html is accessed from the Seattle, WA page but its canonical tag points to http://www.localsaver.com/WA/Seattle/Windermere_Eastlake/BDSP-12576652/931434.html
I would imagine that location is pretty key, especially given the nature of search queries you're wanting to target, e.g. people who want a coupon for a restaurant local to themselves. If people want to walk into a specific store or restaurant with a coupon, they will note the area. Where you will see people leave the area out is when they expect to buy online, or the product is more generic than a specific store, e.g. shoes. Many sites seem to employe a combination, but those focusing on location are keeping it simple and mentioning coupons available at specific stores.
I would look to placing content in a structure that avoids duplication but keeps the site structure relatively simple, like coupons/region/category. You are seeing a lot of variation because there are multiple ways to go about this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question About Local SEO
Hey all, If a business operates in one city but works with associated organizations across multiple regions how would this impact a local SEO campaign? For example, a transportation company is located in Texas but services the Northwest and New England by outsourcing to smaller transportation companies in each of those regions. Would it be wise to create pages for each region they service on their website and then break that down in further into specific cities? Also, would it be worth targeting local search terms even though specific cities are serviced by the associated organizations and not the parent company itself? Thanks in advance, Andrew
Local Website Optimization | | mostcg0 -
Service area local seo
Hello, everyone. I am struggling a little with the vast amounts of information about how best to get a local service area business ranking and the best practice. If I explain what I have been doing and then see how I can improve. I have created a couple of websites for window cleaners. These window cleaners offer several services like window cleaning, gutter cleaning, conservatory cleaning, pressure washing etc. They also cover several towns/cities so it's important for them to be able to target all these areas in search. They don't have multiple offices so only have one home/office address and by the nature of the job provide services at the customer's house/business. What I have been doing is creating a page for each service they provide then to cover the areas I have been doing two things. Creating a page on the site called areas covered with a list of the areas they cover and also adding in the title of the page the main one or two areas that are most important to them. From what I can gather this might not be the best approach?? Google may see the areas in titles as keyword stuffing? Google also doesn't like a list of areas in one go anywhere on a site which can also seem like keyword stuffing? So for an example, this would be a rough title structure of service pages Window cleaners in town/city, town/city and town/city Gutter Cleaners in town/city, town/city and town/city As I said I am not sure this is the best way to do this from what I read. I have read about area specific pages but i struggle to see how i could make each area specific page unique enough as the service is exactly the same in each area. I have also read that putting the most important keywords at the begingin of the the title is better so using the above example would this be better? town/city window cleaners - business name So from what i understand having pages like this might be better Window cleaners town/city1 Window cleaners town/city2 Window cleaners town/city3 Gutter Cleaners town/city1 Gutter Cleaners town/city2 Gutter Cleaners town/city3 and so on but like I say I am aware each of these area specific pages would need to be unique but being that the services are exactly the same in each area I am not sure how I could warrant creating all the pages. Writing about the specific area on the page seems a little odd in that the visitor who lands on that page doesn't want to learn about their area, they live there and know the area. They want to know what the service is and if they do in fact cover their area. In which case how can i best ensure all or most of the areas they cover are targeted and show in search? Some sites i have done cover around 20-30 towns around them so how can best ensure they rank for them? I have also been reading conflicting information about how to structure pages and urls. Some say don't use commas in page titles, some say don't use underscores and only use hyphens. Similarly, I have read that the URL should not contain any hyphens but I am not sure about this seeing as WordPress often adds hyphens between words in URLs. Some say you should always have an H1 on every page others say it's not all that important anymore. With images, i have also been giving them alts the same as the page titles thay are on, is this the wrong thing to do? Id be happy to private messge (if i can do that here) one of the sites I would be eternally grateful if anyone can help in firstly clarifying how I could best improve ranking for areas covered and secondly what best practice is to structure page content like H1's image alts etc. Thanks
Local Website Optimization | | Gavpeds0 -
Local SEO - Multiple stores on same URL
Hello guys, I'm working on a plan of local SEO for a client that is managing over 50 local stores. At the moment all the stores are sharing the same URL address and wanted to ask if it s better to build unique pages for each of the stores or if it's fine to go with all of them on the same URL. What do you think? What's the best way and why? Thank you in advance.
Local Website Optimization | | Noriel0 -
Theory: Local Keywords are Hurting National Rankings?
I've read a good amount here and in other blog posts about strategies for national brands to rank locally as well with local landing pages, citations, etc. I have noticed something strange that I'd like to hear if anyone else is running into, or if anyone has a definitive answer for. I'm looking at a custom business printing company where the products can and are often shipped out of state, so it's a national brand. On each product page, the client is throwing in a few local keywords near where the office is to help rank for local variations. When looking at competitors that have a lower domain authority, lower volume of linking root domains, less content on the page, and other standard signals, they are ranking nationally better than the client. The only thing they're doing that could be better is bolding and throwing in the page keyword 5-10 times (which looks unnatural). But when you search for keyword + home city, the client ranks better. My hypothesis is that since the client is optimizing product pages for local keywords as well as national, it is actually hurting on national searches because it's seen as local-leaning business. Has anyone run into this before, or have a definitive answer?
Local Website Optimization | | Joe.Robison2 -
Recommended blogs and sites about local seo
HI.
Local Website Optimization | | corn2015
Can you please tell me some great blogs/sites to read daily about local seo? I'm really wanting to beef up my knowledge in this area to assist local businesses. Corn1 -
Should I use pipe in title tags for local seo?
Hi, I've created a bunch of landing pages for local areas, reading, windsor, slough etc for the title tag I have for Windsor Emergency Electrician Windsor - BrandName should I be using a pipe in the tag to further help search engines learn/identify the location? Emergency Electrician | Windsor - BrandName Thank you Kev
Local Website Optimization | | otex1 -
Multi Location business - Should I 301 redirect duplicate location pages or alternatively No Follow tag them ?
Hello All, I have a eCommerce site and we operate out of mulitple locations. We currently have individual location pages for these locations against each of our many categories. However on the flip slide , this create alot of duplicate content. All of our location pages whether unique or duplicated have a unique title Tag, H1, H2 tag , NAP and they all bring in the City Name . The content on the duplicated content also brings in the City name as well. We have been going through our categories and writing unique content for our most popular locations to help rank on local search. Currently I've been setting up 301 redirects for the locations in the categories with the duplicated content pointing back to the category page. I am wondering whether the increase in number of 301's will do more harm than having many duplicate location pages ?.. I am sure my site is affected by the panda algorithm penalty(on the duplicated content issues) as a couple of years ago , this didn't matter and we ranked top 3 for pretty much for every location but now we are ranking between 8 - 20th depending on keyword. An Alternative I thought, may be to instead of 301 those locations pages with duplicate content, is to put No Follow tags on them instead ?... What do you think ?. It's not economically viable to write unique content for every location on every category and these would not only take years but would cost us far to much money. Our Site is currently approx 10,000 pages Any thoughts on this greatly appreciated ? thanks Pete
Local Website Optimization | | PeteC120 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0