Which is better for Local & National coupons --1000s of Indexed Pages per City or only a Few?
-
Not sure where this belongs..
I am developing a coupons site for listing local coupons and national coupons (think Valpak+RetailMeNot), eventually in all major cities, and am VERY concerned about how many internal pages to let google 'follow' for indexing, as it can exceed 10,000 per city.
Is there a way to determine what the optimal approach is for internal paging/indexing BEFORE I actually launch the site (it is about ready except for this darned url question, which seems critical) Ie can I put in searchwords for google to determine which ones are most worthy to have their own indexed page? I'm a newbie sort of, so please put answer in simple terms. I'm one person and have limited funds and need to find the cheapest way to get the best organic results for each city that I cover.
Is there a generic answer? One SEO firm told me the more variety the better. Another told me that simple is better, and use content on the simple pages to get variety. So confused I decided to consult the experts here!
Here's the site concept:
**FOR EACH CITY: **
User inputs location: Main city only(ie Houston), or 1 of 40 city regions(suburb, etc..), or zip code, or zip-street combo, OR allow gps lookup. A miles range is defaulted or chosen by the user.
After search area is determined, user chooses 1 of 6 types of coupons searches:
1. Online shopping with national coupon codes, choice of 16 categories (electronics, health, clothes, etc) and 100 subcategories (computers, skin care products, mens shirts) These are national offers for chains like Kohls, which do not use the users location at all.
2. Local shopping in-store coupons, choice of same 16 categories and 100 subcategories that are used for online shopping in #1 (mom & pop shoe store or local chain offer). The results will be within the users chosen location and range.
3. Local restaurant coupons, about 60 subcategories (pizza, fast food, sandwiches). The results are again within the users chosen location and range.
4. Local services coupons, 8 categories (auto repair, activities,etc..) and around 200 subcategories (brakes, miniature golf, etc..). Results within users chosen location and range.
5. Local groceries. This is one page for the main city with coupons.com grocery coupons, and listing the main grocery stores in the city. This page does not break down by sub regions, or zip, etc..
6. Local weekly ad circulars. This is one page for the main city that displays about 50 main national stores that are located in that main city.
So, the best way to handle the urls indexed for the dynamic searches by locations, type of coupon, categories/subcats, and business pages
The combinations of potential urls to index are nearly unlimited:
Does the user's location matter when he searches for one thing (restaurants), but not for another (Kohls)? IF so, how do I know this? SHould I tailor indexed urls to that knowledge? Is there an advantage to having a url for NATIONAL cos that ties to each main city: shopping/Kohls vs shopping/Kohls/Houston or even shopping/Kohls/Houston-suburb?
Again, I"m talking about 'follow' links for indexing. I realize I can have google index just a few main categories and subcats and not the others, or a few city regions but not all of them, etc.. while actually having internal pages for all of them..
Is it better to have 10,000 urls for say coupon-type/city-region/subcategory or just one for the main city: main-city/all coupons?, or something in between? You get the gist. I don't know how to begin to figure out the answers to these kinds of questions and yet they seem critical to the design of the site.
The competition: sites like Valpak, MoneyMailer, localsaver seem to favor the 'more is better' approach, with coupons/zipcode/category or coupons/bizname/zipcode But a site like 8coupons.com appears to have no indexing for categories or subcategories at all! They have city-subregion/coupons and they have individual businesses bizname/city-subregion but as far as I see no city/category or city-subregion/category. And a very popular coupons site in my city only has maincity/coupons maincity/a few categories and maincity/bizname/coupons.
Sorry this is so long, but it seems very complicated to me and I wanted to make the issue as clear as possible. Thanks, couponguy
-
Great! I just sent you an email.
-
Hi,
Sure, I can do some analysis for you - I work solely as a freelance consultant right now. If you're keen, just send me an email (jane.copland@gmail.com). I can do a competitive analysis audit for the main competitors, which could be of use!
Cheers,
Jane
-
Thanks. Not knowing this well, are you for hire to check out some competitors if I give you some names? I can't afford to mess this up (over 5000 hours of programming into this). I know I should learn more but I'm spread thin..
-
Hi,
If consolidation is an option, I'd certainly consider it. What I'd be curious about is the indexation and page count of your most successful competitors. I have not worked with a coupon site personally, and I must admit that the 8,000 page number per town does concern me... however, what I'd do is run a ScreamingFrog crawl (http://www.screamingfrog.co.uk/seo-spider/ - you will need to pay for the premium account at $99 to remove the 500 URL limit) for a look at competitors' websites. This will show you the response codes, canonical tags, directives, etc. that others are using. I am not a fan of the idea that if your competitors are doing it, you should do it too, but this will give you a good idea of what is working for sites who manage to rank well for both smaller terms ([jiffy lube coupon post falls]) and big terms ([kohls coupons]).
I would say that 1,000 is preferable to 8,000 if structured properly, but I'd be really keen to know what the rest of your field in vouchers / coupons looks like from an indexed / live URLs perspective.
-
Thank you. 8000 pages per city won't hurt me? That's perhaps my biggest concern..My structure right now has all those pages, but I want to make sure that's the best way to go..alternatively I could probably reduce the number to 1000 or so by combining subcategories in to 'grouped' subcategories (ie all plumber, carpenter, contractors go under 'home-repairs'). Is 1000 better than 8000?
-
Hi,
It really is complicated - I would definitely say that you do not need to think about building links to 8,000+ pages - the well-ranked competitors won't have good links to the majority of their internal pages, but they'll engage in good marketing that brings in authority to the home page and similar high-level pages on the site. Then they'll link well, with good site structure, down through the categories. They'll also (for the most part) avoid duplication issues with canonical tags, although as you point out, some duplication within sites like this is to be expected. Due to these sites' pages being indexed and often ranking well, we have to assume that Google understands the nature of coupon sites, although you still need to be careful of site hygiene and keep a close eye on your Webmaster Tools account for crawl errors, etc.
-
Thanks Jane,
This is, it seems, complicated, so I appreciate your taking the time to check into it.
Very good advice regarding avoiding duplication. Yet, in the Olive Garden example location IS important, so if I decide to go that route I need to be sure there is content unique to the location (maybe nearby offers, for example..)
If there are 40 regions in a city, and 200 subcategories that's 8000 indexed pages potentially without even listing businesses, so even a simple structure like you mention could yield a huge number of internal pages. I question the value of trying to build backlinks to 8000 pages and worry about losing 'juice' from the home page if I do so..(I've read that page rank is a very low search ranking factor anymore, so maybe I need not worry about the juice issue at all--your thoughts?).
-
Hi there,
The danger you face in creating tens of thousands of URLs / pages for everything on the site and allowing those pages to be indexed is that it is almost certain that these pages will essentially duplicate each other. A coupon page for deals at Olive Garden in Phoenix will not be different, besides one or two words, from a page about Olive Garden in Seattle.
This isn't stopping most of the competitors mentioned: Valpak is allowing these pages to be indexed, although I am not sure of their reach with these pages in terms of search engine performance. Users access a page like this one: http://www.valpak.com/coupons/printable/Jiffy-Lube/92048?addressId=1689472&offerId=1581320 from the main Spokane, WA page, but this URL contains a canonical tag that cuts off the ?addressId= section of the URL, leaving http://www.valpak.com/coupons/printable/Jiffy-Lube/92048. This URL is indexed: https://www.google.co.uk/search?q=http%3A%2F%2Fwww.valpak.com%2Fcoupons%2Fprintable%2FJiffy-Lube%2F92048&oq=http%3A%2F%2Fwww.valpak.com%2Fcoupons%2Fprintable%2FJiffy-Lube%2F92048&aqs=chrome..69i58j69i60j69i57.895j0j4&sourceid=chrome&espv=210&es_sm=119&ie=UTF-8 and I get it ranking sixth in google.com for [jiffy lube coupons post falls] (not the web's most competitive phrase, but an indicator that the site is indexed and able to rank well for "deep" pages).
MoneyMailer's pages are badly optimised - not even a descriptive title tag here: http://www.moneymailer.com/coupons/online/seattle/wa/dining/855894?pageNum=1 but the page is still indexed. That page doesn't rank for related terms, as far as I can see.
Regarding location, several are allowing URLs that do not denote location to load, with canonical tags pointing to a location-based URLs, e.g. http://www.localsaver.com/98102/Real_Estate_Agents/Windermere_Eastlake/BDSP-12576652/931434.html is accessed from the Seattle, WA page but its canonical tag points to http://www.localsaver.com/WA/Seattle/Windermere_Eastlake/BDSP-12576652/931434.html
I would imagine that location is pretty key, especially given the nature of search queries you're wanting to target, e.g. people who want a coupon for a restaurant local to themselves. If people want to walk into a specific store or restaurant with a coupon, they will note the area. Where you will see people leave the area out is when they expect to buy online, or the product is more generic than a specific store, e.g. shoes. Many sites seem to employe a combination, but those focusing on location are keeping it simple and mentioning coupons available at specific stores.
I would look to placing content in a structure that avoids duplication but keeps the site structure relatively simple, like coupons/region/category. You are seeing a lot of variation because there are multiple ways to go about this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How about a No-index backlink in the eye of Google
I have a doubt - when I create a backlink as a part of SEO in some website when I rechecked the same couple of days after. It hasn't indexed and I checked its robots file. It showing **User-agent: ****Mediapartners-Google ****Disallow: ****User-Agent: * ****Disallow:**However, is this create any backlink support or just this for the purpose of not indexing in google.I make it simple -"Is this kind of backlink creation support my SEO activity or Not?" In this No-index website.
Local Website Optimization | | LayaPaul0 -
Using posts to make static pages - A best practice or a bad idea?
I have started working with a few law firms this year. They already have websites and I am doing various marketing tasks such as copywriting, redesigns, and, of course, SEO. In a couple of cases I was surprised to find that they had made the pages describing their various practice areas post content. I'm not sure why. But I suspect that the idea might have been to have the phrase: /practice-areas/ as a part of their URL. I didn't really like the idea of treating pages like posts. It seems a bit like working the system. But apart from that, wouldn't pages have a higher value as "permanent" content? As posts - their publish date has more weight, right? And they'd get old? But maybe the previous developers were on to something and the category/post approach to listing practice areas is the way to go? I am starting a new site for a new firm and I'd like to feel more confident about the right structure to choose for this kind of website before I start. Does anybody know the right answer? Thanks!
Local Website Optimization | | Dandelion1 -
Service area local seo
Hello, everyone. I am struggling a little with the vast amounts of information about how best to get a local service area business ranking and the best practice. If I explain what I have been doing and then see how I can improve. I have created a couple of websites for window cleaners. These window cleaners offer several services like window cleaning, gutter cleaning, conservatory cleaning, pressure washing etc. They also cover several towns/cities so it's important for them to be able to target all these areas in search. They don't have multiple offices so only have one home/office address and by the nature of the job provide services at the customer's house/business. What I have been doing is creating a page for each service they provide then to cover the areas I have been doing two things. Creating a page on the site called areas covered with a list of the areas they cover and also adding in the title of the page the main one or two areas that are most important to them. From what I can gather this might not be the best approach?? Google may see the areas in titles as keyword stuffing? Google also doesn't like a list of areas in one go anywhere on a site which can also seem like keyword stuffing? So for an example, this would be a rough title structure of service pages Window cleaners in town/city, town/city and town/city Gutter Cleaners in town/city, town/city and town/city As I said I am not sure this is the best way to do this from what I read. I have read about area specific pages but i struggle to see how i could make each area specific page unique enough as the service is exactly the same in each area. I have also read that putting the most important keywords at the begingin of the the title is better so using the above example would this be better? town/city window cleaners - business name So from what i understand having pages like this might be better Window cleaners town/city1 Window cleaners town/city2 Window cleaners town/city3 Gutter Cleaners town/city1 Gutter Cleaners town/city2 Gutter Cleaners town/city3 and so on but like I say I am aware each of these area specific pages would need to be unique but being that the services are exactly the same in each area I am not sure how I could warrant creating all the pages. Writing about the specific area on the page seems a little odd in that the visitor who lands on that page doesn't want to learn about their area, they live there and know the area. They want to know what the service is and if they do in fact cover their area. In which case how can i best ensure all or most of the areas they cover are targeted and show in search? Some sites i have done cover around 20-30 towns around them so how can best ensure they rank for them? I have also been reading conflicting information about how to structure pages and urls. Some say don't use commas in page titles, some say don't use underscores and only use hyphens. Similarly, I have read that the URL should not contain any hyphens but I am not sure about this seeing as WordPress often adds hyphens between words in URLs. Some say you should always have an H1 on every page others say it's not all that important anymore. With images, i have also been giving them alts the same as the page titles thay are on, is this the wrong thing to do? Id be happy to private messge (if i can do that here) one of the sites I would be eternally grateful if anyone can help in firstly clarifying how I could best improve ranking for areas covered and secondly what best practice is to structure page content like H1's image alts etc. Thanks
Local Website Optimization | | Gavpeds0 -
On what pages of my site should I put schema.org structured markup for an Aggregate Review of a Concrete Construction Contractors work?
I have a concrete contractor that I do a site for. He has many reviews from Home Advisor. So I created a Structured Data Markup using HTML5\. I put the the AggregateReview near the bottom of the About Us page at [http://www.skv-construction.com/about-us.html](http://www.skv-construction.com/about-us.html). Question 1: Should I also put the AggregateReview on the home page, or on specific project pages. Question 2: How will Google use the data now if the About page is NOT searched or displayed in SERPs. Does Google display this markup when and where they want to? Question 3: Siince this is a Local Business, should I embed the AggregateReview within the LocalSearch tag. I passed the Google test as it is for the Aggregate Review! But I have the review wrapped in the HomeAndConstructionBusiness tag. Here is the code: "http://schema.org/HomeAndConstructionBusiness"> # Quality Workmanship w 50 Yrs Experience "http://schema.org/AggregateRating"> 4.37 stars-based on 54 reviews at ["http://www.homeadvisor.com/rated.SKVConstruction.18028291.html"](<a) target="_blank">Home Advisor "http://schema.org/PostalAddress"> 10005 Fair Lane <spam itemprop="addresslocality" union=""></spam> IL 60180 (847) 364 0161 ["http://www.skv-construction.com/contact-us.html"](<a)>Contact Us Price Range: All Jobs Custom; Call for Quote or Visit Web Site Would appreciate any help. This markup is so vague, I can see why few people are using it. Maybe you should do a Video training or extended training on how to's. Vernon Wanner 815-332-8062
Local Website Optimization | | VernonWanner0 -
Local SEO - Multiple stores on same URL
Hello guys, I'm working on a plan of local SEO for a client that is managing over 50 local stores. At the moment all the stores are sharing the same URL address and wanted to ask if it s better to build unique pages for each of the stores or if it's fine to go with all of them on the same URL. What do you think? What's the best way and why? Thank you in advance.
Local Website Optimization | | Noriel0 -
How to approach SEO for a national website that has multiple chapter/location websites all under different URLs
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national umbrella site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local website. They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago.) We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
Local Website Optimization | | timfrick0 -
Yoast Local SEO Reviews/Would it work for me?
Hi everyone, I'm looking for some feedback on Yoast Local SEO, and if you think it'd work for our site. www.kempruge.com. Our site is a wordpress site, and there's nothing about it, off the top of my head, that makes me think it wouldn't work, but I've been wrong before. We do use All-In-One SEO, not the Yoast plugin, so I'm not sure if that's compatible.or would cause a problem? (The reason we use All-In-One and not Yoast is because that's what we had when I got here, and I'm worried what would happen if we switched). Also, we have three offices, and I need to be able to do local seo for all three. I know Yoast says it supports multiple offices, but I'd feel more comfortable if someone on here let me know from his/her experience that it did. Anything else you want to add about Yoast Local, I'm all ears! Thanks, Ruben
Local Website Optimization | | KempRugeLawGroup0 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0