Which is better for Local & National coupons --1000s of Indexed Pages per City or only a Few?
-
Not sure where this belongs..
I am developing a coupons site for listing local coupons and national coupons (think Valpak+RetailMeNot), eventually in all major cities, and am VERY concerned about how many internal pages to let google 'follow' for indexing, as it can exceed 10,000 per city.
Is there a way to determine what the optimal approach is for internal paging/indexing BEFORE I actually launch the site (it is about ready except for this darned url question, which seems critical) Ie can I put in searchwords for google to determine which ones are most worthy to have their own indexed page? I'm a newbie sort of, so please put answer in simple terms. I'm one person and have limited funds and need to find the cheapest way to get the best organic results for each city that I cover.
Is there a generic answer? One SEO firm told me the more variety the better. Another told me that simple is better, and use content on the simple pages to get variety. So confused I decided to consult the experts here!
Here's the site concept:
**FOR EACH CITY: **
User inputs location: Main city only(ie Houston), or 1 of 40 city regions(suburb, etc..), or zip code, or zip-street combo, OR allow gps lookup. A miles range is defaulted or chosen by the user.
After search area is determined, user chooses 1 of 6 types of coupons searches:
1. Online shopping with national coupon codes, choice of 16 categories (electronics, health, clothes, etc) and 100 subcategories (computers, skin care products, mens shirts) These are national offers for chains like Kohls, which do not use the users location at all.
2. Local shopping in-store coupons, choice of same 16 categories and 100 subcategories that are used for online shopping in #1 (mom & pop shoe store or local chain offer). The results will be within the users chosen location and range.
3. Local restaurant coupons, about 60 subcategories (pizza, fast food, sandwiches). The results are again within the users chosen location and range.
4. Local services coupons, 8 categories (auto repair, activities,etc..) and around 200 subcategories (brakes, miniature golf, etc..). Results within users chosen location and range.
5. Local groceries. This is one page for the main city with coupons.com grocery coupons, and listing the main grocery stores in the city. This page does not break down by sub regions, or zip, etc..
6. Local weekly ad circulars. This is one page for the main city that displays about 50 main national stores that are located in that main city.
So, the best way to handle the urls indexed for the dynamic searches by locations, type of coupon, categories/subcats, and business pages
The combinations of potential urls to index are nearly unlimited:
Does the user's location matter when he searches for one thing (restaurants), but not for another (Kohls)? IF so, how do I know this? SHould I tailor indexed urls to that knowledge? Is there an advantage to having a url for NATIONAL cos that ties to each main city: shopping/Kohls vs shopping/Kohls/Houston or even shopping/Kohls/Houston-suburb?
Again, I"m talking about 'follow' links for indexing. I realize I can have google index just a few main categories and subcats and not the others, or a few city regions but not all of them, etc.. while actually having internal pages for all of them..
Is it better to have 10,000 urls for say coupon-type/city-region/subcategory or just one for the main city: main-city/all coupons?, or something in between? You get the gist. I don't know how to begin to figure out the answers to these kinds of questions and yet they seem critical to the design of the site.
The competition: sites like Valpak, MoneyMailer, localsaver seem to favor the 'more is better' approach, with coupons/zipcode/category or coupons/bizname/zipcode But a site like 8coupons.com appears to have no indexing for categories or subcategories at all! They have city-subregion/coupons and they have individual businesses bizname/city-subregion but as far as I see no city/category or city-subregion/category. And a very popular coupons site in my city only has maincity/coupons maincity/a few categories and maincity/bizname/coupons.
Sorry this is so long, but it seems very complicated to me and I wanted to make the issue as clear as possible. Thanks, couponguy
-
Great! I just sent you an email.
-
Hi,
Sure, I can do some analysis for you - I work solely as a freelance consultant right now. If you're keen, just send me an email (jane.copland@gmail.com). I can do a competitive analysis audit for the main competitors, which could be of use!
Cheers,
Jane
-
Thanks. Not knowing this well, are you for hire to check out some competitors if I give you some names? I can't afford to mess this up (over 5000 hours of programming into this). I know I should learn more but I'm spread thin..
-
Hi,
If consolidation is an option, I'd certainly consider it. What I'd be curious about is the indexation and page count of your most successful competitors. I have not worked with a coupon site personally, and I must admit that the 8,000 page number per town does concern me... however, what I'd do is run a ScreamingFrog crawl (http://www.screamingfrog.co.uk/seo-spider/ - you will need to pay for the premium account at $99 to remove the 500 URL limit) for a look at competitors' websites. This will show you the response codes, canonical tags, directives, etc. that others are using. I am not a fan of the idea that if your competitors are doing it, you should do it too, but this will give you a good idea of what is working for sites who manage to rank well for both smaller terms ([jiffy lube coupon post falls]) and big terms ([kohls coupons]).
I would say that 1,000 is preferable to 8,000 if structured properly, but I'd be really keen to know what the rest of your field in vouchers / coupons looks like from an indexed / live URLs perspective.
-
Thank you. 8000 pages per city won't hurt me? That's perhaps my biggest concern..My structure right now has all those pages, but I want to make sure that's the best way to go..alternatively I could probably reduce the number to 1000 or so by combining subcategories in to 'grouped' subcategories (ie all plumber, carpenter, contractors go under 'home-repairs'). Is 1000 better than 8000?
-
Hi,
It really is complicated - I would definitely say that you do not need to think about building links to 8,000+ pages - the well-ranked competitors won't have good links to the majority of their internal pages, but they'll engage in good marketing that brings in authority to the home page and similar high-level pages on the site. Then they'll link well, with good site structure, down through the categories. They'll also (for the most part) avoid duplication issues with canonical tags, although as you point out, some duplication within sites like this is to be expected. Due to these sites' pages being indexed and often ranking well, we have to assume that Google understands the nature of coupon sites, although you still need to be careful of site hygiene and keep a close eye on your Webmaster Tools account for crawl errors, etc.
-
Thanks Jane,
This is, it seems, complicated, so I appreciate your taking the time to check into it.
Very good advice regarding avoiding duplication. Yet, in the Olive Garden example location IS important, so if I decide to go that route I need to be sure there is content unique to the location (maybe nearby offers, for example..)
If there are 40 regions in a city, and 200 subcategories that's 8000 indexed pages potentially without even listing businesses, so even a simple structure like you mention could yield a huge number of internal pages. I question the value of trying to build backlinks to 8000 pages and worry about losing 'juice' from the home page if I do so..(I've read that page rank is a very low search ranking factor anymore, so maybe I need not worry about the juice issue at all--your thoughts?).
-
Hi there,
The danger you face in creating tens of thousands of URLs / pages for everything on the site and allowing those pages to be indexed is that it is almost certain that these pages will essentially duplicate each other. A coupon page for deals at Olive Garden in Phoenix will not be different, besides one or two words, from a page about Olive Garden in Seattle.
This isn't stopping most of the competitors mentioned: Valpak is allowing these pages to be indexed, although I am not sure of their reach with these pages in terms of search engine performance. Users access a page like this one: http://www.valpak.com/coupons/printable/Jiffy-Lube/92048?addressId=1689472&offerId=1581320 from the main Spokane, WA page, but this URL contains a canonical tag that cuts off the ?addressId= section of the URL, leaving http://www.valpak.com/coupons/printable/Jiffy-Lube/92048. This URL is indexed: https://www.google.co.uk/search?q=http%3A%2F%2Fwww.valpak.com%2Fcoupons%2Fprintable%2FJiffy-Lube%2F92048&oq=http%3A%2F%2Fwww.valpak.com%2Fcoupons%2Fprintable%2FJiffy-Lube%2F92048&aqs=chrome..69i58j69i60j69i57.895j0j4&sourceid=chrome&espv=210&es_sm=119&ie=UTF-8 and I get it ranking sixth in google.com for [jiffy lube coupons post falls] (not the web's most competitive phrase, but an indicator that the site is indexed and able to rank well for "deep" pages).
MoneyMailer's pages are badly optimised - not even a descriptive title tag here: http://www.moneymailer.com/coupons/online/seattle/wa/dining/855894?pageNum=1 but the page is still indexed. That page doesn't rank for related terms, as far as I can see.
Regarding location, several are allowing URLs that do not denote location to load, with canonical tags pointing to a location-based URLs, e.g. http://www.localsaver.com/98102/Real_Estate_Agents/Windermere_Eastlake/BDSP-12576652/931434.html is accessed from the Seattle, WA page but its canonical tag points to http://www.localsaver.com/WA/Seattle/Windermere_Eastlake/BDSP-12576652/931434.html
I would imagine that location is pretty key, especially given the nature of search queries you're wanting to target, e.g. people who want a coupon for a restaurant local to themselves. If people want to walk into a specific store or restaurant with a coupon, they will note the area. Where you will see people leave the area out is when they expect to buy online, or the product is more generic than a specific store, e.g. shoes. Many sites seem to employe a combination, but those focusing on location are keeping it simple and mentioning coupons available at specific stores.
I would look to placing content in a structure that avoids duplication but keeps the site structure relatively simple, like coupons/region/category. You are seeing a lot of variation because there are multiple ways to go about this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International subdirectory without localized content - best practice / need advice
Hi there, Our site uses a subdirectory for regional and multilingual sites as show below for 200+ countries.
Local Website Optimization | | erinfalwell
EX: /en_US/ All sites have ~the same content & are in English. We have hreflang tags but still have crawl issues. Is there another URL structure you would recommend? Are there any other ways to avoid the duplicate page & crawl budget issues outside of the hreflang tag? Appreciate it!0 -
Improve my on-page SEO
Hello, I am a photographer based in the UK, I have recently increased my prices, so SEO has become more important then ever as I need to target additional cities and wedding venues. I am looking for suggestions on ways I can ethically improve my websites on-page SEO and regional landing pages. I am running out of ideas, so any suggestions would be welcome. Do you think search engines will see these regional pages as low quality spammy pages are they not advised! If so how can I target other cities with out paying for PPC. Home page Additional Issues Is the 404 server script any good? I also have an issue, with old deleted wordpress pages, redirecting them even though there are no redirects set up in SEO yoast. I am not sure the server script on the shared hosting for 404 errors is any good, does anyone have any experience with this. For example this page returns the 404 page, however the header status is 200. http://www.robertsail.co.uk/derby-wedding-photographers-2/ If I moved to a dedicated server would this help me out.
Local Website Optimization | | Roboto19701 -
What is the Best Keyword Placement within a URL for Inner Location Pages?
I'm working on a website with 100s of locations. There is a location search page (Find Widget Dealer), a page for each state (Tennessee Widget Dealers) and finally a page for each individual location which has localized unique content and contact info (Nashville Widget Dealer). My question is is related to how I should structure my URL and the keywords within the URL. Keywords in my examples being the location and the product (i.e. widget). Here is a quick overview of each of the 3 tiered pages, with the Nashville page being the most optimized: Find Widget Dealer - Dealer Page only includes a location search bar and bullet list links to states Tennessee Widget Dealers - Page includes brief unique content for the the state and basic listing info for each location along with links to the local page) Nashville Widget Dealer - Page includes a good amount of unique content for this specific location (Most optimized page) That said, here are the 3 URL structure options I am considering: http://website.com/widget-dealers/tennesee/nashville http://website.com/dealers/tennesee-widget-dealers/nashville http://website.com/dealers/tennesee/nashville-widget-dealer Any help is appreciated! Thank you
Local Website Optimization | | the-coopersmith0 -
Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city. Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location. Which scenario is better? 1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it. 2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page. Thanks.
Local Website Optimization | | couponguy0 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0 -
Local Business Schema Markup on every page?
Hello, I have two questions..if someone could shed some light on the topic, I would be so very grateful! 1. I am still making my way through how schema is employed, and as I can tell, it is much more specific (and therefore relevant) in its details than using the data highlighter tool. Is this true? 2. Most of my clients' sites have a footer with the local business info included on every page of their site (address and phone). This said, I have been using the structured data markup helper to add local business schema to home page, and then including the footer markup in the footer file so that every page benefits from the local business markup. Is this incorrect to use it for every page? Also, I noticed that by just using the footer markup for the rest of the pages in the site, I am missing data that was included when I manually went through the index page (i.e. image, url, name of business). Could someone tell me if it is advisable and worth it to manually markup every page for the local business schema or if that should just be used for certain pages such as location, contact us, and/or index? Any tips or help would be greatly appreciated!!! Thanks
Local Website Optimization | | lfrazer0 -
Local SEO Tools for UK
Hi guys I'm looking for any recommendations for local SEO tools in the UK? I keep stumbling across a variety of different tools but they all seem to cater for the US market only. Any tools or tips would be greatly received!
Local Website Optimization | | DHS_SH0 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0