Which is better for Local & National coupons --1000s of Indexed Pages per City or only a Few?
-
Not sure where this belongs..
I am developing a coupons site for listing local coupons and national coupons (think Valpak+RetailMeNot), eventually in all major cities, and am VERY concerned about how many internal pages to let google 'follow' for indexing, as it can exceed 10,000 per city.
Is there a way to determine what the optimal approach is for internal paging/indexing BEFORE I actually launch the site (it is about ready except for this darned url question, which seems critical) Ie can I put in searchwords for google to determine which ones are most worthy to have their own indexed page? I'm a newbie sort of, so please put answer in simple terms. I'm one person and have limited funds and need to find the cheapest way to get the best organic results for each city that I cover.
Is there a generic answer? One SEO firm told me the more variety the better. Another told me that simple is better, and use content on the simple pages to get variety. So confused I decided to consult the experts here!
Here's the site concept:
**FOR EACH CITY: **
User inputs location: Main city only(ie Houston), or 1 of 40 city regions(suburb, etc..), or zip code, or zip-street combo, OR allow gps lookup. A miles range is defaulted or chosen by the user.
After search area is determined, user chooses 1 of 6 types of coupons searches:
1. Online shopping with national coupon codes, choice of 16 categories (electronics, health, clothes, etc) and 100 subcategories (computers, skin care products, mens shirts) These are national offers for chains like Kohls, which do not use the users location at all.
2. Local shopping in-store coupons, choice of same 16 categories and 100 subcategories that are used for online shopping in #1 (mom & pop shoe store or local chain offer). The results will be within the users chosen location and range.
3. Local restaurant coupons, about 60 subcategories (pizza, fast food, sandwiches). The results are again within the users chosen location and range.
4. Local services coupons, 8 categories (auto repair, activities,etc..) and around 200 subcategories (brakes, miniature golf, etc..). Results within users chosen location and range.
5. Local groceries. This is one page for the main city with coupons.com grocery coupons, and listing the main grocery stores in the city. This page does not break down by sub regions, or zip, etc..
6. Local weekly ad circulars. This is one page for the main city that displays about 50 main national stores that are located in that main city.
So, the best way to handle the urls indexed for the dynamic searches by locations, type of coupon, categories/subcats, and business pages
The combinations of potential urls to index are nearly unlimited:
Does the user's location matter when he searches for one thing (restaurants), but not for another (Kohls)? IF so, how do I know this? SHould I tailor indexed urls to that knowledge? Is there an advantage to having a url for NATIONAL cos that ties to each main city: shopping/Kohls vs shopping/Kohls/Houston or even shopping/Kohls/Houston-suburb?
Again, I"m talking about 'follow' links for indexing. I realize I can have google index just a few main categories and subcats and not the others, or a few city regions but not all of them, etc.. while actually having internal pages for all of them..
Is it better to have 10,000 urls for say coupon-type/city-region/subcategory or just one for the main city: main-city/all coupons?, or something in between? You get the gist. I don't know how to begin to figure out the answers to these kinds of questions and yet they seem critical to the design of the site.
The competition: sites like Valpak, MoneyMailer, localsaver seem to favor the 'more is better' approach, with coupons/zipcode/category or coupons/bizname/zipcode But a site like 8coupons.com appears to have no indexing for categories or subcategories at all! They have city-subregion/coupons and they have individual businesses bizname/city-subregion but as far as I see no city/category or city-subregion/category. And a very popular coupons site in my city only has maincity/coupons maincity/a few categories and maincity/bizname/coupons.
Sorry this is so long, but it seems very complicated to me and I wanted to make the issue as clear as possible. Thanks, couponguy
-
Great! I just sent you an email.
-
Hi,
Sure, I can do some analysis for you - I work solely as a freelance consultant right now. If you're keen, just send me an email (jane.copland@gmail.com). I can do a competitive analysis audit for the main competitors, which could be of use!
Cheers,
Jane
-
Thanks. Not knowing this well, are you for hire to check out some competitors if I give you some names? I can't afford to mess this up (over 5000 hours of programming into this). I know I should learn more but I'm spread thin..
-
Hi,
If consolidation is an option, I'd certainly consider it. What I'd be curious about is the indexation and page count of your most successful competitors. I have not worked with a coupon site personally, and I must admit that the 8,000 page number per town does concern me... however, what I'd do is run a ScreamingFrog crawl (http://www.screamingfrog.co.uk/seo-spider/ - you will need to pay for the premium account at $99 to remove the 500 URL limit) for a look at competitors' websites. This will show you the response codes, canonical tags, directives, etc. that others are using. I am not a fan of the idea that if your competitors are doing it, you should do it too, but this will give you a good idea of what is working for sites who manage to rank well for both smaller terms ([jiffy lube coupon post falls]) and big terms ([kohls coupons]).
I would say that 1,000 is preferable to 8,000 if structured properly, but I'd be really keen to know what the rest of your field in vouchers / coupons looks like from an indexed / live URLs perspective.
-
Thank you. 8000 pages per city won't hurt me? That's perhaps my biggest concern..My structure right now has all those pages, but I want to make sure that's the best way to go..alternatively I could probably reduce the number to 1000 or so by combining subcategories in to 'grouped' subcategories (ie all plumber, carpenter, contractors go under 'home-repairs'). Is 1000 better than 8000?
-
Hi,
It really is complicated - I would definitely say that you do not need to think about building links to 8,000+ pages - the well-ranked competitors won't have good links to the majority of their internal pages, but they'll engage in good marketing that brings in authority to the home page and similar high-level pages on the site. Then they'll link well, with good site structure, down through the categories. They'll also (for the most part) avoid duplication issues with canonical tags, although as you point out, some duplication within sites like this is to be expected. Due to these sites' pages being indexed and often ranking well, we have to assume that Google understands the nature of coupon sites, although you still need to be careful of site hygiene and keep a close eye on your Webmaster Tools account for crawl errors, etc.
-
Thanks Jane,
This is, it seems, complicated, so I appreciate your taking the time to check into it.
Very good advice regarding avoiding duplication. Yet, in the Olive Garden example location IS important, so if I decide to go that route I need to be sure there is content unique to the location (maybe nearby offers, for example..)
If there are 40 regions in a city, and 200 subcategories that's 8000 indexed pages potentially without even listing businesses, so even a simple structure like you mention could yield a huge number of internal pages. I question the value of trying to build backlinks to 8000 pages and worry about losing 'juice' from the home page if I do so..(I've read that page rank is a very low search ranking factor anymore, so maybe I need not worry about the juice issue at all--your thoughts?).
-
Hi there,
The danger you face in creating tens of thousands of URLs / pages for everything on the site and allowing those pages to be indexed is that it is almost certain that these pages will essentially duplicate each other. A coupon page for deals at Olive Garden in Phoenix will not be different, besides one or two words, from a page about Olive Garden in Seattle.
This isn't stopping most of the competitors mentioned: Valpak is allowing these pages to be indexed, although I am not sure of their reach with these pages in terms of search engine performance. Users access a page like this one: http://www.valpak.com/coupons/printable/Jiffy-Lube/92048?addressId=1689472&offerId=1581320 from the main Spokane, WA page, but this URL contains a canonical tag that cuts off the ?addressId= section of the URL, leaving http://www.valpak.com/coupons/printable/Jiffy-Lube/92048. This URL is indexed: https://www.google.co.uk/search?q=http%3A%2F%2Fwww.valpak.com%2Fcoupons%2Fprintable%2FJiffy-Lube%2F92048&oq=http%3A%2F%2Fwww.valpak.com%2Fcoupons%2Fprintable%2FJiffy-Lube%2F92048&aqs=chrome..69i58j69i60j69i57.895j0j4&sourceid=chrome&espv=210&es_sm=119&ie=UTF-8 and I get it ranking sixth in google.com for [jiffy lube coupons post falls] (not the web's most competitive phrase, but an indicator that the site is indexed and able to rank well for "deep" pages).
MoneyMailer's pages are badly optimised - not even a descriptive title tag here: http://www.moneymailer.com/coupons/online/seattle/wa/dining/855894?pageNum=1 but the page is still indexed. That page doesn't rank for related terms, as far as I can see.
Regarding location, several are allowing URLs that do not denote location to load, with canonical tags pointing to a location-based URLs, e.g. http://www.localsaver.com/98102/Real_Estate_Agents/Windermere_Eastlake/BDSP-12576652/931434.html is accessed from the Seattle, WA page but its canonical tag points to http://www.localsaver.com/WA/Seattle/Windermere_Eastlake/BDSP-12576652/931434.html
I would imagine that location is pretty key, especially given the nature of search queries you're wanting to target, e.g. people who want a coupon for a restaurant local to themselves. If people want to walk into a specific store or restaurant with a coupon, they will note the area. Where you will see people leave the area out is when they expect to buy online, or the product is more generic than a specific store, e.g. shoes. Many sites seem to employe a combination, but those focusing on location are keeping it simple and mentioning coupons available at specific stores.
I would look to placing content in a structure that avoids duplication but keeps the site structure relatively simple, like coupons/region/category. You are seeing a lot of variation because there are multiple ways to go about this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I want to rank a national home page for a local keyword phrase
Hello - We are a nationally available brand based in Denver, CO. Our home page currently ranks #8 (used to be 5) for "real estate photography in Denver" -- I want to improve this ranking, but our home page is generalized and not geared toward Denver, CO but to all of our markets. I'm trying to troubleshoot this and have a few ideas.... I would love advice on the best route, or a different route altogether: Create a Denver-specific page -- _will that page compete with my home page that is already ranked in the top ten? _ Add the keyword phrase in the image alt attribute Add keyword phrase into the content - need to make sure that viewers realize we are national I already updated the meta description to say "real estate photography in Denver and beyond"
Local Website Optimization | | virtuance_photography1 -
One locations page, or multiple pages?
Hi, I represent a franchisor who does all marketing- including local seo- for our franchisees. I've read a lot about local SEO and understand the basics, but have some remaining questions. 1- If our typical territories are quite large and encompass more than one major city, should we create multiple location pages for the same franchise owner? I believe the answer should be yes from an SEO stand point, but the problem is that most of our franchisees naturally just have one business address (their home). Since PO boxes and virtual offices aren't the way to go, what's the best course of action? And when I say major cities, I'm really talking about major cities (and not just small towns/boroughs). Can they just use a friend's/relative's address? 2- There's a lot of info out there about "locations pages," but it's not really clear whether or not you should really just have ONE page for each location, or several pages with different content? For instance, it looks like a lot of businesses are creating just one, "home-page" looking landing page for their individual locations, with everything from services to testimonials on just that one page. Is this preferred over creating several different local pages for that one location? The latter is what we currently do. From the user stand-point, it looks like each franchise location has it's own "mini website" on our main website. For instance, a landing page optimized for the local business name, a local services page, a project/photo gallery page, local review page, etc. It seems like a lot less work just building one landing page for each location, but is the payoff the same? I'm torn between the two strategies- is it really worth the extra work (in terms of traffic + local ranking) to build out the individual pages for the one location? Thanks Moz Community!
Local Website Optimization | | kimberleymeloserpa0 -
Google still indexing home page even after with 301 - Ecommerce Website
Hi all,
Local Website Optimization | | David1986
We have a 301 redirect problem. Google seems to continue indexing a 301 redirect to our old home page. Even after months. We have a multiple language domain, with subfolders: www.example.com (ex page, now with a redirect to the right locale in the right country) www.example.com/it/home (canonical) www.example.com/en/home (canonical) www.example.com/es/home (canonical) www.example.com/fr/home (canonical) www.example.com/de/home (canonical) We still see the old page (www.example.com) in Google results, with old metadata in English and, just in some countries (i.e.: France), we see the correct result, the "new" homepage, www.example.com/fr/home in first position.
The real problem is that Google is still indexing and showing www.example.com as the "real" and "trusted" URL, even if we set: a 301 redirect the right language for every locale in Google Search Console a canonical tag to the locale url an hreflang tag inside the code a specific sitemap with hreflang tag specified for the new homepages Now our redirect process is the following (Italy example).
www.example.com -->301
www.example.com/en/home --> default version --->301
www.example.com/it/home --> 200 Every online tool, from Moz to Bot simulators see that there is a 301. So Correct. Google Search Console says that: on www.example.com there is a 301 (correct) in the internal link section of Google Search Console the www.example.com is still in first position with 34k links. Many of these links are cominig from property subdomains. Should we change those links inside those third level domain? From www.example.com to www.example.com/LOCALE/home? the www.example.com/LOCALE/home are the real home page, they give 200 code Do you know if there's a way to delete the old home page from Google results since this is 301? Do you think that, even after a 301 redirect, if Google sees too many internal links decides to ignore the 301? Thanks for your help! Davide0 -
Call Tracking, DNI Script & Local SEO
Hi Moz! I've been reading about this a lot more lately - and it doesn't seem like there's exactly a method that Google (or other search engines) would consider to be "best practices". The closest I've come to getting some clarity are these Blumenthals articles - http://blumenthals.com/blog/2013/05/14/a-guide-to-call-tracking-and-local/ & the follow-up piece from CallRail - http://blumenthals.com/blog/2014/11/25/guide-to-using-call-tracking-for-local-search/. Assuming a similar goal of using an existing phone number with a solid foundation in the local search ecosystem, and to create the ability to track how many calls are coming organically (not PPC or other paid platform) to the business directly from the website for an average SMB. For now, let's also assume we're also not interested in screening the calls, or evaluating customer interaction with the staff - I would love to hear from anyone who has implemented the DNI call tracking info for a website. Were there negative effects on Local SEO? Did the value of the information (# of calls/month) outweigh any local search conflicts? If I was deploying this today, it seems like the blueprint for including DNI script, while mitigating risk for losing local search visibility might go something like this: Hire reputable call-tracking service, ensure DNI will match geographic area-code & be "clean" numbers Insert DNI script on key pages on site Maintain original phone number (non-DNI) on footer, within Schema & on Contact page of the site ?? Profit Ok, those last 2 bullet points aren't as important, but I would be curious where other marketers land on this issue, as I think there's not a general consensus at this point. Thanks everyone!
Local Website Optimization | | Etna1 -
Ranking a Website that Services Multiple Cities
We have a website that offers services to various cities in a state. However, since we don't want to do keyword stuffing, how do we rank this website for all of these cities when it comes to the **title tags? **For example, how do we optimize the homepage title tag? Obviously I know we can't put all the cities into it, so how do we choose which city to use? I know we can add city/local pages and optimize them for those locations, but I'm referring specifically to the homepage and other main pages of the website. How do you determine which cities to use in those title tags?
Local Website Optimization | | SEOhughesm0 -
Stuck on Page 4...is this diagnosis on the right track?
My website's (http://bartlettpairphotography.com) SERP rank is #45 for my targeted keyword: Philadelphia wedding photographers. My site is several years old, with 31-Domain Authority and 42-Page Authority. I've been stuck in SERP 40's for about a year now (I used to be top 5) and I have been pulling my hair out trying everything to no avail. I have an inkling that some configuration is seriously wrong, and would be very very appreciative is someone could point me in the right direction! I'm evidently not an expert at this, but here are my high level thoughts, though I could be totally off base here: Homepage problems (ranking 45 for highest priority keyword: Philadelphia wedding photographers): The #5 rank has a flash website, homepage = 33-DA/44-PA (slightly better than me). This makes me wonder if my problem is off-page? I have recently been submitting my photography work to many relevant wedding blogs so I think I will get some nice relevant backlinks in the coming weeks/months. The #11 rank has the same wordpress theme as me (ProPhotoBlogs), and homepage = 26-DA, 35-PA (somewhat worse than me) and similar homepage content etc...this makes me think I have an on-page problem? As you can see, my targeted keyword starts off with a geographic location. Geographically, our location is ~1 hour outside of the location, so ranking on Google maps etc. is very competitive (hundreds of competitors that are closer). Therefore, I'm mostly focused on non-local ranking. Both of the competitors I mentioned are ranking non-locally and both are 1 hour outside Philadelphia. With that said, would it still benefit me to add local content to my homepage (insert google maps, address, hours etc.)? NON-homepage problems (ranking ~30 for longer tail keywords, i.e. specific wedding venues) My blog page (http://bartlettpairphotography.com/blog) is ="noindex,follow." My reasoning for the "noindex" is because I'm showing FULL posts rather than excerpts (because I want my brides to flip through ~5 weddings rather than only clicking on 1). My thinking was that the FOLLOW aspect would pass along the link juice, while avoiding a duplicate content penalty by noindexing? I don't think this problem affects my higher priority homepage problem, but still wanted to point it out. We have ~100 published posts, but honestly I only care about ranking for ~30 of them. What should I do with the ~70 that I don't care about? Are they sucking up link juice that would be better elsewhere? Or should I just leave it because it's more content? Other than that, I'm really lost as to how I can improve my site. I gave the above examples to show that I am trying, but ultimately I feel like I'm looking in the wrong areas. With my SERP in the mid 40s, I feel like many things are broken that I am not able to figure out. I would be so very grateful if someone could help diagnose my issues!
Local Website Optimization | | bartlettpairphoto0 -
Where does analytics pull information from for general keyphrases that do not list a city? Ex: Restaurants, Playgrounds, Librarys
I have heard that when doing a general search, the search engine will pull the results based on IP address. But what if that information is not available? Where does analytics pull that keyword information if there is no location associated with the keyphrase?
Local Website Optimization | | seomozinator0 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0