Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I understand the concept of silo architecture. What I don't understand is how to build the site navigation. I see experts talking about silos, but their sites have pervasive top level navigation. In theory, your top level nav breaks your silos.  If I have 20 pages of supporting content all linked to my silo page, and the top nav is on the supporting content pages, then those pages all link to the pages in the top nav - silo broken, and link juice diluted. it would seem to me that the only way to build a true silo is to strip out all of the navigation on a supporting page, and only have it link to: 1. The silo landing page 2. Other supporting pages in the silo. is this what Bruce Clay does? I've seen Rand's lectures on silos as well. Is this what he is doing? I recently saw a video by the Network Empire team, and they'd also have a pervasive nav. Can someone please explain this?

    | CsmBill
    0

  • So, I was thinking to myself today. Couldn't Google say everything is an exact match anchor text in reality? Such as, Hyundai in Boston, Or cars in boston? I'm just concerned, that's all. Thanks for your help.

    | PeterRota
    0

  • I have a question. I was wondering, if it was possible to set up a rel canonical. When I can't access the non canonical pages? For example, my site as at www.site.com , but the non cannocail is at site.com is their any way to set thet up without actually edting it at site.com ? Thanks for your help

    | PeterRota
    0

  • Why doesn't Google recommend that links are nofollow as standard, via HTML5, etc., with follow being added if the link is on a quality site (defined by PR, or whatever.) and adds value. Wouldn't this save alot of time? Then they could whack all the sites with coding that doesn't comply, couldn't they? Also, instead of enabling negative SEO, why doesn't Google simply focus on wiping out the sites developed simply to pass on PR. I'm sure we could all send them a few thousand suggestions!

    | McTaggart
    0

  • I've got lots of news posts on my blog. There is nothing wrong with the news posts themselves but older posts do get lower CTR and higher bounce rates. I was considering moving the older news to a subdomain (ie.: archive.mywebsite.com) and do a 302 redirect for each post. What do you think?

    | sbrault74
    0

  • Hi, I'm noticing increasing numbers of scraped directory links pointing back to the websites I manage. Much of this info appears to be scraped from a well known (and respected) directory. I don't build links to an of the websites I manage - and none have more than 200 linking root domains currently - not that many. The problem is I focus on quality links and the scraped links are incredibly weak on the whole. Diluting the quality links. I've noticed a certain paranoia in the SEO community about removing / disavowing links, and yet I'm tempted to ignore the rubbish (unless part of a major negative SEO push) and just get on with the job, focusing on quality content that drives natural links, and social media work.

    | McTaggart
    0

  • From a purely SEO /link juice perspective, is there any benefit to linking from body text to a page that is in a pervasive primary navigation?  The primary nav puts a link at the top of the HTML. With the tests done by members of this site, the "first link counts" rule negates the link juice value of a link in the body text if there is already a link in the nav. Now I've also seen the data on using hash tags to get a second or third link, but ignoring that, it would seem that links in the body text to pages in the nav have zero effect. This brings me to another question - block level navigation. If anchor text links pass more juice than links in the top navigation, why would you put your most coveted target pages in the top nav? You would be better off building links in the content, which would create a poor user experience. To me, the theory that anchor text links in the body pass more juice than links in the primary nav doesn't make any sense. Can someone please explain this to me?

    | CsmBill
    0

  • Can someone give me a sample of an effective meta description tag? All of the best practice stuff I read doesn't talk about how to raise CTR. It seems to me that this is a neglected area of SEO, and we want to do this right. Obviously, we will need to test. For example, my main home page keyword is "IT Support" Things I might want to put in the tag: Free Network Assessments 100% Risk Free Trials "Relentless IT Support" (Major Theme) 30 Years of Experience (since 1984) Eliminate your IT Support Headaches Forever (Too long) Call to action? BTW, Thanks to everyone for your help. This is a great community. Solid advice from experts. Here's an example of what I would create Relentless IT Support Since 1984. Trust and Accountability. 100% Risk Free Trials. Contact us today for Free IT Assessment.

    | CsmBill
    0

  • Hi Mozzers, How long does it normally take for your yahoo directory link to show up in Open Site Explorer? We signed up and got our link on dir.yahoo early January and still see nothing in OSE.  Should I call them up and ask them to fetch the page for Google?
    Get a refund? Helps!
    Thanks!

    | Travis-W
    0

  • If you perform this search, you'll see all m. results are blocked by robots.txt: http://goo.gl/PRrlI, but when I reviewed the robots.txt file: http://goo.gl/Hly28, I didn't see anything specifying to block crawlers from these pages. Any ideas why these are showing as blocked?

    | nicole.healthline
    0

  • I have a wordpress website installed on http://domain.com/home/ instead of http://domain.com  - Does it matter whether I leave it that way with a canonical link from the domain.com to the domain.com/home/ or should I move the wordpress files and database to the root domain?

    | JosephFrost
    0

  • I have a site that is ranking well for competitive keywords in the US, but would like to have it rank in Australia as well.  Although there's no direct correlation, I'm running large Adwords campaigns in both countries.  I've read to write localized content for each region, but not sure if this is effective as it used to be.  I've also read to use location markup and microformats.  Any feedback would be greatly appreciated. Thank you in advance

    | NickMacario
    0

  • I did the following two chained 301 redirects (A->B->C) Plural to Singular to New Domain A. http://domain1.com/filenames B. http://domain1.com/filename C. http://domain2.com/filename To new domain without www and then back to origining domain A. http://www.domain1.com/filename B. http://domain2.com/filename C. http://domain1.com/fifilename How much link juicy will be rediectetoto URL C in above two scenarios?

    | Bull135
    0

  • HI MozzersIn my websites sitemap.xml, pages are listed, such as /blog/ and /blog/textile-fact-or-fiction-egyptian-cotton-explained/These pages are visible when you visit them in a browser and when you use the Google Webmaster tool - Fetch as Google to view them (see attachment), however they aren't being indexed in Google, not even the root directory for the blog (/blog/) is being indexed, and when we query:site: www.hilden.co.uk/blog/ It returns 0 results in Google.Also note that:The Wordpress installation is located at /blog/ which is a subdirectory of the main root directory which is managed by Magento. I'm wondering if this causing the problem.Any help on this would be greatly appreciated!AnthonyToTOHuj.png?1

    | Tone_Agency
    0

  • Hi all, A client of mine has a website similar to Pintrest. All in Ajax/. So imagine an ajax-grid based animal lover site called domain.com. The domain has three different Categories Cats, Dogs, Mice. When you click on a category, the site doesn't handle the URL and doesn't change the domain So instead of the domain going from domain.com to domain.com/cats, it uses the Ajax script and just shows all the cat pins. and when you click on each pin/post it opens a page such as domain.com/Pin/123/PostTitle It doesn't reference the category. However a page domain.com/cats does exist and you can go there directly. Is this an SEO issue for not grouping all pins under a category? How does Google handle Ajax these days, it use to be real bad but if Pintrest is going so well i'm assuming times have changed? Any other things to be wary of for a grid based/ajax site? I am happy to pay for an hour or two for a more in depth audit/tips if you can feed back on the above. Fairly urgent. Thanks

    | Profero
    1

  • I have a question regarding parking good value domain. A client has a great website 'A' with page rank of 5 and a lot of traffic. They want to change the URL and redesign the site. So they have parked the domain 'A' and will later redirect it to the new domain, this will be in a month time. My questions is, by parking the old domain 'A' would they have lost its SEO value or will it be given to the new URL once they place a 301 redirect on it. Also, would it not have been better not to park domain 'A', keep it live and just redirect it once new domain goes live, notifying Google in Webmaster tools?

    | OrangeGuys
    0

  • I have a duplicate content warning in our PRO account (well several really) but I can't figure out WHY these pages are considered duplicate content. They have different H1 headers, different sidebar links, and while a couple are relatively scant as far as content (so I might believe those could be seen as duplicate), the others seem to have a substantial amount of content that is different.  It is a little perplexing. Can anyone help me figure this out? Here are some of the pages that are showing as duplicate: http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Seth+Green/?bioid=5554 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Solomon+Northup/?bioid=11758 http://www.downpour.com/catalogsearch/advanced/byNarrator/?mediatype=audio+books&bioid=3665 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Marcus+Rediker/?bioid=10145 http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Robin+Miles/?bioid=2075

    | DownPour
    0

  • Is it very important to make sure most of the links pointing at your site are "follow" links? Is it problematic to post legitimate comments on blogs that include a link back to relevant content or posts on your site?

    | BlueLinkERP
    0

  • Would it be worthwhile to list a business with both UBL and Localeze?

    | DougHoltOnline
    0

  • Hi all We have a link penalty at the moment it seems. I went through 40k links in various phases and have disavowed over a thousand domains that date back to old SEO work. I was barely able to have any links removed as the majority are on directories etc that no one looks after any more etc and / or which are spammy and scraped anyway. According to link research tools link detox tool, we now have a very low risk profile (I loaded the disavowed links into the tool for it to take into consideration when assessing our profile). I then submitted a reconsideration request on the same day as loading the new disavowed file (on the 26th of April). However today (7th May) we got a message in webmaster central that says our link profile is still unnatural. Aaargh. My question: is the disavow file taken into consideration when the reconsideration request is reviewed (ie is that information immediately available to the reviewer)? Or do we have to wait for the disavow file to flow through in the crawl stats? If so, how long do we have to wait? I've checked a link that I disavowed last time and it's still showing up in the links that I pull down from Webmaster Central, and indeed links that I disavowed at the start of April are still showing up in the list of links that can be downloaded. Any help gratefully received. I'm pulling my hair out here, trying to undo the dodgy work of a few random people many months ago! Cheers, Will

    | ArenaFlowers.com
    0

  • I'm reorganizing our categories (we had never used them) and some of them are the same as old tags. So one of them is going to have to have -1 at the end of the permalink (dueling-pianos vs dueling-pianos-1). Does this matter SEO-wise? Is the permalink less powerful to the search engines if it has a number at the end of it? Should I change our tags from dueling-pianos to dueilng-pianos-1? Or should I make the category dueling-piano-1?

    | howlusa
    0

  • I've debated on whether I should ask this on the Q&A forum.  Many times I have started to write it, then changed my mind.  So today, after sending another round of emails out to potential SEM and SEO consultants and getting zero response, I said what the heck and decided to post it. I will point the finger at myself and say I must be doing something wrong in my approach and the way I am seeking out consultants or maybe what I am not saying.  However, I cannot seem to get SEO companies to return my phone calls or emails.  Is my company too small? Are most of the companies recommended here too busy with other  work to worry about following up?  Is my company and brand not sexy enough to deal with?  Am I coming across wrong in my emails and phone calls? With that being said, I decided to write a post here and clearly state some basic and general requirements in hope someone interested will respond.  I would be more than happy to answer and specific questions about my company, budget, etc. in a PM. I am looking for an SEO firm or consultants who: Will deal with small businesses with revenue under $2M. I have budgeted for this and am ready to start a project immediately. Have experience and a successful track record working with eCommerce sites who sell a large number of products. Interested in a long term relationship that will produce long lasting, durable results. Try to understand my business and what it will take to make grow my ONLINE profitability. I look forward to speaking to anyone interested. Jake

    | jake372
    0

  • I have an ecommerce site with many different URLs with the same product. Let's say the product is a hat. It's in: a a) mysite.com/products/hat b) mysite.com/collections/head-ware/hat c) mysite.com/collections/stuff-to-wear-on-your-head/hat Right now, A is the canonical page for B and C. I want to clean up my site, so that every product only has ONE unique URL, which is linked to from all the collections. So B and C URL will be broken. Is it necessary that I 301 them if they were already canonical'd? Based on the number of products I have, I would have to 301 1000+ URLs. I'm just trying to figure out what I need to do to avoid getting penalized. thanks

    | birchlore
    0

  • Suppose the following scenarios after a 301 redirects from source URL to targent URL is removed. 1. If source URL raises a 404 error, will target URL retained the link juice previously passed from source URL? 2. If source URL starts to show different content than what is showing on target URL, will the previously passed link juice be credited back to the source URL?

    | Bull135
    0

  • I've seen Rands video on subdomains and best pratices at
    http://www.seomoz.org/blog/whiteboard-friday-the-microsite-mistake
    http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites I have a question/theory though and it is related to an issue I am having. We have built our website, and now we are looking at adding 3rd party forums and blogs etc (all part of one CMS). The problem is these need to to be on a seperate subdomain to work correctly (I won't go into the specific IT details but this is what I have been advised by my IT guru's). So I can have something like:
    http://cms.mysite.com/forum/ Obviously after reading Rands post and other stuff this is far from ideal. However I have another Idea - run the CMS from root and the main website from the www. subdomain. EG
    www.mysite.com
    mysite.com/blog Now my theory is that because so many website (possibly the majority - especially smaller sites) don't use 301 redirects between root and www. that search engines may make an exception in this case and treat them both as the same domain, so it could possibly be a way of getting round the issue. This is just a theory of mine, based solely on my thoughts that there are so many websites out there that don't 301 root to www. or vice versa, that possibly it would be in the SE's self interest to make an exception and count these as one domain, not 2. What are your thoughts on this and has there been any tests done to see if this is the case or not? Thanks

    | James77
    0

  • I have a domain with two pages in question--one is an article with 2,000 words and the other is a FAQ with 300 words. The 300 word FAQ is copied, word-for-word and pasted inside of the 2,000 word article. Would it be a proper use of the canonical tag to point the smaller, 300 word FAQ at the 2,000 word article? Since the 300 word article is identical to a portion of the 2,000 word article, will Google see this as duplicate content? Thanks in advance for any helpful insight.

    | andrewv
    0

  • Hello Guys ! I will appreciate if you will share your thoughts re the situation i have. The homepage for one of my sites is one last page of google's serp, although internal pages are displayed in the top 10. 1. Why ?
    2. What should I do to correct the situation with the homepage ? regards

    | Webdeal
    0

  • Hello Mozzers, I'm trying to improve and establish rankings for my website which has never really been optimised. I've inherited what seems to be a mess and have a challenge for you! The website currently has 3 different www domains all pointing to the one website, two are .com domains and one is a .com.au - the business is located in Australia and the website is primarily targeting Australian traffic. In addition to this there are a number of other non www domains for the same addresses pointing to the website in the CMS which is Adobe Business Catalyst. When I check Google each of the www domains for the website has the following number of pages indexed: www.Domain1,com 5,190 pages
    www.Domain2.com 1,520 pages
    www,Domain3.com.au 149 pages What is best practice approach from an SEO perspective to re organising this current domain structure? 1. Do I need to use the .com.au as the primary domain given that we are in this market and targeting traffic here? Thats what I have been advised and it seems to be backed up by what I have read here. 2. Do we re direct all domains to the primary .com.au domain? This is easily done in the Adobe Business Catalyst CMS however is this the same as a 301 redirect which is the best approach from an SEO perspective? 3. How do we consolidate all of the current separate domain rankings for the 3 different domains into the one domain rankings within Google to ensure improved rankings and a best practice approach? The website is currently receiving very little organic search traffic so if its simpler and faster to start again fresh rather than go through a complicated migration or re structure and you have a suggestion here please feel free to let me know your ideas! Thank you!

    | JimmyFlorida
    0

  • I bought *out.com more than 1 year, google bot even don't come, then I put the domain to the domain parking. what can I do? I want google index me.

    | Yue
    0

  • If I 301 redirects all urls from http://domain.com/folder/keyword to http://domain.com/folder/keyword.htm Are new urls likely to keep most of link juicy from source url and maintain the rankings in SERP?

    | Bull135
    0

  • Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!

    | M_D_Golden_Peak
    0

  • It's widely known that links from theme sponsorshare do not work well as it was a few years ago. And footer links are kind of useless or even risky. But take a look roofline link profile and traffic chart of mafiashare.net, you will find different worth more research. 1. Most back links to mafiashare.net are from crappy Wordpress themes with various anchor text and even embedded in mis-used "scroll to top" button. 2. OSE shows linking root domains in 1,000+, ahrefs shows ratio of sitewide links is not too high. Does mafiashare.net survive but using above two tactics combined? It has been in alexa top 10,000 for nearly half year. Maybe keep black/gray hat in a small scale is good to avoid the radar while benefit the rankings? Any thoughts are welcome. If you happened to know other websites being successful by sponsoring theme links, please share here.

    | Bull135
    0

  • Hi! I'm working on http://www.hgsplumbingandheating.co.uk/ at the moment, and it's going pretty well for local terms such as 'Norwich plumber' etc. Has anyone got any tips though on how it can be improved, especially with regard to getting the homepage ranking above the Places listings? Thanks!

    | neooptic
    0

  • Is there any way to use href (or a similar tool) for geo-targeting states in the US?

    | theLotter
    0

  • I am trying to find broken internal links within my site. I found a page that was non-existent but had a bunch of internal links pointing to that page, so I ran an Open Site Explorer report for that URL, but it's limited to 25 URLs. Is there a way to get a report of all of my internal pages that link to this invalid URL?  I tried using the link: search modifier in Google, but that shows no responses.

    | sbaylor
    0

  • I am showing a lot of errors in my SEOmoz reports for duplicate content and duplicate titles, many of which appear to be related to capitalization vs non-capitalization in the URL. Case in point, if a URL contains a lower character, such as: http://www.gallerydirect.com/art/product/allyson-krowitz/distinct-microstructure-i as opposed to the same URL having an upper character in the structure: http://www.gallerydirect.com/art/product/allyson-krowitz/distinct-microstructure-I I am finding that some of the internal links on the site use the former structure and other links use the latter structure. These show as duplicate title/content in the SEOmoz reports, but they don't appear as duplicate titles in Webmaster Tools. My question is, should I try to work with our developers to create a script to change all of the content with cap letters in the destination links internally on the site, or is this a non-issue since it doesn't appear in Webmaster Tools?

    | sbaylor
    0

  • Hi Mozzers, My developers recently changed a bunch of the pages I am working on into all lower case (something I know ideally should have been done in the first place). The URLs have sat for about a week as lower case without 301 redirecting the old upper-case URLs to these pages. In Google Webmaster Tools, I'm seeing Google recognize them as duplicate meta tags, title tags, etc.  See image: http://screencast.com/t/KloiZMKOYfa We're 301 redirecting the old URLs to the new ones ASAP, but is there anything else I should do? Any chance Google is going to noindex these pages because it seems them as dupes until I fix them?  Sometimes I can see both pages in the SERPs if I use personalized results, and it scares me: http://screencast.com/t/4BL6iOhz4py3 Thanks!

    | Travis-W
    0

  • Hi, Seomoz gives me warnings for overly dynamic urls. This is mostly caused by a crumbtrail system. I have a canonical link in the header for all the urls I receive warnings on, should I still worry about this? Thanks!

    | mooij
    0

  • Hi, When working on unnatural links penalty, is removing and disavowing links shown on the GWMT enough or should the list be broaden to include OSE and Majestic etc.? Thanks

    | BeytzNet
    0

  • When i search for a very competitve keyword on  a smart phone , my site ranks #6 but on laptop or desktop ranks #24. I do not have a mobile verison of my site just to clarify it. I have cleared the cache on my mobile and laptop. still the same. What is the reason for this?, I d appreciate an insight on this thanks. Nick

    | orion68
    0

  • Hello SEOmozers, I've been working on improving all components of my SEO skills for the past 6 months. I have definitely had some great victories and some gray defeats. My newest challenge is local ranking for a home improvement company. My target is to rank them locally with Google within the top 7 results. I have managed to do so, but only for one keyword "windows and doors CITY". My campaign, in terms of anchor text has a wide variety of long and shortail keywords, I have not concentrated on the above keyword. My question is, how do I go about to rank this website in the local results for all other keywords "windows CITY", "window replacement CITY", etc... What I don't understand is how Google picks up which keywords to rank the website locally for, and which ones to ignore. Any information will be well received. Cheers, Nikster

    | thenikster
    0

  • Let's say I have 10 different models of hats, and each hat has 5 colors. I have two routes I could take: a) Make 50 separate product pages Pros: -Better usability for customer because they can shop for just masks of a specific color. We can sort our collections to only show our red hats. -Help SEO with specific kw title pages (red boston bruins hat vs boston bruins hat). Cons: -Duplicate Content: Hat model in one color will have almost identical description as the same hat in a different color (from a usability and consistency standpoint, we'd want to leave descriptions the same for identical products, switching out only the color) b) Have 10 products listed, each with 5 color variants Pros: -More elegant and organized -NO duplicate Content Cons: -Losing out on color specific search terms -Customer might look at our 'red hats' collection, but shopify will only show the 'default' image of the hat, which could be another color. That's not ideal for usability/conversions. Not sure which route to take. I'm sure other vendors must have faced this issue before. What are your thoughts?

    | birchlore
    0

  • For a real estate website, when a house is sold or taken off of the market. What should happen to the listing? 301 redirect it to the grouping (such as zip code or city) which that listing resides in? 404 it?

    | wattssw
    0

  • Is a recovery from Penguin immediate once Google recognized that you've fixed the problem or is it a slow and steady recovery? I think we may have fixed our issue which is why we're seeing an immediate spike in traffic from Google organic search results.  Our daily traffic was up more than 100% in a single day. Is this a recovery?  At which speed have other sites you manage recovered? EoZJDZ2.png

    | voicesdotcom
    0

  • Hi, We have majorly redesigned our site.  Is is not a big site it is a SaaS site so has the typical structure, Landing, Features, Pricing, Sign Up, Contact Us etc... The main part of the site is after login so out of google's reach. Since the new release a month ago, google has indexed some pages, mainly the blog, which is brand new, it has reindexed a few of the original pages I am guessing this as if I click cached on a site: search it shows the new site. All new pages (of which there are 2) are totally missed.  One is HTTP and one HTTPS, does HTTPS make a difference. I have submitted the site via webmaster tools and it says "URL and linked pages submitted to index" but a site: search doesn't bring all the pages? What is going on here please?  What are we missing?  We just want google to recognise the old site has gone and ALL the new site is here ready and waiting for it. Thanks Andrew

    | Studio33
    0

  • Hello everyone, I have a page in my website that has a terrible link profile (95% exact match keyword links.) What is the best thing to do with this page? It provides no value, and if anything it is hurting me. Should I just delete the page, 301 it to an obscure page or something else? Thanks!

    | Mjstout
    0

  • Most pages on our site are crawled by google about once per week. We plan to implement a new navigation structure with much more interlinks among our pages. I would like to test it first just for a few pages to measure impact. How long may it take approximately until an onpage change related to link structure is reflected in google rankings? Any experience?

    | lcourse
    0

  • We frequently get random super-sketchy looking blogs linking to us with no author or contact information. I believe we are being targeted by a competitor setting up garbage links to us. I am hoping to use the Google disavow links tool to deal with this but is it: Safe to use or does it flag us as link spammers by using it Possible to use on an on-going basis for single links (as them come in, as opposed to a bunch of backlogged links) Thanks!

    | BlueLinkERP
    0

  • Hi Guys, I got a question which i can't understand. I'm working on a e-commerce site which recently got a CMS update including URL updates. 
    We did a lot of 301's on the old url's (around 3000 /4000 i guess) and submitted a new sitemap (around 12.000 urls, of which 10.500 are indexed). The strange thing is.. When i check the indexing status in webmaster tools Google tells me there are over 98.000 url's indexed.
    Doing the site:domainx.com Google tells me there are 111.000 url's indexed. Another strange thing which another forum member describes here : Cache date has been reverted  And next to that old url's (which have a 301 for about a month now) keep showing up in the index. Does anyone know what i could do to solve the problem?

    | ssiebn7
    0

  • We are working with a large retailer that has specific pages for each store they run.  We are interested in leveraging the best practices that are out their specifically for local search. Our current issue is around URL design for the stores pages themselves.  Currently, we have store URL's such as: /store/12584 The number is a GUID like character that means nothing to search engines or, frankly, humans.  Is there a better way we could model this URL for increased relevancy for local retail search? For example: adding store name:
    www.domain.com/store/1st-and-denny-new-york-city/23421
    (example http://www.apple.com/retail/universityvillage/) fully explicit URI www.domain.com/store/us/new-york/new-york-city/10027/bronx/23421
    (example http://www.patagonia.com/us/patagonia-san-diego-2185-san-elijo-avenue-cardiff-by-the-sea-california-92007?assetid=5172) the idea with this second version is that we'd make the URL structure more rich and detailed which might help for local search. Would there be a best practice or recommendation as to how we should model this URL? We are also working to create an on-page optimization but we're specifically interested in local seo strategy and URL design.

    | mongillo
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.