Questions created by eCommerceSEO
-
How to handle temporary campaign URLs
Hi, We have just run a yearly returning commercial campaign for which we have created optimized URL's. (e.g. www.domain.tld/campaign including the category and brand names after the campaign www.domain.tld./campaign/womens This has resulted in 4500+ URL's being indexed in Google including the campaign name, now the campaign is over and these URL's do not exist anymore. How should we handle those URL's? 1.) 301 them to the correct category without the campaign name 2.) Create a static page www.domain.tld/campaign to which we 301 all URL's that have the campaign name in them Do you have any other suggestions on what the best approach would be? This is a yearly commercial campaign so in a year time we will have the same URL's again. Thanks, Chris
Technical SEO | | eCommerceSEO0 -
Link to overall brand pages
On our website we have two ways to get in a brand environment. We have general brand pages and brand pages divided by category. At this moment the category brand pages get the most SEO value, because we have a link on our homepage to these pages (via the mega dropdown). The problem is that we would like to assign the SEO value to the general brand pages (with all the articles) instead of the category brand pages (with only articles within a category). We prefer to optimize the general brand page without a link to this page on the homepage for now. for example; Those two pages have the most SEO value
Technical SEO | | eCommerceSEO
www.debijenkorf.nl/herenmode/diesel
www.debijenkorf.nl/damesmode/diesel but we would like to assign value to;
www.debijenkorf.nl/diesel Do you have a solution for this problem? Thank you in advance! Kind regards,0 -
Should we block URL param in Webmaster tools after URL migration?
Hi, We have just released a new version of our website that now has a human readable nice URL's. Our old ugly URL's are still accessible and cannot be blocked/redirected. These old URL's use a URL param that has an xpath like expression language to define the location in our catalog. We have about 2 million pages indexed with this old URL param in it while we have approximately 70k nice URL's after the migration. This high number of old URL's is due to facetting that was done using this URL param. I wonder if we should now completely block this URL param from Google Webmaster tools so that these ugly URL's will be removed from the Google index. Or will this harm our position in Google? Thanks, Chris
Technical SEO | | eCommerceSEO0 -
URL restructure and phasing out HTML sitemap
Hi SEOMozzies, Love the Q&A resource and already found lots of useful stuff too! I just started as an in-house SEO at a retailer and my first main challenge is to tidy up the complex URL structures and remove the ugly sub sitemap approach currently used. I already found a number of suggestions but it looks like I am dealing with a number of challenges that I need to resolve in a single release. So here is the current setup: The website is an ecommerce site (department store) with around 30k products. We are using multi select navigation (non Ajax). The main website uses a third party search engine to power the multi select navigation, that search engine has a very ugly URL structure. For example www.domain.tld/browse?location=1001/brand=100/color=575&size=1&various other params, or for multi select URL’s www.domain.tld/browse?location=1001/brand=100,104,506/color=575&size=1 &various other non used URL params. URL’s are easily up to 200 characters long and non-descriptive at all to our users. Many of these type of URL’s are indexed by search engines (we currently have 1.2 million of those URL’s indexed including session id’s and all other nasty URL params) Next to this the site is using a “sub site” that is sort of optimized for SEO, not 100% sure this is cloaking but it smells like it. It has a simplified navigation structure and better URL structure for products. Layout is similair to our main site but all complex HTMLelements like multi select, large top navigations menu's etc are all removed. Many of these links are indexed by search engines and rank higher than links from our main website. The URL structure is www.domain.tld/1/optimized-url .Currently 64.000 of these URL’s are indexed. We have links to this sub site in the footer of every page but a normal customer would never reach this site unless they come from organic search. Once a user lands on one of these pages we try to push him back to the main site as quickly as possible. My planned approach to improve this: 1.) Tidy up the URL structure in the main website (e.g. www.domain.tld/women/dresses and www.domain.tld/diesel-red-skirt-4563749. I plan to use Solution 2 as described in http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck to block multi select URL’s from being indexed and would like to use the URL param “location” as an indicator for search engines to ignore the link. A risk here is that all my currently indexed URL (1.2 million URL’s) will be blocked immediately after I put this live. I cannot redirect those URL’s to the optimized URL’s as the old URL’s should still be accessible. 2.) Remove the links to the sub site (www.domain.tld/1/optimized-url) from the footer and redirect (301) all those URL’s to the newly created SEO friendly product URL’s. URL’s that cannot be matched since there is no similar catalog location in the main website will be redirected (301) to our homepage. I wonder if this is a correct approach and if it would be better to do this in a phased way rather than the currently planned big bang? Any feedback would be highly appreciated, also let me know if things are not clear. Thanks! Chris
Technical SEO | | eCommerceSEO0