Need to migrate multiple URLs and trying to save link juice
-
I have an interesting problem SEOmozers and wanted to see if I could get some good ideas as to what I should to for the greatest benefit.
I have an ecommerce website that sells tire sensors. We just converted the old site to a new platform and payment processor, so the site has changed completely from the original, just offering virtually the same products as before. You can find it at www.tire-sensors.com
We're ranked #1 for the keyword "tire sensors" in Google.
We sell sensors for ford, honda, toyota, etc -- and tire-sensors.com has all of those listed.
Before I came along, the company I'm working for also had individual "mini ecommerce" sites created with only 1 brand of sensors and the URL to match that maker.
Example : www.fordtiresensors.com is our site, only sells the Ford parts from our main site, and ranks #1 in Google for "ford tire sensors"
I don't have analytics on these old sites but Google Keyword Tool is saying "ford tire sensors" gets 880 local searches a month, and other brand-specific tire sensors are receiving traffic as well.
We have many other sites that are doing the same thing.
-
www.suzukitiresensors.com (ranked #2 for "suzuki tire sensors") Only sells our Suzuki collection from the main site's inventory
-
etc
We need to get rid of the old sites because we want to shut down the payment gateway and various other things those sites are using, and move to one consolidated system (aka www.tire-sensors.com)
Would simply making each maker-specific URL (ie. fordtiresensors.com) 301 redirect to our main site (www.tire-sensors.com) give us to most benefit, rankings, traffic etc? Or would that be detrimental to what we're trying to do -- capturing the tire sensors market for all car manufacturers?
Suggestions?
Thanks a lot in advance!
Jordan
-
-
Perm 301 re-direct all sub-brand sites to www.tire-sensors.com. Keep the old domains as you will still get referals from them for sometime to come. Or you could 301 redirect each branded URL to the new location of that particular brand of sensor on the new site.
Then to target those indiviual keyword terms, when you catagorize the products in order of brand, simply set the url up as
www.tire-sensors.com/suzuki-tire-sensors/product-name
Title tage could be in the form of "product name | suzuki tire sensors | your brand
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links
Hi 64% of our links come from a .com website and only 30% from .co.uk. We only do business in the UK should I continue with the .com links as they are easier to source. Does this hurt my SEO efforts?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Site-wide Canonical Rewrite Rule for Multiple Currency URL Parameters?
Hi Guys, I am currently working with an eCommerce site which has site-wide duplicate content caused by currency URL parameter variations. Example: https://www.marcb.com/ https://www.marcb.com/?setCurrencyId=3 https://www.marcb.com/?setCurrencyId=2 https://www.marcb.com/?setCurrencyId=1 My initial thought is to create a bunch of canonical tags which will pass on link equity to the core URL version. However I was wondering if there was a rule which could be implemented within the .htaccess file that will make the canonical site-wide without being so labour intensive. I also noticed that these URLs are being indexed in Google, so would it be worth setting a site-wide noindex to these variations also? Thanks
Intermediate & Advanced SEO | | NickG-1230 -
Hammered by Spam links
When we moved from one host to another in Wordpress engine, we had this insertion weird redirect thing happen. We 410'd the page cgi-sys/movingpage.cgi, but it hit us hard in the anchors. If you go to ahrefs, we are literally all Asian in anchors text. Anybody have any suggestions, thank goodness it looks like it finally stopped. I am looking for creative ways to repopulate our back end with the right stuff. Any thoughts would be great! Heres a example: allartalocaltours.com/tumi-tote-401.html ↳customerbloom.com/cgi-sys/movingpage.cgi ↳www.customerbloom.com/cgi-sys/movingpage.cgi ↳lockwww.customerbloom.com/cgi-sys/movingpage.cgi
Intermediate & Advanced SEO | | mattguitar990 -
Can multiple geotargeting hreflang tags be set in one URL? International SEO question
Hi All, I have a question please. If i target www.onedirect.co.nl/en/ in English for Holland, Belgium and Luxembourg, are the tags below correct? English for Holland, Belgium and Luxembourg: http://www.example.co.nl/en/" hreflang="en-nl" /> http://www.example.co.nl/en/" hreflang="en-be" /> http://www.example.co.nl/en/" hreflang="en-lu" /> AND Targeting Holland and Belgium in Dutch: Pour la page www.onedirect.co.nl on peut inclure ce tag: http://www.example.co.nl" hreflang="nl-nl" /> http://www.example.co.nl" hreflang="nl-be" /> thanks a lot for your help!
Intermediate & Advanced SEO | | Onedirect_uk0 -
Technical Question on Image Links - Part of Addressing High Number of Outbound Links
Hi - I've read through the forum, and have been reading online for hours, and can't quite find an answer to what I'm searching for. Hopefully someone can chime in with some information. 🙂 For some background - I am looking closely at four websites, trying to bring them up to speed with current guidelines, and recoup some lost traffic and revenue. One of the things we are zeroing in on is the high amount of outbound links in general, as well as inter-site linking, and a nearly total lack of rel=nofollow on any links. Our current CMS doesn't allow an editor to add them, and it will require programming changes to modify any past links, which means I'm trying to ask for the right things, once, in order to streamline the process. One thing that is nagging at me is that the way we link to our images could be getting misconstrued by a more sensitive Penguin algorithm. Our article images are all hosted on one separate domain. This was done for website performance reasons. My concern is that we don't just embed the image via , which would make this concern moot. We also have an href tag on each to a 'larger view' of the image that precedes the img src in the code, for example - We are still running the numbers, but as some articles have several images, and we currently have about 85,000 articles on those four sites... well, that's a lot of href links to another domain. I'm suggesting that one of the steps we take is to rel=nofollow the image hrefs. Our image traffic from Google search, or any image search for that matter, is negligible. On one site it represented just .008% of our visits in July. I'm getting a little pushback on that idea as having a separate image server is standard for many websites, so I thought I'd seek additional information and opinions. Thanks!
Intermediate & Advanced SEO | | MediaCF0 -
Does link juice pass along the URL or the folders? 10yr old PR 6 site
We have a website that is ~10yrs old and a PR 6. It has a bunch of legitimate links from .edu and .gov sites. Until now the owner has never blogged or added much content to the site. We have suggested that to grow his traffic organically he should add a worpress blog and get agressive with his content. The IT guy is concerned about putting a wordpress blog on the same server as the main site because of security issues with WP. They have a bunch of credit card info on file. So, would it be better to just put the blog on a subdomain like blog.mysite.com OR host the blog on another server but have the URL structure be mysite.com/blog? I have tried to pass as much juice as possible. Any ideas?
Intermediate & Advanced SEO | | jasonsixtwo0 -
301 Redirect To Corresponding Link No Matter The URL?
Hey guys I have hosting on Host Gator with I believe an apache web server. I need a code to put in the HT ACCESS to redirect all WWW URL's to their corresponding http URL. I haven't been able to get a code to work. For example, http://www.mysite.org/page1.html -> http://mysite.org/page1.html , without having to redirect hundreds of pages individually Here is the format my server uses in the HT ACCESS file for 301 redirects. RewriteCond %{HTTP_HOST} ^mysite.org$ [OR] RewriteCond %{HTTP_HOST} ^www.mysite.org
Intermediate & Advanced SEO | | DustinX
$RewriteRule ^Electric-Pressure-Cookers.html$ "http://mysite.org/Pressure-Cookers.html" [R=301,L] Thanks0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0