Merging sites, ensuring traffic doesn't die
-
Wondering if I could get a second opinion on this, please. I have just taken on a new client, they own about 6 different niched car experience websites (hire an Aston Martin for the day, type thing). All the six sites they have seem to perform reasonably well for the brand of car they deal with, the average DA of the sites is about 24.
The client wishes to move all of these different manufacturers into one site and have sections of the site, they can then also target more generic experience day type keywords.
The obvious way of dealing with this move would be to 301 the old sites to the relevant places on the new site and wait for that to rank. However, looking at the backlinks profile of the niched sites, they seem to have very few backlinks and i feel the reason they are ranking so well for all the individual manufacturers is because they all feature the name in the domain. Not exact match, but the name is there.
If I am thinking right, with the 301 we want to tell Google page x is now page y, index this one instead. Because the new site has a more generic name I don't think it will enjoy any of the domain keyword benefits which are helping the sub sites, and as a result I expect the rankings and traffic to drop (at least in the short term).
Am I reading this correct. Would people use a 301 in this case? The easiest thing to do would be to leave the 6 sub sites up and running on their own domain and launch the new site to run alongside them, however the client doesn't want this.
Thanks,
Carl
-
I have had a lot of success using internal landing pages that are targeted to a single topic. For your client, their homepage might be rentaluxurycar.com that is more general in its content; but then have individual pages that are focused on each make and model, i.e. rentaluxurycar.com/astmonmartindb7, with keywords focused on renting a DB7.
I would also push the client on letting the six subsites run alongside the main site. If they have different domains and unique content, then they can crowd the search results.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Http to https - is a '302 object moved' redirect losing me link juice?
Hi guys, I'm looking at a new site that's completely under https - when I look at the http variant it redirects to the https site with "302 object moved" within the code. I got this by loading the http and https variants into webmaster tools as separate sites, and then doing a 'fetch as google' across both. There is some traffic coming through the http option, and as people start linking to the new site I'm worried they'll link to the http variant, and the 302 redirect to the https site losing me ranking juice from that link. Is this a correct scenario, and if so, should I prioritise moving the 302 to a 301? Cheers, Jez
Technical SEO | | jez0000 -
Duplicate page errors from pages don't even exist
Hi, I am having this issue within SEOmoz's Crawl Diagnosis report. There are a lot of crawl errors happening with pages don't even exist. My website has around 40-50 pages but SEO report shows that 375 pages have been crawled. My guess is that the errors have something to do with my recent htaccess configuration. I recently configured my htaccess to add trailing slash at the end of URLs. There is no internal linking issue such as infinite loop when navigating the website but the looping is reported in the SEOmoz's report. Here is an example of a reported link: http://www.mywebsite.com/Door/Doors/GlassNow-Services/GlassNow-Services/Glass-Compliance-Audit/GlassNow-Services/GlassNow-Services/Glass-Compliance-Audit/ btw there is no issue such as crawl error in my Google webmaster tool. Any help appreciated
Technical SEO | | mmoezzi0 -
'External nofollow' in a robots meta tag? (advertorial links)
I believe this has never worked? It'd be an easy way of preventing any penalties from Google's recent crackdown on paid links via advertorials. When it's not possible to nofollow each external link individually, what are people doing? Nofollowing and/or noindexing the whole page?
Technical SEO | | Alex-Harford0 -
Google couldn't access your site because of a DNS error
Hello, We've being doing SEO work for a company for about 8 months and it's been working really well, we've lots of top threes and first pages. Or rather we did. Unfortunately the web host who the client uses (who to recommended them not to) has had severe DNS problems. For the last three weeks Google has been unable to access and index the website. I was hoping this was going to be a quickly resolved and everything return to normal. However this week their listing have totally dropped, 25 page one rankings has become none, Google Webmaster tools says 'Google couldn't access your site because of a DNS error'. Even searching for their own domain no longer works! Does anyone know how this will effect the site in the long term? Once the hosts sort it out will the rankings bounce back. Is there any sort of strategy for handling this problem? Ideally we'd move host but I'm not sure that is possible so any other options, or advice on how it will affect long term rankings so I can report to my client would be appreciated. Many thanks Ric
Technical SEO | | BWIRic0 -
Panda or Penquin -Website Fell - Shouldn't this Recover?
On March 23rd our site fell 47% in one day. www.TranslationSoftware4u.com but we still held quite a few #1 to #7 rankings on Google and thought it would just recover. Our top keyword "translation software" was #4 , now we are #19 Over the next week I waited to see if it recovered. We have been online 10+ years and always stayed with white hat. I admit to learning as I go over the years but always felt content was king so I focused on information. I really do not see my site as using spam techniques but maybe I am missing something on the way I have it. March 23rd, major drop -47% On April 2nd I started with SEO MOZ and the Research tools showed we had duplicate content warning. This was from a blog we were trying to start that only had 7 posts but it had about 20 tags per post. I did not realize that tags actually created that post under that tag. I went in and deleted the tags again being stupid and not realizing it was then making that come up 404. The blog was so small we do not get hits on it anyway so hoping it just clears itself up. ( still get duplicate warning on our directory due to using "php Link Directory", but it's due to how it reuses the title tag and description, 2 instances per category page"). Still trying to fix the php directory issue. Seems many others are running it and did not have a drop. April 24th, we dropped another -10% It keeps falling -70% now. I have gone through the site and tried to clean up any warnings like duplicate title tags, meta descriptions. With regards to links I put up a small web directory with some reciprocal linking. Our product translates languages but software is not the same as a human so we often set clients up with human translators, the directory is a nice place to help our customers find a translator or see online tools that can help. The links were not excessive, there were maybe 100 links. After the fall I went in and found some translators had gone out of business so I deleted those, I am down to 65 links now, about 45 are exchanges. I have submitted to some online directories manually, but looking back through the links there is not really anything that makes me concerned. The link back to my site was really the most neglected SEO thing I did. Again concentrating on content. I did find a few links that I was not happy about but I did not put those links so had no control. I have been working on cleaning up my title tags, and making sure the content just reads better. I have been hoping that my site would just start recovering but it keeps sliding. Has anyone seen recovery from the updates. Should I see anything yet? I cannot seem to get Google to return to the site and reindex. Am I doing somethign spammy on my site and I do not realize it? Thanks for any advice in advance!
Technical SEO | | Force70 -
For Google + purposes, should the author's name appear in the Meta description or title tag of my web site just as you would your key search phrase?
Relative to Cyrus Shepard's article on January 4th regarding Google's Superior SEO strategy, if I'm the primary author of all blog articles and web site content, and I have a link showing authorship going back to Google Plus, is a site wide link from the home page enough or should that show up on all blog posts etc and editorial comment pages etc? Conversely, should the author's name appear in the Meta description or title tag of my web site just as you would your key search phrase since Google appears to be trying to make a solid connection with my name, and all content?
Technical SEO | | lwnickens0 -
SEOMoz is indicating I have 40 pages with duplicate content, yet it doesn't list the URL's of the pages???
When I look at the Errors and Warnings on my Campaign Overview, I have a lot of "duplicate content" errors. When I view the errors/warnings SEOMoz indicates the number of pages with duplicate content, yet when I go to view them the subsequent page says no pages were found... Any ideas are greatly welcomed! Thanks Marty K.
Technical SEO | | MartinKlausmeier0