Technical : Duplicate content and domain name change
-
Hi guys,
So, this is a tricky one. My server team just made quite a big mistake :We are a big
We are a big magento ecommerce website, selling well, with about 6000 products. And we are about to change our domaine name for administrative reasons.
Let's call the current site : current.com and the future one : future.com
Right, here is the issue
Connecting to the search console, I saw future.com sending 11.000 links to current.com. At the same time DA was hit by 7 points.
I realized future.com was uncorrectly redirected and showed a duplicated site or current.com. We corrected this, and future.com now shows a landing page until we make the domain name change.
I was wondering what is the best way to avoid the penalty now and what can be the consequences when changing domain name. Should I set an alias on search console or something ?
Thanks
-
Thanks Patrick,
I read that last year and it is the best ressource so far on the web. But I am not sure it applies to my case.
Thanks.
-
Hi there
Moz actually created a fantastic migration guide that will give you a step by step process to properly move your website to it's new domain. I highly suggest reading it, as it goes into great detail and also includes a resources / tools list to help you and your team properly track performance. You can read all of that here.
Just wanted to let you know that this guide exists! I'm sure great answers are to follow! Hope this helps in the meantime, good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical cross domain Linkjuice
I know that back few years ago, rel=canonical used on cross-domain was passing link juice. As I've read based on many experts (case studies), the canonical cross-domain was working like implementing a 301. Is it still the case ? Does anyone tried to implement it recently and it worked ?
White Hat / Black Hat SEO | | manoman880 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
Whay are low-quality exact match domains still ranking well for our biggest term?
There are a number of low-quality “exact-match” domains that are ranking well for the term “locum tenens”. I don’t want to specifically mention any sites, but there are some with poor content and very few quality backlinks that are on page one. The only reason I can see for them ranking so well is the fact that “locum” and/or “tenens” are in the URL. It’s very frustrating because we have worked hard to do all the right things (regular blogging, high-quality content, quality backlinks, etc.) to build our domain authority and page authority so they are better than these sites, yet they still out-rank us. Our site is www.bartonassociates.com. Could it have something to do with the term “locum tenens”, which is a latin phrase? Is it possible that because it is a latin term that it somehow slipped through the cracks and avoided the update that was supposed to eliminate this? If so, what can we do to get some justice?
White Hat / Black Hat SEO | | ba_seomoz0 -
Guest post linking only to good content
Hello, We're thinking of doing guest posting of the following type: 1. The only link is in the body of the guest post pointing to our most valuable article. 2. It is not a guest posting site - we approached them to help with content, they don't advertise guest posting. They sometimes use guest posting if it's good content. 3. It is a clean site - clean design, clean anchor text profile, etc. We have 70 linking root domains. We want to use the above tactics to add 30 more links. Is this going to help us on into the future of Google (We're only interested in long term)? Is 30 too many? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0 -
How much pain can I expect if I change the URL structure of the site again?
About 3 months ago I implemented a massive URL structure change by 'upgrading' some of the features of our CMS Prior to this URL's for catergorys and products looked something like this http://www.thefurnituremarket.co.uk/proddetail.asp?prod=OX09 I made a few changes but din't implement it fully as I felt it would be better to do it instages as the site was getting indexed more thouroughly. HOWEVER... We have just hit the first page for some key SERP's and I am wary to rock the boat again by changing the URL structures again and all the sitemaps. How much pain do you think we could feel if i went ahead and optimised the URL's fully? and What would you do? 🙂
White Hat / Black Hat SEO | | robertrRSwalters0