Concerned about Dup content between old and new website
-
I have an 8 year old plastic surgery website with my name in the url. I have just released a new website with a generic local plastic surgery url without my name.
However my google authorship photo is appearing in listings from both sites with different URLs.
So far Google is listing pages from both sites on the same google page result for similar search terms.
However I am concerned that eventually I may be punished for dup content since I am the same author for both pages?
-
Thank you
-
Thank you
-
Thank you
-
301 redirect the old version to the new one if the actual content on the sites are same.
If only Author name is matching it will not harm.
-
Hi Wianno,
I strongly suggest forwarding the old URL to the new URL at the DNS level (no masking) with 301 redirect. If you can't do that, redirect visitors from your old site to your new site via server side redirects. The goal is here to send all visitors to your new site and concentrate all that link juice to your new domain.
You won't have to worry about duplicate content. Matt Cutts has addressed this issue in numerous videos so you might want to check some of those out.
-
How about the actual content on these pages, is it the same or is there a real difference in the content and human can't spot in a couple of seconds, or is the URL the only difference between them?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
I am trying to get a handle on how to fix and control a large amount of duplicate content I keep getting on my Moz Reports. The main area where this comes up is for duplicate page content and duplicate title tags ... thousands of them. I partially understand the source of the problem. My site mixes free content with content that requires a login. I think if I were to change my crawl settings to eliminate the login and index the paid content it would lower the quantity of duplicate pages and help me identify the true duplicate pages because a large number of duplicates occur at the site login. Unfortunately, it's not simple in my case because last year I encountered a problem when migrating my archives into a new CMS. The app in the CMS that migrated the data caused a large amount of data truncation Which means that I am piecing together my archives of approximately 5,000 articles. It also means that much of the piecing together process requires me to keep the former app that manages the articles to find where certain articles were truncated and to copy the text that followed the truncation and complete the articles. So far, I have restored about half of the archives which is time-consuming tedious work. My question is if anyone knows a more efficient way of identifying and editing duplicate pages and title tags?
Technical SEO | | Prop650 -
Purchasing duplicate content
Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!
Technical SEO | | BackPack851 -
SEO trending down after adding content to website
Hi
Technical SEO | | swat1827
Looking for some guidance. I added about 14 pages of unique content and did all of the on page SEO work using Yoast - have 'good' status on all of them some of the website architecture was changed - mainly on one page. That being said, we got a significant bump the day I implemented, however every day thereafter we have had very bad results. Worse than we had before for about 3 days now. I did resubmit the updated sitemap to GWT and I'm showing no crawl errors. Also, curious if my Robots.txt file could be the issue. All it contains is User-agent: *
Disallow: /wp-admin/ Any insight or advise is greatly appreciated!
Thanks for your time0 -
How do I optimize a website for SEO for a client that is using a subdirectory as a seperate website?
We launched a subdirectory site about two months ago for our client. What's happening is searches for the topic covered by the subdirectory are yielding search results for the old site and not the new site. We'd like to change this. Are there best practices for the subdirectory site Specifically we're looking for things we can do using sitemapping and Webmaster tools. Are there other technical things we can do? Thanks you.
Technical SEO | | IVSeoTeam120 -
Duplicate Content
We have a ton of duplicate content/title errors on our reports, many of them showing errors of: http://www.mysite.com/(page title) and http://mysite.com/(page title) Our site has been set up so that mysite.com 301 redirects to www.mysite.com (we did this a couple years ago). Is it possible that I set up my campaign the wrong way in SEOMoz? I'm thinking it must be a user error when I set up the campaign since we already have the 301 Redirect. Any advice is appreciated!
Technical SEO | | Ditigal_Taylor0 -
Mapping and tracking old and new information architecture
Howdy. So I'm working on "example.com", which has thousands of URLs. The site is going to be redesigned, with some changes to the information architecture. I'm trying to think of a good way to organize and account for similarities and differences between the original information architecture and the new one. This should help with building 301s. I've downloaded a list of URLs from example.com from Open Site Explorer. What I would love to do is generate a visual "tree" of the site based on the output from Open Site Explorer. It would basically look like a pyramid with all of the subfolders branching out. Does anybody know of a tool out there that will do this for me? Or am I going to have a long day in Excel? 🙂 Any other thoughts on working through this process are welcome. Thank you!
Technical SEO | | SamTurri0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240 -
Duplicate Page Content
Hi within my campaigns i get an error "crawl errors found" that says duplicate page content found, it finds the same content on the home pages below. Are these seen as two different pages? And how can i correct these errors as they are just one page? http://poolstar.net/ http://poolstar.net/Home_Page.php
Technical SEO | | RouteAccounts0