Please provide solution for my website? Duplicate content Problem
-
-
I have 2 Domains with the same name with same content. How to solve that problem? Do I need to change the content from my main website.
-
My Hosting is having different plans, but with the same features. So many pages were having the same content, and it is not possible to change the content, what is the solution for that?
Please let me know how to solve that issue?
-
-
Sorry to hear you are having issues with your site. It appears that you have the two domains due to country targeting?
If this is true, I would choose one domain, and 301 redirect the other domain to the one you want to keep. If they have a different weight or authority, then I would check the domain weight to see which is higher. What you don't want to do is potentially redirect a less powerful domain to a weaker one, as that can slow down ranking progress.
If you are worried about country targeting, you can add in Language tags into your site and also submit a focus area in webmaster tools. This will also detect the language tags if they are properly installed.
Once this process is complete, make sure to fetch the chosen domain in Webmaster Tools to help speed up the indexing of the new domain.
Hope this helps!
-
Hi Alexa,
It's the worst situation from SEO point of view, you have both the sites with same content, URL structure etc.
You should immediately do complete website 301 redirects, choose one domain to keep alive (https://www.vpsnine.com/ has domain authority of 30 and ideal candidate to be kept alive) and transfer all URLs to their respective couterpart.
-
Try to map each URL of existing website with 301 redirects to new domain URL equivalent. (example : > About-us page to be mapped with About-us page on new domain ). This will transfer most of the SEO value and authority to the new domain URLs and to the right pages.
-
Register and verify your both domains with Google Webmaster Tools.
-
Create a custom 404 page for the old domain which suggests visiting new domain.
-
In a development environment, test the redirects from the old domain to the new domain. Ideally, this will be a 1:1 redirect. (http://www.example-old-site.com/...to http://www.example-new-site.com/...)
-
301 redirect your old domain to your new domain.
-
Submit your old sitemap to Google and Bing. The submission pages are within Google Webmaster Tools and Bing Webmaster Center (This step will make the engines crawl your old URLs, see that they are 301 redirects and change their index accordingly.)
-
Fill out the Change of Address form in Google Webmaster Tools.
-
Create a new sitemap and submit it to the engines. (This will tell them about any new URLs that were not present on the old domain)
-
Wait until Google Webmaster Tools updates and fix any errors it indicates in the Diagnostics section.
-
Monitor search engine results to make sure new domain is being properly indexed.
I hope this helps, let me know via your response if you have further questions.
Regards,
Vijay
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Just found a wordpress blog duplicating main website blog - what to do?
Hello Mozzers, I am working on a website and found the social media agency, employed by the website owner, was running a parallel wordpress blog which duplicates the content on the main website's blog (200 odd pages of this duplicating wordpress blog are indexed - the duplication is exact other than for non-blog content pages - around 60 category, date pages, homepage, etc. I am planning to 301 redirect the wordpress blog pages to equivalent pages on website blog, and then 301 redirect the homepage, category and date pages, etc. to the website blog homepage, so all the blog pages redirect to somewhere on main website. _Does this make sense, or should I only worry about redirecting the blog content pages? _ Also, the main website is new and there are redirects coming in to pages from old website already. _Is there anything to be cautious about when redirecting to a main website from multiple old websites? _ Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
Hiya Mozzers I often work for hotels. A common scenario is the hotel / resort has worked with their Property Management System to distribute their booking availability around the web... to third party booking sites - with the inventory goes duplicate page descriptions sent to these "partner" websites. I was just checking duplication on a room description - 20 loads of duplicate descriptions for that page alone - there are 200 rooms - so I'm probably looking at 4,000 loads of duplicate content that need rewriting to prevent duplicate content penalties, which will cost a huge amount of money. Is there any other solution? Perhaps ask booking sites to block relevant pages from search engines?
Intermediate & Advanced SEO | | McTaggart0 -
How do I geo-target continents & avoid duplicate content?
Hi everyone, We have a website which will have content tailored for a few locations: USA: www.site.com
Intermediate & Advanced SEO | | AxialDev
Europe EN: www.site.com/eu
Canada FR: www.site.com/fr-ca Link hreflang and the GWT option are designed for countries. I expect a fair amount of duplicate content; the only differences will be in product selection and prices. What are my options to tell Google that it should serve www.site.com/eu in Europe instead of www.site.com? We are not targeting a particular country on that continent. Thanks!0 -
Last Panda: removed a lot of duplicated content but no still luck!
Hello here, my website virtualsheetmusic.com has been hit several times by Panda since its inception back in February 2011, and so we decided 5 weeks ago to get rid of about 60,000 thin, almost duplicate pages via noindex metatags and canonical (we have no removed physically those pages from our site giving back a 404 because our users may search for those items on our own website), so we expected this last Panda update (#25) to give us some traffic back... instead we lost an additional 10-12% traffic from Google and now it looks even really badly targeted. Let me say how disappointing is this after so much work! I must admit that we still have many pages that may look thin and duplicate content and we are considering to remove those too (but those are actually giving us sales from Google!), but I expected from this last Panda to recover a little bit and improve our positions on the index. Instead nothing, we have been hit again, and badly. I am pretty desperate, and I am afraid to have lost the compass here. I am particularly afraid that the removal of over 60,000 pages via noindex metatags from the index, for some unknown reason, has been more damaging than beneficial. What do you think? Is it just a matter of time? Am I on the right path? Do we need to wait just a little bit more and keep removing (via noindex metatags) duplicate content and improve all the rest as usual? Thank you in advance for any thoughts.
Intermediate & Advanced SEO | | fablau0 -
Reinforcing Rel Canonical? (Fixing Duplicate Content)
Hi Mozzers, We're having trouble with duplicate content between two sites, so we're looking to add some oomph to the rel canonical link elements we put on one of our sites pointing towards the other to help speed up the process and give Google a bigger hint. Would adding a hyperlink on the "copying" website pointing towards the "original" website speed this process up? Would we get in trouble if added about 80,000 links (1 on each product page) with a link to the matching product on the other site? For example, we could use text like "Buy XY product on Other Brand Name and receive 10% off!"
Intermediate & Advanced SEO | | Travis-W0 -
Need help with duplicate content. Same content; different locations.
We have 2 sites that will have duplicate content (e.g., one company that sells the same products under two different brand names for legal reasons). The two companies are in different geographical areas, but the client will put the same content on each page because they're the same product. What is the best way to handle this? Thanks a lot.
Intermediate & Advanced SEO | | Rocket.Fuel0 -
Get Duplicate Page content for same page with different extension ?
I have added a campaign like "Bannerbuzz" in SEOMOZ Pro account and before 2 or 3 days i got errors related to duplicate page content . they are showing me same page with different extension. As i mentioned below http://www.bannerbuzz.com/outdoor-vinyl-banners.html
Intermediate & Advanced SEO | | CommercePundit
&
http://www.bannerbuzz.com/outdoor_vinyl_banner.php We checked our whole source files but we didn't define php related urls in our source code. we want to catch only our .html related urls. so, Can you please guide us to solve this issue ? Thanks <colgroup><col width="857"></colgroup>
| http://www.bannerbuzz.com/outdoor-vinyl-banners.html |0