Using Product Page Content from an Offline Website
-
Hi all,
We have two websites. One of the website's no longer sells product range A.
However, on the second website, we would like to sell range A.
We paid a copywriter to write some really good content for these ranges and we were wondering if we would get stung for duplicate content if we took these descriptions from website 1 and placed them on website 2.
The products / descriptions are live anymore and haven't been for about 6 weeks.
We're ranking for some great keywords at the moment and we don't want to spoil that.
Thanks in advance!
D
-
Thanks for all your responses Linda and Dirk!
The pages are not live on the first website so there will be no possibility of any redirects.
Although i'm reassured now that we can transfer the descriptions over without being penalised.
Thank you again!
-
Not at all, Dirk. I was just clarifying my answer.
If the pages still exist (just not listed on the website) and were doing well = Redirect or canonicalize to take advantage of their residual authority.
If the pages no longer exist = Not much you can do...
In either case, no problem with duplicate content as Google will soon figure out where the content now (exclusively) lives.
-
Linda,
I hope there is no misunderstanding, I fully agreed with your first answer. I also like the solution with the canonical - however not possible to implement this if the content has already been put off-line.
rgds,
Dirk
-
If you don't want to do the redirects, you can do the cross-domain canonical. Lots of unrelated sites do this, for instance when syndicating content.
-
I saw that the writer had said that site 1 no longer sold product range A, but I wasn't sure whether that meant that the range pages had been removed from Google's index or whether they were just no longer available on the site.
I also wasn't sure which site was the one ranking for great keywords. If it is the product range A products on site 1 (with the really good content) then it might be best to leave them indexed, with the redirect, till Google picks up on the change and passes the goodness to site 2. (If not, no harm done.)
-
Thanks for your responses!
We don't want to do any redirects between the websites as we would like to keep them as two separate entities.
I believe google has the content still cached which is why i was panicking about duplicate content.
Thanks,
Dale
-
Hi,
You can only have duplicate content if the same content is published on different sites. As far as I understand from your question, site one doesn't sell product range A anymore, so these products are no longer published on site one. So there can't be a duplicate content issue if you publish the same content on site two.
I like the suggestion of Linda to put 301 from the old pages on site 1 to the new pages on site 2 as it will reinforce the position of the new pages.
rgds,
Dirk
-
You can use a cross-domain canonical from site 1 pointing to to site 2, or 301 redirect the pages from site 1 to site 2.
Duplicate content isn't a penalty, it just makes Google choose which version to show. If you use one of those signals (probably the 301, if you are sure this is a permanent change), the correct site will get the benefit of the content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Using NoIndex Tag instead of 410 Gone Code on Discontinued products?
Hello everyone, I am very new to SEO and I wanted to get some input & second opinions on a workaround I am planning to implement on our Shopify store. Any suggestions, thoughts, or insight you have are welcome & appreciated! For those who aren't aware, Shopify as a platform doesn't allow us to send a 410 Gone Code/Error under any circumstance. When you delete or archive a product/page, it becomes unavailable on the storefront. Unfortunately, the only thing Shopify natively allows me to do is set up a 301 redirect. So when we are forced to discontinue a product, customers currently get a 404 error when trying to go to that old URL. My planned workaround is to automatically detect when a product has been discontinued and add the NoIndex meta tag to the product page. The product page will stay up but be unavailable for purchase. I am also adjusting the LD+JSON to list the products availability as Discontinued instead of InStock/OutOfStock.
Technical SEO | | BakeryTech
Then I let the page sit for a few months so that crawlers have a chance to recrawl and remove the page from their indexes. I think that is how that works?
Once 3 or 6 months have passed, I plan on archiving the product followed by setting up a 301 redirect pointing to our internal search results page. The redirect will send the to search with a query aimed towards similar products. That should prevent people with open tabs, bookmarks and direct links to that page from receiving a 404 error. I do have Google Search Console setup and integrated with our site, but manually telling google to remove a page obviously only impacts their index. Will this work the way I think it will?
Will search engines remove the page from their indexes if I add the NoIndex meta tag after they have already been index?
Is there a better way I should implement this? P.S. For those wondering why I am not disallowing the page URL to the Robots.txt, Shopify won't allow me to call collection or product data from within the template that assembles the Robots.txt. So I can't automatically add product URLs to the list.0 -
What are the SEO recommendations for dynamic, personalised page content? (not e-commerce)
Hi, We will have pages on the website that will display different page copy and images for different user personas. The main content (copy, headings, images) will be supplied dynamically and I'm not sure how Google will index the B and C variations of these pages. As far as I know, the page URL won't change and won't have parameters. Google will crawl and index the page content that comes from JavaScript but I don't know which version of the page copy the search robot will index. If we set user agent filters and serve the default page copy to search robots, we might risk having a cloak penalty because users get different content than search robots. Is it better to have URL parameters for version B and C of the content? For example: /page for the default content /page?id=2 for the B version /page?id=3 for the C version The dynamic content comes from the server side, so not all pages copy variations are in the default HTML. I hope my questions make sense. I couldn't find recommendations for this kind of SEO issue.
Technical SEO | | Gyorgy.B1 -
Should I use my competitor's name in my content to help my rankings?
If I have a competitor that ranks higher than me, would it be helpful to use their name in my content, or in my meta information?
Technical SEO | | greaterstudio0 -
Duplicate homepage content across multiple websites
Hi, I work for a company that runs 30+ membership based websites on separate domains and across multiple markets. The homepage for each site contains a section of content that highlights the site benefits and features. While each website serves a different market/niche, this section of content is essentially the same as each website offers the same benefits and features. What is the best way to avoid duplicate content issues while still being able to show the same section of content across 30+ sites? This particular section of content isn't valuable from an SEO perspective, but the rest of the content on that page is. Any ideas or suggestions would be much appreciated. Thanks
Technical SEO | | CupidTeam0 -
Hotel affiliate website - noindex pages with little unique content?
We are well into development of a hotel affiliate website (using Expedia Affiliate Network), and I know there are many challenges to SEO when using an affiliate system - one of the biggest being how to handle duplicate content. Outside of blog posts and static marketing pages, the majority of the textual content is contained in hotel descriptions. We will be creating unique descriptions over time, but we are a small team and this will be a lengthy process. My question for you mozzers, is whether or not it's advisable for ranking purposes to noindex any page with mostly 'stock' content, and only allow Google to index hotel pages with unique descriptions? Thanks for any input!
Technical SEO | | CassisGroup0 -
Duplicate Page Content Lists the same page twice?
When checking my crawl diagnostics this morning I see that I have the error Duplicate page content. It lists the exact same url twice though and I don't understand how to fix this. It's also listed under duplicate page title. Personal Assistant | Virtual Assistant | Charlotte, NC http://charlottepersonalassistant.com/110 Personal Assistant | Virtual Assistant | Charlotte, NC http://charlottepersonalassistant.com/110 Does this have anything to do with a 301 redirect here? Why does it have http;// twice? Thanks all! | http://www.charlottepersonalassistant.com/ | http://http://charlottepersonalassistant.com/ |
Technical SEO | | eidna220 -
Using robots.txt to deal with duplicate content
I have 2 sites with duplicate content issues. One is a wordpress blog. The other is a store (Pinnacle Cart). I cannot edit the canonical tag on either site. In this case, should I use robots.txt to eliminate the duplicate content?
Technical SEO | | bhsiao0 -
Using a table with tabs to display information on website, work for seo?
When displaying data using a table, and a tab format to seperate different options, for example http://www.mousetraining.co.uk/ms-training/microsoft-excel-training-courses.html - under Standard Excel Training Course Levels / Training Details / Locations / Schedule - at the bottom of the page. Would search engines pick up the keywords from each of the tabs, or are they hidden?? Thanks
Technical SEO | | jpc10040