Duplicating relevant category content in subcategories. Good or bad for google ranking?
-
In a travel related page I have city categories with city related information.
Would you recommend for or against duplicating some relevant city related in subcategory pages.For visitor it would be useful and google should have more context about the topic of our page.
But my main concern is how this may be perceived by google and especially whether it may make it more likely being penalized for thin content. We already were hit end of june by panda/phantom and we are working on adding also more unique content, but this would be something that we could do additionally and basically instantaneously. Just do not want to make things worse. -
Thanks a lot Laura, so I guess we would not need to be too concerned.
-
As with many SEO questions, the answer here is "It depends." What type of content is duplicated on the city pages? How much unique content is there in addition to the duplicated content?
Having the same content on unique city pages is not necessarily a problem. Google will serve up the page that is most relevant to the search query and will filter out the rest. However, if all of the content is the same across all of these pages, the overall site would be considered lower quality. The key is to balance any duplicated content with additional unique content that provides value for the site visitor.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Copyright Theft and Google Rankings
Here is another tough one we've been dealing with. We publish a niche book. For a decade we kept the information offline (no e-books). However, it was widely scanned and reproduced online. We've filed dozens of DMCA complaints over the years, but have found trying to rid the internet entirely of these infringing pages to be futile. We get 1 closed and find 3 more. Two years ago we decided to put the information online ourselves, to generate an official community for our work it instead of "fighting it". We built a full site with hundreds of pages from the book for readers to use, free. Google indexed us, and we followed basic SEO... But in spite of a prime aged domain and a lot of links, we are literally BURIED in google. There are dozens of complete garbage spam sites that rank way higher than us. I understand ranking takes time, and the niche is competitive. But the low quality landing pages that are ranking above us is just too confusing. We fear our work has been indexed by google so much over the years on other sites they will never connect it to us. We'll always be buried on page 14 as another scrape. What would you do to correct this for a client? Could you?
Intermediate & Advanced SEO | | RetBit0 -
Magento products and eBay - duplicate content risk?
Hi, We are selling about 1000 sticker products in our online store and would like to expand a large part of our products lineup to eBay as well. There are pretty good modules for this as I've heard. I'm just wondering if there will be duplicate content problems if I sync the products between Magento and eBay and they get uploaded to eBay with identical titles, descriptions and images? What's the workaround in this case? Thanks!
Intermediate & Advanced SEO | | speedbird12290 -
Not ranking in Google - why???
This will be a bit long, so please bare with me. I have a client in the auto parts industry who wants to rank their homepage for 13 different keywords. We are ranked first page for all keywords in Yahoo! Mexico and Bing Mexico, but not ranking first page at all in Google Mexico. My client's competitor, however, is clearly outranking my client in Google. When comparing both pages, my client's, while not 100% optimized, looks better optimized than their competitor's. Looking at all metrics using Moz, SEMRush, ahrefs, etc... my client's site looks MUCH better on all fronts. I know ranking a single homepage for more than 10 keywords is a difficult task. Our competitor is however, ranking for them, so it's not impossible. The keywords are not even that competitive according to Moz's analysis. I decided to create an optimized page for each keyword to try to rank these pages, but still my client wants the homepage to rank (again, if the competitor is ranking, then it's possible to do this) and I am afraid these pages I created could result in keyword cannibalization ultimately affecting the homepage's possibility to rank. My client had a previous SEO agency working for them and basically all they did was create fake blogs and have lots of keyword rich links directed to the site's homepage. I got the complete link profile from several tools and submitted a disavow requests for as many fishy links I could find, but that hasn't shown any results so far. Note: when looking at the competitor link profile, they have basically just a few links and no external links of real value whatsoever. My client is obviously very frustrated, and so am I. In my SEO experience, it shouldn't be such a difficult task to accomplish, however nothing seems to work even though everything seems to point that my client should rank higher. So now I'm running out of ideas regarding what to do with this site. Any insight you could provide would be SO helpful to me and my client. If needed I can provide my client's homepage URL and also their competitors homepage for you to review. i can also give you any extra information you need. Thanks a lot!
Intermediate & Advanced SEO | | EduardoRuiz0 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
Best practice with duplicate content. Cd
Our website has recently been updated, now it seems that all of our products pages look like this cdnorigin.companyname.com/catagory/product Google is showing these pages within the search. rather then companyname.com/catagory/product Each product page does have a canaonacal tag on that points to the cdnorigin page. Is this best practice? i dont think that cdnorigin.companyname etc looks very goon in the search. is there any reason why my designer would set the canonical tags up this way?
Intermediate & Advanced SEO | | Alexogilvie0 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0