Hello!We have a website with various city tours and activities listed on a single page (http://vaiduokliai.lt/). The list changes accordingly depending on filtering (birthday in Vilnius, bachelor party in Kaunas, etc.). The URL doesn't change. Content changes dynamically. We need to make URL visible for each category, then optimize it for different keywords (for example city tours in Vilnius for a list of tours and activities in Vilnius with appropriate URL /tours-in-Vilnius).The problem is that activities overlap very often in different categories, so there will be a lot of duplicate content on different pages. In such case, how severe penalty could be for duplicate content?
- Home
- jpuzakov
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Latest posts made by jpuzakov
-
Penalties for duplicate content
-
RE: Duplicate content issue
Got it. We actually have plenty of organic entrances to these pages. So rel=canonical is not an option here.
And one more thing. Does it make sense to add nofollow links internally to main dictionary page(http://anglu24.lt/zodynas)? What are downsides of that? Or the negative effect might be similar to rel=canonical in our case?
-
RE: Duplicate content issue
Thanks for the suggestion. Adding more content is the perfect way to deal with this. The downside for us is that we unfortunately don't have resources at the time to make such upgrades to 1000+ pages.
What about using rel=canonical? Is it possible to choose one dictionary page to be the original, and to tell Google that all the other ones are similar thus avoiding possible penalties? How would this work?
-
Duplicate content issue
Hello! We have a lot of duplicate content issues on our website. Most of the pages with these issues are dictionary pages (about 1200 of them). They're not exactly duplicate, but they contain a different word with a translation, picture and audio pronunciation (example http://anglu24.lt/zodynas/a-suitcase-lagaminas). What's the better way of solving this? We probably shouldn't disallow dictionary pages in robots.txt, right?
Thanks!
Best posts made by jpuzakov
Looks like your connection to Moz was lost, please wait while we try to reconnect.