Thanks all for taking time and answer my question, have a nice day!
- Home
- Guybrush_Threepw00d
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Latest posts made by Guybrush_Threepw00d
-
RE: Lot of duplicate content and still traffic is increasing... how does it work?
-
RE: Lot of duplicate content and still traffic is increasing... how does it work?
Hi Ryan,
first of all, thanks for finding time to answer my question. You may be right as:
-
the domain is 14 years old ("If I had to guess they're probably a pretty old site")
-
brand traffic increased after a Facebook page has been created and made popular ("increasing in traffic due to Brand strength triggers")
So, I guess what you say is probably right, Google is figuring out by itself the site structure and the parameters URLs. Still, duplication of content represent way over 50% of the overall site content and I am surprised that this apparently is not representing a big problem for them (I guess this is because is internally duplicated and not from external sources).
Anyway I wont touch this part for now, and as suggested try to focus on what helped them so far and push these elements a little bit more.
Thanks again for your help!
-
-
Lot of duplicate content and still traffic is increasing... how does it work?
Hello Mozzers,
I've a dilemma with a client's site I am working on that is make me questioning my SEO knowledge, or the way Google treat duplicate content. I'll explain now.
The situation is the following: organic traffic is constantly increasing since last September, in every section of the site (home page, categories and product pages) even though:
-
they have tons of duplicate content from same content in old and new URLs (which are in two different languages, even if the actual content on the page is in the same language in both of the URL versions)
-
indexation is completely left to Google decision (no robots file, no sitemap, no meta robots in code, no use of canonical, no redirect applied to any of the old URLs, etc)
-
a lot (really, a lot) of URLs with query parameters (which brings to more duplicated content) linked from the inner page of the site (and indexed in some case)
-
they have Analytics but don't use Webmaster Tools
Now... they expect me to help them increase even more the traffic they're getting, and I'll go first on "regular" onpage optimization, as their title, meta description and headers are not optimized at all according to the page content, but after that I was thinking on fixing the issues with indexation and content duplication, but I am worried I can "break the toy", as things are going well for them.
Should I be confident that fixing these issues will bring to even better results or do you think is better for me to focus on other kind of improvements?
Thanks for your help!
-
Looks like your connection to Moz was lost, please wait while we try to reconnect.