Duplicate categories how to make sure I don't get penalized for this
-
Hi there
How would I go about fixing duplicate categories?
My products sell in multiple category areas and some overlap the other - how can I go about making sure that I don't get penalised for this? Each category and content is unique but my advisors offer different tools and insights.
-
Thanks for your response Patrick! This has helped alot!!! So clear! Thanks for going into more details about the canonical tags - as this part was confusing! lol
Thanks I have set it to www as the preferred
With regards to the canonical tags - should I use this for ALL pages? or just the category pages? Or just pages I feel have similar content? Would it be just safe to add it to every page?
-
Hi Justin
When it comes to canonical tags, it's a best practice. I just like to apply them so that each page "owns" the content on that page in the eyes of the search engine. Especially when some content is on more than page, like a portion of your staff's profiles - I would definitely put a canonical tag on the psychic's profile pages since some of their bios appear on other pages.
The reason I brought up the non www. URL is because if for some reason someone links to the non www. version of your site or a page on your site, it's going to return a 404 error like you are seeing. You'll notice that if you goto http://www.zenory.com/profile/psychic-ginny it automatically redirects to where it is supposed to go. This is just to cover your bases in case there is a linking error somewhere.
Since your site is using "www." - I would use that version. You'll see under Set Your Preferred Domain in Google Webmaster Tools that they recommend staying consistent in your preferred domain and how your site is set up.
Hope this helps! Let me know if you have any questions or comments! Good luck!
-
Hi Patrick
Thanks alot for your response, and sorry for my vague explanation.
Yes, you are correct - I was concerned that my staff are listed in the categories section as duplicate. Every page is going to be unique in content. Thanks for the heads up! Should I apply the canonical tag for every page? And could you please explain the non-www url? I see there is a settings in WMT that I need to apply - under the add your preferred version should I set this up to - Display URLs as http://bit.ly/1yhz96v
-
Hi Justin
I am not entirely sure I am following, but are you worried about your staff appearing on multiple categories? From what I can, that's not an issue - you have unique content to the page and service. The list of psychics shouldn't be an issue.
I would check your canonical tag situation, though. I noticed the site doesn't have any, just a heads up.
Also, your non www. URLs don't work - here is an example.
Let me know if I am understanding!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlinks from customers' websites. Good or bad? Violation?
Hi all, Let's say a company holds 100 customers and somehow getting a backlink from all of their websites. Usually we see "powered by xyz", etc. Is something wrong with this? Is this right backlinks strategy? Or violation of Google guidelines? Generally most of the customers's websites do not have good DA; will it beneficial getting a backlinks from such average below DA websites? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Site traffic halved not sure why
Hi guys, not sure if anyone can help, but we had a client's google organic traffic literally halve from the week at the end of August to September (29 Aug 2016 to be precise) and it hasn't recovered since (here's a screenshot from GA http://puu.sh/sAmd3/b071dd1e57.png) I've been doing a lot of digging around on Moz and elsewhere about any Google updates that may have gone through around that time and there doesn't seem to be anything that I would think would affect it. I thought it might be to do with Penguin, but that doesn't seem to be the case. A while ago before then we did have some domains and pages 301 redirected to the main site when multiple other sites were rolled into the one, but I wouldn't have thought that should affect it. After that I've also gone and removed all those sites and redirects too (couple of weeks ago) but that doesn't seem to have fixed it. There's no black hat SEO done on the site so very odd to have this happen. I'm rather out of ideas what it could be that has impacted things so suddenly and that we couldn't get it recovered from. Any ideas would be much appreciated.
White Hat / Black Hat SEO | | BrisbaneSEOWorks0 -
Is Syndicated (Duplicate) Content considered Fresh Content?
Hi all, I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain? An example may clearly show what I'm after: domain1.com is a lawyer in Seattle.
White Hat / Black Hat SEO | | ColeLusby
domain2.com is a lawyer in New York. Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value? Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains). Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well. We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO. Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain. TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain? Thanks so much, Cole0 -
On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
White Hat / Black Hat SEO | | AMTrends0 -
Site that's 301 redirected is ranking for brand
We own a number of foreign TLD domains for our brand. They are all 301-redirected to our main .com branded domain. One of them is appearing in our branded search results, outranking out main .com page. To be clear, this is despite there being a 301 redirect from it to the .com page. Any ideas on what is going on here?
White Hat / Black Hat SEO | | ipancake0 -
Would it be a good idea to duplicate a website?
Hello, here is the situation: let's say we have a website www.company1.com which is 1 of 3 main online stores catering to a specific market. In an attempt to capture a larger market share, we are considering opening a second website, say www.company2.com. Both these websites have a different URL, but offer the same products for sale to the same clientele. With this second website, the theory is instead of operating 1 of 3 stores, we now operate 2 of 4. We see 2 ways of doing this: we launch www.company2.com as a copy of www.company1.com. we launch www.company2.com as a completely different website. The problem I see with either of these approaches is duplicate content. I think the duplicate content issue would be even more or a problem with the first approach where the entire site is mostly a duplicate. With the second approach, I think the duplicate content issue can be worked around by having completely different product pages and overall website structure. Do you think either of these approaches could result in penalties by the search engines? Furthermore, we all know that higher ranking/increased traffic can be achieved though high quality unique content, social media presence, on-going link-building and so on. Now assuming we have a fixed amount of manpower to provide for these tasks; do you think we have better odds of increasing our overall traffic by sharing the manpower on 2 websites, or putting it all behind a single one? Thanks for your help!
White Hat / Black Hat SEO | | yacpro130 -
Penalized by Penguin 2.0
I believe our site has been penalizes by Penguin 2.0. Our impressions in Google Webmaster are down and our traffic in Google Analytics also took a hit. Both of these occurences took place right when Penguin 2.0 was unleashed. What are the steps I need to take to regain my ranking? Is disallowing all the links I think maybe spammy the first thing to do?
White Hat / Black Hat SEO | | joebuilder0 -
NYT article on JC Penny's black hat campaign
Saw this article on JC Penny receiving a 'manual adjustment' to drop their rankings by 50+ spots: http://www.nytimes.com/2011/02/13/business/13search.html Curious what you guys think they did wrong, and whether or not you are aware of their SEO firm SearchDex? I mean, was it a simple case of low-quality spam links or was there more to it? Anyone study them in OpenSiteExplorer?
White Hat / Black Hat SEO | | scanlin0