Duplicate categories how to make sure I don't get penalized for this
-
Hi there
How would I go about fixing duplicate categories?
My products sell in multiple category areas and some overlap the other - how can I go about making sure that I don't get penalised for this? Each category and content is unique but my advisors offer different tools and insights.
-
Thanks for your response Patrick! This has helped alot!!! So clear! Thanks for going into more details about the canonical tags - as this part was confusing! lol
Thanks I have set it to www as the preferred
With regards to the canonical tags - should I use this for ALL pages? or just the category pages? Or just pages I feel have similar content? Would it be just safe to add it to every page?
-
Hi Justin
When it comes to canonical tags, it's a best practice. I just like to apply them so that each page "owns" the content on that page in the eyes of the search engine. Especially when some content is on more than page, like a portion of your staff's profiles - I would definitely put a canonical tag on the psychic's profile pages since some of their bios appear on other pages.
The reason I brought up the non www. URL is because if for some reason someone links to the non www. version of your site or a page on your site, it's going to return a 404 error like you are seeing. You'll notice that if you goto http://www.zenory.com/profile/psychic-ginny it automatically redirects to where it is supposed to go. This is just to cover your bases in case there is a linking error somewhere.
Since your site is using "www." - I would use that version. You'll see under Set Your Preferred Domain in Google Webmaster Tools that they recommend staying consistent in your preferred domain and how your site is set up.
Hope this helps! Let me know if you have any questions or comments! Good luck!
-
Hi Patrick
Thanks alot for your response, and sorry for my vague explanation.
Yes, you are correct - I was concerned that my staff are listed in the categories section as duplicate. Every page is going to be unique in content. Thanks for the heads up! Should I apply the canonical tag for every page? And could you please explain the non-www url? I see there is a settings in WMT that I need to apply - under the add your preferred version should I set this up to - Display URLs as http://bit.ly/1yhz96v
-
Hi Justin
I am not entirely sure I am following, but are you worried about your staff appearing on multiple categories? From what I can, that's not an issue - you have unique content to the page and service. The list of psychics shouldn't be an issue.
I would check your canonical tag situation, though. I noticed the site doesn't have any, just a heads up.
Also, your non www. URLs don't work - here is an example.
Let me know if I am understanding!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What does Google's Spammy Structured Markup Penalty consist of?
Hey everybody,
White Hat / Black Hat SEO | | klaver
I'm confused about the Spammy Structured Markup Penalty: "This site may not perform as well in Google results because it appears to be in violation of Google's Webmaster Guidelines." Does this mean the rich elements are simply removed from the snippets? Or will there be an actual drop in rankings? Can someone here tell from experience? Thanks for your help!1 -
I would like opinions on Brian Dean's training courses and his advice -- is it useful?
I would like opinions on Brian Dean's training courses and his advice -- has anyone used it successfully? Is it worth the cost? And useful?
White Hat / Black Hat SEO | | marketingdepartment.ch1 -
I have a recipe food blog and use wordpress, but my recipes are usually in more than one category...?
The recipes are in most cases in more than one category (usually two) each. Do and (and if so how) need to set each post to one canicol url? E.g A recipe on Peas is in healthy foods (which is the default wordpress cat.) and also Vegetarian Dishes. I use YOAST for wordpress
White Hat / Black Hat SEO | | Kelly33300 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Why isn't a 301 redirect removing old style URLs from Google's index?
I have two questions:1 - We changed the URL structure of our site. Old URLs were in the format of kiwiforsale.com/used_fruit/yummy_kiwi. These URLs are 301 redirected to kiwiforsale.com/used-fruit/yummy-kiwi. We are getting duplicate content errors in Google Webmaster Tools. Why isn't the 301 redirect removing the old style URL out of Google's index?2 - I tried to remove the old style URL at https://www.google.com/webmasters/tools/removals, however I got the message that "We think the image or web page you're trying to remove hasn't been removed by the site owner. Before Google can remove it from our search results, the site owner needs to take down or update the content."Why are we getting this message? Doesn't the 301 redirect alert Google that the old style URL is toast and it's gone?
White Hat / Black Hat SEO | | CFSSEO0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Why is this site performing so well in the SERP's and getting high traffic volume for no apparent reason!
The site is https://virtualaccountant.ie/ It's a really small site They have only about 7 back links, They don't blog They don't have a PPC campaign They don't stand out from the crowd in terms of product or services offered So why are they succeeding in topping the SERP's for difficult to rank for accounting keywords such as accountant and online accounts. What are they doing better than everyone else, or have they discovered a way to cheat Google, and worse still - ME!
White Hat / Black Hat SEO | | PeterConnor0