Duplicate Content, Canonicalization may not work in our scenario.
-
I'm new to SEO (so please excuse the lack of terminology), and will be taking over our companies inbound marketing completely, I previously just did data analysis and managed our PPC campaigns within Google and Bing/Yahoo, now I get all three, Yipee! But I digress.
Before I get started here, I did read: http://moz.com/community/q/new-client-wants-to-keep-duplicate-content-targeting-different-cities?sort=most_helpful and I found both the answers there to be helpful, but indirect for my scenario.
I'm conducting our companies first real SEO audit (thanks MOZ for the guide there), and duplicate content is going to be our number one problem to tackle. Our companies website was designed back in 2009, with the file structure /city-name/product-name. The problem with this is, we are open in over 50 cities now (and headed to 100 fast), and we are starting to amass duplicate content. Five products (and expanding), times the locations... you get it.
My Question(s):
How should I deal with this? The pages are almost identical, except listing the different information for each product depending upon it's location. However, for one of our products, Moz's own tools (PRO) did not find all the duplicate content, but did find some (I'm assuming it's because the pages have different course options and the address for the course is different, boils down to a different address on the very bottom of the body and different course options on the right sidebar). The other four products duplicate content were found and marked extensively.
If I choose to use Canonicalization to link all the pages to one main page, I believe that would pass all the link juice to that one page, but we would no longer show in a Google search for the other cities, ex: washington DC example product name. Correct me if I'm wrong here.
**Should I worry about the product who's duplicate content only was marked four times out of fifty cities? **I feel as if this question answers itself, but I still would like to have someone who knows more than me shed some light on this issue.
The other four products are not going to be an issue as they are only offered online, but still follow the same file structure with /online in place of /city-name. These will be Canonicalized together under the /online location.
One last thing I will mention here, having the city name in the url gives us a nice advantage (I think) when people are searching for products in cities we offer our product. (correct me again) If this is not the case, I believe I could talk our team into restructuring the files (if you think that's our best option).
Some things you need to know about our site:
We use a cookie for the location. Once you land on a page that has a location tied to it, the cookie is updated and saved. If the location does not exist, then you are redirected to a page to chose a location. I'm pretty sure this can cause some SEO issues too, but once again not sure.
I know this is a wall of text, but I cannot tell you enough how appreciative I am in advance for your informative answers.
Thanks a million,
Trenton
-
Well, I'm super impatient so I spent the last two days trying to figure this one out:
Best Solution:
Canonicalize the four products not contingent on a city, and leave the rest alone. There's no way to canonicalize a page forward(is that a term? should be.) and still have that page show up in Search Engines. So I cannot standardize all the cities into one page, without losing the ability to search for those individual pages (which is worth more to us than having our link juice in 1 spot).
Thanks to everyone who took a look and tried to help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi guys What the best way to adress duplicate content on photo gallery?
inside my moz report for duplicate contentit says that the photo gallery has duplicate content. let me post and example. is saying this site->http://www.yoursite.com//photogallery/name-of-the-page site photogallery category page name its being duplicated to all these other urls : http://www.yoursite.com//photogallery/name-of-the-page-categoryone http://www.yoursite.com//photogallery/name-of-the-page-categorytwo http://www.yoursite.com//photogallery/name-of-the-page-categorythree http://www.yoursite.com//photogallery/name-of-the-page-categoryfour and so on! each one has it own canonical tag to its own individual page. the site structure is this: http://www.yoursite.com//photogallery/ in here there are all the links pointing to the right categorypage ie: http://www.yoursite.com//photogallery/ >>>> http://www.yoursite.com//photogallery/categoryone pic 1 pic 2 pic 3 http://www.yoursite.com//photogallery/categorytwo pic 1 pic 2 pic 3 http://www.yoursite.com//photogallery/categorythree pic 1 pic 2 pic 3 http://www.yoursite.com//photogallery/categoryfour pic 1 pic 2 pic 3 So i don't know how to interpret Moz diagnose. how could i interpret moz reports to find out what to fix and how to fix it? Sorry for the long post! ;
Moz Pro | | surgeonsadvisor0 -
Can't figure out why some of my pages are duplicate content
Within the crawl diagnostics area I'm getting duplicate page content issues on several pages. I don't know why, would anyone be able to tell me how these links are duplicate so I can fix them? http://www.sagenews.ca/Column.asp?id=3010 http://www.sagenews.ca/Column.asp?id=2808 http://www.sagenews.ca/Column.asp?id=2998 http://www.sagenews.ca/Column.asp?id=2837 http://www.sagenews.ca/Column.asp?id=2981
Moz Pro | | INMCA0 -
I have double-checked the rel canonical is properly employed on our page but the On Page Grader says it's not working?
I have double-checked the rel canonical is properly employed on our page but the On Page Grader says it's not working Here is the URL - http://www.solidconcepts.com/industries/aerospace-parts-manufacturing/ What is wrong with how we are doing things?
Moz Pro | | StratasysDirectManufacturing0 -
Finding the source of duplicate content URL's
We have a website that displays a number of products. The product has variations (sizes) and unfortunately every size has its own URL (for now anyway). Needless to say, this causes duplicate content issues. (And of course, we are looking to change the URL's for our site as soon as possible) However, even though these duplicate URL's exist, you should not be able to land on them by navigating through the site. In theory, the site should always display the link to the smallest size. It seems that there is a flaw in our system somewhere, as these links are now found in our campaign here on SEOmoz. My question: is there any way to find the crawl path that lead to the URL's that shouldn't have been found, so we can locate the problem?
Moz Pro | | DocdataCommerce0 -
How can you set SEOmoz to work with your dev site behind an htpasswd?
All sites need to be developed from the small to the grand - and this takes time. Development usually takes place on a subdomain different from our live domain. It is locked down behind an htpasswd during development so its not picked up by searching engines - that may create duplicate content issues if when the site goes live it has already scanned our site on the development server. Its also a security implementation to keep the site away from prying eyes before its ready for launch There could be security holes that have not been tweaked. Whats the best strategy to get SEOmoz involved in this scenario. Its tools are invaluable to the SEO part of the build - but the seomoz crawler bot has a different IP address (being cloud based) - so we cannot just let a single IP address through our htpasswd. Also is there a way to link the dev and live site in seomoz - so when it goes live to maintain all teh same logs without having to create two seperate site campaigns? Thanks!
Moz Pro | | dseo2410 -
How to work round issue with SEOMOZ not supporting SSL?
I have a website that it running via SSL, to secure the website, to reassure the customer is browsing (and so will be purchasing) on a secure website, and to show off the green address bar which many of my competitors do not have. However in my campaign I am getting on Links showing, with my rank reporting '1', surely this cannot be! I asked a Help question, and the only response was: "Unfortunately, we don't currently support https for our linkscape crawls. This is definitely something that we plan to implement in the future though! I hope that helps and I apologize for any inconveniences!" This definately does not help.. Is there any way of working around this issue, or will I be forced to cancel my SEOMOZ membership?
Moz Pro | | jcarter0 -
SEOmoz Bot indexing JSON as content
Hello, We have a bunch of pages that contain local JSON we use to display a slideshow. This JSON has a bunch of<a links="" in="" it. <="" p=""></a> <a links="" in="" it. <="" p="">For some reason, these</a><a links="" that="" are="" in="" json="" being="" indexed="" and="" recognized="" by="" the="" seomoz="" bot="" showing="" up="" as="" legit="" for="" page. <="" p=""></a> <a links="" that="" are="" in="" json="" being="" indexed="" and="" recognized="" by="" the="" seomoz="" bot="" showing="" up="" as="" legit="" for="" page. <="" p="">One example page this is happening on is: http://www.trendhunter.com/trends/a2591-simplifies-product-logos . Searching for the string '<a' yields="" 1100+="" results="" (all="" of="" which="" are="" recognized="" as="" links="" for="" that="" page="" in="" seomoz),="" however,="" ~980="" these="" json="" code="" and="" not="" actual="" on="" the="" page.="" this="" leads="" to="" a="" lot="" invalid="" our="" site,="" super="" inflated="" count="" on-page="" page. <="" span=""></a'></a> <a links="" that="" are="" in="" json="" being="" indexed="" and="" recognized="" by="" the="" seomoz="" bot="" showing="" up="" as="" legit="" for="" page. <="" p="">Is this a bug in the SEOMoz bot? and if not, does google work the same way?</a>
Moz Pro | | trendhunter-1598370 -
Crawl Diagnostics bringing 20k+ errors as duplicate content due to session ids
Signed up to the trial version of Seomoz today just to check it out as I have decided I'm going to do my own SEO rather than outsource it (been let down a few times!). So far I like the look of things and have a feeling I am going to learn a lot and get results. However I have just stumbled on something. After Seomoz dones it's crawl diagnostics run on the site (www.deviltronics.com) it is showing 20,000+ plus errors. From what I can see almost 99% of this is being picked up as erros for duplicate content due to session id's, so i am not sure what to do! I have done a "site:www.deviltronics.com" on google and this certainly doesn't pick up the session id's/duplicate content. So could this just be an issue with the Seomoz bot. If so how can I get Seomoz to ignore these on the crawl? Can I get my developer to add some code somewhere. Help will be much appreciated. Asif
Moz Pro | | blagger0