Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How get rid of duplicate content, titles, etc on php cartweaver site?
-
my website http://www.bartramgallery.com was created using php and cartweaver 2.0 about five years ago by a web developer. I was really happy with the results of the design was inspired to get into web development and have been studying ever since. My biggest problem at this time is that I am not knowledgable with php and the cartweaver product but am learning as I read more. The issue is that seomoz tools are reporting tons of duplicate content and duplicate title pages etc. This is likely from the dynamic urls and same pages with secondary results etc. I just made a new sitemap with auditmypc I think it was called in an attempt to get rid of all the duplicate page titles but is that going to solve anything or do I need to find another way to configure the site? There are many pages with the same content competing for page rank and it is a bit frustrating to say the least. If anyone has any advice it would be greatly appreciated even pointing me in the right direction.
Thank you,
Jesse
-
I am still researching a bunch of sites trying to figure out a way to get the product ID name at the end which would be great as that is the page title. I just thought I would mention that I am working on it and see if you thought that it was not possible as you mentioned due to Cartweaver's limitations. It's funny that I have spent so much time trying to get my urls to show up how they should... seems this could have been configured into the original product. Beggars can't be choosers.
-
Yes I am going to take a look at that when I get home perhaps I have to change how a few things are referenced as well as create the change of address right? because if you type in the normal dynamic nasty url it still goes to the nasty url but if I select the url and paste it.. it brings up the page as I mentioned above. Basically stripped of images and styling.
I am wondering if it is possible to include that number at the end as it is the actual image and could potentially populate the title of the image at the end which would be sweet. Of course then I would have a new problem of too long of an url as I have the titles pretty keyword rich on a lot of them to make a proper title for the page.
If this all works out I have to create a link to your site at cartweaver and from a couple of my sites as you have been a great help and from what I can tell have been able to properly diagnose a fairly complex issue with php and cartweaver and even if some web page that I have not seen something similar enough you have been a great help. Thank you
-
I'm guessing the paths used to reference the images & css files are relative to the the results.php file.. now that there are "/"s the best thing to do is to change the template to either hard code an absolute path or use forward slash at the start to always start at the root.. eg
Old code:
New code
or
-
I tried the test example you did above and it was pretty cool. With the web address http://www.bartramgallery.com/photographer/charles-cramer/10.php it rendered a page with I believe everything except for design and styles as well as any imagery. Not sure what causes that to occur perhaps it is missing something but that was a pretty quick stab at fixing my url issue. I am too tired now and need to go to bed haha. Thanks
-
No worries
Look forward to seeing the site with the new URLs in place - a lot of great photos on that site that need to be shared with everyone
-
Yes it appears that this is a pretty good task to clean up this url issue but well worth it. I was surprised by the system moderators of Cartweaver discounting the url as if it were not important because they are very good developers however I think that the url is much more important than some realize as they are both keyword rich and more interesting to the customer. I am even less likely to click on some random url that has no meaning then if I saw one that clearly spelled out what the page was about.. Thanks Woj I am humbled and realize I have some studying to do.
-
There are 2 issues here:
-
Need to fix the URLs for better user experience & search engines and can do so by using rewrite rules in htaccess
The one suggested by the support forum (I've modified to better match your site but it's untested):
RewriteEngine on
RewriteRule ^photographer/([a-zA-Z0-9_-]+)/([0-9]+).php$ results.php?category=$2 The URLs would then be:
http://www.bartramgallery.com/photographer/charles-cramer/10.php (not ideal with "/10.php" at the end but may be best given the limitations of the cart)
rewrites to: http://www.bartramgallery.com/results.php?category=10 -
Clean up the Google index (remove old URLs & add new ones)
Since both URLs will render the same content we can fix by adding a
tag - attributing 1 source to the duplicate content - check if you can do this dynamically in the templates but be very careful not to canonical everything to the homepage or all your pages will be wiped out the index except the home page!)
-
-
when I read it it seems that the .htaccess was the way to go in that you can have the links appear to google as the old links but in presentation to the customer and keywords the new url would be used. The only thing I was confused about was that it seemed that it would not be good to do redirects but rewrites rather... or is it saying to do both?
-
Thanks
-
Great answer Woj!
-
My pleasure
If you set up redirects, you shouldn't loose any traffic
This can also be controlled via htaccess
In google, search for this "site:bartramgallery.com" (without the double quotes) & you will see all the pages you need to redirect
I see the Charles Cramer page as the first photographers page that comes up & the redirect would be something as simple as:
Redirect 301 /results.php?category=10 http://www.bartramgallery.com/charles-cramer
-
Thank you Woj for taking the time to look at my site and I like that organization method. I was not aware of the possibility of being able to reorganize my site like that. I will definately have to research and study a bit to be able to approach this and for awhile I will probably lose traffic but in the end after the changes it should be a much better foot going forward.
-
I'm not familiar with Cartweaver but these are just guides..
First define an organised URL structure - on bartramgallery.com, at a quick glance, a good one could be:
-
bartramgallery.com/photographer (e.g. bartramgallery.com/gordon-michael)
-
bartramgallery.com/photographer/photo (e.g. bartramgallery.com/gordon-michael/juniper-study-joshua-tree)
OR
bartramgallery.com/landscape-photography/photo (e.g. bartramgallery.com/landscape-photography/juniper-study-joshua-tree)
Keep in mind that the shorter URLs the better (could even have bartramgallery.com/photography/juniper-study-joshua-tree)
Second, rewrite the URLs using Rewrite Rules in the htaccess file (see this post: http://www.seomoz.org/blog/rewriterule-split-personality-explained)
I did a search on the Cartweaver support forums and found this:
http://forums.cartweaver.com/topic/google-analytics-identifying-products-and-categoriesOli, from the Cartweaver Support Team, seems to suggest the same "untested" approach as above
Let me know if you need any further help
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I be flagged for duplicate content by Google?
Hi Moz community, Had a question regarding duplicate content that I can't seem to find the answer to on Google. My agency is working on a large number of franchisee websites (over 40) for one client, a print franchise, that wants a refresh of new copy and SEO. Each print shop has their own 'microsite', though all services and products are the same, the only difference being the location. Each microsite has its own unique domain. To avoid writing the same content over and over in 40+ variations, would all the websites be flagged by Google for duplicate content if we were to use the same base copy, with the only changes being to the store locations (i.e. where we mention Toronto print shop on one site may change to Kelowna print shop on another)? Since the print franchise owns all the domains, I'm wondering if that would be a problem since the sites aren't really competing with one another. Any input would be greatly appreciated. Thanks again!
Intermediate & Advanced SEO | May 22, 2020, 2:21 AM | EdenPrez0 -
Same content, different languages. Duplicate content issue? | international SEO
Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
Intermediate & Advanced SEO | Jun 29, 2017, 12:12 PM | chalet
If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen0 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
Intermediate & Advanced SEO | Apr 26, 2016, 7:42 PM | yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
Sites in multiple countries using same content question
Hey Moz, I am looking to target international audiences. But I may have duplicate content. For example, I have article 123 on each domain listed below. Will each content rank separately (in US and UK and Canada) because of the domain? The idea is to rank well in several different countries. But should I never have an article duplicated? Should we start from ground up creating articles per country? Some articles may apply to both! I guess this whole duplicate content thing is quite confusing to me. I understand that I can submit to GWT and do geographic location and add rel=alternate tag but will that allow all of them to rank separately? www.example.com www.example.co.uk www.example.ca Please help and thanks so much! Cole
Intermediate & Advanced SEO | Sep 11, 2014, 5:51 PM | ColeLusby0 -
Where is the best place to put a sitemap for a site with local content?
I have a simple site that has cities as subdirectories (so URL is root/cityname). All of my content is localized for the city. My "root" page simply links to other cities. I very specifically want to rank for "topic" pages for each city and I'm trying to figure out where to put the sitemap so Google crawls everything most efficiently. I'm debating the following options, which one is better? Put the sitemap on the footer of "root" and link to all popular pages across cities. The advantage here is obviously that the links are one less click away from root. Put the sitemap on the footer of "city root" (e.g. root/cityname) and include all topics for that city. This is how Yelp does it. The advantage here is that the content is "localized" but the disadvantage is it's further away from the root. Put the sitemap on the footer of "city root" and include all topics across all cities. That way wherever Google comes into the site they'll be close to all topics I want to rank for. Thoughts? Thanks!
Intermediate & Advanced SEO | Feb 26, 2014, 10:25 PM | jcgoodrich0 -
Partial duplicate content and canonical tags
Hi - I am rebuilding a consumer website, and each product page will contain a unique product image, and a sentence or two about the product (and we tend to use a lot of the same words in different ways across products). I'd like to have a tabbed area below the product info that talks about the overall product line, and this content would be duplicate across all the product pages (a "Why use our products" type of thing). I'd have this duplicate content also living on its own URL's so they can be found alone in the SERP's. Question is, do I need to add the canonical tag to this page, since there's partial duplicate content on the product pages? And if I did that, would my product pages go un-indexed?? I understand how to handle completely duplicated content, it's the partial duplicate that I'm having difficulty figuring out.
Intermediate & Advanced SEO | Jan 7, 2014, 6:18 PM | Jenny10 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | Apr 11, 2013, 7:07 PM | sbaylor0 -
PDF for link building - avoiding duplicate content
Hello, We've got an article that we're turning into a PDF. Both the article and the PDF will be on our site. This PDF is a good, thorough piece of content on how to choose a product. We're going to strip out all of the links to our in the article and create this PDF so that it will be good for people to reference and even print. Then we're going to do link building through outreach since people will find the article and PDF useful. My question is, how do I use rel="canonical" to make sure that the article and PDF aren't duplicate content? Thanks.
Intermediate & Advanced SEO | Feb 14, 2013, 8:21 PM | BobGW0