Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How get rid of duplicate content, titles, etc on php cartweaver site?
-
my website http://www.bartramgallery.com was created using php and cartweaver 2.0 about five years ago by a web developer. I was really happy with the results of the design was inspired to get into web development and have been studying ever since. My biggest problem at this time is that I am not knowledgable with php and the cartweaver product but am learning as I read more. The issue is that seomoz tools are reporting tons of duplicate content and duplicate title pages etc. This is likely from the dynamic urls and same pages with secondary results etc. I just made a new sitemap with auditmypc I think it was called in an attempt to get rid of all the duplicate page titles but is that going to solve anything or do I need to find another way to configure the site? There are many pages with the same content competing for page rank and it is a bit frustrating to say the least. If anyone has any advice it would be greatly appreciated even pointing me in the right direction.
Thank you,
Jesse
-
I am still researching a bunch of sites trying to figure out a way to get the product ID name at the end which would be great as that is the page title. I just thought I would mention that I am working on it and see if you thought that it was not possible as you mentioned due to Cartweaver's limitations. It's funny that I have spent so much time trying to get my urls to show up how they should... seems this could have been configured into the original product. Beggars can't be choosers.
-
Yes I am going to take a look at that when I get home perhaps I have to change how a few things are referenced as well as create the change of address right? because if you type in the normal dynamic nasty url it still goes to the nasty url but if I select the url and paste it.. it brings up the page as I mentioned above. Basically stripped of images and styling.
I am wondering if it is possible to include that number at the end as it is the actual image and could potentially populate the title of the image at the end which would be sweet. Of course then I would have a new problem of too long of an url as I have the titles pretty keyword rich on a lot of them to make a proper title for the page.
If this all works out I have to create a link to your site at cartweaver and from a couple of my sites as you have been a great help and from what I can tell have been able to properly diagnose a fairly complex issue with php and cartweaver and even if some web page that I have not seen something similar enough you have been a great help. Thank you
-
I'm guessing the paths used to reference the images & css files are relative to the the results.php file.. now that there are "/"s the best thing to do is to change the template to either hard code an absolute path or use forward slash at the start to always start at the root.. eg
Old code:
New code
or
-
I tried the test example you did above and it was pretty cool. With the web address http://www.bartramgallery.com/photographer/charles-cramer/10.php it rendered a page with I believe everything except for design and styles as well as any imagery. Not sure what causes that to occur perhaps it is missing something but that was a pretty quick stab at fixing my url issue. I am too tired now and need to go to bed haha. Thanks
-
No worries
Look forward to seeing the site with the new URLs in place - a lot of great photos on that site that need to be shared with everyone
-
Yes it appears that this is a pretty good task to clean up this url issue but well worth it. I was surprised by the system moderators of Cartweaver discounting the url as if it were not important because they are very good developers however I think that the url is much more important than some realize as they are both keyword rich and more interesting to the customer. I am even less likely to click on some random url that has no meaning then if I saw one that clearly spelled out what the page was about.. Thanks Woj I am humbled and realize I have some studying to do.
-
There are 2 issues here:
-
Need to fix the URLs for better user experience & search engines and can do so by using rewrite rules in htaccess
The one suggested by the support forum (I've modified to better match your site but it's untested):
RewriteEngine on
RewriteRule ^photographer/([a-zA-Z0-9_-]+)/([0-9]+).php$ results.php?category=$2 The URLs would then be:
http://www.bartramgallery.com/photographer/charles-cramer/10.php (not ideal with "/10.php" at the end but may be best given the limitations of the cart)
rewrites to: http://www.bartramgallery.com/results.php?category=10 -
Clean up the Google index (remove old URLs & add new ones)
Since both URLs will render the same content we can fix by adding a
tag - attributing 1 source to the duplicate content - check if you can do this dynamically in the templates but be very careful not to canonical everything to the homepage or all your pages will be wiped out the index except the home page!)
-
-
when I read it it seems that the .htaccess was the way to go in that you can have the links appear to google as the old links but in presentation to the customer and keywords the new url would be used. The only thing I was confused about was that it seemed that it would not be good to do redirects but rewrites rather... or is it saying to do both?
-
Thanks
-
Great answer Woj!
-
My pleasure
If you set up redirects, you shouldn't loose any traffic
This can also be controlled via htaccess
In google, search for this "site:bartramgallery.com" (without the double quotes) & you will see all the pages you need to redirect
I see the Charles Cramer page as the first photographers page that comes up & the redirect would be something as simple as:
Redirect 301 /results.php?category=10 http://www.bartramgallery.com/charles-cramer
-
Thank you Woj for taking the time to look at my site and I like that organization method. I was not aware of the possibility of being able to reorganize my site like that. I will definately have to research and study a bit to be able to approach this and for awhile I will probably lose traffic but in the end after the changes it should be a much better foot going forward.
-
I'm not familiar with Cartweaver but these are just guides..
First define an organised URL structure - on bartramgallery.com, at a quick glance, a good one could be:
-
bartramgallery.com/photographer (e.g. bartramgallery.com/gordon-michael)
-
bartramgallery.com/photographer/photo (e.g. bartramgallery.com/gordon-michael/juniper-study-joshua-tree)
OR
bartramgallery.com/landscape-photography/photo (e.g. bartramgallery.com/landscape-photography/juniper-study-joshua-tree)
Keep in mind that the shorter URLs the better (could even have bartramgallery.com/photography/juniper-study-joshua-tree)
Second, rewrite the URLs using Rewrite Rules in the htaccess file (see this post: http://www.seomoz.org/blog/rewriterule-split-personality-explained)
I did a search on the Cartweaver support forums and found this:
http://forums.cartweaver.com/topic/google-analytics-identifying-products-and-categoriesOli, from the Cartweaver Support Team, seems to suggest the same "untested" approach as above
Let me know if you need any further help
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Directory with Duplicate content? what to do?
Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.
Intermediate & Advanced SEO | | nchlondon0 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
Intermediate & Advanced SEO | | yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Contextual FAQ and FAQ Page, is this duplicate content?
Hi Mozzers, On my website, I have a FAQ Page (with the questions-responses of all the themes (prices, products,...)of my website) and I would like to add some thematical faq on the pages of my website. For example : adding the faq about pricing on my pricing page,... Is this duplicate content? Thank you for your help, regards. Jonathan
Intermediate & Advanced SEO | | JonathanLeplang0 -
Duplicate Titles caused by multiple variations of same URL
Hi. Can you please advise how I can overcome this issue. Moz.com crawle is indicating I have 100's of Duplicate Title tag errors. However this is caused because many URL's have been indexed multiple times in Google. For example. www.abc.com
Intermediate & Advanced SEO | | adhunna
www.abc.com/?b=123 www.abc.com/ www.abc.com/?b=654 www.abc.com/?b=875 www.abc.com/index.html What can I do to stop this issue being reported as duplictae Titles, as well as content? I was thinking maybe I can use Robots.txt to block various query string parameters. I'm Open to ideas and examples.0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
Duplicate content on ecommerce sites
I just want to confirm something about duplicate content. On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content? Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page. If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products... Cheers,
Intermediate & Advanced SEO | | Creode0 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0