Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How get rid of duplicate content, titles, etc on php cartweaver site?
-
my website http://www.bartramgallery.com was created using php and cartweaver 2.0 about five years ago by a web developer. I was really happy with the results of the design was inspired to get into web development and have been studying ever since. My biggest problem at this time is that I am not knowledgable with php and the cartweaver product but am learning as I read more. The issue is that seomoz tools are reporting tons of duplicate content and duplicate title pages etc. This is likely from the dynamic urls and same pages with secondary results etc. I just made a new sitemap with auditmypc I think it was called in an attempt to get rid of all the duplicate page titles but is that going to solve anything or do I need to find another way to configure the site? There are many pages with the same content competing for page rank and it is a bit frustrating to say the least. If anyone has any advice it would be greatly appreciated even pointing me in the right direction.
Thank you,
Jesse
-
I am still researching a bunch of sites trying to figure out a way to get the product ID name at the end which would be great as that is the page title. I just thought I would mention that I am working on it and see if you thought that it was not possible as you mentioned due to Cartweaver's limitations. It's funny that I have spent so much time trying to get my urls to show up how they should... seems this could have been configured into the original product. Beggars can't be choosers.
-
Yes I am going to take a look at that when I get home perhaps I have to change how a few things are referenced as well as create the change of address right? because if you type in the normal dynamic nasty url it still goes to the nasty url but if I select the url and paste it.. it brings up the page as I mentioned above. Basically stripped of images and styling.
I am wondering if it is possible to include that number at the end as it is the actual image and could potentially populate the title of the image at the end which would be sweet. Of course then I would have a new problem of too long of an url as I have the titles pretty keyword rich on a lot of them to make a proper title for the page.
If this all works out I have to create a link to your site at cartweaver and from a couple of my sites as you have been a great help and from what I can tell have been able to properly diagnose a fairly complex issue with php and cartweaver and even if some web page that I have not seen something similar enough you have been a great help. Thank you
-
I'm guessing the paths used to reference the images & css files are relative to the the results.php file.. now that there are "/"s the best thing to do is to change the template to either hard code an absolute path or use forward slash at the start to always start at the root.. eg
Old code:
New code
or
-
I tried the test example you did above and it was pretty cool. With the web address http://www.bartramgallery.com/photographer/charles-cramer/10.php it rendered a page with I believe everything except for design and styles as well as any imagery. Not sure what causes that to occur perhaps it is missing something but that was a pretty quick stab at fixing my url issue. I am too tired now and need to go to bed haha. Thanks
-
No worries
Look forward to seeing the site with the new URLs in place - a lot of great photos on that site that need to be shared with everyone
-
Yes it appears that this is a pretty good task to clean up this url issue but well worth it. I was surprised by the system moderators of Cartweaver discounting the url as if it were not important because they are very good developers however I think that the url is much more important than some realize as they are both keyword rich and more interesting to the customer. I am even less likely to click on some random url that has no meaning then if I saw one that clearly spelled out what the page was about.. Thanks Woj I am humbled and realize I have some studying to do.
-
There are 2 issues here:
-
Need to fix the URLs for better user experience & search engines and can do so by using rewrite rules in htaccess
The one suggested by the support forum (I've modified to better match your site but it's untested):
RewriteEngine on
RewriteRule ^photographer/([a-zA-Z0-9_-]+)/([0-9]+).php$ results.php?category=$2 The URLs would then be:
http://www.bartramgallery.com/photographer/charles-cramer/10.php (not ideal with "/10.php" at the end but may be best given the limitations of the cart)
rewrites to: http://www.bartramgallery.com/results.php?category=10 -
Clean up the Google index (remove old URLs & add new ones)
Since both URLs will render the same content we can fix by adding a
tag - attributing 1 source to the duplicate content - check if you can do this dynamically in the templates but be very careful not to canonical everything to the homepage or all your pages will be wiped out the index except the home page!)
-
-
when I read it it seems that the .htaccess was the way to go in that you can have the links appear to google as the old links but in presentation to the customer and keywords the new url would be used. The only thing I was confused about was that it seemed that it would not be good to do redirects but rewrites rather... or is it saying to do both?
-
Thanks
-
Great answer Woj!
-
My pleasure
If you set up redirects, you shouldn't loose any traffic
This can also be controlled via htaccess
In google, search for this "site:bartramgallery.com" (without the double quotes) & you will see all the pages you need to redirect
I see the Charles Cramer page as the first photographers page that comes up & the redirect would be something as simple as:
Redirect 301 /results.php?category=10 http://www.bartramgallery.com/charles-cramer
-
Thank you Woj for taking the time to look at my site and I like that organization method. I was not aware of the possibility of being able to reorganize my site like that. I will definately have to research and study a bit to be able to approach this and for awhile I will probably lose traffic but in the end after the changes it should be a much better foot going forward.
-
I'm not familiar with Cartweaver but these are just guides..
First define an organised URL structure - on bartramgallery.com, at a quick glance, a good one could be:
-
bartramgallery.com/photographer (e.g. bartramgallery.com/gordon-michael)
-
bartramgallery.com/photographer/photo (e.g. bartramgallery.com/gordon-michael/juniper-study-joshua-tree)
OR
bartramgallery.com/landscape-photography/photo (e.g. bartramgallery.com/landscape-photography/juniper-study-joshua-tree)
Keep in mind that the shorter URLs the better (could even have bartramgallery.com/photography/juniper-study-joshua-tree)
Second, rewrite the URLs using Rewrite Rules in the htaccess file (see this post: http://www.seomoz.org/blog/rewriterule-split-personality-explained)
I did a search on the Cartweaver support forums and found this:
http://forums.cartweaver.com/topic/google-analytics-identifying-products-and-categoriesOli, from the Cartweaver Support Team, seems to suggest the same "untested" approach as above
Let me know if you need any further help
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I be flagged for duplicate content by Google?
Hi Moz community, Had a question regarding duplicate content that I can't seem to find the answer to on Google. My agency is working on a large number of franchisee websites (over 40) for one client, a print franchise, that wants a refresh of new copy and SEO. Each print shop has their own 'microsite', though all services and products are the same, the only difference being the location. Each microsite has its own unique domain. To avoid writing the same content over and over in 40+ variations, would all the websites be flagged by Google for duplicate content if we were to use the same base copy, with the only changes being to the store locations (i.e. where we mention Toronto print shop on one site may change to Kelowna print shop on another)? Since the print franchise owns all the domains, I'm wondering if that would be a problem since the sites aren't really competing with one another. Any input would be greatly appreciated. Thanks again!
Intermediate & Advanced SEO | | EdenPrez0 -
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Could another site copying my content hurt my ranking?
Earlier this week I asked why a page of mine might not be ranking locally. (https://moz.com/community/q/what-could-be-stopping-us-from-ranking-locally). Maybe this might be part of the answer – another firm has copied huge chunks of my website copy: **My company: **https://idearocketanimation.com/video-production-company/ The other company: http://studio3dm.com/studio3dm-com/video/ Could this be causing my page to not rank? And is there anything I can do about it, other than huff and puff to the other firm? (Which I am already doing.)
Intermediate & Advanced SEO | | Wagster0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Tabs and duplicate content?
We own this site http://www.discountstickerprinting.co.uk/ and just a little concerned as I right clicked open in new tab on the tab content section and it went to a new page For example if you right click on the price tab and click open in new tab you will end up with the url
Intermediate & Advanced SEO | | BobAnderson
http://www.discountstickerprinting.co.uk/#tabThree Does this mean that our content is being duplicated onto another page? If so what should I do?0 -
Problems with ecommerce filters causing duplicate content.
We have an ecommerce website with 700 pages. Due to the implementation of filters, we are seeing upto 11,000 pages being indexed where the filter tag is apphended to the URL. This is causing duplicate content issues across the site. We tried adding "nofollow" to all the filters, we have also tried adding canonical tags, which it seems are being ignored. So how can we fix this? We are now toying with 2 other ideas to fix this issue; adding "no index" to all filtered pages making the filters uncrawble using javascript Has anyone else encountered this issue? If so what did you do to combat this and was it successful?
Intermediate & Advanced SEO | | Silkstream0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Getting Google to Correct a Misspelled Site Link...Help!
My company website recently got its site links in google search... WooHoo! However, when you type TECHeGO into Google Search one of the links is spelled incorrectly. Instead of 'CONversion Optimization' its 'COversion Optimization'. At first I thought there was a misspelling on that page somewhere but there is not and have come to the conclusion that Google has made a mistake. I know that I can block the page in webmaster tools (No Thanks) but how in the crap can I get them to correct the spelling when no one really knows how to get them to appear in the first place? Riddle Me That Folks! sitelink.jpg
Intermediate & Advanced SEO | | TECHeGO0