Gallery system creates duplicates
-
Hi,
Does anybody know what can I do with those “duplicate content pages”?
1/ home page shows 4 different urls with different parameters. Should I use meta-robots tag to eliminate it? Or block it in robots.txt?
http://screencast.com/t/xqNiowCYBwgh
2/ Also, there are dozens of duplicates created by the “gallery system”. Like this:
http://screencast.com/t/qTq4YERG
All showing for the same url. There are multiple pages for each location. Some people told me that it's irrelevant for rankings anyway.
I suggested getting rid of flash website alltogether and getting a smooth wordpress installation, but it's not an option.
Can you please help me with it?
Best Regards,
JJ
-
Sorry, I somehow missed your response.
thanks for your advise, I'm aware of the problem.Whole site should be replaced with something more up-to-date and this is the direction I'm trying to push them.
Regards,
JJ -
As to the homepage question, make sure you have your preferred domain set in Webmaster Tools (i.e. WWW vs Non-WWW). Pick whether your homepage should have a trailing slash or not. Have all basic iterations of that redirect to your chosen variation (e.g. homepage.com, homepage.com, www.homepage.com/index.php all redirect to www.homepage.com). If those parameters shown in your image don't significantly change the look or information on the page but are necessary then consider adding canonicals from all parametered versions of the homepage to the regular version of the page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate links from forum what to do?
After a crawl it found over 5k errors and over 5k warnings. Those are: Duplicate page content; Duplicate page title; Overly-Dynamic URLs; Missing Meta descr; Title Element too long. All those come from domain.com/forum/ I don't need SEO on forum so what should I do? What could be an easy solution to this? No index? No follow? Please help
On-Page Optimization | | OVJ0 -
Events in Wordpress Creating Duplicate Content Canonical Issues
Hi, I have a site which uses Event Manager Pro within Wordpress to create Events (as custom post types on my blog. I use it to advertise cookery classes. In a given month I might run one type of class 4 times. The event page I have made for each class is the same and I duplicate it 4 times and just change the dates to promote it. The problem is with over 10 different classes, which are then duplicated up to 4 times each per month. I get loads of duplicate content errors. How can I fix this without redirecting people away from the correct page for the date they are interested in? Is it best just to use a no follow for ALL events and rely on the other parts of my site for SEO? Thanks, T23
On-Page Optimization | | tekton230 -
Duplicate Content Issues with Forum
Hi Everyone, I just signed up last night and received the crawl stats for my site (ShapeFit.com). Since April of 2011, my site has been severely impacted by Google's Panda and Penguin algorithm updates and we have lost about 80% of our traffic during that time. I have been trying to follow the guidelines provided by Google to fix the issues and help recover but nothing seems to be working. The majority of my time has been invested in trying to add content to "thin" pages on the site and filing DMCA notices for copyright infringement issues. Since this work has not produced any noticeable recovery, I decided to focus my attention on removing bad backlinks and this is how I found SEOmoz. My question is about duplicate content. The crawl diagnostics showed 6,000 errors for duplicate page content and the same for duplicate page title. After reviewing the details, it looks like almost every page is from the forum (shapefit.com/forum). What's the best way to resolve these issues? Should I completely block the "forum" folder from being indexed by Google or is there something I can do within the forum software to fix this (I use phpBB)? I really appreciate any feedback that would help fix these issues so the site can hopefully start recovering from Panda/Penguin. Thank you, Kris
On-Page Optimization | | shapefit0 -
Using a lightbox - possible duplicate content issues
Redesigning website in Wordpress and going to use the following lightbox plug-in http://www.pedrolamas.pt/projectos/jquery-lightbox/ Naming the original images that appear on screen as say 'sweets.jpg'
On-Page Optimization | | Jon-C
and the bigger version of the images as 'sweets-large.jpg' Alt text wise I would give both versions of the images slightly different descriptions. Do you think there would be any duplicate content issues with this? Anything I should do differently? I'm very wary of doing anything that Google is likely to think is naughty, so want to stay on their good side! Cheers
T0 -
Is duplicate content harmful? Example from on my site
I'm not talking about content copied from another site but content unique to a site being used on several pages. I have a delivery tab that has precisely the same content as another product page. This content is on four product pages and the dedicated delivery page. Thanks
On-Page Optimization | | Brocberry0 -
Duplicate meta descriptions
Hi all, I'm using Yoast's SEO plugin and when I run a On Page report card here on SEOMOZ it says there are 2 descriptions tags I've been trying to fix this but can't (I'm new!) Anyone any ideas on this? Thanks Elaine
On-Page Optimization | | elaineryan0 -
I have a question about on page links or duplicate contant
Ok help me out here friends. I’m working with the warnings and errors for my site. I have two problems that relate to each other and I want to know if you had to choose what problem what would you choose. I’m running into some duplicate content and title errors because under categories for my products there are so many products that it creates more than one page and with each new page it has the same title or same content on the page. I tried to make this less in some cases by showing more products per page like 100 items and in most cases per category it will only show one page now. Now some times there’s still more than one page and also this creates too many links now on that category page. So I think I can get rid of all the to many on page links but it will make more pages and duplicate content and title tag. What would you guys do?
On-Page Optimization | | Dataken0 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0