How to remove hundreds of duplicate pages
-
Hi - while i was checking duplicate links, am finding hundreds of duplicates pages :-
-
having undefined after domain name and before sub page url
-
having /%5C%22/ after domain name and before the sub page url
-
Due to Pagination limits
Its a joomla site - http://www.mycarhelpline.com
Any suggestions - shall we use:-
-
301 redirect
-
leave these as standdstill
-
and what to do of pagination pages (shall we create a separate title tag n meta description of every pagination page as unique one)
thanks
-
-
Okay, I took a look at the plugin Ben recommended, and took another look at the SH404SEF one. The free one Ben recommended (http://extensions.joomla.org/extensions/site-management/sef/1063) looks like it can help out with some duplicate content - but what I recommend doing is getting the SH404SEF here http://anything-digital.com/sh404sef/features.html because it allows for setting up canonical tags and also gives you the option to add the rel=next feature to your paginated pages, which is one of your problem areas.
One thing I noticed though is that it specifically states it "automatically adds canonical tags to non-html pages" - so that means it will apply it automatically to Joomla's defaul pdf view, etc. While this is helpful, it may not solve the full issue of your duplicate pages with the undefined and "/%5C%22/" issue.
It does however state that it "removes duplicate URLs" - how it identifies and removes these, I am not sure. You may want to try it out because it is useful for other optimization tasks - or contact the owner for more information.
If the tool doesn't recognize and remove the duplicate pages caused by /undefined/ and "/%5C%22/" then you should disallow crawling of these in your robots.txt file. While you are in your robots.txt file you should remove the /images/ because you want those to be crawled - Joomla adds that in by default.
Because a lot of these pages have already been crawled, you should do a 301 on the duplicate pages to their matching page. This sounds like it will be a long process - this may be aided by the sh404sef plugin - not sure.
I just want to also add that I am in no way affiliated with any of these plugins.
-
The only way to solve the duplication error you are getting is to make the URL's distinct. Googlebot comes to your site and looks at the URL's and if they are not distinct it may not index them very well. I understand your site is showing up fine in the SERP's so this may be one of those items you place on a lower priority until later.
I think R.May knows Joomla so I'll refer to him on how to accomplish this but it may be worth it to make the adjustment. You may find the end result of making your page URL more distinct will actually increase your current SERPs. Just a thought.
Other than that. If your site isn't hurting and the only thing you are concerned about is the report in SEOmoz then I would move on and just make a mental note of it for later.
-
Hi ben - changing url is not well required as the site is getting good serp, however - the duplicacy issue to saveguard us from any future issue - is what we seek for
-
Hi - thanks for replying
-
For the dynamic url - Yes - at the initial start - it was missed and as on its not reqd somehow - as the pages are getting indexed well and good in SERP
-
For pagination - Where we needs this is like in our used car section, discount section& news section where multiple pages are created. shall we create separate title & meta description for every pagination page. is it ideally reqd ?
http://www.mycarhelpline.com/index.php?option=com_usedcar&view=category&Itemid=3
- 'undefined' & /%5C%22/ is coming as per report of SEOmoz and is almost on every page of site (except of home page) with the dynamic url after domain name are preceded with these 2 strings as per moz report
how to get this corrected - want to be preventive from this duplicacy n avoid getting a hit in future even if its going well now -
-
-
I'm not a Joomla expert but to make your URL's search engine friendly you are going to need to add an extension like this. That will allow you to make more distinct URLs that will not be considered "duplicate" anymore.
-
Joomla has soo many dup content issues, you have to know Joomla really well to avoid most of them. The biggest issue is you didn't enable the SEF URLs from the start and left the default index.php?option=com on most of them, which stuffs your URLs full of ugly parameters.
You can still enable this in your global options and with a quick edit to htaccess - but it will change all of your current URLs and you will need to 301 all of them, so that isn't a great option unless you are really suffering - and depending on if you are using J 1.6 or under, this is a time consuming nasty process. Also this is unlikely to get rid of any existing duplicate pages - but may make dealing with them and finding them easier.
I don't see the specific examples you posted though, where are you seeing "undefined" and "%5C%22/ " ?
You should implement rel=canonical on the correct version of each page. I recommend SH404SEF which is a Joomla plugin and makes this process easier - but it isn't free. I don't know of a good free plugin that does this, and Joomla's templates make doing this manually difficult.
Looking at it quickly, I also didn't notice any articles that were paginated, but you should try to follow the rel="next" and rel="prev" for paginated pages. This is likely something you will have to edit your Joomla core files to do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Impact of Removing 60,000 Page from Sites
We currently have a database of content across about 100 sites. All of this content is exactly the same on all of them, and it is also found all over the internet in other places. So it's not unique at all and it brings in almost no organic traffic. I want to remove this bloat from our sites. Problem is that this database accounts for almost 60,000 pages on each site and it is all currently indexed. I'm a little bit worried that flat out dumping all of this data at once is going to cause Google to wonder what in the world we are doing and we are going to see some issues from it (at least in the short run). My thought now is to remove this content in stages so it doesn't all get dropped at once. But would deindexing all of this content first be better? That way Google would still be able to crawl it and understand that it is not relevant user content and therefore minimize impact when we do terminate it completely? Any other ideas for minimizing SEO issues?
Intermediate & Advanced SEO | | MJTrevens1 -
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
Duplicate content issue
Hello! We have a lot of duplicate content issues on our website. Most of the pages with these issues are dictionary pages (about 1200 of them). They're not exactly duplicate, but they contain a different word with a translation, picture and audio pronunciation (example http://anglu24.lt/zodynas/a-suitcase-lagaminas). What's the better way of solving this? We probably shouldn't disallow dictionary pages in robots.txt, right? Thanks!
Intermediate & Advanced SEO | | jpuzakov0 -
Better UX or more Dedicated Pages (and page views)?
Hi, I'm building a new e-commerce site and I'm conflicting about what to do in my category pages. If we take for example a computer store.
Intermediate & Advanced SEO | | BeytzNet
I have a category of laptops and inside there are filters by brand (Samsung, HP, etc.). I have two options - either having the brand choice open a new dedicated page -
i.e. Samsung-Laptops.aspx or simply do a JQuery filter which gives a better and faster user experience (immediate, animated and with no refresh). **Which should I use? (or does it depend on the keyword it might target)? **
Samsung laptops / dell laptops / hp laptops - are a great keyword on there own! By the way, splitting Laptops.aspx to many sub category physical pages might also help by providing the site with many actual pages dealing with laptops altogether.0 -
Duplicate or not ?
Hello, I have an ecommerce website with products I have many categories and more products are associated with several categories (I can not do otherwise). Urls of each product are not duplicated because I have : http://www.site.com/product-name However, my breadcrumb varies depending on the way. I have for example: If I go through the A section and sub-section Aa, my breadcrumb will:
Intermediate & Advanced SEO | | android_lyon
Home> Section A> subheading Aa> product 1 If >> I go through the B section and sub-section Ca, my breadcrumb will:
Home> Section B> subheading Ca> product 1 My question: is that with only a breadcrumb different for my product sheets, there is a duplication? My opinion ...... not because the url of the page is unique. Thank you for your feedback. Sorry for the english, i'm french 😉 D.0 -
Similar page titles but not quite duplicate
Howdy Mozzers, I have a problem with the way Google now tries not to show more than one search result per site on the first page. As in it is a lot harder to be ranked number 1 - 10 twice with different pages. Some of my pages have similar yet different page titles so they use the same first two keywords and then a variable such as '(keyword) (keyword) installations' '(keyword) (keyword) surveys'. Then when I search for '(keyword) (keyword)' they all appear at the start of page two with only ever one of them moving onto the end of page one. Now, it could just be that they are not quite optimised for page 1 but I think it would be more holding back of pages so they don't flood page 1. Any help on this? And also is there a problem with having similar page titles for pages? Cheers
Intermediate & Advanced SEO | | Hughescov0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
What to call pages
I reckon I've bagged one of the most interesting SEO projects of the year. My new client is selling vibrators. The site is not even in development yet but they want to make it fun and friendly and take away the stigma and "seediness" of the product. Anyway, the owenr has presented a list of "places" within this site which are places where the products are going to be showcased. These are along the lines of, Royal Rabbits Palace, Clitoral Courtyard, Dungeon Dildos, Magical G-arden etc. (there is a bit shreky/fariy tale thing going on) Clearly, these places add a lot to the look and feel of the site but as URL's and Titles, they are clearly not optimal in an SEO sense. What is for the best...making sure we shift the owner back into SEO best practice or hope that having these weird and wonderful names for the pages is going to add enough to the user experience to make it worthwhile to let through. FYI, did you know you can get vibrators that you can plug an ipod into. Man, I've seen some weird things researching this client!
Intermediate & Advanced SEO | | FDC0