Joomla Duplicate Page content fix for mailto component?
-
Hi,
I am currently working on my site and have the following duplicate page content issues:
My Uni Essays
http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2631849e33
My Uni Essays
http://www.myuniessays.co.uk/component/mailto/?tmpl=component&template=it_university&link=2edd30f8c6
This happens 15 times
Any ideas on how to fix this please?
Thank you
-
Hi Yiannis
Completely agree, but need to keep using Joomla for this one unfortunately.
I have hidden the icons within the admin panel so I will see if this works during next Moz crawl
Thanks!
-
Hello,
i hate to sound pessimistic but I want to save you time. Buy a Wordpress template or so a bespoke design and move away from Joomla.
I had the exact same problem, 10-15 links a day exactly like yours. I did everything in my powers to fix it. Bough the plug in the community recommended, did redirects, I just ended up with hundreds of links every montH.
If someone else has a solution go for it but again your best bet is to move away Joomlart
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix Category Duplicate Titles Issue?
How to fix Category Duplicate titles and descriptions issues? Most common problem in Wordpress. Example - http://www.abc.com.au/news
Intermediate & Advanced SEO | | varunrupal
http://www.abc.com.au/news/page/3
http://www.abc.com.au/news/page/4
http://www.abc.com.au/news/page/5
http://www.abc.com.au/news/page/10
http://www.abc.com.au/news/page/6
http://www.abc.com.au/news/page/7
http://www.abc.com.au/news/page/9
http://www.abc.com.au/news/page/80 -
Loading Content Asynchronously for Page Speed Purposes?
Pages for my companies play process load slowly because the process is heavy. Below the play process there is a block of text, put mostly there for SEO purposes. R&D are proposing to load the SEO Area only after the play process is loading.
Intermediate & Advanced SEO | | theLotter
This seems like a very bad solution, because loading the SEO Area asynchronously will make the content unreadable to Google. Am I missing something?0 -
Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
Hiya Mozzers I often work for hotels. A common scenario is the hotel / resort has worked with their Property Management System to distribute their booking availability around the web... to third party booking sites - with the inventory goes duplicate page descriptions sent to these "partner" websites. I was just checking duplication on a room description - 20 loads of duplicate descriptions for that page alone - there are 200 rooms - so I'm probably looking at 4,000 loads of duplicate content that need rewriting to prevent duplicate content penalties, which will cost a huge amount of money. Is there any other solution? Perhaps ask booking sites to block relevant pages from search engines?
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate content mess
One website I'm working with keeps a HTML archive of content from various magazines they publish. Some articles were repeated across different magazines, sometimes up to 5 times. These articles were also used as content elsewhere on the same website, resulting in up to 10 duplicates of the same article on one website. With regards to the 5 that are duplicates but not contained in the magazine, I can delete (resulting in 404) all but the highest value of each (most don't have any external links). There are hundreds of occurrences of this and it seems unfeasible to 301 or noindex them. After seeing how their system works I can canonical the remaining duplicate that isn't contained in the magazine to the corresponding original magazine version - but I can't canonical any of the other versions in the magazines to the original. I can't delete the other duplicates as they're part of the content of a particular issue of a magazine. The best thing I can think of doing is adding a link in the magazine duplicates to the original article, something along the lines of "This article originally appeared in...", though I get the impression the client wouldn't want to reveal that they used to share so much content across different magazines. The duplicate pages across the different magazines do differ slightly as a result of the different Contents menu for each magazine. Do you think it's a case of what I'm doing will be better than how it was, or is there something further I can do? Is adding the links enough? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
Duplicate content clarity required
Hi, I have access to a masive resource of journals that we have been given the all clear to use the abstract on our site and link back to the journal. These will be really useful links for our visitors. E.g. http://www.springerlink.com/content/59210832213382K2 Simply, if we copy the abstract and then link back to the journal source will this be treated as duplicate content and damage the site or is the link to the source enough for search engines to realise that we aren't trying anything untoward. Would it help if we added an introduction so in effect we are sort of following the curating content model? We are thinking of linking back internally to a relevant page using a keyword too. Will this approach give any benefit to our site at all or will the content be ignored due to it being duplicate and thus render the internal links useless? Thanks Jason
Intermediate & Advanced SEO | | jayderby0 -
How best to handle (legitimate) duplicate content?
Hi everyone, appreciate any thoughts on this. (bit long, sorry) Am working on 3 sites selling the same thing...main difference between each site is physical location/target market area (think North, South, West as an example) Now, say these 3 sites all sell Blue Widgets, and thus all on-page optimisation has been done for this keyword. These 3 sites are now effectively duplicates of each other - well the Blue Widgets page is at least, and whist there are no 'errors' in Webmaster Tools am pretty sure they ought to be ranking better than they are (good PA, DA, mR etc) Sites share the same template/look and feel too AND are accessed via same IP - just for good measure 🙂 So - to questions/thoughts. 1 - Is it enough to try and get creative with on-page changes to try and 'de-dupe' them? Kinda tricky with Blue Widgets example - how many ways can you say that? I could focus on geographical element a bit more, but would like to rank well for Blue Widgets generally. 2 - I could, i guess, no-index, no-follow, blue widgets page on 2 of the sites, seems a bit drastic though. (or robots.txt them) 3 - I could even link (via internal navigation) sites 2 and 3 to site 1 Blue Widgets page and thus make 2 blue widget pages redundant? 4 - Is there anything HTML coding wise i could do to pull in Site 1 content to sites 2 and 3, without cloaking or anything nasty like that? I think 1- is first thing to do. Anything else? Many thanks.
Intermediate & Advanced SEO | | Capote0 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
Home page deindexed by goole, How to determine why and how to fix
On Wednesday I noticed our domain was no longer ranking for our key word and our product Isolator Fitness, http://isolatorfitness.com, I have been researching and not finding answers to why it happened and what to do to fix it. We have about 800 other pages still listed. I am new to all this seo stuff, can anyone guide me in the right direction.
Intermediate & Advanced SEO | | David75
History about 10 days a go I went google web master tools and noticed that there were a large number of errors due to the fact the robots could not crawl our site. Looked at the site and found that Privacy button on WP was turned to block robots. I turned it on and had google re crawl the site, looked like google was not able to crawl the site for about 3 months, on Monday I did a 301 redirect to on of our other sites for another product we sell, to http://isolatorfitness.com/6-pack-bags. This site had a good bit of back links would doing all this at one time cause this How do I determine if we did anything wrong Thanks0