Duplicate url problem causing me problems
-
Hi, i am working with a joomla site and i am using the sh404sef plugin. I have contacted the developer of the plugin who has not been very helpful so i am hoping to get help here.
The problem i am having is, the description of the page showing in google listings is not the same as what i have put into the meta tag description.
for example, for this page
http://www.clairehegarty.co.uk/virtual-gastric-band-with-hypnotherapy
the meta tag description should be
Gastric Band Hypnotherapy to lose weight guaranteed. Free Gastric Band Hypnosis Consultations with Well Known Gastric Hypno Band expert as seen on TV. Hypno Gastric Band Works. We offer full support after your Gastric Band Hypnotherapy
but in google it is showing
Gastric Band Hypnotherapy Works. If you would like a slimmer and healthier body with all the benefits of weight loss surgery without any of the risks that can be ...
now one thing i have noticed is: in the sh404sef control panel, i have noticed that i have the following
index.php?option=com_content&Itemid=190&id=153&lang=en&view=article
the above is the original url from day one but then i have the one below which is not the original
index.php?option=com_content&Itemid=190&catid=150&id=153&lang=en&view=article
i keep deleting the above which is not the original but it keeps coming back and i have been told this could be the fault
can anyone please help me with this and solve how to stop it from coming back so google shows the correct description please.
-
I'm not an expert with Joomla by any means, but based on your original post you seem confidant as to what the description tag should be, but then its not showing up in the actual code, so I feel like something with your Joomla setup isn't working. Or possibly you have two plugins that are conflicting with one another?
-
hi, really do not understand, can you explain more please
-
Hey Diane, Just following up on Richards post, but I'm guessing theres something bigger going on with your Joomla plugins because your description tag is blank. I checked both of the "non pretty" url's and they are both directing to the same page with the same blank description. So I would be looking for a missing checkbox somewhere in the setup for this particular page.
-
Hi Diane, just seen you are a 'journeyman' status so apologies if its likely to me my previous comment was simply telling you what you already know!
Best of luck in finding a fix,
RIchard
-
Hi Diane
Looking at page you mention I cannot see a meta description - its blank - therefore Google has no choice but to try and pick the most appropriate words from the page.
Bear in mind their is no guarantee Google will display the meta description - it will either pick the description or copy from the page - depending on which it thinks is most appropriate for your search.
On its own the meta description does not influence your position, however by being a powerful marketing message opportunity the better written ones may get more clicks... so being more popular may improve the position.
Sorry I dont know my way around Joomla so others are better to help you with the exact fix however hope the above may help in the direction you go.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Problems with WooCommerce Product Attribute Filter URL's
I am running a WordPress/WooCommerce site for a client, and Moz is picking up some issues with URL's generated from WooCommerce product attribute filters. For example: ..co.uk/womens-prescription-glasses/?filter_gender=mens&filter_style=full-rim&filter_shape=oval How do I get Google to ignore these filters?
Technical SEO | | SushiUK
I am running Yoast Premium, but not sure if this can solve the issue? Product categories are canonicalised to the root category URL. Any suggestions very gratefully appreciated. Thanks Bob0 -
Url folder structure
I work for a travel site and we have pages for properties in destinations and am trying to decide how best to organize the URLs basically we have our main domain, resort pages and we'll also have articles about each resort so the URL structure will actually get longer:
Technical SEO | | Vacatia_SEO
A. domain.com/main-keyword/state/city-region/resort-name
_ domain.com/family-condo-for-rent/orlando-florida/liki-tiki-village_ _ domain.com/main-keyword-in-state-city/resort-name-feature _
_ domain.com/family-condo-for-rent/orlando-florida/liki-tiki-village/kid-friend-pool_ B. Another way to structure would be to remove the location and keyword folders and combine. Note that some of the resort names are long and spaces are being replaced dynamically with dashes.
ex. domain.com/main-keyword-in-state-city/resort-name
_ domain.com/family-condo-for-rent-in-orlando-florida/liki-tiki-village_ _ domain.com/main-keyword-in-state-city/resort-name-feature_
_ domain.com/family-condo-for-rent-in-orlando-florida/liki-tiki-village-kid-friend-pool_ Question: is that too many folders or should i combine or break up? What would you do with this? Trying to avoid too many dashes.0 -
Keywords, when are you overdoing it in the URL?
Hi guys, I'm auditing a site covering compensation for cancer. Keywords could include: Undiagnosed cancer 20 cancer compensation 10 undiagnosed cancer symptoms 10 cancer misdiagnosis claims 20 cancer claims 10 misdiagnosis of cancer 50 cancer misdiagnosis 70 So, when structuring the URL for the category, this was previously selected: www.site.co.uk/medical-negligence/cancer-misdiagnosis Although sub-pages appear like this: www.site.co.uk/medical-negligence/cancer-misdiagnosis/breast-cancer-misdiagnosis-claim/ 'Cancer misdiagnosis' as a keyword attracts the most traffic, but if we're using it on sub-pages - is there a need to include it twice on all sub-page URLs? With that in mind, would it be better to follow the following format? www.site.co.uk/medical-negligence/cancer-compensation www.site.co.uk/medical-negligence/cancer-compensation/breast-cancer-misdiagnosis-claim/ Or is there a better way to structure this? Thanks in advance guys!
Technical SEO | | Muhammad-Isap0 -
Duplicate Home Page
Hi everyone! So, I;m using the crawl diagnostics in Moz and it's telling that I've got duplicate content for these two pages: http://www.bridgelanguages.com/
Technical SEO | | Bridge_Education_Group
http://www.bridgelanguages.com/index.php?p=3233&source=3 Would a redirect from the 2nd page to the 1st one be a solution? I'm not even sure where that 2nd link is on the site? Any suggestions or has anyone experienced the same? Thanks! Kelly0 -
Duplicate Content
Hello guys, After fixing the rel tag on similar pages on the site I thought that duplicate content issue were resolved. I checked HTML Improvements on GWT and instead of going down as I expected, it went up. The duplicate issues affect identical product pages which differ from each other just for one detail, let's say length or colour. I could write different meta tags as the duplicate is the meta description, and I did it for some products but still didn't have any effects and they are still showing as duplicates. What would the problem be? Cheers
Technical SEO | | PremioOscar0 -
Url rewrites / shortcuts - Are they considered duplicate content?
When creating a url rewrite or shortcut, does this create duplicate content issues? split your rankings / authority with google/search engines? Scenario 1 wwwlwhatthehellisahoneybooboo.com/dqotd/ -> www.whatthehellisahoneybooboo.com/08/12/2012/deep-questions-of-the-day.html Scenario 2 bitly.com/hbb -> www.whatthehellisahoneybooboo.com/08/12/2012/deep-questions-of-the-day.html (or to make it more compicated...directs to the above mentioned scenario 1 url rewrite) www.whatthehellisahoneybooboo.com/dqotd/ *note well- there's no server side access so mentions of optimizing .htacess are useless in this situation. To be clear, I'm only referring to rewrites, not redirects...just trying to understand the implications of rewrites. Thanks!
Technical SEO | | seosquared0 -
Duplicate Page Warnings, hkey and repetitive URLs
Hi, we just put our association's site in SEO Moz so we can tackle SEO and we received thousands of duplicate page content and duplicate title warnings. I searched the forum before asking 🙂 Appreciate some guidance on how many things are wrong with the URL's below that are getting flagged as duplicate page content. 1. Does the repetition of the page title and section hurt SEO? 2. Does the iMIS15 in the URL (the server) detract from relevant ranking? 3. From the forum, it looks like canonical tags should be added to the version of the page that starts with .....?hkey (is there a way to predict and "canonize" these? Or recommendations?) http://www.iiba.org/imis15/IIBA/About_IIBA/IIBA_Website/About_IIBA/About_IIBA.aspx?hkey=6d821afa-e3aa-4bef-bafa-2453238d12c6 http://www.iiba.org/imis15/IIBA/About_IIBA/IIBA_Website/About_IIBA/About_IIBA.aspx Thank you.
Technical SEO | | lyndas0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0