Best use of robots.txt for "garbage" links from Joomla!
-
I recently started out on Seomoz and is trying to make some cleanup according to the campaign report i received.
One of my biggest gripes is the point of "Dublicate Page Content".
Right now im having over 200 pages with dublicate page content.
Now.. This is triggerede because Seomoz have snagged up auto generated links from my site.
My site has a "send to freind" feature, and every time someone wants to send a article or a product to a friend via email a pop-up appears.
Now it seems like the pop-up pages has been snagged by the seomoz spider,however these pages is something i would never want to index in Google.
So i just want to get rid of them.
Now to my question
I guess the best solution is to make a general rule via robots.txt, so that these pages is not indexed and considered by google at all.
But, how do i do this? what should my syntax be?
A lof of the links looks like this, but has different id numbers according to the product that is being send:
http://mywebshop.dk/index.php?option=com_redshop&view=send_friend&pid=39&tmpl=component&Itemid=167
I guess i need a rule that grabs the following and makes google ignore links that contains this:
view=send_friend
-
Hi Henrik,
It can take up to a week for SEOmoz crawlers to process your site, which may be an issue if you recently added the tag. Did you remember to include all user agents in your first line?
User-agent: *
Be sure to test your robots.txt file in Google Webmaster Tools to ensure everything is correct.
Couple of other things you can do:
1. Add a rel="nofollow" on your send to friend links.
2. Add a meta robots "noindex" to the head of the popup html.
3. And/or add a canonical tag to the popup. Since I don't have a working example, I don't know what to canonical it too (whatever content it is duplicating) but this is also an option.
-
I just tried to add
Disallow: /view=send_friend
I removed the last /
however a crawl gave me the dublicate content problem again.
Is my syntax wrong?
-
The second one "Disallow: /*view=send_friend" will prevent googlebot from crawling any url with that string in it. So that should take care of your problem.
-
So my link example would look like this in robots.txt?
Disallow: /index.php?option=com_redshop&view=send_friend&pid=&tmpl=component&Itemid=
Or
Disallow: /view=send_friend/
-
Your right I would disallow via robots.txt & a wildcard (*) wherever a unique item id # could be generated.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple robots.txt files on server
Hi! I have previously hired a developer to put up my site and noticed afterwards that he did not know much about SEO. This lead me to starting to learn myself and applying some changes step by step. One of the things I am currently doing is inserting sitemap reference in robots.txt file (which was not there before). But just now when I wanted to upload the file via FTP to my server I found multiple ones - in different sizes - and I dont know what to do with them? Can I remove them? I have downloaded and opened them and they seem to be 2 textfiles and 2 dupplicates. Names: robots.txt (original dupplicate)
Technical SEO | | mjukhud
robots.txt-Original (original)
robots.txt-NEW (other content)
robots.txt-Working (other content dupplicate) Would really appreciate help and expertise suggestions. Thanks!0 -
Link juice and max number of links clarification
I understand roughly that "Link Juice" is passed by dividing PR by the number of links on a page. I also understand the juice available is reduced by some portion on each iteration. 50 PR page 10 links on page 5 * .9 = 4.5 PR goes to each link. Correct? If so and knowing Google stops counting links somewhere around 100, how would it impact the flow to have over 100 links? IE 50 PR page 150 links on the page .33 *.9 = .29PR to each link BUT only for 100 of them. After that, the juice is just lost? Also, I assume Google, to the best of its ability, organizes the links in order of importance such that content links are counted before footer links etc.
Technical SEO | | sprynewmedia0 -
Objects behind "hidden" elements
If you take a look at this page: http://www.americanmuscle.com/2010-mustang-body-kits.html You will notice we have a little "Read More" script set up. I have used Google Data Validator to test structured data located behind this 'Read More' and it checks out OK but I was wondering if anyone has insight to whether or not the spiders are even seeing links, etc. behind the 'Read More' script.
Technical SEO | | andrewv0 -
No indexing url including query string with Robots txt
Dear all, how can I block url/pages with query strings like page.html?dir=asc&order=name with robots txt? Thanks!
Technical SEO | | HMK-NL0 -
Is Over use of Twitter to link back to your website affect google rankings
Having a debate here that needs to be settled. A friend is using twitter to link back to his site but is falling down the google rankings. I think that he is over using and creating double content which looks like a robot . Can any body elsse explain this Better Please Thanks in advance for your help
Technical SEO | | Feily0 -
External Linking & Your sites Link juice
Hey guys, quick question. Does a page lose link juice when it gives link juice? If I link to an outside site, do I lose that same amount of link juice or is it just applied to there site and not removed from mine? I understand that linking to a competitor can in turn help him and hurt me (if he then is seen as more relevant than me to google) but does it have a direct relation to hurting/removing my page link juice? Hope this all makes sense. Thanks
Technical SEO | | SheffieldMarketing0 -
Restricted by robots.txt does this cause problems?
I have restricted around 1,500 links which are links to retailers website and links that affiliate links accorsing to webmaster tools Is this the right approach as I thought it would affect the link juice? or should I take the no follow out of the restricted by robots.txt file
Technical SEO | | ocelot0 -
Google & async="true"
Hello, Any idea if Google (or Bing) parses/indexes content from scripts that are loaded using the async="true" attribute? In other words, is asynchronously loaded content indexable? Thank you.
Technical SEO | | phaistonian0