Question about robots.txt
-
Solved!
-
Just a friendly reminder. Please don't delete your question after it's been answered. It's very likely that someone in the future could have the same question and they would have been able to find the answer if you hadn't deleted the question.
-
Consider deleting all of this:
Disallow: /&limit
Disallow: /?limit
Disallow: /&sort
Disallow: /?sort
Disallow: /?route=checkout/
Disallow: /?route=account/
Disallow: /?route=product/search
Disallow: /?route=affiliate/
Disallow: /?marca
Disallow: /&manufacturer
Disallow: /?manufacturer
Disallow: /?filter
Disallow: /&filter
Disallow: /?order
Disallow: /&order
Disallow: /?price
Disallow: /&price
Disallow: /?filter_tag
Disallow: /&filter_tag
Disallow: /?mode
Disallow: /&mode
Disallow: /?cat
Disallow: /&cat
Disallow: /?product_id
Disallow: /&product_id
Disallow: /?route=affiliate/
Disallow: /*?keywordThose rules are telling Google not to crawl domain.com/EVERYTHING(then the URL parameter). This could be where the issue stems from. If you're worried about URLs with these things ranking, consider implementing canonical tags instead to point to the proper pages
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Magento Duplicate Content Question - HELP!
In Magento, when entering product information, does the short description have to be different than the meta description? If they are both the same is this considered duplicate content? Thanks for the help!!!
On-Page Optimization | | LeapOfBelief0 -
I have a question about having to much content on a single page. Please help :)
I am working on a music related site. We are building a feature in our system to allow people to write information about songs on their playlist. So when a song is currently being played a user can read some cool facts or information about the song. http://imgur.com/5jFumPW ( screenshot). Some playlists have over 100 songs and could be completely random in genre and artist. I am wondering if some of these playlists have over 5,000 words of content if that is going to hurt us? We will be very strict about making sure its non spammy and good content. Also for the titles of the content is it bad to have over 100 h3 tags on one page? Just want to make sure we are on the right track. Any advice is greatly appreciated.
On-Page Optimization | | mikecrib10 -
Question Regarding Site Structure
I have a quick question regarding site structure that I hope some of you guys could share your opinion on. I watched a white board friday from Rand a little while back where he explains that you need to try and make the site structure as flat as possible. He was saying try having no more that 3 links from the home page to get to the desired location. My question is this. I am looking at a site that has a pretty complex structure that I am trying to clear up as much as possible without making any of there rankings suffer. So they have www.domian.com/general-category/district/town/ and sometimes www.domian.com/general-category/district/town/item-specifics Now i know it is not good as it is, but they are dubious about changing too much as they have some serious traffic coming to the site. But, my question is that all the pages can be found from the home page through the menus/sub-menus. But do these count as a direct link from the home page. Also a problem is that because of this mozbot has detected that there are too many links from the home page and suggested that it should be below 200. But should I make these menu links no index or no follow. Obviously, by doing this, if the link does count as direct from the home page it wont after doing this. Thanks Jenson
On-Page Optimization | | jensonseo0 -
Copyscape Duplicate Content Ownership Question
We have a site that has had its content copied verbatim to numerous other sites and articles. We were advised to change our content but the content is originally ours. Does google take that into account before they apply duplicate penalties? And shouldn't copyscape be able to show this information in their reports? It just doesnt seem right that the originating author would have to change content because everyone else is stealing it. Any clarification on this?
On-Page Optimization | | anthonytjm0 -
Link juice question
In theory: If i have a page with only two outgoing, do-follow links,
On-Page Optimization | | elgoog
and both have the same target,
only the first one will be counted.. right? Will that link pass 100% or 50% of the link juice? (if the anchor text is the same or different does not make any difference i think)0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0 -
The SEOmoz crawler is being blocked by robots.txt need help
SEO moz is showing me that the robot.txt is blocking content on my site
On-Page Optimization | | CGR-Creative0