Question about robots.txt
-
Solved!
-
Just a friendly reminder. Please don't delete your question after it's been answered. It's very likely that someone in the future could have the same question and they would have been able to find the answer if you hadn't deleted the question.
-
Consider deleting all of this:
Disallow: /&limit
Disallow: /?limit
Disallow: /&sort
Disallow: /?sort
Disallow: /?route=checkout/
Disallow: /?route=account/
Disallow: /?route=product/search
Disallow: /?route=affiliate/
Disallow: /?marca
Disallow: /&manufacturer
Disallow: /?manufacturer
Disallow: /?filter
Disallow: /&filter
Disallow: /?order
Disallow: /&order
Disallow: /?price
Disallow: /&price
Disallow: /?filter_tag
Disallow: /&filter_tag
Disallow: /?mode
Disallow: /&mode
Disallow: /?cat
Disallow: /&cat
Disallow: /?product_id
Disallow: /&product_id
Disallow: /?route=affiliate/
Disallow: /*?keywordThose rules are telling Google not to crawl domain.com/EVERYTHING(then the URL parameter). This could be where the issue stems from. If you're worried about URLs with these things ranking, consider implementing canonical tags instead to point to the proper pages
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Author Byline Question
What's the best practice for displaying author information at the beginning of an article? We're presently displaying it as: By <Author> • Jan 16, 2015 • <City>. We're considering making it even more concise by removing the term 'By'. Would be shooting ourselves in the foot if we did this? Any other ways we should optimize?
On-Page Optimization | | TheaterMania0 -
Does anyone use Genesis Framework? If so can a newbie use it and a few other questions
Hi, So as I search the wonderful land of the internet, I see this Genesis framework brought up quite a bit. I have researched it for a few weeks, but it seems like it uses hooks instead of shortcodes. So I am curious if anyone has used it? And if so what your thoughts are about it? I am a COMPLETE newbie here, so hooks look scary. I am sure with time they will seem like second nature. They claim it has airtight security. So if you have used this framework, how is this any different than an updated stock wordpress site? I understand that vulnerabilities may be in plugins and such, but if it is really airtight, that seems great. Any thoughts are appreciated as I just want the best user experience. So many people use this framework, yet my site gets if I'm lucky 1000 views each month. It is a basic site to let people know we exist. So its not like I have a popular blog with 50,000 pageviews each month. But... going into the future, I want a pleasant and consistent user experience. Maybe a wordpress theme is all you need. Maybe a framework is more for developers. Any thoughts are greatly appreciated. Chris
On-Page Optimization | | asbchris0 -
I have a question about having to much content on a single page. Please help :)
I am working on a music related site. We are building a feature in our system to allow people to write information about songs on their playlist. So when a song is currently being played a user can read some cool facts or information about the song. http://imgur.com/5jFumPW ( screenshot). Some playlists have over 100 songs and could be completely random in genre and artist. I am wondering if some of these playlists have over 5,000 words of content if that is going to hurt us? We will be very strict about making sure its non spammy and good content. Also for the titles of the content is it bad to have over 100 h3 tags on one page? Just want to make sure we are on the right track. Any advice is greatly appreciated.
On-Page Optimization | | mikecrib10 -
Google Index/Cashe questions
I have 15k+ pages. I have 4.5k pages indexed. What relation is the google cashe to indexing pages? My site gets cashed every two days. The competition in my SERP goes 2-3weeks to get cashed. What does this indicate? Is your cashe date your last google crawl? How can I get google to crawl my site? Is there a way I can get google to crawl my site starting from an internal page. This way I could set up a better linking structure that would benefit from doing activities that get that page indexed to help get my site indexed more thoroughly...
On-Page Optimization | | JML11790 -
An ecomerce seo question
Looking for a few opinions on this please...Trying to reduce the number of pages I have to seo to rank on my websites and at the same time avoid the google over optimisation issues. Previously on our ecomerce websites we would have a category page for, say, 12 times, we would then seo that page for generic terms related to the page; ie, blue dress, cheap blue dress, blue party dress etc. The individual product pages would then be seoed with the title and h1 tags containing the exact product name and the url containing the product name too. This worked fine but we are suffering from some duplicate content issues of late (the products are mixture of few unique items and probably 95% imported affiliate datafeeds) as we have an average of 80,000 products per store we have neither the time nor the staff to rewrite everything (the products update daily directly from the merchants so would need to be done daily) What we are planning on moving toward is blocking the individual product pages from Google and instead putting all efforts into the category pages. The category page will contain plenty of quality unique content related to the category so the only duplicate content would be a line of the product name and price. Whilst we would still rank the category page for broad keywords we also would like to now rank the category page for 16 individual product names as there is a good profit to make made by the sheer volume of product names we plan on ranking for. Obviously we could not get all the products into the url and the page title as that would be silly but would it be acceptable to have multiple h2 tags on the page, each with a different entry, the product names (H1 will be saved for the category name). We can easily bold these keywords to help in the optimisation as per the seo moz onsite analysis tool and we can add image text to ensure the product name is featured at least twice on the page. As so few sites actually seo for the long tail product names, most retailers rank by virtue of their domain quality alone, our onsite seo doesn't have to be 100% but getting the best we can out of the page will help the efforts. Many thanks Carl
On-Page Optimization | | Grumpy_Carl0 -
Robots.txt file
Does it serve any purpose if we omit robots.txt file ? I wonder if spider has to read all the pages, why do we insert robots.txt file ?
On-Page Optimization | | seoug_20050 -
Does Google respect User-agent rules in robots.txt?
We want to use an inline linking tool (LinkSmart) to cross link between a few key content types on our online news site. LinkSmart uses a bot to establish the linking. The issue: There are millions of pages on our site that we don't want LinkSmart to spider and process for cross linking. LinkSmart suggested setting a noindex tag on the pages we don't want them to process, and that we target the rule to their specific user agent. I have concerns. We don't want to inadvertently block search engine access to those millions of pages. I've seen googlebot ignore nofollow rules set at the page level. Does it ever arbitrarily obey rules that it's been directed to ignore? Can you quantify the level of risk in setting user-agent-specific nofollow tags on pages we want search engines to crawl, but that we want LinkSmart to ignore?
On-Page Optimization | | lzhao0