400 errors and URL parameters in Google Webmaster Tools
-
On our website we do a lot of dynamic resizing of images by using a script which automatically re-sizes an image dependant on paramaters in the URL like:
www.mysite.com/images/1234.jpg?width=100&height=200&cut=false
In webmaster tools I have noticed there are a lot of 400 errors on these image
Also when I click the URL's listed as causing the errors the URL's are URL Encoded and go to pages like this (this give a bad request):
www.mysite.com/images/1234.jpg?%3Fwidth%3D100%26height%3D200%26cut%3Dfalse
What are your thoughts on what I should do to stop this?
I notice in my webmaster tools "URL Parameters" there are parameters for:
height
width
cutwhich must be from the Image URLs.
These are currently set to "Let Google Decide", but should I change them manually to "Doesn't effect page content"?
Thanks in advance
-
The easiest way would be to add a disallow line to your robots.txt file.
From Google:
- To block access to all URLs that include a question mark (?) (more specifically, any URL that begins with your domain name, followed by any string, followed by a question mark, followed by any string):```
User-agent: Googlebot
Disallow: /*?
More info: http://www.google.com/support/webmasters/bin/answer.py?answer=156449
- To block access to all URLs that include a question mark (?) (more specifically, any URL that begins with your domain name, followed by any string, followed by a question mark, followed by any string):```
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fetch as Google
I have odd scenario I don't know if anyone can help? I've done some serious speed optimisation on a website, amongst other things CDN and caching. However when I do a Search Console Fetch As Google It is still showing 1.7 seconds download time even though the cached content seems to be delivered in less than 200 ms. The site is using SSL which obviously creams off a bit of speed, but I still don't understand the huge discrepancy. Could it be that Google somehow is forcing the server to deliver fresh content despite settings to deliver cache? Thanks in advance
Intermediate & Advanced SEO | | seoman100 -
New-york-city vs. broadway as a URL parameter
We're a content publisher that writes news and reviews about the theater community, both in New York City (broadway mainly) and beyond. Presently, we display the term 'new-york-city' in news articles about Broadway / New York City theater (see http://screencast.com/t/XlifMdT9QP). Would it be better for us to replace that term with simply 'Broadway' to improve its searchability? I was doing some google trends keyword research and it looks like the search term "Broadway" in various permutations is substantially more popular than "New York City Theater."
Intermediate & Advanced SEO | | TheaterMania0 -
Expired urls
For a large jobs site, what would be the best way to handle job adverts that are no longer available? Ideas that I have include: Keep the url live with the original content and display current similar job vacancies below - this has the advantage of continually growing the number of indexed pages. 301 redirect old pages to parent categories - this has the advantage of concentrating any acquired link juice where it is most needed. Your thoughts much appreciated.
Intermediate & Advanced SEO | | cottamg0 -
Received "Googlebot found an extremely high number of URLs on your site:" but most of the example URLs are noindexed.
An example URL can be found here: http://symptom.healthline.com/symptomsearch?addterm=Neck%20pain&addterm=Face&addterm=Fatigue&addterm=Shortness%20Of%20Breath A couple of questions: Why is Google reporting an issue with these URLs if they are marked as noindex? What is the best way to fix the issue? Thanks in advance.
Intermediate & Advanced SEO | | nicole.healthline0 -
Showing Duplicate Content in Webmaster Tools.
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
Intermediate & Advanced SEO | | EcommerceSite0 -
Squarespace Errors
We have a website hosted by SquareSpace. We are happy with SS, but have done some crawl diagnostics and noticed several errors. These are primarily: Duplicate Page Title Duplicate Page Content Client Error (4xx) We dont really understand why these errors are taking place, and wonder if someone in the Seomoz forum has a firm understanding of SS who is able to assist us with this? rainforestcruises.com thanks.
Intermediate & Advanced SEO | | RainforestCruises0 -
Nofollow links in Google Webmaster
I've noticed nofollow links showing up in my Google Webmaster tools "links to your site" list. If they are nofollow why are they showing up here? Do nofollow links still count as a backlink and transfer PR and authority?
Intermediate & Advanced SEO | | NoCoGuru1 -
Does Google penalize for having a bunch of Error 404s?
If a site removes thousands of pages in one day, without any redirects, is there reason to think Google will penalize the site for this? I have thousands of subcategory index pages. I've figured out a way to reduce the number, but it won't be easy to put in redirects for the ones I'm deleting. They will just disappear. There's no link juice issue. These pages are only linked internally, and indexed in Google. Nobody else links to them. Does anyone think it would be better to remove the pages gradually over time instead of all at once? Thanks!
Intermediate & Advanced SEO | | Interesting.com0