URL Search removal tool.
-
Hi All!
I have tried to request a URL removal from Google search for an old testing site they still have listed.
Ive tried requesting removal of the domain and of individual pages, all requests are getting rejected.
Any help would be gratefully appreciated
Many Thanks Anthony
-
Is the URL generating a 404 / included in robots.txt as an exclusion ? If yes, you might want to check with a HTTP header tool to verify the correct 404 is being displayed. Maybe post details here. If nothing works, send a message in the Google support forum and that should do it.
Removal Requirements (https://support.google.com/webmasters/bin/answer.py?hl=en&answer=59819)
To remove a page or image, you must do one of the following:
- Make sure the content is no longer live on the web. Requests for the page must return an HTTP 404 (not found) or 410 status code.
- Block the content using a robots.txt file.
- Block the content using a meta noindex tag.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does the Year according to the URL have an SEO impact
I ask to Godaddy if there is an SEO impact between buying a url for 1 year or 10 years. They said me that there is no impact, however, I read the historical URL has one. So, the question is : buying with 1 year and autorenew might be the right choice. Warm regards,
Industry News | | johncurlee0 -
SEO For Local Searches
I run a driving school of over 100 instructors in the UK. We cover around 60 different areas. My homepage www.driveJohnsons.co.uk is optimised for 'driving lessons' and 'driving school' search terms mainly. My area pages are optimised for the same but with the area included ie: Driving Lessons Birmingham or Driving Lessons Leeds I've taken a drop in many areas... I've cleaned up my incoming links using the disavow too and upped more relevant links associated with the same industry as myself. The question i have is should i change my URL's for my area pages from www.driveJohnsons.co.uk/driving-lessons-leeds to: www.driveJohnsons.co.uk/leeds I've been told stuffing the URL with keywords for an area actually dilutes the strength of my homepage and all the other areas. At the moment i have 60 area pages with: www.drivejohnsons.co.uk/driving-lessons-area It use to work a treat, but i've started seeing some companies change their URLs to: /area and excluding the driving-lessons If i make this change then i'm either going to have to bit the bullet on build up links for those areas again or do a redirect for each area. I've added most areas to google places and i've added google map to many of area pages too. If anyone knows a bit more, please let me know...
Industry News | | Anthony19820 -
How do I predict quality of inbound link before using Disavow links tool?
I am working on Ecommerce website and getting issues with bad inbound links. I am quite excited to use Disavow links tools for clean up bad inbound links and sustain my performance on Google. We have 1,70,000+ inbound links from 1000+ unique root domains. But, I have found maximum root domains with low quality content and structure. Honestly, I don't want to capture inbound links from such websites who are not active with website and publishing. I am quite excited to use Disavow links tool. But, How do I predict quality of inbound links or root domains before using it? Is there any specific criteria to grade quality of inbound links?
Industry News | | CommercePundit0 -
Did Google Search Just Get Crazy Local?
Hey All, I think it's a known fact at this point that when signed into a personal Google account while doing a search, the results are very oriented around keywords and phrases you have already searched for, as well as your account's perceived location; for instance when I wanted to check one of my own web properties in SE listings I would sign out or it would likely appear first as a false reading. Today I noticed something very interesting: even when not signed in, Google's listings were giving precedence to locality. It was to a very extreme degree, as in when searching for "web design," a firm a mile away ranked higher than one 1.5 miles away and such. It would seem that the algos having this high a level of location sensitivity and preference would actually be a boon for the little guys, which is, I assume why it was implemented. However, it brings up a couple of interesting questions for me. 1. How is this going to affect Moz (or any SE ranking platform, for that matter) reports? I assume that Google pulls locations from IP Addresses, therefore would it not simply pull the local results most relevant for the Moz server(s) IP? 2. What can one do to rise above this aggressive level of location based search? I mean, my site (which has a DA of 37 and a PA of 48) appears above sites like webdesign.org (DA of 82, PA of 85). Not that I'm complaining at the moment, but I could see this being a fairly big deal for larger firms looking to rank on a national level. What gives? I'd love to get some opinions from the community here if anyone else has noticed this...
Industry News | | G2W1 -
Not schema, but a new kind of search result?
I came across this search result in Google and I've been racking my brain out in trying to figure out how they did it. Do a search for Novus CD4 and you'll see a search result where they list additional products from the landing page. I used Google's Rich Snippet tool to analyse the page and find no microdata at play. Any ideas how this was achieved? Have you guys come across anything like this? I was thinking of integrating this with schema to display rating stars and prices on an ecommerce site S4aL5.png
Industry News | | Bio-RadAbs0 -
Duplicate Content Tool - sketchy or good?
BloomReach announced a new software product named Dynamic Duplication Reduction (DDR), that aims to eliminate duplicate content issues on web sites. Anyone have any feedback on this yet - seems to go to be true if you ask me. Your pal Chenzo
Industry News | | Chenzo0 -
How to search adwords properly
If i search the word gold, Google tells me it is searched 20 million times/month. I realize that is people searching the word gold in combination with other terms. I would like to know how to find out how many people search the word gold just by itself. How do i do that?
Industry News | | StreetwiseReports0 -
Keyword Ranking Tool
Hi guys, Can anyone help with a software (online or offline / paid or free) that can help us get the keywords and the rankings our websites are ranking for? *** NOTE: We already use some software to get our rankings for a list of keywords we submit. We are interested in a software that can pull our keywords and rankings WITHOUT having to submit a keyword list. We want to know where and what for do we rank (especially long tails).
Industry News | | tolik1
Also, we know we can use a turn around (by exporting the keywords we get traffic from from Analytics and using Market Samurai to get the rankings, but we prefer an independent software that does all this automatically for all the links on our entire domain. I remember I accidentally got into such software a couple of months ago, but lost the track of it :). Please let me know if you know about such software. I hope I am not chasing after shadows 🙂0