How to fix these unwanted URLs?
-
Right now i have wordpress, one page website, but google also show wp-content. KIndly check below in google.
site:http://baltimoreelite.com/
How I can fix this issue?
-
Great job, Mark! I can see from this end that nearly all of those unwanted URLs have already dropped out of the results. That's far quicker than even I expected! And the ones that aren't gone are leading to a 403 Forbidden page, which is great.
One last thing you can do if you want. Because you are on HostGator, they are displaying their custom 403 error page, which has their branding all over it (nasty, kinda ugly) You could create your own simple 403 error page, add your own basic branding to it, and for instance add a line that says something like "You don't have permission to view this page or it is blocked for security reasons. Drop by the home page [link to home page] to find what you're looking for, or to conduct a search."
This basic page can be used to replace the one that HostGator provides by default so any visitors that hit it by accident will still feel like they are on your site, and will have a suggestion for what to do next. Your hosting control panel will have instructions for how & where to provide your own custom error pages.
Hope that last little tweak's useful.
Paul
-
Thank You Paul. All is well now.
-
If I were you, Mark, I'd add it right at the top of your htaccess file. I'd also add in a descriptive comment to make the reason for the directive clear. So:
BEGIN Remove ability to read directory indexes
Options -Indexes
END Remove ability to read directory indexes
These lines would be inserted right at the top of the htaccess file. I would also warn though, that I've had situations where caching plugins have overwritten such directives when they update the htaccess themselves. If that happens, you man need to try inserting it after #END WPSuperCache and before # BEGIN WordPress.
Hope that works for you?
Paul
-
Andy and Nishada - don't forget... Adding robots.txt disallows will do nothing to get already indexed URLs out of the search index after the fact.
Paul
-
You have a much bigger problem than what can be solved just with a robots.txt file, Mark.
All of those URLs are showing up because of a misconfiguration of your theme installation (likely caused by the theme developer) is allowing full display of all of the content of each of those directories. In addition to polluting your search results, as you've noticed, it's a also a pretty major security risk. You can see this in action by going to http://baltimoreelite.com/wp-content/themes/sintia/wpv_theme/assets/css/ What should happen when you go to that URL is you see a blank page, or receive a 403-Forbidden warning. Instead, you're seeing a full listing of the directory contents - bad news.
Since I don't know your hosting configuration, the easiest way to fix this issue is to add a line to your .htaccess file at the root of your site. This should correct for all such instances, You need to add this line:
<code>Options -Indexes</code>
If you're not familiar, the .htaccess file is a text file which you can edit with any text editor. You'll need to use an FTP program or the file manager in your hosting control panel to access it at the root of your site. (You may also need to enable "show hidden files" in your program). I always recommend backing up the existing file before editing just in case. You'll add this new line on its own line with a blank line between it and any other lines in the file. It can go near or at the top of your htaccess file.
Now getting those URLs out of the search index is going to take a bit more work. You'll want to implement the robots.txt exclusions like Andy suggested, and then you'll need to go into Google Webmaster Tools and use the Remove URLs tool to specifically request removal of the directories you blocked with the robots.txt and that you also want removed from the search results. (The robots.txt is a critical part of this process, as Google requires it be in place in order to process the removal requests.)
This, combined with the htaccess edit mentioned above, should keep those URLs from showing up again in teh future.
Hope that all makes sense. If not, be sure to ask!
Paul
-
Hi Mark,
I block the same on my site (which is also a single page). Here is the content of my Robots.txt file.
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-content/ Sitemap: http://www.inetseo.co.uk/sitemap.xml.gz
-Andy
-
Hi Mark,
You need to tell crawlers to not to index those content by modifying the robots.txt file. Below is a good link with some examples and instructions
http://stackoverflow.com/questions/17029811/how-to-set-up-robots-txt-file-for-wordpress
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trailing Slashes on URLs
Hi everyone I have a question on trailing slashes in URL. The crux of it is this: is having both: example.com/subdirectory/ and: example.com/subdirectory on all of your subdirectories considered duplicate content by Google - or in some other way really bad? We have done a heck a lot of research into this, and it would seem...no one knows for sure (it is easy to get lost in a sea of Webmaster tool forums from 2012). Google itself has both URLs for it's subdirectories (try https://www.google.co.uk/maps and https://www.google.co.uk/maps/) as does Moz; and yet there are some rumblings on the internet of people who think you must put a 'redirect' (although not really a redirect as it isn't a 301) in your htaccess file to one or the other (so for example.com/subdirectory/ would 'forward' to example.com/subdirectory); and this is what bbc.co.uk do. We tried putting this htaccess 'forward' in as an experiment, but I noticed our site then stopped being fully crawled by Google bot, so we reversed it. Can any one shed any light?
Technical SEO | | NickOrbital0 -
How to delete specific url?
I just ran drawl diagnostics and trying to delete pages such as "oops that page can't be found" or "404 (not found_ error response pages. Can anyone help?
Technical SEO | | sawedding0 -
Removed URLs
Hi all, We have recently removed 200+ articles from our blog. However, those links are still being shown on Google weeks after their removal. In there a way to speed up the process? What effect will this have on our SEO ranking?
Technical SEO | | businessowner0 -
When to use canonical urls
I will be the first to admit I am never really 100% sure when to use canonical urls. I have a quick question and I am not really sure if this is a situation for a canonical or not. I am looking at a my friends building website and there are issues with what pages are ranking. Basically there homepage is focusing on the building refurbishment location but for some reason in internal page is ranking for that keyword and it is not mentioned at all on that page. Would this be a time to add the homepage url and a canonical on the ranking page (using yoast plugin) to tell Google that the homepage is the preferred page? Thanks Paul
Technical SEO | | propertyhunter0 -
URL rewriting causing problems
Hi I am having problems with my URL rewriting to create seo friendly / user friendly URL's. I hope you follow me as I try to explain what is happening... Since the creation of my rewrite rule I am getting lots of errors in my SEOMOZ report and Google WMT reports due to duplicate content, titles, description etc For example for a product detail, it takes the page and instead of a URL parameter it creates a user friendly url of mydomain.com/games-playstation-vita-psp/B0054QAS However in the google index there is also the following friendly URL which is the same page - which I would like to remove domain.com/games-playstation-vita/B0054QAS The key to the rewrite on the above URLs is the /B0054QAS appended at the end - this tells the script which product to load, the details preceeding this could be in effect rubbish i.e. domain.com/a-load-of-rubbish/B0054QAS and it would still bring back the same page as above. What is the best way of resolving the duplicate URLs that are currently in the google index which is causing problems The same issue is causing a quite serious a 5XX error on one of the generated URLs http://www.mydomain.com/retailersname/1 - , if I click on the link the link does work - it takes you to the retailers site, but again it is the number appended at the end that is the key - the retailersname is just there for user friendly search reasons How can I block this or remove it from the results? Hope you are still with me and can shed some light on these issues please. Many Thanks
Technical SEO | | ocelot0 -
Ignore Urls with pattern.
I have 7000 warnings of urls because of a 302 redirect. http://imageshack.us/photo/my-images/215/44060409.png/ I want to get rid of those, is it possible to get rid of the Urls with robots.txt. For example that it does not crawl anything that has /product_compare/ in its url? Thank you
Technical SEO | | levalencia10 -
Are URL's with trailing slash seen as two different URLs
Hello, http://www.example.com and http://ww.example.com/ Are these seen as two different URL's ? Just as with www or non www ? Or it doesn't make any difference ?
Technical SEO | | seoug_20050