How to fix these unwanted URLs?
-
Right now i have wordpress, one page website, but google also show wp-content. KIndly check below in google.
site:http://baltimoreelite.com/
How I can fix this issue?
-
Great job, Mark! I can see from this end that nearly all of those unwanted URLs have already dropped out of the results. That's far quicker than even I expected! And the ones that aren't gone are leading to a 403 Forbidden page, which is great.
One last thing you can do if you want. Because you are on HostGator, they are displaying their custom 403 error page, which has their branding all over it (nasty, kinda ugly) You could create your own simple 403 error page, add your own basic branding to it, and for instance add a line that says something like "You don't have permission to view this page or it is blocked for security reasons. Drop by the home page [link to home page] to find what you're looking for, or to conduct a search."
This basic page can be used to replace the one that HostGator provides by default so any visitors that hit it by accident will still feel like they are on your site, and will have a suggestion for what to do next. Your hosting control panel will have instructions for how & where to provide your own custom error pages.
Hope that last little tweak's useful.
Paul
-
Thank You Paul. All is well now.
-
If I were you, Mark, I'd add it right at the top of your htaccess file. I'd also add in a descriptive comment to make the reason for the directive clear. So:
BEGIN Remove ability to read directory indexes
Options -Indexes
END Remove ability to read directory indexes
These lines would be inserted right at the top of the htaccess file. I would also warn though, that I've had situations where caching plugins have overwritten such directives when they update the htaccess themselves. If that happens, you man need to try inserting it after #END WPSuperCache and before # BEGIN WordPress.
Hope that works for you?
Paul
-
Andy and Nishada - don't forget... Adding robots.txt disallows will do nothing to get already indexed URLs out of the search index after the fact.
Paul
-
You have a much bigger problem than what can be solved just with a robots.txt file, Mark.
All of those URLs are showing up because of a misconfiguration of your theme installation (likely caused by the theme developer) is allowing full display of all of the content of each of those directories. In addition to polluting your search results, as you've noticed, it's a also a pretty major security risk. You can see this in action by going to http://baltimoreelite.com/wp-content/themes/sintia/wpv_theme/assets/css/ What should happen when you go to that URL is you see a blank page, or receive a 403-Forbidden warning. Instead, you're seeing a full listing of the directory contents - bad news.
Since I don't know your hosting configuration, the easiest way to fix this issue is to add a line to your .htaccess file at the root of your site. This should correct for all such instances, You need to add this line:
<code>Options -Indexes</code>
If you're not familiar, the .htaccess file is a text file which you can edit with any text editor. You'll need to use an FTP program or the file manager in your hosting control panel to access it at the root of your site. (You may also need to enable "show hidden files" in your program). I always recommend backing up the existing file before editing just in case. You'll add this new line on its own line with a blank line between it and any other lines in the file. It can go near or at the top of your htaccess file.
Now getting those URLs out of the search index is going to take a bit more work. You'll want to implement the robots.txt exclusions like Andy suggested, and then you'll need to go into Google Webmaster Tools and use the Remove URLs tool to specifically request removal of the directories you blocked with the robots.txt and that you also want removed from the search results. (The robots.txt is a critical part of this process, as Google requires it be in place in order to process the removal requests.)
This, combined with the htaccess edit mentioned above, should keep those URLs from showing up again in teh future.
Hope that all makes sense. If not, be sure to ask!
Paul
-
Hi Mark,
I block the same on my site (which is also a single page). Here is the content of my Robots.txt file.
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-content/ Sitemap: http://www.inetseo.co.uk/sitemap.xml.gz
-Andy
-
Hi Mark,
You need to tell crawlers to not to index those content by modifying the robots.txt file. Below is a good link with some examples and instructions
http://stackoverflow.com/questions/17029811/how-to-set-up-robots-txt-file-for-wordpress
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Language Specific Characters in URLs for
Hi People, would really appreciate your advice as we are debating best practice and advice seems very subjective depending if we are talking to our dev or SEO team. We are developing a website aimed at the South American market with content entirely in Spanish. This is our first international site so our experience is limited. Should we be using Spanish characters (such as www.xyz.com/contáctanos) in URLs or should we use ASCII character replacements? What are the pros and cons for SEO and usability? Would really be great to get advice from the Moz community and make me look good at the same time as it was my suggestion 🙂 Nick
Technical SEO | | nickspiteri0 -
URL Parameters
On our webshop we've added some URL-parameters. We've set URL's like min_price, filter_cat, filter_color etc. on "don't Crawl" in our Google Search console. We see that some parameters have 100.000+ URL's and some have 10.000+ Is it better to add these parameters in the robots.txt file? And if that's better, how can we write it down so the URL's will not be crawled. Our robotos.txt files shows now: # Added by SEO Ultimate's Link Mask Generator module User-agent: * Disallow: /go/ # End Link Mask Generator output User-agent: * Disallow: /wp-admin/
Technical SEO | | Happy-SEO1 -
URL removals
Hello there, I found out that some pages of the site have two different URL's pointing at the same page generating duplicate content, title and description. Is there a way to block one of them? cheers
Technical SEO | | PremioOscar0 -
How important is keyword usage in the URL?
Hi,
Technical SEO | | Whebb
We have a client who has engaged us recently for some SEO work and most of their website looks pretty good seo-wise already. Many of their site pages rank at the top or middle of page two for their targeted keywords. In many cases they are not using the targeted keyword in the URL and most pages could use some additional on-page clean up. My question is, is it worth it to re-write the URLs to include the targeted keyword and then do 301 redirects to send the old pages to the new ones in order to improve the ranking? Or should we just do the minor on page work in hopes that this will be enough to improve the rankings and push them on to the first page. Thanks.0 -
Issues with trailing slash url
Recently, we have changed our website to www.example.com/super-rentals/ (example) and we have done a 301 redirection to the new urls from the old one. We have noticed in Google webmaster tool that urls without trailing slash as 404 error. www.example.com/super-rentals. Please let us know how to fix this issue as soon as possible. Note: Our previous urls are not the urls without trailing slash. It is a different url (www.example.com/super-rentals.htm) we have rewritten in to www.example.com/super-rentals/ only. I would like to know why GWT pulls out the urls without trailing slash and shows in 404 error. Thanks for your time
Technical SEO | | massimobrogi0 -
Changing .html to .asp in URLs
Hi Mozzers, I have a question. The webmaster of a client of mine needs to make changes to some files which will effect the URL's. Essentially everything is staying the same but the end of the URL will change from .html to .asp. This is because the site will be dynamically loading content (perhaps from a database) (i.e. latest news to come from their blog etc..) In order to do this we would need to change the filenames of the whole website. (i.e. personnel.html would become personel.asp). Changing URLs can harm indexation but a small change to the end - would Google drop these pages? A 301 redirect is not possible from old URL to new. What impact would this have on Rankings? Thanks Gareth
Technical SEO | | Bush_JSM0 -
How can i redirect a url that has % in it?
Google webmaster tools shows a 400 eroor for an old link that contains a 30% off in it. The problem is the % I would like to 301 redirect this link : http://www.geographics.com/Graduation-Stationery,-35%-OFF-Printable-Certificates-Blank-Gift-Certificates/c1353_1354_1359/index.html to http://www.geographics.com/Graduation-Stationery-Printable-Certificates-Blank-Gift-Certificates/c1353_1354_1359/index.html We do not know how to do this in httaccess. Can you please advise? Thanks a lot! Madlena
Technical SEO | | Madlena0 -
How do I 301 url's with numbers in them?
I have a number of 404 error pages showing in webmaster tools and some of the url's have numbers, % symbols, and some are pdf's. My usual 301 redirect in my htaccess file does NOT redirect these pages where the url's have special characters. What am I doing wrong?
Technical SEO | | BradBorst0