WP File Permissions
-
After suffering a malware episode I wonder if there is an optimum setting for the file permissions for a typical Wordpress site?
Colin
-
Thanks very much George.
I had spotted that but wondered if anyone had any other permutation that had worked for them.
I think I'll follow the settings suggested on that page.
Cheers,
Colin
-
Hi Colin,
Take a look at this page on the Wordpress.org site. The link should take you right to the file permissions section.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If my website do not have a robot.txt file, does it hurt my website ranking?
After a site audit, I find out that my website don't have a robot.txt. Does it hurt my website rankings? One more thing, when I type mywebsite.com/robot.txt, it automatically redirect to the homepage. Please help!
Intermediate & Advanced SEO | | binhlai0 -
Recovering old disallow file?
Hi guys, We had aN SEO agency do a disallow request on one of our sites a while back. They have no trace of the disallow txt file and all the links they disallowed. Does anyone know if there is a way to recover this file in google webmaster tools or anyway to find which links were disallowed? Cheers.
Intermediate & Advanced SEO | | jayoliverwright0 -
Handling Multiple Domain 301 Redirects on Single htaccess file
Hello, I have a client that currently that has 9 different markets with different sub-domains on one server (aka one htaccess file.). All the sites have very similar Navigation and some of them contain the same products aka same URLs. The site is using Magento CMS and I'm trying to figure out how to redirect some products that have been removed from one of the stores. The problem I'm running into is when I try to redirect one store url, it redirects all the site's URLs. Example http://store.domain1.com/ http://store.domain2.com/ I'd like to redirect http://store.domain1.com/old-url.html to http://store.domain1.com/new-url.html without making http://store.domain2.com/old-url.html redirect. I've literally been pulling out my hair trying to figure this one out but have had no luck. Does anybody have any ideas on how I could do this without having the sites redirect or create any loops? Any wisdom from you apache experts would be greatly appreciated. Thanks, Erik
Intermediate & Advanced SEO | | Erik-M0 -
Meta robots or robot.txt file?
Hi Mozzers! For parametric URL's would you recommend meta robot or robot.txt file?
Intermediate & Advanced SEO | | eLab_London
For example: http://www.exmaple.com//category/product/cat no./quickView I want to stop indexing /quickView URLs. And what's the real difference between the two? Thanks again! Kay0 -
Disavow files on m.site
Hi I have a site www.example.com and finally have got the developers to add Google webmaster verification codes for: example.com m.example.com As I was advised this is best practice - however I was wondering does this mean I now need to add the disavow file. Thanks Andy
Intermediate & Advanced SEO | | Andy-Halliday0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
Using 2 wildcards in the robots.txt file
I have a URL string which I don't want to be indexed. it includes the characters _Q1 ni the middle of the string. So in the robots.txt can I use 2 wildcards in the string to take out all of the URLs with that in it? So something like /_Q1. Will that pickup and block every URL with those characters in the string? Also, this is not directly of the root, but in a secondary directory, so .com/.../_Q1. So do I have to format the robots.txt as //_Q1* as it will be in the second folder or just using /_Q1 will pickup everything no matter what folder it is on? Thanks.
Intermediate & Advanced SEO | | seo1234560 -
Negative impact on crawling after upload robots.txt file on HTTPS pages
I experienced negative impact on crawling after upload robots.txt file on HTTPS pages. You can find out both URLs as follow. Robots.txt File for HTTP: http://www.vistastores.com/robots.txt Robots.txt File for HTTPS: https://www.vistastores.com/robots.txt I have disallowed all crawlers for HTTPS pages with following syntax. User-agent: *
Intermediate & Advanced SEO | | CommercePundit
Disallow: / Does it matter for that? If I have done any thing wrong so give me more idea to fix this issue.0