WP File Permissions
-
After suffering a malware episode I wonder if there is an optimum setting for the file permissions for a typical Wordpress site?
Colin
-
Thanks very much George.
I had spotted that but wondered if anyone had any other permutation that had worked for them.
I think I'll follow the settings suggested on that page.
Cheers,
Colin
-
Hi Colin,
Take a look at this page on the Wordpress.org site. The link should take you right to the file permissions section.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How important is the file extension in the URL for images?
I know that descriptive image file names are important for SEO. But how important is it to include .png, .jpg, .gif (or whatever file extension) in the url path? i.e. https://example.com/images/golden-retriever vs. https://example.com/images/golden-retriever.jpg Furthermore, since you can set the filename in the Content-Disposition response header, is there any need to include the descriptive filename in the URL path? Since I'm pulling most of our images from a database, it'd be much simpler to not care about simulating a filename, and just reference an image id in my templates. Example: 1. Browser requests GET /images/123456
Intermediate & Advanced SEO | | dsbud
2. Server responds with image setting both Content-Disposition, and Link (canonical) headers Content-Disposition: inline; filename="golden-retriever"
Link: <https: 123456="" example.com="" images="">; rel="canonical"</https:>1 -
SEO Friendly Files Redirected From Images
I have images (.jpg's) of products that when you click them redirect you to a .pdf's containing all the products' specs, patterns, colors, etc. These are 302 redirects that open on a different window when clicked on. Is there a way to keep these redirects and maintain SEO optimization? Any advice is appreciated.
Intermediate & Advanced SEO | | SuperiorPavers0 -
Hacked website - Dealing with 301 redirects and a large .htaccess file
One of my client's websites was recently hacked and I've been dealing with the after effects of it. The website is now clean of malware and I already appealed to Google about the malware issue. The current issue I have is dealing with the 20, 000+ crawl errors which are garbage links that were created from the hacking. How does one go about dealing with all the 301 redirects I need to create for all the 404 crawl errors? I'm already noticing an increased load time on the website due to having a rather large .htaccess file with a couple thousand 301 redirects done already which I fear will result in my client's website performance and SEO performance taking a hit as well.
Intermediate & Advanced SEO | | FPK0 -
Handling Multiple Domain 301 Redirects on Single htaccess file
Hello, I have a client that currently that has 9 different markets with different sub-domains on one server (aka one htaccess file.). All the sites have very similar Navigation and some of them contain the same products aka same URLs. The site is using Magento CMS and I'm trying to figure out how to redirect some products that have been removed from one of the stores. The problem I'm running into is when I try to redirect one store url, it redirects all the site's URLs. Example http://store.domain1.com/ http://store.domain2.com/ I'd like to redirect http://store.domain1.com/old-url.html to http://store.domain1.com/new-url.html without making http://store.domain2.com/old-url.html redirect. I've literally been pulling out my hair trying to figure this one out but have had no luck. Does anybody have any ideas on how I could do this without having the sites redirect or create any loops? Any wisdom from you apache experts would be greatly appreciated. Thanks, Erik
Intermediate & Advanced SEO | | Erik-M0 -
Moving from a static HTML CSS site with .html files to a Wordpress Site while keeping link structure
Mozzers, Hope this finds you well. I need some advice. We have a site built with a dreamweaver template, and it is lacking in responsiveness, ease of updates, and a lot of the coding is behind traditional web standards (which I know will start to hurt our rank - if not the user experience). For SEO purposes, we would like to move the existing static based site to Wordpress so we can update it easily and keep content fresh. Our current site, thriveboston.com, has a lot of page extensions ending in .html. For the transition, it is extremely important for us to keep the link structure. We rank well in the SERPs for Boston Counseling, etc... I found and tested a plugin (offline) that can add a .html extension to Wordpress pages, which allows us to keep our current structure, but has anyone had any luck with this live? Has anyone had any luck moving from a static site - to a Wordpress site - while keeping the current link structure - without hurting any rank? We hope to move soon because if the site continues to grow, it will become even harder to migrate the site over. Also, does anyone have any hesitations? It this a bad move? Should we just stay on the current DWT template (the HTML and CSS) and not migrate? Any suggestions and advice will be heeded. Thanks Mozzers!
Intermediate & Advanced SEO | | _Thriveworks0 -
Files blocked in robot.txt and seo
I use joomla and I have blocked the following in my robots.txt is there anything that is bad for seo ? User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /components/ Disallow: /images/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /libraries/ Disallow: /media/ Disallow: /modules/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/ Disallow: /xmlrpc/ Disallow: /mailto:myemail@myemail.com/ Disallow: /javascript:void(0) Disallow: /.pdf
Intermediate & Advanced SEO | | seoanalytics0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Page load increases with Video File - SEO Effects
We're trying to use a flash video as a product image, so the size increase will be significant. We're talking somewhere around 1.5 - 2mb on a page that is about 400kb before the video. So the increase is significant. There is SEO concern with pages peed and thinking perhaps having the flash video inside an iframe might overcome the speed issues. We're trying to provide a better experience with the video, but the increase in page size, and therefore speed, will be significant. The rest of the page will load, including a fallback static image, so we're really trying to understand how to mitigate the page load speed impact of the video. Any Thoughts?
Intermediate & Advanced SEO | | SEO-Team0