What problems could arise from updating php version?
-
i havent really gotten a straight answer yet for this question - my client says:
"The developers are skeptical about the possibility to update PHP on our server as this could seriously damage the entire RV site functionality."
since i know nothing about php and any potential hazards i have to ask the community to see if there is any validity to these concerns.
we cant update our version of WP unless the version of php is first upgraded from 5.1.6 to 5.2.4
client wont do this because developers say its a potential nightmare
i , as seo, want a current updated version of WP for many obvious reasons
can anyone please tell me what, if any, problems could arise from upgrading the sites php? or is it just a lot of work and the developers are making excuses because they dont want to do it?
thanks very much to whoever answers
-
Alan,
I cannot think of an answer I have seen recently with the clarity of thought and ability to refute a very bad practice. (Especially on a security level). One thing we do with clients and upgrades (we do not handle clients on other peoples platforms) is to take a new update, give it a few weeks for bugs to be discovered and then do the upgrade on our end. We have clients sign off on us handling the upgrades, etc. from the beginning of the relationship.
For Erik I would suggest showing the client what has been said by someone with a lot of savvy experience. If the devs are worth their salt, they will change.
This was a good question and Alan delivered a great answer.
Robert
-
I've got very limited bandwidth for training (90% of my work comes from audits) - and is typically limited to in-person on-site for clients in the LA area because I find the in-person experience to be much more effective. Pricing depends on level and extent, and starts at $250 an hour so is ideal for groups (one fee regardless of participant count). Audits range from $3500 upwards of $7500 or more depending on scale.
-
what do you charge for
Individual Personalized Training? and site audits?
-
You can thank my combined 11 years SEO after 7 years web dev project management, with a background in information security and business ownership
-
i want to have your baby
brilliant answer!
i just copy/pasted the entire thing to my client - even got your pic and bio in there for added cred.
i have been asking this question in one way or another since early Feb. and you just nailed it
thank you very much
-
Any time you upgrade a server solution, the potential exists for things that are currently working to suddenly break. That is just the nature of technology. In an ideal world, this wouldn't happen, however unfortunately it's quite possible for many reasons.
Just one reason is technology developers cannot possibly test for every single unique server configuration on earth when working on an upgrade. They have time, resource and fiscal constraints.
In one example of how an upgrade from PHP 5.1.6 to 5.2.4 caused a WP site to collapse, the problem was neither with PHP OR WP. It was with a separate server solution related to firewalls that had to then be dealt with.
That example validates the concern expressed by the developers you're dealing with.
HOWEVER
Regardless of potential problems of this nature, it is irresponsible and deplorable for developers to refuse to upgrade servers out of the fear that something might break. Could you imagine 90% of the world still operating on IBM mainframe computers because of a fear to upgrade? The security implications alone are appalling, let alone business-case reasons.
Developers and systems administrators "should" be required to implement upgrades on a regular consistent basis, with the understanding that it is their responsibility to deal with problems that may arise, and during upgrades "should" also use intelligent best practices precautions and methods to ensure the least likely chance for a critical failure. THAT is the only proper business path for a business to remain successful long-term.
Hiding under the guise of "it's too dangerous" is a terrible pitiful excuse for laziness.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Insane traffic loss and indexed pages after June Core Update, what can i do to bring it back?
Hello Everybody! After June Core Update was released, we saw an insane drop on traffic/revenue and indexed pages on GSC (Image attached below) The biggest problem here was: Our pages that were out of the index were shown as "Blocked by robots.txt", and when we run the "fetch as Google" tool, it says "Crawl Anomaly". Even though, our robots.txt it's completely clean (Without any disallow's or noindex rules), so I strongly believe that the reason that this pattern of error is showing, is because of the June Core Update. I've come up with some solutions, but none of them seems to work: 1- Add hreflang on the domain: We have other sites in other countries, and ours seems like it's the only one without this tag. The June update was primarily made to minimize two SERP results per domain (or more if google thinks it's relevant). Maybe other sites have "taken our spot" on the SERPS, our domain is considerably newer in comparison to the other countries. 2- Mannualy index all the important pages that were lost The idea was to renew the content on the page (title, meta description, paragraphs and so on) and use the manual GSC index tool. But none of that seems to work as well, all it says is "Crawl Anomaly". 3- Create a new domain If nothing works, this should. We would be looking for a new domain name and treat it as a whole new site. (But frankly, it should be some other way out, this is for an EXTREME case and if nobody could help us. ) I'm open for ideas, and as the days have gone by, our organic revenue and traffic doesn't seem like it's coming up again. I'm Desperate for a solution Any Ideas gCi46YE
Intermediate & Advanced SEO | | muriloacct0 -
How will canonicalizing an https page affect the SERP-ranked http version of that page?
Hey guys, Until recently, my site has been serving traffic over both http and https depending on the user request. Because I only want to serve traffic over https, I've begun redirecting http traffic to https. Reviewing my SEO performance in Moz, I see that for some search terms, an http page shows up on the SERP, and for other search terms, an https page shows. (There aren't really any duplicate pages, just the same pages being served on either http or https.) My question is about canonical tags in this context. Suppose I canonicalize the https version of a page which is already ranked on the SERP as http. Will the link juice from the SERP-ranked http version of that page immediately flow to the now-canonical https version? Will the https version of the page immediately replace the http version on the SERP, with the same ranking? Thank you for your time!
Intermediate & Advanced SEO | | JGRLLC0 -
Does anyone know of a Google update in the past few days?
Have seen a fairly substantial drop in Google search console, I'm still looking into it comparing things, but does anyone know if there's been a Google updates within the past few days? Or has anyone else noticed anything? Thanks
Intermediate & Advanced SEO | | seoman100 -
Index Problem
Hi guys I have a critical problem with google crawler. Its my website : https://1stquest.com I can't create sitemap with online site map creator tools such as XML-simemap.org Fetch as google tools usually mark as partial MOZ crawler test found both HTTP and HTTPS version on site! and google cant index several pages on site. Is problem regards to "unsafe URL"? or something else?
Intermediate & Advanced SEO | | Okesta0 -
How important is admin-ajax.php?
Hi there! It's been a long time since I last did a technical audit of a site. I've currently playing with the 'fetch as google' tool to find out if we're blocking anything vital. The site is based on Wordpress, and after a recent hacking incident, a previous SEO moved the login portal from domain.com/wp-admin/ to domain.com/pr3ss/wp-admin/ - to stop people finding it. Fair enough. But they then updated the robots.txt file to look like this: User-agent: *
Intermediate & Advanced SEO | | Muhammad-Isap
Disallow: /pr3ss/wp-admin/ Now, some pages are trying to draw on theme elements like: http://www.domain.com/pr3ss/wp-admin/admin-ajax.php
http://www.domain.com/pr3ss/wp-content/themes/bestpracticegroup/images/column_wrapper_bg.png And are naturally being blocked (not that this seems to affect the way pages are rendering in Google's eyes) A good SEO friend of mine has suggested allowing the theme folder, and any sub folders where this becomes an issue. What are your thoughts? Is it even worth disallowing the /pr3ss/wp-admin/ path? Cheers guys and gals! All the best, John. I've found a couple of the theme's0 -
SEO Concerns From Moving Mobile M Dot site to Responsive Version?
I currently have my mobile site set up as a m dot site. I have designed a new responsive/adaptive version of my desktop site I would like to start using. When I search from google on mobile, my website is indexed as the m dot site. When I make the switch, this will no longer be the case as I will only have one url for both mobile and desktop. The m dot url's will no longer work. Are there any SEO consequences from making this shift?
Intermediate & Advanced SEO | | mikeylong70 -
To recover from Penguin update, shall i remove the links or disavow links?
Hi, One of our websites hit by Penguin update and I now know where the links are coming from. I have chance to remove the links from those incoming links but I am a little confused whether i should just remove the links from incoming links or disavow the links? Thanks
Intermediate & Advanced SEO | | Rubix0 -
Nuanced duplicate content problem.
Hi guys, I am working on a recently rebuilt website, which has some duplicate content issues that are more nuanced than usual. I have a plan of action (which I will describe further), so please let me know if it's a valid plan or if I am missing something. Situation: The client is targeting two types of users: business leads (Type A) and potential employees (Type B), so for each of their 22 locations, they have 2 pages - one speaking to Type A and another to Type B. Type A location page contains a description of the location. In terms of importance, Type A location pages are secondary because to the Type A user, locations are not of primary importance. Type B location page contains the same description of the location plus additional lifestyle description. These pages carry more importance, since they are attempting to attract applicants to work in specific places. So I am planning to rank these pages eventually for a combination of Location Name + Keyword. Plan: New content is not an option at this point, so I am planning to set up canonical tags on both location Types and make Type B, the canonical URL, since it carries more importance and more SEO potential. The main nuance is that while Type A and Type B location pages contain some of the same content (about 75%-80%), they are not exactly the same. That is why I am not 100% sure that I should canonicalize them, but still most of the wording on the page is identical, so... Any professional opinion would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | naymark.biz0