What problems could arise from updating php version?
-
i havent really gotten a straight answer yet for this question - my client says:
"The developers are skeptical about the possibility to update PHP on our server as this could seriously damage the entire RV site functionality."
since i know nothing about php and any potential hazards i have to ask the community to see if there is any validity to these concerns.
we cant update our version of WP unless the version of php is first upgraded from 5.1.6 to 5.2.4
client wont do this because developers say its a potential nightmare
i , as seo, want a current updated version of WP for many obvious reasons
can anyone please tell me what, if any, problems could arise from upgrading the sites php? or is it just a lot of work and the developers are making excuses because they dont want to do it?
thanks very much to whoever answers
-
Alan,
I cannot think of an answer I have seen recently with the clarity of thought and ability to refute a very bad practice. (Especially on a security level). One thing we do with clients and upgrades (we do not handle clients on other peoples platforms) is to take a new update, give it a few weeks for bugs to be discovered and then do the upgrade on our end. We have clients sign off on us handling the upgrades, etc. from the beginning of the relationship.
For Erik I would suggest showing the client what has been said by someone with a lot of savvy experience. If the devs are worth their salt, they will change.
This was a good question and Alan delivered a great answer.
Robert
-
I've got very limited bandwidth for training (90% of my work comes from audits) - and is typically limited to in-person on-site for clients in the LA area because I find the in-person experience to be much more effective. Pricing depends on level and extent, and starts at $250 an hour so is ideal for groups (one fee regardless of participant count). Audits range from $3500 upwards of $7500 or more depending on scale.
-
what do you charge for
Individual Personalized Training? and site audits?
-
You can thank my combined 11 years SEO after 7 years web dev project management, with a background in information security and business ownership
-
i want to have your baby
brilliant answer!
i just copy/pasted the entire thing to my client - even got your pic and bio in there for added cred.
i have been asking this question in one way or another since early Feb. and you just nailed it
thank you very much
-
Any time you upgrade a server solution, the potential exists for things that are currently working to suddenly break. That is just the nature of technology. In an ideal world, this wouldn't happen, however unfortunately it's quite possible for many reasons.
Just one reason is technology developers cannot possibly test for every single unique server configuration on earth when working on an upgrade. They have time, resource and fiscal constraints.
In one example of how an upgrade from PHP 5.1.6 to 5.2.4 caused a WP site to collapse, the problem was neither with PHP OR WP. It was with a separate server solution related to firewalls that had to then be dealt with.
That example validates the concern expressed by the developers you're dealing with.
HOWEVER
Regardless of potential problems of this nature, it is irresponsible and deplorable for developers to refuse to upgrade servers out of the fear that something might break. Could you imagine 90% of the world still operating on IBM mainframe computers because of a fear to upgrade? The security implications alone are appalling, let alone business-case reasons.
Developers and systems administrators "should" be required to implement upgrades on a regular consistent basis, with the understanding that it is their responsibility to deal with problems that may arise, and during upgrades "should" also use intelligent best practices precautions and methods to ensure the least likely chance for a critical failure. THAT is the only proper business path for a business to remain successful long-term.
Hiding under the guise of "it's too dangerous" is a terrible pitiful excuse for laziness.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content update on 24hr schedule
Hello! I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page. Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine) Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask) On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.) Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page? Thank you all mozzers!
Intermediate & Advanced SEO | | HashtagHustler0 -
Check website update frequency?
Is the tools out there that can check our frequently website is updated with new content products? I'm trying to do an SEO analysis between two websites. Thanks in advance Richard
Intermediate & Advanced SEO | | seoman100 -
Rankings drop from the new update
Hello, I've noticed some big ranking downs on important keywords, from the last Google update, and don't really know what seams to be the problem, but have an assumption. In April 2015 we had 3.000.000 pages indexed by Google, and 80% of them had duplicated content for about 90% of it. The site I'm talking about is http://nobelcom.com/. The duplicated content came from variations between calling from and calling to selections, because each of this selection was making a new url (ex. nobelcom.com/caling from/calling to). If "calling from" and calling to were the same country the url was nobelcom.com/calling-from, but after you chosen another calling to, the url become like the one in the example. To solve this I've decided to keep the nobelcom.com/calling-from urls and for different calling to country to display the content trough a javascript, because it was the same, it changed only the country names and the rates. I thought that this change will help us with the duplicate content, and still deliver our client what they are interested in, without affecting the UX, and also reducing the link juice dilution because we had 3.000.000 indexed by Google and most of them with no added value Can this be the reason for the drops? Now we have 590.000 pages indexed by Google.
Intermediate & Advanced SEO | | Silviu1 -
GTM Migration from Old to New Verison as Old Version closing on 1st April
Hi Guys, Can you please tell me is it a correct configuration for tracking thank you page? In Old Version of Tag Manager - GA conversion Tracking, tag type - universal analytic, web property id - UA-12345678-9
Intermediate & Advanced SEO | | devdan
track type - transaction, Firing rule -{{url}} contains ordersuccessful.aspx, {{event}} equals gtm.dom In New Version of Tag Manager - Choose Product - Google Analytic, choose tag type - universal analytic, configure tag - tracking id - UA-12345678-9, track type - transaction, Fire on -Name - order successful page, type - custom event, Filter- Page url contains ordersuccessful.aspx, event equals gtm.dom If i remove event equals gtm.dom will tag fire and transaction details will reflect in google analytic? I am doing Manually configuration in new version of GTM as i have only few tags so just want to know if all tags successfully configured & I placed new GTM code on my website then google analytic will start reflecting data at same moment right, it will not take 24 hours right? Thanks! Dev0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
Mobile version of my sites: What is better?
What is the best approach to make my sites ready for mobile, in terms of SEO ? Is it better to create a subdomain called "m.mydomain.com" and redirect mobile users to that domain with a lite version of my sites? Or is it better to just keep the same domain as for my desktop version "mydomain.com" and use a WordPress theme that fits for all gadgets, for example Twenty Fourteen WordPress Theme, that adapts to each device? I see that most big sites use a "m.mydomain.com" subdomain for the mobile version, however, I don't see any sense in creating a subdomain of the site, when you can just use the WP adapting theme in the main domain. Any insight please? Thanks!
Intermediate & Advanced SEO | | BloggerGuy0 -
Adding index.php at the end of the url effect it's rankings
I have just had my site updated and we have put index.php at the end of all the urls. Not long after the sites rankings dropped. Checking the backlinks, they all go to (example) http://www.website.com and not http://www.website.com/index.php. So could this change have effected rankings even though it redirects to the new url?
Intermediate & Advanced SEO | | authoritysitebuilder0 -
Nuanced duplicate content problem.
Hi guys, I am working on a recently rebuilt website, which has some duplicate content issues that are more nuanced than usual. I have a plan of action (which I will describe further), so please let me know if it's a valid plan or if I am missing something. Situation: The client is targeting two types of users: business leads (Type A) and potential employees (Type B), so for each of their 22 locations, they have 2 pages - one speaking to Type A and another to Type B. Type A location page contains a description of the location. In terms of importance, Type A location pages are secondary because to the Type A user, locations are not of primary importance. Type B location page contains the same description of the location plus additional lifestyle description. These pages carry more importance, since they are attempting to attract applicants to work in specific places. So I am planning to rank these pages eventually for a combination of Location Name + Keyword. Plan: New content is not an option at this point, so I am planning to set up canonical tags on both location Types and make Type B, the canonical URL, since it carries more importance and more SEO potential. The main nuance is that while Type A and Type B location pages contain some of the same content (about 75%-80%), they are not exactly the same. That is why I am not 100% sure that I should canonicalize them, but still most of the wording on the page is identical, so... Any professional opinion would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | naymark.biz0