Get Your Head Out of the Clouds!

SaaS is great, but is cloud delivery great?

Sure it’s convenient to not have to worry about where the servers are, where the backups are, and whether or not more CPUs have to be spun up, more memory needs to be added, or more bandwidth is needed and it’s time to lay more pipe.

However, sometimes this lack of worrying leads to an unexpectedly high invoice when your user base decided to adopt the solution as part of their daily job, spin up a large number of optimization and predictive analytics scenarios, and spike CPU usage from 2 server days to 30 server days, resulting in a 15-fold bill increase over night. (Whereas hosting on your own rack has a fixed, predictable, cost.)

But this isn’t the real problem. (You could always have set up alerts or limits and prevented this from happening had you thought ahead.) The real problem is regulatory compliance and the massive fines that could be headed your way if you don’t know where your data is and cannot confirm you are 100% in compliance with every regulation that impacts you.

For example, EU and Canada privacy regulations limit where data on their citizens can live and what security protocols must be in place. And even if this is a S2P system, which is focussed on corporations and not people, you still have contact data — which is data on people. Now, by virtue of their employment, these people agree to make their employment (contact) information available, so you’re okay … until they are not employed. Then, if any of that data was personal (such as cell phone or local delivery address), it may have to be removed.

But more importantly, with GDPR coming into effect May 25, you need to be able to provide any EU citizen, regardless of where they are in the world and where you are in the world, with any and all information you have on them — and do so in a reasonable timeframe. Failure to do so can result in a fine of up to €20 Million or 4% of global turnover. For ONE violation. And, if you no longer have a legal right to keep that data, you have to be able to delete all of the data — including all instances across all systems and all (backup) copies. If you don’t even know where the data is, how can you ensure this happens? The answer is, you can’t.

Plus, not every country will permit sensitive or secure data to be stored just anywhere. So, if you want a client that works as a defense contractor, even if your software passes the highest security standards tests, that doesn’t mean that the client you want can host in the cloud.

With all of the uncertainty and chaos, the SaaS of the future is going to be a blend of an (in-house) ASP and provider managed software offering where the application, and databases, are housed in racks in a location selected by the provider in a dedicated hardware environment, but the software, which is going to be managed by the vendor, is going to run in virtual machines and update via vendor “pushes”, where the vendor will have the capability to shut-down and restart the entire virtual machine if a reboot is necessary. This method will also permit the organization to have on-site QA of new release functionality if they like, as that’s just another VM.

Just like your OS can auto-update on schedule or reboot, your S2P application will auto-update in a similar fashion. It will register a new update, schedule it for the next, defined, update cycle. Prevent users from logging in 15 minutes prior. Force users to start log-off 5 minutes before. Shutdown. Install the updates. Reboot if necessary. Restart. And the new version will be ready to go. If there are any issues, an alert will be sent to the provider who will be able to log in to the instance, and even the VM, and fix it as appropriate.

While it’s not the one-instance (with segregated databases) SaaS utopia, it’s the real-world solution for a changing regulatory and compliance landscape, which will also comfort security freaks and control freaks. So, head in the cloud vendors, get ready. It’s coming.