Category Archives: Technology

RPA: Are We There Yet?

Nope. Not even close. And a recent Hackett study proves it.

Earlier this month, The Hackett Group released a point of view on Robotic Process Automation: A Reality Check and a Route Forward where they noted that while early initiatives have produced some tangible successes, many organizations have yet to scale their use of RPA to a level that is making a major impact on performance, likely because RPA has come with a greater-than-expected learning curve.

Right now, mainstream adoption of RPA is 3% in Finance, 3% in HR, 7% in Procurement, and 10% in GBS – Global Business Services. Experimentation (referred to as limited adoption) is higher, 6% in HR, 18% in Finance, 18% in Procurement, and 29% in GBS, but not that high, especially considering the high learning curve for the average organization will end up with a number of these not continuing the experiment.

Due to the large amount of interest, Hackett is predicting that, within 3 years, RPA will be mainstream in 11% of HR Organizations, a 4X increase, 30% of Procurement, a 4X increase, 38% of Finance, a 12X increase, and 52% in GBS, a 5X increase, as well as increases in experimentation. Experimentation will definitely increase due to the hotness of the topic, but mainstream adoption will require success, and as Hackett deftly notes, successful deployment requirements have certain key prerequisites too:

  • digital inputs
  • structured data
  • clear logical rules can be applied

And when the conditions are right, organizations:

  • realize operational cost benefits
  • have less errors and more consistent rule application
  • benefit from increased productivity
  • are able to refocus talent on higher-value work
  • strengthen auditability for key tasks
  • have enhanced task execution data to analyze and improve processes

But this is not enough for success. Hackett prescribes three criteria for success, which they define as:

  • selecting the right RPA opportunities
  • planning the journey
  • building an RPA team or COE

and you’ll have to check out Robotic Process Automation: A Reality Check and a Route Forward for more details, but is this enough?

Maybe, maybe not. It depends on how good of an RPA team is built, and how good they are at identifying appropriate use cases for RPA, and how good they are at the successful implementation. Success breeds success, but failure eliminates the option of continued use of RPA, at least until a management changeover.

The Days of Black Box Marketing May Soon Be Over!

In what marketing will refer to as the good old days of the Source-to-Pay marketplace, when the space was just emerging and most analysts couldn’t see past the shiny UI to what features were, or more importantly, were NOT, lurking underneath, it was a wild-west, anything goes marketplace.

Marketers could make grandiose claims as to what the platform did and did not do, and if they could give a good (PowerPoint) presentation to the analysts, the analysts would buy it and spread the word, and the story would grow bigger and bigger until it should be seen as crazy and unrealistic, but instead was seen as the new gospel according to the power on high.

Big names would get bigger, pockets would get fatter, but customers would lose out when they needed advanced functionality or configurability that just was not there. On the road-map, maybe, but would it get implemented before the company got acquired by a bigger company, which would halt innovative development dead in its tracks?

But those days, which still exist for some vendors with long-standing relationships with the big name analyst firms, may soon be numbered. Why? Now that SpendMatters is doing SolutionMaps, which are deep dives into well defined functionality, a customer can know for sure whether or not a certain provider has a real solution in the area, how deep it goes, and how it compares to other providers. As a result, the depth of insight that will soon be expected by a customer has been taken up a couple of notches, and any analyst firm and consultancy that doesn’t up the bar, is going to be avoided, left behind.

Once (potential) customers realize the degree of information that is available, and should be available, they’ll never settle for less. And that’s a good thing. Because it means the days of black box marketing will soon be over. While North America may never be a Germany where accurate technical specs lead the way, at least accurate claims will. And every vendor will be pushed to do better.

Get Your Head Out of the Clouds!

SaaS is great, but is cloud delivery great?

Sure it’s convenient to not have to worry about where the servers are, where the backups are, and whether or not more CPUs have to be spun up, more memory needs to be added, or more bandwidth is needed and it’s time to lay more pipe.

However, sometimes this lack of worrying leads to an unexpectedly high invoice when your user base decided to adopt the solution as part of their daily job, spin up a large number of optimization and predictive analytics scenarios, and spike CPU usage from 2 server days to 30 server days, resulting in a 15-fold bill increase over night. (Whereas hosting on your own rack has a fixed, predictable, cost.)

But this isn’t the real problem. (You could always have set up alerts or limits and prevented this from happening had you thought ahead.) The real problem is regulatory compliance and the massive fines that could be headed your way if you don’t know where your data is and cannot confirm you are 100% in compliance with every regulation that impacts you.

For example, EU and Canada privacy regulations limit where data on their citizens can live and what security protocols must be in place. And even if this is a S2P system, which is focussed on corporations and not people, you still have contact data — which is data on people. Now, by virtue of their employment, these people agree to make their employment (contact) information available, so you’re okay … until they are not employed. Then, if any of that data was personal (such as cell phone or local delivery address), it may have to be removed.

But more importantly, with GDPR coming into effect May 25, you need to be able to provide any EU citizen, regardless of where they are in the world and where you are in the world, with any and all information you have on them — and do so in a reasonable timeframe. Failure to do so can result in a fine of up to €20 Million or 4% of global turnover. For ONE violation. And, if you no longer have a legal right to keep that data, you have to be able to delete all of the data — including all instances across all systems and all (backup) copies. If you don’t even know where the data is, how can you ensure this happens? The answer is, you can’t.

Plus, not every country will permit sensitive or secure data to be stored just anywhere. So, if you want a client that works as a defense contractor, even if your software passes the highest security standards tests, that doesn’t mean that the client you want can host in the cloud.

With all of the uncertainty and chaos, the SaaS of the future is going to be a blend of an (in-house) ASP and provider managed software offering where the application, and databases, are housed in racks in a location selected by the provider in a dedicated hardware environment, but the software, which is going to be managed by the vendor, is going to run in virtual machines and update via vendor “pushes”, where the vendor will have the capability to shut-down and restart the entire virtual machine if a reboot is necessary. This method will also permit the organization to have on-site QA of new release functionality if they like, as that’s just another VM.

Just like your OS can auto-update on schedule or reboot, your S2P application will auto-update in a similar fashion. It will register a new update, schedule it for the next, defined, update cycle. Prevent users from logging in 15 minutes prior. Force users to start log-off 5 minutes before. Shutdown. Install the updates. Reboot if necessary. Restart. And the new version will be ready to go. If there are any issues, an alert will be sent to the provider who will be able to log in to the instance, and even the VM, and fix it as appropriate.

While it’s not the one-instance (with segregated databases) SaaS utopia, it’s the real-world solution for a changing regulatory and compliance landscape, which will also comfort security freaks and control freaks. So, head in the cloud vendors, get ready. It’s coming.

Why’s it all about the platform when it should be all about the power?

As we all know, the last year has been all about the M&A frenzy as the big try to get bigger by gobbling up any player with any modules they don’t have or any player with customer bases in a region they aren’t in, and doing so in a manner that doesn’t always make sense to analysts. As the doctor indicated in his post last month on Surviving a M&A: The Customer Perspective, acquisitions should lead to synergies and do so from a customer, solution, and/or operations perspective.

Preferably, an M&A should culminate in synergies of all kinds. Why? An M&A that doesn’t synch on an operations perspective doesn’t reduce overhead costs, and that means you don’t get any economics of scale, which is something all the traditional textbooks say is the first thing you should look for. If there are no customer synergies, then there are no cross-sell or up-sell opportunities, and that’s typically the next thing the textbooks say you should look for.

And, especially in our space, if there are no solution synergies, then a lot of money is wasted, as the point of the acquisition should be to build a better, or at least, a more complete platform. Otherwise, one company is paying a lot of money for something that will just get tossed in the bit bucket because supporting non-synergistic platforms gets too expensive too fast and the non-synergistic pieces will get sunsetted faster than the sun in Alert, Nunavut in late February.

So why doesn’t the recent M&A Frenzy make a lot of sense to the doctor? Not only has a fair amount of it been lacking in obvious synergies, but a lot of it has been to simply expand platform offerings, without focussing on the power of the solutions being bought or how the acquisitions will help the platform.

The past year has seen the acquisitions of traditional catalog providers and leading spend analytics and optimization providers. In some cases, the power is limited … and in other cases the power is limitless. But in the majority of cases, to date, the integration has been pretty limited. It’s been more or less just plugging a module into a whole without an analysis of not only the power of the solution but how the solution could enhance the rest of the platform in new and innovative ways.

For example, let’s take optimization. Just plugging it into a S2P platform is pretty good, especially given the dearth of optimization solutions on the market today, but is it great? How do you take an offering to market that the market will understand is better than the other leading vendors which have optimization? After all, if it’s just the same process — construct RFI, send it out, get data, pump into model, get result, make award, push into contract management — what’s better from the perspective of an average Joe? But if you have an advanced Procurement solution, can plug it into the catalog and analyze not only the cost, but the total cost if the order can be piggy-backed on other orders from on-contract suppliers who can add it to forthcoming shipments, give you contract-level discounts, etc. that’s value. And if you are looking to assemble a standard kit for a new hire, can run all the various combinations and determine which variant is best over a given time frame, that’s value too.

And a catalog solution can enhance sourcing if it supports punch-out and integrated search and anytime a buyer is considering sending out an RFI, can be integrated to identify current market pricing and source suppliers from the data within the catalog and in punch-out sites. If the buyer compares this pricing to current pricing, this can let the buyer know if going to market will likely be good (if market pricing is significantly less than current organizational pricing) or bad (if market pricing is significantly higher and the best option is just to extend the contract with the incumbent if pricing will stay about the same).

At the end of the day, Procurement is about generating value — and if the platform addition doesn’t generate additional value, what’s the point?