Monthly Archives: June 2018

GDPR: The Dreaded DPIA (Part XV)

Today’s guest post is from Tony Bridger, an experienced provider of Procurement Consulting and Spend Analysis services across the Commonwealth (as well as a Lean Six Sigma Black Belt) who has been delivering value across continents for two decades. He is currently President of UK-based TrainingWorx Ltd, a provider of a wide range of Procurement and Analytic business training programs (inc. GDPR, spend analysis, project management, process improvement, etc.) and focussed short-term consulting solutions. Tony can be contacted at tony.bridger@data-trainingworx.co.uk.

One of the key changes in the GDPR legislation involves the creation of DPIAs or Data Protection Impact Assessments.

At first glance this appears to be what can only be termed as a “mindless piece of bureaucracy”.

However, perhaps not.

Historically, it may be hypothesised that many personal data breaches have been the result of “mindless planning” neatly followed by badly managed execution.    It has been incredibly easy to obtain data, endlessly spam individuals — and share that data around.   Often, little or no thought, planning or impact assessment has been conducted in the process of managing this type of data.

Conceptually, the DPIA is a very good idea.   However, like many EU regulations the “how” is more obtuse and intricate.

The United Kingdoms ICO site (Information Commissioners Office) states that:

“You must do a DPIA for processing that is likely to result in a high risk to individuals”.

High risk is hard to define in the procurement world.   Many hosted procurement technologies contain considerable volumes of personal data as we are all aware – both controllers and processors need to stop and carefully assess any new data management proposals.   A DPIA creates a structured approach and framework that can be used to help define if the targeted processing could breach the regulation.

A DPIA is effectively a combined project brief and risk assessment of any new data processing activity that an organisation intends to conduct.   The DPIA contains a variety of what appears to be simple requirements.   The DPIA must:

  • describe the nature, scope, context and purposes of the processing;
  • assess necessity, proportionality and compliance measures;
  • identify and assess risks to individuals; and;
  • identify any additional measures to mitigate those risks.

If you think about it carefully, it is eminently sensible in its approach.

However, deductively there are several core organisational processes that need to be in place to achieve the outcome.   In many respects, this is the point at which the DPIA becomes a little more complex in the implementation and management.   If the organisational processes do not currently exist – then these are likely to add to the complexity.

In response to this, supervisory authorities have attempted to provide guidance and checklists that can help organisations manage this process and reduce risk.   We have left the discussion on DPIAs until this stage as there are options to use the process to overcome some of the risks with personal data in this domain.  However, there may be some good news.

In our next post we will start to evaluate how procurement data could be managed through the DPIA process.

Thanks, Tony!

Zycus – Expending their Horizons in the EU

Zycus recently held their inaugural event in Europe — the last three days in Prague, to be precise. the doctor was there and he has to say he was impressed with

  • the conference organization
    (less snafus or lack of organization then a few conferences he’s been to recently organized by larger peers),
  • the content
    (they did a great job blending content from them, their partners, their customers, and leading analysts),
  • the progress
    both on the customer front and the product front

Recently we’ve seen a number of companies break out of Europe and into North America — like Ivalua and Synertrade — but we rarely see companies, even those from North America (and definitely those from India), break in, especially in a short time-frame. In the last two years Zycus has went from almost no presence in Europe to a known provider of S2P services with dozens of local customers among its 300+ worldwide deployments supported by local partners.  That’s quite impressive.

This last fact is key — Zycus understands fully that Europe is not India or America. It is dozens of countries with dozens of languages and dozens of local cultures that need to be supported by a provider that wants to effectively support its customers and the continent in, and on, which they do business. And Zycus understands that there are local implementation partners and providers in Europe that understands these needs. So while some providers try to sell locally with their own staff that they hire in Europe (who can’t know everything as they are few), others try to sell exclusively through partners (who are better equipped for local support, but if not well trained, can’t accurately represent the provider), they sell as a partnership with the local implementation partner, provider of software and provider of service (but take all the responsibility for ensuring the customer receives a successful deployment).

And a successful deployment is something they are quite capable of achieving. Not only do they have 300+ people to support implementations, but they have a history of working with partners to ensure that any localizations that need to happen, happen. We expect that as long as all parties go in with a solid understanding of what needs to happen, and what the true effort is, deployments will be appropriately planned and be successfully realized. And customer progress will continue.

Then we have the product front. Zycus continues to develop and have made good progress on a couple of modules, and their iRequest module in particular. While this may seem the least sophisticated from a sourcing perspective, it is the most important from a success perspective.

When one thinks about why most mavericks try to bypass the Procurement department, it’s typically because they see the Procurement department as a bottleneck. Too long to get approvals. No visibility into the sourcing event. Etc. Etc. With iRequest, anyone in the business can make any sort of request or requisition to Procurement and follow it through to the conclusion, with visibility not just into the status, but into the sourcing event, contracting process, or anything else that is relevant. It links into almost all of their other modules and allows a buyer to kick off events, approval chains, and information request processes with relative ease. It makes Procurement look like an enabler and that is key to organizational acceptance and success. It’s definitely worth checking out.

More coverage on Zycus, here and in depth on Spend Matters Pro (membership required), is coming, so stay tuned.

GDPR – Consents (Part XIV)

Today’s guest post is from Tony Bridger, an experienced provider of Procurement Consulting and Spend Analysis services across the Commonwealth (as well as a Lean Six Sigma Black Belt) who has been delivering value across continents for two decades. He is currently President of UK-based TrainingWorx Ltd, a provider of a wide range of Procurement and Analytic business training programs (inc. GDPR, spend analysis, project management, process improvement, etc.) and focussed short-term consulting solutions. Tony can be contacted at tony.bridger@data-trainingworx.co.uk.

You will have to forgive us for this post – this is not an easy topic. The topic is quite broad and, as with most elements of the GDPR, takes a little thought and consideration.

Consents need to be considered as a key privacy factor across many elements of procurement business.

There are several ways we can discuss consents – but we thought that to demonstrate the complexity of the legislation – and some of the care that needs to be taken, we would use a fictional human resource or temporary labour company in Europe.

If you have any doubts whatsoever as to the complexity of the legislation for this category of supplier, drop on to the site of one of the larger UK based recruitment company websites and enjoy a leisurely afternoon coming to terms with their Privacy notice.   All of them have had to:

  • Map out where personal data is held – files, paper, spreadsheets, databases;
  • Understand who they share it with;
  • Centralise and control their access to personal data;
  • Define the who, what, why, when and where of holding candidate data – and make that clear to candidates;
  • Ensure candidates are informed of how their data is managed – stored and used;
  • Provide consent to send their personal resumes to clients as needed – however, for differing clients, it is likely that individual consents will be required
  • If the recruiter provides psychological testing, they will need to be clear how long those results are retained for, their use and how the results are used.

For example, in 2016-17, the New South Wales government allowed psychological testing of candidates for key roles.  However, the results of these tests were made available across all government agencies on demand – some 30+ of them.    If this was Europe – and a breach occurred – it could be a costly exercise.  Is the Government the agency – or each individual state agency or body?   The differences in how data is used (and associated consents) varies considerably across the globe.   Ironically, the NSW Department of Industry has just issued a warning to candidates that may have applied for roles could have has their personal details exposed in a potential breach – a breach that may have occurred on a much wider basis.

For procurers, if temporary labour agencies are used (and consultants are in the same domain , whether they like it or not)), many will insert contractor or employee names into invoices.   As the initial consent to disclose, and offer of work would have been consent based, it does rely on all parts of the consent process working to specification.   Perhaps that, as the old saying goes, could be a verloen hoep or “forlorn hope”.

With spend analysis data, recruitment agencies would no doubt use the legitimate processing clause – in combination with contractual processing requirements.  No harm there we suspect.   The customer would have the data – and for analysis purposes would need to review that data for contractual reasons.   All seems sensible enough.

However, if you think about the number of if-then-else processes and sub-processes that need to comply, then statistically it will be hard to ensure that all consents are in place in a fast-moving business.   At a later date, if a contractor submits a Data Subject Access Request this could involve recovering information that an agency has supplied to former contractor employers – again it is unclear.  It could be made worse if relationships between agency and customer have broken down.

We don’t have the answers, sadly.  However, it is, sadly, almost inevitable that someone will fall foul of the legislation in a supply chain as complex and high volume as temporary labour.  We shall see.

Thanks, Tony.

Agile Procurement? Or just go faster?

Today’s guest post is from Tony Bridger, an experienced provider of Procurement Consulting and Spend Analysis services across the Commonwealth (as well as a Lean Six Sigma Black Belt) who has been delivering value across continents for two decades. He is currently President of UK-based TrainingWorx Ltd, a provider of a wide range of Procurement and Analytic business training programs (inc. GDPR, spend analysis, project management, process improvement, etc.) and focussed short-term consulting solutions. Tony can be contacted at tony.bridger@data-trainingworx.co.uk.

Yves St Laurent was an outstanding fashion designer in very many respects.   However, he had very clear views on how fashion works.   He summarised it in five words:

“fashions fade, style is eternal”.

There is little or no doubt that the procurement world has (once again) jumped on a fashion trend.     In the fashionista world, everyone is busy being a transformer, a value-adder, a people empoweree – and now agile.   This must leave so little room in the day for saving money – it is costly to keep up with fashion trends as we all know.

Agile is an interesting word.   Agile applied to procurement is a very interesting word.

Agile springs from an alternative approach to software development.   However, it seems to have neatly morphed in to a word that seems to express some form of new, vague approach to sourcing.   Mark C. Layton in the Dummies Guide to Agile Management and Procurement Practices (2012) focuses on software acquisition and development as the basis for an agile approach – and how vendors can be managed in agile technology driven development projects.

CIPS published a paper in their Knowledge Summary series (undated) where some four pages of (unfocused) discussion results in the conclusion that:

“As this paper makes clear, ‘lean’ and ‘agile’ concepts have been, and continue to be, the subject of academic research………… (and) that ‘lean’ and ‘agile’ are not simply theoretical concepts.

Well, no help there then.   After a little rummaging through much word-smithing (I hope I don’t start a new fashion with that phrase), I found an article on Rev-International (Source) – so, quite recent.  The article states:

“To be agile means to be able to think, understand, and move quickly and easily. To be agile, according to Cornell, procurement organizations need to have the knowledge and ability to move quickly.”

Sadly, this deductively implies that unless they adopt the new fashion, procurement teams will remain inherently slow and unfashionably nerdy.   It gets worse:

 “It’s about using market knowledge and business intelligence to exploit profitable opportunities,”

From experience both as a member of, and supplier, to a wide range of procurement organisations, this is pretty much what most seem to do for a living.   However, admittedly, there is still a major capability gap in the use of business data intelligence in many procurement teams.   Many writers still focus on Agile as a procurement technology driven function – not much to do with “the rest” of the sourcing portfolio.   So where does this leave us?   I am now really not sure what to wear.

Don’t you just hate it when a piece of music gets in to your thinking….and you can’t turn it off?   The Kinks, in 1966, wrote a song called “Dedicated follower of fashion”.   There is one line that he/she is:

“……. Eagerly pursuing all the latest fads and trends”

It is too easy to become distracted by the fashionable and the pursuit of a silver bullet – by all means learn new techniques – and adapt if it fits.     However, it would be much better to see good procurement teams (continuing) to deliver quickly, using business intelligence and supplier collaboration – but with style – and a perhaps a little panache.    It’s really business as usual, save money, avoid chasing fashions.   Who knows, perhaps I am just plain old-fashioned and too focused on style.

[1] https://www.jaggaer.com/agile-procurement-achieve/

Thanks, Tony.

Why You Need a Master Data Strategy for Proper Supplier Management (Repost)

This post originally ran on June 24, 2013, but seeing as it’s still a relevant message five years later, it is being re-posted to educate newcomers on the importance of Master Data Management strategies in this data-centric era.

Supplier Information Management is more than just buying a Supplier Information Management (SIM) solution and plopping it into your data centre. Much more. But yet, it seems that some people — anxious to deal with the visibility, risk management, and supplier performance issues facing them — believe that merely obtaining a SIM solution will solve their problems. A proper solution properly acquired, properly implemented, and properly used will go a long way to increasing supply chain visibility, enabling risk management and mitigation, and providing a solid foundation for supplier performance management, but the mere presence of such a solution in your supply management application suite is about as useful as a drill in the hands of a carpenter holding a nail.

You see, Supplier Information will never be restricted to the SIM system. Supplier information will always be present in the ERP system used for resource planning and manufacturing, the accounts payable system, the transactional procurement / procure-to-pay system, the sourcing suite, the contract management system, the risk management solution, the performance tracking and scorecard system, the sustainability / CSR solution, and other systems employed in your organizational back-office to manage the different supply management AND business functions. Supplier data is everywhere, and without a strategy, just shoving it into the SIM system won’t help.

In order to get a proper grip on supplier information, the organization needs a master data strategy that dictates the sub-records that define a supplier record and which system holds the master data for each sub-record. What do we mean by this? For example, the ERP may hold the core supplier identifier sub-record that defines the unique supplier number in your system, the supplier name, the supplier’s tax number, and your customer number in the eyes of the supplier and be the system of record for this information. The accounts payable system, referencing the supplier by it’s supplier number, may be the system of record for the headquarters address and payment address. The contract management system may be the system of record for the list of employees authorized to sign contracts on behalf of the supplier. The CSR system may be the system of record for the suppliers’ carbon rating, third party CSR rating, and your internal sustainability rating. And so on.

If this is the case, the SIM system, to truly be a SIM solution for your organization, needs to integrate with all of these systems and encode the proper rules to resolve data conflicts as required. Specifically, three things need to happen. First of all, whenever a system of record updates data, that data must be pulled into the system and overwrite the existing data. Secondly, anytime data is updated in the SIM system for which it is the system of record, that data must be pushed out to all systems that use it. Thirdly, and this part is sometimes overlooked, whenever data is updated in a system of record, the data not only needs to be pulled into the SIM system, but it then needs to be pushed out to any system that also uses that data. The SIM solution is the centre of a hub-and-spoke data architecture — all updates flow in, and all updates flow out.

This can only be properly accomplished with an appropriate Master Data Strategy. Don’t overlook it. Otherwise your SIM solution will turn out to be a Stuck In Muck solution. An SI is not kidding about this.