Driving Home an Advantage
Finding time to make an operational difference is always a challenge, amidst the continuous noise of daily activities, but modern technologies today are now enabling you to make smaller more manageable iterative changes that really can make a significant impact. Think of them initially as your wider existing processes, including those that touch spreadsheets, being either fully or partially automated as new digital ones. These then effectively become the new digital glue between your existing systems, with all being executed with appropriate security levels and segregations of duty to drive compliance.
Most organisations are today at a similar stage of operational inefficiency with excessive amounts of time being spent on transactional processing to reach acceptable end results, so any time spent on making them more efficient will save you time to drive further value initiatives. The root cause of this friction is twofold. Firstly the inability for users to create cost effective, repeatable, and auditable ultra-granular transformational processes across different vendor applications, and secondly these users not having access to the practical compute power to drive the former in a timely manner.
Root causes however are now solved enabling friction to be fully or partially removed from any process, on either an iterative basis or in one go. Think in the first instance of any resultant activity being automatically or semi automatically driven end to end, from data collection of all required data sets even if x-subsystems, through all flows to reporting @anywhere. Then add to this a secondary ability to drive actionable contextual alerts & workflows plus simulations, all that come together to drive deeper value creation. This capability is sometimes referred to as processtech.
Updated September 2019: Open Banking API’s providing new emerging opportunities for cash process enhancement https://flexsystemhk.wordpress.com/2019/05/07/general-and-open-banking-apis-in-the-digital-era/
Processtech improvements are driven by domain end-users enabling the leverage of their deep knowledge, whether that be focused on complex processes that involve many people, or complex tasks performed by just a few. Any improvement should be seen as a journey, not a slam dunk activity to a single end point. They will enable the leveraging of your existing applications to drive value, as well as contributing to a better work life balance for you, all through the reduction of your repetitive manual transactional procedures.
Thinking of areas to tackle first is a challenge at the beginning, simply because an understanding is required not only on the how but also on the limitations of usage. The key point on the “how” is recognising that you can leverage what works today and solve the key remaining issues iteratively, meaning that you can actually focus individually or collectively on the various multiple problem stages within a single process at the same time; ie frictional areas in data collection, awkward transformational flows, and those cumbersome segmental reporting activities that require you to make information available @anywhere to key stakeholders.
Limitations will be minimal if processes can be defined at an ultra-granular level, recognising that depth of functionality in vendor solutions of this type are diametrically opposed to usability, meaning that in your selection reviews you will need to focus in on understanding the granular capabilities of a solution, as well as its usability.
Full financial consolidations or operational segmental analysis are examples of what can be achieved by corporates. They reflect that granular processes can actually be driven to any required level of deep transformational ability. In the former example, they can be driven fully or partially automatically, even if at some intermediate stages of processing they just require a simple eyeball review by a domain specialist, before onward processing. They can take into account % ownerships, currencies, segmental analysis derived across different vendor systems that might even use different coding structures, allow the generation of rolling balances, and various types of balance sheet reconciliations like currency movements. Variances can be ranked within an entity, at a consolidated level, or both with notifications plus contextual actionable workflows generated, and all being done to drive better data quality.
Additionally as processes become focused and specific they are very useable as complexity involving human interaction has been simplified by the domain owner for them. In other words the system is doing the heavy lifting to drive consistent auditable and repeatable behaviour, which in itself reduces unintentional manual errors. Management of collaborative incremental improvements of any process is further facilitated, at a step by step level, for change management purposes.
Time savings can be very significant and importantly frees up more time for validation of the underlying numbers for increased confidence in reporting pack submissions, an area often highlighted by CFO’s as being of a material concern to them. As we all know data flows are pervasive with many similar parallel processes being undertaken in all operating units, as information flows from business units to regional offices to HQ and often back again in the form of detailed questions.
The fact that they are similar and not the same is worth highlighting as often the same activities are performed differently, all influenced by local requirements and differences in underlying systems in each location. Despite efforts to make systems consistent this is rarely achieved in reality, simply due to operational drivers like acquisitions etc. Now these differences can be handled at a granular level to simplify the “how” to drive better standardization, that also includes the handling of localized issues.
Perfect worlds would see all enterprises being in real time but few aspire to this vision today as it seems too far away from reality, and so totally impractical based on today’s resource and perceived technical capabilities, although in future as more automations are executed we will all drift slowly in that direction. Today’s position is not that surprising of course as all current systems and processes have been developed and iterated on over long periods of time, and no one will replace everything in one go for both cost and practical reasons. The real underlying issue causing the problem is that many core processes require information x-applications, not just within them, and static cumbersome spreadsheets have often been developed to solve this.
Saving time to make time for other activities can be achieved by using processtech for full or partial automation, thereby enabling granular processing x-application and x-vendor systems that will see data transformed for the required results. This in turn will lead to other refinement opportunities being identified and driven more quickly by domain end users for even further and deeper value creation, especially where these are being executed x-department. However change management issues will emerge for these inter departmental ones as highlighted below.
Starting to think of processes as primary or secondary is a good way of looking at various types of dependent data flows, even if across departments, with primary ones getting the data into the system and secondary ones using it; i.e. point of sales (POS) feeds would be primary, but automation of commissions to drive operational performance or enriching POS with actual purchase data for profit release, or fraud detection would be secondary. Sometimes they will cross several departments, but all too often they become data silos preventing deeper innovations and causing work to be duplicated or at worst not leveraged at all by other teams, who might not even be aware of their existence.
Inter-departmental versus intra-departmental processes highlights a significant change management issue. FinTech and retail are often highlighted segments that have seen major progressive changes through the leverage of digital, but they also illustrate some important and significant differences in end to end developments that need to be proactively managed (see Why Organisational Structures in the Digital Era need to be more Agile).
Processtech is your enabler for deep productive change, allowing domain end users to fully automate or partially automate complex tasks or complex processes for primary or secondary activities at a granular level. This can be undertaken with full compliance capability, through the leverage of existing applications without starting at base zero, noting again that whether these are actually x-departmental or otherwise is not of a technical concern, but more one of a change management issue not to mention internal politics. Tangentially it also highlights another differentiator which is the ability to manage both inter and intra departmental workflows.
Let’s look at some practical examples of processtech for both complex tasks and complex processes to give some additional context and ideas on capability; for the former this might be bond management, lease management, retail management to drive customer segmentation, multi-tiered allocations, or last mile project reconciliations undertaken by a few staff. For the latter they may be for financial or operational budget preparations including FP&A, and multi-level financial and segmental consolidations. For all cases the differentiation that comes with processtech is the ultra-granular flow handling, all done with full compliance. It also means that they can be extended to subsidiaries to leverage domain expertise that is in short supply through 1,2,3 or 3,2,1, processes or indeed used to leverage flows for deeper insights into smaller operations that might have less deep resource sets ie visibility and transparency by HR into approved, but not hired positions, that might slow overall growth or impact service levels.
Diving even deeper below will provide further additional insights as to some areas that corporates have been looking at with processtech.
Enrichment of data in processes. Drives value i.e. POS data with purchase data as per the example above, or it may simply just be used to provide a more complete and comprehensive audit trail as to source and actual calculation methodologies. This facilitates later audit and management reviews; e.g. when applied to simple or multi-tiered allocations or multi company consolidations, it will reduce time for reassessment as to the actual derivation of a number, because we all know how much time is spent on figuring out where a specific number originated from in practice further down the line.
User experience (UX) within a process is another consideration, depending on whether it is inward or outward looking, and to whom. Those ones requiring a rich user experience (UX) might be undertaken through native applications, which have more power in this area, whilst others might work well as progressive web applications. Which is used is often chosen based on a basket of criteria defined by corporate policy, security considerations or specifics for the required task in hand.
Driving forwards with a Gross Margin cf Net Margin focus. An important difference worth highlighting relates to projects that focus on gross margin as opposed to net margin, as outcomes can potentially be very different. Typical ranges for the cost of the finance function in a survey done by APQC in 2016 was in the range of 0.57% to 2.13% of revenue. Broken out by vertical the range could vary significantly, both within and across verticals, with the main difference between the lows and highs being attributed to the “drive” of the leadership team to achieve results, with the inference being that real ongoing improvements need management focus to make them happen.
Driving margin improvements on a general basis, as we all know from budget preparation, is not easy as we always continually strive for productivity improvements, and taking any improvement in the 0.57% to 2.13% range above is always welcome. Yet the opportunity for real change is not here, but in the management of gross margins to release locked profits for value creation. Domain users know where to look, but often the information to drive changes are in sub-systems, and simply not accessible enough on a repeatable and auditable basis for the purposes of ongoing agile management. Again worth emphasizing that inter-departmental process structures can also hinder progress in this area.
Web enrichment of a process continues to gather momentum. Easy examples might be in dealing with recruitment or “on the fly” logistics pricing for deliveries today, but user cases are broad with unlimited potential. As they are defined with processtech in an ultra-granular manner, subsequently incorporating web services for their enhancement is easy to achieve, and further knocks our friction within inwardly or outwardly focused activity.
New Tech will also continue to emerge on a constant basis. Current buzzwords to describe latest technologies that are seeing some early deployments by a limited few include artificial intelligence (AI), machine learning (ML), blockchain, and internet of things (IOT). Similar to web enrichment, and if they make sense for you, they can be added to a process noting that they might themselves need detailed management through a specific user interface (UI). For example when adding AI the algorithm will need management in the form of fine tuning in areas like the selection of probabilistic variables that actually influence outcomes, and the use of the right regression type or even addition of some limited rules to finesse outcome for the circumstance in question noting that rules are not as easily scalable as probabilistic formulae.
Technology improvements come fast and frequently and sometimes for certain functional areas a step change is reached, as it is now, enabling an organisation to be more productive for value creation. Finding that initial starting point is not easy, but at least today any x-application process undertaken for decision support that impacts net or gross margins plus segmental reporting can be tackled iteratively to add immediate value. It will require your management focus to drive deeper results but nevertheless a game changer!