A Reference Architecture for The Open Banking Standard..

This is the second in a series of four posts on the Open Banking Standard (OBS) in the UK. This second post will briefly look at the strategic drivers for banks while proposing an architectural style or approach for incumbents to drive change in their platforms to achieve OBS Compliance. We will examine the overall data layer implications in the next post. The final post will look at key strategic levers and possible business models that the standard could help banks to drive innovation towards. 

Introduction…

The Open Banking Standard will steward the development of layers of guidelines (API interoperability standards, data security & privacy and governance) which primarily deal with data sharing in banking. The belief is that this regulation will ultimately spur open competition and unlock innovation. For years, the industry has grappled with fundamental platform issues that are native to every domain of banking. Some of these include systems are siloe-d by function, platforms that are inflexible in responding to rapidly changing market conditions & consumer tastes. Bank IT is perceived by the business to be glacially slow in responding to their needs.

The Open Banking Standard (OBS) represents a vast opportunity for banking organizations in multiple ways. First off, Bank IT has the luxury of using the regulatory mandate to slowly re-architect hitherto inflexible and siloe-d business systems. Secondly, doing so will enable Banks to significantly monetize their vast data resources in several key business areas.  

This will need to change with the introduction of Open Banking Standard. Banks that do not change will not be able to derive and sustain a competitive advantage. PSD2 Compliance (Payment Systems Directive – 2) – which will be mandated by the EU is one of the first layers in the OBS. Further layers will include API standards definitions for business processes (e.g View Account, Transfer Funds, Chargebacks, Dispute Handling etc). 

The OBWG (Open Banking Working Group) standards include the following key constituencies & their requirements [1] – 

 1. Customers: defined as account holders & businesses who agree to sharing their data & any publishers who share open datasets 

2. Data attribute providers: defined as banks & other financial services providers whose customers produce data as part of daily banking activities 

3. Third parties: Interested developers, financial services startups aka FinTechs, and any organisations (e.g  Retail Merchants) who can leverage the data to provide new views & products  

It naturally follows from the above, that the key technical requirements of the framework will include:

1. A set of Data elements, API definitions and Security Standards to provide both data security and a set of access restrictions 

2. A Governance model, a body which will develop & oversee the standards 

3. Developer resources, which will enable third parties to discover, educate and experiment.

The Four Strategic Drivers in the Open Bank Standard …

Clearly the more intelligently a firm harness technology (in pursuit of OBS compliance goals) will determine it’s overall competitive advantage.  This important to note since a range of players across the value chain (the above Third Parties as designated by the standard) can now obtain seamless access to a variety of data. Once obtained the data can help the Third Parties reimagine it in manifold ways. For example they can help consumers make better personal financial decisions for their clients at the expense of the Banks owning the data. For instance, FinTechs have generally been able to make more productive use of client data. They do this by providing clients with intuitive access to cross asset data, tailoring algorithms based on behavioral characteristics  and by providing clients with a more engaging and unified experience.

So, the four strategic business goals that OBS compliant architectures need to solve in the long run – 

  1. Digitize The Customer Journey –  Bank clients who use services like Uber, Zillow, Amazon etc in their daily lives are now very vocal in demanding a seamless experience across all of their banking ervices using digital channels.  The vast majority of Bank applications still lag the innovation cycle, are archaic & are separately managed. The net issue with this is that the client is faced with distinct user experiences ranging from client onboarding to servicing to transaction management. Such applications need to provide anticipatory or predictive capabilities at scale while understand the specific customers lifestyles, financial needs & behavioral preferences. 
  2. Provide Improved Access to Personal Financial Management & Improved Lending Processes  –  Provide consumers with a single aggregated picture of all their accounts. Also improve lending systems by providing more efficient access to loans by incorporating a large amount of contextual data in the process.
  3. Automate Back & Mid Office Processes Across Lending, Risk, Compliance & Fraud – The needs to forge a closer banker/client experience is not just driving demand around data silos & streams themselves but also forcing players to move away from paper based models to more of a seamless, digital & highly automated model to rework a ton of existing back & front office processes. These processes range from risk data aggregation, supranational compliance (AML,KYC, CRS & FATCA), financial reporting across a range of global regions & Cyber Security. Can the Data architectures & the IT systems  that leverage them be created in such a way that they permit agility while constantly learning & optimizing their behaviors across national regulations, InfoSec & compliance requirements? Can every piece of actionable data be aggregated,secured, transformed and reported on in such a way that it’s quality across the entire lifecycle is guaranteed? 
  4. Tune Existing Business Models Based on Client Tastes and Feedback – While the initial build out of the core architecture may seem to focus on digitizing interactions and exposing data via APIs. What follows fast is strong predictive modeling capabilities working at large scale where systems need to constantly learn and optimize their interactions, responsiveness & services based on client needs & preferences. 

The Key System Architecture Tenets…

The design and architecture of a solution as large and complex as a reference architecture for Open Banking is a multidimensional challenge and it will vary at every institution based on their existing investments, vendor products & overall culture. 

The OBS calls out the following areas of data as being in scope – Customer transaction data, customer reference data, aggregated data and sensitive commercial data. A thorough review of the OBWSG standard leads one to suggest a logical reference architecture as called out below.

Based on all the above, the Open Bank Architecture shall – 

  • Support an API based model to invoke any business process or data elements based on appropriate security  by a third party . E.g client or an advisor or a business partner
  • Support the development and deployment of an application that encourages a DevOps based approach
  • Support the easy creation of scalable business processes (e.g. client on boarding, KYC, Payment dispute check etc) that natively emit business metrics from the time they’re instantiated to throughout their lifecycle
  • Support automated application delivery, configuration management & deployment
  • Support a high degree of data agility and data intelligence. The end goal being that that every customer click, discussion & preference shall drive an analytics infused interaction between the Bank and the client
  • Support algorithmic capabilities that enable the creation of new services like automated (or Robo) advisors
  • Support a very high degree of scale across many numbers of users, interactions & omni-channel transactions while working across global infrastructure
  • Shall support deployment across cost efficient platforms like a public or private cloud. In short, the design of the application shall not constrain the available deployment options – which may vary because of cost considerations. The infrastructure options supported shall range from virtual machines to docker based containers – whether running on a public cloud, private cloud or in a hybrid cloud
  • Support small, incremental changes to business services & data elements based on changing business requirements 
  • Support standardization across application stacks, toolsets for development & data technology to a high degree
  • Shall support the creation of a user interface that is highly visual and feature rich from a content standpoint when accessed across any device

 

Reference Architecture…

Now that we have covered the business bases, what foundational technology choices comprise the satisfaction of the above? Lets examine that first at a higher level and then in more detail.

Given the above list of requirements – the application architecture that is a “best fit” is shown below.

OpenBankRefArch_v1

                   Illustration – Candidate Reference Architecture for the Open Bank Standard

Lets examine each of the tiers starting from the lowest –

Infrastructure Layer…

Cloud Computing across it’s three main delivery models (IaaS, PaaS & SaaS) is largely a mainstream endeavor in financial services and no longer an esoteric adventure only for brave innovators. A range of institutions are either deploying or testing cloud-based solutions that span the full range of cloud delivery models. These capabilities include –

IaaS (infrastructure-as-a-service) to provision compute, network & storage, PaaS (platform-as-a-service) to develop applications & exposing their business services as  SaaS (software-as-a-service) via APIs.

Choosing Cloud based infrastructure – whether that is secure public cloud  (Amazon AWS or Microsoft Azure) or an internal private cloud (OpenStack etc) or even a hybrid approach is a safe and sound bet for these applications. Business innovation and transformation are best enabled by a cloud based infrastructure – whether public or private.

 

Data Layer…

While banking data tiers are usually composed of different technologies like RDBMS, EDW (Enterprise Data Warehouses), CMS (Content Management Systems) & Big Data etc. My recommendation for the OBSWG target state is largely dominated by a Big Data Platform powered by Hadoop technology. The vast majority of initial applications recommended by the OBSWG call out for predictive analytics to create tailored Customer Journeys. Big Data is a natural fit as it is fast emerging as the platform of choice for analytic applications.

Financial services firms specifically deal with manifold data types ranging from Customer Account data, Transaction Data, Wire Data, Trade Data, Customer Relationship Management (CRM), General Ledger and other systems supporting core banking functions. When one factors in social media feeds, mobile clients & other non traditional data types, the challenge is not just one of data volumes but also variety and the need to draw conclusions from fast moving data streams by commingling them with years of historical data.

The reasons for choosing Big Data as the dominant technology in the data tier are the below – 

  1. Hadoop’s ability to ingest and work with all the above kinds of data & more (using the schema on read method) has been proven at massive scale. Operational data stores are being built on Hadoop at a fraction of the cost & effort involved with older types of data technology (RDBMS & EDW)
  2. The ability to perform multiple types of processing on a given data set. This processing varies across batch, streaming, in memory and realtime which greatly opens up the ability to create, test & deploy closed loop analytics quicker than ever before
  3. The DAS (Direct Attached Storage) model that Hadoop provides fits neatly in with the horizontal scale out model that the services, UX and business process tier leverage. This keeps cuts Capital Expenditure  to a bare minimum.
  4. The ability to retain data for long periods of time thus providing WM applications with predictive models that can reason on historical data
  5. Hadoop provides the ability to run a massive volumes of models in a very short amount of time helps with modeling automation
  6. Due to it’s parallel processing nature, Hadoop can run calculations (pricing, risk, portfolio, reporting etc) in minutes versus the hours it took using older technology
  7. Hadoop has to work with existing data investments and to augment them with data ingestion & transformation leaving EDW’s to perform complex analytics that they excel at – a huge bonus.

Services Layer…

The overall goals of the OBSWG services tier are to help design, develop,modify and deploy business components in such a way that overall WM application delivery follows a continuous delivery/deployment (CI/CD) paradigm.Given that WM Platforms are some of the most complex financial applications out there, this also has the ancillary benefit of leaving different teams – digital channels, client on boarding, bill pay, transaction management & mid/back office teams to develop and update their components largely independent of other teams. Thus a large monolithic WM enterprise platform is decomposed into its constituent services which are loosely coupled and each is focused on one independent & autonomous business task only. The word ‘task’ here referring to a business capability that has tangible business value.

A highly scalable, open source & industry leading platform as a service (PaaS) is recommended as the way of building out and hosting banking business applications at this layer.  Microservices have moved from the webscale world to fast becoming the standard for building mission critical applications in many industries. Leveraging a PaaS such as OpenShift provides a way to help cut the “technical debt” that has plagued both developers and IT Ops. OpenShift provides the right level of abstraction to encapsulate microservices via it’s native support for Docker Containers. This also has the concomitant advantage of standardizing application stacks, streamlining deployment pipelines thus leading the charge to a DevOps style of building applications. 

Further I recommend that service designer take the approach that their micro services can be deployed in a SaaS application format going forward – which usually implies taking an API based approach.

Now, the services tier has the following global responsibilities – 

  1. Promote a Microservices/SOA style of application development
  2. Support component endpoint invocation via standards based REST APIs
  3. Promote a cloud, OS & ,development language agnostic style of application development
  4. Promote Horizontal scaling and resilience

Predictive Analytics & Business Process Layer…

Though segments of the banking industry have historically been early adopters of analytics, areas being targeted by the OBSWG – Retail lines of business &Payments have generally been laggards. However, the large datasets that are prevalent in Open Bank Standard world as well as the need to drive customer interaction & journeys, risk & compliance reporting, detecting fraud etc calls for a strategic relook at this space. 

Techniques like Machine Learning, Data Science & AI feed into core business processes thus improving them. For instance, Machine Learning techniques support the creation of self improving algorithms which get better with data thus making accurate business predictions. Thus, the overarching goal of the analytics tier should be to support a higher degree of automation by working with the business process and the services tier. Predictive Analytics can be leveraged across the value chain of the Open Bank Standard – ranging from new customer acquisition to customer journey to the back office. More recently these techniques have found increased rates of adoption with enterprise concerns from cyber security to telemetry data processing.

Another area is improved automation via light weight business process management (BPM). Though most large banks do have pockets of BPM implementations that are adding or beginning to add significant business value, an enterprise-wide re-look at the core revenue-producing activities is called for, as is a deeper examination of driving competitive advantage. BPM now has evolved into more than just pure process management. Meanwhile, other disciplines have been added to BPM — which has now become an umbrella term. These include business rules management, event processing, and business resource planning.

Financial Services firms general are fertile ground for business process automation, since most managers across their various lines of business are simply a collection of core and differentiated processes. Examples are private banking (with processes including onboarding customers, collecting deposits, conducting business via multiple channels, and compliance with regulatory mandates such as KYC and AML); investment banking (including straight-through-processing, trading platforms, prime brokerage, and compliance with regulation); payment services; and portfolio management (including modeling model portfolio positions and providing complete transparency across the end-to-end life cycle). The key takeaway is that driving automation can result not just in better business visibility and accountability on behalf of various actors. It can also drive revenue and contribute significantly to the bottom line.

A business process system should allow an IT analyst or customer or advisor to convey their business process by describing the steps that need to be executed in order to achieve the goal (and explain the order of those steps, typically using a flow chart). This greatly improves the visibility of business logic, resulting in higher-level and domain-specific representations (tailored to finance) that can be understood by business users and should be easier to monitor by management. Again , leveraging a PaaS such as OpenShift in conjunction with an industry leading open source BPMS (Business Process Management System) such as JBOSS BPMS provides an integrated BPM capability that can create cloud ready and horizontally scalable business processes.

API & UX Layer…

The API & UX (User Experience) tier fronts humans – clients. business partners, regulators, internal management and other business users across omnichannel touchpoints. A standards based API tier is provided for partner applications and other non-human actors to interact with business service tier. Once the OBSWG defines the exact protocols, data standards & formats – this should be straightforward to implement.

The API/UX tier has the following global responsibilities  – 

  1. Provide a seamless experience across all channels (mobile, eBanking, tablet etc) in a way that is a continuous and non-siloed. The implication is that clients should be able to begin a business transaction in channel A and be able to continue them in channel B where it makes business sense.
  2. Understand client personas and integrate with the business & predictive analytic tier in such a way that the API is loosely yet logically integrated with the overall information architecture
  3. Provide advanced visualization (wireframes, process control, social media collaboration) and cross partner authentication & single sign on
  4. Both the API & UX shall also be designed is such a manner that their design, development & ongoing enhancement lends themselves to an Agile & DevOps methodology.

It can all come together…

In most existing Banking systems, siloed functions have led to brittle data architectures operating on custom built legacy applications. This problem is firstly compounded by inflexible core banking systems and secondly exacerbated by a gross lack of standardization in application stacks underlying capabilities like customer journey, improved lending & fraud detection. These factors inhibit deployment flexibility across a range of platforms thus leading to extremely high IT costs and technical debut. The consequence is that these inhibit client facing applications from using data in a manner that constantly & positively impacts the client experience. There is clearly a need to provide an integrated digital experience across a global customer base. And then to offer more intelligent functions based on existing data assets. Current players do possess a huge first mover advantage as they offer highly established financial products across their large (and largely loyal & sticky) customer bases, a wide networks of physical locations, rich troves of data that pertain to customer accounts & demographic information. However, it is not enough to just possess the data. They must be able to drive change through legacy thinking and infrastructures as things change around the entire industry as it struggles to adapt to a major new segment – the millenials – who increasingly use mobile devices and demand more contextual services as well as a seamless and highly analytic driven & unified banking experience – akin to what they commonly experience via the internet – at web properties like Facebook, Amazon, Google or Yahoo etc

Summary

Platforms designed technology platforms designed around the four key business needs   will create immense operational efficiency, better business models, increased relevance and ultimately drive revenues. These will separate the visionaries, leaders from the laggards in the years to come. The Open Bank Standard will be a catalyst in this immense disruption. 

REFERENCES…

[1] The Open Banking Standard –
https://theodi.org/open-banking-standard

The Open Banking Standard – The Five Major Implications for UK Banks..

“Banking as a service has long sat at the heart of our economy. In our digitally enabled world, the need to seamlessly and efficiently connect different economic agents who are buying and selling goods and services, is critical. The Open Banking Standard is a framework for making banking data work better: for customers; for businesses and; for the economy as a whole.” – OBWG (Open Bank Working Group) co-chair and Barclays executive Matt Hammerstein

Introducing Open Banking Standards…

On a global basis, both the Financial Services and the Insurance industry are facing an unprecedented amount of change driven by factors like changing client preferences and the emergence of new technology—the Internet, mobility, social media, etc. These changes are immensely profound, especially with the arrival of  the “FinTechs”—technology-driven applications that are upending long-standing business models across all sectors from retail banking to wealth management & capital markets. Complement this with members of a major new segment, Millennials. They are increasingly use mobile devices, demanding more contextual services and expecting a seamless unified banking experience—something akin to what they  experience on web properties like Facebook, Amazon, Uber, Google or Yahoo, etc.  These web scale startups are doing so by expanding their wallet share of client revenues by offering contextual products tailored to individual client profiles. Their savvy use of segmentation data and predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, call center banking, point of sale, ATM/kiosk etc.).

Supra national authorities and national government in Europe have taken note of the need for erstwhile protected industries like Banking to stay competitive in this brave new world.

With the passage of the second revision of the ground breaking Directive on Payment Services Directive (PSD-2),  the European Parliament has adopted the legal foundation of the creation of a EU-wide single payments area (SEPA)[1].  While the goal of the PSD is to establish a set of modern, digital industry rules for all payment services in the European Union; it has significant ramifications for the financial services industry as it will surely current business models & foster new areas of competition. While the PSD-2 has gotten the lions share of press interest, the UK government has quietly been working on an initiative to create a standard around allowing Banking organizations to share their customer & transactional data with certified third parties via an open API.  The outgoing PM David Cameron’s government had in fact outlined these plans in the 2015 national budget.

open-bank-project

The EU and the UK governments have recognized that in order for Europe to move into the vision of one Digital Market – the current system of banking calls for change. And they foresee this change will be driven by digital technology. This shakeup will happen via the increased competition that will result as various financial services are unbundled by innovative developers. To that end, by 2019 – all banks should make customer data – their true crown jewels – openly accessible via an open standards based API.

The Open Bank Working Standard Report API…

 The U.K. has been working on an open standard for its financial system for nearly a year. The Open Bank Working Group (OBWP) was created to set standards how banking data should created and accessed openly. This initiative covers the following broad areas – Data Standards, API Standards & Security Standards to protect consumers while spurring innovation via open competition.

Open_Banking_Scope

Illustration: Components of the Open Banking Standard (ref – OBWG Working Group)

Under the Open Banking Standard – expected to be legal reality over the next 2-3 years, any banking customer or authorized 3rd party provider can leverage APIs to gain access to their data and transactions across a whole range of areas ranging from Retail Banking to Business Banking to Commercial Banking.

Open Standards can actually help banks by helping them source data from external providers. For instance, the Customer Journey problem has been an age old issue in banking which has gotten exponentially more complicated over the last five years as the staggering rise of mobile technology and the Internet of Things (IoT) have vastly increased the number of enterprise touch points that customers are exposed to in terms of being able to discover & purchase new products/services. In an OmniChannel world, an increasing number of transactions are being conducted online. In verticals like Retail and Banking, the number of online transactions approaches an average of 40%. Adding to the problem, more and more consumers are posting product reviews and feedback online. Banks thus need to react in realtime to piece together the source of consumer dissatisfaction.  Open Standards will help increase the ability of banks to pull in data from external sources to enrich their limited view of customers.

The Implications of Open Bank Standard…

The five implications of Open Bank Project –

  1. Banks will be focused on building platforms that can drive ecosystems of applications around them.  Banks have thus far been largely focused on delivering commodity financial services using well understood distribution strategies. Most global banks have armies of software developers but their productivity around delivering innovation has been close to zero. Open APIs will primarily force more thinking around how banking products are delivered to the end consumer. The standards for this initiative are primarily open source in origin, though they’re widely accepted across the globe – REST,OAuth etc.
  2. However it is not a zero sum game, Banks can themselves benefit by building business models around monetizing their data assets as their distribution channels will go global & costs will change around Open Bank. To that end existing Digital efforts should be brought in line with Open Bank Standard  The best retail banks will not only seek to learn from, but sometimes partner with, emerging fintech players to integrate new digital solutions and deliver exceptional customer experience. To cooperate and take advantage of fintechs, banks will require new partnering capabilities. To heighten their understanding of customers’ needs and to deliver products and services that customers truly value, banks will need new capabilities in data management and analytics. Using Open Bank APIs, developers across the world can create applications that offer new services (in conjunction with retailers, for example), aggregate financial information or even help in financial planning. Banks will have interesting choices to make between acting as Data Producer or Consumer or Aggregator or even a Distributor based on specific business situations.
  3. Regulators will also benefit substantially by using Partner APIs to both access real time reports  & share data across a range of areas. The lack of realtime data access across a range of risk, compliance and cyber areas has been a long standing problem that can be solved by an open standards based API framework [2].  E.g.  Market/Credit/Basel Risk Based Reporting, AML watch list data and Trade Surveillance etc.
  4. Data Architectures are key to Open Bank Standard –  Currently most industry players are woeful at putting together a comprehensive Single View of their Customers (SVC). Due to operational data silos, each department possess a siloe-d & limited view of the customer across multiple channels. These views are typically inconsistent, lack synchronization with other departments & miss a high amount of potential cross-sell and up-sell opportunities. Data lakes and realtime data processing techniques will be critical to meeting this immense regulatory requirement.
  5. Despite the promise, large gaps still remain in the Open Bank Project. Critical areas like project governance, Service Level Agreements (SLA) for API users in terms of uptime, quality of service are still left unaddressed.

 Open Banking Standard will spur immense changes..

Prior to the Open Banking Standard, Banks recognize the need to move to a predominantly online model by providing consumers with highly interactive, engaging and contextual experiences that span multiple channels—branch banking, eBanking, POS, ATM, etc. Business goals are engagement & increasing profitability per customer for both micro and macro customer populations with the ultimate goal of increasing customer lifetime value (CLV). The Open Banking Standard brings technology approaches to the fore in terms of calling it out as a strategic differentiator.  Banks need to move to a fresh business, data and process approach as a way of staying competitive and relevant. Done right, Open Bank Standards will help the leaders cement their market position.

REFERENCES…

[1] The Open Banking Standard –
https://theodi.org/open-banking-standard

[2]Big Data – Banking’s New Weapon Against Financial Crime – http://www.vamsitalkstech.com/?p=806

The Five Deadly Sins of Financial Services IT..

THE STATE OF GLOBAL FINANCIAL SERVICES IT ARCHITECTURE…

This blog has time & again discussed how Global, Domestic and Regional banks need to be innovative with their IT platform to constantly evolve their product offerings & services. This is imperative due to various business realities –  the increased competition by way of the FinTechs, web scale players delivering exciting services & sharply increasing regulatory compliance pressures. However, systems and software architecture has been a huge issue at nearly every large bank across the globe.

Regulation is also afoot in parts of the globe which will give non traditional banks access to hitherto locked customer data. E.g PSD-2 in the European Union. Further, banking licenses have been more easily granted to non-banks which are primarily technology pioneers. e.g. Paypal

It’s 2016 and Banks are waking up to the fact that IT Architecture is a critical strategic differentiator. Players that have agile & efficient architecture platforms, practices can not only add new service offerings but also able to experiment across a range of analytic led offerings that create & support multi-channel offerings. These digital services can now be found abundantly areas ranging from Retail Banking, Capital Markets, Payments & Wealth Management esp at the FinTechs.

So, How did we get here…

The Financial Services IT landscape – no matter the segment – one picks across the spectrum – Capital Markets, Retail & Consumer Banking, Payment Networks & Cards, Asset Management etc are all largely predicated on a few legacy anti-patterns. These anti-patterns have evolved over the years from a systems architecture, data architecture & middleware standpoint.

These anti-patterns have resulted in a mishmash of organically developed & shrink wrapped systems that do everything from running critical Core Banking Applications to Trade Lifecycle to Securities Settlement to Financial Reporting etc.  Each of these systems operates in an application, workflow, data silo with it’s own view of the enterprise. These are all kept in sync largely via data replication & stove piped process integration.

If this sounds too abstract, let us take an example &  a rather topical one at that. One of the most critical back office functions every financial services organization needs to perform is Risk Data Aggregation & Regulatory Reporting (RDARR). This spans areas from Credit Risk, Market Risk, Operational Risk , Basel III, Solvency II etc..the list goes on.

The basic idea in any risk calculation is to gather a whole range of quality data in one place and to run computations to generate risk measures for reporting.

So, how are various risk measures calculated currently? 

Current Risk Architectures are based on traditional relational databases (RDBMS) architectures with 10’s of feeds from Core Banking Systems, Loan Data, Book Of Record Transaction Systems (BORTS) like Trade & Position Data (e.g. Equities, Fixed Income, Forex, Commodities, Options etc),  Wire Data, Payment Data, Transaction Data etc. 

These data feeds are then tactically placed in memory caches or in enterprise data warehouses (EDW). Once the data has been extracted, it is transformed using a series of batch jobs which then prepare the data for Calculator Frameworks to which run the risk models on them. 

All of the above need access to large amounts of data at the individual transaction Level. The Corporate Finance function within the Bank then makes end of day adjustments to reconcile all of this data up and these adjustments need to be cascaded back to the source systems down to the individual transaction or classes of transaction levels. 

These applications are then typically deployed on clusters of bare metal servers that are not particularly suited to portability, automated provisioning, patching & management. In short, nothing that can automatically be moved over at a moment’s notice. These applications also work on legacy proprietary technology platforms that do not lend themselves to flexible & a DevOps style of development.

Finally, there is always need for statistical frameworks to make adjustments to customer transactions that somehow need to get reflected back in the source systems. All of these frameworks need to have access to and an ability to work with terabtyes (TBs) of data.

Each of above mentioned risk work streams has corresponding data sets, schemas & event flows that they need to work with, with different temporal needs for reporting as some need to be run a few times in a day (e.g. Traded Credit Risk), some daily (e.g. Market Risk) and some end of the week (e.g Enterprise Credit Risk)

Five_Deadly_Sins_Banking_Arch

                          Illustration – The Five Deadly Sins of Financial IT Architectures

Let us examine why this is in the context of these anti-patterns as proposed below –

THE FIVE DEADLY SINS…

The key challenges with current architectures –

  1. Utter, total and complete lack of centralized Data leading to repeated data duplication  – In the typical Risk Data Aggregation application – a massive degree of Data is duplicated from system to system leading to multiple inconsistencies at the summary as well as transaction levels. Because different groups perform different risk reporting functions (e.g Credit and Market Risk) – the feeds, the ingestion, the calculators end up being duplicated as well. A huge mess, any way one looks at it. 
  2. Analytic applications which are not designed for throughput – Traditional Risk algorithms cannot scale with this explosion of data as well as the heterogeneity inherent in reporting across multiple kinds of risks. E.g Certain kinds of Credit Risk need access to around 200 days of historical data where one is looking at the probability of the counter-party defaulting & to obtain a statistical measure of the same. The latter are highly computationally intensive and can run for days. 
  3. Lack of Application Blueprint, Analytic Model & Data Standardization – There is nothing that is either SOA or microservices-like and that precludes best practice development & deployment. This only leads to maintenance headaches. Cloud Computing enforces standards across the stack. Areas like Risk Model and Analytic development needs to be standardized to reflect realities post BCBS 239. The Volcker Rule aims to ban prop trading activity on part of the Banks. Banks must now report on seven key metrics across 10s of different data feeds across PB’s of data. Most cannot do that without undertaking a large development and change management headache.
  4. Lack of Scalability –  It must be possible to operate it as a central system that can scale to carry the full load of the organization and operate with hundreds of applications built by disparate teams all plugged into the same central nervous system.One other factor to consider is the role of cloud computing in customer retention efforts. The analytical computational power required to understand insights from gigantic data sets is costly to maintain on an individual basis. The traditional owned data center will probably not disappear, but banks need to be able to leverage the power of the cloud to perform big data analysis in a cost-effective manner.
    EndFragment
  5. A Lack of Deployment Flexibility –  The application & data requirements dictate the deployment platforms. This massive anti pattern leads to silos and legacy OS’s that can not easily be moved to Containers like Docker & instantiated by a modular Cloud OS like OpenStack.

THE BUSINESS VALUE DRIVERS OF EFFICIENT ARCHITECTURES …

Doing IT Architecture right and in a responsive manner to the business results in critical value drivers that that are met & exceeded this transformation are – 

  1. Effective Compliance with increased Regulatory Risk mandates ranging from Basel – III, FTRB, Liquidity Risk – which demand flexibility of all the different traditional IT tiers.
  2. An ability to detect and deter fraud – Anti Money Laundering (AML) and Retail/Payment Card Fraud etc
  3. Fendoff competition from the FinTechs
  4. Exist & evolve in a multichannel world dominated by the millennial generation
  5. Reduced costs to satisfy pressure on the Cost to Income Ratio (CIR)
  6. The ability to open up data & services that operate on the customer data to other institutions

 A uniform architecture that works across of all these various types would seem a commonsense requirement. However, this is a major problem for most banks. Forward looking approaches that draw heavily from microservices based application development, Big Data enabled data & processing layers, the adoption of Message Oriented Middleware (MOM) & a cloud native approach to developing applications (PaaS) & deployment (IaaS) are the solution to the vexing problem of inflexible IT.

The question is if banks can change before they see a perceptible drop in revenues over the years?  

Deter Financial Crime by Creating an Effective Anti Money Laundering (AML) Program…(1/2)

THE AML CHALLENGE CONTINUES UNABATED…

As this blog has repeatedly catalogued over the last year here[1], here[2] and here[3], Money Laundering is a massive global headache and one of the biggest crimes against humanity. Not a month goes by when we do not hear of billions of dollars in ill gotten funds being stolen from developing economies via corruption as well as from proceeds of nefarious whether it is the Panama papers or banks unwittingly helping drug cartels launder money.

I have seen annual estimates of global money laundering flows ranging anywhere from $ 1 trillion to 2 trillion – almost 5% of global GDP.  Almost all of this is laundered via Retail & Merchant Banks,  Payment Networks, Securities & Futures firms, Casino Services & Clubs etc – which explains why annual AML related fines on Banking organizations run into the billions and are increasing every year. However, the number of SARs (Suspicious Activity Reports) filed by banking institutions are much higher as a category as compared to the numbers filed by these other businesses.

The definition of Financial Crimes is fairly broad & encompasses a large area of definition – the traditional money laundering activity, financial fraud like identity theft/check fraud/wire fraud, terrorist activity, tax evasion, securities market manipulation, insider trading and other kinds of securities fraud. Financial institutions across the spectrum of the market now need to comply with the regulatory mandate at both the global as well as the local market level.

What makes AML such a hard subject for Global Banks which should be innovating quite easily?

The issues which bedevil smooth AML programs include –

  • the complex nature of banking across retail, commercial, wealth management & capital markets; global banks now derive around 40% of revenue from non traditional markets (North America & Western Europe)
  • the scale of customer activity ranging from 5 to 50 million at the large global banks
  • patchwork of local regulations, risk and compliance reporting requirements. E.g. Stringent compliance requirements in the US & UK but softer requirements elsewhere
  • tens of distribution channels
  • growing volumes of transactions causing requirements for complex analytics
  • the need to constantly integrate 3rd party information of lists of politically exposed persons of interest (PEPs) using manual means
  • technology while ensuring the availability of banking services to millions of underserved populations – also makes it easy for the launderers to conduct & mask their activities

The challenges are hard but the costs of non-compliance are severe. Banks have been fined billions of dollars, compliance officers face potential liability & institutional reputation takes a massive hit. Supra national authorities like the United Nations (UN) and the European Union (EU) can also impose sanctions when they perceive that AML violations threaten human rights & the rule of law.

TECHNOLOGY IS THE ANSWER…

Many Banks have already put in rules, policies & procedures to detect AML violations and have also invested in substantial teams staffed by money laundering risk officers (MLRO) & headed by compliance officers. These rules to detect money laundering work based on thresholds and patterns that breached such criteria. The issue with this is that the money launderers themselves are in the class of statisticians and they constantly devise new rules to hide their tracks.

The various elements that make up the risk to banks and financial institutions and the technology they use to detect these can be broken down into five main areas & work streams as shown below.

AML_Workstreams

                                Illustration: The Five Workstreams of AML programs

  1. Customer Due Diligence  – this involves gathering information from the client as well as on-boarding data from external sources to verify these details and to establish a proper KYC (Know Your Customer) program.
  2. Entity Analysis – identifying relationships between institutional clients as well as retail clients to understand the true social graph. Bank compliance officers now have gone beyond KYC (Know Your Customer) to know their customer’s customer, or KYCC.[4]
  3. Downstream Analytics – detecting advanced patterns of behavior among clients & the inter-web of transactions with a view to detecting hidden patterns of money laundering. This also involves assessing client risk during specific points in the banking lifecycle, such as account opening, transactions above a certain monetary value. These data points could signal potentially illegitimate activity based on any number of features associated with such transactions. Any transaction could also lead to the filing of a suspicious activity report (SAR)
  4. Ongoing Monitoring  – Help aggregate such customer transactions across multiple geographies for pattern detection and reporting purposes. This involves creating a corporate taxonomy of rules that capture a natural language description of the conditions, patterns denoting various types of financial crimes – terrorist financing, mafia laundering, drug trafficking, identity theft etc.
  5. SAR Investigation Lifecycle – These rules trigger downstream workflows to allow human investigation on such transactions

QUANTIFIABLE BENEFITS FROM DOING IT WELL…

Financial institutions that leverage new Age technology (Big Data, Predictive Analytics, Workflow) in these five areas will be able to effectively analyze financial data and deter potential money launderers before they are able to proceed, providing the institution with protection in the form of full compliance with the regulations.

The business benefits include –

  • Detect AML violations on a proactive basis thus reducing the probability of massive fines 
  • Save on staffing expenses for Customer Due Diligence (CDD)
  • Increase accurate production of suspicious activity reports (SAR)
  • Decrease the percent of corporate customers with AML-related account closures in the past year by customer risk level and reason – thus reducing loss of revenue
  • Decrease the overall KYC profile backlog across geographies
  • Help create Customer 360 views that can help accelerate CLV (Customer Lifetime Value) as well as Customer Segmentation from a cross-sell/up-sell perspective

CONCLUSION…

Virtually every leading banking institution, securities firm, payment provider understands that they need to enhance their AML capabilities by a few notches and also need to constantly evolve them as fraud itself morphs.

The question is can they form a true picture of their clients (both retail and institutional) on a real time basis, monitor every banking interaction while understanding it’s true context when merged with historical data, detect unusual behavior. Further creating systems that learn  from these patterns truly helps minimize money laundering.

The next and final post in this two part series will examine how Big Data & Analytics help with each of the work streams discussed above.

REFERENCES…

[1] Building AML Regulatory Platforms for the Big Data Era – http://www.vamsitalkstech.com/?p=5

[2]Big Data – Banking’s New Weapon Against Financial Crime – http://www.vamsitalkstech.com/?p=806

[3] Reference Architecture for AML
– http://www.vamsitalkstech.com/?p=833

[4] WSJ – Know Your Customer’s Customer is the New Norm – http://blogs.wsj.com/riskandcompliance/2014/10/02/the-morning-risk-report-know-your-customers-customer-is-new-norm/

Demystifying Digital – Why Customer 360 is the Foundational Digital Capability – ..(1/3)

The first post in this three part series on Digital Foundations introduces the concept of Customer 360 or Single View of Customer (SVC). We will discuss the need for & the definition of the SVC as part of the first step in any Digital Transformation endeavor. We will also discuss specific benefits from both a business & operational state that are enabled by SVC. The second post in the series introduces the concept of a Customer Journey. The third & final post will focus on a technical design & architecture needed to achieve both these capabilities.
 
In an era of exploding organizational touch points, how many companies can truly claim that they know & understand their customers, their needs & evolving preferences deeply and from a realtime perspective?  
How many companies can claim to keep up as a customers product & service usage matures and keep them engaged by cross selling new offerings. How many can accurately predict future revenue from a customer based on their current understanding of their profile?
The answer is not at all encouraging.
Across industries like Banking, Insurance, Telecom & Manufacturing, the ability to get a unified view of the customer & their journey is at the heart of the the enterprise ability to promote relevant offerings & detect customer dissatisfaction. 
  • Currently most industry players are woeful at putting together this comprehensive Single View of their Customers (SVC). Due to operational silos, each department possess a siloed & limited view of the customer across multiple channels. These views are typically inconsistent, lack synchronization with other departments & miss a high amount of potential cross-sell and up-sell opportunities.
  • The Customer Journey problem has been an age old issue which has gotten exponentially more complicated over the last five years as the staggering rise of mobile technology and the Internet of Things (IoT) have vastly increased the number of enterprise touch points that customers are exposed to in terms of being able to discover & purchase new products/services. In an OmniChannel world, an increasing number of transactions are being conducted online. In verticals like Retail and Banking, the number of online transactions approaches an average of 40%. Adding to the problem, more and more consumers are posting product reviews and feedback online. Companies thus need to react in realtime to piece together the source of consumer dissatisfaction.
Another large component of customer outreach are Marketing analytics & the ability to run effective campaigns to recruit customers.

The most common questions that a lot of enterprises fail to answer accurately are –

  1. Is the Customer happy with their overall relationship experience?
  2. What mode of contact do they prefer? And at what time? Can Customers be better targeted at these channels at those preferred times?
  3. What is the overall Customer Lifetime Value (CLV) or how much profit we are able to generate from this customer over their total lifetime?
  4. By understanding CLV across populations, can we leverage that to increase spend on marketing & sales for products that are resulting in higher customer value?
  5. How do we increase cross sell and up-sell of products & services?
  6. Does this customer fall into a certain natural segment and if so, how can we acquire most customers like them?
  7. Can different channels (Online, Mobile, IVR & POS) be synchronized ? Can Customers begin a transaction in one channel and complete it in any of the others without having to resubmit their data?

The first element in Digital is the Customer Centricity & it must naturally follow that a 360 degree view is a huge aspect of that.

Customer360

                                       Illustration – Customer 360 view & its benefits

So what information is specifically contained in a Customer 360 –

The 360 degree view is a snapshot of the below types of data –

  • Customer’s Demographic information – Name, Address, Age etc
  • Length of the Customer-Enterprise relationship
  • Products and Services purchased overall
  • Preferred Channel & time of Contact
  • Marketing Campaigns the customer has responded to
  • Major Milestones in the Customers relationship
  • Ongoing activity – Open Orders, Deposits, Shipments, Customer Cases etc
  • Ongoing Customer Lifetime Value (CLV) Metrics and the Category of customer (Gold, Silver, Bronze etc)
  • Any Risk factors – Likelihood of Churn, Customer Mood Alert, Ongoing issues etc
  • Next Best Action for Customer

How can Big Data technology help?

Leveraging the ingestion and predictive capabilities of a Big Data based platform, banks can provide a user experience that rivals Facebook, Twitter or Google and provide a full picture of customer across all touch points.

Big Data enhances the Customer 360 capability in the following ways  –  

  1. Obtaining a realtime Single View of the Customer (typically a customer across multiple channels, product silos & geographies) across years of account history 
  2. Customer Segmentation by helping businesses understand customer segments down to the individual level as well as at a segment level
  3. Performing Customer sentiment analysis by combining internal organizational data, clickstream data, sentiment analysis with structured sales history to provide a clear view into consumer behavior.
  4. Product Recommendation engines which provide compelling personal product recommendations by mining realtime consumer sentiment, product affinity information with historical data.
  5. Market Basket Analysis, observing consumer purchase history and enriching this data with social media, web activity, and community sentiment regarding past purchase and future buying trends.

Customer 360 can help improve the following operational metrics of a Retailer or a Bank or a Telecom immensely.

  1. Cost to Income ratio; Customers Acquired per FTE; Sales and service FTE’s (as percentage of total FTE’s), New Accounts Per Sales FTE etc
  2.  Sales conversion rates across channels, Decreased customer attrition rates etc.
  3. Improved Net promotor scores (NPS), referral based sales etc

Customer 360 is thus basic digital capability every organization needs to offer their customers, partners & internal stakeholders. This implies a re-architecture of both data management and business processes automation.

The next post will discuss the second critical component of Digital Transformation – the Customer Journey.

Embedding A Culture of Business Analytics into the Enterprise DNA..

IT driven business transformation is always bound to fail” – Amber Storey, Sr Manager, Ernst & Young

The value of Big Data driven Analytics is no longer in question both from a customer as well as an enterprise standpoint. Lack of investment in an analytic strategy has the potential to impact shareholder value negatively.  Business Boards and CXOs are now concerned about their overall levels and maturity of investments in terms of business value – i.e increasing sales, driving down business & IT costs & helping create new business models. It is thus an increasingly accurate argument that smart applications & ecosystems built around them will increasingly dictate enterprise success.

Such examples among forward looking organizations abound across industries. These range from realtime analytics in manufacturing using IoT data streams across the supply chain, the use of natural language processing to drive patient care decisions in healthcare, more accurate insurance fraud detection & driving Digital interactions in Retail Banking etc to quote a few. 

However , most global organizations currently adopt a fairly tactical approach to ensuring the delivery of of traditional business intelligence (BI) and predictive analytics to their application platforms.  This departmental is quite suboptimal in ways as scaleable data driven decisions & culture not only empower decision-makers with up to date and realtime information but also help them develop long term insights into how globally diversified business operations are performing.  Scale is the key word here due to rapidly changing customer trends, partner, supply chain realities & regulatory mandates.

Scale implies speed of learning,  business agility across the organization in terms of having globally diversified operations turn on a dime thus ensuring that the business feels empowered.

A quick introduction to Business (Descriptive & Predictive) Analytics –

Business intelligence (BI) is a traditional & well established analytical domain that essentially takes a retrospective look at business data in systems of record. The goal for BI is to primarily look for macro or aggregate business trends across different aspects or dimensions such as time, product lines, business unites & operating geographies.

BI is primarily concerned with “What happened and what trends exist in the business based on historical data?“. The typical use cases for BI include budgeting, business forecasts, reporting & key performance indicators (KPI).

On the other hand, Predictive Analytics (a subset of Data Science) augments & builds on the BI paradigm by adding a “What could happen” dimension to the data in terms of –

  • being able to probabilistically predict different business scenarios across thousands of variables
  • suggesting specific business actions based on the above outcomes

Predictive Analytics does not intend to nor will it replace the BI domain but only adds significant business capabilities that lead to overall business success. It is not uncommon to find real world business projects leveraging both these analytical approaches.

Creating an industrial approach to analytics – 

Strategic business projects typically begin imbibing a BI/Predictive Analytics based approach as an afterthought to the other aspects of system architecture and buildout. This dated approach then ensures that analytics becomes external to and eventually operating in a reactive mode in the operation of business system.

Having said that, one does need to recognize that an industrial approach to analytics is a complex endeavor that depends on how an organization tackles the convergence of the below approaches –

  1. Organizational Structure
  2. New Age Technology 
  3. A Platforms Mindset
  4. Culture

Creating_An_Analytic_Culture

        Illustration – Embedding A Culture of Business Analytics into the Enterprise DNA..

Lets discuss them briefly – 

Organizational Structure – The historical approach has been to primarily staff analytics teams as a standalone division often reporting to a CIO. This team has responsibility for both the business intelligence as well as some silo of a data strategy. Such a piecemeal approach to predictive analytics ensures that business & application teams adopt a “throw it over the wall” mentality over time.

So what needs to be done? 

In the Digital Age, enterprises should look to centralize both data management as well as the governance of analytics as core business capabilities. I suggest a hybrid organizational structure where a Center of Excellence (COE) is created which reports to the office of the Chief Data Officer (CDO) as well as individual business analytic leaders within the lines of business themselves.

 This should be done to ensure that three specific areas are adequately tackled using a centralized approach- 

  • Investing in creating a data & analytics roadmap by creating a center of excellence (COE)
  • Setting appropriate business milestones with “lines of business” value drivers built into a robust ROI model
  • Managing Risk across the enterprise with detailed scenario planning

New Age Technology –

The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging visualization but also to personalize services clients care about across multiple modes of interaction. Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. We have seen how how exploding data generation across the global economy has become a clear & present business & IT phenomenon. Data volumes are rapidly expanding across industries. However, while the production of data itself that has increased but it is also driving the need for organizations to derive business value from it. This calls for the collection & curation of data from dynamic,  and highly distributed sources such as consumer transactions, B2B interactions, machines such as ATM’s & geo location devices, click streams, social media feeds, server & application log files and multimedia content such as videos etc – using Big Data.

Cloud Computing is the ideal platform to provide the business with self service as well as rapid provisioning of business analytics. Every new application designed needs to be cloud native from the get go.

The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging Visualization but also to personalize services clients care about across multiple modes of interaction. Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc.

A Platforms Mindset – 

As opposed to building standalone or one-off business applications, a Platform Mindset is a more holistic approach capable of producing higher revenues. Platforms abound in the webscale world at shops like Apple, Facebook & Google etc. Applications are constructed like lego blocks  and they reuse customer & interaction data to drive cross sell and up sell among different product lines. The key components here are to ensure that one starts off with products with high customer attachment & retention. While increasing brand value, it is key to ensure that customers & partners can also collaborate in the improvements in the various applications hosted on top of the platform.

Culture – Business value fueled by analytics is only possible if the entire organization operates on an agile basis in order to collaborate across the value chain. Cross functional teams across new product development, customer acquisition & retention, IT Ops, legal & compliance must collaborate in short work cycles to close the traditional business & IT innovation gap. Methodologies like DevOps who’s chief goal is to close the long-standing gap between the engineers who develop and test IT capability and the organizations that are responsible for deploying and maintaining IT operations – must be adopted. Using traditional app dev methodologies, it can take months to design, test and deploy software. No business today has that much time—especially in the age of IT consumerization and end users accustomed to smart phone apps that are updated daily. The focus now is on rapidly developing business applications to stay ahead of competitors that can better harness Big Data’s amazing business capabilities.

Summary- 

Enterprise wide business analytic approaches designed around the four key prongs  (Structure, Culture, Technology & Platforms)   will create immense operational efficiency, better business models, increased relevance and ultimately drive revenues. These will separate the visionaries, leaders from the laggards in the years to come.

What Lines Of Business Want From IT..

Relationship

                    Illustration: Business- IT Relationship (Image src – Pat.it)

Previous posts in this blog have discussed the fact that technological capabilities now make or break business models. It is critical for IT to operate in a manner that maximizes their efficiency while managing costs & ultimately delivering the right outcomes for the organization.

It is clear and apparent to me that the relationship lines of business (LOBs) have with their IT teams – typically central & shared – is completely broken at a majority of large organizations. Each side cannot seem to view either the perspective or the passions of the other. This dangerous dysfunction usually leads to multiple complaints from the business. Examples of which include –

  • IT is perceived to be glacially slow in providing infrastructure needed to launch new business initiatives or to amend existing ones. This leads to the phenomenon of ‘Shadow IT’ where business applications are  run on public clouds bypassing internal IT
  • Something seems to be lost in translation while conveying requirements to different teams within IT
  • IT is too focused on technological capabilities – Virtualization, Middleware, Cloud, Containers, Hadoop et al without much emphasis on business value drivers

So what are the top asks that Business has for their IT groups? I wager that there are five important focus areas –

  1. Transact in the language of the business –Most would agree that there has been too much of a focus on the technology itself – how it works,  what the infrastructure requirements are to host applications – cloud or on-prem, data engines to ingest and process it etc etc . The focus needs to be on customer needs that drive business value for an organization’s customers, partners, regulators & employees. Technology at it’s core is just an engine and does not exist in a vacuum. The most vibrant enterprises understand this ground reality and always ensure that business needs drive IT and not the other way around. It is thus highly important for IT leadership to understand the nuances of the business to ensure that their roadmaps (long and medium term) are being driven with business & competitive outcomes in mind. Examples of such goals are a common organization wide taxonomy across products, customers, logistics, supply chains & business domains. The shared emphasis on both business & IT should be on goals like increased profitability per customer, enhanced segmentation of both micro and macro customer populations with the ultimate goal of increasing customer lifetime value (CLV).
  2. Bi-Modal or “2 Speed” IT et al need to be business approach centric – Digital business models that are driving agile web-scale companies offer enhanced customer experiences built on product innovation and data driven business models. They are also encroaching into the domain of established industry players in verticals like financial services, retail, entertainment, telecommunications, transportation and insurance  by offering contextual & trendy products tailored to individual client profiles. Their savvy use of segmentation data  and realtime predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, point of sale, Internet, etc.). The enterprise approach has been to adopt a model known as Bi-Modal IT championed by Gartner. This model envisages two different IT camps – one focused on traditional applications and the other focused on innovation. Whatever be the moniker for this approach – LOBs need to be involved as stakeholders from the get-go & throughout the process of selecting technology choices that have downstream business ramifications. One of the approaches that is working well is increased cross pollination across both teams, collapsing artificial organizational barriers by adopting DevOps & ensuring that business has a slim IT component to rapidly be able to fill in gaps in IT’s business knowledge or capability.
  3. Self Service Across the board of IT Capabilities – Shadow IT (where business goes around the IT team) is not just an issue with infrastructure software but is slowly creeping up to business intelligence and advanced analytics apps. The delays associated with provisioning legacy data silos combined with using tools that are neither intuitive nor able to scale to deal with the increasing data deluge are making timely business analysis almost impossible to perform.  Insights delivered too late are not very valuable. Thus, LOBs are beginning  to move to a predominantly online SaaS (Software As A Service) model across a range of business intelligence applications. Reports, visual views of internal & external datasets are directly served to internal consumers based on data uploaded into a cloud based BI provider. These reports and views are then directly delivered to end users. IT needs to enable this capability and make it part of their range of offerings to the business.
  4. Help the Business think Analytically  – Business Process Automation (BPM) and Data Driven decision making are proven approaches used at data-driven organizations. When combined with Data and Business Analytics, this tends to be a killer combination. Organizations that are data & data metric driven are able to define key business processes that provide native support for key performance indicators (KPIs) that are critical and basic to their functioning. Applications developed by IT need to be designed in such a way that these KPIs can be communicate and broadcast across the organization constantly. Indeed a high percentage of organizations now have senior executive in place as the champion for BPM, Business Rules and Big Data driven analytics. These applications are also mobile native so that they can be provided access through a variety of mobile platforms for field based employees & back into the corporate firewall.
  5. No “Us vs Them” mentality – it is all “Us”  –  None of the above are only possible if the entire organization operates on an agile basis in order to collaborate across the value chain. Cross functional teams across new product development, customer acquisition & retention, IT Ops, legal & compliance must collaborate in short work cycles to close the traditional business & IT innovation gap.  One of chief goals of agile methodologies is to close the long-standing gap between the engineers who develop and test IT capability and business requirements for such capabilities.  Using traditional app dev methodologies, it can take months to design, test and deploy software – which is simply unsustainable. 

Business & IT need to collaborate. Period. –

The most vibrant enterprises that have implemented web-scale practices not only offer “IT/Business As A Service” but also have instituted strong cultures of symbiotic relationships between customers (both current & prospective), employees , partners and developers etc.

No business today has much time to innovation—especially in the age of IT consumerization where end users accustomed to smart phone apps that are often updated daily. The focus now is on rapidly developing business applications to stay ahead of competitors that can better harness technology’s amazing business capabilities.

How Robo-Advisors work..(2/3)

Millennials want “finance at their fingertips”..they want to be able to email and text the financial advisors and talk to them on a real-time basis,” – Greg Fleming, Ex-Morgan Stanley
The first post in this series on Robo-advisors touched on the fact that Wealth Management has been an area largely untouched by automation as far as the front office is concerned. Automated investment vehicles have largely begun changing that trend and they helping create a variety of business models in the industry. This three part series explored the automated “Robo-advisor” movement in the first post.This second post will focus on the overall business model & main functions of a Robo-advisor.
Introduction
FinTechs led by Wealthfront and Betterment have pioneered the somewhat revolutionary concept of Robo-advisors. To define the term – a Robo-advisor is an algorithm based automated investment advisor that can provide a range of Wealth Management services tailored to a variety of life situations.
Robo-advisors offer completely automated financial planning services. We have seen how the engine of the Wealth Management business is new customer acquisition. The industry is focused on acquiring the millennial or post millennial HNWI (High Net Worth Investor) generation. The technology friendliness of this group ensures that are the primary target market for automated investment advice. Not just the millenials, anyone who is comfortable with using technology and wants lower cost services can benefit from automated investment planning. However,  leaders in the space such as – Wealthfront & Betterment – have disclosed that their average investor age is around 35 years. [1]
Robo-advisors provide algorithm-based portfolio management methods around investment creation, provide automatic portfolio rebalancing and value added services like tax-loss harvesting as we will see. The chief investment vehicle of choice seems to be low-cost, passive exchange-traded funds (ETFs).

What are the main WM business models

Currently there are a few different business models that are being adopted by WM firms.

  1. Full service online Robo-advisor that is a 100% automated without any human element
  2. Hybrid Robo-advisor model being pioneered by firms like Vanguard & Charles Schwab
  3. Pure online advisor that is primarily human in nature

What do Robo-advisors typically do?

The Robo-advisor can be optionally augmented & supervised by a human adviser. At the moment, owing to the popularity of Robo-advisors among the younger high networth investors (HNWI), a range of established players like Vanguard, Charles Schwab as well as a number of FinTech start-ups have developed these automated online investment tools or have acquired FinTech’s in this space.e.g Blackrock. The Robo-advisor is typically offered as  a free service (below a certain minumum) and typically invests in low cost ETFs.  built using digital techniques – such as data science & Big Data.

Robo_Process

                                  Illustration: Essential functions of a Robo-advisor

The major business areas & client offerings in the Wealth & Asset Management space have been covered in the first post in this series at http://www.vamsitalkstech.com/?p=2329

Automated advisors only cover a subset of all of the above at the moment. The major usecases are as below –

  1. Determine individual Client profiles & preferences—e.g. For a given client profile- determine financial goals, expectations of investment return, diversification etc
  2. Identify appropriate financial products that can be offered either as pre-packaged portfolios or custom investments based on the client profile identified in the first step
  3. Establish correct Investment Mix for the client’s profile – these can included but are not ,limited to equities, bonds, ETFs & other securities in the firm’s portfolios . For instance, placing  tax-inefficient assets in retirement accounts like IRA’s as well as  tax efficient municipal bonds in taxable accounts etc.
  4. Using a algorithmic approach, choose the appropriate securities for each client account
  5. Continuously monitor the portfolio & transactions within it to tune performance , lower transaction costs, tax impacts etc based on how the markets are doing. Also ensure that a client’s preferences are being incorporated so that appropriate diversification and risk mitigation is being performed
  6. Provide value added services like Tax loss harvesting to ensure that the client is taking tax benefits into account as they rebalance portfolios or accrue dividends.
  7. Finally ,ensure the best user experience by handling a whole range of financial services – trading, account administration, loans,bill pay, cash transfers, tax reporting, statements in one intuitive user interface.

000-graph

                             Illustration: Betterment user interface. Source – Joe Jansen

To illustrate these concepts in action, leaders like Wealthfront & Betterment are increasingly adding features where  highly relevant, data-driven advice is being provided based on existing data as well as aggregated data from other providers. Wealthfront now provides recommendations on diversification, taxes and fees that are personalized not only to the specific investments in client’s account, but also tailored to their specific financial profile and risk tolerance. For instance, is enough cash being set aside in the emergency fund ? Is a customer holding too much stock in your employer? [1]

The final post will look at a technology & architectural approach to building out a Robo-advisor. We will also discuss best practices from a WM & industry standpoint in the context of Robo-advisors.

References:

  1. Wealthfront Blog – “Introducing the new Dashboard”

Global Retail Banking Needs a Digital Makeover

If you don’t like change, you will like irrelevance even less.” -General Eric Shinseki, Former  US Secretary of Veterans Affairs

This blog has spent time documenting the ongoing digital disruption across the industry especially financial services. Is there proof that creative destruction is taking a hold in Banking? The answer is a clear & unequivocal “Yes”. Clearly, Retail Banking is undergoing a massive makeover. This is being driven by many factors – changing consumer preferences, the advent of technology, automation of business processes & finally competition from not just the traditional players but also the Fintechs. The first casualty of this change is the good old Bank Branch. This post looks at the business background of Retail Banking across the world & will try to explain my view on what is causing this shift in how Banks and consumers perceive financial services.

This blog post will be one of a series of five standalone posts on Retail Bank transformation. The intention for the first post is to discuss industry dynamics, the current state of competition and will briefly introduce the forces causing a change in the status quo. The second post will categorize FinTechs across the banking landscape with key examples of how they disinter-mediate established players. The remaining posts will examine each of the other forces (Customer  in more detail along with specific and granular advice to retail banks on how to incorporate innovation into their existing technology, processes and organizational culture.

Introduction – 

Retail Banking is perhaps one of the most familiar and regular services that everyday citizens use in the course of their lives. Money is a commodity we touch every day in our lives when we bank, shop, pay bills, borrow etc. Retail lines of banking typically include personal accounts, credit cards, mortgages and auto loans. 

For large financial conglomerates that have operations spanning Commercial Banking, Capital Markets, Wealth & Asset Management etc, retail operations have always represented an invaluable source of both stability as well as balance sheet strength. The sheer size & economic exposure of retail operations ensures that it is not only staid yet stable but also somewhat insulated from economic shocks. This is borne out by the policies of respective national central banks & treasury departments. Indeed one of main the reasons regulators have bailed out banks in the past is due to the perception that Main Street & the common citizen’s banking assets becoming a casualty of increased risk taking  by traders in the capital markets divisions. This scenario famously played out during the Great Depression in the late 1920s and was a major factor in causing widespread economic contagion. A stock market crash quickly cascaded into a nation-wide economic depression. 

Thus, retail banking is crucial to not just to the owning corporation but also to diverse stakeholders in the world economy – deposit holders, the regulators led by the US Federal Reserve (in the US) & a host of other actors.  

The State of Global Retail Banking – 

In the financial crisis of 2008, retail banks not only held their own but also assumed a bigger share of revenues as the recovery got underway in the following years. According to a survey by Boston Consulting Group (BCG), retail banking activities accounted for 55 percent of the revenues generated across a global cohort of 140 banks, up from 45 percent in 2006.[1] 

However, the report also contends that retail revenues since 2008 have been slowly falling as investors have begin shifting their savings to deposits as a reaction to high profile financial scandals thus putting pressure on margins. Higher savings rates have helped offset this somewhat & retail banks ended up maintaining better cost to income (CIR) ratios than did other areas of banking.Retail banks also performed better on a key metric return on assets (ROA). The below graphic from the BCG captures this metric. In the Americas region, the average ROA was 162 percent higher than the average group ROA in 2008. From 2001 through 2006, it was 51 percent higher. Global banking revenues stood at $ 1.59 trillion in 2015 – a figure that is expected to hold relatively steady across the globe [2]

It is also important to note that global performance of retail banks across the five major regions: the Americas, Europe, the Middle East, Asia, and Australia has generally varied based on a multitude of factors. And even within regions, banking performance has varied widely.[2]

Retail Banking - BCG

                                      Illustration 1 – Retail Banking is profitable and stable 

As stable as this sector seems, it is also be roiled by four main forces that are causing every major player to rethink their business strategy. Left unaddressed, these changes will cause huge and negative impacts on competitive viability, profitability & also impact all important growth over the next five years. 

What is the proof that retail banking is beginning to change? The below graphic from CNN [1] says it all –

BofA_Branches_CNN

Bank of America has 23% fewer branches and 37% fewer employees than in 2009.  That downward trend across both metrics is expected to continue as online transactions from (deposits to checks to online loans) grown by a staggering 94%. The bank is expected to cut more positions in reflection of a shrinking headcount and branch footprint[1].

Pressure from the FinTechs:

The Financial Services and the Insurance industry are facing an unprecedented amount of change driven by factors like changing client preferences and the emergence of new technology—the Internet, mobility, social media, etc. These changes are immensely profound, especially with the arrival of “FinTech”—technology-driven applications that are upending long-standing business models across all sectors from retail banking to wealth management & capital markets. Further, members of a major new segment, Millennials, increasingly use mobile devices, demand more contextual services and expect a seamless unified banking experience—something akin to what they  experience on web properties like Facebook, Amazon, Uber, Google or Yahoo, etc. They do so by expanding their wallet share of client revenues by offering contextual products tailored to individual client profiles. Their savvy use of segmentation data and predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, call center banking, point of sale, ATM/kiosk etc.).

Retail Banking must trend Digital to respond – 

The definition of Digital is somewhat nebulous, I would like to define the key areas where it’s impact and capabilities will need to be felt for this gradual transformation to occur.

A true Digital Bank needs to –

  • Offer a seamless customer experience much like the one provided by the likes of Facebook & Amazon i.e highly interactive & intelligent applications that can detect a single customer’s journey across multiple channels
  • offer data driven interactive services and products that can detect customer preferences on the fly, match them with existing history and provide value added services. Services that not only provide a better experience but also foster a longer term customer relationship
  • to be able to help the business prototype, test, refine and rapidly develop new business capabilities
  • Above all, treat Digital as a Constant Capability and not as an ‘off the shelf’ product or a one off way of doing things

The five areas that established banks need to change across are depicted below..

RetailBank_Value_Drivers

  1. Convert branches to be advisory & relationship focused instead of centers for transactions – As the number of millennials keeps growing, the actual traffic to branches will only continue to decline.  Branches still have an area of strength in being intimate customer touch points. The branch of the future can be redesigned to have more self service features along with relationship focused advisory personnel instead of purely being staffed by tellers and managers. They need to be reimagined as Digital Centers, not unlike an Apple store, with highly interactive touch screens and personnel focused on building business through high margin products.
  2. Adopt a FinTech like mindset – FinTechs (or new Age financial industry startups) offer enhanced customer experiences built on product innovation and agile business models. They do so by expanding their wallet share of client revenues by offering contextual products tailored to individual client profiles. Their savvy use of segmentation data and predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, Point Of Sale, Internet, etc.). Like banks, these technologies support multiple modes of payments at scale, but they aren’t bound by the same regulatory and compliance regulations as are banks, who operate under a mandate that they must demonstrate that they understand their risk profiles. The best retail banks will not only seek to learn from, but sometimes partner with, emerging fintech players to integrate new digital solutions and deliver exceptional customer experience. To cooperate and take advantage of fintechs, banks will require new partnering capabilities. To heighten their understanding of customers’ needs and to deliver products and services that customers truly value, banks will need new capabilities in data management and analytics.
  3. Understand your customer – Banks need to move to a predominantly online model, providing consumers with highly interactive, engaging and contextual experiences that span multiple channels—branch banking, eBanking, POS, ATM, etc. Further goals are increased profitability per customer for both micro and macro customer populations with the ultimate goal of increasing customer lifetime value (CLV).
  4. Business Process improvement – Drive Automation across lines of business  – Financial services are fertile ground for business process automation, since most banks across their various lines of business are simply a collection of core and differentiated processes. Examples of these processes are consumer banking (with processes including on boarding customers, collecting deposits, conducting business via multiple channels, and compliance with regulatory mandates such as KYC and AML); investment banking (including straight-through-processing, trading platforms, prime brokerage, and compliance with regulation); payment services; and wealth management (including modeling model portfolio positions and providing complete transparency across the end-to-end life cycle). The key takeaway is that driving automation can result not just in better business visibility and accountability on behalf of various actors. It can also drive revenue and contribute significantly to the bottom line. Automation enables enterprise business and IT users to document, simulate, manage, automate and monitor business processes and policies. It is designed to empower business and IT users to collaborate more effectively, so business applications can be changed more easily and quickly.
  5. Agile Culture – All of the above are only possible if the entire organization operates on an agile basis in order to collaborate across the value chain. Cross functional teams across new product development, customer acquisition & retention, IT Ops, legal & compliance must collaborate in short work cycles to close the traditional business & IT innovation gap.  One of DevOps’s chief goals is to close the long-standing gap between the engineers who develop and test IT capability and the organizations that are responsible for deploying and maintaining IT operations. Using traditional app dev methodologies, it can take months to design, test and deploy software. No business today has that much time—especially in the age of IT consumerization and end users accustomed to smart phone apps that are updated daily. The focus now is on rapidly developing business applications to stay ahead of competitors that can better harness Big Data’s amazing business capabilities.

How can all of this be quantified? –

The results of BCG’s sixth annual Global Retail-Banking Excellence benchmarking illustrate the value drivers. Forward looking banks are working on some of the above aspects are able to reduce cycle times for core processes thus improving productivity. The leaders in the survey are also reallocating resources from the mid and office to customer facing roles.[3]

Again, according to the BCG, digital reinvention comes with huge benefits to both the top and bottom-lines. Their annual survey across the global retail banking sector estimates an average reduction in operating expenses from 15% to 25%, increases in pretax profit by 20% to 30% and an average increase in margins before tax from 5% to 10%. [3] These numbers are highly impressive at the scale that large banks operate.

The question thus is, can the vast majority of Banks change before it’s too late? Can they find the right model of execution in the Digital Age before their roles are either diminished or dis-intermediated by competition?

We will dive deep into the FinTech’s in the next post in the series.

References

[1] CNN Money – Bank of America has 23% fewer branches than 2009

[2]BCG Research- Winning Strategies Revisited for Retail Banking

[3] BCG Research- Global Capital Markets 2016: The Value Migration

Capital Markets Pivots to Big Data in 2016

Previous posts in this blog have discussed how Capital markets firms must create new business models and offer superior client relationships based on their vast data assets. Firms that can infuse a data driven culture in both existing & new areas of operation will enjoy superior returns and raise the bar for the rest of the industry in 2016 & beyond. 

Capital Markets are the face of the financial industry to the general public and generate a large percent of the GDP for the world economy. Despite all the negative press they have garnered since the financial crisis of 2008, capital markets perform an important social function in that they contribute heavily to economic growth and are the primary vehicle for household savings. Firms in this space allow corporations to raise capital using the underwriting process. However, it is not just corporations that benefit from such money raising activity – municipal, local and national governments do the same as well. Just that the overall mechanism differs – while business enterprises issue both equity and bonds, governments typically issue bonds. According to the Boston Consulting Group (BCG), the industry will grow to annual revenues of $661 billion in 2016 from $593 billion in 2015 – a healthy 12% increase. On the buy side, the asset base (AuM – Assets under Management) is expected to reach around $100 trillion by 2020 up from $74 trillion in 2014.[1]

Within large banks, the Capital Markets group and the Investment Banking Group perform very different functions.  Capital Markets (CM) is the face of the bank to the street from a trading perspective.  The CM group engineers custom derivative trades that hedge exposure for their clients (typically Hedge Funds, Mutual Funds, Corporations, Governments and high net worth individuals and Trusts) as well as for their own treasury group.  They may also do proprietary trading on the banks behalf for a profit – although it is this type of trading that Volcker Rule is seeking to eliminate.

If a Bank uses dark liquidity pools (DLP) they funnel their Brokerage trades through the CM group to avoid the fees associated with executing an exchange trade on the street.  Such activities can also be used to hide exchange based trading activity from the Street.  In the past, Banks used to make their substantial revenues by profiting from their proprietary trading or by collecting fees for executing trades on behalf of their treasury group or other clients.

Banking and within it, capital markets continues to generate insane amounts of data. These producers range from news providers to electronic trading participants to stock exchanges which are increasingly looking to monetize data. And it is not just the banks, regulatory authorities like the FINRA in the US are processing peak volumes of 40-75 billion market events a day http://www.vamsitalkstech.com/?p=1157 [2]. In addition to data volumes, Capital Markets has always  possessed a variety challenge as well. They have tons of structured data around traditional banking data, market data, reference data & other economic data. You can then factor in semi-structured data around corporate filings,news,retailer data & other gauges of economic activity. An additional challenge now is the creation of data from social media, multimedia etc – firms are presented with significant technology challenges and business opportunities.

Within larger financial supermarkets, the capital markets group typically leads the way in  being forward looking in terms of adopting cutting edge technology and high tech spends.  Most of the compute intensive problems are generated out of either this group or the enterprise risk group. These groups own the exchange facing order management systems, the trade booking systems, the pricing libraries for the products the bank trades as well as the tactical systems that are used to manage their market and credit risks, customer profitability, compliance and collateral systems.  They typically hold about one quarter of a Banks total IT budget. Capital Markets thus has the largest number of use cases for risk and compliance.

Players across value chain on the buy side, the sell side, the intermediaries (stock exchanges & the custodians) & technology firms such as market data providers are all increasingly looking at leveraging these new data sets that can help unlock the value of data for business purposes beyond operational efficiency.

So what are the  different categories of applications that are clearly leveraging Big Data in production deployments.

CapMkts_UseCases

                      Illustration – How are Capital Markets leveraging Big Data In 2016

I have catalogued the major ones below based on my work with the majors in the spectrum over the last year.

  1. Client Profitability Analysis or Customer 360 view:  With the passing of the Volcker Rule, the large firms are now moving over to a model based on flow based trading rather than relying on prop trading. Thus it is critical for capital market firms to better understand their clients (be they institutional or otherwise) from a 360-degree perspective so they can be marketed to as a single entity across different channels—a key to optimizing profits with cross selling in an increasingly competitive landscape. The 360 view encompasses defensive areas like Risk & Compliance but also the ability to get a single view of profitability by customer across all of their trading desks, the Investment Bank and Commercial Lending.
  2. Regulatory Reporting –  Dodd Frank/Volcker Rule Reporting: Banks have begun to leverage data lakes to capture every trade intraday and end of day across it’s lifecycle. They are then validating that no proprietary trading is occurring on on the banks behalf.  
  3. CCAR & DFast Reporting: Big Data can substantially improve the quality of  raw data collected across multiple silos. This improves the understanding of a Bank’s stress test numbers.
  4. Timely and accurate risk management: Running Historical, stat VaR (Value at Risk) or both to run the business and to compare with the enterprise risk VaR numbers.
  5. Timely and accurate liquidity management:  Look at the tiered collateral and their liquidity profiles on an intraday basis to manage the unit’s liquidity.  They also need to look at credit and market stress scenarios and be able to look at the liquidity impact of those scenarios.
  6. Timely and accurate intraday Credit Risk Management:  Understanding when  & if  deal breaches a tenor bucketed limit before they book it.  For FX trading this means that you have about 9 milliseconds  to determine if you can do the trade.  This is a great place to use in memory technology like Spark/Storm and a Hadoop based platform. These usecases are key in increasing the capital that can be invested in the business.  To do this they need to convince upper management that they are managing their risks very tightly.
  7. Timely and accurate intraday Market Risk Management:  Leveraging Big Data to market risk computations ensures that Banks have a real time idea of any market limit breaches for any of the tenor bucketed market limits.
  8. Reducing Market Data costs: Market Data providers like Bloomberg, Thomson Reuters and other smaller agencies typically charge a fee each time data is accessed.  With a large firm, both the front office and Risk access this data on an ad-hoc fairly uncontrolled basis. A popular way to save on cost is to  negotiate the rights to access the data once and read it many times.  The key is that you need a place to put it & that is the Data Lake.
  9. Trade Strategy Development & Backtesting: Big Data is being leveraged to constantly backtest trading strategies and algorithms on large volumes of historical and real time data. The ability to scale up computations as well as to incorporate real time streams is key to
  10. Sentiment Based Trading: Today, large scale trading groups and desks within them have begun monitoring economic, political news and social media data to identify arbitrage opportunities. For instance, looking for correlations between news in the middle east and using that to gauge the price of crude oil in the futures space.  Another example is using weather patterns to gauge demand for electricity in specific regional & local markets with a view to commodities trading. The realtime nature of these sources is information gold. Big Data provides the ability to bring all these sources into one central location and use the gleaned intelligence to drive various downstream activities in trading & private banking.
  11. Market & Trade Surveillance:Surveillance is an umbrella term that usually refers to a wide array of trading practices that serve to distort securities prices thus enabling market manipulators to illicitly profit at the expense of other participants, by creating information asymmetry. Market surveillance is generally out by Exchanges and Self Regulating Organizations (SRO) in the US – all of which have dedicated surveillance departments set up for this purpose. However, capital markets players on the buy and sell side also need to conduct extensive trade surveillance to report up internally. Pursuant to this goal, the exchanges & the SRO’s monitor transaction data including orders and executed trades & perform deep analysis to look for any kind of abuse and fraud.
  12. Buy Side (e.g. Wealth Management) – A huge list of usecases I have catalogued here – https://dzone.com/articles/the-state-of-global-wealth-management-part-2-big-d 
  13. AML Compliance –  Covered in various blogs and webinars.
    http://www.vamsitalkstech.com/?s=AML
    https://www.boozallen.com/insights/2016/04/webinar-anti-money-laudering – 

The Final Word

A few tactical recommendations to industry CIOs:

  • Firstly, capital markets players should look to create centralized trade repositories for Operations, Traders and Risk Management.  This would allow consolidation of systems and a reduction in costs by providing a single platform to replace operations systems, compliance systems and desk centric risk systems.  This would eliminate numerous redundant data & application silos, simplify operations, reduce redundant quant work, improve and understanding of risk.
  • Secondly, it is important to put in place a model to create sources of funding for discretionary projects that can leverage Big Data.
  • Third, Capital Markets groups typically have to fund their portion of AML, Dodd Frank, Volcker Rule, Trade Compliance, Enterprise Market Risk and Traded Credit Risk projects.  These are all mandatory spends.  After this they typically get to tackle discretionary business projects. Eg- fund their liquidity risk, trade booking and tactical risk initiatives.  These defensive efforts always get the short end of the stick and are not to be neglected while planning out new initiatives.
  • Finally, an area in which a lot of current players are lacking is the ability to associate clients using a Lightweight Entity Identifier (LEI). Using a Big Data platform to assign logical and physical entity ID’s to every human and business the bank interacts can have salubrious benefits. Big Data can ensure that firms can do this without having to redo all of their customer on-boarding systems. This is key to achieving customer 360 views, AML and FATCA compliance as well as accurate credit risk reporting.

It is no longer enough for CIOs in this space to think of tactical Big Data projects, they must be thinking around creating platforms and ecosystems around those platforms to be able to do a variety of pathbreaking activities that generate a much higher rate of return.

References

[1] “The State of Capital Markets in 2016” – BCG Perspectives