A Reference Architecture for The Open Banking Standard..

This is the second in a series of four posts on the Open Banking Standard (OBS) in the UK. This second post will briefly look at the strategic drivers for banks while proposing an architectural style or approach for incumbents to drive change in their platforms to achieve OBS Compliance. We will examine the overall data layer implications in the next post. The final post will look at key strategic levers and possible business models that the standard could help banks to drive innovation towards. 

Introduction…

The Open Banking Standard will steward the development of layers of guidelines (API interoperability standards, data security & privacy and governance) which primarily deal with data sharing in banking. The belief is that this regulation will ultimately spur open competition and unlock innovation. For years, the industry has grappled with fundamental platform issues that are native to every domain of banking. Some of these include systems are siloe-d by function, platforms that are inflexible in responding to rapidly changing market conditions & consumer tastes. Bank IT is perceived by the business to be glacially slow in responding to their needs.

The Open Banking Standard (OBS) represents a vast opportunity for banking organizations in multiple ways. First off, Bank IT has the luxury of using the regulatory mandate to slowly re-architect hitherto inflexible and siloe-d business systems. Secondly, doing so will enable Banks to significantly monetize their vast data resources in several key business areas.  

This will need to change with the introduction of Open Banking Standard. Banks that do not change will not be able to derive and sustain a competitive advantage. PSD2 Compliance (Payment Systems Directive – 2) – which will be mandated by the EU is one of the first layers in the OBS. Further layers will include API standards definitions for business processes (e.g View Account, Transfer Funds, Chargebacks, Dispute Handling etc). 

The OBWG (Open Banking Working Group) standards include the following key constituencies & their requirements [1] – 

 1. Customers: defined as account holders & businesses who agree to sharing their data & any publishers who share open datasets 

2. Data attribute providers: defined as banks & other financial services providers whose customers produce data as part of daily banking activities 

3. Third parties: Interested developers, financial services startups aka FinTechs, and any organisations (e.g  Retail Merchants) who can leverage the data to provide new views & products  

It naturally follows from the above, that the key technical requirements of the framework will include:

1. A set of Data elements, API definitions and Security Standards to provide both data security and a set of access restrictions 

2. A Governance model, a body which will develop & oversee the standards 

3. Developer resources, which will enable third parties to discover, educate and experiment.

The Four Strategic Drivers in the Open Bank Standard …

Clearly the more intelligently a firm harness technology (in pursuit of OBS compliance goals) will determine it’s overall competitive advantage.  This important to note since a range of players across the value chain (the above Third Parties as designated by the standard) can now obtain seamless access to a variety of data. Once obtained the data can help the Third Parties reimagine it in manifold ways. For example they can help consumers make better personal financial decisions for their clients at the expense of the Banks owning the data. For instance, FinTechs have generally been able to make more productive use of client data. They do this by providing clients with intuitive access to cross asset data, tailoring algorithms based on behavioral characteristics  and by providing clients with a more engaging and unified experience.

So, the four strategic business goals that OBS compliant architectures need to solve in the long run – 

  1. Digitize The Customer Journey –  Bank clients who use services like Uber, Zillow, Amazon etc in their daily lives are now very vocal in demanding a seamless experience across all of their banking ervices using digital channels.  The vast majority of Bank applications still lag the innovation cycle, are archaic & are separately managed. The net issue with this is that the client is faced with distinct user experiences ranging from client onboarding to servicing to transaction management. Such applications need to provide anticipatory or predictive capabilities at scale while understand the specific customers lifestyles, financial needs & behavioral preferences. 
  2. Provide Improved Access to Personal Financial Management & Improved Lending Processes  –  Provide consumers with a single aggregated picture of all their accounts. Also improve lending systems by providing more efficient access to loans by incorporating a large amount of contextual data in the process.
  3. Automate Back & Mid Office Processes Across Lending, Risk, Compliance & Fraud – The needs to forge a closer banker/client experience is not just driving demand around data silos & streams themselves but also forcing players to move away from paper based models to more of a seamless, digital & highly automated model to rework a ton of existing back & front office processes. These processes range from risk data aggregation, supranational compliance (AML,KYC, CRS & FATCA), financial reporting across a range of global regions & Cyber Security. Can the Data architectures & the IT systems  that leverage them be created in such a way that they permit agility while constantly learning & optimizing their behaviors across national regulations, InfoSec & compliance requirements? Can every piece of actionable data be aggregated,secured, transformed and reported on in such a way that it’s quality across the entire lifecycle is guaranteed? 
  4. Tune Existing Business Models Based on Client Tastes and Feedback – While the initial build out of the core architecture may seem to focus on digitizing interactions and exposing data via APIs. What follows fast is strong predictive modeling capabilities working at large scale where systems need to constantly learn and optimize their interactions, responsiveness & services based on client needs & preferences. 

The Key System Architecture Tenets…

The design and architecture of a solution as large and complex as a reference architecture for Open Banking is a multidimensional challenge and it will vary at every institution based on their existing investments, vendor products & overall culture. 

The OBS calls out the following areas of data as being in scope – Customer transaction data, customer reference data, aggregated data and sensitive commercial data. A thorough review of the OBWSG standard leads one to suggest a logical reference architecture as called out below.

Based on all the above, the Open Bank Architecture shall – 

  • Support an API based model to invoke any business process or data elements based on appropriate security  by a third party . E.g client or an advisor or a business partner
  • Support the development and deployment of an application that encourages a DevOps based approach
  • Support the easy creation of scalable business processes (e.g. client on boarding, KYC, Payment dispute check etc) that natively emit business metrics from the time they’re instantiated to throughout their lifecycle
  • Support automated application delivery, configuration management & deployment
  • Support a high degree of data agility and data intelligence. The end goal being that that every customer click, discussion & preference shall drive an analytics infused interaction between the Bank and the client
  • Support algorithmic capabilities that enable the creation of new services like automated (or Robo) advisors
  • Support a very high degree of scale across many numbers of users, interactions & omni-channel transactions while working across global infrastructure
  • Shall support deployment across cost efficient platforms like a public or private cloud. In short, the design of the application shall not constrain the available deployment options – which may vary because of cost considerations. The infrastructure options supported shall range from virtual machines to docker based containers – whether running on a public cloud, private cloud or in a hybrid cloud
  • Support small, incremental changes to business services & data elements based on changing business requirements 
  • Support standardization across application stacks, toolsets for development & data technology to a high degree
  • Shall support the creation of a user interface that is highly visual and feature rich from a content standpoint when accessed across any device

 

Reference Architecture…

Now that we have covered the business bases, what foundational technology choices comprise the satisfaction of the above? Lets examine that first at a higher level and then in more detail.

Given the above list of requirements – the application architecture that is a “best fit” is shown below.

Open Banking Architecture Diagram

                   Illustration – Candidate Reference Architecture for the Open Bank Standard

Lets examine each of the tiers starting from the lowest –

Infrastructure Layer…

Cloud Computing across it’s three main delivery models (IaaS, PaaS & SaaS) is largely a mainstream endeavor in financial services and no longer an esoteric adventure only for brave innovators. A range of institutions are either deploying or testing cloud-based solutions that span the full range of cloud delivery models. These capabilities include –

IaaS (infrastructure-as-a-service) to provision compute, network & storage, PaaS (platform-as-a-service) to develop applications & exposing their business services as  SaaS (software-as-a-service) via APIs.

Choosing Cloud based infrastructure – whether that is secure public cloud  (Amazon AWS or Microsoft Azure) or an internal private cloud (OpenStack etc) or even a hybrid approach is a safe and sound bet for these applications. Business innovation and transformation are best enabled by a cloud based infrastructure – whether public or private.

 

Data Layer…

While banking data tiers are usually composed of different technologies like RDBMS, EDW (Enterprise Data Warehouses), CMS (Content Management Systems) & Big Data etc. My recommendation for the OBSWG target state is largely dominated by a Big Data Platform powered by Hadoop technology. The vast majority of initial applications recommended by the OBSWG call out for predictive analytics to create tailored Customer Journeys. Big Data is a natural fit as it is fast emerging as the platform of choice for analytic applications.

Financial services firms specifically deal with manifold data types ranging from Customer Account data, Transaction Data, Wire Data, Trade Data, Customer Relationship Management (CRM), General Ledger and other systems supporting core banking functions. When one factors in social media feeds, mobile clients & other non traditional data types, the challenge is not just one of data volumes but also variety and the need to draw conclusions from fast moving data streams by commingling them with years of historical data.

The reasons for choosing Big Data as the dominant technology in the data tier are the below – 

  1. Hadoop’s ability to ingest and work with all the above kinds of data & more (using the schema on read method) has been proven at massive scale. Operational data stores are being built on Hadoop at a fraction of the cost & effort involved with older types of data technology (RDBMS & EDW)
  2. The ability to perform multiple types of processing on a given data set. This processing varies across batch, streaming, in memory and realtime which greatly opens up the ability to create, test & deploy closed loop analytics quicker than ever before
  3. The DAS (Direct Attached Storage) model that Hadoop provides fits neatly in with the horizontal scale out model that the services, UX and business process tier leverage. This keeps cuts Capital Expenditure  to a bare minimum.
  4. The ability to retain data for long periods of time thus providing WM applications with predictive models that can reason on historical data
  5. Hadoop provides the ability to run a massive volumes of models in a very short amount of time helps with modeling automation
  6. Due to it’s parallel processing nature, Hadoop can run calculations (pricing, risk, portfolio, reporting etc) in minutes versus the hours it took using older technology
  7. Hadoop has to work with existing data investments and to augment them with data ingestion & transformation leaving EDW’s to perform complex analytics that they excel at – a huge bonus.

Services Layer…

The overall goals of the OBSWG services tier are to help design, develop,modify and deploy business components in such a way that overall WM application delivery follows a continuous delivery/deployment (CI/CD) paradigm.Given that WM Platforms are some of the most complex financial applications out there, this also has the ancillary benefit of leaving different teams – digital channels, client on boarding, bill pay, transaction management & mid/back office teams to develop and update their components largely independent of other teams. Thus a large monolithic WM enterprise platform is decomposed into its constituent services which are loosely coupled and each is focused on one independent & autonomous business task only. The word ‘task’ here referring to a business capability that has tangible business value.

A highly scalable, open source & industry leading platform as a service (PaaS) is recommended as the way of building out and hosting banking business applications at this layer.  Microservices have moved from the webscale world to fast becoming the standard for building mission critical applications in many industries. Leveraging a PaaS such as OpenShift provides a way to help cut the “technical debt” that has plagued both developers and IT Ops. OpenShift provides the right level of abstraction to encapsulate microservices via it’s native support for Docker Containers. This also has the concomitant advantage of standardizing application stacks, streamlining deployment pipelines thus leading the charge to a DevOps style of building applications. 

Further I recommend that service designer take the approach that their micro services can be deployed in a SaaS application format going forward – which usually implies taking an API based approach.

Now, the services tier has the following global responsibilities – 

  1. Promote a Microservices/SOA style of application development
  2. Support component endpoint invocation via standards based REST APIs
  3. Promote a cloud, OS & ,development language agnostic style of application development
  4. Promote Horizontal scaling and resilience

Predictive Analytics & Business Process Layer…

Though segments of the banking industry have historically been early adopters of analytics, areas being targeted by the OBSWG – Retail lines of business &Payments have generally been laggards. However, the large datasets that are prevalent in Open Bank Standard world as well as the need to drive customer interaction & journeys, risk & compliance reporting, detecting fraud etc calls for a strategic relook at this space. 

Techniques like Machine Learning, Data Science & AI feed into core business processes thus improving them. For instance, Machine Learning techniques support the creation of self improving algorithms which get better with data thus making accurate business predictions. Thus, the overarching goal of the analytics tier should be to support a higher degree of automation by working with the business process and the services tier. Predictive Analytics can be leveraged across the value chain of the Open Bank Standard – ranging from new customer acquisition to customer journey to the back office. More recently these techniques have found increased rates of adoption with enterprise concerns from cyber security to telemetry data processing.

Another area is improved automation via light weight business process management (BPM). Though most large banks do have pockets of BPM implementations that are adding or beginning to add significant business value, an enterprise-wide re-look at the core revenue-producing activities is called for, as is a deeper examination of driving competitive advantage. BPM now has evolved into more than just pure process management. Meanwhile, other disciplines have been added to BPM — which has now become an umbrella term. These include business rules management, event processing, and business resource planning.

Financial Services firms general are fertile ground for business process automation, since most managers across their various lines of business are simply a collection of core and differentiated processes. Examples are private banking (with processes including onboarding customers, collecting deposits, conducting business via multiple channels, and compliance with regulatory mandates such as KYC and AML); investment banking (including straight-through-processing, trading platforms, prime brokerage, and compliance with regulation); payment services; and portfolio management (including modeling model portfolio positions and providing complete transparency across the end-to-end life cycle). The key takeaway is that driving automation can result not just in better business visibility and accountability on behalf of various actors. It can also drive revenue and contribute significantly to the bottom line.

A business process system should allow an IT analyst or customer or advisor to convey their business process by describing the steps that need to be executed in order to achieve the goal (and explain the order of those steps, typically using a flow chart). This greatly improves the visibility of business logic, resulting in higher-level and domain-specific representations (tailored to finance) that can be understood by business users and should be easier to monitor by management. Again , leveraging a PaaS such as OpenShift in conjunction with an industry leading open source BPMS (Business Process Management System) such as JBOSS BPMS provides an integrated BPM capability that can create cloud ready and horizontally scalable business processes.

API & UX Layer…

The API & UX (User Experience) tier fronts humans – clients. business partners, regulators, internal management and other business users across omnichannel touchpoints. A standards based API tier is provided for partner applications and other non-human actors to interact with business service tier. Once the OBSWG defines the exact protocols, data standards & formats – this should be straightforward to implement.

The API/UX tier has the following global responsibilities  – 

  1. Provide a seamless experience across all channels (mobile, eBanking, tablet etc) in a way that is a continuous and non-siloed. The implication is that clients should be able to begin a business transaction in channel A and be able to continue them in channel B where it makes business sense.
  2. Understand client personas and integrate with the business & predictive analytic tier in such a way that the API is loosely yet logically integrated with the overall information architecture
  3. Provide advanced visualization (wireframes, process control, social media collaboration) and cross partner authentication & single sign on
  4. Both the API & UX shall also be designed is such a manner that their design, development & ongoing enhancement lends themselves to an Agile & DevOps methodology.

It can all come together…

In most existing Banking systems, siloed functions have led to brittle data architectures operating on custom built legacy applications. This problem is firstly compounded by inflexible core banking systems and secondly exacerbated by a gross lack of standardization in application stacks underlying capabilities like customer journey, improved lending & fraud detection. These factors inhibit deployment flexibility across a range of platforms thus leading to extremely high IT costs and technical debut. The consequence is that these inhibit client facing applications from using data in a manner that constantly & positively impacts the client experience. There is clearly a need to provide an integrated digital experience across a global customer base. And then to offer more intelligent functions based on existing data assets. Current players do possess a huge first mover advantage as they offer highly established financial products across their large (and largely loyal & sticky) customer bases, a wide networks of physical locations, rich troves of data that pertain to customer accounts & demographic information. However, it is not enough to just possess the data. They must be able to drive change through legacy thinking and infrastructures as things change around the entire industry as it struggles to adapt to a major new segment – the millenials – who increasingly use mobile devices and demand more contextual services as well as a seamless and highly analytic driven & unified banking experience – akin to what they commonly experience via the internet – at web properties like Facebook, Amazon, Google or Yahoo etc

Summary

Platforms designed technology platforms designed around the four key business needs   will create immense operational efficiency, better business models, increased relevance and ultimately drive revenues. These will separate the visionaries, leaders from the laggards in the years to come. The Open Bank Standard will be a catalyst in this immense disruption. 

REFERENCES…

[1] The Open Banking Standard –
https://theodi.org/open-banking-standard

The Open Banking Standard – The Five Major Implications for UK Banks..

“Banking as a service has long sat at the heart of our economy. In our digitally enabled world, the need to seamlessly and efficiently connect different economic agents who are buying and selling goods and services, is critical. The Open Banking Standard is a framework for making banking data work better: for customers; for businesses and; for the economy as a whole.” – OBWG (Open Bank Working Group) co-chair and Barclays executive Matt Hammerstein

Introducing Open Banking Standards…

On a global basis, both the Financial Services and the Insurance industry are facing an unprecedented amount of change driven by factors like changing client preferences and the emergence of new technology—the Internet, mobility, social media, etc. These changes are immensely profound, especially with the arrival of  the “FinTechs”—technology-driven applications that are upending long-standing business models across all sectors from retail banking to wealth management & capital markets. Complement this with members of a major new segment, Millennials. They are increasingly use mobile devices, demanding more contextual services and expecting a seamless unified banking experience—something akin to what they  experience on web properties like Facebook, Amazon, Uber, Google or Yahoo, etc.  These web scale startups are doing so by expanding their wallet share of client revenues by offering contextual products tailored to individual client profiles. Their savvy use of segmentation data and predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, call center banking, point of sale, ATM/kiosk etc.).

Supra national authorities and national government in Europe have taken note of the need for erstwhile protected industries like Banking to stay competitive in this brave new world.

With the passage of the second revision of the ground breaking Directive on Payment Services Directive (PSD-2),  the European Parliament has adopted the legal foundation of the creation of a EU-wide single payments area (SEPA)[1].  While the goal of the PSD is to establish a set of modern, digital industry rules for all payment services in the European Union; it has significant ramifications for the financial services industry as it will surely current business models & foster new areas of competition. While the PSD-2 has gotten the lions share of press interest, the UK government has quietly been working on an initiative to create a standard around allowing Banking organizations to share their customer & transactional data with certified third parties via an open API.  The outgoing PM David Cameron’s government had in fact outlined these plans in the 2015 national budget.

open-bank-project

The EU and the UK governments have recognized that in order for Europe to move into the vision of one Digital Market – the current system of banking calls for change. And they foresee this change will be driven by digital technology. This shakeup will happen via the increased competition that will result as various financial services are unbundled by innovative developers. To that end, by 2019 – all banks should make customer data – their true crown jewels – openly accessible via an open standards based API.

The Open Bank Working Standard Report API…

 The U.K. has been working on an open standard for its financial system for nearly a year. The Open Bank Working Group (OBWP) was created to set standards how banking data should created and accessed openly. This initiative covers the following broad areas – Data Standards, API Standards & Security Standards to protect consumers while spurring innovation via open competition.

Open_Banking_Scope

Illustration: Components of the Open Banking Standard (ref – OBWG Working Group)

Under the Open Banking Standard – expected to be legal reality over the next 2-3 years, any banking customer or authorized 3rd party provider can leverage APIs to gain access to their data and transactions across a whole range of areas ranging from Retail Banking to Business Banking to Commercial Banking.

Open Standards can actually help banks by helping them source data from external providers. For instance, the Customer Journey problem has been an age old issue in banking which has gotten exponentially more complicated over the last five years as the staggering rise of mobile technology and the Internet of Things (IoT) have vastly increased the number of enterprise touch points that customers are exposed to in terms of being able to discover & purchase new products/services. In an OmniChannel world, an increasing number of transactions are being conducted online. In verticals like Retail and Banking, the number of online transactions approaches an average of 40%. Adding to the problem, more and more consumers are posting product reviews and feedback online. Banks thus need to react in realtime to piece together the source of consumer dissatisfaction.  Open Standards will help increase the ability of banks to pull in data from external sources to enrich their limited view of customers.

The Implications of Open Bank Standard…

The five implications of Open Bank Project –

  1. Banks will be focused on building platforms that can drive ecosystems of applications around them.  Banks have thus far been largely focused on delivering commodity financial services using well understood distribution strategies. Most global banks have armies of software developers but their productivity around delivering innovation has been close to zero. Open APIs will primarily force more thinking around how banking products are delivered to the end consumer. The standards for this initiative are primarily open source in origin, though they’re widely accepted across the globe – REST,OAuth etc.
  2. However it is not a zero sum game, Banks can themselves benefit by building business models around monetizing their data assets as their distribution channels will go global & costs will change around Open Bank. To that end existing Digital efforts should be brought in line with Open Bank Standard  The best retail banks will not only seek to learn from, but sometimes partner with, emerging fintech players to integrate new digital solutions and deliver exceptional customer experience. To cooperate and take advantage of fintechs, banks will require new partnering capabilities. To heighten their understanding of customers’ needs and to deliver products and services that customers truly value, banks will need new capabilities in data management and analytics. Using Open Bank APIs, developers across the world can create applications that offer new services (in conjunction with retailers, for example), aggregate financial information or even help in financial planning. Banks will have interesting choices to make between acting as Data Producer or Consumer or Aggregator or even a Distributor based on specific business situations.
  3. Regulators will also benefit substantially by using Partner APIs to both access real time reports  & share data across a range of areas. The lack of realtime data access across a range of risk, compliance and cyber areas has been a long standing problem that can be solved by an open standards based API framework [2].  E.g.  Market/Credit/Basel Risk Based Reporting, AML watch list data and Trade Surveillance etc.
  4. Data Architectures are key to Open Bank Standard –  Currently most industry players are woeful at putting together a comprehensive Single View of their Customers (SVC). Due to operational data silos, each department possess a siloe-d & limited view of the customer across multiple channels. These views are typically inconsistent, lack synchronization with other departments & miss a high amount of potential cross-sell and up-sell opportunities. Data lakes and realtime data processing techniques will be critical to meeting this immense regulatory requirement.
  5. Despite the promise, large gaps still remain in the Open Bank Project. Critical areas like project governance, Service Level Agreements (SLA) for API users in terms of uptime, quality of service are still left unaddressed.

 Open Banking Standard will spur immense changes..

Prior to the Open Banking Standard, Banks recognize the need to move to a predominantly online model by providing consumers with highly interactive, engaging and contextual experiences that span multiple channels—branch banking, eBanking, POS, ATM, etc. Business goals are engagement & increasing profitability per customer for both micro and macro customer populations with the ultimate goal of increasing customer lifetime value (CLV). The Open Banking Standard brings technology approaches to the fore in terms of calling it out as a strategic differentiator.  Banks need to move to a fresh business, data and process approach as a way of staying competitive and relevant. Done right, Open Bank Standards will help the leaders cement their market position.

REFERENCES…

[1] The Open Banking Standard –
https://theodi.org/open-banking-standard

[2]Big Data – Banking’s New Weapon Against Financial Crime – http://www.vamsitalkstech.com/?p=806

The Five Deadly Sins of Financial Services IT..

THE STATE OF GLOBAL FINANCIAL SERVICES IT ARCHITECTURE…

This blog has time & again discussed how Global, Domestic and Regional banks need to be innovative with their IT platform to constantly evolve their product offerings & services. This is imperative due to various business realities –  the increased competition by way of the FinTechs, web scale players delivering exciting services & sharply increasing regulatory compliance pressures. However, systems and software architecture has been a huge issue at nearly every large bank across the globe.

Regulation is also afoot in parts of the globe which will give non traditional banks access to hitherto locked customer data. E.g PSD-2 in the European Union. Further, banking licenses have been more easily granted to non-banks which are primarily technology pioneers. e.g. Paypal

It’s 2016 and Banks are waking up to the fact that IT Architecture is a critical strategic differentiator. Players that have agile & efficient architecture platforms, practices can not only add new service offerings but also able to experiment across a range of analytic led offerings that create & support multi-channel offerings. These digital services can now be found abundantly areas ranging from Retail Banking, Capital Markets, Payments & Wealth Management esp at the FinTechs.

So, How did we get here…

The Financial Services IT landscape – no matter the segment – one picks across the spectrum – Capital Markets, Retail & Consumer Banking, Payment Networks & Cards, Asset Management etc are all largely predicated on a few legacy anti-patterns. These anti-patterns have evolved over the years from a systems architecture, data architecture & middleware standpoint.

These anti-patterns have resulted in a mishmash of organically developed & shrink wrapped systems that do everything from running critical Core Banking Applications to Trade Lifecycle to Securities Settlement to Financial Reporting etc.  Each of these systems operates in an application, workflow, data silo with it’s own view of the enterprise. These are all kept in sync largely via data replication & stove piped process integration.

If this sounds too abstract, let us take an example &  a rather topical one at that. One of the most critical back office functions every financial services organization needs to perform is Risk Data Aggregation & Regulatory Reporting (RDARR). This spans areas from Credit Risk, Market Risk, Operational Risk , Basel III, Solvency II etc..the list goes on.

The basic idea in any risk calculation is to gather a whole range of quality data in one place and to run computations to generate risk measures for reporting.

So, how are various risk measures calculated currently? 

Current Risk Architectures are based on traditional relational databases (RDBMS) architectures with 10’s of feeds from Core Banking Systems, Loan Data, Book Of Record Transaction Systems (BORTS) like Trade & Position Data (e.g. Equities, Fixed Income, Forex, Commodities, Options etc),  Wire Data, Payment Data, Transaction Data etc. 

These data feeds are then tactically placed in memory caches or in enterprise data warehouses (EDW). Once the data has been extracted, it is transformed using a series of batch jobs which then prepare the data for Calculator Frameworks to which run the risk models on them. 

All of the above need access to large amounts of data at the individual transaction Level. The Corporate Finance function within the Bank then makes end of day adjustments to reconcile all of this data up and these adjustments need to be cascaded back to the source systems down to the individual transaction or classes of transaction levels. 

These applications are then typically deployed on clusters of bare metal servers that are not particularly suited to portability, automated provisioning, patching & management. In short, nothing that can automatically be moved over at a moment’s notice. These applications also work on legacy proprietary technology platforms that do not lend themselves to flexible & a DevOps style of development.

Finally, there is always need for statistical frameworks to make adjustments to customer transactions that somehow need to get reflected back in the source systems. All of these frameworks need to have access to and an ability to work with terabtyes (TBs) of data.

Each of above mentioned risk work streams has corresponding data sets, schemas & event flows that they need to work with, with different temporal needs for reporting as some need to be run a few times in a day (e.g. Traded Credit Risk), some daily (e.g. Market Risk) and some end of the week (e.g Enterprise Credit Risk)

Five_Deadly_Sins_Banking_Arch

                          Illustration – The Five Deadly Sins of Financial IT Architectures

Let us examine why this is in the context of these anti-patterns as proposed below –

THE FIVE DEADLY SINS…

The key challenges with current architectures –

  1. Utter, total and complete lack of centralized Data leading to repeated data duplication  – In the typical Risk Data Aggregation application – a massive degree of Data is duplicated from system to system leading to multiple inconsistencies at the summary as well as transaction levels. Because different groups perform different risk reporting functions (e.g Credit and Market Risk) – the feeds, the ingestion, the calculators end up being duplicated as well. A huge mess, any way one looks at it. 
  2. Analytic applications which are not designed for throughput – Traditional Risk algorithms cannot scale with this explosion of data as well as the heterogeneity inherent in reporting across multiple kinds of risks. E.g Certain kinds of Credit Risk need access to around 200 days of historical data where one is looking at the probability of the counter-party defaulting & to obtain a statistical measure of the same. The latter are highly computationally intensive and can run for days. 
  3. Lack of Application Blueprint, Analytic Model & Data Standardization – There is nothing that is either SOA or microservices-like and that precludes best practice development & deployment. This only leads to maintenance headaches. Cloud Computing enforces standards across the stack. Areas like Risk Model and Analytic development needs to be standardized to reflect realities post BCBS 239. The Volcker Rule aims to ban prop trading activity on part of the Banks. Banks must now report on seven key metrics across 10s of different data feeds across PB’s of data. Most cannot do that without undertaking a large development and change management headache.
  4. Lack of Scalability –  It must be possible to operate it as a central system that can scale to carry the full load of the organization and operate with hundreds of applications built by disparate teams all plugged into the same central nervous system.One other factor to consider is the role of cloud computing in customer retention efforts. The analytical computational power required to understand insights from gigantic data sets is costly to maintain on an individual basis. The traditional owned data center will probably not disappear, but banks need to be able to leverage the power of the cloud to perform big data analysis in a cost-effective manner.
    EndFragment
  5. A Lack of Deployment Flexibility –  The application & data requirements dictate the deployment platforms. This massive anti pattern leads to silos and legacy OS’s that can not easily be moved to Containers like Docker & instantiated by a modular Cloud OS like OpenStack.

THE BUSINESS VALUE DRIVERS OF EFFICIENT ARCHITECTURES …

Doing IT Architecture right and in a responsive manner to the business results in critical value drivers that that are met & exceeded this transformation are – 

  1. Effective Compliance with increased Regulatory Risk mandates ranging from Basel – III, FTRB, Liquidity Risk – which demand flexibility of all the different traditional IT tiers.
  2. An ability to detect and deter fraud – Anti Money Laundering (AML) and Retail/Payment Card Fraud etc
  3. Fendoff competition from the FinTechs
  4. Exist & evolve in a multichannel world dominated by the millennial generation
  5. Reduced costs to satisfy pressure on the Cost to Income Ratio (CIR)
  6. The ability to open up data & services that operate on the customer data to other institutions

 A uniform architecture that works across of all these various types would seem a commonsense requirement. However, this is a major problem for most banks. Forward looking approaches that draw heavily from microservices based application development, Big Data enabled data & processing layers, the adoption of Message Oriented Middleware (MOM) & a cloud native approach to developing applications (PaaS) & deployment (IaaS) are the solution to the vexing problem of inflexible IT.

The question is if banks can change before they see a perceptible drop in revenues over the years?  

Deter Financial Crime by Creating an Effective Anti Money Laundering (AML) Program…(1/2)

THE AML CHALLENGE CONTINUES UNABATED…

As this blog has repeatedly catalogued over the last year here[1], here[2] and here[3], Money Laundering is a massive global headache and one of the biggest crimes against humanity. Not a month goes by when we do not hear of billions of dollars in ill gotten funds being stolen from developing economies via corruption as well as from proceeds of nefarious whether it is the Panama papers or banks unwittingly helping drug cartels launder money.

I have seen annual estimates of global money laundering flows ranging anywhere from $ 1 trillion to 2 trillion – almost 5% of global GDP.  Almost all of this is laundered via Retail & Merchant Banks,  Payment Networks, Securities & Futures firms, Casino Services & Clubs etc – which explains why annual AML related fines on Banking organizations run into the billions and are increasing every year. However, the number of SARs (Suspicious Activity Reports) filed by banking institutions are much higher as a category as compared to the numbers filed by these other businesses.

The definition of Financial Crimes is fairly broad & encompasses a large area of definition – the traditional money laundering activity, financial fraud like identity theft/check fraud/wire fraud, terrorist activity, tax evasion, securities market manipulation, insider trading and other kinds of securities fraud. Financial institutions across the spectrum of the market now need to comply with the regulatory mandate at both the global as well as the local market level.

What makes AML such a hard subject for Global Banks which should be innovating quite easily?

The issues which bedevil smooth AML programs include –

  • the complex nature of banking across retail, commercial, wealth management & capital markets; global banks now derive around 40% of revenue from non traditional markets (North America & Western Europe)
  • the scale of customer activity ranging from 5 to 50 million at the large global banks
  • patchwork of local regulations, risk and compliance reporting requirements. E.g. Stringent compliance requirements in the US & UK but softer requirements elsewhere
  • tens of distribution channels
  • growing volumes of transactions causing requirements for complex analytics
  • the need to constantly integrate 3rd party information of lists of politically exposed persons of interest (PEPs) using manual means
  • technology while ensuring the availability of banking services to millions of underserved populations – also makes it easy for the launderers to conduct & mask their activities

The challenges are hard but the costs of non-compliance are severe. Banks have been fined billions of dollars, compliance officers face potential liability & institutional reputation takes a massive hit. Supra national authorities like the United Nations (UN) and the European Union (EU) can also impose sanctions when they perceive that AML violations threaten human rights & the rule of law.

TECHNOLOGY IS THE ANSWER…

Many Banks have already put in rules, policies & procedures to detect AML violations and have also invested in substantial teams staffed by money laundering risk officers (MLRO) & headed by compliance officers. These rules to detect money laundering work based on thresholds and patterns that breached such criteria. The issue with this is that the money launderers themselves are in the class of statisticians and they constantly devise new rules to hide their tracks.

The various elements that make up the risk to banks and financial institutions and the technology they use to detect these can be broken down into five main areas & work streams as shown below.

AML_Workstreams

                                Illustration: The Five Workstreams of AML programs

  1. Customer Due Diligence  – this involves gathering information from the client as well as on-boarding data from external sources to verify these details and to establish a proper KYC (Know Your Customer) program.
  2. Entity Analysis – identifying relationships between institutional clients as well as retail clients to understand the true social graph. Bank compliance officers now have gone beyond KYC (Know Your Customer) to know their customer’s customer, or KYCC.[4]
  3. Downstream Analytics – detecting advanced patterns of behavior among clients & the inter-web of transactions with a view to detecting hidden patterns of money laundering. This also involves assessing client risk during specific points in the banking lifecycle, such as account opening, transactions above a certain monetary value. These data points could signal potentially illegitimate activity based on any number of features associated with such transactions. Any transaction could also lead to the filing of a suspicious activity report (SAR)
  4. Ongoing Monitoring  – Help aggregate such customer transactions across multiple geographies for pattern detection and reporting purposes. This involves creating a corporate taxonomy of rules that capture a natural language description of the conditions, patterns denoting various types of financial crimes – terrorist financing, mafia laundering, drug trafficking, identity theft etc.
  5. SAR Investigation Lifecycle – These rules trigger downstream workflows to allow human investigation on such transactions

QUANTIFIABLE BENEFITS FROM DOING IT WELL…

Financial institutions that leverage new Age technology (Big Data, Predictive Analytics, Workflow) in these five areas will be able to effectively analyze financial data and deter potential money launderers before they are able to proceed, providing the institution with protection in the form of full compliance with the regulations.

The business benefits include –

  • Detect AML violations on a proactive basis thus reducing the probability of massive fines 
  • Save on staffing expenses for Customer Due Diligence (CDD)
  • Increase accurate production of suspicious activity reports (SAR)
  • Decrease the percent of corporate customers with AML-related account closures in the past year by customer risk level and reason – thus reducing loss of revenue
  • Decrease the overall KYC profile backlog across geographies
  • Help create Customer 360 views that can help accelerate CLV (Customer Lifetime Value) as well as Customer Segmentation from a cross-sell/up-sell perspective

CONCLUSION…

Virtually every leading banking institution, securities firm, payment provider understands that they need to enhance their AML capabilities by a few notches and also need to constantly evolve them as fraud itself morphs.

The question is can they form a true picture of their clients (both retail and institutional) on a real time basis, monitor every banking interaction while understanding it’s true context when merged with historical data, detect unusual behavior. Further creating systems that learn  from these patterns truly helps minimize money laundering.

The next and final post in this two part series will examine how Big Data & Analytics help with each of the work streams discussed above.

REFERENCES…

[1] Building AML Regulatory Platforms for the Big Data Era – http://www.vamsitalkstech.com/?p=5

[2]Big Data – Banking’s New Weapon Against Financial Crime – http://www.vamsitalkstech.com/?p=806

[3] Reference Architecture for AML
– http://www.vamsitalkstech.com/?p=833

[4] WSJ – Know Your Customer’s Customer is the New Norm – http://blogs.wsj.com/riskandcompliance/2014/10/02/the-morning-risk-report-know-your-customers-customer-is-new-norm/