“Banking as a service has long sat at the heart of our economy. In our digitally enabled world, the need to seamlessly and efficiently connect different economic agents who are buying and selling goods and services, is critical. The Open Banking Standard is a framework for making banking data work better: for customers; for businesses and; for the economy as a whole.” – OBWG (Open Bank Working Group) co-chair and Barclays executive Matt Hammerstein
Introducing Open Banking Standards…
On a global basis, both the Financial Services and the Insurance industry are facing an unprecedented amount of change driven by factors like changing client preferences and the emergence of new technology—the Internet, mobility, social media, etc. These changes are immensely profound, especially with the arrival of the “FinTechs”—technology-driven applications that are upending long-standing business models across all sectors from retail banking to wealth management & capital markets. Complement this with members of a major new segment, Millennials. They are increasingly use mobile devices, demanding more contextual services and expecting a seamless unified banking experience—something akin to what they experience on web properties like Facebook, Amazon, Uber, Google or Yahoo, etc. These web scale startups are doing so by expanding their wallet share of client revenues by offering contextual products tailored to individual client profiles. Their savvy use of segmentation data and predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, call center banking, point of sale, ATM/kiosk etc.).
Supra national authorities and national government in Europe have taken note of the need for erstwhile protected industries like Banking to stay competitive in this brave new world.
With the passage of the second revision of the ground breaking Directive on Payment Services Directive (PSD-2), the European Parliament has adopted the legal foundation of the creation of a EU-wide single payments area (SEPA). While the goal of the PSD is to establish a set of modern, digital industry rules for all payment services in the European Union; it has significant ramifications for the financial services industry as it will surely current business models & foster new areas of competition. While the PSD-2 has gotten the lions share of press interest, the UK government has quietly been working on an initiative to create a standard around allowing Banking organizations to share their customer & transactional data with certified third parties via an open API. The outgoing PM David Cameron’s government had in fact outlined these plans in the 2015 national budget.
The EU and the UK governments have recognized that in order for Europe to move into the vision of one Digital Market – the current system of banking calls for change. And they foresee this change will be driven by digital technology. This shakeup will happen via the increased competition that will result as various financial services are unbundled by innovative developers. To that end, by 2019 – all banks should make customer data – their true crown jewels – openly accessible via an open standards based API.
The Open Bank Working Standard Report API…
Under the Open Banking Standard – expected to be legal reality over the next 2-3 years, any banking customer or authorized 3rd party provider can leverage APIs to gain access to their data and transactions across a whole range of areas ranging from Retail Banking to Business Banking to Commercial Banking.
Open Standards can actually help banks by helping them source data from external providers. For instance, the Customer Journey problem has been an age old issue in banking which has gotten exponentially more complicated over the last five years as the staggering rise of mobile technology and the Internet of Things (IoT) have vastly increased the number of enterprise touch points that customers are exposed to in terms of being able to discover & purchase new products/services. In an OmniChannel world, an increasing number of transactions are being conducted online. In verticals like Retail and Banking, the number of online transactions approaches an average of 40%. Adding to the problem, more and more consumers are posting product reviews and feedback online. Banks thus need to react in realtime to piece together the source of consumer dissatisfaction. Open Standards will help increase the ability of banks to pull in data from external sources to enrich their limited view of customers.
The Implications of Open Bank Standard…
The five implications of Open Bank Project –
- Banks will be focused on building platforms that can drive ecosystems of applications around them. Banks have thus far been largely focused on delivering commodity financial services using well understood distribution strategies. Most global banks have armies of software developers but their productivity around delivering innovation has been close to zero. Open APIs will primarily force more thinking around how banking products are delivered to the end consumer. The standards for this initiative are primarily open source in origin, though they’re widely accepted across the globe – REST,OAuth etc.
- However it is not a zero sum game, Banks can themselves benefit by building business models around monetizing their data assets as their distribution channels will go global & costs will change around Open Bank. To that end existing Digital efforts should be brought in line with Open Bank Standard The best retail banks will not only seek to learn from, but sometimes partner with, emerging fintech players to integrate new digital solutions and deliver exceptional customer experience. To cooperate and take advantage of fintechs, banks will require new partnering capabilities. To heighten their understanding of customers’ needs and to deliver products and services that customers truly value, banks will need new capabilities in data management and analytics. Using Open Bank APIs, developers across the world can create applications that offer new services (in conjunction with retailers, for example), aggregate financial information or even help in financial planning. Banks will have interesting choices to make between acting as Data Producer or Consumer or Aggregator or even a Distributor based on specific business situations.
- Regulators will also benefit substantially by using Partner APIs to both access real time reports & share data across a range of areas. The lack of realtime data access across a range of risk, compliance and cyber areas has been a long standing problem that can be solved by an open standards based API framework . E.g. Market/Credit/Basel Risk Based Reporting, AML watch list data and Trade Surveillance etc.
- Data Architectures are key to Open Bank Standard – Currently most industry players are woeful at putting together a comprehensive Single View of their Customers (SVC). Due to operational data silos, each department possess a siloe-d & limited view of the customer across multiple channels. These views are typically inconsistent, lack synchronization with other departments & miss a high amount of potential cross-sell and up-sell opportunities. Data lakes and realtime data processing techniques will be critical to meeting this immense regulatory requirement.
- Despite the promise, large gaps still remain in the Open Bank Project. Critical areas like project governance, Service Level Agreements (SLA) for API users in terms of uptime, quality of service are still left unaddressed.
Open Banking Standard will spur immense changes..
Prior to the Open Banking Standard, Banks recognize the need to move to a predominantly online model by providing consumers with highly interactive, engaging and contextual experiences that span multiple channels—branch banking, eBanking, POS, ATM, etc. Business goals are engagement & increasing profitability per customer for both micro and macro customer populations with the ultimate goal of increasing customer lifetime value (CLV). The Open Banking Standard brings technology approaches to the fore in terms of calling it out as a strategic differentiator. Banks need to move to a fresh business, data and process approach as a way of staying competitive and relevant. Done right, Open Bank Standards will help the leaders cement their market position.
 The Open Banking Standard –
Big Data – Banking’s New Weapon Against Financial Crime – http://www.vamsitalkstech.com/?p=806
THE STATE OF GLOBAL FINANCIAL SERVICES IT ARCHITECTURE…
This blog has time & again discussed how Global, Domestic and Regional banks need to be innovative with their IT platform to constantly evolve their product offerings & services. This is imperative due to various business realities – the increased competition by way of the FinTechs, web scale players delivering exciting services & sharply increasing regulatory compliance pressures. However, systems and software architecture has been a huge issue at nearly every large bank across the globe.
Regulation is also afoot in parts of the globe which will give non traditional banks access to hitherto locked customer data. E.g PSD-2 in the European Union. Further, banking licenses have been more easily granted to non-banks which are primarily technology pioneers. e.g. Paypal
It’s 2016 and Banks are waking up to the fact that IT Architecture is a critical strategic differentiator. Players that have agile & efficient architecture platforms, practices can not only add new service offerings but also able to experiment across a range of analytic led offerings that create & support multi-channel offerings. These digital services can now be found abundantly areas ranging from Retail Banking, Capital Markets, Payments & Wealth Management esp at the FinTechs.
So, How did we get here…
The Financial Services IT landscape – no matter the segment – one picks across the spectrum – Capital Markets, Retail & Consumer Banking, Payment Networks & Cards, Asset Management etc are all largely predicated on a few legacy anti-patterns. These anti-patterns have evolved over the years from a systems architecture, data architecture & middleware standpoint.
These anti-patterns have resulted in a mishmash of organically developed & shrink wrapped systems that do everything from running critical Core Banking Applications to Trade Lifecycle to Securities Settlement to Financial Reporting etc. Each of these systems operates in an application, workflow, data silo with it’s own view of the enterprise. These are all kept in sync largely via data replication & stove piped process integration.
If this sounds too abstract, let us take an example & a rather topical one at that. One of the most critical back office functions every financial services organization needs to perform is Risk Data Aggregation & Regulatory Reporting (RDARR). This spans areas from Credit Risk, Market Risk, Operational Risk , Basel III, Solvency II etc..the list goes on.
The basic idea in any risk calculation is to gather a whole range of quality data in one place and to run computations to generate risk measures for reporting.
So, how are various risk measures calculated currently?
Current Risk Architectures are based on traditional relational databases (RDBMS) architectures with 10’s of feeds from Core Banking Systems, Loan Data, Book Of Record Transaction Systems (BORTS) like Trade & Position Data (e.g. Equities, Fixed Income, Forex, Commodities, Options etc), Wire Data, Payment Data, Transaction Data etc.
These data feeds are then tactically placed in memory caches or in enterprise data warehouses (EDW). Once the data has been extracted, it is transformed using a series of batch jobs which then prepare the data for Calculator Frameworks to which run the risk models on them.
All of the above need access to large amounts of data at the individual transaction Level. The Corporate Finance function within the Bank then makes end of day adjustments to reconcile all of this data up and these adjustments need to be cascaded back to the source systems down to the individual transaction or classes of transaction levels.
These applications are then typically deployed on clusters of bare metal servers that are not particularly suited to portability, automated provisioning, patching & management. In short, nothing that can automatically be moved over at a moment’s notice. These applications also work on legacy proprietary technology platforms that do not lend themselves to flexible & a DevOps style of development.
Finally, there is always need for statistical frameworks to make adjustments to customer transactions that somehow need to get reflected back in the source systems. All of these frameworks need to have access to and an ability to work with terabtyes (TBs) of data.
Each of above mentioned risk work streams has corresponding data sets, schemas & event flows that they need to work with, with different temporal needs for reporting as some need to be run a few times in a day (e.g. Traded Credit Risk), some daily (e.g. Market Risk) and some end of the week (e.g Enterprise Credit Risk)
Illustration – The Five Deadly Sins of Financial IT Architectures
Let us examine why this is in the context of these anti-patterns as proposed below –
THE FIVE DEADLY SINS…
The key challenges with current architectures –
- Utter, total and complete lack of centralized Data leading to repeated data duplication – In the typical Risk Data Aggregation application – a massive degree of Data is duplicated from system to system leading to multiple inconsistencies at the summary as well as transaction levels. Because different groups perform different risk reporting functions (e.g Credit and Market Risk) – the feeds, the ingestion, the calculators end up being duplicated as well. A huge mess, any way one looks at it.
- Analytic applications which are not designed for throughput – Traditional Risk algorithms cannot scale with this explosion of data as well as the heterogeneity inherent in reporting across multiple kinds of risks. E.g Certain kinds of Credit Risk need access to around 200 days of historical data where one is looking at the probability of the counter-party defaulting & to obtain a statistical measure of the same. The latter are highly computationally intensive and can run for days.
- Lack of Application Blueprint, Analytic Model & Data Standardization – There is nothing that is either SOA or microservices-like and that precludes best practice development & deployment. This only leads to maintenance headaches. Cloud Computing enforces standards across the stack. Areas like Risk Model and Analytic development needs to be standardized to reflect realities post BCBS 239. The Volcker Rule aims to ban prop trading activity on part of the Banks. Banks must now report on seven key metrics across 10s of different data feeds across PB’s of data. Most cannot do that without undertaking a large development and change management headache.
- Lack of Scalability – It must be possible to operate it as a central system that can scale to carry the full load of the organization and operate with hundreds of applications built by disparate teams all plugged into the same central nervous system.One other factor to consider is the role of cloud computing in customer retention efforts. The analytical computational power required to understand insights from gigantic data sets is costly to maintain on an individual basis. The traditional owned data center will probably not disappear, but banks need to be able to leverage the power of the cloud to perform big data analysis in a cost-effective manner.
- A Lack of Deployment Flexibility – The application & data requirements dictate the deployment platforms. This massive anti pattern leads to silos and legacy OS’s that can not easily be moved to Containers like Docker & instantiated by a modular Cloud OS like OpenStack.
THE BUSINESS VALUE DRIVERS OF EFFICIENT ARCHITECTURES …
Doing IT Architecture right and in a responsive manner to the business results in critical value drivers that that are met & exceeded this transformation are –
- Effective Compliance with increased Regulatory Risk mandates ranging from Basel – III, FTRB, Liquidity Risk – which demand flexibility of all the different traditional IT tiers.
- An ability to detect and deter fraud – Anti Money Laundering (AML) and Retail/Payment Card Fraud etc
- Fendoff competition from the FinTechs
- Exist & evolve in a multichannel world dominated by the millennial generation
- Reduced costs to satisfy pressure on the Cost to Income Ratio (CIR)
- The ability to open up data & services that operate on the customer data to other institutions
A uniform architecture that works across of all these various types would seem a commonsense requirement. However, this is a major problem for most banks. Forward looking approaches that draw heavily from microservices based application development, Big Data enabled data & processing layers, the adoption of Message Oriented Middleware (MOM) & a cloud native approach to developing applications (PaaS) & deployment (IaaS) are the solution to the vexing problem of inflexible IT.
The question is if banks can change before they see a perceptible drop in revenues over the years?
THE AML CHALLENGE CONTINUES UNABATED…
As this blog has repeatedly catalogued over the last year here, here and here, Money Laundering is a massive global headache and one of the biggest crimes against humanity. Not a month goes by when we do not hear of billions of dollars in ill gotten funds being stolen from developing economies via corruption as well as from proceeds of nefarious whether it is the Panama papers or banks unwittingly helping drug cartels launder money.
I have seen annual estimates of global money laundering flows ranging anywhere from $ 1 trillion to 2 trillion – almost 5% of global GDP. Almost all of this is laundered via Retail & Merchant Banks, Payment Networks, Securities & Futures firms, Casino Services & Clubs etc – which explains why annual AML related fines on Banking organizations run into the billions and are increasing every year. However, the number of SARs (Suspicious Activity Reports) filed by banking institutions are much higher as a category as compared to the numbers filed by these other businesses.
The definition of Financial Crimes is fairly broad & encompasses a large area of definition – the traditional money laundering activity, financial fraud like identity theft/check fraud/wire fraud, terrorist activity, tax evasion, securities market manipulation, insider trading and other kinds of securities fraud. Financial institutions across the spectrum of the market now need to comply with the regulatory mandate at both the global as well as the local market level.
What makes AML such a hard subject for Global Banks which should be innovating quite easily?
The issues which bedevil smooth AML programs include –
- the complex nature of banking across retail, commercial, wealth management & capital markets; global banks now derive around 40% of revenue from non traditional markets (North America & Western Europe)
- the scale of customer activity ranging from 5 to 50 million at the large global banks
- patchwork of local regulations, risk and compliance reporting requirements. E.g. Stringent compliance requirements in the US & UK but softer requirements elsewhere
- tens of distribution channels
- growing volumes of transactions causing requirements for complex analytics
- the need to constantly integrate 3rd party information of lists of politically exposed persons of interest (PEPs) using manual means
- technology while ensuring the availability of banking services to millions of underserved populations – also makes it easy for the launderers to conduct & mask their activities
The challenges are hard but the costs of non-compliance are severe. Banks have been fined billions of dollars, compliance officers face potential liability & institutional reputation takes a massive hit. Supra national authorities like the United Nations (UN) and the European Union (EU) can also impose sanctions when they perceive that AML violations threaten human rights & the rule of law.
TECHNOLOGY IS THE ANSWER…
Many Banks have already put in rules, policies & procedures to detect AML violations and have also invested in substantial teams staffed by money laundering risk officers (MLRO) & headed by compliance officers. These rules to detect money laundering work based on thresholds and patterns that breached such criteria. The issue with this is that the money launderers themselves are in the class of statisticians and they constantly devise new rules to hide their tracks.
The various elements that make up the risk to banks and financial institutions and the technology they use to detect these can be broken down into five main areas & work streams as shown below.
Illustration: The Five Workstreams of AML programs
- Customer Due Diligence – this involves gathering information from the client as well as on-boarding data from external sources to verify these details and to establish a proper KYC (Know Your Customer) program.
- Entity Analysis – identifying relationships between institutional clients as well as retail clients to understand the true social graph. Bank compliance officers now have gone beyond KYC (Know Your Customer) to know their customer’s customer, or KYCC.
- Downstream Analytics – detecting advanced patterns of behavior among clients & the inter-web of transactions with a view to detecting hidden patterns of money laundering. This also involves assessing client risk during specific points in the banking lifecycle, such as account opening, transactions above a certain monetary value. These data points could signal potentially illegitimate activity based on any number of features associated with such transactions. Any transaction could also lead to the filing of a suspicious activity report (SAR)
- Ongoing Monitoring – Help aggregate such customer transactions across multiple geographies for pattern detection and reporting purposes. This involves creating a corporate taxonomy of rules that capture a natural language description of the conditions, patterns denoting various types of financial crimes – terrorist financing, mafia laundering, drug trafficking, identity theft etc.
- SAR Investigation Lifecycle – These rules trigger downstream workflows to allow human investigation on such transactions
QUANTIFIABLE BENEFITS FROM DOING IT WELL…
Financial institutions that leverage new Age technology (Big Data, Predictive Analytics, Workflow) in these five areas will be able to effectively analyze financial data and deter potential money launderers before they are able to proceed, providing the institution with protection in the form of full compliance with the regulations.
The business benefits include –
- Detect AML violations on a proactive basis thus reducing the probability of massive fines
- Save on staffing expenses for Customer Due Diligence (CDD)
- Increase accurate production of suspicious activity reports (SAR)
- Decrease the percent of corporate customers with AML-related account closures in the past year by customer risk level and reason – thus reducing loss of revenue
- Decrease the overall KYC profile backlog across geographies
- Help create Customer 360 views that can help accelerate CLV (Customer Lifetime Value) as well as Customer Segmentation from a cross-sell/up-sell perspective
Virtually every leading banking institution, securities firm, payment provider understands that they need to enhance their AML capabilities by a few notches and also need to constantly evolve them as fraud itself morphs.
The question is can they form a true picture of their clients (both retail and institutional) on a real time basis, monitor every banking interaction while understanding it’s true context when merged with historical data, detect unusual behavior. Further creating systems that learn from these patterns truly helps minimize money laundering.
The next and final post in this two part series will examine how Big Data & Analytics help with each of the work streams discussed above.
 Building AML Regulatory Platforms for the Big Data Era – http://www.vamsitalkstech.com/?p=5
Big Data – Banking’s New Weapon Against Financial Crime – http://www.vamsitalkstech.com/?p=806
 Reference Architecture for AML
 WSJ – Know Your Customer’s Customer is the New Norm – http://blogs.wsj.com/riskandcompliance/2014/10/02/the-morning-risk-report-know-your-customers-customer-is-new-norm/
The most common questions that a lot of enterprises fail to answer accurately are –
- Is the Customer happy with their overall relationship experience?
- What mode of contact do they prefer? And at what time? Can Customers be better targeted at these channels at those preferred times?
- What is the overall Customer Lifetime Value (CLV) or how much profit we are able to generate from this customer over their total lifetime?
- By understanding CLV across populations, can we leverage that to increase spend on marketing & sales for products that are resulting in higher customer value?
- How do we increase cross sell and up-sell of products & services?
- Does this customer fall into a certain natural segment and if so, how can we acquire most customers like them?
- Can different channels (Online, Mobile, IVR & POS) be synchronized ? Can Customers begin a transaction in one channel and complete it in any of the others without having to resubmit their data?
The first element in Digital is the Customer Centricity & it must naturally follow that a 360 degree view is a huge aspect of that.
Illustration – Customer 360 view & its benefits
So what information is specifically contained in a Customer 360 –
The 360 degree view is a snapshot of the below types of data –
- Customer’s Demographic information – Name, Address, Age etc
- Length of the Customer-Enterprise relationship
- Products and Services purchased overall
- Preferred Channel & time of Contact
- Marketing Campaigns the customer has responded to
- Major Milestones in the Customers relationship
- Ongoing activity – Open Orders, Deposits, Shipments, Customer Cases etc
- Ongoing Customer Lifetime Value (CLV) Metrics and the Category of customer (Gold, Silver, Bronze etc)
- Any Risk factors – Likelihood of Churn, Customer Mood Alert, Ongoing issues etc
- Next Best Action for Customer
How can Big Data technology help?
Leveraging the ingestion and predictive capabilities of a Big Data based platform, banks can provide a user experience that rivals Facebook, Twitter or Google and provide a full picture of customer across all touch points.
Big Data enhances the Customer 360 capability in the following ways –
- Obtaining a realtime Single View of the Customer (typically a customer across multiple channels, product silos & geographies) across years of account history
- Customer Segmentation by helping businesses understand customer segments down to the individual level as well as at a segment level
- Performing Customer sentiment analysis by combining internal organizational data, clickstream data, sentiment analysis with structured sales history to provide a clear view into consumer behavior.
- Product Recommendation engines which provide compelling personal product recommendations by mining realtime consumer sentiment, product affinity information with historical data.
- Market Basket Analysis, observing consumer purchase history and enriching this data with social media, web activity, and community sentiment regarding past purchase and future buying trends.
Customer 360 can help improve the following operational metrics of a Retailer or a Bank or a Telecom immensely.
- Cost to Income ratio; Customers Acquired per FTE; Sales and service FTE’s (as percentage of total FTE’s), New Accounts Per Sales FTE etc
- Sales conversion rates across channels, Decreased customer attrition rates etc.
- Improved Net promotor scores (NPS), referral based sales etc
Customer 360 is thus basic digital capability every organization needs to offer their customers, partners & internal stakeholders. This implies a re-architecture of both data management and business processes automation.
The next post will discuss the second critical component of Digital Transformation – the Customer Journey.
“IT driven business transformation is always bound to fail” – Amber Storey, Sr Manager, Ernst & Young
The value of Big Data driven Analytics is no longer in question both from a customer as well as an enterprise standpoint. Lack of investment in an analytic strategy has the potential to impact shareholder value negatively. Business Boards and CXOs are now concerned about their overall levels and maturity of investments in terms of business value – i.e increasing sales, driving down business & IT costs & helping create new business models. It is thus an increasingly accurate argument that smart applications & ecosystems built around them will increasingly dictate enterprise success.
Such examples among forward looking organizations abound across industries. These range from realtime analytics in manufacturing using IoT data streams across the supply chain, the use of natural language processing to drive patient care decisions in healthcare, more accurate insurance fraud detection & driving Digital interactions in Retail Banking etc to quote a few.
However , most global organizations currently adopt a fairly tactical approach to ensuring the delivery of of traditional business intelligence (BI) and predictive analytics to their application platforms. This departmental is quite suboptimal in ways as scaleable data driven decisions & culture not only empower decision-makers with up to date and realtime information but also help them develop long term insights into how globally diversified business operations are performing. Scale is the key word here due to rapidly changing customer trends, partner, supply chain realities & regulatory mandates.
Scale implies speed of learning, business agility across the organization in terms of having globally diversified operations turn on a dime thus ensuring that the business feels empowered.
A quick introduction to Business (Descriptive & Predictive) Analytics –
Business intelligence (BI) is a traditional & well established analytical domain that essentially takes a retrospective look at business data in systems of record. The goal for BI is to primarily look for macro or aggregate business trends across different aspects or dimensions such as time, product lines, business unites & operating geographies.
BI is primarily concerned with “What happened and what trends exist in the business based on historical data?“. The typical use cases for BI include budgeting, business forecasts, reporting & key performance indicators (KPI).
On the other hand, Predictive Analytics (a subset of Data Science) augments & builds on the BI paradigm by adding a “What could happen” dimension to the data in terms of –
- being able to probabilistically predict different business scenarios across thousands of variables
- suggesting specific business actions based on the above outcomes
Predictive Analytics does not intend to nor will it replace the BI domain but only adds significant business capabilities that lead to overall business success. It is not uncommon to find real world business projects leveraging both these analytical approaches.
Creating an industrial approach to analytics –
Strategic business projects typically begin imbibing a BI/Predictive Analytics based approach as an afterthought to the other aspects of system architecture and buildout. This dated approach then ensures that analytics becomes external to and eventually operating in a reactive mode in the operation of business system.
Having said that, one does need to recognize that an industrial approach to analytics is a complex endeavor that depends on how an organization tackles the convergence of the below approaches –
- Organizational Structure
- New Age Technology
- A Platforms Mindset
Illustration – Embedding A Culture of Business Analytics into the Enterprise DNA..
Lets discuss them briefly –
Organizational Structure – The historical approach has been to primarily staff analytics teams as a standalone division often reporting to a CIO. This team has responsibility for both the business intelligence as well as some silo of a data strategy. Such a piecemeal approach to predictive analytics ensures that business & application teams adopt a “throw it over the wall” mentality over time.
So what needs to be done?
In the Digital Age, enterprises should look to centralize both data management as well as the governance of analytics as core business capabilities. I suggest a hybrid organizational structure where a Center of Excellence (COE) is created which reports to the office of the Chief Data Officer (CDO) as well as individual business analytic leaders within the lines of business themselves.
This should be done to ensure that three specific areas are adequately tackled using a centralized approach-
- Investing in creating a data & analytics roadmap by creating a center of excellence (COE)
- Setting appropriate business milestones with “lines of business” value drivers built into a robust ROI model
- Managing Risk across the enterprise with detailed scenario planning
New Age Technology –
The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging visualization but also to personalize services clients care about across multiple modes of interaction. Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. We have seen how how exploding data generation across the global economy has become a clear & present business & IT phenomenon. Data volumes are rapidly expanding across industries. However, while the production of data itself that has increased but it is also driving the need for organizations to derive business value from it. This calls for the collection & curation of data from dynamic, and highly distributed sources such as consumer transactions, B2B interactions, machines such as ATM’s & geo location devices, click streams, social media feeds, server & application log files and multimedia content such as videos etc – using Big Data.
Cloud Computing is the ideal platform to provide the business with self service as well as rapid provisioning of business analytics. Every new application designed needs to be cloud native from the get go.
The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging Visualization but also to personalize services clients care about across multiple modes of interaction. Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc.
A Platforms Mindset –
As opposed to building standalone or one-off business applications, a Platform Mindset is a more holistic approach capable of producing higher revenues. Platforms abound in the webscale world at shops like Apple, Facebook & Google etc. Applications are constructed like lego blocks and they reuse customer & interaction data to drive cross sell and up sell among different product lines. The key components here are to ensure that one starts off with products with high customer attachment & retention. While increasing brand value, it is key to ensure that customers & partners can also collaborate in the improvements in the various applications hosted on top of the platform.
Culture – Business value fueled by analytics is only possible if the entire organization operates on an agile basis in order to collaborate across the value chain. Cross functional teams across new product development, customer acquisition & retention, IT Ops, legal & compliance must collaborate in short work cycles to close the traditional business & IT innovation gap. Methodologies like DevOps who’s chief goal is to close the long-standing gap between the engineers who develop and test IT capability and the organizations that are responsible for deploying and maintaining IT operations – must be adopted. Using traditional app dev methodologies, it can take months to design, test and deploy software. No business today has that much time—especially in the age of IT consumerization and end users accustomed to smart phone apps that are updated daily. The focus now is on rapidly developing business applications to stay ahead of competitors that can better harness Big Data’s amazing business capabilities.
Enterprise wide business analytic approaches designed around the four key prongs (Structure, Culture, Technology & Platforms) will create immense operational efficiency, better business models, increased relevance and ultimately drive revenues. These will separate the visionaries, leaders from the laggards in the years to come.
Illustration: Business- IT Relationship (Image src – Pat.it)
Previous posts in this blog have discussed the fact that technological capabilities now make or break business models. It is critical for IT to operate in a manner that maximizes their efficiency while managing costs & ultimately delivering the right outcomes for the organization.
It is clear and apparent to me that the relationship lines of business (LOBs) have with their IT teams – typically central & shared – is completely broken at a majority of large organizations. Each side cannot seem to view either the perspective or the passions of the other. This dangerous dysfunction usually leads to multiple complaints from the business. Examples of which include –
- IT is perceived to be glacially slow in providing infrastructure needed to launch new business initiatives or to amend existing ones. This leads to the phenomenon of ‘Shadow IT’ where business applications are run on public clouds bypassing internal IT
- Something seems to be lost in translation while conveying requirements to different teams within IT
- IT is too focused on technological capabilities – Virtualization, Middleware, Cloud, Containers, Hadoop et al without much emphasis on business value drivers
So what are the top asks that Business has for their IT groups? I wager that there are five important focus areas –
- Transact in the language of the business –Most would agree that there has been too much of a focus on the technology itself – how it works, what the infrastructure requirements are to host applications – cloud or on-prem, data engines to ingest and process it etc etc . The focus needs to be on customer needs that drive business value for an organization’s customers, partners, regulators & employees. Technology at it’s core is just an engine and does not exist in a vacuum. The most vibrant enterprises understand this ground reality and always ensure that business needs drive IT and not the other way around. It is thus highly important for IT leadership to understand the nuances of the business to ensure that their roadmaps (long and medium term) are being driven with business & competitive outcomes in mind. Examples of such goals are a common organization wide taxonomy across products, customers, logistics, supply chains & business domains. The shared emphasis on both business & IT should be on goals like increased profitability per customer, enhanced segmentation of both micro and macro customer populations with the ultimate goal of increasing customer lifetime value (CLV).
- Bi-Modal or “2 Speed” IT et al need to be business approach centric – Digital business models that are driving agile web-scale companies offer enhanced customer experiences built on product innovation and data driven business models. They are also encroaching into the domain of established industry players in verticals like financial services, retail, entertainment, telecommunications, transportation and insurance by offering contextual & trendy products tailored to individual client profiles. Their savvy use of segmentation data and realtime predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, point of sale, Internet, etc.). The enterprise approach has been to adopt a model known as Bi-Modal IT championed by Gartner. This model envisages two different IT camps – one focused on traditional applications and the other focused on innovation. Whatever be the moniker for this approach – LOBs need to be involved as stakeholders from the get-go & throughout the process of selecting technology choices that have downstream business ramifications. One of the approaches that is working well is increased cross pollination across both teams, collapsing artificial organizational barriers by adopting DevOps & ensuring that business has a slim IT component to rapidly be able to fill in gaps in IT’s business knowledge or capability.
- Self Service Across the board of IT Capabilities – Shadow IT (where business goes around the IT team) is not just an issue with infrastructure software but is slowly creeping up to business intelligence and advanced analytics apps. The delays associated with provisioning legacy data silos combined with using tools that are neither intuitive nor able to scale to deal with the increasing data deluge are making timely business analysis almost impossible to perform. Insights delivered too late are not very valuable. Thus, LOBs are beginning to move to a predominantly online SaaS (Software As A Service) model across a range of business intelligence applications. Reports, visual views of internal & external datasets are directly served to internal consumers based on data uploaded into a cloud based BI provider. These reports and views are then directly delivered to end users. IT needs to enable this capability and make it part of their range of offerings to the business.
- Help the Business think Analytically – Business Process Automation (BPM) and Data Driven decision making are proven approaches used at data-driven organizations. When combined with Data and Business Analytics, this tends to be a killer combination. Organizations that are data & data metric driven are able to define key business processes that provide native support for key performance indicators (KPIs) that are critical and basic to their functioning. Applications developed by IT need to be designed in such a way that these KPIs can be communicate and broadcast across the organization constantly. Indeed a high percentage of organizations now have senior executive in place as the champion for BPM, Business Rules and Big Data driven analytics. These applications are also mobile native so that they can be provided access through a variety of mobile platforms for field based employees & back into the corporate firewall.
- No “Us vs Them” mentality – it is all “Us” – None of the above are only possible if the entire organization operates on an agile basis in order to collaborate across the value chain. Cross functional teams across new product development, customer acquisition & retention, IT Ops, legal & compliance must collaborate in short work cycles to close the traditional business & IT innovation gap. One of chief goals of agile methodologies is to close the long-standing gap between the engineers who develop and test IT capability and business requirements for such capabilities. Using traditional app dev methodologies, it can take months to design, test and deploy software – which is simply unsustainable.
Business & IT need to collaborate. Period. –
The most vibrant enterprises that have implemented web-scale practices not only offer “IT/Business As A Service” but also have instituted strong cultures of symbiotic relationships between customers (both current & prospective), employees , partners and developers etc.
No business today has much time to innovation—especially in the age of IT consumerization where end users accustomed to smart phone apps that are often updated daily. The focus now is on rapidly developing business applications to stay ahead of competitors that can better harness technology’s amazing business capabilities.
- Full service online Robo-advisor that is a 100% automated without any human element
- Hybrid Robo-advisor model being pioneered by firms like Vanguard & Charles Schwab
- Pure online advisor that is primarily human in nature
The major business areas & client offerings in the Wealth & Asset Management space have been covered in the first post in this series at http://www.vamsitalkstech.com/?p=2329
Automated advisors only cover a subset of all of the above at the moment. The major usecases are as below –
- Determine individual Client profiles & preferences—e.g. For a given client profile- determine financial goals, expectations of investment return, diversification etc
- Identify appropriate financial products that can be offered either as pre-packaged portfolios or custom investments based on the client profile identified in the first step
- Establish correct Investment Mix for the client’s profile – these can included but are not ,limited to equities, bonds, ETFs & other securities in the firm’s portfolios . For instance, placing tax-inefficient assets in retirement accounts like IRA’s as well as tax efficient municipal bonds in taxable accounts etc.
- Using a algorithmic approach, choose the appropriate securities for each client account
- Continuously monitor the portfolio & transactions within it to tune performance , lower transaction costs, tax impacts etc based on how the markets are doing. Also ensure that a client’s preferences are being incorporated so that appropriate diversification and risk mitigation is being performed
- Provide value added services like Tax loss harvesting to ensure that the client is taking tax benefits into account as they rebalance portfolios or accrue dividends.
- Finally ,ensure the best user experience by handling a whole range of financial services – trading, account administration, loans,bill pay, cash transfers, tax reporting, statements in one intuitive user interface.
Illustration: Betterment user interface. Source – Joe Jansen
To illustrate these concepts in action, leaders like Wealthfront & Betterment are increasingly adding features where highly relevant, data-driven advice is being provided based on existing data as well as aggregated data from other providers. Wealthfront now provides recommendations on diversification, taxes and fees that are personalized not only to the specific investments in client’s account, but also tailored to their specific financial profile and risk tolerance. For instance, is enough cash being set aside in the emergency fund ? Is a customer holding too much stock in your employer? 
The final post will look at a technology & architectural approach to building out a Robo-advisor. We will also discuss best practices from a WM & industry standpoint in the context of Robo-advisors.
- Wealthfront Blog – “Introducing the new Dashboard”
“If you don’t like change, you will like irrelevance even less.” -General Eric Shinseki, Former US Secretary of Veterans Affairs
This blog has spent time documenting the ongoing digital disruption across the industry especially financial services. Is there proof that creative destruction is taking a hold in Banking? The answer is a clear & unequivocal “Yes”. Clearly, Retail Banking is undergoing a massive makeover. This is being driven by many factors – changing consumer preferences, the advent of technology, automation of business processes & finally competition from not just the traditional players but also the Fintechs. The first casualty of this change is the good old Bank Branch. This post looks at the business background of Retail Banking across the world & will try to explain my view on what is causing this shift in how Banks and consumers perceive financial services.
This blog post will be one of a series of five standalone posts on Retail Bank transformation. The intention for the first post is to discuss industry dynamics, the current state of competition and will briefly introduce the forces causing a change in the status quo. The second post will categorize FinTechs across the banking landscape with key examples of how they disinter-mediate established players. The remaining posts will examine each of the other forces (Customer in more detail along with specific and granular advice to retail banks on how to incorporate innovation into their existing technology, processes and organizational culture.
Retail Banking is perhaps one of the most familiar and regular services that everyday citizens use in the course of their lives. Money is a commodity we touch every day in our lives when we bank, shop, pay bills, borrow etc. Retail lines of banking typically include personal accounts, credit cards, mortgages and auto loans.
For large financial conglomerates that have operations spanning Commercial Banking, Capital Markets, Wealth & Asset Management etc, retail operations have always represented an invaluable source of both stability as well as balance sheet strength. The sheer size & economic exposure of retail operations ensures that it is not only staid yet stable but also somewhat insulated from economic shocks. This is borne out by the policies of respective national central banks & treasury departments. Indeed one of main the reasons regulators have bailed out banks in the past is due to the perception that Main Street & the common citizen’s banking assets becoming a casualty of increased risk taking by traders in the capital markets divisions. This scenario famously played out during the Great Depression in the late 1920s and was a major factor in causing widespread economic contagion. A stock market crash quickly cascaded into a nation-wide economic depression.
Thus, retail banking is crucial to not just to the owning corporation but also to diverse stakeholders in the world economy – deposit holders, the regulators led by the US Federal Reserve (in the US) & a host of other actors.
The State of Global Retail Banking –
In the financial crisis of 2008, retail banks not only held their own but also assumed a bigger share of revenues as the recovery got underway in the following years. According to a survey by Boston Consulting Group (BCG), retail banking activities accounted for 55 percent of the revenues generated across a global cohort of 140 banks, up from 45 percent in 2006.
However, the report also contends that retail revenues since 2008 have been slowly falling as investors have begin shifting their savings to deposits as a reaction to high profile financial scandals thus putting pressure on margins. Higher savings rates have helped offset this somewhat & retail banks ended up maintaining better cost to income (CIR) ratios than did other areas of banking.Retail banks also performed better on a key metric return on assets (ROA). The below graphic from the BCG captures this metric. In the Americas region, the average ROA was 162 percent higher than the average group ROA in 2008. From 2001 through 2006, it was 51 percent higher. Global banking revenues stood at $ 1.59 trillion in 2015 – a figure that is expected to hold relatively steady across the globe 
It is also important to note that global performance of retail banks across the five major regions: the Americas, Europe, the Middle East, Asia, and Australia has generally varied based on a multitude of factors. And even within regions, banking performance has varied widely.
Illustration 1 – Retail Banking is profitable and stable
As stable as this sector seems, it is also be roiled by four main forces that are causing every major player to rethink their business strategy. Left unaddressed, these changes will cause huge and negative impacts on competitive viability, profitability & also impact all important growth over the next five years.
What is the proof that retail banking is beginning to change? The below graphic from CNN  says it all –
Bank of America has 23% fewer branches and 37% fewer employees than in 2009. That downward trend across both metrics is expected to continue as online transactions from (deposits to checks to online loans) grown by a staggering 94%. The bank is expected to cut more positions in reflection of a shrinking headcount and branch footprint.
Pressure from the FinTechs:
The Financial Services and the Insurance industry are facing an unprecedented amount of change driven by factors like changing client preferences and the emergence of new technology—the Internet, mobility, social media, etc. These changes are immensely profound, especially with the arrival of “FinTech”—technology-driven applications that are upending long-standing business models across all sectors from retail banking to wealth management & capital markets. Further, members of a major new segment, Millennials, increasingly use mobile devices, demand more contextual services and expect a seamless unified banking experience—something akin to what they experience on web properties like Facebook, Amazon, Uber, Google or Yahoo, etc. They do so by expanding their wallet share of client revenues by offering contextual products tailored to individual client profiles. Their savvy use of segmentation data and predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, call center banking, point of sale, ATM/kiosk etc.).
Retail Banking must trend Digital to respond –
The definition of Digital is somewhat nebulous, I would like to define the key areas where it’s impact and capabilities will need to be felt for this gradual transformation to occur.
A true Digital Bank needs to –
- Offer a seamless customer experience much like the one provided by the likes of Facebook & Amazon i.e highly interactive & intelligent applications that can detect a single customer’s journey across multiple channels
- offer data driven interactive services and products that can detect customer preferences on the fly, match them with existing history and provide value added services. Services that not only provide a better experience but also foster a longer term customer relationship
- to be able to help the business prototype, test, refine and rapidly develop new business capabilities
- Above all, treat Digital as a Constant Capability and not as an ‘off the shelf’ product or a one off way of doing things
The five areas that established banks need to change across are depicted below..
- Convert branches to be advisory & relationship focused instead of centers for transactions – As the number of millennials keeps growing, the actual traffic to branches will only continue to decline. Branches still have an area of strength in being intimate customer touch points. The branch of the future can be redesigned to have more self service features along with relationship focused advisory personnel instead of purely being staffed by tellers and managers. They need to be reimagined as Digital Centers, not unlike an Apple store, with highly interactive touch screens and personnel focused on building business through high margin products.
- Adopt a FinTech like mindset – FinTechs (or new Age financial industry startups) offer enhanced customer experiences built on product innovation and agile business models. They do so by expanding their wallet share of client revenues by offering contextual products tailored to individual client profiles. Their savvy use of segmentation data and predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, Point Of Sale, Internet, etc.). Like banks, these technologies support multiple modes of payments at scale, but they aren’t bound by the same regulatory and compliance regulations as are banks, who operate under a mandate that they must demonstrate that they understand their risk profiles. The best retail banks will not only seek to learn from, but sometimes partner with, emerging fintech players to integrate new digital solutions and deliver exceptional customer experience. To cooperate and take advantage of fintechs, banks will require new partnering capabilities. To heighten their understanding of customers’ needs and to deliver products and services that customers truly value, banks will need new capabilities in data management and analytics.
- Understand your customer – Banks need to move to a predominantly online model, providing consumers with highly interactive, engaging and contextual experiences that span multiple channels—branch banking, eBanking, POS, ATM, etc. Further goals are increased profitability per customer for both micro and macro customer populations with the ultimate goal of increasing customer lifetime value (CLV).
- Business Process improvement – Drive Automation across lines of business – Financial services are fertile ground for business process automation, since most banks across their various lines of business are simply a collection of core and differentiated processes. Examples of these processes are consumer banking (with processes including on boarding customers, collecting deposits, conducting business via multiple channels, and compliance with regulatory mandates such as KYC and AML); investment banking (including straight-through-processing, trading platforms, prime brokerage, and compliance with regulation); payment services; and wealth management (including modeling model portfolio positions and providing complete transparency across the end-to-end life cycle). The key takeaway is that driving automation can result not just in better business visibility and accountability on behalf of various actors. It can also drive revenue and contribute significantly to the bottom line. Automation enables enterprise business and IT users to document, simulate, manage, automate and monitor business processes and policies. It is designed to empower business and IT users to collaborate more effectively, so business applications can be changed more easily and quickly.
- Agile Culture – All of the above are only possible if the entire organization operates on an agile basis in order to collaborate across the value chain. Cross functional teams across new product development, customer acquisition & retention, IT Ops, legal & compliance must collaborate in short work cycles to close the traditional business & IT innovation gap. One of DevOps’s chief goals is to close the long-standing gap between the engineers who develop and test IT capability and the organizations that are responsible for deploying and maintaining IT operations. Using traditional app dev methodologies, it can take months to design, test and deploy software. No business today has that much time—especially in the age of IT consumerization and end users accustomed to smart phone apps that are updated daily. The focus now is on rapidly developing business applications to stay ahead of competitors that can better harness Big Data’s amazing business capabilities.
How can all of this be quantified? –
The results of BCG’s sixth annual Global Retail-Banking Excellence benchmarking illustrate the value drivers. Forward looking banks are working on some of the above aspects are able to reduce cycle times for core processes thus improving productivity. The leaders in the survey are also reallocating resources from the mid and office to customer facing roles.
Again, according to the BCG, digital reinvention comes with huge benefits to both the top and bottom-lines. Their annual survey across the global retail banking sector estimates an average reduction in operating expenses from 15% to 25%, increases in pretax profit by 20% to 30% and an average increase in margins before tax from 5% to 10%.  These numbers are highly impressive at the scale that large banks operate.
The question thus is, can the vast majority of Banks change before it’s too late? Can they find the right model of execution in the Digital Age before their roles are either diminished or dis-intermediated by competition?
We will dive deep into the FinTech’s in the next post in the series.
 CNN Money – Bank of America has 23% fewer branches than 2009
BCG Research- Winning Strategies Revisited for Retail Banking
 BCG Research- Global Capital Markets 2016: The Value Migration
Previous posts in this blog have discussed how Capital markets firms must create new business models and offer superior client relationships based on their vast data assets. Firms that can infuse a data driven culture in both existing & new areas of operation will enjoy superior returns and raise the bar for the rest of the industry in 2016 & beyond.
Capital Markets are the face of the financial industry to the general public and generate a large percent of the GDP for the world economy. Despite all the negative press they have garnered since the financial crisis of 2008, capital markets perform an important social function in that they contribute heavily to economic growth and are the primary vehicle for household savings. Firms in this space allow corporations to raise capital using the underwriting process. However, it is not just corporations that benefit from such money raising activity – municipal, local and national governments do the same as well. Just that the overall mechanism differs – while business enterprises issue both equity and bonds, governments typically issue bonds. According to the Boston Consulting Group (BCG), the industry will grow to annual revenues of $661 billion in 2016 from $593 billion in 2015 – a healthy 12% increase. On the buy side, the asset base (AuM – Assets under Management) is expected to reach around $100 trillion by 2020 up from $74 trillion in 2014.
Within large banks, the Capital Markets group and the Investment Banking Group perform very different functions. Capital Markets (CM) is the face of the bank to the street from a trading perspective. The CM group engineers custom derivative trades that hedge exposure for their clients (typically Hedge Funds, Mutual Funds, Corporations, Governments and high net worth individuals and Trusts) as well as for their own treasury group. They may also do proprietary trading on the banks behalf for a profit – although it is this type of trading that Volcker Rule is seeking to eliminate.
If a Bank uses dark liquidity pools (DLP) they funnel their Brokerage trades through the CM group to avoid the fees associated with executing an exchange trade on the street. Such activities can also be used to hide exchange based trading activity from the Street. In the past, Banks used to make their substantial revenues by profiting from their proprietary trading or by collecting fees for executing trades on behalf of their treasury group or other clients.
Banking and within it, capital markets continues to generate insane amounts of data. These producers range from news providers to electronic trading participants to stock exchanges which are increasingly looking to monetize data. And it is not just the banks, regulatory authorities like the FINRA in the US are processing peak volumes of 40-75 billion market events a day http://www.vamsitalkstech.com/?p=1157 . In addition to data volumes, Capital Markets has always possessed a variety challenge as well. They have tons of structured data around traditional banking data, market data, reference data & other economic data. You can then factor in semi-structured data around corporate filings,news,retailer data & other gauges of economic activity. An additional challenge now is the creation of data from social media, multimedia etc – firms are presented with significant technology challenges and business opportunities.
Within larger financial supermarkets, the capital markets group typically leads the way in being forward looking in terms of adopting cutting edge technology and high tech spends. Most of the compute intensive problems are generated out of either this group or the enterprise risk group. These groups own the exchange facing order management systems, the trade booking systems, the pricing libraries for the products the bank trades as well as the tactical systems that are used to manage their market and credit risks, customer profitability, compliance and collateral systems. They typically hold about one quarter of a Banks total IT budget. Capital Markets thus has the largest number of use cases for risk and compliance.
Players across value chain on the buy side, the sell side, the intermediaries (stock exchanges & the custodians) & technology firms such as market data providers are all increasingly looking at leveraging these new data sets that can help unlock the value of data for business purposes beyond operational efficiency.
So what are the different categories of applications that are clearly leveraging Big Data in production deployments.
Illustration – How are Capital Markets leveraging Big Data In 2016
I have catalogued the major ones below based on my work with the majors in the spectrum over the last year.
- Client Profitability Analysis or Customer 360 view: With the passing of the Volcker Rule, the large firms are now moving over to a model based on flow based trading rather than relying on prop trading. Thus it is critical for capital market firms to better understand their clients (be they institutional or otherwise) from a 360-degree perspective so they can be marketed to as a single entity across different channels—a key to optimizing profits with cross selling in an increasingly competitive landscape. The 360 view encompasses defensive areas like Risk & Compliance but also the ability to get a single view of profitability by customer across all of their trading desks, the Investment Bank and Commercial Lending.
- Regulatory Reporting – Dodd Frank/Volcker Rule Reporting: Banks have begun to leverage data lakes to capture every trade intraday and end of day across it’s lifecycle. They are then validating that no proprietary trading is occurring on on the banks behalf.
- CCAR & DFast Reporting: Big Data can substantially improve the quality of raw data collected across multiple silos. This improves the understanding of a Bank’s stress test numbers.
- Timely and accurate risk management: Running Historical, stat VaR (Value at Risk) or both to run the business and to compare with the enterprise risk VaR numbers.
- Timely and accurate liquidity management: Look at the tiered collateral and their liquidity profiles on an intraday basis to manage the unit’s liquidity. They also need to look at credit and market stress scenarios and be able to look at the liquidity impact of those scenarios.
- Timely and accurate intraday Credit Risk Management: Understanding when & if deal breaches a tenor bucketed limit before they book it. For FX trading this means that you have about 9 milliseconds to determine if you can do the trade. This is a great place to use in memory technology like Spark/Storm and a Hadoop based platform. These usecases are key in increasing the capital that can be invested in the business. To do this they need to convince upper management that they are managing their risks very tightly.
- Timely and accurate intraday Market Risk Management: Leveraging Big Data to market risk computations ensures that Banks have a real time idea of any market limit breaches for any of the tenor bucketed market limits.
- Reducing Market Data costs: Market Data providers like Bloomberg, Thomson Reuters and other smaller agencies typically charge a fee each time data is accessed. With a large firm, both the front office and Risk access this data on an ad-hoc fairly uncontrolled basis. A popular way to save on cost is to negotiate the rights to access the data once and read it many times. The key is that you need a place to put it & that is the Data Lake.
- Trade Strategy Development & Backtesting: Big Data is being leveraged to constantly backtest trading strategies and algorithms on large volumes of historical and real time data. The ability to scale up computations as well as to incorporate real time streams is key to
- Sentiment Based Trading: Today, large scale trading groups and desks within them have begun monitoring economic, political news and social media data to identify arbitrage opportunities. For instance, looking for correlations between news in the middle east and using that to gauge the price of crude oil in the futures space. Another example is using weather patterns to gauge demand for electricity in specific regional & local markets with a view to commodities trading. The realtime nature of these sources is information gold. Big Data provides the ability to bring all these sources into one central location and use the gleaned intelligence to drive various downstream activities in trading & private banking.
- Market & Trade Surveillance:Surveillance is an umbrella term that usually refers to a wide array of trading practices that serve to distort securities prices thus enabling market manipulators to illicitly profit at the expense of other participants, by creating information asymmetry. Market surveillance is generally out by Exchanges and Self Regulating Organizations (SRO) in the US – all of which have dedicated surveillance departments set up for this purpose. However, capital markets players on the buy and sell side also need to conduct extensive trade surveillance to report up internally. Pursuant to this goal, the exchanges & the SRO’s monitor transaction data including orders and executed trades & perform deep analysis to look for any kind of abuse and fraud.
- Buy Side (e.g. Wealth Management) – A huge list of usecases I have catalogued here – https://dzone.com/articles/the-state-of-global-wealth-management-part-2-big-d
- AML Compliance – Covered in various blogs and webinars.
The Final Word –
A few tactical recommendations to industry CIOs:
- Firstly, capital markets players should look to create centralized trade repositories for Operations, Traders and Risk Management. This would allow consolidation of systems and a reduction in costs by providing a single platform to replace operations systems, compliance systems and desk centric risk systems. This would eliminate numerous redundant data & application silos, simplify operations, reduce redundant quant work, improve and understanding of risk.
- Secondly, it is important to put in place a model to create sources of funding for discretionary projects that can leverage Big Data.
- Third, Capital Markets groups typically have to fund their portion of AML, Dodd Frank, Volcker Rule, Trade Compliance, Enterprise Market Risk and Traded Credit Risk projects. These are all mandatory spends. After this they typically get to tackle discretionary business projects. Eg- fund their liquidity risk, trade booking and tactical risk initiatives. These defensive efforts always get the short end of the stick and are not to be neglected while planning out new initiatives.
- Finally, an area in which a lot of current players are lacking is the ability to associate clients using a Lightweight Entity Identifier (LEI). Using a Big Data platform to assign logical and physical entity ID’s to every human and business the bank interacts can have salubrious benefits. Big Data can ensure that firms can do this without having to redo all of their customer on-boarding systems. This is key to achieving customer 360 views, AML and FATCA compliance as well as accurate credit risk reporting.
It is no longer enough for CIOs in this space to think of tactical Big Data projects, they must be thinking around creating platforms and ecosystems around those platforms to be able to do a variety of pathbreaking activities that generate a much higher rate of return.
 “The State of Capital Markets in 2016” – BCG Perspectives