Global Retail Banking Needs a Digital Makeover

If you don’t like change, you will like irrelevance even less.” -General Eric Shinseki, Former  US Secretary of Veterans Affairs

This blog has spent time documenting the ongoing digital disruption across the industry especially financial services. Is there proof that creative destruction is taking a hold in Banking? The answer is a clear & unequivocal “Yes”. Clearly, Retail Banking is undergoing a massive makeover. This is being driven by many factors – changing consumer preferences, the advent of technology, automation of business processes & finally competition from not just the traditional players but also the Fintechs. The first casualty of this change is the good old Bank Branch. This post looks at the business background of Retail Banking across the world & will try to explain my view on what is causing this shift in how Banks and consumers perceive financial services.

This blog post will be one of a series of five standalone posts on Retail Bank transformation. The intention for the first post is to discuss industry dynamics, the current state of competition and will briefly introduce the forces causing a change in the status quo. The second post will categorize FinTechs across the banking landscape with key examples of how they disinter-mediate established players. The remaining posts will examine each of the other forces (Customer  in more detail along with specific and granular advice to retail banks on how to incorporate innovation into their existing technology, processes and organizational culture.

Introduction – 

Retail Banking is perhaps one of the most familiar and regular services that everyday citizens use in the course of their lives. Money is a commodity we touch every day in our lives when we bank, shop, pay bills, borrow etc. Retail lines of banking typically include personal accounts, credit cards, mortgages and auto loans. 

For large financial conglomerates that have operations spanning Commercial Banking, Capital Markets, Wealth & Asset Management etc, retail operations have always represented an invaluable source of both stability as well as balance sheet strength. The sheer size & economic exposure of retail operations ensures that it is not only staid yet stable but also somewhat insulated from economic shocks. This is borne out by the policies of respective national central banks & treasury departments. Indeed one of main the reasons regulators have bailed out banks in the past is due to the perception that Main Street & the common citizen’s banking assets becoming a casualty of increased risk taking  by traders in the capital markets divisions. This scenario famously played out during the Great Depression in the late 1920s and was a major factor in causing widespread economic contagion. A stock market crash quickly cascaded into a nation-wide economic depression. 

Thus, retail banking is crucial to not just to the owning corporation but also to diverse stakeholders in the world economy – deposit holders, the regulators led by the US Federal Reserve (in the US) & a host of other actors.  

The State of Global Retail Banking – 

In the financial crisis of 2008, retail banks not only held their own but also assumed a bigger share of revenues as the recovery got underway in the following years. According to a survey by Boston Consulting Group (BCG), retail banking activities accounted for 55 percent of the revenues generated across a global cohort of 140 banks, up from 45 percent in 2006.[1] 

However, the report also contends that retail revenues since 2008 have been slowly falling as investors have begin shifting their savings to deposits as a reaction to high profile financial scandals thus putting pressure on margins. Higher savings rates have helped offset this somewhat & retail banks ended up maintaining better cost to income (CIR) ratios than did other areas of banking.Retail banks also performed better on a key metric return on assets (ROA). The below graphic from the BCG captures this metric. In the Americas region, the average ROA was 162 percent higher than the average group ROA in 2008. From 2001 through 2006, it was 51 percent higher. Global banking revenues stood at $ 1.59 trillion in 2015 – a figure that is expected to hold relatively steady across the globe [2]

It is also important to note that global performance of retail banks across the five major regions: the Americas, Europe, the Middle East, Asia, and Australia has generally varied based on a multitude of factors. And even within regions, banking performance has varied widely.[2]

Retail Banking - BCG

                                      Illustration 1 – Retail Banking is profitable and stable 

As stable as this sector seems, it is also be roiled by four main forces that are causing every major player to rethink their business strategy. Left unaddressed, these changes will cause huge and negative impacts on competitive viability, profitability & also impact all important growth over the next five years. 

What is the proof that retail banking is beginning to change? The below graphic from CNN [1] says it all –

BofA_Branches_CNN

Bank of America has 23% fewer branches and 37% fewer employees than in 2009.  That downward trend across both metrics is expected to continue as online transactions from (deposits to checks to online loans) grown by a staggering 94%. The bank is expected to cut more positions in reflection of a shrinking headcount and branch footprint[1].

Pressure from the FinTechs:

The Financial Services and the Insurance industry are facing an unprecedented amount of change driven by factors like changing client preferences and the emergence of new technology—the Internet, mobility, social media, etc. These changes are immensely profound, especially with the arrival of “FinTech”—technology-driven applications that are upending long-standing business models across all sectors from retail banking to wealth management & capital markets. Further, members of a major new segment, Millennials, increasingly use mobile devices, demand more contextual services and expect a seamless unified banking experience—something akin to what they  experience on web properties like Facebook, Amazon, Uber, Google or Yahoo, etc. They do so by expanding their wallet share of client revenues by offering contextual products tailored to individual client profiles. Their savvy use of segmentation data and predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, call center banking, point of sale, ATM/kiosk etc.).

Retail Banking must trend Digital to respond – 

The definition of Digital is somewhat nebulous, I would like to define the key areas where it’s impact and capabilities will need to be felt for this gradual transformation to occur.

A true Digital Bank needs to –

  • Offer a seamless customer experience much like the one provided by the likes of Facebook & Amazon i.e highly interactive & intelligent applications that can detect a single customer’s journey across multiple channels
  • offer data driven interactive services and products that can detect customer preferences on the fly, match them with existing history and provide value added services. Services that not only provide a better experience but also foster a longer term customer relationship
  • to be able to help the business prototype, test, refine and rapidly develop new business capabilities
  • Above all, treat Digital as a Constant Capability and not as an ‘off the shelf’ product or a one off way of doing things

The five areas that established banks need to change across are depicted below..

RetailBank_Value_Drivers

  1. Convert branches to be advisory & relationship focused instead of centers for transactions – As the number of millennials keeps growing, the actual traffic to branches will only continue to decline.  Branches still have an area of strength in being intimate customer touch points. The branch of the future can be redesigned to have more self service features along with relationship focused advisory personnel instead of purely being staffed by tellers and managers. They need to be reimagined as Digital Centers, not unlike an Apple store, with highly interactive touch screens and personnel focused on building business through high margin products.
  2. Adopt a FinTech like mindset – FinTechs (or new Age financial industry startups) offer enhanced customer experiences built on product innovation and agile business models. They do so by expanding their wallet share of client revenues by offering contextual products tailored to individual client profiles. Their savvy use of segmentation data and predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, Point Of Sale, Internet, etc.). Like banks, these technologies support multiple modes of payments at scale, but they aren’t bound by the same regulatory and compliance regulations as are banks, who operate under a mandate that they must demonstrate that they understand their risk profiles. The best retail banks will not only seek to learn from, but sometimes partner with, emerging fintech players to integrate new digital solutions and deliver exceptional customer experience. To cooperate and take advantage of fintechs, banks will require new partnering capabilities. To heighten their understanding of customers’ needs and to deliver products and services that customers truly value, banks will need new capabilities in data management and analytics.
  3. Understand your customer – Banks need to move to a predominantly online model, providing consumers with highly interactive, engaging and contextual experiences that span multiple channels—branch banking, eBanking, POS, ATM, etc. Further goals are increased profitability per customer for both micro and macro customer populations with the ultimate goal of increasing customer lifetime value (CLV).
  4. Business Process improvement – Drive Automation across lines of business  – Financial services are fertile ground for business process automation, since most banks across their various lines of business are simply a collection of core and differentiated processes. Examples of these processes are consumer banking (with processes including on boarding customers, collecting deposits, conducting business via multiple channels, and compliance with regulatory mandates such as KYC and AML); investment banking (including straight-through-processing, trading platforms, prime brokerage, and compliance with regulation); payment services; and wealth management (including modeling model portfolio positions and providing complete transparency across the end-to-end life cycle). The key takeaway is that driving automation can result not just in better business visibility and accountability on behalf of various actors. It can also drive revenue and contribute significantly to the bottom line. Automation enables enterprise business and IT users to document, simulate, manage, automate and monitor business processes and policies. It is designed to empower business and IT users to collaborate more effectively, so business applications can be changed more easily and quickly.
  5. Agile Culture – All of the above are only possible if the entire organization operates on an agile basis in order to collaborate across the value chain. Cross functional teams across new product development, customer acquisition & retention, IT Ops, legal & compliance must collaborate in short work cycles to close the traditional business & IT innovation gap.  One of DevOps’s chief goals is to close the long-standing gap between the engineers who develop and test IT capability and the organizations that are responsible for deploying and maintaining IT operations. Using traditional app dev methodologies, it can take months to design, test and deploy software. No business today has that much time—especially in the age of IT consumerization and end users accustomed to smart phone apps that are updated daily. The focus now is on rapidly developing business applications to stay ahead of competitors that can better harness Big Data’s amazing business capabilities.

How can all of this be quantified? –

The results of BCG’s sixth annual Global Retail-Banking Excellence benchmarking illustrate the value drivers. Forward looking banks are working on some of the above aspects are able to reduce cycle times for core processes thus improving productivity. The leaders in the survey are also reallocating resources from the mid and office to customer facing roles.[3]

Again, according to the BCG, digital reinvention comes with huge benefits to both the top and bottom-lines. Their annual survey across the global retail banking sector estimates an average reduction in operating expenses from 15% to 25%, increases in pretax profit by 20% to 30% and an average increase in margins before tax from 5% to 10%. [3] These numbers are highly impressive at the scale that large banks operate.

The question thus is, can the vast majority of Banks change before it’s too late? Can they find the right model of execution in the Digital Age before their roles are either diminished or dis-intermediated by competition?

We will dive deep into the FinTech’s in the next post in the series.

References

[1] CNN Money – Bank of America has 23% fewer branches than 2009

[2]BCG Research- Winning Strategies Revisited for Retail Banking

[3] BCG Research- Global Capital Markets 2016: The Value Migration

Capital Markets Pivots to Big Data in 2016

Previous posts in this blog have discussed how Capital markets firms must create new business models and offer superior client relationships based on their vast data assets. Firms that can infuse a data driven culture in both existing & new areas of operation will enjoy superior returns and raise the bar for the rest of the industry in 2016 & beyond. 

Capital Markets are the face of the financial industry to the general public and generate a large percent of the GDP for the world economy. Despite all the negative press they have garnered since the financial crisis of 2008, capital markets perform an important social function in that they contribute heavily to economic growth and are the primary vehicle for household savings. Firms in this space allow corporations to raise capital using the underwriting process. However, it is not just corporations that benefit from such money raising activity – municipal, local and national governments do the same as well. Just that the overall mechanism differs – while business enterprises issue both equity and bonds, governments typically issue bonds. According to the Boston Consulting Group (BCG), the industry will grow to annual revenues of $661 billion in 2016 from $593 billion in 2015 – a healthy 12% increase. On the buy side, the asset base (AuM – Assets under Management) is expected to reach around $100 trillion by 2020 up from $74 trillion in 2014.[1]

Within large banks, the Capital Markets group and the Investment Banking Group perform very different functions.  Capital Markets (CM) is the face of the bank to the street from a trading perspective.  The CM group engineers custom derivative trades that hedge exposure for their clients (typically Hedge Funds, Mutual Funds, Corporations, Governments and high net worth individuals and Trusts) as well as for their own treasury group.  They may also do proprietary trading on the banks behalf for a profit – although it is this type of trading that Volcker Rule is seeking to eliminate.

If a Bank uses dark liquidity pools (DLP) they funnel their Brokerage trades through the CM group to avoid the fees associated with executing an exchange trade on the street.  Such activities can also be used to hide exchange based trading activity from the Street.  In the past, Banks used to make their substantial revenues by profiting from their proprietary trading or by collecting fees for executing trades on behalf of their treasury group or other clients.

Banking and within it, capital markets continues to generate insane amounts of data. These producers range from news providers to electronic trading participants to stock exchanges which are increasingly looking to monetize data. And it is not just the banks, regulatory authorities like the FINRA in the US are processing peak volumes of 40-75 billion market events a day http://www.vamsitalkstech.com/?p=1157 [2]. In addition to data volumes, Capital Markets has always  possessed a variety challenge as well. They have tons of structured data around traditional banking data, market data, reference data & other economic data. You can then factor in semi-structured data around corporate filings,news,retailer data & other gauges of economic activity. An additional challenge now is the creation of data from social media, multimedia etc – firms are presented with significant technology challenges and business opportunities.

Within larger financial supermarkets, the capital markets group typically leads the way in  being forward looking in terms of adopting cutting edge technology and high tech spends.  Most of the compute intensive problems are generated out of either this group or the enterprise risk group. These groups own the exchange facing order management systems, the trade booking systems, the pricing libraries for the products the bank trades as well as the tactical systems that are used to manage their market and credit risks, customer profitability, compliance and collateral systems.  They typically hold about one quarter of a Banks total IT budget. Capital Markets thus has the largest number of use cases for risk and compliance.

Players across value chain on the buy side, the sell side, the intermediaries (stock exchanges & the custodians) & technology firms such as market data providers are all increasingly looking at leveraging these new data sets that can help unlock the value of data for business purposes beyond operational efficiency.

So what are the  different categories of applications that are clearly leveraging Big Data in production deployments.

CapMkts_UseCases

                      Illustration – How are Capital Markets leveraging Big Data In 2016

I have catalogued the major ones below based on my work with the majors in the spectrum over the last year.

  1. Client Profitability Analysis or Customer 360 view:  With the passing of the Volcker Rule, the large firms are now moving over to a model based on flow based trading rather than relying on prop trading. Thus it is critical for capital market firms to better understand their clients (be they institutional or otherwise) from a 360-degree perspective so they can be marketed to as a single entity across different channels—a key to optimizing profits with cross selling in an increasingly competitive landscape. The 360 view encompasses defensive areas like Risk & Compliance but also the ability to get a single view of profitability by customer across all of their trading desks, the Investment Bank and Commercial Lending.
  2. Regulatory Reporting –  Dodd Frank/Volcker Rule Reporting: Banks have begun to leverage data lakes to capture every trade intraday and end of day across it’s lifecycle. They are then validating that no proprietary trading is occurring on on the banks behalf.  
  3. CCAR & DFast Reporting: Big Data can substantially improve the quality of  raw data collected across multiple silos. This improves the understanding of a Bank’s stress test numbers.
  4. Timely and accurate risk management: Running Historical, stat VaR (Value at Risk) or both to run the business and to compare with the enterprise risk VaR numbers.
  5. Timely and accurate liquidity management:  Look at the tiered collateral and their liquidity profiles on an intraday basis to manage the unit’s liquidity.  They also need to look at credit and market stress scenarios and be able to look at the liquidity impact of those scenarios.
  6. Timely and accurate intraday Credit Risk Management:  Understanding when  & if  deal breaches a tenor bucketed limit before they book it.  For FX trading this means that you have about 9 milliseconds  to determine if you can do the trade.  This is a great place to use in memory technology like Spark/Storm and a Hadoop based platform. These usecases are key in increasing the capital that can be invested in the business.  To do this they need to convince upper management that they are managing their risks very tightly.
  7. Timely and accurate intraday Market Risk Management:  Leveraging Big Data to market risk computations ensures that Banks have a real time idea of any market limit breaches for any of the tenor bucketed market limits.
  8. Reducing Market Data costs: Market Data providers like Bloomberg, Thomson Reuters and other smaller agencies typically charge a fee each time data is accessed.  With a large firm, both the front office and Risk access this data on an ad-hoc fairly uncontrolled basis. A popular way to save on cost is to  negotiate the rights to access the data once and read it many times.  The key is that you need a place to put it & that is the Data Lake.
  9. Trade Strategy Development & Backtesting: Big Data is being leveraged to constantly backtest trading strategies and algorithms on large volumes of historical and real time data. The ability to scale up computations as well as to incorporate real time streams is key to
  10. Sentiment Based Trading: Today, large scale trading groups and desks within them have begun monitoring economic, political news and social media data to identify arbitrage opportunities. For instance, looking for correlations between news in the middle east and using that to gauge the price of crude oil in the futures space.  Another example is using weather patterns to gauge demand for electricity in specific regional & local markets with a view to commodities trading. The realtime nature of these sources is information gold. Big Data provides the ability to bring all these sources into one central location and use the gleaned intelligence to drive various downstream activities in trading & private banking.
  11. Market & Trade Surveillance:Surveillance is an umbrella term that usually refers to a wide array of trading practices that serve to distort securities prices thus enabling market manipulators to illicitly profit at the expense of other participants, by creating information asymmetry. Market surveillance is generally out by Exchanges and Self Regulating Organizations (SRO) like the FINRA in the US – all of which have dedicated surveillance departments set up for this purpose. However, capital markets players on the buy and sell side also need to conduct extensive trade surveillance to report up internally. Pursuant to this goal, the exchanges & the SRO’s monitor transaction data including orders and executed trades & perform deep analysis to look for any kind of abuse and fraud. Big Data shines at this usecase as discussed here – http://www.vamsitalkstech.com/?p=1212
  12. Buy Side (e.g. Wealth Management) – A huge list of usecases I have catalogued here – https://dzone.com/articles/the-state-of-global-wealth-management-part-2-big-d 
  13. AML Compliance –  Covered in various blogs and webinars.
    http://www.vamsitalkstech.com/?s=AML
    https://www.boozallen.com/insights/2016/04/webinar-anti-money-laudering – 

The Final Word

A few tactical recommendations to industry CIOs:

  • Firstly, capital markets players should look to create centralized trade repositories for Operations, Traders and Risk Management.  This would allow consolidation of systems and a reduction in costs by providing a single platform to replace operations systems, compliance systems and desk centric risk systems.  This would eliminate numerous redundant data & application silos, simplify operations, reduce redundant quant work, improve and understanding of risk.
  • Secondly, it is important to put in place a model to create sources of funding for discretionary projects that can leverage Big Data.
  • Third, Capital Markets groups typically have to fund their portion of AML, Dodd Frank, Volcker Rule, Trade Compliance, Enterprise Market Risk and Traded Credit Risk projects.  These are all mandatory spends.  After this they typically get to tackle discretionary business projects. Eg- fund their liquidity risk, trade booking and tactical risk initiatives.  These defensive efforts always get the short end of the stick and are not to be neglected while planning out new initiatives.
  • Finally, an area in which a lot of current players are lacking is the ability to associate clients using a Lightweight Entity Identifier (LEI). Using a Big Data platform to assign logical and physical entity ID’s to every human and business the bank interacts can have salubrious benefits. Big Data can ensure that firms can do this without having to redo all of their customer on-boarding systems. This is key to achieving customer 360 views, AML and FATCA compliance as well as accurate credit risk reporting.

It is no longer enough for CIOs in this space to think of tactical Big Data projects, they must be thinking around creating platforms and ecosystems around those platforms to be able to do a variety of pathbreaking activities that generate a much higher rate of return.

References

[1] “The State of Capital Markets in 2016” – BCG Perspectives

[2] FINRA Technology –
http://technology.finra.org/

Big Data Driven Disruption – The Robo-Advisor..(1/3)

Wealth Management is the highest growth businesses for any medium to large financial institution. It also is the highest customer touch segment of banking and is fostered on long term (read extremely lucrative advisory) relationships. This three part series explores the automated “Robo-advisor” movement in the first post. We will cover the business background and some definitions . The second post will focus on technology & architectural approaches to building out Robo-advisor. The final post will discuss best practices from a WM & industry standpoint in the context of Robo-advisors.

roboadvisor

(Image Credit – Forbes)

The term ‘Wealth Management‘ broadly refers to an aggregation of financial services that are typically bespoke and offered to highly affluent clients.  These include financial advisory,  personal investment management, financial advisory, and planning disciplines directly for the benefit of high-net-worth (HNWI) clients.  This term can refer to a wide range of possible functions and business models.

A wealth manager is a specialized financial advisor who helps a client construct an entire investment portfolio and advises on how to prepare for present and future financial needs. The investment portion of wealth management normally entails both asset allocation of a whole portfolio as well as the selection of individual investments. The planning function of wealth management often incorporates tax planning around the investment portfolio as well as estate planning for individuals as well as family estates.

The ability to sign up wealthy individuals & families; then retaining them over the years by offer those engaging, bespoke & contextual services will largely provide growth in the Wealth Management industry in 2016 and beyond.

However,  WM as an industry sector has lagged other areas within banking from a technology & digitization standpoint. Multiple business forces ranging from increased regulatory & compliance demands, digital demands & expectations from younger, technology savvy customers and new Age FinTechs have led to firms slowly begin a makeover process. Let us examine these trends in more detail. 

Business Trends Driving the need for Robo/Automated Investment Advisors –

These trends  are a combination of industry reality as well as changing preferences on behalf of the HNWI clientele –

  1. Growth in the Wealth Management business largely depends on the ability to sign up new clients. Previously WM shops would not be interested in signinup up clients with less than a certain value of investable assets (typical threshold being $ 1 million). However the need to on-ramp these folks onto a long term relationship means being able to offer lower cost automated business models that better fit their mindsets
  2. The mentality of younger clientele has also evolved over the years. These clients are technologically savvy, they largely have a DIY (Do It Yourself) mindset and their digital needs are largely being missed by the wealth management community. This rising segment demands digital services that are highly automated & 24/7 in nature without needing to pay the premium charged by a human advisor
  3. Regulatory, cost pressures are rising which are leading to commodification of services
  4. Innovative automation and usage techniques of data assets among new entrants aka the FinTechs are leading to the rise of automated advisory services thus challenging incumbent firms. At traditional brokerage firms like  Morgan Stanley, Bank of America Corp. and Wells Fargo & Co. about 46,000 human advisers were employed as of 2016. The challenge for these incumbent firms will be to develop such automated investing tools as well as offer more self-service channels for customers [2]
  5. A need to offer aggregated & holistic financial services tailored to the behavioral needs of the HNWI investors on an individual basis

So where is the biggest trend in this disruption? It is undoubtedly, the Robo-advisor.

Introducing the Automated Advisor (affectionately called the Robo-advisor) –

FinTechs led by Wealthfront and Betterment have pioneered the somewhat revolutionary concept of Robo-advisors. To define the term – a Robo-advisor is an algorithm based automated investment advisor that can provide a range of Wealth Management services described below. The Robo-advisor can be optionally augmented & supervised by a human adviser. At the moment, owing to the popularity of Robo-advisors among the younger high networth investors (HNWI), a range of established players like Vanguard, Charles Schwab as well as a number of FinTech start-ups have developed these automated online investment tools or have acquired FinTech’s in this space.e.g Blackrock. The Robo-advisor is built using digital techniques – such as data science & Big Data – as we will explore in the next post.

What service models can Robo-advisors satisfy –

Full service Wealth Management firms broadly provide services in the following core areas which Robo-advisors can slowly begin supplementing –

  1. Investment Advisory – Helping a client construct an investment portfolio that helps her/him prepare for life changes based on their respective risk appetites & time horizons. The financial instruments invested in range from the mundane – equities, bonds etc to the arcane – hedging derivatives etc
  2. Retirement Planning – Retirement planning is a obvious function of a client’s personal financial journey & one that lends itself to automation. From a HNWI standpoint, there is a need to provide complex retirement services while balancing taxes, income needs & estate prevention etc. Robo-advisors are able to bring in market trends and movements of securities to ensure that client’s retirement holdings are not  skewed toward particular sectors of the marke.
  3. Estate Planning Services – A key function of wealth management is to help clients pass on their assets via inheritance. The Robo-advisor can assist a human wealth managers helps construct wills that leverage trusts and suggest suitable forms of insurance etc to help facilitate a smooth process of estate planning
  4. Tax Planning – Robo-advisors can help clients manage their wealth in such a manner that tax impacts are reduced from a taxation (e.g IRS in the US) perspective. As the pools of wealth increase, even small rates of taxation can have a magnified impact either way. The ability to achieve the right mix of investments from a tax perspective is a key capability and one that can be automated to a high degree
  5. Insurance Management – A Robo-advisor can help suggest and manage  the kinds of insurance purchased by their HNWI clients so that the appropriate hedging services could be put in place based on the client’s specific investment mix & exposures
  6. Institutional Investments– Institutional Robo-advisors can provide investment services to investors like pension funds, hedge funds etc while automating them a variety of backoffice functions

Currently most Robo-advisors limit themselves to providing the first function only i.e portfolio management (i.e. allocating investments among asset classes) without addressing issues such as estate and retirement planning and cash-flow management, which are also the domain of financial planning.[1]

Expect this to change as the technology rapidly matures in the years to come with advances in cognitive computing that will enable . At one of the earliest Robo-advisors, Betterment,  as of early 2016 – more than half of their $3.3 billion of assets under management comes from people with more than $100,000 at the firm. Another early starter, Wealthfront estimated more than a third of its almost $3 billion in assets in accounts requiring at least $100,000. Schwab, one of the first established investment firms to produce an automated product, attracted $5.3 billion to its offering in its first nine months.[2]

Robo

Robo-advisory business models

Currently there are a few different business models that are being adopted by firms.

  1. Full service online Robo-advisor that is a 100% automated without any human element
  2. Hybrid Robo-advisor model being pioneered by firms like Vanguard & Charles Schwab
  3. Pure online advisor that is primarily human in nature

Conclusion –

As one can see clearly, automated investing methods are still in early stages of maturity. However, they are unmistakably the next big trend in the WM industry and one that players should begin developing capabilities around. According to AT.Kearney, by 2020, Roboadvisors will manage around $2.2 trillion in global HNWI assets.[2]

The next post in this three part series will focus on the pivotal role of Big Data in creating a Robo-advisor. We will discuss system requirements & propose a reference architecture. 

References

  1. Wikipedia – https://en.wikipedia.org/wiki/Robo-advisor
  2. Bloomberg – “The Rich are already using Roboadvisors and that scares the banks..”

Across Industries, Big Data Is Now the Engine of Digital Innovation..

The data fabric is the next middleware.” –Todd Papaioannou, CTO at Splunk

Enterprises across the globe are confronting the need to create a Digital Strategy. While the term itself may seen intimidating to some, it essentially represents  an agile culture built on customer centricity & responsiveness. The only way to attain Digital success is to understand your customers at a micro level while making strategic decisions on your offerings to the market. Big Data has become the catalyst in this massive disruption as it can help business in any vertical solve their need to understand their customers better. It aids this by providing foundational  platform for amazing products.

We have seen how how exploding data generation across the global has become a clear & present business & IT phenomenon. Data volumes are rapidly expanding across industries. However, while the production of data itself that has increased but it is also driving the need for organizations to derive business value from it. This calls for the collection & curation of data from dynamic,  and highly distributed sources such as consumer transactions, B2B interactions, machines such as ATM’s & geo location devices, click streams, social media feeds, server & application log files and multimedia content such as videos etc. It needs to be noted that data volumes here consist of multi-varied formats, differing schemas, transport protocols and velocities.

Internet of Things (IoT) has become an entire phenomenon to itself. It is truly a horizontal vertical (no pun intended) as the proliferation of applications of sensors is causing rapid change in system & application architectures. The system of IoT is burgeoning from the initial sensors, digital devices, mechanical automatons to cars, process monitoring systems, browsers, television, traffic cameras etc etc.

Big Data is thus crossing the innovation chasm. A vast majority of early adopter projects are finding business success with a strong gain in ROI (Return On Investment). The skills gap is beginning to slowly decrease with Hadoop ecosystem becoming a skill that every modern application developer needs to have. Increasingly customers are leading the way by deploying Big Data in new and previously uncharted areas like cybersecurity leading to massive cross vertical interest.

DT_Vectors

The five elements in Digital Transformation, irrespective of the business vertical you operate in, are –

  1. Customer Centricity
  2. Realtime multichannel analytics
  3. Operational improvements – Risk, Fraud & Compliance
  4. Ability of the business to visualize data
  5. Marketing & Campaign optimization

The first element in Digital is the Customer centricity.

Big Data drives this in myriad ways  –  

  1. Obtaining a realtime Single View of an entity (typically a customer across multiple channels, product silos & geographies)
  2. Customer Segmentation by helping businesses understand their customers down to the individual level as well as at a segment level
  3. Customer sentiment analysis by combining internal organizational data, clickstream data, sentiment analysis with structured sales history to provide a clear view into consumer behavior.
  4. Product Recommendation engines which provide compelling personal product recommendations by mining realtime consumer sentiment, product affinity information with historical data.
  5. Market Basket Analysis, observing consumer purchase history and enriching this data with social media, web activity, and community sentiment regarding past purchase and future buying trends.

Realtime Multichannel Analytics is the second piece of a Digital Strategy.

Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. The healthcare industry stores patient data across multiple silos – ADT (Admit Discharge Transfer) systems, medication systems, CRM systems etc but all of this must be exposed across different mediums of access. Data Lakes provide an ability to visualize all of the patients data in one place thus improving outcomes. Every customer facing application needs to be both multi-channel as well as one that supports  a unified 360 degree customer view across all these engagement points. Applications developed in 2016 and beyond must take a 360 degree based approach to ensuring a continuous client experience across the spectrum of endpoints and the platforms that span them from a Data Visualization standpoint. Every serious business needs to provide a unified view of a customer across tens of product lines and geographies. Big Data not only provides the core foundational elements for a realtime view of the moving parts of the business but also enables businesses to listen to their customers.

A strategic approach to improving Risk, Fraud & Compliance analytics  can add massive value and competitive differentiation in three distinct categories as shown below.

  1. Exponentially improve existing business processes. e.. Risk data aggregation and measurement, HIPAA/SOX/Manufacturing compliance, fraud detection
  2. Help create new business models and go to market strategies – by monetizing multiple data sources – both internal and external
  3. Vastly improve regulatory compliance by generating fresher and more accurate insights across silos of proprietary data

The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging Visualization but also to personalize services clients care about across multiple modes of interaction. Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. Healthcare is a close second where caregivers expect patient, medication & disease data at their fingertips with a few finger swipes on an iPad app.

The ability of outbound Marketing campaigns to reach engaged customers in a proactive manner using the right channel has been a big gap in their effectiveness. The old school strategy of blasting out direct mailers and emails does not work anymore both from a cost as well as a customer engagement standpoint. Nowadays, campaigns for exciting new products & promotions need to be built on the rich customer intelligence assets that Big Data enables you to build. Examples of these capabilities are replete in sectors like Retail where offering a positive purchase experience in terms of personalized offers, price comparisons, social network based sharing of experiences et al drive higher customer engagement & loyalty.

The Final Word

My goal for this post was to communicate a business revelation that I have had in past year. While the semantics of business processes, the usecases & the data sources, elements, formats may vary from industry to industry ( e.g. Banking to Healthcare to Manufacturing to Telecom) – the approaches as well as the benefits from leveraging a data & analytics driven business model essentially remain the same. These capabilities are beginning to separate the winners from the rest of the pack.

Open Enterprise Hadoop – as secure as Fort Knox

Previous posts in this blog have discussed customers leveraging Open Source, Big Data and Hadoop related technologies for a range of use cases across industry verticals. We have seen how a Hadoop-powered “Data Lake” can not only provide a solid foundation for a new generation of applications that provide analytics and insight, but can also increase the number of access points to an organization’s data. As diverse types of both external and internal enterprise data are ingested into a central repository, the inherent security risks must be understood and addressed by a host of actors in the architecture. Security is thus highly essential for organizations that store and process sensitive data in the Hadoop ecosystem. Many organizations must adhere to strict corporate security polices as well as rigorous industry guidelines. So how does open source Hadoop stack upto demanding standards such as PCI-DSS? 

We have from time to time, noted the ongoing digital transformation across industry verticals. For instance, banking organizations are building digital platforms that aim to engage customers, partners and employees. Retailers & Banks now recognize that the key to win the customer of the day is to offer a seamless experience across multiple channels of engagement. Healthcare providers want to offer their stakeholders – patients, doctors,nurses, suppliers etc with multiple avenues to access contextual data and services; the IoT (Internet of Things) domain is abuzz with the possibilities of Connected Car technology.

The aim of this blogpost is to disabuse those notions which float around from time to time where a Hadoop led 100% open source ecosystem is cast as being somehow insecure or unable to fit well into a corporate security model. It is only to dispel such notions about open source, the Open Source Alliance has noted well that – “Open source enables anyone to examine software for security flaws. The continuous and broad peer-review enabled by publicly available source code improves security through the identification and elimination of defects that might otherwise be missed. Gartner for example, recommends the open source Apache Web server as a more secure alternative to closed source Internet Information servers. The availability of source code also facilitates in-depth security reviews and audits by government customers.” [2]

It is a well understood fact that data is the most important asset a business possess and one that nefarious actors are usually after. Let us consider the retail industry- cardholder data such as card numbers or PAN (Primary Account Numbers) & other authentication data is much sought after by the criminal population.

The consequences of a data breach are myriad & severe and can include –

  • Revenue losses
  • Reputational losses
  • Regulatory sanction and fines etc

Previous blogposts have chronicled cybersecurity in some depth. Please refer to this post as a starting point for a somewhat exhaustive view of cybersecurity. This awareness has led to an increased adoption in risk based security frameworks. E.g ISO 27001, the US National Institute of Standards and Technology (NIST) Cybersecurity Framework and SANS Critical Controls. These frameworks offer a common vocabulary, a set of guidelines that enable enterprises to  identify and prioritize threats, quickly detect and mitigate risks and understand security gaps.

In the realm of payment card data – regulators,payment networks & issuer banks themselves recognize this and have enacted compliance standard – the PCI DSS (Personal Cardholder Information – Data Security Standards). PCI is currently in its third generation incarnation or v3.0 which was introduced over the course of 2014. It is the most important standard for a host of actors –  merchants, processors, payment service providers or really any entity that stores or uses payment card data. It is also important to note that the core process of compliance all applications and systems in a merchant or a payment service provider.

The  PCI standards council recommends the following 12 components for PCI-DSS as depicted in the below table.

PCI_DSS_12_requirements_grande

Illustration: PCI Data Security Standard – high level overview (source: shopify.com)

While PCI covers a whole range of areas that touch payment data such as POS terminals, payment card readers, in store networks etc – data security is front & center.

It is to be noted though that according to the Data Security Standards body who oversee the creation & guidance around the PCI , a technology vendor or product cannot be declared as being cannot “PCI Compliant.”

Thus, the standard has wide implications on two different dimensions –

1. The technology itself as it is incorporated at a merchant as well as

2. The organizational culture around information security policies.

My experience in working at both Hortonworks & Red Hat has shown me that open source is certified at hundreds of enterprise customers running demanding workloads in verticals such as financial services, retail, insurance, telecommunications & healthcare. The other important point to note is that these customers are all PCI, HIPPA and SOX compliant across the board.

It is a total misconception that off the shelf and proprietary point solutions are needed to provide broad coverage across the above pillars. Open enterprise Hadoop offers comprehensive and well rounded implementations across all of the five areas and what more it is 100% open source. 

Let us examine how security in Hadoop works.

The Security Model for Open Enterprise Hadoop – 

The Hadoop community has thus adopted both a top down as well as bottom up approach when looking at security as well as examining at all potential access patterns and across all components of the platform.

Hadoop and Big Data security needs to be considered across the below two prongs – 

  1. What do the individual projects themselves need to support to guarantee that business architectures built using them are highly robust from a security standpoint? 
  2. What are the essential pillars of security that the platform which makes up every enterprise cluster needs to support? 

Let us consider the first. The Apache Hadoop project contains 25+ technologies in the realm of data ingestion, processing & consumption. While anything beyond a cursory look is out of scope here, an exhaustive list of the security hooks provided into each of the major projects are covered here [1].

For instance, Apache Ranger manages fine-grained access control through a rich user interface that ensures consistent policy administration across Hadoop data access components. Security administrators have the flexibility to define security policies for a database, table and column, or a file, and can administer permissions for specific LDAP-based groups or individual users. Rules based on dynamic conditions such as time or geolocation, can also be added to an existing policy rule. The Ranger authorization model is highly pluggable and can be easily extended to any data source using a service-based definition.[1]

Administrators can use Ranger to define a centralized security policy for the following Hadoop components and the list is constantly enhanced:

  • HDFS
  • YARN
  • Hive
  • HBase
  • Storm
  • Knox
  • Solr
  • Kafka

Ranger works with standard authorization APIs in each Hadoop component, and is able to enforce centrally administered policies for any method used to access the data lake.[1]

 Now the second & more important question from an overall platform perspective. 

There are five essential pillars from a security standpoint that address critical needs that security administrators place on data residing in a data lake. If any of these pillars is vulnerable from an implementation standpoint, it ends up creating risk built into organization’s Big Data environment. Any Big Data security strategy must address all five pillars, with a consistent implementation approach to ensure their effectiveness.

Security_Pillars

                             Illustration: The Essential Components of Data Security

  1. Authentication – does the user possess appropriate credentials? This is implemented via the Kerberos authentication protocol & allied concepts such as Principals, Realms & KDC’s (Key Distribution Centers).
  2. Authorization – what resources is the user allowed to access based on business need & credentials?  Implemented in each Hadoop project & integrated with an organizations LDAP/AD/.
  3. Perimeter Security – prevents unauthorized outside access to the cluster. Implemented via Apache Knox Gateway which extends the reach of Hadoop services to users outside of a Hadoop cluster. Knox also simplifies Hadoop security for users who access the cluster data and execute jobs.
  4. Centralized Auditing  – implemented via Apache Atlas and it’s integration with Apache Ranger.
  5. Security Administration – deals with the central setup & control all security information using a central console.  uses Apache Ranger to provide centralized security administration and management. The Ranger Administration Portal is the central interface for security administration. Users can create and update policies, which are then stored in a policy database.

ranger_centralized_admin

                                           Illustration: Centralized Security Administration

It is also to be noted that as Hadoop adoption grows at an incremental pace, workloads that harness data for complex business analytics and decision-making may need more robust data-centric protection (namely data masking, encryption, tokenization). Thus, in addition to the above Hadoop projects as Apache Ranger, enterprises can essentially take an augmentative approach.  Partner solutions that offer data centric protection for Hadoop data such as Dataguise DgSecure for Hadoop which clearly complement an enterprise ready Hadoop distribution (such as those from the open source leader Hortonworks) are definitely worth a close look.

Summary

While implementing Big Data architectures in support of business needs, security administrators should look to address coverage for components across each of the above areas as they design the infrastructure. A rigorous & bottom-up approach to data security makes it possible to enforce and manage security across the stack through a central point of administration, which will likely prevent any potential security gaps and inconsistencies. This approach is especially important for newer technology like Hadoop where exciting new projects & data processing engines are always being incubated at a rapid clip. After all, the data lake is all about building a robust & highly secure platform on which data engines – Storm,Spark etc and processing frameworks like Mapreduce function to create business magic. 

 References – 

[1] Hortonworks Data Security Guide
http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.2/bk_Security_Guide/

[2] Open Source Alliance of America
http://opensourceforamerica.org/learn-more/benefits-of-open-source-software/

Why Software Defined Infrastructure & why now..(1/2)

The ongoing digital transformation in key verticals like financial services, manufacturing, healthcare and telco has incumbent enterprises fending off a host of new market entrants. Enterprise IT’s best answer is to increase the pace of innovation as a way of driving increased differentiation in business processes. Though data analytics & automation remain the lynchpin of this approach – software defined infrastructure (SDI) built on the notions of cloud computing has emerged as the main infrastructure differentiator & that for a host of reasons which we will discuss in this two part blog.

Software Defined Infrastructure (SDI) is essentially an idea that brings together  advances in a host of complementary areas spanning both infrastructure software, data as well as development environments. It supports a new way of building business applications. The core idea in SDI is that massively scalable applications (in support of diverse customer needs) describe their behavior characteristics (via configuration & APIs) to underlying datacenter infrastructure which simply obeys those commands in an automated fashion while abstracting away the underlying complexities.

SDI as an architectural pattern was originally made popular by the web scale giants – the so-called FANG companies of tech — Facebook , Amazon , Netflix and Alphabet (the erstwhile Google) but has begun making it’s way into the enterprise world gradually.

Common Business IT Challenges prior to SDI – 
  1. Cost of hardware infrastructure is typically growing at a high percentage every year as compared to  growth in the total  IT budget. Cost pressures are driving an overall re look at the different tiers across the IT landscape.
  2. Infrastructure is not completely under the control of the IT-Application development teams as yet.  Business realities that dictate rapid app development to meet changing business requirements
  3. Even for small, departmental level applications, still needed to deploy expensive proprietary stacks which are not only cost and deployment footprint prohibitive but also take weeks to spin up in terms of provisioning cycles.
  4. Big box proprietary solutions leading to a hard look at Open Source technologies which are lean and easy to use with lightweight deployment footprint.Apps need to dictate footprint; not vendor provided containers.
  5. Concerns with acquiring developers who are tooled on cutting edge development frameworks & methodologies. You have zero developer mindshare with Big Box technologies.

Key characteristics of an SDI

  1. Applications built on a SDI can detect business events in realtime and respond dynamically by allocating additional resources in three key areas – compute, storage & network – based on the type of workloads being run.
  2. Using an SDI, application developers can seamlessly deploy apps while accessing higher level programming abstractions that allow for the rapid creation of business services (web, application, messaging, SOA/ Microservices tiers), user interfaces and a whole host of application elements.
  3. From a management standpoint, business application workloads are dynamically and automatically assigned to the available infrastructure (spanning public & private cloud resources) on the basis of the application requirements, required SLA in a way that provides continuous optimization across the life cycle of technology.
  4. The SDI itself optimizes the entire application deployment by both externally provisioned APIs & internal interfaces between the five essential pieces – Application, Compute, Storage, Network & Management.

The SDI automates the technology lifecycle –

Consider the typical tasks needed to create and deploy enterprise applications. This list includes but is not limited to –

  • onboarding hardware infrastructure,
  • setting up complicated network connectivity to firewalls, routers, switches etc,
  • making the hardware stack available for consumption by applications,
  • figure out storage requirements and provision those
  • guarantee multi-tenancy
  • application development
  • deployment,
  • monitoring
  • updates, failover & rollbacks
  • patching
  • security
  • compliance checking etc.
The promise of SDI is to automate all of this from a business, technology, developer & IT administrator standpoint.
 SDI Reference Architecture – 
 The SDI encompasses SDC (Software Defined Compute) , SDS (Software Defined Storage), SDN (Software Defined Networking), Software Defined Applications and Cloud Management Platforms (CMP) into one logical construct as can be seen from the below picture.
FS_SDDC

                      Illustration: The different tiers of Software Defined Infrastructure

The core of the software defined approach are APIs.  APIs control the lifecycle of resources (request, approval, provisioning,orchestration & billing) as well as the applications deployed on them. The SDI implies commodity hardware (x86) & a cloud based approach to architecting the datacenter.

The ten fundamental technology tenets of the SDI –

1. Highly elastic – scale up or scale down the gamut of infrastructure (compute – VM/Baremetal/Containers, storage – SAN/NAS/DAS, network – switches/routers/Firewalls etc) in near real time

2. Highly Automated – Given the scale & multi-tenancy requirements, automation at all levels of the stack (development, deployment, monitoring and maintenance)

3. Low Cost – Oddly enough, the SDI operates at a lower CapEx and OpEx compared to the traditional datacenter due to reliance on open source technology & high degree of automation. Further workload consolidation only helps increase hardware utilization.

4. Standardization –  The SDI enforces standardization and homogenization of deployment runtimes, application stacks and development methodologies based on lines of business requirements. This solves a significant IT challenge that has hobbled innovation at large financial institutions.

5. Microservice based applications –  Applications developed for a SDI enabled infrastructure are developed as small, nimble processes that communicate via APIs and over infrastructure like messaging & service mediation components (e.g Apache Kafka & Camel). This offers huge operational and development advantages over legacy applications. While one does not expect Core Banking applications to move over to a microservice model anytime soon, customer facing applications that need responsive digital UIs will need definitely consider such approaches.

6. ‘Kind-of-Cloud’ Agnostic –  The SDI does not enforce the concept of private cloud, or rather it encompasses a range of deployment options – public, private and hybrid.

7. DevOps friendly –  The SDI enforces not just standardization and homogenization of deployment runtimes, application stacks and development methodologies but also enables a culture of continuous collaboration among developers, operations teams and business stakeholders i.e cross departmental innovation. The SDI is a natural container for workloads that are experimental in nature and can be updated/rolled-back/rolled forward incrementally based on changing business requirements. The SDI enables rapid deployment capabilities across the stack leading to faster time to market of business capabilities.

8. Data, Data & Data –  The heart of any successful technology implementation is Data. This includes customer data, transaction data, reference data, risk data, compliance data etc etc. The SDI provides a variety of tools that enable applications to process data in a batch, interactive, low latency manner depending on what the business requirements are.

9. Security –  The SDI shall provide robust perimeter defense as well as application level security with a strong focus on a Defense In Depth strategy.

10. Governance –  The SDI enforces strong governance requirements for capabilities ranging from ITSM requirements – workload orchestration, business policy enabled deployment, autosizing of workloads to change management, provisioning, billing, chargeback & application deployments.

The next & final blog in this series will look at current & specific technology choices – as of 2016 – in building out an SDI.

Data Lakes power the future of Industrial Analytics..(1/4)

The first post in this four part series on Data lakes will focus on the business reasons to create one. The second post will delve deeper into the technology considerations & choices around data ingest & processing in the lake to satisfy myriad business requirements. The third will tackle the critical topic of metadata management, data cleanliness & governance. The fourth & final post in the series will focus on the business justification to build out a Big Data Center of Excellence (COE).

Business owners at the C level are saying, ‘Hey guys, look. It’s no longer inordinately expensive for us to store all of our data. I want all of you to make copies. OK, your systems are busy. Find the time, get an extract, and dump it in Hadoop.’”- Mike Lang, CEO of Revelytix

The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging visualization but also to personalize services clients care about across multiple modes of interaction. Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. Healthcare is a close second where caregivers expect patient, medication & disease data at their fingertips with a few finger swipes on an iPad app.

Big Data has been the chief catalyst in this disruption. The Data Lake architectural & deployment pattern makes it possible to first store all this data & then enables the panoply of Hadoop ecosystem projects & technologies to operate on it to produce business results.

Let us consider a few of the major industry verticals and the sheer data variety that players in these areas commonly possess – 

The Healthcare & Life Sciences industry possess some of the most diverse data across the spectrum ranging from – 

  • Structured Clinical data e.g. Patient ADT information
  • Free hand notes
  • Patient Insurance information
  • Device Telemetry 
  • Medication data
  • Patient Trial Data
  • Medical Images – e.g. CAT Scans, MRIs, CT images etc

The Manufacturing industry players are leveraging the below datasets and many others to derive new insights in a highly process oriented industry-

  • Supply chain data
  • Demand data
  • Pricing data
  • Operational data from the shop floor 
  • Sensor & telemetry data 
  • Sales campaign data

Data In Banking– Corporate IT organizations in the financial industry have been tackling data challenges due to strict silo based approaches that inhibit data agility for many years now.
Consider some of the traditional sources of data in banking –

  • Customer Account data e.g. Names, Demographics, Linked Accounts etc
  • Core Banking Data
  • Transaction Data which captures the low level details of every transaction (e.g debit, credit, transfer, credit card usage etc)
  • Wire & Payment Data
  • Trade & Position Data
  • General Ledger Data e.g AP (accounts payable), AR (accounts receivable), cash management & purchasing information etc.
  • Data from other systems supporting banking reporting functions.

Industries have changed around us since the advent of relational databases & enterprise data warehouses. Relational Databases (RDBMS) & Enterprise Data Warehouses (EDW) were built with very different purposes in mind. RDBMS systems excel at online transaction processing (OLTP) use cases where massive volumes of structured data needs to be processed quickly. EDW’s on the other hand perform online analytical processing functions (OLAP) where data extracts are taken from OLTP systems, loaded & sliced in different ways to . Both these kinds of systems are not simply suited to handle not just immense volumes of data but also highly variable structures of data.

awesome-lake

Let us consider the main reasons why legacy data storage & processing techniques are unsuited to new business realities of today.

  • Legacy data technology enforces a vertical scaling method that is sorely unsuited to handling massive volumes of data in a scale up/scale down manner
  • The structure of the data needs to be modeled in a paradigm called ’schema on write’ which sorely inhibits time to market for new business projects
  • Traditional data systems suffer bottlenecks when large amounts of high variety data are processed using them 
  • Limits in the types of analytics that could be performed. In industries like Retail, Financial Services & Telecommunications, enterprise need to build detailed models of customers accounts to predict their overall service level satisfaction in realtime. These models are predictive in nature and use data science techniques as an integral component. The higher volumes of data along with attribute richness that can be provided to them (e.g. transaction data, social network data, transcribed customer call data) ensures that the models are highly accurate & can provide an enormous amount of value to the business. Legacy systems are not a great fit here.

Given all of the above data complexity and the need to adopt agile analytical methods  – what is the first step that enterprises must adopt? 

The answer is the adoption of the Data Lake as an overarching data architecture pattern. Lets define the term first. A data lake is two things – a small or massive data storage repository and a data processing engine. A data lake provides “massive storage for any kind of data, enormous processing power and the ability to handle virtually limitless concurrent tasks or jobs“.[1] Data Lake are created to ingest, transform, process, analyze & finally archive large amounts of any kind of data – structured, semistructured and unstructured data.

DL_1

                                  Illustration – The Data Lake Architecture Pattern

What Big Data brings to the equation beyond it’s strength in data ingest & processing is a unified architecture. For instance, MapReduce is the original framework for writing applications that process large amounts of structured and unstructured data stored in the Hadoop Distributed File System (HDFS). Apache Hadoop YARN opened Hadoop to other data processing engines (e.g. Apache Spark/Storm) that can now run alongside existing MapReduce jobs to process data in many different ways at the same time. The result is that ANY kind of application processing can be run inside a Hadoop runtime – batch, realtime, interactive or streaming.

Visualization  – Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. The average enterprise user is also familiar with BYOD in the age of self service. The Digital Mesh only exacerbates this gap in user experiences as information consumers navigate applications as they consume services across a mesh that is both multi-channel as well as provides Customer 360 across all these engagement points.While information management technology has grown at a blistering pace, the human ability to process and comprehend numerical data has not. Applications being developed in 2016 are beginning to adopt intelligent visualization approaches that are easy to use,highly interactive and enable the user to manipulate corporate & business data using their fingertips – much like an iPad app. Tools such as intelligent dashboards, scorecards, mashups etc are helping change a visualization paradigms that were based on histograms, pie charts and tons of numbers. Big Data improvements in data lineage, quality are greatly helping the visualization space.

The Final Word

Specifically, Data Lake architectural pattern provide the following benefits – 

The ability to store enormous amounts of data with a high degree of agility & low cost: The Schema On Read architecture makes it trivial to ingest any kind of raw data into Hadoop in a manner that preserves it’s structure.  Business analysts can then explore  this data and then defined a schema to suit the needs of their particular application.

The ability to run any kind of Analytics on the data: Hadoop supports multiple access methods (batch, real-time, streaming, in-memory, etc.) to a common data set.  You are only restricted by your use case.

the ability to analyze, process & archive data while dramatically cutting cost : Since Hadoop was designed to work on low-cost commodity servers which have direct attached storage – it helps dramatically lower the overall cost of storage.  Thus enterprises are able to retain source data for long periods, thus providing business applications with far greater historical context.

The ability to augment & optimize Data Warehouses: Data lakes & Hadoop technology are not a ‘rip & replace’ proposition. While they provide a much lower cost environment than data warehouses, they can also be used as the compute layer to augment these systems.  Data can be stored, extracted and transformed in Hadoop. Then a subset of the data i.e the results are loaded into the data warehouse. This enables the EDW to leverage compute cycles and storage to perform truly high value analytics.

The next post of the series will dive deeper into the architectural choices one needs to make while creating a high fidelity & business centric enterprise data lake.

References – 

[1] https://en.wikipedia.org/wiki/Data_lake

The six megatrends helping enterprises derive massive value from Big Data..

The world is one big data problem.” – Andrew McAfee, associate director of the Center for Digital Business at MIT Sloan

Though Data as a topic has been close to my heart, it was often a subject I would not deal much with given my preoccupation with applications, middleware, cloud computing & DevOps. However I grabbed the chance to teach a Hadoop course in 2012 and it changed the way I looked at data – not merely an enabler but as the true oil of business. Fast forward to 2016, I have almost completed an amazing and enriching year at Hortonworks.  It is a good time for retrospection about how Big Data is transforming businesses across the Fortune 500 landscape. Thus, I present what is not merely the ‘Art of the Possible’ but ‘Business Reality’ –  distilled insights from an year of working with real world customers. Companies pioneering Big Data  into commercial applications to drive shareholder value & customer loyalty.

Six Megatrends

  Illustration – The Megatrends helping enterprises derive massive value from Big Data 

Please find presented the six megatrends that will continue to drive Big Data into enterprise business & IT architectures for the foreseeable future.

  1. The Internet of Anything (IoAT) – The rise of the machines has been well documented but enterprises have just begun waking up to the possibilities in 2016. The paradigm of  harnessing IoT data by leveraging Big Data techniques has begun to gain industry wide adoption & cachet. For example in the manufacturing industry, data is being gathered from a wide variety of sensors that are distributed geographically along factory locations running 24×7. Predictive maintenance strategies that pull together sensor data, prognostics are critical to efficiency & also to optimize the business. In other verticals like healthcare & insurance, massive data volumes are now being reliably generated from diverse sources of telemetry such as patient monitoring devices as well as human manned endpoints at hospitals. In transportation, these devices include cars in the consumer space, trucks & other field vehicles, geolocation devices. Others include field machinery in oil exploration & server logs across IT infrastructure. In the personal consumer space, personal fitness devices like FitBit, Home & Office energy management sensors etc. All of this constitutes the trend that Gartner terms the Digital Mesh. The Mesh really is built from coupling machine data these with the ever growing social media feeds, web clicks, server logs etc.The Digital Mesh leads to an interconnected information deluge which encompasses classical IoT endpoints along with audio, video & social data streams. This leads to huge security challenges and opportunity from a business perspective  for forward looking enterprises (including Governments). Applications that are leveraging Big Data to ingest, connect & combine these disparate feeds into one holistic picture of an entity – whether individual or institution – are clearly beginning to differentiate themselves. IoAT is starting to be a huge part of digital transformation initiatives with more usecases emerging in 2017 across industry verticals.
  2. The Emergence of Unified Architectures – The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging visualization but also to personalize services clients care about across multiple modes of interaction. Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. Healthcare is a close second where caregivers expect patient, medication & disease data at their fingertips with a few finger swipes on an iPad app. What Big Data brings to the equation beyond it’s strength in data ingest & processing is a unified architecture. For instance, MapReduce is the original framework for writing applications that process large amounts of structured and unstructured data stored in the Hadoop Distributed File System (HDFS). Apache Hadoop YARN opened Hadoop to other data processing engines (e.g. Apache Spark/Storm) that can now run alongside existing MapReduce jobs to process data in many different ways at the same time. The result is that ANY kind of application processing can be run inside a Hadoop runtime – batch, realtime, interactive or streaming.
  3. Consumer 360 Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. The healthcare industry stores patient data across multiple silos – ADT (Admit Discharge Transfer) systems, medication systems, CRM systems etc. Data Lakes provide an ability to visualize all of the patients data in one place thus improving outcomes. The Digital Mesh (covered above) only exacerbates this semantic gap in user experiences as information consumers navigate applications as they consume services across the mesh. A mesh that is both multi-channel as well as one that needs a 360 degree customer view across all these engagement points. Applications developed in 2016 and beyond must take a 360 degree based approach to ensuring a continuous client experience across the spectrum of endpoints and the platforms that span them from a Data Visualization standpoint. Every serious business needs to provide a unified view of a customer across tens of product lines and geographies.
  4. Machine Learning, Data Science & Predictive Analytics – Most business problems are data challenges and an approach centered around data analysis helps extract meaningful insights from data thus helping the business It is a common capability now for many enterprises to possess the capability to acquire, store and process large volumes of data using a low cost approach leveraging Big Data and Cloud Computing.  At the same time the rapid maturation of scalable processing techniques allows us to extract richer insights from data. What we commonly refer to as Machine Learning – a combination of  of econometrics, machine learning, statistics, visualization, and computer science – extract valuable business insights hiding in data and builds operational systems to deliver that value.Data Science has evolved to a new branch called “Deep Neural Nets” (DNN). DNN Are what makes possible the ability of smart machines and agents to learn from data flows and to make products that use them even more automated & powerful. Deep Machine Learning involves the art of discovering data insights in a human-like pattern. The web scale world (led by Google and Facebook) have been vocal about their use of Advanced Data Science techniques and the move of Data Science into Advanced Machine Learning. Data Science is an umbrella concept that refers to the process of extracting business patterns from large volumes of both structured, semi structured and unstructured data. It is emerging the key ingredient in enabling a predictive approach to the business. Data Science & it’s applications across a range of industries are covered in the blogpost http://www.vamsitalkstech.com/?p=1846
  5. Visualization  – Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. The average enterprise user is also familiar with BYOD in the age of self service. The Digital Mesh only exacerbates this gap in user experiences as information consumers navigate applications as they consume services across a mesh that is both multi-channel as well as provides Customer 360 across all these engagement points.While information management technology has grown at a blistering pace, the human ability to process and comprehend numerical data has not. Applications being developed in 2016 are beginning to adopt intelligent visualization approaches that are easy to use,highly interactive and enable the user to manipulate corporate & business data using their fingertips – much like an iPad app. Tools such as intelligent dashboards, scorecards, mashups etc are helping change a visualization paradigms that were based on histograms, pie charts and tons of numbers. Big Data improvements in data lineage, quality are greatly helping the visualization space.
  6. DevOps – Big Data powered by Hadoop has now evolved into a true application architecture ecosystem as mentioned above. The 30+ components included in an enterprise grade platform like the Hortonworks Data Platform (HDP) include APIs (Application Programming Interfaces) to satisfy every kind of data need that an application could have – streaming, realtime, interactive or batch. Couple that with improvements in predictive analytics. In 2016, enterprise developers leveraging Big Data have been building scalable applications with data as a first class citizen. Organizations using DevOps are already reaping the rewards as they are able to streamline, improve and create business processes to reflect customer demand and positively affect customer satisfaction. Examples abound in the Webscale world (Netflix, Pinterest, and Etsy) but we now have existing Fortune 1000 companies in verticals like financial services, healthcare, retail and manufacturing who are benefiting from Big Data & DevOps.Thus, 2016 will be the year when Big Data techniques are no longer be the preserve of classical Information Management teams but move to the umbrella application development area which encompasses the DevOps and Continuous Integration & Delivery (CI-CD) spheres.
    One of DevOps chief goal’s is to close the long-standing gap between the engineers who develop and test IT capability and the organizations that are responsible for deploying and maintaining IT operations. Using traditional app dev methodologies, it can take months to design, test and deploy software. No business today has that much time—especially in the age of IT consumerization and end users accustomed to smart phone apps that are updated daily. The focus now is on rapidly developing business applications to stay ahead of competitors that can better harness the Big Data business capabilities. The micro services architecture approach advocated by DevOps combines the notion of autonomous, cooperative yet loosely coupled applications built as a conglomeration of business focused services is a natural fit for the Digital Mesh.  The most important additive and consideration to micro services based architectures in 2016 are  Analytics Everywhere.

The Final Word – 

We have all heard about the growth of data volumes & variety. 2016 is perhaps the first year where forward looking business & technology executives have begin capturing commercial value from the data deluge by balancing analytics with creative user experience. 

Thus, modern data applications are making Big Data ubiquitous. Rather than existing as back-shelf tools for the monthly ETL run or for reporting, these modern applications can help industry firms incorporate data into every decision they make.  Applications in 2016 and beyond are beginning  to recognize that Analytics are pervasive, relentless, realtime and thus embedded into our daily lives.  

An Enterprise Wide Framework for Digital Cybersecurity..(4/4)

The first two posts in this series on Cybersecurity have focused on the strategic issues around information security and the IT response from the datacenter. The third post then spent discussed exciting new innovations being ushered in by Big Data techniques and players in the open source space. This fourth & final post in the series will focus on the business steps that Corporate boards, Executive & IT leadership need to adopt from a governance & strategy standpoint to protect & insulate their businesses from the constant firehose of cyber attacks.

Cybersecurity – A Board level concern – 

Enterprise business is built around data assets and data is the critical prong of any digital initiative. For instance, Digital Banking platforms & Retail applications are evolving to collections of data based ecosystems. These  need to natively support loose federations of partner applications, regulatory applications which are API based & Cloud native. These applications are majorly micro service architecture based & need to support mobile clients from the get go. Owing to their very nature in that they support massive amounts of users & based on their business priority, these tend to take a higher priority in the overall security equation .

The world of business is now driven by complex software & information technology.  IT is now enterprise destiny. Given all of this complexity across global operating zones, perhaps no other business issue has the potential to result in massive customer drain, revenue losses, reputation risks & lawsuits from affected parties as do breaches in Cybersecurity. A major breach in security is a quick game-changer and has the potential to put an organization in defensive mode for years.

Thus, Corporate Boards which have been long insulated from technology decisions now want to understand from their officers how they’re overseeing, and mitigating cyber security. Putting into place an exemplary program that can govern across a vast & quickly evolving cybersecurity threat landscape is a vital board level responsibility. The other important point to note is the interconnected nature of these business ecosystems implies the need for external collaboration as well as a dedicated executive to serve as a Cyber czar.

Enter the formal role of the CISO (Chief Information Security Officer)….

The CISO typically heads an independent technology and business function with a dedicated budget & resources. Her or his mandate extends from physical security (equipment lockdown, fob based access control etc_ to setting architectural security standards for business applications as well as reviewing business processes. One of the CISO’s main goals is standardize the internal taxonomy of cyber risk and to provide a framework for quantifying these risks across a global organization.

A new approach to cybersecurity as a business issue is thus highly called for. Enterprises have put in place formal programs for cybersecurity with designated a CISO (Chief Information Security Officer). The CISO has a team reporting to her which ensures that detailed threat assessments are created as well as dedicated resources embedded both in the lines of business as well as in central architecture & operations to maintain smooth business continuity in the event of security breach led disruptions.

Cybersecurity – An Enterprise Wide Process – 

With all of that in mind, let us take a look at a the components of an enterprise wide cybersecurity program in critical industries like financial services and insurance. I will follow each of the steps with detailed examples from a real world business standpoint. A key theme across the below will be to ensure that the cybersecurity program in and of itself shall not turn burdensome to business operation & innovation. Doing so would defeat the purpose of having such a program.

The program is depicted in the below illustration.

CyberSecurityProgram

                                             Illustration – Enterprise Cybersecurity Process

The first step is almost always an Assessment processes which itself has two sub components – business threat assessment & information threat assessment. The goal here should be to comprehensively understand the organizations business ecosystem  by taking into account every actor – internal or external- that interfaces with the business. For an insurance company, this includes customers, prospects, partner organizations (banks, reinsurance firms), internal actors (e.g. underwriters, actuaries etc).

For a Bank, this includes fraud & cyber risks around retail customer ACH accounts, customer wires,  commercial customer accounts along with the linked entities they do business with, millions of endpoint devices like ATMs & POS terminals, a wide payments ecosystem etc etc. Understanding the likely business threats across each role & defining appropriate operational metrics across those areas is a key part of this stage. At the same time, the range of information used across the organization starting with customer data, payment systems data, employee data should be catalogued and classified based on their threat levels from Critical to Restricted to Internal Use to Benign et al. These classifications must be communicated over to the lines of business as well as IT & development organizations. It is critical for operations & development teams to understand this criticality from the perspective of incorporating secure & efficient development methodologies into their current IT Architecture & development practices.

The next step in the process is to Plan & Benchmark the current state of security with the industry standard organizations to better understand where the internal cyber gaps may lie across the entire range of business systems. This step also takes into account the Digital innovation roadmap in the organization and does not treat areas like Mobility, Cloud Computing, DevOps, Big Data as being distinct from a security theme standpoint. This is key to ensuring that effective controls can be applied in a forward looking manner. For instance, understanding where gaps lie from a Sarbanes Oxley or PCI DSS or HIPAA regulations ensure that appropriate steps be taken to bring these different systems up to par from an industry standpoint. Across these process appropriate risk migrations need to be understood for systems across the board. This ranges from desktop systems, mobile devices and systems which hold & process client data.

The third step is the Execution step. This has three subcomponents –  Systems & IT Refresh & the Governance Process.

The Systems & IT Refresh step deals with instituting specific security technologies, IT Architectures, Big Data standards etc into line of business & central IT systems with the view of remediating or improving gaps observed across step 1. The list of systems is too exhaustive to cover here but at a minimum it includes all the security systems covered here in the first blog in this series @ http://www.vamsitalkstech.com/?p=1265

The Execution step will also vary based on the industry vertical you operate in. Let me explain this with an example.

For instance, in Banking, in addition to general network level security, I would categorize business level security into four specific buckets –   general fraud, credit card fraud, AML compliance and cyber security.

  • Firstly, the current best practice in the banking industry is to encourage a certain amount of convergence in the back end data infrastructure across all of the fraud types – literally in the tens.  Forward looking institutions are building cybersecurity data lakes to aggregate & consolidate all digital banking information, wire data, payment data, credit card swipes, other telemetry data (ATM & POS)  etc in one place to do security analytics. This approach can payoff in a big way.
  • Across all of these different fraud types, the common thread is that the fraud is increasingly digital (or internet based) and they fraudster rings are becoming more sophisticated every day. To detect these infinitesimally small patterns, an analytic approach beyond the existing rules based approach is key to understand for instance – location based patterns in terms of where transactions took place, Social Graph based patterns and Patterns which can commingle realtime & historical data to derive insights.

               

Finally, the Governance process.

Over a certain period of time, it is a given that every organization will be breached. The executive team has to to set in place a governance strategy that recognizes overall limitations in a defensive posture and seeks to move the organization to an active defense approach. The goals of this process are to deeply advise the board not only on how to manage cyber risk from a business mitigation perspective but also be able to setup a steering committee to manage customer, legal & media outreach. The executive team themselves needs to be trained in cybersecurity issues and this should be lead by the CISO. Attention has to be paid to ensuring that the CISO’s team is not only staffed with risk, compliance & fraud detection personnel but also those with expertise and contacts in the specific lines of business that the organization operates across. To that end, the CISO’s team has to be funded at the highest levels of the organization. Investment in human activities like training classes, certifications & regular cybersecurity drills will also ensure a high level of preparedness across the organization. Explicit incident response plans need to be created across different business areas. Based on the specific vulnerability & concomitant business risk, the CISO will need to decide if each of the specific risks can be shared over multiple external actors – vendors, suppliers & other partners. If not, it would make a lot of sense to look for cyber risk insurance, an emerging business area, in those specific situations. More on Cyber risk in a followup post. To reiterate one of the points I made above, a strong cybersecurity process does not inhibit business agility.

What are the questions business execs and boards should ask of their IT:

A few key questions that business management should ask of themselves from a cybersecurity standpoint.
  • How are we doing on Cybersecurity from a competitive & business level standpoint? Further, are we answering this question using a business metric drive approach that assigns scores to the program in various categories? For instance – no of breaches, malware incidents, pace & the effectiveness of response. Are these goals S.M,A.R,T ?
  • Are all systems under regulation protected using appropriate controls?
  • Are we able to hire the best and brightest security personnel and engage them within lines of business?
  • Are we investing in the best IT solutions that leverage Big Data & Cloud Computing that have been proven to be more secure than older fragmented architectures? Can my IT leadership vocalize our roadmap goals across these areas?
  • Are my line of business leaders engaged in cybersecurity from the perspective of their business areas?
  • Is our business ecosystem protected? What are my partners doing to protect sensitive consumer & business data?
  • Are we all sharing appropriate information constantly with industry consortia around threat intelligence & the authorities i.e law enforcement and the federal government agencies?

Conclusion:

My goal in this post to bring forth the high level dimensions of a cybersecurity plan at the board level while not being over prescriptive in terms of specific industry & business actions. Based on my years of working in sensitive industries like Financial Services & Insurance, Healthcare and Telco, I can confidently say that if the broad contours of the above strategy are adopted, you are on your way to becoming an organization with a strong foundation for Cybersecurity management. In this Digital Age, that can be a huge competitive differentiator.

Global Banking faces it’s Uber Moment..

The neighborhood bank branch is on the way out and is being slowly phased out as the primary mode of customer interaction for Banks. Banks across the globe have increased their technology investments in strategic areas such as Analytics, Data & Mobile. The Bank of the future increasingly resembles a technology company.

I have no doubt that the financial industry will face a series of Uber moments,” – Antony Jenkins (then CEO) of Barclays Bank, 2015

The Washington Post proclaimed in an article [1] this week that bank branch on the corner of Main Street may not be there much longer.

Technology is transforming Banking thus leading to dramatic changes in the landscape of customer interactions. We live in the age of the Digital Consumer – Banking in the age of the hyper-connected consumer. As millenials join the labor force, they are expecting to be able to Bank from anywhere, be it a mobile device or use internet banking from their personal computer.

As former Barclays CEO Antony Jenkins described it in a speech given last fall, the global banking industry, which is under severe pressure from customer demands for increased automation and contextual services, will slash employment and branches by 20 percent to 50 percent over the next decade.[2]

“I have no doubt that the financial industry will face a series of Uber moments,” he said in the late-November speech in London, referring to the way that Uber and other ride-hailing companies have rapidly unsettled the taxi industry.[2]

Banking must trend Digital to respond to changing client needs – 

The Financial Services and the Insurance industry are facing an unprecedented amount of change driven by factors like changing client preferences and the emergence of new technology—the Internet, mobility, social media, etc. These changes are immensely profound, especially with the arrival of “FinTech”—technology-driven applications that are upending long-standing business models across all sectors from retail banking to wealth management & capital markets. Further, members of a major new segment, Millennials, increasingly use mobile devices, demand more contextual services and expect a seamless unified banking experience—something akin to what they  experience on web properties like Facebook, Amazon, Uber, Google or Yahoo, etc.

The definition of Digital is somewhat nebulous, I would like to define the key areas where it’s impact and capabilities will need to be felt for this gradual transformation to occur.

A true Digital Bank needs to –

  • Offer a seamless customer experience much like the one provided by the likes of Facebook & Amazon i.e highly interactive & intelligent applications that can detect a single customer’s journey across multiple channels
  • offer data driven interactive services and products that can detect customer preferences on the fly, match them with existing history and provide value added services. Services that not only provide a better experience but also foster a longer term customer relationship
  • to be able to help the business prototype, test, refine and rapidly develop new business capabilities
  • Above all, treat Digital as a Constant Capability and not as an ‘off the shelf’ product or a one off way of doing things

Though some of the above facts & figures may seem startling, it’s how individual banks put both data and technology to work across their internal value chain that will define their standing in the rapidly growing data economy.

Enter the FinTechs

FinTechs (or new Age financial industry startups) offer enhanced customer experiences built on product innovation and agile business models. They do so by expanding their wallet share of client revenues by offering contextual products tailored to individual client profiles. Their savvy use of segmentation data and predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, Point Of Sale, Internet, etc.). Like banks, these technologies support multiple modes of payments at scale, but they aren’t bound by the same regulatory and compliance regulations as are banks, who operate under a mandate that they must demonstrate that they understand their risk profiles. Compliance is an even stronger requirement for banks in areas around KYC (Know Your Customer) and AML (Anti Money Laundering) where there is a need to profile customers—both individual & corporate—to decipher if any of their transaction patterns indicate money laundering, etc.

Banking produces the most data of any industry—rich troves of data that pertain to customer transactions, payments, wire transfers and demographic information. However, it is not enough for financial service IT departments to just possess the data. They must be able to drive change through legacy thinking and infrastructures as the industry changes—both from a data product as well as from a risk & compliance standpoint.

The business areas shown in the below illustration are a mix of both legacy capabilities (Risk, Fraud and Compliance) to the new value added areas (Mobile Banking, Payments, Omni-channel Wealth Management etc).

DataDriven1

   Illustration – Predictive Analytics and Big Data are upending business models in Banking across multiple vectors of disruption 

Business Challenges facing banks today

Banks and other players across the financial spectrum face challenges across three distinct areas. First and foremost they need to play defense with a myriad of regulatory and compliance legislation across defensive areas of the business such as risk data aggregation and measurement and financial compliance and fraud detection. On the other hand, there is a distinct need to vastly improve customer satisfaction and stickiness by implementing predictive analytics capabilities and generating better insights across the customer journey thus driving a truly immersive digital experience. Finally, banks need to leverage their mountains of data assets to create new business models and go-to-market strategies. They need to do this by monetizing multiple data sources—both data-in-motion and data-at-rest—for actionable intelligence.

Data is the single most important driver of bank transformation, impacting financial product selection, promotion targeting, next best action and ultimately, the entire consumer experience. Today, the volume of this data is growing exponentially as consumers increasingly share opinions and interact with an array of smart phones, connected devices, sensors and beacons emitting signals during their customer journey.

Data Challenges – 

Business and technology leaders are struggling to keep pace with a massive glut of data from digitization, the internet of things, machine learning, and cybersecurity for starters. A data lake—which combines data assets, technology and analytics to create enterprise value at a massive scale—can help businesses gain control over their data.

Fortunately, Big Data driven predictive analytics is here to help.  The Hadoop platform and ecosystem of technologies have matured considerably and have evolved to supporting business critical banking applications. The emergence of cloud platforms is helping in this regard.

Positively impacting the banking experience requires data

Whether at the retail bank or at corporate headquarters, there are a number of ways to leverage technology in order to enable a successful consumer experience across all banking sectors:

Retail & Consumer Banking

Banks need to move to a predominantly online model, providing consumers with highly interactive, engaging and contextual experiences that span multiple channels—branch banking, eBanking, POS, ATM, etc. Further goals are increased profitability per customer for both micro and macro customer populations with the ultimate goal of increasing customer lifetime value (CLV).

Capital Markets

Capital markets firms must create new business models and offer superior client relationships based on their data assets. Those that leverage and monetize their data assets will enjoy superior returns and raise the bar for the rest of the industry. It is critical for capital market firms to better understand their clients (be they institutional or otherwise) from a 360-degree perspective so they can be marketed to as a single entity across different channels—a key to optimizing profits with cross selling in an increasingly competitive landscape.

Wealth Managers

The wealth management segment (e.g., private banking, tax planning, estate planning for high net worth individuals) is a potential high growth business for any financial institution. It is the highest touch segment of banking, fostered on long-term and extremely lucrative advisory relationships. It is also the segment most ripe for disruption due to a clear shift in client preferences and expectations for their financial future. Actionable intelligence gathered from real-time transactions and historical data becomes a critical component for product tailoring, personalization and satisfaction.

Corporate Banking

The ability to market complex financial products across a global corporate banking client base is critical to generating profits in this segment. It’s also important to engage in risk-based portfolio optimization to predict which clients are at risk for adverse events like defaults. In addition to being able to track revenue per client and better understand the entities they bank with, it is also critical that corporate banks track AML compliance.

The future of data for Financial Services

Understand the Customer Journey

Across retail banking, wealth management and capital markets, a unified view of the customer journey is at the heart of the bank’s ability to promote the right financial product, recommend a properly aligned portfolio products, keep up with evolving preferences as the customer relationship matures and accurately predict future revenue from a customer. But currently most retail, investment banks and corporate banks lack a comprehensive single view of their customers. Due to operational silos, each department has a limited view of the customer across multiple channels. These views are typically inconsistent, vary quite a bit and result in limited internal collaboration when servicing customer needs. Leveraging the ingestion and predictive capabilities of a Big Data platform, banks can provide a user experience that rivals Facebook, Twitter or Google and provide a full picture of customer across all touch points.

Create Modern data applications

Banks, wealth managers, stock exchanges and investment banks are companies run on data—data on deposits, payments, balances, investments, interactions and third-party data quantifying risk of theft or fraud. Modern data applications for banking data scientists may be built internally or purchased “off the shelf” from third parties. These new applications are powerful and fast enough to detect previously invisible patterns in massive volumes of real-time data. They also enable banks to proactively identify risks with models based on petabytes of historical data. These data science apps comb through the “haystacks” of data to identify subtle “needles” of fraud or risk not easy to find with manual inspection.

These modern data applications make Big Data and data science ubiquitous. Rather than back-shelf tools for the occasional suspicious transaction or period of market volatility, these applications can help financial firms incorporate data into every decision they make. They can automate data mining and predictive modeling for daily use, weaving advanced statistical analysis, machine learning, and artificial intelligence into the bank’s day-to-day operations.

Conclusion – Banks need to drive Product Creation using the Latest Technology –  

A strategic approach to industrializing analytics in a Banking organization can add massive value and competitive differentiation in five distinct categories –

  1. Exponentially improve existing business processes. e.. Risk data aggregation and measurement, financial compliance, fraud detection
  2. Help create new business models and go to market strategies – by monetizing multiple data sources – both internal and external
  3. Vastly improve customer satisfaction by generating better insights across the customer journey
  4. Increase security while expanding access to relevant data throughout the enterprise to knowledge workers
  5. Help drive end to end digitization

If you really think about it –  all that banks do is manipulate and deal in data. If that is not primed for a Über type of revolution I do not know what is.

References
[1] https://www.washingtonpost.com/news/wonk/wp/2016/04/19/say-goodbye-to-your-neighborhood-bank-branch/

[2] http://www.theguardian.com/business/2015/nov/25/banking-facing-uber-moment-says-former-barclays-boss