My take on Gartner’s Top 10 Strategic Technology Trends for 2017

We’re only at the very, very beginning of this next generation of computing and I think that every industry leader will be the ones that transforms first. I don’t care what industry you’re talking about” -Kim Stevenson, CIO, Intel, Feb 2016

Gartner Research rolled out their “Top 10 Strategic Technology Trends for 2017” report a few weeks ago. My goal for this blogpost is to introduce these trends to the reader and to examine the potential impact of their recommendations from an enterprise standpoint.

gartner_trends_2017

                                                              Gartner’s Strategic Trends for 2017 

# 1: AI & Advanced Machine Learning

Gartner rightly forecasts that AI (Artificial Intelligence) and Advanced Machine Learning will continue their march into daily applications run by the Fortune 1000. CIOs are coming to realize that most business problems are primarily data challenges. The rapid maturation of scalable processing techniques allows us to extract richer insights from data. What we commonly refer to as Machine Learning – a combination of econometrics, machine learning, statistics, visualization, and computer science – helps extracts valuable business insights hiding in data and builds operational systems to deliver that value.

Deep Machine Learning involves the art of discovering data insights in a human-like pattern. We are, thus, clearly witnessing the advent of modern data applications. These applications will leverage a range of advanced techniques such as Artificial Intelligence and Machine Learning (ML) encompassing techniques such as neural networks, natural language processing and deep learning.

Implications for industry CIOs – Modern data applications understand their environment (e.g customer preferences and other detailed data insights) to be able to predict business trends in real time & to take action based on them to drive revenues and decrease business risk. These techniques will enable applications and devices to operate in an even more smarter manner while saving companies enormous amounts of money on manual costs.

http://www.vamsitalkstech.com/?p=407

# 2: Intelligent Apps

Personal assistants, e.g Apple Siri, Microsoft Cortona in the category of virtual personal assistants (VPAs), have begun transforming everyday business processes easier for their users. VPAs represent the intersection of AI, conversational interfaces and integration into business processes. In 2017, these will begin improving customer experiences for the largest Fortune 100 enterprises. On the more personal front, Home VPAs will rapidly evolve & become even more smarter as their algorithms get more capable and understanding of their own environments.  We will see increased application of smart agents in diverse fields like financial services,healthcare, telecom and media.

Implications for industry CIOs – Get ready to invest in intelligent applications in the corporate intranet to start with.

# 3: Intelligent Things

The rise of the IoT has only been well documented but couple AI with massive data processing capabilities – that makes up Intelligent Things which can interact with humans in new ways. You can add a whole category of things around transportation (self driving cars, connected cars) and Robots that perform key processes in industrial manufacturing, drones etc.

Implications for industry CIOs – These intelligent devices will increasingly begin communicating with their environments in a manner that will encourage collaboration in a range of business scenarios. 2017 should begin the trend of these devices communicating with each other to form the eponymous ‘Digital Mesh’.

# 4: Virtual & Augmented Reality

Virtual reality (VR) and augmented reality (AR) are technologies that are beginning to completely change the way humans interact with one another and with intelligent systems that make up the Digital Mesh. Pokemon GO & Oculus Rift were the first hugely successful consumer facing AR applications – debuting in 2016. Uses of these technologies will include gamification (to improve customer engagement with products and services), other customer & employee facing applications etc. While both these technologies enable us to view the world in different ways – AR is remarkable in its ability to add to our current reality. BMW’s subsidiary Mini has actually developed a driving goggle with AR technology[1].

Implications for industry CIOs – This one is still on the drawing board for most verticals but it does make sense to invest in areas like gamification and in engaging with remote employees using AR.

# 5: Digital Twin

A Digital twin is a software personification of an Intelligent Thing or system. In the manufacturing industry, digital twins can be setup to function as proxies of things like sensors and gauges, Coordinate Measuring Machines, lasers, vision systems, and white light scanning [2]. The wealth of data being gathered on the shop floor will ensure that Digital twins will be used to reduce costs and increase innovation. Data science will soon make it’s way into the shop floor to enable the collection of insights from these software proxies.

Implications for industry CIOs – Invest in Digital capabilities that serve as proxies for physical things.

# 6: Blockchain

The term Blockchain is derived from a design pattern that describes a chain of data blocks that map to individual transactions. Each transaction that is conducted in the real world (e.g a Bitcoin wire transfer) results in the creation of new blocks in the chain. The new blocks so created are done so by calculating a cryptographic hash function of its previous block thus constructing a chain of blocks – hence the name.

Blockchain is a distributed ledger (DLT) which allows global participants to conduct secure transactions that could be of any type – banking, music purchases, legal contracts, supply chain transactions etc. Blockchain will transform multiple industries in the years to come. Bitcoin is the first application of Blockchain.

How the Blockchain will lead disruption across industry..(5/5)

Implications for industry CIOs – Begin expanding internal knowledge on Blockchain and as to how it can potentially augment or disrupt your vertical industry.

# 7: Conversational Systems

Mobile applications first begun forcing the need for enterprises to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. Conversational Systems take these interactions to the next level and enable humans to communicate with a wide range of Intelligent Things using a range of channels – speech, touch, vision etc.

Implications for industry CIOs – Every touch point matters, and those leading the smart agent transformation should constantly be asking how organizations are removing friction and enhancing the experience for every customer regardless of where they are in the journey.

# 8: Mesh App and Service Architecture

This one is still from last year. The Digital Mesh leads to an interconnected information deluge which encompasses classical IoT endpoints along with audio, video & social data streams. The creation of these smart services will further depend on the vertical industries that these products serve as well as requirements for the platforms that host them. E.g industrial automation, remote healthcare, public transportation, connected cars, home automation etc.The micro services architecture approach which combines the notion of autonomous, cooperative yet loosely coupled applications built as a conglomeration of business focused services is a natural fit for the Digital Mesh.  The most important additive and consideration to micro services based architectures in the age of the Digital Mesh is what I’d like to term –  Analytics Everywhere.

Implications for industry CIOs -The mesh app will require a microservices based architecture which supports multichannel & multi device solutions.

# 9: Digital Technology Platforms

The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous micro level interactions with global consumers/customers/clients/stockholders or patients depending on the vertical you operate in. More information on the core building blocks of Digital Technology Platforms at the below blogpost.

Implications for industry CIOs

http://www.vamsitalkstech.com/?m=201609

# 10: Adaptive Security Architecture

The evolution of the intelligent digital mesh and digital technology platforms and application architectures means that security has to become fluid and adaptive.Traditional solutions cannot handle this challenge which is exacerbated by the expectation that in an IoT & DM world, data flows will be multidirectional across a grid of application endpoints.

Implications for industry CIOs -Expect to find applications in 2016 and beyond incorporating Deep Learning and Real Time Analytics into their core security design with a view to analyzing large scale data at a very low latency. Security in the IoT environment is particularly challenging. Security teams need to work with application, solution and enterprise architects to build security into the overall DevOps process to create a DevSecOps model.

Conclusion..

In this year’s edition, Gartner are clearly forecasting the future ten years out from a mass market standpoint. As we cross this chasm slowly over the next ten years, we will see that IoT begin to emerge and take center stage in every industry vertical. Digital transformation will happen on apps created for and brought together for Smart Agents on the Device Mesh.

These apps will gradually become autonomous, data intensive,server-less, hopefully secure and location independent (data center or cloud). The app can be a sensor or a connected car or a digital twin for a manufacturing technician. So, it’s not just about a single app sitting in a data center or the cloud or on the machine itself. These smart agent apps will data driven, components of a larger mesh, interconnected connected using open interfaces, and resident at the places where it’s optimal for realtime analytics. This may seem like science fiction for the Fortune 1000 enterprise but it is manifest reality at the web scale innovators. The industry will have no choice but to follow.

References..

[1] Cramer – “A lesson in Augmented Realities” –  http://cramer.com/story/the-difference-between-ar-and-vr/

[2] Dr.Michael Grieves – “Digital Twin: Manufacturing Excellence through Virtual Factory Replication” – http://innovate.fit.edu/plm/documents/doc_mgr/912/1411.0_Digital_Twin_White_Paper_Dr_Grieves.pdf

How Big Data & Predictive Analytics transform AML Compliance in Banking & Payments..(2/2)

The first blog in this two part series (Deter Financial Crime by Creating an effective AML Program) described how Money Laundering (ML) activities employed by nefarious actors (e.g drug cartels, corrupt public figures & terrorist organizations) have gotten more sophisticated over the years. Global and Regional Banks are falling short of their compliance goals despite huge technology and process investments. Banks that fail to maintain effective compliance are typically fined hundreds of millions of dollars. In this second & final post, we will examine why Big Data Analytics as a second generation effort can become critical to efforts to shut down the flow of illicit funds across the globe thus ensuring financial organizations are compliant with efforts to reduce money laundering.

Where current enterprisewide AML programs fall short..

As discussed in various posts and in the first blog in the series (below), the Money Laundering (ML) rings of today are highly sophisticated in their understanding of the business specifics across the domains of Banking  – Capital Markets, Retail & Commercial banking. They are also very well versed in the complex rules that govern global trade finance.

Deter Financial Crime by Creating an Effective Anti Money Laundering (AML) Program…(1/2)

Further, the more complex and geographically diverse a financial institution is, the higher it’s risk of AML (Anti Money Laundering) compliance violations. Other factors such as an enormous volume of transactions across multiple distribution channels, across geographies between thousands of counter-parties always increases money laundering risk.

Thus, current AML programs fall short in five specific areas –

  1. Manual Data Collection & Risk Scoring – Bank’s response to AML statutes has been to bring in more staff typically in hundreds at large banks. These staff perform rote but key processes in AML such as Customer Due Diligence (CDD) and Know Your Customer (KYC).  These staff extensively scour external sources like Lexis Nexis, Thomson Reuters, D&B etc to manually scoring of risky client entities often pairing these with internal bank data. They also use AML watch-lists to perform this process of verifying individuals and business customers so that AML Case Managers can review it before filing Suspicious Activity Reports (SAR). On an average, about 50% of the cost of AML programs is incurred in terms of the large headcount requirements. At large Global Banks where the number of accounts are more 100 million customers the data volumes can get real big real quick causing all kinds of headaches for AML programs from a data aggregation, storage, processing and accuracy standpoint. There is a crying need to automate AML programs end to end to not only perform accurate risk scoring but also to keep costs down.
  2. Social Graph Analysis in areas such as Trade finance helps model the complex transactions occurring between thousands of entities. Each of these entities may have a complex holding structure with accounts that have been created using forged documents. Most fraud also happens in networks of fraud. An inability to dynamically understand the topology of the financial relationships among thousands of entities implies that AML programs need to develop graph based analysis capabilities .
  3. AML programs extensively deploy rule based systems or Transaction Monitoring Systems (TMS) which allow an expert system based approach to setup new rules. These rules span areas like monetary thresholds, specific patterns that connote money laundering & also business scenarios that may violate these patterns. However, fraudster rings now learn (or know) these rules quickly & change their fraudulent methods constantly to avoid detection. Thus there is a significant need to reduce a high degree of dependence on traditional TMS – which are slow to adapt to the dynamic nature of money laundering.
  4. The need to perform extensive Behavioral modeling & Customer Segmentation to discover transactions behavior with a view to identifying behavioral patterns of entities & outlier behaviors that connote potential laundering.
  5. Real time transaction monitoring in areas like Payment Cards presents unique challenges where money laundering is hidden within mountains of transaction data. Every piece of data produced as a result of bank operations needs to be commingled with historical data sets (for customers under suspicion) spanning years in making a judgment call about filing a SAR (Suspicious Activity Report).

How Big Data & Predictive Analytics can help across all these areas..

aml_predictiveanalytics

  1. The first area where Big Data & Predictive Analytics have a massive impact is around Due Diligence data of KYC (Know Your Customer) data. All of the above discussed data scraping from various sources can be easily automated by using tools in a Big Data stack to ingest information automatically. This is done by sending requests to data providers (the exact same ones that Banking institutions are currently using) via an API. Once this data is obtained, they can use real time processing tools (such as Apache Storm and Apache Spark) to apply sophisticated algorithms to that collected data to transform that data to calculate a Risk Score or Rating. In Trade Finance, Text Analytics can be used to process a range of documents like invoices, bills of lading, certificates of shipping etc to enable Banks to inspect a complex process across hundreds of entities operating across countries.  This approach enables Banks to process massive amounts of diverse data in quick time (even seconds) to synthesize it to accurate risk scores. Implementing Big Data in this very important workstream can help increase efficiency and reduce costs.
  2. The second area where Big Data shines at is in the space of helping create a Single View of a Customer as depicted below. This is made possible by doing advanced entity matching with the establishment and adoption of a lightweight entity ID service. This service will consist of entity assignment and batch reconciliation. The goal here is to get each business system to propagate the Entity ID back into their Core Banking, loan and payment systems, then transaction data will flow into the lake with this ID attached providing a way to do Customer 360.single-view-of-the-customer
  3. To be clear, we are advocating for a mix of both business rules and Data Science. Machine Learning is recommended as enables a range of business analytics across AML programs overcoming the limitations of a TMS. The first usecase is around Data Science for  – which is – Give me all transactions in one place, give me all the Case Mgmt files in one place, give me all of the customer data in one place and give me all External data (TBD) in one place. And the reason I want all of this is to perform Exploratory, hypothesis Data Science with the goal being to uncover areas of risk that one possibly missed out on before, find out areas that were not as risky as they thought were before so the risk score can be lowered and really constantly finding out the real Risk profile that your institution bears. E.g. Downgrading investment in your Trade financing as you are find a lot of Scrap Metal based fraudulent transactions.
  4. The other important value driver in deploying Data Science is to perform Advanced Transaction Monitoring Intelligence.  The core idea is to get years worth of Banking data in one location (the datalake) & then applying  unsupervised learning to glean patterns in those transactions. The goal is then to identify profiles of actors with the intent of feeding it into downstream surveillance & TM systems. This knowledge can then be used to –
  • Constantly learn transaction behavior for similar customers is very important in detecting laundering in areas like payment cards. It is very common to have retail businesses setup with the sole purpose of laundering money.
  • Discover transaction activity of trade finance customers with similar traits (types of businesses, nature of transfers, areas of operations etc.)
  • Segment customers by similar trasnaction behaviors
  • Understand common money laundering typologies and identify specific risks from a temporal and spatial/geographic standpoint
  • Improve and lear correlations between alert accuracy and suspicious activity reports (SAR) filings
  • Keep the noise level down by weeding out false positives

Benefits of a forward looking approach..  

We believe that we have a fresh approach that can help Banks with the following value drivers & metrics –

  • Detect AML violations on a proactive basis thus reducing the probability of massive fines
  • Save on staffing expenses for Customer Due Diligence (CDD)
  • Increase accurate production of suspicious activity reports (SAR)
  • Decrease the percent of corporate customers with AML-related account closures in the past year by customer risk level and reason – thus reducing loss of revenue
  • Decrease the overall KYC profile update backlog across geographies
  • Help create Customer 360 views that can help accelerate CLV (Customer Lifetime Value) as well as Customer Segmentation from a cross-sell/up-sell perspective

Big Data shines in all the above areas..

Conclusion…

The AML landscape will rapidly change over the next few years to accommodate the business requirements highlighted above. Regulatory authorities should also lead the way in adopting a Hadoop/ ML/Predictive Analytics based approach over the next few years. There is no other way to do tackle large & medium AML programs in a lower cost and highly automated manner.

Design and Architecture of A Robo-Advisor Platform..(3/3)

This three part series explores the automated investment management or the “Robo-advisor” (RA) movement. The first post in this series @- http://www.vamsitalkstech.com/?p=2329 – discussed how Wealth Management has been an area largely untouched by automation as far as the front office is concerned. As a result, automated investment vehicles have largely begun changing that trend and they helping create a variety of business models in the industry esp those catering to the Millenial Mass Affluent Segment. The second post @- http://www.vamsitalkstech.com/?p=2418  focused on the overall business model & main functions of a Robo-Advisor (RA). This third and final post covers a generic technology architecture for a RA platform.

Business Requirements for a Robo-Advisor (RA) Platform…

Some of the key business requirements of a RA platform that confer it advantages as compared to the manual/human driven style of investing are:

  • Collect Individual Client Data – RA Platforms need to offer a high degree of customization from the standpoint of an individual investor. This means an ability to provide a preferably mobile and web interface to capture detailed customer financial background, existing investments as well as any historical data regarding customer segments etc.
  • Client Segmentation – Clients are to be segmented  across granular segments as opposed to the traditional asset based methodology (e.g mass affluent, high net worth, ultra high net worth etc).
  • Algorithm Based Investment Allocation – Once the client data is collected,  normalized & segmented –  a variety of algorithms are applied to the data to classify the client’s overall risk profile and an investment portfolio is allocated based on those requirements. Appropriate securities are purchased as we will discuss in the below sections.
  • Portfolio Rebalancing  – The client’s portfolio is rebalanced appropriately depending on life event changes and market movements.
  • Tax Loss Harvesting – Tax-loss harvesting is the mechanism of selling securities that have a loss associated with them. By doing so or by taking  a loss, the idea is that that client can offset taxes on both gains and income. The sold securities are replaced by similar securities by the RA platform thus maintaining the optimal investment mix.
  • A Single View of a Client’s Financial History- From the WM firm’s standpoint, it would be very useful to have a single view capability for a RA client that shows all of their accounts, interactions & preferences in one view.

User Interface Requirements for a Robo-Advisor (RA) Platform…

Once a customer logs in using any of the digital channels supported (e.g. Mobile, eBanking, Phone etc)  – they are presented with a single view of all their accounts. This view has a few critical areas – Summary View (showing an aggregated view of their financial picture), the Transfer View (allowing one to transfer funds across accounts with other providers).

The Summary View lists the below

  • Demographic info: Customer name, address, age
  • Relationships: customer rating influence, connections, associations across client groups
  • Current activity: financial products, account interactions, any burning customer issues, payments missed etc
  • Customer Journey Graph: which products or services they are associated with since the time they became a customer first etc,

Depending on the clients risk tolerance and investment horizon, the weighted allocation of investments across these categories will vary. To illustrate this, a Model Portfolio and an example are shown below.

Algorithms for a Robo-Advisor (RA) Platform…

There are a variety of algorithmic approaches that could be taken to building out an RA platform. However the common feature of all of these is to –

  • Leverage data science & statistical modeling to automatically allocate client wealth across different asset classes (such as domestic/foreign stocks, bonds & real estate related securities) to automatically rebalance portfolio positions based on changing market conditions or client preferences. These investment decisions are also made based on detailed behavioral understanding of a client’s financial journey metrics – Age, Risk Appetite & other related information. 
  • A mixture of different algorithms can be used such as Modern Portfolio Theory (MPT), Capital Asset Pricing Model (CAPM), the Black Litterman Model, the Fama-French etc. These are used to allocate assets as well as to adjust positions based on market movements and conditions.
  • RA platforms also provide 24×7 tracking of market movements to use that to track rebalancing decisions from not just a portfolio standpoint but also from a taxation standpoint.

Model Portfolios…

  1. Equity  

             A) US Domestic Stock – Large Cap, Medium Cap , Small Cap, Dividend Stocks 

             B) Foreign Stock – Emerging Markets, Developed Markets

       2. Fixed Income

             A) Developed Market Bonds 

             B) US Bonds

             C) International Bonds

             D) Emerging Markets Bonds

      3. Other 

             A) Real Estate  

             B) Currencies

             C) Gold and Precious Metals

             D) Commodities

       4. Cash

Sample Portfolios – for an aggressive investor…

  1. Equity  – 85%

             A) US Domestic Stock (50%) – Large Cap – 30%, Medium Cap – 10% , Small Cap – 10%, Dividend Stocks – 0%

             B) Foreign Stock – (35%) –  Emerging Markets – 18%, Developed Markets – 17% 

       2. Fixed Income – 5%

             A) Developed Market Bonds  – 2%

             B) US Bonds – 1%

             C) International Bonds – 1%

             D) Emerging Markets Bonds – 1%

      3. Other – 5%

             A) Real Estate  – 3%

             B) Currencies – 0%

             C) Gold and Precious Metals – 0%

             D) Commodities – 2%

       4. Cash – 5%

Technology Requirements for a Robo-Advisor (RA) Platform…

An intelligent RA platform has a few core technology requirements (based on the above business requirements).

  1. A Single Data Repository – A shared data repository called a Data Lake is created, that can capture every bit of client data (explained in more detail below) as well as external data. The RA datalake provides more visibility into all data to a variety of different stakeholders. Wealth Advisors access processed data to view client accounts etc. Clients can access their own detailed positions,account balances etc. The Risk group accesses this shared data lake to processes more position, execution and balance data.  Data Scientists (or Quants) who develop models for the RA platform also access this data to perform analysis on fresh data (from the current workday) or on historical data. All historical data is available for at least five years—much longer than before. Moreover, the Hadoop platform enables ingest of data across a range of systems despite their having disparate data definitions and infrastructures. All the data that pertains to trade decisions and lifecycle needs to be made resident in a general enterprise storage pool that is run on the HDFS (Hadoop Distributed Filesystem) or similar Cloud based filesystem. This repository is augmented by incremental feeds with intra-day trading activity data that will be streamed in using technologies like Sqoop, Kafka and Storm.
  2. Customer Data Collection – Existing Financial Data across the below categories is collected & aggregated into the data lake. This data ranges from Customer Data, Reference Data, Market Data & other Client communications. All of this data, can be ingested using a API or pulled into the lake from a relational system using connectors supplied in the RA Data Platform. Examples of data collected include – Customer’s existing Brokerage accounts, Customer’s Savings Accounts, Behavioral Finance Suveys and Questionnaires etc etc. The RA Data Lake stores all internal & external data.
  3. Algorithms – The core of the RA Platform are data science algos. Whatever algorithms are used – a few critical workflows are common to them. The first is Asset Allocation is to take the customers input in the “ADVICE” tab for each type of account and to tailor the portfolio based on the input. The others include Portfolio Rebalancing and Tax Loss Harvesting.
  4. The RA platform should be able to store market data across years both from a macro and from an individual portfolio standpoint so that several key risk measures such as volatility (e.g. position risk, any residual risk and market risk), Beta, and R-Squared – can be calculated at multiple levels.  This for individual securities, a specified index, and for the client portfolio as a whole.

roboadvisor_design_arch

                      Illustration: Architecture of a Robo-Advisor (RA) Platform 

The overall logical flow of data in the system –

  • Information sources are depicted at the left. These encompass a variety of institutional, system and human actors potentially sending thousands of real time messages per hour or by sending over batch feeds.
  • A highly scalable messaging system to help bring these feeds into the RA Platform architecture as well as normalize them and send them in for further processing. Apache Kafka is a good choice for this tier. Realtime data is published by a range of systems over Kafka queues. Each of the transactions could potentially include 100s of attributes that can be analyzed in real time to detect business patterns.  We leverage Kafka integration with Apache Storm to read one value at a time and perform some kind of storage like persist the data into a HBase cluster.In a modern data architecture built on Apache Hadoop, Kafka ( a fast, scalable and durable message broker) works in combination with Storm, HBase (and Spark) for real-time analysis and rendering of streaming data. 
  • Trade data is thus streamed into the platform (on a T+1 basis), which thus ingests, collects, transforms and analyzes core information in real time. The analysis can be both simple and complex event processing & based on pre-existing rules that can be defined in a rules engine, which is invoked with Apache Storm. A Complex Event Processing (CEP) tier can process these feeds at scale to understand relationships among them; where the relationships among these events are defined by business owners in a non technical or by developers in a technical language. Apache Storm integrates with Kafka to process incoming data. 
  • For Real time or Batch Analytics, Apache HBase provides near real-time, random read and write access to tables (or ‘maps’) storing billions of rows and millions of columns. In this case once we store this rapidly and continuously growing dataset from the information producers, we are able  to do perform super fast lookup for analytics irrespective of the data size.
  • Data that has analytic relevance and needs to be kept for offline or batch processing can be stored using the Hadoop Distributed Filesystem (HDFS) or an equivalent filesystem such as Amazon S3 or EMC Isilon or Red Hat Gluster. The idea to deploy Hadoop oriented workloads (MapReduce, or, Machine Learning) directly on the data layer. This is done to perform analytics on small, medium or massive data volumes over a period of time. Historical data can be fed into Machine Learning models created above and commingled with streaming data as discussed in step 1.
  • Horizontal scale-out (read Cloud based IaaS) is preferred as a deployment approach as this helps the architecture scale linearly as the loads placed on the system increase over time. This approach enables the Market Surveillance engine to distribute the load dynamically across a cluster of cloud based servers based on trade data volumes.
  • It is recommended to take an incremental approach to building the RA platform, once all data resides in a general enterprise storage pool and makes the data accessible to many analytical workloads including Trade Surveillance, Risk, Compliance, etc. A shared data repository across multiple lines of business provides more visibility into all intra-day trading activities. Data can be also fed into downstream systems in a seamless manner using technologies like SQOOP, Kafka and Storm. The results of the processing and queries can be exported in various data formats, a simple CSV/txt format or more optimized binary formats, json formats, or you can plug in custom SERDE for custom formats. Additionally, with HIVE or HBASE, data within HDFS can be queried via standard SQL using JDBC or ODBC. The results will be in the form of standard relational DB data types (e.g. String, Date, Numeric, Boolean). Finally, REST APIs in HDP natively support both JSON and XML output by default.
  • Operational data across a bunch of asset classes, risk types and geographies is thus available to investment analysts during the entire trading window when markets are still open, enabling them to reduce risk of that day’s trading activities. The specific advantages to this approach are two-fold: Existing architectures typically are only able to hold a limited set of asset classes within a given system. This means that the data is only assembled for risk processing at the end of the day. In addition, historical data is often not available in sufficient detail. Hadoop accelerates a firm’s speed-to-analytics and also extends its data retention timeline
  • Apache Atlas is used to provide Data Governance capabilities in the platform that use both prescriptive and forensic models, which are enriched by a given businesses data taxonomy and metadata.  This allows for tagging of trade data  between the different businesses data views, which is a key requirement for good data governance and reporting. Atlas also provides audit trail management as data is processed in a pipeline in the lake
  • Another important capability that Big Data/Hadoop can provide is the establishment and adoption of a Lightweight Entity ID service – which aids dramatically in the holistic viewing & audit tracking of trades. The service will consist of entity assignment for both institutional and individual traders. The goal here is to get each target institution to propagate the Entity ID back into their trade booking and execution systems, then transaction data will flow into the lake with this ID attached providing a way to do Client 360.
  • Output data elements can be written out to HDFS, and managed by HBase. From here, reports and visualizations can easily be constructed. One can optionally layer in search and/or workflow engines to present the right data to the right business user at the right time.  

Conclusion…

As one can see clearly, though automated investing methods are still in early stages of maturity – they hold out a tremendous amount of promise. As they are unmistakably the next big trend in the WM industry industry players should begin developing such capabilities.

The Three Core Competencies of Digital – Cloud, Big Data & Intelligent Middleware

Ultimately, the cloud is the latest example of Schumpeterian creative destruction: creating wealth for those who exploit it; and leading to the demise of those that don’t.” – Joe Weiman author of Cloudonomics: The Business Value of Cloud Computing

trifacta_digital

The  Cloud As a Venue for Digital Workloads…

As 2016 draws to a close, it can safely be said that no industry leader questions the existence of the new Digital Economy and the fact that every firm out there needs to create a digital strategy. Myriad organizations are taking serious business steps to making their platforms highly customer-centric via a renewed operational metrics focus. They are also working on creating new business models using their Analytics investments. Examples of these verticals include Banking, Insurance, Telecom, Healthcare, Energy etc.

As a general trend, the Digital Economy brings immense opportunities while exposing firms to risks as well. Customers now demanding highly contextual products, services and experiences – all accessible via an easy API (Application Programming Interfaces).

Big Data Analytics (BDA) software revenues will grow from nearly $122B in 2015 to more than $187B in 2019 – according to Forbes [1].  At the same time, it is clear that exploding data generation across the global economy has become a clear & present business phenomenon. Data volumes are rapidly expanding across industries. However, while the production of data itself that has increased but it is also driving the need for organizations to derive business value from it. As IT leaders know well, digital capabilities need low cost yet massively scalable & agile information delivery platforms – which only Cloud Computing can provide.

For a more detailed technical overview- please visit below link.

http://www.vamsitalkstech.com/?p=1833

Big Data & Big Data Analytics drive consumer interactions.. 

The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging visualization but also to personalize services clients care about across multiple channels of interaction. The only way to attain digital success is to understand your customers at a micro level while constantly making strategic decisions on your offerings to the market. Big Data has become the catalyst in this massive disruption as it can help business in any vertical solve their need to understand their customers better & perceive trends before the competition does. Big Data thus provides the foundational  platform for successful business platforms.

The three key areas where Big Data & Cloud Computing intersect are – 

  • Data Science and Exploration
  • ETL, Data Backups and Data Preparation
  • Analytics and Reporting

Big Data drives business usecases in Digital in myriad ways – key examples include  –  

  1. Obtaining a realtime Single View of an entity (typically a customer across multiple channels, product silos & geographies)
  2. Customer Segmentation by helping businesses understand their customers down to the individual micro level as well as at a segment level
  3. Customer sentiment analysis by combining internal organizational data, clickstream data, sentiment analysis with structured sales history to provide a clear view into consumer behavior.
  4. Product Recommendation engines which provide compelling personal product recommendations by mining realtime consumer sentiment, product affinity information with historical data.
  5. Market Basket Analysis, observing consumer purchase history and enriching this data with social media, web activity, and community sentiment regarding past purchase and future buying trends.

Further, Digital implies the need for sophisticated, multifactor business analytics that need to be performed in near real time on gigantic data volumes. The only deployment paradigm capable of handling such needs is Cloud Computing – whether public or private. Cloud was initially touted as a platform to rapidly provision compute resources. Now with the advent of Digital technologies, the Cloud & Big Data will combine to process & store all this information.  According to the IDC , by 2020 spending on Cloud based Big Data Analytics will outpace on-premise by a factor of 4.5. [2]

Intelligent Middleware provides Digital Agility.. 

Digital Applications are applications modular, flexible and responsive to a variety of access methods – mobile & non mobile. These applications are also highly process driven and support the highest degree of automation. The need of the hour is to provide enterprise architecture capabilities around designing flexible digital platforms that are built around efficient use of data, speed, agility and a service oriented architecture. The choice of open source is key as it allows for a modular and flexible architecture that can be modified and adopted in a phased manner – as you will shortly see.

The intention in adopting a SOA (or even a microservices) architecture for Digital capabilities is to allow lines of business an ability to incrementally plug in lightweight business services like customer on-boarding, electronic patient records, performance measurement, trade surveillance, risk analytics, claims management etc.

Intelligent Middleware adds significant value in six specific areas –

  1. Supports a high degree of Process Automation & Orchestration thus enabling the rapid conversion of paper based business processes to a true digital form in a manner that lends itself to continuous improvement & optimization
  2. Business Rules help by adding a high degree of business flexibility & responsiveness
  3. Native Mobile Applications  enables platforms to support a range of devices & consumer behavior across those front ends
  4. Platforms As a Service engines which enable rapid application & business capability development across a range of runtimes and container paradigms
  5. Business Process Integration engines which enable rapid application & business capability development
  6. Middleware brings the notion of DevOps into the equation. Digital projects bring several technology & culture challenges which can be solved by a greater degree of collaboration, continuous development cycles & new toolchains without giving up proven integration with existing (or legacy)systems.

Intelligent Middleware not only enables Automation & Orchestration but also provides an assembly environment to string different (micro)services together. Finally, it also enables less technical analysts to drive application lifecycle as much as possible.

Further, Digital business projects call out for mobile native applications – which a forward looking middleware stack will support.Middleware is a key component for driving innovation and improving operational efficiency.

Five Key Business Drivers for combining Big Data, Intelligent Middleware & the Cloud…

The key benefits of combining the above paradigms to create new Digital Applications are –

  • Enable Elastic Scalability Across the Digital Stack
    Cloud computing can handle the storage and processing of any amount of data & any kind of data.This calls for the collection & curation of data from dynamic and highly distributed sources such as consumer transactions, B2B interactions, machines such as ATM’s & geo location devices, click streams, social media feeds, server & application log files and multimedia content such as videos etc. It needs to be noted that data volumes here consist of multi-varied formats, differing schemas, transport protocols and velocities. Cloud computing provides the underlying elastic foundation to analyze these datasets.
  • Support Polyglot Development, Data Science & Visualization
    Cloud technologies are polyglot in nature. Developers can choose from a range of programming languages (Java, Python, R, Scala and C# etc) and development frameworks (such as Spark and Storm). Cloud offerings also enable data visualization using a range of tools from Excel to BI Platforms.
  • Reduce Time to Market for Digital Business Capabilities
    Enterprises can avoid time consuming installation, setup & other upfront procedures. consuming  can deploy Hadoop in the cloud without buying new hardware or incurring other up-front costs. On the same vein, even big data analytics should be able to support self service across the lifecycle – from data acquisition, preparation, analysis & visualization.
  • Support a multitude of Deployment Options – Private/Public/Hybrid Cloud 
    A range of scenarios for product development, testing, deployment, backup or cloudbursting are efficiently supported in pursuit of cost & flexibility goals.
  • Fill the Talent Gap
    Open Source technology is the common thread across Cloud, Big Data and Middleware. The hope is that the ubiquity of open source will serve as a critical level in enabling the filling up of the IT-Business skills scarcity gap.

As opposed to building standalone or one-off business applications, a ‘Digital Platform Mindset’ is a more holistic approach capable of producing higher rates of adoption & thus revenues. Platforms abound in the web-scale world at shops like Apple, Facebook & Google etc. Digital Applications are constructed like lego blocks  and they reuse customer & interaction data to drive cross sell and up sell among different product lines. The key components here are to ensure that one starts off with products with high customer attachment & retention. While increasing brand value, it is key to ensure that customers & partners can also collaborate in the improvements in the various applications hosted on top of the platform.

References

[1] Forbes Roundup of Big Data Analytics (BDA) Report

http://www.forbes.com/sites/louiscolumbus/2016/08/20/roundup-of-analytics-big-data-bi-forecasts-and-market-estimates-2016/#b49033b49c5f

[2] IDC FutureScape: Worldwide Big Data and Analytics 2016 Predictions

Five Areas Where Big Data Drives Innovation in the Bill Pay Industry..

As the Bill Pay Industry Motors On…

The traditional model of service providers relying on call centers and face-to-face interactions with their customers to gauge their satisfaction are long past. With the advent of PSD2, the regulatory authorities themselves may be more open to new business models in the Bill Pay space.

With the explosion of data being collected from mobile applications, location based devices & social media, Bill Pay providers can monetize on their years of historical data by opportunistically combining the above and providing Analytics in the below five strategic areas

  1. Ensuring the best possible & timely Customer Payment Experience –Younger customers are typically very happy in leveraging online channels like mobile phones, web applications to make their payment instead of using paper based mailing. Using online channels to process payments also results in higher degrees of both end customer and service provider satisfaction, as it is quicker in terms of funds transfer, availability and is also less error prone. Leveraging Big Data to understand which of your customers prefer mobile channels (based on lifestyle & behavioral preferences) and helping them download service provider mobile applications can accelerate mobile payment adoption modes. Another key use case is to understand which customers typically pay just before or after the deadline thus incurring late fees – another source of customer dissatisfaction. Again, understanding customer payment modes & trends can help increase customer satisfaction here. The ability to reach out to a customer at the best possible mode that they prefer (via mobile app, or, a text message, or, a phone call) can also help address customer dissatisfaction with services.
  2. Provding a Unified View of Customer Across Multiple Service Accounts – Creating a single customer profile or view across multiple household services & interactions, payment history across those can provide an ability for Service Providers to understand the total Customer Lifetime Value (CLV) of a single customer. Creating this profile can also help drive the business value in the following areas.
  • What mode of contact do they prefer? And at what time? Can Customers be better targeted at these channels at those preferred times?
  • What is the overall Customer Lifetime Value (CLV) or how much profit we are able to generate from this customer over their total lifetime?
  • By understanding CLV across populations, can Service Providers leverage that to increase spend on marketing & sales for products that are resulting in higher customer value?
  • Which of my customers are targets for promoting Green Services and Products?
  • What Features are customers currently missing?
  • How can Service Providers we increase cross sell and up-sell of products & services?
  • Does this customer fall into a certain natural segment and if so, how can we acquire most customers like them?

 

monetize_billpay

           Five Ways for Bill Pay Providers to Monetize their Data Assets

  1. Improving Customer Satisfaction – Creating a single customer profile or view across multiple household services & interactions can provide an ability for Service Providers to understand the total Customer Lifetime Value (CLV) of a single customer. Creating this profile can also help drive the business value in the following areas – Customer Satisfaction, Customer NPS (Net Promoter Score), Customer Mood & Willingness to adopt new services, Customer Retention etc.
  1. Analytics As A Service to interested 3rd Parties

The ability of consumers to make their household services payments can serve as a reliable indicator of household economic health as well as a sign of their willingness to adopt new products and services. This data can be anonymized at an individual consumer level, analyzed using machine learning and be provided as a service to various stakeholders – Other businesses like Retailers, the Government & the Regulatory Authorities.

Concrete examples include –

  • Combining Social data, demographic data with bill pay data & other credit data can help the Government gauge the direction of the economy. Obviously the more data that can be merged into this model (e.g. mortgage payment data etc) can help with its overall accuracy
  • Allowing Retailers to analyze consumer mobile usage data, bill pay data, credit records as well as use external data (social media etc) to predict what products they may like etc and to target promotions & card offers etc

A final note on the overall scope of Predictive Analytics in this usecase-

  • Obtaining a real-time Single View of the Customer (typically a customer across multiple channels, product silos & geographies) across years of account history
  • Customer Segmentation by helping businesses understand customer segments down to the individual level as well as at a segment level
  • Performing Customer sentiment analysis by combining internal organizational data, clickstream data, sentiment analysis with structured sales history to provide a clear view into consumer behavior.
  • Product Recommendation engines which provide compelling personal product recommendations by mining realtime consumer sentiment, product affinity information with historical data etc.
  • Market Basket Analysis, observing consumer purchase history and enriching this data with social media, web activity, and community sentiment regarding past purchase and future buying trends.

5.Service Provider Analytics

Service Providers can themselves access this data to help with the various areas of their operations –

  • Improve new Consumer Acquisition by creating client profiles and helping develop targeted leads across a population of individuals
  • Instrument and understand Risk at multiple levels (customer churn, client risk etc) in real time
  • Financial risk modeling across multiple dimensions (?)
  • For Providers with multiple products & services (e.g Cable, Voice and Internet), Basket Analysis based on criteria like behavioral preferences, asset allocation etc – i.e “what products & services are typically purchased in tandem”
  • Run in place analytics on customer lifetime value (CLV) and yield per customer
  • Suggest Next Best Action for a given client and across a pool of customers
  • Provide multiple levels of dashboards ranging from the Descriptive (Business Intelligence) to the Prescriptive (business simulation as well as optimization)
  • Help with Compliance and other reporting functions

CONCLUSION…

Bill Pay is a specialized area of the payments industry. However, the massive amounts of historical customer & service data that players possess can be advantageously leveraged to provide value added services and ultimately drive new business models.

Can Your CIO Do Digital?

Business model innovation is the new contribution of IT”  — Werner Boeing, CIO, Roche Diagnostics

Digital Is Changing the Role of the Industry CIO…

A Motley crew of some what interrelated technologies – Cloud Computing, Big Data Platforms, Predictive Analytics & Mobile Applications are changing the enterprise IT landscape. The common paradigm that captures all of them is Digital. The immense business value of Digital technology no longer in question both from a customer as well as an enterprise standpoint. However, the Digital space calls for strong and visionary leadership both from a business & IT standpoint.

Business Boards and CXOs are now concerned about their organization’s overall level and maturity of digital investments. And the tangible business value in existing business operations– (e.g increasing sales & customer satisfaction, detecting fraud, driving down business & IT costs etc)-but also in helping finetune or create new business models by leveraging Digital paradigms. It is thus an increasingly accurate argument that smart applications & ecosystems built around Digitization will dictate enterprise success.

The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous micro level interactions with global consumers/customers/clients/stockholders or patients depending on the vertical you operate in. Initially enterprises viewed Digital as a bolt-on or a fresh color of paint on an existing IT operation.

How did that change over the last five years?

Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. We have seen how how exploding data generation across the global economy has become a clear & present business & IT phenomenon. Data volumes are rapidly expanding across industries. However, while the production of data by Mobile Applications that has increased but it is also driving the need for organizations to derive business value from it, using advanced techniques such as Data Science and Machine Learning. As a first step, this calls for the collection & curation of data from dynamic,  and highly distributed sources such as consumer transactions, B2B interactions, machines such as ATM’s & geo location devices, click streams, social media feeds, server & application log files and multimedia content such as videos etc – using Big Data. Often these workloads are run on servers hosted on an agile infrastructure such as a Public or Private Cloud.

As one can understand from the above paragraph, the Digital Age calls for a diverse set of fresh skills – both from IT leadership and the rank & file. The role of the Chief Information Officer (CIO) is thus metamorphosing from being an infrastructure service provider to being the overall organizational thought leader in the Digital Age.

The question is – Can Industry CIOs adapt?

The Classic CIO is a provider of IT Infrastructure services.. 

what_cios_think                                                Illustration: The Concerns of a CIO..

So what do CIOs typically think about nowadays?

  1. Keep the core stable and running so IT delivers minimal services to the business and disarm external competition
  2. Are parts of my business really startups and should they be treated as such and should they be kept away from the shackles of inflexible legacy IT? Do I need a digital strategy?
  3. What does the emergence of the 3rd platform (Cloud, Mobility,Social and Big Data) imply?
  4. Where can I show the value of expertise and IT to the money making lines of business?
  5. How can one do all the above while keeping track of Corporate and IT security?

 CIO’s who do not adapt are on the road to Irrelevance…

Where CIOs are being perceived as managing complex legacy systems, the new role of Chief Digital Officer (CDO) has gained currency. The idea that a parallel & more agile IT organization can be created and run to create an ecosystem of innovation & that the office of the CDO is the right place to drive these innovative applications.

Why is that?

  1. CIOs that cannot or that seem dis-engaged with creating innovation through IT are headed the way of the dodo. At the enterprise officer – CIO/CTO level, it becomes very obvious that more than ever “IT is not just a complementary function or a supplementary service but IT is the Business”. If that was merely something that we all paid lip-service to in the past, it is hard reality now. So it is not a case of which company can make the best widgets or has the fastest trading platforms or efficient electronic health records. It is whose enterprise IT can provide the best possible results within a given cost that will win. Its up to the CIOs to deliver and deliver in such a way that large established organizations can compete with upstarts who do not have the same kind of enterprise constraints & shackles.
  2. Innovation & information now follow an “outside in” model. As opposed to data and value being generated by internal functions (sales,engineering, customer fulfillment, core business processes etc) . Enterprise customers are beginning to now operate in what I like to think of as the new normal: entropy.  It’s these conditions that make it imperative for IT Leadership to reconsider their core business applications at the enterprise level. Does internal IT infrastructure need to look more like those of the internet giants?
  3. As a result of the above trends, CIOs are clearly now business level stakeholders more than ever. This means that they need to engage & understand their business at a deep level from an ecosystem and competitive standpoint. Those that cannot do it are neither very effective nor in those positions for long.
  4. Also,it is not merely enough to be a passive stakeholder, CIOs have to deliver on two very broad fronts. The first is to deliver core services (aka standardized functions) on time and at a reasonable cost. These are things like core banking systems, email, data backups etc. Ensuring smooth operation running transactional systems like ERP/business processing systems in manufacturing, decision support systems, classic IT infrastructure, claims management systems in Insurance and Billing systems in Healthcare. The systems that need to run to keep the business operations.The focus here is to deliver on these on time and within SLAs to increasingly demanding internal customers. Like running the NYC subway – no one praises you for keeping things humming day in and out but all hell breaks loose when the trains are nonoperational for any reason. A thankless task but one essentially needed to win the credibility with lines of business.
  5. The advent of public cloud means that internal IT no longer has a monopoly and a captive internal customer base even with core services. If one cannot compete with the likes of Amazon AWS or any of the SaaS based clouds that are mushrooming on a quarterly basis, you will find that soon enough you have to co-exist with Not-So-Shadow IT. The industry has seen enough back-office CIOs who are not perceived by their organizations as having a largely irrelevant role in the evolution of the larger enterprise.
  6. Despite the continued focus on running a strong core as the price of CIO admission to the internal strategic dances, transformation is starting to emerge as a key business driver and is making its way into the larger industry. It is no longer the province of Wall St trading shops or a Google or a Facebook. Innovation as in “adopt this strategy and reinvent your IT and change the business”. The operative word here is incremental rather than disruptive innovation. More on this key point later.
  7. Most rank and file IT personnel in general cannot really keep up with all the nomenclature of technology. For instance, a majority do not really understand umbrella concepts like Cloud, Mobility and Big Data. They know what these mean at a high level but the complex technology underpinnings, various projects & the finer nuances are largely lost on enterprise customers. There are two stark choices from a time perspective that face overworked IT personnel – a) Do you want to increase your value to your corporation by learning to speak the lingua franca of your business and by investing in those skills away from a traditional IT employee mindset? b) do you want to increase your IT depth in your area of expertise.The first makes one a valued collaborator and paves your way up within the chain, the second may definitely increase your marketability in the industry but it is not that easy to keep up. We find that an increasing number of employees choose the first path which creates interesting openings and arbitrage opportunities for other groups in the organization. The CIO needs to step up and be the internal change agent.

CONCLUSION…

Enterprise wide business innovation will continue to be designed around the four key technologies  (Big Data, Cloud Computing, Technology & Platforms). Business Platforms created leveraging these technologies will create immense operational efficiency, better business models, increased relevance to customers and ultimately drive revenues. Such platforms will separate the visionaries, leaders from the laggards in the years to come. As often noticed, the keyword accompanying transformation is often digital. This means a renewed focus on making IT services appealing to millennial or the self service generation – be they customers or employees or partners. This really touches all areas of enterprise IT while leaving behind a significant impact on organizational culture.

This is the age of IT with no boundaries – the question is whether the role of the CIO will largely remain unscathed in the years to come.

Demystifying Digital – Why Customer 360 is the Foundational Digital Capability – ..(1/3)

The first post in this three part series on Digital Foundations introduces the concept of Customer 360 or Single View of Customer (SVC). We will discuss the need for & the definition of the SVC as part of the first step in any Digital Transformation endeavor. We will also discuss specific benefits from both a business & operational state that are enabled by SVC. The second post in the series introduces the concept of a Customer Journey. The third & final post will focus on a technical design & architecture needed to achieve both these capabilities.
 
In an era of exploding organizational touch points, how many companies can truly claim that they know & understand their customers, their needs & evolving preferences deeply and from a realtime perspective?  
How many companies can claim to keep up as a customers product & service usage matures and keep them engaged by cross selling new offerings. How many can accurately predict future revenue from a customer based on their current understanding of their profile?
The answer is not at all encouraging.
Across industries like Banking, Insurance, Telecom & Manufacturing, the ability to get a unified view of the customer & their journey is at the heart of the the enterprise ability to promote relevant offerings & detect customer dissatisfaction. 
  • Currently most industry players are woeful at putting together this comprehensive Single View of their Customers (SVC). Due to operational silos, each department possess a siloed & limited view of the customer across multiple channels. These views are typically inconsistent, lack synchronization with other departments & miss a high amount of potential cross-sell and up-sell opportunities.
  • The Customer Journey problem has been an age old issue which has gotten exponentially more complicated over the last five years as the staggering rise of mobile technology and the Internet of Things (IoT) have vastly increased the number of enterprise touch points that customers are exposed to in terms of being able to discover & purchase new products/services. In an OmniChannel world, an increasing number of transactions are being conducted online. In verticals like Retail and Banking, the number of online transactions approaches an average of 40%. Adding to the problem, more and more consumers are posting product reviews and feedback online. Companies thus need to react in realtime to piece together the source of consumer dissatisfaction.
Another large component of customer outreach are Marketing analytics & the ability to run effective campaigns to recruit customers.

The most common questions that a lot of enterprises fail to answer accurately are –

  1. Is the Customer happy with their overall relationship experience?
  2. What mode of contact do they prefer? And at what time? Can Customers be better targeted at these channels at those preferred times?
  3. What is the overall Customer Lifetime Value (CLV) or how much profit we are able to generate from this customer over their total lifetime?
  4. By understanding CLV across populations, can we leverage that to increase spend on marketing & sales for products that are resulting in higher customer value?
  5. How do we increase cross sell and up-sell of products & services?
  6. Does this customer fall into a certain natural segment and if so, how can we acquire most customers like them?
  7. Can different channels (Online, Mobile, IVR & POS) be synchronized ? Can Customers begin a transaction in one channel and complete it in any of the others without having to resubmit their data?

The first element in Digital is the Customer Centricity & it must naturally follow that a 360 degree view is a huge aspect of that.

single-view-of-the-customer

                                       Illustration – Customer 360 view & its benefits

So what information is specifically contained in a Customer 360 –

The 360 degree view is a snapshot of the below types of data –

  • Customer’s Demographic information – Name, Address, Age etc
  • Length of the Customer-Enterprise relationship
  • Products and Services purchased overall
  • Preferred Channel & time of Contact
  • Marketing Campaigns the customer has responded to
  • Major Milestones in the Customers relationship
  • Ongoing activity – Open Orders, Deposits, Shipments, Customer Cases etc
  • Ongoing Customer Lifetime Value (CLV) Metrics and the Category of customer (Gold, Silver, Bronze etc)
  • Any Risk factors – Likelihood of Churn, Customer Mood Alert, Ongoing issues etc
  • Next Best Action for Customer

How Big Data technology can help..

Leveraging the ingestion and predictive capabilities of a Big Data based platform, banks can provide a user experience that rivals Facebook, Twitter or Google and provide a full picture of customer across all touch points.

Big Data enhances the Customer 360 capability in the following ways  –  

  1. Obtaining a realtime Single View of the Customer (typically a customer across multiple channels, product silos & geographies) across years of account history 
  2. Customer Segmentation by helping businesses understand customer segments down to the individual level as well as at a segment level
  3. Performing Customer sentiment analysis by combining internal organizational data, clickstream data, sentiment analysis with structured sales history to provide a clear view into consumer behavior.
  4. Product Recommendation engines which provide compelling personal product recommendations by mining realtime consumer sentiment, product affinity information with historical data.
  5. Market Basket Analysis, observing consumer purchase history and enriching this data with social media, web activity, and community sentiment regarding past purchase and future buying trends.

Customer 360 can help improve the following operational metrics of a Retailer or a Bank or a Telecom immensely.

  1. Cost to Income ratio; Customers Acquired per FTE; Sales and service FTE’s (as percentage of total FTE’s), New Accounts Per Sales FTE etc
  2.  Sales conversion rates across channels, Decreased customer attrition rates etc.
  3. Improved Net promotor scores (NPS), referral based sales etc

Customer 360 is thus basic digital capability every organization needs to offer their customers, partners & internal stakeholders. This implies a re-architecture of both data management and business processes automation.

The next post will discuss the second critical component of Digital Transformation – the Customer Journey.

Embedding A Culture of Business Analytics into the Enterprise DNA..

IT driven business transformation is always bound to fail” – Amber Storey, Sr Manager, Ernst & Young

The value of Big Data driven Analytics is no longer in question both from a customer as well as an enterprise standpoint. Lack of investment in an analytic strategy has the potential to impact shareholder value negatively.  Business Boards and CXOs are now concerned about their overall levels and maturity of investments in terms of business value – i.e increasing sales, driving down business & IT costs & helping create new business models. It is thus an increasingly accurate argument that smart applications & ecosystems built around them will increasingly dictate enterprise success.

Such examples among forward looking organizations abound across industries. These range from realtime analytics in manufacturing using IoT data streams across the supply chain, the use of natural language processing to drive patient care decisions in healthcare, more accurate insurance fraud detection & driving Digital interactions in Retail Banking etc to quote a few. 

However , most global organizations currently adopt a fairly tactical approach to ensuring the delivery of of traditional business intelligence (BI) and predictive analytics to their application platforms.  This departmental is quite suboptimal in ways as scaleable data driven decisions & culture not only empower decision-makers with up to date and realtime information but also help them develop long term insights into how globally diversified business operations are performing.  Scale is the key word here due to rapidly changing customer trends, partner, supply chain realities & regulatory mandates.

Scale implies speed of learning,  business agility across the organization in terms of having globally diversified operations turn on a dime thus ensuring that the business feels empowered.

A quick introduction to Business (Descriptive & Predictive) Analytics –

Business intelligence (BI) is a traditional & well established analytical domain that essentially takes a retrospective look at business data in systems of record. The goal for BI is to primarily look for macro or aggregate business trends across different aspects or dimensions such as time, product lines, business unites & operating geographies.

BI is primarily concerned with “What happened and what trends exist in the business based on historical data?“. The typical use cases for BI include budgeting, business forecasts, reporting & key performance indicators (KPI).

On the other hand, Predictive Analytics (a subset of Data Science) augments & builds on the BI paradigm by adding a “What could happen” dimension to the data in terms of –

  • being able to probabilistically predict different business scenarios across thousands of variables
  • suggesting specific business actions based on the above outcomes

Predictive Analytics does not intend to nor will it replace the BI domain but only adds significant business capabilities that lead to overall business success. It is not uncommon to find real world business projects leveraging both these analytical approaches.

Creating an industrial approach to analytics – 

Strategic business projects typically begin imbibing a BI/Predictive Analytics based approach as an afterthought to the other aspects of system architecture and buildout. This dated approach then ensures that analytics becomes external to and eventually operating in a reactive mode in the operation of business system.

Having said that, one does need to recognize that an industrial approach to analytics is a complex endeavor that depends on how an organization tackles the convergence of the below approaches –

  1. Organizational Structure
  2. New Age Technology 
  3. A Platforms Mindset
  4. Culture

Creating_An_Analytic_Culture

        Illustration – Embedding A Culture of Business Analytics into the Enterprise DNA..

Lets discuss them briefly – 

Organizational Structure – The historical approach has been to primarily staff analytics teams as a standalone division often reporting to a CIO. This team has responsibility for both the business intelligence as well as some silo of a data strategy. Such a piecemeal approach to predictive analytics ensures that business & application teams adopt a “throw it over the wall” mentality over time.

So what needs to be done? 

In the Digital Age, enterprises should look to centralize both data management as well as the governance of analytics as core business capabilities. I suggest a hybrid organizational structure where a Center of Excellence (COE) is created which reports to the office of the Chief Data Officer (CDO) as well as individual business analytic leaders within the lines of business themselves.

 This should be done to ensure that three specific areas are adequately tackled using a centralized approach- 

  • Investing in creating a data & analytics roadmap by creating a center of excellence (COE)
  • Setting appropriate business milestones with “lines of business” value drivers built into a robust ROI model
  • Managing Risk across the enterprise with detailed scenario planning

New Age Technology –

The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging visualization but also to personalize services clients care about across multiple modes of interaction. Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. We have seen how how exploding data generation across the global economy has become a clear & present business & IT phenomenon. Data volumes are rapidly expanding across industries. However, while the production of data itself that has increased but it is also driving the need for organizations to derive business value from it. This calls for the collection & curation of data from dynamic,  and highly distributed sources such as consumer transactions, B2B interactions, machines such as ATM’s & geo location devices, click streams, social media feeds, server & application log files and multimedia content such as videos etc – using Big Data.

Cloud Computing is the ideal platform to provide the business with self service as well as rapid provisioning of business analytics. Every new application designed needs to be cloud native from the get go.

The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging Visualization but also to personalize services clients care about across multiple modes of interaction. Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc.

A Platforms Mindset – 

As opposed to building standalone or one-off business applications, a Platform Mindset is a more holistic approach capable of producing higher revenues. Platforms abound in the webscale world at shops like Apple, Facebook & Google etc. Applications are constructed like lego blocks  and they reuse customer & interaction data to drive cross sell and up sell among different product lines. The key components here are to ensure that one starts off with products with high customer attachment & retention. While increasing brand value, it is key to ensure that customers & partners can also collaborate in the improvements in the various applications hosted on top of the platform.

Culture – Business value fueled by analytics is only possible if the entire organization operates on an agile basis in order to collaborate across the value chain. Cross functional teams across new product development, customer acquisition & retention, IT Ops, legal & compliance must collaborate in short work cycles to close the traditional business & IT innovation gap. Methodologies like DevOps who’s chief goal is to close the long-standing gap between the engineers who develop and test IT capability and the organizations that are responsible for deploying and maintaining IT operations – must be adopted. Using traditional app dev methodologies, it can take months to design, test and deploy software. No business today has that much time—especially in the age of IT consumerization and end users accustomed to smart phone apps that are updated daily. The focus now is on rapidly developing business applications to stay ahead of competitors that can better harness Big Data’s amazing business capabilities.

Summary- 

Enterprise wide business analytic approaches designed around the four key prongs  (Structure, Culture, Technology & Platforms)   will create immense operational efficiency, better business models, increased relevance and ultimately drive revenues. These will separate the visionaries, leaders from the laggards in the years to come.

What Lines Of Business Want From IT..

Relationship

                    Illustration: Business- IT Relationship (Image src – Pat.it)

Previous posts in this blog have discussed the fact that technological capabilities now make or break business models. It is critical for IT to operate in a manner that maximizes their efficiency while managing costs & ultimately delivering the right outcomes for the organization.

It is clear and apparent to me that the relationship lines of business (LOBs) have with their IT teams – typically central & shared – is completely broken at a majority of large organizations. Each side cannot seem to view either the perspective or the passions of the other. This dangerous dysfunction usually leads to multiple complaints from the business. Examples of which include –

  • IT is perceived to be glacially slow in providing infrastructure needed to launch new business initiatives or to amend existing ones. This leads to the phenomenon of ‘Shadow IT’ where business applications are  run on public clouds bypassing internal IT
  • Something seems to be lost in translation while conveying requirements to different teams within IT
  • IT is too focused on technological capabilities – Virtualization, Middleware, Cloud, Containers, Hadoop et al without much emphasis on business value drivers

So what are the top asks that Business has for their IT groups? I wager that there are five important focus areas –

  1. Transact in the language of the business –Most would agree that there has been too much of a focus on the technology itself – how it works,  what the infrastructure requirements are to host applications – cloud or on-prem, data engines to ingest and process it etc etc . The focus needs to be on customer needs that drive business value for an organization’s customers, partners, regulators & employees. Technology at it’s core is just an engine and does not exist in a vacuum. The most vibrant enterprises understand this ground reality and always ensure that business needs drive IT and not the other way around. It is thus highly important for IT leadership to understand the nuances of the business to ensure that their roadmaps (long and medium term) are being driven with business & competitive outcomes in mind. Examples of such goals are a common organization wide taxonomy across products, customers, logistics, supply chains & business domains. The shared emphasis on both business & IT should be on goals like increased profitability per customer, enhanced segmentation of both micro and macro customer populations with the ultimate goal of increasing customer lifetime value (CLV).
  2. Bi-Modal or “2 Speed” IT et al need to be business approach centric – Digital business models that are driving agile web-scale companies offer enhanced customer experiences built on product innovation and data driven business models. They are also encroaching into the domain of established industry players in verticals like financial services, retail, entertainment, telecommunications, transportation and insurance  by offering contextual & trendy products tailored to individual client profiles. Their savvy use of segmentation data  and realtime predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, point of sale, Internet, etc.). The enterprise approach has been to adopt a model known as Bi-Modal IT championed by Gartner. This model envisages two different IT camps – one focused on traditional applications and the other focused on innovation. Whatever be the moniker for this approach – LOBs need to be involved as stakeholders from the get-go & throughout the process of selecting technology choices that have downstream business ramifications. One of the approaches that is working well is increased cross pollination across both teams, collapsing artificial organizational barriers by adopting DevOps & ensuring that business has a slim IT component to rapidly be able to fill in gaps in IT’s business knowledge or capability.
  3. Self Service Across the board of IT Capabilities – Shadow IT (where business goes around the IT team) is not just an issue with infrastructure software but is slowly creeping up to business intelligence and advanced analytics apps. The delays associated with provisioning legacy data silos combined with using tools that are neither intuitive nor able to scale to deal with the increasing data deluge are making timely business analysis almost impossible to perform.  Insights delivered too late are not very valuable. Thus, LOBs are beginning  to move to a predominantly online SaaS (Software As A Service) model across a range of business intelligence applications. Reports, visual views of internal & external datasets are directly served to internal consumers based on data uploaded into a cloud based BI provider. These reports and views are then directly delivered to end users. IT needs to enable this capability and make it part of their range of offerings to the business.
  4. Help the Business think Analytically  – Business Process Automation (BPM) and Data Driven decision making are proven approaches used at data-driven organizations. When combined with Data and Business Analytics, this tends to be a killer combination. Organizations that are data & data metric driven are able to define key business processes that provide native support for key performance indicators (KPIs) that are critical and basic to their functioning. Applications developed by IT need to be designed in such a way that these KPIs can be communicate and broadcast across the organization constantly. Indeed a high percentage of organizations now have senior executive in place as the champion for BPM, Business Rules and Big Data driven analytics. These applications are also mobile native so that they can be provided access through a variety of mobile platforms for field based employees & back into the corporate firewall.
  5. No “Us vs Them” mentality – it is all “Us”  –  None of the above are only possible if the entire organization operates on an agile basis in order to collaborate across the value chain. Cross functional teams across new product development, customer acquisition & retention, IT Ops, legal & compliance must collaborate in short work cycles to close the traditional business & IT innovation gap.  One of chief goals of agile methodologies is to close the long-standing gap between the engineers who develop and test IT capability and business requirements for such capabilities.  Using traditional app dev methodologies, it can take months to design, test and deploy software – which is simply unsustainable. 

Business & IT need to collaborate. Period. –

The most vibrant enterprises that have implemented web-scale practices not only offer “IT/Business As A Service” but also have instituted strong cultures of symbiotic relationships between customers (both current & prospective), employees , partners and developers etc.

No business today has much time to innovation—especially in the age of IT consumerization where end users accustomed to smart phone apps that are often updated daily. The focus now is on rapidly developing business applications to stay ahead of competitors that can better harness technology’s amazing business capabilities.