Why Platform as a Service (PaaS) Adoption will take off in 2017..

???????????????????????????

Since the time Steve Ballmer went ballistic professing his love for developers, it has been a virtual mantra in the technology industry that developer adoption is key to the success of a given platform. On the face of it – Platform as a Service(PaaS) is a boon to enterprise developers who are tired of the inefficiencies of old school application development environments & stacks. Further, a couple of years ago, PaaS seemed to be the flavor of the future given the focus on Cloud Computing. This blogpost focuses on the advantages of the generic PaaS approach while discussing its lagging slow rate of adoption in the cloud computing market – as compared with it’s cloud cousins – IaaS (Infrastructure as a Service) and SaaS (Software as a Service).

Platform as a Service (PaaS) as the foundation for developing Digital, Cloud Native Applications…

Call them Digital or Cloud Native or Modern. The nature of applications in the industry is slowly changing. So are the cultural underpinnings of the development process and culture themselves- from waterfall to agile to DevOps. At the same time, Cloud Computing and Big Data are enabling the creation of smart data applications. Leading business organizations are cognizant of the need to attract and retain the best possible talent – often competing with the FANGs (Facebook, Amazon, Netflix & Google).

Couple all this with the immense industry and venture capital interest around container oriented & cloud native technologies like Docker – you have a vendor arms race in the making.

Infrastructure is enabling but in the end- it is the applications that are Queen or King. That is where PaaS comes in.

Why Digital Disruption is the Cure for the Common Data Center..

Enter Platform as a Service (PaaS)…

Platform as a Service (PaaS) is one of the three main cloud delivery models, the other two being IaaS (Infrastructure such as compute, network & storage services) and SaaS (Business applications delivered over a cloud). A collection of different cloud technologies, PaaS focuses exclusively on application development & delivery. PaaS advocates a new kind of development based on native support for concepts like agile development, unit testing, continuous integration, automatic scaling, while providing a range of middleware capabilities. Applications developed on these can be deployed out as services & managed across thousands of application instances.

In short, PaaS is the ideal platform for creating & hosting digital applications. What can PaaS provide that older application development toolchains and paradigms cannot?

While the overall design approach and features vary across every PaaS vendor. There are five generic advantages from a high level –

  1. PaaS enables a range of Application, Data & Middleware components to be delivered as API based services to developers on any given Infrastructure as a Service (IaaS).  These capabilities include-  Messaging as a service, Database as a service, Mobile capabilities as a service, Integration as a service, Workflow as a service, Analytics as a service for data driven applications etc. Some PaaS vendors also provide ability to automate & manage APIs for business applications deployment on them – API Management.
  2. PaaS provides easy & agile access to the entire suite of technologies used while creating complex business applications. These range from programming languages to application server (and lightweight) runtimes to programming languages to CI/CD toolchains to source control repositories.
  3. PaaS provides the services which enables a seamless & highly automated manner of building the complete life cycle of building and delivering web applications and services on the internet. Industry players are infusing software delivery processes with practices such as continuous delivery (CD) and continuous integration (CI). For large scale applications such as those built in web scale shops, financial services, manufacturing, telecom etc – PaaS abstracts away the complexities of building, deploying & orchestrating infrastructure thus enabling instantaneous developer productivity. This is a key point – with it’s focus on automation – PaaS can save application and system administrators precious time and resources in managing the lifecycle of elastic applications
  4. PaaS enables your application to be ‘kind of cloud’ agnostic & can enable applications to be run on any cloud platform whether public or private. This means that a PaaS application developed on Amazon AWS can easily be ported to Microsoft Azure to VMWare vSphere to Red Hat RHEV etc
  5. PaaS can help smoothen organizational Culture and Barriers – The adoption of a PaaS forces an agile culture in your organization – one that pushes cross pollination among different business, dev and ops teams. Most organizations are just now beginning to go bimodal can benefit from a PaaS.

The Barriers to PaaS Adoption Will Continue to Fall In 2017..

In general, PaaS market growth rates do not seem to line up well when compared with the other broad sections of the cloud computing space, namely IaaS (Infrastructure as a Service) and SaaS (Software as a Service). 451 Research’s Market Monitor forecasts that the total market for cloud computing (including PaaS, IaaS and infrastructure software as a service (ITSM, backup, archiving) – will hit $21.9B in 2016 more than doubling to $44.2bB by 2020. Of that, some analyst estimates contend that PaaS will be a relatively small $8.1 billion.

451-research-paas_vs_saas_iaas

  (Source – 451 Research)

The advantages that PaaS confers have sadly also caused its relatively low rate of adoption as compared to IaaS and SaaS.

The reasons for the same include, in my opinion  –  

  1. Poor Conception of the Business Value of PaaS –  This is the biggest factor holding back explosive growth in this category. PaaS is a tremendously complicated technology & vendors have not helped by stressing on the complex technology underpinnings (containers, supported programming languages, developer workflow, orchestration, scheduling etc etc) as opposed to helping clients understand the tangible business drivers & value that enterprise CIOs can derive from this technology. Common drivers include increased time to market for digital capabilities, man hours saved in maintaining complex applications, ability to attract new talent etc. These factors will vary for every customer but it is up to frontline Sales teams to help deliver this message in a manner that is appropriate to the client.
  2. Yes, you can do DevOps without PaaS but PaaS helps a long way  – Many Fortune 500 organizations are drawing up DevOps strategies which do not include a PaaS & are based on a simplified CI/CD pipeline. This is to the detriment of both the customer organization & the industry as PaaS can vastly simplify a range of complex runtime & lifecycle services that would otherwise need to be cobbled together by the customer as the application moves from development to production. There is simply a lack of knowledge in the customer community about where a PaaS fits in a development & deployment toolchain.
  3. Smorgasbord of Complex Infrastructure Choices – The average leading PaaS includes a range of open source technologies ranging from containers to runtimes to datacenter orchestration to scheduling to cluster management tools. This makes it very complex from the perspective of Corporate IT – not just it terms of running POCs and initial deployments but also to manage a highly complex stack. It is incumbent on the open source projects to abstract away the complex inner workings to drive adoption  -whether by design or by technology alliances.
  4. You don’t need Cloud for PaaS but not enough Technology Leaders get that – This one is perception. The presence of an infrastructural cloud computing strategy is not a necessary condition for PaaS. 
  5. The false notion that PaaS is only fit for massively scalable, greenfield applications – Industry leading PaaS’s (like Red Hat’s OpenShift) support a range of technology approaches that can help cut technical debt. They donot limit deployment on an application server platform such as JBOSS EAP or WebSphere or WebLogic, or a lightweight framework like Spring.
  6. PaaS will help increase automation thus cutting costs – For developers of applications in Greenfield/ New Age spheres such as IoT, PaaS can enable the creation of thousands of instances in a “Serverless” fashion. PaaS based applications can be composed of microservices which are essentially self maintaining – i.e self healing and self scalable up or down; these microservices are delivered (typically) by IT as Docker containers using automated toolchains. The biggest requirement in large datacenters – human involvement – is drastically reduced if PaaS is used – while increasing agility, business responsiveness and efficiencies.

Conclusion…

As this technology progresses, IoT and artificial intelligence/machine learning services in PaaS will provide democratized access for partners and customers to more easily build connected and intelligent solutions.

My goal for this post was to share a few of my thoughts on the benefits of adopting a game changing technology. Done right, PaaS can provide a tremendous boost to building digital applications thus boosting the bottomline. Beginning 2017, we will witness PaaS satisfying business critical use cases.

References…

[1] http://www.forbes.com/sites/louiscolumbus/2016/03/13/roundup-of-cloud-computing-forecasts-and-market-estimates-2016/#3d75915274b0

Payment Providers – How Big Data Analytics Provides New Opportunities in 2017

                                                         Image Credit – JDL Group

Payments Industry in 2017..

The last post in this blog (handy link below) discussed my predictions for the payments market in 2017. The payments industry is large, quite diverse from a capabilities standpoint while being lucrative from a revenue standpoint.

My Last Post for the Year – Predictions for the Global Payments Industry in 2017

Why is that?

First, payments are both an essential daily function for consumers and corporates alike which means a constant annual growth in transaction volumes. Volumes are the very lifeblood of the industry.

Second, thanks to the explosion of technology capabilities especially around Smartphones & Smart Apps – the number of avenues that consumers can use to make payments has virtually surged.

Thirdly, an increasing number of developing economies such as China, India and Brazil are slowly moving over massive consumer populations over to digital payments from previously all cash economies.

Finally, in developed economies – the increased regulatory push  in the form of standards like PSD2 (Payments Systems Directive 2) have begun blurring boundaries between traditional players and the new upstarts.

All of these factors have the Payments industry growing at a faster clip than most other areas of finance. No wonder, payments startups occupy pride of place in the FinTech boom.

The net net of all this is that payments will continue to offer a steady and attractive stream of investments for players in this area.

Big Data Driven Analytics in the Payments Industry..

Much like the other areas of finance, the payments industry can benefit tremendously from adopting the latest techniques in data storage and analysis. Let us consider the important ways in which they can leverage the diverse and extensive data assets they possess to perform important business functions –

  1. Integrating all the complex & disparate functions of Payments Platforms
    Most payment providers offer a variety of services. E.g. credit cards, debit cards and corporate payments. Integrating different kinds of payment types – credit cards, debit cards, Check, Wire Transfers etc into one centralized payment platform. This helps with internal efficiencies (e.g collapsing redundant functions such as fraud, risk scoring, reconciliation, reporting into one platform) but also with external services offered to merchants (e.g. forecasting, analytics etc).
  2. Detect Payments Fraud
    Big Data is dramatically changing that approach with advanced analytic solutions that are powerful and fast enough to detect fraud in real time but also build models based on historical data (and deep learning) to proactively identify risks.

    http://www.vamsitalkstech.com/?p=1098

  3. Risk Scoring of Payments in Realtime & Batch 
    Payment Providers assess the risk score of transactions in realtime depending upon various attributes (e.g. Consumer’s country of origin, IP Address etc). Big Data enables these attributes to become granular by helping support advanced statistical techniques to incorporate behavioral (e.g. transaction is out of normal behavior for a consumers buying patterns), temporal and spatial techniques.
  4. Detect Payments Money Laundering (AML)
    A range of Big Data techniques are being deployed  to detect money laundering disguised as legitimate payments.

    http://www.vamsitalkstech.com/?p=2559

  5. Understand Your Customers Better
    Payment providers can create a single view of a Cardholder across multiple accounts & channels of usage. Doing this will enable cross sell/upsell and better customer segmentation. The below picture says it all.

    http://www.vamsitalkstech.com/?p=2517

  6. Merchant Analytics 
    Payment providers have been sitting on petabytes of customer data and have only now began waking up to the possibilities of monetizing this data. An area of increasing interest is to provide sophisticated analytics to merchants as a way of driving merchant rewards programs. Retailers, Airlines and other online merchants need to understand what segments their customers fall into as well as what the best avenues are to market to each of them. E.g. Webapp, desktop or tablet etc. Using all of the Payment Data available to them, Payment providers can help Merchant Retailers understand their customers better as well as improve their loyalty programs.
  7. Cross Sell & Up Sell New Payment & Banking Products & Services
    Most payment service providers are also morphing into online banks. Big Data based Data Lakes support the integration of regular banking  capabilities such as bill payment, person-to-person payments and account-to-account transfers to streamline the payments experience beyond the point of sale. Consumers can then move and manage money at the time they choose: instantly, same-day, next-day or on a scheduled date in the future
  8. Delivering the best possible highly personalized Payments Experience
    Mobile Wallets offer the consumer tremendous convenience by Data Lakes support the integration of capabilities such as bill payment, person-to-person payments and account-to-account transfers to streamline the payments experience beyond the point of sale. Consumers can then move and manage money at the time they choose: instantly, same-day, next-day or on a scheduled date in the future

Conclusion..

As we have discussed in previous posts in this blog, the payments industry is at the cusp (if not already, in the midst) of a massive disruption. Business strategies will continue to be driven by technology especially Big Data Analytics. Whether this is in Defense (cut costs, optimize IT, defend against financial crimes or augment existing cyber security) or playing Offense (signing up new customers, better cross sell and data monetization) – Big Data will continue to be a key capability in the industry.

My Last Post for the Year – Predictions for the Global Payments Industry in 2017

The Global Payments Industry enters 2017..

It is not without reason that the payments industry lies at the heart of the FinTech boom. Across the globe, hundreds of startups are vying to provide digital services across the complex payments spectrum. Players ranging from Smartphone manufacturers, FinTechs,Banks and Retailers are all players in this ecosystem.Added to this the payments landscape across the globe is experiencing massive change driven by technology and regulatory mandates. Please find my take on the top five trends on this dynamic industry as we begin 2017, which truly promises to be a watershed year for the industry.

                                                         Image Credit – MasterCard

Trend #1 Mobile Payments volume to surge in 2017

By 2019, global consumer mobile payment volumes are expected to surpass 1 trillion US dollars [1] – this is a massive increase from just 450 billion US dollars in 2017.

The growing popularity of alternative payment modes like Mobile Wallets (e.g Apple Pay, Chase and Android Pay) are driving increased volumes across both open loop and closed loop payments. Couple this with in-app payments (e.g Uber) as well as Banking providers with their own Digital Wallets will step up their game only driving further adoption.

Retailers like Walmart, Nordstrom and Tesco have already started offering more convenient in store payments. At the same time, mobile commerce has matured as smartphone manufacturers have started to create devices with larger screens and payment companies have added one-click buy buttons in order to remove the friction of shopping on mobile phones.

This trend is being clearly seen across all forms of consumer and merchant payments. This is owed to the convenience of making these payments often at the click of a button. This trend will only continue to accelerate in 2017 as smartphone manufacturers continue to make devices that have more onscreen real estate. This will drive more mobile commerce. With IoT technology taking center stage, the day is not long off when connected devices (e.g. wearables) make their own payments.

A smoother and friction less consumer payment experience is what is driving adoption across all of these modes. It is not just about convenience (with a range of stored user preferences) but also comfort (a range of analytics that provide integration with the users other banking products). Thus, providing a more timely and integrated experience.

Trend #2 Payment Providers will rethink their business models

Across the globe, national governments and regulatory authorities are beginning to take note of the fact that they need to unshackle banking data from the incumbents and provide access to other service providers. The intention is to change archaic business models. Access to customer information and transaction data will enable the creation of new business services like with the FinTechs.

On the institutional side, provisions permitting cross border acquiring as well as capping of interchange fees have been passed. This will enable third part processors to access consumer account information thus enabling them to create new products. E..g. Offer banking services, contextual offers etc. This will lead to banks and payment providers building products that provide value added services on the data that they already own.

With the passage of the second revision of the pathbreaking Directive on Payment Services Directive (PSD-2), the European Parliament has adopted the legal foundation of the creation of a EU-wide single payments area (SEPA).  While the goal of the PSD is to establish a set of modern, digital industry rules for all payment services in the European Union; it has significant ramifications for the financial services industry as it will surely current business models & foster new areas of competition. The key message from a regulatory standpoint is that consumer data can be opened up to other players in the payment value chain. This will lead to a clamor by players to own more of the customers data with a view to selling business services (e.g. accurate credit scoring, access to mortgage & other consumer loans and mutual funds etc) on that information.

Trend #3 Global Payment infrastructure moves to Real Time

The demand for fast payments from both consumers and corporates has led to about 40+ countries implementing immediate payment infrastructures that are highly secure yet speedy. The European Union leads the way with Denmark (Realtime 24/7), Norway and the UK (FPS) implementing such systems. The US and Canada have also begun making moves on this front as well.

The implications of this are two fold. One, this will drive down the already decreasing percentage of cash payments in the system while – two – increasing the ability of providers and non banks to provide value added services on the transaction data which is more readily available. At the same time, expect more regulatory focus on moving fraud and compliance programs into a real time mode.

Trend #4 Payment Providers will begin resembling Bay area startups

This one is a little provocative but the high tech trend towards digitization is clear. Payment technology is the eye of the storm – from a FinTech standpoint. This trend will accelerate in 2017 with the easy availability of open source technology in four critical areas – Big Data, Cloud Computing, Predictive Analytics & Blockchain.

Big Data will heavily be leveraged (on Private or Public Cloud based infrastructure) to perform real-time Predictive analytics on payments data in motion as well as at rest. Critical capabilities such as a Single View of Customer/Payment/Fraud & Customer Journey Management etc will all depend on Big Data. Blockchain technology (and its attractiveness in terms of removing the middleman while providing transparency & security) will continue to be prototyped across many different areas.

Trend #5 Cyber Security, Risk and Compliance will remain board level concerns

If there is one common thread across the entire payments value chain – Merchants, Acquirers, Gateways, Schemes, Banks, Corporates etc – it is the risk of cyber attacks. Though EMV based chip technology has reduced point of sales fraud, the trend in cyberattacks is only on the upsurge. Techniques like tokenization – have been developed to help both the schemes as well as providers of digital wallets etc reduce their compliance risk. As corporate payments and other B2C payments move to becoming more cross border – the focus on Anti Money Laundering and Fraud detection will only increase. The need of the hour is to deploy realtime analytics and Big Data techniques to tackle these at an application level.

Big Data Counters Payment Card Fraud (1/3)…

Conclusion..

As payments services firms begin 2017, they will need to transition to a customer oriented mindset. They will being pushed to share data through open standards, become highly digitized in interacting with consumers and will begin leveraging the vast internal data (about customers, their transaction histories, financial preferences, operational insights etc) to create new products or services or to enhance the product experience.

References..

[1]  Mobile Payment Volumes Forecast – https://www.statista.com/statistics/226530/mobile-payment-transaction-volume-forecast/

The Three Habits of Highly Effective Real Time Enterprises…

All I do is sit at home and watch Netflix. ” – Kylie Irving

The Power of Real Time

Anyone who has used Netflix to watch a movie or used Uber to hail a ride knows how simple, time efficient, inexpensive and seamless it is to do either. Chances are that most users of Netflix and Uber would never again use a physical video store or a taxi service unless they did not have a choice. Thus it should not come as a surprise that within a short span of a few years, these companies have acquired millions of delighted customers using their products (which are just apps) while developing market capitalizations of tens of billions of dollars.

As of early 2016, Netflix had about 60 million subscribers[1] and is finding significant success in producing its own content thus continuing to grab market share from the established players like NBC, Fox and CBS. Most Netflix customers opt to ditch Cable and are choosing to stream content in real time across a variety of devices.

Uber is nothing short of a game changer in the ride sharing business. Not just in busy cities but also in underserved suburban areas, Uber services save plenty of time and money in enabling riders to hail cabs. In congested metro areas, Uber also provides near instantaneous rides for a premium which motivates more drivers to service riders. As someone, who has used Uber in almost every continent in the world, it is no surprise that as of 2016, Uber dominates in terms of market coverage, operating in 400 cities in 70+ countries.[2]

What is the common theme in ordering a cab using Uber or a viewing a movie on Netflix ?

Answer – Both services are available at the click of a button, they’re lightning quick and constantly build on their understanding of your tastes, ratings and preferences. In short, they are Real Time products.

Why is Real Time such a powerful business capability?

In the Digital Economy, the ability to interact intelligently with consumers in real time is what makes possible the ability to create new business models and to drive growth in existing lines of business.

So, what do Real Time Enterprises do differently

What underpins a real time enterprise are three critical factors or foundational capabilities as shown in the below illustration. For any enterprise to be considered real time, the presence of these three components is what decides the pace of consumer adoption. Real time capabilities are part business innovation and part technology.

Let us examine these…

#1 Real Time Businesses possess a superior Technology Strategy

First and foremost, business groups must be able to define a vision for where they would like their products and services to be able to do to acquire younger and more dynamic consumers.

As companies adopt new business models, the technologies that support them must also change along with the teams that deliver them. IT departments have to move to more of a service model while delivering agile platforms and technology architectures for business lines to develop products around.

Why Digital Disruption is the Cure for the Common Data Center..

It needs to be kept in mind that these new approaches should be incubated slowly and gradually. They must almost always be business or usecase driven at first.

#2 Real Time Enterprises are extremely smart about how they leverage data

The second capability is an ability to break down data silos in an organization. Most organizations have no idea of what to do with all the data they generate. Sure, they use a fraction of it to perform business operations but beyond that most of this data is simply let go. As a consequence they fail to view their customer as a dynamic yet unified entity. Thus, they have no idea as to how to market more products or to estimate the risk being run on their behalf etc. The ability to add  is a growing emphasis on the importance of the role of the infrastructure within service orientation. As the common factor that is present throughout an organization, the networking infrastructure is potentially the ideal tool for breaking down the barriers that exist between the infrastructure, the applications and the business. Consequently, adding greater intelligence into the network is one way of achieving the levels of virtualization and automation that are necessary in a real-time operation.

Across Industries, Big Data Is Now the Engine of Digital Innovation..

#3 Real Time Enterprises use Predictive Analytics and they automate the hell out of every aspect of their business

Real time enterprises get the fact that using only Business Intelligence (BI) dashboards is largely passe. BI implementations base their insights on data that is typically stale, (even by days). BI operates in a highly siloed manner based on long cycles of data extraction, transformation, indexing etc.

However, if products are to be delivered over mobile and other non traditional channels, then BI is ineffective at providing just in time analytics that can drive an understanding of a dynamic consumers wants and needs. The Real Time enterprise demands that workers at many levels ranging from line of business managers to executives have fresh, high quality and actionable information on which they can base complex yet high quality business decisions. These insights are only enabled by Data Science and Business Automation. When deployed strategically – these techniques can scale to enormous volumes of data and help reason over them reducing manual costs.  They can take on business problems that can’t be managed manually because of the huge amount of data that must be processed.

Why Big Data & Advanced Analytics are Strategic Corporate Assets..

Conclusion..

Real time Enterprises do a lot of things right. They constantly experiment with creating new and existing business capabilities with a view to making them appealing to a rapidly changing clientele. They refine these using constant feedback loops and create cutting edge technology stacks that dominate the competitive landscape. Enterprises need to make the move to becoming Real time.

Neither Netflix nor Uber are sitting on their laurels. Netflix (which discontinued mail in DVDs and moved to an online only model a few years ago) continues to expand globally betting that the convenience of the internet will eventually turn it into a major content producer. Uber is prototyping self driving cars in Pittsburgh and intends to rollout its own fleet of self driving vehicles thus replacing it’s current 1.5 million drivers and also beginning a food delivery business around urban centers eventually[4].

Sure, the ordinary organization is no Netflix or Uber and when a journey such as the one to real time capabilities is embarked on, things can and will go wrong in this process. However, the cost of continuing with business as usual can be incalculable over the next few years.  There is always a startup or a competitor that wants to deliver what you do at much lower cost and at a lightning fast clip. Just ask Blockbuster and the local taxi cab company.

References

[1] Netflix Statistics 2016 – Statistica.com

[2] Fool.com “Just how dominant is Uber” – http://www.fool.com/investing/general/2015/05/24/lyft-vs-uber-just-how-dominant-is-uber-ridesharing.aspx

[3] Expanded Ramblings – “Uber Statistics as of Oct 2016” http://expandedramblings.com/index.php/uber-statistics/

[4] Uber Self driving cars debut in Pittsburgh – “http://www.wsj.com/articles/inside-ubers-new-self-driving-cars-in-pittsburgh-1473847202”

What Blockchain can do for The Internet Of Things..

Blockchain and IoT are a marriage made in heaven. Blockchain can enable & augment a variety of application scenarios and usecases for the IoT. No longer are such possibilities too futuristic – as we will discuss in this post.

IoT meets Blockchain..

Blockchain and Internet Of Things (IoT) are easily the two biggest buzzwords in technology at the moment. The IoT encompasses the world of sensors,moving objects like vehicles & really any device that has embedded electronics  to communicate with the outside world – typically over an IP protocol.

Combine that with Blockchain – a distributed ledger architecture (DLT) pattern.Combining the two can facilitate the entire lifecycle of IoT devices & applications and prove to be the glue for business processes to act on these events. Consider the following scenario – a private blockchain for a driverless connected car that will enable secure and realtime interactions from the car starting with car startup, driver authentication, smart contracts to exchange insurance & maintenance service information and realtime location info to track safety.

Blockchain based distributed ledger technology (DLT) fills in five critical gaps in IoT..

  1. In such typical scenarios as the above, a Blockchain based distributed ledger provides the trust, record of ownership, transparency and the overall (decentralized) communication backbone for IoT.
  2. It needs to be noted that over the years specific IoT communities will develop their own private blockchains that can store transactions in a highly secure manner for their specific applications. IoT architectures that relied on centralized servers to collect and store data will be able to write to local ledgers that will synch with other localized ledgers to maintain a single yet secure copy of the truth.
  3. All IoT transactions on the Blockchain will be timestamped thus ensuring that they are available essentially – for posterity.
  4. Next up, the true innovation of Blockchain – digital agreements or Smart Contracts. Smart contracts can then be applied on the data in the blockchain to enforce business conditions on the IoT interactions.
  5. Finally, one of the big knocks against IoT has been the relative lack of security standards. Blockchain due to its background on high end cryptography actually helps with IoT security. A future post will discuss such a reference architecture.

With that background, let us consider low hanging usecases across key IoT applications in verticals.

blockchain_iot

  1. Industrial Manufacturing – The manufacturing industry is moving to an entirely virtual world across its lifecycle, ranging from product development, customer demand monitoring to production to inventory management. As devices & systems become more interactive and intelligent, the blockchain can serve as a plant level, regional level and global supply chain level ledger. This will dramatically cut costs and drive more efficient just in time (JIT) processes enabling better usage of plant capacity and improved operational efficiencies.
  2. Connected and Driverless VehiclesThe Connected Vehicle enables the car or truck to behave as a giant Smart App. With the passing of every year, vehicles have more automatic features builtin – ranging from navigation, roadside assistance etc. Blockchain will enable these devices to be tracked on the digital mesh thus enabling easy inter vehicle communication as well as automatic tracking of fleet insurance policies, vehicle registration renewals etc
  3. Transportation – IoT + Blockchain = Connected Transportation. A range of scenarios can be imagined around a connected mesh of vehicles that exchange everything from traffic information to avoiding gridlocks & bottlenecks. Extending this to global trade, this mesh can incorporate shipping, air freight as well as ground transportation to track shipments.
  4. Public Infrastructure & Smart CitiesSmart devices are already being used to track the health of bridges, roads, power grids etc. Blockchains can be used to interconnect these to share efficiencies and to conduct maintenance, forecast usage trends for power usage, pollution etc. Another key area of usage would be to help remote areas such as forests to monitor natural incidents and to prevent catastrophic occurrences like large scale forest fires or farm infestations by blight etc.
  5. Financial services and insurance – Banks could use Blockchain backbone to track IoT enabled devices like ATM machines, remote tellers to conduct maintenance. Insurance companies which have already started deploying drones to verify property claims in remote areas can use the Blockchain to validate and verify claims.
  6. Home and Commercial Realestate management – Using sensors deployed on both homes and commercial buildings helps with automated home and office monitoring. The usecases will diverge across both areas but many can be built on having a foundational distributed ledger capability.
  7. Smart Contracts –  Blockchain based cryptocurrencies like Bitcoin enable two business capabilities at a high level – a) transfer of cryptocurrency and b) business rules that specify when the payout has to happen – typically once conditions that have been met – which satisfy fulfillment of contractual terms.
    These rules are termed “Smart Contracts’. Smart contracts are applicable across all of these business areas and can be used to keep track of business rules and take actions based on thresholds that have been met or have been breached. E.g A driverless vehicle that has failed an inspection can be grounded, non payment of home owners insurance can trigger an alert to the homeowners housing society etc.
  8. Retail –  Retailers are already using IoT devices and endpoints to help across the business lifecycle – ranging from the shop floor, to tracking product delivery to store, to understand their customer traffic patterns, wearables etc. The vision of the Connected Store with IoT enabled shelves, an ability for customers to perform more actions using smartphone apps to reducing checkout times with self checkout etc are all taking place. The Blockchain can augment all of these usecases by providing the critical link between retailer and consumer in a way that it automates away the middle man- be it a credit card issuer, or a  central server. For instance consumers can store their product preferences, sizes in a Blockchain and the retailer can access these in a seamless and secure manner.

There still exist large technology shortcomings..

Finally, it needs to be mentioned that there still exist critical gaps in Blockchain technology – whether one considers the public Blockchain on which Bitcoin is built or technologies like Etherium – in terms of interoperability, security standards, throughput and mature developer tooling. These will need to be worked on over the next few quarters before we see production grade IoT deployments on Blockchains.

Conclusion..

The potential ability of Blockchain to ultimately enable secure, global & decentralized communication across billions of IoT endpoints is very promising. As Blockchain matures and makes it way into the enterprise, it likely to impact many aspects of business operations and strategies in the coming years.

My take on Gartner’s Top 10 Strategic Technology Trends for 2017

We’re only at the very, very beginning of this next generation of computing and I think that every industry leader will be the ones that transforms first. I don’t care what industry you’re talking about” -Kim Stevenson, CIO, Intel, Feb 2016

Gartner Research rolled out their “Top 10 Strategic Technology Trends for 2017” report a few weeks ago. My goal for this blogpost is to introduce these trends to the reader and to examine the potential impact of their recommendations from an enterprise standpoint.

gartner_trends_2017

                                                              Gartner’s Strategic Trends for 2017 

# 1: AI & Advanced Machine Learning

Gartner rightly forecasts that AI (Artificial Intelligence) and Advanced Machine Learning will continue their march into daily applications run by the Fortune 1000. CIOs are coming to realize that most business problems are primarily data challenges. The rapid maturation of scalable processing techniques allows us to extract richer insights from data. What we commonly refer to as Machine Learning – a combination of econometrics, machine learning, statistics, visualization, and computer science – helps extracts valuable business insights hiding in data and builds operational systems to deliver that value.

Deep Machine Learning involves the art of discovering data insights in a human-like pattern. We are, thus, clearly witnessing the advent of modern data applications. These applications will leverage a range of advanced techniques such as Artificial Intelligence and Machine Learning (ML) encompassing techniques such as neural networks, natural language processing and deep learning.

Implications for industry CIOs – Modern data applications understand their environment (e.g customer preferences and other detailed data insights) to be able to predict business trends in real time & to take action based on them to drive revenues and decrease business risk. These techniques will enable applications and devices to operate in an even more smarter manner while saving companies enormous amounts of money on manual costs.

http://www.vamsitalkstech.com/?p=407

# 2: Intelligent Apps

Personal assistants, e.g Apple Siri, Microsoft Cortona in the category of virtual personal assistants (VPAs), have begun transforming everyday business processes easier for their users. VPAs represent the intersection of AI, conversational interfaces and integration into business processes. In 2017, these will begin improving customer experiences for the largest Fortune 100 enterprises. On the more personal front, Home VPAs will rapidly evolve & become even more smarter as their algorithms get more capable and understanding of their own environments.  We will see increased application of smart agents in diverse fields like financial services,healthcare, telecom and media.

Implications for industry CIOs – Get ready to invest in intelligent applications in the corporate intranet to start with.

# 3: Intelligent Things

The rise of the IoT has only been well documented but couple AI with massive data processing capabilities – that makes up Intelligent Things which can interact with humans in new ways. You can add a whole category of things around transportation (self driving cars, connected cars) and Robots that perform key processes in industrial manufacturing, drones etc.

Implications for industry CIOs – These intelligent devices will increasingly begin communicating with their environments in a manner that will encourage collaboration in a range of business scenarios. 2017 should begin the trend of these devices communicating with each other to form the eponymous ‘Digital Mesh’.

# 4: Virtual & Augmented Reality

Virtual reality (VR) and augmented reality (AR) are technologies that are beginning to completely change the way humans interact with one another and with intelligent systems that make up the Digital Mesh. Pokemon GO & Oculus Rift were the first hugely successful consumer facing AR applications – debuting in 2016. Uses of these technologies will include gamification (to improve customer engagement with products and services), other customer & employee facing applications etc. While both these technologies enable us to view the world in different ways – AR is remarkable in its ability to add to our current reality. BMW’s subsidiary Mini has actually developed a driving goggle with AR technology[1].

Implications for industry CIOs – This one is still on the drawing board for most verticals but it does make sense to invest in areas like gamification and in engaging with remote employees using AR.

# 5: Digital Twin

A Digital twin is a software personification of an Intelligent Thing or system. In the manufacturing industry, digital twins can be setup to function as proxies of things like sensors and gauges, Coordinate Measuring Machines, lasers, vision systems, and white light scanning [2]. The wealth of data being gathered on the shop floor will ensure that Digital twins will be used to reduce costs and increase innovation. Data science will soon make it’s way into the shop floor to enable the collection of insights from these software proxies.

Implications for industry CIOs – Invest in Digital capabilities that serve as proxies for physical things.

# 6: Blockchain

The term Blockchain is derived from a design pattern that describes a chain of data blocks that map to individual transactions. Each transaction that is conducted in the real world (e.g a Bitcoin wire transfer) results in the creation of new blocks in the chain. The new blocks so created are done so by calculating a cryptographic hash function of its previous block thus constructing a chain of blocks – hence the name.

Blockchain is a distributed ledger (DLT) which allows global participants to conduct secure transactions that could be of any type – banking, music purchases, legal contracts, supply chain transactions etc. Blockchain will transform multiple industries in the years to come. Bitcoin is the first application of Blockchain.

How the Blockchain will lead disruption across industry..(5/5)

Implications for industry CIOs – Begin expanding internal knowledge on Blockchain and as to how it can potentially augment or disrupt your vertical industry.

# 7: Conversational Systems

Mobile applications first begun forcing the need for enterprises to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. Conversational Systems take these interactions to the next level and enable humans to communicate with a wide range of Intelligent Things using a range of channels – speech, touch, vision etc.

Implications for industry CIOs – Every touch point matters, and those leading the smart agent transformation should constantly be asking how organizations are removing friction and enhancing the experience for every customer regardless of where they are in the journey.

# 8: Mesh App and Service Architecture

This one is still from last year. The Digital Mesh leads to an interconnected information deluge which encompasses classical IoT endpoints along with audio, video & social data streams. The creation of these smart services will further depend on the vertical industries that these products serve as well as requirements for the platforms that host them. E.g industrial automation, remote healthcare, public transportation, connected cars, home automation etc.The micro services architecture approach which combines the notion of autonomous, cooperative yet loosely coupled applications built as a conglomeration of business focused services is a natural fit for the Digital Mesh.  The most important additive and consideration to micro services based architectures in the age of the Digital Mesh is what I’d like to term –  Analytics Everywhere.

Implications for industry CIOs -The mesh app will require a microservices based architecture which supports multichannel & multi device solutions.

# 9: Digital Technology Platforms

The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous micro level interactions with global consumers/customers/clients/stockholders or patients depending on the vertical you operate in. More information on the core building blocks of Digital Technology Platforms at the below blogpost.

Implications for industry CIOs

http://www.vamsitalkstech.com/?m=201609

# 10: Adaptive Security Architecture

The evolution of the intelligent digital mesh and digital technology platforms and application architectures means that security has to become fluid and adaptive.Traditional solutions cannot handle this challenge which is exacerbated by the expectation that in an IoT & DM world, data flows will be multidirectional across a grid of application endpoints.

Implications for industry CIOs -Expect to find applications in 2016 and beyond incorporating Deep Learning and Real Time Analytics into their core security design with a view to analyzing large scale data at a very low latency. Security in the IoT environment is particularly challenging. Security teams need to work with application, solution and enterprise architects to build security into the overall DevOps process to create a DevSecOps model.

Conclusion..

In this year’s edition, Gartner are clearly forecasting the future ten years out from a mass market standpoint. As we cross this chasm slowly over the next ten years, we will see that IoT begin to emerge and take center stage in every industry vertical. Digital transformation will happen on apps created for and brought together for Smart Agents on the Device Mesh.

These apps will gradually become autonomous, data intensive,server-less, hopefully secure and location independent (data center or cloud). The app can be a sensor or a connected car or a digital twin for a manufacturing technician. So, it’s not just about a single app sitting in a data center or the cloud or on the machine itself. These smart agent apps will data driven, components of a larger mesh, interconnected connected using open interfaces, and resident at the places where it’s optimal for realtime analytics. This may seem like science fiction for the Fortune 1000 enterprise but it is manifest reality at the web scale innovators. The industry will have no choice but to follow.

References..

[1] Cramer – “A lesson in Augmented Realities” –  http://cramer.com/story/the-difference-between-ar-and-vr/

[2] Dr.Michael Grieves – “Digital Twin: Manufacturing Excellence through Virtual Factory Replication” – http://innovate.fit.edu/plm/documents/doc_mgr/912/1411.0_Digital_Twin_White_Paper_Dr_Grieves.pdf

Why Digital Disruption is the Cure for the Common Data Center..

The foundation of digital business is the boundary-free enterprise, which is made possible by an array of time- and location-independent computing capabilities – cloud, mobile, social and data analytics plus sensors and APIs. There are no shortcuts to the digital enterprise.”

— Mike West,Analyst,Saugatack Research 2015

At its core Digital is a fairly straightforward concept. It is essentially about offering customers more contextual and relevant experiences while creating internal teams that can turn on a dime to serve customers. It is clear that these kinds of consumer capabilities just cannot be offered using an existing technology stack. This blogpost seeks to answer what this next generation computing stack may look like.

What Digital has in Store for Enterprises…

Digital transformation is a daily fact of life at web scale shops like Google, Amazon, Apple, Facebook and Netflix. These mega shops have built not just intuitive and appealing applications but have gradually evolved them into platforms that offer discrete marketplaces that serve global audiences. They also provide robust support for mobile applications that deliver services such as content, video, e-commerce, gaming etc via such channels. In fact they have heralded the age of new media and in doing so have been transforming both internally (business models, internal teams & their offerings) as well as externally.

CXOs at established Fortune 1000 enterprises were unable to find resonance in these stories from the standpoint of their enterprise’s reinvention. This makes a lot of sense as these established companies have legacy investments and legacy stakeholders – both of which represent change inhibitors that the FANGs (Facebook Amazon Netflix and Google) did not have. Enterprise practitioners need to understand how Digital technology can impact both existing technology investments and the future landscape.

Where are most Enterprises at the moment…

Much of what exists in the datacenters across organizations are antiquated from a technology stack. These range from hardware platforms to network devices & switches to monolithic applications running on them. Connecting these applications are often proprietary or manual integraton architectures. There are inflexible, proprietary systems & data architectures, lots of manual processes, monolithic applications and tightly coupled integration. Rapid provisioning of IT resources is a huge bottleneck which frequently leads to lines of business adopting the public cloud to run their workloads.  According to Rakesh Kumar, managing vice president at Gartner – “For over 40 years, data centers have pretty much been a staple of the IT ecosystem,Despite changes in technology for power and cooling, and changes in the design and build of these structures, their basic function and core requirements have, by and large, remained constant. These are centered on high levels of availability and redundancy, strong, well-documented processes to manage change, traditional vendor management and segmented organizational structures. This approach, however, is no longer appropriate for the digital world.” [2]

On that note, the below blogpost had captured the three essential technology investments that make up Digital Transformation.

The Three Core Competencies of Digital – Cloud, Big Data & Intelligent Middleware

If Digital has to happen, IT is one of the largest stakeholders…

Digital applications present seamless expereinces across channels & devices, are tailored to individual customers needs, understand their preferences & need to be developed in an environment of constant product innovation.

So, which datacenter capabilities are required to deliver this?

Figuring out the best architectural foundation to support , leverage & monetize on digital experiences is complex.  The past few years have seen the rapid evolution of many transformational technologies—Big Data, Cognitive Computing, Cloud technology (Public clouds, OpenStack, PaaS, Containers, Software-defined networking & storage), the Blockchain – the list goes on and on. These are leading enterprises to a smarter way of developing enterprise applications and to a more modern, efficient, scalable, cloud-based architectures.

So, what capabilities do Datacenters need to innovate towards?

digital_datacenter

                                         The legacy Datacenter transitions to the Digital Datacenter

While, the illustration above is self explanatory. Enterprise IT will need to majorly embrace Cloud Computing – whatever forms the core offering may take – public, private or hybrid. The compute infrastructure ranging from a mix of open source virtualization to Linux containers. Containers essentially virtualize the operating system so that multiple workloads can run on a single host, instead of virtualizing a server to create multiple operating systems. These containers are easily ported across different servers without the need for reconfiguration and require less maintenance because there are fewer operating systems to manage. For instance, the OpenStack Cloud Project specifies Docker (a defacto standard), a Linux format for containers that’s designed to automate the deployment of applications as highly portable, self-sufficient containers.

Cloud computing will also enable the rapid scale up & scale down across the gamut of infrastructure (compute – VM/Baremetal/Containers, storage – SAN/NAS/DAS, network – switches/routers/Firewalls etc) in near real-time (NRT). Investments in SDN (Software Defined Networking) will be de riguer in order to improve software based provisioning, network, time to market and to drive network equipment costs down. The other vector that brings about datacenter innovation is around automation i.e vastly reducing manual efforts in network and application provisioning. These capabilities will be key as the vast majority of digital applications are deployed as Software as a service (SaaS).

An in depth discussion of these Software Defined capabilities can be found at the below blogpost.

Financial Services IT begins to converge towards Software Defined Datacenters..

Applications developed for a Digital infrastructure will be developed as small, nimble processes that communicate via APIs and over infrastructure like service mediation components (e.g Apache Camel). These microservices based applications will offer huge operational and development advantages over legacy applications. While one does not expect legacy but critical applications that still run on mainframes (e.g. Core Banking, Customer Order Processing etc) to move over to a microservices model anytime soon, customer facing applications that need responsive digital UIs will definitely move.

Which finally brings us to the most important capability of all – Data. The heart of any successful Digital implementation is Data. The definition of Data includes internal data (e.g. customer data, data about transactions, customer preferences data), external datasets & other relevant third party data (e.g. from retailers) etc.  While each source of data may not radically change an application’s view of its customers, the combination of all promises to do just that.

The significant increases in mobile devices and IoT (Internet of Things) capable endpoints will ensure exponential increases in data volumes will occur. Thus Digital applications will need to handle this data – not just to process it but also to be able to glean real time insights.  Some of the biggest technology investments in ensuring a unified customer journeys are in the areas of Big Data & Predictive Analytics. Enterprises should be able to leverage a common source of data that transcends silos (a data lake) to be able to drive customer decisions that drive system behavior in real time using advanced analytics such as Machine Learning techniques, Cognitive computing platforms etc which can provide accurate and personalized insights to drive the customer journey forward.

Can Datacenters incubate innovation ?

Finally, one of the key IT architectural foundation strategies companies need to invest in is modern application development. Gartner calls such a feasible approach “Bimodal IT”. According to Gartner, “infrastructure & operations leaders must ensure that their internal data centers are able to connect into a broader hybrid topology“.[2]  Let us consider Healthcare – a reasonably staid vertical as an example. In a report released by EY, “Order from Chaos – Where big data and analytics are heading, and how life sciences can prepare for the transformational tidal wave,” [1] the services firm noted that an agile environment can help organizations create opportunities to turn data into innovative insights. Typical software development life cycles that require lengthy validations and quality control testing prior to deployment can stifle innovation. Agile software development, which is adaptive and is rooted in evolutionary development and continuous improvement, can be combined with DevOps, which focuses on the the integration between the developers and the teams who deploy and run IT operations. Together, these can help life sciences organizations amp up their application development and delivery cycles. EY notes in its report that life sciences organizations can significantly accelerate project delivery, for example, “from three projects in 12 months to 12 projects in three months.”

Finally, Big Data has evolved to enable the processing of data in a batch, interactive, low latency manner depending on the business requirements – which is a massive gain for Digital projects. Big Data and DevOps will both go hand in hand to deliver new predictive capabilities.

Further, business can create digital models of client personas and integrate these with predictive analytic tiers in such a way that an API (Application Programming Interface) approach is provided to integrate these with the overall information architecture.

Conclusion..

More and more organizations are adopting a Digital first business strategy.  The current approach as in vogue – to treat these as one-off, tactical project investments – does not simply work or scale anymore. There are various organizational models that one could employ from the standpoint of developing analytical maturity. These ranging from a shared service to a line of business led approach. An approach that I have seen work very well is to build a Digital Center of Excellence (COE) to create contextual capabilities, best practices and rollout strategies across the larger organization.

References –

[1] E&Y – “Order From Chaos” http://www.ey.com/Publication/vwLUAssets/EY-OrderFromChaos/$FILE/EY-OrderFromChaos.pdf

[2] Gartner – ” Five Reasons Why a Modern Data Center Strategy Is Needed for the Digital World” – http://www.gartner.com/newsroom/id/3029231

Why Big Data & Advanced Analytics are Strategic Corporate Assets..

The industry is all about Digital now. The explosion in data storage and processing techniques promises to create new digital business opportunities across industries. Business Analytics concerns itself from deriving insights from data that is produced as a byproduct of business operations as well as external data that reflects customer insights. Due to their critical importance in decision making, Business Analytics is now a boardroom matter and not just one confined to the IT teams. My goal in this blogpost is to quickly introduce the analytics landscape before moving on to the significant value drivers that only Predictive Analytics can provide.

The Impact of Business Analytics…

The IDC “Worldwide Big Data and Analytics Spending Guide 2016”, predicts that the big data and business analytics market will grow from $130 billion by the end of this year to $203 billion by 2020[1] at a   compound annual growth rate (CAGR) of 11.7%. This exponential growth is being experienced across industry verticals such as banking & insurance, manufacturing, telecom, retail and healthcare.

Further, during the next four years, IDC finds that large enterprises with 500+ employees will be the main driver in big data and analytics investing, accounting for about $154 billion in revenue. The US will lead the market with around $95 billion in investments during the next four years – followed by Western Europe & the APAC region [1].

The two major kinds of Business Analytics…

When we discuss the broad topic of Business Analytics, it needs to be clarified that there are two major disciplines – Descriptive and Predictive. Industry analysts from Gartner & IDC etc. will tell you that one also needs to widen the definition to include Diagnostic and Prescriptive. Having worked in the industry for a few years, I can safely say that these can be subsumed into the above two major categories.

Let’s define the major kinds of industrial analytics at a high level –

Descriptive Analytics is commonly described as being retrospective in nature i.e “tell me what has already happened”. It covers a range of areas traditionally considered as BI (Business Intelligence). BI focuses on supporting operational business processes like customer onboarding, claims processing, loan qualification etc via dashboards, process metrics, KPI’s (Key Performance Indicators). It also supports a basic level of mathematical techniques for data analysis (such as trending & aggregation etc.) to infer intelligence from the same.  Business intelligence (BI) is a traditional & well established analytical domain that essentially takes a retrospective look at business data in systems of record. The goal of the Descriptive disciplines is to primarily look for macro or aggregate business trends across different aspects or dimensions such as time, product lines, business units & operating geographies.

  • Predictive Analytics is the forward looking branch of analytics which tries to predict the future based on information about the past. It describes what “can happen based on the patterns in data”. It covers areas like machine learning, data mining, statistics, data engineering & other advanced techniques such as text analytics, natural language processing, deep learning, neural networks etc. A more detailed primer on both along with detailed use cases are found here –

The Data Science Continuum in Financial Services..(3/3)

The two main domains of Analytics are complementary yet different…

Predictive Analytics does not intend to, nor will it, replace the BI domain but only adds significant sophisticated analytical capabilities enabling businesses to be able to do more with all the data they collect. It is not uncommon to find real world business projects leveraging both these analytical approaches.

However from an implementation standpoint, the only common area of both approaches is knowledge of the business and the sources of data in an organization. Most other things about them vary.

For instance, predictive approaches both augment & build on the BI paradigm by adding a “What could happen” dimension to the data.

The Descriptive Analytics/BI workflow…

BI projects tend to follow a largely structured process which has been well defined over the last 15-20 years. As the illustration below describes it, data produced in operational systems is subject to extraction, transformation and eventually is loaded into a data warehouse for consumption by visualization tools.

descriptive_analytics

                                                                       The Descriptive Analysis Workflow 

Descriptive Analytics and BI add tremendous value to well defined use cases based on a retrospective look at data.

However, key challenges with this process are –

  1. the lack of a platform to standardize and centralize data feeds leads to data silos which cause all kinds of cost & data governance headaches across the landscape
  2. complying with regulatory initiatives (such as Anti Money Laundering or Solvency II etc.) needs the warehouse to handle varying types of data which is a huge challenge for most of the EDW technologies
  3. the ability to add new & highly granular fields to the data feeds in an agile manner requires extensive relational modeling upfront to handle newer kinds of schemas etc.

Big Data platforms have overcome past shortfalls in security and governance and are being used in BI projects at most organizations. An example of the usage of Hadoop in classic BI areas like Risk Data Aggregation are discussed in depth at the below blog.

http://www.vamsitalkstech.com/?p=2697

That being said, BI projects tend to follow a largely structured process which has been well defined over the last 20 years. This space serves a large existing base of customers but the industry has been looking to Big Data as a way of constructing a central data processing platform which can help with the above issues.

BI projects are predicated on using an EDW (Enterprise Data Warehouse) and/or RDBMS (Relational Database Management System) approach to store & analyze the data. Both these kinds of data storage and processing technologies are legacy in terms of both the data formats they support (Row-Column based) as well as the types of data they can store (structured data).

Finally, these systems fall short of processing data volumes generated by digital workloads which tend to be loosely structured (e.g mobile application front ends, IoT devices like sensors or ATM machines or Point of Sale terminals), & which need business decisions to be made in near real time or in micro batches (e.g detect credit card fraud, suggest the next best action for a bank customer etc.) and increasingly cloud & API based to save on costs & to provide self-service.

That is where Predictive Approaches on Big Data platforms are beginning to shine and fill critical gaps.

The Predictive Analytics workflow…

Though the field of predictive analytics has been around for years – it is rapidly witnessing a rebirth with the advent of Big Data. Hadoop ecosystem projects are enabling the easy ingestion of massive quantities of data thus helping the business gather way more attributes about their customers and their preferences.

data_science_process

                                                                    The Predictive Analysis Workflow

The Predictive Analytics workflow always starts with a business problem in mind. Examples of these would be “A marketing project to detect which customers are likely to buy new products or services in the next six months based on their historical & real time product usage patterns – which are denoted by x, y or z characteristics” or “Detect real-time fraud in credit card transactions.”

In use cases like these, the goal of the data science process is to be able to segment & filter customers by corralling them into categories that enable easy ranking. Once this is done, the business is involved to setup easy and intuitive visualization to present the results.

A lot of times, business groups have a hard time explaining what they would like to see – both data and the visualization. In such cases, a prototype makes things easier from a requirements gathering standpoint.  Once the problem is defined, the data scientist/modeler identifies the raw data sources (both internal and external) which comprise the execution of the business challenge.  They spend a lot of time in the process of collating the data (from Oracle/SQL Server, DB2, Mainframes, Greenplum, Excel sheets, external datasets, etc.). The cleanup process involves fixing a lot of missing values, corrupted data elements, formatting fields that indicate time and date etc.

The data wrangling phase involves writing code to be able to join various data elements so that a single client’s complete dataset is gathered in the Data Lake from a raw features standpoint.  If more data is obtained as the development cycle is underway, the Data Science team has no option but to go back & redo the whole process. The modeling phase is where algorithms come in – these can be supervised or unsupervised. Feature engineering takes in business concepts & raw data features and creates predictive features from them. The Data Scientist takes the raw & engineered features and creates a model using a mix of various algorithms. Once the model has been repeatedly tested for accuracy and performance, it is typically deployed as a service. Models as a Service (MaaS) is the Data Science counterpart to Software as a Service. The MaaS takes in business variables (often hundreds of inputs) and provides as output business decisions/intelligence, measurements, & visualizations that augment decision support systems.

 How Predictive Analytics changes the game…

Predictive analytics can bring about transformative benefits in the following six ways.

  1. Predictive approaches can be applied to a much wider & richer variety of business challenges thus enabling an organization to achieve outcomes that were not really possible with the Descriptive variety. For instance, these use cases range from Digital Transformation to fraud detection to marketing analytics to IoT (Internet of things) across industry verticals. Predictive approaches are real-time and not just batch oriented like the Descriptive approaches.
  2. When deployed strategically – they can scale to enormous volumes of data and help reason over them reducing manual costs.  It can take on problems that can’t be managed manually because of the huge amount of data that must be processed.
  3. They can predict the results of complex business scenarios by being able to probabilistically predict different outcomes across thousands of variables by perceiving minute dependencies between them. An example is social graph analysis to understand which individuals in a given geography are committing fraud and if there is a ring operating
  4. They are vastly superior at handling fine grained data of manifold types than can be handled by the traditional approach or by manual processing. The predictive approach also encourages the integration of previously “dark” data as well as newer external sources of data.
  5. They can also suggest specific business actions(e.g. based on the above outcomes) by mining data for hitherto unknown patterns. The data science approach constantly keeps learning in order to increase its accuracy of decisions
  6. Data Monetization–  they can be used to interpret the mined data to discover solutions to business challenges and new business opportunities/models

References

[1] IDC Worldwide Semiannual Big Data and Business Analytics Spending Guide – Oct 2016 “Double-Digit Growth Forecast for the Worldwide Big Data and Business Analytics Market Through 2020 Led by Banking and Manufacturing Investments, According to IDC

http://www.idc.com/getdoc.jsp?containerId=prUS41826116

 

Demystifying Digital – the importance of Customer Journey Mapping…(2/3)

The first post in this three part series on Digital Foundations @ http://www.vamsitalkstech.com/?p=2517 introduced the concept of Customer 360 or Single View of Customer (SVC).  We discussed specific benefits from both a business & operational standpoint that are enabled by SVC. This second post in the series introduces the concept of a Customer Journey. The third & final post will focus on a technical design & architecture needed to achieve both these capabilities.

Introduction to Customer Journey Mapping…

The core challenge many Banks have is their ability to offer a unified customer experience for banking services across different touch points. The lack of such a unified experience negatively impacts the quality of the overall banking experience.

Thus, Customer Journey Mapping refers to the process of creating a visual depiction of a customers adoption and usage of banking products across different channels or touch points(branch,mobile,phone,chat,email etc). The journey provides dynamic & realtime insight into the total customer lifetime value (CLV) as the person has progressed in her or his life journey. The goal of the customer journey mapping is to provide the bank personnel with a way of servicing the customer better while increasing the bank’s net economic value from servicing this customer.

The result of the journey mapping process is to drive overall engagement model from the customers perspective and not solely the Banks internal processes.

Banks may be curious as to why they need a new approach to customer centricity? Quite simple, just consider the sheer complexity for signing up for new banking products such as checking or savings accounts or receiving credit for a simple checking deposit. At many banks these activities can take a couple of days. Products with higher complexity like home mortgage applications can take weeks to process even for those consumers with outstanding credit. Consumers are beginning to constantly compare these slow cycle times to the realtime service they commonly obtain using online services such as Amazon or Apple Pay or Google Wallet or Airbnb or even FinTechs. For internal innovation to flourish, customer centric mindset rather than an internal process centric mindset is what is called for at most incumbent Banks.

The Boston Consulting Group (BCG) has proposed a six part program for Banks to improve their customer centricity as a way of driving increased responsiveness and customer satisfaction[1]. This is depicted in the below illustration.

customer_journey_mapping

Customer Journey Mapping in Banking involves six different areas. Redrawn & readapted from BCG Analysis [1]
  1. Invest in intuitive interfaces for both customer & internal stakeholder interactions–  Millenials who use services like Uber, Facebook, Zillow, Amazon etc in their daily lives are now very vocal in demanding a seamless experience across all of their banking services using digital channels.  The first component of client oriented thinking is to provide UI applications that smoothly facilitate products that reflect individual customers lifestyles, financial needs & behavioral preferences. The user experience will offer different views to business users at various levels in the bank – client advisors, personal bankers, relationship managers, brach managers etc.  The second component is to provide a seamless experience across all channels (mobile, eBanking, tablet, phone etc) in a way that the overall journey continuous and non-siloed. The implication is that clients should be able to begin a business transaction in channel A and be able to continue them in channel B where it makes business sense.
  2. Technology Investments – The biggest technology investments in ensuring a unified customer journey are in the areas of Big Data & Predictive Analytics. Banks should be able to leverage a common source of data that transcends silos to be able to drive customer decisions that drive system behavior in real time using advanced analytics such as Machine Learning techniques, Cognitive computing platforms etc which can provide accurate and personalized insights to drive the customer journey forward. Such platforms need to be deployed in strategic areas such as the front office, call center, loan qualification etc. Further, business can create models of client personas and integrate these with the predictive analytic tier in such a way that an API (Application Programming Interface) approach is provided to integrate these with the overall information architecture.
  3. Agile Business Practices–  Customer Journey Mapping calls for cross organizational design teams consisting of business experts, UX designers, Developers & IT leaders. The goal is to create intuitive & smart client facing applications using a rapid and agile business & development lifecycle. 
  4. Risk & Compliance –  Scalable enterprise customer journey management also provides a way to integrate risk and compliance functions such as customer risk, AML compliance into the process. This can be achieved using a combination of machine learning & business rules.
  5. Process Workflow – It all starts with the business thinking outside the box and bringing in learnings from other verticals like online retailing, telecom, FinTechs etc to create journeys that reimagine existing business processes using technology. An example would be to reimagine the mortgage application process by having the bank grant a mortgage using a purely online process by detecting that this may be the best product for a given consumer. Once the initial offer is validated using a combination of publicly available MLS (Multi Listing Scheme) data & the consumer’s financial picture, the bank can team up with realtors to provide the consumer with an online home shopping experience and help take the process to a close using eSigning.
  6. Value Proposition – It is key for financial services organizations to identify appropriate usecases as well as target audiences as they begin creating critical customer journeys. First identifying & then profiling these key areas such as customer onboarding, mortgage/auto loan application, fraud claims management workflows in the retail bank, digital investment advisory in wealth management etc are key. Once identified, putting in place strong value drivers with demonstrable ROI metrics is critical in getting management buy in. According to BCG,banks that have adopted an incremental approach to customer journey innovation have increased their revenues by 25% and their productivity by 20% to 40% [1].

Conclusion..

As financial services firms begin to embark on digital transformation, they will need to transition to a customer oriented mindset. Along with a Single View of Client, Customer Journey Mapping is a big step to realizing digitization. Banks that can make this incremental transition will surely realize immense benefits in customer lifetime value & retention as compared to their peers.Furthermore, when a Bank embarks on Data Monetization – using the vast internal data (about customers, their transaction histories, financial preferences, operational insights etc) to create new products or services or to enhance the product experience – journey mapping is a foundational capability that they need to possess.

References..

[1] Boston Consulting Group 2016- “How digitized Customer Journeys can help Banks win hearts, minds and profits”