How Big Data & Advanced Analytics can help Real Estate Investment Trusts (REITS)

                                                         Image Credit – Kiplinger’s

Introduction…

Real Estate Investment Trust’s (REITS) are financial companies that own various forms of commercial and residential real estate. These assets include office buildings, retail shopping centers, hospitals, warehouses, timberland and hotels etc. Real estate is growing quite nicely as a component of the global financial business. Given their focus on real estate investments, REITS have always occupied a specialized position in global finance.

Fundamentally, there are three types of REITS –

  1. Equity REITS which exclusively deal in acquiring, improving and selling properties with the aim of higher returns for their investors
  2. Mortgage REITS only buy and sell mortgages
  3. Hybrid REITS which do both #1 and #2 above

REITS have a reasonably straightforward business model – you take the yields from the properties you own and reinvest the funds to be able to pay your investors (a mandated 95% of dividends). Most of the traditional REIT business processes are well handled by conventional types of technology. However more and more REITs are being challenged to develop a compelling Big Data strategy that leverages their tremendous data assets. 

The Five Key Big Data Applications for REITS… 

Let us consider at the five key areas where advanced analytics built on a Big Data foundation can immensely help REITS.

#1 Property Acquisition Modeling 

REITS owners can leverage the rich datasets available around renters demographics, preferences, seasonality, economic conditions in specific markets to better guide capital decisions on acquiring property. This modeling needs to take into account land costs, development costs, fixture costs & any other sales and marketing costs to appeal to tenants. I’d like to call this macro business perspective. Also from a micro business perspective, being able to better study individual properties using a variety of widely available data – MLS listings for similar properties, foreclosures, closeness to retail establishments, work sites, building profiles, parking spaces, energy footprint etc can help them match tenants to their property holdings. All this is critical to getting their investment mix right to meet profitability targets.

                                  Click on the Image for a blogpost discussing Predictive Analytics in Capital Markets

#2 Portfolio Modeling 

REITS can leverage Big Data to perform more granular modeling of their MBS portfolios. As an example, they can feed in a lot more data into their existing models as discussed above. E.g.  Demographic data, macroeconomic factors et al.  

A simple scenario would be if Interest Rates go up by X basis points – what does that mean for my portfolio exposure, Default Rate, Cost Picture, Optimal times to buy certain MBS’s etc ?  REITS can then use that info to enter hedges etc to protect against any downside. Big Data can also help with a range of predictive modeling across all of the above areas as discussed below.  An example is to build a 360 degree view of a given investment portfolio.

                                                         Click on Image for a Customer 360 discussion 

#3 Risk Data Aggregation & Calculations 

The instruments underlying the portfolios themselves carry large amounts of credit & interest rate risk. Big Data is a fantastic platform for aggregating and calculating many kinds of risk exposures as the below link discuss in detail. 

  

                                            Click on Image for a discussion of Risk Data Aggregation and Measurement 

 

#4 Detect and Prevent Money Laundering (AML)

Due to the global nature of investment funds flowing into real estate, REITS are highly exposed to money laundering and sanctions risks. Whether or not REITS operate in high risk geographies (India,China, South America, Russia etc) or have complex holding structures – they need to file SAR (Suspicious Activity Reports) with the FinCEN.  There has always been a strong case to be made that shady foreign entities and individuals were laundering ill gotten proceeds to buy US real estate. In early 2016, the FinCEN began implementing Geographic Targeting Orders (GTOs). Title companies based in the United States are now required to clearly identify the real owners of either limited liability companies (LLCs) or any other partnerships, and other legal entities being used to purchase high end residential real estate using cash.

AML as a topic is covered exhaustively in the below series of blogposts (please click on image to open the first one).

                                                         Click on Image for a Deepdive on AML

#5 Smart Cities, Net New Investments and Property Management

In the future, REITS would want to invest in Smart Cities which are positioned to be leading urban centers offering mobility, green technology, personalized medicine, safe services, clean water, traffic management and other forward looking urban amenities. These Smart Cities target a new kind of client- upwardly mobile, technologically savvy, environment conscious millenials. According to RBC Capital Markets, Smart Cities presents a massive investment opportunity for REITS. Such investments could provide REITS offering income yields of around 10-20%. (Source – Ben Forster @ Schroeders).

Smart Cities will be created using a number of high end technologies such as IoT, AI, Virtual Reality, Device Meshes etc. By 2020, it is estimated that these buildings will be generating an enormous amount of data that needs to be stored and analyzed by landlords.

As the below graphic from Cisco attests, the ability to work with IoT data to analyze a range of these micro investment opportunities is a Big Data challenge.

The ongoing maintenance and continuous refurbishment of rental properties is a large portion of the business operation of a REIT. The availability of smart sensors and such IoT devices that can track air quality, home appliance malfunction etc can help greatly with preventive maintenance.

Conclusion..

As can be seen from some of the above business areas, most REITS data needs require a holistic approach across the value chain (capital sourcing, investment decisions, portfolio management & operations). This approach spans various horizontal functions like Customer Segmentation, Property Acquisition, Risk, Finance and Business Operations.
The need of the hour for larger REITS is to move to a common model for data storage, model building and testing.  It is becoming increasingly obvious that Big Data can provide massive business opportunities for REITS.

Why the Internet of Things (IoT) is about Data Driven Ecosystems (& not really about the Devices)..

The Internet of Things (IoT) will have a great impact on the economy by transforming many enterprises into digital business and facilitating new business models, improving efficiency, and generating new forms of revenue.However, the ways in which enterprises can actualize any benefits will be diverse and, in some cases, painful” – Jim Tully, vice president and distinguished analyst at Gartner – 2015.

The IoT is one of the most hyped paradigms floating around at the moment. However the hype is not all unjustified. Analyst projections have about 25 billion devices connected to the internet by 2020 delivering cumulative business value of $2 trillion[1] across many industry verticals. Enterprise IT need to now begin developing capabilities to harness this information to serve their end customers. This blogpost discusses foundational IoT business elements that are common across industries.

                                                         Image Credit – ThinkStock

The Immense Market Opportunity around IoT 

The IoT has rapidly become one of the most familiar — and perhaps, most hyped — expressions across business and technology. That hype, however, is entirely justified and is backed up by the numbers as one can glean from the below graphic. The estimated business value of this still nascent market is expected to be around $10 trillion plus by 2022.

                                                         Credit – Tamara Franklin (Oracle Research)

Thinking around IoT has long been dominated by passive devices such as Industrial Sensors, RFID tags and Actuators. As pointed out in my “Gartner’s Trends for 2017” article, these devices are beginning to form a smart mesh. Field devices now have increased ‘smart’ capabilities to communicate with each other and with the internet – typically using an IP protocol -resulting in the combined intelligence of groups of such ‘things’.  The IoT now enables not just machine to machine communication but also the human to machine and human to IoT ecosystems. While the media plays up stories of IoT aware devices such as Google Nest or Amazon Echo etc – it is also shaking up vertical industries.

Virtually every industry out there has a significant amount of connected devices that have been deployed. This includes Retail, Energy & Utilities, Manufacturing, Healthcare, Transportation and Financial services etc.  Having said that, let us consider the five key industrial uses for the IoT space that will yield tremendous business value over the short to medium term – the next 2-5 years.

The Six Key Industry Applications of IoT 

Consider the above graphic (courtesy the BCG), the real business value in IoT lies in Analytics and Applications built on these analytics. In fact, BCG expects that by 2020 these higher order layers will have captured 60% of the growth from the [3]. In such a scenario, the rest of the technology elements – connected things, cloud platforms & data architectures merely enable the upper two layers in delivering business value.

Let us then consider the key industrial use cases for IoT –
  1. Retailers implementing IoT are working to ensure that their customers gain a seamless experience while browsing products in the store.For example the industry has begun adopting smart shelves that restock themselves, installed beacons in stores that communicate with shopping apps on consumers smartphones and NFC (Near Field Communications) that enable customers to make contact-less payments.  Internal operations such as Supply Chains are benefiting in a big way in their ability to gain realtime insight into the
  2. In the area of Commercial real estate, facilities management is an area where companies spend massive amounts of money on energy consumption. According to Deon Newman, at IBM Watson[3], global conglomerates like Siemens own hundreds of thousands of building which produce tens of thousands of millions of emissions. In this case, IoT analytics is being leveraged to reduce such huge carbon footprint.
  3. In the Utilities Industry – as Smart Meters have proliferated in the industry, IoT is driving use-cases ranging from Predictive Maintenance of equipment to optimizing Grid usage. For instance, in water utilities, smart sensors track everything from quality to pressure to usage patterns. Utilities are creating software platforms that provide analytics on usage patterns and forecast demand spikes in the grid .
  4. The Manufacturing industry is moving to an entirely virtual world across its lifecycle, ranging from product development, customer demand monitoring to production to inventory management. This trend is being termed as Industry 4.0 or Connected Manufacturing. As devices & systems become more interactive and intelligent, the data they send out can be used to optimize the lifecycle across the value chain thus driving higher utilization of plant capacity and improved operational efficiencies.
  5. The biggest trend in the Transportation industry is undoubtedly self driving connected cars & buses. The Connected Vehicle concept enables a car or a truck to behave as a giant smart app – sending out data and receiving inputs to optimize it’s functions. With the passing of every year, car makers are adding more and more smart features. Thus, vehicles have more automatic features builtin – ranging from navigation, requesting roadside assistance, self parking etc etc. Applications are being built which will enable these devices to be tracked on the digital mesh thus enabling easy inter vehicle communication to enable traffic management, pollution reduction and public safety.
  6. With Smart Cities governments across the globe are increasingly focused on traffic management, pollution management, public services etc – all with a view to improving quality of life for their citizens.  All of these ecosystems will be adopting IoT technology in the days and years to come.

Conclusion..

It can be seen from the above that the applications are myriad. Thus, while one cannot recommend a generic IT approach to IoT thats applicable to every industry – familiar themes do emerge that apply from a core IT capability standpoint.

The next post will consider the five key & common technology capabilities that Enterprise CIOs need to ensure that their organizations begin to develop to win in the IoT era.

References

[1] Gartner July 2015 – “The Internet of Things is a Revolution waiting to happen” – http://www.gartner.com/smarterwithgartner/the-internet-of-things-is-a-revolution-waiting-to-happen/

[2] BCG Analysis – “Winning in the IoT is about business processes” – https://www.bcgperspectives.com/content/articles/hardware-software-energy-environment-winning-in-iot-all-about-winning-processes/

[3] “Cognitive Computing and the future of smart buildings” – Deon Newman, IBM Watson IoT

https://www.ibm.com/blogs/internet-of-things/cognitive-computing-future-smart-buildings/

A POV on Bank Stress Testing – CCAR & DFAST..

The recession of 2007 to 2009 was still the most painful since the Depression. At its depths, $15 trillion in household wealth had disappeared, ravaging the pensions and college funds of Americans who had thought their money was in good hands. Nearly 9 million workers lost jobs; 9 million people slipped below the poverty line; 5 million homeowners lost homes.”
― Timothy F. Geithner, Former Secretary of the US Treasury – “Reflections on Financial crises – 2014”

A Quick Introduction to Macroeconomic Stress Testing..

The concept of stress testing in banking is not entirely new. It has been practiced for years in global banks across specific business functions that deal with risk. The goal of these internal tests has been to assess firm wide capital adequacy in periods of economic stress. However,the 2008 financial crisis clearly exposed how unprepared the Bank Holding Companies (BHCs) were to systemic risk brought on as a result of severe macroeconomic distress. Thus the current raft of regulator driven stress tests are motivated from the taxpayer funded bailouts in 2008. Back then banks were neither adequately capitalized to cope with stressed economic conditions nor were their market,credit risk losses across portfolios sustainable.

In 2007, SCAP (Supervisory Capital Access Program) was enacted as a stress testing framework in the US that only 19 leading financial institutions (Banks, Insurers etc) had to adhere to. The exercise was not only focused on the quantity of capital available but also the quality- Tier 1 common capital – with the institution. The emphasis on Tier 1 Common Capital is important as it provided an institution with a higher absorption capacity with minimizing losses to higher capital tiers.  Tier 1 Common Capital can also be managed better during economic stress by adjusting dividends, share buybacks and related activities.

Though it was a one off, the SCAP was a stringent and rigorous test. The Fed performed SCAP audits on the results of all the 19 BHC’s – some of whom failed the test.

Following this in 2010, the Dodd Frank Act was enacted by the Obama Administration.The Dodd Frank Act also introduced it’s own stress test – DFAST (Dodd-Frank Act Stress Testing). DFAST requires BHCs with assets of $10 billion & above to run annual stress tests and to make the results public. The goal of these stress tests is multifold but they are conducted primarily to assure the public, the regulators that BHCs have adequately capitalized their portfolios. BHC’s are required to present detailed capital plans to the Fed.

The SCAP’s successor, CCAR (Comprehensive Capital Adequacy Review) was also enacted around that time. Depending on the overall risk profile of the institution, the CCAR mandates several qualitative & quantitative metrics that BHCs need to report on and make public for several stressed macroeconomic scenarios.


Comprehensive Capital Analysis and Review (CCAR) is a regulatory framework introduced by the Federal Reserve in order to assess, regulate, and supervise large banks and financial institutions – collectively referred to in the framework as Bank Holding Companies (BHCs).
– (WikiPedia)

  • Every year, an increasing number of Tier 2 banks come under the CCAR mandate. CCAR basically requires specific BHCs to develop a set of internal macroeconomic scenarios or use those developed by the regulators. Regulators would then get the individual results of these scenario runs from firms across a nine quarter time horizon. Regulators also develop their own systemic stress tests to verify if a given BHC can withstand negative economic scenarios and continue to operate their lending operations. CCAR coverage primarily includes retail banking operations, auto & home lending, trading, counter party credit risk, AFS (Available For Sale)/HTM (Hold To Maturity) securities etc. The CCAR covers all major kinds of risk – market, credit, liquidity and OpsRisk.
CCAR kicked off global moves by regulators to enforce the same of banks in their respective jurisdictions. The EU requires EBA stress testing. The UK is an example of a country that requires its own stress testing – the Prudential Regulatory Authority. The same evolution of the firm wide stress testing has been followed by other regulators over the world, for example, in Europe with the EBA stress testing. Emerging markets such as India and China are also following this trend. Every year, more and more BHCs are increasingly subject to CCAR reporting mandates.

Similarities & Differences between CCAR and DFAST..

To restate – the CCAR is an annual exercise by the Federal Reserve to assess whether the largest bank holding companies operating in the United States have sufficient capital to continue operations throughout times of economic and financial stress and that they have robust, forward-looking capital-planning processes that account for their unique risks.  As part of this exercise, the Federal Reserve evaluates institutions’ capital adequacy, internal capital adequacy assessment processes, and their individual plans to make capital distributions, such as dividend payments or stock repurchases. Dodd-Frank Act stress testing (DFAST)-an exercise similar to CCAR- is a forward-looking stress test conducted by the Federal Reserve for smaller financial institutions. It is supervised by the Federal Reserve to help assess whether institutions have sufficient capital to absorb losses and support operations during adverse economic conditions.

As part of CCAR reporting guidelines, the BHC’s have to explicitly call out

  1. their sources of capital given their risk profile & breadth of operations,
  2. the internal policies & controls for measuring capital adequacy &
  3. any upcoming business decisions (share buybacks, dividends etc) that may impact their capital adequacy plans.

While both CCAR and DFAST look very similar from a high level  – they both mandate that banks  conduct stress tests – they do differ in the details. DFAST is applicable to banks that have assets between 10-50 billion $. During the planning horizon phase, CCAR allows the BHCs to use their own capital action assessments while DFAST enforces a standardized set of capital actions.The DFAST scenarios represent baseline, adverse and severely adverse scenarios. The DFAST is supervised by the Fed, the OCC (Office of the Comptroller of Currency) and the FDIC.

                                                Summary of DFAST and CCAR (Source: E&Y) 

As can be seen from the above table, while DFAST is complementary to CCAR, both efforts are distinct testing exercises that rely on similar processes, data, supervisory exercises, and requirements. The Federal Reserve coordinates these processes to reduce duplicative requirements and to minimize regulatory burden. CCAR results are reported twice on an annual basis and BHCs are required to also incorporate Basel III capital ratios in their reports with Tier 1 capital ratios calculated using existing rules. DFAST is reported up annually and it does include Basel III reporting.

In a Nutshell…

In CCAR (and DFAST), the Fed is essentially asking the BHC’s the following questions –

(1) For your defined risk profile, please define a process of understanding and mapping the key stakeholders to carry out this process.

(2) Please ensure that you use clean internal data to compute your exposures in the event of economic stress. The entire process of data sourcing, cleaning, computation, analytics & reporting needs to be auditable.

(3) What macroeconomic stress scenarios did you develop in working with key lines of business ? What are the key historical assumptions in developing these? What are the key what-if scenarios that you have developed based on the stressed scenarios? The scenarios need to be auditable as well.

(4) We are then going to run our own macroeconomic numbers & run our own scenarios using our own exposure generators on your raw data.

(5) We want to see how close both sets of numbers are.

Both CCAR and DFAST scenarios are expressed in stressed macroeconomic factors and financial indicators. The regulators typically provide these figures on a quarterly basis a few reporting periods in advance.

What are some examples of these scenarios?
  • Measures of Index Turbulence – E.g. In a certain quarter, regulators might establish that the S&P 500 would go down 30%; Decrease in key indices like home, commercial property & other asset prices.
  • Measures of  Economic Activity – E.g. US unemployment rate spikes, higher interest rates, increased inflation. What if unemployment ran to 14%? What does that do to my mortgage portfolio – the default rates increase and this is what it looks like.
  • Measures of Interest Rate Turbulence –  E.g. US treasury yields, Interest rates on US mortgages etc.

Based on this information, banks would then assess the impact of these economic scenarios as reflected in market and credit losses to their portfolios. This would help them estimate how their capital base would behave in this situation. These internal CCAR metrics are then sent over to the regulators. Every Bank has their own models based on their understanding which the Fed needs to review as well for completeness and quality.

The Fed uses the CCAR and DFAST results to evaluate capital adequacy, the quality of the capital adequacy assessment process and then evaluates the BHC’s plans to make capital distributions using dividends & share repurchases etc in the context of the results. The BHC’s boards of directors are required to approve and sign off on these plans.

What do CCAR & DFAST entail of Banks?

Well, six important things as the above illustration captures –

    1. CCAR is fundamentally very different from other umbrella risk types in that it has a strong external component in terms of reporting on internal bank data to the regulatory authorities. CCAR reporting is done by sending over internal bank Book of Record Transaction (BORT) data from their lending systems (with hundreds of manual adjustments) to the regulators for them to run their models to assess capital adequacy.  Currently , most banks do some model reporting internally which are based on canned CCAR algos in tools like SAS/Spark computed for a few macroeconomic stress scenarios.
    2. Both CCAR and DFAST stress the same business processes, data resources and governance mechanisms. They are both a significant ask on the BHCs from the standpoint of planning, execution and governance. BHCs have found them daunting with the new D-SIB’s that enter the mandate are faced with implementing these programs that need significant organizational and IT spend.
    3. Both CCAR and DFAST challenge the banks on data collection, quality, lineage and reporting. The Fed requires that data needs to be accurate, comprehensive and clean. Data Quality is the single biggest challenge to stress test compliance. Banks need to work on a range of BORT (Book of Record Transaction Systems) like Core Banking, Lending Portfolios, Position data and any other data needed to accurate reflect the business. There is also a reconciliation process that is typically used to reconcile risk data with the GL (General Ledger). For instance if a BHC’s lending portfolio is $4 billion based on the raw summary data. Once reconciliation is performed – it seems to be around $3 billion after adjustments. The regulator runs the aforesaid macroeconomic scenarios at $4 billion and the exposures are naturally off.
    4. Contrary to popular perception -the heavy lifting from is typically not in creating and running the exposure calculations for stress testing. The creation of these is relatively straightforward. Banks historically have had their own analytics groups produce these macroeconomic models. They also already have 10s of libraries in place that can be modified to create supervisory scenarios for CCAR/DFAST- the baseline, adverse & severely adverse. The critical difference with stress testing is that silo-ed models and scenarios need be unified along with the data.
    5. Model development in Banks usually follows a well defined lifecycle.Most of Liquidity Assessment and Liquidity Groups within Banks currently have a good base of quants with a clean separation of job duties. For instance, while one group produces scenarios, others work on exposures that feed into liquidity engines to calculate liquidity. The teams running these liquidity assessments are good candidates to run the CCAR/DFAST models as well. The calculators themselves will need to be rewritten for Big Data using something like SAS/ Spark.
    6. Transparency must be demonstrated down to the source data level. And banks need to be able to document all capital classification and computation rules to a sufficient degree to meet regulatory requirements during the auditing and review process.

The Technology Implications of  CCAR/DFAST..

It can clearly be seen that regulatory stress testing derives inputs from virtually every banking function. Then it should come as no surprise that  it follows that from a technology point of view there are several implications :

    • CCAR and DFAST impact a range of systems, processes and controls. The challenges that most Banks have in integrating front office trading desk data (Position data, pricing data and reporting) with back-office systems –  risk & finance are making the job of accurately reporting on stress numbers all the more difficult. These are causing most BHC’s to resort to manual data operations, analytics and complicated reconciliation process across the front, mid and back offices.
    • Not just from a computation & reporting library standardization, banks need to be able to perform common data storage for data from a range of BORT systems.
    • Banks also need to standardize on data taxonomies across all of these systems.
    • To that end, Banks need to stop creating more silos data across Risk and Finance functions; as I have often advocated in this blog – a move to a Data Lake enabled architecture is appropriate as a way of eliminating silos and the problem of unclean data which is sure to invite regulatory sanction.
    • Banks need to focus on Data Cleanliness by setting appropriate governance and audit-ability policies
    • Move to a paradigm of bringing compute to large datasets instead of the other way around
    • Move towards in memory analytics to transform, aggregate and analyze data in real time across many dimensions to obtain an understanding of the banks risk profile at any given point in time

A Reference Architecture for CCAR and DFAST..

 I recommend readers review the below post on FRTB Architecture as it contains core architectural and IT themes that are broadly applicable to CCAR and DFAST as well.

A Reference Architecture for the FRTB (Fundamental Review of the Trading Book)

Conclusion..

As can be seen from the above, both CCAR & DFAST require a holistic approach across the value chain (model development, data sourcing, reporting) across Risk, Finance and Treasury functions.  Further Regulators are increasingly demanding an automated process across risk & capital calculations under various scenarios using accurate and consistent data. The need of the hour for BHCs is to move to a common model for data storage, stress modeling and testing. Doing this can only ensure that the metrics and outputs of capital adequacy can be produced accurately and in a timely manner, thus satisfying the regulatory mandate.

References –

[1] Federal Reserve CCAR Summary Instructions 2016

https://www.federalreserve.gov/newsevents/press/bcreg/bcreg20160128a1.pdf

Why Platform as a Service (PaaS) Adoption will take off in 2017..

???????????????????????????

Since the time Steve Ballmer went ballistic professing his love for developers, it has been a virtual mantra in the technology industry that developer adoption is key to the success of a given platform. On the face of it – Platform as a Service(PaaS) is a boon to enterprise developers who are tired of the inefficiencies of old school application development environments & stacks. Further, a couple of years ago, PaaS seemed to be the flavor of the future given the focus on Cloud Computing. This blogpost focuses on the advantages of the generic PaaS approach while discussing its lagging slow rate of adoption in the cloud computing market – as compared with it’s cloud cousins – IaaS (Infrastructure as a Service) and SaaS (Software as a Service).

Platform as a Service (PaaS) as the foundation for developing Digital, Cloud Native Applications…

Call them Digital or Cloud Native or Modern. The nature of applications in the industry is slowly changing. So are the cultural underpinnings of the development process and culture themselves- from waterfall to agile to DevOps. At the same time, Cloud Computing and Big Data are enabling the creation of smart data applications. Leading business organizations are cognizant of the need to attract and retain the best possible talent – often competing with the FANGs (Facebook, Amazon, Netflix & Google).

Couple all this with the immense industry and venture capital interest around container oriented & cloud native technologies like Docker – you have a vendor arms race in the making. And the prize is to be chosen as the standard for building industry applications.

Thus, infrastructure is enabling but in the end- it is the applications that are Queen or King.

That is where PaaS comes in.

Why Digital Disruption is the Cure for the Common Data Center..

Enter Platform as a Service (PaaS)…

Platform as a Service (PaaS) is one of the three main cloud delivery models, the other two being IaaS (Infrastructure such as compute, network & storage services) and SaaS (Business applications delivered over a cloud). A collection of different cloud technologies, PaaS focuses exclusively on application development & delivery. PaaS advocates a new kind of development based on native support for concepts like agile development, unit testing, continuous integration, automatic scaling, while providing a range of middleware capabilities. Applications developed on these can be deployed out as services & managed across thousands of application instances.

In short, PaaS is the ideal platform for creating & hosting digital applications. What can PaaS provide that older application development toolchains and paradigms cannot?

While the overall design approach and features vary across every PaaS vendor – there are five generic advantages from a high level –

  1. PaaS enables a range of Application, Data & Middleware components to be delivered as API based services to developers on any given Infrastructure as a Service (IaaS).  These capabilities include-  Messaging as a service, Database as a service, Mobile capabilities as a service, Integration as a service, Workflow as a service, Analytics as a service for data driven applications etc. Some PaaS vendors also provide ability to automate & manage APIs for business applications deployment on them – API Management.
  2. PaaS provides easy & agile access to the entire suite of technologies used while creating complex business applications. These range from programming languages to application server (and lightweight) runtimes to programming languages to CI/CD toolchains to source control repositories.
  3. PaaS provides the services which enables a seamless & highly automated manner of building the complete life cycle of building and delivering web applications and services on the internet. Industry players are infusing software delivery processes with practices such as continuous delivery (CD) and continuous integration (CI). For large scale applications such as those built in web scale shops, financial services, manufacturing, telecom etc – PaaS abstracts away the complexities of building, deploying & orchestrating infrastructure thus enabling instantaneous developer productivity. This is a key point – with it’s focus on automation – PaaS can save application and system administrators precious time and resources in managing the lifecycle of elastic applications
  4. PaaS enables your application to be ‘kind of cloud’ agnostic & can enable applications to be run on any cloud platform whether public or private. This means that a PaaS application developed on Amazon AWS can easily be ported to Microsoft Azure to VMWare vSphere to Red Hat RHEV etc
  5. PaaS can help smoothen organizational Culture and Barriers – The adoption of a PaaS forces an agile culture in your organization – one that pushes cross pollination among different business, dev and ops teams. Most organizations are just now beginning to go bimodal for greenfield applications can benefit immensely from choosing a PaaS as a platform standard.

The Barriers to PaaS Adoption Will Continue to Fall In 2017..

In general, PaaS market growth rates do not seem to line up well when compared with the other broad sections of the cloud computing space, namely IaaS (Infrastructure as a Service) and SaaS (Software as a Service). 451 Research’s Market Monitor forecasts that the total market for cloud computing (including PaaS, IaaS and infrastructure software as a service (ITSM, backup, archiving) – will hit $21.9B in 2016 more than doubling to $44.2bB by 2020. Of that, some analyst estimates contend that PaaS will be a relatively small $8.1 billion.

451-research-paas_vs_saas_iaas

  (Source – 451 Research)

The advantages that PaaS confers have sadly also caused its relatively low rate of adoption as compared to IaaS and SaaS.

The reasons for this anemic rate of adoption include, in my opinion  –  

  1. Poor Conception of the Business Value of PaaS –  This is the biggest factor holding back explosive growth in this category. PaaS is a tremendously complicated technology & vendors have not helped by stressing on the complex technology underpinnings (containers, supported programming languages, developer workflow, orchestration, scheduling etc etc) as opposed to helping clients understand the tangible business drivers & value that enterprise CIOs can derive from this technology. Common drivers include increased time to market for digital capabilities, man hours saved in maintaining complex applications, ability to attract new talent etc. These factors will vary for every customer but it is up to frontline Sales teams to help deliver this message in a manner that is appropriate to the client.
  2. Yes, you can do DevOps without PaaS but PaaS helps a long way  – Many Fortune 500 organizations are drawing up DevOps strategies which do not include a PaaS & are based on a simplified CI/CD pipeline. This is to the detriment of both the customer organization & the industry as PaaS can vastly simplify a range of complex runtime & lifecycle services that would otherwise need to be cobbled together by the customer as the application moves from development to production. There is simply a lack of knowledge in the customer community about where a PaaS fits in a development & deployment toolchain.
  3. Smorgasbord of Complex Infrastructure Choices – The average leading PaaS includes a range of open source technologies ranging from containers to runtimes to datacenter orchestration to scheduling to cluster management tools. This makes it very complex from the perspective of Corporate IT – not just it terms of running POCs and initial deployments but also to manage a highly complex stack. It is incumbent on the open source projects to abstract away the complex inner workings to drive adoption  -whether by design or by technology alliances.
  4. You don’t need Cloud for PaaS but not enough Technology Leaders get that – This one is perception. The presence of an infrastructural cloud computing strategy is not a necessary condition for PaaS. 
  5. The false notion that PaaS is only fit for massively scalable, greenfield applications – Industry leading PaaS’s (like Red Hat’s OpenShift) support a range of technology approaches that can help cut technical debt. They donot limit deployment on an application server platform such as JBOSS EAP or WebSphere or WebLogic, or a lightweight framework like Spring.
  6. PaaS will help increase automation thus cutting costs – For developers of applications in Greenfield/ New Age spheres such as IoT, PaaS can enable the creation of thousands of instances in a “Serverless” fashion. PaaS based applications can be composed of microservices which are essentially self maintaining – i.e self healing and self scalable up or down; these microservices are delivered (typically) by IT as Docker containers using automated toolchains. The biggest requirement in large datacenters – human involvement – is drastically reduced if PaaS is used – while increasing agility, business responsiveness and efficiencies.

Conclusion…

My goal for this post was to share a few of my thoughts on the benefits of adopting a game changing technology. Done right, PaaS can provide a tremendous boost to building digital applications thus boosting the bottom line. Beginning 2017, we will witness PaaS satisfying critical industry use cases as leading organizations build end-to-end business solutions that covers many architectural layers.

References…

[1] http://www.forbes.com/sites/louiscolumbus/2016/03/13/roundup-of-cloud-computing-forecasts-and-market-estimates-2016/#3d75915274b0

Payment Providers – How Big Data Analytics Provides New Opportunities in 2017

                                                         Image Credit – JDL Group

Payments Industry in 2017..

The last post in this blog (handy link below) discussed my predictions for the payments market in 2017. The payments industry is large, quite diverse from a capabilities standpoint while being lucrative from a revenue standpoint.

My Last Post for the Year – Predictions for the Global Payments Industry in 2017

Why is that?

First, payments are both an essential daily function for consumers and corporates alike which means a constant annual growth in transaction volumes. Volumes are the very lifeblood of the industry.

Second, thanks to the explosion of technology capabilities especially around Smartphones & Smart Apps – the number of avenues that consumers can use to make payments has virtually surged.

Thirdly, an increasing number of developing economies such as China, India and Brazil are slowly moving over massive consumer populations over to digital payments from previously all cash economies.

Finally, in developed economies – the increased regulatory push  in the form of standards like PSD2 (Payments Systems Directive 2) have begun blurring boundaries between traditional players and the new upstarts.

All of these factors have the Payments industry growing at a faster clip than most other areas of finance. No wonder, payments startups occupy pride of place in the FinTech boom.

The net net of all this is that payments will continue to offer a steady and attractive stream of investments for players in this area.

Big Data Driven Analytics in the Payments Industry..

Much like the other areas of finance, the payments industry can benefit tremendously from adopting the latest techniques in data storage and analysis. Let us consider the important ways in which they can leverage the diverse and extensive data assets they possess to perform important business functions –

  1. Integrating all the complex & disparate functions of Payments Platforms
    Most payment providers offer a variety of services. E.g. credit cards, debit cards and corporate payments. Integrating different kinds of payment types – credit cards, debit cards, Check, Wire Transfers etc into one centralized payment platform. This helps with internal efficiencies (e.g collapsing redundant functions such as fraud, risk scoring, reconciliation, reporting into one platform) but also with external services offered to merchants (e.g. forecasting, analytics etc).
  2. Detect Payments Fraud
    Big Data is dramatically changing that approach with advanced analytic solutions that are powerful and fast enough to detect fraud in real time but also build models based on historical data (and deep learning) to proactively identify risks.

    http://www.vamsitalkstech.com/?p=1098

  3. Risk Scoring of Payments in Realtime & Batch 
    Payment Providers assess the risk score of transactions in realtime depending upon various attributes (e.g. Consumer’s country of origin, IP Address etc). Big Data enables these attributes to become granular by helping support advanced statistical techniques to incorporate behavioral (e.g. transaction is out of normal behavior for a consumers buying patterns), temporal and spatial techniques.
  4. Detect Payments Money Laundering (AML)
    A range of Big Data techniques are being deployed  to detect money laundering disguised as legitimate payments.

    http://www.vamsitalkstech.com/?p=2559

  5. Understand Your Customers Better
    Payment providers can create a single view of a Cardholder across multiple accounts & channels of usage. Doing this will enable cross sell/upsell and better customer segmentation. The below picture says it all.

    http://www.vamsitalkstech.com/?p=2517

  6. Merchant Analytics 
    Payment providers have been sitting on petabytes of customer data and have only now began waking up to the possibilities of monetizing this data. An area of increasing interest is to provide sophisticated analytics to merchants as a way of driving merchant rewards programs. Retailers, Airlines and other online merchants need to understand what segments their customers fall into as well as what the best avenues are to market to each of them. E.g. Webapp, desktop or tablet etc. Using all of the Payment Data available to them, Payment providers can help Merchant Retailers understand their customers better as well as improve their loyalty programs.
  7. Cross Sell & Up Sell New Payment & Banking Products & Services
    Most payment service providers are also morphing into online banks. Big Data based Data Lakes support the integration of regular banking  capabilities such as bill payment, person-to-person payments and account-to-account transfers to streamline the payments experience beyond the point of sale. Consumers can then move and manage money at the time they choose: instantly, same-day, next-day or on a scheduled date in the future
  8. Delivering the best possible highly personalized Payments Experience
    Mobile Wallets offer the consumer tremendous convenience by Data Lakes support the integration of capabilities such as bill payment, person-to-person payments and account-to-account transfers to streamline the payments experience beyond the point of sale. Consumers can then move and manage money at the time they choose: instantly, same-day, next-day or on a scheduled date in the future

Conclusion..

As we have discussed in previous posts in this blog, the payments industry is at the cusp (if not already, in the midst) of a massive disruption. Business strategies will continue to be driven by technology especially Big Data Analytics. Whether this is in Defense (cut costs, optimize IT, defend against financial crimes or augment existing cyber security) or playing Offense (signing up new customers, better cross sell and data monetization) – Big Data will continue to be a key capability in the industry.

My Last Post for the Year – Predictions for the Global Payments Industry in 2017

The Global Payments Industry enters 2017..

It is not without reason that the payments industry lies at the heart of the FinTech boom. Across the globe, hundreds of startups are vying to provide digital services across the complex payments spectrum. Players ranging from Smartphone manufacturers, FinTechs,Banks and Retailers are all players in this ecosystem.Added to this the payments landscape across the globe is experiencing massive change driven by technology and regulatory mandates. Please find my take on the top five trends on this dynamic industry as we begin 2017, which truly promises to be a watershed year for the industry.

                                                         Image Credit – MasterCard

Trend #1 Mobile Payments volume to surge in 2017

By 2019, global consumer mobile payment volumes are expected to surpass 1 trillion US dollars [1] – this is a massive increase from just 450 billion US dollars in 2017.

The growing popularity of alternative payment modes like Mobile Wallets (e.g Apple Pay, Chase and Android Pay) are driving increased volumes across both open loop and closed loop payments. Couple this with in-app payments (e.g Uber) as well as Banking providers with their own Digital Wallets will step up their game only driving further adoption.

Retailers like Walmart, Nordstrom and Tesco have already started offering more convenient in store payments. At the same time, mobile commerce has matured as smartphone manufacturers have started to create devices with larger screens and payment companies have added one-click buy buttons in order to remove the friction of shopping on mobile phones.

This trend is being clearly seen across all forms of consumer and merchant payments. This is owed to the convenience of making these payments often at the click of a button. This trend will only continue to accelerate in 2017 as smartphone manufacturers continue to make devices that have more onscreen real estate. This will drive more mobile commerce. With IoT technology taking center stage, the day is not long off when connected devices (e.g. wearables) make their own payments.

A smoother and friction less consumer payment experience is what is driving adoption across all of these modes. It is not just about convenience (with a range of stored user preferences) but also comfort (a range of analytics that provide integration with the users other banking products). Thus, providing a more timely and integrated experience.

Trend #2 Payment Providers will rethink their business models

Across the globe, national governments and regulatory authorities are beginning to take note of the fact that they need to unshackle banking data from the incumbents and provide access to other service providers. The intention is to change archaic business models. Access to customer information and transaction data will enable the creation of new business services like with the FinTechs.

On the institutional side, provisions permitting cross border acquiring as well as capping of interchange fees have been passed. This will enable third part processors to access consumer account information thus enabling them to create new products. E..g. Offer banking services, contextual offers etc. This will lead to banks and payment providers building products that provide value added services on the data that they already own.

With the passage of the second revision of the pathbreaking Directive on Payment Services Directive (PSD-2), the European Parliament has adopted the legal foundation of the creation of a EU-wide single payments area (SEPA).  While the goal of the PSD is to establish a set of modern, digital industry rules for all payment services in the European Union; it has significant ramifications for the financial services industry as it will surely current business models & foster new areas of competition. The key message from a regulatory standpoint is that consumer data can be opened up to other players in the payment value chain. This will lead to a clamor by players to own more of the customers data with a view to selling business services (e.g. accurate credit scoring, access to mortgage & other consumer loans and mutual funds etc) on that information.

Trend #3 Global Payment infrastructure moves to Real Time

The demand for fast payments from both consumers and corporates has led to about 40+ countries implementing immediate payment infrastructures that are highly secure yet speedy. The European Union leads the way with Denmark (Realtime 24/7), Norway and the UK (FPS) implementing such systems. The US and Canada have also begun making moves on this front as well.

The implications of this are two fold. One, this will drive down the already decreasing percentage of cash payments in the system while – two – increasing the ability of providers and non banks to provide value added services on the transaction data which is more readily available. At the same time, expect more regulatory focus on moving fraud and compliance programs into a real time mode.

Trend #4 Payment Providers will begin resembling Bay area startups

This one is a little provocative but the high tech trend towards digitization is clear. Payment technology is the eye of the storm – from a FinTech standpoint. This trend will accelerate in 2017 with the easy availability of open source technology in four critical areas – Big Data, Cloud Computing, Predictive Analytics & Blockchain.

Big Data will heavily be leveraged (on Private or Public Cloud based infrastructure) to perform real-time Predictive analytics on payments data in motion as well as at rest. Critical capabilities such as a Single View of Customer/Payment/Fraud & Customer Journey Management etc will all depend on Big Data. Blockchain technology (and its attractiveness in terms of removing the middleman while providing transparency & security) will continue to be prototyped across many different areas.

Trend #5 Cyber Security, Risk and Compliance will remain board level concerns

If there is one common thread across the entire payments value chain – Merchants, Acquirers, Gateways, Schemes, Banks, Corporates etc – it is the risk of cyber attacks. Though EMV based chip technology has reduced point of sales fraud, the trend in cyberattacks is only on the upsurge. Techniques like tokenization – have been developed to help both the schemes as well as providers of digital wallets etc reduce their compliance risk. As corporate payments and other B2C payments move to becoming more cross border – the focus on Anti Money Laundering and Fraud detection will only increase. The need of the hour is to deploy realtime analytics and Big Data techniques to tackle these at an application level.

Big Data Counters Payment Card Fraud (1/3)…

Conclusion..

As payments services firms begin 2017, they will need to transition to a customer oriented mindset. They will being pushed to share data through open standards, become highly digitized in interacting with consumers and will begin leveraging the vast internal data (about customers, their transaction histories, financial preferences, operational insights etc) to create new products or services or to enhance the product experience.

References..

[1]  Mobile Payment Volumes Forecast – https://www.statista.com/statistics/226530/mobile-payment-transaction-volume-forecast/

The Three Habits of Highly Effective Real Time Enterprises…

All I do is sit at home and watch Netflix. ” – Kylie Irving

The Power of Real Time

Anyone who has used Netflix to watch a movie or used Uber to hail a ride knows how simple, time efficient, inexpensive and seamless it is to do either. Chances are that most users of Netflix and Uber would never again use a physical video store or a taxi service unless they did not have a choice. Thus it should not come as a surprise that within a short span of a few years, these companies have acquired millions of delighted customers using their products (which are just apps) while developing market capitalizations of tens of billions of dollars.

As of early 2016, Netflix had about 60 million subscribers[1] and is finding significant success in producing its own content thus continuing to grab market share from the established players like NBC, Fox and CBS. Most Netflix customers opt to ditch Cable and are choosing to stream content in real time across a variety of devices.

Uber is nothing short of a game changer in the ride sharing business. Not just in busy cities but also in underserved suburban areas, Uber services save plenty of time and money in enabling riders to hail cabs. In congested metro areas, Uber also provides near instantaneous rides for a premium which motivates more drivers to service riders. As someone, who has used Uber in almost every continent in the world, it is no surprise that as of 2016, Uber dominates in terms of market coverage, operating in 400 cities in 70+ countries.[2]

What is the common theme in ordering a cab using Uber or a viewing a movie on Netflix ?

Answer – Both services are available at the click of a button, they’re lightning quick and constantly build on their understanding of your tastes, ratings and preferences. In short, they are Real Time products.

Why is Real Time such a powerful business capability?

In the Digital Economy, the ability to interact intelligently with consumers in real time is what makes possible the ability to create new business models and to drive growth in existing lines of business.

So, what do Real Time Enterprises do differently

What underpins a real time enterprise are three critical factors or foundational capabilities as shown in the below illustration. For any enterprise to be considered real time, the presence of these three components is what decides the pace of consumer adoption. Real time capabilities are part business innovation and part technology.

Let us examine these…

#1 Real Time Businesses possess a superior Technology Strategy

First and foremost, business groups must be able to define a vision for where they would like their products and services to be able to do to acquire younger and more dynamic consumers.

As companies adopt new business models, the technologies that support them must also change along with the teams that deliver them. IT departments have to move to more of a service model while delivering agile platforms and technology architectures for business lines to develop products around.

Why Digital Disruption is the Cure for the Common Data Center..

It needs to be kept in mind that these new approaches should be incubated slowly and gradually. They must almost always be business or usecase driven at first.

#2 Real Time Enterprises are extremely smart about how they leverage data

The second capability is an ability to break down data silos in an organization. Most organizations have no idea of what to do with all the data they generate. Sure, they use a fraction of it to perform business operations but beyond that most of this data is simply let go. As a consequence they fail to view their customer as a dynamic yet unified entity. Thus, they have no idea as to how to market more products or to estimate the risk being run on their behalf etc. The ability to add  is a growing emphasis on the importance of the role of the infrastructure within service orientation. As the common factor that is present throughout an organization, the networking infrastructure is potentially the ideal tool for breaking down the barriers that exist between the infrastructure, the applications and the business. Consequently, adding greater intelligence into the network is one way of achieving the levels of virtualization and automation that are necessary in a real-time operation.

Across Industries, Big Data Is Now the Engine of Digital Innovation..

#3 Real Time Enterprises use Predictive Analytics and they automate the hell out of every aspect of their business

Real time enterprises get the fact that using only Business Intelligence (BI) dashboards is largely passe. BI implementations base their insights on data that is typically stale, (even by days). BI operates in a highly siloed manner based on long cycles of data extraction, transformation, indexing etc.

However, if products are to be delivered over mobile and other non traditional channels, then BI is ineffective at providing just in time analytics that can drive an understanding of a dynamic consumers wants and needs. The Real Time enterprise demands that workers at many levels ranging from line of business managers to executives have fresh, high quality and actionable information on which they can base complex yet high quality business decisions. These insights are only enabled by Data Science and Business Automation. When deployed strategically – these techniques can scale to enormous volumes of data and help reason over them reducing manual costs.  They can take on business problems that can’t be managed manually because of the huge amount of data that must be processed.

Why Big Data & Advanced Analytics are Strategic Corporate Assets..

Conclusion..

Real time Enterprises do a lot of things right. They constantly experiment with creating new and existing business capabilities with a view to making them appealing to a rapidly changing clientele. They refine these using constant feedback loops and create cutting edge technology stacks that dominate the competitive landscape. Enterprises need to make the move to becoming Real time.

Neither Netflix nor Uber are sitting on their laurels. Netflix (which discontinued mail in DVDs and moved to an online only model a few years ago) continues to expand globally betting that the convenience of the internet will eventually turn it into a major content producer. Uber is prototyping self driving cars in Pittsburgh and intends to rollout its own fleet of self driving vehicles thus replacing it’s current 1.5 million drivers and also beginning a food delivery business around urban centers eventually[4].

Sure, the ordinary organization is no Netflix or Uber and when a journey such as the one to real time capabilities is embarked on, things can and will go wrong in this process. However, the cost of continuing with business as usual can be incalculable over the next few years.  There is always a startup or a competitor that wants to deliver what you do at much lower cost and at a lightning fast clip. Just ask Blockbuster and the local taxi cab company.

References

[1] Netflix Statistics 2016 – Statistica.com

[2] Fool.com “Just how dominant is Uber” – http://www.fool.com/investing/general/2015/05/24/lyft-vs-uber-just-how-dominant-is-uber-ridesharing.aspx

[3] Expanded Ramblings – “Uber Statistics as of Oct 2016” http://expandedramblings.com/index.php/uber-statistics/

[4] Uber Self driving cars debut in Pittsburgh – “http://www.wsj.com/articles/inside-ubers-new-self-driving-cars-in-pittsburgh-1473847202”

What Blockchain can do for The Internet Of Things..

Blockchain and IoT are a marriage made in heaven. Blockchain can enable & augment a variety of application scenarios and usecases for the IoT. No longer are such possibilities too futuristic – as we will discuss in this post.

IoT meets Blockchain..

Blockchain and Internet Of Things (IoT) are easily the two biggest buzzwords in technology at the moment. The IoT encompasses the world of sensors,moving objects like vehicles & really any device that has embedded electronics  to communicate with the outside world – typically over an IP protocol.

Combine that with Blockchain – a distributed ledger architecture (DLT) pattern.Combining the two can facilitate the entire lifecycle of IoT devices & applications and prove to be the glue for business processes to act on these events. Consider the following scenario – a private blockchain for a driverless connected car that will enable secure and realtime interactions from the car starting with car startup, driver authentication, smart contracts to exchange insurance & maintenance service information and realtime location info to track safety.

Blockchain based distributed ledger technology (DLT) fills in five critical gaps in IoT..

  1. In such typical scenarios as the above, a Blockchain based distributed ledger provides the trust, record of ownership, transparency and the overall (decentralized) communication backbone for IoT.
  2. It needs to be noted that over the years specific IoT communities will develop their own private blockchains that can store transactions in a highly secure manner for their specific applications. IoT architectures that relied on centralized servers to collect and store data will be able to write to local ledgers that will synch with other localized ledgers to maintain a single yet secure copy of the truth.
  3. All IoT transactions on the Blockchain will be timestamped thus ensuring that they are available essentially – for posterity.
  4. Next up, the true innovation of Blockchain – digital agreements or Smart Contracts. Smart contracts can then be applied on the data in the blockchain to enforce business conditions on the IoT interactions.
  5. Finally, one of the big knocks against IoT has been the relative lack of security standards. Blockchain due to its background on high end cryptography actually helps with IoT security. A future post will discuss such a reference architecture.

With that background, let us consider low hanging usecases across key IoT applications in verticals.

blockchain_iot

  1. Industrial Manufacturing – The manufacturing industry is moving to an entirely virtual world across its lifecycle, ranging from product development, customer demand monitoring to production to inventory management. As devices & systems become more interactive and intelligent, the blockchain can serve as a plant level, regional level and global supply chain level ledger. This will dramatically cut costs and drive more efficient just in time (JIT) processes enabling better usage of plant capacity and improved operational efficiencies.
  2. Connected and Driverless VehiclesThe Connected Vehicle enables the car or truck to behave as a giant Smart App. With the passing of every year, vehicles have more automatic features builtin – ranging from navigation, roadside assistance etc. Blockchain will enable these devices to be tracked on the digital mesh thus enabling easy inter vehicle communication as well as automatic tracking of fleet insurance policies, vehicle registration renewals etc
  3. Transportation – IoT + Blockchain = Connected Transportation. A range of scenarios can be imagined around a connected mesh of vehicles that exchange everything from traffic information to avoiding gridlocks & bottlenecks. Extending this to global trade, this mesh can incorporate shipping, air freight as well as ground transportation to track shipments.
  4. Public Infrastructure & Smart CitiesSmart devices are already being used to track the health of bridges, roads, power grids etc. Blockchains can be used to interconnect these to share efficiencies and to conduct maintenance, forecast usage trends for power usage, pollution etc. Another key area of usage would be to help remote areas such as forests to monitor natural incidents and to prevent catastrophic occurrences like large scale forest fires or farm infestations by blight etc.
  5. Financial services and insurance – Banks could use Blockchain backbone to track IoT enabled devices like ATM machines, remote tellers to conduct maintenance. Insurance companies which have already started deploying drones to verify property claims in remote areas can use the Blockchain to validate and verify claims.
  6. Home and Commercial Realestate management – Using sensors deployed on both homes and commercial buildings helps with automated home and office monitoring. The usecases will diverge across both areas but many can be built on having a foundational distributed ledger capability.
  7. Smart Contracts –  Blockchain based cryptocurrencies like Bitcoin enable two business capabilities at a high level – a) transfer of cryptocurrency and b) business rules that specify when the payout has to happen – typically once conditions that have been met – which satisfy fulfillment of contractual terms.
    These rules are termed “Smart Contracts’. Smart contracts are applicable across all of these business areas and can be used to keep track of business rules and take actions based on thresholds that have been met or have been breached. E.g A driverless vehicle that has failed an inspection can be grounded, non payment of home owners insurance can trigger an alert to the homeowners housing society etc.
  8. Retail –  Retailers are already using IoT devices and endpoints to help across the business lifecycle – ranging from the shop floor, to tracking product delivery to store, to understand their customer traffic patterns, wearables etc. The vision of the Connected Store with IoT enabled shelves, an ability for customers to perform more actions using smartphone apps to reducing checkout times with self checkout etc are all taking place. The Blockchain can augment all of these usecases by providing the critical link between retailer and consumer in a way that it automates away the middle man- be it a credit card issuer, or a  central server. For instance consumers can store their product preferences, sizes in a Blockchain and the retailer can access these in a seamless and secure manner.

There still exist large technology shortcomings..

Finally, it needs to be mentioned that there still exist critical gaps in Blockchain technology – whether one considers the public Blockchain on which Bitcoin is built or technologies like Etherium – in terms of interoperability, security standards, throughput and mature developer tooling. These will need to be worked on over the next few quarters before we see production grade IoT deployments on Blockchains.

Conclusion..

The potential ability of Blockchain to ultimately enable secure, global & decentralized communication across billions of IoT endpoints is very promising. As Blockchain matures and makes it way into the enterprise, it likely to impact many aspects of business operations and strategies in the coming years.