Vamsi Chemitiganti's weekly musings on applying Big Data, Cloud, & Middleware technology to solving industry challenges. Published every Friday or Sunday (if I'm very busy). All opinions are entirely my own. I write this blog so my readers don't have to spend money on expensive consultants.
The natural tendency in the world of Corporate IT is to create applications in response to business challenges. For instance, take any large Bank or Insurer or Manufacturer – you will find thousands of packaged applications that aim to solve a range of challenges from departmental level issues to enterprise-wide business problems. Over years these have given rise to application and infrastructure sprawl.
The application mindset creates little business value over the long run while creating massive technology headaches. For instance, the rationalization of these applications over time becomes a massive challenge in and of itself. At times, IT does not even understand how relevant some of these applications are to business users, who are even using them and the benefits derived. Over the last 15 years, Silicon Valley players such as Apple, Google, and Facebook et al have begun illustrating the power of building platforms that connect a range of users to the businesses that serve them. As the Network Effects connected to using these platforms have grown exponentially, so have the users.
What Corporate IT & business need to learn to do is to move to a Platform mindset.
The Platform Strategy…
Amazon is the perfect example of how to conceive and execute a platform strategy over a couple of decades. It began life as a retailer in 1994 and over time morphed into other complementary offerings such as Marketplace, AWS, Prime Video, Payments etc. These platforms have led to an ever-increasing panoply of services, higher revenues, promoted more directed consumer interactions and higher network effects. Each platform generates its own revenue stream and is a large standalone corporation in its own right. However, the sum of these platforms is higher than the sum of the individual products and this has led to Amazon becoming the most valuable company in the world (as of late 2017).
So what are the key business benefits and drivers of a platform oriented model?
Driver #1 Platforms enable you to build business ecosystems
Platforms enable enterprise business to orient their core capabilities better and to be able to deliver on those. Once that is done to a high degree of success, partners and other ecosystem players can plug in their capabilities. The functionality that the platform provides is the ability to inter The challenge most times is that large companies always seem to play catchup with business models of nimbler players. When they do this, they often choose an application based approach which does not enable them to take a holistic view of their enterprise and the business ecosystems around them. In the Platform approach, IT departments move to more of a service model while delivering agile platforms and technology architectures for business lines to develop products around.
E.g. Post the PSD2 regulation, innovators in the European Banking system will become a prime example of platform led business ecosystems.
Driver #2 Platforms enable you to rethink and better the customer experience thus driving new revenue streams
The primary appeal of a platform based architecture is the ability to drive cross-sell and upsell opportunities. This increases not the number of products adopted by a given customer but also (and ultimately) the total revenue per customer.
The below blog post discusses how Payment Providers are increasingly using advanced analytics on their business platforms to generate not only increased topline/sales growth but also to defend against fraud and anti-money laundering (AML).
Driver #3 Platforms enable you to experiment with business models (e.g. Data Monetization)
The next progressive driver in leveraging both internal and external data is to use it to drive new revenue streams in existing lines of business. This is also termed Data Monetization. Data Monetization is the organizational ability to turn data into cost savings & revenues in existing lines of business and to create new revenue streams. This requires fusing both internal and external data to create new analytics and visualization.
Driver #4 Platforms destroy business process silos
One of the chief reasons that hold back an enterprise ability to innovate is the presence of both business and data silos. This is directly a result of an Application based approach. When underlying business processes & data sources are both fragmented, communication between business teams moves over to other internal & informal mechanisms such as email, chat and phone calls etc. This is an overall recipe for delayed business decisions which are ultimately ineffective as they depend more on intuition than are backed by data. The Platforms approach drives the organization towards unification and rationalization of both the data and the business process that creates it thus leading to a unified and consistent view of both across the business.
Driver #5 Platforms move you to become a Real-time Enterprise
Enterprises that are platform oriented does more strategic things right than wrong. They constantly experiment with creating new and existing business capabilities with a view to making them appealing to a rapidly changing clientele. They refine these using constant feedback loops and create platforms comprised of cutting-edge technology stacks that dominate the competitive landscape. The Real-Time enterprise demands that workers at many levels ranging from the line of business managers to executives have fresh, high quality and actionable information on which they can base complex yet high-quality business decisions.
A business and IT strategy built on platform approaches enable an organization to take on a much wider & richer variety of business challenges. This enables an organization to achieve outcomes that were not really possible with the Application model.
A few weeks ago on the invitation of DZone Magazine, I jointly authored a Big Data Reference Architecture along with my friend & collaborator, Tim Spann (https://www.linkedin.com/in/timothyspann/). Tim & I distilled our experience working on IIoT projects to propose an industrial strength digital architecture. It brings together several technology themes – Big Data , Cyber Security, Cognitive Applications, Business Process Management and Data Science. Our goal is to discuss a best in class architecture that enables flexible deployment for new IIoT capabilities allowing enterprises to build digital applications. The abridged article was featured in the new DZone Guide to Big Data: Data Science & Advanced Analytics which can be downloaded at https://dzone.com/guides/big-data-data-science-and-advanced-analytics
How the Internet Of Things (IoT) leads to the Digital Mesh..
The Internet of Things (IoT) has become one of the four top hyped up technology paradigms affecting the world of business. The other usual suspects being Big Data, AI/Machine Learning & Blockchain. Cisco predicts that the IOT is expected to impact about 25 billion connected things by 2020 and affect about $2 trillion of economic value globally across a diverse range of verticals. These devices are not just consumer oriented devices such as smartphones and home monitoring systems but dedicated industry objects such as sensors, actuators, engines etc.
The interesting angle to all this is the fact that autonomous devices are already beginning to communicate with one another using IP based protocols. They largely exchanging state & control information around various variables. With the growth of computational power on these devices, we are not far off from their sending over more granular and interesting streaming data – about their environment, performance and business operations – all of which will enable a higher degree of insightful analytics to be performed on the data. Gartner Research has termed this interconnected world where decision making & manufacturing optimization can occur via IoT as the “Digital Mesh“.
The evolution of technological innovation in areas such as Big Data, Predictive Analytics and Cloud Computing now enables the integration and analysis of massive amounts of device data at scale while performing a range of analytics and business process workflows on the data.
According to Gartner, the Digital Mesh will thus lead to an interconnected data information deluge powered by the continuous data from these streams. These streams will encompasses classical IoT endpoints (sensors, field devices, actuators etc) sending data in a variety of formats – text, audio, video & social data streams – along with new endpoints in areas as diverse as Industrial Automation, Remote Healthcare, Public Transportation, Connected Cars, Home Automation etc. These intelligent devices will increasingly begin communicating with their environments in a manner that will encourage collaboration in a range of business scenarios. The industrial cousin of IoT is the Industrial Internet of Things (IIIoT).
Defining the Industrial Internet Of Things (IIoT)
The Industrial Internet of Things (IIoT) can be defined as a ecosystem of capabilities that interconnects machines, personnel and processes to optimize the industrial lifecycle. The foundational technologies that IIoT leverages are Smart Assets, Big Data, Realtime Analytics, Enterprise Automation and Cloud based services.
The primary industries impacted the most by the IIoT will include Industrial Manufacturing, the Utility industry, Energy, Automotive, Transportation, Telecom & Insurance.
According to Markets and Markets, the annual worldwide Industrial IoT market is projected to exceed $319 billion in 2020, which represents an 8% a compound annual growth rate (CAGR). The top four segments are projected to be manufacturing, energy and utilities, auto & transportation and healthcare.
Architectural Challenges for Industrial IoT versus Consumer IoT..
Consumer based IoT applications generally receive the lion’s share of media attention. However the ability of industrial devices (such as sensors) to send ever more richer data about their operating environment and performance characteristics is driving a move to Digitization and Automation across a range of industrial manufacturing.
Thus, there are four distinct challenges that we need to account for in an Industrial IOT scenario as compared to Consumer IoT.
The IIoT needs RobustArchitectures that are able to handle millions of device telemetry messages per second. The architecture needs to take into account that all kinds of devices operating in environments ranging from the constrained to
IIoT also calls for the highest degrees of Infrastructure and Application reliability across the stack. For instance, a lost message or dropped messages in a healthcare or a connected car scenario may mean life or death for a patient, or, an accident.
An ability to integrate seamlessly with existing Information Systems. Lets be clear, these new age IIOT architectures need to augment existing systems such as Manufacturing Execution Systems (MES) or Traffic Management Systems. In Manufacturing, MES systems continually improve the product lifecycle and perform better resource scheduling and utilization. This integration helps these systems leverage the digital intelligence and insights across (potentially) millions of devices across complex areas of operation.
An ability to incorporate richer kinds of analytics than has been possible before that provide a great degree of context. This ability to reason around context is what provides an ability to design new business models which cannot be currently imagined due to lack of agility in the data and analytics space.
What will IIoT based Digital Applications look like..
Digital Applications are being designed for specific device endpoints across industries. While the underlying mechanisms and business models differ from industry to industry, all of these use predictive analytics based on a combination of real time data processing & data science algorithms. These techniques extract insights from streaming data to provide digital services on existing toolchains, provide value added customer service, predict device performance & failures, improve operational metrics etc.
Examples abound. For instance, a great example in manufacturing is the notion of a Digital Twin which Gartner called out last year. A Digital twin is a software personification of an Intelligent device or system. It forms a bridge between the real world and the digital world. In the manufacturing industry, digital twins can be setup to function as proxies of Things like sensors and gauges, coordinate measuring machines, vision systems, and white light scanning. This data is sent over a cloud based system where it is combined with historical data to better maintain the physical system.
The wealth of data being gathered on the shop floor will ensure that Digital twins will be used to reduce costs and increase innovation. Thus, in global manufacturing – Data science will soon make it’s way into the shop floor to enable the collection of insights from these software proxies. We covered the phenomenon of Servitization in manufacturing in a previous blogpost.
In the Retail industry, an ability to detect a customer’s location in realtime and combining that information with their historical buying patterns can drive real time promotions and an ability to dynamically price retail goods.
Solution Requirements for an IIoT Architecture..
At a high level, the IIoT reference architecture should support six broad solution areas-
Device Discovery – Discovering a range of devices (and their details) on the Digital Mesh for an organization within and outside the firewall perimeter
Performing Remote Lifecycle Configuration of these devices ranging from startup to modification to monitoring to shut down
Performing Deep Security level introspection to ensure the patch levels etc are adequate
Creating Business workflows on the Digital Mesh. We will do this by marrying these devices to enterprise information systems (EISs)
Performing Business oriented Predictive Analytics on these devices, this is critical to
On a futuristic basis, support optional integration with the Blockchain to support a distributed organizational ledger that can coordinate activity across all global areas that an enterprise operates in.
Building Blocks of the Architecture
Listed below are the foundational blocks of our reference architecture. Though the requirements will vary across industries, an organization can reasonably standardize on a number of foundational components as depicted below and then incrementally augment them as the interactions between different components increase based on business requirements.
Our reference architecture includes the following major building blocks –
Device Integration Layer
Data & Middleware Tier
Digital Application Layer
It also includes the following cross cutting concerns which span across the above layers –
Device and Data Security
Business Process Management
Data Governance – Provenance, Auditing, Logging
The next section provides a brief overview of the reference architecture’s components at a logical level.
Device Layer –
The first requirement of IIIoT implementations is to support connectivity from the Things themselves or the Device layer depicted at the bottom. The Device layer includes a whole range of sensors, actuators, smartphones, gateways and industrial equipment etc. The ability to connect with devices and edge devices like routers, smart gateways using a variety of protocols is key. These network protocols include Ethernet, WiFi, and Cellular which can all directly connect to the internet. Other protocols that need a gateway device to connect include Bluetooth, RFID, NFC, Zigbee et al. Devices can connect directly with the data ingest layer shown above but it is preferred that they connect via a gateway which can perform a range of edge processing.
This is important from a business standpoint for instance, in certain verticals like healthcare and financial services, there exist stringent regulations that govern when certain identifying data elements (e.g. video feeds) can leave the premises of a hospital or bank etc. A gateway cannot just perform intelligent edge processing but also can connect thousands of device endpoints and facilitate bidirectional communication with the core IIoT architecture.
The ideal tool for these constantly evolving devices, metadata, protocols, data formats and types is Apache NiFi. These agents will send the data to an Apache NiFi gateway or directly into an enterprise Apache NiFi cluster in the cloud or on-premise.
A subproject of Apache NiFi – MiNiFi provides a complementary data collection approach that supplements the core tenets of NiFi in dataflow management. However due to its small footprint and low resource consumption, is well suited to handle dataflow from sensors and other IOT devices. It provides central management of agents while providing full chain of custody information on the flows themselves.
For remote locations, more powerful devices like the Arrow BeagleBone Black Industrial and MyPi Industrial, it is very simple to run a tiny Java or C++ MiNiFi agent for secure connectivity needs.
The data sent by the device endpoints are then modeled into an appropriate domain representation based on the actual content of the messages. The data sent over also includes metadata around the message. A canonical model can optionally be developed (based on the actual business domain) which can support a variety of applications from a business intelligence standpoint.
Apache NiFi supports the flexibility of ingesting changing file formats, sizes, data types and schemas. The devices themselves can send a range of feeds in different formats. E.g. XML now and based on upgraded capabilities – richer JSON tomorrow. NiFi supports ingesting any file type that the devices or the gateways may send. Once the messages are received by Apache NiFi, they are enveloped in security with every touch to each flow file controlled, secured and audited. NiFi flows also provide full data provenance for each file, packet or chunk of data sent through the system. NiFi can work with specific schemas if there are special requirements for file types, but it can also work with unstructured or semi structured data just as well. From a scalability standpoint, NiFi can ingest 50,000 streams concurrently on a zero-master shared nothing cluster that horizontally scales via easy administration with Apache Ambari.
Data and Middleware Layer –
The IIIoT Architecture recommends a Big Data platform with native message oriented middleware (MOM) capabilities to ingest device mesh data. This layer will also process device data in such a fashion – batch or real-time – as the business needs demand.
Application protocols such as AMQP, MQTT, CoAP, WebSockets etc are all deployed by many device gateways to communicate application specific messages. The reason for recommending a Big Data/NoSQL dominated data architecture for IIOT is quite simple. These systems provide Schema on Read which is an innovative data handling technique. In this model, a format or schema is applied to data as it is accessed from a storage location as opposed to doing the same while it is ingested. From an IIOT standpoint, one must not just deal with the data itself but also metadata such as timestamps, device id, other firmware data such as software version, device manufactured data etc. The data sent from the device layer will consist of time series data and individual measurements.
The IIoT data stream can thus be visualized as a constantly running data pump which is handled by a Big Data pipeline takes the raw telemetry data from the gateways, decides which ones are of interest and discards the ones not deemed significant from a business standpoint. Apache NiFi is your gateway and gate keeper. It ingests the raw data, manages the flow of thousands of producers and consumers, does basic data enrichment, sentiment analysis in stream, aggregation, splitting, schema translation, format conversion and other initial steps to prepare the data. It does that all with a user-friendly web UI and easily extendible architecture.It will then send raw or processed data to Kafka for further processing by Apache Storm, Apache Spark or other consumers.Apache Storm is a distributed real-time computation engine that reliably processes unbounded streams of data.Storm excels at handling complex streams of data that require windowing and other complex event processing. While Storm processes stream data at scale, Apache Kafka distributes messages at scale. Kafka is a distributed pub-sub real-time messaging system that provides strong durability and fault tolerance guarantees. NiFi, Storm and Kafka naturally complement each other, and their powerful cooperation enables real-time streaming analytics for fast-moving big data. All the stream processing is handled by NiFi-Storm-Kafka combination.
Appropriate logic is built into the higher layers to support device identification, ID lookup, secure authentication and transformation of the data. This layer will process data (cleanse, transform, apply a canonical representation) to support Business Automation (BPM), BI (business intelligence) and visualization for a variety of consumers. The data ingest layer will also providing notification and alerts via Apache NiFi.
Here are some typical uses for this event processing pipeline:
a. Real-time data filtering and pattern matching
b. Enrichment based on business context
c. Real-time analytics such as KPIs, complex event processing etc
d. Predictive Analytics
e. Business workflow with decision nodes and human task nodes
Digital Application Tier –
Once IIoT knowledge has become part of the Hadoop based Data Lake, all the rich analytics, machine learning and deep learning frameworks, tools and libraries now become available to Data Scientists and Analysts. They can easily produce insights, dashboards, reports and real-time analytics with IIoT data joined with existing data in the lake including social media data, EDW data, log data. All your data can be queried with familiar SQL through a variety of interfaces such as Apache Phoenix on HBase, Apache Hive LLAP and Apache Spark SQL. Using your existing BI tools or the open sourced Apache Zeppelin, you can produce and share live reports. You can run TensorFlow in containers on YARN for deep learning insights on your images, videos and text data; while running YARN clustered Spark ML pipelines fed by Kafka and NiFi to run streaming machine learning algorithms on trained models.
A range of predictive applications are suitable for this tier. The models themselves should seek to answer business questions around things like -Asset failure, the key performance indicators in a manufacturing process and how they’re trending, insurance policy pricing etc.
Once the device data has been ingested into a modern data lake, key functions that need to be performed include data aggregation, transformation, enriching, filtering, sorting etc.
As one can see, this can get very complex very quick – both from a data storage and processing standpoint. A Cloud based infrastructure with its ability to provide highly scalable compute, network and storage resources is a natural fit to handle bursty IIoT applications. However, IIoT applications add their own diverse requirements of computing infrastructure, namely the ability to accommodate hundreds of kinds of devices and network gateways – which means that IT must be prepared to support a large diversity of operating systems and storage types
The tier is also responsible for the integration of the IIoT environment into the business processes of an enterprise. The IIoT solution ties into existing line-of-business applications and standard software solutions through adapters or Enterprise Application Integration (EAI) and business-to-business (B2B) gateway capabilities. End users in business-to-business or business-to-consumer scenarios will interact with the IIOT solution and the special- purpose IIoT devices through this layer. They may use the IIoT solution or line-of-business system UIs, including apps on personal mobile devices, such as smartphones and tablets.
The topic of Security is perhaps the most important cross cutting concern across all layers of the IIoT architecture stack. Needless to say, each of the layers must support the strongest data encryption, authentication and authentication capabilities for devices, users and partner applications. Accordingly, capabilities must be provided to ingest and store security feeds, IDS logs for advanced behavioral analytics, server logs, device telemetry. These feeds must be constantly analyzed across three domains – the Device domain, the Business domain and the IT domain. The below blogpost delves into some of these themes and is a good read to get a deeper handle on this issue from a SOC (security operations center) standpoint.
It is evident from the above that IIoT will enormous opportunity for businesses globally. It will also create layers of complexity and opportunity for Enterprise IT. The creation of smart digital services on the data served up will further depend on the vertical industries. Whatever be the kind of business model – whether tracking behavior, location sensitive pricing, business process automation etc – the end goal of IT architecture should be to create enterprise business applications that are ultimately data native and analytics driven.
“We’re only at the very, very beginning of this next generation of computing and I think that every industry leader will be the ones that transforms first. I don’t care what industry you’re talking about” -Kim Stevenson, CIO, Intel, Feb 2016
Gartner Research rolled out their “Top 10 Strategic Technology Trends for 2017” report a few weeks ago. My goal for this blogpost is to introduce these trends to the reader and to examine the potential impact of their recommendations from an enterprise standpoint.
# 1: AI & Advanced Machine Learning
Gartner rightly forecasts that AI (Artificial Intelligence) and Advanced Machine Learning will continue their march into daily applications run by the Fortune 1000. CIOs are coming to realize that most business problems are primarily data challenges. The rapid maturation of scalable processing techniques allows us to extract richer insights from data. What we commonly refer to as Machine Learning – a combination of econometrics, machine learning, statistics, visualization, and computer science – helps extracts valuable business insights hiding in data and builds operational systems to deliver that value.
Deep Machine Learning involves the art of discovering data insights in a human-like pattern. We are, thus, clearly witnessing the advent of modern data applications. These applications will leverage a range of advanced techniques such as Artificial Intelligence and Machine Learning (ML) encompassing techniques such as neural networks, natural language processing and deep learning.
Implications for industry CIOs – Modern data applications understand their environment (e.g customer preferences and other detailed data insights) to be able to predict business trends in real time & to take action based on them to drive revenues and decrease business risk. These techniques will enable applications and devices to operate in an even more smarter manner while saving companies enormous amounts of money on manual costs.
# 2: Intelligent Apps
Personal assistants, e.g Apple Siri, Microsoft Cortona in the category of virtual personal assistants (VPAs), have begun transforming everyday business processes easier for their users. VPAs represent the intersection of AI, conversational interfaces and integration into business processes. In 2017, these will begin improving customer experiences for the largest Fortune 100 enterprises. On the more personal front, Home VPAs will rapidly evolve & become even more smarter as their algorithms get more capable and understanding of their own environments. We will see increased application of smart agents in diverse fields like financial services,healthcare, telecom and media.
Implications for industry CIOs – Get ready to invest in intelligent applications in the corporate intranet to start with.
# 3: Intelligent Things
The rise of the IoT has only been well documented but couple AI with massive data processing capabilities – that makes up Intelligent Things which can interact with humans in new ways. You can add a whole category of things around transportation (self driving cars, connected cars) and Robots that perform key processes in industrial manufacturing, drones etc.
Implications for industry CIOs – These intelligent devices will increasingly begin communicating with their environments in a manner that will encourage collaboration in a range of business scenarios. 2017 should begin the trend of these devices communicating with each other to form the eponymous ‘Digital Mesh’.
# 4: Virtual & Augmented Reality
Virtual reality (VR) and augmented reality (AR) are technologies that are beginning to completely change the way humans interact with one another and with intelligent systems that make up the Digital Mesh. Pokemon GO & Oculus Rift were the first hugely successful consumer facing AR applications – debuting in 2016. Uses of these technologies will include gamification (to improve customer engagement with products and services), other customer & employee facing applications etc. While both these technologies enable us to view the world in different ways – AR is remarkable in its ability to add to our current reality. BMW’s subsidiary Mini has actually developed a driving goggle with AR technology.
Implications for industry CIOs – This one is still on the drawing board for most verticals but it does make sense to invest in areas like gamification and in engaging with remote employees using AR.
# 5: Digital Twin
A Digital twin is a software personification of an Intelligent Thing or system. In the manufacturing industry, digital twins can be setup to function as proxies of things like sensors and gauges, Coordinate Measuring Machines, lasers, vision systems, and white light scanning . The wealth of data being gathered on the shop floor will ensure that Digital twins will be used to reduce costs and increase innovation. Data science will soon make it’s way into the shop floor to enable the collection of insights from these software proxies.
Implications for industry CIOs – Invest in Digital capabilities that serve as proxies for physical things.
# 6: Blockchain
The term Blockchain is derived from a design pattern that describes a chain of data blocks that map to individual transactions. Each transaction that is conducted in the real world (e.g a Bitcoin wire transfer) results in the creation of new blocks in the chain. The new blocks so created are done so by calculating a cryptographic hash function of its previous block thus constructing a chain of blocks – hence the name.
Blockchain is a distributed ledger (DLT) which allows global participants to conduct secure transactions that could be of any type – banking, music purchases, legal contracts, supply chain transactions etc. Blockchain will transform multiple industries in the years to come. Bitcoin is the first application of Blockchain.
Implications for industry CIOs – Begin expanding internal knowledge on Blockchain and as to how it can potentially augment or disrupt your vertical industry.
# 7: Conversational Systems
Mobile applications first begun forcing the need for enterprises to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. Conversational Systems take these interactions to the next level and enable humans to communicate with a wide range of Intelligent Things using a range of channels – speech, touch, vision etc.
Implications for industry CIOs – Every touch point matters, and those leading the smart agent transformation should constantly be asking how organizations are removing friction and enhancing the experience for every customer regardless of where they are in the journey.
# 8: Mesh App and Service Architecture
This one is still from last year. The Digital Mesh leads to an interconnected information deluge which encompasses classical IoT endpoints along with audio, video & social data streams. The creation of these smart services will further depend on the vertical industries that these products serve as well as requirements for the platforms that host them. E.g industrial automation, remote healthcare, public transportation, connected cars, home automation etc.The micro services architecture approach which combines the notion of autonomous, cooperative yet loosely coupled applications built as a conglomeration of business focused services is a natural fit for the Digital Mesh. The most important additive and consideration to micro services based architectures in the age of the Digital Mesh is what I’d like to term – Analytics Everywhere.
Implications for industry CIOs -The mesh app will require a microservices based architecture which supports multichannel & multi device solutions.
# 9: Digital Technology Platforms
The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous micro level interactions with global consumers/customers/clients/stockholders or patients depending on the vertical you operate in. More information on the core building blocks of Digital Technology Platforms at the below blogpost.
The evolution of the intelligent digital mesh and digital technology platforms and application architectures means that security has to become fluid and adaptive.Traditional solutions cannot handle this challenge which is exacerbated by the expectation that in an IoT & DM world, data flows will be multidirectional across a grid of application endpoints.
Implications for industry CIOs -Expect to find applications in 2016 and beyond incorporating Deep Learning and Real Time Analytics into their core security design with a view to analyzing large scale data at a very low latency. Security in the IoT environment is particularly challenging. Security teams need to work with application, solution and enterprise architects to build security into the overall DevOps process to create a DevSecOps model.
In this year’s edition, Gartner are clearly forecasting the future ten years out from a mass market standpoint. As we cross this chasm slowly over the next ten years, we will see that IoT begin to emerge and take center stage in every industry vertical. Digital transformation will happen on apps created for and brought together for Smart Agents on the Device Mesh.
These apps will gradually become autonomous, data intensive,server-less, hopefully secure and location independent (data center or cloud). The app can be a sensor or a connected car or a digital twin for a manufacturing technician. So, it’s not just about a single app sitting in a data center or the cloud or on the machine itself. These smart agent apps will data driven, components of a larger mesh, interconnected connected using open interfaces, and resident at the places where it’s optimal for realtime analytics. This may seem like science fiction for the Fortune 1000 enterprise but it is manifest reality at the web scale innovators. The industry will have no choice but to follow.
 Cramer – “A lesson in Augmented Realities” – http://cramer.com/story/the-difference-between-ar-and-vr/
“Ultimately, the cloud is the latest example of Schumpeterian creative destruction: creating wealth for those who exploit it; and leading to the demise of those that don’t.” – Joe Weiman author of Cloudonomics: The Business Value of Cloud Computing
The Cloud As a Venue for Digital Workloads…
As 2016 draws to a close, it can safely be said that no industry leader questions the existence of the new Digital Economy and the fact that every firm out there needs to create a digital strategy. Myriad organizations are taking serious business steps to making their platforms highly customer-centric via a renewed operational metrics focus. They are also working on creating new business models using their Analytics investments. Examples of these verticals include Banking, Insurance, Telecom, Healthcare, Energy etc.
As a general trend, the Digital Economy brings immense opportunities while exposing firms to risks as well. Customers now demanding highly contextual products, services and experiences – all accessible via an easy API (Application Programming Interfaces).
Big Data Analytics (BDA) software revenues will grow from nearly $122B in 2015 to more than $187B in 2019 – according to Forbes . At the same time, it is clear that exploding data generation across the global economy has become a clear & present business phenomenon. Data volumes are rapidly expanding across industries. However, while the production of data itself that has increased but it is also driving the need for organizations to derive business value from it. As IT leaders know well, digital capabilities need low cost yet massively scalable & agile information delivery platforms – which only Cloud Computing can provide.
For a more detailed technical overview- please visit below link.
Big Data & Big Data Analytics drive consumer interactions..
The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging visualization but also to personalize services clients care about across multiple channels of interaction. The only way to attain digital success is to understand your customers at a micro level while constantly making strategic decisions on your offerings to the market. Big Data has become the catalyst in this massive disruption as it can help business in any vertical solve their need to understand their customers better & perceive trends before the competition does. Big Data thus provides the foundational platform for successful business platforms.
The three key areas where Big Data & Cloud Computing intersect are –
Data Science and Exploration
ETL, Data Backups and Data Preparation
Analytics and Reporting
Big Data drives business usecases in Digital in myriad ways – key examples include –
Obtaining a realtime Single View of an entity (typically a customer across multiple channels, product silos & geographies)
Customer Segmentation by helping businesses understand their customers down to the individual micro level as well as at a segment level
Customer sentiment analysis by combining internal organizational data, clickstream data, sentiment analysis with structured sales history to provide a clear view into consumer behavior.
Product Recommendation engines which provide compelling personal product recommendations by mining realtime consumer sentiment, product affinity information with historical data.
Market Basket Analysis, observing consumer purchase history and enriching this data with social media, web activity, and community sentiment regarding past purchase and future buying trends.
Further, Digital implies the need for sophisticated, multifactor business analytics that need to be performed in near real time on gigantic data volumes. The only deployment paradigm capable of handling such needs is Cloud Computing – whether public or private. Cloud was initially touted as a platform to rapidly provision compute resources. Now with the advent of Digital technologies, the Cloud & Big Data will combine to process & store all this information. According to the IDC , by 2020 spending on Cloud based Big Data Analytics will outpace on-premise by a factor of 4.5. 
Intelligent Middleware provides Digital Agility..
Digital Applications are applications modular, flexible and responsive to a variety of access methods – mobile & non mobile. These applications are also highly process driven and support the highest degree of automation. The need of the hour is to provide enterprise architecture capabilities around designing flexible digital platforms that are built around efficient use of data, speed, agility and a service oriented architecture. The choice of open source is key as it allows for a modular and flexible architecture that can be modified and adopted in a phased manner – as you will shortly see.
The intention in adopting a SOA (or even a microservices) architecture for Digital capabilities is to allow lines of business an ability to incrementally plug in lightweight business services like customer on-boarding, electronic patient records, performance measurement, trade surveillance, risk analytics, claims management etc.
Intelligent Middleware adds significant value in six specific areas –
Supports a high degree of Process Automation & Orchestration thus enabling the rapid conversion of paper based business processes to a true digital form in a manner that lends itself to continuous improvement & optimization
Business Rules help by adding a high degree of business flexibility & responsiveness
Native Mobile Applications enables platforms to support a range of devices & consumer behavior across those front ends
Platforms As a Service engines which enable rapid application & business capability development across a range of runtimes and container paradigms
Business Process Integration engines which enable rapid application & business capability development
Middleware brings the notion of DevOps into the equation. Digital projects bring several technology & culture challenges which can be solved by a greater degree of collaboration, continuous development cycles & new toolchains without giving up proven integration with existing (or legacy)systems.
Intelligent Middleware not only enables Automation & Orchestration but also provides an assembly environment to string different (micro)services together. Finally, it also enables less technical analysts to drive application lifecycle as much as possible.
Further, Digital business projects call out for mobile native applications – which a forward looking middleware stack will support.Middleware is a key component for driving innovation and improving operational efficiency.
Five Key Business Drivers for combining Big Data, Intelligent Middleware & the Cloud…
The key benefits of combining the above paradigms to create new Digital Applications are –
Enable Elastic Scalability Across the Digital Stack
Cloud computing can handle the storage and processing of any amount of data & any kind of data.This calls for the collection & curation of data from dynamic and highly distributed sources such as consumer transactions, B2B interactions, machines such as ATM’s & geo location devices, click streams, social media feeds, server & application log files and multimedia content such as videos etc. It needs to be noted that data volumes here consist of multi-varied formats, differing schemas, transport protocols and velocities. Cloud computing provides the underlying elastic foundation to analyze these datasets.
Support Polyglot Development, Data Science & Visualization Cloud technologies are polyglot in nature. Developers can choose from a range of programming languages (Java, Python, R, Scala and C# etc) and development frameworks (such as Spark and Storm). Cloud offerings also enable data visualization using a range of tools from Excel to BI Platforms.
Reduce Time to Market for Digital Business Capabilities
Enterprises can avoid time consuming installation, setup & other upfront procedures. consuming can deploy Hadoop in the cloud without buying new hardware or incurring other up-front costs. On the same vein, even big data analytics should be able to support self service across the lifecycle – from data acquisition, preparation, analysis & visualization.
Support a multitude of Deployment Options – Private/Public/Hybrid Cloud
A range of scenarios for product development, testing, deployment, backup or cloudbursting are efficiently supported in pursuit of cost & flexibility goals.
Fill the Talent Gap
Open Source technology is the common thread across Cloud, Big Data and Middleware. The hope is that the ubiquity of open source will serve as a critical level in enabling the filling up of the IT-Business skills scarcity gap.
As opposed to building standalone or one-off business applications, a ‘Digital Platform Mindset’ is a more holistic approach capable of producing higher rates of adoption & thus revenues. Platforms abound in the web-scale world at shops like Apple, Facebook & Google etc. Digital Applications are constructed like lego blocks and they reuse customer & interaction data to drive cross sell and up sell among different product lines. The key components here are to ensure that one starts off with products with high customer attachment & retention. While increasing brand value, it is key to ensure that customers & partners can also collaborate in the improvements in the various applications hosted on top of the platform.
 Forbes Roundup of Big Data Analytics (BDA) Report
THE STATE OF GLOBAL FINANCIAL SERVICES IT ARCHITECTURE…
This blog has time & again discussed how Global, Domestic and Regional banks need to be innovative with their IT platform to constantly evolve their product offerings & services. This is imperative due to various business realities – the increased competition by way of the FinTechs, web scale players delivering exciting services & sharply increasing regulatory compliance pressures. However, systems and software architecture has been a huge issue at nearly every large bank across the globe.
Regulation is also afoot in parts of the globe which will give non traditional banks access to hitherto locked customer data. E.g PSD-2 in the European Union. Further, banking licenses have been more easily granted to non-banks which are primarily technology pioneers. e.g. Such as a Paypal, Square etc
In 2016, Banks are waking up to the fact that IT Architecture is a critical strategic differentiator. Players that have agile & efficient architecture platforms and practices can not only add new service offerings but also are able to experiment across a range of analytic led offerings that create & support multi-channel products. These digital services and usecases can now be found abundantly areas ranging from Retail Banking, Capital Markets. FinTechs have innovated in areas such as Payments & Wealth Management.
So, How did we get here…
The Financial Services IT landscape – no matter the segment – one picks across the spectrum – Capital Markets, Retail & Consumer Banking, Payment Networks & Cards, Asset Management etc are all largely predicated on a few legacy technology anti-patterns. These anti-patterns have evolved over the years from a systems architecture, data architecture & middleware standpoint.
These have resulted in a mishmash of organically developed & shrink wrapped systems that do everything from running critical Core Banking Applications to Trade Lifecycle to Securities Settlement to Financial Reporting etc. Each of these systems operates in an application, workflow, data silo with it’s own view of the enterprise. These are all kept in sync largely via data replication & stove piped process integration.
If this sounds too abstract, let us take an example & a rather topical one at that. One of the most critical back office functions every financial services organization needs to perform is Risk Data Aggregation & Regulatory Reporting (RDARR). This spans areas from Credit Risk, Market Risk, Operational Risk , Basel III, Solvency II etc..the list goes on.
The basic idea in any risk calculation is to gather a whole range of quality data in one place and to run computations to generate risk measures for reporting.
So, how are various risk measures calculated currently?
Current Risk Architectures are based on traditional relational databases (RDBMS) architectures with 10’s of feeds from Core Banking Systems, Loan Data, Book Of Record Transaction Systems (BORTS) like Trade & Position Data (e.g. Equities, Fixed Income, Forex, Commodities, Options etc), Wire Data, Payment Data, Transaction Data etc.
These data feeds are then tactically placed in memory caches or in enterprise data warehouses (EDW). Once relevant data has been extracted, it is transformed using a series of batch jobs. These jobs which then prepare the data for Calculator Frameworks to which run their risk models across hundreds of scenarios.
All of the above need access to large amounts of data at the individual transaction Level. The Corporate Finance function within the Bank then makes end of day adjustments to reconcile all of this data up and these adjustments need to be cascaded back to the source systems down to the individual transaction or classes of transaction levels.
These applications are then typically deployed on clusters of bare metal servers that are not particularly suited to portability, automated provisioning, patching & management. In short, nothing that can automatically be moved over at a moment’s notice. These applications also work on legacy proprietary technology platforms that do not lend themselves to flexible & a DevOps style of development.
Finally, there is always need for statistical frameworks to make adjustments to customer transactions that somehow need to get reflected back in the source systems. All of these frameworks need to have access to and an ability to work with terabtyes (TBs) of data.
Each of above mentioned risk work streams has corresponding data sets, schemas & event flows that they need to work across. They also have different temporal needs for reporting. Some need to be run a few times in a day (e.g. Traded Credit Risk), some daily (e.g. Market Risk) and some end of the week (e.g Enterprise Credit Risk)
Illustration – The Five Deadly Sins of Financial IT Architectures
Let us examine why this is in the context of these anti-patterns as proposed below –
THE FIVE DEADLY SINS…
The key challenges with current architectures –
Utter, total and complete lack of centralized Data leading to repeated data duplication – In the typical Risk Data Aggregation application – a massive degree of Data is duplicated from system to system leading to multiple inconsistencies at the summary as well as transaction levels. Because different groups perform different risk reporting functions (e.g Credit and Market Risk) – the feeds, the ingestion, the calculators end up being duplicated as well. A huge mess, any way one looks at it.
Analytic applications which are not designed for throughput – Traditional Risk algorithms cannot scale with this explosion of data as well as the heterogeneity inherent in reporting across multiple kinds of risks. E.g Certain kinds of Credit Risk need access to around 200 days of historical data where one is looking at the probability of the counter-party defaulting & to obtain a statistical measure of the same. These models are highly computationally intensive and can run for days if the data architecture cannot scale in providing efficient compute on massive volumes of data.
Lack of Application Blueprint, Analytic Model & Data Standardization – There is nothing that is either SOA or microservices-like in most RDA applications and that precludes best practice development & deployment. All of this only leads to maintenance headaches. The reason that Cloud Computing based frameworks such a a PaaS (Platform as a Service) are highly elegant are that they enforce standardization of systems software components across the stack. Areas like Risk Model and Analytic development needs to be standardized to reflect realities post BCBS 239 (and the upcoming FRTB). With the Volcker Rule reporting that bans prop trading activity on part of the Banks, they must now report on seven key metrics across 10s of different data feeds across PB’s of data. Most existing Risk applications cannot do that without undertaking a large development and change management headache.
Lack of Scalability – It must be possible to operate it as a central system that can scale to carry the full load of the organization and operate with hundreds of applications built by disparate teams all plugged into the same central nervous system.One other factor to consider is the role of cloud computing in customer retention efforts. The analytical computational power required to understand insights from gigantic data sets is costly to maintain on an individual basis. The traditional owned data center will probably not disappear, but banks need to be able to leverage the power of the cloud to perform big data analysis in a cost-effective manner.
A Lack of Deployment Flexibility – The application & data requirements dictate the deployment platforms. This massive anti pattern leads to silos and legacy OS’s that can not easily be moved to Containers like Docker & instantiated by a modular Cloud OS like OpenStack.
THE BUSINESS VALUE DRIVERS OF EFFICIENT ARCHITECTURES …
Doing IT Architecture right and in a responsive manner to the business results in critical value drivers that that are met & exceeded this transformation are –
Effective Compliance with increased Regulatory Risk mandates ranging from Basel – III, FTRB, Liquidity Risk – which demand flexibility of all the different traditional IT tiers.
An ability to detect and deter fraud – Anti Money Laundering (AML) and Retail/Payment Card Fraud etc
Fendoff competition from the FinTechs
Exist & evolve in a multichannel world dominated by the millennial generation
Reduced costs to satisfy pressure on the Cost to Income Ratio (CIR)
The ability to open up data & services that operate on the customer data to other institutions
A uniform architecture that works across of all these various types would seem a commonsense requirement. However, this is a major problem for most banks. Forward looking approaches that draw heavily from microservices based application development, Big Data enabled data & processing layers, the adoption of Message Oriented Middleware (MOM) & a cloud native approach to developing applications (PaaS) & deployment (IaaS) are the solution to the vexing problem of inflexible IT.
The question is if banks can change before they see a perceptible drop in revenues over the years?
“IT driven business transformation is always bound to fail” – Amber Storey, Sr Manager, Ernst & Young
The value of Big Data driven Analytics is no longer in question both from a customer as well as an enterprise standpoint. Lack of investment in an analytic strategy has the potential to impact shareholder value negatively. Business Boards and CXOs are now concerned about their overall levels and maturity of investments in terms of business value – i.e increasing sales, driving down business & IT costs & helping create new business models. It is thus an increasingly accurate argument that smart applications & ecosystems built around them will increasingly dictate enterprise success.
Such examples among forward looking organizations abound across industries. These range from realtime analytics in manufacturing using IoT data streams across the supply chain, the use of natural language processing to drive patient care decisions in healthcare, more accurate insurance fraud detection & driving Digital interactions in Retail Banking etc to quote a few.
However , most global organizations currently adopt a fairly tactical approach to ensuring the delivery of of traditional business intelligence (BI) and predictive analytics to their application platforms. This departmental is quite suboptimal in ways as scaleable data driven decisions & culture not only empower decision-makers with up to date and realtime information but also help them develop long term insights into how globally diversified business operations are performing. Scale is the key word here due to rapidly changing customer trends, partner, supply chain realities & regulatory mandates.
Scale implies speed of learning, business agility across the organization in terms of having globally diversified operations turn on a dime thus ensuring that the business feels empowered.
A quick introduction to Business (Descriptive & Predictive) Analytics –
Business intelligence (BI) is a traditional & well established analytical domain that essentially takes a retrospective look at business data in systems of record. The goal for BI is to primarily look for macro or aggregate business trends across different aspects or dimensions such as time, product lines, business unites & operating geographies.
BI is primarily concerned with “What happened and what trends exist in the business based on historical data?“. The typical use cases for BI include budgeting, business forecasts, reporting & key performance indicators (KPI).
On the other hand, Predictive Analytics (a subset of Data Science) augments & builds on the BI paradigm by adding a “What could happen” dimension to the data in terms of –
being able to probabilistically predict different business scenarios across thousands of variables
suggesting specific business actions based on the above outcomes
Predictive Analytics does not intend to nor will it replace the BI domain but only adds significant business capabilities that lead to overall business success. It is not uncommon to find real world business projects leveraging both these analytical approaches.
Creating an industrial approach to analytics –
Strategic business projects typically begin imbibing a BI/Predictive Analytics based approach as an afterthought to the other aspects of system architecture and buildout. This dated approach then ensures that analytics becomes external to and eventually operating in a reactive mode in the operation of business system.
Having said that, one does need to recognize that an industrial approach to analytics is a complex endeavor that depends on how an organization tackles the convergence of the below approaches –
New Age Technology
A Platforms Mindset
Illustration – Embedding A Culture of Business Analytics into the Enterprise DNA..
Lets discuss them briefly –
Organizational Structure– The historical approach has been to primarily staff analytics teams as a standalone division often reporting to a CIO. This team has responsibility for both the business intelligence as well as some silo of a data strategy. Such a piecemeal approach to predictive analytics ensures that business & application teams adopt a “throw it over the wall” mentality over time.
So what needs to be done?
In the Digital Age, enterprises should look to centralize both data management as well as the governance of analytics as core business capabilities. I suggest a hybrid organizational structure where a Center of Excellence (COE) is created which reports to the office of the Chief Data Officer (CDO) as well as individual business analytic leaders within the lines of business themselves.
This should be done to ensure that three specific areas are adequately tackled using a centralized approach-
Investing in creating a data & analytics roadmap by creating a center of excellence (COE)
Setting appropriate business milestones with “lines of business” value drivers built into a robust ROI model
Managing Risk across the enterprise with detailed scenario planning
New Age Technology –
The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging visualization but also to personalize services clients care about across multiple modes of interaction. Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. We have seen how how exploding data generation across the global economy has become a clear & present business & IT phenomenon. Data volumes are rapidly expanding across industries. However, while the production of data itself that has increased but it is also driving the need for organizations to derive business value from it. This calls for the collection & curation of data from dynamic, and highly distributed sources such as consumer transactions, B2B interactions, machines such as ATM’s & geo location devices, click streams, social media feeds, server & application log files and multimedia content such as videos etc – using Big Data.
Cloud Computing is the ideal platform to provide the business with self service as well as rapid provisioning of business analytics. Every new application designed needs to be cloud native from the get go.
The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging Visualization but also to personalize services clients care about across multiple modes of interaction. Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc.
A Platforms Mindset –
As opposed to building standalone or one-off business applications, a Platform Mindset is a more holistic approach capable of producing higher revenues. Platforms abound in the webscale world at shops like Apple, Facebook & Google etc. Applications are constructed like lego blocks and they reuse customer & interaction data to drive cross sell and up sell among different product lines. The key components here are to ensure that one starts off with products with high customer attachment & retention. While increasing brand value, it is key to ensure that customers & partners can also collaborate in the improvements in the various applications hosted on top of the platform.
Culture – Business value fueled by analytics is only possible if the entire organization operates on an agile basis in order to collaborate across the value chain. Cross functional teams across new product development, customer acquisition & retention, IT Ops, legal & compliance must collaborate in short work cycles to close the traditional business & IT innovation gap. Methodologies like DevOps who’s chief goal is to close the long-standing gap between the engineers who develop and test IT capability and the organizations that are responsible for deploying and maintaining IT operations – must be adopted. Using traditional app dev methodologies, it can take months to design, test and deploy software. No business today has that much time—especially in the age of IT consumerization and end users accustomed to smart phone apps that are updated daily. The focus now is on rapidly developing business applications to stay ahead of competitors that can better harness Big Data’s amazing business capabilities.
Enterprise wide business analytic approaches designed around the four key prongs (Structure, Culture, Technology & Platforms) will create immense operational efficiency, better business models, increased relevance and ultimately drive revenues. These will separate the visionaries, leaders from the laggards in the years to come.
Illustration: Business- IT Relationship (Image src – Pat.it)
Previous posts in this blog have discussed the fact that technological capabilities now make or break business models. It is critical for IT to operate in a manner that maximizes their efficiency while managing costs & ultimately delivering the right outcomes for the organization.
It is clear and apparent to me that the relationship lines of business (LOBs) have with their IT teams – typically central & shared – is completely broken at a majority of large organizations. Each side cannot seem to view either the perspective or the passions of the other. This dangerous dysfunction usually leads to multiple complaints from the business. Examples of which include –
IT is perceived to be glacially slow in providing infrastructure needed to launch new business initiatives or to amend existing ones. This leads to the phenomenon of ‘Shadow IT’ where business applications are run on public clouds bypassing internal IT
Something seems to be lost in translation while conveying requirements to different teams within IT
IT is too focused on technological capabilities – Virtualization, Middleware, Cloud, Containers, Hadoop et al without much emphasis on business value drivers
So what are the top asks that Business has for their IT groups? I wager that there are five important focus areas –
Transact in the language of the business –Most would agree that there has been too much of a focus on the technology itself – how it works, what the infrastructure requirements are to host applications – cloud or on-prem, data engines to ingest and process it etc etc . The focus needs to be on customer needs that drive business value for an organization’s customers, partners, regulators & employees. Technology at it’s core is just an engine and does not exist in a vacuum. The most vibrant enterprises understand this ground reality and always ensure that business needs drive IT and not the other way around. It is thus highly important for IT leadership to understand the nuances of the business to ensure that their roadmaps (long and medium term) are being driven with business & competitive outcomes in mind. Examples of such goals are a common organization wide taxonomy across products, customers, logistics, supply chains & business domains. The shared emphasis on both business & IT should be on goals like increased profitability per customer, enhanced segmentation of both micro and macro customer populations with the ultimate goal of increasing customer lifetime value (CLV).
Bi-Modal or “2 Speed” IT et al need to be business approach centric – Digital business models that are driving agile web-scale companies offer enhanced customer experiences built on product innovation and data driven business models. They are also encroaching into the domain of established industry players in verticals like financial services, retail, entertainment, telecommunications, transportation and insurance by offering contextual & trendy products tailored to individual client profiles. Their savvy use of segmentation data and realtime predictive analytics enables the delivery of bundles of tailored products across multiple delivery channels (web, mobile, point of sale, Internet, etc.). The enterprise approach has been to adopt a model known as Bi-Modal IT championed by Gartner. This model envisages two different IT camps – one focused on traditional applications and the other focused on innovation. Whatever be the moniker for this approach – LOBs need to be involved as stakeholders from the get-go & throughout the process of selecting technology choices that have downstream business ramifications. One of the approaches that is working well is increased cross pollination across both teams, collapsing artificial organizational barriers by adopting DevOps & ensuring that business has a slim IT component to rapidly be able to fill in gaps in IT’s business knowledge or capability.
Self Service Across the board of IT Capabilities – Shadow IT (where business goes around the IT team) is not just an issue with infrastructure software but is slowly creeping up to business intelligence and advanced analytics apps. The delays associated with provisioning legacy data silos combined with using tools that are neither intuitive nor able to scale to deal with the increasing data deluge are making timely business analysis almost impossible to perform. Insights delivered too late are not very valuable. Thus, LOBs are beginning to move to a predominantly online SaaS (Software As A Service) model across a range of business intelligence applications. Reports, visual views of internal & external datasets are directly served to internal consumers based on data uploaded into a cloud based BI provider. These reports and views are then directly delivered to end users. IT needs to enable this capability and make it part of their range of offerings to the business.
Help the Business think Analytically – Business Process Automation (BPM) and Data Driven decision making are proven approaches used at data-driven organizations. When combined with Data and Business Analytics, this tends to be a killer combination. Organizations that are data & data metric driven are able to define key business processes that provide native support for key performance indicators (KPIs) that are critical and basic to their functioning. Applications developed by IT need to be designed in such a way that these KPIs can be communicate and broadcast across the organization constantly. Indeed a high percentage of organizations now have senior executive in place as the champion for BPM, Business Rules and Big Data driven analytics. These applications are also mobile native so that they can be provided access through a variety of mobile platforms for field based employees & back into the corporate firewall.
No “Us vs Them” mentality – it is all “Us” – None of the above are only possible if the entire organization operates on an agile basis in order to collaborate across the value chain. Cross functional teams across new product development, customer acquisition & retention, IT Ops, legal & compliance must collaborate in short work cycles to close the traditional business & IT innovation gap. One of chief goals of agile methodologies is to close the long-standing gap between the engineers who develop and test IT capability and business requirements for such capabilities. Using traditional app dev methodologies, it can take months to design, test and deploy software – which is simply unsustainable.
Business & IT need to collaborate. Period. –
The most vibrant enterprises that have implemented web-scale practices not only offer “IT/Business As A Service” but also have instituted strong cultures of symbiotic relationships between customers (both current & prospective), employees , partners and developers etc.
No business today has much time to innovation—especially in the age of IT consumerization where end users accustomed to smart phone apps that are often updated daily. The focus now is on rapidly developing business applications to stay ahead of competitors that can better harness technology’s amazing business capabilities.
“The data fabric is the next middleware.” –Todd Papaioannou, CTO at Splunk
Enterprises across the globe are confronting the need to create a Digital Strategy. While the term itself may seen intimidating to some, it essentially represents an agile culture built on customer centricity & responsiveness. The only way to attain Digital success is to understand your customers at a micro level while making strategic decisions on your offerings to the market. Big Data has become the catalyst in this massive disruption as it can help business in any vertical solve their need to understand their customers better. It aids this by providing foundational platform for amazing products.
We have seen how how exploding data generation across the global has become a clear & present business & IT phenomenon. Data volumes are rapidly expanding across industries. However, while the production of data itself that has increased but it is also driving the need for organizations to derive business value from it. This calls for the collection & curation of data from dynamic, and highly distributed sources such as consumer transactions, B2B interactions, machines such as ATM’s & geo location devices, click streams, social media feeds, server & application log files and multimedia content such as videos etc. It needs to be noted that data volumes here consist of multi-varied formats, differing schemas, transport protocols and velocities.
Internet of Things (IoT) has become an entire phenomenon to itself. It is truly a horizontal vertical (no pun intended) as the proliferation of applications of sensors is causing rapid change in system & application architectures. The system of IoT is burgeoning from the initial sensors, digital devices, mechanical automatons to cars, process monitoring systems, browsers, television, traffic cameras etc etc.
Big Data is thus crossing the innovation chasm. A vast majority of early adopter projects are finding business success with a strong gain in ROI (Return On Investment). The skills gap is beginning to slowly decrease with Hadoop ecosystem becoming a skill that every modern application developer needs to have. Increasingly customers are leading the way by deploying Big Data in new and previously uncharted areas like cybersecurity leading to massive cross vertical interest.
The five elements in Digital Transformation, irrespective of the business vertical you operate in, are –
The first element in Digital is the Customer centricity.
Big Data drives this in myriad ways –
Obtaining a realtime Single View of an entity (typically a customer across multiple channels, product silos & geographies)
Customer Segmentation by helping businesses understand their customers down to the individual level as well as at a segment level
Customer sentiment analysis by combining internal organizational data, clickstream data, sentiment analysis with structured sales history to provide a clear view into consumer behavior.
Product Recommendation engines which provide compelling personal product recommendations by mining realtime consumer sentiment, product affinity information with historical data.
Market Basket Analysis, observing consumer purchase history and enriching this data with social media, web activity, and community sentiment regarding past purchase and future buying trends.
Realtime Multichannel Analytics is the second piece of a Digital Strategy.
Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. The healthcare industry stores patient data across multiple silos – ADT (Admit Discharge Transfer) systems, medication systems, CRM systems etc but all of this must be exposed across different mediums of access. Data Lakes provide an ability to visualize all of the patients data in one place thus improving outcomes. Every customer facing application needs to be both multi-channel as well as one that supports a unified 360 degree customer view across all these engagement points. Applications developed in 2016 and beyond must take a 360 degree based approach to ensuring a continuous client experience across the spectrum of endpoints and the platforms that span them from a Data Visualization standpoint. Every serious business needs to provide a unified view of a customer across tens of product lines and geographies. Big Data not only provides the core foundational elements for a realtime view of the moving parts of the business but also enables businesses to listen to their customers.
A strategic approach to improving Risk, Fraud & Compliance analytics can add massive value and competitive differentiation in three distinct categories as shown below.
Exponentially improve existing business processes. e.. Risk data aggregation and measurement, HIPAA/SOX/Manufacturing compliance, fraud detection
Help create new business models and go to market strategies – by monetizing multiple data sources – both internal and external
Vastly improve regulatory compliance by generating fresher and more accurate insights across silos of proprietary data
The onset of Digital Architectures in enterprise businesses implies the ability to drive continuous online interactions with global consumers/customers/clients or patients. The goal is not just provide engaging Visualization but also to personalize services clients care about across multiple modes of interaction. Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. Healthcare is a close second where caregivers expect patient, medication & disease data at their fingertips with a few finger swipes on an iPad app.
The ability of outbound Marketing campaigns to reach engaged customers in a proactive manner using the right channel has been a big gap in their effectiveness. The old school strategy of blasting out direct mailers and emails does not work anymore both from a cost as well as a customer engagement standpoint. Nowadays, campaigns for exciting new products & promotions need to be built on the rich customer intelligence assets that Big Data enables you to build. Examples of these capabilities are replete in sectors like Retail where offering a positive purchase experience in terms of personalized offers, price comparisons, social network based sharing of experiences et al drive higher customer engagement & loyalty.
The Final Word –
My goal for this post was to communicate a business revelation that I have had in past year. While the semantics of business processes, the usecases & the data sources, elements, formats may vary from industry to industry ( e.g. Banking to Healthcare to Manufacturing to Telecom) – the approaches as well as the benefits from leveraging a data & analytics driven business model essentially remain the same. These capabilities are beginning to separate the winners from the rest of the pack.
“Dream no small dreams for they have no power to move the hearts of men.” — Goethe
It is that time of the year again when the mavens at Gartner make their annual predictions regarding the top Strategic trends for the upcoming year. The definition of ‘strategic’ as in an emerging technology trend that will impact Iong term business thus influencing plans & budgets. As before, I will be offering up my own take on these while solidifying the discussion in terms of the Social, Mobile, Big Data Analytics & Cloud (SMAC) stack that is driving ongoing industry revolution.
The Digital Mesh – The rise of the machines has been well documented but enterprises are waking up to the possibilities only recently. Massive data volumes are now being reliably generated from diverse sources of telemetry as well as endpoints at corporate offices (as a consequence of BYOD). The former devices include sensors used in manufacturing, personal fitness devices like FitBit, Home and Office energy management sensors, Smart cars, Geo-location devices etc. Couple these with the ever growing social media feeds, web clicks, server logs and more – one sees a clear trend forming which Gartner terms the Digital Mesh. The Digital Mesh leads to an interconnected information deluge which encompasses classical IoT endpoints along with audio, video & social data streams. This leads to huge security challenges and opportunity from a business perspective for forward looking enterprises (including Governments). Applications will need to combine these into one holistic picture of an entity – whether individual or institution.
Information of Everything – The IoT era brings an explosion of data that flows across organizational, system and application boundaries. Look for advances in technology especially in Big Data and Visualization to help consumers harness this information in the right form enriched with the right contextual information.In the Information of Everything era, massive amounts of efforts will thus be expended on data ingestion, quality and governance challenges.
Ambient User Experiences –
Mobile applications first begun forcing the need for enterprise to begin supporting multiple channels of interaction with their consumers. For example Banking now requires an ability to engage consumers in a seamless experience across an average of four to five channels – Mobile, eBanking, Call Center, Kiosk etc. The average enterprise user is familiar with BYOD in the age of self service. The Digital Mesh only exacerbates this gap in user experiences as information consumers navigate applications as they consume services across a mesh that is both multi-channel as well as provides Customer 360 across all these engagement points.Applications developed in 2016 and beyond must take an approach to ensuring a smooth experience across the spectrum of endpoints and the platforms that span them from a Data Visualization standpoint.
Autonomous Agents and Things –
Smart machines like robots,personal assistants like Apple Siri,automated home equipment will rapidly evolve & become even more smarter as their algorithms get more capable and understanding of their own environments. In addition, Big Data & Cloud computing will continue to mature and offer day to day capabilities around systems that employ machine learning to make predictions & decisions. We will see increased application of Smart Agents in diverse fields like financial services,healthcare, telecom and media.
Advanced Machine Learning – Most business problems are data challenges and an approach centered around data analysis helps extract meaningful insights from data thus helping the business It is a common capability now for many enterprises to possess the capability to acquire, store and process large volumes of data using a low cost approach leveraging Big Data and Cloud Computing. At the same time the rapid maturation of scalable processing techniques allows us to extract richer insights from data. What we commonly refer to as Machine Learning – a combination of of econometrics, machine learning, statistics, visualization, and computer science – extract valuable business insights hiding in data and builds operational systems to deliver that value. Data Science has evolved to a new branch called “Deep Neural Nets” (DNN). DNN Are what makes possible the ability of smart machines and agents to learn from data flows and to make products that use them even more automated & powerful. Deep Machine Learning involves the art of discovering data insights in a human-like pattern. The web scale world (led by Google and Facebook) have been vocal about their use of Advanced Data Science techniques and the move of Data Science into Advanced Machine Learning.
3D Printing Materials –
3D printing continues to evolve and advance across a wide variety of industries.2015 saw a wider range of materials including carbon fiber, glass, nickel alloys, electronics & other materials used in the 3D printing process . More and more industries continue to incorporate the print and assembly of composite parts constructed using such materials – prominent examples including Tesla and SpaceX. We are at the beginning of a 20 year revolution which will lead to sea changes in industrial automation.
Adaptive Security – A cursory study of the top data breaches in 2015 reads like a “Who’s Who”of actors in society across Governments, Banks, Retail establishments etc. The enterprise world now understands that an comprehensive & strategic approach to Cybersecurity has now far progressed from being an IT challenge a few years ago to a business imperative. As Digital and IoT ecosystems evolve to loose federations of API accessible and cloud native applications, more and more assets are at danger of being targeted by extremely well funded and sophisticated adversaries. For instance – it is an obvious truth that data from millions of IoT endpoints requires data ingest & processing at scale. The challenge from a security perspective is multilayered and arises not just from malicious actors but also from a lack of a holistic approach that combines security with data governance, audit trails and quality attributes. Traditional solutions cannot handle this challenge which is exacerbated by the expectation that in an IoT & DM world, data flows will be multidirectional across a grid of application endpoints. Expect to find applications in 2016 and beyond incorporating Deep Learning and Real Time Analytics into their core security design with a view to analyzing large scale data at a very low latency.
Advanced System Architecture –
The advent of the digital mesh and ecosystem technologies like autonomous agents (powered by Deep Neural Nets) will make increasing demands on computing architectures from a power consumption, system intelligence as well as a form factor perspective. The key is to provide increased performance while mimicking neuro biological architectures. The name given this style of building electronic circuits is neuromorphic computing. Systems designers will have increased choice in terms of using field programmable gate arrays (FPGAs) or graphics processing units (GPUs). While both FGPAs and GPUs have their pros and cons, devices & computing architectures using these as a foundation are both suited to deep learning and other pattern matching algorithms leveraged by advanced machine learning. Look for more reductions in form factors at less power consumption while allowing advanced intelligence in the IoT endpoint ecosystem.
Mesh App and Service Architecture The micro services architecture approach which combines the notion of autonomous, cooperative yet loosely coupled applications built as a conglomeration of business focused services is a natural fit for the Digital Mesh. The most important additive and consideration to micro services based architectures in the age of the Digital Mesh is what I’d like to term – Analytics Everywhere. Applications in 2016 and beyond will need to recognize that Analytics are pervasive, relentless, realtime and thus embedded into our daily lives. Every interaction a user has with a micro services based application will need a predictive capability built into the application architecture itself. Thus, 2016 will be the year when Big Data techniques are no longer be the preserve of classical Information Management teams but move to the umbrella Application development area which encompasses the DevOps and Continuous Integration & Delivery (CI-CD) spheres.
IoT Architecture and Platforms
There is no doubt in anyone’s mind that IoT (Internet Of Things) is a technology megatrend that will reshape enterprises, government and citizens for years to come. IoT platforms will complement Mesh Apps and Service Architectures with a common set of platform capabilities built around open communication, security, scalability & performance requirements. These will form the basic components of IoT infrastructure including but not limited to machine to machine interfaces,location based technology, micro controllers , sensors, actuators and the communication protocols (based on an all IP standard).
The Final Word –
One feels strongly that Open Source will drive the various layers that make up the Digital Mesh stack (Big Data, Operating Systems, Middleware, Advanced Machine Learning & BPM). IoT will be a key part of Digital Transformation initiatives.
However, the challenge for developing Vertical capabilities on these IoT platforms is three fold. Specifically in areas of augmenting micro services based Digital Mesh applications- which are largely lacking at the time of writing:
Data Ingest in batch or near realtime (NRT) or realtime from dynamically changing, disparate and physically distributed sensors, machines, geo location devices, clickstreams, files, and social feeds via highly secure lightweight agents
Provide secure data transfer using point-to-point and bidirectional data flows in real time
Curate these flows with Simple Event Processing (SEP) capabilities via tracing, parsing, filtering, joining, transforming, forking or cloning of data flows while adding business context to these flows. As mobile clients, IoT applications, social media feeds etc are being brought onboard into existing applications from an analytics perspective, traditional IT operations face pressures from both business and development teams to provide new and innovative services.
The creation of these smart services will further depend on the vertical industries that these products serve as well as requirements for the platforms that host them. E.g industrial automation, remote healthcare, public transportation, connected cars, home automation etc.
Finally, 2016 also throws up some interesting questions around Cyber Security, namely –
a. Can an efficient Cybersecurity be a lasting source of competitive advantage;
b. Given that most breaches are long running in nature where systems are slowly compromised over months. How does one leverage Big Data and Predictive Modeling to rewire and re-architect creaky defenses?
c. Most importantly, how can applications implement security in a manner that they constantly adapt and learn;
If there were just a couple of sentences to sum up Gartner’s forecast for 2016 in a succinct manner, it would be “The emergence of the Digital Mesh & the rapid maturation of IoT will serve to accelerate business transformation across industry verticals. The winning enterprises will begin to make smart technology investments in Big Data, DevOps & Cloud practices to harness these changes “.