Cybersecurity and the Next Generation Datacenter..(2/4)

The first blog of this four part series introduced a key business issue in the Digital Age – Cybersecurity. We also briefly touched upon responses that are being put in place by Corporate Boards. This part two focuses on technology strategies for enterprises to achieve resilience in the face of these attacks. The next post – part three – will focus on advances in Big Data Analytics that provide advanced security analytics capabilities. The final post of the series will focus on the business steps that Corporate boards, Executive & IT leadership need to adopt from a governance & strategy standpoint to protect & insulate their businesses from the constant firehose of cyber attacks.

Growing reliance on IT breeds Cyber Insecurity – 

The increased reliance on on information technology to help run businesses, their supply chains and consumer facing applications has led to a massive increase cyber risk. Given that most organizations are increasingly allowing employees to remotely access critical systems, the need to provide highly secure computing capabilities has become more pronounced. This IT infrastructure ranges from systems that store sensitive customer information, financial data etc has lead to an entire industry segment for network and computer security. This also has led to the creation of a burgeoning market of security experts across a range of cyber segments to tailor solutions to fit the operating requirements of respective client organizations.

 The core domains of Cyber Defense –

A fact of life facing the CIO & CISO (Chief Information Security Officer) or an IT manager is that every enterprise corporate datacenter is currently a mishmash of existing legacy technology investments. These range from antiquated proprietary software, some open source investments, proprietary server, rack & networking architectures. The people & software process piece is then added on all of these by incorporating custom architecture tools and governance processes.

Layered across & within these are the typical security tools, frameworks and approaches that are employed commonly.

The important functional areas of Cybersecurity are listed below –

  • Intrusion detection systems (IDS)
  • Firewalls
  • Application Security leveraging Cryptography
  • Data Security
  • System administration controls,
  • Server & Workstation security
  • Server Management Procedures (Patching, Updating etc),
  • Incident Response,
  • Data Protection
  • Identity and Access Management (IAAM) etc. These tools are also commonly extended to endpoint devices like laptops and mobile clients etc.

While these are all valid & necessary investments, security as a theme in IT is almost always an afterthought across the four primary technology domains in the datacenter – Infrastructure, Platforms, Data & Management.

Why is that?

From an IT philosophy & culture standpoint, security has historically been thought of as a Non Functional Requirement (or an “illity”) or a desirable or additive feature. As a result most high level executives as well as IT personnel & end users have come to regard security as a process of running through checklists by installing cumbersome client tools, malware scanners as well as conforming with periodic audits.

However recent hack attacks at major financial institutions as discussed in the first post in this series –, have brought to fore the need to view Cybersecurity & defense as an integral factor in the IT lifecycle. Not just an integral factor but a strategic component while building out applications & datacenter architectures that host them.

Datacenter complexity breeds Cyber Insecurity – 

Information Technology as an industry is really only 30+ years old as compared to architecture or banking or manufacturing or healthcare which have existed as codified bodies of knowledge for hundreds of years. Consequently the body of work on IT still evolves and continues to do so at a rapid clip. Over the last couple of decades, computing architectures have evolved from being purely mainframe based in the 1960s & 70s to a mishmash of few hundred Unix servers running the entire application portfolio of a major enterprise like a financial institution in the 1980s.

Fast forward to 2016, a typical Fortune 500 enterprise now runs multiple data centers with each hosting hundreds of thousands of Linux & Windows based servers either bare metal or virtualized, high end mainframes, legacy Unix systems etc. The sum total of the system images can run into tens of operating systems alone. When one factors in complex n-tier applications themselves along with packaged software (databases, application servers, message oriented middleware,business process management systems, ISV applications and systems utilities etc), the number of unique systems runs into 100,000 or more instances.

This complexity adds significantly to management tasks (maintaining, updating, patching servers) as well as the automation factor needed to derive business value at scale.

Security challenges thus rise manifold in the typical legacy technology dominated data center.

The top five day to day challenges from a security standpoint include –

  • Obtaining a ‘single pane of glass‘ view from a security standpoint across the zones in the infrastructure
  • Understanding and gaining visibility in realtime across this complex infrastructure
  • Staying ahead of rapidly moving exploits like the Heartbleed virus, the Shellshock bash vulnerability etc. The key point is that ensuring that all vulnerable systems are instantly patched
  • Understanding what platforms and systems are hosting applications that have been designated as “Non Compliant” for various reasons – legacy applications that are no longer maintained, out of support or unsupported software stacks which are way behind on patch levels etc
  • Proactively enforcing policies around security compliance and governance. These run the gamut from server patch policies, hardware configurations tailored to security zones e.g. a server with too many NICs in a DMZ or applications that did not have the correct & certified version of an application.

Datacenter Architecture built for Cybersecurity – 

Can there be a data center architecture that is optimized for Cybersecurity from the ground up?

I contend that at a high level, four critical tiers and planes underlie every typical corporate information technology architecture.

These are  –

  1. Infrastructure tier – where Storage, Compute & Network provisioning reside
  2. Data tier – the sum total of all data assets including OLTP systems, Big Data
  3. Application/Services tier – applications composed of services or microservices
  4. Management plane – which maintains the operator and admins view

Open source and Cybersecurity – Like two peas in a pod – 

Open source technology choices across the above layers provide the highest security benefits. Open source platforms are maintained by the highest number of varied contributors that removes the dependence on any one organization as a source of security support. The open development model ensures that hordes of developers – both corporate & hobbyists, agree on standards while constantly testing and improving platforms. For example, platforms ranging from Red Hat Linux to Open Middleware to Hadoop to OpenStack have received the highest security ratings in their respective categories. All of the above platforms have the highest release velocity, rate of product updates & security fixes.

There has been a perception across the industry that while open source frameworks and platforms are invaluable for developers, they are probably not a good fit for IT operations teams who need a mix of highly usable management consoles as well as scripting facilities & monitoring capabilities. However, open source projects have largely closed this feature gap and then some over the last five years. Robust and mature open source management platforms now span the gamut across all the above disciplines as enumerated below.

  • OS Management – Systems Management Consoles
  • Application Middleware – end to end application deployment, provisioning & monitoring toolsets
  • Big Data & Open Source RDBMS – Mature consoles for provisioning, managing, and monitoring clusters
  • Cloud Computing – Cloud Management Platforms

The below illustration captures these tiers along with specifics on security; lets examine each of the tiers starting from the lowest –


         Illustration: Next generation Data Center with different technology layers 

Infrastructure Tier

The next generation way of architecting infrastructure is largely centered around Cloud Computing. A range of forward looking institutions are either deploying or testing cloud-based solutions that span the full range of cloud delivery models – whether private or public or a hybrid mode.

Security and transparency are best enabled by a cloud based infrastructure due to the below reasons.

  • highly standardized application & OS stacks with enables seamless patching across tiers
  • Workload isolation by leveraging virtualization or containers
  • Highest levels of deployment automation
  • The ability of cloud based stacks to scale up at an instant to handle massive amounts of streaming data

Cloud computing provides three main delivery models (IaaS, PaaS & SaaS).

  • IaaS (infrastructure-as-a-service) to provision compute, network & storage,
  • PaaS (platform-as-a-service) to develop applications &
  • exposing their business services as  SaaS (software-as-a-service) via APIs.

There are three broad options while choosing Cloud a based infrastructure –

  1. Leveraging a secure public cloud  (Amazon AWS or Microsoft Azure) or
  2. An internal private cloud (built on OpenStack etc)
  3. A combination of the two i.e a hybrid approach is a safe and sound bet for any new or greenfield applications.

In fact many vendors now offer cloud based security products which offer a range of services from malware detection to monitoring cloud based applications like Google’s suite of office applications, Salesforce etc.

Data Tier – 

While enterprise data tiers are usually composed of different technologies like RDBMS, EDW (Enterprise Data Warehouses), CMS (Content Management Systems) & Big Data etc. My recommendation for the target state is largely dominated by appropriate technologies for the appropriate usecase. For example a Big Data Platform powered by Hadoop is a great fit for data ingest, processing & long term storage. EDW’s shine at reporting use cases & RDBMS’s at online transaction processing. Document Management Systems are fantastic at providing business document storage, retrieval etc. All of these technologies can be secured for both data in motion and at rest.

Given the focus of the digital wave in leveraging algorithmic & predictive analytics capabilities in create tailored & managed consumer products  – Hadoop is a natural fit as it is fast emerging as the platform of choice for analytic applications.  

Big Data and Hadoop make security comparatively easy to bake in as compared to a silo’ed approach due to the below reasons –  

  1. Hadoop’s ability to ingest and work with all the above kinds of data & more (using the schema on read method) has been proven at massive scale. Operational data stores are being built on Hadoop at a fraction of the cost & effort involved with older types of data technology (RDBMS & EDW). Since the data is all available in one place, it makes it much more easier to perform data governance & auditing
  2. The ability to perform multiple types of security processing on a given data set. This processing varies across batch, streaming, in memory and realtime which greatly opens up the ability to create, test & deploy closed loop analytics quicker than ever before. In areas like security telemetry where streams of data are constantly being generated,  ingesting high volume data at high speeds and sending it to various processing applications for computation and analytics – is key
  3. The DAS (Direct Attached Storage) model that Hadoop provides fits neatly in with the horizontal scale out model that the services, UX and business process tier leverage in a cloud based architecture.
  4. The ability to retain data for long periods of time thus providing security oriented applications with predictive models that can reason on historical data
  5. Hadoop provides the ability to run a massive volumes of models in a very short amount of time helps with modeling automation

Techniques like Machine Learning, Data Science & AI feed into core business processes thus improving them. For instance, Machine Learning techniques support the creation of self improving algorithms which get better with data thus making accurate cyber security & other business predictions. Thus, the overarching goal of the analytics tier should be to support a higher degree of automation by working with the business process and the services tier. Predictive Analytics can be leveraged across the value chain of Cybersecurity & have begun to find increased rates of adoption with usecases ranging from behavior detection to telemetry data processing.

Services Tier

A highly scalable, open source & industry leading platform as a service (PaaS) is recommended as the way of building out and hosting this tier. A leading PaaS technology , (e.g. Red Hat’s OpenShift), is hardened constantly for process, network, and storage separation for each of the tenets running on a private or public cloud. In addition, there is focus on providing intrusion detection capabilities across files, ports & potential back doors etc.

An enterpise PaaS provides the right level of abstraction for both developers and deployers to encapsulate business functinlaity as microservices. This capability is provided via it’s native support for a linux container standard like Docker that can  be hosted on either bare metal or any virtualization platform. This also has the concomitant advantage of standardizing application stacks, streamlining deployment pipelines thus leading the charge to a DevOps style of building applications which can constantly protect against new security exploits. Microservices have moved from the webscale world to fast becoming the standard for building mission critical applications in many industries. Leveraging a PaaS such as OpenShift provides a way to help cut the “technical debt” [1] that has plagued both developers and IT Ops.

Further I recommend that service designers design their micro services so that they can be deployed in a SaaS paradigm – which usually implies taking an API based approach. APIs promotes security from the get-go due to their ability to expose business oriented functionality depending on the end users permission levels.

Further, APIs enable one to natively build or to integrate security features into the applications themselves  – via simple REST/SOAP calls. These include APIs for data encryption, throttling traffic from suspect consumers, systems behavior monitoring &  integration with Identity & Access Management Systems etc.

A DevOps oriented methodology is recommended in building applications in the following ways –

  • Ensuring that security tooling is incorporated into development environments
  • Leveraging resiliency & recoverability tools like ChaosMonkey (which is part of the Netflix Simian Army project) etc to constantly test systems for different kinds of vulnerabilities (e.g abnormal conditions, random errors, massive amounts of traffic etc) from Day 1
  • Promoting horizontal scaling and resilience by testing live application updates, rollbacks etc
  • Leveraging a Cloud, OS & development language agnostic style of application development


User Experience Tier – 

The UX (User Experience) tier fronts humans – clients. partners, regulators, management and other business users across all touch points. The UX tier interacts closely APIs  provided for partner applications and other non-human actors to interact with business service tier. Data and information security are key priorities at this layer offers secure connectivity to backend systems.  Data is transmitted over a secure pipe from device to backend systems across business applications.

The UX tier has the following global security responsibilities  – 

  1. Provide a consistent security across all channels (mobile, eBanking, tablet etc) in a way that is a seamless and non-siloed. The implication is that clients should be able to begin a business transaction in channel A and be able to continue them in channel B where it makes business sense where security is carried forth across both channels.
  2. Understand client personas and integrate with the business & predictive analytic tier in such a way that the UX is deeply integrated with the overall security architecture
  3. Provide advanced visualization (wireframes, process control, social media collaboration) that integrates with single sign on(SSO) & cross partner authentication
  4. The UX should also be designed is such a manner that it’s design, development & ongoing enhancement follow an Agile & DevOps methodology

The other recommendation for remote clients is to leverage desktop virtualization. In this model, the user essentially uses a device (a laptop or a terminal or a smartphone) that performs zero processing in that it just displays a user interface or application (ranging from the simple to the complex – a financial application, or office tools, document management user interface etc) delivered from a secure server over a secure connection. These clients known as zero clients are highly secure as they run a golden uncompromisable image run from a highly protected central server. These also have a smaller attack surface.

How to embed Cybersecurity into the infrastructure –  

How do all of the above foundational technologies (Big Data, UX,Cloud, BPM & Predictive Analytics) help encourage a virtuous cycle?

This cycle needs to be accelerated helping the creation of a learning organization which can outlast competition by means of a culture of unafraid experimentation and innovation.

  1.  The Architecture shall support small, incremental changes to business services & data elements based on changing business requirements which include Cybersecurity
  2. The Architecture shall support standardization across application stacks, toolsets for development & data technology to a high degree
  3. The Architecture shall support the creation of a user interface that is highly visual and feature rich from a content standpoint when accessed across any device
  4. The Architecture shall support an API based model to invoke any interaction – by a client or an advisor or a business partner
  5. The Architecture shall support the development and deployment of an application that encourages a DevOps based approach
  6. The Architecture shall support the easy creation of scalable business processes that natively emit security metrics from the time they’re instantiated to throughout their lifecycle


My goal in this post was to convince enterprise practitioners to shed their conservatism in adopting new approaches in building out applications & data center architectures. The inherent advantage in using Cloud, Big Data & Realtime analytics is that security can been intrinsically built into infrastructure.

This post makes no apologies about being forward looking. Fresh challenges call for fresh approaches and a new mindset.




Leave a Reply

Your email address will not be published. Required fields are marked *