How a Bank Chief Data Officer (CDO) describes Big Data to his Board Members & Lines of Business..

Big-Data

(Image Credit – SAS)

Without big data analytics, companies are blind and deaf, wandering out onto the web like deer on a freeway.” – Geoffrey Moore

Much has already been written about the last five years heralding the dawn of the Data Age. The production, processing and consumption of Data already dominates the Internet. Whether that be social media data or relationship data or search data etc – We all can agree that much of human interaction is beginning to be focused around data and the intricate patterns in those mountains of data.

The term Twitter Revolution (referring to the different revolutions and protests, all of which were coordinated using the social networking site Twitter to plan the protests, mobilize the demonstrators and update the news to all around the world) is now part of the lexicon. Anyone remember Arab Spring?

https://en.wikipedia.org/wiki/Twitter_Revolution

Just as Data has radically changed the social landscape, the aftershocks are beginning to be felt in how businesses operate, innovate and evolve.Just as Amazon,Google,Yahoo and Twitter et al have built data driven platforms, players in the traditional Fortune 500 are now endeavoring to do so. Whatever moniker one uses to refer to this phenomena – Digital Platform or Business Transformation or Next Generation Analytics – these are all essentially underpinned by Data.

In Financial Services & Banking, most people would agree that there has been too much of a focus on the technology itself – how it works,  what the infrastructure requirements are to ingest and process it- rather than the practical uses of it.

So it was very interesting that at a recent meeting with a Chief Data Officer (CDO) of a large Global financial institution, the discussion centered around how one would describe Big Data to non technical business leaders.

This gentleman (who shall sadly remain unnamed for confidentiality purposes) is highly regarded in the industry & someone who came ground up working across areas as diverse as Treasury to Retail Banking to Risk Management. He has stewarded and led a large Enterprise Data program (with effective data governance, quality & security) over the last three years. An expert in high finance yet possessing a strong curiosity for and understanding of technology (& it’s benefits from a competitive standpoint) –  a rare mix.

While reminiscing on his journey over the last few years (in transforming the Bank to an information focused enterprise) , he described his succinct view on educating the business on Big Data thus bridging this critical knowledge gap.

It is so simple yet profound a signature statement (delivered with his customary clarity) that I had to post it here for everyone’s benefit.

If done right, Big Data is like one of those things in life that makes everything better“.

And what are the key benefits as he explains them ?

  1. Enormous Business Value in a range of areas as diverse as –  Defensive (Risk, Fraud and Compliance  – RFC ) to Competitive Parity (e.g Single View of Customer) to the Offensive (Digital Transformation across their Retail Banking business)
  2. Drastically Reduced Time to Market for new business projects
  3. Hugely Improved Quality & Access to information & realtime analytics for customers, analysts and other stakeholders
  4. Huge Reduction in CapEx & OpEx spend on data management projects  (Big Data augments and even helps supplant legacy investments in MPP systems, Data Warehouses, RDBMS’s etc)
  5. Becoming the Employer of Choice for talent due to their vocal thought leadership in this field – in areas as diverse as Hadoop, Data Science and Machine Learning

I am truly struck with how refreshing & jargon free this is as an approach – to such an intensely technical subject.

The key here is his describing Big Data as a holistic business strategy (and not yet another tactical IT project) to the Board & Line of Business leaders. Never mind the technology & it’s nitty gritty, it is the ability of the leadership to articulate and orchestrate a response that will differentiate the winners from the also-rans in the long run. As I’ve pointed out before – your Data assets and what you are able to do with them will dictate your  organization’s enterprise destiny.

The Business Case for Big Data in Financial Exchanges..(2/2)

wall-street-bull

The first post in this two part series focused on the competitive dynamics in the financial exchange landscape.For established institutions that have huge early mover advantage, the ability to compete with innovative players by using fresh technology approaches is critical to engage customers.This post will focus on a fresh strategy approach from a business and IT perspective.

Traditional players in the financial exchange market have been taken by surprise by the raft of deregulation that has occurred in the business. New entrants, unencumbered by legacy IT and legacy thinking have focused on innovation. Such business models predicated on agile systems, rapid & iterative development and more importantly – a Data First strategy have helped the upstarts capture a good chunk of market share.

Traditional leaders now only conduct around 50-60% of the total traded instrument volumes worldwide.                                                                    

(Ref – http://www.statista.com/statistics/270127/largest-stock-exchanges-worldwide-by-trading-volume/)  

Before we deliver further into business strategy implications, lets first recap the business landscape from the perspective of the incumbent exchange operators –

  • Increased competitive dynamics leading to loss of liquidity and trading volumes
  • The use of electronic trading now means that systems match buy and sell orders technology without manual intervention thus taking away trading volume from the above incumbents
  • Low cost operations as opposed to a personnel intensive business (this is true as of a few years ago) , running their facilities with heavy reliance on technology & automation thus resulting in minimal headcount
  • New model of “Maker/Taker Pricing” i.e just paying members without having them to pay the usual exchange fees to trade on the platform as long as their trading adds liquidity rather than takes it – upends the traditional membership route for the traditional exchange ; (ref – Wikipedia)
  • Volume based trading incentives that were initially offered by the upstarts generate higher volumes from select customers

In the midst of all this, the technology landscape is undergoing seismic shifts both from a paradigm and from a culture perspective with five major trends being witnessed in the industry –

  1. Disruptive technology (namely Big Data, Cloud Computing & Mobile Clients) offers a great way to architect agile and flexible computing platforms
  2. Exploding data volumes both from a size as well as from a variety perspective – given that social media sources like twitter are now being commingled with existing data assets to create curated ‘data pools’ that can aid in realtime decision making
  3. Moore’s law continues to push processing power capabilities upward and costs downward. Commodity hardware (x86) based Compute,Storage and Networking approaches dominate from a feature perspective while cutting costs down for enterprise IT
  4. A new problem-solving mindset buoyed by Data Science and Predictive Analytics is changing the manner in which organizations are harnessing data and making decisions
  5. The common thread tying all the above approaches together – Enterprise Open Source. Providers beginning with open source pioneer Red Hat have gradually moved from upstart to incumbent in large swathes of the financial industry over the last 10 years. Open Source is mission critical and highly proven in stringent environments (like exchanges) that need near 100% uptime

Perhaps more than any other area in financial services – data is the currency in an exchange. Buy Orders for different financial instruments like equities,bonds & options etc are sent in and matched with Sell Orders. Tick data for thousands of different symbols is provided for market participants as a service & is disseminated to tens of thousands of terminals across multiple markets etc.  The data management landscape here has long been dominated by relational database technology (RDBMS) which mandate storing information in a structured format on large & expensive servers.

On the other hand, Hadoop & NoSQL systems can ingest any kind of data and replicate it across clusters of hundreds to thousands of commodity servers thus making data access much more cheaper, more agile and able to support a multitude of processing paradigms. What is more, compute applications (e.g. trade lifecycle analytics, market surveillance, regulatory etc) can directly be run on the datasets themselves thus resulting in architectures that are simple yet highly scalable.

So how can firms begin to incorporate the above ideas into their product lifecycle and create a roadmap for innovation? To analyze this, we will use the Value Discipline framework created by Michael Traecey and Fred Wiersema.

This was first proposed by them in the hugely influential article “Customer intimacy and other value disciplines” published in 1993 in the Harvard Business Review (Ref – https://hbr.org/1993/01/customer-intimacy-and-other-value-disciplines) & elucidated further in their groundbreaking book “The Discipline of Market Leaders“.

They postulate that firms which need to succeed in a tremendously competitive marketplace, need to create strategies in three broad areas (or value disciplines) operational excellence, customer intimacy and product leadership. 

ToolsHero_Treacy-Wiersema

Ref – http://www.toolshero.com/value-disciplines-treacy-wiersema/

Operational excellence focuses on providing customers with highly reliable products and services. This makes a lot of sense in the exchange markets where previous flash crashes and interruptions in trading garner not just bad press but also carry severe reputational risk. Investing in technologies proven at web scale(read open source) which guarantee high availability and a superior degree of automation  is key. Operational excellence as a discipline is also concerned with creating a culture of unafraid experimentation that constantly helps a product or a process or a service improve customer experience.

Product leadership deals with creating leading edge products and services that constantly disarm slower competitors. Techniques that guarantee that the right data is in the hands of the right employee at the right time ensure that contextual services can be offered in real time to customers. This has the effect of optimizing existing workflows while also enabling the iterative creation of new products and business models.

The third prong of value discipline is Customer Intimacy. This is the most important but sadly a discounted virtue, in this area of financial services, as a result of a perception that exchanges are mere facilitators of electronic transactions. This perception is often strengthened due to onerous regulatory and compliance mandates.

How can customer intimacy be increased both from a macro and a micro perspective? For instance –

• Using the valuable trade, position data that they possess, firms can better segment customers and also create models that can learn & predict behavior from historical & current data

• Obtaining a single view of customer (360 degree view) to hyper-target individual institutional customers with better data products & promotions. These products could ultimately be offered as a SaaS (Software As a Service) over APIs

The core argument proposed by Treacy and Wiersema was that firms should pick one of the above areas to excel at while staying competitive in the other two.

Expanding on the themes above, six key business areas where Big Data can help Exchanges create intelligent platforms & applications  – 

  1. Risk & Compliance Analytics – Provide tools to their customers that help analyze regulatory compliance, risk & trading analytics. This is combine their historical data, tick data and provide a service to algorithmic traders & bots. This moves exchanges to becoming more of information brokers. Players may already be doing this in silos but Big Data techniques can help augment and enhance existing approaches.
  2. Trade Surveillance – This is to ensure that abusive trading practices are effectively monitored and detected – thus avoiding reputational risk.
  3. Post Trade Analysis –  Post trade analytics (contrast tick data with the execution data as a way of showing regulators that you did what you were supposed to). This is also important from an audit trail perspective & also from following appropriate regulatory reporting standards
  4. Trading Analytics – Analytics for High Frequency Trading by combining data from new information sources like Social media with historical information
  5. 360 degree view of trades a realtime basis – Financial exchanges make revenue from trading commissions. Understanding your customers across all financial instruments and need to understand which customers are doing most business on the exchanges. Customer fees go down as their volumes go up.
  6. Risk Management – variety of use-cases here ranging from trade risk management, internal & external reporting

The bottom line -in an industry dominated by speeds and feeds of data, Big Data enables a relook at business and IT strategy. Improving Operational excellence, Product leadership & deeper Customer relationships are all key as they driving increased volume, customer loyalty, revenue and ultimately profitability.

Players can only ignore this megatrend at their peril.

The Business Case for Big Data in Financial Exchanges..(1/2)

130811_wall_street_ap_605

A vital cog in the world’s financial industry – financial exchanges have always been an interesting segment within the broader capital markets ecosystem. They primarily serve as a vehicle for household savings with nearly 50% of households directly or indirectly holding investments in the stock market as well as governments & corporations relying on them to raise capital. This two part series of posts explores the structure of the financial exchanges, the competitive pressures they’re under and how Big Data can help create new business models thus ultimately driving profitability in a tremendously competitive marketplace.   

The Capital Markets value chain encompasses firms on the buy side (e.g wealth managers), the sell side (e.g broker dealers) & firms that provide custodial services as well as technology providers who provide platforms for post trade analytics support. The crucial link to all of these is the execution venues themselves as well as the clearing houses.With increased globalization driving the capital markets and an increasing number of issuers, one finds an ever increasing amount of complexity across a range of financial instruments assets (stocks, bonds, derivatives, commodities etc.).

The primary exchanges for equity (stock) trading have been majors like NYSE,NASDAQ and LSE (the London Stock Exchange). Futures and Options are dominated by CME and EUREX. However, deregulation has also resulted in increased fragmentation i.e the above traditional leaders now have competition from non traditional exchange operators like Electronic Communication Networks, Crossing Networks (e.g. investment banks developing their own internal crossing systems to match buyers & sellers etc) &  Dark Liquidity Pools etc.

Various other moves like decimalization of stock prices have led to smaller commissions, after hours trading leading to increased adoption of cutting edge technology. Buy side players now have multi-varied access to venues that can give them the best price for their trade. A case in point are Dark pools which are forums for trading securities in a private manner – typically not available to the retail investor & thus named for their lack of transparency. Dark pools facilitate block trading by highly sophisticated investors who do not wish to indicate their trade strategy to the wider market – by placing their large orders in an open exchange (which then needs to report the trade to the regulators like FINRA for instance).

Dark pools directly execute orders without routing them through traditional venues and some of them even use their own clearing operations. These Dark Pools could be owned by multiple institutions – investment banks (e.g Credit Suisse’s CrossFinder, Goldman’s Sigma X etc), Exchange or ECN owned (e.g Instinet) or owned by specific trading firms themselves.

The impact of these new trading avenues has been disruptive to the traditional exchanges due to the below reasons –

  • Increased competition leading to loss of liquidity and trading volumes
  • The use of electronic trading now means that systems match buy and sell orders technology without manual intervention thus taking away trading volume from the above incumbents
  • Low cost operations as opposed to a personnel intensive business (this is true as of a few years ago) , running their facilities with heavy reliance on technology & automation thus resulting in minimal headcount
  • New model of “Maker/Taker Pricing” i.e just paying members without having them to pay the usual exchange fees to trade on the platform as long as their trading adds liquidity rather than takes it – upends the traditional membership route for the traditional exchange ; (ref – Wikipedia)
  • Volume based trading incentives that were initially offered by the upstarts – offering discounts to institutions & traders that generate higher volumes

Thus, shrinking margins, new upstarts & regulatory pressures are further fragmenting trading venues thus leading traditional players to look into how existing business processes & systems integration could be made more transparent, efficient and agile. Players across the board also facing significant profit pressures as volumes have decreased in recent years (thus dropping commissions as well), following the market drops in the 2008 financial crisis.

The response from the exchanges can be summed up in four broad categories  –

  1. Consolidate by mergers & acquisition (e.g NYSE/Euronext-ICE, CME-CBOT,BATS-DirectEdge etc) to achieve better economies of scale,
  2. Vertically integrate Clearing functions into their business operations (e.g NASDAQ’s launch of Genium)
  3. Launching new products based on data and applications
  4. Creating more efficient business processes

Given #1,#2 and #3 above, the problem (#4) of creating new revenue streams still exists and continues to bedevil the business. While creating highly efficient business process can definitely drive operational experience it certainly does not create a sustainable competitive edge by itself.

Given this backdrop, the World Federation of Exchanges (WFE) just published their report for the first half of 2015 and things are looking up for the industry as a result of sustained global economic expansion.

The main trends were: (Source – WFE)

– The value of share trading rose 36% worldwide to 59 trillion USD in H1 2015 from the second-half of 2014 (+58% year on year).

– The number of trades also rose 36% from the second half 2014 (+67% year on year). – Global Market Capitalization rose 8% to 73 trillion USD from the second-half of 2014 (+6% year on year).

– Strong increase in IPOs and in investment flows. The number of IPOs increased by 19% year on year. Total investment flows increased by 42% year on year.

– Exchange Traded Derivatives (ETD) volumes decreased by 3.5% compared with the second half of 2014, mainly driven by commodity and currency derivatives.

VolumesShareTrading

It is interesting that a huge percentage of this surge is being led by the Asian market – which grew 219% YoY from an Order Book trading value perspective.

TradeVolByGeo

Given all of the above & the increasingly regional flavor a global market is acquiriting, how are firms to expand to more business focus/value generating areas? What are they to do from a strategic perspective? 

And more importantly how do they avoid becoming a casualty in an increasingly commodifying industry? 

A succinct technology response is that firms need to tap into the vast reserves of data that they have to create new value added services that tap into their vast computing infrastructures. At a high level, there are a few ways to make high profit margins on existing or new product lines – creating very sophisticated trading infrastructures for certain kinds of financial instruments, increased adoption of execution automation while guaranteeing clients best price, speed and the lowest transaction costs. On the other side of the equation, providing a human interface to cater to the needs of high value clients is key – thus enter the 360 degree view of customer.

The next blogpost will expand on the above themes from a strategic innovation framework standpoint. We will throw up some interesting ideas for players in light of the substantial data assets the incumbents possess. Finally, we will examine how all of this can be leveraged to create a sustainable competitive advantage via continuous innovation.

Financial Services IT begins to converge towards Software Defined Datacenters..

Previous posts in this blog have commented on the financial services industry as increasingly undergoing a gradual makeover if not outright transformation – both from a business and IT perspective.  This is being witnessed across the spectrum that makes up this crucial vertical –  Retail & Consumer Banking, Stock Exchanges, Wealth Management/ Private Banking & Cards etc.

The regulatory deluge (Basel III, Dodd Frank, CAT Reporting, AML & KYC etc) and the increasing sophistication of cybersecurity threats have completely changed the landscape that IT finds itself in – compared to even five years ago.

Brett King writes in his inimitable style about the age of the hyper-connected consumer i.e younger segments of the population who expect to be able to bank from anywhere, be it from a mobile device or via the Internet from their personal computers instead of just walking into a physical branch.

Further multiple Fintechs (like WealthFront, Kabbage, Square, LendingClub, Mint.com, Cyptocurrency based startups etc)  are leading the way in pioneering a better customer experience.  For an established institution that has huge early mover advantage, the ability to compete with innovative players by using fresh technology approaches is critical to engage customers.

All of these imperatives place a lot of pressure on Enterprise FS IT to move from an antiquated command and control model to being able to deliver on demand services with the speed of an Amazon Web Services.

These new services are composed of Applications that encompass paradigms ranging from Smart Middleware, Big Data, Realtime Analytics, Data Science, DevOps and Mobility. The common business thread to deploying all of these applications is to be able to react quickly and expeditiously to customer expectations and requirements.

Enter the Software Defined Datacenter (SDDC). Various definitions exist for this term but I wager that it means – “a highly automated & self-healing datacenter infrastructure that can quickly deliver on demand services to millions of end users, internal developers without  imposing significant headcount requirements on the enterprise“.

Let’s parse this below.

The SDDC encompasses SDC (Software Defined Compute) , SDS (Software Defined Storage), SDN (Software Defined Networking), Software Defined Applications and Cloud Management Platforms (CMP) into one logical construct as can be seen from the below picture.

FS_SDDC

The core of the software defined approach are APIs.  APIs control the lifecycle of resources (request, approval, provisioning,orchestration & billing) as well as the applications deployed on them. The SDDC implies commodity hardware (x86) & a cloud based approach to architecting the datacenter.

The ten fundamental technology differentiators of the SDDC –

1. Highly elastic – scale up or scale down the gamut of infrastructure (compute – VM/Baremetal/Containers, storage – SAN/NAS/DAS, network – switches/routers/Firewalls etc) in near real time

2. Highly Automated – Given the scale & multi-tenancy requirements, automation at all levels of the stack (development, deployment, monitoring and maintenance)

3. Low Cost – Oddly enough, the SDDC operates at a lower CapEx and OpEx compared to the traditional datacenter due to reliance on open source technology & high degree of automation. Further workload consolidation only helps increase hardware utilization.

4. Standardization –  The SDDC enforces standardization and homogenization of deployment runtimes, application stacks and development methodologies based on lines of business requirements. This solves a significant IT challenge that has hobbled innovation at large financial institutions.

5. Microservice based applications –  Applications developed for a SDDC enabled infrastructure are developed as small, nimble processes that communicate via APIs and over infrastructure like service mediation components (e.g Apache Camel). This offers huge operational and development advantages over legacy applications. While one does not expect Core Banking applications to move over to a microservice model anytime soon, customer facing applications that need responsive digital UIs will need definitely consider such approaches.

6. ‘Kind-of-Cloud’ Agnostic –  The SDDC does not enforce the concept of private cloud, or rather it encompasses a range of deployment options – public, private and hybrid.

7. DevOps friendly –  The SDDC enforces not just standardization and homogenization of deployment runtimes, application stacks and development methodologies but also enables a culture of continuous collaboration among developers, operations teams and business stakeholders i.e cross departmental innovation. The SDDC is a natural container for workloads that are experimental in nature and can be updated/rolled-back/rolled forward incrementally based on changing business requirements. The SDDC enables rapid deployment capabilities across the stack leading to faster time to market of business capabilities.

8. Data, Data & Data –  The heart of any successful technology implementation is Data. This includes customer data, transaction data, reference data, risk data, compliance data etc etc. The SDDC provides a variety of tools that enable applications to process data in a batch, interactive, low latency manner depending on what the business requirements are.

9. Security –  The SDDC shall provide robust perimeter defense as well as application level security with a strong focus on a Defense In Depth strategy. Further data at rest and in motion shall be

10. Governance –  The SDDC enforces strong governance requirements for capabilities ranging from ITSM requirements – workload orchestration, business policy enabled deployment, autosizing of workloads to change management, provisioning, billing, chargeback & application deployments.

So how is doing SDDC at the moment? Most major banks have initiatives in place to gradually evolve their infrastructures to an SDI paradigm. Bank of America (for one) have been vocal about their approach in using two stacks, one Open Source & OpenStack based and the other a proprietary stack[1].

To sum up the core benefit of the SDDC approach, it brings a large enterprise closer to web scale architectures and practices.

The business dividends of the latter include –

1. Digital Transformation – Every large Bank is under growing pressure to transform lines of business or their entire enterprise into a digital operation. I define digital in this context as being able to – “adapt high levels of automation while enabling the business to support multiple channels by which products and services can be delivered to customers. ”

Further the culture of digital encourages constant innovation and agility resulting high levels of customer & employee satisfaction.”

2. Smart Data & Analytics –  Techniques that ensure that the right data is in the hands of the right employee at the right time so that contextual services can be offered in real time to customers. This has the effect of optimizing existing workflows while also enabling the creation of new business models.

3. Cost Savings – Oddly enough, the move to web-scale only reduces business and IT costs. You not only end up doing more with less employees due to higher levels of automation but also are able to constantly cut costs due to adopting technologies like Cloud Computing which enable one to cut CapEx and OpEx. Almost all webscale IT is dominated by open source technologies & APIs, which are much more cost effective than proprietaty platforms.

4. A Culture of Collaboration – The most vibrant enterprises that have implemented web-scale practices not only offer “IT/Business As A Service” but also have instituted strong cultures of symbiotic relationships between customers (both current & prospective), employees , partners and developers etc.

5. Building for the Future – The core idea behind implementing web-scale architecture and data management practices is “Be disruptive in your business or be disrupted by competition”. Web-scale practices enable the building of business platforms around which ecosystems can be created and then sustained based on increasing revenue.

To quote wikipedia, a widespread transition to the SDDC will take years:

Enterprise IT will have to become truly business focused, automatically placing application workloads where they can be best processed. We anticipate that it will take about a decade until the SDDC becomes a reality. However, each step of the journey will lead to efficiency gains and make the IT organization more and more service oriented.

The virtuous loop encouraged by constant customer data & feedback enables business applications (and platforms) to behave like agile & growing organisms –  SDDC based architectures offer them the agility to get there.

References

1.http://blogs.wsj.com/cio/2015/06/26/bank-of-america-adding-workloads-to-software-defined-infrastructure/

The Business and Technology Outlook for Capital Markets in 2015 & beyond..(3/3)

Nasdaq-Stock-Market

(Photo Credit – EconomyDecoded)

This article is the final installment in a three part series that talks about the business issues being faced by firms in the Capital Markets space. This post discusses the state of the market circa mid 2015, the evolving landscape and the strategic imperatives that firms face. We also cover the general business direction in the context of disruptive technology innovation.

Capital Markets are the face of the financial industry to the general public and generate a huge percent of the GDP for the world economy. Despite all the negative press they have garnered since the financial crisis of 2008, Capital Markets perform an important social function in that they contribute heavily to economic growth and are the primary vehicle for household savings. Capital markets allow corporations to raise capital using the underwriting process and it is not just corporations that benefit from such money raising activity – municipal, local and national governments do the same as well. Just that the overall mechanism differs – while business enterprises issue both equity and bonds, governments typically issue bonds.

These securities are then purchased by investors (typically Hedge Funds, Mutual Funds, Corporations, Governments, retail investors and high net worth individuals and Trusts)) as part of their long or short-term planning based on their risk appetites. To give one a sense of their scale and importance, IMF statistics show that as of 2013, the combined size of world-wide capital markets at 282,000 billion US $ – almost three times the size of the world’s gross domestic product (GDP)! (Source – Wikipedia)

As of mid 2015, the world’s capital market participants (be they on the buy or the sell side, custodians etc) find themselves at a critical juncture. On the one side you have a rapid rise in the market indices which shows no signs of abatement with more and more households participating in the market either directly or indirectly; at the same time there is an ever increasing demand in the area of offering new products in this age of the digital (and thus empowered) consumer who has other choices like social lending, virtual currencies.

Regulation and risk management are even more front and center with an incipient sovereign debut crisis in Greece (and likely other countries – Spain anyone?) and the Chinese stock-market crash.  Such economic upheavals only increase pressures on Banks to adhere to stringent Basel III risk ratios in areas like credit, market and liquidity. The result is reduced reliance on leverage and a focus on higher quality collateral. Both of which tend to crimp profits (as compared to the go-go years of the dot-com and the mortgage bubbles) and some would allege turning financial services into a boring utility like business.

IT Background:

Within large bulge bracket firms, Capital Markets groups engineer custom derivative trades that hedge exposure for their clients as well as for their own internal treasury groups. They may also do proprietary trading (on the bank’s behalf) for a profit (though this is the type of trading that the Volcker Rule seeks to eliminate). These groups typically lead the way in being forward looking from a high tech perspective.

Most of the compute intensive problems are generated out of either this group or the enterprise risk group. They typically own system that interface with the exchange facing order management systems, the trade booking systems, the pricing libraries for the products the bank trades as well as the tactical systems that are used to manage their market and credit risks, customer profitability, compliance and collateral systems. As a result, they usually get about a large chunk of a Bank’s total IT budget and see technology as a key competitive advantage. The above business drivers are already being tackled in many areas within the Capital Markets spectrum.

Analysis:

Having set the context in terms of the top strategic issues players in this space find themselves confronting, let’s examine the outlook for the remaining part of 2015 and beyond – 

  1. Firms that successfully navigate the business waters will be the ones that can create new business models and offer a superior client relationship based on their data assets. Firms that can better understand and monetize their data assets will be placed in a position of superior returns. It is critical to understand your clients (be they institutional or otherwise) from a 360 degree perspective so you can market to them as one entity across different channels. This is key to optimize profits via cross selling in an increasingly competitive landscape.
  2. Firms need to diversify into newer areas (and offer newer services) as profitability from existing lines slowly dries up. To state an example – can a firm that is a broker dealer offer other value added services like collateral management, risk management etc as a service?
  3. Like it or not, we are not still past the era of systemic shock – with a geopolitical crises  (Greece) and the risk of a major economic failure cascading down as a contagion – ultimately hurting the general public via unemployment and other kinds of economic retrenchment. Firms need to do the due diligence to comply with regulations like CCAR and maintain transparent communication with the regulatory agencies and the general public. Reputation risk is still too great of a business challenge that can be very hard to overcome.
  4. Capital markets is an information intensive business. So it is all about Data and it’s Governance. With BCBS 239 (as we saw in an earlier post), firms need to put on place data aggregation capabilities that help with firm wide risk reporting. We will find more of a focus in changing the Enterprise Risk, Compliance and Finance applications to promote systems and data consolidation.All key risk measurements need to be consistent & accurate across the above internal and external domains across multiple geographies & regulatory jurisdictions. A 360 degree view of every risk type is needed and this shall be consistent without discrepancies. Enhanced ability to process and glean insights from both data at rest and in motion is not an innovative capability anymore but a must-have.
  5. Thus, firms that move from experimentation to deployment of elastic computing capabilities (Big Data, Mobility and Cloud) in support of business needs will increasingly dominate the landscape. Capital Markets are an IT & Information intensive business after all. Expect to find the leaders adopt a web-scale approach to building out their information technology architectures in support of business initiatives. Web-scale practices enable the building of business platforms around which ecosystems (and platforms as opposed to standalone applications) can be created and then sustained based on increasing revenue.
  6. Compliance with Anti Money Laundering(AML) legislation will continue to be a huge focus with an emphasis on preventing drug proceeds, terrorist financing and other illegal financing from being laundered into the legal system.
  7. At the risk of stating the obvious – Cyber-security will be continue to be front and center in the next few years. The media is replete with stories of both internal and external security compromises at major financial firm. Investments in security and increased monitoring of data (both internal and external) to monitor fraud and non-compliance will only increase.  Broadly identifying every potential attack vector, enforcing realtime intelligence & deep learning around these while keeping the overall business context in mind will be one of the key approaches in keeping data & systems secure. We are not far off from the days of mass cyber attacks on the nerve centers of finance as a weapon of warfare by nefarious sources. CIOs & IT Security need to have mitigating strategies for any such eventuality. One significant breach can create outlasting reputational and business damage among clients.
  8. Business and IT will thus need to align work closer together to realize positive business outcomes and drive profitability. Management incentives need to be aligned to closer collaboration among peers at different levels to drive better governance and thus lower risk.
  9. Digital will be an increasing focus in terms of driving business transformation. To reiterate from an earlier blogpost – Digital in this context is defined as being able to – “adapt high levels of automation while enabling the business to support multiple channels by which products and services can be delivered to customers. Further the culture of digital encourages constant innovation and agility resulting high levels of customer & employee satisfaction.”
  10. Data Science and deep learning techniques will find increasing adoption in Capital Markets. Use cases will span the spectrum from Risk management, Compliance (AML & Fraud), Trade Analytics (Pre trade, trade surveillance, transaction cost analysis, bitemporal analytics etc) and client segmentation  as being the four key areas driving adoption. Firms should consider delivering these incremental capabilities via analytic sandboxes to their business users. An interesting business area is for stock exchanges to provide analytic tools as a service to their client base.
  11. As Data gets centralized into Lakes or Oceans, enterprise Data Governance and IT Portfolio management will be key in ensuring smooth implementations that can provide incremental business benefits.

The combination of regulatory forces, competition & compliance will compel Capital Markets firms to rethink their IT strategy. Leaders will accelerate their technology & innovation roadmap in a way that promotes long term sustainable growth.

Design and Architecture of a Real World Trading Platform.. (2/3)

This article is the second installment in a three part series that talks about the business issues being faced by large trading operations & infrastructures in Capital Markets space. This post discusses a real world reference architecture using Big Data techniques and is more technical in nature. The final part of this series will focus on business recommendations for disruptive innovation in this area.

For part 1, please visit – http://www.vamsitalkstech.com/?p=303

Introduction

With globalization driving the capital markets and an increasing number of issuers, one finds an increasing amount of complexity across a range of financial instruments and assets (stocks, bonds, derivatives, commodities etc.) and venues (NYSE, NASDAQ, CME, Dark Pools etc.). Added to that shrinking margins & regulatory pressures are further driving buy side players to look into how existing business processes & systems integration currently work with a view to making these more transparent, efficient and agile.

The business drivers (as noted in the first post in this three part series) from a Capital Markets perspective-

1.Re-tool existing trading infrastructures so that they are more integrated yet loosely coupled and efficient

2.Automating complex trading strategies that are quantitative in nature across a range of asset classes like equities, forex,ETFs and commodities etc

3.Needing to incorporate newer & faster sources of data (social media, sensor data, clickstream date) and not just the conventional sources (market data, position data, M&A data, transaction data etc). Pure speed can only get a firm so far

4.Retrofitting existing trade systems to be able to accommodate a range of mobile clients who have a vested interest in deriving analytics. e.g marry tick data with market structure information to understand why certain securities dip or spike at certain points and the reasons for the same (e.g. institutional selling or equity linked trades with derivatives)

5.Helping traders create algorithms as well as customizing these to be able to generate constant  competitive advantage

The need of the hour is to provide enterprise architecture capabilities around designing flexible trading platforms that are built around efficient use of data, speed, agility and a service oriented architecture. The choice of open source is key as it allows for a modular and flexible architecture that can be modified and adopted in a phased manner – as you will shortly see.

Business Requirements

Trading platforms are concerned with executing orders coming in from portfolio managers on the Buy side, order management & monitoring through the execution process and providing electronic access to a wide variety of venues. From the Sell side one needs to provide support for handling customer orders and managing trading positions.

The following business requirements must be met of systems that offer Buy/Sell trading capabilities –

  1. Architecture must offer support front, mid & back office trading capabilities with support for both simple and complex rule based and algorithmic trade strategies
  2. Support the development lifecycle  & seamless cutover in terms of backtesting and live implementation of the above strategies. In short support an iterative and DevOps based methodology. The goal is to ensure that folks developing startegies can test their models across the widest spectrum of asset classes in the most productive manner possible
  3. Display well designed & highly intuitive trade and blotter UIs for trade management with support for mobile technologies. This is critical in ensuring a smooth user experience
  4. Support a business model trading as a service – TaaS that can potentially be sold as a utility over open APIs
  5. Support a hybrid & scale-out deployment model. Services that provide the core business functionality  should be deployable all the way from bare metal to VMs to Docker containers on a private or a public cloud as required. A core requirement is to use Open Source Software and commodity
  6. Support a rule based trading model (declarative) that will evolve to supporting predictive analytics with ingrained support for both complex event processing (CEP) as well as business workflow (ideally support for the BPMN standard notation)
  7. Support integration with a wide variety of external participants across the globe. The platform must truly be global in terms of supporting exchanges & products (FOREX, open across different hours
  8. Support a wide variety of financial products and formats with FIX being the primary
  9. Provide support for order capture, trading &  crossing
  10. Provide the ability to cross buy and sell side market orders (when both side orders are detected in the system)
  11. Auto route and execute orders based on accounts, quantity and real time market data
  12. Support other complex order routing requirements as applicable
  13. Finally, support a high degree of scalability, as volumes grow – system should be able to autoscale to accomodate a high volume of trades/sec with a desirable latency in milliseconds & well defined SLA’s for Order Entry & Disaster Recovery at a minimum

Design Tenets –

  1. At the application tier – a SOA based approach is key – all core business functions are modeled as SOA services or even microservices
  2. The choice of an ESB/message tier to interconnect all market participants
  3. Open messaging standard – AMQP (Advance Message Queuing Protocol) chosen as the transport protocol of choice for performance and industry reasons. Legacy architectures by financial and stock exchanges to optimize the cost of their IT infrastructures  have been hobbled by proprietary vendor and legacy protocols. AMQP was developed by a consortium of banks and vendors (JP Morgan and Red Hat, VMWare among others) and functions as the lingua franca for financial services backbone messaging. It is now deployed in a range of industries ranging from healthcare to manufacturing to IoT (Across verticals). Using AMQP avoids lock-in and costly bridging technology. Further, organizations like NYSE have been leading development of technologies like OpenMAMA, which intends to provide a vendor agnostic middleware API that supports event driven messaging. One exmple use is to allow market data vendors to publish their messages like quotes & trades over an industry standard platform while allowing them to build value based services on the platform. Our intention is to future proof our architecture by basing it on open standards
  4. FIX (Financial Information Exchange) run over AMQP will be the primary business interchange protocol
  5. Apache Kafka or Fuse ESB chosen as the messaging tier or service bus
  6.  A BRMS (Business Rules Mgmt System) provides Rules, CEP and BPM under a single umbrella. This tier contains the definition and the runtime for rules for order management, routing, crossing, matching.
  7. In memory analytics provided by an in memory data grid or even using a Spark in memory layer
  8. The data layer is based on an Apache Hadoop platform and is architected based on a lambda architecture (developed by Nathan Marz). More on this in the below sections

Archtecture

Figure 1 – Reference Architecture for Trading Platform

The key components of the Trading Platform Architecture as depicted above are –

  1. Order Management System – which displays a rich interactive portal with a user interface; clients call in brokers via the telephone or place orders electronically. These orders are routed to the OMS. The OMS receives the orders, performs proper matching and decides the best avenue and price based on business rules/complex events and then routes them to the appropriate market venue to get these filled
  2. A Market Data distribution service connects to market data providers (e.g. Bloomberg, Thomson Reuters etc.) and send in regular updates to the OMS, rules around what market data becomes the reference point for OMS, i.e. if the same market data is available from multiple sources which takes priority
  3. Connectivity is also established via FIX gateways to the distribution service.
  4. The business rules approach adds another dimension to BPM by enabling one to leverage declarative logic with business rules to build compact, fast and easy to understand trading logic. An example of this is in a sector (e.g. Trading platforms, Mortgage underwriting applications) where market conditions result in changing business rules as well as business processes comprised in the satisfaction of buy/sell requests.
  5. Complex Event Processing (CEP) – The term Event by itself is frequently overloaded and can be used to refer to several different things, depending on the context it is used. In our trading platform, when a sell operation is executed, it causes a change of state in the domain that can be observed on several actors, like the price of the securities that changed to match the value of the operation, the owner of the individual traded assets that change from the seller to the buyer, the balance of the accounts from both seller and buyer that are credited and debited, etc. Intercepting a cloud of these events and having a business process adapt and react to them is key to have an agile trading platform.
  6. The data management layer spans information stores like the Security Master, Customer Master, Holdings and Account/Product Master etc. This layer also needs to deal with Data Governance.

Modeling

Figure 2 – Trade Rules Modeling

The flow of data in the system can be depicted as shown below –

DataFlow                                            Figure 3 – Overall Trading Process flow

The intention in adopting a SOA (or even a microservices) architecture is to be able to incrementally plug in lightweight business services like performance measurement, trade surveillance, risk analytics, option pricing etc.

The data architecture is based on the lambda system developed by Nathan Marz.The lambda architecture solves the problem of computing arbitrary functions on arbitrary data in real time by decomposing the problem into three layers: the batch layer, the serving layer, and the speed layer.

Data Architecture

Screen Shot 2014-09-30 at 3.31.52 PM

Figure 4 – Data process flow (source VoltDB)

On a very high generic level the data architecture has 3 components.

  1. the Batch Layer – constantly ingests, stores and processes market data, social media data, reference data, position data etc and constantly precomputes views
  2. the Speed Layer processes real-time feeds & produces tactical views of the same
  3. the Service layer which holds the batch views relevant for the queries needed by predictive analytics.

The Lambda Architecture is aimed at applications built around complex asynchronous transformations that need to run with low latency (say, a few seconds to a few hours) which is perfectly suited to our business case.

Advantages of an open architecture

  • Cost-Effective – An open source stack reduces cost almost by 50% when compared to a legacy system built on mainframe or proprietary technology.
  • Data Governance – Effectively provided by the Hadoop stack
  • Scalable – Provides a high degree of scalability in the architecture
  • Innovative – Built on the most robust architecture and state-of-the-art technology
  • Deployment – Supports a variety of deployment architectures, on premises or on the cloud
  • Load balancing support is built in to handle increasing volumes
  • Visibility into business rules as well as support for monitoring workflows

New Age trading platforms built on open source can not only be deployed across physical, virtual, mobile, and cloud environments but also include complementary paradigms – integration, business rules and complex event processing (CEP) capabilities can add to operational efficiency, new business models, better risk management & ultimately  a higher degree of profitability.

References

Big Data Lambda Architecture

http://voltdb.com/blog/simplifying-complex-lambda-architecture

Why Big Data & Intelligent Middleware will revolutionize Financial trading…(part 1 of 3)


This article is the first in a 3 part series that talks about the business issues faced by large trading desks in Capital Markets space and the agile architectures that can be put in place using Big Data techniques to help clients gain competitive advantage.

There are very few industries that are as data-centric as the financial services industry. Every interaction that a client or partner system has with a banking institution produces actionable data that has potential business value associated with it.

Even within the large spectrum of client facing domains that make up financial services – capital markets occupies the pride of place in terms of being forward looking as far as adoption of new age technologies goes.

While traditional domains like retail banking, wealth management, consumer banking  have historically had multiple sources and silos of data across the front-, back- and mid-offices – capital markets always have had a data challenge especially around areas like the trade lifecycle.

In 2015, it is no secret that the capital markets have seen better times from a business standpoint, namely because –

1.Falling volumes across a range of asset classes e.g. equities from the days of 2008-2009. The below graphic from the Tabb group perfectly captures this dynamic.Average daily trading volume, tallied by month, was just 5.8 billion shares in May 2015, less than half of the peak of 12.3 billion shares during the financial crisis. (Ref – www.marketwatch.com)

FallingVolumes

2.Decline in profits across previously lucrative areas like High Frequency Trading as volumes fall. An important point to note here is that profitability of a trading desk is driven by trading volumes that the desk attracts from their various institutional clients.

3. Overall decreased enterprise profitability owing to both #1 and #2

4.Provide the head of capital markets & risk managers with a 360 degree view of the customers that their entire range of desks do business with to not just optimize enterprise profits but to also manage risk

A definition of high frequency trading is in order for  those new to the space –

High-frequency trading HFT) is essentially algorithmic trading in finance by using of sophisticated technological tools and computer algorithms to rapidly trade securities. HFT uses proprietary trading strategies carried out by computers to move in and out of positions in seconds or fractions of a second.

About two years ago, I had the good fortune of being part of a panel discussion around the use of Big Data in a range of trading use-cases (High and low latency trading) covering both hardware based platforms as well as hybrid platforms. An example discussed at length was by using  Big data technologies, coupled with flash memory to improve performance, a firm could develop a strategy that includes weather data, social data or geolocation in real time.

In fact, at that time some financial services firms were already starting to experiment with advanced analytics, coupled with low latency technology, to develop smarter or intelligent trading decisions.

(Ref – http://www.wallstreetandtech.com/latency/big-data-for-intelligent-trading/d/d-id/1268561? )

Now, in the last two years, the above initial trickle has transformed into a gushing waterfall with firms rushing to re-architect their shrink wrapped legacy applications with a view to –

1.Re-tooling their trading infrastructures so that they are more integrated yet loosely coupled and efficient

2.Automating complex trading strategies that are quantitative in nature across a range of asset classes like equities, forex,ETFs and commodities etc

3.Needing to incorporate newer & faster sources of data (social media, sensor data, clickstream date) and not just the conventional sources (market data, position data, M&A data, transaction data etc). Pure speed can only get a firm so far

4.Retrofitting existing trade systems to be able to accommodate a range of mobile clients who have a vested interest in deriving analytics. e.g marry tick data with market structure information to understand why certain securities dip or spike at certain points and the reasons for the same (e.g. institutional selling or equity linked trades with derivatives)

5.Helping traders create algorithms as well as customizing these to be able to generate constant  competitive advantage

In fact the moniker given to this new architectural paradigm is “Smart Data” or the need to generate a holistic view of market state based on hitherto unharness-able data streams. Legacy and shrink wrapped systems are simply unable to meet the above five business requirements as we have seen this across a range of clients.

The key here is the word “integrated” as that signifies a platform that captures all and any kind of structured, unstructured data while allowing traders (and more importantly quants) to develop strategies that can be, tested, simulated and validated –  by first leveraging a classical rules/CEP engine based approach. And secondly, by augmenting this approach by using the tens of analytical packages found in a programming language like R. These packages provide sophisticated time series analysis, financial network analysis and risk modeling etc. The lifecycle of developing such smart strategies that take into account not just streaming/real time data but also combine it with existing silos will be covered in a future post. 

Having thus set the business stage, the next post in this three part series will focus on the architecture of a real world integrated trading platform built around the Apache Hadoop  as well as using other enterprise open source components – enterprise messaging, distributed caching, rules/workflow and CEP engines.

We will then round out the discussion in the third post by focusing on what are the key areas within trading that require a hardware based platform and which areas benefit from software based platforms where Smart Data based techniques have a significant role to play.