Wednesday, March 30, 2011

Compliance Planning - The Volcker Rule

As rule-making gathers steam, banks have started putting together exploratory (some are further ahead) teams to define their “Dodd-Frank Compliance Strategy”.

If we have to comply with the DFA, we will need to do much better than waiting for rules to be finalized and rolled out. Old hands know that any compliance starts with an outline (even hazy to begin with) of the expectations and potential responses and evolves as you get a better handle on it (this applies to the act of complying, not the strategic options). We will have lay the foundation for compliance based on what we know about the Act today and we will have to create a unified theme for our compliance effort that makes some sense and start compliance ‘runs’ starting now and then tweak the engine as every rule takes shape. If we are not compliant by the time the final rule is published, it is too late already. Remember how we complied with HIPAA in 2002 and then with SOX in 2003 and then GLBA in 2005 and then PCI DSS in 2006 and how we did everything six times over? “It broke the bank” metaphorically then, it will not be a metaphor this time around.

I take the Volcker rule as an example for this blog and let us see what business process or  technology level measures will need to be planned in order to achieve,

One quick qualifier. The focus is on business process compliance and not strategic options or business model issues.

Volcker rule’s compliance burden is going to be essentially in one area. How do you demonstrate that whatever you have done falls under “Permitted Activities” of the rule?

Let us take one example. One permitted activity is “transactions in connection with underwriting or market-making activities, to the extent designed not to “exceed the reasonably expected near term demands of clients, customers or counterparties”.

Let us analyze the steps needed to comply with this:
1.       All transactions of underwriting or market-making activities need to be brought together in a central place (this will have to be done, no matter what shape or content of the final rule)
2.       All client, customer or counterparties orders and instructions will have to be captured, tagged and massaged so each can be related to one or more underwriting or market-making activities performed by the bank (anywhere in the world?). A huge business process issue here will be about re-designing the customer interaction process and documentation to have enough information and commitment from the client, customer or counterparty for you to justify your actions. Can this be done? Or this starts another “transaction code rationalization” a la HIPAA?
3.       All underwriting or market-making activities performed by the bank that do not have a direct link to a client, customer or counterparty order but can be demonstrated as “reasonably expected near term demands”?,  Demonstrate that it was necessary to perform a transaction to “support” a client, customer or counterparty order and that it was simply a risk mitigation measure of a risk that was already there due to a client, customer or counterparty order and not a new risk created by bank’s own decisions. “reasonable expectations”, “near term demands” – all minefields but still enough to kick-off compliance planning.
There will be more things to do but let us say this is the gist of it.
From a pure compliance planning and technology support perspective, these three things will translate to a) building ‘data warehouses’ of certain types of transactions to be able to analyze   their cause-effect relationships with certain third-party (client, customer, counterparty) actions or orders and b) building strong analytical and heuristic engines that will establish & report connections, dependencies, pairing and risk-mitigation across different transactions.
It will be expensive, difficult and will take a few years (even five or more) of testing and double-checking to make sure that it actually works and generates dependable information.
The Pros:
Regardless of the Volcker rule or the DFA, this information will be a great risk management and decision-making tool for a bank and those who do it well will have a huge competitive advantage. As of today, the power to correlate transactions and risks across clients and proprietary actions does not exist at most banks and where it exists, it exists in pockets.
The Cons:
If the rule-making process loses sight of these real challenges, we may have rules that will take thousands of person-days to file returns that the regulator never gets to and suddenly it is too late all over again.

Labels:

Thursday, March 10, 2011

Challenges in making rules under the Dodd Frank Act, 2010

The Dodd-Frank Act is very ambitious in its scope and expects all agencies charged with banking regulation to come up with specific rules within 18 months (or so) in their respective domains so as to actually implement the DFA across the financial services spectrum. This is (far) easier said than done and regulatory agencies need significant resources to be able to do a good job in this area. Let us see some of the things that precede rule-making at individual agency level:
1.       Subject matter expertise: All the agencies have accumulated vast expertise in their respective domains but DFA covers new territories not charted by any agency thus far. They include Hedge Funds, Capital Adequacy, Consumer Protection, Segregation (Volcker) and so forth. The industry has spent billions of dollars and employed an army of Ph Ds to come up with very sophisticated IP and to understand these issues to a level that you can write a regulation around it will take a lot of learning for the agencies. In general, I believe a good regulation does not need to correspond one-to-one with industry practices but writing a new regulation in an hitherto unregulated (or lightly regulated) area is a different cup of tea.
2.       Defining the terms: Each industry has its jargon and I guess financial services is the leader in this arena. From “give me some balance sheet” to “dark pools” and from “family offices” to “Hedge Funds” there are terms galore and everyone understands what they mean but when you talk of basing a law on these terms, you need a precise definition to ensure that the Regulated Entities are clearly identified and the regulations do not end up offering regulatory arbitrage. I will not be surprised if the terms to be defined run into thousands and this amounts to virtually building an industry lexicon and will be a humongous task.
3.       Resources: Obviously, these are not the times to fund major government initiatives. Already, the DFA sets in motion activities that need huge funding support (consumer protection, education, orderly liquidation fund, insurance council and more). And the DFA rule-making process (we are still not talking about enforcement, just rule-making) is causing agency heads to raise their hands in despair. The Chairman of the SEC, in a recent study by Boston Consulting Group, said the following “We are currently understaffed by about 400 and need a total of 800 staff to cope up with the DFA”. Union rules, poor communication with self-reg (read FINRA) institutions were cited as some of the challenges. Read this in conjunction with 1 above and you realize that even if budgets were to be available, you are not going to find people of the required caliber in the required numbers quickly.
4.       Internal Training: Consumer education et al are great things to do but the regulators across the board will require significant training and collaboration with other centers of expertise within the government. It is fair to say that every examiner of every agency does not understand the nuances of all the complex swaps and derivatives and hedge funds and dark pools to a level that he or she can actually go ahead and examine it to assure regulatory compliance. We expect legal issues to come up as entities will fight fiercely to maintain the confidentiality of their intellectual property in trading strategy models and formulae and even if they were to be overcome, to have the domain and technology expertise to reverse-engineer those to an extent that you can verify if they break any laws. The Fed Agencies will need to collaborate in innovative ways with universities and specialized agencies not only in the United States but globally to really create the capability needed to implement DFA in a meaningful way.
5.       Plugging the holes: In the meanwhile, the new Pokemon Institute reports shows the goings-on in the data theft and breach world. They surveyed 51 organizations from 15 industries (including financial services) and say that the cost of data breach repairs continue to climb and the average cost of doing so across surveyed entities in 2010 was $ 7.2 million. They, of course, do not include the costs that are not immediately visible in monetary terms such as reputation, client walking away to a competitor etc and we all know that those costs eventually extract a far
These are huge challenges and it will test the mettle of the government from the President to the Fed and all the agencies to take the well-meaning DFA to a state of a comprehensive and enforceable law. All this is not to say that it is not good or anything of that sort. In spite of all the issues and challenges, this step is pioneering and if successful, will establish a bench-mark for regulating the financial services industry of the future. America leads the world in financial services and it must lead in financial services regulation too but it will take a lot more than good intentions to make it work.


Labels: , ,

Friday, March 04, 2011

Leveraging NIST 800 body of work for regulatory compliance

Regulators should leverage NIST’s body of work
People are tired of hearing it from me but I am not tired of saying that I’m a big fan of the work done by NIST over the years. You hear many ‘experts’ saying that NIST is behind the curve, we are in Web 2.0, cloud, social networking, iPad era and NIST guidance is for the 90s. There is an element of truth in that but I would like to state the following:
1.       There are thousands of people and companies innovating every day and for every major product or service that gains acceptance, there are 19 that fade away. As a rule-making body, you cannot be analyzing and making rules for all 20 but only for the one that gains wide-spread usage. The remaining 19 (or those that survive) will be used by some folks because it is particularly relevant to them or because they are geeks who love non-standard stuff but you cannot invest time and money in making rules around each of them, especially if you are public body and give away most of your work free in the public domain.
2.       It is important to understand that rule-making, while important & useful at granular level, is not about technology. It is about creating a systematic approach to securing and monitoring technology platforms that you use. Many people see these rules as if they need to have a one-to-one correspondence with each technology product and that is quite unnecessary. Even if you look at something as granular as PCI DSS, there will be a new security device that does things differently and one may say PCI DSS does not give guidance on how to handle that and hence, it is behind the curve. Security, audit, monitoring and reporting principles are universal and if adopted properly, equip a good security analyst or auditor to easily adopt them to new or emerging technologies.
3.       NIST is accelerating its research and publication process in 2011 and hopefully will receive the funding and priority to keep it up going forward.
Let us take a new release from NIST on “Managing Information Security Risk” (800-39) released this week (March, 2011). It introduces two important ideas for any organization to consider:
1.       Multi-tiered strategic view of risk management:
a.       Tier-1 view of enterprise wide risk (tolerance policy, investment in ERM, appetite)
b.      Tier-2 view of business process level risk (point of failure, architecture, controls)
c.       Tier-3 view of Information system level risk (SDLC, vendor security, audits)
(Only after performing this analysis, move to tactical risk management actions)
2.       Lifecycle view of risk management
a.       Closely linked to the above, see the total picture of what you are doing and do not spend money and efforts on stand-alone sporadic actions
b.      E.g. a “point-in-time penetration test” on your network devices to comply with, say, PCI DSS, and next day you have a major breach that shakes your company and reputation. Why does it happen? There are many reasons and they are mostly managerial rather than technical. Issues like how did you decide which part of your network needs to be tested, where is your data stored, what is your access control policy and practice, are you config and security patches up-to-date and so forth. Most of the vendors and tools that you use for a penetration test will do a reasonable job  (occasionally they do a bad job as well) but the risk management process fails to them what to look for and that is why the money and effort on that test does not get you results.
I do not know if bodies such as FFIEC and SEC make conscious and extensive use of the NIST resources as NIST’s mandate is the Federal Government security (FISMA). The Frank Dodd Act is sure to bring many more regulatory rules (the process has already begun and we can clearly see that the documentation aspects of compliance are going to expand significantly) and NIST provides a great foundation for an organization to take a very systematic, regulation-agnostic approach to technology risk management as an “in-principle” and “proactive” compliance rather than “topical” and “reactive” compliance. No business manager needs to be told what is better.
If a regulator can give specific guidance on how compliance is measured, it enables the regulated entities to work on it. E.g. as PCI DSS provides detailed instructions, compliance has become an organized process. On the other hand, say, SOX 404 does not state anything on how, what or where, compliance efforts have been sporadic and expensive and many extensions and dilutions were granted in its implementation.
A hacker hacks into a government website using exactly the same tools and techniques that he will use to hack into a bank’s website. Hence, if NIST guidelines are good enough to protect the government website, it is good enough to protect the bank website too.
If you use NIST as your information security framework, you don’t need to wait for all the rules of FDA to emerge, you can proactively accelerate your process not only to comply with the laws but actually make your information more secure. Don’t hesitate to tell your regulatory examiners and external auditors that you use NIST as the foundation for your information security governance. If they continue to maintain a poker-face, you know you have made progress.
Buck Kulkarni
March 4, 2011 

Labels:

Thursday, March 03, 2011

Frank Dodd Act Op & Tech Implications Tracker

The regulations, the regulators and the regulated
The US financial services sector regulatory scene has been very confusing, to say the least, for the past two decades. On one hand, there was over-regulation with too many regulators and big gaps in regulation on the other. On top of all that already exists, we have the Frank-Dodd Act taking shape and it is reasonable to say that both the regulators and the regulated are struggling to deal with it.
Fundamentally, any compliance effort has two elements:
1.       “In-principle” compliance strategies & decisions – an organization may decide to become a holding company or get out of credit card business or terminate or initiate new associations and partnerships to position themselves in a certain manner vis-à-vis the regulation(s). These are highly organization-specific decisions and are not the focus on this blog series.
2.       “In-practice” compliance strategies, decisions and implementation – once you decide you have to comply with a regulation, you have to start taking steps that will enable you to comply. This includes understanding the Act, individual rules, their applicability to your circumstance, your current situation and the compliance gaps, remediation steps needed, prioritization of remediation actions, desired state & roadmap of compliance and having achieved compliance, how do you stay there? I call this “GET COMPLIANT, STAY COMPLIANT” process that has a definite beginning and a definite non-end but if done properly, can save you a lot of heartburn, effort, reputational and business risks and of course, a ton of money. People call me naïve (among other things) but at the time of this writing, I still believe good regulatory compliance can be a very significant competitive advantage.
The US financial services industry has had the SEC, the OCC, the FDIC, the OTS, the NCUA and the Fed as the major regulators. Creation of FFIEC as the central examination standards setter has served both the regulators and the regulated well. It is somewhat fashionable to talk of the FFIEC and the NIST as ‘behind-the-curve’ that is, in my opinion, unfair and more importantly, rather immature. Just because FFIEC does not have an examination for Cloud Computing does not mean it is behind the curve. Experienced auditors and examiners know the principles and can easily apply them to emerging technologies. This is not to say FFIEC or NIST should not accelerate but it does not make them any less useful or relevant. No regulatory or standards-setting body can keep up with thousands of innovators and entrepreneurs pushing the boundaries of technology every day that any RE is free to implement. Leave aside the bureaucracy and so forth, these bodies have done tremendous work to create a common intellectual property and as Frank-Dodd and other regulations mature, I hope they leverage these two assets intelligently. While NIST is charged with laying down the information security standards for the federal government, are information security requirements for a bank or a private sector organization any different? I will come back to these themes later. 
Today we start with a primer on the Frank-Dodd Act or the Wall Street Reform and Consumer Protection Act (P.L. 111-203). Clearly the most ambitious piece of legislation in recent memory, it aims to regulate a very wide (if not the whole of it) swath of US financial markets. This legislation, named after the two senators who drove it with a strong zeal, was signed into a law in June, 2010.
Some data points to gauge the size and the scope of the law:
1.       16 Titles with each title to evolve into a major, complex law to cover different aspects of banking, financial services and insurance industry in the United States
2.       Creates three new entities – the Financial Stability Oversight Council, Bureau of Consumer Financial Protection and the Federal Insurance Office
3.       More than 25 major studies (I mean major) will need to be conducted to scope, understand, define and get the government’s arms around the foundational issues
4.       More than 250 rules will need to be drafted over next 18 months with permissible time extensions where needed
5.       Will try to streamline the federal and state regulatory infrastructure  onto a more cohesive and unified platform to ensure full coverage while avoiding duplication, dilution and regulatory arbitrage enjoyed by some REs presently
6.       Thousands of market terms will now need a precise definition if they are used in a law. E.g. what is a ‘family office’ or ‘accredited investor’ or ‘off-balance-sheet-item’. This work is crucial as any open items here will simply open up new regulatory arbitrage (regarb) opportunities.
Enough for now and we will keep coming back to it. As rules are formulated and issued (by different agencies), it can significantly impact how American financial services industry works, complies, competes and succeeds in the highly integrated global marketplace of the future. Equally important, if not more, is how the regulators maintain their perspective, write the rules, create enforcement capabilities and do not end up stifling the industry’s ability to innovate and compete in the integrated global markets of tomorrow, especially as rival markets and currencies emerge in different parts of the world.
I intend to write regularly on the operational and technology aspects of compliance from March 1, 2011 to about end of 2012 on the very complex but compelling US Financial Services Regulation scene as some old dominos fall, some new ones enter our lexicons with the potential to change the US financial services industry in ways unimaginable at present. Despite all the politics, posturing, rhetoric and ‘while-we-are-at-it, let-us-fix-that-too’ approach and of course the possibility that it may be repealed, I feel Frank-Dodd Act has the potential to do more good than harm and possibly make American financial services industry more competitive in the years to come. I look forward to your comments and experiences as all of us work our way through this challenging

Labels: