Wednesday, May 04, 2011

Derivatives Trading & Clearing Rule-making

Derivatives Trading And Clearing Rule-making
Derivatives acquired an especially dark reputation during and after the financial crisis, primarily because most people (including those on the Wall Street and that includes some sitting in the corner offices) did not understand the derivatives business and then, the market process for trading and clearing the derivatives was completely private and the public and the regulators had no visibility into how that process worked and what exactly was going wrong in that area of the markets.
Dodd-Frank’s huge push for consumer education and consumer protection may or may not help ease the first part of the problem but the second part, open platforms for trading and settlements of swaps is what the CFTC is addressing with some intense work in building the foundation and structure for making rules in this gigantic sector of the market that has been hitherto before unregulated. The CFTC has identified 31 areas, will release concept papers, invite public debate and comments to move forward with its mandate of forming the rules that can go into effect under the aegis of Dodd Frank Act.
The major areas and topics the CFTC has identified are:
1.       Foundation and House-keeping:
a.       The biggest challenge in this area will obviously be definitions. There are so many terms (starting from the word swap itself) that are used in the market in a very general and loose sense. For a rule to hold, very precise definitions will be needed. Terms like swap dealer, swap participant, non-financial participant, settlement, un-cleared swaps and the like will take a lot of work to hammer into a industry-wide (world-wide, really) acceptable meaning. This is going to be easier said than done.
b.      Registration – who to register the various players in derivatives game, their legal classes, roles, authority, responsibilities and penal actions for each type of participant
c.       Code of conduct within as well as without (i.e. counterparties)
d.      Capital and margin requirement for non-bank participants
e.      Bankruptcy & Liquidation treatments of derivatives (one of the major causes of disruption in 2008)
2.       Clearing:
a.       A key goal of the legislation is to define the core principles for clearing
b.      End-user exceptions permissible
c.       Mandatory clearing process
d.      ‘Systemically important’ treatment of DCO (derivatives clearing organization)
3.       Trading:
a.       Core principles, rules and workflow
b.      Documentation maintenance across lifecycle and transaction legs
c.       Identifying, defining & registering parties to trading
d.      Interface with global (external to US) trading systems and processes
4.       Data;
a.       Core principles regarding data elements to be provided, captured, processed
b.      Data storage and processing entities and responsibilities
c.       Real-time reporting requirements, if any
d.      Data ownership, stewardship and security
5.       Particular Products:
a.       Derivative trades can be created literally on the fly and it is difficult to enforce the product discipline of an equities or bond market. If a regulator does not know what product is being traded, he has no control over the systemic risks being created. On the other hand, excessive regulation may hurt US’s ability to compete in this extremely competitive and location-agnostic business
b.      Portfolio level margining, reporting and systemic risk oversight
c.       Joint decisions with SEC on product categories and actual oversight classification
6.       Anti-manipulation:
a.       Restrictive or otherwise manipulative trading practices
b.      Disruptive practices and processes
c.       Treatment of whistle-blowers
7.       Position Limits:
a.       Large Traders, bona-fide hedging,  aggregate limits
b.      Systemic risk aspect of position limits
8.       Other linkages:
a.       Volcker Rule
b.      Credit Rating agency and process
c.       Investment Adviser reporting
d.      Fair credit reporting
I may have missed a few but I think the point is clear.
House Agriculture Committee Chairman Frank Lucas has already proposed an 18-month delay to implementing these rules and as expected, the industry lobbies are supporting the delay whole-heartedly.
My problem?  This is humongous and both the CFTS’s desire to release the rules asap and the industry’s desire to delay appear moot to me. Even if CFTC could get all the rules out quickly and the industry is eagerly looking forward to implementing them, creating and permeating all these rules thru the entire  ecosystem, getting people to understand the definitions, create the documentation, workflow and clarity in contracts, the actual trading and settlement platforms, the arbitration and legal framework to resolve issues, the front, mid and back offices, the cross-border legs involving possibly every other country in the world is unrealistic. Imagine the technology platforms and the new legal entities that need to come into existence to develop, test, operate and maintain them at the scale required and at the assurance level required to process trillions of dollars of trades. Especially at a time when not all participants are going to open their purse-strings easily for upgrading their skills, technology and people.
My solution? Big bang implementation of this rule (or group of rules) is not going to work, whether today or 18 months from today. The idea behind the rule is sound and a good, old incremental approach is what I can see taking us. Maybe it will take five years but two benefits, one, we can prioritize and channel limited resources in a properly directed manner and second, it will get us there!




Labels: , , , ,

Wednesday, March 30, 2011

Compliance Planning - The Volcker Rule

As rule-making gathers steam, banks have started putting together exploratory (some are further ahead) teams to define their “Dodd-Frank Compliance Strategy”.

If we have to comply with the DFA, we will need to do much better than waiting for rules to be finalized and rolled out. Old hands know that any compliance starts with an outline (even hazy to begin with) of the expectations and potential responses and evolves as you get a better handle on it (this applies to the act of complying, not the strategic options). We will have lay the foundation for compliance based on what we know about the Act today and we will have to create a unified theme for our compliance effort that makes some sense and start compliance ‘runs’ starting now and then tweak the engine as every rule takes shape. If we are not compliant by the time the final rule is published, it is too late already. Remember how we complied with HIPAA in 2002 and then with SOX in 2003 and then GLBA in 2005 and then PCI DSS in 2006 and how we did everything six times over? “It broke the bank” metaphorically then, it will not be a metaphor this time around.

I take the Volcker rule as an example for this blog and let us see what business process or  technology level measures will need to be planned in order to achieve,

One quick qualifier. The focus is on business process compliance and not strategic options or business model issues.

Volcker rule’s compliance burden is going to be essentially in one area. How do you demonstrate that whatever you have done falls under “Permitted Activities” of the rule?

Let us take one example. One permitted activity is “transactions in connection with underwriting or market-making activities, to the extent designed not to “exceed the reasonably expected near term demands of clients, customers or counterparties”.

Let us analyze the steps needed to comply with this:
1.       All transactions of underwriting or market-making activities need to be brought together in a central place (this will have to be done, no matter what shape or content of the final rule)
2.       All client, customer or counterparties orders and instructions will have to be captured, tagged and massaged so each can be related to one or more underwriting or market-making activities performed by the bank (anywhere in the world?). A huge business process issue here will be about re-designing the customer interaction process and documentation to have enough information and commitment from the client, customer or counterparty for you to justify your actions. Can this be done? Or this starts another “transaction code rationalization” a la HIPAA?
3.       All underwriting or market-making activities performed by the bank that do not have a direct link to a client, customer or counterparty order but can be demonstrated as “reasonably expected near term demands”?,  Demonstrate that it was necessary to perform a transaction to “support” a client, customer or counterparty order and that it was simply a risk mitigation measure of a risk that was already there due to a client, customer or counterparty order and not a new risk created by bank’s own decisions. “reasonable expectations”, “near term demands” – all minefields but still enough to kick-off compliance planning.
There will be more things to do but let us say this is the gist of it.
From a pure compliance planning and technology support perspective, these three things will translate to a) building ‘data warehouses’ of certain types of transactions to be able to analyze   their cause-effect relationships with certain third-party (client, customer, counterparty) actions or orders and b) building strong analytical and heuristic engines that will establish & report connections, dependencies, pairing and risk-mitigation across different transactions.
It will be expensive, difficult and will take a few years (even five or more) of testing and double-checking to make sure that it actually works and generates dependable information.
The Pros:
Regardless of the Volcker rule or the DFA, this information will be a great risk management and decision-making tool for a bank and those who do it well will have a huge competitive advantage. As of today, the power to correlate transactions and risks across clients and proprietary actions does not exist at most banks and where it exists, it exists in pockets.
The Cons:
If the rule-making process loses sight of these real challenges, we may have rules that will take thousands of person-days to file returns that the regulator never gets to and suddenly it is too late all over again.

Labels:

Thursday, March 10, 2011

Challenges in making rules under the Dodd Frank Act, 2010

The Dodd-Frank Act is very ambitious in its scope and expects all agencies charged with banking regulation to come up with specific rules within 18 months (or so) in their respective domains so as to actually implement the DFA across the financial services spectrum. This is (far) easier said than done and regulatory agencies need significant resources to be able to do a good job in this area. Let us see some of the things that precede rule-making at individual agency level:
1.       Subject matter expertise: All the agencies have accumulated vast expertise in their respective domains but DFA covers new territories not charted by any agency thus far. They include Hedge Funds, Capital Adequacy, Consumer Protection, Segregation (Volcker) and so forth. The industry has spent billions of dollars and employed an army of Ph Ds to come up with very sophisticated IP and to understand these issues to a level that you can write a regulation around it will take a lot of learning for the agencies. In general, I believe a good regulation does not need to correspond one-to-one with industry practices but writing a new regulation in an hitherto unregulated (or lightly regulated) area is a different cup of tea.
2.       Defining the terms: Each industry has its jargon and I guess financial services is the leader in this arena. From “give me some balance sheet” to “dark pools” and from “family offices” to “Hedge Funds” there are terms galore and everyone understands what they mean but when you talk of basing a law on these terms, you need a precise definition to ensure that the Regulated Entities are clearly identified and the regulations do not end up offering regulatory arbitrage. I will not be surprised if the terms to be defined run into thousands and this amounts to virtually building an industry lexicon and will be a humongous task.
3.       Resources: Obviously, these are not the times to fund major government initiatives. Already, the DFA sets in motion activities that need huge funding support (consumer protection, education, orderly liquidation fund, insurance council and more). And the DFA rule-making process (we are still not talking about enforcement, just rule-making) is causing agency heads to raise their hands in despair. The Chairman of the SEC, in a recent study by Boston Consulting Group, said the following “We are currently understaffed by about 400 and need a total of 800 staff to cope up with the DFA”. Union rules, poor communication with self-reg (read FINRA) institutions were cited as some of the challenges. Read this in conjunction with 1 above and you realize that even if budgets were to be available, you are not going to find people of the required caliber in the required numbers quickly.
4.       Internal Training: Consumer education et al are great things to do but the regulators across the board will require significant training and collaboration with other centers of expertise within the government. It is fair to say that every examiner of every agency does not understand the nuances of all the complex swaps and derivatives and hedge funds and dark pools to a level that he or she can actually go ahead and examine it to assure regulatory compliance. We expect legal issues to come up as entities will fight fiercely to maintain the confidentiality of their intellectual property in trading strategy models and formulae and even if they were to be overcome, to have the domain and technology expertise to reverse-engineer those to an extent that you can verify if they break any laws. The Fed Agencies will need to collaborate in innovative ways with universities and specialized agencies not only in the United States but globally to really create the capability needed to implement DFA in a meaningful way.
5.       Plugging the holes: In the meanwhile, the new Pokemon Institute reports shows the goings-on in the data theft and breach world. They surveyed 51 organizations from 15 industries (including financial services) and say that the cost of data breach repairs continue to climb and the average cost of doing so across surveyed entities in 2010 was $ 7.2 million. They, of course, do not include the costs that are not immediately visible in monetary terms such as reputation, client walking away to a competitor etc and we all know that those costs eventually extract a far
These are huge challenges and it will test the mettle of the government from the President to the Fed and all the agencies to take the well-meaning DFA to a state of a comprehensive and enforceable law. All this is not to say that it is not good or anything of that sort. In spite of all the issues and challenges, this step is pioneering and if successful, will establish a bench-mark for regulating the financial services industry of the future. America leads the world in financial services and it must lead in financial services regulation too but it will take a lot more than good intentions to make it work.


Labels: , ,

Friday, March 04, 2011

Leveraging NIST 800 body of work for regulatory compliance

Regulators should leverage NIST’s body of work
People are tired of hearing it from me but I am not tired of saying that I’m a big fan of the work done by NIST over the years. You hear many ‘experts’ saying that NIST is behind the curve, we are in Web 2.0, cloud, social networking, iPad era and NIST guidance is for the 90s. There is an element of truth in that but I would like to state the following:
1.       There are thousands of people and companies innovating every day and for every major product or service that gains acceptance, there are 19 that fade away. As a rule-making body, you cannot be analyzing and making rules for all 20 but only for the one that gains wide-spread usage. The remaining 19 (or those that survive) will be used by some folks because it is particularly relevant to them or because they are geeks who love non-standard stuff but you cannot invest time and money in making rules around each of them, especially if you are public body and give away most of your work free in the public domain.
2.       It is important to understand that rule-making, while important & useful at granular level, is not about technology. It is about creating a systematic approach to securing and monitoring technology platforms that you use. Many people see these rules as if they need to have a one-to-one correspondence with each technology product and that is quite unnecessary. Even if you look at something as granular as PCI DSS, there will be a new security device that does things differently and one may say PCI DSS does not give guidance on how to handle that and hence, it is behind the curve. Security, audit, monitoring and reporting principles are universal and if adopted properly, equip a good security analyst or auditor to easily adopt them to new or emerging technologies.
3.       NIST is accelerating its research and publication process in 2011 and hopefully will receive the funding and priority to keep it up going forward.
Let us take a new release from NIST on “Managing Information Security Risk” (800-39) released this week (March, 2011). It introduces two important ideas for any organization to consider:
1.       Multi-tiered strategic view of risk management:
a.       Tier-1 view of enterprise wide risk (tolerance policy, investment in ERM, appetite)
b.      Tier-2 view of business process level risk (point of failure, architecture, controls)
c.       Tier-3 view of Information system level risk (SDLC, vendor security, audits)
(Only after performing this analysis, move to tactical risk management actions)
2.       Lifecycle view of risk management
a.       Closely linked to the above, see the total picture of what you are doing and do not spend money and efforts on stand-alone sporadic actions
b.      E.g. a “point-in-time penetration test” on your network devices to comply with, say, PCI DSS, and next day you have a major breach that shakes your company and reputation. Why does it happen? There are many reasons and they are mostly managerial rather than technical. Issues like how did you decide which part of your network needs to be tested, where is your data stored, what is your access control policy and practice, are you config and security patches up-to-date and so forth. Most of the vendors and tools that you use for a penetration test will do a reasonable job  (occasionally they do a bad job as well) but the risk management process fails to them what to look for and that is why the money and effort on that test does not get you results.
I do not know if bodies such as FFIEC and SEC make conscious and extensive use of the NIST resources as NIST’s mandate is the Federal Government security (FISMA). The Frank Dodd Act is sure to bring many more regulatory rules (the process has already begun and we can clearly see that the documentation aspects of compliance are going to expand significantly) and NIST provides a great foundation for an organization to take a very systematic, regulation-agnostic approach to technology risk management as an “in-principle” and “proactive” compliance rather than “topical” and “reactive” compliance. No business manager needs to be told what is better.
If a regulator can give specific guidance on how compliance is measured, it enables the regulated entities to work on it. E.g. as PCI DSS provides detailed instructions, compliance has become an organized process. On the other hand, say, SOX 404 does not state anything on how, what or where, compliance efforts have been sporadic and expensive and many extensions and dilutions were granted in its implementation.
A hacker hacks into a government website using exactly the same tools and techniques that he will use to hack into a bank’s website. Hence, if NIST guidelines are good enough to protect the government website, it is good enough to protect the bank website too.
If you use NIST as your information security framework, you don’t need to wait for all the rules of FDA to emerge, you can proactively accelerate your process not only to comply with the laws but actually make your information more secure. Don’t hesitate to tell your regulatory examiners and external auditors that you use NIST as the foundation for your information security governance. If they continue to maintain a poker-face, you know you have made progress.
Buck Kulkarni
March 4, 2011 

Labels:

Thursday, March 03, 2011

Frank Dodd Act Op & Tech Implications Tracker

The regulations, the regulators and the regulated
The US financial services sector regulatory scene has been very confusing, to say the least, for the past two decades. On one hand, there was over-regulation with too many regulators and big gaps in regulation on the other. On top of all that already exists, we have the Frank-Dodd Act taking shape and it is reasonable to say that both the regulators and the regulated are struggling to deal with it.
Fundamentally, any compliance effort has two elements:
1.       “In-principle” compliance strategies & decisions – an organization may decide to become a holding company or get out of credit card business or terminate or initiate new associations and partnerships to position themselves in a certain manner vis-à-vis the regulation(s). These are highly organization-specific decisions and are not the focus on this blog series.
2.       “In-practice” compliance strategies, decisions and implementation – once you decide you have to comply with a regulation, you have to start taking steps that will enable you to comply. This includes understanding the Act, individual rules, their applicability to your circumstance, your current situation and the compliance gaps, remediation steps needed, prioritization of remediation actions, desired state & roadmap of compliance and having achieved compliance, how do you stay there? I call this “GET COMPLIANT, STAY COMPLIANT” process that has a definite beginning and a definite non-end but if done properly, can save you a lot of heartburn, effort, reputational and business risks and of course, a ton of money. People call me naïve (among other things) but at the time of this writing, I still believe good regulatory compliance can be a very significant competitive advantage.
The US financial services industry has had the SEC, the OCC, the FDIC, the OTS, the NCUA and the Fed as the major regulators. Creation of FFIEC as the central examination standards setter has served both the regulators and the regulated well. It is somewhat fashionable to talk of the FFIEC and the NIST as ‘behind-the-curve’ that is, in my opinion, unfair and more importantly, rather immature. Just because FFIEC does not have an examination for Cloud Computing does not mean it is behind the curve. Experienced auditors and examiners know the principles and can easily apply them to emerging technologies. This is not to say FFIEC or NIST should not accelerate but it does not make them any less useful or relevant. No regulatory or standards-setting body can keep up with thousands of innovators and entrepreneurs pushing the boundaries of technology every day that any RE is free to implement. Leave aside the bureaucracy and so forth, these bodies have done tremendous work to create a common intellectual property and as Frank-Dodd and other regulations mature, I hope they leverage these two assets intelligently. While NIST is charged with laying down the information security standards for the federal government, are information security requirements for a bank or a private sector organization any different? I will come back to these themes later. 
Today we start with a primer on the Frank-Dodd Act or the Wall Street Reform and Consumer Protection Act (P.L. 111-203). Clearly the most ambitious piece of legislation in recent memory, it aims to regulate a very wide (if not the whole of it) swath of US financial markets. This legislation, named after the two senators who drove it with a strong zeal, was signed into a law in June, 2010.
Some data points to gauge the size and the scope of the law:
1.       16 Titles with each title to evolve into a major, complex law to cover different aspects of banking, financial services and insurance industry in the United States
2.       Creates three new entities – the Financial Stability Oversight Council, Bureau of Consumer Financial Protection and the Federal Insurance Office
3.       More than 25 major studies (I mean major) will need to be conducted to scope, understand, define and get the government’s arms around the foundational issues
4.       More than 250 rules will need to be drafted over next 18 months with permissible time extensions where needed
5.       Will try to streamline the federal and state regulatory infrastructure  onto a more cohesive and unified platform to ensure full coverage while avoiding duplication, dilution and regulatory arbitrage enjoyed by some REs presently
6.       Thousands of market terms will now need a precise definition if they are used in a law. E.g. what is a ‘family office’ or ‘accredited investor’ or ‘off-balance-sheet-item’. This work is crucial as any open items here will simply open up new regulatory arbitrage (regarb) opportunities.
Enough for now and we will keep coming back to it. As rules are formulated and issued (by different agencies), it can significantly impact how American financial services industry works, complies, competes and succeeds in the highly integrated global marketplace of the future. Equally important, if not more, is how the regulators maintain their perspective, write the rules, create enforcement capabilities and do not end up stifling the industry’s ability to innovate and compete in the integrated global markets of tomorrow, especially as rival markets and currencies emerge in different parts of the world.
I intend to write regularly on the operational and technology aspects of compliance from March 1, 2011 to about end of 2012 on the very complex but compelling US Financial Services Regulation scene as some old dominos fall, some new ones enter our lexicons with the potential to change the US financial services industry in ways unimaginable at present. Despite all the politics, posturing, rhetoric and ‘while-we-are-at-it, let-us-fix-that-too’ approach and of course the possibility that it may be repealed, I feel Frank-Dodd Act has the potential to do more good than harm and possibly make American financial services industry more competitive in the years to come. I look forward to your comments and experiences as all of us work our way through this challenging

Labels:

Saturday, August 05, 2006

Cornell Project Management Methdology - a good way to get going

I have been a great fan of quality and process platforms all along and over the past 25 years, I was involved in engineering / project management methodologies such as SSADM, SEI CMM, ISO & PMI. The disciplines grew in response to two distinct challenges faced by the IT community. One was managing the technology better and the other was managing the technology project better. Pioneers like Ed Yourdon empowered people to look at software development as a scientific process and took it from the realm of pure art to (at least some) science and the software process methodologies empowered people to see that projects can actually be planned and are not totally an ‘act of God’J. Even by mid-80s, too much software was already circling the globe and folks on the inside were already feeling the pain of bloated software and were looking for help and the situation now is of course significantly more acute.

COBIT, ITIL and 17799 (or NIST for federal organizations) took the tool-set to a new level in that it established best practices frameworks to guide projects as well as programs or development as well as operations. With the regulatory compliance reverberating through the corporate world, other frameworks such as COSO have now entered the mix and have performed a very valuable service by taking (mostly) IT projects mainstream within the organizations.

While each of these has performed a valuable role in providing the community with tools to optimize their project planning and execution process, the community’s acceptance of these tools has been painfully slow. I think most organizations and people have outgrown the stage of ‘we don’t need all these fancy tools’ and want to implement methodologies that will enable them to manage their IT better. However, putting them to actual use has been very difficult. These methodologies, in some ways, overlook the natural and basic process of how people absorb and utilize things. People would typically like to start with a small, low-impact project to get a feel and once sure of the implications and results, they would make the investments to apply the organization wide. Most of the tools are not suited for this approach. You have to invest a lot of time and effort to create the minimum platform before you can get going and the effort feels disproportionate to the experiment you want to conduct. Obviously, this is misleading - if people do implement it with rigor and commitment, they will see the benefits but a lot of organizations do not have the resources or patience to go thru this and end up dropping the initiative and sliding back to ‘wild west’ ways of doing things that some people did not want to give up to begin with.

I recently came across Cornell University’s Project Management Methodology (CPMM). Cornell IT created this custom version of project management to meet their specific internal needs and goals and acknowledge it to be based on PMI BoK. I see several merits in this version from a practical adoption perspective. First, it is a simplified implementation (though certainly not simplistic). Anyone who has done projects for a few years would be able to implement it easily. Secondly, the WBS model is at a level where you don’t have to cross-reference things at a granular level. These gives tremendous flexibility to choose the level of detail you wish to have in your implementation without getting bogged down. Thirdly, it uses very little tech-heavy language or notations and you can involve all the stakeholders in the project that gives a real productivity boost as all people talk from the same document. Most templates are in Word and Excel rather than .mpps and .vsds than only programmers know how to load and read. Then there are some powerful guiding principles like SMART, a visual map of the five-phase process, document templates and constant connection to business case and all stakeholders to ensure that the project is not ‘hijacked’ by a dominant stakeholder (U-know-who). And finally, it seems ready to use irrespective of the size of your project. That will be a big plus for organizations starting out on an exploratory journey.

If you have not seen it already, take a look (projectmanagement.cornell.edu). If you are seeking ways to introduce project discipline in your organization without being too expensive or disruptive, you might get some good ideas. If simplifying the project management process and unifying all your stakeholders are your goals, you might be in for some pleasant surprises how much CPMM will deliver.

Buck Kulkarni

Tuesday, August 01, 2006

GRC and the business manager

All corporations are doing something about compliance and some are doing more than others. While some are putting in the minimal work that they think they can get away with, others see this to be far more fundamental to the long-term success and profitability of their business and are investing aggressively.

Most companies started their compliance effort with specific Sarbanes Oxley (SOX) compliance requirements or specific weaknesses identified in their operations by auditors. This resulted in many of them ‘passing’ their first yearly audits and they felt they had their arms around compliance.

But it was not to be. Compliance proved to be an equally, or even more, elusive goal in the next year and the realization dawned on many companies that they had created compliance silos that were very rigid, expensive and difficult to maintain.

Over the last 2-3 years, different players within a corporate entity understood what was at stake and how they need to engage in the rather difficult and sometimes nebulous process of achieving compliance. The Board members understood what was at stake. The CEO and CFO understood the serious repercussions on their life and career. Audit and risk management folks knew a lot about it already and were happy to see that their agenda was now getting the attention they always knew it deserved. IT and infrastructure started with an ambivalent attitude ‘tell us what you want and we will fix it’ rather than getting pro-actively engaged. Some of them learned and got on board and some others did not and got run over by risk, security and compliance folks.

But one key stake-holder that has still not showed up in strength is the business manager. An executive running a region or a line of business or a product or a combination is still rather removed from the nuts-and-bolts of compliance. It is a bit un-nerving to watch this unfold in company after company you are work with.

The reasons are many, some obvious and some not so obvious. The simpler ones are that business managers are too busy with critical operations (or making money for the company), they are not legal and accounting savvy, they are not IT savvy etc. But the real reason you learn after speaking with many business managers is that they don’t think it is their job. Business managers are the ‘line managers’ so to speak and compliance, just like accounting, HR, security, facilities, is a ‘staff function’. It is part of the eco-system that the company is supposed to provide to the business manager to run the business.

While not totally wrong, it is increasingly anachronistic in the modern business model. Just as a business manager has to involve herself in HR to ensure her people are happy and productive, has to involve herself in accounting to understand the profit, loss, commissions, incentives, market shares of her business operation, she has to now understand the compliance situation to conduct her business in a safe, uninterrupted and credible manner.

But the business managers need some help. If all they can hear is firewall and IDS protection, COSO framework and material weaknesses, it is difficult for them to get focused on compliance. However, if we tell them that 45% of their customers may have difficulty conducting business on our website if we do not do this (or that), they immediately get engaged and in fact, push compliance far more than many other stakeholders.

I recall some aggressive banks had lists of competitors and their important customers that they would target should a bank fail due to Y2K problems. While we did hear about it actually happening, it is a good pointer of where compliance is headed. Very soon, it is going to be a competitive differentiator and no body needs to worry about it more than the business manager.
Buck Kulkarni