About Us  |  Search  | FAQ  | Contact Us
Anti Money Laundering
Home
Banking
STP
Risk Management
BCM
CLS
Human Resources
e-commerce
Features
Smarts Cards
Interviews
Optimise CRM
Data Warehousing
Disaster Recovery
Swift Messaging
Securities
M-commerce
Africa
Finance
BPM & Workflow
Capital Markets
Global Custody
Outsourcing
 

 

 

 

Anti Money Laundering, Enhanced Regulatory Control, Improved Data Integrity, Timely Management Information

There is a Magic Bullet

www.stbsystems.com

Among the many costs that financial institutions have to cope with, some of the most seemingly unjustifiable are the costs of being regulated, being seen to be on top of it all by submitting endless home country and head office reports and of course having to act as an unpaid add-on to your many and various local law enforcement services whilst seeking out financial crime. James Phillips, Business Development Director of STB Systems, explains how financial institutions can benefit from (as opposed to tolerate!) regulatory, anti-money laundering and head office reporting, and at the same time reduce the costs, capital charges and risks of regulatory and other integrity-related issues.

Regulatory control and reporting is a cost
No-one needs to be told that there is a cost of keeping a financial institution's nose clean. Chief Financial Officers and others in management will have noticed that this cost just keeps on rising year after year. Not only are the costs of standing still increasing, there are now the additional costs associated with new emerging risks, such as the forthcoming regulatory capital charge for operational risk for banks, or the risk of actual regulatory censure and reputational damage in the event of getting caught up in some aspect of money laundering or bank secrecy violation.

The Bank for International Settlements has provided a benchmark, for some time, for banking regulatory supervision. However with Basel II; the 2nd Capital Adequacy Directive, BIS have introduced what will probably be the most expensive upgrade to the regulatory regime in banking history. The cost of complying with Basel II will be huge, and unceremoniously adds to what has already been a pretty expensive decade or so of heavier financial regulation. The increased capital charges associated with Basel II, and in particular those intended to cover for operational risk are most newsworthy.

There is already a high cost to compliance. However over the next few years, in preparation for Basel II, increased information technology spend and human resource cost for compliance management and training will put on further pressure. These costs are such that many banks may well find they are unable to operate effectively, and that they could find themselves considering closing instead of carrying on trying to push water uphill. Indeed, it has been suggested that rather than spend even more money on regulatory compliance, and thus erode capital, that it might be better to put the funds under the mattress as a capital buffer instead. Of course the only problem with this is that regulators, looking after the public interest, will not be able to measure and quantify the risks involved and thus be satisfied that the capital under the mattress is sufficient.

To balance the budget CFO's will have to find ways of complying with regulations effectively and at the same time to accept that additional resources may have to be deployed on regulatory compliance, meaning they have little option but to reducing costs elsewhere. How is this to be done?

Reconciliation is a cost
At the same time, organisations clock up considerable internal costs in the unceasing quest for "the right answer". On a regular basis, many hours are spent performing reconciliation tasks to determine why one number on one report does not relate to the subtotals arising from another report. This leads to a weakening in confidence that the organisation is aware of its status. As a consequence, regulatory risks increase in parallel with the pure business risk of being outstripped by competitors, or of not correctly understanding the nature of ones own business. Thus, the risk of not exploiting profitable business, as opposed to continuing to spend time on somewhat less worthwhile activities is a constant niggle.

It therefore follows that being able to be continuously comfortable that the information you are getting is accurate, and that your compliance and regulatory activities are not flawed; the concept of undertaking "continuous business audit" - carries great value. Organisations can eliminate swathes of wasted time, adding up, cross casting and other virtually manual tick-backs by ensuring integrity of data in all their operations. There are ways of achieving this.

Lack of quality data about your business is a cost
The absence of confidence in the integrity of data, or in the reliability or appropriateness of installed computer systems also frequently leads to many system changes, and more huge costs, where a Holy Grail is being sought. Thus, organisations embark on vastly expensive enterprise resource planning projects, or replace their general ledgers or install data warehouses, or attempt to force through standardisation projects which aim to get all their branches onto the same transaction processing sub-system, which itself could in some cases compromise operating efficiencies as different branches may have different specialisations, maybe wholesale in one place yet retail in another.

In many cases it is simply not feasible for a bank, securities firm and the like to upgrade all their infrastructure to replace existing internal systems, either for reasons of cost or because of straight forward impracticality. Equally, it is also a home truth that whatever improvements in subsystems are put in place, it is just about impossible to stop some departments booking data into their most popular desk top office packages which in themselves may not perform data validation, and so the data integrity issue will keep on rearing its ugly head, whatever you do.

For smaller or medium size organisations, the systems creating the data problems may be core systems. There will not be a viable cost-benefit result that can justify changing these source systems purely in order to improve data quality. For larger firms, which may have been able to afford wholesale systems upgrades, they are still dogged by the greater complexity of their business that introduces yet further scope for error, be that unintentional, or indeed for insider activity which may be fraudulent and which needs to be detected, yet because of the complexity of the larger organisation's infrastructure makes this very difficult. So far, then, an unhappy story.

Anti Money Laundering control is a cost
And here's yet another cost. There's no need to repeat all the issues relating to the requirement for vigilance in respect of money laundering or other forms of financial crime here. However, it is interesting to review the different approaches to using automation as a solution and to see that there are certain tecnologies that might not be right for certain types of organisation or certain specific high-risk areas.

At one end of the scale, there are neural networks, and data mining products. These very powerful software engines will squirrel away in years of historical data, and current data, in order to determine patterns and activities that you did not even know you were looking for.

These systems operate on an enterprise-wide basis, pooling millions of transactions from any or all parts of the organisation so as to be able to better seek out and find these patterns. This type of technology does not, however, lend itself to smaller firms for cost and overkill reasons, and it does not do much to help protect against letting an inappropriate customer open an account with you in the first place. Some systems are near real-time, running 24 hours, and provide financial institutions with facilities to track and profile transactional activity where customers, or groups of customers, are undertaking business with the bank in many different locations. These technologies aim to pool data and to enable multi-branched financial institutions get the whole picture of a customer's activity, which when seen as a whole may indicate a suspicious pattern which of course would not be seen when viewed locally. Most of these systems are fiendishly expensive and are far too costly for the regular branch operation.

Alternatively, there are systems that enable normal patterns of banking activity to be determined for individuals, for accounts or for groups of accounts. There are filters that focus on payment streams and automatically intercept payments or movements with parties designated as undesirable to do business with, for one reason or another. Finally, other systems and services exclusively focus on the recording of customer verification data that will enable account managers and compliance officers to truly say that they have procedures in place that illustrate best efforts are being made to "know your customer".

So; organisations can choose to not specifically automate, or at least only partially automate anti money laundering control, and risk regulatory censure or reputational damage. Alternatively, adequate automation can be provided and another cost incurred!

What to do? What's the Holy Grail?
The best solution must surely be to tackle many of these highly related compliance, data integrity, regulatory and management information requirements in one go. If project overheads make this impractical, which they may well, then organisations facing these issues should strive nonetheless to use one set of technology and to eliminate all the duplicated work that otherwise goes into preparation of data feeds for project one, and then again for project two and so on. This is achievable, and is a real step towards that Holy Grail: quite simply, organisations will accrue major benefits by concentrating on systems that capture faults, automatically detecting issues, exceptions, data failures, regulatory tolerances, abnormalities and so on.

Thus, any time a mandatory field is not completed, or an exception to a norm occurs, or an unacceptable value is placed in a database, it would be rather better to determine this quickly, than to find out about it at the month end when reports are due, or at the year end when clock-ticking auditors sit about watching their fees accumulate whilst waiting for someone to determine the source of a discrepancy.

A solution that collects all the meaningful data together, aggregates it, tests it against business rules and other criteria that determine the data as good bad or indifferent can transform the value of business information, and reduce the risk of financial or regulatory non-compliance to a financial institution in one fell swoop.

What is needed to achieve this is easily defined:

n Take data from core or sub source systems, office desk top packages; wherever the data resides

n Pass the data to a single database that will normalise it, so that comparisons can be made

n Strip the data into subsets that are suitable for regulatory reporting, management reporting and anti money laundering purposes

n Define validation criteria, such as tolerances for abnormal movements, essential data fields, blocked entities, blocked accounts or activities and so on

n Apply business rules to the data to test these validation criteria in all parts of the data, and fire off reports if anything fails your criteria; or those of the applicable regulatory or anti-money laundering authorities

Thus, organisations can carry out one project, obtain one version of the truth, and do this work once only.

As a consequence of this approach, CFO's can expect to have a cleaner corporate nose. One implementation, aimed at achieving a continuous audit of the state of the business, as it is represented in data, can achieve improvements that pay for that implementation many times over. Once by reducing (or at least containing!) the costs of regulation through automating regulatory report computation, again by eliminating unproductive and unsatisfying man-hours involved in manually checking and cross-checking numbers; so releasing valuable resources to do other things. Next, the CFO will as a direct result of this approach, eliminate data faults and simultaneously learn more about the business, its trends, its patterns, its highs and lows. Add to this that operational and reputational risks (arising from not playing ball with Basel II or anti money laundering protection requirements) are much reduced, and it is a win-win situation all around.

About STB
STB don't do anything particularly glamorous, or worthy of front-page news. Our chosen space is to reduce the cost, to our clients, of keeping their noses clean with the regulators, lawmakers and head office. Put simply, we improve the efficiency of collecting data, re-aggregating it, and reporting it, and also identifying and reporting about any anomalies in it. These anomalies can be genuine data errors, or subterfuge perpetrated by fraud scams or money launderers.

We do so in such a way that the cost of this service is minimised and acceptable to a vast array of clients. It is all that we do, and we've been doing it for a decade. We therefore represent one of the external suppliers that can, through specific expertise, share some of the burden. In particular, it is spectacularly inefficient for each financial institution to reinvent the same wheel for the computation of Basel-dictated ratios, or to reinvent the same wheel in order to identify suspicious patterns of activity in an account. These areas can all be outsourced to specialised vendors, as one set of mandatory rules is the same for everyone. STB are ideally positioned to help.

Of course, not everything can be dealt with by using computer systems. Most obviously, it is not possible to automate a way out of a skills shortage or an endemic procedural control problem, or automate the eradication of rampant poor quality. STB do not, therefore, provide training or recruitment or other similar fixes for improving the quality of regulatory, anti money laundering and head office reporting. However, the company does provide cost-effective, efficient systems that satisfy these requirements, and provide added value as well in the whole area of continuous business audit.

 

James Phillips
Business Development Director

 

 

Banking

Secure Banking
Internet Banking
Clicks not Bricks
Better Infrastructure
Banking Software
Automated Dealing
Synergys going dotty
Focus on Technology
Issues
Web Based Banking
Trading room costs
Offshore fund managers
New age of hacking
Electronic trading
Money laundering
The Perfect Storm
Supplier Financing
Speculative Bubbles
Index Funds
Convenience banking
Two-tier banking
Gaining Clickshare
Cutting out paper
Tracking trends
Integrating
E-commerce
Banking on Security
Real Time
security under scrutiny
Personal touch Banking
ISMA
Sell side value
GSTPA for FX
Informatics
Anti Money Laundering
21st Century Banking

 
 
 

 

Home  |  About Us  |  Search  | FAQ  | Contact Us