Financial sector software failures fill our news pages – from banks failing to make payments to cash machines giving away free money, these high profile crashes are just the tip of the iceberg.
The challenge of ensuring quality of software and IT in the banking sector is nothing new. Banking IT is highly complex, with different systems for both retail and investment banks. A typical banking IT scenario may encompass several different systems from multiple vendors, developed independently and integrated over a period of twenty years. The business supported by the systems is so wide that no single team understands the entire IT infrastructure.
Adding to this complexity are emerging and relatively disruptive technologies. Gartner has recently coined the phrase “Nexus of Forces: Social, Mobile, Cloud and Information” . According to the industry analyst, “the Nexus of Forces is the convergence and mutual reinforcement of social, mobility, cloud and information patterns that drive new business scenarios.”
Unfortunately, these forces are driving change at a pace that many banks are unable to sustain. Poorly documented legacy systems are unable to integrate with the new forces/technologies and this status quo is leading to the banking sector’s failure to capitalize on new opportunities.
Chaos of Forces
There is an inherent and undeniable link between change and quality that many IT departments across the banking sector already recognize. However, as banks attempt to introduce the new forces and become more agile, complexity increases and the impact of change is tougher to analyze. Banks, alongside investment firms and insurance companies, are failing to view the forces in a connected fashion. For example, if a bank is involved in mobile development, there may be limited recognition that this development may relate to social or cloud activity. Rather, the development is treated as an isolated activity with the bank failing to understand the interdependency of these domains.
Mobile – A Force to Be Reckoned With
The highest profile force in banking today is mobile. To banks, a mobile phone is becoming a new wallet that could make credit cards obsolete and one day manage all financial transactions. Although cloud computing is regarded by Gartner as the cement between the forces, in banking it has taken a back seat due to concerns with security, regulation and compliance. The US has relatively advanced mobile banking, unlike its counterparts in the UK, for example, who are lagging behind.
According to the Federal Reserve, in its Consumers and Mobile Financial Services 2013 report: “As of November 2012, 28% of all mobile phone users and 48% of smartphone users had used mobile banking in the past 12 months. This is a significant increase from 21% in December 2011 for mobile phone users and 42% for smartphone users.
“While relatively less common, the use of mobile phones to make payments at the point-of-sale increased threefold over the same period, with 6% of smartphone owners having used their phone to make a purchase.”
While mobile may become the dominant force in banking in the short term, it also presents new quality challenges. The discipline of quality and testing is evolving to ensure that mobile devices, software and apps behave as expected and this evolution affirms that the mobile quality landscape differs from the rest of the IT world. For instance, for each mobile platform such as Android, Apple iOS or BlackBerry, there is a specific testing toolset. Ensuring cloud quality also presents similar issues, whether banks are faced with testing the integration of applications from a cloud service provider, or creating a test environment in the cloud.
Data Protection and the Cloud
International laws around data protection present a number of challenges to financial services providers globally when considering cloud computing adoption. Customers increasingly demand to know where their data is held and pose questions on data sovereignty and liberation.
If U.S. banking customers’ data are stored in Europe, is their personal information safe and who is liable if it isn’t safe? As more banking organizations adopt the cloud, the answers to these questions are still unclear.
There are initiatives underway to bring international law in line with the realities of cloud computing. For instance, Microsoft’s General Counsel, Brad Smith, has been lobbying the EU to harmonize data retention requirements and further extend the flexibility that allows for international processing of EU data. He’s also lobbying Congress to bring privacy and trade laws into line with today’s technology.
In terms of quality, banks are either reliant on a cloud service provider’s quality checks, or more sensibly will perform their own quality assurance to ensure that the cloud service integrates well with existing IT infrastructure.
A growing phenomenon is Testing as a Service (TaaS), however, again this brings up the data security conundrum and in the short-term virtual testing will only cover non-sensitive data.
Building a Quality Strategy For the Nexus of Forces
Speed, accuracy and confidence in the ability to implement change are major success factors hinging on quality. For an effective quality program, banks need to understand how the new forces align within an overall strategic vision, as opposed to having separate strategic visions for mobile, social, cloud and information.
An effective quality strategy takes into consideration: legacy IT infrastructure; back and front office requirements; the interdependency of the new forces, and should address the following major areas: scope, resources, environment, data volume, timeframes and risk evaluation. Depending on the institution’s focus, understanding one force and its dependencies is a key first step to quality that embraces old and new technologies.
As banks adopt the forces, quality management is also increasingly embracing an integrated approach to solve an integration problem. By centralizing testing services for applications and application development, banks are creating Test Centers of Excellence (TCoE). Always outsourced, the CoE offers a core competency of testing coupled with independence, and is geared to support complex systems that are in constant flux and renewal.
Indeed, creating a CoE ties into the holistic view of the Nexus of Forces. A TCoE avoids the problems of a few dedicated testers across the organization who operate in silos and are in danger of large scale duplication of testing activity. Bringing quality into a TCoE means that wider integration issues and associated risks are more readily identified, as the team involved in performing functional, regression, automated or performance testing holds a good understanding of the environment that all products or software operate in.
With the new forces changing the face of banking, now is the time to stop operating in silos and start gaining one view of quality across the entire organization. Today, test CoEs provide that coherent view and are helping banks to identify and prioritize the areas that hold the most risk, while introducing new technologies.
Rob McConnell is the market director for Northern Ireland at SQS Software Quality Systems, a consultancy focused on management solutions and services aimed at improving transparency and efficiency in IT and business processes.