First thing that I see is that the article is 75¾ of a decade old. Technology across the board has changed. Ex. I’m thinking 4G was barely starting then let alone the current push to 5G. I remember having conversations around WAN pipes of 500MB for fairly significant organizations. Now, I have 1 gig down and 500 upstream at my house 😉. While I realize that Canadian banks are often ahead of all but the largest US banks in terms of technology (they should be with the obscene profits they make from us!), I would suspect that many, perhaps most, US banks have automated at least some of these processes leading to less downtime.
Additionally, the curve of uptake for cloud computing has obviously accelerated in that time span as well, which includes, for a lot of organizations, cloud-based redundancy, on-demand scale up/out, containerization that makes spinning up new instances for various workloads to run in almost blindingly fast, network optimization that makes a lot of what existed in 2013 begin to look pretty archaic, and on and on.
On top of that, an increase in the adoption and use of methodologies like ITIL, Six Sigma, CMMI, etc. has led to better and better processes for reducing downtime/outages and increasing the speed for return to service.
Given all of that, it seems to me that any bank (or other organization for that matter) that is suffering from these kinds of issues is likely doing so due to one, or a combination of:
- Management/execs that don’t listen/listen well to the IT experts they pay to understand how to make things better.
- Organizations that have to labor under certain really arcane legislation that creates a situation where the choice becomes between pain or other pain.
- Geographic limitations – very fair to say that none of the US carriers go out of their way to provide gig+ connections in the middle of Omaha or in the hill country of South Carolina (as examples). Yeah, Comcast, the plucky Canadian is talking about YOU!
- Actual budget limitations that necessitate making compromises (I really doubt this applies to most of the industries that article was talking about).
I’d be interested in feedback on the following article. Are you seeing this in your industry as well?
According to ITIC, one hour of downtime costs over $100K in SLA penalties for 95% of large enterprises. Even a small delay can result in failure to meet the service-level agreements (SLAs) between you and your clients, leading to SLA penalties and a tarnished reputation.
In addition, Network Computing, the Meta Group and Contingency Planning Research found that businesses lose on average $84-108K for every hour of IT system downtime, with companies in financial services, telecommunications, and manufacturing topping the list of industries with the highest rate of revenue loss. Although IT system downtime isn’t always in your control, there are actions you can take to mitigate risks and prevent penalties.
Mortgage Loan Processing
Processing mortgage loans and new account documents is often slow and tedious, with much customer confusion. To uphold a strong service strategy and retain clientele, banks need to ensure that their clients are not waiting longer to their new accounts or mortgage loans to be processed.
Business Impact of Missing SLAs
Slow file times when opening financial documents for validation and approval can clog the whole system. As there are millions of pages of documents involved in mortgage processing, even few-second delays compound, accumulating across the team and over the course of a year. Lost productivity = lost revenue.
When agents have to wait several seconds to access each document, fewer loans and accounts get processed each day. Poor processing performance can cause failure to meet processing SLAs, and even lost business due to slow turnaround times. This drop in productivity can have lasting implications for business, such as low margins due to high mortgage processing labor costs, damaged reputation from poor customer reviews, and SLA penalties.
Preventing SLA Penalties
All said and done, how can SLA penalties be prevented? It starts with improving new account and mortgage processing performance. An easy first step that reaps huge rewards is leveraging advanced document compression to shrink documents and therefore significantly reduce file sizes by 50% or more. By investing in a strong IT infrastructure, banks can drastically decrease delays and lower the costs of downtime.
Using an advanced document compression software can provide other productivity-boosting solutions as well. Document linearization makes it possible to immediately display the current page of a document without having to wait for all the pages to download, which speeds up the file open experience immensely. Converting scanned image documents to searchable PDFs also makes it much quicker to find content within files, cutting down on manual search times.
Director of Strategic Alliances
Foxit Software Inc.