Bank Systems & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Management Strategies

11:25 AM
Connect Directly
RSS
E-Mail
50%
50%

Mortgage Information Provider Turns to Utility Computing

Brian Davenport, senior VP and CIO of Stewart Mortgage Information, a provider of financial services to banks and mortgage companies, last year decided to outsource most of the company's data-center operations rather than undertake a major upgrade of its infrastructure.

It turned to VeriCenter Inc., which aims to become a true utility-computing vendor by creating a network of regional data-center capacity that's currently structured around flexible monthly contracts but eventually will migrate to a usage-based model. "We wanted to find a way we could provide a high level of service without incurring the extremely high infrastructure costs," Davenport says. Stewart Mortgage implemented its deployment with VeriCenter in October and to date has seen a 15% to 20% reduction in IT expenses, which Davenport expects to further improve.

Twenty-one percent of the respondents to a survey by InterUnity Group and AFCOM say they plan to implement utility computing next year, and 10.6% of those already using the model say they expect to increase its use next year. Vendors with utility-style offerings, such as VeriCenter, Hewlett-Packard, IBM, and Sun Microsystems, are seeing the results of such investments.

Sun in the past year has introduced programs that provide access to its grid-computing network at a rate of $1 per CPU per hour and $1 per gigabyte for storage. Jonathan Schwartz, president and chief operating officer, says Sun is working with more than 10 companies with computation-intensive workloads on proof-of-concept programs that will lead to multithousand-CPU, multiyear contracts. "But to me, what will be more interesting is the long tail, the marketplace for demands of very small increment CPU loads, which eventually will be a bigger market than the large-scale implementations," Schwartz says.

The price point for getting into utility computing today is around 50 cents per CPU per hour, says David Gelardi, VP of deep-computing capacity on demand for IBM, but each engagement must be negotiated in regards to specific computing and software requirements. "You really can't look at capacity on demand in the same way as a utility like water or electricity because it's more sophisticated than that," he says. "We're not there as an industry yet, and I know most clients aren't there yet."

Utility computing will be a constantly evolving technology over the next decade, says Steve Prentice, an analyst with research firm Gartner. "What we are going to increasingly see is an infrastructure that's a mix of corporate-owned data and externally purchased services that are blended together in, hopefully, an almost seamless patchwork at the point of delivery."

Illustration by Steve Lyons

Return to the story:
Step Into The Future

Continue to the sidebar:
CPU Cool: Getting Faster But Not More Power-Hungry

Register for Bank Systems & Technology Newsletters
Slideshows
Video
Bank Systems & Technology Radio
Archived Audio Interviews
Join Bank Systems & Technology Associate Editor Bryan Yurcan, and guests Karen Massey and Jerry Silva from IDC Financial Insights, for a conversation about the firm's 11th annual FinTech rankings.