MagicMnemonic.com
It's all in the mind.

Questions abound in the enterprise server industry. For instance, if you had to stake your business on PC-based technology or go the way of larger, legacy systems, what would you do?

Some vendors in the enterprise server arena plan to sway users into moving to PC-based enterprise servers. The mainframe era is dead, they say.

Opponents argue that the mainframe is the only machine that can do the job. Anything else presents too many management headaches. You can probably guess which side Intel Corp. is on.

The king of desktop PCs is hoping it can maneuver into the server business in a big way. Key to Intel’s strategy is its recently introduced Pentium Pro microprocessor.

Optimized to take advantage of true 32-bit software, Intel plans to catapult itself into the server market with Microsoft Corp.’s Windows NT by its side. Vendors of Intel-based products speculate that the chances of Intel succeeding in its enterprise strategy are quite good.

“Intel obviously has its act together and it sees in the long term that if it wants to continue to have the presence on the desktop, it also needs to have a presence at the enterprise (level) as well,” says Lary Evans, senior vice president and general manager, the server group, at Dell Computer Corp., based in Austin, Texas.

“Over the last couple of years Intel has made substantial investments on being a serious player in the enterprise market,” he says.

Robert Lorentz, a server specialist with NEC Technologies Inc., based in Mississauga, Ont., agrees with Evans in the sense that companies want to move away from large, expensive systems. He says these companies want to migrate to smaller, but more powerful PC-based servers. It’s where Intel comes into play that he offers a differing opinion.

“You have to ask yourself why Microsoft wanted a dual strategy,” Lorentz says. “When you look at the Microsoft CDs, you (use that software on) Intel or RISC.”

Consequently, NEC Technologies is the world’s largest manufacturer of MIPS microprocessors designed on a RISC architecture from MIPS Technologies Inc., a subsidiary of Silicon Graphics Inc.

Intel may made inroads into the enterprise server segment, but only to a point, one consultant says. He says that because of the high end technologies that are required for RAID array based servers, it is necessary to team up with an experienced RAID recovery company like HDRS in order to best limit hard disk failures. RAID failures are estimated to cost North American business over $300 million per year.

“I think Intel works quite fine in the server environment up to a certain level,” says Don Thompson, a senior manager with the Deloitte & Touche Consulting Group in Toronto. “But past that, if you’re really looking for a lot of performance … you are probably looking to a more RISC-like architecture.”

Despite the performance RISC offers, the PowerPC family of processors — based on IBM’s Power architecture, an acronym for Performance Optimized with Enhanced RISC — may not play a major role in the enterprise server domain, Thompson says.

“I think as fast as they (the PowerPC alliance) move, Intel will still be with them. Because Intel has so much market share the PowerPC is certainly not going to make a huge penetration.”

He points out, however, that competition from the PowerPC makes Intel’s production schedule that much more aggressive.

As for using PCs as servers, an IBM Canada Ltd. manager says users should be careful.

Norbert Dawalibi, general manager of large scale computing at IBM Canada, says trying to manage a large collection of PC-based servers would be strenuous for even the most enthusiastic MIS person.

“What people don’t realize is that the real cost in IT is in managing and people,” Dawalibi says. “If you have more than five or 10 PC servers, it becomes a lot more expensive to manage.”

According to the consulting firm Gartner Group, based in Stamford, Conn., the server end of the client-server model is taking over. A Gartner Group research note explains: “As enterprises deploy larger client-server applications, they are finding that management issues … are becoming more difficult.

“Such issues significantly impede first-generation client-server architectures from successfully handling enterprise-scale applications.”

Although this may sound more like a software management issue, the effect it will have on server hardware is considerable.

Gartner Group points out that these management issues are attributable to the “fat client” which puts much of the processing logic, including presentation work, business rules and even data input and output logic, on the client PC.

Second-generation client-server applications are going to use what Gartner calls the “thin client” or the “fat server” model, which implies that more of a load is placed on the server and the PC’s workload is reduced.

“What people don’t realize is that if I have 5,000 PCs out there and I start putting all of the logic in those PCs, it’s very difficult if not impossible to manage,” Dawalibi says.”

“Simple things like installing a new version of software or putting a correction on the software that’s already there becomes a nightmare. Whereas if you have a fat server model, you just do it.”

That model is expected to have a profound impact on enterprise servers and hardware design, Dawalibi says.

“Lots of people started client-server applications using PCs. What you realize when you do that is that a very robust server is needed in the back to handle those applications.”

Data is the most important aspect any company has, Dawalibi says, and “that’s not going to go away. You need to house it in some sort of data repository. Typically, that’s where the mainframes and the large minis are going to sit. Really, you are looking at putting another server down from that which is really a small engine itself or a replication server, database engine, or communications server. The mainframe is simply going to be the vault.”

Enterprise servers are moving to a more streamlined design, coming in smaller

RAID servers have replaced old style mainframes.

boxes and using more power processors. But that doesn’t mean that the larger mainframe should be discounted altogether, Dawalibi says.

“It’s coming back as the enterprise server for larger enterprises. Clearly it’s not the answer for small companies but if you’re a government, insurance company, bank, or large manufacturer, you will find that there’s nothing else that can do the job.”

Dell’s Evans disagrees. “Over time” the mainframe is dying, it just doesn’t know it yet,” he says. “We are a couple of years away, but over time I do (see that happening).”

One method for achieving higher performance with PCs is to cluster them together. This technology promises to bring enterprise servers to an even higher level of performance. Evans says this will be the year when clustering finally matures.

“The reason I say that is because that technology is going to be driven by the software, not by the hardware. That’s because the time frame I think Microsoft will have its first release of clustering software,” he says. “And that’s really what is going to make it happen.” Relying on Microsoft to “make it happen” might be a correct statement if operating system trends are any indication.

Low to mid-range enterprise servers are frequently using Windows NT as the choice operating environment, says Thompson. “You could look at it as Unix being pretty dominant on the enterprise level, in fact the mainframe is still there to some extent,” he says.


Tags: , ,

Post Comment

Please notice: Comments are moderated by an Admin.


Namaless.com