As the wheel turns
Is server-based computing, or thin-client as it`s still more commonly known, a case of IT history repeating itself, or is it a completely new approach to computing?
The business argument for server-based computing is the strongest it has been in 20 years.
Before the invention of the PC, a company`s interactions with its computing systems were based on the principles of today`s server-based computing.
With the advent of the PC, things started changing rapidly. Information storage started steadily moving from back-end to the front-end, offering the organisation better performance and efficiency. However, in many instances it began exposing the business world to threats that would only be realised many years later, with the proliferation of hacking tools, viruses, malware and other malicious code designed to compromise information the company had begun relocating to the front-end of the organisation.
To solve this problem, the IT industry proposed a move back to the server-based model, locating all information at the back-end, but affording users the same rich graphical interaction with the front-end they had become used to in the PC world.
This practice was commonly called thin-client computing, something the industry in the past two years has done away with, in favour of server-based computing. Thin-client places the focus on the front-end device, but because of the proliferation in the kinds of devices an organisation can utilise in the server-based computing model, the old moniker was no longer suitable.
Over the years, server-based computing has faced numerous challenges, such as coping with the ever-growing need for mobility and an ever-growing hunger for increased network performance, both in the local LAN and in the organisation`s WAN infrastructure.
The server-based model is still a "horses for courses" scenario, says Citrix Systems MD, Chris Norton. "This model is best suited to companies that have a task-based workforce. Companies with a heavy usage of intensive applications such as computer-aided design, for example, wouldn`t find this proposition viable.
"With the significant adoption of broadband on a local basis, it`s a given that most companies and their high-level employees will have some form of broadband connection, whether that`s wireless or wired. With that trend, new device types have come to the fore to further extend the server-based computing value proposition into areas like mobility and remote offices. These devices include mobile VGA-type screens with wireless 3G capabilities, smart phones and high-end PDAs," Norton explains.
I believe notebooks are now rather being sold as desktop PC replacements.Manoj Bhoola, business group manager, Microsoft
Despite the proliferation of new devices, Microsoft feels the PC will remain the primary device for server-based computing.
"Through 2007 more than 80% of server-based computing implementations will be fulfilled by using the PC as the access device," says Manoj Bhoola, Microsoft`s server and tools business group manager, quoting a Gartner article.
"It`s an interesting perspective," he continues, "since there`s a debate on the go as to whether the notebook computer is a mobile device or a conventional PC. Most employees take their notebook home at night, since it ensures better security of the asset. However, since notebook prices have come down substantially, I believe notebooks are not being sold for the purpose of mobility, but rather as desktop PC replacements."
Mobility has a direct trade-off with the level of security a company can implement. "The rule of thumb, generally, is the more mobile you are, the less secure your information and interactions will become," says Norton.
"In the past, organisations were only typically able to authorise the user onto their network, based on the correct credentials, and more importantly, without any regard for the environment they were connecting from.
"From a regulatory perspective though, companies cannot simply allow devices that aren`t in their control to synchronise their information with the back-end - it`s too risky. Today, companies need to be able to authorise the user`s credentials and environment (which includes the location from which they`re connecting from, the security level of their device and other security metrics), and then only deliver the functionality back to the client, that the complete picture allows for.
"It`s something we call smart access," says Norton.
Upload and work
"You authorise the user, their device and their physical location (among other things) and then apply certain administrator-defined policies in terms of the permissions and information they have access to. This way, you can grant mobility while at the same time reducing the threat of security breaches. You`re also in control of the user`s access experience," Norton explains.
"It needs to be a 'yes, but` access strategy. Yes, you can have access to the system, but since either your environment or device failed to comply with the following criteria, you can not have access to certain kinds of functionality.
Looking at technology, Bhoola says there has been great enhancement in the offerings on the server portion of the server-based computing model. "Previously, you had to load a thin-client stack on the server and that server just acted as the host. Now the traditional thin-client is becoming even thinner and it simply needs a protocol in order to connect to applications. And while these devices have minimal hardware inside them, they can be uploaded with relevant information and applications run locally.
"It`s really an upload and work concept," he says.
"There`s also a stronger networking and middleware component coming into the equation," he continues. "Today, some of the stack can be leveraged from the middleware and the intelligence being built into networking devices - that`s why the terminal is becoming even thinner," Bhoola opines.
Norton says there`s confusion in the market as to the actual costs involved with moving to a server-based computing model.
The more mobile you are, the less secure your information and interactions will become.Chris Norton, MD, Citrix Systems
"It`s alarming how few companies actually look back on their total cost of ownership (TCO) calculations over a six-month to 12-month period once they`ve implemented such a solution. Citrix has developed a tool to compare a customer`s current scenario with the server-based computing model.
"While it almost always shows a cost reduction over a business`s current scenario, it`s important that customers re-do this exercise at six-month to 12-month intervals thereafter, so that they can see exactly what their costs savings have been," Norton adds.
When it comes to cost though, Norton says customers are generally too focused on initial cost and don`t look far enough into long-term benefits and functionality. "The challenge is customers generally go for the cheapest option. In this scenario, they shouldn`t expect the quality of a more expensive solution.
"One of the biggest stumbling blocks locally is the 'fish bowl` management mentality prevalent in the country. Most managers are still convinced that if they can`t see their employees at work, they`re not working.
"Adding to this, organisations also believe that mobile broadband connectivity is too expensive a service to roll-out to a mobile workforce. If they did the maths as to what the company pays to keep a person in the office, it`s at least R750 for rent alone. Now add all of the other things into the equation, such as furniture rental, stationery, electricity, coffee, tea, cleaning staff and more. In contrast, the average wireless broadband connection costs between R600 and R700 in SA.
"They need to see that they`ll actually be saving money by making their workforce more mobile," he maintains.
"Customers must begin focusing on return on investment (ROI). The problem is that ROI is also quite subjective - the tools used to calculate this assume companies will act in a best practice manner. There are just too many ways the customer can fall down in the best practice stakes. There needs to more accountability from the vendors that propose the models," he adds.
"Ultimately, the TCO figure of a Citrix environment is in the region of 30% more efficient than a client server environment. We`ve seen figures better than that in the past too," he says.
"In terms of ROI, it`s usually six to eight months, depending on how the organisation is managing their environment. In a nutshell, the better your ship is being managed, the longer your ROI term will be. In a well-run IT shop the ROI can be anywhere between 18 and 24 months," adds Norton.
Pros and cons
"Server-based computing affords the organisation increased management and control over their environment. With the proliferation of malicious hack attacks and viruses in the enterprise, it makes sense to have the capability to run a fat-client for some instances and a thin-client for others, but at the same time manage everything remotely in a thin-client environment," says Bhoola.
"Anti-virus solutions on the server and PC act very differently," he adds. "In the server-based computing model, the organisation takes less risk, since there are fewer areas of entry to the organisation that need protection.
Managers are still convinced that if they can`t see their employees at work, they`re not working.Chris Norton, MD Citrix Systems
"If a PC is running as a thin-client, the organisation will usually disable the hard disk, removable disk drives and USB ports, which to a great degree stops malicious code coming into the network."
Although smart phones and mobile devices can do numerous interesting things in the server-based computing model, clients need to ask what threats those devices bring into the environment. "With the next version of Windows Mobile 5, Microsoft will have a built-in anti-virus methodology, which can scan smart phones and PDAs constantly, and do another scan on connecting to a server.
"Another thing coming down the line will be to further alleviate this threat at a stack level with the next version of MS Terminal Server. When a user connects remotely to the network, their session will automatically be placed in quarantine, which means that every piece of functionality will still be available, but that all data transfers are scrutinised. As the client is approved (through a series of checks and scans), that session is taken out of quarantine. If it fails any of the checks, it stays in quarantine," he explains.
Server-based computing isn`t without its drawbacks though. "It all relies on connectivity," Norton continues. "If a router goes down, the system doesn`t work. Similarly, if ActiveDirectory goes down, users cannot be authorised. But then this is no different from a conventional client-server model," he concludes.
"Server-based computing is best suited to mixed environments, where the customer has Unix, Windows, AS/400 and mainframes in a single environment," says StorTech MD Tim Knowles.
"In that environment, it`s usually quite difficult to enable a user to sit at one desktop and access disparate applications and systems seamlessly. Server-based computing solves this integration nightmare to a great degree," he says.
"It also helps to ask what the conventional problems with the desktop are and how thin-client solves these," Knowles continues.
"It`s difficult for companies to keep up with the conventional three-year upgrade cycle. With thin-client, the client machines will plausibly never need an upgrade, although never is a long time," he quips.
"Networking has been one prohibitive factor in the local market," he says. "With the advent of broadband, we are most certainly making progress.
"A bigger factor is user pushback though," he states. "Users are generally comfortable in having their own applications, storage and computers on their desk. Taking that away can cause a negative sentiment in the user community, so this needs to be treated carefully.
"So the bottom line is, don`t throw out the desktop and notebook just yet. Certainly look at where thin-client makes sense. Task-based employees are generally the most appropriate audience, but that`s not to say that it can`t work for knowledge workers," he concludes.
The next step
While server-based computing has numerous benefits, IBM`s Werner Lindeman, executive with the systems and technology group, South and Central Africa, says the next step, namely the virtualised environment, will bring even more compelling benefits.
"While centralisation is a major focus for many companies, with true virtualisation, companies don`t necessarily have to centralise things. It quite simply doesn`t matter where servers and information are located. When the user needs a high-level of processing resource, the environment will be able to automatically shift workloads to where they can be more appropriately addressed," he explains.
"In the same way, customers need not worry about growing data-centres," he adds.
An important question to ask, he says, is whether or not server-based computing will drive down the costs associated with licensing.
"The main drive with an internal project called 'IBM Workplace` is to create a model where customers can access a cheaper payment model based on deploying software such as open office in a thin-client environment."
"The TCO models will become more and more visible going forward," says Lindeman, "as software companies begin coming up with more innovative approaches to solving enterprise pains."