Johannesburg, 30 Jul 2009
With the arrival of new hardware platforms such as Intel's Core i7 and AMD's Phenom II, the computing power provided by modern hardware has overtaken the average level of resource that software vendors need at their disposal in order to provide their end-user customers with a comfortable experience.
“This is something that's strongly evidenced by the fact that there is no noticeable difference between the performance of a standard quad-core processor-equipped computer and one that's equipped with one of the new platforms, when it comes to general office productivity tasks,” says Othelo Vieira, Acer notebook product manager at Tarsus Technologies.
“Where the difference does come in,” he admits, “is in the really demanding tasks that power-users are likely to throw at their computers, such as 3D imaging, CAD/CAM and video editing.
“And in this space,” he continues, “there will probably never be enough computing power, since every ounce of processing capability equates to reduced wait times for users.”
Vieira says that the reason for there being very little, or no noticeable change in performance levels in everyday computing tasks is that processor architectures are becoming wider in terms of the number of cores they incorporate, while the vast majority of software packages available today are still designed for the single-core world.
“In essence, we land up with a scenario where one of the two, four or eight cores in a processor is running at maximum utilisation, while the others are being grossly under-utilised,” he says.
“If, however, software packages were designed to make use of multiple processor cores, the performance difference would be extremely noticeable.”
Vieira admits, however, that solutions to this challenge are coming to the fore, but might still take some time to materialise.
“Windows 7, the upcoming operating system offering from Microsoft, is far more au fait with the concept of multiple processor cores and what is termed 'multi-threaded' code,” he says. “The vast majority of software vendors have realised this shift to multi-processor, or 'multi-threaded' and see it as both necessary and urgent,” he adds.
“As such,” he predicts, “the problem is likely to see a solution in the next two years.”
Vieira warns that the technology might not be out of the woods in the long-term.
“The whole reason this problem exists today is due to poor communication between hardware and software vendors,” he says.
“The processor vendors had exhausted the full gamut of power they were likely to glean from a single-core processor and decided to look at doubling up on processor cores as a solution. While technically, this did deliver performance at a physical level, the software wasn't geared up to deal with the change in architecture.
“And the fact that the software vendors weren't given sufficient warning about this change is the problem,” he says.
“We can only hope the industry learns from this in time - after all, this is not likely to be the last time we face similar performance issues,” he concludes.
Share