All large technology companies are developing their own programming languages these days, and although this can provide many advantages for the company and their own platforms, it can be a real headache for developers.
For instance, in June, Apple announced a new programming language for iOS, called Swift, which will replace Objective-C. Unlike the legacy language, which has been in use since the 1970s, Swift doesn't use any braces in the syntax. Swift claims to be concise and clean, avoiding unnecessary coding symbols, including brackets, parenthesis around control statement conditionals, and the ending-line semicolon. Samsung has its own language, Tizen. Facebook developers released Hack in March, which is a language for back-end development. Basically, Hack integrates the fast development potential of PHP with the control of static typing, while further adding many new features associated with modern programming languages.
Google, finally, has not one, but two of its own developer languages. Go, which was announced in 2009, is based on C, but promises a more concise, simple, and secure language. Dart, meanwhile, which was released in 2011, is a Javascript replacement. "Even though these new languages are compatible with the old ones (ie, C into Swift, PHP into Hack, Dart into Javascript), in order to take full advantage of the new languages' features, developers need to learn them. Besides, they are likely to eventually replace the languages they are based on, as the platforms evolve," says Richard Firth, Chairman and CEO of MIP Holdings.
"So yes, these new languages promise to do so much more from a development point of view, but who has the time to learn the ins and outs of new languages that are contextually specific to a single platform?"
These new languages boast many advantages, from cleaner code, to faster app development, to the developed apps being faster to execute, he adds. "However, the plight of the developer should not be ignored. They are being forced to continually learn and adapt to new languages, and although the new languages may claim to lead to faster app development, a developer will be significantly slowed down by the initial learning process."
He adds that those mentioned are just front-end languages, meaning they are purely languages to make a device's front-end look good. This does not cover the discussion of front-end versus back-end languages, which is a bit of a different debate.
Gone, too, are the days where a developer could become an expert at a specific language. With trying to keep up with the plethora of new platform-specific languages, developers are forced into the role of "Jack of all trades, and master of none" role, as they struggle to keep up with the details of a new language they're unlikely to use anywhere else.
"What it comes down to is that an initiative like Apple's Swift risks overwhelming programmers and dividing the developer community. Although it might be important for programmers to know a few languages, imposing multiple new languages on them on a regular basis could very well be counter-productive for the developer community and the app ecosystems they support," Firth says.
He adds that this has resulted in a duel between the big corporates owning the standard for programming and the programmers themselves. "The drive for all developers is a unified and single interface between front-end and device. This is why there is a push for HTML5 on the development side. However, with device manufacturers protecting their walled gardens, developers are being forced into multiple language development."
Firth points out that this new "fight" between the programmer for a ubiquitous interface and the device/operating system owners for greater numbers of consumers is unsustainable. "Companies need to remember that although it is their prerogative to make their platform languages as specific as they choose, if developers don't want to use them, nobody (outside of the company) is going to."
And although there are many reasons for companies to want to replace a legacy language with a new one, they need to do so sparingly, and always consider their developer community before making such a considerable change, he adds.
"Forcing consumers to accept only the apps and products offered within a closed environment may seem like the easiest way to grow a community, but consumers want functionality, and don't really care about where they get it. This struggle is forcing consumers to choose between the walled gardens at the same time developers are fighting for a single interface. Time will tell if the device manufacturers will succeed, or whether usability, functionality and simplicity will win."
Share