Software giant Microsoft has announced the general availability of Azure OpenAI Service in a move it says aims at “democratising artificial intelligence (AI)” in its ongoing partnership with start-up OpenAI, the company behind viral bot ChatGPT.
The company is now adding ChatGPT to its Azure cloud service, as it looks to dominate the world of AI.
In 2019, Microsoft invested $1 billion in OpenAI, a San Francisco-based start-up company that designed ChatGPT.
This deal allowed OpenAI to use Microsoft’s Azure Cloud Platform for its research and development; and in return, Microsoft was given the first opportunity to commercially leverage early results from OpenAI’s research.
Early investors in the start-up included Elon Musk and Peter Thiel.
Citing people familiar with the matter, the New York Times says since 2019, Microsoft has quietly invested another $2 billion into OpenAI.
ChatGPT, which stands for Chat Generative Pre-Trained Transformer, is a chatbot launched by OpenAI in November 2022.
It is built on top of OpenAI’s GPT-3 family of large language models, and is fine-tuned with both supervised and reinforcement learning techniques.
It has the ability to interact in conversational dialogue form and provide responses that can appear human.
The text-based chatbot can also draft prose, poetry or even computer code on command.
Since its release in November, the bot has gone viral, attracting interest in the Silicon Valley. Social media has also been abuzz with discussions around the possibilities and dangers of this new innovation, ranging from its ability to debug code to its potential to write essays for college students.
As the interest in ChatGPT gathers pace, Microsoft is reportedly in talks to invest $10 billion in OpenAI as part of funding that will value the firm at $29 billion.
In a blog post yesterday, Microsoft says large language models are quickly becoming an essential platform for people to innovate, apply AI to solve big problems, and imagine what’s possible.
“With Azure OpenAI Service now generally available, more businesses can apply for access to the most advanced AI models in the world – including GPT-3.5, Codex, and DALL E2 – backed by the trusted enterprise-grade capabilities and AI-optimised infrastructure of Microsoft Azure, to create cutting-edge applications,” says Eric Boyd, Microsoft corporate vice president for AI Platform in the blog post.
“Customers will also be able to access ChatGPT – a fine-tuned version of GPT-3.5 that has been trained and runs inference on Azure AI infrastructure – through Azure OpenAI Service soon,” he adds.
On the pitfalls of platforms such as ChatGPT, market research firm IDC points out that generative AI, while providing lower-cost, higher-value solutions, has significant ethical and perhaps legal implications.
It notes that there are significant questions over issues like copyright, trust and safety.
“Organisations must consider issues such as privacy and consent around data, reproduction of biases and toxicity, generation of harmful content, sufficient security against third-party manipulation, and accountability and transparency of processes,” says IDC.
Meanwhile, Israeli-based cyber security firm Check Point Research says it is seeing attempts by Russian cyber criminals to bypass OpenAI’s restrictions, in order to use ChatGPT for malicious purposes.
According to Check Point, in underground hacking forums, hackers are discussing how to circumvent IP addresses, payment cards and phone numbers controls – all of which are needed to gain access to ChatGPT from Russia.
Says Check Point: “It is not extremely difficult to bypass OpenAI’s restricting measures for specific countries to access ChatGPT.
“Right now, we are seeing Russian hackers already discussing and checking how to get past the geofencing to use ChatGPT for their malicious purposes. We believe these hackers are most likely trying to implement and test ChatGPT into their day-to-day criminal operations.
“Cyber criminals are growing more and more interested in ChatGPT, because the AI technology behind it can make a hacker more cost-efficient.”