ChatGPT, OpenAI’s NLP-based chatbot, has taken the internet by storm since it was made available for public testing in November of last year. With the launch of GPT-4, which will replace the presently used GPT-3 autoregressive language model, the generative AI model is projected to take a tremendous jump. The upgrade is apparently being incorporated into Microsoft’s applications, with Bing, the tech giant’s search engine, among the platforms expected to gain.
Microsoft recently announced a multi-million-dollar investment in ChatGPT’s parent firm, OpenAI, in a move that could help it overtake Alphabet’s Google as the most preferred search engine in the world. The tech giant has already announced integration of the ChatGPT chatbot into its workplace instant messaging platform, Microsoft Teams, through a premium subscription model priced at $7 (roughly Rs. 600).
According to a Semafor research, Microsoft’s Bing search engine may be incorporated with the future GPT-4 system. GPT stands for generative pre-trained transformer, or an AI-capacity model’s to read through massive data sets and provide detailed replies to questions in a manner similar to human language.
According to the research, the most significant improvement that GPT-4 will provide over GPT-3 is the speed with which it can react to inquiries. This is accomplished through enhanced input datasets, algorithms, parameterization, and alignment, rather than simply expanding and broadening the language model. According to reports, the majority of the work is being done on the server side, through optimization modifications that make the system more efficient, accurate, and speedier.
ChatGPT is now available for free testing to all users as part of an ongoing public beta. OpenAI also announced a test subscription plan, ChatGPT Plus, on Wednesday for $20 per month (roughly Rs. 1,650). This is mostly because running the system is expensive, with each search costing OpenAI a few of cents, according to CEO Sam Altman.
According to the study, the process is now powered by supercomputers co-developed by Microsoft and OpenAI that have 285,000 CPU cores and 10,000 GPUs with a 400Gbps network connectivity speed.
Developers have been hurrying to get their hands on Nvidia’s newest graphics processor, the H100 Tensor Core GPU, which is ideally suited to operate systems like those used in GPT, now that the source code on which ChatGPT is based is free to everybody. Nvidia’s H100 Tensor Core GPU costs $30,000 (about Rs.25,00,00), making this a fairly pricey affair to say the least.
As a result, much of the battle to dominate the search engine business might be headed by AI-powered generative models like GPT, and particularly by innovation on the server side of these models.
Google has held the top spot since it defeated Yahoo some years ago to become the world’s most popular search engine. It remains to be seen if Microsoft’s Bing will challenge and eventually dethrone Google as the world’s most popular and commonly used search engine.