OpenAI recently unveiled several new features for ChatGPT and other artificial intelligence tools at its recent developer conference. Two of the most significant announcements from the event include the upcoming launch of a chatbot creator tool called GPTs (short for generative pretrained transformers) and a new model for ChatGPT known as GPT-4 Turbo.
OpenAI has a history of introducing new models for ChatGPT. Earlier this year, they upgraded ChatGPT’s algorithm from GPT-3.5 to GPT-4. Are you curious about the differences you can expect with the GPT-4 Turbo version of the chatbot when it becomes available later this year? Based on past releases, it’s likely that the model will first be made accessible to ChatGPT Plus subscribers before being released to the general public.
Despite WIRED’s request for early access to the new ChatGPT model being declined by OpenAI, here’s what we anticipate will be distinctive about GPT-4 Turbo.
Sam Altman, CEO of OpenAI, expressed at the conference,
“We are as frustrated as all of you, perhaps even more so, that GPT-4’s knowledge about the world ceased in 2021.”
The new model now incorporates information up to April 2023, enabling it to provide more up-to-date context in response to your queries. Altman has expressed his commitment to prevent ChatGPT’s information from becoming outdated.
It’s worth noting that the methods used to obtain this information remain a significant point of dispute for authors and publishers who are dissatisfied with OpenAI’s use of their content without consent.
Input Longer Prompts Feel free to provide extensive and detailed prompts. According to Altman, “GPT-4 Turbo can handle up to 128,000 tokens of context.”
Although tokens aren’t equivalent to the number of words in a prompt, Altman compared this new limit to roughly 300 pages of text from a book. This means you can input more information at once for the chatbot to analyze, making it suitable for tasks such as summarizing extensive documents.
Improved Instruction Following Wouldn’t it be great if ChatGPT were more adept at paying attention to the fine details of your requests in a prompt? OpenAI claims that the new model is a more attentive listener.
The company’s blog post states, “GPT-4 Turbo performs better than our previous models on tasks that require careful adherence to instructions, such as generating specific formats (e.g., ‘always respond in XML’).” This improvement could be especially valuable for individuals who use the chatbot to assist with coding.
More Affordable Developer Pricing While this may not be the primary concern for most ChatGPT users, using OpenAI’s application programming interface (API) can be costly for developers.
As Altman puts it, “So, the new pricing is one cent for a thousand prompt tokens and three cents for a thousand completion tokens.” In simpler terms, this means that GPT-4 Turbo may offer a more cost-effective solution for developers to input information and receive responses.
Multiple Tools in One Chat ChatGPT Plus subscribers who are familiar with the GPT-4 dropdown menu, which allows you to choose different chatbot tools, will be pleased to know that this dropdown menu is being retired. Altman explained, “We listened to your feedback. That model picker was extremely frustrating.” The updated chatbot with GPT-4 Turbo will automatically select the appropriate tools for your request, making it more user-friendly. For instance, if you request an image, it is expected to automatically use Dall-E 3 to fulfill your prompt.