At Google I/O 2023, the search giant finally unveiled PaLM 2, its latest general-purpose large language model. PaLM 2 is the bedrock on which multiple Google products are now being built, including Google Generative AI Search, Duet AI in Google Docs and Gmail, Google Bard, and more. But what exactly is the Google PaLM 2 AI model? Is it better than GPT-4? Does it support plugins? To answer all your questions, go through our detailed explainer on the PaLM 2 AI model released by Google.
Table of Contents
What is Google’s PaLM 2 AI Model?
PaLM 2 is the latest Large Language Model (LLM) released by Google that is highly capable in advanced reasoning, coding, and mathematics. It’s also multilingual and supports more than 100 languages. PaLM 2 is a successor to the earlier Pathways Language Model (PaLM) launched in 2022.
The first version of PaLM was trained on 540 billion parameters, making it one of the largest LLMs around. However, in 2023, Google came up with PaLM 2, which is much smaller in size, but it’s faster and more efficient than the competition.

In PaLM 2’s 92-page technical report, Google has not mentioned the parameter size, but according to a TechCrunch report, one of the PaLM 2 models is only trained on 14.7 billion parameters, which is far less than PaLM 1 and other competitive models. Some researchers on Twitter say that the largest PaLM 2 model is likely trained on 100 billion parameters, which is still much lower than the competition.
To give you an idea, OpenAI’s GPT-4 model is said to be trained on 1 trillion parameters, which is just mind-blowing. The GPT-4 model is at least 10 times larger than PaLM 2.
How Google Made PaLM 2 Smaller?
In the official blog, Google says that bigger is not always better and research creativity is the key to making great models. Here, by “research creativity,” Google is likely referring to Reinforcement Learning from Human Feedback (RLHF), compute-optimal scaling, and other novel techniques.
Google has not disclosed what research creativity it’s employing in PaLM 2, but looks like the company might be using LoRA (Low-Rank Adaptation), instruction tuning, and quality datasets to get better results despite using a relatively smaller model.

Overall, PaLM 2 is an LLM model that’s faster, relatively smaller, and cost-efficient because it serves fewer parameters. At the same time, it brings capabilities such as common sense reasoning, better logic interpretation, advanced mathematics, multilingual conversation, coding mastery, and more. That was the basics of the PaLM 2 model, now let’s go ahead and learn about its features in detail.
What are the Highlight Features of PaLM 2?
As mentioned above, PaLM 2 is faster, highly efficient, and has a lower serving cost. Apart from that, it brings several advanced capabilities. To begin with, PaLM 2 is very good at common sense reasoning. Google, in fact, says that PaLM 2’s reasoning capabilities are competitive with GPT-4. Testing in the WinoGrande commonsense test, PaLM 2 scored 90.2 whereas GPT-4 achieved 87.5. In the ARC-C test, GPT-4 scores a notch higher and achieves 96.3 whereas PaLM 2 scores 95.1. In other reasoning tests, including DROP, StrategyQA, CSQA, and a few others, PaLM 2 outperforms GPT-4.

Not just that, due to its multilingual ability, PaLM 2 can understand idioms, poems, nuanced texts, and even riddles in other languages. It goes beyond the literal meaning of words and understands the ambiguous and figurative meaning behind words. This is because PaLM 2 has been pre-trained on parallel multilingual texts of various languages. In addition, the corpus of high-quality multilingual data makes PaLM 2 even more powerful. As a result, translation and other such applications work far better on PaLM 2.

Next, we come to its coding capabilities. Google says that PaLM 2 is again trained on a large corpus of quality source code datasets available in the public domain. As a result, it supports more than 20 programming languages, which include Python, JavaScrupt, C, C++, and even older languages like Prolog, Fortran, and Verilog. It can also generate code, offer context-aware suggestions, translate code from one language to another, add functions with just a comment, and more.
What Can the PaLM 2 Model Do?
First of all, let me say that PaLM 2 has been built to make it adaptable for different use cases. Google announced that PaLM 2 will come in four different models — Gecko, Otter, Bison, and Unicorn; Gecko being the smallest and Unicorn being the largest.

Gecko is so lightweight that it can run even on smartphones while being completely offline. It can process 20 tokens per second on a flagship phone, which is around 16 words per second. That’s awesome, right? Imagine the kind of AI-powered on-device applications you can run on your smartphone without requiring an active internet connection or beefy specs.

Apart from that, PaLM 2 can be fine-tuned to make a domain-specific model right away. Google has already created Med-PaLM 2, a medical-specific LLM fine-tuned on PaLM 2 that received “Expert” level competency on U.S. Medical Licensing Exam-style questions. It achieved an accuracy of 85.4% in the USMLE test, even higher than GPT-4 (84%). That said, do bear in mind that GPT-4 is a general-purpose LLM and not fine-tuned for medical knowledge.

Moving ahead, Google has added multimodal capability to Med-PaLM 2. It can analyze images like X-rays and mammograms and come up with conclusions, in line with expert clinicians. That’s pretty remarkable as it can bring much-needed medical access to remote areas around the world. Besides that, Google has developed Sec-PaLM, a specialized version of PaLM 2 for cybersecurity analysis and to quickly detect malicious threats in no time.
PaLM 2-Powered Google Products
These are all different use cases of PaLM 2 in different spheres and industries. As for individual consumers, you can experience PaLM 2 in action through Google Bard, Google Generative AI Search, and Duet AI in Gmail, Google Docs, and Google Sheets. Google recently moved Bard, its interactive AI chatbot, to PaLM 2 and opened up access to more than 180 countries. You can follow our article and learn how to use Google Bard right now.

As for using PaLM 2 in Gmail, Google Docs, and Sheets (Google is calling it Duet AI for Google Workspace), you need to join the waitlist to take advantage of the AI-powered features. Finally, for developers, Google has released the PaLM API which is based on the PaLM 2 model. You can sign up right now to use the PaLM API in your products. It can generate more than 75 tokens per second and has a context window of 8,000 tokens.
PaLM 2 vs GPT-4: How Do the AI Models Compare?
Before comparing the capabilities, one thing is clear — PaLM 2 is fast. I mean, it’s fast at responding to queries, even complex reasoning questions. Not just that, it offers three drafts at once, in case you are not satisfied with the default response. Thus, from an efficiency and computing standpoint, Google is a step or two ahead of OpenAI. Read about all the new features of Google Bard AI here.
As far as capabilities are concerned, we tested the reasoning skill of both the models and PaLM 2-powered Google Bard truly shines in such tests. Out of 3 reasoning questions, Bard correctly answered all 3 of them whereas ChatGPT-4 could only answer 1 correct answer. In one instance, Bard’s assessment was wrong (seemed to hallucinate), but somehow gave the right answer.
Apart from that, for coding tasks, I asked Bard to find a bug in the code I provided, but it gave a lengthy response to fix the issues, which turned out to be entirely wrong. However, ChatGPT-4 instantly identified the coding syntax, spotted the error, and fixed the code without further prompting.
I also assigned a task to both models to implement Dijkstra’s algorithm in Python and both models generated error-free code. I compiled both of them and none of the functions threw any errors. That said, ChatGPT-4 generates clean code with some examples whereas Bard only implements the barebone function.
Limitations of Google PaLM 2
Now coming to limitations, we already know that ChatGPT plugins are powerful and can quickly enhance GPT-4’s capabilities by miles. With just the Code Interpreter Plugin, users are able to do so much more with ChatGPT. Indeed, Google has also announced “Tools” similar to plugins, but they are not live yet and third-party support seems lackluster at present. In tandem, developer support is huge for OpenAI.

Next, GPT-4 is a multimodal model meaning it can analyze both texts and images. Multimodality has a number of interesting use cases. You can ask ChatGPT to study a graph, table, medical report, medical imaging, and more. Yes, the feature has not been added to ChatGPT yet, but we have seen an early demo and it seemed very impressive. On the other hand, PaLM 2 is not a multimodal model as it only deals with texts.
The search giant has fine-tuned PaLM 2 to create Med-PaLM 2 which is indeed multimodal, but it’s not open for public use and is limited to the medical domain only. Google says that the next-generation model called Gemini will be multimodal with groundbreaking features, but it’s still being trained and is months away from release. Google has promised to bring Lens support to Bard, but it’s not the same as an AI-powered visual model.

Finally, in comparison to GPT-4, Google Bard hallucinates a lot (see an example here, where Bard thinks that the PaLM AI model is created by OpenAI). It makes up information on the fly and confidently responds with false information. GPT-3 and GPT-3.5 also had a similar problem, but OpenAI has managed to reduce hallucination by 40% with the release of GPT-4. Google needs to address the same hallucination problem “boldly and responsibly.”
Conclusion: PaLM 2 or GPT-4?
In summation, Google’s PaLM 2 AI model has improved in some areas such as advanced reasoning, translation, multilingual capabilities, maths, and coding. Moreover, it has the added benefit of running a smaller model with fast performance and low serving costs. However, to reach feature parity with GPT-4, Google needs to add multimodality, third-party tools (Plugins), address the hallucination issue, and make its AI models as developer-friendly as possible.
FAQs
How many parameters does Google PaLM have? ›
PaLM 2, according to internal documents, is trained on 340 billion parameters, an indication of the complexity of the model. The initial PaLM was trained on 540 billion parameters.
What is Google PaLM API? ›Objectives. Vertex AI PaLM API is a service that allows you to create and train generative models using Google Cloud. It's a fully managed service that provides a simple and intuitive interface for creating and training generative models.
What is the difference between LaMDA and PaLM? ›LaMDA is designed to be flexible and capable of generating text in a variety of contexts and styles. Google PaLM (Parallel Language Model) is a language model developed by Google that is designed for large-scale language generation tasks.
Is Google PaLM model open source? ›Google has also made the PaLM model open-source and publicly available.
What ratio for PaLM trees? ›Mature palms in the landscape should be fertilized with 8-2-12 fertilizer which is a complete, water soluble, palm tree fertilizer, specially formulated for palms and containing all the essential elements and manganese: a mineral that helps prevent yellowing and necrosis between the leaf veins and a reduction in leaf ...
How many parameters does GPT 4 have? ›According to some sources, it is true that GPT-4 has 170 trillion parameters. This makes it 1000 times larger than GPT-2 and nearly 1000 times larger than GPT-3, which had 1.5 billion and 175 billion parameters respectively.
Is PaLM better than GPT-3? ›However, PaLM has outperformed GPT-3 on some NLP tasks, especially in English, while GPT-3 has demonstrated capabilities in code generation that PaLM does not have.
What is Google's equivalent to OpenAI? ›PaLM is a large language model, or LLM, similar to the GPT series created by OpenAI or Meta's LLaMA family of models. Google first announced PaLM in April 2022. Like other LLMs, PaLM is a flexible system that can potentially carry out all sorts of text generation and editing tasks.
What does GPT stand for? ›Chat GPT stands for Chat Generative Pre-Trained Transformer and was developed by an AI research company, Open AI. It is an artificial intelligence (AI) chatbot technology that can process our natural human language and generate a response.
What palm has V shaped leaves? ›Palm leaves with upright V-shaped leaflets are called induplicate, while those with an inverted V shape are called reduplicate. Figure 6. Parts of palmate, pinnate, and costapalmate leaves.
Are there multiple types of palm trees? ›
Palm trees come in over 2,500 varieties and over 202 distinct genera. The majority of these species are located in tropical and subtropical regions. Most palm tree species are only found in tropical or subtropical locations because they cannot tolerate cooler temperatures.
Are there different types of palms? › What is Google PaLM 2? ›What is PaLM 2? PaLM 2 is a language model that Google is deploying to bring AI capabilities to all of its products, including Gmail, Google Docs and Bard. Similar to other language models like GPT-4, PaLM 2 is capable of powering AI-based chatbots. It is also adept at writing code, analysing images and translating.
Is PaLM better than GPT-4? ›This flexibility in model sizes gives PaLM 2 an edge in terms of accessibility and deployment. Google claims that PaLM 2 demonstrates enhanced reasoning capabilities compared to GPT-4, particularly in tasks like WinoGrande and DROP, with a slight advantage in ARC-C as well.
What is the use of Google PaLM? ›Google Research's state-of-the-art language model, PaLM, can solve complex math word problems, answer questions in new languages, and explain jokes.
Can you over fertilize a palm? ›When growing palm trees, try to under-fertilize rather than over-fertilizer. Under-fertilized plants just don't grow as fast; over-fertilized them and they may die.
What happens if you over fertilize a palm tree? ›Water BEFORE AND AFTER fertilizing, especially when using a quick release material. Under-fertilize rather than over-fertilize. Under-fertilized plants just don't grow as fast; over-fertilize them and they may die.
What is the largest neural network? ›GPT-3 is a deep neural network that uses the transformer architecture to process and generate text. It consists of 175 billion parameters, which makes it by far the largest neural network ever created.
Does GPT-4 really have 100 trillion parameters? ›GPT-4 is reportedly about six times larger than GPT-3, with one trillion parameters, according to a report by Semafor, which has previously leaked GPT-4 in Bing. In addition to the number of parameters, the quality of the data and the amount of data training are critical to the quality of an AI system.
What is the maximum input size in GPT-2? ›GPT-2 uses input text to set the initial context for further text generation. The length of an input string can range from few words to a maximum sequence length of 1024 tokens.
What is the best soil pH for palm trees? ›
Palm trees favor slightly acidic soil. Although palm trees can survive in soils with a pH range from 5.5 to 7.5, the ideal conditions are soil at a pH level of 6.3. Soil acidity strips are the quickest way to test your soil acidity. Organic soil amendments can be used to raise your soil's pH to the desired level.
Does Palm have an API? ›For developers who are experimenting with AI, we're introducing the PaLM API, an easy and safe way to build on top of our best language models. Today, we're making an efficient model available, in terms of size and capabilities, and we'll add other sizes soon.
Is Elon Musk in OpenAI? ›A co-founder at OpenAI, Musk resigned from the board in 2018, only to see ChatGPT become the technology with the fastest adoption rate, well, maybe ever.
What AI did Google shut down? ›Google AI was a program that attempted to build artificial intelligence that could perform tasks similar to humans. It was shut down in 2017, with the announcement that it would be working on a "new kind of AI." The new kind of AI was never revealed, and Google focused on its existing technologies.
Does Elon own OpenAI? ›Musk is one of the co-founders of OpenAI, which was started as a non-profit in 2015. He stepped down from the company's board in 2018.
Can I use GPT in BIOS? ›UEFI mode is compatible with the GUID Partition Table (GPT) partition style, while the legacy BIOS mode is compatible with the Master Boot Record (MBR) partition style.
Should I install Windows on GPT? ›We recommend performing Windows® 10 installations enabling UEFI with a GUID Partition Table (GPT). Some features may not be available if you use the Master Boot Record (MBR) style partition table.
Will there be a GPT-4? ›The newest version of OpenAI's language model system, GPT-4, was officially launched on March 13, 2023 with a paid subscription allowing users access to the Chat GPT-4 tool. As of this writing, full access to the model's capabilities remains limited, and the free version of ChatGPT still uses the GPT-3.5 model.
What is the rarest palm? ›Hyophorbe amaricaulis, the rarest palm in the world. The only individual in existence is the specimen in Curepipe gardens.
What is butterfly palm? ›Panicles of yellow flowers appear in the summer. One of several common names, "butterfly palm" refers to the leaves which curve upwards in multiple stems that look similar to a butterfly. It has been shown to reduce indoor air pollution and is one of the most popular indoor palms.
How many parameters does Google have? ›
Google Brain has developed an artificial intelligence language model with some 1.6 trillion parameters.
How many parameters does GPT 3 have? ›OpenAI researchers recently released a paper describing the development of GPT-3, a state-of-the-art language model made up of 175 billion parameters. For comparison, the previous version, GPT-2, was made up of 1.5 billion parameters.
How many uses does a PaLM tree have? ›Palm trees can produce oil, lumber, woven materials, multiple food sources, drinks, insulation, and so much more. In many regions, palm trees are a vital resource used in everyday life, often for a plethora of these purposes. Along with being extremely useful, they are a beautiful sight.
What is the capability of PaLM AI? ›- Text generation. PaLM 2 generates text on any topic a user requests using a text prompt.
- Summarization. Another core capability that summarizes large volumes of content into a more compact form.
- Content analysis. ...
- Reasoning. ...
- Code generation. ...
- Code analysis. ...
- Text translation.