Pega GenAI brings more LLMs to low-code automation workflows

Pegasystems has announced plans to expand the capabilities of its Pega GenAI enterprise platform by connecting to both Amazon Web Services (AWS) and Google Cloud large language models (LLMs).

By moving beyond Microsoft Azure OpenAI, Pegasystems is broadening the options available to its Pega platform customers for developing AI-based workflow automations. The announcement also underscores the rising importance of generative AI as a must-have functionality in the low-code market.

Arnal Dayaratna, research vice president for software development at IDC, said the move to connect to models hosted by AWS and Google marks a notable step forward in deepening the integration of generative AI capabilities into the company’s platform.

“Pega customers can now seamlessly integrate generative AI capabilities from an expansive universe of foundation models into Pega-related development and use cases,” he said.

The streamlined access to generative AI models, said Dayaratna, is set to “accelerate the development of net-new generative AI applications via the Pega platform. Additionally, the partnerships with AWS and Google Cloud are likely to catalyze the deepened integration of generative AI functionality into existing enterprise applications by means of Pega.”

A release issued by Pega yesterday stated that AWS and Google Cloud generative AI models will be available in Pega Connect GenAI, a plug-and-play architecture that allows low-code developers to author prompts and get immediate value from generative AI in any workflow or decision.

According to the workflow automation provider, this will enable Pega low-code developers to build custom generative AI-powered capabilities into their workflows to help boost the productivity of employees and agents interacting with them.

Pega GenAI, it said, can be used to build a component to summarize documents on the fly and give end-users an at-a-glance overview of critical information when they open their assignments.

Generative AI models from AWS that will be supported include Amazon Bedrock, a managed service that offers a choice of foundation models (FMs) from assorted AI companies, and Amazon Titan, a collection of models designed to support assorted use cases.

On the Google Cloud front, offerings supported include Vertex AI, a machine learning platform; Google Gemini, a generative artificial intelligence chatbot formerly called Bard; and the Google- and Amazon.com-supported Claude from Anthropic, a generative AI model and alternative to OpenAI’s GenAI.

The new services are currently on display at PegaWorld INspire annual conference taking place this week in Las Vegas.

The low-code market evolves

Asked how Pega’s approach matches up with ServiceNow’s Washington DC platform launched in March, Bern Elliot, Gartner distinguished VP analyst, said both are good examples of how established application platform vendors are enabling generative AI functionality for their customers.

“Embedding AI tools into applications platforms is a trend that came quite naturally out of generative AI’s strengths and also its weaknesses,” he said, adding that an example of the former is that LLMs are “particularly useful when working with a specific context. The application platform knows the context of the user, the task, and the broader workflow.”

As a result, he said, the “augmentation and automation capabilities can be applied to that very targeted context. One common use case is a targeted, integrated, conversational virtual assistant, sometimes called a copilot.”

As for model choices, Elliot said “different tasks are performed better and worse with different language models. Also, companies may have existing preferences for which language models they prefer. This may be based on performance, pretraining, or cost.”

In addition, he said, integrated trust, risk, and security management (TRISM) is a “critical element of these offers. By including it at the system level, the broader security and governance policies can be applied across the entire work stack and portfolio.”

Meanwhile, following the release of ServiceNow’s Washington DC platform, Stephen Elliot, IDC group vice president said one of its most important aspects is the meshing of more robust generative AI support with ServiceNow’s standardized data governance and management systems, which are popular among the company’s larger customers.

“I think this is a key release, because now as customers continue to understand not just the technology behind this, but also the security, the data governance and regional regulations … these are all layers of maturity that are continuing to progress,” he said.

© Foundry