Integrate ServiceDesk Plus with your Azure OpenAI resource to enable AI-powered features that simplify help desk operations. Once Azure OpenAI is enabled, the default ChatGPT integration will be automatically disabled.
Before configuring the integration, ensure you have the following ready from the Azure portal:
An active Azure subscription with an Azure OpenAI resource provisioned.
The Resource URL of your Azure OpenAI service (typically in the format https://<resource-name>.openai.azure.com).
A model deployed in your Azure OpenAI resource. Note the Base Model category, exact Model Name (e.g., gpt-4.1), and the Model Name as defined in the deployment.
An API Key with permission to access the configured Azure OpenAI deployment.
ServiceDesk Plus administrator (SDAdmin) access.
To access the Azure OpenAI configuration page:
Navigate to Admin > Apps & Add-ons > Integrations > Third Party.
Click Azure OpenAI.
Select the Enable Azure OpenAI checkbox to activate the integration.
Important: Enabling Azure OpenAI will automatically disable the default ChatGPT integration.
Configuring the Connection
Once Azure OpenAI is enabled, fill in the connection details as described below.
Resource URL
Enter the endpoint URL of your Azure OpenAI service. This can be obtained from the Azure portal under your OpenAI resource's Keys and Endpoint section. The URL follows the format:
https://<resource-name>.openai.azure.com
Base Model
Select the base model category configured in your Azure OpenAI deployment from the dropdown. If your model is not listed, select Other to manually specify the model name in the next field.
Model Name
Enter the Model Name exactly as defined in your Azure OpenAI deployment. This value is used to route API requests to the correct model.
API Key
Provide the API Key generated from the Azure portal. The key must have permission to access the configured Azure OpenAI deployment.
After configuring the connection details, select the AI features you want to activate. Each feature can be individually enabled or disabled. Use the Permissions control next to each feature to define which users or roles can access it.
Click Save to apply your configuration, or Cancel to discard changes.
The following table describes all available AI features, their functionality, and default permission levels.
|
Feature |
Description |
Permission |
|
Approval Prediction |
Automatically predicts approval outcomes for requests based on historical data. |
System |
|
Ask Zia (Conversational Chatbot) |
Enables the Zia conversational chatbot powered by Azure OpenAI. Can only be enabled when a supported model (GPT-4.1 or above) is configured. |
System |
|
Generate Post Incident Review |
Automatically generates post-incident review content using request and incident data. Best results are achieved with higher-context models. |
System |
|
Reopen Prediction |
Predicts the likelihood of a request being reopened after closure. |
System |
|
Zia Agent – Incident Review Summariser |
Automatically generates a concise summary of incident reviews to speed up post-incident analysis. |
System |
|
Zia Agent – L1 Support Agent |
AI-powered first-line support agent that assists in handling and triaging incoming requests. |
System |
|
Zia Agent – Request Resolution Agent |
Assists in identifying and suggesting resolution steps for requests based on context and history. |
System |
|
Classic Bot – Ask LLM |
Allows users to directly ask questions to the configured language model through the Classic Bot. |
All (configurable) |
|
Classic Bot – Context-aware Solution Suggestion |
Suggests relevant solutions based on request context when users interact with the Classic Bot. Works best with models that support a higher context length. |
All (configurable) |
|
Classic Bot – Explore Solutions |
Enables AI-powered solution discovery through the Classic Bot interface. |
All (configurable) |
|
Classic Bot – Explore Solutions while Creating Request |
Suggests relevant solutions in real time while users are creating a request through the Classic Bot. |
All (configurable) |
|
Classic Bot – Template Prediction |
Predicts and suggests appropriate request templates based on user input. Advanced models provide more accurate predictions. |
All (configurable) |
|
Reply Assistant |
Generates AI-assisted response suggestions for requests and conversations. |
All (configurable) |
|
Request Summarization |
Generates concise summaries of requests and their conversations for quicker review. |
All (configurable) |

Each AI feature has a Permissions control that determines who can access it:
System-level features (such as Approval Prediction and Reopen Prediction) run automatically in the background and are not configurable by role.
Role-based features (such as Reply Assistant and Request Summarization) can be restricted to technicians or made available to all users (technicians and requesters) using the dropdown next to each feature.
Two controls are available at the top right of the Azure OpenAI Configurations page:
Usage Stats – Monitor Azure OpenAI token consumption and API call metrics to keep track of costs and usage trends.
View History – Review a log of all configuration changes made to the Azure OpenAI integration, including timestamps and the users who made them.
The Ask Zia and other Zia agent features are supported only with GPT-4.1 and above. Supported models include GPT-4.1, GPT-4.1 mini, GPT-5, and GPT-5.1 mini. Configuring a lower model will prevent these features from being enabled.
Request Summarization, Template Prediction, Context-aware Solution Suggestion, and Post Incident Review work best with GPT-4o-mini and higher models.
gpt-3.5-turbo is supported for basic features; however, responses for long conversations or complex requests may be truncated or less detailed due to token constraints.
Ensure the selected Base Model and Model Name meet the minimum requirements for each feature before enabling them.