Azure OpenAI endpoint configuration is essential for anyone who wants to use their own AI infrastructure with AI SQL Tuner Studio. This guide walks you through configuring your Azure OpenAI endpoint via Microsoft Foundry (Azure AI Foundry), including resource provisioning, model deployment, and connecting your endpoint to the Studio.
Table of Contents
1) Prerequisites
- A Microsoft Foundry resource (Azure AI Foundry) in your Azure subscription
- A Foundry project and a deployed model (deployment name)
- Your Azure OpenAI-compatible Endpoint (URL) and API key (or a key/secret associated with your project’s model endpoint)
1.1) Create a Microsoft Foundry resource (new Foundry portal)
Microsoft Foundry (formerly Azure AI Foundry) is the portal experience you use to create Foundry resources/projects and work with models and agents. Learn more about Azure AI Foundry.
- Open Microsoft Foundry: https://ai.azure.com/
- Ensure you’re using the new Foundry portal (there’s a banner toggle in the portal to switch between (new) and (classic)).
- Create a Foundry resource and a Foundry project (or select an existing project).
- In your project, locate the model endpoint details you’ll use with AI SQL Tuner Studio: the Endpoint URL and your API key.
Notes:
- AI SQL Tuner Studio expects the endpoint to end with
/. - If you don’t see the resource/project you expect in the new portal, use View all resources to open the classic experience.
1.1.1) Alternative: Provision an Azure OpenAI resource (Azure Portal)
- In the Azure Portal, search for Azure OpenAI and choose Create.
- Select your Subscription / Resource group, Region (must be supported for Azure OpenAI in your tenant), and Name (this becomes part of the endpoint host).
- Create the resource.
- After creation, open the Azure OpenAI resource, copy the Endpoint (looks like
https://<resource>.cognitiveservices.azure.com/), then go to Keys and Endpoint and copy Key 1 or Key 2.
Notes:
- The endpoint must be the resource endpoint (not a Portal URL).
- AI SQL Tuner Studio expects the endpoint to end with
/.
1.2) Deploy a model (create a deployment)
AI SQL Tuner Studio uses a deployment name (not the raw model ID) when calling Azure OpenAI.
- Open Microsoft Foundry: https://ai.azure.com/
- Select your Foundry project.
- Go to your project’s models/deployments area.
- Select Create deployment.
- Choose the model you want to use (depends on what’s available in your region/tenant).
- Give it a Deployment name (example:
gpt-5.2,o1-preview,model-router). - Create/save the deployment.
You will use this deployment name in the Primary model field in the Studio UI.
2) Configure in AI SQL Tuner Studio
- Launch AI SQL Tuner Studio.
- In the left panel, expand the AI Configuration section.
- Fill in the following fields:
- AZURE_OPENAI_ENDPOINT — paste your Azure OpenAI endpoint URL (e.g.,
https://your-resource.cognitiveservices.azure.com/)
- AZURE_OPENAI_KEY — paste your API key
- Primary model — enter your deployment name (e.g.,
gpt-5.2,model-router)
- Secondary model (optional) — enter a fallback deployment name for rate-limiting or large-request scenarios
- AZURE_OPENAI_ENDPOINT — paste your Azure OpenAI endpoint URL (e.g.,
- Check Save key if you want the API key persisted locally for future sessions.
- Click Save AI settings to store your configuration.
Your settings are saved to %LOCALAPPDATA%\AI SQL Tuner Studio\settings.json. The API key is stored only if you explicitly opt in via the Save key checkbox.
How the Studio selects models
The Studio automatically selects deployments based on request size:
- Requests under 100,000 characters use the primary model.
- Requests between 100,000 and 200,000 characters use the secondary model (if configured).
- Requests above 200,000 characters are blocked with a prompt to reduce the request size.
Tip: The app automatically detects the actual model used when deployments route to different model versions. The model in use is shown in the console output during execution.
Running an analysis with your endpoint
- Configure a SQL Server connection in the Connections section.
- Select a Tuning Goal (Server Health Check, Fix Deadlocks, Code Review, or Index Tuning).
- Click Run.
- Monitor progress in the Console panel; the HTML report appears in the Report panel when complete.
3) How AI SQL Tuner Studio resolves configuration
Configuration is resolved in the following priority order (highest to lowest):
- Studio UI settings — values entered in the AI Configuration section and saved via Save AI settings
- Built-in defaults — default deployment names and endpoint provided by AI SQL Tuner LLC
If no custom endpoint or key is provided, the app uses the default Azure OpenAI endpoint hosted in AI SQL Tuner LLC’s Azure subscription (no business data is transmitted — only SQL Server system metadata).
4) Common configuration problems
“Azure OpenAI endpoint or API key is not configured or is empty”
- In the Studio UI, expand AI Configuration and verify the endpoint and key fields are filled in.
- Ensure the endpoint ends with
/. - If using the Save key option, click Save AI settings to persist.
Endpoint looks correct but streaming fails
- Verify the endpoint is the Azure OpenAI resource endpoint (not a portal URL).
- Ensure the deployment exists under that resource.
- Try a different deployment name in the Primary model field.
“Using model/deployment: …” is not what you expected
- Set the Primary model field explicitly in the AI Configuration section.
- Confirm the deployment name in Azure OpenAI matches exactly (case-sensitive in Azure).
Request size exceeds limit
- The Studio will display a dialog if the request is too large.
- Configure a Secondary model with a larger context window, or reduce the scope of the analysis (e.g., fewer objects for Code Review).
5) Quick checklist
- Endpoint set to
https://...cognitiveservices.azure.com/in the AI Configuration section - API key entered
- Primary model / deployment name set
- Deployment name exists in Azure OpenAI
- Clicked Save AI settings
- Run an analysis and verify the console shows your deployment name
6) Example: Use a different model
- Open the AI Configuration section in the left panel.
- Set Primary model to
gpt-5.4(or any deployment name you created). - Click Save AI settings.
- Select a connection and tuning goal, then click Run.
To use a model-router deployment, enter model-router in the Primary model field and click Save AI settings.
Frequently Asked Questions
What is an Azure OpenAI endpoint and why do I need one?
An Azure OpenAI endpoint is a URL that connects AI SQL Tuner Studio to your own Azure-hosted AI model. By configuring your own endpoint, you keep full control over data residency, billing, and which model version the Studio uses for query analysis. This is ideal for organizations with strict compliance or network requirements.
Can I use any OpenAI model with AI SQL Tuner Studio?
AI SQL Tuner Studio works with any model deployed through Azure OpenAI or Microsoft Foundry that exposes an OpenAI-compatible responses API. Popular choices include GPT-5.4 (supported since version 1.0.26) and GPT-5.3 Codex. Check your deployment in the Azure portal to confirm the model name.
How do I find my Azure OpenAI endpoint URL?
In the Azure portal, navigate to your Azure OpenAI or Microsoft Foundry resource and look under Keys and Endpoint. The endpoint URL typically follows the format https://<your-resource-name>.cognitiveservices.azure.com/. Copy this value into the AI Configuration section of AI SQL Tuner Studio.
What should I do if my Azure OpenAI endpoint connection fails?
Start by checking the common problems listed in section 4 above. The most frequent causes are an incorrect endpoint URL, an expired or missing API key, firewall rules blocking outbound traffic on port 443, or a deployment name that does not match what is configured in the Studio. The console output will show the exact error to help you diagnose the issue.
Is my data secure when using a custom Azure OpenAI endpoint?
Yes. When you configure your own Azure OpenAI endpoint, all API calls go directly from AI SQL Tuner Studio to your Azure resource. Your query data never passes through third-party servers. You can further restrict access using Azure Private Endpoints and network security groups. Learn more about AI SQL Tuner Studio features on our homepage.
If you need support, please include the deployment name you used (shown in the console output), the exact error message, and your authentication method (Windows auth, Azure AD Interactive, or SQL auth).