Right way to integrate ChatGPT with Asp.Net Core | Prompt Engineering

OpenAI has widen the scope of one-person profitable app ideas using its API for different purposes.

Dipendra Neupane
codenp
5 min readFeb 8, 2024

--

Photo by Emiliano Vittoriosi on Unsplash

ChatGPT — An AI powered language model developed by OpenAI is capable of generating human-like text responses based on user’s prompt. It also has the capabilities of retaining the past conversations and restricting the bad requests.

There are different sets of modal having different capabilities available under OpenAI platform. For example, DALL·E, which can generate images based on text input. Whisper, which is ASR (automatic speech recognition) modal for converting audio into text.

You can explore all modals of OpenAI from the official documentation.

In this article, I am going to use the modal gpt-3.5-turbo. Alternatively, you can also use gpt-4 or gpt-4 Turbo, only if you are paying customer.

Without further ado. let’s deep dive into the implementation.

Sign Up for an OpenAI API Key

First of all, we need to have an API key to integrate ChatGPT into out Asp.Net Core application.

Head over to the OpenAI Website and create ani API Key from here. You can provide read/write permissions to the different endpoints while creating an API Key. I am providing all the permissions for this demo implementation.

Once you have your API Key, you’re ready to move on to the next step in the integration process!

Create an Asp.Net Core Web API Project

Create an Asp.Net Core Web API project and add OpenAI’s API key on appsettings.json. BaseUrl, is an API endpoint, that we will be using later while triggering a prompt to ChatGPT.

Value for BaseUrl may differ based on the modal you are using. Since we will be using a modal gpt-3.5-turbo which is the valid endpoint for API.

{  
"OpenAISetting": {
"APIKey": "<put_your_api_key_here>",
"BaseUrl": "https://api.openai.com/v1/chat/completions"
}
}

Go to NuGet package manager (tools > nuget package manager > manage nuget packages for solution), browse OpenAI and install it. Alternatively, you can install it using Package Manager Console by running the command below.

Install-Package OpenAI

Let’s create a service that will interact with the OpenAI ChatGPT API. Later, we will be using that service on out PromptController and expose it using swagger.

public interface IPromptService
{
Task<string> TriggerOpenAI(string prompt);
}

public class PromptService : IPromptService
{
public readonly IConfiguration _configuration;
public PromptService(IConfiguration configuration)
{
_configuration = configuration;
}

public async Task<string> TriggerOpenAI(string prompt)
{
var apiKey = _configuration.GetValue<string>("OpenAISetting:APIKey");
var baseUrl = _configuration.GetValue<string>("OpenAISetting:BaseUrl");

HttpClient client = new HttpClient();
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", apiKey);

var request = new OpenAIRequestDto
{
Model = "gpt-3.5-turbo",
Messages = new List<OpenAIMessageRequestDto>{
new OpenAIMessageRequestDto
{
Role = "user",
Content = prompt
}
},
MaxTokens = 100
};
var json = JsonSerializer.Serialize(request);
var content = new StringContent(json, Encoding.UTF8, "application/json");
var response = await client.PostAsync(baseUrl, content);
var resjson = await response.Content.ReadAsStringAsync();
if (!response.IsSuccessStatusCode)
{
var errorResponse = JsonSerializer.Deserialize<OpenAIErrorResponseDto>(resjson);
throw new System.Exception(errorResponse.Error.Message);
}
var data = JsonSerializer.Deserialize<OpenAIResponseDto>(resjson);
var responseText = data.choices[0].message.content;

return responseText;
}
}

Look at the OpenAIRequestDto, which takes a model that I have mentioned earlier. Along with that, the Messages parameter takes the Content, which represents the input from application user.

The Role parameter is about how it should behave throughout the conversation.

The MaxTokens parameter represents the maximum number of tokens that can be generated after the completion of API call. You can calculate the tokens from here.

There are many request parameters which I am not using here for the sake of simplicity.

The custom DTO’s are attached below and remaining part of code is the regular Web API call using HttpClient.

public class OpenAIErrorResponseDto
{
[JsonPropertyName("error")]
public OpenAIError Error { get; set; }
}
public class OpenAIError
{
[JsonPropertyName("message")]
public string Message { get; set; }

[JsonPropertyName("type")]
public string Type { get; set; }

[JsonPropertyName("param")]
public string Param { get; set; }

[JsonPropertyName("code")]
public string Code { get; set; }
}
 public class OpenAIRequestDto
{
[JsonPropertyName("model")]
public string Model { get; set; }

[JsonPropertyName("messages")]
public List<OpenAIMessageRequestDto> Messages { get; set; }

[JsonPropertyName("temperature")]
public float Temperature { get; set; }

[JsonPropertyName("max_tokens")]
public int MaxTokens { get; set; }
}

public class OpenAIMessageRequestDto
{
[JsonPropertyName("role")]
public string Role { get; set; }

[JsonPropertyName("content")]
public string Content { get; set; }
}
public class OpenAIResponseDto
{
public string id { get; set; }
public string @object { get; set; }
public int created { get; set; }
public string model { get; set; }
public List<Choice> choices { get; set; }
public Usage usage { get; set; }
}

public class Choice
{
public int index { get; set; }
public Message message { get; set; }
public object logprobs { get; set; }
public string finish_reason { get; set; }
}
public class Usage
{
public int prompt_tokens { get; set; }
public int completion_tokens { get; set; }
public int total_tokens { get; set; }
}
public class OpenAIChoice
{
public string text { get; set; }
public float probability { get; set; }
public float[] logprobs { get; set; }
public int[] finish_reason { get; set; }
}

public class Message
{
public string role { get; set; }
public string content { get; set; }
}

Now, Create a PromptController and use the PromptService inside it.

 [ApiController]
[Route("[controller]")]
public class PromptController : ControllerBase
{
private readonly IPromptService _promptService;

public PromptController(IPromptService promptService)
{
_promptService = promptService;
}

[HttpGet(Name = "TriggerOpenAI")]
public async Task<IActionResult> TriggerOpenAI([FromQuery] string input)
{
var response = await _promptService.TriggerOpenAI(input);
return Ok(response);
}
}

We have integrated almost everything and its ready to test.

Oops!!! Almost forget to mention that, you should register a IPromptService on startup for injecting it into the controller.

builder.Services.AddTransient<IPromptService, PromptService>();

Looks interesting, right?

I have shared the code on my GitHub repository, incase you wants to play around with it.

--

--

Dipendra Neupane
codenp

Full-Stack Software Dev | When Dipendra isn't busy crafting code or captivating content, you can find him enjoying a good cup of americano or hitting the Gym.