New endpoints Chat, Embedding and OpenAI

,

Hi,
is anyone already using the newly exposed API endpoints for ChatCompletions, Embeddings and OpenAI? They are documented in the $discovery JSON but I’m getting HTTP 400 Bad Request responses only.

Any hints are appreciated.

1 Like

Hi @jkirstaetter , 400 Bad Request can be caused by a typo, a missing required field in your request or by making request in region where the free tier is not supported.

I tried executing the newly exposed endpoints. Please refer the colab gist.

1 Like

Ohhh, I see.
You’re using the OpenAI client to use those endpoints.
Whew, silly me, I was looking at the sources of the Gemini SDKs in order to implement it into the client itself. OK, now I understand it…

Also, the request structure is different compared to the regular generateContent methods. OK, I got the picture and will be able to use them now.

Thanks!

Hi @GUNAND_MAYANGLAMBAM

I gave it a shot with latest regular as well as prerelease NuGet package of OpenAI in C#

using OpenAI;
using OpenAI.Chat;
using System.ClientModel;

var apiKey = Environment.GetEnvironmentVariable("GOOGLE_API_KEY");
var model = "gemini-1.5-flash";
var prompt = "Explain to me how AI works.";

OpenAIClientOptions options = new() { Endpoint = new Uri("https://generativelanguage.googleapis.com/v1beta/openai/") };
ChatClient client = new(model, new ApiKeyCredential(apiKey), options);
ChatCompletion completion = client.CompleteChat(prompt);
Console.WriteLine($"[ASSISTANT]: {completion.Content[0].Text}");

Although the request was responded with an HTTP 200 and I can see the JSON string with the expected information, there is a problem with the Deserialization. It throws an exception due to the role property.

Unhandled exception. System.ArgumentOutOfRangeException: Unknown ChatMessageRole value. (Parameter 'value')
Actual value was model.
   at OpenAI.Chat.ChatMessageRoleExtensions.ToChatMessageRole(String value)
   at OpenAI.Chat.InternalChatCompletionResponseMessage.DeserializeInternalChatCompletionResponseMessage(JsonElement element, ModelReaderWriterOptions options)
...

The current value provided by the Gemini API is model but it should be assistant according to the extension method used to map the role in the response.

        public static ChatMessageRole ToChatMessageRole(this string value)
        {
            if (StringComparer.OrdinalIgnoreCase.Equals(value, "system")) return ChatMessageRole.System;
            if (StringComparer.OrdinalIgnoreCase.Equals(value, "user")) return ChatMessageRole.User;
            if (StringComparer.OrdinalIgnoreCase.Equals(value, "assistant")) return ChatMessageRole.Assistant;
            if (StringComparer.OrdinalIgnoreCase.Equals(value, "tool")) return ChatMessageRole.Tool;
            if (StringComparer.OrdinalIgnoreCase.Equals(value, "function")) return ChatMessageRole.Function;
            throw new ArgumentOutOfRangeException(nameof(value), value, "Unknown ChatMessageRole value.");
        }

It looks like the generated content from the endpoint https://generativelanguage.googleapis.com/v1beta/openai/chat/completions/ does not provide the correct value.

Maybe you can escalate this?

Regards, JoKi

Hi @GUNAND_MAYANGLAMBAM

Yes, and I can confirm this based on my own source code to integrate the Gemini API into Microsoft.Extensions.AI and Semantic Kernel. The correct mapping of role values should be like this.

        /// <summary>
        /// Maps a <see cref="Mscc.GenerativeAI.Role"/> to a <see cref="ChatRole"/>.
        /// </summary>
        /// <param name="role">The role to map.</param>
        private static ChatRole ToAbstractionRole(string? role)
        {
            if (string.IsNullOrEmpty(role)) return new ChatRole("unknown");

            return role switch
            {
                Role.User => ChatRole.User,
                Role.Model => ChatRole.Assistant,
                Role.System => ChatRole.System,
                Role.Function => ChatRole.Tool,
                _ => new ChatRole(role)
            };
        }

Question here is now, whether this would be a change required in the OpenAI package or in the API response.

Sweet,

It’s already reported as an issue: OpenAI library doesn't work with Gemini's OpenAI compat endpoint · Issue #289 · openai/openai-dotnet · GitHub

Let’s see what the outcome is going to be.

Hello @GUNAND_MAYANGLAMBAM

However, giving it a second thought. The Gemini API claims to be compatible to the OpenAI library. Given the current situation I would rather say that the returned value of role is wrong, and should be assistant instead of model.

Kind regards, JoKi

Hey @jkirstaetter , Thanks for reporting the issue. I will follow up with the team on this.

1 Like

Hi @jkirstaetter , role mismatch issue has been resolved.

1 Like

Hi @GUNAND_MAYANGLAMBAM

Yes, can confirm. Great improvement, thanks.

1 Like