In this article, we will create the openai-client
gem to make it easier to use the OpenAI API. We will use Bundler to create a new gem, RSpec for unit testing, Rubocop for linting, and Faraday as an HTTP client. My Bundler version is 2.4.3
.
You can use a different version, but we may not get the same result. To check which Bundler version you currently have:
bundle -v
Create a ruby gem
Run the following command to generate a new ruby gem:
bundle gem openai-client
During installation, you will be asked questions such as which test framework to use, which linter to use, CI configuration, etc. For this gem, we will use RSpec as a test framework and Rubocop as the linter.
Gem files
After installing the gem, you will see the following files in the ‘./openai-client’ folder:
Gemfile
: Used to manage gem dependencies for developing our library. This file contains a gemspec string, which means that Bundler will also include the dependencies specified in openai-client.gemspec. It’s best to list all the gems that our library depends on in the gem specification.Rakefile
: Requires Bundler and adds Rake build, install, and release tasks by calling Bundler::GemHelper.install_tasks. The build task will build the current version of the gem and store it in the pkg folder, the install task will build and install the gem on our system (just as if we had installed it), and the release will push the gem into Rubygems for public consumption.LICENSE.txt
: Includes the MIT license. It will only be enabled if you choose to allow it..gitignore
: Ignores everything we don’t want to push to GitHub.openai-client.gemspec
: Gem specification file. Here we provide information for the consumption of Rubygems, such as our gem’s name, description, and home page. Here we specify the dependencies that our gem should run.lib/openai/client.rb
: The main file for defining our gem’s code. This is the file that Bundler will need when loading our gem. In addition, this file defines a module we can use as a namespace for all our gem codes.lib/openai/client/version.rb:
Defines the VERSION constant. This file is loaded by openai-client.gemspec to specify the version of the gem. When we release a new version of a gem, we increment part of that version number to indicate to Rubygems that we are releasing a new version.
Add configuration class
First, we need to add the ability to add an access token to the gem so that we can make calls to the OpenAI API.
The configuration should work in the following way:
Openai::Client.configure do |c|
c.logger = Rails.logger # or whatever logger you want to use
c.access_token = 'access_token_goes_here'
c.organization_id = 'organization_id_goes_here'
end
Create the file lib/openai/client/configuration.rb
module Openai
module Client
class Configuration
attr_accessor :logger, :access_token, :organization_id, :openai_url
def initialize
@openai_url = 'https://api.openai.com/v1'
@access_token = nil
@organization_id = nil
@logger = Logger.new($stdout)
end
end
end
end
Create the file lib/openai/client/configurable.rb
module Openai
module Client
module Configurable
# @api public
# Public: Returns the instance of Configuration class
#
# @return [Openai::Client::Configuration] instance of Configuration class
def configuration
@configuration ||= Configuration.new
end
# @api public
# Public: Allows to provide configuration values
#
# @return [Openai::Client::Configuration] instance of Configuration class
#
# Example:
# Openai::Client.configure do |c|
# c.logger = CustomLogger
# c.access_token = 'access_token_goes_here'
# c.organization_id = 'organization_id_goes_here'
# c.openai_url = 'https://api.openai.com/v1'
# end
def configure
yield(configuration)
end
end
end
end
Open the file lib/openai/client.rb
require 'logger'
require 'openai/client/configuration'
require 'openai/client/configurable'
module Openai
module Client
extend Configurable
end
end
The best way to ensure everything works correctly is to add some unit tests. So let’s move on.
Add RSpec for configuration
Before writing unit tests, let’s add a few gems to the Gemfile that will be useful in this and subsequent chapters.
Open the Gemfile
group :development, :test do
gem 'faker', '~> 2.0'
gem 'rake', '~> 13.0'
gem 'rspec', '~> 3.0'
gem 'rubocop', '~> 1.21'
gem 'rubocop-performance', '~> 1.12'
gem 'rubocop-rspec', '~> 2.6'
gem 'webmock', '~> 3.12'
gem 'yard', '~> 0.9'
end
Make sure you also have the gems listed above in your Gemfile.
Then open the spec/spec_helper.rb
file and add the two required statements at the top.
require 'faker'
require 'webmock/rspec'
Open the file spec/openai/client_spec.rb
RSpec.describe Openai::Client do
describe '.configuration' do
subject(:configuration) { described_class.configuration }
let(:logger) { instance_double(Logger) }
let(:access_token) { Faker::Internet.password }
let(:organization_id) { Faker::Number.digit }
let(:openai_url) { Faker::Internet.url }
before do
described_class.configure do |c|
c.access_token = access_token
c.organization_id = organization_id
c.logger = logger
c.openai_url = openai_url
end
end
it 'returns the provided access_token' do
expect(configuration.access_token).to eq(access_token)
end
it 'returns the provided logger' do
expect(configuration.logger).to eq(logger)
end
it 'returns the provided organization_id' do
expect(configuration.organization_id).to eq(organization_id)
end
it 'returns the provided openai_url' do
expect(configuration.openai_url).to eq(openai_url)
end
end
end
Finally, we need to run tests to ensure they are green and our configuration works as intended.
rspec spec
Adding an HTTP client
We will use faraday
for this gem as the HTTP client. For now, we want two HTTP methods, GET and POST. Also, it’s good to log a message in case of a failed API call and throw an exception.
Open the openai-client.gemspec
file and add faraday
and faraday_middleware
as dependencies. We need to add all the dependencies to the openai-client.gemspec
file, not to the Gemfile
. Otherwise, it will not be available when we install the gem to the project.
spec.add_dependency('faraday', '~> 2.7')
spec.add_dependency('faraday_middleware', '~> 1.2')
Now it’s time to create an actual HTTP client, which is just a faraday
wrapper but makes it easier for us to process HTTP requests.
Here we will set up the default headers, put in an access_token
, and set the API base URL for all requests. Also, we want the response to be parsed and returned as a hash or an array, not a default string. So we add faraday_middleware
.
Visit their official website for more information on the faraday
configuration and available middleware. Faraday docs.
Create the file lib/openai/client/http.rb
and paste
module Openai
module Client
class Http
def initialize
@connection = Faraday.new(url: Openai::Client.configuration.openai_url, headers: headers) do |conn|
conn.response :json
conn.response :raise_error
conn.request :json
end
@logger = Openai::Client.configuration.logger
end
# @api public
# Public: Makes a GET request using the Faraday HTTP Client.
#
# @param [String] path API path
# @param [String] query_params query params
#
# @raise [Faraday::Error] on failure API call
#
# @return [Faraday::Response] instance of Faraday::Response class
def get(path, query_params = {})
connection.get(path, query_params)
rescue Faraday::Error => e
log_error(e) && raise
end
# @api public
# Public: Makes a POST request using the Faraday HTTP Client.
#
# @param [String] path API path
# @param [Hash] body request body
#
# @raise [Faraday::Error] on failure API call
#
# @return [Faraday::Response] instance of Faraday::Response class
def post(path, body = {})
connection.post(path, body)
rescue Faraday::Error => e
log_error(e) && raise
end
private
attr_reader :connection, :logger
# @api private
# Internal: Logs failure API calls.
#
# @param [Faraday::Error] e the error to log
def log_error(e)
logger.error("Error response, status: #{e.response[:status]}, body: #{e.response[:body]}")
end
def headers
{
'Authorization' => "Bearer #{Openai::Client.configuration.access_token}",
'OpenAI-Organization' => Openai::Client.configuration.organization_id
}
end
end
end
end
Then we want to require a newly created HTTP client. Open the lib/openai/client.rb
and require the file and faraday
gem
require 'faraday'
require 'faraday_middleware'
# other require statements go here ...
require 'openai/client/http'
Testing the HTTP client
We can write some simple unit tests to test our client’s work. We want to check if GET and POST are available, and if the API call fails, we log it and throw an exception.
Create the file spec/openai/client/http_spec.rb
and paste
RSpec.describe Openai::Client::Http do
let(:client) { described_class.new }
let(:url) { Faker::Internet.url }
let(:logger) { Openai::Client.configuration.logger }
let(:body) { {} }
let(:error_message) { "Error response, status: #{status}, body: #{body}" }
describe '#get' do
subject(:get) { client.get(url) }
before do
stub_request(:get, url).to_return(body: body.to_s, status: status)
end
context 'when the API call is successful' do
let(:status) { 200 }
it 'handles a successful API call' do
expect(get).to be_success
end
it 'parses response body into a hash' do
expect(get.body).to be_a(Hash)
end
end
context 'when the API call fails' do
let(:status) { 500 }
it 'raises an exception' do
expect { get }.to raise_error(Faraday::Error)
end
it 'logs an error' do
allow(logger).to receive(:error)
get
expect(logger).to have_received(:error).with(error_message)
end
end
end
describe '#post' do
subject(:post) { client.post(url, body) }
before do
stub_request(:post, url).to_return(body: body.to_s, status: status)
end
context 'when the API call is successful' do
let(:status) { 201 }
it 'handles a successful API call' do
expect(post).to be_success
end
it 'parses response body into a hash' do
expect(post.body).to be_a(Hash)
end
end
context 'when the API call fails' do
let(:status) { 500 }
it 'raises an exception' do
expect { post }.to raise_error(Faraday::Error)
end
it 'logs an error' do
allow(logger).to receive(:error)
post
expect(logger).to have_received(:error).with(error_message)
end
end
end
end
So far, so good! Don’t forget to run the specs
rspec spec
Make sure all specs are green!
OpenAI Models Endpoint
We’ve moved on to the fun part. Our first two endpoints are:
models
- a list of currently available models and basic information about each.models/:model_id
- Retrieves a model instance, providing basic information about the model.
If you don’t know what OpenAI models are, read the Model Documentation.
We want to call models
by running Openai::Client.models.list
and models/:model_id
by running Openai::Client.models.find(model_id)
.
Let’s start by adding the Models class. First, create the file lib/openai/client/models.rb
and paste it.
module Openai
module Client
class Models
PATH = 'models'
# @api public
# Public: Makes an API call to return all models.
#
# @return [Hash] a hash with models
def list
Http.new.get(PATH).body
rescue StandardError
nil
end
# @api public
# Public: Makes an API call to find the model by the ID.
#
# @param [String] id model id
#
# @return [Hash] found model or nil
def find(id)
Http.new.get("#{PATH}/#{id}").body
rescue StandardError
nil
end
end
end
end
Then, open the file lib/openai/client.rb
and paste
require 'openai/client/models'
module Openai
module Client
extend Configurable
ATTRS = ['models'].freeze
class << self
ATTRS.each do |attr|
define_method(attr) do
instance_variable_get("@#{attr}") || instance_variable_set("@#{attr}",
const_get(attr.capitalize, self).new)
end
end
end
end
end
Just a bit of metaprogramming, and we’re done with the functionality. If you don’t understand how the code above works, I can provide a more accessible version of the same code.
def self.models
@models ||= Models.new
end
This will be generated from the code snippet that we added earlier, a simplified version, but at the same time, less flexible since every time we add a new endpoint, we need a new method to handle it. In our example, we only need to add the new endpoint name to the ATTRS
array.
RSpec for the OpenAI models
We currently only have one models
endpoint, but I believe we’d like to add more in the future. So, with that in mind, let’s create reusable specs so we can use them to test the availability of the models
endpoint and other future endpoints we’re going to add later.
To do this, we will use the RSpec shared examples. If you have no experience with them, I suggest you check out documentation here.
Create the new file spec/support/shared_examples.rb
and paste
RSpec.shared_examples 'API wrapper' do
context 'when the API call is successful' do
let(:status) { 200 }
it 'returns a value' do
expect(subject).not_to be_nil
end
end
context 'when the API call fails' do
let(:status) { 500 }
it 'returns nil' do
expect(subject).to be_nil
end
end
end
We called the shared examples API wrapper
. And here is the way how we can use them.
Create the file spec/openai/client/models_spec.rb
and paste
require 'support/shared_examples'
RSpec.describe Openai::Client::Models do
before do
stub_request(:get, url).to_return(body: '{}', status: status)
end
describe '#find' do
subject { described_class.new.find(model_id) }
let(:model_id) { Faker::Number.digit }
let(:url) { "#{Openai::Client.configuration.openai_url}/#{described_class::PATH}/#{model_id}" }
it_behaves_like 'API wrapper' # shared examples
end
describe '#list' do
subject { described_class.new.list }
let(:url) { "#{Openai::Client.configuration.openai_url}/#{described_class::PATH}" }
it_behaves_like 'API wrapper' # shared examples
end
end
Our shared example requires only a subject
, url
, and of course, we do not want to send real requests, but we want to stub them. Most likely, these are the only things that will change from endpoint to endpoint.
The last test I want to add is to test that we can access the newly added class in the following way Openai::Client.models
.
Open the file spec/openai/client_spec.rb
and paste it at the end of the spec file
# when we want to add a new endpoint, the only thing we have to do is expand the array.
['models'].each do |attr|
describe ".#{attr}" do
subject(:method) { described_class.public_send(attr) }
it "returns an instance of the #{attr.capitalize} class" do
expect(method).to be_instance_of(described_class.const_get(attr.capitalize))
end
end
end
Finnaly, run specs
rspec spec
I hope you see the green specs in the console!
Testing openai-client gem from the console
Before we start, we need to generate an access token to make calls to the OpenAI API.
- Get your API key from https://beta.openai.com/account/api-keys
Bear in mind that this token provides limited free requests.
Open the console
bin/console
And paste
Openai::Client.configure do |c|
c.access_token = 'your access token'
end
We can then test the only endpoint we have at the moment.
models = Openai::Client.models.list # list of models
model = Openai::Client.models.find(models['data'].first['id']) # first model from the list
Publish the openai-client gem
Before publishing, make sure you have completed the following steps:
Commit and push all changes to GitHub/GitLab or wherever you use it.
Create an account at rubygems.org.
Alos, you must put gem credentials in
~/.gem/credentials
.Run the following command:
curl -u {your_github_username} https://rubygems.org/api/v1/api_key.yaml > ~/.gem/credentials # here you will be asked to enter your GitHub email address and password.
Add required permissions to the
credentials
filechmod 0600 ~/.gem/credentials
Check that gem can be installed locally.
To do this, we can run
rake build
which will build a local copy of our gem. Thengem install pkg/openai-client-0.1.0.gem
to install it.
Finally, to release the first version of our gem run:
rake release
Every time you want to release a new version, don’t forget to update the actual version of the gem (lib/openai/client/version.rb
). We can do this manually or use the gem-release
gem.
gem install gem-release
# then
gem bump --version minor # bumps to the next minor version
gem bump --version major # bumps to the next major version
gem bump --version 1.1.1 # bumps to the specified version
Summary
We did it. Our first gem has been successfully developed and released. I spent about two evenings on this gem beside the main work, so there may be minor typos. If you find it, write to the chat. Also, feel free to write your suggestions if you see the development of this kind of gem differently. I’m always happy to talk about such things.