Skip to content

Conversation

graysonchen
Copy link
Contributor

@graysonchen graysonchen commented Sep 1, 2025

image

What this does

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
  • I updated documentation if needed
  • I didn't modify auto-generated files manually (models.json, aliases.json)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

Related issues

@crmne
Copy link
Owner

crmne commented Sep 1, 2025

Why?

@graysonchen
Copy link
Contributor Author

Hey @crmne Yep, for integrate Cloudflare AI Gateway

With this, we can quickly pinpoint issues and access logs.
image

@crmne
Copy link
Owner

crmne commented Sep 3, 2025

I see. Since it's OpenAI compatible, instead of adding another configuration option to the library, can't you do the following and use the OpenAI provider instead?

RubyLLM.configure do |config|
  config.openai_api_base = '<cloudflare api gateway url>'
  config.openai_use_system_role = true
end

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants