Skip to content

Conversation

@mwqgithub
Copy link
Contributor

@mwqgithub mwqgithub commented Dec 18, 2025

Purpose of the change

Allow user to set an openai compatible model provider like qwen and deepseek.

Description

Add a openai-compatible model provider in memmachine-compose.sh. This allows user to set an openai compatible model provider like qwen and deepseek.

Type of change

  • New feature (non-breaking change which adds functionality)
  • Project Maintenance (updates to build scripts, CI, etc., that do not affect the main project)

How Has This Been Tested?

  • Manual verification (list step-by-step instructions)

Checklist

  • I have signed the commit(s) within this pull request
  • My code follows the style guidelines of this project (See STYLE_GUIDE.md)
  • I have performed a self-review of my own code
  • I have commented my code
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added unit tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • Any dependent changes have been merged and published in downstream modules
  • I have checked my code and corrected any misspellings

Maintainer Checklist

  • Confirmed all checks passed
  • Contributor has signed the commit(s)
  • Reviewed the code
  • Run, Tested, and Verified the change(s) work as expected

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds support for "Others" as a model provider option in the memmachine-compose.sh script, enabling users to configure OpenAI-compatible API endpoints such as Qwen and DeepSeek.

Key changes:

  • Added "OTHERS" provider option alongside existing OpenAI, Bedrock, and Ollama options
  • Implemented configuration handling for custom base URLs and API keys for OpenAI-compatible providers
  • Updated sample configuration files to include other_model and other_embedder entries

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated no comments.

File Description
memmachine-compose.sh Added OTHERS provider support with model selection, configuration generation, API key/base URL setup, and validation logic
sample_configs/episodic_memory_config.gpu.sample Added other_embedder and other_model configuration entries as templates for OpenAI-compatible providers
sample_configs/episodic_memory_config.cpu.sample Added other_embedder and other_model configuration entries as templates for OpenAI-compatible providers

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

;;
"OTHERS")
print_prompt
read -p "Which OpenAI-compatible LLM model would you like to use? [gpt-4o-mini]: " llm_model
Copy link
Contributor

@Kevin-K-W Kevin-K-W Dec 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to change the default value to "qwen-flush"

;;
"OTHERS")
print_prompt
read -p "Which OpenAI-compatible embedding model would you like to use? [text-embedding-3-small]: " embedding_model
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

change to text-embedding-v4

# Ask user for provider path (OpenAI, Others, Bedrock, or Ollama)
print_prompt
read -p "Which provider would you like to use? (OpenAI/Bedrock/Ollama) [OpenAI]: " provider_input
read -p "Which provider would you like to use? (OpenAI/Others/Bedrock/Ollama) [OpenAI]: " provider_input
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OpenAI/Bedrock/Ollama/Others

echo " Default LLM: gpt-4o-mini"
echo " Default embedding: text-embedding-3-small"
echo " Requires: OpenAI API key"
echo " Others - Uses an OpenAI-compatible endpoint (custom base URL)"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Put this to the end of list

Copy link
Contributor

@sscargal sscargal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for opening this PR and for working to improve model provider options! I wanted to share some thoughts and get your input on the naming and clarity:

  • The addition of an "Others" provider might be confusing or redundant, as the existing "openai" and "ollama" options already support any OpenAI-compatible endpoint, not just the official OpenAI API.
  • Users may not always realize that "openai" and "ollama" cover both the OpenAI service and any compatible endpoints from other providers. The current naming could imply it's limited to OpenAI only.
  • To help with clarity and reduce potential confusion, it could be helpful to update the documentation or add tooltips specifying that the "openai" option supports all OpenAI-compatible APIs—including third-party endpoints.
  • Alternatively, renaming the proposed "Others" option to something like "OpenAI/Compatible" or "Custom (OpenAI-compatible)" might make the scope more obvious to users.

I am open to suggestions and feedback. I'm okay with the last bullet of renaming this proposal from "Others" to "Custom (OpenAI-compatible)". However, we will need to change the input method from asking the user to type the option to selecting the option from a numbered list. This approach would improve the user experience.

Furthermore, we would need to add this new option to the memmachine-configure command so that they remain in sync.

@jealous
Copy link
Contributor

jealous commented Dec 19, 2025

Just notice Steve's comment. I think it makes sense. You may need to update the change.

@mwqgithub
Copy link
Contributor Author

mwqgithub commented Dec 19, 2025

Hi @sscargal , thanks for your comment.
The reason for adding the "others" option is the "openai" option uses the response API, which most openai-compatible providers don't support. If we can make the "openai" options use the chat completion API too, I think it's simpler and less confusing to let users use "openai" for both openai and third-party services.
I'm not sure about the benefits of the responses API over the chat completions API. If there's no significant difference between them, I think we can just change the API type for the "openai" option.
@jealous @Kevin-K-W

@sscargal
Copy link
Contributor

@mwqgithub Thanks for the update. I agree. Perhaps we should consider renaming the proposed "Others" option to something like "OpenAI/Compatible" or "Custom (OpenAI-compatible)", which might make the scope more obvious to users. "Others" is too general IMHO and could mean anything.

@mwqgithub mwqgithub force-pushed the fix-setup-script branch 2 times, most recently from afa5acc to 2028173 Compare December 22, 2025 02:31
@mwqgithub
Copy link
Contributor Author

Hi, @Kevin-K-W @sscargal . I've updated the code. Could you take a look again?

@mwqgithub mwqgithub changed the title Add a 'Others' model provider in memmachine-compose.sh Add openai-compatible model provider in memmachine-compose.sh Dec 22, 2025
Copy link
Contributor

@sscargal sscargal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@sscargal sscargal merged commit c9e023b into MemMachine:main Dec 22, 2025
49 of 50 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants