PHP Luminova: Ollama AI Client Integration
Use Ollama with Luminova for local AI chat, embeddings, and model management. Includes practical PHP examples for prompts, streaming, and embeddings.
The Luminova Ollama client (Luminova\AI\Client\Ollama) provides a clean interface to the Ollama API, enabling local or self-hosted open-source model inference including chat, embeddings, vision, model management, custom model creation, and fine-tuning via Modelfiles.
Ollama works by building and running models locally (or via a remote server).There is no background training system—models are created instantly from a Modelfile.
This guide covers Ollama-specific methods and usage within Luminova.
For shared AI features and client implementation, see: Base AI Class (Luminova\Base\AI)
To get started, install and run Ollama:https://ollama.com
Usages
Local Instance (Default)
use Luminova\AI\Client\Ollama;
$client = new Ollama(); // Connects to http://localhost:11434/api/Custom Host
$client = new Ollama('http://192.168.1.50:11434/api/');Authenticated Remote Deployment
$client = new Ollama(
baseurl: 'https://ollama.example.com/api/',
apiKey: env('OLLAMA_KEY')
);Via Application Config
Set handler = 'Ollama' and baseUrl = 'http://...' in App\Config\AI, then use the AI manager:
use Luminova\AI\AI;
$reply = AI::message('Hello from Ollama!');Text Generation
Raw Completion (generate)
Sends a plain prompt to the /generate endpoint without conversation context.
$response = $client->setModel('llama3')->generate('Explain recursion in plain English.');
print_r($response);Single Message
$reply = $client->message('What are the main features of PHP 8.3?');
echo $reply['content'] ?? '';Multi-Turn Chat
$reply = $client->chat([
['role' => 'system', 'content' => 'You are a helpful Linux assistant.'],
['role' => 'user', 'content' => 'How do I list all running processes?'],
['role' => 'assistant', 'content' => 'You can use the `ps aux` command.'],
['role' => 'user', 'content' => 'How do I filter by process name?'],
]);
echo $reply['content'] ?? '';With Options
$reply = $client->chat('Write a short poem about databases.', [
'model' => 'mistral',
'temperature' => 0.9,
]);Web Search
Note: Ollama's web search support requires a compatible server configuration.
$results = $client->webSearch('Current PHP release notes', ['limit' => 5]);
print_r($results);Embeddings
Generate vector embeddings for semantic search, clustering, or RAG pipelines.
Single Embedding
$vector = $client->embed('The quick brown fox');
// Returns a flat float array: [0.012, -0.034, ...]Batch Embeddings
$vectors = $client->embed(['cat', 'feline', 'automobile']);
use Luminova\AI\AI;
$score = AI::compareCosineVector($vectors[0], $vectors[1]);
echo $score; // ~0.92 (cat vs feline are similar)With a Specific Embedding Model
$vector = $client->embed('Some text to embed', [
'model' => 'nomic-embed-text',
]);Vision (Image Understanding)
Ollama supports multimodal models like llava that can process images alongside text.
Analyze an Image by URL
$content = $client->vision(
'What is shown in this image?',
'https://example.com/photo.jpg',
['model' => 'llava']
);
echo $content['content'] ?? '';Analyze a Local File
$content = $client->vision(
'Is there any text visible in this screenshot?',
'/tmp/screenshot.png'
);
echo $content['content'] ?? '';Compare Multiple Images
$content = $client->vision(
'Describe the differences between these two images.',
['/tmp/before.png', '/tmp/after.png'],
['model' => 'llava']
);Model Management
List All Local Models
$models = $client->models();
foreach ($models as $model) {
echo $model['name'] . PHP_EOL;
}Get Model Details
$info = $client->model('llama3');
echo $info['modified_at'];Pull a Model from the Registry
$client->pull('mistral');
$client->pull('llama3:8b');Push a Model to the Registry
$client->push('my-org/my-assistant');Delete a Local Model
$client->delete('llama3');Creating Custom Models
create() builds a custom Ollama model from a Modelfile, an array, or a JSON file.
From a Modelfile Path
$result = $client->create('my-assistant', '/path/to/dataset.modelfile', [
'from' => 'llama3',
'system' => 'You are a helpful PHP assistant.',
]);From an Array (Auto-converted to Modelfile)
$model = $client->create(
'php-assistant',
modelfile: [
'parameters' => [
'temperature' => 0.2,
'top_p' => 0.9,
],
],
options: [
'from' => 'llama3',
'system' => 'You are a helpful assistant specialized in PHP.',
]
);From a Raw Modelfile String
$modelfile = "PARAMETER temperature 0.5\nPARAMETER top_p 0.95";
$result = $client->create('custom-model', $modelfile, ['from' => 'mistral']);Fine-Tuning
fineTune() is a direct alias of create().
Note:
This is not traditional fine-tuning, the
fineTunemethod wrapscreatefor consistency.Ollama builds a new model by combining a base model with instructions and examples.
From a Modelfile
$result = $client->fineTune('my-llama', '/path/to/dataset.modelfile', [
'system' => 'You are a helpful PHP assistant.',
]);From a Structured Dataset Array
fineTuneDataset() converts a PHP array to Modelfile format, writes a temp file, and calls create() automatically. The temp file is deleted after processing.
$result = $client->fineTuneDataset([
'parameters' => ['temperature' => 0.7, 'max_tokens' => 256],
'messages' => [
['prompt' => 'What is PHP?', 'completion' => 'A server-side scripting language.'],
['prompt' => 'What is MySQL?', 'completion' => 'An open-source relational database.'],
],
], [
'suffix' => 'php-assistant',
'system' => 'You are a PHP expert.',
]);Checking Fine-Tune Status
$status = $client->fineTuneStatus('php-assistant');
if ($status['status'] === 'succeeded') {
echo 'Model ready: ' . $status['fine_tuned_model'];
}Error Handling
use Luminova\Exceptions\AIException;
try {
$reply = $client->message('Hello!');
} catch (AIException $e) {
echo 'Ollama Error: ' . $e->getMessage();
}Class Definition
- Full namespace:
\Luminova\AI\Client\Ollama - Parent class: Luminova\Base\AI
- Implements: \Luminova\Interface\AIClientInterface
Properties
endpoints
Ollama API endpoints.
protected array $endpoints = [
'generate' => 'generate',
'responses' => 'chat',
'embeddings' => 'embed',
'models' => 'tags',
'show' => 'show',
'pull' => 'pull',
'push' => 'push',
'delete' => 'delete',
'create' => 'create',
'search' => 'web_search',
];Methods
constructor
Create a new Ollama client instance.
public __construct(?string $baseurl = null, ?string $apiKey = null): mixedParameters:
| Parameter | Type | Description |
|---|---|---|
$baseurl | string|null | Base URL of the Ollama server (default: http://localhost:11434/api/). |
$apiKey | string|null | Optional Bearer token for authenticated Ollama deployments. |
Example:
Local default:
$client = new Ollama('http://localhost:11434/api/');
$reply = $client->message('Explain recursion in plain English.');Remote / authenticated deployment:
$client = new Ollama('https://ollama.example.com/api/', env('OLLAMA_KEY'));
App\Config\AI- For default application AI configuration.
create
Creates a custom model.
- An absolute path to a
.modelfiletext file - An absolute path to a JSON file (auto-converted via
toModelFile()) - A raw Modelfile string
public create(string $model, array|string $modelfile, array $options = []): arrayParameters:
| Parameter | Type | Description |
|---|---|---|
$model | string | Suffix or display name for the resulting fine-tuned model (e.g. my-model). |
$modelfile | array|string | Training model file path, array of modelfile configuration.. |
$options | array<string,mixed> | Optional training parameters. @type string from Base model to fine-tune (e.g. llama3).@type string system System prompt to embed in the modelfile (Ollama only). |
Return Value:
array - Return model creation result.
Throws:
- \Luminova\Exceptions\AIException - On network, client errors or malformed JSON in the response.
Note:
- Model creation is synchronous—it either succeeds or fails.
- Always ensure the base model exists (
pull('llama3')) before creating derived models.
Example:
Create Model:
$model = $client->create('php-assistant',
modelfile: [
'parameters' => [
'temperature' => 0.2,
'top_p' => 0.9,
]
],
options: [
'from' => 'llama3',
'system' => 'You are a helpful assistant specialized in PHP.',
]
);delete
Delete a locally-stored model.
public delete(string $model): arrayParameters:
| Parameter | Type | Description |
|---|---|---|
$model | string | Model name to delete (e.g. llama3). |
Return Value:
array - Ollama API response.
Example:
$client->delete('llama3');pull
Pull (download) a model from the Ollama model registry.
public pull(string $model): arrayParameters:
| Parameter | Type | Description |
|---|---|---|
$model | string | Model name to pull (e.g. mistral, llama3:8b). |
Return Value:
array - Ollama API response containing pull progress details.
Example:
$client->pull('mistral');push
Push a locally-created model to the Ollama model registry.
public push(string $model): arrayParameters:
| Parameter | Type | Description |
|---|---|---|
$model | string | Model name to push (must exist locally). |
Return Value:
array - Ollama API response.
Example:
$client->push('my-org/my-assistant');