Taskallama is a Laravel package that provides seamless integration with Ollama's LLM API. It simplifies generating AI-powered content, from professional task writing to conversational agents, with minimal effort. Whether you're building a task management system, an HR assistant for job posts, or blog content generation, Taskallama has you covered.
Why i built it? Simple reasons - i want to implement a ai helper on or Project and task management system at Taskavel.com to help me quickly scaffold the task. We gonna use it also on our another SaaS project, Advanced ATS system at Bagel.blue to make it easy to create a Job Postings.
- Simple API for generating AI responses via the Ollama LLM.
- Supports task creation, conversational AI, embeddings, and more.
- Customizable agent personalities for tailored responses.
- Integration with Laravel Livewire for real-time interactions.
- Configurable options like streaming, model selection, and temperature.
-
Ollama Installation
- Taskallama requires Ollama to be installed and running locally on your machine. You can download and install Ollama from their official website:
-
Ollama Configuration
- By default, Taskallama connects to Ollama at
http://127.0.0.1:11434
. Ensure that Ollama is running and accessible at this address. You can update theOLLAMA_URL
in the config file if it's hosted elsewhere.
- By default, Taskallama connects to Ollama at
-
System Requirements
- PHP
^8.3
or higher. - Laravel
^11.0
or higher.
- PHP
You can install the package via composer:
composer require codingwisely/taskallama
Next, you should publish the package's configuration file:
php artisan vendor:publish --tag="taskallama-config"
This will publish a taskallama.php
file in your config
directory where you can configure your Ollama API key and other settings.
return [
'model' => env('OLLAMA_MODEL', 'llama3.2'),
'default_format' => 'json',
'url' => env('OLLAMA_URL', 'http://127.0.0.1:11434'),
'default_prompt' => env('OLLAMA_DEFAULT_PROMPT', 'Hello Taskavelian, how can I assist you today?'),
'connection' => [
'timeout' => env('OLLAMA_CONNECTION_TIMEOUT', 300),
],
];
Generate a response using a prompt:
use CodingWisely\Taskallama\Facades\Taskallama;
$response = Taskallama::agent('You are a professional task creator...')
->prompt('Write a task for implementing a new feature in a SaaS app.')
->model('llama3.2')
->options(['temperature' => 0.5])
->stream(false)
->ask();
return $response['response'];
Generate a stream response using a prompt:
use CodingWisely\Taskallama\Facades\Taskallama;
return response()->stream(function () use () {
Taskallama::agent('You are a professional task creator...')
->prompt('Write a task for implementing a new feature in a SaaS app.')
->model('llama3.2')
->options(['temperature' => 0.5])
->stream(true)
->ask();
}, 200, [
'Cache-Control' => 'no-cache',
'X-Accel-Buffering' => 'no',
'Content-Type' => 'text/event-stream',
]);
Create a conversational agent:
use CodingWisely\Taskallama\Facades\Taskallama;
$messages = [
['role' => 'user', 'content' => 'Tell me about Laravel'],
['role' => 'assistant', 'content' => 'Laravel is a PHP framework for web development.'],
['role' => 'user', 'content' => 'Why is it so popular?'],
];
$response = Taskallama::agent('You are a Laravel expert.')
->model('llama3.2')
->options(['temperature' => 0.7])
->chat($messages);
Integrate Taskallama into a Livewire component for real-time task generation:
namespace App\Livewire;
use CodingWisely\Taskallama\Taskallama;
use Livewire\Component;
class AskTaskallama extends Component
{
public $question = '';
public $response = '';
public function ask()
{
if (empty(trim($this->question))) {
$this->response = "Please provide a valid question.";
return;
}
try {
$this->response = Taskallama::agent('You are a task-writing assistant.')
->prompt($this->question)
->model('llama3.2')
->options(['temperature' => 0])
->stream(false)
->ask()['response'] ?? "No response received.";
} catch (\Exception $e) {
$this->response = "Error: " . $e->getMessage();
}
}
public function render()
{
return view('livewire.ask-taskallama');
}
}
Generate embeddings for advanced search or semantic analysis:
$embeddings = Taskallama::agent('Embedding Assistant')
->model('llama3.2')
->options(['temperature' => 0.5])
->ask();
print_r($embeddings);
$models = Taskallama::getInstance()->listLocalModels();
print_r($models);
$modelInfo = Taskallama::getInstance()->getModelInfo('llama3.2');
print_r($modelInfo);
$modelSettings = Taskallama::getInstance()->getModelSettings('llama3.2');
print_r($modelSettings);
If you're pulling model, make sure you set this a background job, as it may take a while to download the model.
$pullModel = Taskallama::getInstance()->pull('mistral');
$deleteModel = Taskallama::getInstance()->delete('mistral');
Run the tests with:
composer test
This package is open-source software licensed under the MIT License. Please see the LICENSE.md file for more information.