Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: darrenburns/elia
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: 1.3.0
Choose a base ref
...
head repository: darrenburns/elia
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: main
Choose a head ref

Commits on May 13, 2024

  1. Copy the full SHA
    f546c11 View commit details
  2. Copy the full SHA
    abee323 View commit details
  3. Merge pull request #41 from darrenburns/arrow-between-widgets

    More intuitive cursor movement
    darrenburns authored May 13, 2024
    Copy the full SHA
    b41b2fe View commit details
  4. 1.4.0

    darrenburns committed May 13, 2024
    Copy the full SHA
    a5d1329 View commit details

Commits on May 14, 2024

  1. Copy the full SHA
    6f914b3 View commit details
  2. Merge pull request #42 from darrenburns/gemini-builtin

    Add builtin models for Google Gemini
    darrenburns authored May 14, 2024
    Copy the full SHA
    6296132 View commit details
  3. 1.5.0

    darrenburns committed May 14, 2024
    Copy the full SHA
    afea24f View commit details
  4. Copy the full SHA
    507781d View commit details

Commits on May 21, 2024

  1. Copy the full SHA
    15bd1bb View commit details
  2. Archiving chats

    darrenburns committed May 21, 2024
    Copy the full SHA
    a748f83 View commit details
  3. Change default code theme

    darrenburns committed May 21, 2024
    Copy the full SHA
    9b475f9 View commit details
  4. Copy the full SHA
    dd88042 View commit details
  5. Default to Monokai

    darrenburns committed May 21, 2024
    Copy the full SHA
    eaecc59 View commit details
  6. Copy the full SHA
    b7d6965 View commit details
  7. Copy the full SHA
    aa47800 View commit details
  8. Copy the full SHA
    d53fc93 View commit details
  9. Small changes

    darrenburns committed May 21, 2024
    Copy the full SHA
    268e8fb View commit details
  10. 1.6.0

    darrenburns committed May 21, 2024
    Copy the full SHA
    1b51948 View commit details
  11. Merge pull request #43 from darrenburns/updates

    Update to latest Textual, fix dependencies, add inline mode
    darrenburns authored May 21, 2024
    Copy the full SHA
    bb6d837 View commit details
  12. Update README.md

    darrenburns authored May 21, 2024
    Copy the full SHA
    c1b7687 View commit details

Commits on May 22, 2024

  1. Fix datetime timezone reference and improve asynchronous handling

    - Corrected datetime.UTC to datetime.timezone.utc for accurate timezone reference
    - Ensured use of 'await' with 'push_screen' and 'pop_screen' methods for proper async behavior
    - Improved formatting and comments for clarity and readability
    CharlesCNorton committed May 22, 2024
    Copy the full SHA
    86893f7 View commit details
  2. Fix async handling and improve CLI functionality

    - Ensured proper handling of asynchronous functions within Click commands using `asyncio.run()`
    - Added missing imports and components to maintain full functionality
    - Verified correct use of `asyncio.run()` to avoid potential blocking issues in the event loop
    - Enhanced code clarity and maintainability
    CharlesCNorton committed May 22, 2024
    Copy the full SHA
    060d235 View commit details
  3. Progress renaming chats

    darrenburns committed May 22, 2024
    Copy the full SHA
    f39094e View commit details
  4. Fix

    darrenburns committed May 22, 2024
    Copy the full SHA
    46d9c43 View commit details
  5. 1.6.1

    darrenburns committed May 22, 2024
    Copy the full SHA
    aa1cad0 View commit details
  6. Copy the full SHA
    1b9bff5 View commit details
  7. Copy the full SHA
    af7dac1 View commit details
  8. Update README.md

    darrenburns authored May 22, 2024
    Copy the full SHA
    3cc3506 View commit details
  9. Polish

    darrenburns committed May 22, 2024
    Copy the full SHA
    163288a View commit details
  10. Update Textual

    darrenburns committed May 22, 2024
    Copy the full SHA
    feef486 View commit details
  11. Copy the full SHA
    d70ef71 View commit details
  12. 1.7.0

    darrenburns committed May 22, 2024
    Copy the full SHA
    8a69f27 View commit details
  13. Merge pull request #46 from darrenburns/rename-chats

    Progress renaming chats
    darrenburns authored May 22, 2024
    Copy the full SHA
    5c5e431 View commit details
  14. Update README.md

    darrenburns authored May 22, 2024
    Copy the full SHA
    b3b6205 View commit details
  15. Update README.md

    darrenburns authored May 22, 2024
    Copy the full SHA
    d5f3790 View commit details

Commits on Jun 3, 2024

  1. Create LICENSE

    darrenburns authored Jun 3, 2024
    Copy the full SHA
    004aa83 View commit details

Commits on Jun 20, 2024

  1. Downgrade textual

    darrenburns committed Jun 20, 2024
    Copy the full SHA
    6dc8d2f View commit details
  2. Version bump

    darrenburns committed Jun 20, 2024
    Copy the full SHA
    d265aa3 View commit details

Commits on Jun 21, 2024

  1. Copy the full SHA
    173a220 View commit details

Commits on Sep 8, 2024

  1. Switch to uv

    darrenburns committed Sep 8, 2024
    Copy the full SHA
    2e055df View commit details
  2. Merge pull request #74 from seapagan/fix-no-clipboard-error

    Cleanly handle exception if no clipboard mechanism is installed
    darrenburns authored Sep 8, 2024
    Copy the full SHA
    c8ade45 View commit details
  3. Copy the full SHA
    7d21f7d View commit details
  4. Copy the full SHA
    70763d9 View commit details
  5. Copy the full SHA
    8385420 View commit details
  6. Copy the full SHA
    f43e56e View commit details
  7. Remove old CSS

    darrenburns committed Sep 8, 2024
    Copy the full SHA
    7c3864f View commit details
  8. Merge pull request #78 from darrenburns/8sep-improvements

    8sep improvements
    darrenburns authored Sep 8, 2024
    Copy the full SHA
    61ef601 View commit details
  9. Merge pull request #44 from CharlesCNorton/main

    Fix datetime timezone reference and improve asynchronous handling
    darrenburns authored Sep 8, 2024
    Copy the full SHA
    6bbd438 View commit details
  10. Bump to 1.9.0

    darrenburns committed Sep 8, 2024
    Copy the full SHA
    35d9068 View commit details

Commits on Sep 10, 2024

  1. Copy the full SHA
    f1cabd4 View commit details
5 changes: 0 additions & 5 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -24,8 +24,3 @@ repos:
- id: python-no-eval
- id: python-no-log-warn
- id: python-use-type-annotations
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: 'v0.0.269'
hooks:
- id: ruff
args: [ --fix, --exit-non-zero-on-fix ]
201 changes: 201 additions & 0 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,201 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/

TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION

1. Definitions.

"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.

"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.

"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.

"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.

"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.

"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.

"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).

"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.

"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."

"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.

2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.

3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.

4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:

(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and

(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and

(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and

(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.

You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.

5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.

6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.

7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.

8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.

9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.

END OF TERMS AND CONDITIONS

APPENDIX: How to apply the Apache License to your work.

To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.

Copyright [yyyy] [name of copyright owner]

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
71 changes: 63 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -19,10 +19,10 @@ Speak with proprietary models such as ChatGPT and Claude, or with local models r
Install Elia with [pipx](https://github.com/pypa/pipx):

```bash
pipx install elia-chat
pipx install --python 3.11 elia-chat
```

Depending on the model you wish to use, you may need to set one or more environment variables (e.g. `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, etc).
Depending on the model you wish to use, you may need to set one or more environment variables (e.g. `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, `GEMINI_API_KEY` etc).

## Quickstart

@@ -32,10 +32,28 @@ Launch Elia from the command line:
elia
```

Launch directly a new chat from the command line:
Launch a new chat inline (under your prompt) with `-i`/`--inline`:

```bash
elia "What is the Zen of Python?"
elia -i "What is the Zen of Python?"
```

Launch a new chat in full-screen mode:

```bash
elia "Tell me a cool fact about lizards!"
```

Specify a model via the command line using `-m`/`--model`:

```bash
elia -m gpt-4o
```

Options can be combined - here's how you launch a chat with Gemini 1.5 Flash in inline mode (requires `GEMINI_API_KEY` environment variable).

```bash
elia -i -m gemini/gemini-1.5-flash-latest "How do I call Rust code from Python?"
```

## Running local models
@@ -53,13 +71,19 @@ the options window (`ctrl+o`).
The example file below shows the available options, as well as examples of how to add new models.

```toml
# the *ID* for the model that is selected by default on launch
# to use one of the default builtin OpenAI/anthropic models, prefix
# the model name with `elia-`.
default_model = "elia-gpt-3.5-turbo"
# the ID or name of the model that is selected by default on launch
default_model = "gpt-4o"
# the system prompt on launch
system_prompt = "You are a helpful assistant who talks like a pirate."

# choose from "nebula", "cobalt", "twilight", "hacker", "alpine", "galaxy", "nautilus", "monokai", "textual"
theme = "galaxy"

# change the syntax highlighting theme of code in messages
# choose from https://pygments.org/styles/
# defaults to "monokai"
message_code_theme = "dracula"

# example of adding local llama3 support
# only the `name` field is required here.
[[models]]
@@ -92,6 +116,37 @@ name = "gpt-3.5-turbo"
display_name = "GPT 3.5 Turbo (Personal)"
```

## Custom themes

Add a custom theme YAML file to the themes directory.
You can find the themes directory location by pressing `ctrl+o` on the home screen and looking for the `Themes directory` line.

Here's an example of a theme YAML file:

```yaml
name: example # use this name in your config file
primary: '#4e78c4'
secondary: '#f39c12'
accent: '#e74c3c'
background: '#0e1726'
surface: '#17202a'
error: '#e74c3c' # error messages
success: '#2ecc71' # success messages
warning: '#f1c40f' # warning messages
```
## Changing keybindings
Right now, keybinds cannot be changed. Terminals are also rather limited in what keybinds they support.
For example, pressing <kbd>Cmd</kbd>+<kbd>Enter</kbd> to send a message is not possible (although we may support a protocol to allow this in some terminals in the future).
For now, I recommend you map whatever key combo you want at the terminal emulator level to send `\n`.
Here's an example using iTerm:

<img width="848" alt="image" src="https://github.com/darrenburns/elia/assets/5740731/94b6e50c-429a-4d17-99c2-affaa828f35b">

With this mapping in place, pressing <kbd>Cmd</kbd>+<kbd>Enter</kbd> will send a message to the LLM, and pressing <kbd>Enter</kbd> alone will create a new line.

## Import from ChatGPT

Export your conversations to a JSON file using the ChatGPT UI, then import them using the `import` command.
57 changes: 22 additions & 35 deletions elia_chat/__main__.py
Original file line number Diff line number Diff line change
@@ -6,7 +6,7 @@
import pathlib
from textwrap import dedent
import tomllib
from typing import Any, Tuple
from typing import Any

import click
from click_default_group import DefaultGroup
@@ -17,18 +17,15 @@
from elia_chat.config import LaunchConfig
from elia_chat.database.import_chatgpt import import_chatgpt_data
from elia_chat.database.database import create_database, sqlite_file_name
from elia_chat.launch_args import QuickLaunchArgs
from elia_chat.locations import config_file

console = Console()


def create_db_if_not_exists() -> None:
if not sqlite_file_name.exists():
click.echo(f"Creating database at {sqlite_file_name!r}")
asyncio.run(create_database())


def load_or_create_config_file() -> dict[str, Any]:
config = config_file()

@@ -43,22 +40,38 @@ def load_or_create_config_file() -> dict[str, Any]:

return file_config


@click.group(cls=DefaultGroup, default="default", default_if_no_args=True)
def cli() -> None:
"""Interact with large language models using your terminal."""


@cli.command()
@click.argument("prompt", nargs=-1, type=str, required=False)
def default(prompt: tuple[str, ...]):
@click.option(
"-m",
"--model",
type=str,
default="",
help="The model to use for the chat",
)
@click.option(
"-i",
"--inline",
is_flag=True,
help="Run in inline mode, without launching full TUI.",
default=False,
)
def default(prompt: tuple[str, ...], model: str, inline: bool) -> None:
prompt = prompt or ("",)
joined_prompt = " ".join(prompt)
create_db_if_not_exists()
file_config = load_or_create_config_file()
app = Elia(LaunchConfig(**file_config), startup_prompt=joined_prompt)
app.run()
cli_config = {}
if model:
cli_config["default_model"] = model

launch_config: dict[str, Any] = {**file_config, **cli_config}
app = Elia(LaunchConfig(**launch_config), startup_prompt=joined_prompt)
app.run(inline=inline)

@cli.command()
def reset() -> None:
@@ -91,7 +104,6 @@ def reset() -> None:
asyncio.run(create_database())
console.print(f"♻️ Database reset @ {sqlite_file_name}")


@cli.command("import")
@click.argument(
"file",
@@ -109,30 +121,5 @@ def import_file_to_db(file: pathlib.Path) -> None:
asyncio.run(import_chatgpt_data(file=file))
console.print(f"[green]ChatGPT data imported from {str(file)!r}")


@cli.command()
@click.argument("message", nargs=-1, type=str, required=True)
@click.option(
"-m",
"--model",
type=str,
default="gpt-3.5-turbo",
help="The model to use for the chat",
)
def chat(message: Tuple[str, ...], model: str) -> None:
"""
Start Elia with a chat message
"""
quick_launch_args = QuickLaunchArgs(
launch_prompt=" ".join(message),
launch_prompt_model_name=model,
)
launch_config = LaunchConfig(
default_model=quick_launch_args.launch_prompt_model_name,
)
app = Elia(launch_config, quick_launch_args.launch_prompt)
app.run()


if __name__ == "__main__":
cli()
50 changes: 42 additions & 8 deletions elia_chat/app.py
Original file line number Diff line number Diff line change
@@ -6,15 +6,17 @@

from textual.app import App
from textual.binding import Binding
from textual.reactive import Reactive, reactive
from textual.signal import Signal

from elia_chat.chats_manager import ChatsManager
from elia_chat.models import ChatData, ChatMessage
from elia_chat.config import EliaChatModel, LaunchConfig, launch_config
from elia_chat.config import EliaChatModel, LaunchConfig
from elia_chat.runtime_config import RuntimeConfig
from elia_chat.screens.chat_screen import ChatScreen
from elia_chat.screens.help_screen import HelpScreen
from elia_chat.screens.home_screen import HomeScreen
from elia_chat.themes import BUILTIN_THEMES, Theme, load_user_themes

if TYPE_CHECKING:
from litellm.types.completion import (
@@ -28,13 +30,17 @@ class Elia(App[None]):
CSS_PATH = Path(__file__).parent / "elia.scss"
BINDINGS = [
Binding("q", "app.quit", "Quit", show=False),
Binding("?", "help", "Help"),
Binding("f1,?", "help", "Help"),
]

def __init__(self, config: LaunchConfig, startup_prompt: str = ""):
super().__init__()
self.launch_config = config
launch_config.set(config)

available_themes: dict[str, Theme] = BUILTIN_THEMES.copy()
available_themes |= load_user_themes()

self.themes: dict[str, Theme] = available_themes

self._runtime_config = RuntimeConfig(
selected_model=config.default_model_object,
system_prompt=config.system_prompt,
@@ -52,6 +58,10 @@ def __init__(self, config: LaunchConfig, startup_prompt: str = ""):
put users into the chat window, rather than going to the home screen.
"""

super().__init__()

theme: Reactive[str | None] = reactive(None, init=False)

@property
def runtime_config(self) -> RuntimeConfig:
return self._runtime_config
@@ -62,15 +72,16 @@ def runtime_config(self, new_runtime_config: RuntimeConfig) -> None:
self.runtime_config_signal.publish(self.runtime_config)

async def on_mount(self) -> None:
self.push_screen(HomeScreen(self.runtime_config_signal))
await self.push_screen(HomeScreen(self.runtime_config_signal))
self.theme = self.launch_config.theme
if self.startup_prompt:
await self.launch_chat(
prompt=self.startup_prompt,
model=self.runtime_config.selected_model,
)

async def launch_chat(self, prompt: str, model: EliaChatModel) -> None:
current_time = datetime.datetime.now(datetime.UTC)
current_time = datetime.datetime.now(datetime.timezone.utc)
system_message: ChatCompletionSystemMessageParam = {
"content": self.runtime_config.system_prompt,
"role": "system",
@@ -102,9 +113,32 @@ async def launch_chat(self, prompt: str, model: EliaChatModel) -> None:

async def action_help(self) -> None:
if isinstance(self.screen, HelpScreen):
self.app.pop_screen()
self.pop_screen()
else:
await self.push_screen(HelpScreen())

def get_css_variables(self) -> dict[str, str]:
if self.theme:
theme = self.themes.get(self.theme)
if theme:
color_system = theme.to_color_system().generate()
else:
color_system = {}
else:
await self.app.push_screen(HelpScreen())
color_system = {}

return {**super().get_css_variables(), **color_system}

def watch_theme(self, theme: str | None) -> None:
self.refresh_css(animate=False)
self.screen._update_styles()

@property
def theme_object(self) -> Theme | None:
try:
return self.themes[self.theme]
except KeyError:
return None


if __name__ == "__main__":
16 changes: 15 additions & 1 deletion elia_chat/chats_manager.py
Original file line number Diff line number Diff line change
@@ -3,6 +3,7 @@
from dataclasses import dataclass
import datetime

from sqlmodel import select
from textual import log

from elia_chat.database.converters import (
@@ -27,6 +28,10 @@ async def get_chat(chat_id: int) -> ChatData:
chat_dao = await ChatDao.from_id(chat_id)
return chat_dao_to_chat_data(chat_dao)

@staticmethod
async def rename_chat(chat_id: int, new_title: str) -> None:
await ChatDao.rename_chat(chat_id, new_title)

@staticmethod
async def get_messages(
chat_id: int,
@@ -65,7 +70,7 @@ async def create_chat(chat_data: ChatData) -> int:
chat = ChatDao(
model=lookup_key,
title="",
started_at=datetime.datetime.now(datetime.UTC),
started_at=datetime.datetime.now(datetime.timezone.utc),
)
session.add(chat)
await session.commit()
@@ -87,6 +92,15 @@ async def create_chat(chat_data: ChatData) -> int:

return chat.id

@staticmethod
async def archive_chat(chat_id: int) -> None:
async with get_session() as session:
statement = select(ChatDao).where(ChatDao.id == chat_id)
result = await session.exec(statement)
chat_dao = result.one()
chat_dao.archived = True
await session.commit()

@staticmethod
async def add_message_to_chat(chat_id: int, message: ChatMessage) -> None:
async with get_session() as session:
56 changes: 48 additions & 8 deletions elia_chat/config.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
from contextvars import ContextVar
import os
from pydantic import AnyHttpUrl, BaseModel, ConfigDict, Field, SecretStr

@@ -33,7 +32,7 @@ class EliaChatModel(BaseModel):
"""A description of the model which may appear inside the Elia UI."""
product: str | None = Field(default=None)
"""For example `ChatGPT`, `Claude`, `Gemini`, etc."""
temperature: int = Field(default=1.0)
temperature: float = Field(default=1.0)
"""The temperature to use. Low temperature means the same prompt is likely
to produce similar results. High temperature means a flatter distribution
when predicting the next token, and so the next token will be more random.
@@ -55,14 +54,16 @@ def get_builtin_openai_models() -> list[EliaChatModel]:
provider="OpenAI",
product="ChatGPT",
description="Fast & inexpensive model for simple tasks.",
temperature=0.7,
),
EliaChatModel(
id="elia-gpt-4o",
name="openai/gpt-4o",
name="gpt-4o",
display_name="GPT-4o",
provider="OpenAI",
product="ChatGPT",
description="Fastest and most affordable flagship model.",
temperature=0.7,
),
EliaChatModel(
id="elia-gpt-4-turbo",
@@ -71,12 +72,21 @@ def get_builtin_openai_models() -> list[EliaChatModel]:
provider="OpenAI",
product="ChatGPT",
description="Previous high-intelligence model.",
temperature=0.7,
),
]


def get_builtin_anthropic_models() -> list[EliaChatModel]:
return [
EliaChatModel(
id="elia-claude-3-5-sonnet-20240620",
name="claude-3-5-sonnet-20240620",
display_name="Claude 3.5 Sonnet",
provider="Anthropic",
product="Claude 3.5",
description=("Anthropic's most intelligent model"),
),
EliaChatModel(
id="elia-claude-3-haiku-20240307",
name="claude-3-haiku-20240307",
@@ -103,13 +113,39 @@ def get_builtin_anthropic_models() -> list[EliaChatModel]:
display_name="Claude 3 Opus",
provider="Anthropic",
product="Claude 3",
description="Most powerful model for highly complex tasks",
description="Excels at writing and complex tasks",
),
]


def get_builtin_google_models() -> list[EliaChatModel]:
return [
EliaChatModel(
id="elia-gemini/gemini-1.5-pro-latest",
name="gemini/gemini-1.5-pro-latest",
display_name="Gemini 1.5 Pro",
provider="Google",
product="Gemini",
description="Excels at reasoning tasks including code and text generation, "
"text editing, problem solving, data extraction and generation",
),
EliaChatModel(
id="elia-gemini/gemini-1.5-flash-latest",
name="gemini/gemini-1.5-flash-latest",
display_name="Gemini 1.5 Flash",
provider="Google",
product="Gemini",
description="Fast and versatile performance across a variety of tasks",
),
]


def get_builtin_models() -> list[EliaChatModel]:
return get_builtin_openai_models() + get_builtin_anthropic_models()
return (
get_builtin_openai_models()
+ get_builtin_anthropic_models()
+ get_builtin_google_models()
)


class LaunchConfig(BaseModel):
@@ -120,17 +156,20 @@ class LaunchConfig(BaseModel):

model_config = ConfigDict(frozen=True)

default_model: str = Field(default="elia-gpt-3.5-turbo")
default_model: str = Field(default="elia-gpt-4o")
"""The ID or name of the default model."""
system_prompt: str = Field(
default=os.getenv(
"ELIA_SYSTEM_PROMPT", "You are a helpful assistant named Elia."
)
)
message_code_theme: str = Field(default="monokai")
"""The default Pygments syntax highlighting theme to be used in chatboxes."""
models: list[EliaChatModel] = Field(default_factory=list)
builtin_models: list[EliaChatModel] = Field(
default_factory=get_builtin_models, init=False
)
theme: str = Field(default="nebula")

@property
def all_models(self) -> list[EliaChatModel]:
@@ -142,5 +181,6 @@ def default_model_object(self) -> EliaChatModel:

return get_model(self.default_model, self)


launch_config: ContextVar[LaunchConfig] = ContextVar("launch_config")
@classmethod
def get_current(cls) -> "LaunchConfig":
return cls()
9 changes: 9 additions & 0 deletions elia_chat/database/models.py
Original file line number Diff line number Diff line change
@@ -75,6 +75,7 @@ async def all() -> list["ChatDao"]:
statement = (
select(ChatDao)
.join(subquery, subquery.c.chat_id == ChatDao.id)
.where(ChatDao.archived == False) # noqa: E712
.order_by(desc(subquery.c.max_timestamp))
.options(selectinload(ChatDao.messages))
)
@@ -91,3 +92,11 @@ async def from_id(chat_id: int) -> "ChatDao":
)
result = await session.exec(statement)
return result.one()

@staticmethod
async def rename_chat(chat_id: int, new_title: str) -> None:
async with get_session() as session:
chat = await ChatDao.from_id(chat_id)
chat.title = new_title
session.add(chat)
await session.commit()
252 changes: 157 additions & 95 deletions elia_chat/elia.scss

Large diffs are not rendered by default.

7 changes: 7 additions & 0 deletions elia_chat/locations.py
Original file line number Diff line number Diff line change
@@ -21,3 +21,10 @@ def config_directory() -> Path:

def config_file() -> Path:
return config_directory() / "config.toml"


def theme_directory() -> Path:
"""Return (possibly creating) the themes directory."""
theme_dir = data_directory() / "themes"
theme_dir.mkdir(exist_ok=True, parents=True)
return theme_dir
6 changes: 4 additions & 2 deletions elia_chat/models.py
Original file line number Diff line number Diff line change
@@ -5,7 +5,9 @@
from typing import TYPE_CHECKING


from elia_chat.config import LaunchConfig, EliaChatModel, launch_config
from elia_chat.config import LaunchConfig, EliaChatModel

from textual._context import active_app

if TYPE_CHECKING:
from litellm.types.completion import ChatCompletionMessageParam
@@ -23,7 +25,7 @@ def get_model(
Models are looked up by ID first.
"""
if config is None:
config = launch_config.get()
config = active_app.get().launch_config
try:
return {model.id: model for model in config.all_models}[model_id_or_name]
except KeyError:
9 changes: 6 additions & 3 deletions elia_chat/screens/chat_details.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
from datetime import timezone
from typing import TYPE_CHECKING, cast
import humanize
from textual.app import ComposeResult
@@ -65,7 +66,9 @@ def compose(self) -> ComposeResult:

yield Label("First message", classes="heading")
if chat.create_timestamp:
create_timestamp = chat.create_timestamp.replace(tzinfo=None)
create_timestamp = chat.create_timestamp.replace(
tzinfo=timezone.utc
)
yield Label(
f"{humanize.naturaltime(create_timestamp)}",
classes="datum",
@@ -75,11 +78,11 @@ def compose(self) -> ComposeResult:

yield Rule()

update_time = chat.update_time
update_time = chat.update_time.replace(tzinfo=timezone.utc)
yield Label("Updated at", classes="heading")
if update_time:
yield Label(
f"{humanize.naturaltime(chat.update_time.replace(tzinfo=None))}",
f"{humanize.naturaltime(chat.update_time)}",
classes="datum",
)
else:
24 changes: 16 additions & 8 deletions elia_chat/screens/chat_screen.py
Original file line number Diff line number Diff line change
@@ -5,7 +5,7 @@
from textual.widgets import Footer

from elia_chat.chats_manager import ChatsManager
from elia_chat.widgets.agent_is_typing import AgentIsTyping
from elia_chat.widgets.agent_is_typing import ResponseStatus
from elia_chat.widgets.chat import Chat
from elia_chat.models import ChatData

@@ -15,9 +15,10 @@ class ChatScreen(Screen[None]):
BINDINGS = [
Binding(
key="escape",
action="focus('prompt')",
action="app.focus('prompt')",
description="Focus prompt",
key_display="esc",
tooltip="Return focus to the prompt input.",
),
]

@@ -33,19 +34,26 @@ def compose(self) -> ComposeResult:
yield Chat(self.chat_data)
yield Footer()

@on(Chat.NewUserMessage)
def new_user_message(self, event: Chat.NewUserMessage) -> None:
"""Handle a new user message."""
self.query_one(Chat).allow_input_submit = False
response_status = self.query_one(ResponseStatus)
response_status.set_awaiting_response()
response_status.display = True

@on(Chat.AgentResponseStarted)
def start_awaiting_response(self) -> None:
"""Prevent sending messages because the agent is typing."""
self.query_one(AgentIsTyping).display = True
self.query_one(Chat).allow_input_submit = False
response_status = self.query_one(ResponseStatus)
response_status.set_agent_responding()
response_status.display = True

@on(Chat.AgentResponseComplete)
async def agent_response_complete(self, event: Chat.AgentResponseComplete) -> None:
"""Allow the user to send messages again."""
chat = self.query_one(Chat)
agent_is_typing = self.query_one(AgentIsTyping)
agent_is_typing.display = False
chat.allow_input_submit = True
self.query_one(ResponseStatus).display = False
self.query_one(Chat).allow_input_submit = True
log.debug(
f"Agent response complete. Adding message "
f"to chat_id {event.chat_id!r}: {event.message}"
12 changes: 7 additions & 5 deletions elia_chat/screens/help_screen.py
Original file line number Diff line number Diff line change
@@ -8,7 +8,7 @@
class HelpScreen(ModalScreen[None]):
BINDINGS = [
Binding("q", "app.quit", "Quit", show=False),
Binding("escape,?", "app.pop_screen()", "Close help", key_display="esc"),
Binding("escape,f1,?", "app.pop_screen()", "Close help", key_display="esc"),
]

HELP_MARKDOWN = """\
@@ -51,9 +51,10 @@ class HelpScreen(ModalScreen[None]):
On the chat screen, pressing `up` and `down` will navigate through messages,
but if you just wish to scroll a little, you can use `shift+up` and `shift+down`.
### The chat history
### The chat list
- `up,down,k,j`: Navigate through chats.
- `a`: Archive the highlighted chat.
- `pageup,pagedown`: Up/down a page.
- `home,end`: Go to first/last chat.
- `g,G`: Go to first/last chat.
@@ -81,6 +82,7 @@ class HelpScreen(ModalScreen[None]):
It's present on both the home screen and the chat page.
- `ctrl+j`: Submit the prompt
- `alt+enter`: Submit the prompt (only works in some terminals)
- `up`: Move the cursor up
- `down`: Move the cursor down
- `left`: Move the cursor left
@@ -120,10 +122,11 @@ class HelpScreen(ModalScreen[None]):
### The chat screen
Press `shift+tab` to focus the latest message (or move the cursor `up` from (0, 0)).
You can use the arrow keys to move up and down through messages.
- `ctrl+r`: Rename the chat (or click the chat title).
- `f2`: View more information about the chat.
_With a message focused_:
- `y,c`: Copy the raw Markdown of the message to the clipboard.
@@ -142,7 +145,6 @@ class HelpScreen(ModalScreen[None]):
- `G`: Focus the latest message.
- `m`: Move focus to the prompt box.
- `up,down,k,j`: Navigate through messages.
- `f2`: View more information about the chat.
"""

35 changes: 32 additions & 3 deletions elia_chat/screens/home_screen.py
Original file line number Diff line number Diff line change
@@ -14,13 +14,14 @@
from elia_chat.widgets.app_header import AppHeader
from elia_chat.screens.chat_screen import ChatScreen
from elia_chat.widgets.chat_options import OptionsModal
from elia_chat.widgets.welcome import Welcome

if TYPE_CHECKING:
from elia_chat.app import Elia


class HomePromptInput(PromptInput):
BINDINGS = [Binding("escape", "app.quit", "Exit Elia", key_display="esc")]
BINDINGS = [Binding("escape", "app.quit", "Quit", key_display="esc")]


class HomeScreen(Screen[None]):
@@ -34,9 +35,22 @@ class HomeScreen(Screen[None]):

BINDINGS = [
Binding(
"ctrl+j", "send_message", "Send message", priority=True, key_display="^j"
"ctrl+j,alt+enter",
"send_message",
"Send message",
priority=True,
key_display="^j",
tooltip="Send a message to the chosen LLM. On modern terminals, "
"[b u]alt+enter[/] can be used as an alternative.",
),
Binding(
"o,ctrl+o",
"options",
"Options",
key_display="^o",
tooltip="Change the model, system prompt, and check where Elia"
" is storing your data.",
),
Binding("o,ctrl+o", "options", "Options", key_display="^o"),
]

def __init__(
@@ -57,12 +71,14 @@ def compose(self) -> ComposeResult:
yield AppHeader(self.config_signal)
yield HomePromptInput(id="home-prompt")
yield ChatList()
yield Welcome()
yield Footer()

@on(ScreenResume)
async def reload_screen(self) -> None:
chat_list = self.query_one(ChatList)
await chat_list.reload_and_refresh()
self.show_welcome_if_required()

@on(ChatList.ChatOpened)
async def open_chat_screen(self, event: ChatList.ChatOpened):
@@ -71,6 +87,10 @@ async def open_chat_screen(self, event: ChatList.ChatOpened):
chat = await self.chats_manager.get_chat(chat_id)
await self.app.push_screen(ChatScreen(chat))

@on(ChatList.CursorEscapingTop)
def cursor_escaping_top(self):
self.query_one(HomePromptInput).focus()

@on(PromptInput.PromptSubmitted)
async def create_new_chat(self, event: PromptInput.PromptSubmitted) -> None:
text = event.text
@@ -96,3 +116,12 @@ async def action_options(self) -> None:
def update_config(self, runtime_config: RuntimeConfig) -> None:
app = cast("Elia", self.app)
app.runtime_config = runtime_config

def show_welcome_if_required(self) -> None:
chat_list = self.query_one(ChatList)
if chat_list.option_count == 0:
welcome = self.query_one(Welcome)
welcome.display = "block"
else:
welcome = self.query_one(Welcome)
welcome.display = "none"
25 changes: 25 additions & 0 deletions elia_chat/screens/rename_chat_screen.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
from textual import on
from textual.app import ComposeResult
from textual.binding import Binding
from textual.containers import Vertical
from textual.screen import ModalScreen
from textual.widgets import Input


class RenameChat(ModalScreen[str]):
BINDINGS = [
Binding("escape", "app.pop_screen", "Cancel", key_display="esc"),
Binding("enter", "app.pop_screen", "Save"),
]

def compose(self) -> ComposeResult:
with Vertical():
title_input = Input(placeholder="Enter a title...")
title_input.border_subtitle = (
"[[white]enter[/]] Save [[white]esc[/]] Cancel"
)
yield title_input

@on(Input.Submitted)
def close_screen(self, event: Input.Submitted) -> None:
self.dismiss(event.value)
172 changes: 172 additions & 0 deletions elia_chat/themes.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,172 @@
from pydantic import BaseModel, Field
from textual.design import ColorSystem
import yaml

from elia_chat.locations import theme_directory


class Theme(BaseModel):
name: str = Field(exclude=True)
primary: str
secondary: str | None = None
background: str | None = None
surface: str | None = None
panel: str | None = None
warning: str | None = None
error: str | None = None
success: str | None = None
accent: str | None = None
dark: bool = True

def to_color_system(self) -> ColorSystem:
"""Convert this theme to a ColorSystem."""
return ColorSystem(
**self.model_dump(
exclude={
"text_area",
"syntax",
"variable",
"url",
"method",
}
)
)


def load_user_themes() -> dict[str, Theme]:
"""Load user themes from "~/.config/elia/themes".
Returns:
A dictionary mapping theme names to theme objects.
"""
themes: dict[str, Theme] = {}
for path in theme_directory().iterdir():
path_suffix = path.suffix
if path_suffix == ".yaml" or path_suffix == ".yml":
with path.open() as theme_file:
theme_content = yaml.load(theme_file, Loader=yaml.FullLoader) or {}
try:
themes[theme_content["name"]] = Theme(**theme_content)
except KeyError:
raise ValueError(
f"Invalid theme file {path}. A `name` is required."
)
return themes


BUILTIN_THEMES: dict[str, Theme] = {
"textual": Theme(
name="textual",
primary="#004578",
secondary="#0178D4",
warning="#ffa62b",
error="#ba3c5b",
success="#4EBF71",
accent="#ffa62b",
dark=True,
),
"monokai": Theme(
name="monokai",
primary="#F92672", # Pink
secondary="#66D9EF", # Light Blue
warning="#FD971F", # Orange
error="#F92672", # Pink (same as primary for consistency)
success="#A6E22E", # Green
accent="#AE81FF", # Purple
background="#272822", # Dark gray-green
surface="#3E3D32", # Slightly lighter gray-green
panel="#3E3D32", # Same as surface for consistency
dark=True,
),
"nautilus": Theme(
name="nautilus",
primary="#0077BE", # Ocean Blue
secondary="#20B2AA", # Light Sea Green
warning="#FFD700", # Gold (like sunlight on water)
error="#FF6347", # Tomato (like a warning buoy)
success="#32CD32", # Lime Green (like seaweed)
accent="#FF8C00", # Dark Orange (like a sunset over water)
dark=True,
background="#001F3F", # Dark Blue (deep ocean)
surface="#003366", # Navy Blue (shallower water)
panel="#005A8C", # Steel Blue (water surface)
),
"galaxy": Theme(
name="galaxy",
primary="#8A2BE2", # Improved Deep Magenta (Blueviolet)
secondary="#a684e8",
warning="#FFD700", # Gold, more visible than orange
error="#FF4500", # OrangeRed, vibrant but less harsh than pure red
success="#00FA9A", # Medium Spring Green, kept for vibrancy
accent="#FF69B4", # Hot Pink, for a pop of color
dark=True,
background="#0F0F1F", # Very Dark Blue, almost black
surface="#1E1E3F", # Dark Blue-Purple
panel="#2D2B55", # Slightly Lighter Blue-Purple
),
"nebula": Theme(
name="nebula",
primary="#4169E1", # Royal Blue, more vibrant than Midnight Blue
secondary="#9400D3", # Dark Violet, more vibrant than Indigo Dye
warning="#FFD700", # Kept Gold for warnings
error="#FF1493", # Deep Pink, more nebula-like than Crimson
success="#00FF7F", # Spring Green, slightly more vibrant
accent="#FF00FF", # Magenta, for a true neon accent
dark=True,
background="#0A0A23", # Dark Navy, closer to a night sky
surface="#1C1C3C", # Dark Blue-Purple
panel="#2E2E5E", # Slightly Lighter Blue-Purple
),
"alpine": Theme(
name="alpine",
primary="#4A90E2", # Clear Sky Blue
secondary="#81A1C1", # Misty Blue
warning="#EBCB8B", # Soft Sunlight
error="#BF616A", # Muted Red
success="#A3BE8C", # Alpine Meadow Green
accent="#5E81AC", # Mountain Lake Blue
dark=True,
background="#2E3440", # Dark Slate Grey
surface="#3B4252", # Darker Blue-Grey
panel="#434C5E", # Lighter Blue-Grey
),
"cobalt": Theme(
name="cobalt",
primary="#334D5C", # Deep Cobalt Blue
secondary="#4878A6", # Slate Blue
warning="#FFAA22", # Amber, suitable for warnings related to primary
error="#E63946", # Red, universally recognized for errors
success="#4CAF50", # Green, commonly used for success indication
accent="#D94E64", # Candy Apple Red
dark=True,
surface="#27343B", # Dark Lead
panel="#2D3E46", # Storm Gray
background="#1F262A", # Charcoal
),
"twilight": Theme(
name="twilight",
primary="#367588",
secondary="#5F9EA0",
warning="#FFD700",
error="#FF6347",
success="#00FA9A",
accent="#FF7F50",
dark=True,
background="#191970",
surface="#3B3B6D",
panel="#4C516D",
),
"hacker": Theme(
name="hacker",
primary="#00FF00", # Bright Green (Lime)
secondary="#32CD32", # Lime Green
warning="#ADFF2F", # Green Yellow
error="#FF4500", # Orange Red (for contrast)
success="#00FA9A", # Medium Spring Green
accent="#39FF14", # Neon Green
dark=True,
background="#0D0D0D", # Almost Black
surface="#1A1A1A", # Very Dark Gray
panel="#2A2A2A", # Dark Gray
),
}
23 changes: 20 additions & 3 deletions elia_chat/widgets/agent_is_typing.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,26 @@
from textual.app import ComposeResult
from textual.containers import Horizontal
from textual.containers import Vertical
from textual.reactive import Reactive, reactive
from textual.widgets import LoadingIndicator, Label


class AgentIsTyping(Horizontal):
class ResponseStatus(Vertical):
"""
A widget that displays the status of the response from the agent.
"""

message: Reactive[str] = reactive("Agent is responding", recompose=True)

def compose(self) -> ComposeResult:
yield Label(f" {self.message}")
yield LoadingIndicator()
yield Label(" Agent is responding ")

def set_awaiting_response(self) -> None:
self.message = "Awaiting response"
self.add_class("-awaiting-response")
self.remove_class("-agent-responding")

def set_agent_responding(self) -> None:
self.message = "Agent is responding"
self.add_class("-agent-responding")
self.remove_class("-awaiting-response")
11 changes: 2 additions & 9 deletions elia_chat/widgets/app_header.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
from typing import TYPE_CHECKING, cast
from importlib.metadata import version
from rich.markup import escape
from rich.style import Style
from textual.app import ComposeResult
from textual.containers import Horizontal, Vertical
from textual.signal import Signal
@@ -40,17 +39,11 @@ def on_config_change(config: RuntimeConfig) -> None:
self.config_signal.subscribe(self, on_config_change)

def compose(self) -> ComposeResult:
title_style = self.get_component_rich_style("app-title")
subtitle_style = self.get_component_rich_style("app-subtitle")

with Horizontal():
with Vertical(id="cl-header-container"):
yield Label(
Text.assemble(
("elia ", title_style + Style(bold=True)),
("///", subtitle_style),
(f" {version('elia_chat')}", title_style),
)
Text("Elia") + Text(" v" + version("elia-chat"), style="dim"),
id="elia-title",
)
model_name_or_id = (
self.elia.runtime_config.selected_model.id
84 changes: 61 additions & 23 deletions elia_chat/widgets/chat.py
Original file line number Diff line number Diff line change
@@ -4,6 +4,8 @@
from dataclasses import dataclass
from typing import TYPE_CHECKING, cast

from textual.widgets import Label

from elia_chat import constants
from textual import log, on, work, events
from textual.app import ComposeResult
@@ -17,8 +19,8 @@
from elia_chat.chats_manager import ChatsManager
from elia_chat.models import ChatData, ChatMessage
from elia_chat.screens.chat_details import ChatDetails
from elia_chat.widgets.agent_is_typing import AgentIsTyping
from elia_chat.widgets.chat_header import ChatHeader
from elia_chat.widgets.agent_is_typing import ResponseStatus
from elia_chat.widgets.chat_header import ChatHeader, TitleStatic
from elia_chat.widgets.prompt_input import PromptInput
from elia_chat.widgets.chatbox import Chatbox

@@ -37,6 +39,7 @@ class ChatPromptInput(PromptInput):

class Chat(Widget):
BINDINGS = [
Binding("ctrl+r", "rename", "Rename", key_display="^r"),
Binding("shift+down", "scroll_container_down", show=False),
Binding("shift+up", "scroll_container_up", show=False),
Binding(
@@ -82,31 +85,24 @@ class AgentResponseFailed(Message):
last_message: ChatMessage

@dataclass
class FirstMessageSent(Message):
chat_data: ChatData
class NewUserMessage(Message):
content: str

def compose(self) -> ComposeResult:
yield ResponseStatus()
yield ChatHeader(chat=self.chat_data, model=self.model)

with VerticalScroll(id="chat-container") as vertical_scroll:
vertical_scroll.can_focus = False

yield ChatPromptInput(id="prompt")
yield AgentIsTyping()

async def on_mount(self, _: events.Mount) -> None:
"""
When the component is mounted, we need to check if there is a new chat to start
"""
await self.load_chat(self.chat_data)

# TODO - The code below shouldn't be required.
# Seems like a Textual bug.
self.set_timer(
0.05,
callback=lambda: self.chat_container.scroll_end(animate=False, force=True),
)

@property
def chat_container(self) -> VerticalScroll:
return self.query_one("#chat-container", VerticalScroll)
@@ -130,7 +126,7 @@ def restore_state_on_agent_failure(self, event: Chat.AgentResponseFailed) -> Non
async def new_user_message(self, content: str) -> None:
log.debug(f"User message submitted in chat {self.chat_data.id!r}: {content!r}")

now_utc = datetime.datetime.now(datetime.UTC)
now_utc = datetime.datetime.now(datetime.timezone.utc)
user_message: ChatCompletionUserMessageParam = {
"content": content,
"role": "user",
@@ -145,15 +141,19 @@ async def new_user_message(self, content: str) -> None:
), "Textual has mounted container at this point in the lifecycle."

await self.chat_container.mount(user_message_chatbox)

self.scroll_to_latest_message()
self.post_message(self.NewUserMessage(content))

await ChatsManager.add_message_to_chat(
chat_id=self.chat_data.id, message=user_chat_message
)

prompt = self.query_one(ChatPromptInput)
prompt.submit_ready = False
self.stream_agent_response()

@work
@work(thread=True, group="agent_response")
async def stream_agent_response(self) -> None:
model = self.chat_data.model
log.debug(f"Creating streaming response with model {model.name!r}")
@@ -193,14 +193,17 @@ async def stream_agent_response(self) -> None:
"content": "",
"role": "assistant",
}
now = datetime.datetime.now(datetime.UTC)
message = ChatMessage(message=ai_message, model=model, timestamp=now)
now = datetime.datetime.now(datetime.timezone.utc)

message = ChatMessage(message=ai_message, model=model, timestamp=now)
response_chatbox = Chatbox(
message=message,
model=self.chat_data.model,
classes="response-in-progress",
)
self.post_message(self.AgentResponseStarted())
self.app.call_from_thread(self.chat_container.mount, response_chatbox)

assert (
self.chat_container is not None
), "Textual has mounted container at this point in the lifecycle."
@@ -211,20 +214,20 @@ async def stream_agent_response(self) -> None:
chunk = cast(ModelResponse, chunk)
response_chatbox.border_title = "Agent is responding..."

if chunk_count == 0:
self.post_message(self.AgentResponseStarted())
await self.chat_container.mount(response_chatbox)

chunk_content = chunk.choices[0].delta.content
if isinstance(chunk_content, str):
response_chatbox.append_chunk(chunk_content)
self.app.call_from_thread(
response_chatbox.append_chunk, chunk_content
)
else:
break

scroll_y = self.chat_container.scroll_y
max_scroll_y = self.chat_container.max_scroll_y
if scroll_y in range(max_scroll_y - 3, max_scroll_y + 1):
self.chat_container.scroll_end(animate=False)
self.app.call_from_thread(
self.chat_container.scroll_end, animate=False
)

chunk_count += 1
except Exception:
@@ -245,12 +248,27 @@ async def stream_agent_response(self) -> None:
)
)

@on(AgentResponseFailed)
@on(AgentResponseStarted)
async def agent_started_responding(
self, event: AgentResponseFailed | AgentResponseStarted
) -> None:
try:
awaiting_reply = self.chat_container.query_one("#awaiting-reply", Label)
except NoMatches:
pass
else:
if awaiting_reply:
await awaiting_reply.remove()

@on(AgentResponseComplete)
def agent_finished_responding(self, event: AgentResponseComplete) -> None:
# Ensure the thread is updated with the message from the agent
self.chat_data.messages.append(event.message)
event.chatbox.border_title = "Agent"
event.chatbox.remove_class("response-in-progress")
prompt = self.query_one(ChatPromptInput)
prompt.submit_ready = True

@on(PromptInput.PromptSubmitted)
async def user_chat_message_submitted(
@@ -261,9 +279,23 @@ async def user_chat_message_submitted(
await self.new_user_message(user_message)

@on(PromptInput.CursorEscapingTop)
async def on_cursor_up_from_prompt(self) -> None:
async def on_cursor_up_from_prompt(
self, event: PromptInput.CursorEscapingTop
) -> None:
self.focus_latest_message()

@on(Chatbox.CursorEscapingBottom)
def move_focus_to_prompt(self) -> None:
self.query_one(ChatPromptInput).focus()

@on(TitleStatic.ChatRenamed)
async def handle_chat_rename(self, event: TitleStatic.ChatRenamed) -> None:
if event.chat_id == self.chat_data.id and event.new_title:
self.chat_data.title = event.new_title
header = self.query_one(ChatHeader)
header.update_header(self.chat_data, self.model)
await ChatsManager.rename_chat(event.chat_id, event.new_title)

def get_latest_chatbox(self) -> Chatbox:
return self.query(Chatbox).last()

@@ -273,6 +305,10 @@ def focus_latest_message(self) -> None:
except NoMatches:
pass

def action_rename(self) -> None:
title_static = self.query_one(TitleStatic)
title_static.begin_rename()

def action_focus_latest_message(self) -> None:
self.focus_latest_message()

@@ -309,6 +345,8 @@ async def load_chat(self, chat_data: ChatData) -> None:
# If the last message didn't receive a response, try again.
messages = chat_data.messages
if messages and messages[-1].message["role"] == "user":
prompt = self.query_one(ChatPromptInput)
prompt.submit_ready = False
self.stream_agent_response()

def action_close(self) -> None:
50 changes: 48 additions & 2 deletions elia_chat/widgets/chat_header.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,58 @@
from __future__ import annotations
from dataclasses import dataclass

from rich.console import ConsoleRenderable, RichCast
from rich.markup import escape

from textual.app import ComposeResult
from textual.message import Message
from textual.widget import Widget
from textual.widgets import Static

from elia_chat.config import EliaChatModel
from elia_chat.models import ChatData
from elia_chat.screens.rename_chat_screen import RenameChat


class TitleStatic(Static):
@dataclass
class ChatRenamed(Message):
chat_id: int
new_title: str

def __init__(
self,
chat_id: int,
renderable: ConsoleRenderable | RichCast | str = "",
*,
expand: bool = False,
shrink: bool = False,
markup: bool = True,
name: str | None = None,
id: str | None = None,
classes: str | None = None,
disabled: bool = False,
) -> None:
super().__init__(
renderable,
expand=expand,
shrink=shrink,
markup=markup,
name=name,
id=id,
classes=classes,
disabled=disabled,
)
self.chat_id = chat_id

def begin_rename(self) -> None:
self.app.push_screen(RenameChat(), callback=self.request_chat_rename)

def action_rename_chat(self) -> None:
self.begin_rename()

async def request_chat_rename(self, new_title: str) -> None:
self.post_message(self.ChatRenamed(self.chat_id, new_title))


class ChatHeader(Widget):
@@ -36,12 +81,13 @@ def update_header(self, chat: ChatData, model: EliaChatModel):

def title_static_content(self) -> str:
chat = self.chat
return escape(chat.short_preview) if chat else "Empty chat"
content = escape(chat.title or chat.short_preview) if chat else "Empty chat"
return f"[@click=rename_chat]{content}[/]"

def model_static_content(self) -> str:
model = self.model
return escape(model.display_name or model.name) if model else "Unknown model"

def compose(self) -> ComposeResult:
yield Static(self.title_static_content(), id="title-static")
yield TitleStatic(self.chat.id, self.title_static_content(), id="title-static")
yield Static(self.model_static_content(), id="model-static")
67 changes: 61 additions & 6 deletions elia_chat/widgets/chat_list.py
Original file line number Diff line number Diff line change
@@ -2,6 +2,7 @@

import datetime
from dataclasses import dataclass
from typing import Self, cast

import humanize
from rich.console import RenderResult, Console, ConsoleOptions
@@ -10,6 +11,7 @@
from rich.text import Text
from textual import events, log, on
from textual.binding import Binding
from textual.geometry import Region
from textual.message import Message
from textual.widgets import OptionList
from textual.widgets.option_list import Option
@@ -27,7 +29,7 @@ class ChatListItemRenderable:
def __rich_console__(
self, console: Console, options: ConsoleOptions
) -> RenderResult:
now = datetime.datetime.now(datetime.UTC)
now = datetime.datetime.now(datetime.timezone.utc)
delta = now - self.chat.update_time
time_ago = humanize.naturaltime(delta)
time_ago_text = Text(time_ago, style="dim i")
@@ -57,13 +59,25 @@ def __init__(self, chat: ChatData, config: LaunchConfig) -> None:
class ChatList(OptionList):
BINDINGS = [
Binding(
"escape", "screen.focus('home-prompt')", "Focus prompt", key_display="esc"
"escape",
"app.focus('home-prompt')",
"Focus prompt",
key_display="esc",
tooltip="Return focus to the prompt input.",
),
Binding(
"a",
"archive_chat",
"Archive chat",
key_display="a",
tooltip="Archive the highlighted chat"
" (without deleting it from Elia's database).",
),
Binding("j,down", "cursor_down", "Down", show=False),
Binding("k,up", "cursor_up", "Up", show=False),
Binding("G,end", "last", "Last", show=False),
Binding("l,enter", "select", "Select", show=False),
Binding("l,right,enter", "select", "Select", show=False),
Binding("g,home", "first", "First", show=False),
Binding("G,end", "last", "Last", show=False),
Binding("pagedown", "page_down", "Page Down", show=False),
Binding("pageup", "page_up", "Page Up", show=False),
]
@@ -72,6 +86,12 @@ class ChatList(OptionList):
class ChatOpened(Message):
chat: ChatData

class CursorEscapingTop(Message):
"""Cursor attempting to move out-of-bounds at top of list."""

class CursorEscapingBottom(Message):
"""Cursor attempting to move out-of-bounds at bottom of list."""

async def on_mount(self) -> None:
await self.reload_and_refresh()

@@ -86,7 +106,7 @@ async def post_chat_opened(self, event: OptionList.OptionSelected) -> None:
@on(events.Focus)
def show_border_subtitle(self) -> None:
if self.highlighted is not None:
self.border_subtitle = "[[white]Enter[/]] Open chat"
self.border_subtitle = self.get_border_subtitle()
elif self.option_count > 0:
self.highlighted = 0

@@ -104,12 +124,14 @@ async def reload_and_refresh(self, new_highlighted: int = -1) -> None:
old_highlighted = self.highlighted
self.clear_options()
self.add_options(self.options)
self.border_title = f"History ({len(self.options)})"
self.border_title = self.get_border_title()
if new_highlighted > -1:
self.highlighted = new_highlighted
else:
self.highlighted = old_highlighted

self.refresh()

async def load_chat_list_items(self) -> list[ChatListItem]:
chats = await self.load_chats()
return [ChatListItem(chat, self.app.launch_config) for chat in chats]
@@ -118,6 +140,33 @@ async def load_chats(self) -> list[ChatData]:
all_chats = await ChatsManager.all_chats()
return all_chats

async def action_archive_chat(self) -> None:
if self.highlighted is None:
return

item = cast(ChatListItem, self.get_option_at_index(self.highlighted))
self.options.pop(self.highlighted)
self.remove_option_at_index(self.highlighted)

chat_id = item.chat.id
await ChatsManager.archive_chat(chat_id)

self.border_title = self.get_border_title()
self.border_subtitle = self.get_border_subtitle()
self.app.notify(
item.chat.title or f"Chat [b]{chat_id!r}[/] archived.",
title="Chat archived",
)
self.refresh()

def get_border_title(self) -> str:
return f"History ({len(self.options)})"

def get_border_subtitle(self) -> str:
if self.highlighted is None:
return ""
return f"{self.highlighted + 1} / {self.option_count}"

def create_chat(self, chat_data: ChatData) -> None:
new_chat_list_item = ChatListItem(chat_data, self.app.launch_config)
log.debug(f"Creating new chat {new_chat_list_item!r}")
@@ -131,3 +180,9 @@ def create_chat(self, chat_data: ChatData) -> None:
option_list.add_options(self.options)
option_list.highlighted = 0
self.refresh()

def action_cursor_up(self) -> None:
if self.highlighted == 0:
self.post_message(self.CursorEscapingTop())
else:
return super().action_cursor_up()
9 changes: 5 additions & 4 deletions elia_chat/widgets/chat_options.py
Original file line number Diff line number Diff line change
@@ -12,7 +12,7 @@
from textual.widgets import Footer, RadioSet, RadioButton, Static, TextArea

from elia_chat.config import EliaChatModel
from elia_chat.locations import config_file
from elia_chat.locations import config_file, theme_directory
from elia_chat.runtime_config import RuntimeConfig
from elia_chat.database.database import sqlite_file_name

@@ -69,7 +69,7 @@ def compose(self) -> ComposeResult:
selected_model = self.runtime_config.selected_model
models_rs.border_title = "Available Models"
for model in self.elia.launch_config.all_models:
label = f"[dim]{escape(model.display_name or model.name)}"
label = f"{escape(model.display_name or model.name)}"
provider = model.provider
if provider:
label += f" [i]by[/] {provider}"
@@ -91,7 +91,8 @@ def compose(self) -> ComposeResult:
with Vertical(id="xdg-info") as xdg_info:
xdg_info.border_title = "More Information"
yield Static(f"{sqlite_file_name.absolute()}\n[dim]Database[/]\n")
yield Static(f"{config_file()}\n[dim]Config[/]")
yield Static(f"{config_file()}\n[dim]Config[/]\n")
yield Static(f"{theme_directory()}\n[dim]Themes directory[/]")
# TODO - yield and dock a label to the bottom explaining
# that the changes made here only apply to the current session
# We can probably do better when it comes to system prompts.
@@ -140,4 +141,4 @@ def apply_overridden_subtitles(
if system_prompt_ta.text != self.elia.launch_config.system_prompt:
system_prompt_ta.border_subtitle = "overrides config"
else:
system_prompt_ta.border_subtitle = ""
system_prompt_ta.border_subtitle = "editable"
140 changes: 119 additions & 21 deletions elia_chat/widgets/chatbox.py
Original file line number Diff line number Diff line change
@@ -2,6 +2,7 @@
import bisect
from dataclasses import dataclass

from rich.cells import cell_len
from rich.console import RenderableType
from rich.markdown import Markdown
from rich.syntax import Syntax
@@ -43,9 +44,47 @@ class VisualModeToggled(Message):
description="Toggle visual select",
key_display="v",
),
Binding("up,k", "cursor_up", "Cursor Up", show=False),
Binding("down,j", "cursor_down", "Cursor Down", show=False),
Binding("right,l", "cursor_right", "Cursor Right", show=False),
Binding("left,h", "cursor_left", "Cursor Left", show=False),
Binding("shift+up,K", "cursor_up(True)", "cursor up select", show=False),
Binding("shift+down,J", "cursor_down(True)", "cursor down select", show=False),
Binding("shift+left,H", "cursor_left(True)", "cursor left select", show=False),
Binding(
"y,c", "copy_to_clipboard", description="Copy selection", key_display="y"
"shift+right,L", "cursor_right(True)", "cursor right select", show=False
),
Binding("ctrl+left,b", "cursor_word_left", "cursor word left", show=False),
Binding("ctrl+right,w", "cursor_word_right", "cursor word right", show=False),
Binding(
"home,ctrl+a,0,^", "cursor_line_start", "cursor line start", show=False
),
Binding("end,ctrl+e,$", "cursor_line_end", "cursor line end", show=False),
Binding("pageup,ctrl+b", "cursor_page_up", "cursor page up", show=False),
Binding("pagedown,ctrl+f", "cursor_page_down", "cursor page down", show=False),
Binding("ctrl+d", "cursor_half_page_down", "cursor half page down", show=False),
Binding("ctrl+u", "cursor_half_page_up", "cursor half page up", show=False),
Binding(
"ctrl+shift+left,B",
"cursor_word_left(True)",
"cursor left word select",
show=False,
),
Binding(
"ctrl+shift+right,W",
"cursor_word_right(True)",
"cursor right word select",
show=False,
),
Binding("f6,V", "select_line", "select line", show=False),
Binding(
"y,c",
"copy_to_clipboard",
description="Copy selection",
show=False,
),
Binding("g", "cursor_top", "Go to top", show=False),
Binding("G", "cursor_bottom", "Go to bottom", show=False),
Binding("u", "next_code_block", description="Next code block", key_display="u"),
]

@@ -87,17 +126,37 @@ def action_cursor_word_left(self, select: bool = False) -> None:
def action_cursor_word_right(self, select: bool = False) -> None:
return super().action_cursor_word_right(self.visual_mode or select)

def action_cursor_top(self) -> None:
self.selection = Selection.cursor((0, 0))

def action_cursor_bottom(self) -> None:
self.selection = Selection.cursor((self.document.line_count - 1, 0))

def action_copy_to_clipboard(self) -> None:
text_to_copy = self.selected_text

if text_to_copy:
message = f"Copied {len(text_to_copy)} selected characters to clipboard."
self.notify(message, title="Selection copied")
message = f"Copied {len(text_to_copy)} characters to clipboard."
title = "Selection copied"
else:
text_to_copy = self.text
message = f"Copied message ({len(text_to_copy)} characters)."
self.notify(message, title="Message copied")
title = "Message copied"

try:
import pyperclip

pyperclip.copy(text_to_copy)
except pyperclip.PyperclipException as exc:
self.notify(
str(exc),
title="Clipboard error",
severity="error",
timeout=10,
)
else:
self.notify(message, title=title)

self.app.copy_to_clipboard(text_to_copy)
self.visual_mode = False

def action_next_code_block(self) -> None:
@@ -112,12 +171,14 @@ def action_next_code_block(self) -> None:
)
else:
if query:
self.visual_mode = True
code_block_nodes = self.document.query_syntax_tree(query)
locations: list[tuple[tuple[int, int], tuple[int, int]]] = [
(node.start_point, node.end_point)
for (node, _name) in code_block_nodes
]
if not locations:
return
self.visual_mode = True
end_locations = [end for _start, end in locations]
cursor_row, _cursor_column = self.cursor_location
search_start_location = cursor_row + 1, 0
@@ -131,6 +192,28 @@ def action_next_code_block(self) -> None:
def action_leave_selection_mode(self) -> None:
self.post_message(self.LeaveSelectionMode())

def action_cursor_half_page_down(self) -> None:
"""Move the cursor and scroll down half of a page."""
half_height = self.content_size.height // 2
_, cursor_location = self.selection
target = self.navigator.get_location_at_y_offset(
cursor_location,
half_height,
)
self.scroll_relative(y=half_height, animate=False)
self.move_cursor(target)

def action_cursor_half_page_up(self) -> None:
"""Move the cursor and scroll down half of a page."""
half_height = self.content_size.height // 2
_, cursor_location = self.selection
target = self.navigator.get_location_at_y_offset(
cursor_location,
-half_height,
)
self.scroll_relative(y=-half_height, animate=False)
self.move_cursor(target)


class Chatbox(Widget, can_focus=True):
BINDINGS = [
@@ -151,6 +234,9 @@ class Chatbox(Widget, can_focus=True):
),
]

class CursorEscapingBottom(Message):
"""Sent when the cursor moves down from the bottom message."""

selection_mode = reactive(False, init=False)

def __init__(
@@ -185,7 +271,10 @@ def action_up(self) -> None:
self.screen.focus_previous(Chatbox)

def action_down(self) -> None:
self.screen.focus_next(Chatbox)
if self.parent and self is self.parent.children[-1]:
self.post_message(self.CursorEscapingBottom())
else:
self.screen.focus_next(Chatbox)

def action_select(self) -> None:
self.selection_mode = not self.selection_mode
@@ -195,9 +284,20 @@ def action_copy_to_clipboard(self) -> None:
if not self.selection_mode:
text_to_copy = self.message.message.get("content")
if isinstance(text_to_copy, str):
self.app.copy_to_clipboard(text_to_copy)
message = f"Copied message ({len(text_to_copy)} characters)."
self.notify(message, title="Message copied")
try:
import pyperclip

pyperclip.copy(text_to_copy)
except pyperclip.PyperclipException as exc:
self.notify(
str(exc),
title="Clipboard error",
severity="error",
timeout=10,
)
else:
message = f"Copied message ({len(text_to_copy)} characters)."
self.notify(message, title="Message copied")
else:
message = "Unable to copy message"
self.notify(message, title="Clipboard error", severity="error")
@@ -252,7 +352,8 @@ def markdown(self) -> Markdown:
content = self.message.message.get("content")
if not isinstance(content, str):
content = ""
return Markdown(content)

return Markdown(content, code_theme=self.app.launch_config.message_code_theme)

def render(self) -> RenderableType:
if self.selection_mode:
@@ -261,28 +362,25 @@ def render(self) -> RenderableType:
return ""

message = self.message.message
theme = self.app.theme_object
if theme:
background_color = theme.background
else:
background_color = "#121212"

if message["role"] == "user":
content = message["content"] or ""
if isinstance(content, str):
return Syntax(
content,
lexer="markdown",
word_wrap=True,
background_color="#121212",
background_color=background_color,
)
else:
return ""
return self.markdown

def get_content_width(self, container: Size, viewport: Size) -> int:
# Naive approach. Can sometimes look strange, but works well enough.
content = self.message.message.get("content")
if isinstance(content, str):
content_width = min(len(content), container.width)
else:
content_width = 10 # Arbitrary
return content_width

def append_chunk(self, chunk: str) -> None:
"""Append a chunk of text to the end of the message."""
content = self.message.message.get("content")
22 changes: 18 additions & 4 deletions elia_chat/widgets/prompt_input.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
from dataclasses import dataclass
from textual import events, on
from textual.binding import Binding
from textual.reactive import reactive
from textual.widgets import TextArea
from textual.message import Message

@@ -19,7 +20,11 @@ class CursorEscapingTop(Message):
class CursorEscapingBottom(Message):
pass

BINDINGS = [Binding("ctrl+j", "submit_prompt", "Send message", key_display="^j")]
BINDINGS = [
Binding("ctrl+j,alt+enter", "submit_prompt", "Send message", key_display="^j")
]

submit_ready = reactive(True)

def __init__(
self,
@@ -36,22 +41,24 @@ def on_key(self, event: events.Key) -> None:
if self.cursor_location == (0, 0) and event.key == "up":
event.prevent_default()
self.post_message(self.CursorEscapingTop())
event.stop()
elif self.cursor_at_end_of_text and event.key == "down":
event.prevent_default()
self.post_message(self.CursorEscapingBottom())
event.stop()

def watch_submit_ready(self, submit_ready: bool) -> None:
self.set_class(not submit_ready, "-submit-blocked")

def on_mount(self):
self.border_title = "Enter your [u]m[/]essage..."
self.submit_ready = False

@on(TextArea.Changed)
async def prompt_changed(self, event: TextArea.Changed) -> None:
text_area = event.text_area
if text_area.text.strip() != "":
self.submit_ready = True
text_area.border_subtitle = "[[white]^j[/]] Send message"
else:
self.submit_ready = False
text_area.border_subtitle = None

text_area.set_class(text_area.wrapped_document.height > 1, "multiline")
@@ -63,7 +70,14 @@ async def prompt_changed(self, event: TextArea.Changed) -> None:
self.parent.refresh()

def action_submit_prompt(self) -> None:
if self.text.strip() == "":
self.notify("Cannot send empty message!")
return

if self.submit_ready:
message = self.PromptSubmitted(self.text, prompt_input=self)
self.clear()
self.post_message(message)
else:
self.app.bell()
self.notify("Please wait for response to complete.")
38 changes: 38 additions & 0 deletions elia_chat/widgets/welcome.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
"""Show a welcome box on the home page when the user has
no chat history.
"""

from rich.console import RenderableType
from textual.widgets import Static


class Welcome(Static):
MESSAGE = """
To get started, type a message in the box at the top of the
screen and press [b u]ctrl+j[/] or [b u]alt+enter[/] to send it.
Change the model and system prompt by pressing [b u]ctrl+o[/].
Make sure you've set any required API keys first (e.g. [b]OPENAI_API_KEY[/])!
If you have any issues or feedback, please let me know [@click='open_issues'][b r]on GitHub[/][/]!
Finally, please consider starring the repo and sharing it with your friends and colleagues!
[@click='open_repo'][b r]https://github.com/darrenburns/elia[/][/]
"""

BORDER_TITLE = "Welcome to Elia!"

def render(self) -> RenderableType:
return self.MESSAGE

def _action_open_repo(self) -> None:
import webbrowser

webbrowser.open("https://github.com/darrenburns/elia")

def _action_open_issues(self) -> None:
import webbrowser

webbrowser.open("https://github.com/darrenburns/elia/issues")
15 changes: 10 additions & 5 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,21 +1,23 @@
[project]
name = "elia_chat"
version = "1.3.0"
version = "1.10.0"
description = "A powerful terminal user interface for interacting with large language models."
authors = [
{ name = "Darren Burns", email = "darrenb900@gmail.com" }
]
dependencies = [
"textual[syntax]==0.58.1",
"textual[syntax]==0.79.1",
"sqlmodel>=0.0.9",
"humanize>=4.6.0",
"click>=8.1.6",
"xdg-base-dirs>=6.0.1",
"pydantic-settings>=2.2.1",
"aiosqlite>=0.20.0",
"click-default-group>=1.2.4",
"litellm>=1.35.38",
"greenlet>=3.0.3",
"google-generativeai>=0.5.3",
"pyperclip>=1.8.2",
"litellm>=1.37.19",
"pydantic>=2.9.0",
]
readme = "README.md"
requires-python = ">= 3.11"
@@ -27,7 +29,7 @@ elia = "elia_chat.__main__:cli"
requires = ["hatchling"]
build-backend = "hatchling.build"

[tool.rye]
[tool.uv]
managed = true
dev-dependencies = [
"black>=23.3.0",
@@ -38,6 +40,9 @@ dev-dependencies = [
"pyinstrument>=4.6.2",
]

[tool.uv.sources]
textual = { path = "../textual", editable = true }

[tool.mypy]
ignore_missing_imports = true

187 changes: 0 additions & 187 deletions requirements-dev.lock

This file was deleted.

150 changes: 0 additions & 150 deletions requirements.lock

This file was deleted.

1,745 changes: 1,745 additions & 0 deletions uv.lock

Large diffs are not rendered by default.