Skip to content

Commit

Permalink
init commit
Browse files Browse the repository at this point in the history
  • Loading branch information
chamalis committed Oct 1, 2023
0 parents commit 8ca76df
Show file tree
Hide file tree
Showing 46 changed files with 1,712 additions and 0 deletions.
57 changes: 57 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
FROM python:3.10-slim-buster

EXPOSE 9999

WORKDIR /app

CMD ["./scripts/run-prod.sh"]
VOLUME ["/app/docs/"]

RUN adduser --uid 1000 --disabled-password --gecos '' --home /home/deploy deploy

RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential \
libpq-dev \
python3-pip \
netcat \
postgresql-client && \
rm -rf /var/lib/apt/lists/*

# Install 3rd party in deeper layer cause it changes less often
COPY requirements/ /app/requirements/
RUN pip install --no-cache-dir -r /app/requirements/dev.txt

COPY . /app
ENV PYTHONPATH=/app
RUN chown deploy.deploy -R /app

COPY docs/ /app/docs/
# RUN make -C /app/docs/ html

# Run everything as user deploy inside the container
USER deploy



#FROM tiangolo/uvicorn-gunicorn:python3.11-slim
#
#LABEL maintainer="Stelios Barberakis <[email protected]>"
#
#WORKDIR /app
#
## Install 3rd party in deeper layer cause it changes less often
#COPY requirements/ /app/requirements
#RUN pip install --no-cache-dir -r /app/requirements/docker.txt
#
## tiangolo's image expects /app/main.py or /app/app/main.py
#COPY ./app /app
#
## #COPY ./scripts /app/scripts
#CMD ["/app/scripts/run-prod.sh"]
## CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]



# ## MANUAL DOCKERFILE using custom run script run-prod.sh ## #

#FROM tiangolo/uvicorn-gunicorn-fastapi:python3.11
146 changes: 146 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,146 @@
# SPA

SPA is a simple movie REST API to demonstrate
minimal usage of FastAPI with PostgreSQL, SQLAlchemy,
Docker, Nginx.

It may serve as a template for FastAPI
projects.

It uses the datasets found here:

* https://datasets.imdbws.com/title.basics.tsv.gz
* https://datasets.imdbws.com/title.ratings.tsv.gz

and has no interface so far.

## Configuration file

COPY `env.example.native` or `env.example.docker` to `.env`
and FILL IN according to your environment. Default values
will probably work, but its recommended to verify yourself.

In order to switch from host to docker and vice versa,
the `.env` needs to be changed accordingly every time.
Theoretically in this version, the only parameter that should
need to change is

* `DB_HOST`
* For docker: `DB_HOST=spa_db`
* For host: `DB_HOST=127.0.0.1`
or the IP where the postgresql server is running

## Docker deployment

Prerequisites:
* docker
* docker-compose

```commandline
docker-compose build && docker-compose up -d
```

First time it runs, it will create the db
and fill in data which will take several
minutes. Please do not interrupt it during that stage.

## Native deployment (non-docker)

### Prerequisites: ###
* Python3 (tested at 3.10.12)
* python3-virtualenv


### Python Virtual env setup

You can Replace `venv` with any writtable path:

```commandline
virtualenv -p python3 venv
source venv/bin/activate
```

### Update PYTHONPATH

assuming you have cd in `spa` folder

```commandline
export PYTHONPATH=$PYTHONPATH:$(pwd)/app
```

#### Install dependencies

```commandline
pip install -r requirements/dev.txt
```


### Download data

The following script downloads the relevant .tsv.gz files
into the `raw_data` directory if not already there

```bash
./scripts/prepare_data.sh
```

### create db

This step needs postgresql server and client to be installed

Using the same data you use in `.env`
e.g, create db `spa` using postgres user `spa`:

```commandline
createdb -U spa spa
```

### init db

Create the schema

```commandline
alembic upgrade head
```

WARNING: The following script (fills in the DB) will take 5-15 minutes:

```bash
python ./scripts/populate_db.py
```

if there are (real) data already in, it will fail with an IntegrityError
(due to primary keys uniqueness)

### run
```
uvicorn app.main:app --host 0.0.0.0 --port 8000
```

## Example Usage

Rather than using curl, I 'd suggest to use Firefox, for the GET requests,
since it has builtin json viewer.

Here we assume the host is `localhost` and port `8000`.


```bash
# create via imdb_url
curl -i --header "Content-Type: application/json" -X POST -d '{"title": "Incendies", "imdb_url":"https://www.imdb.com/title/tt1255953/"}' http://127.0.0.1:8000/api/v1/movies

# create via imdb_url
curl -i --header "Content-Type: application/json" -X POST -d '{"title": "Incendies", "imdb_id":"tt1255953"}' http://127.0.0.1:8000/api/v1/movies

# get mystery sorted by best rating
curl -i -X GET http://127.0.0.1:8000/api/v1/movies?order=rating,desc&genre=Mystery

# next page based on the given ordering
curl -i -X GET http://127.0.0.1:8000/api/v1/movies?order=rating,desc&genre=Mystery&page=2

# Get Dramas with rating >= 8.6 ordered by title
curl -i -X GET http://127.0.0.1:8000/api/v1/movies?rating=8.6&genre=Drama&order=title

# Specify page size and page number
curl -i -X GET http://127.0.0.1:8000/api/v1/movies?page=4&size=66
```
111 changes: 111 additions & 0 deletions alembic.ini
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
# A generic, single database configuration.

[alembic]
# path to migration scripts
script_location = alembic

# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s

# sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory.
prepend_sys_path = .

# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python-dateutil library that can be
# installed by adding `alembic[tz]` to the pip requirements
# string value is passed to dateutil.tz.gettz()
# leave blank for localtime
# timezone =

# max length of characters to apply to the
# "slug" field
# truncate_slug_length = 40

# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false

# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false

# version location specification; This defaults
# to alembic/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "version_path_separator" below.
# version_locations = %(here)s/bar:%(here)s/bat:alembic/versions

# version path separator; As mentioned above, this is the character used to split
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
# Valid values for version_path_separator are:
#
# version_path_separator = :
# version_path_separator = ;
# version_path_separator = space
version_path_separator = os # Use os.pathsep. Default configuration used for new projects.

# set to 'true' to search source files recursively
# in each "version_locations" directory
# new in Alembic version 1.10
# recursive_version_locations = false

# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8

# sqlalchemy.url = postgresql://user:pass@localhost/ecg_stelios
sqlalchemy.url = postgresql://%(POSTGRES_USER)s:%(POSTGRES_PASSWORD)s@%(DB_HOST)s:%(DB_PORT)s/%(POSTGRES_DB)s


[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples

# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks = black
# black.type = console_scripts
# black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME

# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic

[handlers]
keys = console

[formatters]
keys = generic

[logger_root]
level = WARN
handlers = console
qualname =

[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine

[logger_alembic]
level = INFO
handlers =
qualname = alembic

[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic

[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S
1 change: 1 addition & 0 deletions alembic/README
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Generic single-database configuration.
Loading

0 comments on commit 8ca76df

Please sign in to comment.