Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generic DB connection and management #2

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,8 @@ __pycache__
metabase-db/

.DS_Store

build/
dist/

*.egg-info/
25 changes: 20 additions & 5 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,6 +1,21 @@
FROM python:3-stretch
RUN mkdir /code
# builder image
FROM python:3-alpine as builder
COPY . /code/

WORKDIR /code
ADD requirements.txt /code/
RUN pip install -r requirements.txt
ADD . /code/

RUN python setup.py bdist_wheel --dist-dir=/tmp/dist/

# Final image
FROM python:3-alpine

ENV KANBANDASH_DATABASE_URL=

COPY --from=builder /tmp/dist/kanbandash*.whl /tmp/

RUN \
apk add --no-cache postgresql-libs && \
apk add --no-cache --virtual .build-deps gcc musl-dev postgresql-dev && \
pip install /tmp/kanbandash*.whl --no-cache-dir && \
apk --purge del .build-deps && \
rm -fR /tmp/kanbandash*.whl
2 changes: 2 additions & 0 deletions MANIFEST.in
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
include kanbandash/alembic.ini
include kanbandash/kanban-dashboards.json
141 changes: 133 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,127 @@
metabase-kanban-dashboard
=========================
# metabase-kanban-dashboard

The goal of this project is to provide an open-source Kanban metrics dashboard for Metabase.

This still in early stage and that's how the dashboard currently looks like:

![Screen Shot 2020-09-07 at 22 19 59-fullpage](https://user-images.githubusercontent.com/33388/92423867-fbac2380-f158-11ea-9e07-7b5c5d83a9db.png)

Testing with the container and the test data:
---------------------------------------------
## Getting started

Before getting started you will need to have a PostgreSQL server and a Metabase instance configured and running.

The tools included in this repository will create the questions and the dashboards in your metabase instance as well as the database schema you will need.

It's not in the scope of this project to connect to any project management tool to extract your information (at least not for now).

Long story short, here is what you will need in order to get your Kanban dashboard up and running:

1. Create the tables required to store your Kanban data
1. Import the Metabase questions and dashboards
1. Process the data from your project management tool and insert in the database

In the next sections we'll guide you over each of the next steps.

If you want to play around with the test docker environment before getting your hands dirty, take a look in the section "[Testing with the container and the test data](#testing-with-the-container-and-the-test-data)".


### Installing

You can install the library by cloning this repository and running `pip install .` inside the repository directory.

Another option is to just download our docker container using `docker pull cravefood/kanban-dash`.

In either case you will always have to set the the env var `KANBANDASH_DATABASE_URL`. This is how the tool will know in which database it should connect. The variable should look like this `KANBANDASH_DATABASE_URL=postgresql://<db_user>:<db_password>@<db_host>:<db_port>/<db_name>`.


### Creating the models

To create the models you will need to run the command:

```
KANBANDASH_DATABASE_URL=postgresql://<db_user>:<db_password>@<db_host>:<db_port>/<db_name> kanban-dash models --create
```

Or using the docker container:

```
docker run -e KANBANDASH_DATABASE_URL=postgresql://<db_user>:<db_password>@<db_host>:<db_port>/<db_name> cravefood/kanban-dash kanban-dash models --create
```


### Creating the questions and dashboards in Metabase

Before creating the data in Metabase you will need to configure your Metabase instance to access your Kanban database. You can do that by accessing Settings -> Admin -> Databases (top menu) -> Add database.

With your database configured you will need to create a new collection: Browse all items -> New collection.
After creating the collection you will have to access it in order to get the collection id (available in the collection URL).

Now we are ready to run the collection import script:

```
kanban-dash metabase --import \
--username=<your-user-name> \
--collection-id=<your-collection-id> \
--url=<your-metabase-url>
```

or using our docker container:

```
docker run -it cravefood/kanban-dash kanban-dash metabase --import \
--username=<your-user-name> \
--collection-id=<your-collection-id> \
--url=<your-metabase-url>
```


After running the script you should be able to access the collection and see the imported reports. Select "Kanban" in the dashboards tab to see the dashboard without any data.

If you want to insert some test data you can that by running:

```
KANBANDASH_DATABASE_URL=postgresql://<db_user>:<db_password>@<db_host>:<db_port>/<db_name> kanban-dash generate-test-data
```

Or using the docker container:

```
docker run -e KANBANDASH_DATABASE_URL=postgresql://<db_user>:<db_password>@<db_host>:<db_port>/<db_name> cravefood/kanban-dash kanban-dash generate-test-data
```


Once you are done playing with the test data you can clean it up using the command:

```
KANBANDASH_DATABASE_URL=postgresql://<db_user>:<db_password>@<db_host>:<db_port>/<db_name> models --reset
```

Or using the docker container:

```
docker run -e KANBANDASH_DATABASE_URL=postgresql://<db_user>:<db_password>@<db_host>:<db_port>/<db_name> cravefood/kanban-dash kanban-dash models --reset
```

### Inserting real data

TODO


## Testing with the container and the test data

* Start the containers
```
$ docker-compose up
```

* Run the script to create a populate the test database:
* Run the scripts to create the database schema:
```
docker-compose run kanban-dash kanban-dash models --create
```

* Run the script to populate the database with test data:
```
$ ./scripts/run.sh populate_with_test_data.py
docker-compose run kanban-dash kanban-dash generate-test-data
```

* Access the local Metabase instance on http://localhost:3000, create a user and password and connect to the testing database (db `kanban_metrics`, hostname `postgres`, username `postgres`, no password).
Expand All @@ -26,9 +130,30 @@ $ ./scripts/run.sh populate_with_test_data.py

* Run the script to import the Kanban dashboard to the new collection. Make sure to use the proper username and collection-id:
```
./scripts/metabase-import-export.py \
metabase-import-export \
--username=<your-user-name> \
import \
--collection-id=<your-collection-id> \
--import-file=kanban-dashboards.json
--import-file=kanbandash/kanban-dashboards.json
```

or using our docker container:

```
docker run cravefood/kanban-dash metabase-import-export \
--username=<your-user-name> \
import \
--collection-id=<your-collection-id> \
--import-file=kanbandash/kanban-dashboards.json
```


## Developing / Contributing

### Creating a schema migration

The database migrations are using alembic. If perform a change in the models.py file you will need to create a new database migration to reflect those changes. Usign the docker-compose environment you can do that using the following command:

```
docker-compose run kanban-dash alembic revision --autogenerate -m "<Your message here>"
```
7 changes: 4 additions & 3 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,12 @@ services:
- MGID=${GID}
- MB_DB_FILE=/metabase.db

kanban-reports:
kanban-dash:
build: .
# repo it's just a few scripts. This is a dirty hack to keep the container running
command: bash -c "sleep infinity"
depends_on:
- postgres
volumes:
- .:/code
working_dir: /code/kanbandash
environment:
- KANBANDASH_DATABASE_URL=postgresql://postgres@postgres/kanban_metrics
Empty file added kanbandash/__init__.py
Empty file.
86 changes: 86 additions & 0 deletions kanbandash/alembic.ini
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
# A generic, single database configuration.

[alembic]
# path to migration scripts
script_location = alembic

# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s

# timezone to use when rendering the date
# within the migration file as well as the filename.
# string value is passed to dateutil.tz.gettz()
# leave blank for localtime
# timezone =

# max length of characters to apply to the
# "slug" field
# truncate_slug_length = 40

# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false

# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false

# version location specification; this defaults
# to alembic/versions. When using multiple version
# directories, initial revisions must be specified with --version-path
# version_locations = %(here)s/bar %(here)s/bat alembic/versions

# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8

# This will be overwritten in env.py
sqlalchemy.url = driver://user:pass@localhost/dbname


[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples

# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks=black
# black.type=console_scripts
# black.entrypoint=black
# black.options=-l 79

# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic

[handlers]
keys = console

[formatters]
keys = generic

[logger_root]
level = WARN
handlers = console
qualname =

[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine

[logger_alembic]
level = INFO
handlers =
qualname = alembic

[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic

[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S
1 change: 1 addition & 0 deletions kanbandash/alembic/README
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Generic single-database configuration.
Empty file added kanbandash/alembic/__init__.py
Empty file.
81 changes: 81 additions & 0 deletions kanbandash/alembic/env.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
from logging.config import fileConfig

from sqlalchemy import engine_from_config
from sqlalchemy import pool

from alembic import context

from kanbandash import models
from kanbandash import settings

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = models.Base.metadata

# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.

# Replacing some settings from alembic.ini
config.set_main_option("sqlalchemy.url", settings.POSTGRES_DATABASE_URL)


def run_migrations_offline():
"""Run migrations in 'offline' mode.

This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.

Calls to context.execute() here emit the given string to the
script output.

"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)

with context.begin_transaction():
context.run_migrations()


def run_migrations_online():
"""Run migrations in 'online' mode.

In this scenario we need to create an Engine
and associate a connection with the context.

"""
connectable = engine_from_config(
config.get_section(config.config_ini_section),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)

with connectable.connect() as connection:
context.configure(connection=connection, target_metadata=target_metadata)

with context.begin_transaction():
context.run_migrations()


if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()
Loading