Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Init tests #369

Merged
merged 1 commit into from
Aug 9, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 13 additions & 3 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,13 @@ jobs:
working_directory: ~/app
docker:
- image: circleci/python:3.7.3-node-browsers
- image: circleci/redis:5.0.5-alpine
- image: circleci/postgres:11.4-alpine
environment:
POSTGRES_USER: main
POSTGRES_DB: main
POSTGRES_PASSWORD: main

steps:
- checkout
- restore_cache:
Expand All @@ -22,9 +29,12 @@ jobs:
key: 'venv-{{ checksum "requirements/base.txt" }}-{{ checksum "requirements/testing.txt" }}-{{ checksum "requirements/development.txt" }}-{{ checksum "requirements/production.txt" }}'
paths:
- .venv
- run: |
source .venv/bin/activate
tox
- run:
environment:
POSTGRESQL_DSN: postgresql://main:[email protected]:5432/main
command: |
source .venv/bin/activate
tox
- save_cache:
key: 'tox-{{ checksum "requirements/base.txt" }}-{{ checksum "requirements/testing.txt" }}-{{ checksum "requirements/development.txt" }}-{{ checksum "requirements/production.txt" }}'
paths:
Expand Down
24 changes: 15 additions & 9 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,14 +8,28 @@ $ docker-compose up --build

This will build the docker container for you which uses the correct [python version](.python-version), installs the dependencies, and binds the ports. This also "volume mounts" your local directory into the container, meaning that any changes you make on your host machine will be available in the docker container. The exception to these changes being reflected immediately will be if/when a dependency is added or updated, in which case you'll need to run the above command again (basically just ctrl-c, up arrow, enter, and wait for the rebuild).

For testing, you can run:
## Testing

While you can allow for CircleCI to run tests/checks, running locally simply uses `docker` and `tox`:

```bash
# if you need to rebuild first, `docker-compose build`

$ docker-compose run web tox
```

Tox forwards positional arguments to pytest, that way you can use all standard pytest arguments. For example, only running a specific test can be done like this:

```bash
$ docker-compose run web tox -e py37 tests/test_website.py::test_endpoint_index
```

To run the black auto-formatter on the code you can use:

```bash
$ docker-compose run web tox -e autoformat
```

# The Involved Path

If instead you'd prefer to set-up your project on the host machine, you are free to do so. This is a non-exhaustive primer on the steps required, if you need help directly please ask in [#community_projects](slack://open?team=T07EFKXHR&id=C2FMLUBEU).
Expand Down Expand Up @@ -86,14 +100,6 @@ Now you should be good to run the application:

Once that launches you can visit [localhost:8000](http://localhost:8000) in your browser and be in business.

### 7. Testing

While you can allow for CircleCI to run tests/checks, running locally simply uses `tox`:

```bash
(.venv) $ tox
```

## Windows Systems

TODO: see [#330](https://github.com/pyslackers/website/issues/330)
6 changes: 5 additions & 1 deletion docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ services:
build: .
depends_on:
- redis
- postgresql
environment:
REDIS_URL: "redis://redis:6379/0"
POSTGRESQL_DSN: "postgresql://main:main@postgresql:5432/main"
Expand All @@ -15,7 +16,7 @@ services:
- "8000:8000"
volumes:
- "${PWD}:/app"
- /app/.tox
- tox-data:/app/.tox
command: gunicorn pyslackersweb:app_factory --access-logfile - --bind=0.0.0.0:8000 --worker-class=aiohttp.GunicornWebWorker --reload

redis:
Expand All @@ -26,3 +27,6 @@ services:
environment:
POSTGRES_PASSWORD: main
POSTGRES_USER: main

volumes:
tox-data: {}
5 changes: 4 additions & 1 deletion pyslackersweb/contexts.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,11 @@
async def apscheduler(app: web.Application) -> AsyncGenerator[None, None]:
app["scheduler"] = app["website_app"]["scheduler"] = AsyncIOScheduler()
app["scheduler"].start()

yield
app["scheduler"].shutdown()

if app["scheduler"].running:
app["scheduler"].shutdown()


async def client_session(app: web.Application) -> AsyncGenerator[None, None]:
Expand Down
2 changes: 1 addition & 1 deletion pyslackersweb/website/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,4 @@

class InviteSchema(Schema):
email = fields.Email(required=True)
agree_tos = fields.Boolean(required=True)
agree_tos = fields.Boolean(required=True, validate=bool)
32 changes: 19 additions & 13 deletions pyslackersweb/website/tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,16 +44,16 @@ class Channel:

async def sync_github_repositories(
session: ClientSession, redis: RedisConnection, *, cache_key: str = GITHUB_REPO_CACHE_KEY
) -> None:
) -> List[Repository]:
mattrasband marked this conversation as resolved.
Show resolved Hide resolved
logger.debug("Refreshing GitHub cache")
repositories = []
try:
async with session.get(
"https://api.github.com/orgs/pyslackers/repos",
headers={"Accept": "application/vnd.github.mercy-preview+json"},
) as r:
repos = await r.json()

repositories = []
for repo in repos:
if repo["archived"]:
continue
Expand All @@ -72,13 +72,15 @@ async def sync_github_repositories(

repositories.sort(key=lambda r: r.stars, reverse=True)

await redis.set(cache_key, json.dumps([x.__dict__ for x in repositories[:6]]))

await redis.set(
cache_key, json.dumps([dataclasses.asdict(repo) for repo in repositories[:6]])
mattrasband marked this conversation as resolved.
Show resolved Hide resolved
)
except asyncio.CancelledError:
logger.debug("Github cache refresh canceled")
except Exception:
except Exception: # pylint: disable=broad-except
logger.exception("Error refreshing GitHub cache")
raise

return repositories
mattrasband marked this conversation as resolved.
Show resolved Hide resolved


async def sync_slack_users(
Expand All @@ -87,11 +89,11 @@ async def sync_slack_users(
*,
cache_key_tz: str = SLACK_TZ_CACHE_KEY,
cache_key_count: str = SLACK_COUNT_CACHE_KEY,
):
) -> Counter:
logger.debug("Refreshing slack users cache.")

counter: Counter = Counter()
try:
counter: Counter = Counter()
async for user in slack_client.iter(slack.methods.USERS_LIST, minimum_time=3):
if user["deleted"] or user["is_bot"] or not user["tz"]:
continue
Expand All @@ -112,16 +114,17 @@ async def sync_slack_users(
logger.debug("Slack users cache refresh canceled")
except Exception: # pylint: disable=broad-except
logger.exception("Error refreshing slack users cache")
return

return counter


async def sync_slack_channels(
slack_client: SlackAPI, redis: RedisConnection, *, cache_key: str = SLACK_CHANNEL_CACHE_KEY
) -> None:
) -> List[Channel]:
logger.debug("Refreshing slack channels cache.")

channels = []
try:
channels = []
async for channel in slack_client.iter(slack.methods.CHANNELS_LIST):
channels.append(
Channel(
Expand All @@ -137,10 +140,13 @@ async def sync_slack_channels(

logger.debug("Found %s slack channels", len(channels))

await redis.set(cache_key, json.dumps([x.__dict__ for x in channels]))
await redis.set(
cache_key, json.dumps([dataclasses.asdict(channel) for channel in channels])
)

except asyncio.CancelledError:
logger.debug("Slack channels cache refresh canceled")
except Exception: # pylint: disable=broad-except
logger.exception("Error refreshing slack channels cache")
return

return channels
28 changes: 15 additions & 13 deletions pyslackersweb/website/views.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
import json
import logging

import slack.exceptions

from aiohttp import web
from aiohttp_jinja2 import template
from marshmallow.exceptions import ValidationError
Expand All @@ -24,7 +26,9 @@ async def get(self):

return {
"member_count": int((await redis.get(SLACK_COUNT_CACHE_KEY, encoding="utf-8")) or 0),
"projects": json.loads(await redis.get(GITHUB_REPO_CACHE_KEY, encoding="utf-8")),
"projects": json.loads(
await redis.get(GITHUB_REPO_CACHE_KEY, encoding="utf-8") or "{}"
),
"sponsors": [
{
"image": self.request.app.router["static"].url_for(
Expand Down Expand Up @@ -65,19 +69,17 @@ async def post(self):

try:
invite = self.schema.load(await self.request.post())
async with self.request.app["client_session"].post(
"https://slack.com/api/users.admin.invite",
headers={"Authorization": f"Bearer {self.request.app['slack_invite_token']}"},
data={"email": invite["email"], "resend": True},
) as r:
body = await r.json()

if body["ok"]:
context["success"] = True
else:
logger.warning("Error sending slack invite: %s", body["error"], extra=body)
context["errors"].update(non_field=[body["error"]])
await self.request.app["slack_client"].query(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

url="users.admin.invite", data={"email": invite["email"], "resend": True}
)
context["success"] = True
except ValidationError as e:
context["errors"] = e.normalized_messages()
except slack.exceptions.SlackAPIError as e:
logger.warning("Error sending slack invite: %s", e.error, extra=e.data)
context["errors"].update(non_field=[e.error])
except slack.exceptions.HTTPException:
logger.exception("Error contacting slack API")
context["errors"].update(non_field=["Error contacting slack API"])

return context
1 change: 1 addition & 0 deletions requirements/testing.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,4 @@ pytest-cov==2.7.1
pytest-aiohttp==0.3.0
tox==3.13.2
mypy==0.720
asynctest==0.13.0
Empty file added tests/__init__.py
Empty file.
18 changes: 18 additions & 0 deletions tests/conftest.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
import pytest

import pyslackersweb
import pyslackersweb.website.tasks

pytest_plugins = ("slack.tests.plugin",)


@pytest.fixture
async def client(aiohttp_client, slack_client):

application = await pyslackersweb.app_factory()

app_client = await aiohttp_client(application)
app_client.app["scheduler"].shutdown()
app_client.app["website_app"]["slack_client"] = slack_client

return app_client
2 changes: 0 additions & 2 deletions tests/test_nothing.py

This file was deleted.

111 changes: 111 additions & 0 deletions tests/test_website.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
import logging

from collections import namedtuple

import pytest
import aiohttp.web

from pyslackersweb.website import tasks

SlackInviteTestParam = namedtuple("Param", "response data expected")


async def test_endpoint_index(client):
r = await client.get("/")

assert r.history[0].url.path == "/"
assert r.history[0].status == 302

assert r.status == 200
assert r.url.path == "/web"


async def test_endpoint_slack(client):
r = await client.get("/web/slack")
assert r.status == 200


@pytest.mark.parametrize(
"slack_client,data,expected",
(
SlackInviteTestParam(
response={},
data={"email": "[email protected]", "agree_tos": True},
expected="successAlert",
),
SlackInviteTestParam(
response={}, data={"agree_tos": True}, expected="Missing data for required field"
),
SlackInviteTestParam(
response={},
data={"email": "[email protected]", "agree_tos": False},
expected="There was an error processing your invite",
),
SlackInviteTestParam(
response={},
data={"email": "foobar", "agree_tos": True},
expected="Not a valid email address",
),
SlackInviteTestParam(
response={"body": {"ok": False, "error": "already_in_team"}},
data={"email": "[email protected]", "agree_tos": True},
expected="Reason: already_in_team",
),
SlackInviteTestParam(
response={"body": {"ok": False, "error": "not_authed"}},
data={"email": "[email protected]", "agree_tos": True},
expected="Reason: not_authed",
),
SlackInviteTestParam(
response={"status": 500},
data={"email": "[email protected]", "agree_tos": True},
expected="Reason: Error contacting slack API",
),
),
indirect=["slack_client"],
)
async def test_endpoint_slack_invite(client, data, expected):
r = await client.post(path="/web/slack", data=data)
html = await r.text()

assert r.status == 200
assert expected in html


async def test_task_sync_github_repositories(client, caplog):

async with aiohttp.ClientSession() as session:
result = await tasks.sync_github_repositories(session, client.app["redis"])

assert result

for record in caplog.records:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I really do not want to test logging implementation, that seems brittle and not like something to ensure doesn't change.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The goal is not to check the logging implementation but to ensure not exceptions where raised. It kinds of loop back to the return / raise issue of the tasks.

IMO it's fine for now.

Later on, we should split those tasks into two functions. One handling the logging / exception catching for apscheduler and a second doing the actual work that we can test

assert record.levelno <= logging.INFO


@pytest.mark.parametrize("slack_client", ({"body": ["users_iter", "users"]},), indirect=True)
async def test_task_sync_slack_users(client, caplog):

result = await tasks.sync_slack_users(
client.app["website_app"]["slack_client"], client.app["redis"]
)

assert result
assert len(result) == 1
assert result["America/Los_Angeles"] == 2

for record in caplog.records:
mattrasband marked this conversation as resolved.
Show resolved Hide resolved
assert record.levelno <= logging.INFO


@pytest.mark.parametrize("slack_client", ({"body": ["channels_iter", "channels"]},), indirect=True)
async def test_task_sync_slack_channels(client, caplog):

result = await tasks.sync_slack_channels(
client.app["website_app"]["slack_client"], client.app["redis"]
)

assert result

for record in caplog.records:
mattrasband marked this conversation as resolved.
Show resolved Hide resolved
assert record.levelno <= logging.INFO
Loading