diff options
author | 2021-04-12 23:14:08 +0800 | |
---|---|---|
committer | 2021-04-12 23:14:08 +0800 | |
commit | 48a7cca5c90841679a71eb8d0e902ad6dc943cc6 (patch) | |
tree | 5511c136833aa07efc89c3eac98a9eb013cb53ca | |
parent | update comment (diff) | |
parent | Merge pull request #1516 from ToxicKidz/sorted-available-channels (diff) |
Merge branch 'main' into doc-imp
40 files changed, 1562 insertions, 1190 deletions
diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index 634bb4bca..1df05e990 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -4,14 +4,14 @@ **/bot/exts/moderation/*silence.py @MarkKoz bot/exts/info/codeblock/** @MarkKoz bot/exts/utils/extensions.py @MarkKoz -bot/exts/utils/snekbox.py @MarkKoz @Akarys42 +bot/exts/utils/snekbox.py @MarkKoz @Akarys42 @jb3 bot/exts/help_channels/** @MarkKoz @Akarys42 -bot/exts/moderation/** @Akarys42 @mbaruh @Den4200 @ks129 -bot/exts/info/** @Akarys42 @Den4200 -bot/exts/info/information.py @mbaruh -bot/exts/filters/** @mbaruh +bot/exts/moderation/** @Akarys42 @mbaruh @Den4200 @ks129 @jb3 +bot/exts/info/** @Akarys42 @Den4200 @jb3 +bot/exts/info/information.py @mbaruh @jb3 +bot/exts/filters/** @mbaruh @jb3 bot/exts/fun/** @ks129 -bot/exts/utils/** @ks129 +bot/exts/utils/** @ks129 @jb3 bot/exts/recruitment/** @wookie184 # Rules @@ -30,9 +30,9 @@ tests/bot/exts/test_cogs.py @MarkKoz tests/** @Akarys42 # CI & Docker -.github/workflows/** @MarkKoz @Akarys42 @SebastiaanZ @Den4200 -Dockerfile @MarkKoz @Akarys42 @Den4200 -docker-compose.yml @MarkKoz @Akarys42 @Den4200 +.github/workflows/** @MarkKoz @Akarys42 @SebastiaanZ @Den4200 @jb3 +Dockerfile @MarkKoz @Akarys42 @Den4200 @jb3 +docker-compose.yml @MarkKoz @Akarys42 @Den4200 @jb3 # Tools Pipfile* @Akarys42 diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml index e6826e09b..84a671917 100644 --- a/.github/workflows/build.yml +++ b/.github/workflows/build.yml @@ -39,7 +39,7 @@ jobs: with: registry: ghcr.io username: ${{ github.repository_owner }} - password: ${{ secrets.GHCR_TOKEN }} + password: ${{ secrets.GITHUB_TOKEN }} # Build and push the container to the GitHub Container # Repository. The container will be tagged as "latest" diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md new file mode 100644 index 000000000..57ccd80e7 --- /dev/null +++ b/CODE_OF_CONDUCT.md @@ -0,0 +1,3 @@ +# Code of Conduct + +The Python Discord Code of Conduct can be found [on our website](https://pydis.com/coc). diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index addab32ff..f20b53162 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,123 +1,3 @@ -# Contributing to one of Our Projects +# Contributing Guidelines -Our projects are open-source and are automatically deployed whenever commits are pushed to the `main` branch on each repository, so we've created a set of guidelines in order to keep everything clean and in working order. - -Note that contributions may be rejected on the basis of a contributor failing to follow these guidelines. - -## Rules - -1. **No force-pushes** or modifying the Git history in any way. -2. If you have direct access to the repository, **create a branch for your changes** and create a pull request for that branch. If not, create a branch on a fork of the repository and create a pull request from there. - * It's common practice for a repository to reject direct pushes to `main`, so make branching a habit! - * If PRing from your own fork, **ensure that "Allow edits from maintainers" is checked**. This gives permission for maintainers to commit changes directly to your fork, speeding up the review process. -3. **Adhere to the prevailing code style**, which we enforce using [`flake8`](http://flake8.pycqa.org/en/latest/index.html) and [`pre-commit`](https://pre-commit.com/). - * Run `flake8` and `pre-commit` against your code [**before** you push it](https://soundcloud.com/lemonsaurusrex/lint-before-you-push). Your commit will be rejected by the build server if it fails to lint. - * [Git Hooks](https://git-scm.com/book/en/v2/Customizing-Git-Git-Hooks) are a powerful git feature for executing custom scripts when certain important git actions occur. The pre-commit hook is the first hook executed during the commit process and can be used to check the code being committed & abort the commit if issues, such as linting failures, are detected. While git hooks can seem daunting to configure, the `pre-commit` framework abstracts this process away from you and is provided as a dev dependency for this project. Run `pipenv run precommit` when setting up the project and you'll never have to worry about committing code that fails linting. -4. **Make great commits**. A well structured git log is key to a project's maintainability; it efficiently provides insight into when and *why* things were done for future maintainers of the project. - * Commits should be as narrow in scope as possible. Commits that span hundreds of lines across multiple unrelated functions and/or files are very hard for maintainers to follow. After about a week they'll probably be hard for you to follow too. - * Avoid making minor commits for fixing typos or linting errors. Since you've already set up a `pre-commit` hook to run the linting pipeline before a commit, you shouldn't be committing linting issues anyway. - * A more in-depth guide to writing great commit messages can be found in Chris Beam's [*How to Write a Git Commit Message*](https://chris.beams.io/posts/git-commit/) -5. **Avoid frequent pushes to the main repository**. This goes for PRs opened against your fork as well. Our test build pipelines are triggered every time a push to the repository (or PR) is made. Try to batch your commits until you've finished working for that session, or you've reached a point where collaborators need your commits to continue their own work. This also provides you the opportunity to amend commits for minor changes rather than having to commit them on their own because you've already pushed. - * This includes merging main into your branch. Try to leave merging from main for after your PR passes review; a maintainer will bring your PR up to date before merging. Exceptions to this include: resolving merge conflicts, needing something that was pushed to main for your branch, or something was pushed to main that could potentionally affect the functionality of what you're writing. -6. **Don't fight the framework**. Every framework has its flaws, but the frameworks we've picked out have been carefully chosen for their particular merits. If you can avoid it, please resist reimplementing swathes of framework logic - the work has already been done for you! -7. If someone is working on an issue or pull request, **do not open your own pull request for the same task**. Instead, collaborate with the author(s) of the existing pull request. Duplicate PRs opened without communicating with the other author(s) and/or PyDis staff will be closed. Communication is key, and there's no point in two separate implementations of the same thing. - * One option is to fork the other contributor's repository and submit your changes to their branch with your own pull request. We suggest following these guidelines when interacting with their repository as well. - * The author(s) of inactive PRs and claimed issues will be be pinged after a week of inactivity for an update. Continued inactivity may result in the issue being released back to the community and/or PR closure. -8. **Work as a team** and collaborate wherever possible. Keep things friendly and help each other out - these are shared projects and nobody likes to have their feet trodden on. -9. All static content, such as images or audio, **must be licensed for open public use**. - * Static content must be hosted by a service designed to do so. Failing to do so is known as "leeching" and is frowned upon, as it generates extra bandwidth costs to the host without providing benefit. It would be best if appropriately licensed content is added to the repository itself so it can be served by PyDis' infrastructure. - -Above all, the needs of our community should come before the wants of an individual. Work together, build solutions to problems and try to do so in a way that people can learn from easily. Abuse of our trust may result in the loss of your Contributor role. - -## Changes to this Arrangement - -All projects evolve over time, and this contribution guide is no different. This document is open to pull requests or changes by contributors. If you believe you have something valuable to add or change, please don't hesitate to do so in a PR. - -## Supplemental Information -### Developer Environment -Instructions for setting the bot developer environment can be found on the [PyDis wiki](https://pythondiscord.com/pages/contributing/bot/) - -To provide a standalone development environment for this project, docker compose is utilized to pull the current version of the [site backend](https://github.com/python-discord/site). While appropriate for bot-only contributions, any contributions that necessitate backend changes will require the site repository to be appropriately configured as well. Instructions for setting up the site environment can be found on the [PyDis site](https://pythondiscord.com/pages/contributing/site/). - -When pulling down changes from GitHub, remember to sync your environment using `pipenv sync --dev` to ensure you're using the most up-to-date versions the project's dependencies. - -### Type Hinting -[PEP 484](https://www.python.org/dev/peps/pep-0484/) formally specifies type hints for Python functions, added to the Python Standard Library in version 3.5. Type hints are recognized by most modern code editing tools and provide useful insight into both the input and output types of a function, preventing the user from having to go through the codebase to determine these types. - -For example: - -```py -import typing as t - - -def foo(input_1: int, input_2: t.Dict[str, str]) -> bool: - ... -``` - -Tells us that `foo` accepts an `int` and a `dict`, with `str` keys and values, and returns a `bool`. - -All function declarations should be type hinted in code contributed to the PyDis organization. - -For more information, see *[PEP 483](https://www.python.org/dev/peps/pep-0483/) - The Theory of Type Hints* and Python's documentation for the [`typing`](https://docs.python.org/3/library/typing.html) module. - -### AutoDoc Formatting Directives -Many documentation packages provide support for automatic documentation generation from the codebase's docstrings. These tools utilize special formatting directives to enable richer formatting in the generated documentation. - -For example: - -```py -import typing as t - - -def foo(bar: int, baz: t.Optional[t.Dict[str, str]] = None) -> bool: - """ - Does some things with some stuff. - - :param bar: Some input - :param baz: Optional, some dictionary with string keys and values - - :return: Some boolean - """ - ... -``` - -Since PyDis does not utilize automatic documentation generation, use of this syntax should not be used in code contributed to the organization. Should the purpose and type of the input variables not be easily discernable from the variable name and type annotation, a prose explanation can be used. Explicit references to variables, functions, classes, etc. should be wrapped with backticks (`` ` ``). - -For example, the above docstring would become: - -```py -import typing as t - - -def foo(bar: int, baz: t.Optional[t.Dict[str, str]] = None) -> bool: - """ - Does some things with some stuff. - - This function takes an index, `bar` and checks for its presence in the database `baz`, passed as a dictionary. Returns `False` if `baz` is not passed. - """ - ... -``` - -### Logging Levels -The project currently defines [`logging`](https://docs.python.org/3/library/logging.html) levels as follows, from lowest to highest severity: -* **TRACE:** These events should be used to provide a *verbose* trace of every step of a complex process. This is essentially the `logging` equivalent of sprinkling `print` statements throughout the code. - * **Note:** This is a PyDis-implemented logging level. -* **DEBUG:** These events should add context to what's happening in a development setup to make it easier to follow what's going while working on a project. This is in the same vein as **TRACE** logging but at a much lower level of verbosity. -* **INFO:** These events are normal and don't need direct attention but are worth keeping track of in production, like checking which cogs were loaded during a start-up. -* **WARNING:** These events are out of the ordinary and should be fixed, but have not caused a failure. - * **NOTE:** Events at this logging level and higher should be reserved for events that require the attention of the DevOps team. -* **ERROR:** These events have caused a failure in a specific part of the application and require urgent attention. -* **CRITICAL:** These events have caused the whole application to fail and require immediate intervention. - -Ensure that log messages are succinct. Should you want to pass additional useful information that would otherwise make the log message overly verbose the `logging` module accepts an `extra` kwarg, which can be used to pass a dictionary. This is used to populate the `__dict__` of the `LogRecord` created for the logging event with user-defined attributes that can be accessed by a log handler. Additional information and caveats may be found [in Python's `logging` documentation](https://docs.python.org/3/library/logging.html#logging.Logger.debug). - -### Work in Progress (WIP) PRs -Github [provides a PR feature](https://github.blog/2019-02-14-introducing-draft-pull-requests/) that allows the PR author to mark it as a WIP. This provides both a visual and functional indicator that the contents of the PR are in a draft state and not yet ready for formal review. - -This feature should be utilized in place of the traditional method of prepending `[WIP]` to the PR title. - -As stated earlier, **ensure that "Allow edits from maintainers" is checked**. This gives permission for maintainers to commit changes directly to your fork, speeding up the review process. - -## Footnotes - -This document was inspired by the [Glowstone contribution guidelines](https://github.com/GlowstoneMC/Glowstone/blob/dev/docs/CONTRIBUTING.md). +The Contributing Guidelines for Python Discord projects can be found [on our website](https://pydis.com/contributing.md). @@ -9,26 +9,29 @@ aiodns = "~=2.0" aiohttp = "~=3.7" aioping = "~=0.3.1" aioredis = "~=1.3.1" +arrow = "~=1.0.3" "async-rediscache[fakeredis]" = "~=0.1.2" beautifulsoup4 = "~=4.9" colorama = {version = "~=0.4.3",sys_platform = "== 'win32'"} coloredlogs = "~=14.0" deepdiff = "~=4.0" "discord.py" = "~=1.6.0" +emoji = "~=0.6" feedparser = "~=5.2" fuzzywuzzy = "~=0.17" lxml = "~=4.4" markdownify = "==0.6.1" more_itertools = "~=8.2" python-dateutil = "~=2.8" +python-frontmatter = "~=1.0.0" pyyaml = "~=5.1" +regex = "==2021.4.4" sentry-sdk = "~=0.19" statsd = "~=3.3" -arrow = "~=0.17" -emoji = "~=0.6" [dev-packages] coverage = "~=5.0" +coveralls = "~=2.1" flake8 = "~=3.8" flake8-annotations = "~=2.0" flake8-bugbear = "~=20.1" @@ -39,7 +42,6 @@ flake8-tidy-imports = "~=4.0" flake8-todo = "~=0.7" pep8-naming = "~=0.9" pre-commit = "~=2.1" -coveralls = "~=2.1" [requires] python_version = "3.8" diff --git a/Pipfile.lock b/Pipfile.lock index a5e57a3fb..1e1a8167b 100644 --- a/Pipfile.lock +++ b/Pipfile.lock @@ -1,7 +1,7 @@ { "_meta": { "hash": { - "sha256": "d5106d76a47c287ef74fc610be4977a76f261be003739cabf639ace310eeac57" + "sha256": "e35c9bad81b01152ad3e10b85f1abf5866aa87b9d87e03bc30bdb9d37668ccae" }, "pipfile-spec": 6, "requires": { @@ -101,11 +101,11 @@ }, "arrow": { "hashes": [ - "sha256:e098abbd9af3665aea81bdd6c869e93af4feb078e98468dd351c383af187aac5", - "sha256:ff08d10cda1d36c68657d6ad20d74fbea493d980f8b2d45344e00d6ed2bf6ed4" + "sha256:3515630f11a15c61dcb4cdd245883270dd334c83f3e639824e65a4b79cc48543", + "sha256:399c9c8ae732270e1aa58ead835a79a40d7be8aa109c579898eb41029b5a231d" ], "index": "pypi", - "version": "==0.17.0" + "version": "==1.0.3" }, "async-rediscache": { "extras": [ @@ -206,7 +206,6 @@ "sha256:5941b2b48a20143d2267e95b1c2a7603ce057ee39fd88e7329b0c292aa16869b", "sha256:9f47eda37229f68eee03b24b9748937c7dc3868f906e8ba69fbcbdd3bc5dc3e2" ], - "index": "pypi", "markers": "sys_platform == 'win32'", "version": "==0.4.4" }, @@ -267,55 +266,50 @@ }, "hiredis": { "hashes": [ - "sha256:06a039208f83744a702279b894c8cf24c14fd63c59cd917dcde168b79eef0680", - "sha256:0a909bf501459062aa1552be1461456518f367379fdc9fdb1f2ca5e4a1fdd7c0", - "sha256:18402d9e54fb278cb9a8c638df6f1550aca36a009d47ecf5aa263a38600f35b0", - "sha256:1e4cbbc3858ec7e680006e5ca590d89a5e083235988f26a004acf7244389ac01", - "sha256:23344e3c2177baf6975fbfa361ed92eb7d36d08f454636e5054b3faa7c2aff8a", - "sha256:289b31885b4996ce04cadfd5fc03d034dce8e2a8234479f7c9e23b9e245db06b", - "sha256:2c1c570ae7bf1bab304f29427e2475fe1856814312c4a1cf1cd0ee133f07a3c6", - "sha256:2c227c0ed371771ffda256034427320870e8ea2e4fd0c0a618c766e7c49aad73", - "sha256:3bb9b63d319402cead8bbd9dd55dca3b667d2997e9a0d8a1f9b6cc274db4baee", - "sha256:3ef2183de67b59930d2db8b8e8d4d58e00a50fcc5e92f4f678f6eed7a1c72d55", - "sha256:43b8ed3dbfd9171e44c554cb4acf4ee4505caa84c5e341858b50ea27dd2b6e12", - "sha256:47bcf3c5e6c1e87ceb86cdda2ee983fa0fe56a999e6185099b3c93a223f2fa9b", - "sha256:5263db1e2e1e8ae30500cdd75a979ff99dcc184201e6b4b820d0de74834d2323", - "sha256:5b1451727f02e7acbdf6aae4e06d75f66ee82966ff9114550381c3271a90f56c", - "sha256:6996883a8a6ff9117cbb3d6f5b0dcbbae6fb9e31e1a3e4e2f95e0214d9a1c655", - "sha256:6c96f64a54f030366657a54bb90b3093afc9c16c8e0dfa29fc0d6dbe169103a5", - "sha256:7332d5c3e35154cd234fd79573736ddcf7a0ade7a986db35b6196b9171493e75", - "sha256:7885b6f32c4a898e825bb7f56f36a02781ac4a951c63e4169f0afcf9c8c30dfb", - "sha256:7b0f63f10a166583ab744a58baad04e0f52cfea1ac27bfa1b0c21a48d1003c23", - "sha256:819f95d4eba3f9e484dd115ab7ab72845cf766b84286a00d4ecf76d33f1edca1", - "sha256:8968eeaa4d37a38f8ca1f9dbe53526b69628edc9c42229a5b2f56d98bb828c1f", - "sha256:89ebf69cb19a33d625db72d2ac589d26e936b8f7628531269accf4a3196e7872", - "sha256:8daecd778c1da45b8bd54fd41ffcd471a86beed3d8e57a43acf7a8d63bba4058", - "sha256:955ba8ea73cf3ed8bd2f963b4cb9f8f0dcb27becd2f4b3dd536fd24c45533454", - "sha256:964f18a59f5a64c0170f684c417f4fe3e695a536612e13074c4dd5d1c6d7c882", - "sha256:969843fbdfbf56cdb71da6f0bdf50f9985b8b8aeb630102945306cf10a9c6af2", - "sha256:996021ef33e0f50b97ff2d6b5f422a0fe5577de21a8873b58a779a5ddd1c3132", - "sha256:9e9c9078a7ce07e6fce366bd818be89365a35d2e4b163268f0ca9ba7e13bb2f6", - "sha256:a04901757cb0fb0f5602ac11dda48f5510f94372144d06c2563ba56c480b467c", - "sha256:a7bf1492429f18d205f3a818da3ff1f242f60aa59006e53dee00b4ef592a3363", - "sha256:aa0af2deb166a5e26e0d554b824605e660039b161e37ed4f01b8d04beec184f3", - "sha256:abfb15a6a7822f0fae681785cb38860e7a2cb1616a708d53df557b3d76c5bfd4", - "sha256:b253fe4df2afea4dfa6b1fa8c5fef212aff8bcaaeb4207e81eed05cb5e4a7919", - "sha256:b27f082f47d23cffc4cf1388b84fdc45c4ef6015f906cd7e0d988d9e35d36349", - "sha256:b33aea449e7f46738811fbc6f0b3177c6777a572207412bbbf6f525ffed001ae", - "sha256:b44f9421c4505c548435244d74037618f452844c5d3c67719d8a55e2613549da", - "sha256:bcc371151d1512201d0214c36c0c150b1dc64f19c2b1a8c9cb1d7c7c15ebd93f", - "sha256:c2851deeabd96d3f6283e9c6b26e0bfed4de2dc6fb15edf913e78b79fc5909ed", - "sha256:cdfd501c7ac5b198c15df800a3a34c38345f5182e5f80770caf362bccca65628", - "sha256:d2c0caffa47606d6d7c8af94ba42547bd2a441f06c74fd90a1ffe328524a6c64", - "sha256:dcb2db95e629962db5a355047fb8aefb012df6c8ae608930d391619dbd96fd86", - "sha256:e0eeb9c112fec2031927a1745788a181d0eecbacbed941fc5c4f7bc3f7b273bf", - "sha256:e154891263306200260d7f3051982774d7b9ef35af3509d5adbbe539afd2610c", - "sha256:e2e023a42dcbab8ed31f97c2bcdb980b7fbe0ada34037d87ba9d799664b58ded", - "sha256:e64be68255234bb489a574c4f2f8df7029c98c81ec4d160d6cd836e7f0679390", - "sha256:e82d6b930e02e80e5109b678c663a9ed210680ded81c1abaf54635d88d1da298" + "sha256:04026461eae67fdefa1949b7332e488224eac9e8f2b5c58c98b54d29af22093e", + "sha256:04927a4c651a0e9ec11c68e4427d917e44ff101f761cd3b5bc76f86aaa431d27", + "sha256:07bbf9bdcb82239f319b1f09e8ef4bdfaec50ed7d7ea51a56438f39193271163", + "sha256:09004096e953d7ebd508cded79f6b21e05dff5d7361771f59269425108e703bc", + "sha256:0adea425b764a08270820531ec2218d0508f8ae15a448568109ffcae050fee26", + "sha256:0b39ec237459922c6544d071cdcf92cbb5bc6685a30e7c6d985d8a3e3a75326e", + "sha256:0d5109337e1db373a892fdcf78eb145ffb6bbd66bb51989ec36117b9f7f9b579", + "sha256:0f41827028901814c709e744060843c77e78a3aca1e0d6875d2562372fcb405a", + "sha256:11d119507bb54e81f375e638225a2c057dda748f2b1deef05c2b1a5d42686048", + "sha256:1233e303645f468e399ec906b6b48ab7cd8391aae2d08daadbb5cad6ace4bd87", + "sha256:139705ce59d94eef2ceae9fd2ad58710b02aee91e7fa0ccb485665ca0ecbec63", + "sha256:1f03d4dadd595f7a69a75709bc81902673fa31964c75f93af74feac2f134cc54", + "sha256:240ce6dc19835971f38caf94b5738092cb1e641f8150a9ef9251b7825506cb05", + "sha256:294a6697dfa41a8cba4c365dd3715abc54d29a86a40ec6405d677ca853307cfb", + "sha256:3d55e36715ff06cdc0ab62f9591607c4324297b6b6ce5b58cb9928b3defe30ea", + "sha256:3dddf681284fe16d047d3ad37415b2e9ccdc6c8986c8062dbe51ab9a358b50a5", + "sha256:3f5f7e3a4ab824e3de1e1700f05ad76ee465f5f11f5db61c4b297ec29e692b2e", + "sha256:508999bec4422e646b05c95c598b64bdbef1edf0d2b715450a078ba21b385bcc", + "sha256:5d2a48c80cf5a338d58aae3c16872f4d452345e18350143b3bf7216d33ba7b99", + "sha256:5dc7a94bb11096bc4bffd41a3c4f2b958257085c01522aa81140c68b8bf1630a", + "sha256:65d653df249a2f95673976e4e9dd7ce10de61cfc6e64fa7eeaa6891a9559c581", + "sha256:7492af15f71f75ee93d2a618ca53fea8be85e7b625e323315169977fae752426", + "sha256:7f0055f1809b911ab347a25d786deff5e10e9cf083c3c3fd2dd04e8612e8d9db", + "sha256:807b3096205c7cec861c8803a6738e33ed86c9aae76cac0e19454245a6bbbc0a", + "sha256:81d6d8e39695f2c37954d1011c0480ef7cf444d4e3ae24bc5e89ee5de360139a", + "sha256:87c7c10d186f1743a8fd6a971ab6525d60abd5d5d200f31e073cd5e94d7e7a9d", + "sha256:8b42c0dc927b8d7c0eb59f97e6e34408e53bc489f9f90e66e568f329bff3e443", + "sha256:a00514362df15af041cc06e97aebabf2895e0a7c42c83c21894be12b84402d79", + "sha256:a39efc3ade8c1fb27c097fd112baf09d7fd70b8cb10ef1de4da6efbe066d381d", + "sha256:a4ee8000454ad4486fb9f28b0cab7fa1cd796fc36d639882d0b34109b5b3aec9", + "sha256:a7928283143a401e72a4fad43ecc85b35c27ae699cf5d54d39e1e72d97460e1d", + "sha256:adf4dd19d8875ac147bf926c727215a0faf21490b22c053db464e0bf0deb0485", + "sha256:ae8427a5e9062ba66fc2c62fb19a72276cf12c780e8db2b0956ea909c48acff5", + "sha256:b4c8b0bc5841e578d5fb32a16e0c305359b987b850a06964bd5a62739d688048", + "sha256:b84f29971f0ad4adaee391c6364e6f780d5aae7e9226d41964b26b49376071d0", + "sha256:c39c46d9e44447181cd502a35aad2bb178dbf1b1f86cf4db639d7b9614f837c6", + "sha256:cb2126603091902767d96bcb74093bd8b14982f41809f85c9b96e519c7e1dc41", + "sha256:dcef843f8de4e2ff5e35e96ec2a4abbdf403bd0f732ead127bd27e51f38ac298", + "sha256:e3447d9e074abf0e3cd85aef8131e01ab93f9f0e86654db7ac8a3f73c63706ce", + "sha256:f52010e0a44e3d8530437e7da38d11fb822acfb0d5b12e9cd5ba655509937ca0", + "sha256:f8196f739092a78e4f6b1b2172679ed3343c39c61a3e9d722ce6fcf1dac2824a" ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", - "version": "==1.1.0" + "markers": "python_version >= '3.6'", + "version": "==2.0.0" }, "humanfriendly": { "hashes": [ @@ -490,15 +484,6 @@ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", "version": "==2.20" }, - "pyreadline": { - "hashes": [ - "sha256:4530592fc2e85b25b1a9f79664433da09237c1a270e4d78ea5aa3a2c7229e2d1", - "sha256:65540c21bfe14405a3a77e4c085ecfce88724743a4ead47c66b84defcf82c32e", - "sha256:9ce5fa65b8992dfa373bddc5b6e0864ead8f291c94fbfec05fbd5c836162e67b" - ], - "markers": "sys_platform == 'win32'", - "version": "==2.1" - }, "python-dateutil": { "hashes": [ "sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c", @@ -507,37 +492,45 @@ "index": "pypi", "version": "==2.8.1" }, + "python-frontmatter": { + "hashes": [ + "sha256:766ae75f1b301ffc5fe3494339147e0fd80bc3deff3d7590a93991978b579b08", + "sha256:e98152e977225ddafea6f01f40b4b0f1de175766322004c826ca99842d19a7cd" + ], + "index": "pypi", + "version": "==1.0.0" + }, "pyyaml": { "hashes": [ - "sha256:fd7f6999a8070df521b6384004ef42833b9bd62cfee11a09bda1079b4b704247", - "sha256:3bd0e463264cf257d1ffd2e40223b197271046d09dadf73a0fe82b9c1fc385a5", - "sha256:d483ad4e639292c90170eb6f7783ad19490e7a8defb3e46f97dfe4bacae89122", - "sha256:c20cfa2d49991c8b4147af39859b167664f2ad4561704ee74c1de03318e898db", - "sha256:e4fac90784481d221a8e4b1162afa7c47ed953be40d31ab4629ae917510051df", - "sha256:49d4cdd9065b9b6e206d0595fee27a96b5dd22618e7520c33204a4a3239d5b10", - "sha256:d2d9808ea7b4af864f35ea216be506ecec180628aced0704e34aca0b040ffe46", + "sha256:08682f6b72c722394747bddaf0aa62277e02557c0fd1c42cb853016a38f8dedf", + "sha256:0f5f5786c0e09baddcd8b4b45f20a7b5d61a7e7e99846e3c799b05c7c53fa696", + "sha256:129def1b7c1bf22faffd67b8f3724645203b79d8f4cc81f674654d9902cb4393", "sha256:294db365efa064d00b8d1ef65d8ea2c3426ac366c0c4368d930bf1c5fb497f77", - "sha256:fe69978f3f768926cfa37b867e3843918e012cf83f680806599ddce33c2c68b0", - "sha256:8c1be557ee92a20f184922c7b6424e8ab6691788e6d86137c5d93c1a6ec1b8fb", - "sha256:dd5de0646207f053eb0d6c74ae45ba98c3395a571a2891858e87df7c9b9bd51b", "sha256:3b2b1824fe7112845700f815ff6a489360226a5609b96ec2190a45e62a9fc922", - "sha256:607774cbba28732bfa802b54baa7484215f530991055bb562efbed5b2f20a45e", - "sha256:08682f6b72c722394747bddaf0aa62277e02557c0fd1c42cb853016a38f8dedf", + "sha256:3bd0e463264cf257d1ffd2e40223b197271046d09dadf73a0fe82b9c1fc385a5", + "sha256:4465124ef1b18d9ace298060f4eccc64b0850899ac4ac53294547536533800c8", + "sha256:49d4cdd9065b9b6e206d0595fee27a96b5dd22618e7520c33204a4a3239d5b10", + "sha256:4e0583d24c881e14342eaf4ec5fbc97f934b999a6828693a99157fde912540cc", "sha256:5accb17103e43963b80e6f837831f38d314a0495500067cb25afab2e8d7a4018", - "sha256:fa5ae20527d8e831e8230cbffd9f8fe952815b2b7dae6ffec25318803a7528fc", - "sha256:fdc842473cd33f45ff6bce46aea678a54e3d21f1b61a7750ce3c498eedfe25d6", + "sha256:607774cbba28732bfa802b54baa7484215f530991055bb562efbed5b2f20a45e", "sha256:6c78645d400265a062508ae399b60b8c167bf003db364ecb26dcab2bda048253", - "sha256:bb4191dfc9306777bc594117aee052446b3fa88737cd13b7188d0e7aa8162185", - "sha256:4e0583d24c881e14342eaf4ec5fbc97f934b999a6828693a99157fde912540cc", - "sha256:129def1b7c1bf22faffd67b8f3724645203b79d8f4cc81f674654d9902cb4393", - "sha256:cb333c16912324fd5f769fff6bc5de372e9e7a202247b48870bc251ed40239aa", - "sha256:895f61ef02e8fed38159bb70f7e100e00f471eae2bc838cd0f4ebb21e28f8541", + "sha256:72a01f726a9c7851ca9bfad6fd09ca4e090a023c00945ea05ba1638c09dc3347", "sha256:74c1485f7707cf707a7aef42ef6322b8f97921bd89be2ab6317fd782c2d53183", + "sha256:895f61ef02e8fed38159bb70f7e100e00f471eae2bc838cd0f4ebb21e28f8541", + "sha256:8c1be557ee92a20f184922c7b6424e8ab6691788e6d86137c5d93c1a6ec1b8fb", + "sha256:bb4191dfc9306777bc594117aee052446b3fa88737cd13b7188d0e7aa8162185", "sha256:bfb51918d4ff3d77c1c856a9699f8492c612cde32fd3bcd344af9be34999bfdc", - "sha256:4465124ef1b18d9ace298060f4eccc64b0850899ac4ac53294547536533800c8", + "sha256:c20cfa2d49991c8b4147af39859b167664f2ad4561704ee74c1de03318e898db", + "sha256:cb333c16912324fd5f769fff6bc5de372e9e7a202247b48870bc251ed40239aa", + "sha256:d2d9808ea7b4af864f35ea216be506ecec180628aced0704e34aca0b040ffe46", + "sha256:d483ad4e639292c90170eb6f7783ad19490e7a8defb3e46f97dfe4bacae89122", + "sha256:dd5de0646207f053eb0d6c74ae45ba98c3395a571a2891858e87df7c9b9bd51b", "sha256:e1d4970ea66be07ae37a3c2e48b5ec63f7ba6804bdddfdbd3cfd954d25a82e63", - "sha256:0f5f5786c0e09baddcd8b4b45f20a7b5d61a7e7e99846e3c799b05c7c53fa696", - "sha256:72a01f726a9c7851ca9bfad6fd09ca4e090a023c00945ea05ba1638c09dc3347" + "sha256:e4fac90784481d221a8e4b1162afa7c47ed953be40d31ab4629ae917510051df", + "sha256:fa5ae20527d8e831e8230cbffd9f8fe952815b2b7dae6ffec25318803a7528fc", + "sha256:fd7f6999a8070df521b6384004ef42833b9bd62cfee11a09bda1079b4b704247", + "sha256:fdc842473cd33f45ff6bce46aea678a54e3d21f1b61a7750ce3c498eedfe25d6", + "sha256:fe69978f3f768926cfa37b867e3843918e012cf83f680806599ddce33c2c68b0" ], "index": "pypi", "version": "==5.4.1" @@ -550,6 +543,53 @@ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'", "version": "==3.5.3" }, + "regex": { + "hashes": [ + "sha256:01afaf2ec48e196ba91b37451aa353cb7eda77efe518e481707e0515025f0cd5", + "sha256:11d773d75fa650cd36f68d7ca936e3c7afaae41b863b8c387a22aaa78d3c5c79", + "sha256:18c071c3eb09c30a264879f0d310d37fe5d3a3111662438889ae2eb6fc570c31", + "sha256:1e1c20e29358165242928c2de1482fb2cf4ea54a6a6dea2bd7a0e0d8ee321500", + "sha256:281d2fd05555079448537fe108d79eb031b403dac622621c78944c235f3fcf11", + "sha256:314d66636c494ed9c148a42731b3834496cc9a2c4251b1661e40936814542b14", + "sha256:32e65442138b7b76dd8173ffa2cf67356b7bc1768851dded39a7a13bf9223da3", + "sha256:339456e7d8c06dd36a22e451d58ef72cef293112b559010db3d054d5560ef439", + "sha256:3916d08be28a1149fb97f7728fca1f7c15d309a9f9682d89d79db75d5e52091c", + "sha256:3a9cd17e6e5c7eb328517969e0cb0c3d31fd329298dd0c04af99ebf42e904f82", + "sha256:47bf5bf60cf04d72bf6055ae5927a0bd9016096bf3d742fa50d9bf9f45aa0711", + "sha256:4c46e22a0933dd783467cf32b3516299fb98cfebd895817d685130cc50cd1093", + "sha256:4c557a7b470908b1712fe27fb1ef20772b78079808c87d20a90d051660b1d69a", + "sha256:52ba3d3f9b942c49d7e4bc105bb28551c44065f139a65062ab7912bef10c9afb", + "sha256:563085e55b0d4fb8f746f6a335893bda5c2cef43b2f0258fe1020ab1dd874df8", + "sha256:598585c9f0af8374c28edd609eb291b5726d7cbce16be6a8b95aa074d252ee17", + "sha256:619d71c59a78b84d7f18891fe914446d07edd48dc8328c8e149cbe0929b4e000", + "sha256:67bdb9702427ceddc6ef3dc382455e90f785af4c13d495f9626861763ee13f9d", + "sha256:6d1b01031dedf2503631d0903cb563743f397ccaf6607a5e3b19a3d76fc10480", + "sha256:741a9647fcf2e45f3a1cf0e24f5e17febf3efe8d4ba1281dcc3aa0459ef424dc", + "sha256:7c2a1af393fcc09e898beba5dd59196edaa3116191cc7257f9224beaed3e1aa0", + "sha256:7d9884d86dd4dd489e981d94a65cd30d6f07203d90e98f6f657f05170f6324c9", + "sha256:90f11ff637fe8798933fb29f5ae1148c978cccb0452005bf4c69e13db951e765", + "sha256:919859aa909429fb5aa9cf8807f6045592c85ef56fdd30a9a3747e513db2536e", + "sha256:96fcd1888ab4d03adfc9303a7b3c0bd78c5412b2bfbe76db5b56d9eae004907a", + "sha256:97f29f57d5b84e73fbaf99ab3e26134e6687348e95ef6b48cfd2c06807005a07", + "sha256:980d7be47c84979d9136328d882f67ec5e50008681d94ecc8afa8a65ed1f4a6f", + "sha256:a91aa8619b23b79bcbeb37abe286f2f408d2f2d6f29a17237afda55bb54e7aac", + "sha256:ade17eb5d643b7fead300a1641e9f45401c98eee23763e9ed66a43f92f20b4a7", + "sha256:b9c3db21af35e3b3c05764461b262d6f05bbca08a71a7849fd79d47ba7bc33ed", + "sha256:bd28bc2e3a772acbb07787c6308e00d9626ff89e3bfcdebe87fa5afbfdedf968", + "sha256:bf5824bfac591ddb2c1f0a5f4ab72da28994548c708d2191e3b87dd207eb3ad7", + "sha256:c0502c0fadef0d23b128605d69b58edb2c681c25d44574fc673b0e52dce71ee2", + "sha256:c38c71df845e2aabb7fb0b920d11a1b5ac8526005e533a8920aea97efb8ec6a4", + "sha256:ce15b6d103daff8e9fee13cf7f0add05245a05d866e73926c358e871221eae87", + "sha256:d3029c340cfbb3ac0a71798100ccc13b97dddf373a4ae56b6a72cf70dfd53bc8", + "sha256:e512d8ef5ad7b898cdb2d8ee1cb09a8339e4f8be706d27eaa180c2f177248a10", + "sha256:e8e5b509d5c2ff12f8418006d5a90e9436766133b564db0abaec92fd27fcee29", + "sha256:ee54ff27bf0afaf4c3b3a62bcd016c12c3fdb4ec4f413391a90bd38bc3624605", + "sha256:fa4537fb4a98fe8fde99626e4681cc644bdcf2a795038533f9f711513a862ae6", + "sha256:fd45ff9293d9274c5008a2054ecef86a9bfe819a67c7be1afb65e69b405b3042" + ], + "index": "pypi", + "version": "==2021.4.4" + }, "sentry-sdk": { "hashes": [ "sha256:4ae8d1ced6c67f1c8ea51d82a16721c166c489b76876c9f2c202b8a50334b237", @@ -563,7 +603,7 @@ "sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259", "sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced" ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'", "version": "==1.15.0" }, "sortedcontainers": { @@ -784,11 +824,11 @@ }, "flake8-annotations": { "hashes": [ - "sha256:40a4d504cdf64126ea0bdca39edab1608bc6d515e96569b7e7c3c59c84f66c36", - "sha256:eabbfb2dd59ae0e9835f509f930e79cd99fa4ff1026fe6ca073503a57407037c" + "sha256:0d6cd2e770b5095f09689c9d84cc054c51b929c41a68969ea1beb4b825cac515", + "sha256:d10c4638231f8a50c0a597c4efce42bd7b7d85df4f620a0ddaca526138936a4f" ], "index": "pypi", - "version": "==2.6.1" + "version": "==2.6.2" }, "flake8-bugbear": { "hashes": [ @@ -846,11 +886,11 @@ }, "identify": { "hashes": [ - "sha256:1cfb05b578de996677836d5a2dde14b3dffde313cf7d2b3e793a0787a36e26dd", - "sha256:9cc5f58996cd359b7b72f0a5917d8639de5323917e6952a3bfbf36301b576f40" + "sha256:398cb92a7599da0b433c65301a1b62b9b1f4bb8248719b84736af6c0b22289d6", + "sha256:4537474817e0bbb8cea3e5b7504b7de6d44e3f169a90846cbc6adb0fc8294502" ], "markers": "python_full_version >= '3.6.1'", - "version": "==2.2.1" + "version": "==2.2.3" }, "idna": { "hashes": [ @@ -869,10 +909,10 @@ }, "nodeenv": { "hashes": [ - "sha256:5304d424c529c997bc888453aeaa6362d242b6b4631e90f3d4bf1b290f1c84a9", - "sha256:ab45090ae383b716c4ef89e690c41ff8c2b257b85b309f01f3654df3d084bd7c" + "sha256:3ef13ff90291ba2a4a7a4ff9a979b63ffdd00a464dbe04acf0ea6471517a4c2b", + "sha256:621e6b7076565ddcacd2db0294c0381e01fd28945ab36bcf00f41c5daf63bef7" ], - "version": "==1.5.0" + "version": "==1.6.0" }, "pep8-naming": { "hashes": [ @@ -884,11 +924,11 @@ }, "pre-commit": { "hashes": [ - "sha256:94c82f1bf5899d56edb1d926732f4e75a7df29a0c8c092559c77420c9d62428b", - "sha256:de55c5c72ce80d79106e48beb1b54104d16495ce7f95b0c7b13d4784193a00af" + "sha256:029d53cb83c241fe7d66eeee1e24db426f42c858f15a38d20bcefd8d8e05c9da", + "sha256:46b6ffbab37986c47d0a35e40906ae029376deed89a0eb2e446fb6e67b220427" ], "index": "pypi", - "version": "==2.11.1" + "version": "==2.12.0" }, "pycodestyle": { "hashes": [ @@ -916,35 +956,35 @@ }, "pyyaml": { "hashes": [ - "sha256:fd7f6999a8070df521b6384004ef42833b9bd62cfee11a09bda1079b4b704247", - "sha256:3bd0e463264cf257d1ffd2e40223b197271046d09dadf73a0fe82b9c1fc385a5", - "sha256:d483ad4e639292c90170eb6f7783ad19490e7a8defb3e46f97dfe4bacae89122", - "sha256:c20cfa2d49991c8b4147af39859b167664f2ad4561704ee74c1de03318e898db", - "sha256:e4fac90784481d221a8e4b1162afa7c47ed953be40d31ab4629ae917510051df", - "sha256:49d4cdd9065b9b6e206d0595fee27a96b5dd22618e7520c33204a4a3239d5b10", - "sha256:d2d9808ea7b4af864f35ea216be506ecec180628aced0704e34aca0b040ffe46", + "sha256:08682f6b72c722394747bddaf0aa62277e02557c0fd1c42cb853016a38f8dedf", + "sha256:0f5f5786c0e09baddcd8b4b45f20a7b5d61a7e7e99846e3c799b05c7c53fa696", + "sha256:129def1b7c1bf22faffd67b8f3724645203b79d8f4cc81f674654d9902cb4393", "sha256:294db365efa064d00b8d1ef65d8ea2c3426ac366c0c4368d930bf1c5fb497f77", - "sha256:fe69978f3f768926cfa37b867e3843918e012cf83f680806599ddce33c2c68b0", - "sha256:8c1be557ee92a20f184922c7b6424e8ab6691788e6d86137c5d93c1a6ec1b8fb", - "sha256:dd5de0646207f053eb0d6c74ae45ba98c3395a571a2891858e87df7c9b9bd51b", "sha256:3b2b1824fe7112845700f815ff6a489360226a5609b96ec2190a45e62a9fc922", - "sha256:607774cbba28732bfa802b54baa7484215f530991055bb562efbed5b2f20a45e", - "sha256:08682f6b72c722394747bddaf0aa62277e02557c0fd1c42cb853016a38f8dedf", + "sha256:3bd0e463264cf257d1ffd2e40223b197271046d09dadf73a0fe82b9c1fc385a5", + "sha256:4465124ef1b18d9ace298060f4eccc64b0850899ac4ac53294547536533800c8", + "sha256:49d4cdd9065b9b6e206d0595fee27a96b5dd22618e7520c33204a4a3239d5b10", + "sha256:4e0583d24c881e14342eaf4ec5fbc97f934b999a6828693a99157fde912540cc", "sha256:5accb17103e43963b80e6f837831f38d314a0495500067cb25afab2e8d7a4018", - "sha256:fa5ae20527d8e831e8230cbffd9f8fe952815b2b7dae6ffec25318803a7528fc", - "sha256:fdc842473cd33f45ff6bce46aea678a54e3d21f1b61a7750ce3c498eedfe25d6", + "sha256:607774cbba28732bfa802b54baa7484215f530991055bb562efbed5b2f20a45e", "sha256:6c78645d400265a062508ae399b60b8c167bf003db364ecb26dcab2bda048253", - "sha256:bb4191dfc9306777bc594117aee052446b3fa88737cd13b7188d0e7aa8162185", - "sha256:4e0583d24c881e14342eaf4ec5fbc97f934b999a6828693a99157fde912540cc", - "sha256:129def1b7c1bf22faffd67b8f3724645203b79d8f4cc81f674654d9902cb4393", - "sha256:cb333c16912324fd5f769fff6bc5de372e9e7a202247b48870bc251ed40239aa", - "sha256:895f61ef02e8fed38159bb70f7e100e00f471eae2bc838cd0f4ebb21e28f8541", + "sha256:72a01f726a9c7851ca9bfad6fd09ca4e090a023c00945ea05ba1638c09dc3347", "sha256:74c1485f7707cf707a7aef42ef6322b8f97921bd89be2ab6317fd782c2d53183", + "sha256:895f61ef02e8fed38159bb70f7e100e00f471eae2bc838cd0f4ebb21e28f8541", + "sha256:8c1be557ee92a20f184922c7b6424e8ab6691788e6d86137c5d93c1a6ec1b8fb", + "sha256:bb4191dfc9306777bc594117aee052446b3fa88737cd13b7188d0e7aa8162185", "sha256:bfb51918d4ff3d77c1c856a9699f8492c612cde32fd3bcd344af9be34999bfdc", - "sha256:4465124ef1b18d9ace298060f4eccc64b0850899ac4ac53294547536533800c8", + "sha256:c20cfa2d49991c8b4147af39859b167664f2ad4561704ee74c1de03318e898db", + "sha256:cb333c16912324fd5f769fff6bc5de372e9e7a202247b48870bc251ed40239aa", + "sha256:d2d9808ea7b4af864f35ea216be506ecec180628aced0704e34aca0b040ffe46", + "sha256:d483ad4e639292c90170eb6f7783ad19490e7a8defb3e46f97dfe4bacae89122", + "sha256:dd5de0646207f053eb0d6c74ae45ba98c3395a571a2891858e87df7c9b9bd51b", "sha256:e1d4970ea66be07ae37a3c2e48b5ec63f7ba6804bdddfdbd3cfd954d25a82e63", - "sha256:0f5f5786c0e09baddcd8b4b45f20a7b5d61a7e7e99846e3c799b05c7c53fa696", - "sha256:72a01f726a9c7851ca9bfad6fd09ca4e090a023c00945ea05ba1638c09dc3347" + "sha256:e4fac90784481d221a8e4b1162afa7c47ed953be40d31ab4629ae917510051df", + "sha256:fa5ae20527d8e831e8230cbffd9f8fe952815b2b7dae6ffec25318803a7528fc", + "sha256:fd7f6999a8070df521b6384004ef42833b9bd62cfee11a09bda1079b4b704247", + "sha256:fdc842473cd33f45ff6bce46aea678a54e3d21f1b61a7750ce3c498eedfe25d6", + "sha256:fe69978f3f768926cfa37b867e3843918e012cf83f680806599ddce33c2c68b0" ], "index": "pypi", "version": "==5.4.1" @@ -962,7 +1002,7 @@ "sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259", "sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced" ], - "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'", + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'", "version": "==1.15.0" }, "snowballstemmer": { @@ -977,7 +1017,7 @@ "sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b", "sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f" ], - "markers": "python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'", + "markers": "python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2'", "version": "==0.10.2" }, "urllib3": { diff --git a/SECURITY.md b/SECURITY.md new file mode 100644 index 000000000..fa5a88a39 --- /dev/null +++ b/SECURITY.md @@ -0,0 +1,3 @@ +# Security Notice + +The Security Notice for Python Discord projects can be found [on our website](https://pydis.com/security.md). diff --git a/bot/bot.py b/bot/bot.py index 3a2af472d..914da9c98 100644 --- a/bot/bot.py +++ b/bot/bot.py @@ -111,7 +111,7 @@ class Bot(commands.Bot): loop = asyncio.get_event_loop() allowed_roles = [discord.Object(id_) for id_ in constants.MODERATION_ROLES] - intents = discord.Intents().all() + intents = discord.Intents.all() intents.presences = False intents.dm_typing = False intents.dm_reactions = False diff --git a/bot/constants.py b/bot/constants.py index 467a4a2c4..6d14bbb3a 100644 --- a/bot/constants.py +++ b/bot/constants.py @@ -388,6 +388,7 @@ class Categories(metaclass=YAMLGetter): help_available: int help_dormant: int help_in_use: int + moderators: int modmail: int voice: int @@ -402,7 +403,6 @@ class Channels(metaclass=YAMLGetter): python_events: int python_news: int reddit: int - user_event_announcements: int dev_contrib: int dev_core: int @@ -412,9 +412,9 @@ class Channels(metaclass=YAMLGetter): python_general: int cooldown: int + how_to_get_help: int attachment_log: int - dm_log: int message_log: int mod_log: int user_log: int @@ -435,9 +435,8 @@ class Channels(metaclass=YAMLGetter): helpers: int incidents: int incidents_archive: int - mods: int mod_alerts: int - mod_spam: int + nominations: int nomination_voting: int organisation: int @@ -466,7 +465,6 @@ class Webhooks(metaclass=YAMLGetter): big_brother: int dev_log: int - dm_log: int duck_pond: int incidents_archive: int reddit: int @@ -485,13 +483,16 @@ class Roles(metaclass=YAMLGetter): python_community: int sprinters: int voice_verified: int + video: int admins: int core_developers: int devops: int + domain_leads: int helpers: int moderators: int owners: int + project_leads: int jammers: int team_leaders: int @@ -593,7 +594,8 @@ class HelpChannels(metaclass=YAMLGetter): enable: bool claim_minutes: int cmd_whitelist: List[int] - idle_minutes: int + idle_minutes_claimant: int + idle_minutes_others: int deleted_idle_minutes: int max_available: int max_total_channels: int @@ -661,6 +663,12 @@ class Event(Enum): voice_state_update = "voice_state_update" +class VideoPermission(metaclass=YAMLGetter): + section = "video_permission" + + default_permission_duration: int + + # Debug mode DEBUG_MODE = 'local' in os.environ.get("SITE_URL", "local") diff --git a/bot/decorators.py b/bot/decorators.py index 02735d0dc..1d30317ef 100644 --- a/bot/decorators.py +++ b/bot/decorators.py @@ -1,4 +1,5 @@ import asyncio +import functools import logging import types import typing as t @@ -8,7 +9,7 @@ from discord import Member, NotFound from discord.ext import commands from discord.ext.commands import Cog, Context -from bot.constants import Channels, RedirectOutput +from bot.constants import Channels, DEBUG_MODE, RedirectOutput from bot.utils import function from bot.utils.checks import in_whitelist_check from bot.utils.function import command_wraps @@ -153,3 +154,23 @@ def respect_role_hierarchy(member_arg: function.Argument) -> t.Callable: await func(*args, **kwargs) return wrapper return decorator + + +def mock_in_debug(return_value: t.Any) -> t.Callable: + """ + Short-circuit function execution if in debug mode and return `return_value`. + + The original function name, and the incoming args and kwargs are DEBUG level logged + upon each call. This is useful for expensive operations, i.e. media asset uploads + that are prone to rate-limits but need to be tested extensively. + """ + def decorator(func: t.Callable) -> t.Callable: + @functools.wraps(func) + async def wrapped(*args, **kwargs) -> t.Any: + """Short-circuit and log if in debug mode.""" + if DEBUG_MODE: + log.debug(f"Function {func.__name__} called with args: {args}, kwargs: {kwargs}") + return return_value + return await func(*args, **kwargs) + return wrapped + return decorator diff --git a/bot/errors.py b/bot/errors.py index ab0adcd42..3544c6320 100644 --- a/bot/errors.py +++ b/bot/errors.py @@ -35,3 +35,9 @@ class InvalidInfractedUser(Exception): self.reason = reason super().__init__(reason) + + +class BrandingMisconfiguration(RuntimeError): + """Raised by the Branding cog when a misconfigured event is encountered.""" + + pass diff --git a/bot/exts/backend/branding/__init__.py b/bot/exts/backend/branding/__init__.py index 81ea3bf49..20a747b7f 100644 --- a/bot/exts/backend/branding/__init__.py +++ b/bot/exts/backend/branding/__init__.py @@ -1,7 +1,7 @@ from bot.bot import Bot -from bot.exts.backend.branding._cog import BrandingManager +from bot.exts.backend.branding._cog import Branding def setup(bot: Bot) -> None: - """Loads BrandingManager cog.""" - bot.add_cog(BrandingManager(bot)) + """Load Branding cog.""" + bot.add_cog(Branding(bot)) diff --git a/bot/exts/backend/branding/_cog.py b/bot/exts/backend/branding/_cog.py index 20df83a89..0a4ddcc88 100644 --- a/bot/exts/backend/branding/_cog.py +++ b/bot/exts/backend/branding/_cog.py @@ -1,566 +1,647 @@ import asyncio -import itertools +import contextlib import logging import random import typing as t from datetime import datetime, time, timedelta +from enum import Enum +from operator import attrgetter -import arrow import async_timeout import discord from async_rediscache import RedisCache -from discord.ext import commands +from discord.ext import commands, tasks from bot.bot import Bot -from bot.constants import Branding, Colours, Emojis, Guild, MODERATION_ROLES -from bot.exts.backend.branding import _constants, _decorators, _errors, _seasons +from bot.constants import Branding as BrandingConfig, Channels, Colours, Guild, MODERATION_ROLES +from bot.decorators import mock_in_debug +from bot.exts.backend.branding._repository import BrandingRepository, Event, RemoteObject log = logging.getLogger(__name__) -class GitHubFile(t.NamedTuple): +class AssetType(Enum): """ - Represents a remote file on GitHub. + Recognised Discord guild asset types. - The `sha` hash is kept so that we can determine that a file has changed, - despite its filename remaining unchanged. + The value of each member corresponds exactly to a kwarg that can be passed to `Guild.edit`. """ - download_url: str - path: str - sha: str + BANNER = "banner" + ICON = "icon" -def pretty_files(files: t.Iterable[GitHubFile]) -> str: - """Provide a human-friendly representation of `files`.""" - return "\n".join(file.path for file in files) +def compound_hash(objects: t.Iterable[RemoteObject]) -> str: + """ + Join SHA attributes of `objects` into a single string. + + Compound hashes are cached to check for change in any of the member `objects`. + """ + return "-".join(item.sha for item in objects) + + +def make_embed(title: str, description: str, *, success: bool) -> discord.Embed: + """ + Construct simple response embed. + + If `success` is True, use green colour, otherwise red. + + For both `title` and `description`, empty string are valid values ~ fields will be empty. + """ + colour = Colours.soft_green if success else Colours.soft_red + return discord.Embed(title=title[:256], description=description[:2048], colour=colour) -def time_until_midnight() -> timedelta: +def extract_event_duration(event: Event) -> str: """ - Determine amount of time until the next-up UTC midnight. + Extract a human-readable, year-agnostic duration string from `event`. - The exact `midnight` moment is actually delayed to 5 seconds after, in order - to avoid potential problems due to imprecise sleep. + In the case that `event` is a fallback event, resolves to 'Fallback'. """ - now = datetime.utcnow() - tomorrow = now + timedelta(days=1) - midnight = datetime.combine(tomorrow, time(second=5)) + if event.meta.is_fallback: + return "Fallback" - return midnight - now + fmt = "%B %d" # Ex: August 23 + start_date = event.meta.start_date.strftime(fmt) + end_date = event.meta.end_date.strftime(fmt) + return f"{start_date} - {end_date}" -class BrandingManager(commands.Cog): + +def extract_event_name(event: Event) -> str: """ - Manages the guild's branding. - - The purpose of this cog is to help automate the synchronization of the branding - repository with the guild. It is capable of discovering assets in the repository - via GitHub's API, resolving download urls for them, and delegating - to the `bot` instance to upload them to the guild. - - BrandingManager is designed to be entirely autonomous. Its `daemon` background task awakens - once a day (see `time_until_midnight`) to detect new seasons, or to cycle icons within a single - season. The daemon can be turned on and off via the `daemon` cmd group. The value set via - its `start` and `stop` commands is persisted across sessions. If turned on, the daemon will - automatically start on the next bot start-up. Otherwise, it will wait to be started manually. - - All supported operations, e.g. setting seasons, applying the branding, or cycling icons, can - also be invoked manually, via the following API: - - branding list - - Show all available seasons - - branding set <season_name> - - Set the cog's internal state to represent `season_name`, if it exists - - If no `season_name` is given, set chronologically current season - - This will not automatically apply the season's branding to the guild, - the cog's state can be detached from the guild - - Seasons can therefore be 'previewed' using this command - - branding info - - View detailed information about resolved assets for current season - - branding refresh - - Refresh internal state, i.e. synchronize with branding repository - - branding apply - - Apply the current internal state to the guild, i.e. upload the assets - - branding cycle - - If there are multiple available icons for current season, randomly pick - and apply the next one - - The daemon calls these methods autonomously as appropriate. The use of this cog - is locked to moderation roles. As it performs media asset uploads, it is prone to - rate-limits - the `apply` command should be used with caution. The `set` command can, - however, be used freely to 'preview' seasonal branding and check whether paths have been - resolved as appropriate. - - While the bot is in debug mode, it will 'mock' asset uploads by logging the passed - download urls and pretending that the upload was successful. Make use of this - to test this cog's behaviour. + Extract title-cased event name from the path of `event`. + + An event with a path of 'events/black_history_month' will resolve to 'Black History Month'. """ + name = event.path.split("/")[-1] # Inner-most directory name. + words = name.split("_") # Words from snake case. - current_season: t.Type[_seasons.SeasonBase] + return " ".join(word.title() for word in words) - banner: t.Optional[GitHubFile] - available_icons: t.List[GitHubFile] - remaining_icons: t.List[GitHubFile] +class Branding(commands.Cog): + """ + Guild branding management. - days_since_cycle: t.Iterator + Extension responsible for automatic synchronisation of the guild's branding with the branding repository. + Event definitions and assets are automatically discovered and applied as appropriate. - daemon: t.Optional[asyncio.Task] + All state is stored in Redis. The cog should therefore seamlessly transition across restarts and maintain + a consistent icon rotation schedule for events with multiple icon assets. - # Branding configuration - branding_configuration = RedisCache() + By caching hashes of banner & icon assets, we discover changes in currently applied assets and always keep + the latest version applied. + + The command interface allows moderators+ to control the daemon or request asset synchronisation, while + regular users can see information about the current event and the overall event schedule. + """ + + # RedisCache[ + # "daemon_active": bool | If True, daemon starts on start-up. Controlled via commands. + # "event_path": str | Current event's path in the branding repo. + # "event_description": str | Current event's Markdown description. + # "event_duration": str | Current event's human-readable date range. + # "banner_hash": str | SHA of the currently applied banner. + # "icons_hash": str | Compound SHA of all icons in current rotation. + # "last_rotation_timestamp": float | POSIX UTC timestamp. + # ] + cache_information = RedisCache() + + # Icons in current rotation. Keys (str) are download URLs, values (int) track the amount of times each + # icon has been used in the current rotation. + cache_icons = RedisCache() + + # All available event names & durations. Cached by the daemon nightly; read by the calendar command. + cache_events = RedisCache() def __init__(self, bot: Bot) -> None: + """Instantiate repository abstraction & allow daemon to start.""" + self.bot = bot + self.repository = BrandingRepository(bot) + + self.bot.loop.create_task(self.maybe_start_daemon()) # Start depending on cache. + + # region: Internal logic & state management + + @mock_in_debug(return_value=True) # Mocked in development environment to prevent API spam. + async def apply_asset(self, asset_type: AssetType, download_url: str) -> bool: """ - Assign safe default values on init. + Download asset from `download_url` and apply it to PyDis as `asset_type`. - At this point, we don't have information about currently available branding. - Most of these attributes will be overwritten once the daemon connects, or once - the `refresh` command is used. + Return a boolean indicating whether the application was successful. """ - self.bot = bot - self.current_season = _seasons.get_current_season() + log.info(f"Applying '{asset_type.value}' asset to the guild.") - self.banner = None + try: + file = await self.repository.fetch_file(download_url) + except Exception: + log.exception(f"Failed to fetch '{asset_type.value}' asset.") + return False - self.available_icons = [] - self.remaining_icons = [] + await self.bot.wait_until_guild_available() + pydis: discord.Guild = self.bot.get_guild(Guild.id) - self.days_since_cycle = itertools.cycle([None]) + timeout = 10 # Seconds. + try: + with async_timeout.timeout(timeout): # Raise after `timeout` seconds. + await pydis.edit(**{asset_type.value: file}) + except discord.HTTPException: + log.exception("Asset upload to Discord failed.") + return False + except asyncio.TimeoutError: + log.error(f"Asset upload to Discord timed out after {timeout} seconds.") + return False + else: + log.trace("Asset uploaded successfully.") + return True - self.daemon = None - self._startup_task = self.bot.loop.create_task(self._initial_start_daemon()) + async def apply_banner(self, banner: RemoteObject) -> bool: + """ + Apply `banner` to the guild and cache its hash if successful. + + Banners should always be applied via this method to ensure that the last hash is cached. + + Return a boolean indicating whether the application was successful. + """ + success = await self.apply_asset(AssetType.BANNER, banner.download_url) - async def _initial_start_daemon(self) -> None: - """Checks is daemon active and when is, start it at cog load.""" - if await self.branding_configuration.get("daemon_active"): - self.daemon = self.bot.loop.create_task(self._daemon_func()) + if success: + await self.cache_information.set("banner_hash", banner.sha) - @property - def _daemon_running(self) -> bool: - """True if the daemon is currently active, False otherwise.""" - return self.daemon is not None and not self.daemon.done() + return success - async def _daemon_func(self) -> None: + async def rotate_icons(self) -> bool: """ - Manage all automated behaviour of the BrandingManager cog. + Choose and apply the next-up icon in rotation. + + We keep track of the amount of times each icon has been used. The values in `cache_icons` can be understood + to be iteration IDs. When an icon is chosen & applied, we bump its count, pushing it into the next iteration. - Once a day, the daemon will perform the following tasks: - - Update `current_season` - - Poll GitHub API to see if the available branding for `current_season` has changed - - Update assets if changes are detected (banner, guild icon, bot avatar, bot nickname) - - Check whether it's time to cycle guild icons + Once the current iteration (lowest count in the cache) depletes, we move onto the next iteration. - The internal loop runs once when activated, then periodically at the time - given by `time_until_midnight`. + In the case that there is only 1 icon in the rotation and has already been applied, do nothing. - All method calls in the internal loop are considered safe, i.e. no errors propagate - to the daemon's loop. The daemon itself does not perform any error handling on its own. + Return a boolean indicating whether a new icon was applied successfully. """ - await self.bot.wait_until_guild_available() + log.debug("Rotating icons.") - while True: - self.current_season = _seasons.get_current_season() - branding_changed = await self.refresh() + state = await self.cache_icons.to_dict() + log.trace(f"Total icons in rotation: {len(state)}.") - if branding_changed: - await self.apply() + if not state: # This would only happen if rotation not initiated, but we can handle gracefully. + log.warning("Attempted icon rotation with an empty icon cache. This indicates wrong logic.") + return False - elif next(self.days_since_cycle) == Branding.cycle_frequency: - await self.cycle() + if len(state) == 1 and 1 in state.values(): + log.debug("Aborting icon rotation: only 1 icon is available and has already been applied.") + return False - until_midnight = time_until_midnight() - await asyncio.sleep(until_midnight.total_seconds()) + current_iteration = min(state.values()) # Choose iteration to draw from. + options = [download_url for download_url, times_used in state.items() if times_used == current_iteration] - async def _info_embed(self) -> discord.Embed: - """Make an informative embed representing current season.""" - info_embed = discord.Embed(description=self.current_season.description, colour=self.current_season.colour) + log.trace(f"Choosing from {len(options)} icons in iteration {current_iteration}.") + next_icon = random.choice(options) - # If we're in a non-evergreen season, also show active months - if self.current_season is not _seasons.SeasonBase: - title = f"{self.current_season.season_name} ({', '.join(str(m) for m in self.current_season.months)})" - else: - title = self.current_season.season_name + success = await self.apply_asset(AssetType.ICON, next_icon) + + if success: + await self.cache_icons.increment(next_icon) # Push the icon into the next iteration. + + timestamp = datetime.utcnow().timestamp() + await self.cache_information.set("last_rotation_timestamp", timestamp) + + return success - # Use the author field to show the season's name and avatar if available - info_embed.set_author(name=title) + async def maybe_rotate_icons(self) -> None: + """ + Call `rotate_icons` if the configured amount of time has passed since last rotation. - banner = self.banner.path if self.banner is not None else "Unavailable" - info_embed.add_field(name="Banner", value=banner, inline=False) + We offset the calculated time difference into the future to avoid off-by-a-little-bit errors. Because there + is work to be done before the timestamp is read and written, the next read will likely commence slightly + under 24 hours after the last write. + """ + log.debug("Checking whether it's time for icons to rotate.") - icons = pretty_files(self.available_icons) or "Unavailable" - info_embed.add_field(name="Available icons", value=icons, inline=False) + last_rotation_timestamp = await self.cache_information.get("last_rotation_timestamp") - # Only display cycle frequency if we're actually cycling - if len(self.available_icons) > 1 and Branding.cycle_frequency: - info_embed.set_footer(text=f"Icon cycle frequency: {Branding.cycle_frequency}") + if last_rotation_timestamp is None: # Maiden case ~ never rotated. + await self.rotate_icons() + return - return info_embed + last_rotation = datetime.fromtimestamp(last_rotation_timestamp) + difference = (datetime.utcnow() - last_rotation) + timedelta(minutes=5) - async def _reset_remaining_icons(self) -> None: - """Set `remaining_icons` to a shuffled copy of `available_icons`.""" - self.remaining_icons = random.sample(self.available_icons, k=len(self.available_icons)) + log.trace(f"Icons last rotated at {last_rotation} (difference: {difference}).") - async def _reset_days_since_cycle(self) -> None: + if difference.days >= BrandingConfig.cycle_frequency: + await self.rotate_icons() + + async def initiate_icon_rotation(self, available_icons: t.List[RemoteObject]) -> None: """ - Reset the `days_since_cycle` iterator based on configured frequency. + Set up a new icon rotation. - If the current season only has 1 icon, or if `Branding.cycle_frequency` is falsey, - the iterator will always yield None. This signals that the icon shouldn't be cycled. + This function should be called whenever available icons change. This is generally the case when we enter + a new event, but potentially also when the assets of an on-going event change. In such cases, a reset + of `cache_icons` is necessary, because it contains download URLs which may have gotten stale. - Otherwise, it will yield ints in range [1, `Branding.cycle_frequency`] indefinitely. - When the iterator yields a value equal to `Branding.cycle_frequency`, it is time to cycle. + This function does not upload a new icon! """ - if len(self.available_icons) > 1 and Branding.cycle_frequency: - sequence = range(1, Branding.cycle_frequency + 1) - else: - sequence = [None] + log.debug("Initiating new icon rotation.") - self.days_since_cycle = itertools.cycle(sequence) + await self.cache_icons.clear() - async def _get_files(self, path: str, include_dirs: bool = False) -> t.Dict[str, GitHubFile]: + new_state = {icon.download_url: 0 for icon in available_icons} + await self.cache_icons.update(new_state) + + log.trace(f"Icon rotation initiated for {len(new_state)} icons.") + + await self.cache_information.set("icons_hash", compound_hash(available_icons)) + + async def send_info_embed(self, channel_id: int, *, is_notification: bool) -> None: """ - Get files at `path` in the branding repository. + Send the currently cached event description to `channel_id`. - If `include_dirs` is False (default), only returns files at `path`. - Otherwise, will return both files and directories. Never returns symlinks. + When `is_notification` holds, a short contextual message for the #changelog channel is added. - Return dict mapping from filename to corresponding `GitHubFile` instance. - This may return an empty dict if the response status is non-200, - or if the target directory is empty. + We read event information from `cache_information`. The caller is therefore responsible for making + sure that the cache is up-to-date before calling this function. """ - url = f"{_constants.BRANDING_URL}/{path}" - async with self.bot.http_session.get( - url, headers=_constants.HEADERS, params=_constants.PARAMS - ) as resp: - # Short-circuit if we get non-200 response - if resp.status != _constants.STATUS_OK: - log.error(f"GitHub API returned non-200 response: {resp}") - return {} - directory = await resp.json() # Directory at `path` + log.debug(f"Sending event information event to channel: {channel_id} ({is_notification=}).") - allowed_types = {"file", "dir"} if include_dirs else {"file"} - return { - file["name"]: GitHubFile(file["download_url"], file["path"], file["sha"]) - for file in directory - if file["type"] in allowed_types - } + await self.bot.wait_until_guild_available() + channel: t.Optional[discord.TextChannel] = self.bot.get_channel(channel_id) - async def refresh(self) -> bool: + if channel is None: + log.warning(f"Cannot send event information: channel {channel_id} not found!") + return + + log.trace(f"Destination channel: #{channel.name}.") + + description = await self.cache_information.get("event_description") + duration = await self.cache_information.get("event_duration") + + if None in (description, duration): + content = None + embed = make_embed("No event in cache", "Is the daemon enabled?", success=False) + + else: + content = "Python Discord is entering a new event!" if is_notification else None + embed = discord.Embed(description=description[:2048], colour=discord.Colour.blurple()) + embed.set_footer(text=duration[:2048]) + + await channel.send(content=content, embed=embed) + + async def enter_event(self, event: Event) -> t.Tuple[bool, bool]: """ - Synchronize available assets with branding repository. + Apply `event` assets and update information cache. - If the current season is not the evergreen, and lacks at least one asset, - we use the evergreen seasonal dir as fallback for missing assets. + We cache `event` information to ensure that we: + * Remember which event we're currently in across restarts + * Provide an on-demand informational embed without re-querying the branding repository - Finally, if neither the seasonal nor fallback branding directories contain - an asset, it will simply be ignored. + An event change should always be handled via this function, as it ensures that the cache is populated. - Return True if the branding has changed. This will be the case when we enter - a new season, or when something changes in the current seasons's directory - in the branding repository. + The #changelog notification is omitted when `event` is fallback, or already applied. + + Return a 2-tuple indicating whether the banner, and the icon, were applied successfully. """ - old_branding = (self.banner, self.available_icons) - seasonal_dir = await self._get_files(self.current_season.branding_path, include_dirs=True) + log.info(f"Entering event: '{event.path}'.") - # Only make a call to the fallback directory if there is something to be gained - branding_incomplete = any( - asset not in seasonal_dir - for asset in (_constants.FILE_BANNER, _constants.FILE_AVATAR, _constants.SERVER_ICONS) - ) - if branding_incomplete and self.current_season is not _seasons.SeasonBase: - fallback_dir = await self._get_files( - _seasons.SeasonBase.branding_path, include_dirs=True - ) - else: - fallback_dir = {} + banner_success = await self.apply_banner(event.banner) # Only one asset ~ apply directly. - # Resolve assets in this directory, None is a safe value - self.banner = ( - seasonal_dir.get(_constants.FILE_BANNER) - or fallback_dir.get(_constants.FILE_BANNER) - ) + await self.initiate_icon_rotation(event.icons) # Prepare a new rotation. + icon_success = await self.rotate_icons() # Apply an icon from the new rotation. - # Now resolve server icons by making a call to the proper sub-directory - if _constants.SERVER_ICONS in seasonal_dir: - icons_dir = await self._get_files( - f"{self.current_season.branding_path}/{_constants.SERVER_ICONS}" - ) - self.available_icons = list(icons_dir.values()) + # This will only be False in the case of a manual same-event re-synchronisation. + event_changed = event.path != await self.cache_information.get("event_path") - elif _constants.SERVER_ICONS in fallback_dir: - icons_dir = await self._get_files( - f"{_seasons.SeasonBase.branding_path}/{_constants.SERVER_ICONS}" - ) - self.available_icons = list(icons_dir.values()) + # Cache event identity to avoid re-entry in case of restart. + await self.cache_information.set("event_path", event.path) + # Cache information shown in the 'about' embed. + await self.populate_cache_event_description(event) + + # Notify guild of new event ~ this reads the information that we cached above. + if event_changed and not event.meta.is_fallback: + await self.send_info_embed(Channels.change_log, is_notification=True) else: - self.available_icons = [] # This should never be the case, but an empty list is a safe value + log.trace("Omitting #changelog notification. Event has not changed, or new event is fallback.") - # GitHubFile instances carry a `sha` attr so this will pick up if a file changes - branding_changed = old_branding != (self.banner, self.available_icons) + return banner_success, icon_success - if branding_changed: - log.info(f"New branding detected (season: {self.current_season.season_name})") - await self._reset_remaining_icons() - await self._reset_days_since_cycle() + async def synchronise(self) -> t.Tuple[bool, bool]: + """ + Fetch the current event and delegate to `enter_event`. - return branding_changed + This is a convenience function to force synchronisation via a command. It should generally only be used + in a recovery scenario. In the usual case, the daemon already has an `Event` instance and can pass it + to `enter_event` directly. - async def cycle(self) -> bool: + Return a 2-tuple indicating whether the banner, and the icon, were applied successfully. """ - Apply the next-up server icon. + log.debug("Synchronise: fetching current event.") - Returns True if an icon is available and successfully gets applied, False otherwise. - """ - if not self.available_icons: - log.info("Cannot cycle: no icons for this season") - return False + current_event, available_events = await self.repository.get_current_event() - if not self.remaining_icons: - log.info("Reset & shuffle remaining icons") - await self._reset_remaining_icons() + await self.populate_cache_events(available_events) - next_up = self.remaining_icons.pop(0) - success = await self.set_icon(next_up.download_url) + if current_event is None: + log.error("Failed to fetch event. Cannot synchronise!") + return False, False - return success + return await self.enter_event(current_event) - async def apply(self) -> t.List[str]: + async def populate_cache_events(self, events: t.List[Event]) -> None: """ - Apply current branding to the guild and bot. - - This delegates to the bot instance to do all the work. We only provide download urls - for available assets. Assets unavailable in the branding repo will be ignored. + Clear `cache_events` and re-populate with names and durations of `events`. - Returns a list of names of all failed assets. An asset is considered failed - if it isn't found in the branding repo, or if something goes wrong while the - bot is trying to apply it. + For each event, we store its name and duration string. This is the information presented to users in the + calendar command. If a format change is needed, it has to be done here. - An empty list denotes that all assets have been applied successfully. + The cache does not store the fallback event, as it is not shown in the calendar. """ - report = {asset: False for asset in ("banner", "icon")} + log.debug("Populating events cache.") - if self.banner is not None: - report["banner"] = await self.set_banner(self.banner.download_url) + await self.cache_events.clear() - report["icon"] = await self.cycle() + no_fallback = [event for event in events if not event.meta.is_fallback] + chronological_events = sorted(no_fallback, key=attrgetter("meta.start_date")) - failed_assets = [asset for asset, succeeded in report.items() if not succeeded] - return failed_assets + log.trace(f"Writing {len(chronological_events)} events (fallback omitted).") - @commands.has_any_role(*MODERATION_ROLES) - @commands.group(name="branding") - async def branding_cmds(self, ctx: commands.Context) -> None: - """Manual branding control.""" - if not ctx.invoked_subcommand: - await ctx.send_help(ctx.command) + with contextlib.suppress(ValueError): # Cache raises when updated with an empty dict. + await self.cache_events.update({ + extract_event_name(event): extract_event_duration(event) + for event in chronological_events + }) - @branding_cmds.command(name="list", aliases=["ls"]) - async def branding_list(self, ctx: commands.Context) -> None: - """List all available seasons and branding sources.""" - embed = discord.Embed(title="Available seasons", colour=Colours.soft_green) + async def populate_cache_event_description(self, event: Event) -> None: + """ + Cache `event` description & duration. - for season in _seasons.get_all_seasons(): - if season is _seasons.SeasonBase: - active_when = "always" - else: - active_when = f"in {', '.join(str(m) for m in season.months)}" + This should be called when entering a new event, and can be called periodically to ensure that the cache + holds fresh information in the case that the event remains the same, but its description changes. - description = ( - f"Active {active_when}\n" - f"Branding: {season.branding_path}" - ) - embed.add_field(name=season.season_name, value=description, inline=False) + The duration is stored formatted for the frontend. It is not intended to be used programmatically. + """ + log.debug("Caching event description & duration.") - await ctx.send(embed=embed) + await self.cache_information.set("event_description", event.meta.description) + await self.cache_information.set("event_duration", extract_event_duration(event)) - @branding_cmds.command(name="set") - async def branding_set(self, ctx: commands.Context, *, season_name: t.Optional[str] = None) -> None: + # endregion + # region: Daemon + + async def maybe_start_daemon(self) -> None: """ - Manually set season, or reset to current if none given. + Start the daemon depending on cache state. - Season search is a case-less comparison against both seasonal class name, - and its `season_name` attr. + The daemon will only start if it has been explicitly enabled via a command. + """ + log.debug("Checking whether daemon should start.") - This only pre-loads the cog's internal state to the chosen season, but does not - automatically apply the branding. As that is an expensive operation, the `apply` - command must be called explicitly after this command finishes. + should_begin: t.Optional[bool] = await self.cache_information.get("daemon_active") # None if never set! - This means that this command can be used to 'preview' a season gathering info - about its available assets, without applying them to the guild. + if should_begin: + self.daemon_loop.start() - If the daemon is running, it will automatically reset the season to current when - it wakes up. The season set via this command can therefore remain 'detached' from - what it should be - the daemon will make sure that it's set back properly. + def cog_unload(self) -> None: """ - if season_name is None: - new_season = _seasons.get_current_season() - else: - new_season = _seasons.get_season(season_name) - if new_season is None: - raise _errors.BrandingError("No such season exists") + Cancel the daemon in case of cog unload. - if self.current_season is new_season: - raise _errors.BrandingError(f"Season {self.current_season.season_name} already active") + This is **not** done automatically! The daemon otherwise remains active in the background. + """ + log.debug("Cog unload: cancelling daemon.") - self.current_season = new_season - await self.branding_refresh(ctx) + self.daemon_loop.cancel() - @branding_cmds.command(name="info", aliases=["status"]) - async def branding_info(self, ctx: commands.Context) -> None: + async def daemon_main(self) -> None: """ - Show available assets for current season. + Synchronise guild & caches with branding repository. - This can be used to confirm that assets have been resolved properly. - When `apply` is used, it attempts to upload exactly the assets listed here. + Pull the currently active event from the branding repository and check whether it matches the currently + active event in the cache. If not, apply the new event. + + However, it is also possible that an event's assets change as it's active. To account for such cases, + we check the banner & icons hashes against the currently cached values. If there is a mismatch, each + specific asset is re-applied. """ - await ctx.send(embed=await self._info_embed()) + log.info("Daemon main: checking current event.") - @branding_cmds.command(name="refresh") - async def branding_refresh(self, ctx: commands.Context) -> None: - """Sync currently available assets with branding repository.""" - async with ctx.typing(): - await self.refresh() - await self.branding_info(ctx) + new_event, available_events = await self.repository.get_current_event() + + await self.populate_cache_events(available_events) + + if new_event is None: + log.warning("Daemon main: failed to get current event from branding repository, will do nothing.") + return + + if new_event.path != await self.cache_information.get("event_path"): + log.debug("Daemon main: new event detected!") + await self.enter_event(new_event) + return + + await self.populate_cache_event_description(new_event) # Cache fresh frontend info in case of change. - @branding_cmds.command(name="apply") - async def branding_apply(self, ctx: commands.Context) -> None: + log.trace("Daemon main: event has not changed, checking for change in assets.") + + if new_event.banner.sha != await self.cache_information.get("banner_hash"): + log.debug("Daemon main: detected banner change.") + await self.apply_banner(new_event.banner) + + if compound_hash(new_event.icons) != await self.cache_information.get("icons_hash"): + log.debug("Daemon main: detected icon change.") + await self.initiate_icon_rotation(new_event.icons) + await self.rotate_icons() + else: + await self.maybe_rotate_icons() + + @tasks.loop(hours=24) + async def daemon_loop(self) -> None: """ - Apply current season's branding to the guild. + Call `daemon_main` every 24 hours. - Use `info` to check which assets will be applied. Shows which assets have - failed to be applied, if any. + The scheduler maintains an exact 24-hour frequency even if this coroutine takes time to complete. If the + coroutine is started at 00:01 and completes at 00:05, it will still be started at 00:01 the next day. """ - async with ctx.typing(): - failed_assets = await self.apply() - if failed_assets: - raise _errors.BrandingError( - f"Failed to apply following assets: {', '.join(failed_assets)}" - ) + log.trace("Daemon loop: calling daemon main.") - response = discord.Embed(description=f"All assets applied {Emojis.ok_hand}", colour=Colours.soft_green) - await ctx.send(embed=response) + try: + await self.daemon_main() + except Exception: + log.exception("Daemon loop: failed with an unhandled exception!") - @branding_cmds.command(name="cycle") - async def branding_cycle(self, ctx: commands.Context) -> None: + @daemon_loop.before_loop + async def daemon_before(self) -> None: """ - Apply the next-up guild icon, if multiple are available. + Call `daemon_loop` immediately, then block the loop until the next-up UTC midnight. - The order is random. + The first iteration is invoked directly such that synchronisation happens immediately after daemon start. + We then calculate the time until the next-up midnight and sleep before letting `daemon_loop` begin. """ - async with ctx.typing(): - success = await self.cycle() - if not success: - raise _errors.BrandingError("Failed to cycle icon") + log.trace("Daemon before: performing start-up iteration.") + + await self.daemon_loop() + + log.trace("Daemon before: calculating time to sleep before loop begins.") + now = datetime.utcnow() - response = discord.Embed(description=f"Success {Emojis.ok_hand}", colour=Colours.soft_green) - await ctx.send(embed=response) + # The actual midnight moment is offset into the future to prevent issues with imprecise sleep. + tomorrow = now + timedelta(days=1) + midnight = datetime.combine(tomorrow, time(minute=1)) - @branding_cmds.group(name="daemon", aliases=["d", "task"]) - async def daemon_group(self, ctx: commands.Context) -> None: - """Control the background daemon.""" + sleep_secs = (midnight - now).total_seconds() + log.trace(f"Daemon before: sleeping {sleep_secs} seconds before next-up midnight: {midnight}.") + + await asyncio.sleep(sleep_secs) + + # endregion + # region: Command interface (branding) + + @commands.group(name="branding") + async def branding_group(self, ctx: commands.Context) -> None: + """Control the branding cog.""" if not ctx.invoked_subcommand: await ctx.send_help(ctx.command) - @daemon_group.command(name="status") - async def daemon_status(self, ctx: commands.Context) -> None: - """Check whether daemon is currently active.""" - if self._daemon_running: - remaining_time = (arrow.utcnow() + time_until_midnight()).humanize() - response = discord.Embed(description=f"Daemon running {Emojis.ok_hand}", colour=Colours.soft_green) - response.set_footer(text=f"Next refresh {remaining_time}") - else: - response = discord.Embed(description="Daemon not running", colour=Colours.soft_red) + @branding_group.command(name="about", aliases=("current", "event")) + async def branding_about_cmd(self, ctx: commands.Context) -> None: + """Show the current event's description and duration.""" + await self.send_info_embed(ctx.channel.id, is_notification=False) - await ctx.send(embed=response) + @commands.has_any_role(*MODERATION_ROLES) + @branding_group.command(name="sync") + async def branding_sync_cmd(self, ctx: commands.Context) -> None: + """ + Force branding synchronisation. - @daemon_group.command(name="start") - async def daemon_start(self, ctx: commands.Context) -> None: - """If the daemon isn't running, start it.""" - if self._daemon_running: - raise _errors.BrandingError("Daemon already running!") + Show which assets have failed to synchronise, if any. + """ + async with ctx.typing(): + banner_success, icon_success = await self.synchronise() - self.daemon = self.bot.loop.create_task(self._daemon_func()) - await self.branding_configuration.set("daemon_active", True) + failed_assets = ", ".join( + name + for name, status in [("banner", banner_success), ("icon", icon_success)] + if status is False + ) - response = discord.Embed(description=f"Daemon started {Emojis.ok_hand}", colour=Colours.soft_green) - await ctx.send(embed=response) + if failed_assets: + resp = make_embed("Synchronisation unsuccessful", f"Failed to apply: {failed_assets}.", success=False) + resp.set_footer(text="Check log for details.") + else: + resp = make_embed("Synchronisation successful", "Assets have been applied.", success=True) - @daemon_group.command(name="stop") - async def daemon_stop(self, ctx: commands.Context) -> None: - """If the daemon is running, stop it.""" - if not self._daemon_running: - raise _errors.BrandingError("Daemon not running!") + await ctx.send(embed=resp) - self.daemon.cancel() - await self.branding_configuration.set("daemon_active", False) + # endregion + # region: Command interface (branding calendar) + + @branding_group.group(name="calendar", aliases=("schedule", "events")) + async def branding_calendar_group(self, ctx: commands.Context) -> None: + """ + Show the current event calendar. - response = discord.Embed(description=f"Daemon stopped {Emojis.ok_hand}", colour=Colours.soft_green) - await ctx.send(embed=response) + We draw event information from `cache_events` and use each key-value pair to create a field in the response + embed. As such, we do not need to query the API to get event information. The cache is automatically + re-populated by the daemon whenever it makes a request. A moderator+ can also explicitly request a cache + refresh using the 'refresh' subcommand. - async def _fetch_image(self, url: str) -> bytes: - """Retrieve and read image from `url`.""" - log.debug(f"Getting image from: {url}") - async with self.bot.http_session.get(url) as resp: - return await resp.read() + Due to Discord limitations, we only show up to 25 events. This is entirely sufficient at the time of writing. + In the case that we find ourselves with more than 25 events, a warning log will alert core devs. - async def _apply_asset(self, target: discord.Guild, asset: _constants.AssetType, url: str) -> bool: + In the future, we may be interested in a field-paginating solution. """ - Internal method for applying media assets to the guild. + if ctx.invoked_subcommand: + # If you're wondering why this works: when the 'refresh' subcommand eventually re-invokes + # this group, the attribute will be automatically set to None by the framework. + return + + available_events = await self.cache_events.to_dict() + log.trace(f"Found {len(available_events)} cached events available for calendar view.") + + if not available_events: + resp = make_embed("No events found!", "Cache may be empty, try `branding calendar refresh`.", success=False) + await ctx.send(embed=resp) + return + + embed = discord.Embed(title="Current event calendar", colour=discord.Colour.blurple()) + + # Because Discord embeds can only contain up to 25 fields, we only show the first 25. + first_25 = list(available_events.items())[:25] - This shouldn't be called directly. The purpose of this method is mainly generic - error handling to reduce needless code repetition. + if len(first_25) != len(available_events): # Alert core devs that a paginating solution is now necessary. + log.warning(f"There are {len(available_events)} events, but the calendar view can only display 25.") - Return True if upload was successful, False otherwise. + for name, duration in first_25: + embed.add_field(name=name[:256], value=duration[:1024]) + + embed.set_footer(text="Otherwise, the fallback season is used.") + + await ctx.send(embed=embed) + + @commands.has_any_role(*MODERATION_ROLES) + @branding_calendar_group.command(name="refresh") + async def branding_calendar_refresh_cmd(self, ctx: commands.Context) -> None: """ - log.info(f"Attempting to set {asset.name}: {url}") + Refresh event cache and show current event calendar. - kwargs = {asset.value: await self._fetch_image(url)} - try: - async with async_timeout.timeout(5): - await target.edit(**kwargs) + Supplementary subcommand allowing force-refreshing the event cache. Implemented as a subcommand because + unlike the supergroup, it requires moderator privileges. + """ + log.info("Performing command-requested event cache refresh.") - except asyncio.TimeoutError: - log.info("Asset upload timed out") - return False + async with ctx.typing(): + available_events = await self.repository.get_events() + await self.populate_cache_events(available_events) - except discord.HTTPException as discord_error: - log.exception("Asset upload failed", exc_info=discord_error) - return False + await ctx.invoke(self.branding_calendar_group) + + # endregion + # region: Command interface (branding daemon) + + @commands.has_any_role(*MODERATION_ROLES) + @branding_group.group(name="daemon", aliases=("d",)) + async def branding_daemon_group(self, ctx: commands.Context) -> None: + """Control the branding cog's daemon.""" + if not ctx.invoked_subcommand: + await ctx.send_help(ctx.command) + + @branding_daemon_group.command(name="enable", aliases=("start", "on")) + async def branding_daemon_enable_cmd(self, ctx: commands.Context) -> None: + """Enable the branding daemon.""" + await self.cache_information.set("daemon_active", True) + if self.daemon_loop.is_running(): + resp = make_embed("Daemon is already enabled!", "", success=False) else: - log.info("Asset successfully applied") - return True + self.daemon_loop.start() + resp = make_embed("Daemon enabled!", "It will now automatically awaken on start-up.", success=True) - @_decorators.mock_in_debug(return_value=True) - async def set_banner(self, url: str) -> bool: - """Set the guild's banner to image at `url`.""" - guild = self.bot.get_guild(Guild.id) - if guild is None: - log.info("Failed to get guild instance, aborting asset upload") - return False + await ctx.send(embed=resp) - return await self._apply_asset(guild, _constants.AssetType.BANNER, url) + @branding_daemon_group.command(name="disable", aliases=("stop", "off")) + async def branding_daemon_disable_cmd(self, ctx: commands.Context) -> None: + """Disable the branding daemon.""" + await self.cache_information.set("daemon_active", False) - @_decorators.mock_in_debug(return_value=True) - async def set_icon(self, url: str) -> bool: - """Sets the guild's icon to image at `url`.""" - guild = self.bot.get_guild(Guild.id) - if guild is None: - log.info("Failed to get guild instance, aborting asset upload") - return False + if self.daemon_loop.is_running(): + self.daemon_loop.cancel() + resp = make_embed("Daemon disabled!", "It will not awaken on start-up.", success=True) + else: + resp = make_embed("Daemon is already disabled!", "", success=False) - return await self._apply_asset(guild, _constants.AssetType.SERVER_ICON, url) + await ctx.send(embed=resp) - def cog_unload(self) -> None: - """Cancels startup and daemon task.""" - self._startup_task.cancel() - if self.daemon is not None: - self.daemon.cancel() + @branding_daemon_group.command(name="status") + async def branding_daemon_status_cmd(self, ctx: commands.Context) -> None: + """Check whether the daemon is currently enabled.""" + if self.daemon_loop.is_running(): + resp = make_embed("Daemon is enabled", "Use `branding daemon disable` to stop.", success=True) + else: + resp = make_embed("Daemon is disabled", "Use `branding daemon enable` to start.", success=False) + + await ctx.send(embed=resp) + + # endregion diff --git a/bot/exts/backend/branding/_constants.py b/bot/exts/backend/branding/_constants.py deleted file mode 100644 index ca8e8c5f5..000000000 --- a/bot/exts/backend/branding/_constants.py +++ /dev/null @@ -1,51 +0,0 @@ -from enum import Enum, IntEnum - -from bot.constants import Keys - - -class Month(IntEnum): - """All month constants for seasons.""" - - JANUARY = 1 - FEBRUARY = 2 - MARCH = 3 - APRIL = 4 - MAY = 5 - JUNE = 6 - JULY = 7 - AUGUST = 8 - SEPTEMBER = 9 - OCTOBER = 10 - NOVEMBER = 11 - DECEMBER = 12 - - def __str__(self) -> str: - return self.name.title() - - -class AssetType(Enum): - """ - Discord media assets. - - The values match exactly the kwarg keys that can be passed to `Guild.edit`. - """ - - BANNER = "banner" - SERVER_ICON = "icon" - - -STATUS_OK = 200 # HTTP status code - -FILE_BANNER = "banner.png" -FILE_AVATAR = "avatar.png" -SERVER_ICONS = "server_icons" - -BRANDING_URL = "https://api.github.com/repos/python-discord/branding/contents" - -PARAMS = {"ref": "main"} # Target branch -HEADERS = {"Accept": "application/vnd.github.v3+json"} # Ensure we use API v3 - -# A GitHub token is not necessary for the cog to operate, -# unauthorized requests are however limited to 60 per hour -if Keys.github: - HEADERS["Authorization"] = f"token {Keys.github}" diff --git a/bot/exts/backend/branding/_decorators.py b/bot/exts/backend/branding/_decorators.py deleted file mode 100644 index 6a1e7e869..000000000 --- a/bot/exts/backend/branding/_decorators.py +++ /dev/null @@ -1,27 +0,0 @@ -import functools -import logging -import typing as t - -from bot.constants import DEBUG_MODE - -log = logging.getLogger(__name__) - - -def mock_in_debug(return_value: t.Any) -> t.Callable: - """ - Short-circuit function execution if in debug mode and return `return_value`. - - The original function name, and the incoming args and kwargs are DEBUG level logged - upon each call. This is useful for expensive operations, i.e. media asset uploads - that are prone to rate-limits but need to be tested extensively. - """ - def decorator(func: t.Callable) -> t.Callable: - @functools.wraps(func) - async def wrapped(*args, **kwargs) -> t.Any: - """Short-circuit and log if in debug mode.""" - if DEBUG_MODE: - log.debug(f"Function {func.__name__} called with args: {args}, kwargs: {kwargs}") - return return_value - return await func(*args, **kwargs) - return wrapped - return decorator diff --git a/bot/exts/backend/branding/_errors.py b/bot/exts/backend/branding/_errors.py deleted file mode 100644 index 7cd271af3..000000000 --- a/bot/exts/backend/branding/_errors.py +++ /dev/null @@ -1,2 +0,0 @@ -class BrandingError(Exception): - """Exception raised by the BrandingManager cog.""" diff --git a/bot/exts/backend/branding/_repository.py b/bot/exts/backend/branding/_repository.py new file mode 100644 index 000000000..7b09d4641 --- /dev/null +++ b/bot/exts/backend/branding/_repository.py @@ -0,0 +1,240 @@ +import logging +import typing as t +from datetime import date, datetime + +import frontmatter + +from bot.bot import Bot +from bot.constants import Keys +from bot.errors import BrandingMisconfiguration + +# Base URL for requests into the branding repository. +BRANDING_URL = "https://api.github.com/repos/python-discord/branding/contents" + +PARAMS = {"ref": "main"} # Target branch. +HEADERS = {"Accept": "application/vnd.github.v3+json"} # Ensure we use API v3. + +# A GitHub token is not necessary. However, unauthorized requests are limited to 60 per hour. +if Keys.github: + HEADERS["Authorization"] = f"token {Keys.github}" + +# Since event periods are year-agnostic, we parse them into `datetime` objects with a manually inserted year. +# Please note that this is intentionally a leap year to allow Feb 29 to be valid. +ARBITRARY_YEAR = 2020 + +# Format used to parse date strings after we inject `ARBITRARY_YEAR` at the end. +DATE_FMT = "%B %d %Y" # Ex: July 10 2020 + +log = logging.getLogger(__name__) + + +class RemoteObject: + """ + Remote file or directory on GitHub. + + The annotations match keys in the response JSON that we're interested in. + """ + + sha: str # Hash helps us detect asset change. + name: str # Filename. + path: str # Path from repo root. + type: str # Either 'file' or 'dir'. + download_url: t.Optional[str] # If type is 'dir', this is None! + + def __init__(self, dictionary: t.Dict[str, t.Any]) -> None: + """Initialize by grabbing annotated attributes from `dictionary`.""" + missing_keys = self.__annotations__.keys() - dictionary.keys() + if missing_keys: + raise KeyError(f"Fetched object lacks expected keys: {missing_keys}") + for annotation in self.__annotations__: + setattr(self, annotation, dictionary[annotation]) + + +class MetaFile(t.NamedTuple): + """Attributes defined in a 'meta.md' file.""" + + is_fallback: bool + start_date: t.Optional[date] + end_date: t.Optional[date] + description: str # Markdown event description. + + +class Event(t.NamedTuple): + """Event defined in the branding repository.""" + + path: str # Path from repo root where event lives. This is the event's identity. + meta: MetaFile + banner: RemoteObject + icons: t.List[RemoteObject] + + def __str__(self) -> str: + return f"<Event at '{self.path}'>" + + +class BrandingRepository: + """ + Branding repository abstraction. + + This class represents the branding repository's main branch and exposes available events and assets + as objects. It performs the necessary amount of validation to ensure that a misconfigured event + isn't returned. Such events are simply ignored, and will be substituted with the fallback event, + if available. Warning logs will inform core developers if a misconfigured event is encountered. + + Colliding events cause no special behaviour. In such cases, the first found active event is returned. + We work with the assumption that the branding repository checks for such conflicts and prevents them + from reaching the main branch. + + This class keeps no internal state. All `get_current_event` calls will result in GitHub API requests. + The caller is therefore responsible for being responsible and caching information to prevent API abuse. + + Requests are made using the HTTP session looked up on the bot instance. + """ + + def __init__(self, bot: Bot) -> None: + self.bot = bot + + async def fetch_directory(self, path: str, types: t.Container[str] = ("file", "dir")) -> t.Dict[str, RemoteObject]: + """ + Fetch directory found at `path` in the branding repository. + + Raise an exception if the request fails, or if the response lacks the expected keys. + + Passing custom `types` allows getting only files or directories. By default, both are included. + """ + full_url = f"{BRANDING_URL}/{path}" + log.debug(f"Fetching directory from branding repository: '{full_url}'.") + + async with self.bot.http_session.get(full_url, params=PARAMS, headers=HEADERS) as response: + if response.status != 200: + raise RuntimeError(f"Failed to fetch directory due to status: {response.status}") + + log.debug("Fetch successful, reading JSON response.") + json_directory = await response.json() + + return {file["name"]: RemoteObject(file) for file in json_directory if file["type"] in types} + + async def fetch_file(self, download_url: str) -> bytes: + """ + Fetch file as bytes from `download_url`. + + Raise an exception if the request does not succeed. + """ + log.debug(f"Fetching file from branding repository: '{download_url}'.") + + async with self.bot.http_session.get(download_url, params=PARAMS, headers=HEADERS) as response: + if response.status != 200: + raise RuntimeError(f"Failed to fetch file due to status: {response.status}") + + log.debug("Fetch successful, reading payload.") + return await response.read() + + def parse_meta_file(self, raw_file: bytes) -> MetaFile: + """ + Parse a 'meta.md' file from raw bytes. + + The caller is responsible for handling errors caused by misconfiguration. + """ + attrs, description = frontmatter.parse(raw_file, encoding="UTF-8") + + if not description: + raise BrandingMisconfiguration("No description found in 'meta.md'!") + + if attrs.get("fallback", False): + return MetaFile(is_fallback=True, start_date=None, end_date=None, description=description) + + start_date_raw = attrs.get("start_date") + end_date_raw = attrs.get("end_date") + + if None in (start_date_raw, end_date_raw): + raise BrandingMisconfiguration("Non-fallback event doesn't have start and end dates defined!") + + # We extend the configured month & day with an arbitrary leap year, allowing a datetime object to exist. + # This may raise errors if misconfigured. We let the caller handle such cases. + start_date = datetime.strptime(f"{start_date_raw} {ARBITRARY_YEAR}", DATE_FMT).date() + end_date = datetime.strptime(f"{end_date_raw} {ARBITRARY_YEAR}", DATE_FMT).date() + + return MetaFile(is_fallback=False, start_date=start_date, end_date=end_date, description=description) + + async def construct_event(self, directory: RemoteObject) -> Event: + """ + Construct an `Event` instance from an event `directory`. + + The caller is responsible for handling errors caused by misconfiguration. + """ + contents = await self.fetch_directory(directory.path) + + missing_assets = {"meta.md", "banner.png", "server_icons"} - contents.keys() + + if missing_assets: + raise BrandingMisconfiguration(f"Directory is missing following assets: {missing_assets}") + + server_icons = await self.fetch_directory(contents["server_icons"].path, types=("file",)) + + if len(server_icons) == 0: + raise BrandingMisconfiguration("Found no server icons!") + + meta_bytes = await self.fetch_file(contents["meta.md"].download_url) + + meta_file = self.parse_meta_file(meta_bytes) + + return Event(directory.path, meta_file, contents["banner.png"], list(server_icons.values())) + + async def get_events(self) -> t.List[Event]: + """ + Discover available events in the branding repository. + + Misconfigured events are skipped. May return an empty list in the catastrophic case. + """ + log.debug("Discovering events in branding repository.") + + try: + event_directories = await self.fetch_directory("events", types=("dir",)) # Skip files. + except Exception: + log.exception("Failed to fetch 'events' directory.") + return [] + + instances: t.List[Event] = [] + + for event_directory in event_directories.values(): + log.trace(f"Attempting to construct event from directory: '{event_directory.path}'.") + try: + instance = await self.construct_event(event_directory) + except Exception as exc: + log.warning(f"Could not construct event '{event_directory.path}'.", exc_info=exc) + else: + instances.append(instance) + + return instances + + async def get_current_event(self) -> t.Tuple[t.Optional[Event], t.List[Event]]: + """ + Get the currently active event, or the fallback event. + + The second return value is a list of all available events. The caller may discard it, if not needed. + Returning all events alongside the current one prevents having to query the API twice in some cases. + + The current event may be None in the case that no event is active, and no fallback event is found. + """ + utc_now = datetime.utcnow() + log.debug(f"Finding active event for: {utc_now}.") + + # Construct an object in the arbitrary year for the purpose of comparison. + lookup_now = date(year=ARBITRARY_YEAR, month=utc_now.month, day=utc_now.day) + log.trace(f"Lookup object in arbitrary year: {lookup_now}.") + + available_events = await self.get_events() + log.trace(f"Found {len(available_events)} available events.") + + for event in available_events: + meta = event.meta + if not meta.is_fallback and (meta.start_date <= lookup_now <= meta.end_date): + return event, available_events + + log.trace("No active event found. Looking for fallback event.") + + for event in available_events: + if event.meta.is_fallback: + return event, available_events + + log.warning("No event is currently active and no fallback event was found!") + return None, available_events diff --git a/bot/exts/backend/branding/_seasons.py b/bot/exts/backend/branding/_seasons.py deleted file mode 100644 index 5f6256b30..000000000 --- a/bot/exts/backend/branding/_seasons.py +++ /dev/null @@ -1,175 +0,0 @@ -import logging -import typing as t -from datetime import datetime - -from bot.constants import Colours -from bot.exts.backend.branding._constants import Month -from bot.exts.backend.branding._errors import BrandingError - -log = logging.getLogger(__name__) - - -class SeasonBase: - """ - Base for Seasonal classes. - - This serves as the off-season fallback for when no specific - seasons are active. - - Seasons are 'registered' simply by inheriting from `SeasonBase`. - We discover them by calling `__subclasses__`. - """ - - season_name: str = "Evergreen" - - colour: str = Colours.soft_green - description: str = "The default season!" - - branding_path: str = "seasonal/evergreen" - - months: t.Set[Month] = set(Month) - - -class Christmas(SeasonBase): - """Branding for December.""" - - season_name = "Festive season" - - colour = Colours.soft_red - description = ( - "The time is here to get into the festive spirit! No matter who you are, where you are, " - "or what beliefs you may follow, we hope every one of you enjoy this festive season!" - ) - - branding_path = "seasonal/christmas" - - months = {Month.DECEMBER} - - -class Easter(SeasonBase): - """Branding for April.""" - - season_name = "Easter" - - colour = Colours.bright_green - description = ( - "Bunny here, bunny there, bunny everywhere! Here at Python Discord, we celebrate " - "our version of Easter during the entire month of April." - ) - - branding_path = "seasonal/easter" - - months = {Month.APRIL} - - -class Halloween(SeasonBase): - """Branding for October.""" - - season_name = "Halloween" - - colour = Colours.orange - description = "Trick or treat?!" - - branding_path = "seasonal/halloween" - - months = {Month.OCTOBER} - - -class Pride(SeasonBase): - """Branding for June.""" - - season_name = "Pride" - - colour = Colours.pink - description = ( - "The month of June is a special month for us at Python Discord. It is very important to us " - "that everyone feels welcome here, no matter their origin, identity or sexuality. During the " - "month of June, while some of you are participating in Pride festivals across the world, " - "we will be celebrating individuality and commemorating the history and challenges " - "of the LGBTQ+ community with a Pride event of our own!" - ) - - branding_path = "seasonal/pride" - - months = {Month.JUNE} - - -class Valentines(SeasonBase): - """Branding for February.""" - - season_name = "Valentines" - - colour = Colours.pink - description = "Love is in the air!" - - branding_path = "seasonal/valentines" - - months = {Month.FEBRUARY} - - -class Wildcard(SeasonBase): - """Branding for August.""" - - season_name = "Wildcard" - - colour = Colours.purple - description = "A season full of surprises!" - - months = {Month.AUGUST} - - -def get_all_seasons() -> t.List[t.Type[SeasonBase]]: - """Give all available season classes.""" - return [SeasonBase] + SeasonBase.__subclasses__() - - -def get_current_season() -> t.Type[SeasonBase]: - """Give active season, based on current UTC month.""" - current_month = Month(datetime.utcnow().month) - - active_seasons = tuple( - season - for season in SeasonBase.__subclasses__() - if current_month in season.months - ) - - if not active_seasons: - return SeasonBase - - return active_seasons[0] - - -def get_season(name: str) -> t.Optional[t.Type[SeasonBase]]: - """ - Give season such that its class name or its `season_name` attr match `name` (caseless). - - If no such season exists, return None. - """ - name = name.casefold() - - for season in get_all_seasons(): - matches = (season.__name__.casefold(), season.season_name.casefold()) - - if name in matches: - return season - - -def _validate_season_overlap() -> None: - """ - Raise BrandingError if there are any colliding seasons. - - This serves as a local test to ensure that seasons haven't been misconfigured. - """ - month_to_season = {} - - for season in SeasonBase.__subclasses__(): - for month in season.months: - colliding_season = month_to_season.get(month) - - if colliding_season: - raise BrandingError(f"Season {season} collides with {colliding_season} in {month.name}") - else: - month_to_season[month] = season - - -_validate_season_overlap() diff --git a/bot/exts/backend/error_handler.py b/bot/exts/backend/error_handler.py index 9cb54cdab..76ab7dfc2 100644 --- a/bot/exts/backend/error_handler.py +++ b/bot/exts/backend/error_handler.py @@ -1,7 +1,6 @@ import contextlib import difflib import logging -import random import typing as t from discord import Embed @@ -10,10 +9,9 @@ from sentry_sdk import push_scope from bot.api import ResponseCodeError from bot.bot import Bot -from bot.constants import Colours, ERROR_REPLIES, Icons, MODERATION_ROLES +from bot.constants import Colours, Icons, MODERATION_ROLES from bot.converters import TagNameConverter from bot.errors import InvalidInfractedUser, LockedResourceError -from bot.exts.backend.branding._errors import BrandingError from bot.utils.checks import InWhitelistCheckFailure log = logging.getLogger(__name__) @@ -79,9 +77,6 @@ class ErrorHandler(Cog): await self.handle_api_error(ctx, e.original) elif isinstance(e.original, LockedResourceError): await ctx.send(f"{e.original} Please wait for it to finish and try again later.") - elif isinstance(e.original, BrandingError): - await ctx.send(embed=self._get_error_embed(random.choice(ERROR_REPLIES), str(e.original))) - return elif isinstance(e.original, InvalidInfractedUser): await ctx.send(f"Cannot infract that user. {e.original.reason}") else: diff --git a/bot/exts/filters/filtering.py b/bot/exts/filters/filtering.py index c90b18dcb..464732453 100644 --- a/bot/exts/filters/filtering.py +++ b/bot/exts/filters/filtering.py @@ -6,6 +6,7 @@ from typing import Any, Dict, List, Mapping, NamedTuple, Optional, Tuple, Union import dateutil import discord.errors +import regex from async_rediscache import RedisCache from dateutil.relativedelta import relativedelta from discord import Colour, HTTPException, Member, Message, NotFound, TextChannel @@ -34,7 +35,11 @@ CODE_BLOCK_RE = re.compile( EVERYONE_PING_RE = re.compile(rf"@everyone|<@&{Guild.id}>|@here") SPOILER_RE = re.compile(r"(\|\|.+?\|\|)", re.DOTALL) URL_RE = re.compile(r"(https?://[^\s]+)", flags=re.IGNORECASE) -ZALGO_RE = re.compile(r"[\u0300-\u036F\u0489]") + +# Exclude variation selectors from zalgo because they're actually invisible. +VARIATION_SELECTORS = r"\uFE00-\uFE0F\U000E0100-\U000E01EF" +INVISIBLE_RE = regex.compile(rf"[{VARIATION_SELECTORS}\p{{UNASSIGNED}}\p{{FORMAT}}\p{{CONTROL}}--\s]", regex.V1) +ZALGO_RE = regex.compile(rf"[\p{{NONSPACING MARK}}\p{{ENCLOSING MARK}}--[{VARIATION_SELECTORS}]]", regex.V1) # Other constants. DAYS_BETWEEN_ALERTS = 3 @@ -178,6 +183,7 @@ class Filtering(Cog): def get_name_matches(self, name: str) -> List[re.Match]: """Check bad words from passed string (name). Return list of matches.""" + name = self.clean_input(name) matches = [] watchlist_patterns = self._get_filterlist_items('filter_token', allowed=False) for pattern in watchlist_patterns: @@ -444,6 +450,8 @@ class Filtering(Cog): if SPOILER_RE.search(text): text = self._expand_spoilers(text) + text = self.clean_input(text) + # Make sure it's not a URL if URL_RE.search(text): return False, None @@ -462,6 +470,7 @@ class Filtering(Cog): Second return value is a reason of URL blacklisting (can be None). """ + text = self.clean_input(text) if not URL_RE.search(text): return False, None @@ -492,6 +501,8 @@ class Filtering(Cog): Attempts to catch some of common ways to try to cheat the system. """ + text = self.clean_input(text) + # Remove backslashes to prevent escape character aroundfuckery like # discord\.gg/gdudes-pony-farm text = text.replace("\\", "") @@ -628,6 +639,15 @@ class Filtering(Cog): await self.bot.api_client.delete(f'bot/offensive-messages/{msg["id"]}') log.info(f"Deleted the offensive message with id {msg['id']}.") + @staticmethod + def clean_input(string: str) -> str: + """Remove zalgo and invisible characters from `string`.""" + # For future consideration: remove characters in the Mc, Sk, and Lm categories too. + # Can be normalised with form C to merge char + combining char into a single char to avoid + # removing legit diacritics, but this would open up a way to bypass filters. + no_zalgo = ZALGO_RE.sub("", string) + return INVISIBLE_RE.sub("", no_zalgo) + def setup(bot: Bot) -> None: """Load the Filtering cog.""" diff --git a/bot/exts/help_channels/_caches.py b/bot/exts/help_channels/_caches.py index 4cea385b7..c5e4ee917 100644 --- a/bot/exts/help_channels/_caches.py +++ b/bot/exts/help_channels/_caches.py @@ -8,12 +8,19 @@ claim_times = RedisCache(namespace="HelpChannels.claim_times") # RedisCache[discord.TextChannel.id, t.Union[discord.User.id, discord.Member.id]] claimants = RedisCache(namespace="HelpChannels.help_channel_claimants") +# Stores the timestamp of the last message from the claimant of a help channel +# RedisCache[discord.TextChannel.id, UtcPosixTimestamp] +claimant_last_message_times = RedisCache(namespace="HelpChannels.claimant_last_message_times") + +# This cache maps a help channel to the timestamp of the last non-claimant message. +# This cache being empty for a given help channel indicates the question is unanswered. +# RedisCache[discord.TextChannel.id, UtcPosixTimestamp] +non_claimant_last_message_times = RedisCache(namespace="HelpChannels.non_claimant_last_message_times") + # This cache maps a help channel to original question message in same channel. # RedisCache[discord.TextChannel.id, discord.Message.id] question_messages = RedisCache(namespace="HelpChannels.question_messages") -# This cache maps a help channel to whether it has had any -# activity other than the original claimant. True being no other -# activity and False being other activity. -# RedisCache[discord.TextChannel.id, bool] -unanswered = RedisCache(namespace="HelpChannels.unanswered") +# This cache keeps track of the dynamic message ID for +# the continuously updated message in the #How-to-get-help channel. +dynamic_message = RedisCache(namespace="HelpChannels.dynamic_message") diff --git a/bot/exts/help_channels/_channel.py b/bot/exts/help_channels/_channel.py index 224214b00..0846b28c8 100644 --- a/bot/exts/help_channels/_channel.py +++ b/bot/exts/help_channels/_channel.py @@ -1,8 +1,11 @@ import logging import typing as t -from datetime import datetime, timedelta +from datetime import timedelta +from enum import Enum +import arrow import discord +from arrow import Arrow import bot from bot import constants @@ -15,6 +18,17 @@ MAX_CHANNELS_PER_CATEGORY = 50 EXCLUDED_CHANNELS = (constants.Channels.cooldown,) +class ClosingReason(Enum): + """All possible closing reasons for help channels.""" + + COMMAND = "command" + LATEST_MESSSAGE = "auto.latest_message" + CLAIMANT_TIMEOUT = "auto.claimant_timeout" + OTHER_TIMEOUT = "auto.other_timeout" + DELETED = "auto.deleted" + CLEANUP = "auto.cleanup" + + def get_category_channels(category: discord.CategoryChannel) -> t.Iterable[discord.TextChannel]: """Yield the text channels of the `category` in an unsorted manner.""" log.trace(f"Getting text channels in the category '{category}' ({category.id}).") @@ -25,23 +39,69 @@ def get_category_channels(category: discord.CategoryChannel) -> t.Iterable[disco yield channel -async def get_idle_time(channel: discord.TextChannel) -> t.Optional[int]: +async def get_closing_time(channel: discord.TextChannel, init_done: bool) -> t.Tuple[Arrow, ClosingReason]: """ - Return the time elapsed, in seconds, since the last message sent in the `channel`. + Return the time at which the given help `channel` should be closed along with the reason. - Return None if the channel has no messages. - """ - log.trace(f"Getting the idle time for #{channel} ({channel.id}).") - - msg = await _message.get_last_message(channel) - if not msg: - log.debug(f"No idle time available; #{channel} ({channel.id}) has no messages.") - return None + `init_done` is True if the cog has finished loading and False otherwise. - idle_time = (datetime.utcnow() - msg.created_at).seconds + The time is calculated as follows: - log.trace(f"#{channel} ({channel.id}) has been idle for {idle_time} seconds.") - return idle_time + * If `init_done` is True or the cached time for the claimant's last message is unavailable, + add the configured `idle_minutes_claimant` to the time the most recent message was sent. + * If the help session is empty (see `is_empty`), do the above but with `deleted_idle_minutes`. + * If either of the above is attempted but the channel is completely empty, close the channel + immediately. + * Otherwise, retrieve the times of the claimant's and non-claimant's last messages from the + cache. Add the configured `idle_minutes_claimant` and idle_minutes_others`, respectively, and + choose the time which is furthest in the future. + """ + log.trace(f"Getting the closing time for #{channel} ({channel.id}).") + + is_empty = await _message.is_empty(channel) + if is_empty: + idle_minutes_claimant = constants.HelpChannels.deleted_idle_minutes + else: + idle_minutes_claimant = constants.HelpChannels.idle_minutes_claimant + + claimant_time = await _caches.claimant_last_message_times.get(channel.id) + + # The current session lacks messages, the cog is still starting, or the cache is empty. + if is_empty or not init_done or claimant_time is None: + msg = await _message.get_last_message(channel) + if not msg: + log.debug(f"No idle time available; #{channel} ({channel.id}) has no messages, closing now.") + return Arrow.min, ClosingReason.DELETED + + # Use the greatest offset to avoid the possibility of prematurely closing the channel. + time = Arrow.fromdatetime(msg.created_at) + timedelta(minutes=idle_minutes_claimant) + reason = ClosingReason.DELETED if is_empty else ClosingReason.LATEST_MESSSAGE + return time, reason + + claimant_time = Arrow.utcfromtimestamp(claimant_time) + others_time = await _caches.non_claimant_last_message_times.get(channel.id) + + if others_time: + others_time = Arrow.utcfromtimestamp(others_time) + else: + # The help session hasn't received any answers (messages from non-claimants) yet. + # Set to min value so it isn't considered when calculating the closing time. + others_time = Arrow.min + + # Offset the cached times by the configured values. + others_time += timedelta(minutes=constants.HelpChannels.idle_minutes_others) + claimant_time += timedelta(minutes=idle_minutes_claimant) + + # Use the time which is the furthest into the future. + if claimant_time >= others_time: + closing_time = claimant_time + reason = ClosingReason.CLAIMANT_TIMEOUT + else: + closing_time = others_time + reason = ClosingReason.OTHER_TIMEOUT + + log.trace(f"#{channel} ({channel.id}) should be closed at {closing_time} due to {reason}.") + return closing_time, reason async def get_in_use_time(channel_id: int) -> t.Optional[timedelta]: @@ -50,8 +110,8 @@ async def get_in_use_time(channel_id: int) -> t.Optional[timedelta]: claimed_timestamp = await _caches.claim_times.get(channel_id) if claimed_timestamp: - claimed = datetime.utcfromtimestamp(claimed_timestamp) - return datetime.utcnow() - claimed + claimed = Arrow.utcfromtimestamp(claimed_timestamp) + return arrow.utcnow() - claimed def is_excluded_channel(channel: discord.abc.GuildChannel) -> bool: diff --git a/bot/exts/help_channels/_cog.py b/bot/exts/help_channels/_cog.py index 1c730dce9..262b18e16 100644 --- a/bot/exts/help_channels/_cog.py +++ b/bot/exts/help_channels/_cog.py @@ -2,9 +2,10 @@ import asyncio import logging import random import typing as t -from datetime import datetime, timezone +from datetime import timedelta from operator import attrgetter +import arrow import discord import discord.abc from discord.ext import commands @@ -20,6 +21,7 @@ NAMESPACE = "help" HELP_CHANNEL_TOPIC = """ This is a Python help channel. You can claim your own help channel in the Python Help: Available category. """ +AVAILABLE_HELP_CHANNELS = "**Currently available help channel(s):** {available}" class HelpChannels(commands.Cog): @@ -43,7 +45,9 @@ class HelpChannels(commands.Cog): In Use Category * Contains all channels which are occupied by someone needing help - * Channel moves to dormant category after `constants.HelpChannels.idle_minutes` of being idle + * Channel moves to dormant category after + - `constants.HelpChannels.idle_minutes_other` minutes since the last user message, or + - `constants.HelpChannels.idle_minutes_claimant` minutes since the last claimant message. * Command can prematurely mark a channel as dormant * Channel claimant is allowed to use the command * Allowed roles for the command are configurable with `constants.HelpChannels.cmd_whitelist` @@ -70,7 +74,10 @@ class HelpChannels(commands.Cog): self.channel_queue: asyncio.Queue[discord.TextChannel] = None self.name_queue: t.Deque[str] = None - self.last_notification: t.Optional[datetime] = None + self.last_notification: t.Optional[arrow.Arrow] = None + + self.dynamic_message: t.Optional[int] = None + self.available_help_channels: t.Set[discord.TextChannel] = set() # Asyncio stuff self.queue_tasks: t.List[asyncio.Task] = [] @@ -112,11 +119,16 @@ class HelpChannels(commands.Cog): self.bot.stats.incr("help.claimed") - # Must use a timezone-aware datetime to ensure a correct POSIX timestamp. - timestamp = datetime.now(timezone.utc).timestamp() + # datetime.timestamp() would assume it's local, despite d.py giving a (naïve) UTC time. + timestamp = arrow.Arrow.fromdatetime(message.created_at).timestamp() + await _caches.claim_times.set(message.channel.id, timestamp) + await _caches.claimant_last_message_times.set(message.channel.id, timestamp) + # Delete to indicate that the help session has yet to receive an answer. + await _caches.non_claimant_last_message_times.delete(message.channel.id) - await _caches.unanswered.set(message.channel.id, True) + # Removing the help channel from the dynamic message, and editing/sending that message. + self.available_help_channels.remove(message.channel) # Not awaited because it may indefinitely hold the lock while waiting for a channel. scheduling.create_task(self.move_to_available(), name=f"help_claim_{message.id}") @@ -187,7 +199,7 @@ class HelpChannels(commands.Cog): # Don't use a discord.py check because the check needs to fail silently. if await self.close_check(ctx): log.info(f"Close command invoked by {ctx.author} in #{ctx.channel}.") - await self.unclaim_channel(ctx.channel, is_auto=False) + await self.unclaim_channel(ctx.channel, closed_on=_channel.ClosingReason.COMMAND) async def get_available_candidate(self) -> discord.TextChannel: """ @@ -233,7 +245,11 @@ class HelpChannels(commands.Cog): elif missing < 0: log.trace(f"Moving {abs(missing)} superfluous available channels over to the Dormant category.") for channel in channels[:abs(missing)]: - await self.unclaim_channel(channel) + await self.unclaim_channel(channel, closed_on=_channel.ClosingReason.CLEANUP) + + # Getting channels that need to be included in the dynamic message. + await self.update_available_help_channels() + log.trace("Dynamic available help message updated.") async def init_categories(self) -> None: """Get the help category objects. Remove the cog if retrieval fails.""" @@ -279,6 +295,10 @@ class HelpChannels(commands.Cog): # This may confuse users. So would potentially long delays for the cog to become ready. self.close_command.enabled = True + # Acquiring the dynamic message ID, if it exists within the cache. + log.trace("Attempting to fetch How-to-get-help dynamic message ID.") + self.dynamic_message = await _caches.dynamic_message.get("message_id") + await self.init_available() _stats.report_counts() @@ -293,26 +313,23 @@ class HelpChannels(commands.Cog): """ log.trace(f"Handling in-use channel #{channel} ({channel.id}).") - if not await _message.is_empty(channel): - idle_seconds = constants.HelpChannels.idle_minutes * 60 - else: - idle_seconds = constants.HelpChannels.deleted_idle_minutes * 60 - - time_elapsed = await _channel.get_idle_time(channel) + closing_time, closed_on = await _channel.get_closing_time(channel, self.init_task.done()) - if time_elapsed is None or time_elapsed >= idle_seconds: + # Closing time is in the past. + # Add 1 second due to POSIX timestamps being lower resolution than datetime objects. + if closing_time < (arrow.utcnow() + timedelta(seconds=1)): log.info( - f"#{channel} ({channel.id}) is idle longer than {idle_seconds} seconds " - f"and will be made dormant." + f"#{channel} ({channel.id}) is idle past {closing_time} " + f"and will be made dormant. Reason: {closed_on.value}" ) - await self.unclaim_channel(channel) + await self.unclaim_channel(channel, closed_on=closed_on) else: # Cancel the existing task, if any. if has_task: self.scheduler.cancel(channel.id) - delay = idle_seconds - time_elapsed + delay = (closing_time - arrow.utcnow()).seconds log.info( f"#{channel} ({channel.id}) is still active; " f"scheduling it to be moved after {delay} seconds." @@ -336,6 +353,10 @@ class HelpChannels(commands.Cog): category_id=constants.Categories.help_available, ) + # Adding the help channel to the dynamic message, and editing/sending that message. + self.available_help_channels.add(channel) + await self.update_available_help_channels() + _stats.report_counts() async def move_to_dormant(self, channel: discord.TextChannel) -> None: @@ -356,7 +377,7 @@ class HelpChannels(commands.Cog): _stats.report_counts() @lock.lock_arg(f"{NAMESPACE}.unclaim", "channel") - async def unclaim_channel(self, channel: discord.TextChannel, *, is_auto: bool = True) -> None: + async def unclaim_channel(self, channel: discord.TextChannel, *, closed_on: _channel.ClosingReason) -> None: """ Unclaim an in-use help `channel` to make it dormant. @@ -364,7 +385,7 @@ class HelpChannels(commands.Cog): Remove the cooldown role from the channel claimant if they have no other channels claimed. Cancel the scheduled cooldown role removal task. - Set `is_auto` to True if the channel was automatically closed or False if manually closed. + `closed_on` is the reason that the channel was closed. See _channel.ClosingReason for possible values. """ claimant_id = await _caches.claimants.get(channel.id) _unclaim_channel = self._unclaim_channel @@ -375,9 +396,14 @@ class HelpChannels(commands.Cog): decorator = lock.lock_arg(f"{NAMESPACE}.unclaim", "claimant_id", wait=True) _unclaim_channel = decorator(_unclaim_channel) - return await _unclaim_channel(channel, claimant_id, is_auto) + return await _unclaim_channel(channel, claimant_id, closed_on) - async def _unclaim_channel(self, channel: discord.TextChannel, claimant_id: int, is_auto: bool) -> None: + async def _unclaim_channel( + self, + channel: discord.TextChannel, + claimant_id: int, + closed_on: _channel.ClosingReason + ) -> None: """Actual implementation of `unclaim_channel`. See that for full documentation.""" await _caches.claimants.delete(channel.id) @@ -393,12 +419,12 @@ class HelpChannels(commands.Cog): await _cooldown.remove_cooldown_role(claimant) await _message.unpin(channel) - await _stats.report_complete_session(channel.id, is_auto) + await _stats.report_complete_session(channel.id, closed_on) await self.move_to_dormant(channel) # Cancel the task that makes the channel dormant only if called by the close command. # In other cases, the task is either already done or not-existent. - if not is_auto: + if closed_on == _channel.ClosingReason.COMMAND: self.scheduler.cancel(channel.id) async def move_to_in_use(self, channel: discord.TextChannel) -> None: @@ -410,7 +436,7 @@ class HelpChannels(commands.Cog): category_id=constants.Categories.help_in_use, ) - timeout = constants.HelpChannels.idle_minutes * 60 + timeout = constants.HelpChannels.idle_minutes_claimant * 60 log.trace(f"Scheduling #{channel} ({channel.id}) to become dormant in {timeout} sec.") self.scheduler.schedule_later(timeout, channel.id, self.move_idle_channel(channel)) @@ -428,7 +454,7 @@ class HelpChannels(commands.Cog): if not _channel.is_excluded_channel(message.channel): await self.claim_channel(message) else: - await _message.check_for_answer(message) + await _message.update_message_caches(message) @commands.Cog.listener() async def on_message_delete(self, msg: discord.Message) -> None: @@ -465,3 +491,34 @@ class HelpChannels(commands.Cog): self.queue_tasks.remove(task) return channel + + async def update_available_help_channels(self) -> None: + """Updates the dynamic message within #how-to-get-help for available help channels.""" + if not self.available_help_channels: + self.available_help_channels = set( + c for c in self.available_category.channels if not _channel.is_excluded_channel(c) + ) + + available_channels = AVAILABLE_HELP_CHANNELS.format( + available=", ".join( + c.mention for c in sorted(self.available_help_channels, key=attrgetter("position")) + ) or None + ) + + if self.dynamic_message is not None: + try: + log.trace("Help channels have changed, dynamic message has been edited.") + await self.bot.http.edit_message( + constants.Channels.how_to_get_help, self.dynamic_message, content=available_channels + ) + except discord.NotFound: + pass + else: + return + + log.trace("Dynamic message could not be edited or found. Creating a new one.") + new_dynamic_message = await self.bot.http.send_message( + constants.Channels.how_to_get_help, available_channels + ) + self.dynamic_message = new_dynamic_message["id"] + await _caches.dynamic_message.set("message_id", self.dynamic_message) diff --git a/bot/exts/help_channels/_message.py b/bot/exts/help_channels/_message.py index 36388f9bd..afd698ffe 100644 --- a/bot/exts/help_channels/_message.py +++ b/bot/exts/help_channels/_message.py @@ -1,9 +1,10 @@ import logging import textwrap import typing as t -from datetime import datetime +import arrow import discord +from arrow import Arrow import bot from bot import constants @@ -28,7 +29,7 @@ For more tips, check out our guide on **[asking good questions]({ASKING_GUIDE_UR AVAILABLE_TITLE = "Available help channel" -AVAILABLE_FOOTER = f"Closes after {constants.HelpChannels.idle_minutes} minutes of inactivity or when you send !close." +AVAILABLE_FOOTER = "Closes after a period of inactivity, or when you send !close." DORMANT_MSG = f""" This help channel has been marked as **dormant**, and has been moved into the **Help: Dormant** \ @@ -42,25 +43,27 @@ through our guide for **[asking a good question]({ASKING_GUIDE_URL})**. """ -async def check_for_answer(message: discord.Message) -> None: - """Checks for whether new content in a help channel comes from non-claimants.""" +async def update_message_caches(message: discord.Message) -> None: + """Checks the source of new content in a help channel and updates the appropriate cache.""" channel = message.channel # Confirm the channel is an in use help channel if is_in_category(channel, constants.Categories.help_in_use): - log.trace(f"Checking if #{channel} ({channel.id}) has been answered.") + log.trace(f"Checking if #{channel} ({channel.id}) has had a reply.") - # Check if there is an entry in unanswered - if await _caches.unanswered.contains(channel.id): - claimant_id = await _caches.claimants.get(channel.id) - if not claimant_id: - # The mapping for this channel doesn't exist, we can't do anything. - return + claimant_id = await _caches.claimants.get(channel.id) + if not claimant_id: + # The mapping for this channel doesn't exist, we can't do anything. + return - # Check the message did not come from the claimant - if claimant_id != message.author.id: - # Mark the channel as answered - await _caches.unanswered.set(channel.id, False) + # datetime.timestamp() would assume it's local, despite d.py giving a (naïve) UTC time. + timestamp = Arrow.fromdatetime(message.created_at).timestamp() + + # Overwrite the appropriate last message cache depending on the author of the message + if message.author.id == claimant_id: + await _caches.claimant_last_message_times.set(channel.id, timestamp) + else: + await _caches.non_claimant_last_message_times.set(channel.id, timestamp) async def get_last_message(channel: discord.TextChannel) -> t.Optional[discord.Message]: @@ -125,12 +128,12 @@ async def dm_on_open(message: discord.Message) -> None: ) -async def notify(channel: discord.TextChannel, last_notification: t.Optional[datetime]) -> t.Optional[datetime]: +async def notify(channel: discord.TextChannel, last_notification: t.Optional[Arrow]) -> t.Optional[Arrow]: """ Send a message in `channel` notifying about a lack of available help channels. - If a notification was sent, return the `datetime` at which the message was sent. Otherwise, - return None. + If a notification was sent, return the time at which the message was sent. + Otherwise, return None. Configuration: @@ -144,7 +147,7 @@ async def notify(channel: discord.TextChannel, last_notification: t.Optional[dat log.trace("Notifying about lack of channels.") if last_notification: - elapsed = (datetime.utcnow() - last_notification).seconds + elapsed = (arrow.utcnow() - last_notification).seconds minimum_interval = constants.HelpChannels.notify_minutes * 60 should_send = elapsed >= minimum_interval else: @@ -167,7 +170,7 @@ async def notify(channel: discord.TextChannel, last_notification: t.Optional[dat allowed_mentions=discord.AllowedMentions(everyone=False, roles=allowed_roles) ) - return message.created_at + return Arrow.fromdatetime(message.created_at) except Exception: # Handle it here cause this feature isn't critical for the functionality of the system. log.exception("Failed to send notification about lack of dormant channels!") diff --git a/bot/exts/help_channels/_stats.py b/bot/exts/help_channels/_stats.py index b8778e7d9..eb34e75e1 100644 --- a/bot/exts/help_channels/_stats.py +++ b/bot/exts/help_channels/_stats.py @@ -22,21 +22,20 @@ def report_counts() -> None: log.warning(f"Couldn't find category {name!r} to track channel count stats.") -async def report_complete_session(channel_id: int, is_auto: bool) -> None: +async def report_complete_session(channel_id: int, closed_on: _channel.ClosingReason) -> None: """ Report stats for a completed help session channel `channel_id`. - Set `is_auto` to True if the channel was automatically closed or False if manually closed. + `closed_on` is the reason why the channel was closed. See `_channel.ClosingReason` for possible reasons. """ - caller = "auto" if is_auto else "command" - bot.instance.stats.incr(f"help.dormant_calls.{caller}") + bot.instance.stats.incr(f"help.dormant_calls.{closed_on.value}") in_use_time = await _channel.get_in_use_time(channel_id) if in_use_time: bot.instance.stats.timing("help.in_use_time", in_use_time) - unanswered = await _caches.unanswered.get(channel_id) - if unanswered: + non_claimant_last_message_time = await _caches.non_claimant_last_message_times.get(channel_id) + if non_claimant_last_message_time is None: bot.instance.stats.incr("help.sessions.unanswered") - elif unanswered is not None: + else: bot.instance.stats.incr("help.sessions.answered") diff --git a/bot/exts/info/information.py b/bot/exts/info/information.py index c54ca96bf..5e2c4b417 100644 --- a/bot/exts/info/information.py +++ b/bot/exts/info/information.py @@ -6,7 +6,7 @@ from collections import defaultdict from typing import Any, DefaultDict, Dict, Mapping, Optional, Tuple, Union import fuzzywuzzy -from discord import Colour, Embed, Guild, Message, Role +from discord import AllowedMentions, Colour, Embed, Guild, Message, Role from discord.ext.commands import BucketType, Cog, Context, Paginator, command, group, has_any_role from bot import constants @@ -284,7 +284,7 @@ class Information(Cog): embed.add_field(name=field_name, value=field_content, inline=False) embed.set_thumbnail(url=user.avatar_url_as(static_format="png")) - embed.colour = user.top_role.colour if roles else Colour.blurple() + embed.colour = user.colour if user.colour != Colour.default() else Colour.blurple() return embed @@ -447,9 +447,9 @@ class Information(Cog): def add_content(title: str, content: str) -> None: paginator.add_line(f'== {title} ==\n') - # replace backticks as it breaks out of code blocks. Spaces seemed to be the most reasonable solution. - # we hope it's not close to 2000 - paginator.add_line(content.replace('```', '`` `')) + # Replace backticks as it breaks out of code blocks. + # An invisible character seemed to be the most reasonable solution. We hope it's not close to 2000. + paginator.add_line(content.replace('`', '`\u200b')) paginator.close_page() if message.content: @@ -468,7 +468,7 @@ class Information(Cog): add_content(title, transformer(item)) for page in paginator.pages: - await ctx.send(page) + await ctx.send(page, allowed_mentions=AllowedMentions.none()) @raw.command() async def json(self, ctx: Context, message: Message) -> None: diff --git a/bot/exts/moderation/defcon.py b/bot/exts/moderation/defcon.py index bab95405c..dfb1afd19 100644 --- a/bot/exts/moderation/defcon.py +++ b/bot/exts/moderation/defcon.py @@ -181,7 +181,7 @@ class Defcon(Cog): role = ctx.guild.default_role permissions = role.permissions - permissions.update(send_messages=False, add_reactions=False) + permissions.update(send_messages=False, add_reactions=False, connect=False) await role.edit(reason="DEFCON shutdown", permissions=permissions) await ctx.send(f"{Action.SERVER_SHUTDOWN.value.emoji} Server shut down.") @@ -192,7 +192,7 @@ class Defcon(Cog): role = ctx.guild.default_role permissions = role.permissions - permissions.update(send_messages=True, add_reactions=True) + permissions.update(send_messages=True, add_reactions=True, connect=True) await role.edit(reason="DEFCON unshutdown", permissions=permissions) await ctx.send(f"{Action.SERVER_OPEN.value.emoji} Server reopened.") diff --git a/bot/exts/moderation/dm_relay.py b/bot/exts/moderation/dm_relay.py index 6d081741c..1d2206e27 100644 --- a/bot/exts/moderation/dm_relay.py +++ b/bot/exts/moderation/dm_relay.py @@ -1,132 +1,72 @@ import logging -from typing import Optional import discord -from async_rediscache import RedisCache -from discord import Color -from discord.ext import commands -from discord.ext.commands import Cog +from discord.ext.commands import Cog, Context, command, has_any_role -from bot import constants from bot.bot import Bot -from bot.converters import UserMentionOrID -from bot.utils.checks import in_whitelist_check -from bot.utils.messages import send_attachments -from bot.utils.webhooks import send_webhook +from bot.constants import Emojis, MODERATION_ROLES +from bot.utils.services import send_to_paste_service log = logging.getLogger(__name__) class DMRelay(Cog): - """Relay direct messages to and from the bot.""" - - # RedisCache[str, t.Union[discord.User.id, discord.Member.id]] - dm_cache = RedisCache() + """Inspect messages sent to the bot.""" def __init__(self, bot: Bot): self.bot = bot - self.webhook_id = constants.Webhooks.dm_log - self.webhook = None - self.bot.loop.create_task(self.fetch_webhook()) - - @commands.command(aliases=("reply",)) - async def send_dm(self, ctx: commands.Context, member: Optional[UserMentionOrID], *, message: str) -> None: - """ - Allows you to send a DM to a user from the bot. - - If `member` is not provided, it will send to the last user who DM'd the bot. - - This feature should be used extremely sparingly. Use ModMail if you need to have a serious - conversation with a user. This is just for responding to extraordinary DMs, having a little - fun with users, and telling people they are DMing the wrong bot. - - NOTE: This feature will be removed if it is overused. - """ - if not member: - user_id = await self.dm_cache.get("last_user") - member = ctx.guild.get_member(user_id) if user_id else None - - # If we still don't have a Member at this point, give up - if not member: - log.debug("This bot has never gotten a DM, or the RedisCache has been cleared.") - await ctx.message.add_reaction("❌") + + @command(aliases=("relay", "dr")) + async def dmrelay(self, ctx: Context, user: discord.User, limit: int = 100) -> None: + """Relays the direct message history between the bot and given user.""" + log.trace(f"Relaying DMs with {user.name} ({user.id})") + + if user.bot: + await ctx.send(f"{Emojis.cross_mark} No direct message history with bots.") return - if member.id == self.bot.user.id: - log.debug("Not sending message to bot user") - return await ctx.send("🚫 I can't send messages to myself!") - - try: - await member.send(message) - except discord.errors.Forbidden: - log.debug("User has disabled DMs.") - await ctx.message.add_reaction("❌") - else: - await ctx.message.add_reaction("✅") - self.bot.stats.incr("dm_relay.dm_sent") - - async def fetch_webhook(self) -> None: - """Fetches the webhook object, so we can post to it.""" - await self.bot.wait_until_guild_available() - - try: - self.webhook = await self.bot.fetch_webhook(self.webhook_id) - except discord.HTTPException: - log.exception(f"Failed to fetch webhook with id `{self.webhook_id}`") - - @Cog.listener() - async def on_message(self, message: discord.Message) -> None: - """Relays the message's content and attachments to the dm_log channel.""" - # Only relay DMs from humans - if message.author.bot or message.guild or self.webhook is None: + output = "" + async for msg in user.history(limit=limit, oldest_first=True): + created_at = msg.created_at.strftime(r"%Y-%m-%d %H:%M") + + # Metadata (author, created_at, id) + output += f"{msg.author} [{created_at}] ({msg.id}): " + + # Content + if msg.content: + output += msg.content + "\n" + + # Embeds + if (embeds := len(msg.embeds)) > 0: + output += f"<{embeds} embed{'s' if embeds > 1 else ''}>\n" + + # Attachments + attachments = "\n".join(a.url for a in msg.attachments) + if attachments: + output += attachments + "\n" + + if not output: + await ctx.send(f"{Emojis.cross_mark} No direct message history with {user.mention}.") + return + + metadata = ( + f"User: {user} ({user.id})\n" + f"Channel ID: {user.dm_channel.id}\n\n" + ) + + paste_link = await send_to_paste_service(metadata + output, extension="txt") + + if paste_link is None: + await ctx.send(f"{Emojis.cross_mark} Failed to upload output to hastebin.") return - if message.clean_content: - await send_webhook( - webhook=self.webhook, - content=message.clean_content, - username=f"{message.author.display_name} ({message.author.id})", - avatar_url=message.author.avatar_url - ) - await self.dm_cache.set("last_user", message.author.id) - self.bot.stats.incr("dm_relay.dm_received") - - # Handle any attachments - if message.attachments: - try: - await send_attachments( - message, - self.webhook, - username=f"{message.author.display_name} ({message.author.id})" - ) - except (discord.errors.Forbidden, discord.errors.NotFound): - e = discord.Embed( - description=":x: **This message contained an attachment, but it could not be retrieved**", - color=Color.red() - ) - await send_webhook( - webhook=self.webhook, - embed=e, - username=f"{message.author.display_name} ({message.author.id})", - avatar_url=message.author.avatar_url - ) - except discord.HTTPException: - log.exception("Failed to send an attachment to the webhook") - - async def cog_check(self, ctx: commands.Context) -> bool: + await ctx.send(paste_link) + + async def cog_check(self, ctx: Context) -> bool: """Only allow moderators to invoke the commands in this cog.""" - checks = [ - await commands.has_any_role(*constants.MODERATION_ROLES).predicate(ctx), - in_whitelist_check( - ctx, - channels=[constants.Channels.dm_log], - redirect=None, - fail_silently=True, - ) - ] - return all(checks) + return await has_any_role(*MODERATION_ROLES).predicate(ctx) def setup(bot: Bot) -> None: - """Load the DMRelay cog.""" + """Load the DMRelay cog.""" bot.add_cog(DMRelay(bot)) diff --git a/bot/exts/moderation/stream.py b/bot/exts/moderation/stream.py new file mode 100644 index 000000000..12e195172 --- /dev/null +++ b/bot/exts/moderation/stream.py @@ -0,0 +1,179 @@ +import logging +from datetime import timedelta, timezone + +import arrow +import discord +from arrow import Arrow +from async_rediscache import RedisCache +from discord.ext import commands + +from bot.bot import Bot +from bot.constants import Colours, Emojis, Guild, Roles, STAFF_ROLES, VideoPermission +from bot.converters import Expiry +from bot.utils.scheduling import Scheduler +from bot.utils.time import format_infraction_with_duration + +log = logging.getLogger(__name__) + + +class Stream(commands.Cog): + """Grant and revoke streaming permissions from members.""" + + # Stores tasks to remove streaming permission + # RedisCache[discord.Member.id, UtcPosixTimestamp] + task_cache = RedisCache() + + def __init__(self, bot: Bot): + self.bot = bot + self.scheduler = Scheduler(self.__class__.__name__) + self.reload_task = self.bot.loop.create_task(self._reload_tasks_from_redis()) + + def cog_unload(self) -> None: + """Cancel all scheduled tasks.""" + self.reload_task.cancel() + self.reload_task.add_done_callback(lambda _: self.scheduler.cancel_all()) + + async def _revoke_streaming_permission(self, member: discord.Member) -> None: + """Remove the streaming permission from the given Member.""" + await self.task_cache.delete(member.id) + await member.remove_roles(discord.Object(Roles.video), reason="Streaming access revoked") + + async def _reload_tasks_from_redis(self) -> None: + """Reload outstanding tasks from redis on startup, delete the task if the member has since left the server.""" + await self.bot.wait_until_guild_available() + items = await self.task_cache.items() + for key, value in items: + member = self.bot.get_guild(Guild.id).get_member(key) + + if not member: + # Member isn't found in the cache + try: + member = await self.bot.get_guild(Guild.id).fetch_member(key) + except discord.errors.NotFound: + log.debug( + f"Member {key} left the guild before we could schedule " + "the revoking of their streaming permissions." + ) + await self.task_cache.delete(key) + continue + except discord.HTTPException: + log.exception(f"Exception while trying to retrieve member {key} from Discord.") + continue + + revoke_time = Arrow.utcfromtimestamp(value) + log.debug(f"Scheduling {member} ({member.id}) to have streaming permission revoked at {revoke_time}") + self.scheduler.schedule_at( + revoke_time, + key, + self._revoke_streaming_permission(member) + ) + + @commands.command(aliases=("streaming",)) + @commands.has_any_role(*STAFF_ROLES) + async def stream(self, ctx: commands.Context, member: discord.Member, duration: Expiry = None) -> None: + """ + Temporarily grant streaming permissions to a member for a given duration. + + A unit of time should be appended to the duration. + Units (∗case-sensitive): + \u2003`y` - years + \u2003`m` - months∗ + \u2003`w` - weeks + \u2003`d` - days + \u2003`h` - hours + \u2003`M` - minutes∗ + \u2003`s` - seconds + + Alternatively, an ISO 8601 timestamp can be provided for the duration. + """ + log.trace(f"Attempting to give temporary streaming permission to {member} ({member.id}).") + + if duration is None: + # Use default duration and convert back to datetime as Embed.timestamp doesn't support Arrow + duration = arrow.utcnow() + timedelta(minutes=VideoPermission.default_permission_duration) + duration = duration.datetime + elif duration.tzinfo is None: + # Make duration tz-aware. + # ISODateTime could already include tzinfo, this check is so it isn't overwritten. + duration.replace(tzinfo=timezone.utc) + + # Check if the member already has streaming permission + already_allowed = any(Roles.video == role.id for role in member.roles) + if already_allowed: + await ctx.send(f"{Emojis.cross_mark} {member.mention} can already stream.") + log.debug(f"{member} ({member.id}) already has permission to stream.") + return + + # Schedule task to remove streaming permission from Member and add it to task cache + self.scheduler.schedule_at(duration, member.id, self._revoke_streaming_permission(member)) + await self.task_cache.set(member.id, duration.timestamp()) + + await member.add_roles(discord.Object(Roles.video), reason="Temporary streaming access granted") + + # Use embed as embed timestamps do timezone conversions. + embed = discord.Embed( + description=f"{Emojis.check_mark} {member.mention} can now stream.", + colour=Colours.soft_green + ) + embed.set_footer(text=f"Streaming permission has been given to {member} until") + embed.timestamp = duration + + # Mention in content as mentions in embeds don't ping + await ctx.send(content=member.mention, embed=embed) + + # Convert here for nicer logging + revoke_time = format_infraction_with_duration(str(duration)) + log.debug(f"Successfully gave {member} ({member.id}) permission to stream until {revoke_time}.") + + @commands.command(aliases=("pstream",)) + @commands.has_any_role(*STAFF_ROLES) + async def permanentstream(self, ctx: commands.Context, member: discord.Member) -> None: + """Permanently grants the given member the permission to stream.""" + log.trace(f"Attempting to give permanent streaming permission to {member} ({member.id}).") + + # Check if the member already has streaming permission + if any(Roles.video == role.id for role in member.roles): + if member.id in self.scheduler: + # Member has temp permission, so cancel the task to revoke later and delete from cache + self.scheduler.cancel(member.id) + await self.task_cache.delete(member.id) + + await ctx.send(f"{Emojis.check_mark} Permanently granted {member.mention} the permission to stream.") + log.debug( + f"Successfully upgraded temporary streaming permission for {member} ({member.id}) to permanent." + ) + return + + await ctx.send(f"{Emojis.cross_mark} This member can already stream.") + log.debug(f"{member} ({member.id}) already had permanent streaming permission.") + return + + await member.add_roles(discord.Object(Roles.video), reason="Permanent streaming access granted") + await ctx.send(f"{Emojis.check_mark} Permanently granted {member.mention} the permission to stream.") + log.debug(f"Successfully gave {member} ({member.id}) permanent streaming permission.") + + @commands.command(aliases=("unstream", "rstream")) + @commands.has_any_role(*STAFF_ROLES) + async def revokestream(self, ctx: commands.Context, member: discord.Member) -> None: + """Revoke the permission to stream from the given member.""" + log.trace(f"Attempting to remove streaming permission from {member} ({member.id}).") + + # Check if the member already has streaming permission + if any(Roles.video == role.id for role in member.roles): + if member.id in self.scheduler: + # Member has temp permission, so cancel the task to revoke later and delete from cache + self.scheduler.cancel(member.id) + await self.task_cache.delete(member.id) + await self._revoke_streaming_permission(member) + + await ctx.send(f"{Emojis.check_mark} Revoked the permission to stream from {member.mention}.") + log.debug(f"Successfully revoked streaming permission from {member} ({member.id}).") + return + + await ctx.send(f"{Emojis.cross_mark} This member doesn't have video permissions to remove!") + log.debug(f"{member} ({member.id}) didn't have the streaming permission to remove!") + + +def setup(bot: Bot) -> None: + """Loads the Stream cog.""" + bot.add_cog(Stream(bot)) diff --git a/bot/exts/recruitment/talentpool/_cog.py b/bot/exts/recruitment/talentpool/_cog.py index b809cea17..72604be51 100644 --- a/bot/exts/recruitment/talentpool/_cog.py +++ b/bot/exts/recruitment/talentpool/_cog.py @@ -1,14 +1,16 @@ import logging import textwrap from collections import ChainMap +from io import StringIO from typing import Union +import discord from discord import Color, Embed, Member, User from discord.ext.commands import Cog, Context, group, has_any_role from bot.api import ResponseCodeError from bot.bot import Bot -from bot.constants import Channels, Guild, MODERATION_ROLES, STAFF_ROLES, Webhooks +from bot.constants import Channels, Emojis, Guild, MODERATION_ROLES, STAFF_ROLES, Webhooks from bot.converters import FetchedMember from bot.exts.moderation.watchchannels._watchchannel import WatchChannel from bot.exts.recruitment.talentpool._review import Reviewer @@ -113,15 +115,39 @@ class TalentPool(WatchChannel, Cog, name="Talentpool"): """ await ctx.invoke(self.watched_command, oldest_first=True, update_cache=update_cache) + @nomination_group.command(name='forcewatch', aliases=('fw', 'forceadd', 'fa'), root_aliases=("forcenominate",)) + @has_any_role(*MODERATION_ROLES) + async def force_watch_command(self, ctx: Context, user: FetchedMember, *, reason: str = '') -> None: + """ + Adds the given `user` to the talent pool, from any channel. + + A `reason` for adding the user to the talent pool is optional. + """ + await self._watch_user(ctx, user, reason) + @nomination_group.command(name='watch', aliases=('w', 'add', 'a'), root_aliases=("nominate",)) @has_any_role(*STAFF_ROLES) async def watch_command(self, ctx: Context, user: FetchedMember, *, reason: str = '') -> None: """ - Relay messages sent by the given `user` to the `#talent-pool` channel. + Adds the given `user` to the talent pool. A `reason` for adding the user to the talent pool is optional. - If given, it will be displayed in the header when relaying messages of this user to the channel. + This command can only be used in the `#nominations` channel. """ + if ctx.channel.id != Channels.nominations: + if any(role.id in MODERATION_ROLES for role in ctx.author.roles): + await ctx.send( + f":x: Nominations should be run in the <#{Channels.nominations}> channel. " + "Use `!tp forcewatch` to override this check." + ) + else: + await ctx.send(f":x: Nominations must be run in the <#{Channels.nominations}> channel") + return + + await self._watch_user(ctx, user, reason) + + async def _watch_user(self, ctx: Context, user: FetchedMember, reason: str) -> None: + """Adds the given user to the talent pool.""" if user.bot: await ctx.send(f":x: I'm sorry {ctx.author}, I'm afraid I can't do that. I only watch humans.") return @@ -306,7 +332,18 @@ class TalentPool(WatchChannel, Cog, name="Talentpool"): """Mark a user's nomination as reviewed and cancel the review task.""" if not await self.reviewer.mark_reviewed(ctx, user_id): return - await ctx.send(f"✅ The user with ID `{user_id}` was marked as reviewed.") + await ctx.send(f"{Emojis.check_mark} The user with ID `{user_id}` was marked as reviewed.") + + @nomination_group.command(aliases=('gr',)) + @has_any_role(*MODERATION_ROLES) + async def get_review(self, ctx: Context, user_id: int) -> None: + """Get the user's review as a markdown file.""" + review = (await self.reviewer.make_review(user_id))[0] + if review: + file = discord.File(StringIO(review), f"{user_id}_review.md") + await ctx.send(file=file) + else: + await ctx.send(f"There doesn't appear to be an active nomination for {user_id}") @nomination_group.command(aliases=('review',)) @has_any_role(*MODERATION_ROLES) @@ -316,7 +353,7 @@ class TalentPool(WatchChannel, Cog, name="Talentpool"): return await self.reviewer.post_review(user_id, update_database=False) - await ctx.message.add_reaction("✅") + await ctx.message.add_reaction(Emojis.check_mark) @Cog.listener() async def on_member_ban(self, guild: Guild, user: Union[User, Member]) -> None: diff --git a/bot/exts/recruitment/talentpool/_review.py b/bot/exts/recruitment/talentpool/_review.py index fb3461238..11aa3b62b 100644 --- a/bot/exts/recruitment/talentpool/_review.py +++ b/bot/exts/recruitment/talentpool/_review.py @@ -66,26 +66,40 @@ class Reviewer: self._review_scheduler.schedule_at(review_at, user_id, self.post_review(user_id, update_database=True)) async def post_review(self, user_id: int, update_database: bool) -> None: - """Format a generic review of a user and post it to the nomination voting channel.""" + """Format the review of a user and post it to the nomination voting channel.""" + review, seen_emoji = await self.make_review(user_id) + if not review: + return + + guild = self.bot.get_guild(Guild.id) + channel = guild.get_channel(Channels.nomination_voting) + log.trace(f"Posting the review of {user_id}") + message = (await self._bulk_send(channel, review))[-1] + if seen_emoji: + for reaction in (seen_emoji, "\N{THUMBS UP SIGN}", "\N{THUMBS DOWN SIGN}"): + await message.add_reaction(reaction) + + if update_database: + nomination = self._pool.watched_users[user_id] + await self.bot.api_client.patch(f"{self._pool.api_endpoint}/{nomination['id']}", json={"reviewed": True}) + + async def make_review(self, user_id: int) -> typing.Tuple[str, Optional[Emoji]]: + """Format a generic review of a user and return it with the seen emoji.""" + log.trace(f"Formatting the review of {user_id}") nomination = self._pool.watched_users[user_id] if not nomination: log.trace(f"There doesn't appear to be an active nomination for {user_id}") - return + return "", None guild = self.bot.get_guild(Guild.id) - channel = guild.get_channel(Channels.nomination_voting) member = guild.get_member(user_id) - if update_database: - await self.bot.api_client.patch(f"{self._pool.api_endpoint}/{nomination['id']}", json={"reviewed": True}) - if not member: - await channel.send( - f"I tried to review the user with ID `{user_id}`, but they don't appear to be on the server 😔" - ) - return + return ( + f"I tried to review the user with ID `{user_id}`, but they don't appear to be on the server :pensive:" + ), None opening = f"<@&{Roles.moderators}> <@&{Roles.admins}>\n{member.mention} ({member}) for Helper!" @@ -100,14 +114,11 @@ class Reviewer: vote_request = ( "*Refer to their nomination and infraction histories for further details*.\n" f"*Please react {seen_emoji} if you've seen this post." - " Then react 👍 for approval, or 👎 for disapproval*." + " Then react :+1: for approval, or :-1: for disapproval*." ) - review = "\n\n".join(part for part in (opening, current_nominations, review_body, vote_request)) - - message = (await self._bulk_send(channel, review))[-1] - for reaction in (seen_emoji, "👍", "👎"): - await message.add_reaction(reaction) + review = "\n\n".join((opening, current_nominations, review_body, vote_request)) + return review, seen_emoji async def _construct_review_body(self, member: Member) -> str: """Formats the body of the nomination, with details of activity, infractions, and previous nominations.""" @@ -256,10 +267,10 @@ class Reviewer: @staticmethod def _random_ducky(guild: Guild) -> Union[Emoji, str]: - """Picks a random ducky emoji to be used to mark the vote as seen. If no duckies found returns 👀.""" + """Picks a random ducky emoji to be used to mark the vote as seen. If no duckies found returns :eyes:.""" duckies = [emoji for emoji in guild.emojis if emoji.name.startswith("ducky")] if not duckies: - return "👀" + return ":eyes:" return random.choice(duckies) @staticmethod @@ -289,12 +300,12 @@ class Reviewer: await self._pool.fetch_user_cache() if user_id not in self._pool.watched_users: log.trace(f"Can't find a nominated user with id {user_id}") - await ctx.send(f"❌ Can't find a currently nominated user with id `{user_id}`") + await ctx.send(f":x: Can't find a currently nominated user with id `{user_id}`") return False nomination = self._pool.watched_users[user_id] if nomination["reviewed"]: - await ctx.send("❌ This nomination was already reviewed, but here's a cookie 🍪") + await ctx.send(":x: This nomination was already reviewed, but here's a cookie :cookie:") return False await self.bot.api_client.patch(f"{self._pool.api_endpoint}/{nomination['id']}", json={"reviewed": True}) diff --git a/bot/exts/utils/utils.py b/bot/exts/utils/utils.py index a5d6f69b9..cae7f2593 100644 --- a/bot/exts/utils/utils.py +++ b/bot/exts/utils/utils.py @@ -9,7 +9,7 @@ from discord.ext.commands import BadArgument, Cog, Context, clean_content, comma from discord.utils import snowflake_time from bot.bot import Bot -from bot.constants import Channels, MODERATION_ROLES, STAFF_ROLES +from bot.constants import Channels, MODERATION_ROLES, Roles, STAFF_ROLES from bot.converters import Snowflake from bot.decorators import in_whitelist from bot.pagination import LinePaginator @@ -175,7 +175,7 @@ class Utils(Cog): await ctx.send(embed=embed) @command(aliases=("poll",)) - @has_any_role(*MODERATION_ROLES) + @has_any_role(*MODERATION_ROLES, Roles.project_leads, Roles.domain_leads) async def vote(self, ctx: Context, title: clean_content(fix_channel_mentions=True), *options: str) -> None: """ Build a quick voting poll with matching reactions with the provided options. diff --git a/bot/resources/tags/customhelp.md b/bot/resources/tags/customhelp.md new file mode 100644 index 000000000..6f0b17642 --- /dev/null +++ b/bot/resources/tags/customhelp.md @@ -0,0 +1,3 @@ +**Custom help commands in discord.py** + +To learn more about how to create custom help commands in discord.py by subclassing the help command, please see [this tutorial](https://gist.github.com/InterStella0/b78488fb28cadf279dfd3164b9f0cf96#embed-minimalhelpcommand) by Stella#2000 diff --git a/bot/resources/tags/intents.md b/bot/resources/tags/intents.md new file mode 100644 index 000000000..464caf0ba --- /dev/null +++ b/bot/resources/tags/intents.md @@ -0,0 +1,19 @@ +**Using intents in discord.py** + +Intents are a feature of Discord that tells the gateway exactly which events to send your bot. By default, discord.py has all intents enabled, except for the `Members` and `Presences` intents, which are needed for events such as `on_member` and to get members' statuses. + +To enable one of these intents, you need to first go to the [Discord developer portal](https://discord.com/developers/applications), then to the bot page of your bot's application. Scroll down to the `Privileged Gateway Intents` section, then enable the intents that you need. + +Next, in your bot you need to set the intents you want to connect with in the bot's constructor using the `intents` keyword argument, like this: + +```py +from discord import Intents +from discord.ext import commands + +intents = Intents.default() +intents.members = True + +bot = commands.Bot(command_prefix="!", intents=intents) +``` + +For more info about using intents, see the [discord.py docs on intents](https://discordpy.readthedocs.io/en/latest/intents.html), and for general information about them, see the [Discord developer documentation on intents](https://discord.com/developers/docs/topics/gateway#gateway-intents). diff --git a/bot/resources/tags/ytdl.md b/bot/resources/tags/ytdl.md index e34ecff44..df28024a0 100644 --- a/bot/resources/tags/ytdl.md +++ b/bot/resources/tags/ytdl.md @@ -1,12 +1,12 @@ Per [PyDis' Rule 5](https://pythondiscord.com/pages/rules), we are unable to assist with questions related to youtube-dl, commonly used by Discord bots to stream audio, as its use violates YouTube's Terms of Service. -For reference, this usage is covered by the following clauses in [YouTube's TOS](https://www.youtube.com/static?template=terms), as of 2019-07-22: +For reference, this usage is covered by the following clauses in [YouTube's TOS](https://www.youtube.com/static?gl=GB&template=terms), as of 2021-03-17: ``` -The following restrictions apply to your use of the Service. You are not allowed to: +The following restrictions apply to your use of the Service. You are not allowed to: -1. access, reproduce, download, distribute, transmit, broadcast, display, sell, license, alter, modify or otherwise use any part of the Service or any Content except: (a) as specifically permitted by the Service; (b) with prior written permission from YouTube and, if applicable, the respective rights holders; or (c) as permitted by applicable law; +1. access, reproduce, download, distribute, transmit, broadcast, display, sell, license, alter, modify or otherwise use any part of the Service or any Content except: (a) as specifically permitted by the Service; (b) with prior written permission from YouTube and, if applicable, the respective rights holders; or (c) as permitted by applicable law; -3. access the Service using any automated means (such as robots, botnets or scrapers) except: (a) in the case of public search engines, in accordance with YouTube’s robots.txt file; (b) with YouTube’s prior written permission; or (c) as permitted by applicable law; +3. access the Service using any automated means (such as robots, botnets or scrapers) except: (a) in the case of public search engines, in accordance with YouTube’s robots.txt file; (b) with YouTube’s prior written permission; or (c) as permitted by applicable law; 9. use the Service to view or listen to Content other than for personal, non-commercial use (for example, you may not publicly screen videos or stream music from the Service) ``` diff --git a/bot/utils/scheduling.py b/bot/utils/scheduling.py index 4dd036e4f..6843bae88 100644 --- a/bot/utils/scheduling.py +++ b/bot/utils/scheduling.py @@ -59,14 +59,18 @@ class Scheduler: def schedule_at(self, time: datetime, task_id: t.Hashable, coroutine: t.Coroutine) -> None: """ - Schedule `coroutine` to be executed at the given naïve UTC `time`. + Schedule `coroutine` to be executed at the given `time`. + + If `time` is timezone aware, then use that timezone to calculate now() when subtracting. + If `time` is naïve, then use UTC. If `time` is in the past, schedule `coroutine` immediately. If a task with `task_id` already exists, close `coroutine` instead of scheduling it. This prevents unawaited coroutine warnings. Don't pass a coroutine that'll be re-used elsewhere. """ - delay = (time - datetime.utcnow()).total_seconds() + now_datetime = datetime.now(time.tzinfo) if time.tzinfo else datetime.utcnow() + delay = (time - now_datetime).total_seconds() if delay > 0: coroutine = self._await_later(delay, task_id, coroutine) diff --git a/bot/utils/services.py b/bot/utils/services.py index 5949c9e48..db9c93d0f 100644 --- a/bot/utils/services.py +++ b/bot/utils/services.py @@ -47,7 +47,14 @@ async def send_to_paste_service(contents: str, *, extension: str = "") -> Option continue elif "key" in response_json: log.info(f"Successfully uploaded contents to paste service behind key {response_json['key']}.") - return URLs.paste_service.format(key=response_json['key']) + extension + + paste_link = URLs.paste_service.format(key=response_json['key']) + extension + + if extension == '.py': + return paste_link + + return paste_link + "?noredirect" + log.warning( f"Got unexpected JSON response from paste service: {response_json}\n" f"trying again ({attempt}/{FAILED_REQUEST_ATTEMPTS})." diff --git a/config-default.yml b/config-default.yml index 502f0f861..8c6e18470 100644 --- a/config-default.yml +++ b/config-default.yml @@ -139,6 +139,7 @@ guild: help_dormant: 691405908919451718 help_in_use: 696958401460043776 logs: &LOGS 468520609152892958 + moderators: &MODS_CATEGORY 749736277464842262 modmail: &MODMAIL 714494672835444826 voice: 356013253765234688 @@ -150,7 +151,6 @@ guild: python_events: &PYEVENTS_CHANNEL 729674110270963822 python_news: &PYNEWS_CHANNEL 704372456592506880 reddit: &REDDIT_CHANNEL 458224812528238616 - user_event_announcements: &USER_EVENT_A 592000283102674944 # Development dev_contrib: &DEV_CONTRIB 635950537262759947 @@ -163,13 +163,13 @@ guild: # Python Help: Available cooldown: 720603994149486673 + how_to_get_help: 704250143020417084 # Topical discord_py: 343944376055103488 # Logs attachment_log: &ATTACH_LOG 649243850006855680 - dm_log: 653713721625018428 message_log: &MESSAGE_LOG 467752170159079424 mod_log: &MOD_LOG 282638479504965634 user_log: 528976905546760203 @@ -193,15 +193,12 @@ guild: helpers: &HELPERS 385474242440986624 incidents: 714214212200562749 incidents_archive: 720668923636351037 - mods: &MODS 305126844661760000 mod_alerts: 473092532147060736 - mod_appeals: &MOD_APPEALS 808790025688711198 - mod_meta: &MOD_META 775412552795947058 - mod_spam: &MOD_SPAM 620607373828030464 - mod_tools: &MOD_TOOLS 775413915391098921 + nominations: 822920136150745168 nomination_voting: 822853512709931008 organisation: &ORGANISATION 551789653284356126 staff_lounge: &STAFF_LOUNGE 464905259261755392 + staff_info: &STAFF_INFO 396684402404622347 # Staff announcement channels admin_announcements: &ADMIN_ANNOUNCEMENTS 749736155569848370 @@ -226,17 +223,13 @@ guild: talent_pool: &TALENT_POOL 534321732593647616 moderation_categories: + - *MODS_CATEGORY - *MODMAIL - *LOGS moderation_channels: - *ADMINS - *ADMIN_SPAM - - *MOD_APPEALS - - *MOD_META - - *MOD_TOOLS - - *MODS - - *MOD_SPAM # Modlog cog ignores events which occur in these channels modlog_blacklist: @@ -265,14 +258,19 @@ guild: admins: &ADMINS_ROLE 267628507062992896 core_developers: 587606783669829632 devops: 409416496733880320 + domain_leads: 807415650778742785 helpers: &HELPERS_ROLE 267630620367257601 moderators: &MODS_ROLE 267629731250176001 owners: &OWNERS_ROLE 267627879762755584 + project_leads: 815701647526330398 # Code Jam jammers: 737249140966162473 team_leaders: 737250302834638889 + # Streaming + video: 764245844798079016 + moderation_roles: - *ADMINS_ROLE - *MODS_ROLE @@ -287,7 +285,6 @@ guild: webhooks: big_brother: 569133704568373283 dev_log: 680501655111729222 - dm_log: 654567640664244225 duck_pond: 637821475327311927 incidents_archive: 720671599790915702 python_news: &PYNEWS_WEBHOOK 704381182279942324 @@ -324,7 +321,6 @@ filter: - *MOD_LOG - *STAFF_LOUNGE - *TALENT_POOL - - *USER_EVENT_A role_whitelist: - *ADMINS_ROLE @@ -469,8 +465,12 @@ help_channels: cmd_whitelist: - *HELPERS_ROLE - # Allowed duration of inactivity before making a channel dormant - idle_minutes: 30 + # Allowed duration of inactivity by claimant before making a channel dormant + idle_minutes_claimant: 30 + + # Allowed duration of inactivity by others before making a channel dormant + # `idle_minutes_claimant` must also be met, before a channel is closed + idle_minutes_others: 10 # Allowed duration of inactivity when channel is empty (due to deleted messages) # before message making a channel dormant @@ -481,7 +481,7 @@ help_channels: # Maximum number of channels across all 3 categories # Note Discord has a hard limit of 50 channels per category, so this shouldn't be > 50 - max_total_channels: 32 + max_total_channels: 42 # Prefix for help channel names name_prefix: 'help-' @@ -513,12 +513,12 @@ duck_pond: - *PYEVENTS_CHANNEL - *MAILING_LISTS - *REDDIT_CHANNEL - - *USER_EVENT_A - *DUCK_POND - *CHANGE_LOG - *STAFF_ANNOUNCEMENTS - *MOD_ANNOUNCEMENTS - *ADMIN_ANNOUNCEMENTS + - *STAFF_INFO python_news: @@ -546,3 +546,7 @@ branding: config: required_keys: ['bot.token'] + + +video_permission: + default_permission_duration: 5 # Default duration for stream command in minutes diff --git a/tests/bot/exts/info/test_information.py b/tests/bot/exts/info/test_information.py index 80731c9f0..a996ce477 100644 --- a/tests/bot/exts/info/test_information.py +++ b/tests/bot/exts/info/test_information.py @@ -283,6 +283,7 @@ class UserEmbedTests(unittest.IsolatedAsyncioTestCase): user = helpers.MockMember() user.nick = None user.__str__ = unittest.mock.Mock(return_value="Mr. Hemlock") + user.colour = 0 embed = await self.cog.create_user_embed(ctx, user) @@ -298,6 +299,7 @@ class UserEmbedTests(unittest.IsolatedAsyncioTestCase): user = helpers.MockMember() user.nick = "Cat lover" user.__str__ = unittest.mock.Mock(return_value="Mr. Hemlock") + user.colour = 0 embed = await self.cog.create_user_embed(ctx, user) @@ -311,10 +313,9 @@ class UserEmbedTests(unittest.IsolatedAsyncioTestCase): """Created `!user` embeds should not contain mention of the @everyone-role.""" ctx = helpers.MockContext(channel=helpers.MockTextChannel(id=1)) admins_role = helpers.MockRole(name='Admins') - admins_role.colour = 100 # A `MockMember` has the @Everyone role by default; we add the Admins to that. - user = helpers.MockMember(roles=[admins_role], top_role=admins_role) + user = helpers.MockMember(roles=[admins_role], colour=100) embed = await self.cog.create_user_embed(ctx, user) @@ -332,12 +333,11 @@ class UserEmbedTests(unittest.IsolatedAsyncioTestCase): ctx = helpers.MockContext(channel=helpers.MockTextChannel(id=50)) moderators_role = helpers.MockRole(name='Moderators') - moderators_role.colour = 100 infraction_counts.return_value = ("Infractions", "expanded infractions info") nomination_counts.return_value = ("Nominations", "nomination info") - user = helpers.MockMember(id=314, roles=[moderators_role], top_role=moderators_role) + user = helpers.MockMember(id=314, roles=[moderators_role], colour=100) embed = await self.cog.create_user_embed(ctx, user) infraction_counts.assert_called_once_with(user) @@ -367,11 +367,10 @@ class UserEmbedTests(unittest.IsolatedAsyncioTestCase): ctx = helpers.MockContext(channel=helpers.MockTextChannel(id=100)) moderators_role = helpers.MockRole(name='Moderators') - moderators_role.colour = 100 infraction_counts.return_value = ("Infractions", "basic infractions info") - user = helpers.MockMember(id=314, roles=[moderators_role], top_role=moderators_role) + user = helpers.MockMember(id=314, roles=[moderators_role], colour=100) embed = await self.cog.create_user_embed(ctx, user) infraction_counts.assert_called_once_with(user) @@ -407,12 +406,11 @@ class UserEmbedTests(unittest.IsolatedAsyncioTestCase): ctx = helpers.MockContext() moderators_role = helpers.MockRole(name='Moderators') - moderators_role.colour = 100 - user = helpers.MockMember(id=314, roles=[moderators_role], top_role=moderators_role) + user = helpers.MockMember(id=314, roles=[moderators_role], colour=100) embed = await self.cog.create_user_embed(ctx, user) - self.assertEqual(embed.colour, discord.Colour(moderators_role.colour)) + self.assertEqual(embed.colour, discord.Colour(100)) @unittest.mock.patch( f"{COG_PATH}.basic_user_infraction_counts", @@ -422,7 +420,7 @@ class UserEmbedTests(unittest.IsolatedAsyncioTestCase): """The embed should be created with a blurple colour if the user has no assigned roles.""" ctx = helpers.MockContext() - user = helpers.MockMember(id=217) + user = helpers.MockMember(id=217, colour=discord.Colour.default()) embed = await self.cog.create_user_embed(ctx, user) self.assertEqual(embed.colour, discord.Colour.blurple()) @@ -435,7 +433,7 @@ class UserEmbedTests(unittest.IsolatedAsyncioTestCase): """The embed thumbnail should be set to the user's avatar in `png` format.""" ctx = helpers.MockContext() - user = helpers.MockMember(id=217) + user = helpers.MockMember(id=217, colour=0) user.avatar_url_as.return_value = "avatar url" embed = await self.cog.create_user_embed(ctx, user) diff --git a/tests/bot/utils/test_services.py b/tests/bot/utils/test_services.py index 1b48f6560..3b71022db 100644 --- a/tests/bot/utils/test_services.py +++ b/tests/bot/utils/test_services.py @@ -30,9 +30,9 @@ class PasteTests(unittest.IsolatedAsyncioTestCase): """Url with specified extension is returned on successful requests.""" key = "paste_key" test_cases = ( - (f"https://paste_service.com/{key}.txt", "txt"), + (f"https://paste_service.com/{key}.txt?noredirect", "txt"), (f"https://paste_service.com/{key}.py", "py"), - (f"https://paste_service.com/{key}", ""), + (f"https://paste_service.com/{key}?noredirect", ""), ) response = MagicMock( json=AsyncMock(return_value={"key": key}) |