diff options
| author | 2020-04-05 14:13:40 +0100 | |
|---|---|---|
| committer | 2020-04-05 14:13:40 +0100 | |
| commit | 504ce02b71f3918cd4debbc40fcafa614c280ca4 (patch) | |
| tree | 6d1d01138c4aab88b1c3aa416f8b40ec7820595d | |
| parent | Add close alias for dormant command (diff) | |
| parent | Merge pull request #813 from python-discord/feat/ci/b000/cache-pipenv (diff) | |
Merge branch 'master' into feat/frontend/o200/help-channels
30 files changed, 906 insertions, 316 deletions
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 61d11f844..be591d17e 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,4 +1,4 @@ -# Contributing to one of our projects +# Contributing to one of Our Projects Our projects are open-source and are automatically deployed whenever commits are pushed to the `master` branch on each repository, so we've created a set of guidelines in order to keep everything clean and in working order. @@ -10,12 +10,12 @@ Note that contributions may be rejected on the basis of a contributor failing to 2. If you have direct access to the repository, **create a branch for your changes** and create a pull request for that branch. If not, create a branch on a fork of the repository and create a pull request from there. * It's common practice for a repository to reject direct pushes to `master`, so make branching a habit! * If PRing from your own fork, **ensure that "Allow edits from maintainers" is checked**. This gives permission for maintainers to commit changes directly to your fork, speeding up the review process. -3. **Adhere to the prevailing code style**, which we enforce using [flake8](http://flake8.pycqa.org/en/latest/index.html). - * Run `flake8` against your code **before** you push it. Your commit will be rejected by the build server if it fails to lint. - * [Git Hooks](https://git-scm.com/book/en/v2/Customizing-Git-Git-Hooks) are a powerful tool that can be a daunting to set up. Fortunately, [`pre-commit`](https://github.com/pre-commit/pre-commit) abstracts this process away from you and is provided as a dev dependency for this project. Run `pipenv run precommit` when setting up the project and you'll never have to worry about breaking the build for linting errors. +3. **Adhere to the prevailing code style**, which we enforce using [`flake8`](http://flake8.pycqa.org/en/latest/index.html) and [`pre-commit`](https://pre-commit.com/). + * Run `flake8` and `pre-commit` against your code [**before** you push it](https://soundcloud.com/lemonsaurusrex/lint-before-you-push). Your commit will be rejected by the build server if it fails to lint. + * [Git Hooks](https://git-scm.com/book/en/v2/Customizing-Git-Git-Hooks) are a powerful git feature for executing custom scripts when certain important git actions occur. The pre-commit hook is the first hook executed during the commit process and can be used to check the code being committed & abort the commit if issues, such as linting failures, are detected. While git hooks can seem daunting to configure, the `pre-commit` framework abstracts this process away from you and is provided as a dev dependency for this project. Run `pipenv run precommit` when setting up the project and you'll never have to worry about committing code that fails linting. 4. **Make great commits**. A well structured git log is key to a project's maintainability; it efficiently provides insight into when and *why* things were done for future maintainers of the project. * Commits should be as narrow in scope as possible. Commits that span hundreds of lines across multiple unrelated functions and/or files are very hard for maintainers to follow. After about a week they'll probably be hard for you to follow too. - * Try to avoid making minor commits for fixing typos or linting errors. Since you've already set up a pre-commit hook to run `flake8` before a commit, you shouldn't be committing linting issues anyway. + * Avoid making minor commits for fixing typos or linting errors. Since you've already set up a `pre-commit` hook to run the linting pipeline before a commit, you shouldn't be committing linting issues anyway. * A more in-depth guide to writing great commit messages can be found in Chris Beam's [*How to Write a Git Commit Message*](https://chris.beams.io/posts/git-commit/) 5. **Avoid frequent pushes to the main repository**. This goes for PRs opened against your fork as well. Our test build pipelines are triggered every time a push to the repository (or PR) is made. Try to batch your commits until you've finished working for that session, or you've reached a point where collaborators need your commits to continue their own work. This also provides you the opportunity to amend commits for minor changes rather than having to commit them on their own because you've already pushed. * This includes merging master into your branch. Try to leave merging from master for after your PR passes review; a maintainer will bring your PR up to date before merging. Exceptions to this include: resolving merge conflicts, needing something that was pushed to master for your branch, or something was pushed to master that could potentionally affect the functionality of what you're writing. @@ -24,13 +24,12 @@ Note that contributions may be rejected on the basis of a contributor failing to * One option is to fork the other contributor's repository and submit your changes to their branch with your own pull request. We suggest following these guidelines when interacting with their repository as well. * The author(s) of inactive PRs and claimed issues will be be pinged after a week of inactivity for an update. Continued inactivity may result in the issue being released back to the community and/or PR closure. 8. **Work as a team** and collaborate wherever possible. Keep things friendly and help each other out - these are shared projects and nobody likes to have their feet trodden on. -9. **Internal projects are internal**. As a contributor, you have access to information that the rest of the server does not. With this trust comes responsibility - do not release any information you have learned as a result of your contributor position. We are very strict about announcing things at specific times, and many staff members will not appreciate a disruption of the announcement schedule. -10. All static content, such as images or audio, **must be licensed for open public use**. +9. All static content, such as images or audio, **must be licensed for open public use**. * Static content must be hosted by a service designed to do so. Failing to do so is known as "leeching" and is frowned upon, as it generates extra bandwidth costs to the host without providing benefit. It would be best if appropriately licensed content is added to the repository itself so it can be served by PyDis' infrastructure. -Above all, the needs of our community should come before the wants of an individual. Work together, build solutions to problems and try to do so in a way that people can learn from easily. Abuse of our trust may result in the loss of your Contributor role, especially in relation to Rule 7. +Above all, the needs of our community should come before the wants of an individual. Work together, build solutions to problems and try to do so in a way that people can learn from easily. Abuse of our trust may result in the loss of your Contributor role. -## Changes to this arrangement +## Changes to this Arrangement All projects evolve over time, and this contribution guide is no different. This document is open to pull requests or changes by contributors. If you believe you have something valuable to add or change, please don't hesitate to do so in a PR. @@ -48,10 +47,14 @@ When pulling down changes from GitHub, remember to sync your environment using ` For example: ```py -def foo(input_1: int, input_2: dict) -> bool: +import typing as t + + +def foo(input_1: int, input_2: t.Dict[str, str]) -> bool: + ... ``` -Tells us that `foo` accepts an `int` and a `dict` and returns a `bool`. +Tells us that `foo` accepts an `int` and a `dict`, with `str` keys and values, and returns a `bool`. All function declarations should be type hinted in code contributed to the PyDis organization. @@ -63,15 +66,19 @@ Many documentation packages provide support for automatic documentation generati For example: ```py -def foo(bar: int, baz: dict=None) -> bool: +import typing as t + + +def foo(bar: int, baz: t.Optional[t.Dict[str, str]] = None) -> bool: """ Does some things with some stuff. :param bar: Some input - :param baz: Optional, some other input + :param baz: Optional, some dictionary with string keys and values :return: Some boolean """ + ... ``` Since PyDis does not utilize automatic documentation generation, use of this syntax should not be used in code contributed to the organization. Should the purpose and type of the input variables not be easily discernable from the variable name and type annotation, a prose explanation can be used. Explicit references to variables, functions, classes, etc. should be wrapped with backticks (`` ` ``). @@ -79,25 +86,33 @@ Since PyDis does not utilize automatic documentation generation, use of this syn For example, the above docstring would become: ```py -def foo(bar: int, baz: dict=None) -> bool: +import typing as t + + +def foo(bar: int, baz: t.Optional[t.Dict[str, str]] = None) -> bool: """ Does some things with some stuff. This function takes an index, `bar` and checks for its presence in the database `baz`, passed as a dictionary. Returns `False` if `baz` is not passed. """ + ... ``` ### Logging Levels -The project currently defines [`logging`](https://docs.python.org/3/library/logging.html) levels as follows: -* **TRACE:** Use this for tracing every step of a complex process. That way we can see which step of the process failed. Err on the side of verbose. **Note:** This is a PyDis-implemented logging level. -* **DEBUG:** Someone is interacting with the application, and the application is behaving as expected. -* **INFO:** Something completely ordinary happened. Like a cog loading during startup. -* **WARNING:** Someone is interacting with the application in an unexpected way or the application is responding in an unexpected way, but without causing an error. -* **ERROR:** An error that affects the specific part that is being interacted with -* **CRITICAL:** An error that affects the whole application. +The project currently defines [`logging`](https://docs.python.org/3/library/logging.html) levels as follows, from lowest to highest severity: +* **TRACE:** These events should be used to provide a *verbose* trace of every step of a complex process. This is essentially the `logging` equivalent of sprinkling `print` statements throughout the code. + * **Note:** This is a PyDis-implemented logging level. +* **DEBUG:** These events should add context to what's happening in a development setup to make it easier to follow what's going while working on a project. This is in the same vein as **TRACE** logging but at a much lower level of verbosity. +* **INFO:** These events are normal and don't need direct attention but are worth keeping track of in production, like checking which cogs were loaded during a start-up. +* **WARNING:** These events are out of the ordinary and should be fixed, but have not caused a failure. + * **NOTE:** Events at this logging level and higher should be reserved for events that require the attention of the DevOps team. +* **ERROR:** These events have caused a failure in a specific part of the application and require urgent attention. +* **CRITICAL:** These events have caused the whole application to fail and require immediate intervention. + +Ensure that log messages are succinct. Should you want to pass additional useful information that would otherwise make the log message overly verbose the `logging` module accepts an `extra` kwarg, which can be used to pass a dictionary. This is used to populate the `__dict__` of the `LogRecord` created for the logging event with user-defined attributes that can be accessed by a log handler. Additional information and caveats may be found [in Python's `logging` documentation](https://docs.python.org/3/library/logging.html#logging.Logger.debug). ### Work in Progress (WIP) PRs -Github [has introduced a new PR feature](https://github.blog/2019-02-14-introducing-draft-pull-requests/) that allows the PR author to mark it as a WIP. This provides both a visual and functional indicator that the contents of the PR are in a draft state and not yet ready for formal review. +Github [provides a PR feature](https://github.blog/2019-02-14-introducing-draft-pull-requests/) that allows the PR author to mark it as a WIP. This provides both a visual and functional indicator that the contents of the PR are in a draft state and not yet ready for formal review. This feature should be utilized in place of the traditional method of prepending `[WIP]` to the PR title. @@ -33,9 +33,7 @@ flake8-tidy-imports = "~=4.0" flake8-todo = "~=0.7" pep8-naming = "~=0.9" pre-commit = "~=2.1" -safety = "~=1.8" unittest-xml-reporting = "~=3.0" -dodgy = "~=0.1" [requires] python_version = "3.8" diff --git a/Pipfile.lock b/Pipfile.lock index 348456f2c..ad9a3173a 100644 --- a/Pipfile.lock +++ b/Pipfile.lock @@ -1,7 +1,7 @@ { "_meta": { "hash": { - "sha256": "b8b38e84230bdc37f8c8955e8dddc442183a2e23c4dfc6ed37c522644aecdeea" + "sha256": "2d3ba484e8467a115126b2ba39fa5f36f103ea455477813dd658797875c79cc9" }, "pipfile-spec": 6, "requires": { @@ -18,11 +18,11 @@ "default": { "aio-pika": { "hashes": [ - "sha256:0332bc13abbd8923dac657b331716778c55ea0a32ac0951306ce85edafcc916c", - "sha256:39770d8bc7e9059e28622d599e2ac9ebc16a7198b33d1743c1a496ca3b0f8170" + "sha256:9e4614636296e0040055bd6b304e97a38cc9796669ef391fc9b36649831d43ee", + "sha256:c9d242b3c7142d64b185feb6c5cce4154962610e89ec2e9b52bd69ef01f89b2f" ], "index": "pypi", - "version": "==6.5.3" + "version": "==6.6.0" }, "aiodns": { "hashes": [ @@ -159,11 +159,11 @@ }, "deepdiff": { "hashes": [ - "sha256:b3fa588d1eac7fa318ec1fb4f2004568e04cb120a1989feda8e5e7164bcbf07a", - "sha256:ed7342d3ed3c0c2058a3fb05b477c943c9959ef62223dca9baa3375718a25d87" + "sha256:59fc1e3e7a28dd0147b0f2b00e3e27181f0f0ef4286b251d5f214a5bcd9a9bc4", + "sha256:91360be1d9d93b1d9c13ae9c5048fa83d9cff17a88eb30afaa0d7ff2d0fee17d" ], "index": "pypi", - "version": "==4.2.0" + "version": "==4.3.2" }, "discord-py": { "hashes": [ @@ -189,10 +189,10 @@ }, "humanfriendly": { "hashes": [ - "sha256:2f79aaa2965c0fc3d79452e64ec2c7601d70d67e51ea2e99cb40afe3fe2824c5", - "sha256:6990c0af4b72f50ddf302900eb982edf199247e621e06d80d71b00b1a1574214" + "sha256:25c2108a45cfd1e8fbe9cdb30b825d34ef5d5675c8e11e4775c9aedbfb0bdee2", + "sha256:3a831920e40e55ad49adb64c9179ed50c604cabca72cd300e7bd5b51310e4ebb" ], - "version": "==8.0" + "version": "==8.1" }, "idna": { "hashes": [ @@ -331,10 +331,10 @@ }, "packaging": { "hashes": [ - "sha256:170748228214b70b672c581a3dd610ee51f733018650740e98c7df862a583f73", - "sha256:e665345f9eef0c621aa0bf2f8d78cf6d21904eef16a93f020240b704a57f1334" + "sha256:3c292b474fda1671ec57d46d739d072bfd495a4f51ad01a055121d81e952b7a3", + "sha256:82f77b9bee21c1bafbf35a84905d604d5d1223801d639cf3ed140bd651c08752" ], - "version": "==20.1" + "version": "==20.3" }, "pamqp": { "hashes": [ @@ -379,16 +379,17 @@ }, "pycparser": { "hashes": [ - "sha256:a988718abfad80b6b157acce7bf130a30876d27603738ac39f140993246b25b3" + "sha256:2d475327684562c3a96cc71adf7dc8c4f0565175cf86b6d7a404ff4c771f15f0", + "sha256:7582ad22678f0fcd81102833f60ef8d0e57288b6b5fb00323d101be910e35705" ], - "version": "==2.19" + "version": "==2.20" }, "pygments": { "hashes": [ - "sha256:2a3fe295e54a20164a9df49c75fa58526d3be48e14aceba6d6b1e8ac0bfd6f1b", - "sha256:98c8aa5a9f778fcd1026a17361ddaf7330d1b7c62ae97c3bb0ae73e0b9b6b0fe" + "sha256:647344a061c249a3b74e230c739f434d7ea4d8b1d5f3721bc0f3558049b38f44", + "sha256:ff7a40b4860b727ab48fad6360eb351cc1b33cbf9b15a0f689ca5353e9463324" ], - "version": "==2.5.2" + "version": "==2.6.1" }, "pyparsing": { "hashes": [ @@ -421,20 +422,20 @@ }, "pyyaml": { "hashes": [ - "sha256:059b2ee3194d718896c0ad077dd8c043e5e909d9180f387ce42012662a4946d6", - "sha256:1cf708e2ac57f3aabc87405f04b86354f66799c8e62c28c5fc5f88b5521b2dbf", - "sha256:24521fa2890642614558b492b473bee0ac1f8057a7263156b02e8b14c88ce6f5", - "sha256:4fee71aa5bc6ed9d5f116327c04273e25ae31a3020386916905767ec4fc5317e", - "sha256:70024e02197337533eef7b85b068212420f950319cc8c580261963aefc75f811", - "sha256:74782fbd4d4f87ff04159e986886931456a1894c61229be9eaf4de6f6e44b99e", - "sha256:940532b111b1952befd7db542c370887a8611660d2b9becff75d39355303d82d", - "sha256:cb1f2f5e426dc9f07a7681419fe39cee823bb74f723f36f70399123f439e9b20", - "sha256:dbbb2379c19ed6042e8f11f2a2c66d39cceb8aeace421bfc29d085d93eda3689", - "sha256:e3a057b7a64f1222b56e47bcff5e4b94c4f61faac04c7c4ecb1985e18caa3994", - "sha256:e9f45bd5b92c7974e59bcd2dcc8631a6b6cc380a904725fce7bc08872e691615" + "sha256:06a0d7ba600ce0b2d2fe2e78453a470b5a6e000a985dd4a4e54e436cc36b0e97", + "sha256:240097ff019d7c70a4922b6869d8a86407758333f02203e0fc6ff79c5dcede76", + "sha256:4f4b913ca1a7319b33cfb1369e91e50354d6f07a135f3b901aca02aa95940bd2", + "sha256:69f00dca373f240f842b2931fb2c7e14ddbacd1397d57157a9b005a6a9942648", + "sha256:73f099454b799e05e5ab51423c7bcf361c58d3206fa7b0d555426b1f4d9a3eaf", + "sha256:74809a57b329d6cc0fdccee6318f44b9b8649961fa73144a98735b0aaf029f1f", + "sha256:7739fc0fa8205b3ee8808aea45e968bc90082c10aef6ea95e855e10abf4a37b2", + "sha256:95f71d2af0ff4227885f7a6605c37fd53d3a106fcab511b8860ecca9fcf400ee", + "sha256:b8eac752c5e14d3eca0e6dd9199cd627518cb5ec06add0de9d32baeee6fe645d", + "sha256:cc8955cfbfc7a115fa81d85284ee61147059a753344bc51098f3ccd69b0d7e0c", + "sha256:d13155f591e6fcc1ec3b30685d50bf0711574e2c0dfffd7644babf8b5102ca1a" ], "index": "pypi", - "version": "==5.3" + "version": "==5.3.1" }, "requests": { "hashes": [ @@ -446,11 +447,11 @@ }, "sentry-sdk": { "hashes": [ - "sha256:480eee754e60bcae983787a9a13bc8f155a111aef199afaa4f289d6a76aa622a", - "sha256:a920387dc3ee252a66679d0afecd34479fb6fc52c2bc20763793ed69e5b0dcc0" + "sha256:23808d571d2461a4ce3784ec12bbee5bdb8c026c143fe79d36cef8a6d653e71f", + "sha256:bb90a4e19c7233a580715fc986cc44be2c48fc10b31e71580a2037e1c94b6950" ], "index": "pypi", - "version": "==0.14.2" + "version": "==0.14.3" }, "six": { "hashes": [ @@ -475,11 +476,11 @@ }, "sphinx": { "hashes": [ - "sha256:776ff8333181138fae52df65be733127539623bb46cc692e7fa0fcfc80d7aa88", - "sha256:ca762da97c3b5107cbf0ab9e11d3ec7ab8d3c31377266fd613b962ed971df709" + "sha256:b4c750d546ab6d7e05bdff6ac24db8ae3e8b8253a3569b754e445110a0a12b66", + "sha256:fc312670b56cb54920d6cc2ced455a22a547910de10b3142276495ced49231cb" ], "index": "pypi", - "version": "==2.4.3" + "version": "==2.4.4" }, "sphinxcontrib-applehelp": { "hashes": [ @@ -595,13 +596,6 @@ ], "version": "==19.3.0" }, - "certifi": { - "hashes": [ - "sha256:017c25db2a153ce562900032d5bc68e9f191e44e9a0f762f373977de9df1fbb3", - "sha256:25b64c7da4cd7479594d035c08c2d809eb4aab3a26e5a990ea98cc450c320f1f" - ], - "version": "==2019.11.28" - }, "cfgv": { "hashes": [ "sha256:1ccf53320421aeeb915275a196e23b3b8ae87dea8ac6698b1638001d4a486d53", @@ -609,56 +603,42 @@ ], "version": "==3.1.0" }, - "chardet": { - "hashes": [ - "sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae", - "sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691" - ], - "version": "==3.0.4" - }, - "click": { - "hashes": [ - "sha256:2335065e6395b9e67ca716de5f7526736bfa6ceead690adf616d925bdc622b13", - "sha256:5b94b49521f6456670fdb30cd82a4eca9412788a93fa6dd6df72c94d5a8ff2d7" - ], - "version": "==7.0" - }, "coverage": { "hashes": [ - "sha256:15cf13a6896048d6d947bf7d222f36e4809ab926894beb748fc9caa14605d9c3", - "sha256:1daa3eceed220f9fdb80d5ff950dd95112cd27f70d004c7918ca6dfc6c47054c", - "sha256:1e44a022500d944d42f94df76727ba3fc0a5c0b672c358b61067abb88caee7a0", - "sha256:25dbf1110d70bab68a74b4b9d74f30e99b177cde3388e07cc7272f2168bd1477", - "sha256:3230d1003eec018ad4a472d254991e34241e0bbd513e97a29727c7c2f637bd2a", - "sha256:3dbb72eaeea5763676a1a1efd9b427a048c97c39ed92e13336e726117d0b72bf", - "sha256:5012d3b8d5a500834783689a5d2292fe06ec75dc86ee1ccdad04b6f5bf231691", - "sha256:51bc7710b13a2ae0c726f69756cf7ffd4362f4ac36546e243136187cfcc8aa73", - "sha256:527b4f316e6bf7755082a783726da20671a0cc388b786a64417780b90565b987", - "sha256:722e4557c8039aad9592c6a4213db75da08c2cd9945320220634f637251c3894", - "sha256:76e2057e8ffba5472fd28a3a010431fd9e928885ff480cb278877c6e9943cc2e", - "sha256:77afca04240c40450c331fa796b3eab6f1e15c5ecf8bf2b8bee9706cd5452fef", - "sha256:7afad9835e7a651d3551eab18cbc0fdb888f0a6136169fbef0662d9cdc9987cf", - "sha256:9bea19ac2f08672636350f203db89382121c9c2ade85d945953ef3c8cf9d2a68", - "sha256:a8b8ac7876bc3598e43e2603f772d2353d9931709345ad6c1149009fd1bc81b8", - "sha256:b0840b45187699affd4c6588286d429cd79a99d509fe3de0f209594669bb0954", - "sha256:b26aaf69713e5674efbde4d728fb7124e429c9466aeaf5f4a7e9e699b12c9fe2", - "sha256:b63dd43f455ba878e5e9f80ba4f748c0a2156dde6e0e6e690310e24d6e8caf40", - "sha256:be18f4ae5a9e46edae3f329de2191747966a34a3d93046dbdf897319923923bc", - "sha256:c312e57847db2526bc92b9bfa78266bfbaabac3fdcd751df4d062cd4c23e46dc", - "sha256:c60097190fe9dc2b329a0eb03393e2e0829156a589bd732e70794c0dd804258e", - "sha256:c62a2143e1313944bf4a5ab34fd3b4be15367a02e9478b0ce800cb510e3bbb9d", - "sha256:cc1109f54a14d940b8512ee9f1c3975c181bbb200306c6d8b87d93376538782f", - "sha256:cd60f507c125ac0ad83f05803063bed27e50fa903b9c2cfee3f8a6867ca600fc", - "sha256:d513cc3db248e566e07a0da99c230aca3556d9b09ed02f420664e2da97eac301", - "sha256:d649dc0bcace6fcdb446ae02b98798a856593b19b637c1b9af8edadf2b150bea", - "sha256:d7008a6796095a79544f4da1ee49418901961c97ca9e9d44904205ff7d6aa8cb", - "sha256:da93027835164b8223e8e5af2cf902a4c80ed93cb0909417234f4a9df3bcd9af", - "sha256:e69215621707119c6baf99bda014a45b999d37602cb7043d943c76a59b05bf52", - "sha256:ea9525e0fef2de9208250d6c5aeeee0138921057cd67fcef90fbed49c4d62d37", - "sha256:fca1669d464f0c9831fd10be2eef6b86f5ebd76c724d1e0706ebdff86bb4adf0" - ], - "index": "pypi", - "version": "==5.0.3" + "sha256:03f630aba2b9b0d69871c2e8d23a69b7fe94a1e2f5f10df5049c0df99db639a0", + "sha256:046a1a742e66d065d16fb564a26c2a15867f17695e7f3d358d7b1ad8a61bca30", + "sha256:0a907199566269e1cfa304325cc3b45c72ae341fbb3253ddde19fa820ded7a8b", + "sha256:165a48268bfb5a77e2d9dbb80de7ea917332a79c7adb747bd005b3a07ff8caf0", + "sha256:1b60a95fc995649464e0cd48cecc8288bac5f4198f21d04b8229dc4097d76823", + "sha256:1f66cf263ec77af5b8fe14ef14c5e46e2eb4a795ac495ad7c03adc72ae43fafe", + "sha256:2e08c32cbede4a29e2a701822291ae2bc9b5220a971bba9d1e7615312efd3037", + "sha256:3844c3dab800ca8536f75ae89f3cf566848a3eb2af4d9f7b1103b4f4f7a5dad6", + "sha256:408ce64078398b2ee2ec08199ea3fcf382828d2f8a19c5a5ba2946fe5ddc6c31", + "sha256:443be7602c790960b9514567917af538cac7807a7c0c0727c4d2bbd4014920fd", + "sha256:4482f69e0701139d0f2c44f3c395d1d1d37abd81bfafbf9b6efbe2542679d892", + "sha256:4a8a259bf990044351baf69d3b23e575699dd60b18460c71e81dc565f5819ac1", + "sha256:513e6526e0082c59a984448f4104c9bf346c2da9961779ede1fc458e8e8a1f78", + "sha256:5f587dfd83cb669933186661a351ad6fc7166273bc3e3a1531ec5c783d997aac", + "sha256:62061e87071497951155cbccee487980524d7abea647a1b2a6eb6b9647df9006", + "sha256:641e329e7f2c01531c45c687efcec8aeca2a78a4ff26d49184dce3d53fc35014", + "sha256:65a7e00c00472cd0f59ae09d2fb8a8aaae7f4a0cf54b2b74f3138d9f9ceb9cb2", + "sha256:6ad6ca45e9e92c05295f638e78cd42bfaaf8ee07878c9ed73e93190b26c125f7", + "sha256:73aa6e86034dad9f00f4bbf5a666a889d17d79db73bc5af04abd6c20a014d9c8", + "sha256:7c9762f80a25d8d0e4ab3cb1af5d9dffbddb3ee5d21c43e3474c84bf5ff941f7", + "sha256:85596aa5d9aac1bf39fe39d9fa1051b0f00823982a1de5766e35d495b4a36ca9", + "sha256:86a0ea78fd851b313b2e712266f663e13b6bc78c2fb260b079e8b67d970474b1", + "sha256:8a620767b8209f3446197c0e29ba895d75a1e272a36af0786ec70fe7834e4307", + "sha256:922fb9ef2c67c3ab20e22948dcfd783397e4c043a5c5fa5ff5e9df5529074b0a", + "sha256:9fad78c13e71546a76c2f8789623eec8e499f8d2d799f4b4547162ce0a4df435", + "sha256:a37c6233b28e5bc340054cf6170e7090a4e85069513320275a4dc929144dccf0", + "sha256:c3fc325ce4cbf902d05a80daa47b645d07e796a80682c1c5800d6ac5045193e5", + "sha256:cda33311cb9fb9323958a69499a667bd728a39a7aa4718d7622597a44c4f1441", + "sha256:db1d4e38c9b15be1521722e946ee24f6db95b189d1447fa9ff18dd16ba89f732", + "sha256:eda55e6e9ea258f5e4add23bcf33dc53b2c319e70806e180aecbff8d90ea24de", + "sha256:f372cdbb240e09ee855735b9d85e7f50730dcfb6296b74b95a3e5dea0615c4c1" + ], + "index": "pypi", + "version": "==5.0.4" }, "distlib": { "hashes": [ @@ -666,21 +646,6 @@ ], "version": "==0.3.0" }, - "dodgy": { - "hashes": [ - "sha256:28323cbfc9352139fdd3d316fa17f325cc0e9ac74438cbba51d70f9b48f86c3a", - "sha256:51f54c0fd886fa3854387f354b19f429d38c04f984f38bc572558b703c0542a6" - ], - "index": "pypi", - "version": "==0.2.1" - }, - "dparse": { - "hashes": [ - "sha256:00a5fdfa900629e5159bf3600d44905b333f4059a3366f28e0dbd13eeab17b19", - "sha256:cef95156fa0adedaf042cd42f9990974bec76f25dfeca4dc01f381a243d5aa5b" - ], - "version": "==0.4.1" - }, "entrypoints": { "hashes": [ "sha256:589f874b313739ad35be6e0cd7efde2a4e9b6fea91edcc34e58ecbb8dbe56d19", @@ -752,11 +717,11 @@ }, "flake8-tidy-imports": { "hashes": [ - "sha256:8aa34384b45137d4cf33f5818b8e7897dc903b1d1e10a503fa7dd193a9a710ba", - "sha256:b26461561bcc80e8012e46846630ecf0aaa59314f362a94cb7800dfdb32fa413" + "sha256:5b6e75cec6d751e66534c522fbdce7dac1c2738b1216b0f6b10453995932e188", + "sha256:cf26fbb3ab31a398f265d53b6f711d80006450c19221e41b2b7b0e0b14ac39c5" ], "index": "pypi", - "version": "==4.0.0" + "version": "==4.0.1" }, "flake8-todo": { "hashes": [ @@ -767,17 +732,10 @@ }, "identify": { "hashes": [ - "sha256:1222b648251bdcb8deb240b294f450fbf704c7984e08baa92507e4ea10b436d5", - "sha256:d824ebe21f38325c771c41b08a95a761db1982f1fc0eee37c6c97df3f1636b96" + "sha256:a7577a1f55cee1d21953a5cf11a3c839ab87f5ef909a4cba6cf52ed72b4c6059", + "sha256:ab246293e6585a1c6361a505b68d5b501a0409310932b7de2c2ead667b564d89" ], - "version": "==1.4.11" - }, - "idna": { - "hashes": [ - "sha256:7588d1c14ae4c77d74036e8c22ff447b26d0fde8f007354fd48a7814db15b7cb", - "sha256:a068a21ceac8a4d63dbfd964670474107f541babbd2250d61922f029858365fa" - ], - "version": "==2.9" + "version": "==1.4.13" }, "mccabe": { "hashes": [ @@ -792,28 +750,21 @@ ], "version": "==1.3.5" }, - "packaging": { - "hashes": [ - "sha256:170748228214b70b672c581a3dd610ee51f733018650740e98c7df862a583f73", - "sha256:e665345f9eef0c621aa0bf2f8d78cf6d21904eef16a93f020240b704a57f1334" - ], - "version": "==20.1" - }, "pep8-naming": { "hashes": [ - "sha256:45f330db8fcfb0fba57458c77385e288e7a3be1d01e8ea4268263ef677ceea5f", - "sha256:a33d38177056321a167decd6ba70b890856ba5025f0a8eca6a3eda607da93caf" + "sha256:5d9f1056cb9427ce344e98d1a7f5665710e2f20f748438e308995852cfa24164", + "sha256:f3b4a5f9dd72b991bf7d8e2a341d2e1aa3a884a769b5aaac4f56825c1763bf3a" ], "index": "pypi", - "version": "==0.9.1" + "version": "==0.10.0" }, "pre-commit": { "hashes": [ - "sha256:09ebe467f43ce24377f8c2f200fe3cd2570d328eb2ce0568c8e96ce19da45fa6", - "sha256:f8d555e31e2051892c7f7b3ad9f620bd2c09271d87e9eedb2ad831737d6211eb" + "sha256:487c675916e6f99d355ec5595ad77b325689d423ef4839db1ed2f02f639c9522", + "sha256:c0aa11bce04a7b46c5544723aedf4e81a4d5f64ad1205a30a9ea12d5e81969e1" ], "index": "pypi", - "version": "==2.1.1" + "version": "==2.2.0" }, "pycodestyle": { "hashes": [ @@ -836,45 +787,22 @@ ], "version": "==2.1.1" }, - "pyparsing": { - "hashes": [ - "sha256:4c830582a84fb022400b85429791bc551f1f4871c33f23e44f353119e92f969f", - "sha256:c342dccb5250c08d45fd6f8b4a559613ca603b57498511740e65cd11a2e7dcec" - ], - "version": "==2.4.6" - }, "pyyaml": { "hashes": [ - "sha256:059b2ee3194d718896c0ad077dd8c043e5e909d9180f387ce42012662a4946d6", - "sha256:1cf708e2ac57f3aabc87405f04b86354f66799c8e62c28c5fc5f88b5521b2dbf", - "sha256:24521fa2890642614558b492b473bee0ac1f8057a7263156b02e8b14c88ce6f5", - "sha256:4fee71aa5bc6ed9d5f116327c04273e25ae31a3020386916905767ec4fc5317e", - "sha256:70024e02197337533eef7b85b068212420f950319cc8c580261963aefc75f811", - "sha256:74782fbd4d4f87ff04159e986886931456a1894c61229be9eaf4de6f6e44b99e", - "sha256:940532b111b1952befd7db542c370887a8611660d2b9becff75d39355303d82d", - "sha256:cb1f2f5e426dc9f07a7681419fe39cee823bb74f723f36f70399123f439e9b20", - "sha256:dbbb2379c19ed6042e8f11f2a2c66d39cceb8aeace421bfc29d085d93eda3689", - "sha256:e3a057b7a64f1222b56e47bcff5e4b94c4f61faac04c7c4ecb1985e18caa3994", - "sha256:e9f45bd5b92c7974e59bcd2dcc8631a6b6cc380a904725fce7bc08872e691615" + "sha256:06a0d7ba600ce0b2d2fe2e78453a470b5a6e000a985dd4a4e54e436cc36b0e97", + "sha256:240097ff019d7c70a4922b6869d8a86407758333f02203e0fc6ff79c5dcede76", + "sha256:4f4b913ca1a7319b33cfb1369e91e50354d6f07a135f3b901aca02aa95940bd2", + "sha256:69f00dca373f240f842b2931fb2c7e14ddbacd1397d57157a9b005a6a9942648", + "sha256:73f099454b799e05e5ab51423c7bcf361c58d3206fa7b0d555426b1f4d9a3eaf", + "sha256:74809a57b329d6cc0fdccee6318f44b9b8649961fa73144a98735b0aaf029f1f", + "sha256:7739fc0fa8205b3ee8808aea45e968bc90082c10aef6ea95e855e10abf4a37b2", + "sha256:95f71d2af0ff4227885f7a6605c37fd53d3a106fcab511b8860ecca9fcf400ee", + "sha256:b8eac752c5e14d3eca0e6dd9199cd627518cb5ec06add0de9d32baeee6fe645d", + "sha256:cc8955cfbfc7a115fa81d85284ee61147059a753344bc51098f3ccd69b0d7e0c", + "sha256:d13155f591e6fcc1ec3b30685d50bf0711574e2c0dfffd7644babf8b5102ca1a" ], "index": "pypi", - "version": "==5.3" - }, - "requests": { - "hashes": [ - "sha256:43999036bfa82904b6af1d99e4882b560e5e2c68e5c4b0aa03b655f3d7d73fee", - "sha256:b3f43d496c6daba4493e7c431722aeb7dbc6288f52a6e04e7b6023b0247817e6" - ], - "index": "pypi", - "version": "==2.23.0" - }, - "safety": { - "hashes": [ - "sha256:0a3a8a178a9c96242b224f033ee8d1d130c0448b0e6622d12deaf37f6c3b4e59", - "sha256:5059f3ffab3648330548ea9c7403405bbfaf085b11235770825d14c58f24cb78" - ], - "index": "pypi", - "version": "==1.8.5" + "version": "==5.3.1" }, "six": { "hashes": [ @@ -905,19 +833,12 @@ "index": "pypi", "version": "==3.0.2" }, - "urllib3": { - "hashes": [ - "sha256:2f3db8b19923a873b3e5256dc9c2dedfa883e33d87c690d9c7913e1f40673cdc", - "sha256:87716c2d2a7121198ebcb7ce7cccf6ce5e9ba539041cfbaeecfb641dc0bf6acc" - ], - "version": "==1.25.8" - }, "virtualenv": { "hashes": [ - "sha256:30ea90b21dabd11da5f509710ad3be2ae47d40ccbc717dfdd2efe4367c10f598", - "sha256:4a36a96d785428278edd389d9c36d763c5755844beb7509279194647b1ef47f1" + "sha256:87831f1070534b636fea2241dd66f3afe37ac9041bcca6d0af3215cdcfbf7d82", + "sha256:f3128d882383c503003130389bf892856341c1da12c881ae24d6358c82561b55" ], - "version": "==20.0.7" + "version": "==20.0.13" } } } diff --git a/azure-pipelines.yml b/azure-pipelines.yml index 16d1b7a2a..d56675029 100644 --- a/azure-pipelines.yml +++ b/azure-pipelines.yml @@ -1,9 +1,13 @@ # https://aka.ms/yaml variables: + PIP_NO_CACHE_DIR: false + PIP_USER: 1 PIPENV_HIDE_EMOJIS: 1 PIPENV_IGNORE_VIRTUALENVS: 1 PIPENV_NOSPIN: 1 + PRE_COMMIT_HOME: $(Pipeline.Workspace)/pre-commit-cache + PYTHONUSERBASE: $(Pipeline.Workspace)/py-user-base jobs: - job: test @@ -12,22 +16,38 @@ jobs: vmImage: ubuntu-18.04 variables: - PIP_CACHE_DIR: ".cache/pip" - PRE_COMMIT_HOME: $(Pipeline.Workspace)/pre-commit-cache + BOT_API_KEY: foo + BOT_SENTRY_DSN: blah + BOT_TOKEN: bar + REDDIT_CLIENT_ID: spam + REDDIT_SECRET: ham + WOLFRAM_API_KEY: baz steps: - task: UsePythonVersion@0 displayName: 'Set Python version' - name: PythonVersion + name: python inputs: versionSpec: '3.8.x' addToPath: true + - task: Cache@2 + displayName: 'Restore Python environment' + inputs: + key: python | $(Agent.OS) | "$(python.pythonLocation)" | 0 | ./Pipfile | ./Pipfile.lock + cacheHitVar: PY_ENV_RESTORED + path: $(PYTHONUSERBASE) + + - script: echo '##vso[task.prependpath]$(PYTHONUSERBASE)/bin' + displayName: 'Prepend PATH' + - script: pip install pipenv displayName: 'Install pipenv' + condition: and(succeeded(), ne(variables.PY_ENV_RESTORED, 'true')) - script: pipenv install --dev --deploy --system displayName: 'Install project using pipenv' + condition: and(succeeded(), ne(variables.PY_ENV_RESTORED, 'true')) # Create an executable shell script which replaces the original pipenv binary. # The shell script ignores the first argument and executes the rest of the args as a command. @@ -35,22 +55,21 @@ jobs: # pipenv entirely, which is too dumb to know it should use the system interpreter rather than # creating a new venv. - script: | - printf '%s\n%s' '#!/bin/bash' '"${@:2}"' > $(PythonVersion.pythonLocation)/bin/pipenv \ - && chmod +x $(PythonVersion.pythonLocation)/bin/pipenv + printf '%s\n%s' '#!/bin/bash' '"${@:2}"' > $(python.pythonLocation)/bin/pipenv \ + && chmod +x $(python.pythonLocation)/bin/pipenv displayName: 'Mock pipenv binary' - task: Cache@2 displayName: 'Restore pre-commit environment' inputs: - key: pre-commit | "$(PythonVersion.pythonLocation)" | .pre-commit-config.yaml - restoreKeys: | - pre-commit | "$(PythonVersion.pythonLocation)" + key: pre-commit | "$(python.pythonLocation)" | 0 | .pre-commit-config.yaml path: $(PRE_COMMIT_HOME) - - script: pre-commit run --all-files + # pre-commit's venv doesn't allow user installs - not that they're really needed anyway. + - script: export PIP_USER=0; pre-commit run --all-files displayName: 'Run pre-commit hooks' - - script: BOT_API_KEY=foo BOT_SENTRY_DSN=blah BOT_TOKEN=bar WOLFRAM_API_KEY=baz REDDIT_CLIENT_ID=spam REDDIT_SECRET=ham coverage run -m xmlrunner + - script: coverage run -m xmlrunner displayName: Run tests - script: coverage report -m && coverage xml -o coverage.xml diff --git a/bot/__main__.py b/bot/__main__.py index f6ca5a9c8..bf98f2cfd 100644 --- a/bot/__main__.py +++ b/bot/__main__.py @@ -61,6 +61,7 @@ bot.load_extension("bot.cogs.tags") bot.load_extension("bot.cogs.token_remover") bot.load_extension("bot.cogs.utils") bot.load_extension("bot.cogs.watchchannels") +bot.load_extension("bot.cogs.webhook_remover") bot.load_extension("bot.cogs.wolfram") if constants.HelpChannels.enable: diff --git a/bot/cogs/alias.py b/bot/cogs/alias.py index 0b800575f..55c7efe65 100644 --- a/bot/cogs/alias.py +++ b/bot/cogs/alias.py @@ -26,10 +26,10 @@ class Alias (Cog): log.debug(f"{cmd_name} was invoked through an alias") cmd = self.bot.get_command(cmd_name) if not cmd: - return log.warning(f'Did not find command "{cmd_name}" to invoke.') + return log.info(f'Did not find command "{cmd_name}" to invoke.') elif not await cmd.can_run(ctx): - return log.warning( - f'{str(ctx.author)} tried to run the command "{cmd_name}"' + return log.info( + f'{str(ctx.author)} tried to run the command "{cmd_name}" but lacks permission.' ) await ctx.invoke(cmd, *args, **kwargs) diff --git a/bot/cogs/bot.py b/bot/cogs/bot.py index f0ca2b175..a6929b431 100644 --- a/bot/cogs/bot.py +++ b/bot/cogs/bot.py @@ -59,7 +59,6 @@ class BotCog(Cog, name="Bot"): icon_url=URLs.bot_avatar ) - log.info(f"{ctx.author} called !about. Returning information about the bot.") await ctx.send(embed=embed) @command(name='echo', aliases=('print',)) @@ -236,7 +235,7 @@ class BotCog(Cog, name="Bot"): ) and not msg.author.bot and len(msg.content.splitlines()) > 3 - and not TokenRemover.is_token_in_message(msg) + and not TokenRemover.find_token_in_message(msg) ) if parse_codeblock: # no token in the msg diff --git a/bot/cogs/error_handler.py b/bot/cogs/error_handler.py index 261769efc..6a622d2ce 100644 --- a/bot/cogs/error_handler.py +++ b/bot/cogs/error_handler.py @@ -31,7 +31,9 @@ class ErrorHandler(Cog): Error handling emits a single error message in the invoking context `ctx` and a log message, prioritised as follows: - 1. If the name fails to match a command but matches a tag, the tag is invoked + 1. If the name fails to match a command: + * If it matches shh+ or unshh+, the channel is silenced or unsilenced respectively. + Otherwise if it matches a tag, the tag is invoked * If CommandNotFound is raised when invoking the tag (determined by the presence of the `invoked_from_error_handler` attribute), this error is treated as being unexpected and therefore sends an error message @@ -48,9 +50,11 @@ class ErrorHandler(Cog): log.trace(f"Command {command} had its error already handled locally; ignoring.") return - # Try to look for a tag with the command's name if the command isn't found. if isinstance(e, errors.CommandNotFound) and not hasattr(ctx, "invoked_from_error_handler"): + if await self.try_silence(ctx): + return if ctx.channel.id != Channels.verification: + # Try to look for a tag with the command's name await self.try_get_tag(ctx) return # Exit early to avoid logging. elif isinstance(e, errors.UserInputError): @@ -89,6 +93,33 @@ class ErrorHandler(Cog): else: return self.bot.get_command("help") + async def try_silence(self, ctx: Context) -> bool: + """ + Attempt to invoke the silence or unsilence command if invoke with matches a pattern. + + Respecting the checks if: + * invoked with `shh+` silence channel for amount of h's*2 with max of 15. + * invoked with `unshh+` unsilence channel + Return bool depending on success of command. + """ + command = ctx.invoked_with.lower() + silence_command = self.bot.get_command("silence") + ctx.invoked_from_error_handler = True + try: + if not await silence_command.can_run(ctx): + log.debug("Cancelling attempt to invoke silence/unsilence due to failed checks.") + return False + except errors.CommandError: + log.debug("Cancelling attempt to invoke silence/unsilence due to failed checks.") + return False + if command.startswith("shh"): + await ctx.invoke(silence_command, duration=min(command.count("h")*2, 15)) + return True + elif command.startswith("unshh"): + await ctx.invoke(self.bot.get_command("unsilence")) + return True + return False + async def try_get_tag(self, ctx: Context) -> None: """ Attempt to display a tag by interpreting the command name as a tag name. diff --git a/bot/cogs/filtering.py b/bot/cogs/filtering.py index 6651d38e4..3f3dbb853 100644 --- a/bot/cogs/filtering.py +++ b/bot/cogs/filtering.py @@ -38,6 +38,7 @@ WORD_WATCHLIST_PATTERNS = [ TOKEN_WATCHLIST_PATTERNS = [ re.compile(fr'{expression}', flags=re.IGNORECASE) for expression in Filter.token_watchlist ] +WATCHLIST_PATTERNS = WORD_WATCHLIST_PATTERNS + TOKEN_WATCHLIST_PATTERNS def expand_spoilers(text: str) -> str: @@ -88,24 +89,18 @@ class Filtering(Cog): f"Your URL has been removed because it matched a blacklisted domain. {staff_mistake_str}" ) }, + "watch_regex": { + "enabled": Filter.watch_regex, + "function": self._has_watch_regex_match, + "type": "watchlist", + "content_only": True, + }, "watch_rich_embeds": { "enabled": Filter.watch_rich_embeds, "function": self._has_rich_embed, "type": "watchlist", "content_only": False, }, - "watch_words": { - "enabled": Filter.watch_words, - "function": self._has_watchlist_words, - "type": "watchlist", - "content_only": True, - }, - "watch_tokens": { - "enabled": Filter.watch_tokens, - "function": self._has_watchlist_tokens, - "type": "watchlist", - "content_only": True, - }, } @property @@ -191,8 +186,8 @@ class Filtering(Cog): else: channel_str = f"in {msg.channel.mention}" - # Word and match stats for watch_words and watch_tokens - if filter_name in ("watch_words", "watch_tokens"): + # Word and match stats for watch_regex + if filter_name == "watch_regex": surroundings = match.string[max(match.start() - 10, 0): match.end() + 10] message_content = ( f"**Match:** '{match[0]}'\n" @@ -248,37 +243,24 @@ class Filtering(Cog): break # We don't want multiple filters to trigger @staticmethod - async def _has_watchlist_words(text: str) -> Union[bool, re.Match]: + async def _has_watch_regex_match(text: str) -> Union[bool, re.Match]: """ - Returns True if the text contains one of the regular expressions from the word_watchlist in our filter config. + Return True if `text` matches any regex from `word_watchlist` or `token_watchlist` configs. - Only matches words with boundaries before and after the expression. + `word_watchlist`'s patterns are placed between word boundaries while `token_watchlist` is + matched as-is. Spoilers are expanded, if any, and URLs are ignored. """ if SPOILER_RE.search(text): text = expand_spoilers(text) - for regex_pattern in WORD_WATCHLIST_PATTERNS: - match = regex_pattern.search(text) - if match: - return match # match objects always have a boolean value of True - return False - - @staticmethod - async def _has_watchlist_tokens(text: str) -> Union[bool, re.Match]: - """ - Returns True if the text contains one of the regular expressions from the token_watchlist in our filter config. + # Make sure it's not a URL + if URL_RE.search(text): + return False - This will match the expression even if it does not have boundaries before and after. - """ - for regex_pattern in TOKEN_WATCHLIST_PATTERNS: - match = regex_pattern.search(text) + for pattern in WATCHLIST_PATTERNS: + match = pattern.search(text) if match: - - # Make sure it's not a URL - if not URL_RE.search(text): - return match # match objects always have a boolean value of True - - return False + return match @staticmethod async def _has_urls(text: str) -> bool: diff --git a/bot/cogs/moderation/__init__.py b/bot/cogs/moderation/__init__.py index 5243cb92d..6880ca1bd 100644 --- a/bot/cogs/moderation/__init__.py +++ b/bot/cogs/moderation/__init__.py @@ -2,12 +2,14 @@ from bot.bot import Bot from .infractions import Infractions from .management import ModManagement from .modlog import ModLog +from .silence import Silence from .superstarify import Superstarify def setup(bot: Bot) -> None: - """Load the Infractions, ModManagement, ModLog, and Superstarify cogs.""" + """Load the Infractions, ModManagement, ModLog, Silence, and Superstarify cogs.""" bot.add_cog(Infractions(bot)) bot.add_cog(ModLog(bot)) bot.add_cog(ModManagement(bot)) + bot.add_cog(Silence(bot)) bot.add_cog(Superstarify(bot)) diff --git a/bot/cogs/moderation/management.py b/bot/cogs/moderation/management.py index 35448f682..250a24247 100644 --- a/bot/cogs/moderation/management.py +++ b/bot/cogs/moderation/management.py @@ -100,7 +100,12 @@ class ModManagement(commands.Cog): confirm_messages = [] log_text = "" - if isinstance(duration, str): + if duration is not None and not old_infraction['active']: + if reason is None: + await ctx.send(":x: Cannot edit the expiration of an expired infraction.") + return + confirm_messages.append("expiry unchanged (infraction already expired)") + elif isinstance(duration, str): request_data['expires_at'] = None confirm_messages.append("marked as permanent") elif duration is not None: diff --git a/bot/cogs/moderation/scheduler.py b/bot/cogs/moderation/scheduler.py index f0b6b2c48..917697be9 100644 --- a/bot/cogs/moderation/scheduler.py +++ b/bot/cogs/moderation/scheduler.py @@ -222,7 +222,7 @@ class InfractionScheduler(Scheduler): # If multiple active infractions were found, mark them as inactive in the database # and cancel their expiration tasks. if len(response) > 1: - log.warning( + log.info( f"Found more than one active {infr_type} infraction for user {user.id}; " "deactivating the extra active infractions too." ) diff --git a/bot/cogs/moderation/silence.py b/bot/cogs/moderation/silence.py new file mode 100644 index 000000000..1ef3967a9 --- /dev/null +++ b/bot/cogs/moderation/silence.py @@ -0,0 +1,159 @@ +import asyncio +import logging +from contextlib import suppress +from typing import Optional + +from discord import TextChannel +from discord.ext import commands, tasks +from discord.ext.commands import Context + +from bot.bot import Bot +from bot.constants import Channels, Emojis, Guild, MODERATION_ROLES, Roles +from bot.converters import HushDurationConverter +from bot.utils.checks import with_role_check + +log = logging.getLogger(__name__) + + +class SilenceNotifier(tasks.Loop): + """Loop notifier for posting notices to `alert_channel` containing added channels.""" + + def __init__(self, alert_channel: TextChannel): + super().__init__(self._notifier, seconds=1, minutes=0, hours=0, count=None, reconnect=True, loop=None) + self._silenced_channels = {} + self._alert_channel = alert_channel + + def add_channel(self, channel: TextChannel) -> None: + """Add channel to `_silenced_channels` and start loop if not launched.""" + if not self._silenced_channels: + self.start() + log.info("Starting notifier loop.") + self._silenced_channels[channel] = self._current_loop + + def remove_channel(self, channel: TextChannel) -> None: + """Remove channel from `_silenced_channels` and stop loop if no channels remain.""" + with suppress(KeyError): + del self._silenced_channels[channel] + if not self._silenced_channels: + self.stop() + log.info("Stopping notifier loop.") + + async def _notifier(self) -> None: + """Post notice of `_silenced_channels` with their silenced duration to `_alert_channel` periodically.""" + # Wait for 15 minutes between notices with pause at start of loop. + if self._current_loop and not self._current_loop/60 % 15: + log.debug( + f"Sending notice with channels: " + f"{', '.join(f'#{channel} ({channel.id})' for channel in self._silenced_channels)}." + ) + channels_text = ', '.join( + f"{channel.mention} for {(self._current_loop-start)//60} min" + for channel, start in self._silenced_channels.items() + ) + await self._alert_channel.send(f"<@&{Roles.moderators}> currently silenced channels: {channels_text}") + + +class Silence(commands.Cog): + """Commands for stopping channel messages for `verified` role in a channel.""" + + def __init__(self, bot: Bot): + self.bot = bot + self.muted_channels = set() + self._get_instance_vars_task = self.bot.loop.create_task(self._get_instance_vars()) + self._get_instance_vars_event = asyncio.Event() + + async def _get_instance_vars(self) -> None: + """Get instance variables after they're available to get from the guild.""" + await self.bot.wait_until_guild_available() + guild = self.bot.get_guild(Guild.id) + self._verified_role = guild.get_role(Roles.verified) + self._mod_alerts_channel = self.bot.get_channel(Channels.mod_alerts) + self._mod_log_channel = self.bot.get_channel(Channels.mod_log) + self.notifier = SilenceNotifier(self._mod_log_channel) + self._get_instance_vars_event.set() + + @commands.command(aliases=("hush",)) + async def silence(self, ctx: Context, duration: HushDurationConverter = 10) -> None: + """ + Silence the current channel for `duration` minutes or `forever`. + + Duration is capped at 15 minutes, passing forever makes the silence indefinite. + Indefinitely silenced channels get added to a notifier which posts notices every 15 minutes from the start. + """ + await self._get_instance_vars_event.wait() + log.debug(f"{ctx.author} is silencing channel #{ctx.channel}.") + if not await self._silence(ctx.channel, persistent=(duration is None), duration=duration): + await ctx.send(f"{Emojis.cross_mark} current channel is already silenced.") + return + if duration is None: + await ctx.send(f"{Emojis.check_mark} silenced current channel indefinitely.") + return + + await ctx.send(f"{Emojis.check_mark} silenced current channel for {duration} minute(s).") + await asyncio.sleep(duration*60) + log.info(f"Unsilencing channel after set delay.") + await ctx.invoke(self.unsilence) + + @commands.command(aliases=("unhush",)) + async def unsilence(self, ctx: Context) -> None: + """ + Unsilence the current channel. + + If the channel was silenced indefinitely, notifications for the channel will stop. + """ + await self._get_instance_vars_event.wait() + log.debug(f"Unsilencing channel #{ctx.channel} from {ctx.author}'s command.") + if await self._unsilence(ctx.channel): + await ctx.send(f"{Emojis.check_mark} unsilenced current channel.") + + async def _silence(self, channel: TextChannel, persistent: bool, duration: Optional[int]) -> bool: + """ + Silence `channel` for `self._verified_role`. + + If `persistent` is `True` add `channel` to notifier. + `duration` is only used for logging; if None is passed `persistent` should be True to not log None. + Return `True` if channel permissions were changed, `False` otherwise. + """ + current_overwrite = channel.overwrites_for(self._verified_role) + if current_overwrite.send_messages is False: + log.info(f"Tried to silence channel #{channel} ({channel.id}) but the channel was already silenced.") + return False + await channel.set_permissions(self._verified_role, **dict(current_overwrite, send_messages=False)) + self.muted_channels.add(channel) + if persistent: + log.info(f"Silenced #{channel} ({channel.id}) indefinitely.") + self.notifier.add_channel(channel) + return True + + log.info(f"Silenced #{channel} ({channel.id}) for {duration} minute(s).") + return True + + async def _unsilence(self, channel: TextChannel) -> bool: + """ + Unsilence `channel`. + + Check if `channel` is silenced through a `PermissionOverwrite`, + if it is unsilence it and remove it from the notifier. + Return `True` if channel permissions were changed, `False` otherwise. + """ + current_overwrite = channel.overwrites_for(self._verified_role) + if current_overwrite.send_messages is False: + await channel.set_permissions(self._verified_role, **dict(current_overwrite, send_messages=None)) + log.info(f"Unsilenced channel #{channel} ({channel.id}).") + self.notifier.remove_channel(channel) + self.muted_channels.discard(channel) + return True + log.info(f"Tried to unsilence channel #{channel} ({channel.id}) but the channel was not silenced.") + return False + + def cog_unload(self) -> None: + """Send alert with silenced channels on unload.""" + if self.muted_channels: + channels_string = ''.join(channel.mention for channel in self.muted_channels) + message = f"<@&{Roles.moderators}> channels left silenced on cog unload: {channels_string}" + asyncio.create_task(self._mod_alerts_channel.send(message)) + + # This cannot be static (must have a __func__ attribute). + def cog_check(self, ctx: Context) -> bool: + """Only allow moderators to invoke the commands in this cog.""" + return with_role_check(ctx, *MODERATION_ROLES) diff --git a/bot/cogs/moderation/superstarify.py b/bot/cogs/moderation/superstarify.py index 893cb7f13..ca3dc4202 100644 --- a/bot/cogs/moderation/superstarify.py +++ b/bot/cogs/moderation/superstarify.py @@ -59,7 +59,7 @@ class Superstarify(InfractionScheduler, Cog): return # Nick change was triggered by this event. Ignore. log.info( - f"{after.display_name} is currently in superstar-prison. " + f"{after.display_name} ({after.id}) tried to escape superstar prison. " f"Changing the nick back to {before.display_name}." ) await after.edit( @@ -80,7 +80,7 @@ class Superstarify(InfractionScheduler, Cog): ) if not notified: - log.warning("Failed to DM user about why they cannot change their nickname.") + log.info("Failed to DM user about why they cannot change their nickname.") @Cog.listener() async def on_member_join(self, member: Member) -> None: diff --git a/bot/cogs/moderation/utils.py b/bot/cogs/moderation/utils.py index 5052b9048..3598f3b1f 100644 --- a/bot/cogs/moderation/utils.py +++ b/bot/cogs/moderation/utils.py @@ -38,7 +38,7 @@ async def post_user(ctx: Context, user: UserSnowflake) -> t.Optional[dict]: log.trace(f"Attempting to add user {user.id} to the database.") if not isinstance(user, (discord.Member, discord.User)): - log.warning("The user being added to the DB is not a Member or User object.") + log.debug("The user being added to the DB is not a Member or User object.") payload = { 'avatar_hash': getattr(user, 'avatar', 0), diff --git a/bot/cogs/snekbox.py b/bot/cogs/snekbox.py index cff7c5786..315383b12 100644 --- a/bot/cogs/snekbox.py +++ b/bot/cogs/snekbox.py @@ -232,7 +232,7 @@ class Snekbox(Cog): timeout=10 ) - code = new_message.content.split(' ', maxsplit=1)[1] + code = await self.get_code(new_message) await ctx.message.clear_reactions() with contextlib.suppress(HTTPException): await response.delete() @@ -243,6 +243,26 @@ class Snekbox(Cog): return code + async def get_code(self, message: Message) -> Optional[str]: + """ + Return the code from `message` to be evaluated. + + If the message is an invocation of the eval command, return the first argument or None if it + doesn't exist. Otherwise, return the full content of the message. + """ + log.trace(f"Getting context for message {message.id}.") + new_ctx = await self.bot.get_context(message) + + if new_ctx.command is self.eval_command: + log.trace(f"Message {message.id} invokes eval command.") + split = message.content.split(maxsplit=1) + code = split[1] if len(split) > 1 else None + else: + log.trace(f"Message {message.id} does not invoke eval command.") + code = message.content + + return code + @command(name="eval", aliases=("e",)) @guild_only() @in_channel(Channels.bot_commands, hidden_channels=(Channels.esoteric,), bypass_roles=EVAL_ROLES) @@ -281,7 +301,7 @@ class Snekbox(Cog): code = await self.continue_eval(ctx, response) if not code: break - log.info(f"Re-evaluating message {ctx.message.id}") + log.info(f"Re-evaluating code from message {ctx.message.id}:\n{code}") def predicate_eval_message_edit(ctx: Context, old_msg: Message, new_msg: Message) -> bool: diff --git a/bot/cogs/sync/syncers.py b/bot/cogs/sync/syncers.py index c7ce54d65..003bf3727 100644 --- a/bot/cogs/sync/syncers.py +++ b/bot/cogs/sync/syncers.py @@ -131,7 +131,7 @@ class Syncer(abc.ABC): await message.edit(content=f':ok_hand: {mention}{self.name} sync will proceed.') return True else: - log.warning(f"The {self.name} syncer was aborted or timed out!") + log.info(f"The {self.name} syncer was aborted or timed out!") await message.edit( content=f':warning: {mention}{self.name} sync aborted or timed out!' ) diff --git a/bot/cogs/tags.py b/bot/cogs/tags.py index 539105017..a6e5952ff 100644 --- a/bot/cogs/tags.py +++ b/bot/cogs/tags.py @@ -11,6 +11,7 @@ from bot import constants from bot.bot import Bot from bot.converters import TagNameConverter from bot.pagination import LinePaginator +from bot.utils.messages import wait_for_deletion log = logging.getLogger(__name__) @@ -167,6 +168,7 @@ class Tags(Cog): @tags_group.command(name='get', aliases=('show', 'g')) async def get_command(self, ctx: Context, *, tag_name: TagNameConverter = None) -> None: """Get a specified tag, or a list of all tags if no tag is specified.""" + def _command_on_cooldown(tag_name: str) -> bool: """ Check if the command is currently on cooldown, on a per-tag, per-channel basis. @@ -205,12 +207,22 @@ class Tags(Cog): "time": time.time(), "channel": ctx.channel.id } - await ctx.send(embed=Embed.from_dict(tag['embed'])) + await wait_for_deletion( + await ctx.send(embed=Embed.from_dict(tag['embed'])), + [ctx.author.id], + client=self.bot + ) elif founds and len(tag_name) >= 3: - await ctx.send(embed=Embed( - title='Did you mean ...', - description='\n'.join(tag['title'] for tag in founds[:10]) - )) + await wait_for_deletion( + await ctx.send( + embed=Embed( + title='Did you mean ...', + description='\n'.join(tag['title'] for tag in founds[:10]) + ) + ), + [ctx.author.id], + client=self.bot + ) else: tags = self._cache.values() diff --git a/bot/cogs/token_remover.py b/bot/cogs/token_remover.py index 547ba8da0..421ad23e2 100644 --- a/bot/cogs/token_remover.py +++ b/bot/cogs/token_remover.py @@ -3,6 +3,7 @@ import binascii import logging import re import struct +import typing as t from datetime import datetime from discord import Colour, Message @@ -53,8 +54,9 @@ class TokenRemover(Cog): See: https://discordapp.com/developers/docs/reference#snowflakes """ - if self.is_token_in_message(msg): - await self.take_action(msg) + found_token = self.find_token_in_message(msg) + if found_token: + await self.take_action(msg, found_token) @Cog.listener() async def on_message_edit(self, before: Message, after: Message) -> None: @@ -63,12 +65,13 @@ class TokenRemover(Cog): See: https://discordapp.com/developers/docs/reference#snowflakes """ - if self.is_token_in_message(after): - await self.take_action(after) + found_token = self.find_token_in_message(after) + if found_token: + await self.take_action(after, found_token) - async def take_action(self, msg: Message) -> None: + async def take_action(self, msg: Message, found_token: str) -> None: """Remove the `msg` containing a token an send a mod_log message.""" - user_id, creation_timestamp, hmac = TOKEN_RE.search(msg.content).group(0).split('.') + user_id, creation_timestamp, hmac = found_token.split('.') self.mod_log.ignore(Event.message_delete, msg.id) await msg.delete() await msg.channel.send(DELETION_MESSAGE_TEMPLATE.format(mention=msg.author.mention)) @@ -91,18 +94,21 @@ class TokenRemover(Cog): ) @classmethod - def is_token_in_message(cls, msg: Message) -> bool: - """Check if `msg` contains a seemly valid token.""" + def find_token_in_message(cls, msg: Message) -> t.Optional[str]: + """Return a seemingly valid token found in `msg` or `None` if no token is found.""" if msg.author.bot: - return False + return # Use findall rather than search to guard against method calls prematurely returning the # token check (e.g. `message.channel.send` also matches our token pattern) maybe_matches = TOKEN_RE.findall(msg.content) - if not maybe_matches: - return False + for substr in maybe_matches: + if cls.is_maybe_token(substr): + # Short-circuit on first match + return substr - return any(cls.is_maybe_token(substr) for substr in maybe_matches) + # No matching substring + return @classmethod def is_maybe_token(cls, test_str: str) -> bool: diff --git a/bot/cogs/utils.py b/bot/cogs/utils.py index 024141d62..3ed471bbf 100644 --- a/bot/cogs/utils.py +++ b/bot/cogs/utils.py @@ -40,6 +40,8 @@ If the implementation is easy to explain, it may be a good idea. Namespaces are one honking great idea -- let's do more of those! """ +ICON_URL = "https://www.python.org/static/opengraph-icon-200x200.png" + class Utils(Cog): """A selection of utilities which don't have a clear category.""" @@ -59,6 +61,10 @@ class Utils(Cog): await ctx.invoke(self.bot.get_command("help"), "pep") return + # Handle PEP 0 directly because it's not in .rst or .txt so it can't be accessed like other PEPs. + if pep_number == 0: + return await self.send_pep_zero(ctx) + possible_extensions = ['.txt', '.rst'] found_pep = False for extension in possible_extensions: @@ -82,7 +88,7 @@ class Utils(Cog): description=f"[Link]({self.base_pep_url}{pep_number:04})", ) - pep_embed.set_thumbnail(url="https://www.python.org/static/opengraph-icon-200x200.png") + pep_embed.set_thumbnail(url=ICON_URL) # Add the interesting information fields_to_check = ("Status", "Python-Version", "Created", "Type") @@ -230,7 +236,17 @@ class Utils(Cog): await ctx.send(embed=embed) return - # handle if it's a search string + # Try to handle first exact word due difflib.SequenceMatched may use some other similar word instead + # exact word. + for i, line in enumerate(zen_lines): + for word in line.split(): + if word.lower() == search_value.lower(): + embed.title += f" (line {i}):" + embed.description = line + await ctx.send(embed=embed) + return + + # handle if it's a search string and not exact word matcher = difflib.SequenceMatcher(None, search_value.lower()) best_match = "" @@ -278,6 +294,19 @@ class Utils(Cog): for reaction in options: await message.add_reaction(reaction) + async def send_pep_zero(self, ctx: Context) -> None: + """Send information about PEP 0.""" + pep_embed = Embed( + title=f"**PEP 0 - Index of Python Enhancement Proposals (PEPs)**", + description=f"[Link](https://www.python.org/dev/peps/)" + ) + pep_embed.set_thumbnail(url=ICON_URL) + pep_embed.add_field(name="Status", value="Active") + pep_embed.add_field(name="Created", value="13-Jul-2000") + pep_embed.add_field(name="Type", value="Informational") + + await ctx.send(embed=pep_embed) + def setup(bot: Bot) -> None: """Load the Utils cog.""" diff --git a/bot/cogs/webhook_remover.py b/bot/cogs/webhook_remover.py new file mode 100644 index 000000000..49692113d --- /dev/null +++ b/bot/cogs/webhook_remover.py @@ -0,0 +1,72 @@ +import logging +import re + +from discord import Colour, Message +from discord.ext.commands import Cog + +from bot.bot import Bot +from bot.cogs.moderation.modlog import ModLog +from bot.constants import Channels, Colours, Event, Icons + +WEBHOOK_URL_RE = re.compile(r"((?:https?://)?discordapp\.com/api/webhooks/\d+/)\S+/?", re.I) + +ALERT_MESSAGE_TEMPLATE = ( + "{user}, looks like you posted a Discord webhook URL. Therefore, your " + "message has been removed. Your webhook may have been **compromised** so " + "please re-create the webhook **immediately**. If you believe this was " + "mistake, please let us know." +) + +log = logging.getLogger(__name__) + + +class WebhookRemover(Cog): + """Scan messages to detect Discord webhooks links.""" + + def __init__(self, bot: Bot): + self.bot = bot + + @property + def mod_log(self) -> ModLog: + """Get current instance of `ModLog`.""" + return self.bot.get_cog("ModLog") + + async def delete_and_respond(self, msg: Message, redacted_url: str) -> None: + """Delete `msg` and send a warning that it contained the Discord webhook `redacted_url`.""" + # Don't log this, due internal delete, not by user. Will make different entry. + self.mod_log.ignore(Event.message_delete, msg.id) + await msg.delete() + await msg.channel.send(ALERT_MESSAGE_TEMPLATE.format(user=msg.author.mention)) + + message = ( + f"{msg.author} (`{msg.author.id}`) posted a Discord webhook URL " + f"to #{msg.channel}. Webhook URL was `{redacted_url}`" + ) + log.debug(message) + + # Send entry to moderation alerts. + await self.mod_log.send_log_message( + icon_url=Icons.token_removed, + colour=Colour(Colours.soft_red), + title="Discord webhook URL removed!", + text=message, + thumbnail=msg.author.avatar_url_as(static_format="png"), + channel_id=Channels.mod_alerts + ) + + @Cog.listener() + async def on_message(self, msg: Message) -> None: + """Check if a Discord webhook URL is in `message`.""" + matches = WEBHOOK_URL_RE.search(msg.content) + if matches: + await self.delete_and_respond(msg, matches[1] + "xxx") + + @Cog.listener() + async def on_message_edit(self, before: Message, after: Message) -> None: + """Check if a Discord webhook URL is in the edited message `after`.""" + await self.on_message(after) + + +def setup(bot: Bot) -> None: + """Load `WebhookRemover` cog.""" + bot.add_cog(WebhookRemover(bot)) diff --git a/bot/constants.py b/bot/constants.py index da1a62780..60e3c4897 100644 --- a/bot/constants.py +++ b/bot/constants.py @@ -206,9 +206,8 @@ class Filter(metaclass=YAMLGetter): filter_zalgo: bool filter_invites: bool filter_domains: bool + watch_regex: bool watch_rich_embeds: bool - watch_words: bool - watch_tokens: bool # Notifications are not expected for "watchlist" type filters notify_user_zalgo: bool diff --git a/bot/converters.py b/bot/converters.py index 1945e1da3..72c46fdf0 100644 --- a/bot/converters.py +++ b/bot/converters.py @@ -262,6 +262,34 @@ class ISODateTime(Converter): return dt +class HushDurationConverter(Converter): + """Convert passed duration to `int` minutes or `None`.""" + + MINUTES_RE = re.compile(r"(\d+)(?:M|m|$)") + + async def convert(self, ctx: Context, argument: str) -> t.Optional[int]: + """ + Convert `argument` to a duration that's max 15 minutes or None. + + If `"forever"` is passed, None is returned; otherwise an int of the extracted time. + Accepted formats are: + * <duration>, + * <duration>m, + * <duration>M, + * forever. + """ + if argument == "forever": + return None + match = self.MINUTES_RE.match(argument) + if not match: + raise BadArgument(f"{argument} is not a valid minutes duration.") + + duration = int(match.group(1)) + if duration > 15: + raise BadArgument("Duration must be at most 15 minutes.") + return duration + + def proxy_user(user_id: str) -> discord.Object: """ Create a proxy user object from the given id. @@ -323,7 +351,7 @@ class FetchedUser(UserConverter): except discord.HTTPException as e: # If the Discord error isn't `Unknown user`, return a proxy instead if e.code != 10013: - log.warning(f"Failed to fetch user, returning a proxy instead: status {e.status}") + log.info(f"Failed to fetch user, returning a proxy instead: status {e.status}") return proxy_user(arg) log.debug(f"Failed to fetch user {arg}: user does not exist.") diff --git a/bot/resources/tags/zen.md b/bot/resources/tags/zen.md deleted file mode 100644 index 3e132eed8..000000000 --- a/bot/resources/tags/zen.md +++ /dev/null @@ -1,20 +0,0 @@ - -Beautiful is better than ugly. -Explicit is better than implicit. -Simple is better than complex. -Complex is better than complicated. -Flat is better than nested. -Sparse is better than dense. -Readability counts. -Special cases aren't special enough to break the rules. -Although practicality beats purity. -Errors should never pass silently. -Unless explicitly silenced. -In the face of ambiguity, refuse the temptation to guess. -There should be one-- and preferably only one --obvious way to do it. -Although that way may not be obvious at first unless you're Dutch. -Now is better than never. -Although never is often better than *right* now. -If the implementation is hard to explain, it's a bad idea. -If the implementation is easy to explain, it may be a good idea. -Namespaces are one honking great idea -- let's do more of those! diff --git a/bot/utils/messages.py b/bot/utils/messages.py index a36edc774..e969ee590 100644 --- a/bot/utils/messages.py +++ b/bot/utils/messages.py @@ -92,7 +92,7 @@ async def send_attachments( elif link_large: large.append(attachment) else: - log.warning(f"{failure_msg} because it's too large.") + log.info(f"{failure_msg} because it's too large.") except HTTPException as e: if link_large and e.status == 413: large.append(attachment) diff --git a/config-default.yml b/config-default.yml index 89f470e19..70c31ebb5 100644 --- a/config-default.yml +++ b/config-default.yml @@ -240,9 +240,8 @@ filter: filter_zalgo: false filter_invites: true filter_domains: true + watch_regex: true watch_rich_embeds: true - watch_words: true - watch_tokens: true # Notify user on filter? # Notifications are not expected for "watchlist" type filters @@ -272,6 +271,7 @@ filter: - 524691714909274162 # Panda3D - 336642139381301249 # discord.py - 405403391410438165 # Sentdex + - 172018499005317120 # The Coding Den domain_blacklist: - pornhub.com diff --git a/tests/bot/cogs/moderation/__init__.py b/tests/bot/cogs/moderation/__init__.py new file mode 100644 index 000000000..e69de29bb --- /dev/null +++ b/tests/bot/cogs/moderation/__init__.py diff --git a/tests/bot/cogs/moderation/test_silence.py b/tests/bot/cogs/moderation/test_silence.py new file mode 100644 index 000000000..3fd149f04 --- /dev/null +++ b/tests/bot/cogs/moderation/test_silence.py @@ -0,0 +1,251 @@ +import unittest +from unittest import mock +from unittest.mock import MagicMock, Mock + +from discord import PermissionOverwrite + +from bot.cogs.moderation.silence import Silence, SilenceNotifier +from bot.constants import Channels, Emojis, Guild, Roles +from tests.helpers import MockBot, MockContext, MockTextChannel + + +class SilenceNotifierTests(unittest.IsolatedAsyncioTestCase): + def setUp(self) -> None: + self.alert_channel = MockTextChannel() + self.notifier = SilenceNotifier(self.alert_channel) + self.notifier.stop = self.notifier_stop_mock = Mock() + self.notifier.start = self.notifier_start_mock = Mock() + + def test_add_channel_adds_channel(self): + """Channel in FirstHash with current loop is added to internal set.""" + channel = Mock() + with mock.patch.object(self.notifier, "_silenced_channels") as silenced_channels: + self.notifier.add_channel(channel) + silenced_channels.__setitem__.assert_called_with(channel, self.notifier._current_loop) + + def test_add_channel_starts_loop(self): + """Loop is started if `_silenced_channels` was empty.""" + self.notifier.add_channel(Mock()) + self.notifier_start_mock.assert_called_once() + + def test_add_channel_skips_start_with_channels(self): + """Loop start is not called when `_silenced_channels` is not empty.""" + with mock.patch.object(self.notifier, "_silenced_channels"): + self.notifier.add_channel(Mock()) + self.notifier_start_mock.assert_not_called() + + def test_remove_channel_removes_channel(self): + """Channel in FirstHash is removed from `_silenced_channels`.""" + channel = Mock() + with mock.patch.object(self.notifier, "_silenced_channels") as silenced_channels: + self.notifier.remove_channel(channel) + silenced_channels.__delitem__.assert_called_with(channel) + + def test_remove_channel_stops_loop(self): + """Notifier loop is stopped if `_silenced_channels` is empty after remove.""" + with mock.patch.object(self.notifier, "_silenced_channels", __bool__=lambda _: False): + self.notifier.remove_channel(Mock()) + self.notifier_stop_mock.assert_called_once() + + def test_remove_channel_skips_stop_with_channels(self): + """Notifier loop is not stopped if `_silenced_channels` is not empty after remove.""" + self.notifier.remove_channel(Mock()) + self.notifier_stop_mock.assert_not_called() + + async def test_notifier_private_sends_alert(self): + """Alert is sent on 15 min intervals.""" + test_cases = (900, 1800, 2700) + for current_loop in test_cases: + with self.subTest(current_loop=current_loop): + with mock.patch.object(self.notifier, "_current_loop", new=current_loop): + await self.notifier._notifier() + self.alert_channel.send.assert_called_once_with(f"<@&{Roles.moderators}> currently silenced channels: ") + self.alert_channel.send.reset_mock() + + async def test_notifier_skips_alert(self): + """Alert is skipped on first loop or not an increment of 900.""" + test_cases = (0, 15, 5000) + for current_loop in test_cases: + with self.subTest(current_loop=current_loop): + with mock.patch.object(self.notifier, "_current_loop", new=current_loop): + await self.notifier._notifier() + self.alert_channel.send.assert_not_called() + + +class SilenceTests(unittest.IsolatedAsyncioTestCase): + def setUp(self) -> None: + self.bot = MockBot() + self.cog = Silence(self.bot) + self.ctx = MockContext() + self.cog._verified_role = None + # Set event so command callbacks can continue. + self.cog._get_instance_vars_event.set() + + async def test_instance_vars_got_guild(self): + """Bot got guild after it became available.""" + await self.cog._get_instance_vars() + self.bot.wait_until_guild_available.assert_called_once() + self.bot.get_guild.assert_called_once_with(Guild.id) + + async def test_instance_vars_got_role(self): + """Got `Roles.verified` role from guild.""" + await self.cog._get_instance_vars() + guild = self.bot.get_guild() + guild.get_role.assert_called_once_with(Roles.verified) + + async def test_instance_vars_got_channels(self): + """Got channels from bot.""" + await self.cog._get_instance_vars() + self.bot.get_channel.called_once_with(Channels.mod_alerts) + self.bot.get_channel.called_once_with(Channels.mod_log) + + @mock.patch("bot.cogs.moderation.silence.SilenceNotifier") + async def test_instance_vars_got_notifier(self, notifier): + """Notifier was started with channel.""" + mod_log = MockTextChannel() + self.bot.get_channel.side_effect = (None, mod_log) + await self.cog._get_instance_vars() + notifier.assert_called_once_with(mod_log) + self.bot.get_channel.side_effect = None + + async def test_silence_sent_correct_discord_message(self): + """Check if proper message was sent when called with duration in channel with previous state.""" + test_cases = ( + (0.0001, f"{Emojis.check_mark} silenced current channel for 0.0001 minute(s).", True,), + (None, f"{Emojis.check_mark} silenced current channel indefinitely.", True,), + (5, f"{Emojis.cross_mark} current channel is already silenced.", False,), + ) + for duration, result_message, _silence_patch_return in test_cases: + with self.subTest( + silence_duration=duration, + result_message=result_message, + starting_unsilenced_state=_silence_patch_return + ): + with mock.patch.object(self.cog, "_silence", return_value=_silence_patch_return): + await self.cog.silence.callback(self.cog, self.ctx, duration) + self.ctx.send.assert_called_once_with(result_message) + self.ctx.reset_mock() + + async def test_unsilence_sent_correct_discord_message(self): + """Proper reply after a successful unsilence.""" + with mock.patch.object(self.cog, "_unsilence", return_value=True): + await self.cog.unsilence.callback(self.cog, self.ctx) + self.ctx.send.assert_called_once_with(f"{Emojis.check_mark} unsilenced current channel.") + + async def test_silence_private_for_false(self): + """Permissions are not set and `False` is returned in an already silenced channel.""" + perm_overwrite = Mock(send_messages=False) + channel = Mock(overwrites_for=Mock(return_value=perm_overwrite)) + + self.assertFalse(await self.cog._silence(channel, True, None)) + channel.set_permissions.assert_not_called() + + async def test_silence_private_silenced_channel(self): + """Channel had `send_message` permissions revoked.""" + channel = MockTextChannel() + self.assertTrue(await self.cog._silence(channel, False, None)) + channel.set_permissions.assert_called_once() + self.assertFalse(channel.set_permissions.call_args.kwargs['send_messages']) + + async def test_silence_private_preserves_permissions(self): + """Previous permissions were preserved when channel was silenced.""" + channel = MockTextChannel() + # Set up mock channel permission state. + mock_permissions = PermissionOverwrite() + mock_permissions_dict = dict(mock_permissions) + channel.overwrites_for.return_value = mock_permissions + await self.cog._silence(channel, False, None) + new_permissions = channel.set_permissions.call_args.kwargs + # Remove 'send_messages' key because it got changed in the method. + del new_permissions['send_messages'] + del mock_permissions_dict['send_messages'] + self.assertDictEqual(mock_permissions_dict, new_permissions) + + async def test_silence_private_notifier(self): + """Channel should be added to notifier with `persistent` set to `True`, and the other way around.""" + channel = MockTextChannel() + with mock.patch.object(self.cog, "notifier", create=True): + with self.subTest(persistent=True): + await self.cog._silence(channel, True, None) + self.cog.notifier.add_channel.assert_called_once() + + with mock.patch.object(self.cog, "notifier", create=True): + with self.subTest(persistent=False): + await self.cog._silence(channel, False, None) + self.cog.notifier.add_channel.assert_not_called() + + async def test_silence_private_added_muted_channel(self): + """Channel was added to `muted_channels` on silence.""" + channel = MockTextChannel() + with mock.patch.object(self.cog, "muted_channels") as muted_channels: + await self.cog._silence(channel, False, None) + muted_channels.add.assert_called_once_with(channel) + + async def test_unsilence_private_for_false(self): + """Permissions are not set and `False` is returned in an unsilenced channel.""" + channel = Mock() + self.assertFalse(await self.cog._unsilence(channel)) + channel.set_permissions.assert_not_called() + + @mock.patch.object(Silence, "notifier", create=True) + async def test_unsilence_private_unsilenced_channel(self, _): + """Channel had `send_message` permissions restored""" + perm_overwrite = MagicMock(send_messages=False) + channel = MockTextChannel(overwrites_for=Mock(return_value=perm_overwrite)) + self.assertTrue(await self.cog._unsilence(channel)) + channel.set_permissions.assert_called_once() + self.assertIsNone(channel.set_permissions.call_args.kwargs['send_messages']) + + @mock.patch.object(Silence, "notifier", create=True) + async def test_unsilence_private_removed_notifier(self, notifier): + """Channel was removed from `notifier` on unsilence.""" + perm_overwrite = MagicMock(send_messages=False) + channel = MockTextChannel(overwrites_for=Mock(return_value=perm_overwrite)) + await self.cog._unsilence(channel) + notifier.remove_channel.assert_called_once_with(channel) + + @mock.patch.object(Silence, "notifier", create=True) + async def test_unsilence_private_removed_muted_channel(self, _): + """Channel was removed from `muted_channels` on unsilence.""" + perm_overwrite = MagicMock(send_messages=False) + channel = MockTextChannel(overwrites_for=Mock(return_value=perm_overwrite)) + with mock.patch.object(self.cog, "muted_channels") as muted_channels: + await self.cog._unsilence(channel) + muted_channels.discard.assert_called_once_with(channel) + + @mock.patch.object(Silence, "notifier", create=True) + async def test_unsilence_private_preserves_permissions(self, _): + """Previous permissions were preserved when channel was unsilenced.""" + channel = MockTextChannel() + # Set up mock channel permission state. + mock_permissions = PermissionOverwrite(send_messages=False) + mock_permissions_dict = dict(mock_permissions) + channel.overwrites_for.return_value = mock_permissions + await self.cog._unsilence(channel) + new_permissions = channel.set_permissions.call_args.kwargs + # Remove 'send_messages' key because it got changed in the method. + del new_permissions['send_messages'] + del mock_permissions_dict['send_messages'] + self.assertDictEqual(mock_permissions_dict, new_permissions) + + @mock.patch("bot.cogs.moderation.silence.asyncio") + @mock.patch.object(Silence, "_mod_alerts_channel", create=True) + def test_cog_unload_starts_task(self, alert_channel, asyncio_mock): + """Task for sending an alert was created with present `muted_channels`.""" + with mock.patch.object(self.cog, "muted_channels"): + self.cog.cog_unload() + alert_channel.send.assert_called_once_with(f"<@&{Roles.moderators}> channels left silenced on cog unload: ") + asyncio_mock.create_task.assert_called_once_with(alert_channel.send()) + + @mock.patch("bot.cogs.moderation.silence.asyncio") + def test_cog_unload_skips_task_start(self, asyncio_mock): + """No task created with no channels.""" + self.cog.cog_unload() + asyncio_mock.create_task.assert_not_called() + + @mock.patch("bot.cogs.moderation.silence.with_role_check") + @mock.patch("bot.cogs.moderation.silence.MODERATION_ROLES", new=(1, 2, 3)) + def test_cog_check(self, role_check): + """Role check is called with `MODERATION_ROLES`""" + self.cog.cog_check(self.ctx) + role_check.assert_called_once_with(self.ctx, *(1, 2, 3)) diff --git a/tests/bot/cogs/test_snekbox.py b/tests/bot/cogs/test_snekbox.py index fd9468829..1dec0ccaf 100644 --- a/tests/bot/cogs/test_snekbox.py +++ b/tests/bot/cogs/test_snekbox.py @@ -1,11 +1,13 @@ import asyncio import logging import unittest -from unittest.mock import AsyncMock, MagicMock, Mock, call, patch +from unittest.mock import AsyncMock, MagicMock, Mock, call, create_autospec, patch +from discord.ext import commands + +from bot import constants from bot.cogs import snekbox from bot.cogs.snekbox import Snekbox -from bot.constants import URLs from tests.helpers import MockBot, MockContext, MockMessage, MockReaction, MockUser @@ -23,7 +25,7 @@ class SnekboxTests(unittest.IsolatedAsyncioTestCase): self.assertEqual(await self.cog.post_eval("import random"), "return") self.bot.http_session.post.assert_called_with( - URLs.snekbox_eval_api, + constants.URLs.snekbox_eval_api, json={"input": "import random"}, raise_for_status=True ) @@ -43,10 +45,10 @@ class SnekboxTests(unittest.IsolatedAsyncioTestCase): self.assertEqual( await self.cog.upload_output("My awesome output"), - URLs.paste_service.format(key=key) + constants.URLs.paste_service.format(key=key) ) self.bot.http_session.post.assert_called_with( - URLs.paste_service.format(key="documents"), + constants.URLs.paste_service.format(key="documents"), data="My awesome output", raise_for_status=True ) @@ -279,11 +281,14 @@ class SnekboxTests(unittest.IsolatedAsyncioTestCase): """Test that the continue_eval function does continue if required conditions are met.""" ctx = MockContext(message=MockMessage(add_reaction=AsyncMock(), clear_reactions=AsyncMock())) response = MockMessage(delete=AsyncMock()) - new_msg = MockMessage(content='!e NewCode') + new_msg = MockMessage() self.bot.wait_for.side_effect = ((None, new_msg), None) + expected = "NewCode" + self.cog.get_code = create_autospec(self.cog.get_code, spec_set=True, return_value=expected) actual = await self.cog.continue_eval(ctx, response) - self.assertEqual(actual, 'NewCode') + self.cog.get_code.assert_awaited_once_with(new_msg) + self.assertEqual(actual, expected) self.bot.wait_for.assert_has_awaits( ( call('message_edit', check=partial_mock(snekbox.predicate_eval_message_edit, ctx), timeout=10), @@ -302,6 +307,32 @@ class SnekboxTests(unittest.IsolatedAsyncioTestCase): self.assertEqual(actual, None) ctx.message.clear_reactions.assert_called_once() + async def test_get_code(self): + """Should return 1st arg (or None) if eval cmd in message, otherwise return full content.""" + prefix = constants.Bot.prefix + subtests = ( + (self.cog.eval_command, f"{prefix}{self.cog.eval_command.name} print(1)", "print(1)"), + (self.cog.eval_command, f"{prefix}{self.cog.eval_command.name}", None), + (MagicMock(spec=commands.Command), f"{prefix}tags get foo"), + (None, "print(123)") + ) + + for command, content, *expected_code in subtests: + if not expected_code: + expected_code = content + else: + [expected_code] = expected_code + + with self.subTest(content=content, expected_code=expected_code): + self.bot.get_context.reset_mock() + self.bot.get_context.return_value = MockContext(command=command) + message = MockMessage(content=content) + + actual_code = await self.cog.get_code(message) + + self.bot.get_context.assert_awaited_once_with(message) + self.assertEqual(actual_code, expected_code) + def test_predicate_eval_message_edit(self): """Test the predicate_eval_message_edit function.""" msg0 = MockMessage(id=1, content='abc') diff --git a/tests/bot/test_converters.py b/tests/bot/test_converters.py index 1e5ca62ae..ca8cb6825 100644 --- a/tests/bot/test_converters.py +++ b/tests/bot/test_converters.py @@ -8,6 +8,7 @@ from discord.ext.commands import BadArgument from bot.converters import ( Duration, + HushDurationConverter, ISODateTime, TagContentConverter, TagNameConverter, @@ -271,3 +272,32 @@ class ConverterTests(unittest.TestCase): exception_message = f"`{datetime_string}` is not a valid ISO-8601 datetime string" with self.assertRaises(BadArgument, msg=exception_message): asyncio.run(converter.convert(self.context, datetime_string)) + + def test_hush_duration_converter_for_valid(self): + """HushDurationConverter returns correct value for minutes duration or `"forever"` strings.""" + test_values = ( + ("0", 0), + ("15", 15), + ("10", 10), + ("5m", 5), + ("5M", 5), + ("forever", None), + ) + converter = HushDurationConverter() + for minutes_string, expected_minutes in test_values: + with self.subTest(minutes_string=minutes_string, expected_minutes=expected_minutes): + converted = asyncio.run(converter.convert(self.context, minutes_string)) + self.assertEqual(expected_minutes, converted) + + def test_hush_duration_converter_for_invalid(self): + """HushDurationConverter raises correct exception for invalid minutes duration strings.""" + test_values = ( + ("16", "Duration must be at most 15 minutes."), + ("10d", "10d is not a valid minutes duration."), + ("-1", "-1 is not a valid minutes duration."), + ) + converter = HushDurationConverter() + for invalid_minutes_string, exception_message in test_values: + with self.subTest(invalid_minutes_string=invalid_minutes_string, exception_message=exception_message): + with self.assertRaisesRegex(BadArgument, exception_message): + asyncio.run(converter.convert(self.context, invalid_minutes_string)) |