| Commit message (Collapse) | Author | Age | Lines |
| ... | |
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Without the exception set, to the user the bot would fail silently
if an exception was handled here
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Because the futures are cleaned up and Markdown only exists in the
cache after a short time, items that were requested previously
and had the cache cleared would be missing from the CachedParser
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
This could be handled by using sets to hold the items in _page_symbols,
but ultimately the check has a much smaller cost than having
thousands of sets for the urls.
Because we create futures for every item that ends up in the queue we
can also skip the .get is None check and instead fetch the
future directly from the dict
|
| | | | | | | | | |
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
The code has no way of reaching futures through new requests after
their result has been set as that also includes setting its value in
redis.
|
| | | | | | | | | |
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
In some cases these are actual symbols that we can look up
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
The code handling this was moved to a function to achieve this cleanly.
Includes fixes for bugs where incorrect package was added to the symbol
name in the second branch and an incorrect symbol being added in
the third branch
Co-authored-by: MarkKoz <[email protected]>
|
| | | | | | | | | |
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Co-authored-by: MarkKoz <[email protected]>
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Co-authored-by: MarkKoz <[email protected]>
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
We're not using it as a decorator so using wraps only complicates
the call syntax
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Instead of fetching it again in the cog, the converter now returns
the inventory for later use. The set command now no longer attempts
to reschedule the inventory, and a bug that caused the inventory
rescheduling to do nothing in `update_single` was fixed after moving
it to its own method
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Previously the bot returned an error if a symbol was not found while
inventories were refreshing, but we can just wait for the to finish
refreshing and then the symbol may be filled in.
A logging call to notify of the refresh being done was also added.
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Creating futures for everything and then awaiting at the end takes
care of all the potential race conditions that may pop up from items
that are parsed and sent to redis while the get_markdown method is
in the middle of fetching a page. In case it happens with the
implementation we'll just need to move the item to the front and the
future will get a result set soon afterwards.
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Co-authored-by: MarkKoz <[email protected]>
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
This reverts commit ad90978f
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
The parsing may take up to a few hundred ms depending on the amount
of work it has to do
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
The strainer now forces the text attribute to be None, simplifying
the check on strings and falls back to the superclass' method on non
string elements
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
The html we parse frequently ends up with trailing and sometimes leading
newlines which get stripped out by discord anyway, we have no reason
to keep those around when sending the Markdown over to redis
|
| | | | | | | | | |
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
The symbol is also no longer sent back to the user, as it is not
necessary and we can skip the cleanup on it
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
This allows the caller to work with the message further
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Co-authored-by: MarkKoz <[email protected]>
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
With only two strings, the addition is a bit clearer than
constructing and joining a tuple
Co-authored-by: MarkKoz <[email protected]>
|
| | | | | | | | | |
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Co-authored-by: MarkKoz <[email protected]>
|
| | | | | | | | | |
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Co-authored-by: MarkKoz <[email protected]>
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Co-authored-by: MarkKoz <[email protected]>
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
The else is a bit clearer than the early return
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
All commands that refresh the inventories in some way are now locked to
prevent various race conditions that may have occurred in the unlikely
scenario that they got triggered together, the fetching part of the
get command now also has to wait for the running inventory refresh to
finish before proceeding to fetch and parse the html
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
discord.py uses the globals of functions to resolve forward refs
in commands, previously decorators applied before commands
broke the bot with forwardrefs to names that weren't in the namespace
of the module where they were defined, the new function takes care of
merging the globals in a new function to mitigate this issue.
closes: #1323
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
While technically correct, always sending success could be misleading
in case of a typo on the package
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
The finally will make sure we reset the task and log it no matter
what happens, additionally the clearing of the variable is now only
done in one place as the finally also executes when the coro is
cancelled
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
Previously in case get_markdown for an item ran twice, the one
that ran second would overwrite the future created by the first one,
potentially causing the coro to wait for it infinitely as _parse_queue
would only be able to set the last future
|
| | | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | |
| | | | | | | | |
We no longer need to keep the items around since everything is in redis
and the costs of always going through redis is fairly small
|
| | | | | | | | | |
|
| | |\ \ \ \ \ \ \ |
|
| | | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | | |
Previously we used packages as the top level keys and fields
contained the url and the symbol id, however if we want to store
all symbols from fetched pages instead of only the ones that were
fetched by the users this comes worse off than using the page url
in the field and setting EXPIREs for them instead of doing it manually
in python.
The new implementation uses package:url as the redis key and only
the symbol id for field names, with the expire being set to a week
on the key, this means we have to pattern match the keys when deleting
the cache for a package but that's being done far less than the expire
checking done previously.
|
| | | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | | |
The result of _split_parameters is only iterated over, so a list is not
needed. Making it lazy may also save some time in cases where we don't
use all parameters
|
| | | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | | |
The two variables were initialized and cleared together and contained
related information
|
| | | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | | |
Previously the code assumed ' and " can be used interchangeably,
and strings that were inside of brackets were ignored for depth but
their contents weren't causing strings like "ab[cd" to increase the
depth
|
| | | | | | | | | | |
|
| | | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | | |
Co-authored-by: MarkKoz <[email protected]>
|
| | | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | | |
Co-authored-by: MarkKoz <[email protected]>
|
| | | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | | |
Co-authored-by: MarkKoz <[email protected]>
|
| | | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | | |
A newline was also added to set to keep it consistent with
set_if_exists
|
| | | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | | |
Some packages (currently only python) should be prioritised to others,
the previous cleanup didn't account for other packages loading before it
which resulted in duplicate symbols getting the python prefix and the
original symbols linking to most probably undesired pages
|
| | | | | | | | | |
| | | | | | | | |
| | | | | | | | |
| | | | | | | | | |
We also clear the cache when removing a package
|