| Commit message (Collapse) | Author | Age |
| |
|
| |
|
|
|
|
|
|
| |
If all titles were declared “bad” the pp previously tried to access the
first title in the array, which was already drained. We now simply clone
the array to facilitate this usage.
|
| |
|
|
|
|
| |
Otherwise, the sub-processes run with the default verbosity level.
|
| |
|
| |
|
| |
|
|
|
|
|
| |
That makes it clear that these parts are only exposed to facilitate
macro use and not as part of the public API.
|
| |
|
|
|
|
|
| |
This seems to have no apparent effect on anything. As such I went ahead
and removed this dead code.
|
| |
|
|
|
|
|
|
|
|
|
|
|
| |
We need to tell yt_dlp about our post processors, as they would
otherwise not take full effect. For example, changing the title would
previously only have changed the title in the *in-memory* info json, the
actual file on disk (video and .info.json) would still have the old
title, as yt_dlp did not know about our post processor.
Registering it via their api also has the upside of being able to
determine when to run.
|
|
|
|
|
|
| |
The reqwest crate will panic if it is blockingly run inside another
executor. But we cannot make this function async, as the whole api is
forced to be sync by python ffi.
|
|
|
|
|
| |
This avoids the dependency on a real std-library (i.e., python3) at
runtime.
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
| |
This avoids having to wrap all blocks into a `match` statement.
|
|
|
|
| |
shlex is better maintained, and _actually_ meant for this purpose .
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
| |
I also moved that to a separate subcommand, as we would otherwise have
too many `requires`/`conflicts_with` statements.
|
|
|
|
|
|
| |
The matching behaviour was not predictable at all (probably due to a bad
config), which than led to using `yt videos ls | grep -i <query>`
instead.
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
| |
There is no point in doing this anymore, as we no longer need to
deserialize it.
|
|
|
|
|
| |
Otherwise, we might encounter very weird behaviour if yt_dlp ever
changes the types of these keys.
|
|
|
|
|
|
|
| |
This number obviously does not actually mean that we have finished
updating (as it is incremented on staring).
But it still provides some feedback, on how long the update will
probably take.
|
|
|
|
|
|
|
|
|
|
| |
Rustpython currently does not use a garbage collector. Thus, every
cyclic reference between Python objects results in a memory leak of
these objects (as Rustpython uses (A)RCs).
The only real way to workaround the memory leaks, is by restarting the
whole process, and this `--grouped` flag seems to be the best solution
for that.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The previous code was written with the assumption, that `yt_dlp` had an
async API (which means, that calls to it should never block).
Unfortunately, the API is sync and all calls into it block. Therefore,
all the parallelism in the previous code accounted for nothing; The
actual update ran completely sequentially.
The new version uses a local thread pool to avoid blocking the tokio
runtime and as such achieves higher speed.
Unfortunately, because the rustpython implementation is way slower
than cpython, the whole update takes longer. But that is a problem for
another day.
|
|
|
|
| |
The single file approach becomes unwieldy once one has more open videos.
|
| |
|
|
|
|
|
| |
Having the `-s/--subscription` flag is pointless, as there are no other
flags that could make a positional arg ambiguous.
|
|
|
|
|
| |
This code was temporarily commented out, as I had not migrated it in the
pyo3 -> rustpython migration.
|
| |
|
|
|
|
| |
This is just too much noise.
|
|
|
|
|
|
|
|
| |
The previous parser was very brittle, it failed for (valid) outputs like
`1d 10h 30m` (as it only expected two number unit pairs).
On top of that, extending it was failure prone (as proven by the
roundtrip failure in combination with the `d` unit).
|
|
|
|
| |
This is just cleaner, compared to running `yt-dlp --version` as command.
|
| |
|
|
|
|
| |
Having one crate outside the `crates` directory is just weird.
|
|
|
|
|
|
|
|
| |
That allows us to avoid cpython's GIL and gives us full ability to
leverage async/concurrent code to speed up python operations.
I have also taken the opportunity to change the `InfoJson` struct to an
untyped json value, as that is what it actually is.
|