diff options
| author | 2023-01-30 04:40:57 -0500 | |
|---|---|---|
| committer | 2023-01-30 04:40:57 -0500 | |
| commit | 919f8ba16a7b82ba1099bd25b2c61c7881a05aa2 (patch) | |
| tree | 50eb34c3286538164a2f2b7048d110dc89b2a971 /data | |
| parent | f1051085013c0d702ef974b9b27ea43b3fc73259 (diff) | |
New upstream version 1.24.5.upstream/1.24.5
Diffstat (limited to 'data')
| -rw-r--r-- | data/completion/_gallery-dl | 3 | ||||
| -rw-r--r-- | data/completion/gallery-dl | 2 | ||||
| -rw-r--r-- | data/completion/gallery-dl.fish | 1 | ||||
| -rw-r--r-- | data/man/gallery-dl.1 | 5 | ||||
| -rw-r--r-- | data/man/gallery-dl.conf.5 | 150 |
5 files changed, 139 insertions, 22 deletions
diff --git a/data/completion/_gallery-dl b/data/completion/_gallery-dl index 1125b36..06e8556 100644 --- a/data/completion/_gallery-dl +++ b/data/completion/_gallery-dl @@ -70,6 +70,7 @@ _arguments -C -S \ --mtime-from-date'[Set file modification times according to "date" metadata]' \ --exec'[Execute CMD for each downloaded file. Example: --exec "convert {} {}.png && rm {}"]':'<cmd>' \ --exec-after'[Execute CMD after all files were downloaded successfully. Example: --exec-after "cd {} && convert * ../doc.pdf"]':'<cmd>' \ -{-P,--postprocessor}'[Activate the specified post processor]':'<name>' && rc=0 +{-P,--postprocessor}'[Activate the specified post processor]':'<name>' \ +{-O,--postprocessor-option}'[Additional "<key>=<value>" post processor options]':'<opt>' && rc=0 return rc diff --git a/data/completion/gallery-dl b/data/completion/gallery-dl index f57306e..203c87d 100644 --- a/data/completion/gallery-dl +++ b/data/completion/gallery-dl @@ -10,7 +10,7 @@ _gallery_dl() elif [[ "${prev}" =~ ^()$ ]]; then COMPREPLY=( $(compgen -d -- "${cur}") ) else - COMPREPLY=( $(compgen -W "--help --version --input-file --destination --directory --filename --proxy --source-address --user-agent --clear-cache --cookies --cookies-from-browser --quiet --verbose --get-urls --resolve-urls --dump-json --simulate --extractor-info --list-keywords --list-modules --list-extractors --write-log --write-unsupported --write-pages --limit-rate --retries --http-timeout --sleep --sleep-request --sleep-extractor --filesize-min --filesize-max --chunk-size --no-part --no-skip --no-mtime --no-download --no-postprocessors --no-check-certificate --config --config-yaml --option --ignore-config --username --password --netrc --download-archive --abort --terminate --range --chapter-range --filter --chapter-filter --zip --ugoira-conv --ugoira-conv-lossless --ugoira-conv-copy --write-metadata --write-info-json --write-infojson --write-tags --mtime-from-date --exec --exec-after --postprocessor" -- "${cur}") ) + COMPREPLY=( $(compgen -W "--help --version --input-file --destination --directory --filename --proxy --source-address --user-agent --clear-cache --cookies --cookies-from-browser --quiet --verbose --get-urls --resolve-urls --dump-json --simulate --extractor-info --list-keywords --list-modules --list-extractors --write-log --write-unsupported --write-pages --limit-rate --retries --http-timeout --sleep --sleep-request --sleep-extractor --filesize-min --filesize-max --chunk-size --no-part --no-skip --no-mtime --no-download --no-postprocessors --no-check-certificate --config --config-yaml --option --ignore-config --username --password --netrc --download-archive --abort --terminate --range --chapter-range --filter --chapter-filter --zip --ugoira-conv --ugoira-conv-lossless --ugoira-conv-copy --write-metadata --write-info-json --write-infojson --write-tags --mtime-from-date --exec --exec-after --postprocessor --postprocessor-option" -- "${cur}") ) fi } diff --git a/data/completion/gallery-dl.fish b/data/completion/gallery-dl.fish index 986d9df..e2a7e6d 100644 --- a/data/completion/gallery-dl.fish +++ b/data/completion/gallery-dl.fish @@ -65,3 +65,4 @@ complete -c gallery-dl -l 'mtime-from-date' -d 'Set file modification times acco complete -c gallery-dl -x -l 'exec' -d 'Execute CMD for each downloaded file. Example: --exec "convert {} {}.png && rm {}"' complete -c gallery-dl -x -l 'exec-after' -d 'Execute CMD after all files were downloaded successfully. Example: --exec-after "cd {} && convert * ../doc.pdf"' complete -c gallery-dl -x -s 'P' -l 'postprocessor' -d 'Activate the specified post processor' +complete -c gallery-dl -x -s 'O' -l 'postprocessor-option' -d 'Additional "<key>=<value>" post processor options' diff --git a/data/man/gallery-dl.1 b/data/man/gallery-dl.1 index 00723f3..024ddb3 100644 --- a/data/man/gallery-dl.1 +++ b/data/man/gallery-dl.1 @@ -1,4 +1,4 @@ -.TH "GALLERY-DL" "1" "2023-01-11" "1.24.4" "gallery-dl Manual" +.TH "GALLERY-DL" "1" "2023-01-28" "1.24.5" "gallery-dl Manual" .\" disable hyphenation .nh @@ -208,6 +208,9 @@ Execute CMD after all files were downloaded successfully. Example: --exec-after .TP .B "\-P, \-\-postprocessor" \f[I]NAME\f[] Activate the specified post processor +.TP +.B "\-O, \-\-postprocessor\-option" \f[I]OPT\f[] +Additional '<key>=<value>' post processor options .SH EXAMPLES .TP diff --git a/data/man/gallery-dl.conf.5 b/data/man/gallery-dl.conf.5 index e5742b7..6b11835 100644 --- a/data/man/gallery-dl.conf.5 +++ b/data/man/gallery-dl.conf.5 @@ -1,4 +1,4 @@ -.TH "GALLERY-DL.CONF" "5" "2023-01-11" "1.24.4" "gallery-dl Manual" +.TH "GALLERY-DL.CONF" "5" "2023-01-28" "1.24.5" "gallery-dl Manual" .\" disable hyphenation .nh .\" disable justification (adjust text to left margin only) @@ -574,8 +574,19 @@ update its contents with cookies received during data extraction. .br * \f[I]object\f[] (scheme -> proxy) -.IP "Default:" 9 -\f[I]null\f[] +.IP "Example:" 4 +.. code:: json + +"http://10.10.1.10:3128" + +.. code:: json + +{ +"http" : "http://10.10.1.10:3128", +"https": "http://10.10.1.10:1080", +"http://10.20.1.128": "http://10.10.1.10:5323" +} + .IP "Description:" 4 Proxy (or proxies) to be used for remote connections. @@ -590,16 +601,6 @@ It is also possible to set a proxy for a specific host by using \f[I]scheme://host\f[] as key. See \f[I]Requests' proxy documentation\f[] for more details. -Example: - -.. code:: json - -{ -"http" : "http://10.10.1.10:3128", -"https": "http://10.10.1.10:1080", -"http://10.20.1.128": "http://10.10.1.10:5323" -} - Note: If a proxy URLs does not include a scheme, \f[I]http://\f[] is assumed. @@ -669,6 +670,48 @@ Note: \f[I]requests\f[] and \f[I]urllib3\f[] only support HTTP/1.1, while a real browser would use HTTP/2. +.SS extractor.*.headers +.IP "Type:" 6 +\f[I]object\f[] (name -> value) + +.IP "Default:" 9 +.. code:: json + +{ +"User-Agent" : "<extractor.*.user-agent>", +"Accept" : "*/*", +"Accept-Language": "en-US,en;q=0.5", +"Accept-Encoding": "gzip, deflate" +} + + +.IP "Description:" 4 +Additional \f[I]HTTP headers\f[] +to be sent with each HTTP request, + +To disable sending a header, set its value to \f[I]null\f[]. + + +.SS extractor.*.ciphers +.IP "Type:" 6 +\f[I]list\f[] of \f[I]strings\f[] + +.IP "Example:" 4 +.. code:: json + +["ECDHE-ECDSA-AES128-GCM-SHA256", +"ECDHE-RSA-AES128-GCM-SHA256", +"ECDHE-ECDSA-CHACHA20-POLY1305", +"ECDHE-RSA-CHACHA20-POLY1305"] + + +.IP "Description:" 4 +List of TLS/SSL cipher suites in +\f[I]OpenSSL cipher list format\f[] +to be passed to +\f[I]ssl.SSLContext.set_ciphers()\f[] + + .SS extractor.*.keywords .IP "Type:" 6 \f[I]object\f[] (name -> value) @@ -897,6 +940,25 @@ will run all three post processors - \f[I]mtime\f[], \f[I]zip\f[], \f[I]exec\f[] for each downloaded \f[I]pixiv\f[] file. +.SS extractor.*.postprocessor-options +.IP "Type:" 6 +\f[I]object\f[] (name -> value) + +.IP "Example:" 4 +.. code:: json + +{ +"archive": null, +"keep-files": true +} + + +.IP "Description:" 4 +Additional \f[I]Postprocessor Options\f[] that get added to each individual +\f[I]post processor object\f[] +before initializing it and evaluating filters. + + .SS extractor.*.retries .IP "Type:" 6 \f[I]integer\f[] @@ -909,6 +971,26 @@ Maximum number of times a failed HTTP request is retried before giving up, or \f[I]-1\f[] for infinite retries. +.SS extractor.*.retry-codes +.IP "Type:" 6 +\f[I]list\f[] of \f[I]integers\f[] + +.IP "Example:" 4 +[404, 429, 430] + +.IP "Description:" 4 +Additional \f[I]HTTP response status codes\f[] +to retry an HTTP request on. + +\f[I]2xx\f[] codes (success responses) and +\f[I]3xx\f[] codes (redirection messages) +will never be retried and always count as success, +regardless of this option. + +\f[I]5xx\f[] codes (server error responses) will always be retried, +regardless of this option. + + .SS extractor.*.timeout .IP "Type:" 6 \f[I]float\f[] @@ -1208,15 +1290,30 @@ follow the \f[I]source\f[] and download from there if possible. .SS extractor.danbooru.metadata .IP "Type:" 6 -\f[I]bool\f[] +.br +* \f[I]bool\f[] +.br +* \f[I]string\f[] +.br +* \f[I]list\f[] of \f[I]strings\f[] .IP "Default:" 9 \f[I]false\f[] +.IP "Example:" 4 +.br +* replacements,comments,ai_tags +.br +* ["replacements", "comments", "ai_tags"] + .IP "Description:" 4 Extract additional metadata (notes, artist commentary, parent, children, uploader) +It is possible to specify a custom list of metadata includes. +See \f[I]available_includes\f[] +for possible field names. \f[I]aibooru\f[] also supports \f[I]ai_metadata\f[]. + Note: This requires 1 additional HTTP request per post. @@ -1405,7 +1502,7 @@ A (comma-separated) list of subcategories to include when processing a user profile. Possible values are -\f[I]"gallery"\f[], \f[I]"scraps"\f[], \f[I]"journal"\f[], \f[I]"favorite"\f[]. +\f[I]"gallery"\f[], \f[I]"scraps"\f[], \f[I]"journal"\f[], \f[I]"favorite"\f[], \f[I]"status"\f[]. It is possible to use \f[I]"all"\f[] instead of listing all values separately. @@ -1418,14 +1515,15 @@ It is possible to use \f[I]"all"\f[] instead of listing all values separately. \f[I]"html"\f[] .IP "Description:" 4 -Selects the output format of journal entries. +Selects the output format for textual content. This includes journals, +literature and status updates. .br * \f[I]"html"\f[]: HTML with (roughly) the same layout as on DeviantArt. .br * \f[I]"text"\f[]: Plain text with image references and HTML tags removed. .br -* \f[I]"none"\f[]: Don't download journals. +* \f[I]"none"\f[]: Don't download textual content. .SS extractor.deviantart.mature @@ -3735,6 +3833,20 @@ Extract overlay notes (position and text). Note: This requires 1 additional HTTP request per post. +.SS extractor.[booru].url +.IP "Type:" 6 +\f[I]string\f[] + +.IP "Default:" 9 +\f[I]"file_url"\f[] + +.IP "Example:" 4 +"preview_url" + +.IP "Description:" 4 +Alternate field name to retrieve download URLs from. + + .SS extractor.[manga-extractor].chapter-reverse .IP "Type:" 6 \f[I]bool\f[] @@ -3978,7 +4090,7 @@ Additional HTTP headers to send when downloading files, \f[I]list\f[] of \f[I]integers\f[] .IP "Default:" 9 -\f[I][429]\f[] +\f[I]extractor.*.retry-codes\f[] .IP "Description:" 4 Additional \f[I]HTTP response status codes\f[] @@ -3988,7 +4100,7 @@ Codes \f[I]200\f[], \f[I]206\f[], and \f[I]416\f[] (when resuming a \f[I]partial download) will never be retried and always count as success, regardless of this option. -Codes \f[I]500\f[] - \f[I]599\f[] (server error responses) will always be retried, +\f[I]5xx\f[] codes (server error responses) will always be retried, regardless of this option. |
