openSUSE Commits
Threads by month
- ----- 2024 -----
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
October 2022
- 1 participants
- 2854 discussions
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-dask for openSUSE:Factory checked in at 2022-10-25 11:20:11
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-dask (Old)
and /work/SRC/openSUSE:Factory/.python-dask.new.2275 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-dask"
Tue Oct 25 11:20:11 2022 rev:56 rq:1030989 version:2022.10.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-dask/python-dask.changes 2022-09-12 19:08:31.714583611 +0200
+++ /work/SRC/openSUSE:Factory/.python-dask.new.2275/python-dask.changes 2022-10-25 11:20:38.310220573 +0200
@@ -1,0 +2,100 @@
+Fri Oct 21 13:19:48 UTC 2022 - Ben Greiner <code(a)bnavigator.de>
+
+- Update to version 2022.10.0
+ * Backend library dispatching for IO in Dask-Array and
+ Dask-DataFrame (GH#9475) Richard (Rick) Zamora
+ * Add new CLI that is extensible (GH#9283) Doug Davis
+ * Groupby median (GH#9516) Ian Rose
+ * Fix array copy not being a no-op (GH#9555) David Hoese
+ * Add support for string timedelta in map_overlap (GH#9559)
+ Nicolas Grandemange
+ * Shuffle-based groupby for single functions (GH#9504) Ian Rose
+ * Make datetime.datetime tokenize idempotantly (GH#9532) Martin
+ Durant
+ * Support tokenizing datetime.time (GH#9528) Tim Paine
+ * Avoid race condition in lazy dispatch registration (GH#9545)
+ James Bourbeau
+ * Do not allow setitem to np.nan for int dtype (GH#9531) Doug
+ Davis
+ * Stable demo column projection (GH#9538) Ian Rose
+ * Ensure pickle-able binops in delayed (GH#9540) Ian Rose
+ * Fix project CSV columns when selecting (GH#9534) Martin Durant
+ * Update Parquet best practice (GH#9537) Matthew Rocklin
+- move -all metapackage to -complete, mirroring upstream's
+ [complete] extra.
+
+-------------------------------------------------------------------
+Fri Sep 30 23:19:11 UTC 2022 - Arun Persaud <arun(a)gmx.de>
+
+- update to version 2022.9.2:
+ * Enhancements
+ + Remove factorization logic from array auto chunking (:pr:`9507`)
+ `James Bourbeau`_
+ * Documentation
+ + Add docs on running Dask in a standalone Python script
+ (:pr:`9513`) `James Bourbeau`_
+ + Clarify custom-graph multiprocessing example (:pr:`9511`)
+ `nouman`_
+ * Maintenance
+ + Groupby sort upstream compatibility (:pr:`9486`) `Ian Rose`_
+
+-------------------------------------------------------------------
+Fri Sep 16 19:54:12 UTC 2022 - Arun Persaud <arun(a)gmx.de>
+
+- update to version 2022.9.1:
+ * New Features
+ + Add "DataFrame" and "Series" "median" methods (:pr:`9483`)
+ `James Bourbeau`_
+ * Enhancements
+ + Shuffle "groupby" default (:pr:`9453`) `Ian Rose`_
+ + Filter by list (:pr:`9419`) `Greg Hayes`_
+ + Added "distributed.utils.key_split" functionality to
+ "dask.utils.key_split" (:pr:`9464`) `Luke Conibear`_
+ * Bug Fixes
+ + Fix overlap so that "set_index" doesn't drop rows (:pr:`9423`)
+ `Julia Signell`_
+ + Fix assigning pandas "Series" to column when "ddf.columns.min()"
+ raises (:pr:`9485`) `Erik Welch`_
+ + Fix metadata comparison "stack_partitions" (:pr:`9481`) `James
+ Bourbeau`_
+ + Provide default for "split_out" (:pr:`9493`) `Lawrence
+ Mitchell`_
+ * Deprecations
+ + Allow "split_out" to be "None", which then defaults to "1" in
+ "groupby().aggregate()" (:pr:`9491`) `Ian Rose`_
+ * Documentation
+ + Fixing "enforce_metadata" documentation, not checking for dtypes
+ (:pr:`9474`) `Nicolas Grandemange`_
+ + Fix "it's" --> "its" typo (:pr:`9484`) `Nat Tabris`_
+ * Maintenance
+ + Workaround for parquet writing failure using some datetime
+ series but not others (:pr:`9500`) `Ian Rose`_
+ + Filter out "numeric_only" warnings from "pandas" (:pr:`9496`)
+ `James Bourbeau`_
+ + Avoid "set_index(..., inplace=True)" where not necessary
+ (:pr:`9472`) `James Bourbeau`_
+ + Avoid passing groupby key list of length one (:pr:`9495`) `James
+ Bourbeau`_
+ + Update "test_groupby_dropna_cudf" based on "cudf" support for
+ "group_keys" (:pr:`9482`) `James Bourbeau`_
+ + Remove "dd.from_bcolz" (:pr:`9479`) `James Bourbeau`_
+ + Added "flake8-bugbear" to "pre-commit" hooks (:pr:`9457`) `Luke
+ Conibear`_
+ + Bind loop variables in function definitions ("B023")
+ (:pr:`9461`) `Luke Conibear`_
+ + Added assert for comparisons ("B015") (:pr:`9459`) `Luke
+ Conibear`_
+ + Set top-level default shell in CI workflows (:pr:`9469`) `James
+ Bourbeau`_
+ + Removed unused loop control variables ("B007") (:pr:`9458`)
+ `Luke Conibear`_
+ + Replaced "getattr" calls for constant attributes ("B009")
+ (:pr:`9460`) `Luke Conibear`_
+ + Pin "libprotobuf" to allow nightly "pyarrow" in the upstream CI
+ build (:pr:`9465`) `Joris Van den Bossche`_
+ + Replaced mutable data structures for default arguments ("B006")
+ (:pr:`9462`) `Luke Conibear`_
+ + Changed "flake8" mirror and updated version (:pr:`9456`) `Luke
+ Conibear`_
+
+-------------------------------------------------------------------
Old:
----
dask-2022.9.0.tar.gz
New:
----
dask-2022.10.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-dask.spec ++++++
--- /var/tmp/diff_new_pack.EyIYoT/_old 2022-10-25 11:20:39.066222248 +0200
+++ /var/tmp/diff_new_pack.EyIYoT/_new 2022-10-25 11:20:39.074222266 +0200
@@ -43,7 +43,7 @@
%define skip_python2 1
Name: python-dask%{psuffix}
# ===> Note: python-dask MUST be updated in sync with python-distributed! <===
-Version: 2022.9.0
+Version: 2022.10.0
Release: 0
Summary: Minimal task scheduling abstraction
License: BSD-3-Clause
@@ -61,6 +61,8 @@
Requires: python-packaging >= 20.0
Requires: python-partd >= 0.3.10
Requires: python-toolz >= 0.8.2
+Requires(post): update-alternatives
+Requires(postun):update-alternatives
Recommends: %{name}-array = %{version}
Recommends: %{name}-bag = %{version}
Recommends: %{name}-dataframe = %{version}
@@ -77,15 +79,15 @@
Recommends: python-pyarrow >= 0.14.0
Recommends: python-s3fs >= 0.4.0
Recommends: python-xxhash
-Suggests: %{name}-all = %{version}
+Suggests: %{name}-complete = %{version}
Suggests: %{name}-diagnostics = %{version}
Provides: %{name}-multiprocessing = %{version}-%{release}
Obsoletes: %{name}-multiprocessing < %{version}-%{release}
BuildArch: noarch
%if %{with test}
# test that we specified all requirements correctly in the core
-# and subpackages by only requiring dask-all and optional extras
-BuildRequires: %{python_module dask-all = %{version}}
+# and subpackages by only requiring dask-complete and optional extras
+BuildRequires: %{python_module dask-complete = %{version}}
BuildRequires: %{python_module pytest-rerunfailures}
BuildRequires: %{python_module pytest-xdist}
BuildRequires: %{python_module pytest}
@@ -127,7 +129,7 @@
larger-than-memory or distributed environments. These parallel collections
run on top of dynamic task schedulers.
-%package all
+%package complete
# This must have a Requires for dask and all the dask subpackages
Summary: All dask components
Requires: %{name} = %{version}
@@ -138,8 +140,10 @@
Requires: %{name}-diagnostics = %{version}
Requires: %{name}-distributed = %{version}
Requires: %{name}-dot = %{version}
+Provides: %{name}-all = %{version}-%{release}
+Obsoletes: %{name}-all < %{version}-%{release}
-%description all
+%description complete
A flexible library for parallel computing in Python.
Dask is composed of two parts:
@@ -320,6 +324,7 @@
%install
%if !%{with test}
%python_install
+%python_clone -a %{buildroot}%{_bindir}/dask
%{python_expand # give SUSE specific install instructions
sed -E -i '/Please either conda or pip install/,/python -m pip install/ {
s/either conda or pip//;
@@ -363,10 +368,17 @@
%pytest --pyargs dask -n auto -r fE -m "not network" -k "not ($donttest)" --reruns 3 --reruns-delay 3
%endif
+%post
+%python_install_alternative dask
+
+%postun
+%python_uninstall_alternative dask
+
%if !%{with test}
%files %{python_files}
%doc README.rst
%license LICENSE.txt
+%python_alternative %{_bindir}/dask
%{python_sitelib}/dask/
%{python_sitelib}/dask-%{version}*-info
%exclude %{python_sitelib}/dask/array/
@@ -378,7 +390,7 @@
%pycache_only %exclude %{python_sitelib}/dask/__pycache__/delayed*.pyc
%pycache_only %exclude %{python_sitelib}/dask/__pycache__/dot.*
-%files %{python_files all}
+%files %{python_files complete}
%license LICENSE.txt
%files %{python_files array}
++++++ dask-2022.9.0.tar.gz -> dask-2022.10.0.tar.gz ++++++
/work/SRC/openSUSE:Factory/python-dask/dask-2022.9.0.tar.gz /work/SRC/openSUSE:Factory/.python-dask.new.2275/dask-2022.10.0.tar.gz differ: char 5, line 1
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-sparse for openSUSE:Factory checked in at 2022-10-25 11:20:12
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-sparse (Old)
and /work/SRC/openSUSE:Factory/.python-sparse.new.2275 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-sparse"
Tue Oct 25 11:20:12 2022 rev:11 rq:1030992 version:0.13.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-sparse/python-sparse.changes 2022-07-15 13:53:00.835573810 +0200
+++ /work/SRC/openSUSE:Factory/.python-sparse.new.2275/python-sparse.changes 2022-10-25 11:20:39.274222708 +0200
@@ -1,0 +2,7 @@
+Fri Oct 21 08:39:20 UTC 2022 - Matej Cepl <mcepl(a)suse.com>
+
+- Add skip-32bit-archs.patch skiping a failing test on 32bit arch
+ (gh#pydata/sparse#490).
+- Also remove conditional python_module definition.
+
+-------------------------------------------------------------------
New:
----
skip-32bit-archs.patch
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-sparse.spec ++++++
--- /var/tmp/diff_new_pack.c58UOe/_old 2022-10-25 11:20:39.798223869 +0200
+++ /var/tmp/diff_new_pack.c58UOe/_new 2022-10-25 11:20:39.802223879 +0200
@@ -16,7 +16,6 @@
#
-%{?!python_module:%define python_module() python3-%{**}}
%define skip_python2 1
Name: python-sparse
Version: 0.13.0
@@ -26,6 +25,9 @@
Group: Development/Languages/Python
URL: https://github.com/pydata/sparse
Source: https://files.pythonhosted.org/packages/source/s/sparse/sparse-%{version}.t…
+# PATCH-FIX-UPSTREAM skip-32bit-archs.patch gh#pydata/sparse#490 mcepl(a)suse.com
+# Skip some tests on 32bit architecture
+Patch0: skip-32bit-archs.patch
BuildRequires: %{python_module setuptools}
# SECTION test requirements
BuildRequires: %{python_module dask-array}
++++++ skip-32bit-archs.patch ++++++
---
sparse/tests/test_coo.py | 3 +++
1 file changed, 3 insertions(+)
--- a/sparse/tests/test_coo.py
+++ b/sparse/tests/test_coo.py
@@ -1,6 +1,7 @@
import contextlib
import operator
import pickle
+import platform
import sys
from functools import reduce
@@ -14,6 +15,8 @@ from sparse import COO
from sparse._settings import NEP18_ENABLED
from sparse._utils import assert_eq, random_value_array, html_table
+pytestmark = pytest.mark.skipif(platform.architecture()[0] == '32bit',
+ reason='Fails on 32bit arch (gh#pydata/sparse#490)')
@pytest.fixture(scope="module", params=["f8", "f4", "i8", "i4"])
def random_sparse(request):
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-distributed for openSUSE:Factory checked in at 2022-10-25 11:20:10
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-distributed (Old)
and /work/SRC/openSUSE:Factory/.python-distributed.new.2275 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-distributed"
Tue Oct 25 11:20:10 2022 rev:60 rq:1030988 version:2022.10.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-distributed/python-distributed.changes 2022-09-12 19:08:30.558580362 +0200
+++ /work/SRC/openSUSE:Factory/.python-distributed.new.2275/python-distributed.changes 2022-10-25 11:20:36.866217371 +0200
@@ -1,0 +2,104 @@
+Fri Oct 21 13:22:30 UTC 2022 - Ben Greiner <code(a)bnavigator.de>
+
+- Update to version 2022.10.0
+ * Use of new dask CLI (GH#6735) Doug Davis
+ * Refactor occupancy (GH#7075) Hendrik Makait
+ * Expose managed/unmanaged/spilled memory to Prometheus (GH#7112)
+ crusaderky
+ * Round up saturation-factor (GH#7116) Gabe Joseph
+ * Return default on KeyError at any level in get_metadata
+ (GH#7109) Hendrik Makait
+ * Count task states per task prefix and expose to Prometheus
+ (GH#7088) Nat Tabris
+ * Add scheduler-sni option for dask workers (GH#6290) Burt
+ Holzman
+ * Improve exception catching in UCX communication (GH#7132) Peter
+ Andreas Entschev
+ * Improve robustness of PipInstall plugin (GH#7111) Hendrik
+ Makait
+ * Fix dependencies that should point to dask/dask (GH#7138) James
+ Bourbeau
+ * Hold on to z.sum() until test completes (GH#7136) Lawrence
+ Mitchell
+ * Update typing for system_monitor after python/typeshed#8829
+ (GH#7131) Lawrence Mitchell
+ * Fix two potentially flaky queuing tests (GH#7124) Gabe Joseph
+ * Revamp SystemMonitor (GH#7097) crusaderky
+ * Adjust hardware benchmarks bokeh test (GH#7096) Florian Jetter
+ * Multi-platform mypy checks (GH#7094) crusaderky
+
+-------------------------------------------------------------------
+Fri Sep 30 23:19:52 UTC 2022 - Arun Persaud <arun(a)gmx.de>
+
+- update to version 2022.9.2:
+ * Enhancements
+ + Smarter stealing with dependencies (GH#7024) Hendrik Makait
+ + Enable Active Memory Manager by default (GH#7042) crusaderky
+ + Allow timeout strings in distributed.wait (GH#7081) James
+ Bourbeau
+ + Make AMM memory measure configurable (GH#7062) crusaderky
+ + AMM support for actors (GH#7072) crusaderky
+ + Expose message-bytes-limit in config (GH#7074) Hendrik Makait
+ + Detect mismatching Python version in scheduler (GH#7018) Hendrik
+ Makait
+ + Improve KilledWorker message users see (GH#7043) James Bourbeau
+ + Support for cgroups v2 and respect soft limits (GH#7051)
+ Samantha Hughes
+ * Bug Fixes
+ + Catch BaseException on UCX read error (GH#6996) Peter Andreas
+ Entschev
+ + Fix transfer limiting in _select_keys_for_gather (GH#7071)
+ Hendrik Makait
+ + Parse worker-saturation if a string (GH#7064) Gabe Joseph
+ + Nanny(config=...) parameter overlays global dask config
+ (GH#7069) crusaderky
+ + Ensure default clients don���t propagate to subprocesses (GH#7028)
+ Florian Jetter
+ * Documentation
+ + Improve documentation of message-bytes-limit (GH#7077) Hendrik
+ Makait
+ + Minor tweaks to Sphinx documentation (GH#7041) crusaderky
+ + Improve upload_file API documentation (GH#7040) Florian Jetter
+ * Maintenance
+ + test_serialize_numba: Workaround issue with np.empty_like in NP
+ 1.23 (GH#7089) Graham Markall
+ + Type platform constants for mypy (GH#7091) jakirkham
+ + dask-worker-space (GH#7054) crusaderky
+ + Remove failing test case (GH#7087) Hendrik Makait
+ + test_default_client (GH#7058) crusaderky
+ + Fix pre-commit fails with recent versions of mypy and pandas
+ (GH#7068) crusaderky
+ + Add factorization utility (GH#7048) James Bourbeau
+
+-------------------------------------------------------------------
+Fri Sep 16 19:55:34 UTC 2022 - Arun Persaud <arun(a)gmx.de>
+
+- update to version 2022.9.1:
+ * Enhancements
+ + Add dashboard component for size of open data transfers
+ (GH#6982) Hendrik Makait
+ + Allow very fast keys and very expensive transfers as stealing
+ candidates (GH#7022) Florian Jetter
+ * Bug Fixes
+ + No longer double count transfer cost in stealing (GH#7036)
+ Hendrik Makait
+ * Maintenance
+ + Make test_wait_first_completed robust (GH#7039) Florian Jetter
+ + Partial annotations for SchedulerState (GH#7023) crusaderky
+ + Add more type annotations to stealing.py (GH#7009) Florian
+ Jetter
+ + Update codecov settings (GH#7015) Florian Jetter
+ + Speed up test_balance (GH#7008) Florian Jetter
+ + Fix test report after queuing job added (GH#7012) Gabe Joseph
+ + Clean up env variables in Gihub Actions (GH#7001) crusaderky
+ + Make test_steal_reschedule_reset_in_flight_occupancy non timing
+ dependent (GH#7010) Florian Jetter
+ + Replaced distributed.utils.key_split with dask.utils.key_split
+ (GH#7005) Luke Conibear
+ + Revert ���Revert ���Limit incoming data transfers by amount of data���
+ (GH#6994)��� (:pr:`7007) Florian Jetter
+ + CI job running tests with queuing on (GH#6989) Gabe Joseph
+ + Fix distributed/tests/test_client_executor.py::test_wait
+ (GH#6990) Florian Jetter
+
+-------------------------------------------------------------------
Old:
----
distributed-2022.9.0-gh.tar.gz
New:
----
distributed-2022.10.0-gh.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-distributed.spec ++++++
--- /var/tmp/diff_new_pack.1WGbhQ/_old 2022-10-25 11:20:37.610219021 +0200
+++ /var/tmp/diff_new_pack.1WGbhQ/_new 2022-10-25 11:20:37.614219030 +0200
@@ -45,7 +45,7 @@
%bcond_with paralleltests
Name: python-distributed%{psuffix}
# ===> Note: python-dask MUST be updated in sync with python-distributed! <===
-Version: 2022.9.0
+Version: 2022.10.0
Release: 0
Summary: Library for distributed computing with Python
License: BSD-3-Clause
@@ -56,7 +56,7 @@
Patch1: distributed-ignore-offline.patch
# PATCH-FIX-OPENSUSE distributed-ignore-thread-leaks.patch -- ignore leaking threads on obs, code(a)bnavigator.de
Patch2: distributed-ignore-thread-leaks.patch
-# PATCh-FIX-OPENSUSE Ignore two deprecations introduced by Tornado 6.2
+# PATCH-FIX-OPENSUSE Ignore two deprecations introduced by Tornado 6.2
Patch3: support-tornado-6-2.patch
BuildRequires: %{python_module base >= 3.8}
BuildRequires: %{python_module setuptools}
@@ -82,12 +82,11 @@
BuildArch: noarch
%if %{with test}
BuildRequires: %{python_module bokeh}
-BuildRequires: %{python_module dask-all = %{version}}
+BuildRequires: %{python_module dask-complete = %{version}}
BuildRequires: %{python_module distributed = %{version}}
BuildRequires: %{python_module ipykernel}
BuildRequires: %{python_module ipython}
BuildRequires: %{python_module jupyter_client}
-BuildRequires: %{python_module pytest-asyncio >= 0.17.2}
BuildRequires: %{python_module pytest-rerunfailures}
BuildRequires: %{python_module pytest-timeout}
BuildRequires: %{python_module pytest}
@@ -155,19 +154,18 @@
donttest+=" or (test_worker and test_worker_reconnects_mid_compute)"
# server-side fail due to the non-network warning in a subprocess where the patched filter does not apply
donttest+=" or (test_client and test_quiet_close_process)"
-
-# Exception messages not caught -- https://github.com/dask/distributed/issues/5460#issuecomment-1079432890
-python310_donttest+=" or test_exception_text"
-python310_donttest+=" or test_worker_bad_args"
-python310_donttest+=" or test_run_spec_deserialize_fail"
+# creates OOM aborts on some obs workers
+donttest+=" or (test_steal and steal_communication_heavy_tasks)"
if [[ $(getconf LONG_BIT) -eq 32 ]]; then
# OverflowError -- https://github.com/dask/distributed/issues/5252
donttest+=" or test_ensure_spilled_immediately"
donttest+=" or test_value_raises_during_spilling"
donttest+=" or test_fail_to_pickle_execute_1"
- # https://github.com/dask/distributed/issues/6718
- python310_donttest+=" or (test_profile and test_basic)"
+ # https://github.com/dask/distributed/issues/7174
+ donttest+=" or (test_steal and steal_communication_heavy_tasks)"
+ # https://github.com/dask/distributed/issues/7175
+ donttest+=" or (test_sizeof_error and larger)"
fi
%if %{with paralleltests}
++++++ distributed-2022.9.0-gh.tar.gz -> distributed-2022.10.0-gh.tar.gz ++++++
++++ 11942 lines of diff (skipped)
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-ecdsa for openSUSE:Factory checked in at 2022-10-25 11:20:08
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-ecdsa (Old)
and /work/SRC/openSUSE:Factory/.python-ecdsa.new.2275 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-ecdsa"
Tue Oct 25 11:20:08 2022 rev:15 rq:1030999 version:0.18.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-ecdsa/python-ecdsa.changes 2021-10-20 20:24:02.573369134 +0200
+++ /work/SRC/openSUSE:Factory/.python-ecdsa.new.2275/python-ecdsa.changes 2022-10-25 11:20:35.222213728 +0200
@@ -1,0 +2,29 @@
+Mon Oct 24 17:14:10 UTC 2022 - Ben Greiner <code(a)bnavigator.de>
+
+- Update to 0.18.0
+ * New features:
+ + Support for EdDSA (Ed25519, Ed448) signature creation and
+ verification.
+ + Support for Ed25519 and Ed448 in PKCS#8 and public key files.
+ + Support for point precomputation for EdDSA.
+ * New API:
+ + CurveEdTw class to represent the Twisted Edwards curve
+ parameters.
+ + PointEdwards class to represent points on Twisted Edwards
+ curve and provide point arithmetic on it.
+ + curve_by_name in curves module to get a Curve object by
+ providing curve name.
+ * Bug fix:
+ + Accept private EdDSA keys that include public key in the
+ ASN.1 structure.
+ + Fix incompatibility with Python 3.3 in handling of
+ memoryviews of empty strings.
+ + Make the VerifyingKey encoded with explicit parameters use
+ the same kind of point encoding for public key and curve
+ generator.
+ + Better handling of malformed curve parameters (as in
+ CVE-2022-0778); make python-ecdsa raise MalformedPointError
+ instead of AssertionError.
+- Also remove the conditional definition of python_module.
+
+-------------------------------------------------------------------
Old:
----
ecdsa-0.17.0.tar.gz
New:
----
ecdsa-0.18.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-ecdsa.spec ++++++
--- /var/tmp/diff_new_pack.JFp0m5/_old 2022-10-25 11:20:35.822215058 +0200
+++ /var/tmp/diff_new_pack.JFp0m5/_new 2022-10-25 11:20:35.826215066 +0200
@@ -1,7 +1,7 @@
#
# spec file for package python-ecdsa
#
-# Copyright (c) 2021 SUSE LLC
+# Copyright (c) 2022 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -16,13 +16,12 @@
#
-%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-ecdsa
-Version: 0.17.0
+Version: 0.18.0
Release: 0
Summary: ECDSA cryptographic signature library (pure python)
License: MIT
-URL: https://github.com/warner/python-ecdsa
+URL: https://github.com/tlsfuzzer/python-ecdsa
Source: https://files.pythonhosted.org/packages/source/e/ecdsa/ecdsa-%{version}.tar…
BuildRequires: %{python_module hypothesis}
BuildRequires: %{python_module pytest}
@@ -31,7 +30,7 @@
BuildRequires: fdupes
BuildRequires: openssl
BuildRequires: python-rpm-macros
-Requires: python-six
+Requires: python-six >= 1.9.0
Suggests: python-gmpy
Suggests: python-gmpy2
BuildArch: noarch
@@ -59,11 +58,16 @@
%python_expand %fdupes %{buildroot}%{$python_sitelib}
%check
-%pytest
+# unfortunate hypothesis fuzzing (gh#warner/python-ecdsa#307):
+donttest="(test_ecdsa and test_sig_verify)"
+donttest="$donttest or (test_jacobi and test_add and scale_points)"
+donttest="$donttest or (test_ellipticcurve and test_p192_mult_tests)"
+%pytest -k "not ($donttest)"
%files %{python_files}
%license LICENSE
%doc NEWS README.md
-%{python_sitelib}/*
+%{python_sitelib}/ecdsa
+%{python_sitelib}/ecdsa-%{version}*-info
%changelog
++++++ ecdsa-0.17.0.tar.gz -> ecdsa-0.18.0.tar.gz ++++++
++++ 8404 lines of diff (skipped)
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-Flask-HTMLmin for openSUSE:Factory checked in at 2022-10-25 11:20:09
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-Flask-HTMLmin (Old)
and /work/SRC/openSUSE:Factory/.python-Flask-HTMLmin.new.2275 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-Flask-HTMLmin"
Tue Oct 25 11:20:09 2022 rev:7 rq:1030923 version:2.2.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-Flask-HTMLmin/python-Flask-HTMLmin.changes 2021-12-20 21:06:46.418955320 +0100
+++ /work/SRC/openSUSE:Factory/.python-Flask-HTMLmin.new.2275/python-Flask-HTMLmin.changes 2022-10-25 11:20:36.122215722 +0200
@@ -1,0 +2,13 @@
+Sat Feb 19 19:21:30 UTC 2022 - Arun Persaud <arun(a)gmx.de>
+
+- specfile:
+ * update copyright year
+
+- update to version 2.2.0:
+ * Added CSS minification
+ * Create codeql-analysis.yml
+ * Add python 3.9 to setup classifiers
+ * Fix tests
+ * Bump urllib3 from 1.26.3 to 1.26.5
+
+-------------------------------------------------------------------
Old:
----
v2.1.0.tar.gz
New:
----
v2.2.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-Flask-HTMLmin.spec ++++++
--- /var/tmp/diff_new_pack.sWs1da/_old 2022-10-25 11:20:36.514216591 +0200
+++ /var/tmp/diff_new_pack.sWs1da/_new 2022-10-25 11:20:36.526216618 +0200
@@ -1,7 +1,7 @@
#
# spec file for package python-Flask-HTMLmin
#
-# Copyright (c) 2021 SUSE LLC
+# Copyright (c) 2022 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -19,7 +19,7 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
%define skip_python2 1
Name: python-Flask-HTMLmin
-Version: 2.1.0
+Version: 2.2.0
Release: 0
Summary: Flask minifier for HTML responses
License: BSD-3-Clause
@@ -27,12 +27,14 @@
Source: https://github.com/hamidfzm/Flask-HTMLmin/archive/v%{version}.tar.gz
Patch0: remove-pytest-runner.patch
BuildRequires: %{python_module Flask}
+BuildRequires: %{python_module cssmin}
BuildRequires: %{python_module htmlmin}
BuildRequires: %{python_module pytest-cov}
BuildRequires: %{python_module setuptools}
BuildRequires: fdupes
BuildRequires: python-rpm-macros
Requires: python-Flask
+Requires: python-cssmin
Requires: python-htmlmin
BuildArch: noarch
%python_subpackages
++++++ v2.1.0.tar.gz -> v2.2.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Flask-HTMLmin-2.1.0/.github/workflows/codeql-analysis.yml new/Flask-HTMLmin-2.2.0/.github/workflows/codeql-analysis.yml
--- old/Flask-HTMLmin-2.1.0/.github/workflows/codeql-analysis.yml 1970-01-01 01:00:00.000000000 +0100
+++ new/Flask-HTMLmin-2.2.0/.github/workflows/codeql-analysis.yml 2021-10-18 13:57:35.000000000 +0200
@@ -0,0 +1,71 @@
+# For most projects, this workflow file will not need changing; you simply need
+# to commit it to your repository.
+#
+# You may wish to alter this file to override the set of languages analyzed,
+# or to provide custom queries or build logic.
+#
+# ******** NOTE ********
+# We have attempted to detect the languages in your repository. Please check
+# the `language` matrix defined below to confirm you have the correct set of
+# supported CodeQL languages.
+#
+name: "CodeQL"
+
+on:
+ push:
+ branches: [ master ]
+ pull_request:
+ # The branches below must be a subset of the branches above
+ branches: [ master ]
+ schedule:
+ - cron: '41 23 * * 6'
+
+jobs:
+ analyze:
+ name: Analyze
+ runs-on: ubuntu-latest
+ permissions:
+ actions: read
+ contents: read
+ security-events: write
+
+ strategy:
+ fail-fast: false
+ matrix:
+ language: [ 'python' ]
+ # CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python' ]
+ # Learn more:
+ # https://docs.github.com/en/free-pro-team@latest/github/finding-security-vul…
+
+ steps:
+ - name: Checkout repository
+ uses: actions/checkout@v2
+
+ # Initializes the CodeQL tools for scanning.
+ - name: Initialize CodeQL
+ uses: github/codeql-action/init@v1
+ with:
+ languages: ${{ matrix.language }}
+ # If you wish to specify custom queries, you can do so here or in a config file.
+ # By default, queries listed here will override any specified in a config file.
+ # Prefix the list here with "+" to use these queries and those in the config file.
+ # queries: ./path/to/local/query, your-org/your-repo/queries@main
+
+ # Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
+ # If this step fails, then you should remove it and run the build manually (see below)
+ - name: Autobuild
+ uses: github/codeql-action/autobuild@v1
+
+ # ������ Command-line programs to run using the OS shell.
+ # ���� https://git.io/JvXDl
+
+ # ������ If the Autobuild fails above, remove it and uncomment the following three lines
+ # and modify them (or add more) to build your code if your project
+ # uses a compiled language
+
+ #- run: |
+ # make bootstrap
+ # make release
+
+ - name: Perform CodeQL Analysis
+ uses: github/codeql-action/analyze@v1
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Flask-HTMLmin-2.1.0/.github/workflows/tests.yml new/Flask-HTMLmin-2.2.0/.github/workflows/tests.yml
--- old/Flask-HTMLmin-2.1.0/.github/workflows/tests.yml 2021-02-10 12:52:30.000000000 +0100
+++ new/Flask-HTMLmin-2.2.0/.github/workflows/tests.yml 2021-10-18 13:57:35.000000000 +0200
@@ -8,7 +8,7 @@
runs-on: ubuntu-latest
strategy:
matrix:
- python-version: [3.6, 3.7, 3.8]
+ python-version: [3.6, 3.7, 3.8, 3.9]
steps:
- uses: actions/checkout@v2
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Flask-HTMLmin-2.1.0/Pipfile new/Flask-HTMLmin-2.2.0/Pipfile
--- old/Flask-HTMLmin-2.1.0/Pipfile 2021-02-10 12:52:30.000000000 +0100
+++ new/Flask-HTMLmin-2.2.0/Pipfile 2021-10-18 13:57:35.000000000 +0200
@@ -14,6 +14,7 @@
[packages]
flask = "*"
htmlmin = "*"
+cssmin = "*"
[requires]
python_version = "3.7"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Flask-HTMLmin-2.1.0/Pipfile.lock new/Flask-HTMLmin-2.2.0/Pipfile.lock
--- old/Flask-HTMLmin-2.1.0/Pipfile.lock 2021-02-10 12:52:30.000000000 +0100
+++ new/Flask-HTMLmin-2.2.0/Pipfile.lock 2021-10-18 13:57:35.000000000 +0200
@@ -18,10 +18,10 @@
"default": {
"click": {
"hashes": [
- "sha256:d2b5255c7c6349bc1bd1e59e08cd12acbbd63ce649f2588755783aa94dfb6b1a",
- "sha256:dacca89f4bfadd5de3d7489b7c8a566eee0d3676333fbb50030263894c38c0dc"
+ "sha256:8c04c11192119b1ef78ea049e0a6f0463e4c48ef00a30160c704337586f3ad7a",
+ "sha256:fba402a4a47334742d782209a7c79bc448911afe1149d07bdabdf480b3e2f4b6"
],
- "version": "==7.1.2"
+ "version": "==8.0.1"
},
"flask": {
"hashes": [
@@ -38,92 +38,98 @@
"index": "pypi",
"version": "==0.1.12"
},
+ "importlib-metadata": {
+ "hashes": [
+ "sha256:960d52ba7c21377c990412aca380bf3642d734c2eaab78a2c39319f67c6a5786",
+ "sha256:e592faad8de1bda9fe920cf41e15261e7131bcf266c30306eec00e8e225c1dd5"
+ ],
+ "markers": "python_version < '3.8'",
+ "version": "==4.4.0"
+ },
"itsdangerous": {
"hashes": [
- "sha256:321b033d07f2a4136d3ec762eac9f16a10ccd60f53c0c91af90217ace7ba1f19",
- "sha256:b12271b2047cb23eeb98c8b5622e2e5c5e9abd9784a153e9d8ef9cb4dd09d749"
+ "sha256:5174094b9637652bdb841a3029700391451bd092ba3db90600dea710ba28e97c",
+ "sha256:9e724d68fc22902a1435351f84c3fb8623f303fffcc566a4cb952df8c572cff0"
],
- "version": "==1.1.0"
+ "version": "==2.0.1"
},
"jinja2": {
"hashes": [
- "sha256:03e47ad063331dd6a3f04a43eddca8a966a26ba0c5b7207a9a9e4e08f1b29419",
- "sha256:a6d58433de0ae800347cab1fa3043cebbabe8baa9d29e668f1c768cb87a333c6"
+ "sha256:1f06f2da51e7b56b8f238affdd6b4e2c61e39598a378cc49345bc1bd42a978a4",
+ "sha256:703f484b47a6af502e743c9122595cc812b0271f661722403114f71a79d0f5a4"
],
- "version": "==2.11.3"
+ "version": "==3.0.1"
},
"markupsafe": {
"hashes": [
- "sha256:00bc623926325b26bb9605ae9eae8a215691f33cae5df11ca5424f06f2d1f473",
- "sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161",
- "sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235",
- "sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5",
- "sha256:13d3144e1e340870b25e7b10b98d779608c02016d5184cfb9927a9f10c689f42",
- "sha256:195d7d2c4fbb0ee8139a6cf67194f3973a6b3042d742ebe0a9ed36d8b6f0c07f",
- "sha256:22c178a091fc6630d0d045bdb5992d2dfe14e3259760e713c490da5323866c39",
- "sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff",
- "sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b",
- "sha256:2beec1e0de6924ea551859edb9e7679da6e4870d32cb766240ce17e0a0ba2014",
- "sha256:3b8a6499709d29c2e2399569d96719a1b21dcd94410a586a18526b143ec8470f",
- "sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1",
- "sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e",
- "sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183",
- "sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66",
- "sha256:596510de112c685489095da617b5bcbbac7dd6384aeebeda4df6025d0256a81b",
- "sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1",
- "sha256:6788b695d50a51edb699cb55e35487e430fa21f1ed838122d722e0ff0ac5ba15",
- "sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1",
- "sha256:6f1e273a344928347c1290119b493a1f0303c52f5a5eae5f16d74f48c15d4a85",
- "sha256:6fffc775d90dcc9aed1b89219549b329a9250d918fd0b8fa8d93d154918422e1",
- "sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e",
- "sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b",
- "sha256:7c1699dfe0cf8ff607dbdcc1e9b9af1755371f92a68f706051cc8c37d447c905",
- "sha256:7fed13866cf14bba33e7176717346713881f56d9d2bcebab207f7a036f41b850",
- "sha256:84dee80c15f1b560d55bcfe6d47b27d070b4681c699c572af2e3c7cc90a3b8e0",
- "sha256:88e5fcfb52ee7b911e8bb6d6aa2fd21fbecc674eadd44118a9cc3863f938e735",
- "sha256:8defac2f2ccd6805ebf65f5eeb132adcf2ab57aa11fdf4c0dd5169a004710e7d",
- "sha256:98bae9582248d6cf62321dcb52aaf5d9adf0bad3b40582925ef7c7f0ed85fceb",
- "sha256:98c7086708b163d425c67c7a91bad6e466bb99d797aa64f965e9d25c12111a5e",
- "sha256:9add70b36c5666a2ed02b43b335fe19002ee5235efd4b8a89bfcf9005bebac0d",
- "sha256:9bf40443012702a1d2070043cb6291650a0841ece432556f784f004937f0f32c",
- "sha256:a6a744282b7718a2a62d2ed9d993cad6f5f585605ad352c11de459f4108df0a1",
- "sha256:acf08ac40292838b3cbbb06cfe9b2cb9ec78fce8baca31ddb87aaac2e2dc3bc2",
- "sha256:ade5e387d2ad0d7ebf59146cc00c8044acbd863725f887353a10df825fc8ae21",
- "sha256:b00c1de48212e4cc9603895652c5c410df699856a2853135b3967591e4beebc2",
- "sha256:b1282f8c00509d99fef04d8ba936b156d419be841854fe901d8ae224c59f0be5",
- "sha256:b1dba4527182c95a0db8b6060cc98ac49b9e2f5e64320e2b56e47cb2831978c7",
- "sha256:b2051432115498d3562c084a49bba65d97cf251f5a331c64a12ee7e04dacc51b",
- "sha256:b7d644ddb4dbd407d31ffb699f1d140bc35478da613b441c582aeb7c43838dd8",
- "sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6",
- "sha256:bf5aa3cbcfdf57fa2ee9cd1822c862ef23037f5c832ad09cfea57fa846dec193",
- "sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f",
- "sha256:caabedc8323f1e93231b52fc32bdcde6db817623d33e100708d9a68e1f53b26b",
- "sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f",
- "sha256:cdb132fc825c38e1aeec2c8aa9338310d29d337bebbd7baa06889d09a60a1fa2",
- "sha256:d53bc011414228441014aa71dbec320c66468c1030aae3a6e29778a3382d96e5",
- "sha256:d73a845f227b0bfe8a7455ee623525ee656a9e2e749e4742706d80a6065d5e2c",
- "sha256:d9be0ba6c527163cbed5e0857c451fcd092ce83947944d6c14bc95441203f032",
- "sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7",
- "sha256:e8313f01ba26fbbe36c7be1966a7b7424942f670f38e666995b88d012765b9be",
- "sha256:feb7b34d6325451ef96bc0e36e1a6c0c1c64bc1fbec4b854f4529e51887b1621"
+ "sha256:01a9b8ea66f1658938f65b93a85ebe8bc016e6769611be228d797c9d998dd298",
+ "sha256:023cb26ec21ece8dc3907c0e8320058b2e0cb3c55cf9564da612bc325bed5e64",
+ "sha256:0446679737af14f45767963a1a9ef7620189912317d095f2d9ffa183a4d25d2b",
+ "sha256:0717a7390a68be14b8c793ba258e075c6f4ca819f15edfc2a3a027c823718567",
+ "sha256:0955295dd5eec6cb6cc2fe1698f4c6d84af2e92de33fbcac4111913cd100a6ff",
+ "sha256:10f82115e21dc0dfec9ab5c0223652f7197feb168c940f3ef61563fc2d6beb74",
+ "sha256:1d609f577dc6e1aa17d746f8bd3c31aa4d258f4070d61b2aa5c4166c1539de35",
+ "sha256:2ef54abee730b502252bcdf31b10dacb0a416229b72c18b19e24a4509f273d26",
+ "sha256:3c112550557578c26af18a1ccc9e090bfe03832ae994343cfdacd287db6a6ae7",
+ "sha256:47ab1e7b91c098ab893b828deafa1203de86d0bc6ab587b160f78fe6c4011f75",
+ "sha256:49e3ceeabbfb9d66c3aef5af3a60cc43b85c33df25ce03d0031a608b0a8b2e3f",
+ "sha256:4efca8f86c54b22348a5467704e3fec767b2db12fc39c6d963168ab1d3fc9135",
+ "sha256:53edb4da6925ad13c07b6d26c2a852bd81e364f95301c66e930ab2aef5b5ddd8",
+ "sha256:594c67807fb16238b30c44bdf74f36c02cdf22d1c8cda91ef8a0ed8dabf5620a",
+ "sha256:611d1ad9a4288cf3e3c16014564df047fe08410e628f89805e475368bd304914",
+ "sha256:6557b31b5e2c9ddf0de32a691f2312a32f77cd7681d8af66c2692efdbef84c18",
+ "sha256:693ce3f9e70a6cf7d2fb9e6c9d8b204b6b39897a2c4a1aa65728d5ac97dcc1d8",
+ "sha256:6a7fae0dd14cf60ad5ff42baa2e95727c3d81ded453457771d02b7d2b3f9c0c2",
+ "sha256:6c4ca60fa24e85fe25b912b01e62cb969d69a23a5d5867682dd3e80b5b02581d",
+ "sha256:7d91275b0245b1da4d4cfa07e0faedd5b0812efc15b702576d103293e252af1b",
+ "sha256:905fec760bd2fa1388bb5b489ee8ee5f7291d692638ea5f67982d968366bef9f",
+ "sha256:97383d78eb34da7e1fa37dd273c20ad4320929af65d156e35a5e2d89566d9dfb",
+ "sha256:984d76483eb32f1bcb536dc27e4ad56bba4baa70be32fa87152832cdd9db0833",
+ "sha256:a30e67a65b53ea0a5e62fe23682cfe22712e01f453b95233b25502f7c61cb415",
+ "sha256:ab3ef638ace319fa26553db0624c4699e31a28bb2a835c5faca8f8acf6a5a902",
+ "sha256:b2f4bf27480f5e5e8ce285a8c8fd176c0b03e93dcc6646477d4630e83440c6a9",
+ "sha256:b7f2d075102dc8c794cbde1947378051c4e5180d52d276987b8d28a3bd58c17d",
+ "sha256:be98f628055368795d818ebf93da628541e10b75b41c559fdf36d104c5787066",
+ "sha256:d7f9850398e85aba693bb640262d3611788b1f29a79f0c93c565694658f4071f",
+ "sha256:f5653a225f31e113b152e56f154ccbe59eeb1c7487b39b9d9f9cdb58e6c79dc5",
+ "sha256:f826e31d18b516f653fe296d967d700fddad5901ae07c622bb3705955e1faa94",
+ "sha256:f8ba0e8349a38d3001fae7eadded3f6606f0da5d748ee53cc1dab1d6527b9509",
+ "sha256:f9081981fe268bd86831e5c75f7de206ef275defcb82bc70740ae6dc507aee51",
+ "sha256:fa130dd50c57d53368c9d59395cb5526eda596d3ffe36666cd81a44d56e48872"
],
- "version": "==1.1.1"
+ "version": "==2.0.1"
+ },
+ "typing-extensions": {
+ "hashes": [
+ "sha256:0ac0f89795dd19de6b97debb0c6af1c70987fd80a2d62d1958f7e56fcc31b497",
+ "sha256:50b6f157849174217d0656f99dc82fe932884fb250826c18350e159ec6cdf342",
+ "sha256:779383f6086d90c99ae41cf0ff39aac8a7937a9283ce0a414e5dd782f4c94a84"
+ ],
+ "markers": "python_version < '3.8'",
+ "version": "==3.10.0.0"
},
"werkzeug": {
"hashes": [
- "sha256:2de2a5db0baeae7b2d2664949077c2ac63fbd16d98da0ff71837f7d1dea3fd43",
- "sha256:6c80b1e5ad3665290ea39320b91e1be1e0d5f60652b964a3070216de83d2e47c"
+ "sha256:1de1db30d010ff1af14a009224ec49ab2329ad2cde454c8a708130642d579c42",
+ "sha256:6c1ec500dcdba0baa27600f6a22f6333d8b662d22027ff9f6202e3367413caa8"
+ ],
+ "version": "==2.0.1"
+ },
+ "zipp": {
+ "hashes": [
+ "sha256:3607921face881ba3e026887d8150cca609d517579abe052ac81fc5aeffdbd76",
+ "sha256:51cb66cc54621609dd593d1787f286ee42a5c0adbb4b29abea5a63edc3e03098"
],
- "version": "==1.0.1"
+ "version": "==3.4.1"
}
},
"develop": {
"attrs": {
"hashes": [
- "sha256:31b2eced602aa8423c2aea9c76a724617ed67cf9513173fd3a4f03e3a929c7e6",
- "sha256:832aa3cde19744e49938b91fea06d69ecb9e649c93ba974535d08ad92164f700"
+ "sha256:149e90d6d8ac20db7a955ad60cf0e6881a3f20d37096140088356da6c716b0b1",
+ "sha256:ef6aaac3ca6cd92904cdd0d83f629a15f18053ec84e6432106f7a4d04ae4f5fb"
],
- "version": "==20.3.0"
+ "version": "==21.2.0"
},
"bleach": {
"hashes": [
@@ -134,52 +140,64 @@
},
"certifi": {
"hashes": [
- "sha256:1a4995114262bffbc2413b159f2a1a480c969de6e6eb13ee966d470af86af59c",
- "sha256:719a74fb9e33b9bd44cc7f3a8d94bc35e4049deebe19ba7d8e108280cfd59830"
+ "sha256:2bbf76fd432960138b3ef6dda3dde0544f27cbf8546c458e60baf371917ba9ee",
+ "sha256:50b1e4f8446b06f41be7dd6338db18e0990601dce795c2b1686458aa7e8fa7d8"
],
- "version": "==2020.12.5"
+ "version": "==2021.5.30"
},
"cffi": {
"hashes": [
- "sha256:00a1ba5e2e95684448de9b89888ccd02c98d512064b4cb987d48f4b40aa0421e",
- "sha256:00e28066507bfc3fe865a31f325c8391a1ac2916219340f87dfad602c3e48e5d",
- "sha256:045d792900a75e8b1e1b0ab6787dd733a8190ffcf80e8c8ceb2fb10a29ff238a",
- "sha256:0638c3ae1a0edfb77c6765d487fee624d2b1ee1bdfeffc1f0b58c64d149e7eec",
- "sha256:105abaf8a6075dc96c1fe5ae7aae073f4696f2905fde6aeada4c9d2926752362",
- "sha256:155136b51fd733fa94e1c2ea5211dcd4c8879869008fc811648f16541bf99668",
- "sha256:1a465cbe98a7fd391d47dce4b8f7e5b921e6cd805ef421d04f5f66ba8f06086c",
- "sha256:1d2c4994f515e5b485fd6d3a73d05526aa0fcf248eb135996b088d25dfa1865b",
- "sha256:2c24d61263f511551f740d1a065eb0212db1dbbbbd241db758f5244281590c06",
- "sha256:51a8b381b16ddd370178a65360ebe15fbc1c71cf6f584613a7ea08bfad946698",
- "sha256:594234691ac0e9b770aee9fcdb8fa02c22e43e5c619456efd0d6c2bf276f3eb2",
- "sha256:5cf4be6c304ad0b6602f5c4e90e2f59b47653ac1ed9c662ed379fe48a8f26b0c",
- "sha256:64081b3f8f6f3c3de6191ec89d7dc6c86a8a43911f7ecb422c60e90c70be41c7",
- "sha256:6bc25fc545a6b3d57b5f8618e59fc13d3a3a68431e8ca5fd4c13241cd70d0009",
- "sha256:798caa2a2384b1cbe8a2a139d80734c9db54f9cc155c99d7cc92441a23871c03",
- "sha256:7c6b1dece89874d9541fc974917b631406233ea0440d0bdfbb8e03bf39a49b3b",
- "sha256:7ef7d4ced6b325e92eb4d3502946c78c5367bc416398d387b39591532536734e",
- "sha256:840793c68105fe031f34d6a086eaea153a0cd5c491cde82a74b420edd0a2b909",
- "sha256:8d6603078baf4e11edc4168a514c5ce5b3ba6e3e9c374298cb88437957960a53",
- "sha256:9cc46bc107224ff5b6d04369e7c595acb700c3613ad7bcf2e2012f62ece80c35",
- "sha256:9f7a31251289b2ab6d4012f6e83e58bc3b96bd151f5b5262467f4bb6b34a7c26",
- "sha256:9ffb888f19d54a4d4dfd4b3f29bc2c16aa4972f1c2ab9c4ab09b8ab8685b9c2b",
- "sha256:a5ed8c05548b54b998b9498753fb9cadbfd92ee88e884641377d8a8b291bcc01",
- "sha256:a7711edca4dcef1a75257b50a2fbfe92a65187c47dab5a0f1b9b332c5919a3fb",
- "sha256:af5c59122a011049aad5dd87424b8e65a80e4a6477419c0c1015f73fb5ea0293",
- "sha256:b18e0a9ef57d2b41f5c68beefa32317d286c3d6ac0484efd10d6e07491bb95dd",
- "sha256:b4e248d1087abf9f4c10f3c398896c87ce82a9856494a7155823eb45a892395d",
- "sha256:ba4e9e0ae13fc41c6b23299545e5ef73055213e466bd107953e4a013a5ddd7e3",
- "sha256:c6332685306b6417a91b1ff9fae889b3ba65c2292d64bd9245c093b1b284809d",
- "sha256:d5ff0621c88ce83a28a10d2ce719b2ee85635e85c515f12bac99a95306da4b2e",
- "sha256:d9efd8b7a3ef378dd61a1e77367f1924375befc2eba06168b6ebfa903a5e59ca",
- "sha256:df5169c4396adc04f9b0a05f13c074df878b6052430e03f50e68adf3a57aa28d",
- "sha256:ebb253464a5d0482b191274f1c8bf00e33f7e0b9c66405fbffc61ed2c839c775",
- "sha256:ec80dc47f54e6e9a78181ce05feb71a0353854cc26999db963695f950b5fb375",
- "sha256:f032b34669220030f905152045dfa27741ce1a6db3324a5bc0b96b6c7420c87b",
- "sha256:f60567825f791c6f8a592f3c6e3bd93dd2934e3f9dac189308426bd76b00ef3b",
- "sha256:f803eaa94c2fcda012c047e62bc7a51b0bdabda1cad7a92a522694ea2d76e49f"
+ "sha256:005a36f41773e148deac64b08f233873a4d0c18b053d37da83f6af4d9087b813",
+ "sha256:04c468b622ed31d408fea2346bec5bbffba2cc44226302a0de1ade9f5ea3d373",
+ "sha256:06d7cd1abac2ffd92e65c0609661866709b4b2d82dd15f611e602b9b188b0b69",
+ "sha256:06db6321b7a68b2bd6df96d08a5adadc1fa0e8f419226e25b2a5fbf6ccc7350f",
+ "sha256:0857f0ae312d855239a55c81ef453ee8fd24136eaba8e87a2eceba644c0d4c06",
+ "sha256:0f861a89e0043afec2a51fd177a567005847973be86f709bbb044d7f42fc4e05",
+ "sha256:1071534bbbf8cbb31b498d5d9db0f274f2f7a865adca4ae429e147ba40f73dea",
+ "sha256:158d0d15119b4b7ff6b926536763dc0714313aa59e320ddf787502c70c4d4bee",
+ "sha256:1bf1ac1984eaa7675ca8d5745a8cb87ef7abecb5592178406e55858d411eadc0",
+ "sha256:1f436816fc868b098b0d63b8920de7d208c90a67212546d02f84fe78a9c26396",
+ "sha256:24a570cd11895b60829e941f2613a4f79df1a27344cbbb82164ef2e0116f09c7",
+ "sha256:24ec4ff2c5c0c8f9c6b87d5bb53555bf267e1e6f70e52e5a9740d32861d36b6f",
+ "sha256:2894f2df484ff56d717bead0a5c2abb6b9d2bf26d6960c4604d5c48bbc30ee73",
+ "sha256:29314480e958fd8aab22e4a58b355b629c59bf5f2ac2492b61e3dc06d8c7a315",
+ "sha256:293e7ea41280cb28c6fcaaa0b1aa1f533b8ce060b9e701d78511e1e6c4a1de76",
+ "sha256:34eff4b97f3d982fb93e2831e6750127d1355a923ebaeeb565407b3d2f8d41a1",
+ "sha256:35f27e6eb43380fa080dccf676dece30bef72e4a67617ffda586641cd4508d49",
+ "sha256:3c3f39fa737542161d8b0d680df2ec249334cd70a8f420f71c9304bd83c3cbed",
+ "sha256:3d3dd4c9e559eb172ecf00a2a7517e97d1e96de2a5e610bd9b68cea3925b4892",
+ "sha256:43e0b9d9e2c9e5d152946b9c5fe062c151614b262fda2e7b201204de0b99e482",
+ "sha256:48e1c69bbacfc3d932221851b39d49e81567a4d4aac3b21258d9c24578280058",
+ "sha256:51182f8927c5af975fece87b1b369f722c570fe169f9880764b1ee3bca8347b5",
+ "sha256:58e3f59d583d413809d60779492342801d6e82fefb89c86a38e040c16883be53",
+ "sha256:5de7970188bb46b7bf9858eb6890aad302577a5f6f75091fd7cdd3ef13ef3045",
+ "sha256:65fa59693c62cf06e45ddbb822165394a288edce9e276647f0046e1ec26920f3",
+ "sha256:681d07b0d1e3c462dd15585ef5e33cb021321588bebd910124ef4f4fb71aef55",
+ "sha256:69e395c24fc60aad6bb4fa7e583698ea6cc684648e1ffb7fe85e3c1ca131a7d5",
+ "sha256:6c97d7350133666fbb5cf4abdc1178c812cb205dc6f41d174a7b0f18fb93337e",
+ "sha256:6e4714cc64f474e4d6e37cfff31a814b509a35cb17de4fb1999907575684479c",
+ "sha256:72d8d3ef52c208ee1c7b2e341f7d71c6fd3157138abf1a95166e6165dd5d4369",
+ "sha256:8ae6299f6c68de06f136f1f9e69458eae58f1dacf10af5c17353eae03aa0d827",
+ "sha256:8b198cec6c72df5289c05b05b8b0969819783f9418e0409865dac47288d2a053",
+ "sha256:99cd03ae7988a93dd00bcd9d0b75e1f6c426063d6f03d2f90b89e29b25b82dfa",
+ "sha256:9cf8022fb8d07a97c178b02327b284521c7708d7c71a9c9c355c178ac4bbd3d4",
+ "sha256:9de2e279153a443c656f2defd67769e6d1e4163952b3c622dcea5b08a6405322",
+ "sha256:9e93e79c2551ff263400e1e4be085a1210e12073a31c2011dbbda14bda0c6132",
+ "sha256:9ff227395193126d82e60319a673a037d5de84633f11279e336f9c0f189ecc62",
+ "sha256:a465da611f6fa124963b91bf432d960a555563efe4ed1cc403ba5077b15370aa",
+ "sha256:ad17025d226ee5beec591b52800c11680fca3df50b8b29fe51d882576e039ee0",
+ "sha256:afb29c1ba2e5a3736f1c301d9d0abe3ec8b86957d04ddfa9d7a6a42b9367e396",
+ "sha256:b85eb46a81787c50650f2392b9b4ef23e1f126313b9e0e9013b35c15e4288e2e",
+ "sha256:bb89f306e5da99f4d922728ddcd6f7fcebb3241fc40edebcb7284d7514741991",
+ "sha256:cbde590d4faaa07c72bf979734738f328d239913ba3e043b1e98fe9a39f8b2b6",
+ "sha256:cc5a8e069b9ebfa22e26d0e6b97d6f9781302fe7f4f2b8776c3e1daea35f1adc",
+ "sha256:cd2868886d547469123fadc46eac7ea5253ea7fcb139f12e1dfc2bbd406427d1",
+ "sha256:d42b11d692e11b6634f7613ad8df5d6d5f8875f5d48939520d351007b3c13406",
+ "sha256:df5052c5d867c1ea0b311fb7c3cd28b19df469c056f7fdcfe88c7473aa63e333",
+ "sha256:f2d45f97ab6bb54753eab54fffe75aaf3de4ff2341c9daee1987ee1837636f1d",
+ "sha256:fd78e5fee591709f32ef6edb9a015b4aa1a5022598e36227500c8f4e02328d9c"
],
- "version": "==1.14.4"
+ "version": "==1.14.5"
},
"chardet": {
"hashes": [
@@ -227,30 +245,27 @@
},
"cryptography": {
"hashes": [
- "sha256:0d7b69674b738068fa6ffade5c962ecd14969690585aaca0a1b1fc9058938a72",
- "sha256:1bd0ccb0a1ed775cd7e2144fe46df9dc03eefd722bbcf587b3e0616ea4a81eff",
- "sha256:3c284fc1e504e88e51c428db9c9274f2da9f73fdf5d7e13a36b8ecb039af6e6c",
- "sha256:49570438e60f19243e7e0d504527dd5fe9b4b967b5a1ff21cc12b57602dd85d3",
- "sha256:541dd758ad49b45920dda3b5b48c968f8b2533d8981bcdb43002798d8f7a89ed",
- "sha256:5a60d3780149e13b7a6ff7ad6526b38846354d11a15e21068e57073e29e19bed",
- "sha256:7951a966613c4211b6612b0352f5bf29989955ee592c4a885d8c7d0f830d0433",
- "sha256:922f9602d67c15ade470c11d616f2b2364950602e370c76f0c94c94ae672742e",
- "sha256:a0f0b96c572fc9f25c3f4ddbf4688b9b38c69836713fb255f4a2715d93cbaf44",
- "sha256:a777c096a49d80f9d2979695b835b0f9c9edab73b59e4ceb51f19724dda887ed",
- "sha256:a9a4ac9648d39ce71c2f63fe7dc6db144b9fa567ddfc48b9fde1b54483d26042",
- "sha256:aa4969f24d536ae2268c902b2c3d62ab464b5a66bcb247630d208a79a8098e9b",
- "sha256:c7390f9b2119b2b43160abb34f63277a638504ef8df99f11cb52c1fda66a2e6f",
- "sha256:e18e6ab84dfb0ab997faf8cca25a86ff15dfea4027b986322026cc99e0a892da"
+ "sha256:0f1212a66329c80d68aeeb39b8a16d54ef57071bf22ff4e521657b27372e327d",
+ "sha256:1e056c28420c072c5e3cb36e2b23ee55e260cb04eee08f702e0edfec3fb51959",
+ "sha256:240f5c21aef0b73f40bb9f78d2caff73186700bf1bc6b94285699aff98cc16c6",
+ "sha256:26965837447f9c82f1855e0bc8bc4fb910240b6e0d16a664bb722df3b5b06873",
+ "sha256:37340614f8a5d2fb9aeea67fd159bfe4f5f4ed535b1090ce8ec428b2f15a11f2",
+ "sha256:3d10de8116d25649631977cb37da6cbdd2d6fa0e0281d014a5b7d337255ca713",
+ "sha256:3d8427734c781ea5f1b41d6589c293089704d4759e34597dce91014ac125aad1",
+ "sha256:7ec5d3b029f5fa2b179325908b9cd93db28ab7b85bb6c1db56b10e0b54235177",
+ "sha256:8e56e16617872b0957d1c9742a3f94b43533447fd78321514abbe7db216aa250",
+ "sha256:de4e5f7f68220d92b7637fc99847475b59154b7a1b3868fb7385337af54ac9ca",
+ "sha256:eb8cc2afe8b05acbd84a43905832ec78e7b3873fb124ca190f574dca7389a87d",
+ "sha256:ee77aa129f481be46f8d92a1a7db57269a2f23052d5f2433b4621bb457081cc9"
],
- "index": "pypi",
- "version": "==3.3.2"
+ "version": "==3.4.7"
},
"docutils": {
"hashes": [
- "sha256:0c5b78adfbf7762415433f5515cd5c9e762339e23369dbe8000d84a4bf4ab3af",
- "sha256:c2de3a60e9e7d07be26b7f2b00ca0309c207e06c100f9cc2a94931fc75a478fc"
+ "sha256:686577d2e4c32380bb50cbb22f575ed742d58168cee37e99117a854bcd88f125",
+ "sha256:cf316c8370a737a022b72b56874f6602acf974a37a9fba42ec2876387549fc61"
],
- "version": "==0.16"
+ "version": "==0.17.1"
},
"entrypoints": {
"hashes": [
@@ -276,11 +291,11 @@
},
"importlib-metadata": {
"hashes": [
- "sha256:ace61d5fc652dc280e7b6b4ff732a9c2d40db2c0f92bc6cb74e07b73d53a1771",
- "sha256:fa5daa4477a7414ae34e95942e4dd07f62adf589143c875c133c1e53c4eff38d"
+ "sha256:960d52ba7c21377c990412aca380bf3642d734c2eaab78a2c39319f67c6a5786",
+ "sha256:e592faad8de1bda9fe920cf41e15261e7131bcf266c30306eec00e8e225c1dd5"
],
"markers": "python_version < '3.8'",
- "version": "==3.4.0"
+ "version": "==4.4.0"
},
"jeepney": {
"hashes": [
@@ -292,10 +307,10 @@
},
"keyring": {
"hashes": [
- "sha256:9acb3e1452edbb7544822b12fd25459078769e560fa51f418b6d00afaa6178df",
- "sha256:9f44660a5d4931bdc14c08a1d01ef30b18a7a8147380710d8c9f9531e1f6c3c0"
+ "sha256:045703609dd3fccfcdb27da201684278823b72af515aedec1a8515719a038cb8",
+ "sha256:8f607d7d1cc502c43a932a275a56fe47db50271904513a379d39df1af277ac48"
],
- "version": "==22.0.1"
+ "version": "==23.0.1"
},
"mccabe": {
"hashes": [
@@ -306,10 +321,10 @@
},
"more-itertools": {
"hashes": [
- "sha256:5652a9ac72209ed7df8d9c15daf4e1aa0e3d2ccd3c87f8265a0673cd9cbc9ced",
- "sha256:c5d6da9ca3ff65220c3bfd2a8db06d698f05d4d2b9be57e1deb2be5a45019713"
+ "sha256:2cf89ec599962f2ddc4d568a05defc40e0a587fbc10d5989713638864c36be4d",
+ "sha256:83f0308e05477c68f56ea3a888172c78ed5d5b3c282addb67508e7ba6c8f813a"
],
- "version": "==8.7.0"
+ "version": "==8.8.0"
},
"packaging": {
"hashes": [
@@ -362,10 +377,10 @@
},
"pygments": {
"hashes": [
- "sha256:bc9591213a8f0e0ca1a5e68a479b4887fdc3e75d0774e5c71c31920c427de435",
- "sha256:df49d09b498e83c1a73128295860250b0b7edd4c723a32e9bc0d295c7c2ec337"
+ "sha256:a18f47b506a429f6f4b9df81bb02beab9ca21d0a5fee38ed15aef65f0545519f",
+ "sha256:d66e804411278594d764fc69ec36ec13d9ae9147193a1740cd34d272ca383b8e"
],
- "version": "==2.7.4"
+ "version": "==2.9.0"
},
"pyparsing": {
"hashes": [
@@ -392,10 +407,10 @@
},
"readme-renderer": {
"hashes": [
- "sha256:267854ac3b1530633c2394ead828afcd060fc273217c42ac36b6be9c42cd9a9d",
- "sha256:6b7e5aa59210a40de72eb79931491eaf46fefca2952b9181268bd7c7c65c260a"
+ "sha256:63b4075c6698fcfa78e584930f07f39e05d46f3ec97f65006e430b595ca6348c",
+ "sha256:92fd5ac2bf8677f310f3303aa4bce5b9d5f9f2094ab98c29f13791d7b805a3db"
],
- "version": "==28.0"
+ "version": "==29.0"
},
"requests": {
"hashes": [
@@ -421,17 +436,17 @@
},
"six": {
"hashes": [
- "sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259",
- "sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced"
+ "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926",
+ "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"
],
- "version": "==1.15.0"
+ "version": "==1.16.0"
},
"tqdm": {
"hashes": [
- "sha256:2874fa525c051177583ec59c0fb4583e91f28ccd3f217ffad2acdb32d2c789ac",
- "sha256:ab9b659241d82b8b51b2269ee243ec95286046bf06015c4e15a947cc15914211"
+ "sha256:736524215c690621b06fc89d0310a49822d75e599fcd0feb7cc742b98d692493",
+ "sha256:cd5791b5d7c3f2f1819efc81d36eb719a38e0906a7380365c556779f585ea042"
],
- "version": "==4.56.1"
+ "version": "==4.61.0"
},
"twine": {
"hashes": [
@@ -443,19 +458,20 @@
},
"typing-extensions": {
"hashes": [
- "sha256:7cb407020f00f7bfc3cb3e7881628838e69d8f3fcab2f64742a5e76b2f841918",
- "sha256:99d4073b617d30288f569d3f13d2bd7548c3a7e4c8de87db09a9d29bb3a4a60c",
- "sha256:dafc7639cde7f1b6e1acc0f457842a83e722ccca8eef5270af2d74792619a89f"
+ "sha256:0ac0f89795dd19de6b97debb0c6af1c70987fd80a2d62d1958f7e56fcc31b497",
+ "sha256:50b6f157849174217d0656f99dc82fe932884fb250826c18350e159ec6cdf342",
+ "sha256:779383f6086d90c99ae41cf0ff39aac8a7937a9283ce0a414e5dd782f4c94a84"
],
"markers": "python_version < '3.8'",
- "version": "==3.7.4.3"
+ "version": "==3.10.0.0"
},
"urllib3": {
"hashes": [
- "sha256:1b465e494e3e0d8939b50680403e3aedaa2bc434b7d5af64dfd3c958d7f5ae80",
- "sha256:de3eedaad74a2683334e282005cd8d7f22f4d55fa690a2a1020a416cb0a47e73"
+ "sha256:753a0374df26658f99d826cfe40394a686d05985786d946fbe4165b5148f5a7c",
+ "sha256:a7acd0977125325f516bda9735fa7142b909a8d01e8b2e4c8108d0984e6e0098"
],
- "version": "==1.26.3"
+ "index": "pypi",
+ "version": "==1.26.5"
},
"wcwidth": {
"hashes": [
@@ -481,10 +497,10 @@
},
"zipp": {
"hashes": [
- "sha256:102c24ef8f171fd729d46599845e95c7ab894a4cf45f5de11a44cc7444fb1108",
- "sha256:ed5eee1974372595f9e416cc7bbeeb12335201d8081ca8a0743c954d4446e5cb"
+ "sha256:3607921face881ba3e026887d8150cca609d517579abe052ac81fc5aeffdbd76",
+ "sha256:51cb66cc54621609dd593d1787f286ee42a5c0adbb4b29abea5a63edc3e03098"
],
- "version": "==3.4.0"
+ "version": "==3.4.1"
}
}
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Flask-HTMLmin-2.1.0/README.md new/Flask-HTMLmin-2.2.0/README.md
--- old/Flask-HTMLmin-2.1.0/README.md 2021-02-10 12:52:30.000000000 +0100
+++ new/Flask-HTMLmin-2.2.0/README.md 2021-10-18 13:57:35.000000000 +0200
@@ -2,7 +2,7 @@
Flask-HTMLmin
=============
[![PyPI version](https://badge.fury.io/py/Flask-HTMLmin.svg)](https://badge.fury.io/py/Flask-HTMLmin)
-![Supported Python Versions](https://img.shields.io/badge/python-3.6%20%7C%203.7%20%7C%203.8-blue.svg)
+![Supported Python Versions](https://img.shields.io/badge/python-3.6%20%7C%203.7%20%7C%203.8%20%7C%203.9-blue.svg)
[![License](https://img.shields.io/badge/License-BSD%203--Clause-orange.svg)](LICENSE)
![tests](https://github.com/hamidfzm/Flask-HTMLmin/workflows/tests/badge.svg)
[![codecov](https://codecov.io/gh/hamidfzm/Flask-HTMLmin/branch/master/graph/badge.svg)](https://codecov.io/gh/hamidfzm/Flask-HTMLmin)
@@ -46,7 +46,7 @@
# pass additional parameters to htmlmin
# HTMLMIN(app, **kwargs)
# example:
-# htmlmin = HTMLMIN(app, remove_comments=False, remove_empty_space=True)
+# htmlmin = HTMLMIN(app, remove_comments=False, remove_empty_space=True, disable_css_min=True)
@app.route('/')
@@ -71,5 +71,6 @@
- [x] Test cases
- [x] Route (or URL rule) exemption
- [x] Caching (in progress)
-- [ ] Minify inline CSS
+- [x] Minify inline CSS
- [ ] Minify inline Javascript
+- [ ] Type hints
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Flask-HTMLmin-2.1.0/flask_htmlmin/__init__.py new/Flask-HTMLmin-2.2.0/flask_htmlmin/__init__.py
--- old/Flask-HTMLmin-2.1.0/flask_htmlmin/__init__.py 2021-02-10 12:52:30.000000000 +0100
+++ new/Flask-HTMLmin-2.2.0/flask_htmlmin/__init__.py 2021-10-18 13:57:35.000000000 +0200
@@ -2,6 +2,8 @@
from htmlmin import Minifier
from flask import request, current_app
import warnings
+import cssmin
+import re
class HTMLMIN(object):
@@ -15,7 +17,10 @@
'reduce_empty_attributes': True,
'remove_optional_attribute_quotes': False
}
+
+ self.disable_css_min = kwargs.get('disable_css_min', False)
default_options.update(kwargs)
+ self.opts = default_options
self._exempt_routes = set()
self._html_minify = Minifier(
@@ -51,13 +56,68 @@
return response
response.direct_passthrough = False
- response.set_data(
- self._html_minify.minify(response.get_data(as_text=True))
- )
+ if self.disable_css_min:
+ response.set_data(
+ self._html_minify.minify(response.get_data(as_text=True))
+ )
+ else:
+ response.set_data(
+ self._css_minify(
+ self._html_minify.minify(
+ response.get_data(as_text=True)
+ )
+ )
+ )
return response
return response
+ def _css_minify(self, response):
+ """
+ Minify inline css
+ """
+
+ # Minify internal css
+ out = ''
+ text = response
+ opening_tags = re.findall(r"<style\s*[^>]*>", text, re.M | re.I)
+ for tag in opening_tags:
+ i = text.find(tag)+len(tag)-1
+ e = text.find("</style>")+9
+ css = text[i:e]
+ out += text[0:i] + self._min_css(css)
+ text = text[e:]
+ out = out+text
+
+ # Minify inline css
+ out2 = ''
+ inlines = re.findall(r"<[A-Za-z0-9-]+[^>]+?style=\"[\s\S]+?\"",
+ out, re.M | re.I)
+ for inline in inlines:
+ i = out.find(inline)
+ j = out[i:].find("style=")+7
+ k = out[i+j:].find('"')
+ css = out[i+j:i+j+k+1]
+ out2 += out[0:i+j] + re.sub(r";+\s*(?=(\"|\'))", "",
+ self._min_css(css), re.I | re.M)
+ out = out[i+j+k+1:]
+ out2 += out
+ return out2
+
+ def _min_css(self, css):
+ if self.opts.get("remove_comments"):
+ css = cssmin.remove_comments(css)
+ css = cssmin.condense_whitespace(css)
+ css = css.replace('"\\"}\\""', "___PSEUDOCLASSBMH___")
+ css = cssmin.remove_unnecessary_whitespace(css)
+ css = cssmin.remove_unnecessary_semicolons(css)
+ css = cssmin.condense_zero_units(css)
+ css = cssmin.condense_multidimensional_zeros(css)
+ css = cssmin.condense_floating_points(css)
+ css = cssmin.normalize_rgb_colors_to_hex(css)
+ css = cssmin.condense_hex_colors(css)
+ return css
+
def exempt(self, obj):
"""
decorator to mark a view as exempt from htmlmin.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Flask-HTMLmin-2.1.0/setup.py new/Flask-HTMLmin-2.2.0/setup.py
--- old/Flask-HTMLmin-2.1.0/setup.py 2021-02-10 12:52:30.000000000 +0100
+++ new/Flask-HTMLmin-2.2.0/setup.py 2021-10-18 13:57:35.000000000 +0200
@@ -14,7 +14,7 @@
setup(
name='Flask-HTMLmin',
- version='2.1.0',
+ version='2.2.0',
url='https://github.com/hamidfzm/Flask-HTMLmin',
license='BSD-3-Clause',
author='Hamid FzM',
@@ -29,7 +29,8 @@
platforms='any',
install_requires=[
'Flask',
- 'htmlmin'
+ 'htmlmin',
+ 'cssmin'
],
python_requires='>=3.6',
classifiers=[
@@ -42,6 +43,7 @@
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
+ 'Programming Language :: Python :: 3.9',
'Topic :: Internet :: WWW/HTTP :: Dynamic Content',
'Topic :: Software Development :: Libraries :: Python Modules',
'Topic :: Text Processing :: Markup :: HTML',
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Flask-HTMLmin-2.1.0/test.py new/Flask-HTMLmin-2.2.0/test.py
--- old/Flask-HTMLmin-2.1.0/test.py 2021-02-10 12:52:30.000000000 +0100
+++ new/Flask-HTMLmin-2.2.0/test.py 2021-10-18 13:57:35.000000000 +0200
@@ -7,7 +7,7 @@
app = Flask(__name__)
app.config['TESTING'] = True
-app.config['MINIFY_PAGE'] = True
+app.config['MINIFY_HTML'] = True
htmlmin = HTMLMIN(app=app)
json_resp = dict(
@@ -15,8 +15,13 @@
)
html_resp = '''<html>
+ <style>
+ .h {
+ width: 3em
+ }
+ </style>
<body>
- <h1>
+ <h1 style="width: 3em;;">
HTML
</h1>
</body>
@@ -48,7 +53,8 @@
def test_html_minify(client):
""" testing HTML minified response """
resp = client.get('/').data
- assert b'<html> <body> <h1> HTML </h1> </body> </html>' == resp
+ assert b'<html> <style>.h{width:3em}</style><body>\
+ <h1 style="width:3em"> HTML </h1> </body> </html>' == resp
def test_json_unminifed(client):
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-softlayer for openSUSE:Factory checked in at 2022-10-25 11:20:07
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-softlayer (Old)
and /work/SRC/openSUSE:Factory/.python-softlayer.new.2275 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-softlayer"
Tue Oct 25 11:20:07 2022 rev:17 rq:1030941 version:6.1.2
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-softlayer/python-softlayer.changes 2022-10-16 16:09:20.726772946 +0200
+++ /work/SRC/openSUSE:Factory/.python-softlayer.new.2275/python-softlayer.changes 2022-10-25 11:20:34.174211404 +0200
@@ -1,0 +2,6 @@
+Mon Oct 17 18:04:18 UTC 2022 - Matej Cepl <mcepl(a)suse.com>
+
+- Add fix-maint-issue.patch (fixes bsc#1203311).
+- No, we actually don't need python-six at all.
+
+-------------------------------------------------------------------
New:
----
fix-maint-issue.patch
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-softlayer.spec ++++++
--- /var/tmp/diff_new_pack.gTciNh/_old 2022-10-25 11:20:34.982213195 +0200
+++ /var/tmp/diff_new_pack.gTciNh/_new 2022-10-25 11:20:34.986213205 +0200
@@ -16,7 +16,6 @@
#
-%{?!python_module:%define python_module() python-%{**} python3-%{**}}
%define skip_python2 1
Name: python-softlayer
Version: 6.1.2
@@ -25,6 +24,9 @@
License: MIT
URL: https://github.com/softlayer/softlayer-python
Source: https://github.com/softlayer/softlayer-python/archive/v%{version}.tar.gz
+# PATCH-FIX-UPSTREAM fix-maint-issue.patch bsc#1203311 mcepl(a)suse.com
+# xmlrpc y2038 problem
+Patch0: fix-maint-issue.patch
BuildRequires: %{python_module click >= 8.0.4}
BuildRequires: %{python_module prettytable >= 2.5.0}
BuildRequires: %{python_module prompt_toolkit >= 2}
@@ -33,7 +35,6 @@
BuildRequires: %{python_module requests >= 2.20.0}
BuildRequires: %{python_module rich >= 12.5.1}
BuildRequires: %{python_module setuptools}
-BuildRequires: %{python_module six >= 1.7.0}
BuildRequires: %{python_module softlayer-zeep >= 5.0.0}
BuildRequires: %{python_module testtools}
BuildRequires: %{python_module typing_extensions}
@@ -47,7 +48,6 @@
Requires: python-requests >= 2.20.0
Requires: python-rich >= 12.5.1
Requires: python-setuptools
-Requires: python-six >= 1.7.0
Requires: python-softlayer-zeep >= 5.0.0
Requires: python-typing_extensions
Requires: python-urllib3 >= 1.24
@@ -60,7 +60,7 @@
This library provides a simple Python client to interact with SoftLayer's XML-RPC API.
%prep
-%setup -q -n softlayer-python-%{version}
+%autosetup -p1 -n softlayer-python-%{version}
%build
%python_build
++++++ fix-maint-issue.patch ++++++
From 392c38718c172a8f76c1f53fbe23ba2659cf320c Mon Sep 17 00:00:00 2001
From: Christopher Gallo <chrisagallo(a)gmail.com>
Date: Fri, 23 Sep 2022 13:31:01 -0500
Subject: [PATCH 1/4] updated release workflow for the correct url
---
.github/workflows/release.yml | 2
SoftLayer/CLI/cdn/detail.py | 2
SoftLayer/fixtures/SoftLayer_Network_CdnMarketplace_Configuration_Mapping.py | 6 +-
SoftLayer/fixtures/SoftLayer_Network_CdnMarketplace_Metrics.py | 6 +-
tests/CLI/modules/cdn_tests.py | 29 ++++------
tests/managers/cdn_tests.py | 19 +++++-
6 files changed, 37 insertions(+), 27 deletions(-)
--- a/.github/workflows/release.yml
+++ b/.github/workflows/release.yml
@@ -48,5 +48,5 @@ jobs:
with:
user: __token__
password: ${{ secrets.CGALLO_PYPI }}
- repository_url: https://pypi.org/legacy/
+ repository_url: https://upload.pypi.org/legacy/
--- a/SoftLayer/CLI/cdn/detail.py
+++ b/SoftLayer/CLI/cdn/detail.py
@@ -41,6 +41,6 @@ def cli(env, unique_id, history):
table.add_row(['status', cdn_mapping['status']])
table.add_row(['total_bandwidth', total_bandwidth])
table.add_row(['total_hits', total_hits])
- table.add_row(['hit_radio', hit_ratio])
+ table.add_row(['hit_ratio', hit_ratio])
env.fout(table)
--- a/SoftLayer/fixtures/SoftLayer_Network_CdnMarketplace_Configuration_Mapping.py
+++ b/SoftLayer/fixtures/SoftLayer_Network_CdnMarketplace_Configuration_Mapping.py
@@ -11,7 +11,7 @@ listDomainMappings = [
"path": "/",
"protocol": "HTTP",
"status": "CNAME_CONFIGURATION",
- "uniqueId": "9934111111111",
+ "uniqueId": "11223344",
"vendorName": "akamai"
}
]
@@ -28,7 +28,7 @@ listDomainMappingByUniqueId = [
"path": "/",
"protocol": "HTTP",
"status": "CNAME_CONFIGURATION",
- "uniqueId": "9934111111111",
+ "uniqueId": "11223344",
"vendorName": "akamai"
}
]
@@ -41,7 +41,7 @@ updateDomainMapping = [
"performanceConfiguration": "Large file optimization",
"protocol": "HTTP",
"respectHeaders": True,
- "uniqueId": "424406419091111",
+ "uniqueId": "11223344",
"vendorName": "akamai",
"header": "www.test.com",
"httpPort": 83,
--- a/SoftLayer/fixtures/SoftLayer_Network_CdnMarketplace_Metrics.py
+++ b/SoftLayer/fixtures/SoftLayer_Network_CdnMarketplace_Metrics.py
@@ -6,9 +6,9 @@ getMappingUsageMetrics = [
"HitRatio"
],
"totals": [
- "0.0",
- "0",
- "0.0"
+ 1.0,
+ 3,
+ 2.0
],
"type": "TOTALS"
}
--- a/tests/CLI/modules/cdn_tests.py
+++ b/tests/CLI/modules/cdn_tests.py
@@ -4,7 +4,9 @@
:license: MIT, see LICENSE for more details.
"""
+import datetime
import json
+from unittest import mock as mock
from SoftLayer.CLI import exceptions
from SoftLayer import testing
@@ -21,27 +23,22 @@ class CdnTests(testing.TestCase):
'domain': 'test.example.com',
'origin': '1.1.1.1',
'status': 'CNAME_CONFIGURATION',
- 'unique_id': '9934111111111',
+ 'unique_id': '11223344',
'vendor': 'akamai'}]
)
- def test_detail_account(self):
+ @mock.patch('SoftLayer.utils.days_to_datetime')
+ def test_detail_account(self, mock_now):
+ mock_now.return_value = datetime.datetime(2020, 1, 1)
result = self.run_command(['cdn', 'detail', '--history=30', '1245'])
self.assert_no_fail(result)
- self.assertEqual(json.loads(result.output),
- {'hit_radio': '0.0 %',
- 'hostname': 'test.example.com',
- 'origin': '1.1.1.1',
- 'origin_type': 'HOST_SERVER',
- 'path': '/',
- 'protocol': 'HTTP',
- 'provider': 'akamai',
- 'status': 'CNAME_CONFIGURATION',
- 'total_bandwidth': '0.0 GB',
- 'total_hits': '0',
- 'unique_id': '9934111111111'}
- )
+ api_results = json.loads(result.output)
+ self.assertEqual(api_results['hit_ratio'], '2.0 %')
+ self.assertEqual(api_results['total_bandwidth'], '1.0 GB')
+ self.assertEqual(api_results['total_hits'], 3)
+ self.assertEqual(api_results['hostname'], 'test.example.com')
+ self.assertEqual(api_results['protocol'], 'HTTP')
def test_purge_content(self):
result = self.run_command(['cdn', 'purge', '1234',
@@ -122,7 +119,7 @@ class CdnTests(testing.TestCase):
self.assertEqual('include: test', header_result['Cache key optimization'])
def test_edit_cache_by_uniqueId(self):
- result = self.run_command(['cdn', 'edit', '9934111111111', '--cache', 'include-specified', '--cache', 'test'])
+ result = self.run_command(['cdn', 'edit', '11223344', '--cache', 'include-specified', '--cache', 'test'])
self.assert_no_fail(result)
header_result = json.loads(result.output)
self.assertEqual('include: test', header_result['Cache key optimization'])
--- a/tests/managers/cdn_tests.py
+++ b/tests/managers/cdn_tests.py
@@ -4,6 +4,8 @@
:license: MIT, see LICENSE for more details.
"""
+import datetime
+from unittest import mock as mock
from SoftLayer import fixtures
from SoftLayer.managers import cdn
@@ -28,7 +30,9 @@ class CDNTests(testing.TestCase):
'listDomainMappingByUniqueId',
args=args)
- def test_detail_usage_metric(self):
+ @mock.patch('SoftLayer.utils.days_to_datetime')
+ def test_detail_usage_metric(self, mock_now):
+ mock_now.return_value = datetime.datetime(2020, 1, 1)
self.cdn_client.get_usage_metrics(12345, history=30, frequency="aggregate")
args = (12345,
@@ -39,6 +43,15 @@ class CDNTests(testing.TestCase):
'getMappingUsageMetrics',
args=args)
+ # Does this still work in 2038 ? https://github.com/softlayer/softlayer-python/issues/1764 for context
+ @mock.patch('SoftLayer.utils.days_to_datetime')
+ def test_detail_usage_metric_future(self, mock_now):
+ mock_now.return_value = datetime.datetime(2040, 1, 1)
+ self.assertRaises(
+ OverflowError,
+ self.cdn_client.get_usage_metrics, 12345, history=30, frequency="aggregate"
+ )
+
def test_get_origins(self):
self.cdn_client.get_origins("12345")
self.assert_called_with('SoftLayer_Network_CdnMarketplace_Configuration_Mapping_Path',
@@ -105,7 +118,7 @@ class CDNTests(testing.TestCase):
args=args)
def test_cdn_edit(self):
- identifier = '9934111111111'
+ identifier = '11223344'
header = 'www.test.com'
result = self.cdn_client.edit(identifier, header=header)
@@ -116,7 +129,7 @@ class CDNTests(testing.TestCase):
'SoftLayer_Network_CdnMarketplace_Configuration_Mapping',
'updateDomainMapping',
args=({
- 'uniqueId': '9934111111111',
+ 'uniqueId': '11223344',
'originType': 'HOST_SERVER',
'protocol': 'HTTP',
'path': '/',
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package jefferson for openSUSE:Factory checked in at 2022-10-25 11:20:06
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/jefferson (Old)
and /work/SRC/openSUSE:Factory/.jefferson.new.2275 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "jefferson"
Tue Oct 25 11:20:06 2022 rev:2 rq:1030940 version:0.4.1+git.20220705
Changes:
--------
--- /work/SRC/openSUSE:Factory/jefferson/jefferson.changes 2020-08-19 18:52:12.479700002 +0200
+++ /work/SRC/openSUSE:Factory/.jefferson.new.2275/jefferson.changes 2022-10-25 11:20:32.838208442 +0200
@@ -1,0 +2,42 @@
+Fri Oct 14 13:16:31 UTC 2022 - mardnh(a)gmx.de
+
+- Update to version 0.4.1+git.20220705:
+ * Fix is_safe_path call to use absolute path rather than
+ relative path to execution directory.
+ * Fix extraction of files with size greater than one erase block.
+ * Remove unnecessary log call.
+ * Keep xattr, xref, and summary nodetypes in order to propely
+ identify unknown node types.
+ * Remove handling of xref nodes, xattr nodes, summary nodes.
+ * Use inode indexed dicts for inodes, dirent, and xref entries.
+ * Simplify filesystem structure.
+ * Fix duplicate inodes handling.
+ * Fix support for python 3.10 by pinning python-lzo to 1.14.
+ * Better handling of decompression error + simpler endianness
+ logging.
+ * Memory-mapped file support
+ * Add support for LZO compression.
+ * Revert the symlink path traversal check as it does not present
+ a direct risk to normal end users. Those checks can be
+ implemented by other tools where required.
+ * Fix path traversal security vulnerability by canonicalizing
+ path names of every inodes and discarding inodes with a path
+ pointing outside of the extraction directory.
+ * Autodetect endianness rather than scan the filesystem twice,
+ one for each possible endianness. We make the assumption that
+ a JFFS2 has always a fixed endianness and that nodes won't
+ switch between endianness in the middle of a filesystem.
+ * Add support for JFFS2 old magic signature (0x1984).
+ * pin cstruct version to 2.1 so we don't end up with breaking
+ API changes in the future. Moving to more recent versions
+ should be done manually once it's been tested that jefferson
+ still works with the newly released version of cstruct.
+ * Fix set_endianness to support cstruct version 2.1.
+ * Converted python2 encode("hex") to python3 hex()
+ * Switched to lzma in python stdlib
+ * Convert to python3.
+- Drop patches:
+ * 18.patch
+ * jefferson-use-pylzma.patch
+
+-------------------------------------------------------------------
Old:
----
18.patch
jefferson-0.3+git.20160616.tar.xz
jefferson-use-pylzma.patch
New:
----
jefferson-0.4.1+git.20220705.tar.xz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ jefferson.spec ++++++
--- /var/tmp/diff_new_pack.adn8Tc/_old 2022-10-25 11:20:33.794210562 +0200
+++ /var/tmp/diff_new_pack.adn8Tc/_new 2022-10-25 11:20:33.798210571 +0200
@@ -1,7 +1,8 @@
#
# spec file for package jefferson
#
-# Copyright (c) 2020, Martin Hauke <mardnh(a)gmx.de>
+# Copyright (c) 2022 SUSE LLC
+# Copyright (c) 2020-2022, Martin Hauke <mardnh(a)gmx.de>
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -17,23 +18,20 @@
Name: jefferson
-Version: 0.3+git.20160616
+Version: 0.4.1+git.20220705
Release: 0
Summary: JFFS2 filesystem extraction tool
License: MIT
Group: Development/Tools/Other
URL: https://github.com/sviehb/jefferson
Source: %{name}-%{version}.tar.xz
-# Add support for python3
-Patch0: https://github.com/sviehb/jefferson/pull/18.patch
-Patch1: jefferson-use-pylzma.patch
BuildRequires: fdupes
BuildRequires: help2man
BuildRequires: python-rpm-macros
-BuildRequires: python3-cstruct >= 1.5
-BuildRequires: python3-setuptools
+BuildRequires: python3-cstruct >= 2.1
BuildRequires: python3-pylzma
-Requires: python3-cstruct >= 1.5
+BuildRequires: python3-setuptools
+Requires: python3-cstruct >= 2.1
Requires: python3-pylzma
BuildArch: noarch
@@ -52,8 +50,6 @@
%prep
%setup -q
-%patch0 -p1
-%patch1 -p1
chmod -x README.md
%build
++++++ _service ++++++
--- /var/tmp/diff_new_pack.adn8Tc/_old 2022-10-25 11:20:33.858210704 +0200
+++ /var/tmp/diff_new_pack.adn8Tc/_new 2022-10-25 11:20:33.862210712 +0200
@@ -5,7 +5,7 @@
<param name="revision">master</param>
<param name="scm">git</param>
<param name="changesgenerate">enable</param>
- <param name="versionformat">0.3+git.%cd</param>
+ <param name="versionformat">0.4.1+git.%cd</param>
</service>
<service mode="disabled" name="recompress">
<param name="file">*.tar</param>
++++++ _servicedata ++++++
--- /var/tmp/diff_new_pack.adn8Tc/_old 2022-10-25 11:20:33.882210757 +0200
+++ /var/tmp/diff_new_pack.adn8Tc/_new 2022-10-25 11:20:33.886210766 +0200
@@ -1,6 +1,6 @@
<servicedata>
<service name="tar_scm">
<param name="url">https://github.com/sviehb/jefferson.git</param>
- <param name="changesrevision">6f9169bad3ceb4e212fae62ad710eeca3350226b</param></service></servicedata>
+ <param name="changesrevision">ecc51d7ab500e6286c80154f89388e3173784081</param></service></servicedata>
(No newline at EOF)
++++++ jefferson-0.3+git.20160616.tar.xz -> jefferson-0.4.1+git.20220705.tar.xz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/FETCH_HEAD new/jefferson-0.4.1+git.20220705/.git/FETCH_HEAD
--- old/jefferson-0.3+git.20160616/.git/FETCH_HEAD 1970-01-01 01:00:00.000000000 +0100
+++ new/jefferson-0.4.1+git.20220705/.git/FETCH_HEAD 2022-07-05 10:56:07.000000000 +0200
@@ -0,0 +1,4 @@
+ecc51d7ab500e6286c80154f89388e3173784081 branch 'master' of https://github.com/sviehb/jefferson
+5d8e2f0b2f4e6d0999382614f37981d6ca6c3d9f not-for-merge branch 'fix-inode-versioning' of https://github.com/sviehb/jefferson
+684069ea678c388d7fe58a61e671e4fa5a2a3c3f not-for-merge branch 'fix-lzo-python310' of https://github.com/sviehb/jefferson
+da60d313db2be447b6f8c4ff2cdcca40292faf6d not-for-merge branch 'fix-path-traversal' of https://github.com/sviehb/jefferson
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/ORIG_HEAD new/jefferson-0.4.1+git.20220705/.git/ORIG_HEAD
--- old/jefferson-0.3+git.20160616/.git/ORIG_HEAD 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/.git/ORIG_HEAD 2022-07-05 10:56:07.000000000 +0200
@@ -1 +1 @@
-6f9169bad3ceb4e212fae62ad710eeca3350226b
+ecc51d7ab500e6286c80154f89388e3173784081
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/config new/jefferson-0.4.1+git.20220705/.git/config
--- old/jefferson-0.3+git.20160616/.git/config 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/.git/config 2022-07-05 10:56:07.000000000 +0200
@@ -1,11 +1,15 @@
[core]
- repositoryformatversion = 0
+ repositoryformatversion = 1
filemode = true
bare = false
logallrefupdates = true
[remote "origin"]
url = https://github.com/sviehb/jefferson.git
fetch = +refs/heads/*:refs/remotes/origin/*
+ promisor = true
+ partialclonefilter = tree:0
[branch "master"]
remote = origin
merge = refs/heads/master
+[extensions]
+ partialClone = origin
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/hooks/fsmonitor-watchman.sample new/jefferson-0.4.1+git.20220705/.git/hooks/fsmonitor-watchman.sample
--- old/jefferson-0.3+git.20160616/.git/hooks/fsmonitor-watchman.sample 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/.git/hooks/fsmonitor-watchman.sample 2022-07-05 10:56:07.000000000 +0200
@@ -86,12 +86,13 @@
# recency index to select candidate nodes and "fields" to limit the
# output to file names only. Then we're using the "expression" term to
# further constrain the results.
+ my $last_update_line = "";
if (substr($last_update_token, 0, 1) eq "c") {
$last_update_token = "\"$last_update_token\"";
+ $last_update_line = qq[\n"since": $last_update_token,];
}
my $query = <<" END";
- ["query", "$git_work_tree", {
- "since": $last_update_token,
+ ["query", "$git_work_tree", {$last_update_line
"fields": ["name"],
"expression": ["not", ["dirname", ".git"]]
}]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/hooks/pre-push.sample new/jefferson-0.4.1+git.20220705/.git/hooks/pre-push.sample
--- old/jefferson-0.3+git.20160616/.git/hooks/pre-push.sample 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/.git/hooks/pre-push.sample 2022-07-05 10:56:07.000000000 +0200
@@ -14,7 +14,7 @@
# Information about the commits which are being pushed is supplied as lines to
# the standard input in the form:
#
-# <local ref> <local sha1> <remote ref> <remote sha1>
+# <local ref> <local oid> <remote ref> <remote oid>
#
# This sample shows how to prevent push of commits where the log message starts
# with "WIP" (work in progress).
@@ -22,27 +22,27 @@
remote="$1"
url="$2"
-z40=0000000000000000000000000000000000000000
+zero=$(git hash-object --stdin </dev/null | tr '[0-9a-f]' '0')
-while read local_ref local_sha remote_ref remote_sha
+while read local_ref local_oid remote_ref remote_oid
do
- if [ "$local_sha" = $z40 ]
+ if test "$local_oid" = "$zero"
then
# Handle delete
:
else
- if [ "$remote_sha" = $z40 ]
+ if test "$remote_oid" = "$zero"
then
# New branch, examine all commits
- range="$local_sha"
+ range="$local_oid"
else
# Update to existing branch, examine new commits
- range="$remote_sha..$local_sha"
+ range="$remote_oid..$local_oid"
fi
# Check for WIP commit
- commit=`git rev-list -n 1 --grep '^WIP' "$range"`
- if [ -n "$commit" ]
+ commit=$(git rev-list -n 1 --grep '^WIP' "$range")
+ if test -n "$commit"
then
echo >&2 "Found WIP commit in $local_ref, not pushing"
exit 1
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/hooks/push-to-checkout.sample new/jefferson-0.4.1+git.20220705/.git/hooks/push-to-checkout.sample
--- old/jefferson-0.3+git.20160616/.git/hooks/push-to-checkout.sample 1970-01-01 01:00:00.000000000 +0100
+++ new/jefferson-0.4.1+git.20220705/.git/hooks/push-to-checkout.sample 2022-07-05 10:56:07.000000000 +0200
@@ -0,0 +1,78 @@
+#!/bin/sh
+
+# An example hook script to update a checked-out tree on a git push.
+#
+# This hook is invoked by git-receive-pack(1) when it reacts to git
+# push and updates reference(s) in its repository, and when the push
+# tries to update the branch that is currently checked out and the
+# receive.denyCurrentBranch configuration variable is set to
+# updateInstead.
+#
+# By default, such a push is refused if the working tree and the index
+# of the remote repository has any difference from the currently
+# checked out commit; when both the working tree and the index match
+# the current commit, they are updated to match the newly pushed tip
+# of the branch. This hook is to be used to override the default
+# behaviour; however the code below reimplements the default behaviour
+# as a starting point for convenient modification.
+#
+# The hook receives the commit with which the tip of the current
+# branch is going to be updated:
+commit=$1
+
+# It can exit with a non-zero status to refuse the push (when it does
+# so, it must not modify the index or the working tree).
+die () {
+ echo >&2 "$*"
+ exit 1
+}
+
+# Or it can make any necessary changes to the working tree and to the
+# index to bring them to the desired state when the tip of the current
+# branch is updated to the new commit, and exit with a zero status.
+#
+# For example, the hook can simply run git read-tree -u -m HEAD "$1"
+# in order to emulate git fetch that is run in the reverse direction
+# with git push, as the two-tree form of git read-tree -u -m is
+# essentially the same as git switch or git checkout that switches
+# branches while keeping the local changes in the working tree that do
+# not interfere with the difference between the branches.
+
+# The below is a more-or-less exact translation to shell of the C code
+# for the default behaviour for git's push-to-checkout hook defined in
+# the push_to_deploy() function in builtin/receive-pack.c.
+#
+# Note that the hook will be executed from the repository directory,
+# not from the working tree, so if you want to perform operations on
+# the working tree, you will have to adapt your code accordingly, e.g.
+# by adding "cd .." or using relative paths.
+
+if ! git update-index -q --ignore-submodules --refresh
+then
+ die "Up-to-date check failed"
+fi
+
+if ! git diff-files --quiet --ignore-submodules --
+then
+ die "Working directory has unstaged changes"
+fi
+
+# This is a rough translation of:
+#
+# head_has_history() ? "HEAD" : EMPTY_TREE_SHA1_HEX
+if git cat-file -e HEAD 2>/dev/null
+then
+ head=HEAD
+else
+ head=$(git hash-object -t tree --stdin </dev/null)
+fi
+
+if ! git diff-index --quiet --cached --ignore-submodules $head --
+then
+ die "Working directory has staged changes"
+fi
+
+if ! git read-tree -u -m "$commit"
+then
+ die "Could not update working tree to new HEAD"
+fi
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/hooks/update.sample new/jefferson-0.4.1+git.20220705/.git/hooks/update.sample
--- old/jefferson-0.3+git.20160616/.git/hooks/update.sample 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/.git/hooks/update.sample 2022-07-05 10:56:07.000000000 +0200
@@ -60,7 +60,7 @@
# --- Check types
# if $newrev is 0000...0000, it's a commit to delete a ref.
-zero="0000000000000000000000000000000000000000"
+zero=$(git hash-object --stdin </dev/null | tr '[0-9a-f]' '0')
if [ "$newrev" = "$zero" ]; then
newrev_type=delete
else
Binary files old/jefferson-0.3+git.20160616/.git/index and new/jefferson-0.4.1+git.20220705/.git/index differ
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/logs/HEAD new/jefferson-0.4.1+git.20220705/.git/logs/HEAD
--- old/jefferson-0.3+git.20160616/.git/logs/HEAD 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/.git/logs/HEAD 2022-07-05 10:56:07.000000000 +0200
@@ -1,2 +1,4 @@
-0000000000000000000000000000000000000000 6f9169bad3ceb4e212fae62ad710eeca3350226b Martin Hauke <mardnh(a)gmx.de> 1597349289 +0200 clone: from https://github.com/sviehb/jefferson.git
-6f9169bad3ceb4e212fae62ad710eeca3350226b 6f9169bad3ceb4e212fae62ad710eeca3350226b Martin Hauke <mardnh(a)gmx.de> 1597349289 +0200 reset: moving to origin/master
+0000000000000000000000000000000000000000 ecc51d7ab500e6286c80154f89388e3173784081 Martin Hauke <mardnh(a)gmx.de> 1665753391 +0200 clone: from https://github.com/sviehb/jefferson.git
+ecc51d7ab500e6286c80154f89388e3173784081 ecc51d7ab500e6286c80154f89388e3173784081 Martin Hauke <mardnh(a)gmx.de> 1665753391 +0200 checkout: moving from master to master
+ecc51d7ab500e6286c80154f89388e3173784081 ecc51d7ab500e6286c80154f89388e3173784081 Martin Hauke <mardnh(a)gmx.de> 1665753391 +0200 reset: moving to master
+ecc51d7ab500e6286c80154f89388e3173784081 ecc51d7ab500e6286c80154f89388e3173784081 Martin Hauke <mardnh(a)gmx.de> 1665753450 +0200 reset: moving to master
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/logs/refs/heads/master new/jefferson-0.4.1+git.20220705/.git/logs/refs/heads/master
--- old/jefferson-0.3+git.20160616/.git/logs/refs/heads/master 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/.git/logs/refs/heads/master 2022-07-05 10:56:07.000000000 +0200
@@ -1 +1 @@
-0000000000000000000000000000000000000000 6f9169bad3ceb4e212fae62ad710eeca3350226b Martin Hauke <mardnh(a)gmx.de> 1597349289 +0200 clone: from https://github.com/sviehb/jefferson.git
+0000000000000000000000000000000000000000 ecc51d7ab500e6286c80154f89388e3173784081 Martin Hauke <mardnh(a)gmx.de> 1665753391 +0200 clone: from https://github.com/sviehb/jefferson.git
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/logs/refs/remotes/origin/HEAD new/jefferson-0.4.1+git.20220705/.git/logs/refs/remotes/origin/HEAD
--- old/jefferson-0.3+git.20160616/.git/logs/refs/remotes/origin/HEAD 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/.git/logs/refs/remotes/origin/HEAD 2022-07-05 10:56:07.000000000 +0200
@@ -1 +1 @@
-0000000000000000000000000000000000000000 6f9169bad3ceb4e212fae62ad710eeca3350226b Martin Hauke <mardnh(a)gmx.de> 1597349289 +0200 clone: from https://github.com/sviehb/jefferson.git
+0000000000000000000000000000000000000000 ecc51d7ab500e6286c80154f89388e3173784081 Martin Hauke <mardnh(a)gmx.de> 1665753391 +0200 clone: from https://github.com/sviehb/jefferson.git
Binary files old/jefferson-0.3+git.20160616/.git/objects/pack/pack-3a0a0173eda603658094f7a412971bc5123e6e64.idx and new/jefferson-0.4.1+git.20220705/.git/objects/pack/pack-3a0a0173eda603658094f7a412971bc5123e6e64.idx differ
Binary files old/jefferson-0.3+git.20160616/.git/objects/pack/pack-3a0a0173eda603658094f7a412971bc5123e6e64.pack and new/jefferson-0.4.1+git.20220705/.git/objects/pack/pack-3a0a0173eda603658094f7a412971bc5123e6e64.pack differ
Binary files old/jefferson-0.3+git.20160616/.git/objects/pack/pack-725251cf98b7e590958c23630b690dc7a2d4860a.idx and new/jefferson-0.4.1+git.20220705/.git/objects/pack/pack-725251cf98b7e590958c23630b690dc7a2d4860a.idx differ
Binary files old/jefferson-0.3+git.20160616/.git/objects/pack/pack-725251cf98b7e590958c23630b690dc7a2d4860a.pack and new/jefferson-0.4.1+git.20220705/.git/objects/pack/pack-725251cf98b7e590958c23630b690dc7a2d4860a.pack differ
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/objects/pack/pack-725251cf98b7e590958c23630b690dc7a2d4860a.promisor new/jefferson-0.4.1+git.20220705/.git/objects/pack/pack-725251cf98b7e590958c23630b690dc7a2d4860a.promisor
--- old/jefferson-0.3+git.20160616/.git/objects/pack/pack-725251cf98b7e590958c23630b690dc7a2d4860a.promisor 1970-01-01 01:00:00.000000000 +0100
+++ new/jefferson-0.4.1+git.20220705/.git/objects/pack/pack-725251cf98b7e590958c23630b690dc7a2d4860a.promisor 2022-07-05 10:56:07.000000000 +0200
@@ -0,0 +1,8 @@
+ecc51d7ab500e6286c80154f89388e3173784081 HEAD
+5d8e2f0b2f4e6d0999382614f37981d6ca6c3d9f refs/heads/fix-inode-versioning
+684069ea678c388d7fe58a61e671e4fa5a2a3c3f refs/heads/fix-lzo-python310
+da60d313db2be447b6f8c4ff2cdcca40292faf6d refs/heads/fix-path-traversal
+ecc51d7ab500e6286c80154f89388e3173784081 refs/heads/master
+3fafcf2c922aab5d72e2c65e862c6594e41dd700 refs/tags/v0.3
+dc8564fb90b54c71bd4c62eb71931fd5040fa308 refs/tags/v0.4
+93f4f7dce124fb4c9ffd1d497dc873013e86e355 refs/tags/v0.4.1
Binary files old/jefferson-0.3+git.20160616/.git/objects/pack/pack-c1640479d2ac5d8b1cc2ce8e37d65d4dba26e302.idx and new/jefferson-0.4.1+git.20220705/.git/objects/pack/pack-c1640479d2ac5d8b1cc2ce8e37d65d4dba26e302.idx differ
Binary files old/jefferson-0.3+git.20160616/.git/objects/pack/pack-c1640479d2ac5d8b1cc2ce8e37d65d4dba26e302.pack and new/jefferson-0.4.1+git.20220705/.git/objects/pack/pack-c1640479d2ac5d8b1cc2ce8e37d65d4dba26e302.pack differ
Binary files old/jefferson-0.3+git.20160616/.git/objects/pack/pack-d21226feb514c8682694542253e376efbcd4fdc8.idx and new/jefferson-0.4.1+git.20220705/.git/objects/pack/pack-d21226feb514c8682694542253e376efbcd4fdc8.idx differ
Binary files old/jefferson-0.3+git.20160616/.git/objects/pack/pack-d21226feb514c8682694542253e376efbcd4fdc8.pack and new/jefferson-0.4.1+git.20220705/.git/objects/pack/pack-d21226feb514c8682694542253e376efbcd4fdc8.pack differ
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/packed-refs new/jefferson-0.4.1+git.20220705/.git/packed-refs
--- old/jefferson-0.3+git.20160616/.git/packed-refs 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/.git/packed-refs 2022-07-05 10:56:07.000000000 +0200
@@ -1,2 +1,10 @@
# pack-refs with: peeled fully-peeled sorted
-6f9169bad3ceb4e212fae62ad710eeca3350226b refs/remotes/origin/master
+5d8e2f0b2f4e6d0999382614f37981d6ca6c3d9f refs/remotes/origin/fix-inode-versioning
+684069ea678c388d7fe58a61e671e4fa5a2a3c3f refs/remotes/origin/fix-lzo-python310
+da60d313db2be447b6f8c4ff2cdcca40292faf6d refs/remotes/origin/fix-path-traversal
+ecc51d7ab500e6286c80154f89388e3173784081 refs/remotes/origin/master
+3fafcf2c922aab5d72e2c65e862c6594e41dd700 refs/tags/v0.3
+dc8564fb90b54c71bd4c62eb71931fd5040fa308 refs/tags/v0.4
+^361c6e38684bfb24e612e52726cf38c02bd2c9cc
+93f4f7dce124fb4c9ffd1d497dc873013e86e355 refs/tags/v0.4.1
+^ecc51d7ab500e6286c80154f89388e3173784081
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/.git/refs/heads/master new/jefferson-0.4.1+git.20220705/.git/refs/heads/master
--- old/jefferson-0.3+git.20160616/.git/refs/heads/master 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/.git/refs/heads/master 2022-07-05 10:56:07.000000000 +0200
@@ -1 +1 @@
-6f9169bad3ceb4e212fae62ad710eeca3350226b
+ecc51d7ab500e6286c80154f89388e3173784081
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/README.md new/jefferson-0.4.1+git.20220705/README.md
--- old/jefferson-0.3+git.20160616/README.md 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/README.md 2022-07-05 10:56:07.000000000 +0200
@@ -1,33 +1,31 @@
-# jefferson
-JFFS2 filesystem extraction tool
+## Jefferson
-Installation
-============
-```bash
-$ sudo python setup.py install
-```
+JFFS2 filesystem extraction tool
+### Installation
-Dependencies
-============
-- `cstruct`
-- `pyliblzma`
+Follow these steps on Debian based systems (Debian, Ubuntu, Kali, ...) to perform a system-wide installation of jefferon:
```bash
-$ sudo pip install cstruct
-$ sudo apt-get install python-lzma
+git clone https://github.com/sviehb/jefferson.git
+cd jefferson
+sudo apt update
+sudo apt install python3-pip liblzo2-dev
+sudo python3 -m pip install -r requirements.txt
+sudo python3 setup.py install
```
-Features
-============
-- Big/Little Endian support
-- `JFFS2_COMPR_ZLIB`, `JFFS2_COMPR_RTIME`, and `JFFS2_COMPR_LZMA` compression support
+
+### Features
+
+- big-endian and little-endian support with auto-detection
+- zlib, rtime, LZMA, and LZO compression support
- CRC checks - for now only enforced on `hdr_crc`
-- Extraction of symlinks, directories, files, and device nodes
-- Detection/handling of duplicate inode numbers. Occurs if multiple JFFS2 filesystems are found in one file and causes `jefferson` to treat segments as separate filesystems
+- extraction of symlinks, directories, files, and device nodes
+- detection/handling of duplicate inode numbers. Occurs if multiple JFFS2 filesystems are found in one file and causes `jefferson` to treat segments as separate filesystems
+
+### Usage
-Usage
-============
```bash
$ jefferson filesystem.img -d outdir
```
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/requirements.txt new/jefferson-0.4.1+git.20220705/requirements.txt
--- old/jefferson-0.3+git.20160616/requirements.txt 1970-01-01 01:00:00.000000000 +0100
+++ new/jefferson-0.4.1+git.20220705/requirements.txt 2022-07-05 10:56:07.000000000 +0200
@@ -0,0 +1,2 @@
+cstruct==2.1
+python-lzo==1.14
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/setup.py new/jefferson-0.4.1+git.20220705/setup.py
--- old/jefferson-0.3+git.20160616/setup.py 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/setup.py 2022-07-05 10:56:07.000000000 +0200
@@ -1,20 +1,19 @@
-#!/usr/bin/env python
+#!/usr/bin/env python3
# coding=utf-8
from distutils.core import setup
-version = '0.2'
+version = "0.4.1"
setup(
- name='jefferson',
+ name="jefferson",
version=version,
- description='',
- author='Stefan Viehb��ck',
- url='https://github.com/sviehb/jefferson',
- license='MIT',
-
+ description="",
+ author="Stefan Viehb��ck",
+ url="https://github.com/sviehb/jefferson",
+ license="MIT",
requires=['cstruct'],
- packages=['jefferson'],
- package_dir={'jefferson': 'src/jefferson'},
- scripts=['src/scripts/jefferson'],
-)
\ No newline at end of file
+ packages=["jefferson"],
+ package_dir={"jefferson": "src/jefferson"},
+ scripts=["src/scripts/jefferson"],
+)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/src/jefferson/__init__.py new/jefferson-0.4.1+git.20220705/src/jefferson/__init__.py
--- old/jefferson-0.3+git.20160616/src/jefferson/__init__.py 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/src/jefferson/__init__.py 2022-07-05 10:56:07.000000000 +0200
@@ -1 +1 @@
-__author__ = 'stefan'
+__author__ = "stefan"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/src/jefferson/jffs2_lzma.py new/jefferson-0.4.1+git.20220705/src/jefferson/jffs2_lzma.py
--- old/jefferson-0.3+git.20160616/src/jefferson/jffs2_lzma.py 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/src/jefferson/jffs2_lzma.py 2022-07-05 10:56:07.000000000 +0200
@@ -15,7 +15,7 @@
def decompress(data, outlen):
- lzma_header = struct.pack('<BIQ', PROPERTIES, DICT_SIZE, outlen)
+ lzma_header = struct.pack("<BIQ", PROPERTIES, DICT_SIZE, outlen)
lzma_data = lzma_header + data
decompressed = lzma.decompress(lzma_data)
- return decompressed
\ No newline at end of file
+ return decompressed
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/src/jefferson/rtime.py new/jefferson-0.4.1+git.20220705/src/jefferson/rtime.py
--- old/jefferson-0.3+git.20160616/src/jefferson/rtime.py 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/src/jefferson/rtime.py 2022-07-05 10:56:07.000000000 +0200
@@ -4,11 +4,11 @@
outpos = 0
pos = 0
while outpos < destlen:
- value = ord(data_in[pos])
+ value = data_in[pos]
pos += 1
cpage_out[outpos] = value
outpos += 1
- repeat = ord(data_in[pos])
+ repeat = data_in[pos]
pos += 1
backoffs = positions[value]
@@ -21,6 +21,8 @@
backoffs += 1
repeat -= 1
else:
- cpage_out[outpos:outpos + repeat] = cpage_out[backoffs:backoffs + repeat]
+ cpage_out[outpos : outpos + repeat] = cpage_out[
+ backoffs : backoffs + repeat
+ ]
outpos += repeat
- return str(cpage_out)
\ No newline at end of file
+ return cpage_out
\ No newline at end of file
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/jefferson-0.3+git.20160616/src/scripts/jefferson new/jefferson-0.4.1+git.20220705/src/scripts/jefferson
--- old/jefferson-0.3+git.20160616/src/scripts/jefferson 2016-06-16 16:41:06.000000000 +0200
+++ new/jefferson-0.4.1+git.20220705/src/scripts/jefferson 2022-07-05 10:56:07.000000000 +0200
@@ -1,20 +1,25 @@
-#! /usr/bin/python2
+#!/usr/bin/env python3
+import argparse
import struct
import stat
import os
+import sys
import zlib
import binascii
-
+import lzo
+import mmap
+import contextlib
import cstruct
from jefferson import jffs2_lzma, rtime
def PAD(x):
- return (((x) + 3) & ~3)
+ return ((x) + 3) & ~3
+JFFS2_OLD_MAGIC_BITMASK = 0x1984
JFFS2_MAGIC_BITMASK = 0x1985
JFFS2_COMPR_NONE = 0x00
JFFS2_COMPR_ZERO = 0x01
@@ -27,10 +32,10 @@
JFFS2_COMPR_LZMA = 0x08
# /* Compatibility flags. */
-JFFS2_COMPAT_MASK = 0xc000 # /* What do to if an unknown nodetype is found */
+JFFS2_COMPAT_MASK = 0xC000 # /* What do to if an unknown nodetype is found */
JFFS2_NODE_ACCURATE = 0x2000
# /* INCOMPAT: Fail to mount the filesystem */
-JFFS2_FEATURE_INCOMPAT = 0xc000
+JFFS2_FEATURE_INCOMPAT = 0xC000
# /* ROCOMPAT: Mount read-only */
JFFS2_FEATURE_ROCOMPAT = 0x8000
# /* RWCOMPAT_COPY: Mount read/write, and copy the node when it's GC'd */
@@ -46,309 +51,289 @@
JFFS2_NODETYPE_XATTR = JFFS2_FEATURE_INCOMPAT | JFFS2_NODE_ACCURATE | 8
JFFS2_NODETYPE_XREF = JFFS2_FEATURE_INCOMPAT | JFFS2_NODE_ACCURATE | 9
-
def mtd_crc(data):
- return (binascii.crc32(data, -1) ^ -1) & 0xffffffff
-
+ return (binascii.crc32(data, -1) ^ -1) & 0xFFFFFFFF
-cstruct.typedef('uint8', 'uint8_t')
-cstruct.typedef('uint16', 'jint16_t')
-cstruct.typedef('uint32', 'jint32_t')
-cstruct.typedef('uint32', 'jmode_t')
+def is_safe_path(basedir, real_path):
+ basedir = os.path.realpath(basedir)
+ return basedir == os.path.commonpath((basedir, real_path))
+
+cstruct.typedef("uint8", "uint8_t")
+cstruct.typedef("uint16", "jint16_t")
+cstruct.typedef("uint32", "jint32_t")
+cstruct.typedef("uint32", "jmode_t")
class Jffs2_unknown_node(cstruct.CStruct):
__byte_order__ = cstruct.LITTLE_ENDIAN
- __struct__ = """
- /* All start like this */
- jint16_t magic;
- jint16_t nodetype;
- jint32_t totlen; /* So we can skip over nodes we don't grok */
- jint32_t hdr_crc;
+ __def__ = """
+ struct {
+ /* All start like this */
+ jint16_t magic;
+ jint16_t nodetype;
+ jint32_t totlen; /* So we can skip over nodes we don't grok */
+ jint32_t hdr_crc;
+ }
"""
def unpack(self, data):
- cstruct.CStruct.unpack(self, data[:self.size])
- comp_hrd_crc = mtd_crc(data[:self.size - 4])
+ cstruct.CStruct.unpack(self, data[: self.size])
+ comp_hrd_crc = mtd_crc(data[: self.size - 4])
if comp_hrd_crc == self.hdr_crc:
self.hdr_crc_match = True
else:
- #print 'hdr_crc does not match!'
+ # print("hdr_crc does not match!")
self.hdr_crc_match = False
-class Jffs2_raw_xattr(cstruct.CStruct):
- __byte_order__ = cstruct.LITTLE_ENDIAN
- __struct__ = """
- jint16_t magic;
- jint16_t nodetype; /* = JFFS2_NODETYPE_XATTR */
- jint32_t totlen;
- jint32_t hdr_crc;
- jint32_t xid; /* XATTR identifier number */
- jint32_t version;
- uint8_t xprefix;
- uint8_t name_len;
- jint16_t value_len;
- jint32_t data_crc;
- jint32_t node_crc;
- uint8_t data[0];
- """
-
-
-class Jffs2_raw_summary(cstruct.CStruct):
- __byte_order__ = cstruct.LITTLE_ENDIAN
- __struct__ = """
- jint16_t magic;
- jint16_t nodetype; /* = JFFS2_NODETYPE_SUMMARY */
- jint32_t totlen;
- jint32_t hdr_crc;
- jint32_t sum_num; /* number of sum entries*/
- jint32_t cln_mkr; /* clean marker size, 0 = no cleanmarker */
- jint32_t padded; /* sum of the size of padding nodes */
- jint32_t sum_crc; /* summary information crc */
- jint32_t node_crc; /* node crc */
- jint32_t sum[0]; /* inode summary info */
- """
-
-
-class Jffs2_raw_xref(cstruct.CStruct):
- __byte_order__ = cstruct.LITTLE_ENDIAN
- __struct__ = """
- jint16_t magic;
- jint16_t nodetype; /* = JFFS2_NODETYPE_XREF */
- jint32_t totlen;
- jint32_t hdr_crc;
- jint32_t ino; /* inode number */
- jint32_t xid; /* XATTR identifier number */
- jint32_t xseqno; /* xref sequencial number */
- jint32_t node_crc;
- """
-
-
class Jffs2_raw_dirent(cstruct.CStruct):
__byte_order__ = cstruct.LITTLE_ENDIAN
- __struct__ = """
- jint16_t magic;
- jint16_t nodetype; /* == JFFS2_NODETYPE_DIRENT */
- jint32_t totlen;
- jint32_t hdr_crc;
- jint32_t pino;
- jint32_t version;
- jint32_t ino; /* == zero for unlink */
- jint32_t mctime;
- uint8_t nsize;
- uint8_t type;
- uint8_t unused[2];
- jint32_t node_crc;
- jint32_t name_crc;
- /* uint8_t data[0]; -> name */
+ __def__ = """
+ struct {
+ jint16_t magic;
+ jint16_t nodetype; /* == JFFS2_NODETYPE_DIRENT */
+ jint32_t totlen;
+ jint32_t hdr_crc;
+ jint32_t pino;
+ jint32_t version;
+ jint32_t ino; /* == zero for unlink */
+ jint32_t mctime;
+ uint8_t nsize;
+ uint8_t type;
+ uint8_t unused[2];
+ jint32_t node_crc;
+ jint32_t name_crc;
+ /* uint8_t data[0]; -> name */
+ }
"""
def unpack(self, data, node_offset):
- cstruct.CStruct.unpack(self, data[:self.size])
- self.name = data[self.size:self.size + self.nsize]
+ cstruct.CStruct.unpack(self, data[: self.size])
+ self.name = data[self.size : self.size + self.nsize].tobytes()
self.node_offset = node_offset
- if mtd_crc(data[:self.size - 8]) == self.node_crc:
+ if mtd_crc(data[: self.size - 8]) == self.node_crc:
self.node_crc_match = True
else:
- print 'node_crc does not match!'
+ print("node_crc does not match!")
self.node_crc_match = False
if mtd_crc(self.name) == self.name_crc:
self.name_crc_match = True
else:
- print 'data_crc does not match!'
+ print("data_crc does not match!")
self.name_crc_match = False
def __str__(self):
result = []
- for field in self.__fields__ + ['name', 'node_offset']:
+ for field in self.__fields__ + ["name", "node_offset"]:
result.append(field + "=" + str(getattr(self, field, None)))
return type(self).__name__ + "(" + ", ".join(result) + ")"
class Jffs2_raw_inode(cstruct.CStruct):
__byte_order__ = cstruct.LITTLE_ENDIAN
- __struct__ = """
- jint16_t magic; /* A constant magic number. */
- jint16_t nodetype; /* == JFFS2_NODETYPE_INODE */
- jint32_t totlen; /* Total length of this node (inc data, etc.) */
- jint32_t hdr_crc;
- jint32_t ino; /* Inode number. */
- jint32_t version; /* Version number. */
- jmode_t mode; /* The file's type or mode. */
- jint16_t uid; /* The file's owner. */
- jint16_t gid; /* The file's group. */
- jint32_t isize; /* Total resultant size of this inode (used for truncations) */
- jint32_t atime; /* Last access time. */
- jint32_t mtime; /* Last modification time. */
- jint32_t ctime; /* Change time. */
- jint32_t offset; /* Where to begin to write. */
- jint32_t csize; /* (Compressed) data size */
- jint32_t dsize; /* Size of the node's data. (after decompression) */
- uint8_t compr; /* Compression algorithm used */
- uint8_t usercompr; /* Compression algorithm requested by the user */
- jint16_t flags; /* See JFFS2_INO_FLAG_* */
- jint32_t data_crc; /* CRC for the (compressed) data. */
- jint32_t node_crc; /* CRC for the raw inode (excluding data) */
- /* uint8_t data[0]; */
+ __def__ = """
+ struct {
+ jint16_t magic; /* A constant magic number. */
+ jint16_t nodetype; /* == JFFS2_NODETYPE_INODE */
+ jint32_t totlen; /* Total length of this node (inc data, etc.) */
+ jint32_t hdr_crc;
+ jint32_t ino; /* Inode number. */
+ jint32_t version; /* Version number. */
+ jmode_t mode; /* The file's type or mode. */
+ jint16_t uid; /* The file's owner. */
+ jint16_t gid; /* The file's group. */
+ jint32_t isize; /* Total resultant size of this inode (used for truncations) */
+ jint32_t atime; /* Last access time. */
+ jint32_t mtime; /* Last modification time. */
+ jint32_t ctime; /* Change time. */
+ jint32_t offset; /* Where to begin to write. */
+ jint32_t csize; /* (Compressed) data size */
+ jint32_t dsize; /* Size of the node's data. (after decompression) */
+ uint8_t compr; /* Compression algorithm used */
+ uint8_t usercompr; /* Compression algorithm requested by the user */
+ jint16_t flags; /* See JFFS2_INO_FLAG_* */
+ jint32_t data_crc; /* CRC for the (compressed) data. */
+ jint32_t node_crc; /* CRC for the raw inode (excluding data) */
+ /* uint8_t data[0]; */
+ }
"""
def unpack(self, data):
- cstruct.CStruct.unpack(self, data[:self.size])
+ cstruct.CStruct.unpack(self, data[: self.size])
- node_data = data[self.size:self.size + self.csize]
- if self.compr == JFFS2_COMPR_NONE:
- self.data = node_data
- elif self.compr == JFFS2_COMPR_ZERO:
- self.data = '\x00' * self.dsize
- elif self.compr == JFFS2_COMPR_ZLIB:
- self.data = zlib.decompress(node_data)
- elif self.compr == JFFS2_COMPR_RTIME:
- self.data = rtime.decompress(node_data, self.dsize)
- elif self.compr == JFFS2_COMPR_LZMA:
- self.data = jffs2_lzma.decompress(node_data, self.dsize)
- else:
- print 'compression not implemented', self
- print node_data.encode('hex')[:20]
- self.data = node_data
+ node_data = data[self.size : self.size + self.csize].tobytes()
+ try:
+ if self.compr == JFFS2_COMPR_NONE:
+ self.data = node_data
+ elif self.compr == JFFS2_COMPR_ZERO:
+ self.data = b"\x00" * self.dsize
+ elif self.compr == JFFS2_COMPR_ZLIB:
+ self.data = zlib.decompress(node_data)
+ elif self.compr == JFFS2_COMPR_RTIME:
+ self.data = rtime.decompress(node_data, self.dsize)
+ elif self.compr == JFFS2_COMPR_LZMA:
+ self.data = jffs2_lzma.decompress(node_data, self.dsize)
+ elif self.compr == JFFS2_COMPR_LZO:
+ self.data = lzo.decompress(node_data, False, self.dsize)
+ else:
+ print("compression not implemented", self)
+ print(node_data.hex()[:20])
+ self.data = node_data
+ except Exception as e:
+ print("Decompression error on inode {}: {}".format(self.ino, e), file=sys.stderr)
+ self.data = b"\x00" * self.dsize
if len(self.data) != self.dsize:
- print 'data length mismatch!'
+ print("data length mismatch!")
- if mtd_crc(data[:self.size - 8]) == self.node_crc:
+ if mtd_crc(data[: self.size - 8]) == self.node_crc:
self.node_crc_match = True
else:
- print 'hdr_crc does not match!'
+ print("hdr_crc does not match!")
self.node_crc_match = False
if mtd_crc(node_data) == self.data_crc:
self.data_crc_match = True
else:
- print 'data_crc does not match!'
+ print("data_crc does not match!")
self.data_crc_match = False
+
class Jffs2_device_node_old(cstruct.CStruct):
__byte_order__ = cstruct.LITTLE_ENDIAN
- __struct__ = """
- jint16_t old_id;
+ __def__ = """
+ struct {
+ jint16_t old_id;
+ }
"""
+
class Jffs2_device_node_new(cstruct.CStruct):
__byte_order__ = cstruct.LITTLE_ENDIAN
- __struct__ = """
- jint32_t new_id;
+ __def__ = """
+ struct {
+ jint32_t new_id;
+ }
"""
+
NODETYPES = {
JFFS2_FEATURE_INCOMPAT: Jffs2_unknown_node,
JFFS2_NODETYPE_DIRENT: Jffs2_raw_dirent,
JFFS2_NODETYPE_INODE: Jffs2_raw_inode,
- JFFS2_NODETYPE_CLEANMARKER: 'JFFS2_NODETYPE_CLEANMARKER',
- JFFS2_NODETYPE_SUMMARY: Jffs2_raw_summary,
- JFFS2_NODETYPE_XATTR: Jffs2_raw_xattr,
- JFFS2_NODETYPE_XREF: Jffs2_raw_xref,
- JFFS2_NODETYPE_PADDING: 'JFFS2_NODETYPE_PADDING'
+ JFFS2_NODETYPE_CLEANMARKER: "JFFS2_NODETYPE_CLEANMARKER",
+ JFFS2_NODETYPE_PADDING: "JFFS2_NODETYPE_PADDING",
}
def set_endianness(endianness):
- Jffs2_device_node_new.__fmt__ = endianness + Jffs2_device_node_new.__fmt__[1:]
- Jffs2_device_node_old.__fmt__ = endianness + Jffs2_device_node_old.__fmt__[1:]
+ global Jffs2_device_node_new, Jffs2_device_node_old, Jffs2_unknown_node, Jffs2_raw_dirent, Jffs2_raw_inode, Jffs2_raw_summary, Jffs2_raw_xattr, Jffs2_raw_xref
- for node in NODETYPES.values():
- if isinstance(node, cstruct.CStructMeta):
- node.__fmt__ = endianness + node.__fmt__[1:]
+ Jffs2_device_node_new = Jffs2_device_node_new.parse(
+ Jffs2_device_node_new.__def__,
+ __name__=Jffs2_device_node_new.__name__,
+ __byte_order__=endianness,
+ )
+
+ Jffs2_device_node_old = Jffs2_device_node_old.parse(
+ Jffs2_device_node_old.__def__,
+ __name__=Jffs2_device_node_old.__name__,
+ __byte_order__=endianness,
+ )
+
+ Jffs2_unknown_node = Jffs2_unknown_node.parse(
+ Jffs2_unknown_node.__def__,
+ __name__=Jffs2_unknown_node.__name__,
+ __byte_order__=endianness,
+ )
+
+ Jffs2_raw_dirent = Jffs2_raw_dirent.parse(
+ Jffs2_raw_dirent.__def__,
+ __name__=Jffs2_raw_dirent.__name__,
+ __byte_order__=endianness,
+ )
+
+ Jffs2_raw_inode = Jffs2_raw_inode.parse(
+ Jffs2_raw_inode.__def__,
+ __name__=Jffs2_raw_inode.__name__,
+ __byte_order__=endianness,
+ )
def scan_fs(content, endianness, verbose=False):
- set_endianness(endianness)
- summaries = []
pos = 0
- jffs2_magic_bitmask_str = struct.pack(endianness + 'H', JFFS2_MAGIC_BITMASK)
- fs_index = 0
+ jffs2_old_magic_bitmask_str = struct.pack(endianness + "H", JFFS2_OLD_MAGIC_BITMASK)
+ jffs2_magic_bitmask_str = struct.pack(endianness + "H", JFFS2_MAGIC_BITMASK)
+ content_mv = memoryview(content)
fs = {}
- fs[fs_index] = {}
- fs[fs_index]["endianness"] = endianness
- fs[fs_index][JFFS2_NODETYPE_INODE] = []
- fs[fs_index][JFFS2_NODETYPE_DIRENT] = []
- fs[fs_index][JFFS2_NODETYPE_XATTR] = []
- fs[fs_index][JFFS2_NODETYPE_XREF] = []
- fs[fs_index][JFFS2_NODETYPE_SUMMARY] = []
+ fs[JFFS2_NODETYPE_INODE] = {}
+ fs[JFFS2_NODETYPE_DIRENT] = {}
- dirent_dict = {}
while True:
- find_result = content.find(jffs2_magic_bitmask_str, pos, len(content) - Jffs2_unknown_node.size)
- if find_result == -1:
+ find_result = content.find(
+ jffs2_magic_bitmask_str, pos, len(content) - Jffs2_unknown_node.size
+ )
+ find_result_old = content.find(
+ jffs2_old_magic_bitmask_str, pos, len(content) - Jffs2_unknown_node.size
+ )
+ if find_result == -1 and find_result_old == -1:
break
- else:
+ if find_result != -1:
pos = find_result
+ else:
+ pos = find_result_old
unknown_node = Jffs2_unknown_node()
- unknown_node.unpack(content[pos:pos + unknown_node.size])
+ unknown_node.unpack(content_mv[pos : pos + unknown_node.size])
if not unknown_node.hdr_crc_match:
pos += 1
continue
offset = pos
pos += PAD(unknown_node.totlen)
- if unknown_node.magic == JFFS2_MAGIC_BITMASK:
+ if unknown_node.magic in [
+ JFFS2_MAGIC_BITMASK,
+ JFFS2_OLD_MAGIC_BITMASK,
+ ]:
if unknown_node.nodetype in NODETYPES:
if unknown_node.nodetype == JFFS2_NODETYPE_DIRENT:
dirent = Jffs2_raw_dirent()
- dirent.unpack(content[0 + offset:], offset)
- if dirent.ino in dirent_dict:
- print 'duplicate inode use detected!!!'
- fs_index += 1
- fs[fs_index] = {}
- fs[fs_index]["endianness"] = endianness
- fs[fs_index][JFFS2_NODETYPE_INODE] = []
- fs[fs_index][JFFS2_NODETYPE_DIRENT] = []
- fs[fs_index][JFFS2_NODETYPE_XATTR] = []
- fs[fs_index][JFFS2_NODETYPE_XREF] = []
- fs[fs_index][JFFS2_NODETYPE_SUMMARY] = []
- dirent_dict = {}
-
- dirent_dict[dirent.ino] = dirent
-
- fs[fs_index][JFFS2_NODETYPE_DIRENT].append(dirent)
+ dirent.unpack(content_mv[0 + offset :], offset)
+ if dirent.ino in fs[JFFS2_NODETYPE_DIRENT]:
+ if dirent.version > fs[JFFS2_NODETYPE_DIRENT][dirent.ino].version:
+ fs[JFFS2_NODETYPE_DIRENT][dirent.ino] = dirent
+ else:
+ fs[JFFS2_NODETYPE_DIRENT][dirent.ino] = dirent
if verbose:
- print '0x%08X:' % (offset), dirent
+ print("0x%08X:" % (offset), dirent)
elif unknown_node.nodetype == JFFS2_NODETYPE_INODE:
inode = Jffs2_raw_inode()
- inode.unpack(content[0 + offset:])
- fs[fs_index][JFFS2_NODETYPE_INODE].append(inode)
- if verbose:
- print '0x%08X:' % (offset), inode
- elif unknown_node.nodetype == JFFS2_NODETYPE_XREF:
- xref = Jffs2_raw_xref()
- xref.unpack(content[offset:offset + xref.size])
- fs[fs_index][JFFS2_NODETYPE_XREF].append(xref)
- if verbose:
- print '0x%08X:' % (offset), xref
- elif unknown_node.nodetype == JFFS2_NODETYPE_XATTR:
- xattr = Jffs2_raw_xattr()
- xattr.unpack(content[offset:offset + xattr.size])
- fs[fs_index][JFFS2_NODETYPE_XREF].append(xattr)
- if verbose:
- print '0x%08X:' % (offset), xattr
- elif unknown_node.nodetype == JFFS2_NODETYPE_SUMMARY:
- summary = Jffs2_raw_summary()
- summary.unpack(content[offset:offset + summary.size])
- summaries.append(summary)
- fs[fs_index][JFFS2_NODETYPE_SUMMARY].append(summary)
+ inode.unpack(content_mv[0 + offset :])
+
+ if inode.ino in fs[JFFS2_NODETYPE_INODE]:
+ fs[JFFS2_NODETYPE_INODE][inode.ino].append(inode)
+ else:
+ fs[JFFS2_NODETYPE_INODE][inode.ino] = [inode]
if verbose:
- print '0x%08X:' % (offset), summary
+ print("0x%08X:" % (offset), inode)
elif unknown_node.nodetype == JFFS2_NODETYPE_CLEANMARKER:
pass
elif unknown_node.nodetype == JFFS2_NODETYPE_PADDING:
pass
+ elif unknown_node.nodetype == JFFS2_NODETYPE_SUMMARY:
+ pass
+ elif unknown_node.nodetype == JFFS2_NODETYPE_XATTR:
+ pass
+ elif unknown_node.nodetype == JFFS2_NODETYPE_XREF:
+ pass
else:
- print 'Unhandled node type', unknown_node.nodetype, unknown_node
- return fs.values()
+ print("Unknown node type", unknown_node.nodetype, unknown_node)
+ content_mv.release()
+ return fs
def get_device(inode):
@@ -358,32 +343,34 @@
if inode.dsize == len(Jffs2_device_node_new):
node = Jffs2_device_node_new()
node.unpack(inode.data)
- return os.makedev((node.new_id & 0xfff00) >> 8, (node.new_id & 0xff) | ((node.new_id >> 12) & 0xfff00))
- elif inode.dsize == len(Jffs2_device_node_old):
+ return os.makedev(
+ (node.new_id & 0xFFF00) >> 8,
+ (node.new_id & 0xFF) | ((node.new_id >> 12) & 0xFFF00),
+ )
+
+ if inode.dsize == len(Jffs2_device_node_old):
node = Jffs2_device_node_old()
node.unpack(inode.data)
- return os.makedev((node.old_id >> 8) & 0xff, node.old_id & 0xff)
+ return os.makedev((node.old_id >> 8) & 0xFF, node.old_id & 0xFF)
return None
+def sort_version(item):
+ return item.version
def dump_fs(fs, target):
node_dict = {}
- set_endianness(fs["endianness"])
-
- for dirent in fs[JFFS2_NODETYPE_DIRENT]:
+ for dirent in fs[JFFS2_NODETYPE_DIRENT].values():
dirent.inodes = []
- for inode in fs[JFFS2_NODETYPE_INODE]:
- if inode.ino == dirent.ino:
- dirent.inodes.append(inode)
- if dirent.ino in node_dict:
- print 'duplicate dirent.ino use detected!!!', dirent
+ for ino, inodes in fs[JFFS2_NODETYPE_INODE].items():
+ if ino == dirent.ino:
+ dirent.inodes = sorted(inodes, key=sort_version)
node_dict[dirent.ino] = dirent
- for dirent in fs[JFFS2_NODETYPE_DIRENT]:
+ for dirent in fs[JFFS2_NODETYPE_DIRENT].values():
pnode_pino = dirent.pino
pnodes = []
- for i in range(100):
+ for _ in range(100):
if pnode_pino not in node_dict:
break
pnode = node_dict[pnode_pino]
@@ -394,105 +381,110 @@
node_names = []
for pnode in pnodes:
- node_names.append(pnode.name)
- node_names.append(dirent.name)
- path = '/'.join(node_names)
+ node_names.append(pnode.name.decode())
+ node_names.append(dirent.name.decode())
+ path = "/".join(node_names)
+
+ target_path = os.path.realpath(os.path.join(target, path))
+
+ if not is_safe_path(target, target_path):
+ print(f"Path traversal attempt to {target_path}, discarding.")
+ continue
- target_path = os.path.join(os.getcwd(), target, path)
for inode in dirent.inodes:
try:
if stat.S_ISDIR(inode.mode):
- print 'writing S_ISDIR', path
+ print("writing S_ISDIR", path)
if not os.path.isdir(target_path):
os.makedirs(target_path)
elif stat.S_ISLNK(inode.mode):
- print 'writing S_ISLNK', path
+ print("writing S_ISLNK", path)
if not os.path.islink(target_path):
if os.path.exists(target_path):
- print 'file already exists as', inode.data
continue
os.symlink(inode.data, target_path)
elif stat.S_ISREG(inode.mode):
- print 'writing S_ISREG', path
+ print("writing S_ISREG", path)
if not os.path.isfile(target_path):
if not os.path.isdir(os.path.dirname(target_path)):
os.makedirs(os.path.dirname(target_path))
- with open(target_path, 'wb') as fd:
+ with open(target_path, "wb") as fd:
for inode in dirent.inodes:
fd.seek(inode.offset)
fd.write(inode.data)
os.chmod(target_path, stat.S_IMODE(inode.mode))
break
elif stat.S_ISCHR(inode.mode):
- print 'writing S_ISBLK', path
+ print("writing S_ISBLK", path)
os.mknod(target_path, inode.mode, get_device(inode))
elif stat.S_ISBLK(inode.mode):
- print 'writing S_ISBLK', path
+ print("writing S_ISBLK", path)
os.mknod(target_path, inode.mode, get_device(inode))
elif stat.S_ISFIFO(inode.mode):
- print 'skipping S_ISFIFO', path
+ print("skipping S_ISFIFO", path)
elif stat.S_ISSOCK(inode.mode):
- print 'skipping S_ISSOCK', path
+ print("skipping S_ISSOCK", path)
else:
- print 'unhandled inode.mode: %o' % inode.mode, inode, dirent
-
- except IOError as e:
- print "I/O error(%i): %s" % (e.errno, e.strerror), inode, dirent
+ print("unhandled inode.mode: %o" % inode.mode, inode, dirent)
- except OSError as e:
- print "OS error(%i): %s" % (e.errno, e.strerror), inode, dirent
+ except OSError as error:
+ print("OS error(%i): %s" % (error.errno, error.strerror), inode, dirent)
def main():
- import argparse
parser = argparse.ArgumentParser()
- parser.add_argument('-v', '--verbose', help='increase output verbosity',
- action="store_true")
- parser.add_argument('-f', '--force', help='overwrite destination directory',
- action="store_true")
- parser.add_argument('filesystem', type=str,
- help="path to filesystem")
- parser.add_argument('-d', '--dest', type=str, default='jffs2-root',
- help='destination directory (default: jffs-root)')
+ parser.add_argument(
+ "-v", "--verbose", help="increase output verbosity", action="store_true"
+ )
+ parser.add_argument(
+ "-f", "--force", help="overwrite destination directory", action="store_true"
+ )
+ parser.add_argument("filesystem", type=str, help="path to filesystem")
+ parser.add_argument(
+ "-d",
+ "--dest",
+ type=str,
+ default="jffs2-root",
+ help="destination directory (default: jffs-root)",
+ )
args = parser.parse_args()
dest_path = os.path.join(os.getcwd(), args.dest)
if os.path.exists(dest_path):
if not args.force:
- print 'Destination path already exists!'
+ print("Destination path already exists!")
return
else:
os.mkdir(dest_path)
- content = open(args.filesystem, 'rb').read()
- fs_list = scan_fs(content, cstruct.BIG_ENDIAN, verbose=args.verbose)
- fs_list += scan_fs(content, cstruct.LITTLE_ENDIAN, verbose=args.verbose)
-
- fs_index = 1
- for fs in fs_list:
- if not fs[JFFS2_NODETYPE_DIRENT]:
- continue
+ with contextlib.ExitStack() as context_stack:
+ filesystem = context_stack.enter_context(open(args.filesystem, "rb"))
+ filesystem_len = os.fstat(filesystem.fileno()).st_size
+ if 0 == filesystem_len:
+ return
+ content = context_stack.enter_context(
+ mmap.mmap(filesystem.fileno(), filesystem_len, access=mmap.ACCESS_READ)
+ )
+ magic = struct.unpack("<H", content[0:2])[0]
+ if magic in [JFFS2_OLD_MAGIC_BITMASK, JFFS2_MAGIC_BITMASK]:
+ endianness = cstruct.LITTLE_ENDIAN
+ else:
+ endianness = cstruct.BIG_ENDIAN
+
+ set_endianness(endianness)
+
+ fs = scan_fs(content, endianness, verbose=args.verbose)
+ print("dumping fs to %s (endianness: %s)" % (dest_path, endianness))
+ for key, value in fs.items():
+ print("%s count: %i" % (NODETYPES[key].__name__, len(value)))
+
+ if not os.path.exists(dest_path):
+ os.mkdir(dest_path)
- dest_path_fs = os.path.join(dest_path, 'fs_%i' % fs_index)
- print 'dumping fs #%i to %s' % (fs_index, dest_path_fs)
- for key, value in fs.iteritems():
- if key == "endianness":
- if value == cstruct.BIG_ENDIAN:
- print 'Endianness: Big'
- elif value == cstruct.LITTLE_ENDIAN:
- print 'Endianness: Little'
- continue
-
- print '%s count: %i' % (NODETYPES[key].__name__, len(value))
-
- if not os.path.exists(dest_path_fs):
- os.mkdir(dest_path_fs)
-
- dump_fs(fs, dest_path_fs)
- print '-' * 10
- fs_index += 1
+ dump_fs(fs, dest_path)
+ print("-" * 10)
-if __name__ == '__main__':
+if __name__ == "__main__":
main()
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-fanficfare for openSUSE:Factory checked in at 2022-10-25 11:20:04
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-fanficfare (Old)
and /work/SRC/openSUSE:Factory/.python-fanficfare.new.2275 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-fanficfare"
Tue Oct 25 11:20:04 2022 rev:44 rq:1030937 version:4.17.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-fanficfare/python-fanficfare.changes 2022-09-20 19:24:16.906594179 +0200
+++ /work/SRC/openSUSE:Factory/.python-fanficfare.new.2275/python-fanficfare.changes 2022-10-25 11:20:29.182200339 +0200
@@ -1,0 +2,15 @@
+Mon Oct 24 17:22:25 UTC 2022 - Matej Cepl <mcepl(a)suse.com>
+
+- Update to 4.17.0:
+ - Update Translations
+ - Fix site name fanfiction.tenhawkpresents.ink
+ - Flaresolverr v3 beta using 'expiry' cookie key, was
+ 'expires'.
+ - Flaresolverr v3 beta doesn't have 'headers'??
+ - adapter_adultfanfictionorg: Fixes for site changes.
+ - Disable Cancel during metadata update ProgBar.
+ - adapter_chosentwofanficcom: Site has several links to each
+ story in a series page.
+ - Fixes for add_category/genre_when_multi_category settings.
+
+-------------------------------------------------------------------
Old:
----
FanFicFare-4.16.0.tar.gz
_service
_servicedata
New:
----
FanFicFare-4.17.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-fanficfare.spec ++++++
--- /var/tmp/diff_new_pack.2Tct66/_old 2022-10-25 11:20:32.154206927 +0200
+++ /var/tmp/diff_new_pack.2Tct66/_new 2022-10-25 11:20:32.162206944 +0200
@@ -21,14 +21,13 @@
%define skip_python2 1
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-fanficfare
-Version: 4.16.0
+Version: 4.17.0
Release: 0
Summary: Tool for making eBooks from stories on fanfiction and other web sites
License: GPL-3.0-only
Group: Development/Languages/Python
URL: https://github.com/JimmXinu/FanFicFare
Source: https://github.com/JimmXinu/%{modname}/archive/v%{version}/%{modname}-%{ver…
-# Source: %%{modname}-%%{version}.tar.gz
BuildRequires: %{python_module beautifulsoup4}
BuildRequires: %{python_module chardet}
BuildRequires: %{python_module cloudscraper}
@@ -93,6 +92,7 @@
%license LICENSE
%doc DESCRIPTION.rst README.md
%python_alternative %{_bindir}/%{modnamedown}
-%{python_sitelib}/*
+%{python_sitelib}/%{modname}-%{version}*-info
+%{python_sitelib}/%{modnamedown}
%changelog
++++++ FanFicFare-4.16.0.tar.gz -> FanFicFare-4.17.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/calibre-plugin/__init__.py new/FanFicFare-4.17.0/calibre-plugin/__init__.py
--- old/FanFicFare-4.16.0/calibre-plugin/__init__.py 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/calibre-plugin/__init__.py 2022-10-18 18:47:27.000000000 +0200
@@ -33,7 +33,7 @@
from calibre.customize import InterfaceActionBase
# pulled out from FanFicFareBase for saving in prefs.py
-__version__ = (4, 16, 0)
+__version__ = (4, 17, 0)
## Apparently the name for this class doesn't matter--it was still
## 'demo' for the first few versions.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/calibre-plugin/dialogs.py new/FanFicFare-4.17.0/calibre-plugin/dialogs.py
--- old/FanFicFare-4.16.0/calibre-plugin/dialogs.py 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/calibre-plugin/dialogs.py 2022-10-18 18:47:27.000000000 +0200
@@ -634,13 +634,15 @@
finish_function,
init_label=_("Fetching metadata for stories..."),
win_title=_("Downloading metadata for stories"),
- status_prefix=_("Fetched metadata for")):
+ status_prefix=_("Fetched metadata for"),
+ disable_cancel=False):
ld = _LoopProgressDialog(gui,
book_list,
foreach_function,
init_label,
win_title,
- status_prefix)
+ status_prefix,
+ disable_cancel)
# Mac OS X gets upset if the finish_function is called from inside
# the real _LoopProgressDialog class.
@@ -658,7 +660,8 @@
foreach_function,
init_label=_("Fetching metadata for stories..."),
win_title=_("Downloading metadata for stories"),
- status_prefix=_("Fetched metadata for")):
+ status_prefix=_("Fetched metadata for"),
+ disable_cancel=False):
QProgressDialog.__init__(self,
init_label,
_('Cancel'), 0, len(book_list), gui)
@@ -677,6 +680,11 @@
self.setLabelText('%s %d / %d' % (self.status_prefix, self.i, len(self.book_list)))
self.setValue(self.i)
+ if disable_cancel:
+ self.setCancelButton(None)
+ self.reject = self.disabled_reject
+ self.closeEvent = self.disabled_closeEvent
+
## self.do_loop does QTimer.singleShot on self.do_loop also.
## A weird way to do a loop, but that was the example I had.
## 100 instead of 0 on the first go due to Win10(and later
@@ -684,6 +692,15 @@
QTimer.singleShot(100, self.do_loop)
self.exec_()
+ # used when disable_cancel = True
+ def disabled_reject(self):
+ pass
+
+ # used when disable_cancel = True
+ def disabled_closeEvent(self, event):
+ if event.spontaneous():
+ event.ignore()
+
def updateStatus(self):
remaining_time_string = ''
if self.show_est_time and self.i > -1:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/calibre-plugin/fff_plugin.py new/FanFicFare-4.17.0/calibre-plugin/fff_plugin.py
--- old/FanFicFare-4.16.0/calibre-plugin/fff_plugin.py 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/calibre-plugin/fff_plugin.py 2022-10-18 18:47:27.000000000 +0200
@@ -2139,7 +2139,8 @@
partial(self.update_books_finish, options=options),
init_label=_("Updating calibre for FanFiction stories..."),
win_title=_("Update calibre for FanFiction stories"),
- status_prefix=_("Updated"))
+ status_prefix=_("Updated"),
+ disable_cancel=True)
def update_error_column(self,payload):
'''Update custom error column if configured.'''
@@ -2155,7 +2156,8 @@
partial(self.update_books_finish, options=options),
init_label=_("Updating calibre for BAD FanFiction stories..."),
win_title=_("Update calibre for BAD FanFiction stories"),
- status_prefix=_("Updated"))
+ status_prefix=_("Updated"),
+ disable_cancel=True)
def update_error_column_loop(self,book,db=None,errorcol_label=None,lastcheckedcol_label=None):
if book['calibre_id'] and errorcol_label:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/calibre-plugin/plugin-defaults.ini new/FanFicFare-4.17.0/calibre-plugin/plugin-defaults.ini
--- old/FanFicFare-4.16.0/calibre-plugin/plugin-defaults.ini 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/calibre-plugin/plugin-defaults.ini 2022-10-18 18:47:27.000000000 +0200
@@ -1761,7 +1761,7 @@
website_encodings:Windows-1252,utf8
-[fanfic.tenhawkpresents.ink]
+[fanfiction.tenhawkpresents.ink]
use_basic_cache:true
## Some sites require login (or login for some rated stories) The
## program can prompt you, or you can save it in config. In
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/calibre-plugin/translations/de.po new/FanFicFare-4.17.0/calibre-plugin/translations/de.po
--- old/FanFicFare-4.16.0/calibre-plugin/translations/de.po 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/calibre-plugin/translations/de.po 2022-10-18 18:47:27.000000000 +0200
@@ -2,6 +2,7 @@
# Copyright (C) YEAR ORGANIZATION
#
# Translators:
+# Dustin Steiner, 2022
# Ettore Atalan <atalanttore(a)googlemail.com>, 2014-2016,2018,2020
# ILB, 2014-2017,2020-2022
# Johannes Sch��pp <mail(a)jschpp.de>, 2020
@@ -18,7 +19,7 @@
"Project-Id-Version: calibre-plugins\n"
"POT-Creation-Date: 2022-07-06 11:14-0500\n"
"PO-Revision-Date: 2014-06-19 22:55+0000\n"
-"Last-Translator: ILB, 2014-2017,2020-2022\n"
+"Last-Translator: Dustin Steiner, 2022\n"
"Language-Team: German (http://www.transifex.com/calibre/calibre-plugins/language/de/)\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
@@ -1572,7 +1573,7 @@
#: dialogs.py:1036
msgid "Are you sure you want to remove the selected %d books from the list?"
-msgstr "Sind sie sicher, dass sie die ausgew��hlten B��cher von der Liste l��schen wollen?"
+msgstr "Sind Sie sicher, dass Sie die %d ausgew��hlten B��cher von der Liste l��schen wollen?"
#: dialogs.py:1062
msgid "Note"
@@ -1588,7 +1589,7 @@
#: dialogs.py:1112
msgid "Are you sure you want to remove the %d selected URLs from the list?"
-msgstr "Sind sie sicher, dass sie die %d ausgew��hlten URLs von der Liste l��schen m��chten?"
+msgstr "Sind Sie sicher, dass Sie die %d ausgew��hlten URLs von der Liste l��schen m��chten?"
#: dialogs.py:1130
msgid "List of Books to Reject"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/calibre-plugin/translations/ko.po new/FanFicFare-4.17.0/calibre-plugin/translations/ko.po
--- old/FanFicFare-4.16.0/calibre-plugin/translations/ko.po 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/calibre-plugin/translations/ko.po 2022-10-18 18:47:27.000000000 +0200
@@ -2,14 +2,14 @@
# Copyright (C) YEAR ORGANIZATION
#
# Translators:
-# Junghee Lee <daemul72(a)gmail.com>, 2022
-# Junghee Lee <daemul72(a)gmail.com>, 2021
+# JungHee Lee <daemul72(a)gmail.com>, 2022
+# JungHee Lee <daemul72(a)gmail.com>, 2021
msgid ""
msgstr ""
"Project-Id-Version: calibre-plugins\n"
"POT-Creation-Date: 2022-07-06 11:14-0500\n"
"PO-Revision-Date: 2014-06-19 22:55+0000\n"
-"Last-Translator: Junghee Lee <daemul72(a)gmail.com>, 2022\n"
+"Last-Translator: JungHee Lee <daemul72(a)gmail.com>, 2022\n"
"Language-Team: Korean (http://www.transifex.com/calibre/calibre-plugins/language/ko/)\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/calibre-plugin/translations/pl.po new/FanFicFare-4.17.0/calibre-plugin/translations/pl.po
--- old/FanFicFare-4.16.0/calibre-plugin/translations/pl.po 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/calibre-plugin/translations/pl.po 2022-10-18 18:47:27.000000000 +0200
@@ -6,12 +6,13 @@
# Marcin Kozio�� <koziol.martin(a)gmail.com>, 2019-2020
# Ola Kleniewska <anyzeklove(a)gmail.com>, 2016
# Piotr Str��bski <strebski(a)gmail.com>, 2015
+# The Name <moje.konto+transifex(a)posteo.org>, 2022
msgid ""
msgstr ""
"Project-Id-Version: calibre-plugins\n"
"POT-Creation-Date: 2022-07-06 11:14-0500\n"
"PO-Revision-Date: 2014-06-19 22:55+0000\n"
-"Last-Translator: Marcin Kozio�� <koziol.martin(a)gmail.com>, 2019-2020\n"
+"Last-Translator: The Name <moje.konto+transifex(a)posteo.org>, 2022\n"
"Language-Team: Polish (http://www.transifex.com/calibre/calibre-plugins/language/pl/)\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
@@ -307,7 +308,7 @@
#: config.py:563
msgid "Success"
-msgstr ""
+msgstr "Sukces"
#: config.py:564
msgid "Mark successfully downloaded or updated books."
@@ -315,7 +316,7 @@
#: config.py:569
msgid "Failed"
-msgstr ""
+msgstr "Nie powiod��o si��"
#: config.py:570
msgid "Mark failed downloaded or updated books."
@@ -323,7 +324,7 @@
#: config.py:575
msgid "Chapter Error"
-msgstr ""
+msgstr "B����d Rozdzia��u"
#: config.py:576
msgid ""
@@ -1824,7 +1825,7 @@
#: fff_plugin.py:614 fff_plugin.py:1960 fff_plugin.py:2584 fff_plugin.py:2596
#: fff_plugin.py:2607 fff_plugin.py:2613 fff_plugin.py:2626
msgid "Warning"
-msgstr ""
+msgstr "Ostrze��enie"
#: fff_plugin.py:622
msgid "(%d Story URLs Skipped, on Rejected URL List)"
@@ -2231,7 +2232,7 @@
#: fff_plugin.py:1788
msgid "Info"
-msgstr ""
+msgstr "Informacja"
#: fff_plugin.py:1830
msgid "Story Details:"
@@ -2271,7 +2272,7 @@
#: fff_plugin.py:1964
msgid "FanFicFare: "
-msgstr ""
+msgstr "FanFicFare:"
#: fff_plugin.py:1964
msgid "No Good Stories for Anthology"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/calibre-plugin/translations/uk.po new/FanFicFare-4.17.0/calibre-plugin/translations/uk.po
--- old/FanFicFare-4.16.0/calibre-plugin/translations/uk.po 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/calibre-plugin/translations/uk.po 2022-10-18 18:47:27.000000000 +0200
@@ -32,11 +32,11 @@
#: common_utils.py:250
msgid "Keyboard shortcuts"
-msgstr "���������������������� ������������ ��������������"
+msgstr "���������������������� ��������������������"
#: common_utils.py:279
msgid "Prefs Viewer dialog"
-msgstr "������������������������ ���������� ������������������"
+msgstr "���������� ������������������ ����������������������"
#: common_utils.py:280
msgid "Preferences for: "
@@ -48,7 +48,7 @@
#: common_utils.py:313
msgid "Clear all settings for this plugin"
-msgstr "���������������� ������ ������������������������ ������ ���������� ��������������"
+msgstr "���������������� ������ ������������������������ ������ ���������� ��������������"
#: common_utils.py:317
msgid "Edit"
@@ -84,11 +84,11 @@
msgid ""
"Any settings in other libraries or stored in a JSON file in your calibre "
"plugins folder will not be touched."
-msgstr "��������-������ ������������������������ �� ���������� ����������������������, ������ ������������������ �� ���������� JSON �� ���������� ���������� ���������������� ���� ������������ ��������������."
+msgstr "��������-������ ������������������������ �� ���������� ���������������������� ������ ������������������ �� ���������� JSON �� ���������� �������� ���������������� calibre ���� ������������ ��������������."
#: common_utils.py:363 common_utils.py:391
msgid "You must restart calibre afterwards."
-msgstr "���������� ���������� ���� �������������� ������������������������������ Calibre."
+msgstr "���������� ���������� ���� �������������� ������������������������������ calibre."
#: common_utils.py:371
msgid "All settings for this plugin in this library have been saved."
@@ -96,21 +96,21 @@
#: common_utils.py:372 common_utils.py:401
msgid "Please restart calibre now."
-msgstr "��������-���������� ���������������������������� Calibre."
+msgstr "�������� ���������� ���������������������������� Calibre."
#: common_utils.py:374 common_utils.py:403
msgid "Restart calibre now"
-msgstr "������������������������������ Calibre"
+msgstr "������������������������������ calibre"
#: common_utils.py:389
msgid ""
"Are you sure you want to clear your settings in this library for this "
"plugin?"
-msgstr "���� ����������������, ���� �������������� ���������������� �������� ������������������������ �� ������ ���������������������� ������ ������������ ��������������?"
+msgstr "���� ����������������, ���� �������������� ���������������� �������� ������������������������ �� ������ �������������������� ������ ���������� ��������������?"
#: common_utils.py:400
msgid "All settings for this plugin in this library have been cleared."
-msgstr "������ ������������������������ ������ ���������� �������������� �� ������ �������������������� �������� ��������������."
+msgstr "������ ������������������������ ������ ���������� �������������� �� ������ �������������������� �������� ��������������."
#: config.py:225
msgid "List of Supported Sites"
@@ -138,7 +138,7 @@
#: config.py:267
msgid "Email Settings"
-msgstr "������������������������ ����������"
+msgstr "������������������ ����������"
#: config.py:270
msgid "Other"
@@ -250,7 +250,7 @@
#: config.py:516 config.py:688
msgid "Reject Without Confirmation?"
-msgstr "������������������ ������ ��������������������������"
+msgstr "������������������ ������ ��������������������������?"
#: config.py:517
msgid ""
@@ -365,7 +365,7 @@
#: config.py:616
msgid "Automatically Convert new/update books?"
-msgstr "���������������������� ������������������������ ��������/������������������ ������������?"
+msgstr "���������������������� �������������������������� ��������/���������������� ����������?"
#: config.py:617
msgid ""
@@ -380,11 +380,11 @@
#: config.py:625
msgid "Take URLs from Clipboard?"
-msgstr "�������������������� URL-������������ �� ������������ ������������?"
+msgstr "�������������������� ������������ �� ������������ ������������?"
#: config.py:626
msgid "Prefill URLs from valid URLs in Clipboard when Adding New."
-msgstr "������������������ �������������������� URL-���������� �� �������������� URL-���������� �� ������������ ������������ ������ ������������������ ����������."
+msgstr "������������������ �������������������� ���������� �� ������������������ ���������� �� ������������ ������������ ������ ������������������ ����������."
#: config.py:630
msgid "FanFicFare button opens menu?"
@@ -398,7 +398,7 @@
#: config.py:635
msgid "Default to Update when books selected?"
-msgstr "���� �������������������������� ����������������, �������� ���������� ������������?"
+msgstr "������������ ��������������������, �������� ���������� ������������������?"
#: config.py:636
msgid ""
@@ -451,11 +451,11 @@
#: config.py:669
msgid "Reject List"
-msgstr "������������ ����������������������"
+msgstr "������������ ��������������������"
#: config.py:673
msgid "Edit Reject URL List"
-msgstr "�������������������������� ������������ �������������������� ����������������"
+msgstr "�������������� ������������ ����������, ������ �������� ������������������"
#: config.py:674
msgid "Edit list of URLs FanFicFare will automatically Reject."
@@ -463,15 +463,15 @@
#: config.py:678 config.py:757
msgid "Add Reject URLs"
-msgstr "������������ ������������������ ������������������"
+msgstr "������������ ������������ ��������������������"
#: config.py:679
msgid "Add additional URLs to Reject as text."
-msgstr "������������ ������������������ ������������������ ���� ����������, ������ ���������������� ������������������ "
+msgstr "������������ ������������������ ������������ ������ �������������������� ���� ����������."
#: config.py:683
msgid "Edit Reject Reasons List"
-msgstr "�������������������������� ������������ ������������ ������ ��������������������"
+msgstr "�������������� ������������ ������������ ������ ��������������������"
#: config.py:684 config.py:747
msgid "Customize the Reasons presented when Rejecting URLs"
@@ -491,7 +491,7 @@
#: config.py:732
msgid "Edit Reject URLs List"
-msgstr "�������������������������� ������������ �������������������� ����������������"
+msgstr "�������������� ������������ ���������� ������������������������������������"
#: config.py:745
msgid "Reject Reasons"
@@ -603,11 +603,11 @@
#: config.py:873
msgid "Plugin Defaults"
-msgstr "������������ ������������������������ ��������������"
+msgstr "������������ ������������������������ ��������������"
#: config.py:874
msgid "Plugin Defaults (%s) (Read-Only)"
-msgstr "������������ ������������������������ �������������� (%s) (������������ ��������������������)"
+msgstr "������������ ������������������������ �������������� (%s) (�������� ��������������)"
#: config.py:885
msgid "View 'Safe' personal.ini"
@@ -1113,13 +1113,13 @@
#: config.py:1565
msgid "Force Title into Title Sort?"
-msgstr "������������������ ������������������������ ���������������� ���������� �� �������������������������� ��������?"
+msgstr "������������������ ������������������������ ���������� ������ �������������������������� ���� ��������������?"
#: config.py:1566
msgid ""
"If checked, the title as given will be used for the Title Sort, too.\n"
"If not checked, calibre will apply it's built in algorithm which makes 'The Title' sort as 'Title, The', etc."
-msgstr "�������� ������������������ ������������������������, ���������� ���������� �������� ���������������������������������� ������ �������������������������� ��������.\n�������� ���� ����������������������, calibre �������� �������������������������� �������������������� �� ���������� ����������������, �������� ������������ ���������� 'The Title', ���� 'Title, The' �� ��.��."
+msgstr "�������� ������������������ ������������������������, ���������� ���������� �������� ���������������������������������� ������ �������������������������� ���� ��������������.\n�������� ���� ����������������������, calibre �������� �������������������������� �������������������� �� ���������� ����������������, �������� �������������������� ���������� ��The Title�� ���� ��Title, The�� ��������."
#: config.py:1569
msgid "Fix Title Case?"
@@ -2417,7 +2417,7 @@
#: jobs.py:85
msgid "Downloading FanFiction Stories"
-msgstr "������������������������ FanFiction ��������������"
+msgstr "������������������������ ������������������ FanFiction"
#: jobs.py:105
msgid "%(count)d of %(total)d stories finished downloading"
@@ -2462,7 +2462,7 @@
#: prefs.py:27
msgid "Add New Book"
-msgstr "������������ �������� ����������"
+msgstr "������������ �������� ����������"
#: prefs.py:28
msgid "Update EPUB if New Chapters"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/fanficfare/adapters/adapter_adultfanfictionorg.py new/FanFicFare-4.17.0/fanficfare/adapters/adapter_adultfanfictionorg.py
--- old/FanFicFare-4.16.0/fanficfare/adapters/adapter_adultfanfictionorg.py 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/fanficfare/adapters/adapter_adultfanfictionorg.py 2022-10-18 18:47:27.000000000 +0200
@@ -258,18 +258,23 @@
asoup = self.make_soup(adata)
- ##Getting the number of pages
- pages=asoup.find('div',{'class' : 'pagination'}).findAll('li')[-1].find('a')
- if not pages == None:
- pages = pages['href'].split('=')[-1]
- else:
- pages = 0
+ ##Getting the number of author pages
+ pages = 0
+ pagination=asoup.find('ul',{'class' : 'pagination'})
+ if pagination:
+ pages = pagination.findAll('li')[-1].find('a')
+ if not pages == None:
+ pages = pages['href'].split('=')[-1]
+ else:
+ pages = 0
+ storya = None
##If there is only 1 page of stories, check it to get the Metadata,
if pages == 0:
a = asoup.findAll('li')
for lc2 in a:
if lc2.find('a', href=re.compile(r'story.php\?no='+self.story.getMetadata('storyId')+"$")):
+ storya = lc2
break
## otherwise go through the pages
else:
@@ -293,6 +298,7 @@
for lc2 in a:
if lc2.find('a', href=re.compile(r'story.php\?no='+self.story.getMetadata('storyId')+"$")):
i=1
+ storya = lc2
break
page = page + 1
if page > int(pages):
@@ -305,14 +311,14 @@
##There is also a double <br/>, so we have to fix that, then remove the leading and trailing '-:-'.
##They are always in the same order.
## EDIT 09/26/2016: Had some trouble with unicode errors... so I had to put in the decode/encode parts to fix it
- liMetadata = unicode(lc2).replace('\n','').replace('\r','').replace('\t',' ').replace(' ',' ').replace(' ',' ').replace(' ',' ')
+ liMetadata = unicode(storya).replace('\n','').replace('\r','').replace('\t',' ').replace(' ',' ').replace(' ',' ').replace(' ',' ')
liMetadata = stripHTML(liMetadata.replace(r'<br/>','-:-').replace('<!-- <br /-->','-:-'))
liMetadata = liMetadata.strip('-:-').strip('-:-').encode('utf-8')
for i, value in enumerate(liMetadata.decode('utf-8').split('-:-')):
if i == 0:
# The value for the title has been manipulated, so may not be the same as gotten at the start.
- # I'm going to use the href from the lc2 retrieved from the author's page to determine if it is correct.
- if lc2.find('a', href=re.compile(r'story.php\?no='+self.story.getMetadata('storyId')+"$"))['href'] != url:
+ # I'm going to use the href from the storya retrieved from the author's page to determine if it is correct.
+ if storya.find('a', href=re.compile(r'story.php\?no='+self.story.getMetadata('storyId')+"$"))['href'] != url:
raise exceptions.StoryDoesNotExist('Did not find story in author story list: {0}'.format(author_Url))
elif i == 1:
##Get the description
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/fanficfare/adapters/adapter_chosentwofanficcom.py new/FanFicFare-4.17.0/fanficfare/adapters/adapter_chosentwofanficcom.py
--- old/FanFicFare-4.16.0/fanficfare/adapters/adapter_chosentwofanficcom.py 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/fanficfare/adapters/adapter_chosentwofanficcom.py 2022-10-18 18:47:27.000000000 +0200
@@ -199,9 +199,9 @@
storyas = seriessoup.findAll('a', href=re.compile(r'viewstory.php\?sid=\d+'))
i=1
for a in storyas:
- # skip 'report this' and 'TOC' links
- if 'contact.php' not in a['href'] and 'index' not in a['href']:
- if a['href'] == ('viewstory.php?sid='+self.story.getMetadata('storyId')):
+ # this site has several links to each story.
+ if a.text == 'Latest Chapter':
+ if ('viewstory.php?sid='+self.story.getMetadata('storyId')) in a['href']:
self.setSeries(series_name, i)
self.story.setMetadata('seriesUrl',series_url)
break
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/fanficfare/adapters/adapter_tenhawkpresents.py new/FanFicFare-4.17.0/fanficfare/adapters/adapter_tenhawkpresents.py
--- old/FanFicFare-4.16.0/fanficfare/adapters/adapter_tenhawkpresents.py 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/fanficfare/adapters/adapter_tenhawkpresents.py 2022-10-18 18:47:27.000000000 +0200
@@ -48,7 +48,7 @@
@staticmethod
def getSiteDomain():
- return 'fanfic.tenhawkpresents.ink'
+ return 'fanfiction.tenhawkpresents.ink'
@classmethod
def getSiteExampleURLs(cls):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/fanficfare/cli.py new/FanFicFare-4.17.0/fanficfare/cli.py
--- old/FanFicFare-4.16.0/fanficfare/cli.py 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/fanficfare/cli.py 2022-10-18 18:47:27.000000000 +0200
@@ -28,7 +28,7 @@
import os, sys, platform
-version="4.16.0"
+version="4.17.0"
os.environ['CURRENT_VERSION_ID']=version
global_cache = 'global_cache'
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/fanficfare/defaults.ini new/FanFicFare-4.17.0/fanficfare/defaults.ini
--- old/FanFicFare-4.16.0/fanficfare/defaults.ini 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/fanficfare/defaults.ini 2022-10-18 18:47:27.000000000 +0200
@@ -1782,7 +1782,7 @@
website_encodings:Windows-1252,utf8
-[fanfic.tenhawkpresents.ink]
+[fanfiction.tenhawkpresents.ink]
use_basic_cache:true
## Some sites require login (or login for some rated stories) The
## program can prompt you, or you can save it in config. In
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/fanficfare/flaresolverr_proxy.py new/FanFicFare-4.17.0/fanficfare/flaresolverr_proxy.py
--- old/FanFicFare-4.16.0/fanficfare/flaresolverr_proxy.py 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/fanficfare/flaresolverr_proxy.py 2022-10-18 18:47:27.000000000 +0200
@@ -131,7 +131,9 @@
if data is None:
# Without download (or with FlareSolverr v2), don't
# need base64 decode, and image downloads won't work.
- if 'image' in resp.json['solution']['headers']['content-type']:
+ if 'headers' in resp.json['solution'] and \
+ 'content-type' in resp.json['solution']['headers'] and \
+ 'image' in resp.json['solution']['headers']['content-type']:
raise exceptions.HTTPErrorFFF(
url,
428, # 404 & 410 trip StoryDoesNotExist
@@ -183,12 +185,13 @@
## 30000000000 == 2920-08-30 05:20:00. If 900 years isn't
## enough, somebody can fix it then.
## (current global_cookie/
- # logger.debug(c['expires'])
- if c['expires'] > 30000000000:
- c['expires'] = 30000000000
+ expireKey = 'expires' if 'expires' in c else 'expiry'
+ logger.debug("expireKey:%s"%expireKey)
+ if c[expireKey] > 30000000000:
+ c[expireKey] = 30000000000
# logger.debug(c['name'])
# import datetime
- # logger.debug(datetime.datetime.utcfromtimestamp(c['expires']))
+ # logger.debug(datetime.datetime.utcfromtimestamp(c[expireKey]))
retval.append(Cookie(0, # version
c['name'],
@@ -201,8 +204,8 @@
c['path'],
c['path'] == None or c['path'] == '', # path_specified,
c['secure'],
- c['expires'],
- c['expires'] == -1, # discard
+ c[expireKey],
+ c[expireKey] == -1, # discard
None, # comment,
None, # comment_url,
{}, # rest
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/fanficfare/story.py new/FanFicFare-4.17.0/fanficfare/story.py
--- old/FanFicFare-4.16.0/fanficfare/story.py 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/fanficfare/story.py 2022-10-18 18:47:27.000000000 +0200
@@ -1079,14 +1079,16 @@
## there's more than one category value. Does not work
## consistently well if you try to include_in_ chain genre
## back into category--breaks with fandoms sites like AO3
- if listname == 'genre' and self.getConfig('add_genre_when_multi_category') and len(self.getList('category',
- removeallentities=False,
- # to avoid inf loops if genre/cat substs
- includelist=includelist+[listname],
- doreplacements=False,
- skip_cache=True,
- seen_list=seen_list
- )) > 1:
+ if( listname == 'genre' and self.getConfig('add_genre_when_multi_category')
+ and len(self.getList('category',
+ removeallentities=False,
+ # to avoid inf loops if genre/cat substs
+ includelist=includelist+[listname],
+ doreplacements=False,
+ skip_cache=True,
+ seen_list=seen_list
+ )) > 1
+ and self.getConfig('add_genre_when_multi_category') not in retlist ):
retlist.append(self.getConfig('add_genre_when_multi_category'))
if retlist:
@@ -1111,15 +1113,24 @@
# remove dups and sort.
retlist = sorted(list(set(retlist)))
- ## Add value of add_genre_when_multi_category to
- ## category if there's more than one category
- ## value (before this, obviously). Applied
- ## *after* doReplacements. For normalization
- ## crusaders who want Crossover as a category
- ## instead of genre. Moved after dedup'ing so
- ## consolidated category values don't count.
- if listname == 'category' and self.getConfig('add_category_when_multi_category') and len(retlist) > 1:
- retlist.append(self.getConfig('add_category_when_multi_category'))
+ ## Add value of add_genre_when_multi_category to
+ ## category if there's more than one category
+ ## value (before this, obviously). Applied
+ ## *after* doReplacements. For normalization
+ ## crusaders who want Crossover as a category
+ ## instead of genre. Moved after dedup'ing so
+ ## consolidated category values don't count.
+ if( listname == 'category'
+ and self.getConfig('add_category_when_multi_category')
+ and len(retlist) > 1
+ and self.getConfig('add_category_when_multi_category') not in retlist ):
+ retlist.append(self.getConfig('add_category_when_multi_category'))
+ ## same sort as above, but has to be after due to
+ ## changing list. unique filter not needed: 'not
+ ## in retlist' check
+ if not (listname in ('author','authorUrl','authorId') or self.getConfig('keep_in_order_'+listname)):
+ retlist = sorted(list(set(retlist)))
+
else:
retlist = []
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/FanFicFare-4.16.0/setup.py new/FanFicFare-4.17.0/setup.py
--- old/FanFicFare-4.16.0/setup.py 2022-09-19 19:20:42.000000000 +0200
+++ new/FanFicFare-4.17.0/setup.py 2022-10-18 18:47:27.000000000 +0200
@@ -26,7 +26,7 @@
name=package_name,
# Versions should comply with PEP440.
- version="4.16.0",
+ version="4.17.0",
description='A tool for downloading fanfiction to eBook formats',
long_description=long_description,
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-cstruct for openSUSE:Factory checked in at 2022-10-25 11:20:04
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-cstruct (Old)
and /work/SRC/openSUSE:Factory/.python-cstruct.new.2275 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-cstruct"
Tue Oct 25 11:20:04 2022 rev:3 rq:1030926 version:3.3
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-cstruct/python-cstruct.changes 2022-10-14 15:42:19.315896621 +0200
+++ /work/SRC/openSUSE:Factory/.python-cstruct.new.2275/python-cstruct.changes 2022-10-25 11:20:28.434198683 +0200
@@ -1,0 +2,25 @@
+Mon Oct 24 15:29:46 UTC 2022 - Martin Hauke <mardnh(a)gmx.de>
+
+- Update to version 3.3
+ * Fix tests on 32bit architecture
+- Update to version 3.2
+ * Add more tests
+
+-------------------------------------------------------------------
+Fri Oct 14 13:45:17 UTC 2022 - Martin Hauke <mardnh(a)gmx.de>
+
+- Update to version 3.1
+ * Make CStruct/MemCStruct Pickle Friendly
+- Update to version 3.0
+ * Flexible array support
+- Update to version 2.3
+ * Fix compare with None
+- Update to version 2.2
+ Fixes
+ * Fix empty MemCStruct size
+ Improvements
+ * Python 3.10 support
+ * pytest
+ * black code style
+
+-------------------------------------------------------------------
Old:
----
python-cstruct-1.8.tar.gz
New:
----
python-cstruct-3.3.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-cstruct.spec ++++++
--- /var/tmp/diff_new_pack.xKyvMS/_old 2022-10-25 11:20:28.882199674 +0200
+++ /var/tmp/diff_new_pack.xKyvMS/_new 2022-10-25 11:20:28.890199692 +0200
@@ -2,7 +2,7 @@
# spec file for package python-cstruct
#
# Copyright (c) 2022 SUSE LLC
-# Copyright (c) 2020, Martin Hauke <mardnh(a)gmx.de>
+# Copyright (c) 2020-2022, Martin Hauke <mardnh(a)gmx.de>
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -19,14 +19,13 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-cstruct
-Version: 1.8
+Version: 3.3
Release: 0
Summary: C-style structs for Python
License: MIT
Group: Development/Languages/Python
URL: https://github.com/andreax79/python-cstruct
Source: https://github.com/andreax79/python-cstruct/archive/v%{version}.tar.gz#/%{n…
-BuildRequires: %{python_module devel}
BuildRequires: %{python_module pytest}
BuildRequires: %{python_module setuptools}
BuildRequires: fdupes
@@ -46,17 +45,14 @@
%prep
%setup -q
-sed -i -e '/^#!\//, 1d' \
- cstruct/__init__.py \
- cstruct/examples/fdisk.py \
- cstruct/examples/who.py \
- cstruct/tests/test_cstruct.py
%build
%python_build
%install
%python_install
+%python_expand find %{buildroot}%{$python_sitelib} -name "*.py" -exec sed -i -e '/^#!\//, 1d' {} \;
+%python_expand rm -R %{buildroot}%{$python_sitelib}/tests/
%python_expand %fdupes %{buildroot}%{$python_sitelib}
%check
++++++ python-cstruct-1.8.tar.gz -> python-cstruct-3.3.tar.gz ++++++
++++ 4677 lines of diff (skipped)
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package lua-fennel for openSUSE:Factory checked in at 2022-10-25 11:20:02
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/lua-fennel (Old)
and /work/SRC/openSUSE:Factory/.lua-fennel.new.2275 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "lua-fennel"
Tue Oct 25 11:20:02 2022 rev:2 rq:1030955 version:1.2.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/lua-fennel/lua-fennel.changes 2021-08-28 22:30:02.534033568 +0200
+++ /work/SRC/openSUSE:Factory/.lua-fennel.new.2275/lua-fennel.changes 2022-10-25 11:20:27.718197099 +0200
@@ -1,0 +2,89 @@
+Fri Oct 21 14:39:15 UTC 2022 - Mia Herkt <mia(a)0x0.st>
+
+- Update to 1.2.1
+New Features
+ * Add `fennel.install` function to the API for adding the
+ searcher
+ * Add missing `table?` predicate to fennel API to go with
+ `list?`, `sym?`, etc
+
+Bug Fixes
+ * Skip autogenerated locals in repl locals saving to avoid
+ exceeding local limit
+ * Ensure `(values)` consistently evaluates to zero values
+ * Fix bug preventing new macros from shadowing other macros
+ * Ensure macros use stable `pairs` table traversal for
+ reproducible builds
+
+- Changes in 1.2.0:
+New Forms
+ * Add `fcollect` macro for range "comprehension"
+
+New Features
+ * Make `include` splice modules in where they're used instead of
+ at the top
+ * Add `ast-source` function to API to get file/line info from
+ AST nodes
+ * Show errors using terminal control codes instead of arrow
+ indicator
+ * Parser now includes column information (byte-based) in AST
+ nodes
+ * For greater consistency, add `&into`/`&until` to certain
+ looping constructs
+
+Bug Fixes
+ * Duplicate table keys no longer crash the compiler
+ * Don't print stack trace for compiler errors in built-in macros
+ * Fix an issue with native modules in `--compile-binary`
+ * Improve argument handling so unused arguments get passed on to
+ script
+ * Fix a bug where macros modifying table literals would emit
+ incorrect output
+ * Fix a bug in the REPL where parser errors display the error
+ message as `nil`
+ * Fix a bug when `nil` were emitted by `unquote` in a macro,
+ and the macro was not compiled correctly because the resulting
+ list length was calculated incorrectly
+ * Fix a REPL bug where `,doc m.foo` did not resolve multisym to
+ macro for macro modules loaded as macro table via
+ `(import-macros m :my.macro.module)`
+
+Changes in 1.1.0:
+New Forms
+ * Add `match-try` macro for chained pattern matching for steps
+ which might fail
+
+New Features
+ * The `fennel.parser` function now accepts a string in addition
+ to an iterator
+ * The `accumulate` macro can now accumulate over multiple values
+ * The `fn` special now accepts a metadata table in place of a
+ docstring
+ * The `,reload mod` repl command can now reload macro modules
+
+Bug Fixes
+ * Fix an issue where built-in macros would modify their AST
+ arguments
+ * Fix a bug where `--skip-include` would mistakenly emit a
+ warning
+ * Remove hex string escapes to preserve PUC Lua 5.1 compatibility
+ * Prevent errors resolving the target of certain repl commands
+ from crashing
+ * Fix a bug where disabling the compiler sandbox broke module
+ require scope
+ * Fix a bug where certain specials wouldn't short-circuit in
+ `and`/`or`
+ * Fix a bug where symbols bound to `nil` did not show up in REPL
+ completion
+
+Changes and Removals
+ * Deprecate the `granulate` and `string-stream` functions in the
+ API
+ * Deprecate the `global` form in favor of using the `_G` table
+
+-------------------------------------------------------------------
+Mon Nov 15 08:14:24 UTC 2021 - Fabio Pesari <fpesari(a)tuxfamily.org>
+
+- Updated to 1.0.0
+
+-------------------------------------------------------------------
@@ -8 +97 @@
-Sat Jun 6 17:22:25 UTC 2020 - Fabio Pesari <fpesari(a)tuxfamily.org> - 0.4.1
+Sat Jun 6 17:22:25 UTC 2020 - Fabio Pesari <fpesari(a)tuxfamily.org>
Old:
----
0.10.0.tar.gz
New:
----
lua-fennel-1.2.1.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ lua-fennel.spec ++++++
--- /var/tmp/diff_new_pack.P4Scvw/_old 2022-10-25 11:20:28.274198330 +0200
+++ /var/tmp/diff_new_pack.P4Scvw/_new 2022-10-25 11:20:28.282198347 +0200
@@ -1,5 +1,7 @@
+#
# spec file for package lua-fennel
#
+# Copyright (c) 2022 SUSE LLC
# Copyright (c) 2020 Fabio Pesari
#
# All modifications and additions to the file contributed by third parties
@@ -10,32 +12,35 @@
# case the license is the MIT License). An "Open Source License" is a
# license that conforms to the Open Source Definition (Version 1.9)
# published by the Open Source Initiative.
+
+# Please submit bugfixes or comments via https://bugs.opensuse.org/
#
-# Please submit bugfixes or comments via http://bugs.opensuse.org/
+
Name: lua-fennel
-Version: 0.10.0
+Version: 1.2.1
Release: 0
Summary: Lisp dialect that compiles to Lua
License: MIT
Group: Development/Languages/Lua
URL: https://fennel-lang.org/
-Source0: https://git.sr.ht/~technomancy/fennel/archive/%{version}.tar.gz
+Source0: https://git.sr.ht/~technomancy/fennel/archive/%{version}.tar.gz#/%{name}-%{…
BuildRequires: lua
BuildArch: noarch
%description
-Fennel is a lisp that compiles to Lua. It aims to be easy to use,
-expressive, and has almost zero overhead compared to handwritten Lua.
+Fennel is a lisp that compiles to Lua. Features include:
-��� Full Lua compatibility - You can use any function or library from Lua.
-��� Zero overhead - Compiled code should be just as or more efficient than
-hand-written Lua.
-��� Compile-time macros - Ship compiled code with no runtime dependency
-on Fennel.
-��� Embeddable - Fennel is a one-file library as well as an executable.
-Embed it in other programs to support runtime extensibility and
-interactive development.
+��� Full Lua compatibility - You can use any function or library from
+ Lua.
+��� Zero overhead - Compiled code should be just as or more efficient
+ than hand-written Lua.
+��� Compile-time macros - Ship compiled code with no runtime
+ dependency on Fennel.
+��� Embeddable - Fennel is a one-file library as well as an
+ executable.
+ Embed it in other programs to support runtime extensibility and
+ interactive development.
%prep
%autosetup -p1 -n fennel-%{version}
@@ -44,11 +49,11 @@
%make_build fennel
%check
-make test
+%make_build test
%install
mkdir -p %{buildroot}%{_bindir}
-sed -i s:/usr/bin/env\ lua:/usr/bin/lua: fennel
+sed -i s:%{_bindir}/env\ lua:%{_bindir}/lua: fennel
install -m 755 fennel %{buildroot}%{_bindir}
%files
1
0