openSUSE Commits
Threads by month
- ----- 2025 -----
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
November 2020
- 2 participants
- 2810 discussions
[opensuse-commit] commit python-django-codemod for openSUSE:Factory
by User for buildservice source handling 29 Nov '20
by User for buildservice source handling 29 Nov '20
29 Nov '20
Hello community,
here is the log from the commit of package python-django-codemod for openSUSE:Factory checked in at 2020-11-29 12:29:45
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-django-codemod (Old)
and /work/SRC/openSUSE:Factory/.python-django-codemod.new.5913 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-django-codemod"
Sun Nov 29 12:29:45 2020 rev:2 rq:851337 version:1.0.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-django-codemod/python-django-codemod.changes 2020-08-28 21:21:28.692327609 +0200
+++ /work/SRC/openSUSE:Factory/.python-django-codemod.new.5913/python-django-codemod.changes 2020-11-29 12:29:58.750096295 +0100
@@ -1,0 +2,6 @@
+Thu Nov 26 08:48:41 UTC 2020 - John Vandenberg <jayvdb(a)gmail.com>
+
+- Update to v1.0.0
+ * See https://github.com/browniebroke/django-codemod/compare/v0.13.0...v1.0.0
+
+-------------------------------------------------------------------
Old:
----
django-codemod-0.13.0.tar.gz
New:
----
django-codemod-1.0.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-django-codemod.spec ++++++
--- /var/tmp/diff_new_pack.S8JV6c/_old 2020-11-29 12:29:59.246096797 +0100
+++ /var/tmp/diff_new_pack.S8JV6c/_new 2020-11-29 12:29:59.250096802 +0100
@@ -18,19 +18,21 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-django-codemod
-Version: 0.13.0
+Version: 1.0.0
Release: 0
Summary: Collections of libCST codemodders to upgrade Django
License: MIT
Group: Development/Languages/Python
URL: https://github.com/browniebroke/django-codemod
-Source: https://files.pythonhosted.org/packages/source/d/django-codemod/django-code…
-BuildRequires: %{python_module setuptools}
+Source: https://github.com/browniebroke/django-codemod/archive/v%{version}.tar.gz#/…
+BuildRequires: %{python_module pip}
+BuildRequires: %{python_module poetry}
BuildRequires: fdupes
BuildRequires: python-rpm-macros
Requires: python-click
Requires: python-libcst
Requires: python-rich
+Recommends: python-setuptools
Requires(post): update-alternatives
Requires(postun): update-alternatives
BuildArch: noarch
@@ -49,14 +51,14 @@
%prep
%setup -q -n django-codemod-%{version}
-sed -i 's/rich<5/rich/' setup.cfg
-sed -i '/addopts/d' setup.cfg
+sed -i 's/rich = ".*"/rich = "*"/' pyproject.toml
+sed -i '/addopts/d' pyproject.toml
%build
-%python_build
+%pyproject_wheel
%install
-%python_install
+%pyproject_install
%python_clone -a %{buildroot}%{_bindir}/djcodemod
%python_expand %fdupes %{buildroot}%{$python_sitelib}
++++++ django-codemod-0.13.0.tar.gz -> django-codemod-1.0.0.tar.gz ++++++
++++ 5175 lines of diff (skipped)
1
0
[opensuse-commit] commit python-django-storages for openSUSE:Factory
by User for buildservice source handling 29 Nov '20
by User for buildservice source handling 29 Nov '20
29 Nov '20
Hello community,
here is the log from the commit of package python-django-storages for openSUSE:Factory checked in at 2020-11-29 12:29:36
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-django-storages (Old)
and /work/SRC/openSUSE:Factory/.python-django-storages.new.5913 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-django-storages"
Sun Nov 29 12:29:36 2020 rev:7 rq:851329 version:1.10.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-django-storages/python-django-storages.changes 2020-04-21 13:11:11.964875963 +0200
+++ /work/SRC/openSUSE:Factory/.python-django-storages.new.5913/python-django-storages.changes 2020-11-29 12:29:55.342092848 +0100
@@ -1,0 +2,46 @@
+Fri Nov 27 13:23:14 UTC 2020 - John Vandenberg <jayvdb(a)gmail.com>
+
+- Update to v1.10.1
+ * Restore AWS_DEFAULT_ACL handling.
+ This setting is ignored if ACL is set in AWS_S3_OBJECT_PARAMETERS
+ * Fix using SFTP_STORAGE_HOST
+- from v1.10
+ * Removed support for end-of-life Python 2.7 and 3.4
+ * Removed support for end-of-life Django 1.11
+ * Add support for Django 3.1
+ * Introduce a new BaseStorage class with a get_default_settings
+ method and use it in S3Boto3Storage, AzureStorage,
+ GoogleCloudStorage, and SFTPStorage. These backends now
+ calculate their settings when instantiated, not imported.
+ * S3 Breaking: Automatic bucket creation has been removed.
+ Doing so encourages using overly broad credentials.
+ As a result, support for the corresponding AWS_BUCKET_ACL and
+ AWS_AUTO_CREATE_BUCKET settings have been removed.
+ * Support for the undocumented setting AWS_PRELOAD_METADATA removed
+ * The constructor kwarg acl is no longer accepted. Instead, use the
+ ACL key in setting AWS_S3_OBJECT_PARAMETERS
+ * The constructor kwarg ``bucket`` is no longer accepted.
+ Instead, use ``bucket_name`` or AWS_STORAGE_BUCKET_NAME setting
+ * Support for setting AWS_REDUCED_REDUNDANCY has been removed.
+ Replace with StorageClass=REDUCED_REDUNDANCY in
+ AWS_S3_OBJECT_PARAMETERS
+ * Support for setting AWS_S3_ENCRYPTION has been removed.
+ Replace with ServerSideEncryption=AES256 in
+ AWS_S3_OBJECT_PARAMETERS
+ * Support for setting AWS_DEFAULT_ACL has been removed.
+ Replace with ACL in AWS_S3_OBJECT_PARAMETERS
+ * Add ``http_method`` parameter to ``.url`` method
+ * Add support for signing Cloudfront URLs to the ``.url`` method.
+ You must set AWS_CLOUDFRONT_KEY, AWS_CLOUDFRONT_KEY_ID and
+ install either cryptography or rsa.
+ URLs will only be signed if AWS_QUERYSTRING_AUTH is set to True
+ * Automatic Google Cloud bucket creation has been removed.
+ Doing so encourages using overly broad credentials.
+ As a result, support for the corresponding GS_AUTO_CREATE_BUCKET
+ and GS_AUTO_CREATE_ACL settings have been removed.
+ * Add DROPBOX_WRITE_MODE setting to control e.g. overwriting behavior.
+ * Remove SFTP exception swallowing during ssh connection
+ * Add FTP_STORAGE_ENCODING setting to set the filesystem encoding
+ * Support multiple nested paths for files
+
+-------------------------------------------------------------------
Old:
----
django-storages-1.9.1.tar.gz
New:
----
django-storages-1.10.1.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-django-storages.spec ++++++
--- /var/tmp/diff_new_pack.TY1hAQ/_old 2020-11-29 12:29:55.882093395 +0100
+++ /var/tmp/diff_new_pack.TY1hAQ/_new 2020-11-29 12:29:55.886093398 +0100
@@ -19,7 +19,7 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
%bcond_without python2
Name: python-django-storages
-Version: 1.9.1
+Version: 1.10.1
Release: 0
Summary: Support for many storage backends in Django
License: BSD-3-Clause
@@ -28,7 +28,7 @@
BuildRequires: %{python_module setuptools}
BuildRequires: fdupes
BuildRequires: python-rpm-macros
-Requires: python-Django >= 1.11
+Requires: python-Django >= 2.2
Suggests: python-apache-libcloud
Suggests: python-azure >= 3.0.0
Suggests: python-azure-storage-blob >= 1.3.1
@@ -38,7 +38,7 @@
Suggests: python-paramiko
BuildArch: noarch
# SECTION test requirements
-BuildRequires: %{python_module Django >= 1.11}
+BuildRequires: %{python_module Django >= 2.2}
BuildRequires: %{python_module azure-storage-blob >= 1.3.1}
BuildRequires: %{python_module boto3 >= 1.4.4}
BuildRequires: %{python_module dropbox >= 7.2.1}
++++++ django-storages-1.9.1.tar.gz -> django-storages-1.10.1.tar.gz ++++++
++++ 3013 lines of diff (skipped)
1
0
[opensuse-commit] commit python-screeninfo for openSUSE:Factory
by User for buildservice source handling 29 Nov '20
by User for buildservice source handling 29 Nov '20
29 Nov '20
Hello community,
here is the log from the commit of package python-screeninfo for openSUSE:Factory checked in at 2020-11-29 12:29:35
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-screeninfo (Old)
and /work/SRC/openSUSE:Factory/.python-screeninfo.new.5913 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-screeninfo"
Sun Nov 29 12:29:35 2020 rev:3 rq:851326 version:0.6.6
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-screeninfo/python-screeninfo.changes 2020-05-11 13:45:29.445646207 +0200
+++ /work/SRC/openSUSE:Factory/.python-screeninfo.new.5913/python-screeninfo.changes 2020-11-29 12:29:45.934083333 +0100
@@ -1,0 +2,7 @@
+Thu Nov 26 06:32:42 UTC 2020 - andy great <andythe_great(a)pm.me>
+
+- Update to version 0.6.6.
+ * Use dataclasses only when needed.
+- Remove use_dataclasses_when_needed.patch, fixed.
+
+-------------------------------------------------------------------
Old:
----
screeninfo-0.6.5.tar.gz
use_dataclasses_when_needed.patch
New:
----
screeninfo-0.6.6.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-screeninfo.spec ++++++
--- /var/tmp/diff_new_pack.szxgEZ/_old 2020-11-29 12:29:50.534087985 +0100
+++ /var/tmp/diff_new_pack.szxgEZ/_new 2020-11-29 12:29:50.534087985 +0100
@@ -19,14 +19,12 @@
%define skip_python2 1
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-screeninfo
-Version: 0.6.5
+Version: 0.6.6
Release: 0
Summary: Fetch location and size of physical screens
License: MIT
URL: https://github.com/rr-/screeninfo
Source0: https://github.com/rr-/screeninfo/archive/%{version}.tar.gz#/screeninfo-%{v…
-# PATCH-FIX-UPSTREAM https://github.com/rr-/screeninfo/pull/36 -- Use dataclass when needed
-Patch0: use_dataclasses_when_needed.patch
BuildRequires: %{python_module setuptools}
%if 0%{?suse_version} <= 1500
BuildRequires: %{python_module dataclasses}
@@ -41,7 +39,6 @@
%prep
%setup -q -n screeninfo-%{version}
-%patch0 -p1
%build
%python_build
++++++ screeninfo-0.6.5.tar.gz -> screeninfo-0.6.6.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/screeninfo-0.6.5/setup.py new/screeninfo-0.6.6/setup.py
--- old/screeninfo-0.6.5/setup.py 2020-04-15 16:13:07.000000000 +0200
+++ new/screeninfo-0.6.6/setup.py 2020-10-23 09:55:39.000000000 +0200
@@ -3,7 +3,7 @@
setup(
name="screeninfo",
packages=["screeninfo", "screeninfo.enumerators"],
- version="0.6.5",
+ version="0.6.6",
description="Fetch location and size of physical screens.",
author="rr-",
author_email="rr-(a)sakuya.pl",
@@ -11,7 +11,7 @@
keywords=["screen", "monitor", "desktop"],
classifiers=[],
install_requires=[
- "dataclasses",
+ "dataclasses ; python_version<'3.7'",
'Cython ; sys_platform=="darwin"',
'pyobjus ; sys_platform=="darwin"',
],
1
0
[opensuse-commit] commit uwsgi for openSUSE:Factory
by User for buildservice source handling 29 Nov '20
by User for buildservice source handling 29 Nov '20
29 Nov '20
Hello community,
here is the log from the commit of package uwsgi for openSUSE:Factory checked in at 2020-11-29 12:29:30
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/uwsgi (Old)
and /work/SRC/openSUSE:Factory/.uwsgi.new.5913 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "uwsgi"
Sun Nov 29 12:29:30 2020 rev:39 rq:851302 version:2.0.19.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/uwsgi/uwsgi.changes 2020-09-21 17:26:17.616077832 +0200
+++ /work/SRC/openSUSE:Factory/.uwsgi.new.5913/uwsgi.changes 2020-11-29 12:29:31.734068970 +0100
@@ -1,0 +2,38 @@
+Thu Nov 26 18:51:29 UTC 2020 - Dirk Mueller <dmueller(a)suse.com>
+
+- update 2.0.19.1:
+ * Reverted CGI chunked encoding support
+ * Fixed bug with WSGI responses returning
+ * Update travis to xenial (Terence D. Honles)
+ * Fix segfault in logsocket plugin (Riccardo Magliocchetti, #2010)
+ * Backport Coverity fixes from master (Riccardo Magliocchetti)
+ * Fix Python 3.7 warnings (Orivej Desh)
+ * Fix uwsgi.workers() leak in Python plugin (Arne Welzel, #2056)
+ * Backport redislog plugin 32-bit build fixes (Riccardo Magliocchetti, #1828)
+ * Fix stack overflow in core/rpc (Nicola Martino)
+ * Fix build with spaces in the path (Arne Welzel, #1939)
+ * Add missing initialization for zend_file_handle in php plugin (Arne Welzel)
+ * Build Python 3.7 and 3.8 plugins in CI (Arne Welzel)
+ * Add Trove classifiers for Python 3.7 and 3.8 (Hugo)
+ * Graceful shutdown for vassals (Sponsored by guppyltd.com)
+ * Improve yaml parsing with libyaml (Arne Welzel, #2097)
+ * Add smart-daemon2 option to notify daemon of master reloading (Eduardo Felipe Castegnaro)
+ * Do not chroot multiple times when root (Arne Welzel)
+ * Support io.BytesIO with wsgi.file_wrapper (Arne Welzel, #1126)
+ * Add websocket continuation frames support (Timi, #1350)
+ * Fix compilation with gevent 1.5.0 (Vytautas Liuolia)
+ * Fix PSGI plugin build with gcc 10 (Jorge Gallegos)
+ * Get rid of paste.script dependency in pypy/python plugins (Thomas De Schampheleire)
+ * Improve performance for santitizing file descriptors with cgi plugin (Natanael Copa, #2053)
+ * Fix offload-threads with honour-range (Liss Tarnell)
+ * Fix logging packet size length overflow (Pawel Marokwsi)
+ * Fix possible deadlock in install (Jacob Tolar)
+ * Fix parsing of http port for ipv6 (Cyril Baÿ)
+ * Fix impossibility of determining the end of the chunked stream with psgi plugin (ols)
+ * Fix parsing of http-socket port for ipv6 (Daniel Holth)
+ * Add chunked request decoding to the CGI plugin (Robert Schindler)
+ * Add add max-worker-lifetime-delta to reload workers with a delta (Marcin Lulek , #2020)
+
+- remove uwsgi-2.0.18-psgi-fix-duplicate-uperl.patch (upstream)
+
+-------------------------------------------------------------------
Old:
----
uwsgi-2.0.18-psgi-fix-duplicate-uperl.patch
uwsgi-2.0.18.tar.gz
New:
----
uwsgi-2.0.19.1.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ uwsgi.spec ++++++
--- /var/tmp/diff_new_pack.B3JCiD/_old 2020-11-29 12:29:32.494069739 +0100
+++ /var/tmp/diff_new_pack.B3JCiD/_new 2020-11-29 12:29:32.498069744 +0100
@@ -17,7 +17,7 @@
Name: uwsgi
-Version: 2.0.18
+Version: 2.0.19.1
Release: 0
Summary: Application Container Server for Networked/Clustered Web Applications
License: Apache-2.0 AND GPL-2.0-only WITH GCC-exception-2.0
@@ -43,8 +43,6 @@
Patch3: uwsgi-1.9.11-systemd_logger-old_systemd.patch
# PATCH-FIX-OPENSUSE uwsgi-2.0.18-postgresql-config.patch - Use pkg-config instead of pg_config
Patch4: uwsgi-2.0.18-postgresql-config.patch
-# PATCH-FIX-OPENSUSE uwsgi-2.0.18-psgi-fix-duplicate-uperl.patch - Fix duplicate uperl with gcc 10
-Patch5: uwsgi-2.0.18-psgi-fix-duplicate-uperl.patch
%define apache_branch %(rpm -q --qf %%{version} apache2 | grep -E -o "2\\.[0-9]+")
%if "%{apache_branch}" == "2.4"
%define apxs %{_bindir}/apxs2
@@ -439,7 +437,6 @@
%patch2 -p1
%patch3 -p1
%patch4 -p1
-%patch5 -p1
# Generate a config that builds all plugins except for examples and stuff we
# can't satisfy the requirements for or are just broken
excluded_plugins=""
++++++ uwsgi-2.0.18.tar.gz -> uwsgi-2.0.19.1.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/.travis.yml new/uwsgi-2.0.19.1/.travis.yml
--- old/uwsgi-2.0.18/.travis.yml 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/.travis.yml 2020-06-17 11:03:34.000000000 +0200
@@ -1,3 +1,5 @@
+dist: xenial
+
language: c
compiler:
@@ -23,6 +25,12 @@
- echo -e "\n\n>>> Building python36 plugin"
- /usr/bin/python3.6 -V
- /usr/bin/python3.6 uwsgiconfig.py --plugin plugins/python base python36
+ - echo -e "\n\n>>> Building python37 plugin"
+ - /usr/bin/python3.7 -V
+ - /usr/bin/python3.7 uwsgiconfig.py --plugin plugins/python base python37
+ - echo -e "\n\n>>> Building python38 plugin"
+ - /usr/bin/python3.8 -V
+ - /usr/bin/python3.8 uwsgiconfig.py --plugin plugins/python base python38
- echo -e "\n\n>>> Building rack plugin"
- rvm use 2.4
- ruby -v
@@ -35,10 +43,13 @@
- ./tests/travis.sh
before_install:
+ - sudo apt-get update -qq
+ - sudo apt-get install -qqyf software-properties-common
- sudo add-apt-repository ppa:deadsnakes/ppa -y
- sudo add-apt-repository ppa:ondrej/php -y
- sudo apt-get update -qq
- - sudo apt-get install -qqyf python2.6-dev python3.4-dev python3.5-dev python3.6-dev
+ - sudo apt-get install -qqyf python{2.6,3.4,3.5,3.6,3.7,3.8}-dev
+ - sudo apt-get install -qqyf python3.8-distutils
- sudo apt-get install -qqyf libxml2-dev libpcre3-dev libcap2-dev
- sudo apt-get install -qqyf php7.2-dev libphp7.2-embed libargon2-0-dev libsodium-dev
- sudo apt-get install -qqyf liblua5.1-0-dev
@@ -49,6 +60,6 @@
- sudo apt-get install -qqyf libwrap0-dev libgeoip-dev libv8-dev libxslt1-dev
- sudo apt-get install -qqyf libboost-thread-dev libboost-filesystem-dev
- sudo apt-get install -qqyf libssl-dev libacl1-dev python-greenlet-dev
- - sudo apt-get install -qqyf openjdk-7-jdk libgloox-dev gccgo
+ - sudo apt-get install -qqyf openjdk-8-jdk libgloox-dev gccgo
- sudo apt-get install -qqyf cli-common-dev mono-devel mono-mcs uuid-dev
- sudo apt-get install -qqyf curl
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/CONTRIBUTORS new/uwsgi-2.0.19.1/CONTRIBUTORS
--- old/uwsgi-2.0.18/CONTRIBUTORS 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/CONTRIBUTORS 2020-06-17 11:03:34.000000000 +0200
@@ -32,3 +32,7 @@
Adriano Di Luzio (adriano(a)unbit.it)
Curtis Maloney
Vladimir Didenko
+Alexandre Bonnetain
+Darvame Hleran
+Sokolov Yura <funny.falcon(a)gmail.com>
+Marcin Lulek <info(a)webreactor.eu>
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/PKG-INFO new/uwsgi-2.0.19.1/PKG-INFO
--- old/uwsgi-2.0.18/PKG-INFO 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/PKG-INFO 2020-06-17 11:03:34.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 1.0
Name: uWSGI
-Version: 2.0.18
+Version: 2.0.19.1
Summary: The uWSGI server
Home-page: https://uwsgi-docs.readthedocs.io/en/latest/
Author: Unbit
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/core/daemons.c new/uwsgi-2.0.19.1/core/daemons.c
--- old/uwsgi-2.0.18/core/daemons.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/core/daemons.c 2020-06-17 11:03:34.000000000 +0200
@@ -254,6 +254,16 @@
// unregister daemon to prevent it from being respawned
ud->registered = 0;
}
+
+ // smart daemons that have to be notified when master is reloading or stopping
+ if (ud->notifypid && ud->pid > 0 && ud->pidfile) {
+ if (uwsgi_instance_is_reloading) {
+ kill(-(ud->pid), ud->reload_signal > 0 ? ud->reload_signal : SIGHUP);
+ }
+ else {
+ kill(-(ud->pid), ud->stop_signal);
+ }
+ }
ud = ud->next;
}
}
@@ -520,6 +530,7 @@
char *d_ns_pid = NULL;
char *d_chdir = NULL;
char *d_max_throttle = NULL;
+ char *d_notifypid = NULL;
char *arg = uwsgi_str(value);
@@ -543,6 +554,7 @@
"ns_pid", &d_ns_pid,
"chdir", &d_chdir,
"max_throttle", &d_max_throttle,
+ "notifypid", &d_notifypid,
NULL)) {
uwsgi_log("invalid --%s keyval syntax\n", opt);
exit(1);
@@ -594,6 +606,8 @@
uwsgi_ud->max_throttle = d_max_throttle ? atoi(d_max_throttle) : 0;
+ uwsgi_ud->notifypid = d_notifypid ? 1 : 0;
+
if (d_touch) {
size_t i,rlen = 0;
char **argv = uwsgi_split_quoted(d_touch, strlen(d_touch), ";", &rlen);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/core/emperor.c new/uwsgi-2.0.19.1/core/emperor.c
--- old/uwsgi-2.0.18/core/emperor.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/core/emperor.c 2020-06-17 11:03:34.000000000 +0200
@@ -722,7 +722,7 @@
// remove uWSGI instance
if (c_ui->pid != -1) {
- if (write(c_ui->pipe[0], "\0", 1) != 1) {
+ if (write(c_ui->pipe[0], uwsgi.emperor_graceful_shutdown ? "\2" : "\0", 1) != 1) {
uwsgi_error("emperor_stop()/write()");
}
}
@@ -739,7 +739,7 @@
// remove uWSGI instance
if (c_ui->pid != -1) {
- if (write(c_ui->pipe[0], "\0", 1) != 1) {
+ if (write(c_ui->pipe[0], uwsgi.emperor_graceful_shutdown ? "\2" : "\0", 1) != 1) {
uwsgi_error("emperor_stop()/write()");
}
}
@@ -2410,8 +2410,9 @@
if (byte == 0) {
uwsgi_hooks_run(uwsgi.hook_emperor_stop, "emperor-stop", 0);
close(uwsgi.emperor_fd);
- if (!uwsgi.status.brutally_reloading)
+ if (!uwsgi.status.brutally_reloading && !uwsgi.status.brutally_destroying) {
kill_them_all(0);
+ }
}
// reload me
else if (byte == 1) {
@@ -2422,6 +2423,14 @@
grace_them_all(0);
uwsgi_unblock_signal(SIGHUP);
}
+ // remove me gracefully
+ else if (byte == 2) {
+ uwsgi_hooks_run(uwsgi.hook_emperor_stop, "emperor-stop", 0);
+ close(uwsgi.emperor_fd);
+ if (!uwsgi.status.brutally_reloading && !uwsgi.status.brutally_destroying) {
+ gracefully_kill_them_all(0);
+ }
+ }
}
#ifdef UWSGI_EVENT_USE_PORT
// special cose for port event system
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/core/fifo.c new/uwsgi-2.0.19.1/core/fifo.c
--- old/uwsgi-2.0.18/core/fifo.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/core/fifo.c 2020-06-17 11:03:34.000000000 +0200
@@ -104,7 +104,9 @@
char *path = uwsgi_fifo_by_slot();
- unlink(path);
+ if (unlink(path) != 0 && errno != ENOENT) {
+ uwsgi_error("uwsgi_master_fifo()/unlink()");
+ }
if (mkfifo(path, S_IRUSR|S_IWUSR)) {
uwsgi_error("uwsgi_master_fifo()/mkfifo()");
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/core/logging.c new/uwsgi-2.0.19.1/core/logging.c
--- old/uwsgi-2.0.18/core/logging.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/core/logging.c 2020-06-17 11:03:34.000000000 +0200
@@ -752,10 +752,11 @@
i = fscanf(procfile, "%*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %*s %llu %lld", (unsigned long long *) vsz, (unsigned long long *) rss);
if (i != 2) {
uwsgi_log("warning: invalid record in /proc/self/stat\n");
+ } else {
+ *rss = *rss * uwsgi.page_size;
}
fclose(procfile);
}
- *rss = *rss * uwsgi.page_size;
#elif defined(__CYGWIN__)
// same as Linux but rss is not in pages...
FILE *procfile;
@@ -1082,19 +1083,18 @@
return strlen(*buf);
}
-
static ssize_t uwsgi_lf_rsize(struct wsgi_request *wsgi_req, char **buf) {
- *buf = uwsgi_num2str(wsgi_req->response_size);
+ *buf = uwsgi_size2str(wsgi_req->response_size);
return strlen(*buf);
}
static ssize_t uwsgi_lf_hsize(struct wsgi_request *wsgi_req, char **buf) {
- *buf = uwsgi_num2str(wsgi_req->headers_size);
+ *buf = uwsgi_size2str(wsgi_req->headers_size);
return strlen(*buf);
}
static ssize_t uwsgi_lf_size(struct wsgi_request *wsgi_req, char **buf) {
- *buf = uwsgi_num2str(wsgi_req->headers_size+wsgi_req->response_size);
+ *buf = uwsgi_size2str(wsgi_req->headers_size+wsgi_req->response_size);
return strlen(*buf);
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/core/master_checks.c new/uwsgi-2.0.19.1/core/master_checks.c
--- old/uwsgi-2.0.18/core/master_checks.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/core/master_checks.c 2020-06-17 11:03:34.000000000 +0200
@@ -218,7 +218,7 @@
// check if worker was running longer than allowed lifetime
if (uwsgi.workers[i].pid > 0 && uwsgi.workers[i].cheaped == 0 && uwsgi.max_worker_lifetime > 0) {
uint64_t lifetime = uwsgi_now() - uwsgi.workers[i].last_spawn;
- if (lifetime > uwsgi.max_worker_lifetime && uwsgi.workers[i].manage_next_request == 1) {
+ if (lifetime > (uwsgi.max_worker_lifetime + (i-1) * uwsgi.max_worker_lifetime_delta) && uwsgi.workers[i].manage_next_request == 1) {
uwsgi_log("worker %d lifetime reached, it was running for %llu second(s)\n", i, (unsigned long long) lifetime);
uwsgi.workers[i].manage_next_request = 0;
kill(uwsgi.workers[i].pid, SIGWINCH);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/core/offload.c new/uwsgi-2.0.19.1/core/offload.c
--- old/uwsgi-2.0.18/core/offload.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/core/offload.c 2020-06-17 11:03:34.000000000 +0200
@@ -671,11 +671,12 @@
uwsgi.offload_engine_pipe = uwsgi_offload_register_engine("pipe", u_offload_pipe_prepare, u_offload_pipe_do);
}
-int uwsgi_offload_request_sendfile_do(struct wsgi_request *wsgi_req, int fd, size_t len) {
+int uwsgi_offload_request_sendfile_do(struct wsgi_request *wsgi_req, int fd, size_t pos, size_t len) {
struct uwsgi_offload_request uor;
uwsgi_offload_setup(uwsgi.offload_engine_sendfile, &uor, wsgi_req, 1);
uor.fd = fd;
uor.len = len;
+ uor.pos = pos;
return uwsgi_offload_run(wsgi_req, &uor, NULL);
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/core/rpc.c new/uwsgi-2.0.19.1/core/rpc.c
--- old/uwsgi-2.0.18/core/rpc.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/core/rpc.c 2020-06-17 11:03:34.000000000 +0200
@@ -12,6 +12,11 @@
return -1;
}
+ if (strlen(name) >= UMAX8) {
+ uwsgi_log("the supplied RPC name string is too long, max size is %d\n", UMAX8-1);
+ return -1;
+ }
+
uwsgi_lock(uwsgi.rpc_table_lock);
// first check if a function is already registered
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/core/utils.c new/uwsgi-2.0.19.1/core/utils.c
--- old/uwsgi-2.0.18/core/utils.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/core/utils.c 2020-06-17 11:03:34.000000000 +0200
@@ -585,13 +585,15 @@
}
}
- if (uwsgi.chroot && !uwsgi.reloads) {
+ if (uwsgi.chroot && !uwsgi.is_chrooted && !uwsgi.reloads) {
if (!uwsgi.master_as_root)
uwsgi_log("chroot() to %s\n", uwsgi.chroot);
+
if (chroot(uwsgi.chroot)) {
uwsgi_error("chroot()");
exit(1);
}
+ uwsgi.is_chrooted = 1;
#ifdef __linux__
if (uwsgi.logging_options.memory_report) {
uwsgi_log("*** Warning, on linux system you have to bind-mount the /proc fs in your chroot to get memory debug/report.\n");
@@ -989,7 +991,7 @@
return;
nonroot:
- if (uwsgi.chroot && !uwsgi.is_a_reload) {
+ if (uwsgi.chroot && !uwsgi.is_chrooted && !uwsgi.is_a_reload) {
uwsgi_log("cannot chroot() as non-root user\n");
exit(1);
}
@@ -1920,6 +1922,12 @@
return str;
}
+char *uwsgi_size2str(size_t num) {
+ char *str = uwsgi_malloc(sizeof(UMAX64_STR) + 1);
+ snprintf(str, sizeof(UMAX64_STR) + 1, "%llu", (unsigned long long) num);
+ return str;
+}
+
int uwsgi_num2str2(int num, char *ptr) {
return snprintf(ptr, 11, "%d", num);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/core/uwsgi.c new/uwsgi-2.0.19.1/core/uwsgi.c
--- old/uwsgi-2.0.18/core/uwsgi.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/core/uwsgi.c 2020-06-17 11:03:34.000000000 +0200
@@ -239,6 +239,7 @@
#if defined(__linux__) && !defined(OBSOLETE_LINUX_KERNEL)
{"emperor-use-clone", required_argument, 0, "use clone() instead of fork() passing the specified unshare() flags", uwsgi_opt_set_unshare, &uwsgi.emperor_clone, 0},
#endif
+ {"emperor-graceful-shutdown", no_argument, 0, "use vassals graceful shutdown during ragnarok", uwsgi_opt_true, &uwsgi.emperor_graceful_shutdown, 0},
#ifdef UWSGI_CAP
{"emperor-cap", required_argument, 0, "set vassals capability", uwsgi_opt_set_emperor_cap, NULL, 0},
{"vassals-cap", required_argument, 0, "set vassals capability", uwsgi_opt_set_emperor_cap, NULL, 0},
@@ -275,6 +276,7 @@
{"max-requests", required_argument, 'R', "reload workers after the specified amount of managed requests", uwsgi_opt_set_64bit, &uwsgi.max_requests, 0},
{"min-worker-lifetime", required_argument, 0, "number of seconds worker must run before being reloaded (default is 60)", uwsgi_opt_set_64bit, &uwsgi.min_worker_lifetime, 0},
{"max-worker-lifetime", required_argument, 0, "reload workers after the specified amount of seconds (default is disabled)", uwsgi_opt_set_64bit, &uwsgi.max_worker_lifetime, 0},
+ {"max-worker-lifetime-delta", required_argument, 0, "add (worker_id * delta) seconds to the max_worker_lifetime value of each worker", uwsgi_opt_set_int, &uwsgi.max_worker_lifetime_delta, 0},
{"socket-timeout", required_argument, 'z', "set internal sockets timeout", uwsgi_opt_set_int, &uwsgi.socket_timeout, 0},
{"no-fd-passing", no_argument, 0, "disable file descriptor passing", uwsgi_opt_true, &uwsgi.no_fd_passing, 0},
@@ -3301,7 +3303,7 @@
uwsgi_log_verbose("mem-collector thread started for worker %d\n", uwsgi.mywid);
for(;;) {
sleep(uwsgi.mem_collector_freq);
- uint64_t rss, vsz;
+ uint64_t rss = 0, vsz = 0;
get_memusage(&rss, &vsz);
uwsgi.workers[uwsgi.mywid].rss_size = rss;
uwsgi.workers[uwsgi.mywid].vsz_size = vsz;
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/core/websockets.c new/uwsgi-2.0.19.1/core/websockets.c
--- old/uwsgi-2.0.18/core/websockets.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/core/websockets.c 2020-06-17 11:03:34.000000000 +0200
@@ -144,6 +144,7 @@
static void uwsgi_websocket_parse_header(struct wsgi_request *wsgi_req) {
uint8_t byte1 = wsgi_req->websocket_buf->buf[0];
uint8_t byte2 = wsgi_req->websocket_buf->buf[1];
+ wsgi_req->websocket_is_fin = byte1 >> 7;
wsgi_req->websocket_opcode = byte1 & 0xf;
wsgi_req->websocket_has_mask = byte2 >> 7;
wsgi_req->websocket_size = byte2 & 0x7f;
@@ -161,14 +162,38 @@
}
}
- struct uwsgi_buffer *ub = uwsgi_buffer_new(wsgi_req->websocket_size);
+ struct uwsgi_buffer *ub = NULL;
+ if (wsgi_req->websocket_opcode == 0) {
+ if (uwsgi.websockets_continuation_buffer == NULL) {
+ uwsgi_log("Error continuation with empty previous buffer");
+ goto error;
+ }
+ ub = uwsgi.websockets_continuation_buffer;
+ }
+ else {
+ ub = uwsgi_buffer_new(wsgi_req->websocket_size);
+ }
if (uwsgi_buffer_append(ub, (char *) ptr, wsgi_req->websocket_size)) goto error;
if (uwsgi_buffer_decapitate(wsgi_req->websocket_buf, wsgi_req->websocket_pktsize)) goto error;
wsgi_req->websocket_phase = 0;
wsgi_req->websocket_need = 2;
+
+ if (wsgi_req->websocket_is_fin) {
+ uwsgi.websockets_continuation_buffer = NULL;
+ /// Freeing websockets_continuation_buffer is done by the caller
+ return ub;
+ }
+ uwsgi.websockets_continuation_buffer = ub;
+ /// Message is not complete, send empty dummy buffer to signal waiting for full message
+ ub = uwsgi_buffer_new(1);
+ uwsgi_buffer_append(ub, "\0", 1);
return ub;
error:
uwsgi_buffer_destroy(ub);
+ if (uwsgi.websockets_continuation_buffer != NULL && ub != uwsgi.websockets_continuation_buffer) {
+ uwsgi_buffer_destroy(uwsgi.websockets_continuation_buffer);
+ }
+ uwsgi.websockets_continuation_buffer = NULL;
return NULL;
}
@@ -338,12 +363,20 @@
return NULL;
}
+static void clear_continuation_buffer() {
+ if (uwsgi.websockets_continuation_buffer != NULL) {
+ uwsgi_buffer_destroy(uwsgi.websockets_continuation_buffer);
+ uwsgi.websockets_continuation_buffer = NULL;
+ }
+}
+
struct uwsgi_buffer *uwsgi_websocket_recv(struct wsgi_request *wsgi_req) {
if (wsgi_req->websocket_closed) {
return NULL;
}
struct uwsgi_buffer *ub = uwsgi_websocket_recv_do(wsgi_req, 0);
if (!ub) {
+ clear_continuation_buffer();
wsgi_req->websocket_closed = 1;
}
return ub;
@@ -355,6 +388,7 @@
}
struct uwsgi_buffer *ub = uwsgi_websocket_recv_do(wsgi_req, 1);
if (!ub) {
+ clear_continuation_buffer();
wsgi_req->websocket_closed = 1;
}
return ub;
@@ -432,4 +466,5 @@
uwsgi.websockets_ping_freq = 30;
uwsgi.websockets_pong_tolerance = 3;
uwsgi.websockets_max_size = 1024;
+ uwsgi.websockets_continuation_buffer = NULL;
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/core/writer.c new/uwsgi-2.0.19.1/core/writer.c
--- old/uwsgi-2.0.18/core/writer.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/core/writer.c 2020-06-17 11:03:34.000000000 +0200
@@ -634,7 +634,7 @@
fd = tmp_fd;
can_close = 1;
}
- if (!uwsgi_offload_request_sendfile_do(wsgi_req, fd, len)) {
+ if (!uwsgi_offload_request_sendfile_do(wsgi_req, fd, pos, len)) {
wsgi_req->via = UWSGI_VIA_OFFLOAD;
wsgi_req->response_size += len;
return 0;
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/core/yaml.c new/uwsgi-2.0.19.1/core/yaml.c
--- old/uwsgi-2.0.18/core/yaml.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/core/yaml.c 2020-06-17 11:03:34.000000000 +0200
@@ -156,44 +156,49 @@
break;
case YAML_FLOW_SEQUENCE_START_TOKEN:
case YAML_BLOCK_SEQUENCE_START_TOKEN:
- status = 3;
+ if (in_uwsgi_section)
+ in_uwsgi_section++;
+ // fallthrough
+ case YAML_FLOW_ENTRY_TOKEN:
+ case YAML_BLOCK_ENTRY_TOKEN:
+ status = 3; // inside a sequence
break;
case YAML_BLOCK_MAPPING_START_TOKEN:
- if (!in_uwsgi_section) {
- if (key) {
- if (!strcmp(section_asked, key)) {
- in_uwsgi_section = 1;
- }
+ if (in_uwsgi_section) {
+ in_uwsgi_section++;
+ break;
+ }
+ if (key) {
+ if (!strcmp(section_asked, key)) {
+ in_uwsgi_section = 1;
}
}
break;
case YAML_BLOCK_END_TOKEN:
- if (in_uwsgi_section) {
- parsing = 0;
- break;
- }
+ case YAML_FLOW_SEQUENCE_END_TOKEN:
+ if (in_uwsgi_section)
+ parsing = !!(--in_uwsgi_section);
+ key = NULL;
+ status = 0;
break;
case YAML_SCALAR_TOKEN:
- case YAML_FLOW_ENTRY_TOKEN:
- case YAML_BLOCK_ENTRY_TOKEN:
if (status == 1) {
key = (char *) token.data.scalar.value;
}
- else if (status == 2) {
+ else if (status == 2 || status == 3) {
val = (char *) token.data.scalar.value;
if (key && val && in_uwsgi_section) {
add_exported_option(key, val, 0);
}
- status = 0;
- }
- else if (status == 3) {
- val = (char *) token.data.scalar.value;
- if (key && val && in_uwsgi_section) {
- add_exported_option(key, val, 0);
+
+ // If this was the scalar of a value token, forget the state.
+ if (status == 2) {
+ key = NULL;
+ status = 0;
}
}
else {
- uwsgi_log("unsupported YAML token in %s block\n", section_asked);
+ uwsgi_log("unsupported YAML token %d in %s block\n", token.type, section_asked);
parsing = 0;
break;
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/cgi/cgi_plugin.c new/uwsgi-2.0.19.1/plugins/cgi/cgi_plugin.c
--- old/uwsgi-2.0.18/plugins/cgi/cgi_plugin.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/cgi/cgi_plugin.c 2020-06-17 11:03:34.000000000 +0200
@@ -780,8 +780,21 @@
close(cgi_pipe[1]);
// close all the fd > 2
- for(i=3;i<(int)uwsgi.max_fd;i++) {
- close(i);
+ DIR *dirp = opendir("/proc/self/fd");
+ if (dirp == NULL)
+ dirp = opendir("/dev/fd");
+ if (dirp != NULL) {
+ struct dirent *dent;
+ while ((dent = readdir(dirp)) != NULL) {
+ int fd = atoi(dent->d_name);
+ if ((fd > 2) && fd != dirfd(dirp))
+ close(fd);
+ }
+ closedir(dirp);
+ } else {
+ for(i=3;i<(int)uwsgi.max_fd;i++) {
+ close(i);
+ }
}
// fill cgi env
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/corerouter/cr_common.c new/uwsgi-2.0.19.1/plugins/corerouter/cr_common.c
--- old/uwsgi-2.0.18/plugins/corerouter/cr_common.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/corerouter/cr_common.c 2020-06-17 11:03:34.000000000 +0200
@@ -62,7 +62,7 @@
// fix SERVER_PORT
if (!ugs->port || !ugs->port_len) {
- ugs->port = strchr(ugs->name, ':');
+ ugs->port = strrchr(ugs->name, ':');
if (ugs->port) {
ugs->port++;
ugs->port_len = strlen(ugs->port);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/gevent/gevent.c new/uwsgi-2.0.19.1/plugins/gevent/gevent.c
--- old/uwsgi-2.0.18/plugins/gevent/gevent.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/gevent/gevent.c 2020-06-17 11:03:34.000000000 +0200
@@ -433,8 +433,13 @@
ugevent.spawn = PyDict_GetItemString(gevent_dict, "spawn");
if (!ugevent.spawn) uwsgi_pyexit;
- ugevent.signal = PyDict_GetItemString(gevent_dict, "signal");
- if (!ugevent.signal) uwsgi_pyexit;
+ ugevent.signal = PyDict_GetItemString(gevent_dict, "signal_handler");
+ if (!ugevent.signal) {
+ // gevent.signal_handler appears in gevent 1.3.
+ // On older gevent, fall back to the deprecated gevent.signal.
+ ugevent.signal = PyDict_GetItemString(gevent_dict, "signal");
+ if (!ugevent.signal) uwsgi_pyexit;
+ }
ugevent.greenlet_switch = PyDict_GetItemString(gevent_dict, "sleep");
if (!ugevent.greenlet_switch) uwsgi_pyexit;
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/logsocket/logsocket_plugin.c new/uwsgi-2.0.19.1/plugins/logsocket/logsocket_plugin.c
--- old/uwsgi-2.0.18/plugins/logsocket/logsocket_plugin.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/logsocket/logsocket_plugin.c 2020-06-17 11:03:34.000000000 +0200
@@ -43,6 +43,7 @@
else {
ul->msg.msg_iov = uwsgi_malloc(sizeof(struct iovec));
ul->msg.msg_iovlen = 1;
+ ul->count = 0;
}
if (comma) {
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/php/php_plugin.c new/uwsgi-2.0.19.1/plugins/php/php_plugin.c
--- old/uwsgi-2.0.18/plugins/php/php_plugin.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/php/php_plugin.c 2020-06-17 11:03:34.000000000 +0200
@@ -1061,10 +1061,9 @@
SG(request_info).path_translated = wsgi_req->file;
+ memset(&file_handle, 0, sizeof(zend_file_handle));
file_handle.type = ZEND_HANDLE_FILENAME;
file_handle.filename = real_filename;
- file_handle.free_filename = 0;
- file_handle.opened_path = NULL;
if (php_request_startup(TSRMLS_C) == FAILURE) {
uwsgi_500(wsgi_req);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/psgi/psgi.h new/uwsgi-2.0.19.1/plugins/psgi/psgi.h
--- old/uwsgi-2.0.18/plugins/psgi/psgi.h 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/psgi/psgi.h 2020-06-17 11:03:34.000000000 +0200
@@ -87,3 +87,5 @@
void uwsgi_perl_check_auto_reload(void);
void uwsgi_psgi_preinit_apps(void);
+
+extern struct uwsgi_perl uperl;
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/psgi/psgi_loader.c new/uwsgi-2.0.19.1/plugins/psgi/psgi_loader.c
--- old/uwsgi-2.0.18/plugins/psgi/psgi_loader.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/psgi/psgi_loader.c 2020-06-17 11:03:34.000000000 +0200
@@ -1,7 +1,6 @@
#include "psgi.h"
extern struct uwsgi_server uwsgi;
-struct uwsgi_perl uperl;
extern struct uwsgi_plugin psgi_plugin;
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/psgi/psgi_plugin.c new/uwsgi-2.0.19.1/plugins/psgi/psgi_plugin.c
--- old/uwsgi-2.0.18/plugins/psgi/psgi_plugin.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/psgi/psgi_plugin.c 2020-06-17 11:03:34.000000000 +0200
@@ -3,11 +3,7 @@
extern char **environ;
extern struct uwsgi_server uwsgi;
-#ifdef __APPLE__
-extern struct uwsgi_perl uperl;
-#else
struct uwsgi_perl uperl;
-#endif
struct uwsgi_plugin psgi_plugin;
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/psgi/uwsgi_plmodule.c new/uwsgi-2.0.19.1/plugins/psgi/uwsgi_plmodule.c
--- old/uwsgi-2.0.18/plugins/psgi/uwsgi_plmodule.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/psgi/uwsgi_plmodule.c 2020-06-17 11:03:34.000000000 +0200
@@ -883,7 +883,7 @@
XSRETURN_UNDEF;
}
- ST(0) = newSVpv(chunk, len);
+ ST(0) = newSVpvn(chunk, len);
sv_2mortal(ST(0));
XSRETURN(1);
}
@@ -902,7 +902,7 @@
XSRETURN_UNDEF;
}
- ST(0) = newSVpv(chunk, len);
+ ST(0) = newSVpvn(chunk, len);
sv_2mortal(ST(0));
XSRETURN(1);
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/pypy/pypy_setup.py new/uwsgi-2.0.19.1/plugins/pypy/pypy_setup.py
--- old/uwsgi-2.0.18/plugins/pypy/pypy_setup.py 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/pypy/pypy_setup.py 2020-06-17 11:03:34.000000000 +0200
@@ -326,10 +326,10 @@
if c[0] != '/':
c = os.getcwd() + '/' + c
try:
- from paste.script.util.logging_config import fileConfig
+ from logging.config import fileConfig
fileConfig(c)
except ImportError:
- print "PyPy WARNING: unable to load paste.script.util.logging_config"
+ print "PyPy WARNING: unable to load logging.config"
from paste.deploy import loadapp
wsgi_application = loadapp('config:%s' % c)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/python/pyloader.c new/uwsgi-2.0.19.1/plugins/python/pyloader.c
--- old/uwsgi-2.0.18/plugins/python/pyloader.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/python/pyloader.c 2020-06-17 11:03:34.000000000 +0200
@@ -678,7 +678,7 @@
uwsgi_log( "Loading paste environment: %s\n", paste);
if (up.paste_logger) {
- PyObject *paste_logger_dict = get_uwsgi_pydict("paste.script.util.logging_config");
+ PyObject *paste_logger_dict = get_uwsgi_pydict("logging.config");
if (paste_logger_dict) {
PyObject *paste_logger_fileConfig = PyDict_GetItemString(paste_logger_dict, "fileConfig");
if (paste_logger_fileConfig) {
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/python/uwsgi_pymodule.c new/uwsgi-2.0.19.1/plugins/python/uwsgi_pymodule.c
--- old/uwsgi-2.0.18/plugins/python/uwsgi_pymodule.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/python/uwsgi_pymodule.c 2020-06-17 11:03:34.000000000 +0200
@@ -2167,8 +2167,6 @@
goto clear;
}
- apps_tuple = PyDict_GetItemString(worker_dict, "apps");
-
PyDict_Clear(worker_dict);
zero = PyInt_FromLong(uwsgi.workers[i + 1].id);
@@ -2322,9 +2320,10 @@
PyTuple_SetItem(apps_tuple, j, apps_dict);
}
-
+
PyDict_SetItemString(worker_dict, "apps", apps_tuple);
+ Py_DECREF(apps_tuple);
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/python/wsgi_handlers.c new/uwsgi-2.0.19.1/plugins/python/wsgi_handlers.c
--- old/uwsgi-2.0.18/plugins/python/wsgi_handlers.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/python/wsgi_handlers.c 2020-06-17 11:03:34.000000000 +0200
@@ -461,31 +461,32 @@
}
PyObject *py_uwsgi_sendfile(PyObject * self, PyObject * args) {
-
+ int chunk_size;
+ PyObject *filelike;
struct wsgi_request *wsgi_req = py_current_wsgi_req();
- if (!PyArg_ParseTuple(args, "O|i:uwsgi_sendfile", &wsgi_req->async_sendfile, &wsgi_req->sendfile_fd_chunk)) {
+ if (!PyArg_ParseTuple(args, "O|i:uwsgi_sendfile", &filelike, &chunk_size)) {
return NULL;
}
-#ifdef PYTHREE
- wsgi_req->sendfile_fd = PyObject_AsFileDescriptor(wsgi_req->async_sendfile);
- if (wsgi_req->sendfile_fd >= 0) {
- Py_INCREF((PyObject *)wsgi_req->async_sendfile);
- }
-#else
- if (PyFile_Check((PyObject *)wsgi_req->async_sendfile)) {
- Py_INCREF((PyObject *)wsgi_req->async_sendfile);
- wsgi_req->sendfile_fd = PyObject_AsFileDescriptor(wsgi_req->async_sendfile);
+ if (!PyObject_HasAttrString(filelike, "read")) {
+ PyErr_SetString(PyExc_AttributeError, "object has no attribute 'read'");
+ return NULL;
}
-#endif
- // PEP 333 hack
- wsgi_req->sendfile_obj = wsgi_req->async_sendfile;
- //wsgi_req->sendfile_obj = (void *) PyTuple_New(0);
+ // wsgi.file_wrapper called a second time? Forget the old reference.
+ if (wsgi_req->async_sendfile) {
+ Py_DECREF(wsgi_req->async_sendfile);
+ }
- Py_INCREF((PyObject *) wsgi_req->sendfile_obj);
- return (PyObject *) wsgi_req->sendfile_obj;
+ // XXX: Not 100% sure why twice.
+ // Maybe: We keep one at async_sendfile and transfer
+ // one to the caller (even though he gave it to us).
+ Py_INCREF(filelike);
+ Py_INCREF(filelike);
+ wsgi_req->async_sendfile = filelike;
+ wsgi_req->sendfile_fd_chunk = chunk_size;
+ return filelike;
}
void threaded_swap_ts(struct wsgi_request *wsgi_req, struct uwsgi_app *wi) {
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/python/wsgi_subhandler.c new/uwsgi-2.0.19.1/plugins/python/wsgi_subhandler.c
--- old/uwsgi-2.0.18/plugins/python/wsgi_subhandler.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/python/wsgi_subhandler.c 2020-06-17 11:03:34.000000000 +0200
@@ -76,7 +76,18 @@
return 1;
}
return 0;
-}
+}
+
+/*
+ * attempt to make a fd from a file-like PyObject - returns -1 on error.
+ */
+static int uwsgi_python_try_filelike_as_fd(PyObject *filelike) {
+ int fd;
+ if ((fd = PyObject_AsFileDescriptor(filelike)) < 0) {
+ PyErr_Clear();
+ }
+ return fd;
+}
/*
this is a hack for supporting non-file object passed to wsgi.file_wrapper
@@ -253,7 +264,9 @@
if (uwsgi_python_send_body(wsgi_req, (PyObject *)wsgi_req->async_result)) goto clear;
}
- if (wsgi_req->sendfile_obj == wsgi_req->async_result) {
+ // Check if wsgi.file_wrapper has been used.
+ if (wsgi_req->async_sendfile == wsgi_req->async_result) {
+ wsgi_req->sendfile_fd = uwsgi_python_try_filelike_as_fd((PyObject *)wsgi_req->async_sendfile);
if (wsgi_req->sendfile_fd >= 0) {
UWSGI_RELEASE_GIL
uwsgi_response_sendfile_do(wsgi_req, wsgi_req->sendfile_fd, 0, 0);
@@ -299,8 +312,14 @@
Py_DECREF(pychunk);
goto clear;
}
- }
- else if (wsgi_req->sendfile_obj == pychunk) {
+ } else if (wsgi_req->async_sendfile == pychunk) {
+ //
+ // XXX: It's not clear whether this can ever be sensibly reached
+ // based on PEP 333/3333 - it would mean the iterator yielded
+ // the result of wsgi.file_wrapper. However, the iterator should
+ // only ever yield `bytes`...
+ //
+ wsgi_req->sendfile_fd = uwsgi_python_try_filelike_as_fd((PyObject *)wsgi_req->async_sendfile);
if (wsgi_req->sendfile_fd >= 0) {
UWSGI_RELEASE_GIL
uwsgi_response_sendfile_do(wsgi_req, wsgi_req->sendfile_fd, 0, 0);
@@ -310,22 +329,30 @@
else if (PyObject_HasAttrString(pychunk, "read")) {
uwsgi_python_consume_file_wrapper_read(wsgi_req, pychunk);
}
-
uwsgi_py_check_write_errors {
uwsgi_py_write_exception(wsgi_req);
Py_DECREF(pychunk);
goto clear;
}
+ } else if (pychunk != Py_None) {
+ // The iterator returned something that we were not able to handle.
+ PyObject *pystr = PyObject_Repr(pychunk);
+#ifdef PYTHREE
+ const char *cstr = PyUnicode_AsUTF8(pystr);
+#else
+ const char *cstr = PyString_AsString(pystr);
+#endif
+ uwsgi_log("[ERROR] Unhandled object from iterator: %s (%p)\n", cstr, pychunk);
+ Py_DECREF(pystr);
}
-
Py_DECREF(pychunk);
return UWSGI_AGAIN;
clear:
-
- if (wsgi_req->sendfile_fd != -1) {
- Py_DECREF((PyObject *)wsgi_req->async_sendfile);
+ // Release the reference that we took in py_uwsgi_sendfile.
+ if (wsgi_req->async_sendfile != NULL) {
+ Py_DECREF((PyObject *) wsgi_req->async_sendfile);
}
if (wsgi_req->async_placeholder) {
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/pyuwsgi/pyuwsgi.c new/uwsgi-2.0.19.1/plugins/pyuwsgi/pyuwsgi.c
--- old/uwsgi-2.0.18/plugins/pyuwsgi/pyuwsgi.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/pyuwsgi/pyuwsgi.c 2020-06-17 11:03:34.000000000 +0200
@@ -82,7 +82,7 @@
PyObject *py_str = PyObject_Str(item);
PyList_Append(args_li, py_str);
#ifdef PYTHREE
- char *str = PyUnicode_AsUTF8(py_str);
+ const char *str = PyUnicode_AsUTF8(py_str);
size += strlen(str) + 1;
#else
size += PyObject_Length(item) + 1;
@@ -102,7 +102,7 @@
for (i = 0; i < new_argc; i++) {
PyObject *arg = PyList_GetItem(args_li, i);
#ifdef PYTHREE
- char *arg_str = PyUnicode_AsUTF8(arg);
+ const char *arg_str = PyUnicode_AsUTF8(arg);
#else
char *arg_str = PyString_AsString(arg);
#endif
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/redislog/redislog_plugin.c new/uwsgi-2.0.19.1/plugins/redislog/redislog_plugin.c
--- old/uwsgi-2.0.18/plugins/redislog/redislog_plugin.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/redislog/redislog_plugin.c 2020-06-17 11:03:34.000000000 +0200
@@ -159,7 +159,7 @@
uredislog->fd = uwsgi_connect(uredislog->address, uwsgi.socket_timeout, 0);
if (uredislog->password) {
setup_iov.iov_len = snprintf(
- setup_buf, sizeof (setup_buf), "*2\r\n$4\r\nauth\r\n$%lu\r\n%*s\r\n",
+ setup_buf, sizeof (setup_buf), "*2\r\n$4\r\nauth\r\n$%zu\r\n%*s\r\n",
strlen(uredislog->password), (int)strlen(uredislog->password), uredislog->password);
setup_iov.iov_base = setup_buf;
ret = writev(uredislog->fd, &setup_iov, 1);
@@ -172,7 +172,7 @@
}
if (uredislog->id) {
setup_iov.iov_len = snprintf(
- setup_buf, sizeof (setup_buf), "*2\r\n$6\r\nselect\r\n$%lu\r\n%*s\r\n",
+ setup_buf, sizeof (setup_buf), "*2\r\n$6\r\nselect\r\n$%zu\r\n%*s\r\n",
strlen(uredislog->id), (int)strlen(uredislog->id), uredislog->id);
setup_iov.iov_base = setup_buf;
ret = writev(uredislog->fd, &setup_iov, 1);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/router_static/router_static.c new/uwsgi-2.0.19.1/plugins/router_static/router_static.c
--- old/uwsgi-2.0.18/plugins/router_static/router_static.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/router_static/router_static.c 2020-06-17 11:03:34.000000000 +0200
@@ -239,7 +239,7 @@
}
if (wsgi_req->socket->can_offload) {
- if (!uwsgi_offload_request_sendfile_do(wsgi_req, fd, st.st_size)) {
+ if (!uwsgi_offload_request_sendfile_do(wsgi_req, fd, 0, st.st_size)) {
wsgi_req->via = UWSGI_VIA_OFFLOAD;
wsgi_req->response_size += st.st_size;
// the fd will be closed by the offload engine
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/plugins/transformation_offload/offload.c new/uwsgi-2.0.19.1/plugins/transformation_offload/offload.c
--- old/uwsgi-2.0.18/plugins/transformation_offload/offload.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/plugins/transformation_offload/offload.c 2020-06-17 11:03:34.000000000 +0200
@@ -25,7 +25,7 @@
struct uwsgi_transformation *orig_ut = (struct uwsgi_transformation *) ut->data;
// sendfile offload
if (orig_ut->fd > -1) {
- if (!uwsgi_offload_request_sendfile_do(wsgi_req, orig_ut->fd, orig_ut->len)) {
+ if (!uwsgi_offload_request_sendfile_do(wsgi_req, orig_ut->fd, 0, orig_ut->len)) {
// the fd will be closed by the offload engine
orig_ut->fd = -1;
wsgi_req->via = UWSGI_VIA_OFFLOAD;
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/proto/http.c new/uwsgi-2.0.19.1/proto/http.c
--- old/uwsgi-2.0.18/proto/http.c 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/proto/http.c 2020-06-17 11:03:34.000000000 +0200
@@ -276,7 +276,7 @@
}
}
else {
- char *server_port = strchr(wsgi_req->socket->name, ':');
+ char *server_port = strrchr(wsgi_req->socket->name, ':');
if (server_port) {
wsgi_req->uh->pktsize += proto_base_add_uwsgi_var(wsgi_req, "SERVER_PORT", 11, server_port+1, strlen(server_port+1));
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/setup.py new/uwsgi-2.0.19.1/setup.py
--- old/uwsgi-2.0.18/setup.py 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/setup.py 2020-06-17 11:03:34.000000000 +0200
@@ -136,5 +136,7 @@
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
+ 'Programming Language :: Python :: 3.7',
+ 'Programming Language :: Python :: 3.8',
],
)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/tests/static/test.txt new/uwsgi-2.0.19.1/tests/static/test.txt
--- old/uwsgi-2.0.18/tests/static/test.txt 1970-01-01 01:00:00.000000000 +0100
+++ new/uwsgi-2.0.19.1/tests/static/test.txt 2020-06-17 11:03:34.000000000 +0200
@@ -0,0 +1 @@
+cookie
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/tests/static/test2.txt new/uwsgi-2.0.19.1/tests/static/test2.txt
--- old/uwsgi-2.0.18/tests/static/test2.txt 1970-01-01 01:00:00.000000000 +0100
+++ new/uwsgi-2.0.19.1/tests/static/test2.txt 2020-06-17 11:03:34.000000000 +0200
@@ -0,0 +1 @@
+cookie2
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/tests/testfilewrapper.py new/uwsgi-2.0.19.1/tests/testfilewrapper.py
--- old/uwsgi-2.0.18/tests/testfilewrapper.py 1970-01-01 01:00:00.000000000 +0100
+++ new/uwsgi-2.0.19.1/tests/testfilewrapper.py 2020-06-17 11:03:34.000000000 +0200
@@ -0,0 +1,122 @@
+from __future__ import print_function
+import gc
+import io
+import os.path
+import time
+
+import flask
+import flask.helpers
+
+
+application = flask.Flask(__name__)
+
+FILENAME = os.path.join(os.path.dirname(__file__), "static", "test.txt")
+FILENAME2 = os.path.join(os.path.dirname(__file__), "static", "test2.txt")
+
+(a)application.after_request
+def _after(response):
+ gc.collect()
+ fds = os.listdir("/proc/self/fd")
+ print("PY: objects:", len(gc.get_objects()), "fds:", len(fds))
+ return response
+
+(a)application.route("/")
+def index():
+ return "HELLO\n"
+
+(a)application.route("/1")
+def send_file_1():
+ fp = open(FILENAME, "rb")
+ return flask.send_file(fp, attachment_filename="test.txt")
+
+
+(a)application.route("/2")
+def send_file_2():
+ bio = io.BytesIO(b"cookie\n")
+ return flask.send_file(bio, attachment_filename="test.txt")
+
+
+(a)application.route("/3")
+def send_file_3():
+ """
+ What happens if we call the wsgi.file_wrapper twice?
+
+ This should respond with cookie2
+ """
+ fp = open(FILENAME, "rb")
+ flask.send_file(fp, attachment_filename="test.txt")
+ fp = open(FILENAME2, "rb")
+ return flask.send_file(fp, attachment_filename="test.txt")
+
+
+(a)application.route("/4")
+def send_file_4():
+ """
+ Non-filelike object to send_file/wrap_file/wsgi.file_wrapper.
+
+ AttributeError on the call to wsgi.file_wrapper.
+ """
+ return flask.send_file(object(), attachment_filename="test.txt")
+
+
+(a)application.route("/stream1")
+def stream1():
+ """
+ Unrelated to wsgi.file_wrapper, just ensuring the iterator stuff still works.
+ """
+ def _yield():
+ start = time.time()
+ for i in range(3):
+ time.sleep(0.1)
+ yield " {:.2f} cookie".format(time.time() - start).encode("utf-8")
+ yield b"\n"
+ return flask.Response(_yield(), mimetype="text/plain")
+
+
+(a)application.route("/stream2")
+def stream2():
+ """
+ Yielding the result of a wrap_file call with a file object.
+
+ gunicorn / werkzeug do not support this as it's not required.
+ """
+ fp = open(FILENAME, "rb")
+ resp = flask.helpers.wrap_file(flask.request.environ, fp)
+ print("PY: resp after return", hex(id(resp)))
+
+ def _yield():
+ print("PY: _yield() run", hex(id(resp)), repr(resp))
+ yield resp
+ return flask.Response(_yield())
+
+
+(a)application.route("/stream3")
+def stream3():
+ """
+ Yielding the result of a wrap_file call with a BytesIO object.
+
+ gunicorn / werkzeug do not support this as it's not required.
+ """
+ bio = io.BytesIO(b"cookie\n")
+ resp = flask.helpers.wrap_file(flask.request.environ, bio)
+
+ def _yield():
+ yield resp
+ return flask.Response(_yield())
+
+
+(a)application.route("/stream4")
+def stream4():
+ """
+ werkzeug logs: AssertionError: applications must write bytes
+ gunicorn logs: TypeError: <Response streamed [200 OK]> is not a byte
+ uwsgi didn't log, should now..
+ """
+ fp = open(FILENAME, "rb")
+ resp = flask.send_file(fp, attachment_filename="test.txt")
+ print("PY: resp after return", hex(id(resp)))
+
+ def _yield():
+ print("PY: _yield() run", hex(id(resp)), repr(resp))
+ yield resp
+ return flask.Response(_yield(), direct_passthrough=True)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/tests/testrpc.py new/uwsgi-2.0.19.1/tests/testrpc.py
--- old/uwsgi-2.0.18/tests/testrpc.py 1970-01-01 01:00:00.000000000 +0100
+++ new/uwsgi-2.0.19.1/tests/testrpc.py 2020-06-17 11:03:34.000000000 +0200
@@ -0,0 +1,13 @@
+import uwsgi
+
+def hello():
+ pass
+
+def application(env, start_response):
+ try:
+ uwsgi.register_rpc("A"*300, hello)
+ start_response('500 Buffer Overflow', [('Content-Type', 'text/plain')])
+ except ValueError:
+ start_response('200 OK', [('Content-Type', 'text/plain')])
+
+ return ()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/tests/testworkers.py new/uwsgi-2.0.19.1/tests/testworkers.py
--- old/uwsgi-2.0.18/tests/testworkers.py 1970-01-01 01:00:00.000000000 +0100
+++ new/uwsgi-2.0.19.1/tests/testworkers.py 2020-06-17 11:03:34.000000000 +0200
@@ -0,0 +1,26 @@
+"""
+Regression test for #2056 - uwsgi.workers() leaking objects.
+"""
+import uwsgi
+import gc
+
+
+def application(env, start_response):
+ gc.collect()
+ start_objs = len(gc.get_objects())
+
+ for i in range(200):
+ uwsgi.workers()
+
+ gc.collect()
+ end_objs = len(gc.get_objects())
+ diff_objs = end_objs - start_objs
+
+ # Sometimes there is a spurious diff of 4 objects or so.
+ if diff_objs > 10:
+ start_response('500 Leaking', [('Content-Type', 'text/plain')])
+ yield "Leaking objects...\n".format(diff_objs).encode("utf-8")
+ else:
+ start_response('200 OK', [('Content-Type', 'text/plain')])
+
+ yield "{} {} {}\n".format(start_objs, end_objs, diff_objs).encode("utf-8")
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/tests/travis.sh new/uwsgi-2.0.19.1/tests/travis.sh
--- old/uwsgi-2.0.18/tests/travis.sh 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/tests/travis.sh 2020-06-17 11:03:34.000000000 +0200
@@ -1,4 +1,5 @@
#!/bin/bash
+set -u
txtund=$(tput sgr 0 1) # underline
@@ -57,13 +58,13 @@
test_python() {
date > reload.txt
rm -f uwsgi.log
- echo -e "${bldyel}================== TESTING $1 =====================${txtrst}"
+ echo -e "${bldyel}================== TESTING $1 $2 =====================${txtrst}"
echo -e "${bldyel}>>> Spawning uWSGI python app${txtrst}"
echo -en "${bldred}"
- ./uwsgi --master --plugin 0:$1 --http :8080 --exit-on-reload --touch-reload reload.txt --wsgi-file tests/staticfile.py --daemonize uwsgi.log
+ ./uwsgi --master --plugin 0:$1 --http :8080 --exit-on-reload --touch-reload reload.txt --wsgi-file $2 --daemonize uwsgi.log
echo -en "${txtrst}"
http_test "http://localhost:8080/"
- echo -e "${bldyel}===================== DONE $1 =====================${txtrst}\n\n"
+ echo -e "${bldyel}===================== DONE $1 $2 =====================${txtrst}\n\n"
}
@@ -71,31 +72,35 @@
date > reload.txt
rm -f uwsgi.log
# the code assumes that ruby environment is activated by `rvm use`
- echo -e "${bldyel}================== TESTING $1 =====================${txtrst}"
+ echo -e "${bldyel}================== TESTING $1 $2 =====================${txtrst}"
echo -e "${bldyel}>>> Installing sinatra gem using gem${txtrst}"
gem install sinatra || die
echo -e "${bldyel}>>> Spawning uWSGI rack app${txtrst}"
echo -en "${bldred}"
- ./uwsgi --master --plugin 0:$1 --http :8080 --exit-on-reload --touch-reload reload.txt --rack examples/config2.ru --daemonize uwsgi.log
+ ./uwsgi --master --plugin 0:$1 --http :8080 --exit-on-reload --touch-reload reload.txt --rack $2 --daemonize uwsgi.log
echo -en "${txtrst}"
http_test "http://localhost:8080/hi"
- echo -e "${bldyel}===================== DONE $1 =====================${txtrst}\n\n"
+ echo -e "${bldyel}===================== DONE $1 $2 =====================${txtrst}\n\n"
}
while read PV ; do
- test_python $PV
+ for WSGI_FILE in tests/staticfile.py tests/testworkers.py tests/testrpc.py ; do
+ test_python $PV $WSGI_FILE
+ done
done < <(cat .travis.yml | grep "plugins/python base" | sed s_".*plugins/python base "_""_g)
while read RV ; do
- test_rack $RV
+ for RACK in examples/config2.ru ; do
+ test_rack $RV $RACK
+ done
done < <(cat .travis.yml | grep "plugins/rack base" | sed s_".*plugins/rack base "_""_g)
-echo "${bldgre}>>> $SUCCESS SUCCESSFUL PLUGIN(S)${txtrst}"
+echo "${bldgre}>>> $SUCCESS SUCCESSFUL TEST(S)${txtrst}"
if [ $ERROR -ge 1 ]; then
- echo "${bldred}>>> $ERROR FAILED PLUGIN(S)${txtrst}"
+ echo "${bldred}>>> $ERROR FAILED TEST(S)${txtrst}"
exit 1
fi
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/tests/werkzeug.py new/uwsgi-2.0.19.1/tests/werkzeug.py
--- old/uwsgi-2.0.18/tests/werkzeug.py 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/tests/werkzeug.py 1970-01-01 01:00:00.000000000 +0100
@@ -1,5 +0,0 @@
-import uwsgi
-
-print(uwsgi.opt)
-print(uwsgi.magic_table)
-from werkzeug.testapp import test_app as application
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/tests/werkzeug_app.py new/uwsgi-2.0.19.1/tests/werkzeug_app.py
--- old/uwsgi-2.0.18/tests/werkzeug_app.py 1970-01-01 01:00:00.000000000 +0100
+++ new/uwsgi-2.0.19.1/tests/werkzeug_app.py 2020-06-17 11:03:34.000000000 +0200
@@ -0,0 +1,5 @@
+import uwsgi
+
+print(uwsgi.opt)
+print(uwsgi.magic_table)
+from werkzeug.testapp import test_app as application
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/uwsgi.gemspec new/uwsgi-2.0.19.1/uwsgi.gemspec
--- old/uwsgi-2.0.18/uwsgi.gemspec 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/uwsgi.gemspec 2020-06-17 11:03:34.000000000 +0200
@@ -2,7 +2,7 @@
s.name = 'uwsgi'
s.license = 'GPL-2'
s.version = `python -c "import uwsgiconfig as uc; print uc.uwsgi_version"`.sub(/-dev-.*/,'')
- s.date = '2019-02-09'
+ s.date = '2020-06-14'
s.summary = "uWSGI"
s.description = "The uWSGI server for Ruby/Rack"
s.authors = ["Unbit"]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/uwsgi.h new/uwsgi-2.0.19.1/uwsgi.h
--- old/uwsgi-2.0.18/uwsgi.h 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/uwsgi.h 2020-06-17 11:03:34.000000000 +0200
@@ -614,6 +614,8 @@
char *chdir;
int max_throttle;
+
+ int notifypid;
};
struct uwsgi_logger {
@@ -1631,6 +1633,8 @@
struct sockaddr_in6 sin6;
struct sockaddr_un sun;
} client_addr;
+
+ uint8_t websocket_is_fin;
};
@@ -2837,6 +2841,13 @@
#ifdef UWSGI_SSL
int tlsv1;
#endif
+
+ // uWSGI 2.0.19
+ int emperor_graceful_shutdown;
+ int is_chrooted;
+ struct uwsgi_buffer *websockets_continuation_buffer;
+
+ uint64_t max_worker_lifetime_delta;
};
struct uwsgi_rpc {
@@ -3380,6 +3391,7 @@
char *uwsgi_num2str(int);
char *uwsgi_float2str(float);
char *uwsgi_64bit2str(int64_t);
+char *uwsgi_size2str(size_t);
char *magic_sub(char *, size_t, size_t *, char *[]);
void init_magic_table(char *[]);
@@ -4405,7 +4417,7 @@
void uwsgi_offload_engines_register_all(void);
struct uwsgi_thread *uwsgi_offload_thread_start(void);
-int uwsgi_offload_request_sendfile_do(struct wsgi_request *, int, size_t);
+int uwsgi_offload_request_sendfile_do(struct wsgi_request *, int, size_t, size_t);
int uwsgi_offload_request_net_do(struct wsgi_request *, char *, struct uwsgi_buffer *);
int uwsgi_offload_request_memory_do(struct wsgi_request *, char *, size_t);
int uwsgi_offload_request_pipe_do(struct wsgi_request *, int, size_t);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/uwsgi-2.0.18/uwsgiconfig.py new/uwsgi-2.0.19.1/uwsgiconfig.py
--- old/uwsgi-2.0.18/uwsgiconfig.py 2019-02-09 15:48:07.000000000 +0100
+++ new/uwsgi-2.0.19.1/uwsgiconfig.py 2020-06-17 11:03:34.000000000 +0200
@@ -1,6 +1,6 @@
# uWSGI build system
-uwsgi_version = '2.0.18'
+uwsgi_version = '2.0.19.1'
import os
import re
@@ -27,6 +27,10 @@
except:
import configparser as ConfigParser
+try:
+ from shlex import quote
+except ImportError:
+ from pipes import quote
PY3 = sys.version_info[0] == 3
@@ -203,11 +207,12 @@
def spcall3(cmd):
p = subprocess.Popen(cmd, shell=True, stdin=open('/dev/null'), stderr=subprocess.PIPE, stdout=subprocess.PIPE)
+ (out, err) = p.communicate()
- if p.wait() == 0:
+ if p.returncode == 0:
if sys.version_info[0] > 2:
- return p.stderr.read().rstrip().decode()
- return p.stderr.read().rstrip()
+ return err.rstrip().decode()
+ return err.rstrip()
else:
return None
@@ -592,7 +597,7 @@
t.join()
print("*** uWSGI linking ***")
- ldline = "%s -o %s %s %s %s" % (GCC, bin_name, ' '.join(uniq_warnings(ldflags)),
+ ldline = "%s -o %s %s %s %s" % (GCC, quote(bin_name), ' '.join(uniq_warnings(ldflags)),
' '.join(map(add_o, gcc_list)), ' '.join(uniq_warnings(libs)))
print(ldline)
ret = os.system(ldline)
1
0
[opensuse-commit] commit b4 for openSUSE:Factory
by User for buildservice source handling 29 Nov '20
by User for buildservice source handling 29 Nov '20
29 Nov '20
Hello community,
here is the log from the commit of package b4 for openSUSE:Factory checked in at 2020-11-29 12:29:25
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/b4 (Old)
and /work/SRC/openSUSE:Factory/.b4.new.5913 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "b4"
Sun Nov 29 12:29:25 2020 rev:7 rq:851292 version:0.5.3+0
Changes:
--------
--- /work/SRC/openSUSE:Factory/b4/b4.changes 2020-11-23 18:37:55.912587726 +0100
+++ /work/SRC/openSUSE:Factory/.b4.new.5913/b4.changes 2020-11-29 12:29:30.638067862 +0100
@@ -1,0 +2,10 @@
+Fri Nov 27 11:02:21 UTC 2020 - jslaby(a)suse.cz
+
+- Update to version 0.5.3+0:
+ * Increment version to 0.5.3 in prep for release
+ * Unbreak thanks-tracking
+ * Fix crash on incomplete series thanks tracking
+ * Improve ty with cherrypicked subsets
+ * Unquote msgid if we're getting a full https URL
+
+-------------------------------------------------------------------
Old:
----
b4-0.5.2+9.obscpio
New:
----
b4-0.5.3+0.obscpio
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ b4.spec ++++++
--- /var/tmp/diff_new_pack.Uff30E/_old 2020-11-29 12:29:31.130068360 +0100
+++ /var/tmp/diff_new_pack.Uff30E/_new 2020-11-29 12:29:31.134068363 +0100
@@ -17,9 +17,9 @@
%define skip_python2 1
-%define version_unconverted 0.5.2+9
+%define version_unconverted 0.5.3+0
Name: b4
-Version: 0.5.2+9
+Version: 0.5.3+0
Release: 0
Summary: Helper scripts for kernel.org patches
License: GPL-2.0-or-later
++++++ _servicedata ++++++
--- /var/tmp/diff_new_pack.Uff30E/_old 2020-11-29 12:29:31.178068409 +0100
+++ /var/tmp/diff_new_pack.Uff30E/_new 2020-11-29 12:29:31.178068409 +0100
@@ -1,4 +1,4 @@
<servicedata>
<service name="tar_scm">
<param name="url">git://git.kernel.org/pub/scm/utils/b4/b4.git</param>
- <param name="changesrevision">46b9e092457c2921653704a4f824c9674f5967cb</param></service></servicedata>
\ No newline at end of file
+ <param name="changesrevision">08d58022600f213e8f71d192fb672d03593f2c62</param></service></servicedata>
\ No newline at end of file
++++++ b4-0.5.2+9.obscpio -> b4-0.5.3+0.obscpio ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/b4-0.5.2+9/b4/__init__.py new/b4-0.5.3+0/b4/__init__.py
--- old/b4-0.5.2+9/b4/__init__.py 2020-11-17 21:54:52.000000000 +0100
+++ new/b4-0.5.3+0/b4/__init__.py 2020-11-25 23:04:35.000000000 +0100
@@ -28,7 +28,7 @@
charset.add_charset('utf-8', None)
emlpolicy = email.policy.EmailPolicy(utf8=True, cte_type='8bit', max_line_length=None)
-__VERSION__ = '0.5.2'
+__VERSION__ = '0.5.3'
ATTESTATION_FORMAT_VER = '0.1'
logger = logging.getLogger('b4')
@@ -1922,7 +1922,7 @@
matches = re.search(r'^https?://[^/]+/([^/]+)/([^/]+@[^/]+)', msgid, re.IGNORECASE)
if matches:
chunks = matches.groups()
- msgid = chunks[1]
+ msgid = urllib.parse.unquote(chunks[1])
# Infer the project name from the URL, if possible
if chunks[0] != 'r':
cmdargs.useproject = chunks[0]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/b4-0.5.2+9/b4/mbox.py new/b4-0.5.3+0/b4/mbox.py
--- old/b4-0.5.2+9/b4/mbox.py 2020-11-17 21:54:52.000000000 +0100
+++ new/b4-0.5.3+0/b4/mbox.py 2020-11-25 23:04:35.000000000 +0100
@@ -246,10 +246,6 @@
def thanks_record_am(lser, cherrypick=None):
- if not lser.complete:
- logger.debug('Incomplete series, not tracking for thanks')
- return
-
# Are we tracking this already?
datadir = b4.get_data_dir()
slug = lser.get_slug(extended=True)
@@ -257,24 +253,39 @@
patches = list()
at = 0
- for pmsg in lser.patches[1:]:
- at += 1
+ padlen = len(str(lser.expected))
+ lmsg = None
+
+ for pmsg in lser.patches:
if pmsg is None:
+ at += 1
+ continue
+
+ if lmsg is None:
+ lmsg = pmsg
+
+ if not pmsg.has_diff:
+ # Don't care about the cover letter
+ at += 1
continue
if cherrypick is not None and at not in cherrypick:
logger.debug('Skipped non-cherrypicked: %s', at)
+ at += 1
continue
pmsg.load_hashes()
if pmsg.attestation is None:
logger.debug('Unable to get hashes for all patches, not tracking for thanks')
return
- patches.append((pmsg.subject, pmsg.pwhash, pmsg.msgid))
- lmsg = lser.patches[0]
+ prefix = '%s/%s' % (str(pmsg.counter).zfill(padlen), pmsg.expected)
+ patches.append((pmsg.subject, pmsg.pwhash, pmsg.msgid, prefix))
+ at += 1
+
if lmsg is None:
- lmsg = lser.patches[1]
+ logger.debug('All patches missing, not tracking for thanks')
+ return
allto = email.utils.getaddresses([str(x) for x in lmsg.msg.get_all('to', [])])
allcc = email.utils.getaddresses([str(x) for x in lmsg.msg.get_all('cc', [])])
@@ -289,6 +300,7 @@
'references': b4.LoreMessage.clean_header(lmsg.msg['References']),
'sentdate': b4.LoreMessage.clean_header(lmsg.msg['Date']),
'quote': b4.make_quote(lmsg.body, maxlines=5),
+ 'cherrypick': cherrypick is not None,
'patches': patches,
}
fullpath = os.path.join(datadir, filename)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/b4-0.5.2+9/b4/ty.py new/b4-0.5.3+0/b4/ty.py
--- old/b4-0.5.2+9/b4/ty.py 2020-11-17 21:54:52.000000000 +0100
+++ new/b4-0.5.3+0/b4/ty.py 2020-11-25 23:04:35.000000000 +0100
@@ -99,10 +99,11 @@
else:
msg['References'] = '<%s>' % jsondata['msgid']
- if jsondata['subject'].find('Re: ') < 0:
- msg['Subject'] = 'Re: %s' % jsondata['subject']
+ subject = re.sub(r'^Re:\s+', '', jsondata['subject'], flags=re.I)
+ if jsondata.get('cherrypick'):
+ msg['Subject'] = 'Re: (subset) ' + subject
else:
- msg['Subject'] = jsondata['subject']
+ msg['Subject'] = 'Re: ' + subject
mydomain = jsondata['myemail'].split('@')[1]
msg['Message-Id'] = email.utils.make_msgid(idstring='b4-ty', domain=mydomain)
@@ -189,11 +190,13 @@
# We need to find all of them in the commits
found = list()
matches = 0
+ at = 0
for patch in jsondata['patches']:
+ at += 1
logger.debug('Checking %s', patch)
if patch[1] in patchids:
logger.debug('Found: %s', patch[0])
- found.append(commits[patch[1]])
+ found.append((at, commits[patch[1]][0]))
matches += 1
else:
# try to locate by subject
@@ -201,7 +204,7 @@
for pwhash, commit in commits.items():
if commit[1] == patch[0]:
logger.debug('Matched using subject')
- found.append(commit)
+ found.append((at, commit[0]))
success = True
matches += 1
break
@@ -209,7 +212,7 @@
for tracker in commit[2]:
if tracker.find(patch[2]) >= 0:
logger.debug('Matched using recorded message-id')
- found.append(commit)
+ found.append((at, commit[0]))
success = True
matches += 1
break
@@ -218,7 +221,7 @@
if not success:
logger.debug(' Failed to find a match for: %s', patch[0])
- found.append((None, patch[0]))
+ found.append((at, None))
return found
@@ -320,24 +323,27 @@
if not cidmask:
cidmask = 'commit: %s'
slines = list()
- counter = 0
nomatch = 0
padlen = len(str(len(commits)))
- for commit in commits:
- counter += 1
- prefix = '[%s/%s] ' % (str(counter).zfill(padlen), len(commits))
- slines.append('%s%s' % (prefix, commit[1]))
- if commit[0] is None:
+ patches = jsondata['patches']
+ for at, cid in commits:
+ try:
+ prefix = '[%s] ' % patches[at-1][3]
+ except IndexError:
+ prefix = '[%s/%s] ' % (str(at).zfill(padlen), len(commits))
+ slines.append('%s%s' % (prefix, patches[at-1][0]))
+ if cid is None:
slines.append('%s(no commit info)' % (' ' * len(prefix)))
nomatch += 1
else:
- slines.append('%s%s' % (' ' * len(prefix), cidmask % commit[0]))
+ slines.append('%s%s' % (' ' * len(prefix), cidmask % cid))
jsondata['summary'] = '\n'.join(slines)
- if nomatch == counter:
+ if nomatch == len(commits):
logger.critical(' WARNING: None of the patches matched for: %s', jsondata['subject'])
logger.critical(' Please review the resulting message')
elif nomatch > 0:
- logger.critical(' WARNING: Could not match %s of %s patches in: %s', nomatch, counter, jsondata['subject'])
+ logger.critical(' WARNING: Could not match %s of %s patches in: %s',
+ nomatch, len(commits), jsondata['subject'])
logger.critical(' Please review the resulting message')
msg = make_reply(thanks_template, jsondata)
++++++ b4.obsinfo ++++++
--- /var/tmp/diff_new_pack.Uff30E/_old 2020-11-29 12:29:31.278068510 +0100
+++ /var/tmp/diff_new_pack.Uff30E/_new 2020-11-29 12:29:31.282068513 +0100
@@ -1,5 +1,5 @@
name: b4
-version: 0.5.2+9
-mtime: 1605646492
-commit: 46b9e092457c2921653704a4f824c9674f5967cb
+version: 0.5.3+0
+mtime: 1606341875
+commit: 08d58022600f213e8f71d192fb672d03593f2c62
1
0
[opensuse-commit] commit python-pathvalidate for openSUSE:Factory
by User for buildservice source handling 29 Nov '20
by User for buildservice source handling 29 Nov '20
29 Nov '20
Hello community,
here is the log from the commit of package python-pathvalidate for openSUSE:Factory checked in at 2020-11-29 12:29:21
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-pathvalidate (Old)
and /work/SRC/openSUSE:Factory/.python-pathvalidate.new.5913 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-pathvalidate"
Sun Nov 29 12:29:21 2020 rev:5 rq:851317 version:2.3.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-pathvalidate/python-pathvalidate.changes 2020-03-29 14:27:09.242142253 +0200
+++ /work/SRC/openSUSE:Factory/.python-pathvalidate.new.5913/python-pathvalidate.changes 2020-11-29 12:29:24.438061591 +0100
@@ -1,0 +2,7 @@
+Thu Nov 26 10:16:44 UTC 2020 - John Vandenberg <jayvdb(a)gmail.com>
+
+- Update to v2.3.0
+ * Change not to process for "."/".." by sanitization functions
+ * Change to normalize with sanitize_filepath in default
+
+-------------------------------------------------------------------
Old:
----
pathvalidate-2.2.2.tar.gz
New:
----
pathvalidate-2.3.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-pathvalidate.spec ++++++
--- /var/tmp/diff_new_pack.Ct4sAR/_old 2020-11-29 12:29:25.650062817 +0100
+++ /var/tmp/diff_new_pack.Ct4sAR/_new 2020-11-29 12:29:25.654062821 +0100
@@ -19,7 +19,7 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
%define skip_python2 1
Name: python-pathvalidate
-Version: 2.2.2
+Version: 2.3.0
Release: 0
Summary: Python library to sanitize/validate a string such as filenames
License: MIT
++++++ pathvalidate-2.2.2.tar.gz -> pathvalidate-2.3.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pathvalidate-2.2.2/PKG-INFO new/pathvalidate-2.3.0/PKG-INFO
--- old/pathvalidate-2.2.2/PKG-INFO 2020-03-28 13:31:57.910138000 +0100
+++ new/pathvalidate-2.3.0/PKG-INFO 2020-05-03 17:39:06.286724300 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: pathvalidate
-Version: 2.2.2
+Version: 2.3.0
Summary: pathvalidate is a Python library to sanitize/validate a string such as filenames/file-paths/etc.
Home-page: https://github.com/thombashi/pathvalidate
Author: Tsuyoshi Hombashi
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pathvalidate-2.2.2/pathvalidate/__version__.py new/pathvalidate-2.3.0/pathvalidate/__version__.py
--- old/pathvalidate-2.2.2/pathvalidate/__version__.py 2020-03-28 13:17:03.000000000 +0100
+++ new/pathvalidate-2.3.0/pathvalidate/__version__.py 2020-05-03 17:27:25.000000000 +0200
@@ -1,6 +1,6 @@
__author__ = "Tsuyoshi Hombashi"
__copyright__ = "Copyright 2016, {}".format(__author__)
__license__ = "MIT License"
-__version__ = "2.2.2"
+__version__ = "2.3.0"
__maintainer__ = __author__
__email__ = "tsuyoshi.hombashi(a)gmail.com"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pathvalidate-2.2.2/pathvalidate/_base.py new/pathvalidate-2.3.0/pathvalidate/_base.py
--- old/pathvalidate-2.2.2/pathvalidate/_base.py 2020-03-28 12:55:43.000000000 +0100
+++ new/pathvalidate-2.3.0/pathvalidate/_base.py 2020-05-03 16:45:20.000000000 +0200
@@ -24,7 +24,7 @@
@property
def reserved_keywords(self) -> Tuple[str, ...]:
- return (".", "..")
+ return tuple()
@property
def min_len(self) -> int:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pathvalidate-2.2.2/pathvalidate/_filename.py new/pathvalidate-2.3.0/pathvalidate/_filename.py
--- old/pathvalidate-2.2.2/pathvalidate/_filename.py 2020-03-28 07:34:36.000000000 +0100
+++ new/pathvalidate-2.3.0/pathvalidate/_filename.py 2020-05-03 16:08:02.000000000 +0200
@@ -181,6 +181,9 @@
platform=Platform.WINDOWS,
)
+ if unicode_filename in (".", ".."):
+ return
+
if unicode_filename[-1] in (" ", "."):
raise InvalidCharError(
self._ERROR_MSG_TEMPLATE.format(
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pathvalidate-2.2.2/pathvalidate/_filepath.py new/pathvalidate-2.3.0/pathvalidate/_filepath.py
--- old/pathvalidate-2.2.2/pathvalidate/_filepath.py 2020-03-28 09:11:53.000000000 +0100
+++ new/pathvalidate-2.3.0/pathvalidate/_filepath.py 2020-05-03 17:19:50.000000000 +0200
@@ -43,6 +43,7 @@
max_len: Optional[int] = None,
platform: PlatformType = None,
check_reserved: bool = True,
+ normalize: bool = True,
) -> None:
super().__init__(
min_len=min_len, max_len=max_len, check_reserved=check_reserved, platform=platform,
@@ -61,6 +62,7 @@
check_reserved=check_reserved,
platform=self.platform,
)
+ self.__normalize = normalize
if self._is_universal() or self._is_windows():
self.__split_drive = ntpath.splitdrive
@@ -73,9 +75,13 @@
self.__fpath_validator.validate_abspath(value)
- unicode_file_path = preprocess(value)
- drive, unicode_file_path = self.__split_drive(unicode_file_path)
- sanitized_path = self._sanitize_regexp.sub(replacement_text, unicode_file_path)
+ unicode_filepath = preprocess(value)
+
+ if self.__normalize:
+ unicode_filepath = os.path.normpath(unicode_filepath)
+
+ drive, unicode_filepath = self.__split_drive(unicode_filepath)
+ sanitized_path = self._sanitize_regexp.sub(replacement_text, unicode_filepath)
if self._is_windows():
path_separator = "\\"
else:
@@ -158,9 +164,9 @@
if not value:
return
- file_path = os.path.normpath(value)
- unicode_file_path = preprocess(file_path)
- value_len = len(unicode_file_path)
+ filepath = os.path.normpath(value)
+ unicode_filepath = preprocess(filepath)
+ value_len = len(unicode_filepath)
if value_len > self.max_len:
raise InvalidLengthError(
@@ -173,18 +179,18 @@
)
)
- self._validate_reserved_keywords(unicode_file_path)
- unicode_file_path = unicode_file_path.replace("\\", "/")
- for entry in unicode_file_path.split("/"):
+ self._validate_reserved_keywords(unicode_filepath)
+ unicode_filepath = unicode_filepath.replace("\\", "/")
+ for entry in unicode_filepath.split("/"):
if not entry or entry in (".", ".."):
continue
self.__fname_validator._validate_reserved_keywords(entry)
if self._is_universal() or self._is_windows():
- self.__validate_win_file_path(unicode_file_path)
+ self.__validate_win_filepath(unicode_filepath)
else:
- self.__validate_unix_file_path(unicode_file_path)
+ self.__validate_unix_filepath(unicode_filepath)
def validate_abspath(self, value: PathType) -> None:
value = str(value)
@@ -225,26 +231,26 @@
if not self._is_windows() and drive and is_nt_abs:
raise err_object
- def __validate_unix_file_path(self, unicode_file_path: str) -> None:
- match = _RE_INVALID_PATH.findall(unicode_file_path)
+ def __validate_unix_filepath(self, unicode_filepath: str) -> None:
+ match = _RE_INVALID_PATH.findall(unicode_filepath)
if match:
raise InvalidCharError(
self._ERROR_MSG_TEMPLATE.format(
- invalid=findall_to_str(match), value=repr(unicode_file_path)
+ invalid=findall_to_str(match), value=repr(unicode_filepath)
)
)
- def __validate_win_file_path(self, unicode_file_path: str) -> None:
- match = _RE_INVALID_WIN_PATH.findall(unicode_file_path)
+ def __validate_win_filepath(self, unicode_filepath: str) -> None:
+ match = _RE_INVALID_WIN_PATH.findall(unicode_filepath)
if match:
raise InvalidCharError(
self._ERROR_MSG_TEMPLATE.format(
- invalid=findall_to_str(match), value=repr(unicode_file_path)
+ invalid=findall_to_str(match), value=repr(unicode_filepath)
),
platform=Platform.WINDOWS,
)
- _drive, value = self.__split_drive(unicode_file_path)
+ _drive, value = self.__split_drive(unicode_filepath)
if value:
match_reserved = self._RE_NTFS_RESERVED.search(value)
if match_reserved:
@@ -334,6 +340,7 @@
platform: Optional[str] = None,
max_len: Optional[int] = None,
check_reserved: bool = True,
+ normalize: bool = True,
) -> PathType:
"""Make a valid file path from a string.
@@ -363,7 +370,7 @@
max_len:
Maximum length of the ``file_path`` length. Truncate the name if the ``file_path``
length exceedd this value. If the value is |None|,
- automatically determined by the ``platform``:
+ ``max_len`` will automatically determined by the ``platform``:
- ``Linux``: 4096
- ``macOS``: 1024
@@ -371,6 +378,8 @@
- ``universal``: 260
check_reserved:
If |True|, sanitize reserved names of the ``platform``.
+ normalize:
+ If |True|, normalize the the file path.
Returns:
Same type as the argument (str or PathLike object):
@@ -385,7 +394,7 @@
"""
return FilePathSanitizer(
- platform=platform, max_len=max_len, check_reserved=check_reserved
+ platform=platform, max_len=max_len, check_reserved=check_reserved, normalize=normalize
).sanitize(file_path, replacement_text)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pathvalidate-2.2.2/pathvalidate/_symbol.py new/pathvalidate-2.3.0/pathvalidate/_symbol.py
--- old/pathvalidate-2.2.2/pathvalidate/_symbol.py 2020-03-28 07:29:05.000000000 +0100
+++ new/pathvalidate-2.3.0/pathvalidate/_symbol.py 2020-05-03 16:25:54.000000000 +0200
@@ -17,12 +17,14 @@
def validate_unprintable(text: str) -> None:
+ # deprecated
match_list = __RE_UNPRINTABLE.findall(preprocess(text))
if match_list:
raise InvalidCharError("unprintable character found: {}".format(match_list))
def replace_unprintable(text: str, replacement_text: str = "") -> str:
+ # deprecated
try:
return __RE_UNPRINTABLE.sub(replacement_text, preprocess(text))
except (TypeError, AttributeError):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pathvalidate-2.2.2/pathvalidate.egg-info/PKG-INFO new/pathvalidate-2.3.0/pathvalidate.egg-info/PKG-INFO
--- old/pathvalidate-2.2.2/pathvalidate.egg-info/PKG-INFO 2020-03-28 13:31:57.000000000 +0100
+++ new/pathvalidate-2.3.0/pathvalidate.egg-info/PKG-INFO 2020-05-03 17:39:06.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: pathvalidate
-Version: 2.2.2
+Version: 2.3.0
Summary: pathvalidate is a Python library to sanitize/validate a string such as filenames/file-paths/etc.
Home-page: https://github.com/thombashi/pathvalidate
Author: Tsuyoshi Hombashi
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pathvalidate-2.2.2/test/_common.py new/pathvalidate-2.3.0/test/_common.py
--- old/pathvalidate-2.2.2/test/_common.py 2020-01-11 06:04:50.000000000 +0100
+++ new/pathvalidate-2.3.0/test/_common.py 2020-05-03 16:47:24.000000000 +0200
@@ -75,8 +75,6 @@
INVALID_PYTHON_VAR_CHARS = INVALID_JS_VAR_CHARS + ("$",)
WIN_RESERVED_FILE_NAMES = [
- ".",
- "..",
"CON",
"con",
"PRN",
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pathvalidate-2.2.2/test/test_filename.py new/pathvalidate-2.3.0/test/test_filename.py
--- old/pathvalidate-2.2.2/test/test_filename.py 2020-03-28 13:13:02.000000000 +0100
+++ new/pathvalidate-2.3.0/test/test_filename.py 2020-05-03 16:51:10.000000000 +0200
@@ -69,8 +69,6 @@
[
"windows",
(
- ".",
- "..",
"CON",
"PRN",
"AUX",
@@ -96,8 +94,8 @@
"LPT9",
),
],
- ["linux", (".", "..")],
- ["macos", (".", "..", ":")],
+ ["linux", ()],
+ ["macos", (":",)],
],
)
def test_normal_reserved_keywords(self, test_platform, expected):
@@ -266,22 +264,24 @@
for reserved_keyword, platform in product(
WIN_RESERVED_FILE_NAMES, ["windows", "universal"]
)
- if reserved_keyword not in [".", ".."]
]
+ [
- [reserved_keyword, platform, ValidationError]
+ [reserved_keyword, platform, None]
for reserved_keyword, platform in product([".", ".."], ["posix", "linux", "macos"])
]
+ [[":", "posix", ValidationError], [":", "macos", ValidationError],],
)
def test_exception_reserved_name(self, value, platform, expected):
- with pytest.raises(expected) as e:
+ if expected is None:
validate_filename(value, platform=platform)
- assert e.value.reason == ErrorReason.RESERVED_NAME
- assert e.value.reserved_name
- assert e.value.reusable_name is False
+ else:
+ with pytest.raises(expected) as e:
+ validate_filename(value, platform=platform)
+ assert e.value.reason == ErrorReason.RESERVED_NAME
+ assert e.value.reserved_name
+ assert e.value.reusable_name is False
- assert not is_valid_filename(value, platform=platform)
+ assert not is_valid_filename(value, platform=platform)
@pytest.mark.parametrize(
["value", "platform"],
@@ -444,7 +444,10 @@
[reserved.upper(), "windows", reserved.upper() + "_"]
for reserved in WIN_RESERVED_FILE_NAMES
]
- + [[reserved, "linux", reserved + "_"] for reserved in (".", "..")],
+ + [
+ [reserved_keyword, platform, reserved_keyword]
+ for reserved_keyword, platform in product([".", ".."], ["windows", "universal"])
+ ],
)
def test_normal_reserved_name(self, value, test_platform, expected):
filename = sanitize_filename(value, platform=test_platform)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pathvalidate-2.2.2/test/test_filepath.py new/pathvalidate-2.3.0/test/test_filepath.py
--- old/pathvalidate-2.2.2/test/test_filepath.py 2020-03-28 13:24:54.000000000 +0100
+++ new/pathvalidate-2.3.0/test/test_filepath.py 2020-05-03 17:33:22.000000000 +0200
@@ -4,6 +4,7 @@
import platform as m_platform
import random
+import sys
from collections import OrderedDict
from itertools import chain, product
from pathlib import Path
@@ -60,12 +61,7 @@
@pytest.mark.parametrize(
["test_platform", "expected"],
- [
- ["windows", (".", "..",),],
- ["posix", (".", "..", "/", ":")],
- ["linux", (".", "..", "/")],
- ["macos", (".", "..", "/", ":")],
- ],
+ [["windows", tuple()], ["posix", ("/", ":")], ["linux", ("/",)], ["macos", ("/", ":")],],
)
def test_normal_reserved_keywords(self, test_platform, expected):
assert FilePathValidator(255, platform=test_platform).reserved_keywords == expected
@@ -241,15 +237,15 @@
["test_platform", "value", "expected"],
[
["linux", "a/b/c.txt", None],
- ["linux", "a/b?/c.txt", None],
+ ["linux", "a//b?/c.txt", None],
["linux", "../a/./../b/c.txt", None],
["windows", "a/b/c.txt", None],
- ["windows", "a/b?/c.txt", ValidationError],
+ ["windows", "a//b?/c.txt", ValidationError],
["windows", "../a/./../b/c.txt", None],
["universal", "a/b/c.txt", None],
["universal", "./a/b/c.txt", None],
["universal", "../a/./../b/c.txt", None],
- ["universal", "a/b?/c.txt", ValidationError],
+ ["universal", "a//b?/c.txt", ValidationError],
],
)
def test_normal_rel_path(self, test_platform, value, expected):
@@ -506,7 +502,6 @@
"abc/{}_/xyz".format(reserved_keyword),
]
for reserved_keyword, platform in product(WIN_RESERVED_FILE_NAMES, ["universal"])
- if reserved_keyword not in [".", ".."]
]
+ [
[
@@ -515,7 +510,6 @@
"/abc/{}/xyz".format(reserved_keyword),
]
for reserved_keyword, platform in product(WIN_RESERVED_FILE_NAMES, ["linux"])
- if reserved_keyword not in [".", ".."]
]
+ [
[
@@ -524,7 +518,6 @@
"abc/{}_.txt".format(reserved_keyword),
]
for reserved_keyword, platform in product(WIN_RESERVED_FILE_NAMES, ["universal"])
- if reserved_keyword not in [".", ".."]
]
+ [
[
@@ -533,7 +526,6 @@
"/abc/{}.txt".format(reserved_keyword),
]
for reserved_keyword, platform in product(WIN_RESERVED_FILE_NAMES, ["linux"])
- if reserved_keyword not in [".", ".."]
]
+ [
[
@@ -542,7 +534,6 @@
"C:\\abc\\{}_.txt".format(reserved_keyword),
]
for reserved_keyword, platform in product(WIN_RESERVED_FILE_NAMES, ["windows"])
- if reserved_keyword not in [".", ".."]
]
+ [
["{}\\{}".format(drive, filename), platform, "{}\\{}_".format(drive, filename)]
@@ -588,6 +579,44 @@
assert is_valid_filepath(sanitized_name)
@pytest.mark.parametrize(
+ ["test_platform", "value", "expected"],
+ [
+ ["linux", "a/b/c.txt", "a/b/c.txt"],
+ ["linux", "a//b?/c.txt", "a/b?/c.txt"],
+ ["linux", "../a/./../b/c.txt", "../b/c.txt"],
+ ["windows", "a/b/c.txt", "a\\b\\c.txt"],
+ ["windows", "a//b?/c.txt", "a\\b\\c.txt"],
+ ["windows", "../a/./../b/c.txt", "..\\b\\c.txt"],
+ ["universal", "a/b/c.txt", "a/b/c.txt"],
+ ["universal", "./", "."],
+ ["universal", "./a/b/c.txt", "a/b/c.txt"],
+ ["universal", "../a/./../b/c.txt", "../b/c.txt"],
+ ["universal", "a//b?/c.txt", "a/b/c.txt"],
+ ],
+ )
+ def test_normal_rel_path(self, test_platform, value, expected):
+ assert sanitize_filepath(value, platform=test_platform) == expected
+
+ @pytest.mark.parametrize(
+ ["test_platform", "value", "expected"],
+ [
+ ["linux", "a/b/c.txt", "a/b/c.txt"],
+ ["linux", "a//b?/c.txt", "a/b?/c.txt"],
+ ["linux", "../a/./../b/c.txt", "../a/./../b/c.txt"],
+ ["windows", "a/b/c.txt", "a\\b\\c.txt"],
+ ["windows", "a//b?/c.txt", "a\\b\\c.txt"],
+ ["windows", "../a/./../b/c.txt", "..\\a\\.\\..\\b\\c.txt"],
+ ["universal", "a/b/c.txt", "a/b/c.txt"],
+ ["universal", "./", "."],
+ ["universal", "./a/b/c.txt", "./a/b/c.txt"],
+ ["universal", "../a/./../b/c.txt", "../a/./../b/c.txt"],
+ ["universal", "a//b?/c.txt", "a/b/c.txt"],
+ ],
+ )
+ def test_normal_not_normalize(self, test_platform, value, expected):
+ assert sanitize_filepath(value, platform=test_platform, normalize=False) == expected
+
+ @pytest.mark.parametrize(
["value", "expected"], [["", ""], [None, ""],],
)
def test_normal_null_values(self, value, expected):
@@ -653,6 +682,7 @@
with pytest.raises(expected):
sanitize_filepath(value, platform="auto")
+ @pytest.mark.skipif(sys.version_info < (3, 6), reason="requires python3.6 or higher")
@pytest.mark.parametrize(
["value", "expected"],
[[1, TypeError], [True, TypeError], [nan, TypeError], [inf, TypeError]],
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pathvalidate-2.2.2/tox.ini new/pathvalidate-2.3.0/tox.ini
--- old/pathvalidate-2.2.2/tox.ini 2020-03-20 11:58:55.000000000 +0100
+++ new/pathvalidate-2.3.0/tox.ini 2020-04-27 13:07:15.000000000 +0200
@@ -24,11 +24,10 @@
wheel
commands =
python setup.py sdist bdist_wheel
- twine check dist/*
+ twine check dist/*.whl dist/*.tar.gz
python setup.py clean --all
[testenv:clean]
-basepython = python3.8
deps =
cleanpy
commands =
@@ -70,12 +69,11 @@
commands =
python setup.py check
mypy pathvalidate --show-error-context --show-error-codes --python-version 3.5
- pytype --keep-going --jobs 2 --disable import-error pathvalidate
- codespell pathvalidate docs examples test -q2 --check-filenames
+ pytype --keep-going --jobs 4 --disable import-error pathvalidate
+ codespell pathvalidate docs/pages examples test -q2 --check-filenames
pylama
[testenv:readme]
-basepython = python3.8
changedir = docs
deps =
readmemaker>=1.0.0
@@ -83,7 +81,6 @@
python make_readme.py
[testenv:release]
-basepython = python3.8
deps =
releasecmd>=0.3.1,<1
commands =
1
0
[opensuse-commit] commit picom for openSUSE:Factory
by User for buildservice source handling 29 Nov '20
by User for buildservice source handling 29 Nov '20
29 Nov '20
Hello community,
here is the log from the commit of package picom for openSUSE:Factory checked in at 2020-11-29 12:29:17
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/picom (Old)
and /work/SRC/openSUSE:Factory/.picom.new.5913 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "picom"
Sun Nov 29 12:29:17 2020 rev:3 rq:851310 version:8.2
Changes:
--------
--- /work/SRC/openSUSE:Factory/picom/picom.changes 2020-06-15 20:29:54.910250567 +0200
+++ /work/SRC/openSUSE:Factory/.picom.new.5913/picom.changes 2020-11-29 12:29:21.286058404 +0100
@@ -1,0 +2,9 @@
+Thu Nov 26 12:34:36 UTC 2020 - Dirk Mueller <dmueller(a)suse.com>
+
+- update to 8.2:
+ * Fixes assertion failures related to WIN_FLAGS_SHADOW_STALE, see #479
+ * write-pid-path in configuration file now accepted, see #492
+ * Pid files are now deleted during shutdown, see #492
+ * Build fixes for certain platforms, see #501, #502
+
+-------------------------------------------------------------------
Old:
----
v8.tar.gz
New:
----
v8.2.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ picom.spec ++++++
--- /var/tmp/diff_new_pack.6XgtRs/_old 2020-11-29 12:29:22.026059151 +0100
+++ /var/tmp/diff_new_pack.6XgtRs/_new 2020-11-29 12:29:22.030059156 +0100
@@ -17,13 +17,13 @@
Name: picom
-Version: 8
+Version: 8.2
Release: 0
Summary: Stand-alone compositor for X11
License: MPL-2.0 AND MIT
Group: System/X11/Utilities
URL: https://github.com/yshui/picom
-Source0: https://github.com/yshui/picom/archive/v8.tar.gz
+Source0: https://github.com/yshui/picom/archive/v%{version}.tar.gz
Source1: picom.desktop
BuildRequires: asciidoc
BuildRequires: c_compiler
++++++ v8.tar.gz -> v8.2.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/bin/picom-trans new/picom-8.2/bin/picom-trans
--- old/picom-8/bin/picom-trans 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/bin/picom-trans 2020-10-24 10:44:12.000000000 +0200
@@ -160,7 +160,7 @@
fi
# Find the line number of the target window in the window tree
- lineno=$(echo -n "$treeout" | grep -nw "$wid" | head -n1 | cut -d ':' -f 1)
+ lineno=$(echo -n "$treeout" | grep -nw "^\s*$wid" | head -n1 | cut -d ':' -f 1)
if test -z "$lineno"; then
echo 'Failed to find window in window tree.'
@@ -169,7 +169,7 @@
# Find the highest ancestor of the target window below
topmost=$(echo -n "$treeout" \
- | head -n $((lineno + 1)) \
+ | head -n $lineno \
| sed -n 's/^ \(0x[[:xdigit:]]*\).*/\1/p' \
| tail -n 1)
fi
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/man/picom.1.asciidoc new/picom-8.2/man/picom.1.asciidoc
--- old/picom-8/man/picom.1.asciidoc 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/man/picom.1.asciidoc 2020-10-24 10:44:12.000000000 +0200
@@ -86,7 +86,7 @@
Look for configuration file at the path. See *CONFIGURATION FILES* section below for where picom looks for a configuration file by default. Use `/dev/null` to avoid loading configuration file.
*--write-pid-path* 'PATH'::
- Write process ID to a file.
+ Write process ID to a file. it is recommended to use an absolute path.
*--shadow-red* 'VALUE'::
Red color value of shadow (0.0 - 1.0, defaults to 0).
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/meson.build new/picom-8.2/meson.build
--- old/picom-8/meson.build 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/meson.build 2020-10-24 10:44:12.000000000 +0200
@@ -1,4 +1,4 @@
-project('picom', 'c', version: '8',
+project('picom', 'c', version: '8.2',
default_options: ['c_std=c11'])
cc = meson.get_compiler('c')
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/backend/gl/gl_common.c new/picom-8.2/src/backend/gl/gl_common.c
--- old/picom-8/src/backend/gl/gl_common.c 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/backend/gl/gl_common.c 2020-10-24 10:44:12.000000000 +0200
@@ -485,7 +485,7 @@
// ri, rx, ry, rxe, rye, rdx, rdy, rdxe, rdye);
memcpy(&coord[i * 16],
- (GLint[][2]){
+ ((GLint[][2]){
{vx1, vy1},
{texture_x1, texture_y1},
{vx2, vy1},
@@ -494,11 +494,12 @@
{texture_x2, texture_y2},
{vx1, vy2},
{texture_x1, texture_y2},
- },
+ }),
sizeof(GLint[2]) * 8);
GLuint u = (GLuint)(i * 4);
- memcpy(&indices[i * 6], (GLuint[]){u + 0, u + 1, u + 2, u + 2, u + 3, u + 0},
+ memcpy(&indices[i * 6],
+ ((GLuint[]){u + 0, u + 1, u + 2, u + 2, u + 3, u + 0}),
sizeof(GLuint) * 6);
}
}
@@ -885,10 +886,13 @@
for (int i = 0; i < nrects; i++) {
GLint y1 = y_inverted ? height - rect[i].y2 : rect[i].y1,
y2 = y_inverted ? height - rect[i].y1 : rect[i].y2;
+ // clang-format off
memcpy(&coord[i * 8],
- (GLint[][2]){
- {rect[i].x1, y1}, {rect[i].x2, y1}, {rect[i].x2, y2}, {rect[i].x1, y2}},
+ ((GLint[][2]){
+ {rect[i].x1, y1}, {rect[i].x2, y1},
+ {rect[i].x2, y2}, {rect[i].x1, y2}}),
sizeof(GLint[2]) * 4);
+ // clang-format on
indices[i * 6 + 0] = (GLuint)i * 4 + 0;
indices[i * 6 + 1] = (GLuint)i * 4 + 1;
indices[i * 6 + 2] = (GLuint)i * 4 + 2;
@@ -1383,15 +1387,16 @@
for (int i = 0; i < nrects; i++) {
// clang-format off
memcpy(&coord[i * 8],
- (GLint[]){rect[i].x1, gd->height - rect[i].y2,
+ ((GLint[]){rect[i].x1, gd->height - rect[i].y2,
rect[i].x2, gd->height - rect[i].y2,
rect[i].x2, gd->height - rect[i].y1,
- rect[i].x1, gd->height - rect[i].y1},
+ rect[i].x1, gd->height - rect[i].y1}),
sizeof(GLint) * 8);
// clang-format on
GLuint u = (GLuint)(i * 4);
- memcpy(&indices[i * 6], (GLuint[]){u + 0, u + 1, u + 2, u + 2, u + 3, u + 0},
+ memcpy(&indices[i * 6],
+ ((GLuint[]){u + 0, u + 1, u + 2, u + 2, u + 3, u + 0}),
sizeof(GLuint) * 6);
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/backend/xrender/xrender.c new/picom-8.2/src/backend/xrender/xrender.c
--- old/picom-8/src/backend/xrender/xrender.c 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/backend/xrender/xrender.c 2020-10-24 10:44:12.000000000 +0200
@@ -166,6 +166,7 @@
if (!tmp_picture[0] || !tmp_picture[1]) {
log_error("Failed to build intermediate Picture.");
pixman_region32_fini(®_op);
+ pixman_region32_fini(®_op_resized);
return false;
}
@@ -242,6 +243,7 @@
xcb_render_free_picture(c, tmp_picture[0]);
xcb_render_free_picture(c, tmp_picture[1]);
pixman_region32_fini(®_op);
+ pixman_region32_fini(®_op_resized);
return true;
}
@@ -266,6 +268,8 @@
x_create_picture_with_visual_and_pixmap(base->c, fmt.visual, pixmap, 0, NULL);
img->owned = owned;
img->visual = fmt.visual;
+ free(r);
+
if (img->pict == XCB_NONE) {
free(img);
return NULL;
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/c2.c new/picom-8.2/src/c2.c
--- old/picom-8/src/c2.c 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/c2.c 2020-10-24 10:44:12.000000000 +0200
@@ -267,9 +267,10 @@
static inline c2_ptr_t c2h_comb_tree(c2_b_op_t op, c2_ptr_t p1, c2_ptr_t p2) {
c2_ptr_t p = {.isbranch = true, .b = cmalloc(c2_b_t)};
+ p.b->neg = false;
+ p.b->op = op;
p.b->opr1 = p1;
p.b->opr2 = p2;
- p.b->op = op;
return p;
}
@@ -369,6 +370,7 @@
#ifdef DEBUG_C2
log_trace("(\"%s\"): ", pattern);
c2_dump(plptr->ptr);
+ putchar('\n');
#endif
return plptr;
@@ -1197,7 +1199,7 @@
}
c2_dump(pbranch->opr2);
- printf(")");
+ printf(") ");
}
// For a leaf
else {
@@ -1507,8 +1509,9 @@
}
#ifdef DEBUG_WINMATCH
- log_trace("(%#010lx): branch: result = %d, pattern = ", w->id, result);
+ log_trace("(%#010x): branch: result = %d, pattern = ", w->base.id, result);
c2_dump(cond);
+ putchar('\n');
#endif
}
// Handle a leaf
@@ -1527,10 +1530,11 @@
}
#ifdef DEBUG_WINMATCH
- log_trace("(%#010lx): leaf: result = %d, error = %d, "
- "client = %#010lx, pattern = ",
- w->id, result, error, w->client_win);
+ log_trace("(%#010x): leaf: result = %d, error = %d, "
+ "client = %#010x, pattern = ",
+ w->base.id, result, error, w->client_win);
c2_dump(cond);
+ putchar('\n');
#endif
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/common.h new/picom-8.2/src/common.h
--- old/picom-8/src/common.h 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/common.h 2020-10-24 10:44:12.000000000 +0200
@@ -230,6 +230,10 @@
bool tmout_unredir_hit;
/// Whether we need to redraw the screen
bool redraw_needed;
+
+ /// Cache a xfixes region so we don't need to allocate it everytime.
+ /// A workaround for yshui/picom#301
+ xcb_xfixes_region_t damaged_region;
/// The region needs to painted on next paint.
region_t *damage;
/// The region damaged on the last paint.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/compiler.h new/picom-8.2/src/compiler.h
--- old/picom-8/src/compiler.h 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/compiler.h 2020-10-24 10:44:12.000000000 +0200
@@ -111,6 +111,6 @@
typedef unsigned long ulong;
typedef unsigned int uint;
-static inline int popcount(uint x) {
- return __builtin_popcount(x);
+static inline int attr_const popcntul(unsigned long a) {
+ return __builtin_popcountl(a);
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/config_libconfig.c new/picom-8.2/src/config_libconfig.c
--- old/picom-8/src/config_libconfig.c 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/config_libconfig.c 2020-10-24 10:44:12.000000000 +0200
@@ -634,6 +634,15 @@
config_setting_lookup_float(blur_cfg, "deviation", &opt->blur_deviation);
}
+ // --write-pid-path
+ if (config_lookup_string(&cfg, "write-pid-path", &sval)) {
+ if (*sval != '/') {
+ log_warn("The write-pid-path in your configuration file is not"
+ " an absolute path");
+ }
+ opt->write_pid_path = strdup(sval);
+ }
+
// Wintype settings
// XXX ! Refactor all the wintype_* arrays into a struct
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/event.c new/picom-8.2/src/event.c
--- old/picom-8/src/event.c 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/event.c 2020-10-24 10:44:12.000000000 +0200
@@ -410,6 +410,7 @@
region_t region;
pixman_region32_init_rects(®ion, rects, nrects);
add_damage(ps, ®ion);
+ pixman_region32_fini(®ion);
}
static inline void ev_expose(session_t *ps, xcb_expose_event_t *ev) {
@@ -590,11 +591,9 @@
set_ignore_cookie(
ps, xcb_damage_subtract(ps->c, w->damage, XCB_NONE, XCB_NONE));
} else {
- xcb_xfixes_region_t tmp = x_new_id(ps->c);
- xcb_xfixes_create_region(ps->c, tmp, 0, NULL);
- set_ignore_cookie(ps, xcb_damage_subtract(ps->c, w->damage, XCB_NONE, tmp));
- x_fetch_region(ps->c, tmp, &parts);
- xcb_xfixes_destroy_region(ps->c, tmp);
+ set_ignore_cookie(
+ ps, xcb_damage_subtract(ps->c, w->damage, XCB_NONE, ps->damaged_region));
+ x_fetch_region(ps->c, ps->damaged_region, &parts);
pixman_region32_translate(&parts, w->g.x + w->g.border_width,
w->g.y + w->g.border_width);
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/options.c new/picom-8.2/src/options.c
--- old/picom-8/src/options.c 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/options.c 2020-10-24 10:44:12.000000000 +0200
@@ -782,7 +782,11 @@
P_CASELONG(309, unredir_if_possible_delay);
case 310:
// --write-pid-path
+ free(opt->write_pid_path);
opt->write_pid_path = strdup(optarg);
+ if (*opt->write_pid_path != '/') {
+ log_warn("--write-pid-path is not an absolute path");
+ }
break;
P_CASEBOOL(311, vsync_use_glfinish);
case 312:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/picom.c new/picom-8.2/src/picom.c
--- old/picom-8/src/picom.c 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/picom.c 2020-10-24 10:44:12.000000000 +0200
@@ -1310,7 +1310,13 @@
static void refresh_windows(session_t *ps) {
win_stack_foreach_managed(w, &ps->window_stack) {
- win_process_flags(ps, w);
+ win_process_update_flags(ps, w);
+ }
+}
+
+static void refresh_images(session_t *ps) {
+ win_stack_foreach_managed(w, &ps->window_stack) {
+ win_process_image_flags(ps, w);
}
}
@@ -1351,7 +1357,7 @@
// stale.
handle_root_flags(ps);
- // Process window flags
+ // Process window flags (window mapping)
refresh_windows(ps);
{
@@ -1363,6 +1369,9 @@
free(r);
}
+ // Process window flags (stale images)
+ refresh_images(ps);
+
e = xcb_request_check(ps->c, xcb_ungrab_server_checked(ps->c));
if (e) {
log_fatal_x_error(e, "failed to ungrab x server");
@@ -1741,6 +1750,12 @@
XCB_XFIXES_MINOR_VERSION)
.sequence);
+ ps->damaged_region = x_new_id(ps->c);
+ if (!XCB_AWAIT_VOID(xcb_xfixes_create_region, ps->c, ps->damaged_region, 0, NULL)) {
+ log_fatal("Failed to create a XFixes region");
+ goto err;
+ }
+
ext_info = xcb_get_extension_data(ps->c, &xcb_glx_id);
if (ext_info && ext_info->present) {
ps->glx_exists = true;
@@ -2270,6 +2285,11 @@
ps->debug_window = XCB_NONE;
}
+ if (ps->damaged_region != XCB_NONE) {
+ xcb_xfixes_destroy_region(ps->c, ps->damaged_region);
+ ps->damaged_region = XCB_NONE;
+ }
+
if (ps->o.experimental_backends) {
// backend is deinitialized in unredirect()
assert(ps->backend_data == NULL);
@@ -2383,6 +2403,7 @@
// Main loop
bool quit = false;
int ret_code = 0;
+ char *pid_file = NULL;
do {
Display *dpy = XOpenDisplay(NULL);
@@ -2425,6 +2446,9 @@
}
session_run(ps_g);
quit = ps_g->quit;
+ if (quit && ps_g->o.write_pid_path) {
+ pid_file = strdup(ps_g->o.write_pid_path);
+ }
session_destroy(ps_g);
free(ps_g);
ps_g = NULL;
@@ -2434,6 +2458,10 @@
} while (!quit);
free(config_file);
+ if (pid_file) {
+ log_trace("remove pid file %s", pid_file);
+ unlink(pid_file);
+ }
log_deinit_tls();
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/utils.h new/picom-8.2/src/utils.h
--- old/picom-8/src/utils.h 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/utils.h 2020-10-24 10:44:12.000000000 +0200
@@ -135,10 +135,6 @@
/// clamp `val` into interval [min, max]
#define clamp(val, min, max) max2(min2(val, max), min)
-static inline int attr_const popcountl(unsigned long a) {
- return __builtin_popcountl(a);
-}
-
/**
* Normalize a double value to a specific range.
*
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/win.c new/picom-8.2/src/win.c
--- old/picom-8/src/win.c 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/win.c 2020-10-24 10:44:12.000000000 +0200
@@ -320,12 +320,21 @@
}
}
-void win_process_flags(session_t *ps, struct managed_win *w) {
+void win_process_update_flags(session_t *ps, struct managed_win *w) {
if (win_check_flags_all(w, WIN_FLAGS_MAPPED)) {
map_win_start(ps, w);
win_clear_flags(w, WIN_FLAGS_MAPPED);
}
+ if (win_check_flags_all(w, WIN_FLAGS_CLIENT_STALE)) {
+ win_recheck_client(ps, w);
+ win_clear_flags(w, WIN_FLAGS_CLIENT_STALE);
+ }
+}
+
+void win_process_image_flags(session_t *ps, struct managed_win *w) {
+ assert(!win_check_flags_all(w, WIN_FLAGS_MAPPED));
+
// Not a loop
while (win_check_flags_any(w, WIN_FLAGS_IMAGES_STALE) &&
!win_check_flags_all(w, WIN_FLAGS_IMAGE_ERROR)) {
@@ -370,11 +379,6 @@
if (win_check_flags_any(w, WIN_FLAGS_IMAGES_STALE)) {
win_clear_flags(w, WIN_FLAGS_IMAGES_STALE);
}
-
- if (win_check_flags_all(w, WIN_FLAGS_CLIENT_STALE)) {
- win_recheck_client(ps, w);
- win_clear_flags(w, WIN_FLAGS_CLIENT_STALE);
- }
}
/**
@@ -1886,7 +1890,7 @@
// Clear PIXMAP_STALE flag, since the window is destroyed there is no
// pixmap available so STALE doesn't make sense.
// Do this before changing the window state to destroying
- win_clear_flags(mw, WIN_FLAGS_PIXMAP_STALE);
+ win_clear_flags(mw, WIN_FLAGS_IMAGES_STALE);
// Update state flags of a managed window
mw->state = WSTATE_DESTROYING;
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/win.h new/picom-8.2/src/win.h
--- old/picom-8/src/win.h 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/win.h 2020-10-24 10:44:12.000000000 +0200
@@ -254,8 +254,10 @@
#endif
};
-/// Process pending images flags on a window. Has to be called in X critical section
-void win_process_flags(session_t *ps, struct managed_win *w);
+/// Process pending updates/images flags on a window. Has to be called in X critical
+/// section
+void win_process_update_flags(session_t *ps, struct managed_win *w);
+void win_process_image_flags(session_t *ps, struct managed_win *w);
/// Bind a shadow to the window, with color `c` and shadow kernel `kernel`
bool win_bind_shadow(struct backend_base *b, struct managed_win *w, struct color c,
struct conv *kernel);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/src/x.c new/picom-8.2/src/x.c
--- old/picom-8/src/x.c 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/src/x.c 2020-10-24 10:44:12.000000000 +0200
@@ -612,10 +612,10 @@
return (struct xvisual_info){-1, -1, -1, -1, -1, 0};
}
- int red_size = popcountl(pictfmt->direct.red_mask),
- blue_size = popcountl(pictfmt->direct.blue_mask),
- green_size = popcountl(pictfmt->direct.green_mask),
- alpha_size = popcountl(pictfmt->direct.alpha_mask);
+ int red_size = popcntul(pictfmt->direct.red_mask),
+ blue_size = popcntul(pictfmt->direct.blue_mask),
+ green_size = popcntul(pictfmt->direct.green_mask),
+ alpha_size = popcntul(pictfmt->direct.alpha_mask);
return (struct xvisual_info){
.red_size = red_size,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/tests/configs/issue394.conf new/picom-8.2/tests/configs/issue394.conf
--- old/picom-8/tests/configs/issue394.conf 1970-01-01 01:00:00.000000000 +0100
+++ new/picom-8.2/tests/configs/issue394.conf 2020-10-24 10:44:12.000000000 +0200
@@ -0,0 +1,4 @@
+fading = true;
+fade-in-step = 1;
+fade-out-step = 0.01;
+shadow = true;
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/tests/run_tests.sh new/picom-8.2/tests/run_tests.sh
--- old/picom-8/tests/run_tests.sh 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/tests/run_tests.sh 2020-10-24 10:44:12.000000000 +0200
@@ -15,3 +15,4 @@
./run_one_test.sh $exe configs/issue314.conf testcases/issue314_2.py
./run_one_test.sh $exe configs/issue314.conf testcases/issue314_3.py
./run_one_test.sh $exe /dev/null testcases/issue299.py
+./run_one_test.sh $exe configs/issue394.conf testcases/issue394.py
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/tests/testcases/common.py new/picom-8.2/tests/testcases/common.py
--- old/picom-8/tests/testcases/common.py 2020-04-21 20:33:17.000000000 +0200
+++ new/picom-8.2/tests/testcases/common.py 2020-10-24 10:44:12.000000000 +0200
@@ -28,6 +28,11 @@
str_type = to_atom(conn, "STRING")
conn.core.ChangePropertyChecked(xproto.PropMode.Replace, wid, prop_name, str_type, 8, len(name), name).check()
+def set_window_size(conn, wid, width, height):
+ value_mask = xproto.ConfigWindow.Width | xproto.ConfigWindow.Height
+ value_list = [width, height]
+ conn.core.ConfigureWindowChecked(wid, value_mask, value_list).check()
+
def find_picom_window(conn):
prop_name = to_atom(conn, "WM_NAME")
setup = conn.get_setup()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/picom-8/tests/testcases/issue394.py new/picom-8.2/tests/testcases/issue394.py
--- old/picom-8/tests/testcases/issue394.py 1970-01-01 01:00:00.000000000 +0100
+++ new/picom-8.2/tests/testcases/issue394.py 2020-10-24 10:44:12.000000000 +0200
@@ -0,0 +1,35 @@
+#!/usr/bin/env python
+
+import xcffib.xproto as xproto
+import xcffib
+import time
+from common import set_window_name, set_window_size
+
+conn = xcffib.connect()
+setup = conn.get_setup()
+root = setup.roots[0].root
+visual = setup.roots[0].root_visual
+depth = setup.roots[0].root_depth
+
+# issue 394 is caused by a window getting a size update just before destroying leading to a shadow update on destroyed window.
+wid = conn.generate_id()
+print("Window id is ", hex(wid))
+
+# Create a window
+conn.core.CreateWindowChecked(depth, wid, root, 0, 0, 100, 100, 0, xproto.WindowClass.InputOutput, visual, 0, []).check()
+
+# Set Window name so it doesn't get a shadow
+set_window_name(conn, wid, "Test Window")
+
+# Map the window
+print("mapping")
+conn.core.MapWindowChecked(wid).check()
+
+time.sleep(0.5)
+
+# Resize the window and destroy
+print("resize and destroy")
+set_window_size(conn, wid, 150, 150)
+conn.core.DestroyWindowChecked(wid).check()
+
+time.sleep(0.5)
1
0
[opensuse-commit] commit imapfilter for openSUSE:Factory
by User for buildservice source handling 29 Nov '20
by User for buildservice source handling 29 Nov '20
29 Nov '20
Hello community,
here is the log from the commit of package imapfilter for openSUSE:Factory checked in at 2020-11-29 12:29:12
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/imapfilter (Old)
and /work/SRC/openSUSE:Factory/.imapfilter.new.5913 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "imapfilter"
Sun Nov 29 12:29:12 2020 rev:46 rq:851277 version:2.7.4
Changes:
--------
--- /work/SRC/openSUSE:Factory/imapfilter/imapfilter.changes 2020-11-17 21:25:17.297377978 +0100
+++ /work/SRC/openSUSE:Factory/.imapfilter.new.5913/imapfilter.changes 2020-11-29 12:29:20.558057666 +0100
@@ -1,0 +2,7 @@
+Thu Nov 26 18:03:20 UTC 2020 - Arun Persaud <arun(a)gmx.de>
+
+- update to version 2.7.4:
+ * Bug fix; incorrect argument to regular expression compile
+ function.
+
+-------------------------------------------------------------------
Old:
----
imapfilter-2.7.3.tar.gz
New:
----
imapfilter-2.7.4.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ imapfilter.spec ++++++
--- /var/tmp/diff_new_pack.7jWDGD/_old 2020-11-29 12:29:21.054058169 +0100
+++ /var/tmp/diff_new_pack.7jWDGD/_new 2020-11-29 12:29:21.058058173 +0100
@@ -17,7 +17,7 @@
Name: imapfilter
-Version: 2.7.3
+Version: 2.7.4
Release: 0
Summary: A mail filtering utility
License: MIT
++++++ imapfilter-2.7.3.tar.gz -> imapfilter-2.7.4.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/imapfilter-2.7.3/NEWS new/imapfilter-2.7.4/NEWS
--- old/imapfilter-2.7.3/NEWS 2020-11-14 19:48:19.000000000 +0100
+++ new/imapfilter-2.7.4/NEWS 2020-11-18 22:30:58.000000000 +0100
@@ -1,3 +1,6 @@
+IMAPFilter 2.7.4 - 18 Nov 2020
+ - Bug fix; incorrect argument to regular expression compile function.
+
IMAPFilter 2.7.3 - 14 Nov 2020
- Bug fix; incorrect free of compiled pattern.
- Unexpected network errors and IMAP BYE are now logged.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/imapfilter-2.7.3/src/regex.lua new/imapfilter-2.7.4/src/regex.lua
--- old/imapfilter-2.7.3/src/regex.lua 2020-11-14 19:48:19.000000000 +0100
+++ new/imapfilter-2.7.4/src/regex.lua 2020-11-18 22:30:58.000000000 +0100
@@ -7,7 +7,7 @@
_regex_cache.mt.__index = function (self, key)
- local r, compiled = ifre.compile(pattern)
+ local r, compiled = ifre.compile(key)
if not r then return end
self[key] = compiled
return compiled
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/imapfilter-2.7.3/src/version.h new/imapfilter-2.7.4/src/version.h
--- old/imapfilter-2.7.3/src/version.h 2020-11-14 19:48:19.000000000 +0100
+++ new/imapfilter-2.7.4/src/version.h 2020-11-18 22:30:58.000000000 +0100
@@ -3,7 +3,7 @@
/* Program's version number. */
-#define VERSION "2.7.3"
+#define VERSION "2.7.4"
/* Program's copyright. */
#define COPYRIGHT "Copyright (c) 2001-2020 Eleftherios Chatzimparmpas"
1
0
[opensuse-commit] commit python36 for openSUSE:Factory
by User for buildservice source handling 29 Nov '20
by User for buildservice source handling 29 Nov '20
29 Nov '20
Hello community,
here is the log from the commit of package python36 for openSUSE:Factory checked in at 2020-11-29 12:29:06
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python36 (Old)
and /work/SRC/openSUSE:Factory/.python36.new.5913 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python36"
Sun Nov 29 12:29:06 2020 rev:3 rq:851367 version:3.6.12
Changes:
--------
--- /work/SRC/openSUSE:Factory/python36/python36.changes 2020-08-25 09:35:03.656081382 +0200
+++ /work/SRC/openSUSE:Factory/.python36.new.5913/python36.changes 2020-11-29 12:29:16.834053900 +0100
@@ -1,0 +2,58 @@
+Fri Nov 27 15:59:09 UTC 2020 - Markéta Machová <mmachova(a)suse.com>
+
+- Pin Sphinx version to fix doc subpackage
+
+-------------------------------------------------------------------
+Wed Nov 25 17:16:15 UTC 2020 - Matej Cepl <mcepl(a)suse.com>
+
+- Change setuptools and pip version numbers according to new wheels
+- Add ignore_pip_deprec_warn.patch to switch of persistently
+ failing test.
+
+-------------------------------------------------------------------
+Tue Nov 24 17:38:21 UTC 2020 - Matej Cepl <mcepl(a)suse.com>
+
+- Replace bundled wheels for pip and setuptools with the updated ones
+ (bsc#1176262 CVE-2019-20916).
+
+-------------------------------------------------------------------
+Tue Oct 13 09:23:03 UTC 2020 - Marketa Calabkova <mcalabkova(a)suse.com>
+
+- Handful of changes to make python36 compatible with SLE15 and SLE12
+ (jsc#ECO-2799, jsc#SLE-13738)
+- Rebase bpo23395-PyErr_SetInterrupt-signal.patch
+
+-------------------------------------------------------------------
+Fri Oct 9 16:05:50 UTC 2020 - Dominique Leuenberger <dimstar(a)opensuse.org>
+
+- Fix build with RPM 4.16: error: bare words are no longer
+ supported, please use "...": x86 == ppc.
+
+-------------------------------------------------------------------
+Fri Oct 9 08:06:30 UTC 2020 - Matej Cepl <mcepl(a)suse.com>
+
+- Fix installing .desktop file
+
+-------------------------------------------------------------------
+Fri Sep 25 06:58:03 UTC 2020 - Dominique Leuenberger <dimstar(a)opensuse.org>
+
+- Buildrequire timezone only for general flavor. It's used in this
+ flavor for the test suite.
+
+-------------------------------------------------------------------
+Wed Sep 2 20:31:39 UTC 2020 - Matej Cepl <mcepl(a)suse.com>
+
+- Add faulthandler_stack_overflow_on_GCC10.patch to make build
+ working even with GCC10 (bpo#38965).
+
+-------------------------------------------------------------------
+Tue Sep 1 10:22:30 UTC 2020 - Matej Cepl <mcepl(a)suse.com>
+
+- Just cleanup and reordering items to synchronize with python38
+
+-------------------------------------------------------------------
+Mon Aug 31 11:12:31 UTC 2020 - Tomáš Chvátal <tchvatal(a)suse.com>
+
+- Format with spec-cleaner
+
+-------------------------------------------------------------------
@@ -50,0 +109 @@
+- This release also fixes CVE-2020-26116 (bsc#1177211) and CVE-2019-20907 (bsc#1174091).
@@ -84 +143 @@
- * Include more security fixes
+ * Include more security fixes (CVE-2019-18348, bsc#1155094)
New:
----
faulthandler_stack_overflow_on_GCC10.patch
ignore_pip_deprec_warn.patch
pip-20.2.3-py2.py3-none-any.whl
setuptools-44.1.1-py2.py3-none-any.whl
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python36.spec ++++++
--- /var/tmp/diff_new_pack.ttr46C/_old 2020-11-29 12:29:18.090055171 +0100
+++ /var/tmp/diff_new_pack.ttr46C/_new 2020-11-29 12:29:18.094055175 +0100
@@ -66,13 +66,13 @@
%define so_minor 0
%define so_version %{python_version_soname}%{abi_kind}%{so_major}_%{so_minor}
# rpm and python have different ideas about what is an arch-dependent name, so:
-%if %{__isa_name} == ppc
+%if "%{__isa_name}" == "ppc"
%define archname %(echo %{_arch} | sed s/ppc/powerpc/)
%else
%define archname %{_arch}
%endif
# our arm has Hardware-Floatingpoint
-%if %{_arch} == arm
+%if "%{_arch}" == "arm"
%define armsuffix hf
%endif
# pyexpat.cpython-35m-x86_64-linux-gnu
@@ -101,14 +101,18 @@
Source9: import_failed.map
Source10: pre_checkin.sh
Source11: skipped_tests.py
-Source19: idle3.desktop
-Source20: idle3.appdata.xml
-Source99: https://www.python.org/static/files/pubkeys.txt#/python.keyring
+Source12: idle3.desktop
+Source13: idle3.appdata.xml
+
+# Fixed bundled wheels
+Source20: setuptools-44.1.1-py2.py3-none-any.whl
+Source21: pip-20.2.3-py2.py3-none-any.whl
+
# The following files are not used in the build.
# They are listed here to work around missing functionality in rpmbuild,
# which would otherwise exclude them from distributed src.rpm files.
+Source99: https://www.python.org/static/files/pubkeys.txt#/python.keyring
Source100: PACKAGING-NOTES
-### COMMON-PATCH-BEGIN ###
# implement "--record-rpm" option for distutils installations
Patch01: Python-3.0b1-record-rpm.patch
# support lib-vs-lib64 distinction
@@ -159,7 +163,13 @@
Patch36: riscv64-support.patch
# PATCH-FIX-UPSTREAM riscv64-ctypes.patch bpo-35847: RISC-V needs CTYPES_PASS_BY_REF_HACK (GH-11694)
Patch37: riscv64-ctypes.patch
-### COMMON-PATCH-END ###
+# PATCH-FIX-UPSTREAM faulthandler._stack_overflow_on_GCC10.patch bpo#38965 mcepl(a)suse.com
+# Fix faulthandler._stack_overflow() on GCC 10
+Patch38: faulthandler_stack_overflow_on_GCC10.patch
+# PATCH-FIX-UPSTREAM ignore_pip_deprec_warn.patch mcepl(a)suse.com
+# Ignore deprecation warning for old version of pip
+Patch39: ignore_pip_deprec_warn.patch
+
BuildRequires: automake
BuildRequires: fdupes
BuildRequires: gmp-devel
@@ -167,7 +177,6 @@
BuildRequires: netcfg
BuildRequires: openssl-devel
BuildRequires: pkgconfig
-BuildRequires: timezone
BuildRequires: xz
BuildRequires: pkgconfig(bzip2)
BuildRequires: pkgconfig(expat)
@@ -181,9 +190,7 @@
%if %{with doc}
# Here we just run sphinx and we can use generic one, we don't need
# the flavor variant
-BuildRequires: python3-Sphinx < 3.0
-BuildRequires: python3-python-docs-theme
-BuildRequires: python3-sphinxcontrib-qthelp >= 1.0.2
+BuildRequires: python3-Sphinx < 3
%endif
%if %{with general}
# required for idle3 (.desktop and .appdata.xml files)
@@ -193,6 +200,7 @@
BuildRequires: gettext
BuildRequires: readline-devel
BuildRequires: sqlite-devel
+BuildRequires: timezone
BuildRequires: update-desktop-files
BuildRequires: pkgconfig(ncurses)
BuildRequires: pkgconfig(tk)
@@ -204,6 +212,7 @@
Provides: python = %{python_version}
%if %{primary_interpreter}
Provides: python3 = %{python_version}
+Obsoletes: python3 <= %{python_version}
%endif
%endif
@@ -226,6 +235,7 @@
Requires: %{python_pkg_name} = %{version}
%if %{primary_interpreter}
Provides: python3-tk = %{version}
+Obsoletes: python3-tk < %{version}
%endif
%description -n %{python_pkg_name}-tk
@@ -235,7 +245,8 @@
Summary: Python Interface to the (N)Curses Library
Requires: %{python_pkg_name} = %{version}
%if %{primary_interpreter}
-Provides: python3-curses
+Provides: python3-curses = %{version}
+Obsoletes: python3-curses < %{version}
%endif
%description -n %{python_pkg_name}-curses
@@ -246,7 +257,8 @@
Summary: Python Interface to the GDBM Library
Requires: %{python_pkg_name} = %{version}
%if %{primary_interpreter}
-Provides: python3-dbm
+Provides: python3-dbm = %{version}
+Obsoletes: python3-dbm < %{version}
%endif
%description -n %{python_pkg_name}-dbm
@@ -259,6 +271,7 @@
Requires: %{python_pkg_name}-tk
%if %{primary_interpreter}
Provides: python3-idle = %{version}
+Obsoletes: python3-idle < %{version}
%endif
%description -n %{python_pkg_name}-idle
@@ -272,6 +285,7 @@
Enhances: %{python_pkg_name} = %{python_version}
%if %{primary_interpreter}
Provides: python3-doc = %{version}
+Obsoletes: python3-doc < %{version}
%endif
%description -n %{python_pkg_name}-doc
@@ -283,6 +297,7 @@
Summary: Additional Package Documentation for Python 3 in devhelp format
%if %{primary_interpreter}
Provides: python3-doc-devhelp = %{version}
+Obsoletes: python3-doc-devhelp < %{version}
%endif
%description -n %{python_pkg_name}-doc-devhelp
@@ -306,22 +321,15 @@
Provides: %{python_pkg_name}-typing = %{version}
# python3-xml was merged into python3, now moved into -base
Provides: %{python_pkg_name}-xml = %{version}
-# python-importlib-metadata was specifical project which was merged into 3.8
-Provides: %{python_pkg_name}-importlib-metadata = %{version}
-# python-importlib_resources is a backport of 3.7 behaviour into older pythons
-Provides: %{python_pkg_name}-importlib_resources = %{version}
%if %{primary_interpreter}
Provides: python3-asyncio = %{version}
Provides: python3-base = %{version}
Obsoletes: python3-asyncio < %{version}
+Obsoletes: python3-base < %{version}
Provides: python3-typing = %{version}
Obsoletes: python3-typing < %{version}
Provides: python3-xml = %{version}
Obsoletes: python3-xml < %{version}
-Provides: python3-importlib-metadata = %{version}
-Obsoletes: python3-importlib-metadata < %{version}
-Provides: python3-importlib_resources = %{version}
-Obsoletes: python3-importlib_resources < %{version}
%endif
%description -n %{python_pkg_name}-base
@@ -346,6 +354,7 @@
Provides: python3-tools = %{version}
Obsoletes: python3-2to3 < %{version}
Obsoletes: python3-demo < %{version}
+Obsoletes: python3-tools < %{version}
%endif
%description -n %{python_pkg_name}-tools
@@ -357,6 +366,7 @@
Requires: %{python_pkg_name}-base = %{version}
%if %{primary_interpreter}
Provides: python3-devel = %{version}
+Obsoletes: python3-devel < %{version}
%endif
%description -n %{python_pkg_name}-devel
@@ -376,6 +386,7 @@
Requires: %{python_pkg_name}-tk = %{version}
%if %{primary_interpreter}
Provides: python3-testsuite = %{version}
+Obsoletes: python3-testsuite < %{version}
%endif
%description -n %{python_pkg_name}-testsuite
@@ -421,6 +432,8 @@
%patch35 -p1
%patch36 -p1
%patch37 -p1
+%patch38 -p1
+%patch39 -p1
# drop Autoconf version requirement
sed -i 's/^AC_PREREQ/dnl AC_PREREQ/' configure.ac
@@ -441,6 +454,15 @@
# drop duplicate README from site-packages
rm Lib/site-packages/README.txt
+# Replace bundled wheels with the updates ones
+rm -v Lib/ensurepip/_bundled/*.whl
+cp -v %{SOURCE20} %{SOURCE21} Lib/ensurepip/_bundled/
+STVER=$(basename %{SOURCE20}|cut -d- -f2)
+PIPVER=$(basename %{SOURCE21}|cut -d- -f2)
+sed -E -i -e "s/^(\s*_SETUPTOOLS_VERSION\s+=\s+)\"[0-9.]+\"/\1\"${STVER}\"/" \
+ -e "s/^(\s*_PIP_VERSION\s+=\s+)\"[0-9.]+\"/\1\"${PIPVER}\"/" \
+ Lib/ensurepip/__init__.py
+
%build
%if %{with doc}
TODAY_DATE=`date -r %{SOURCE0} "+%%B %%d, %%Y"`
@@ -448,7 +470,7 @@
cd Doc
sed -i "s/^today = .*/today = '$TODAY_DATE'/" conf.py
-%make_build -j1 html
+make %{?_smp_mflags} -j1 html
# Build also devhelp files
sphinx-build -a -b devhelp . build/devhelp
@@ -456,7 +478,11 @@
%else
%define _lto_cflags %{nil}
# use rpm_opt_flags
+%if 0%{?suse_version} < 1500
+export OPT="%{optflags} -DOPENSSL_LOAD_CONF -fwrapv $(pkg-config --cflags-only-I libffi)"
+%else
export OPT="%{optflags} -DOPENSSL_LOAD_CONF -fwrapv $(pkg-config --cflags-only-I libffi) -fno-semantic-interposition"
+%endif
touch -r %{SOURCE0} Makefile.pre.in
@@ -480,14 +506,14 @@
--enable-loadable-sqlite-extensions
# prevent make from trying to rebuild PYTHON_FOR_GEN stuff
-%make_build -t Python/Python-ast.c \
+make -t Python/Python-ast.c \
Include/Python-ast.h \
Objects/typeslots.inc \
Python/opcode_targets.h \
Include/opcode.h
%if %{with general}
-%make_build
+make %{?_smp_mflags}
%endif
%if %{with base}
%if %{with profileopt}
@@ -496,7 +522,7 @@
target=all
%endif
LD_LIBRARY_PATH=.:$LD_LIBRARY_PATH \
- %make_build $target
+ make %{?_smp_mflags} $target
%endif
%endif
@@ -512,10 +538,12 @@
# test_multiprocessing_forkserver is racy
EXCLUDE="$EXCLUDE test_multiprocessing_forkserver"
%endif
+%ifarch ppc ppc64 ppc64le
# exclue test_faulthandler due to bnc#831629
EXCLUDE="$EXCLUDE test_faulthandler"
+%endif
# some tests break in QEMU
-%if 0%{?qemu_user_space_build} > 0
+%if 0%{?qemu_user_space_build}
EXCLUDE="$EXCLUDE test_multiprocessing_forkserver test_multiprocessing_spawn test_posix test_os test_socket"
# qemu bug (siginterrupt handling)
EXCLUDE="$EXCLUDE test_signal"
@@ -538,7 +566,7 @@
# Use timeout, like make target buildbottest
# We cannot run tests parallel, because osc build environment doesn’t
# have /dev/shm
-%make_build -j1 test TESTOPTS="-u curses -v -x $EXCLUDE --timeout=1200"
+make %{?_smp_mflags} -j1 test TESTOPTS="-u curses -v -x $EXCLUDE --timeout=3000"
# use network, be verbose:
#make test TESTOPTS="-l -u network -v"
%endif
@@ -589,14 +617,15 @@
done
for library in \
- array _asyncio audioop binascii _bisect _bz2 cmath _codecs_* _crypt _csv \
- _ctypes _datetime _decimal fcntl grp _hashlib _heapq _json _lsprof \
- _lzma math mmap _multibytecodec _multiprocessing _opcode ossaudiodev \
- parser _pickle _posixsubprocess _random resource select _ssl _socket spwd \
- _struct syslog termios _testbuffer _testimportmultiple _testmultiphase \
- unicodedata zlib _ctypes_test _testcapi xxlimited \
- _elementtree pyexpat \
- _md5 _sha1 _sha256 _sha512 _blake2 _sha3
+ array _asyncio audioop binascii _bisect _bz2 cmath _codecs_* \
+ _crypt _csv _ctypes _datetime _decimal fcntl grp \
+ _hashlib _heapq _json _lsprof _lzma math mmap _multibytecodec \
+ _multiprocessing _opcode ossaudiodev parser _pickle \
+ _posixsubprocess _random resource select _ssl _socket spwd \
+ _struct syslog termios _testbuffer _testimportmultiple \
+ _testmultiphase unicodedata zlib _ctypes_test _testcapi xxlimited \
+ _elementtree pyexpat _md5 _sha1 \
+ _sha256 _sha512 _blake2 _sha3
do
eval rm "%{buildroot}%{sitedir}/lib-dynload/$library.*"
done
@@ -622,13 +651,15 @@
done
# install idle desktop file
-cp %{SOURCE19} idle%{python_version}.desktop
+cp %{SOURCE12} idle%{python_version}.desktop
sed -i -e 's:idle3:idle%{python_version}:g' idle%{python_version}.desktop
+mkdir -p %{buildroot}%{_datadir}/applications
install -m 644 -D -t %{buildroot}%{_datadir}/applications idle%{python_version}.desktop
%suse_update_desktop_file idle%{python_version}
-cp %{SOURCE20} idle%{python_version}.appdata.xml
+cp %{SOURCE13} idle%{python_version}.appdata.xml
sed -i -e 's:idle3.desktop:idle%{python_version}.desktop:g' idle%{python_version}.appdata.xml
+mkdir -p %{buildroot}%{_datadir}/metainfo
install -m 644 -D -t %{buildroot}%{_datadir}/metainfo idle%{python_version}.appdata.xml
appstream-util validate-relax --nonet %{buildroot}%{_datadir}/metainfo/idle%{python_version}.appdata.xml
@@ -773,6 +804,7 @@
%doc Lib/idlelib/ChangeLog
%{_bindir}/idle%{python_version}
%{_datadir}/applications/idle%{python_version}.desktop
+%dir %{_datadir}/metainfo
%{_datadir}/metainfo/idle%{python_version}.appdata.xml
%{_datadir}/icons/hicolor/*/apps/idle%{python_version}.png
%dir %{_datadir}/icons/hicolor
++++++ bpo23395-PyErr_SetInterrupt-signal.patch ++++++
--- /var/tmp/diff_new_pack.ttr46C/_old 2020-11-29 12:29:18.186055268 +0100
+++ /var/tmp/diff_new_pack.ttr46C/_new 2020-11-29 12:29:18.186055268 +0100
@@ -119,7 +119,7 @@
unittest.main()
--- a/Misc/ACKS
+++ b/Misc/ACKS
-@@ -247,7 +247,7 @@ Donn Cave
+@@ -248,7 +248,7 @@ Donn Cave
Charles Cazabon
Jesús Cea Avión
Per Cederqvist
++++++ faulthandler_stack_overflow_on_GCC10.patch ++++++
From 5044c889dfced2f43e2cccb673d889a4882f6b3b Mon Sep 17 00:00:00 2001
From: "Miss Islington (bot)"
<31488909+miss-islington(a)users.noreply.github.com>
Date: Wed, 4 Dec 2019 12:29:22 -0800
Subject: [PATCH] bpo-38965: Fix faulthandler._stack_overflow() on GCC 10
(GH-17467)
Use the "volatile" keyword to prevent tail call optimization
on any compiler, rather than relying on compiler specific pragma.
(cherry picked from commit 8b787964e0a647caa0558b7c29ae501470d727d9)
Co-authored-by: Victor Stinner <vstinner(a)python.org>
---
.../2019-12-04-17-08-55.bpo-38965.yqax3m.rst | 3 +++
Modules/faulthandler.c | 16 ++++++----------
2 files changed, 9 insertions(+), 10 deletions(-)
create mode 100644 Misc/NEWS.d/next/Tests/2019-12-04-17-08-55.bpo-38965.yqax3m.rst
--- /dev/null
+++ b/Misc/NEWS.d/next/Tests/2019-12-04-17-08-55.bpo-38965.yqax3m.rst
@@ -0,0 +1,3 @@
+Fix test_faulthandler on GCC 10. Use the "volatile" keyword in
+``faulthandler._stack_overflow()`` to prevent tail call optimization on any
+compiler, rather than relying on compiler specific pragma.
--- a/Modules/faulthandler.c
+++ b/Modules/faulthandler.c
@@ -1091,18 +1091,14 @@ faulthandler_fatal_error_py(PyObject *se
#if defined(HAVE_SIGALTSTACK) && defined(HAVE_SIGACTION)
#define FAULTHANDLER_STACK_OVERFLOW
-#ifdef __INTEL_COMPILER
- /* Issue #23654: Turn off ICC's tail call optimization for the
- * stack_overflow generator. ICC turns the recursive tail call into
- * a loop. */
-# pragma intel optimization_level 0
-#endif
-static
-uintptr_t
+static uintptr_t
stack_overflow(uintptr_t min_sp, uintptr_t max_sp, size_t *depth)
{
- /* allocate 4096 bytes on the stack at each call */
- unsigned char buffer[4096];
+ /* Allocate (at least) 4096 bytes on the stack at each call.
+
+ bpo-23654, bpo-38965: use volatile keyword to prevent tail call
+ optimization. */
+ volatile unsigned char buffer[4096];
uintptr_t sp = (uintptr_t)&buffer;
*depth += 1;
if (sp < min_sp || max_sp < sp)
++++++ ignore_pip_deprec_warn.patch ++++++
--- a/Lib/test/test_venv.py
+++ b/Lib/test/test_venv.py
@@ -438,6 +438,7 @@ class EnsurePipTest(BaseTest):
' module unconditionally')
# Issue #26610: pip/pep425tags.py requires ctypes
@unittest.skipUnless(ctypes, 'pip requires ctypes')
+ @unittest.skip("Doesn't work with modified wheels")
@requires_zlib
def test_with_pip(self):
self.do_test_with_pip(False)
++++++ pep538_coerce_legacy_c_locale.patch ++++++
--- /var/tmp/diff_new_pack.ttr46C/_old 2020-11-29 12:29:18.258055341 +0100
+++ /var/tmp/diff_new_pack.ttr46C/_new 2020-11-29 12:29:18.258055341 +0100
@@ -1,5 +1,3 @@
-diff --git a/Doc/using/cmdline.rst b/Doc/using/cmdline.rst
-index d14793a..65aa3ad 100644
--- a/Doc/using/cmdline.rst
+++ b/Doc/using/cmdline.rst
@@ -728,6 +728,45 @@ conflict.
@@ -48,8 +46,6 @@
Debug-mode variables
~~~~~~~~~~~~~~~~~~~~
-diff --git a/Lib/test/support/script_helper.py b/Lib/test/support/script_helper.py
-index 507dc48..c3cb720 100644
--- a/Lib/test/support/script_helper.py
+++ b/Lib/test/support/script_helper.py
@@ -56,8 +56,35 @@ def interpreter_requires_environment():
@@ -90,7 +86,7 @@
# Executing the interpreter in a subprocess
-@@ -115,30 +142,7 @@ def run_python_until_end(*args, **env_vars):
+@@ -115,30 +142,7 @@ def run_python_until_end(*args, **env_va
def _assert_python(expected_success, *args, **env_vars):
res, cmd_line = run_python_until_end(*args, **env_vars)
if (res.rc and expected_success) or (not res.rc and not expected_success):
@@ -122,9 +118,6 @@
return res
def assert_python_ok(*args, **env_vars):
-diff --git a/Lib/test/test_c_locale_coercion.py b/Lib/test/test_c_locale_coercion.py
-new file mode 100644
-index 0000000..635c98f
--- /dev/null
+++ b/Lib/test/test_c_locale_coercion.py
@@ -0,0 +1,371 @@
@@ -499,8 +492,6 @@
+
+if __name__ == "__main__":
+ test_main()
-diff --git a/Lib/test/test_cmd_line.py b/Lib/test/test_cmd_line.py
-index 38156b4..5922ed9 100644
--- a/Lib/test/test_cmd_line.py
+++ b/Lib/test/test_cmd_line.py
@@ -153,6 +153,7 @@ class CmdLineTest(unittest.TestCase):
@@ -511,8 +502,6 @@
code = (
b'import locale; '
b'print(ascii("' + undecodable + b'"), '
-diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py
-index 7866a5c..b41239a 100644
--- a/Lib/test/test_sys.py
+++ b/Lib/test/test_sys.py
@@ -680,6 +680,7 @@ class SysModuleTest(unittest.TestCase):
@@ -523,8 +512,6 @@
code = '\n'.join((
'import sys',
'def dump(name):',
-diff --git a/Modules/main.c b/Modules/main.c
-index 585d696..96d8be4 100644
--- a/Modules/main.c
+++ b/Modules/main.c
@@ -107,7 +107,11 @@ static const char usage_6[] =
@@ -540,8 +527,6 @@
static int
usage(int exitcode, const wchar_t* program)
-diff --git a/Programs/_testembed.c b/Programs/_testembed.c
-index 813cf30..2a64092 100644
--- a/Programs/_testembed.c
+++ b/Programs/_testembed.c
@@ -1,4 +1,5 @@
@@ -550,8 +535,6 @@
#include "pythread.h"
#include <stdio.h>
-diff --git a/Programs/python.c b/Programs/python.c
-index a7afbc7..03f8295 100644
--- a/Programs/python.c
+++ b/Programs/python.c
@@ -15,6 +15,21 @@ wmain(int argc, wchar_t **argv)
@@ -622,11 +605,9 @@
for (i = 0; i < argc; i++) {
PyMem_RawFree(argv_copy2[i]);
-diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c
-index ecfdfee..4fee178 100644
--- a/Python/pylifecycle.c
+++ b/Python/pylifecycle.c
-@@ -167,6 +167,7 @@ Py_SetStandardStreamEncoding(const char *encoding, const char *errors)
+@@ -167,6 +167,7 @@ Py_SetStandardStreamEncoding(const char
return 0;
}
@@ -634,7 +615,7 @@
/* Global initializations. Can be undone by Py_FinalizeEx(). Don't
call this twice without an intervening Py_FinalizeEx() call. When
initializations fail, a fatal error is issued and the function does
-@@ -301,6 +302,183 @@ import_init(PyInterpreterState *interp, PyObject *sysmod)
+@@ -301,6 +302,183 @@ import_init(PyInterpreterState *interp,
}
@@ -818,7 +799,7 @@
void
_Py_InitializeEx_Private(int install_sigs, int install_importlib)
{
-@@ -315,11 +493,19 @@ _Py_InitializeEx_Private(int install_sigs, int install_importlib)
+@@ -315,11 +493,19 @@ _Py_InitializeEx_Private(int install_sig
initialized = 1;
_Py_Finalizing = NULL;
@@ -839,7 +820,7 @@
#endif
if ((p = Py_GETENV("PYTHONDEBUG")) && *p != '\0')
-@@ -1247,12 +1433,8 @@ initstdio(void)
+@@ -1250,12 +1436,8 @@ initstdio(void)
}
}
if (!errors && !(pythonioencoding && *pythonioencoding)) {
@@ -854,11 +835,9 @@
}
}
-diff --git a/configure.ac b/configure.ac
-index 3f2459a..7444486 100644
--- a/configure.ac
+++ b/configure.ac
-@@ -3360,6 +3360,40 @@ then
+@@ -3417,6 +3417,40 @@ then
fi
AC_MSG_RESULT($with_pymalloc)
++++++ python-3.6-CVE-2017-18207.patch ++++++
--- /var/tmp/diff_new_pack.ttr46C/_old 2020-11-29 12:29:18.286055369 +0100
+++ /var/tmp/diff_new_pack.ttr46C/_new 2020-11-29 12:29:18.290055373 +0100
@@ -7,11 +7,9 @@
Lib/wave.py | 2 ++
1 file changed, 2 insertions(+)
-diff --git a/Lib/wave.py b/Lib/wave.py
-index cf94d5af72b4..6db5a2e9cc96 100644
--- a/Lib/wave.py
+++ b/Lib/wave.py
-@@ -259,6 +259,8 @@ def _read_fmt_chunk(self, chunk):
+@@ -258,6 +258,8 @@ class Wave_read:
self._sampwidth = (sampwidth + 7) // 8
else:
raise Error('unknown format: %r' % (wFormatTag,))
++++++ riscv64-ctypes.patch ++++++
--- /var/tmp/diff_new_pack.ttr46C/_old 2020-11-29 12:29:18.310055393 +0100
+++ /var/tmp/diff_new_pack.ttr46C/_new 2020-11-29 12:29:18.310055393 +0100
@@ -13,18 +13,13 @@
2 files changed, 2 insertions(+), 1 deletion(-)
create mode 100644 Misc/NEWS.d/next/Library/2019-01-29-09-11-09.bpo-35847.eiSi4t.rst
-diff --git a/Misc/NEWS.d/next/Library/2019-01-29-09-11-09.bpo-35847.eiSi4t.rst b/Misc/NEWS.d/next/Library/2019-01-29-09-11-09.bpo-35847.eiSi4t.rst
-new file mode 100644
-index 0000000000..e3775f96f3
--- /dev/null
+++ b/Misc/NEWS.d/next/Library/2019-01-29-09-11-09.bpo-35847.eiSi4t.rst
@@ -0,0 +1 @@
+RISC-V needed the CTYPES_PASS_BY_REF_HACK. Fixes ctypes Structure test_pass_by_value.
-diff --git a/Modules/_ctypes/callproc.c b/Modules/_ctypes/callproc.c
-index a7965c19b7..bed5364020 100644
--- a/Modules/_ctypes/callproc.c
+++ b/Modules/_ctypes/callproc.c
-@@ -1058,7 +1058,7 @@ GetComError(HRESULT errcode, GUID *riid, IUnknown *pIunk)
+@@ -1063,7 +1063,7 @@ GetComError(HRESULT errcode, GUID *riid,
#endif
#if (defined(__x86_64__) && (defined(__MINGW64__) || defined(__CYGWIN__))) || \
@@ -33,6 +28,3 @@
#define CTYPES_PASS_BY_REF_HACK
#define POW2(x) (((x & ~(x - 1)) == x) ? x : 0)
#define IS_PASS_BY_REF(x) (x > 8 || !POW2(x))
---
-2.28.0
-
++++++ riscv64-support.patch ++++++
--- /var/tmp/diff_new_pack.ttr46C/_old 2020-11-29 12:29:18.318055401 +0100
+++ /var/tmp/diff_new_pack.ttr46C/_new 2020-11-29 12:29:18.318055401 +0100
@@ -13,19 +13,14 @@
3 files changed, 71 insertions(+), 1 deletion(-)
create mode 100644 Misc/NEWS.d/next/Build/2018-04-30-16-53-00.bpo-33377.QBh6vP.rst
-diff --git a/Misc/NEWS.d/next/Build/2018-04-30-16-53-00.bpo-33377.QBh6vP.rst b/Misc/NEWS.d/next/Build/2018-04-30-16-53-00.bpo-33377.QBh6vP.rst
-new file mode 100644
-index 0000000000..f5dbd23c7c
--- /dev/null
+++ b/Misc/NEWS.d/next/Build/2018-04-30-16-53-00.bpo-33377.QBh6vP.rst
@@ -0,0 +1,2 @@
+Add new triplets for mips r6 and riscv variants (used in extension
+suffixes).
-diff --git a/configure b/configure
-index f9eee2c028..673cfbd3cf 100755
--- a/configure
+++ b/configure
-@@ -781,6 +781,7 @@ infodir
+@@ -785,6 +785,7 @@ infodir
docdir
oldincludedir
includedir
@@ -33,7 +28,7 @@
localstatedir
sharedstatedir
sysconfdir
-@@ -893,6 +894,7 @@ datadir='${datarootdir}'
+@@ -898,6 +899,7 @@ datadir='${datarootdir}'
sysconfdir='${prefix}/etc'
sharedstatedir='${prefix}/com'
localstatedir='${prefix}/var'
@@ -41,7 +36,7 @@
includedir='${prefix}/include'
oldincludedir='/usr/include'
docdir='${datarootdir}/doc/${PACKAGE_TARNAME}'
-@@ -1145,6 +1147,15 @@ do
+@@ -1150,6 +1152,15 @@ do
| -silent | --silent | --silen | --sile | --sil)
silent=yes ;;
@@ -57,7 +52,7 @@
-sbindir | --sbindir | --sbindi | --sbind | --sbin | --sbi | --sb)
ac_prev=sbindir ;;
-sbindir=* | --sbindir=* | --sbindi=* | --sbind=* | --sbin=* \
-@@ -1282,7 +1293,7 @@ fi
+@@ -1287,7 +1298,7 @@ fi
for ac_var in exec_prefix prefix bindir sbindir libexecdir datarootdir \
datadir sysconfdir sharedstatedir localstatedir includedir \
oldincludedir docdir infodir htmldir dvidir pdfdir psdir \
@@ -66,7 +61,7 @@
do
eval ac_val=\$$ac_var
# Remove trailing slashes.
-@@ -1435,6 +1446,7 @@ Fine tuning of the installation directories:
+@@ -1440,6 +1451,7 @@ Fine tuning of the installation director
--sysconfdir=DIR read-only single-machine data [PREFIX/etc]
--sharedstatedir=DIR modifiable architecture-independent data [PREFIX/com]
--localstatedir=DIR modifiable single-machine data [PREFIX/var]
@@ -74,7 +69,7 @@
--libdir=DIR object code libraries [EPREFIX/lib]
--includedir=DIR C header files [PREFIX/include]
--oldincludedir=DIR C header files for non-gcc [/usr/include]
-@@ -5238,6 +5250,26 @@ cat >> conftest.c <<EOF
+@@ -5261,6 +5273,26 @@ cat >> conftest.c <<EOF
ia64-linux-gnu
# elif defined(__m68k__) && !defined(__mcoldfire__)
m68k-linux-gnu
@@ -101,7 +96,7 @@
# elif defined(__mips_hard_float) && defined(_MIPSEL)
# if _MIPS_SIM == _ABIO32
mipsel-linux-gnu
-@@ -5280,6 +5312,14 @@ cat >> conftest.c <<EOF
+@@ -5303,6 +5335,14 @@ cat >> conftest.c <<EOF
sparc64-linux-gnu
# elif defined(__sparc__)
sparc-linux-gnu
@@ -116,11 +111,9 @@
# else
# error unknown platform triplet
# endif
-diff --git a/configure.ac b/configure.ac
-index b83abee18c..419bc34eab 100644
--- a/configure.ac
+++ b/configure.ac
-@@ -781,6 +781,26 @@ cat >> conftest.c <<EOF
+@@ -804,6 +804,26 @@ cat >> conftest.c <<EOF
ia64-linux-gnu
# elif defined(__m68k__) && !defined(__mcoldfire__)
m68k-linux-gnu
@@ -147,7 +140,7 @@
# elif defined(__mips_hard_float) && defined(_MIPSEL)
# if _MIPS_SIM == _ABIO32
mipsel-linux-gnu
-@@ -823,6 +843,14 @@ cat >> conftest.c <<EOF
+@@ -846,6 +866,14 @@ cat >> conftest.c <<EOF
sparc64-linux-gnu
# elif defined(__sparc__)
sparc-linux-gnu
@@ -162,6 +155,3 @@
# else
# error unknown platform triplet
# endif
---
-2.28.0
-
1
0
[opensuse-commit] commit imagewriter for openSUSE:Factory
by User for buildservice source handling 29 Nov '20
by User for buildservice source handling 29 Nov '20
29 Nov '20
Hello community,
here is the log from the commit of package imagewriter for openSUSE:Factory checked in at 2020-11-29 12:29:03
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/imagewriter (Old)
and /work/SRC/openSUSE:Factory/.imagewriter.new.5913 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "imagewriter"
Sun Nov 29 12:29:03 2020 rev:24 rq:851283 version:1.10.1432200249.1d253d9
Changes:
--------
--- /work/SRC/openSUSE:Factory/imagewriter/imagewriter.changes 2018-05-15 10:31:50.227636681 +0200
+++ /work/SRC/openSUSE:Factory/.imagewriter.new.5913/imagewriter.changes 2020-11-29 12:29:06.886043838 +0100
@@ -1,0 +2,5 @@
+Mon Nov 9 17:15:13 UTC 2020 - Hans-Peter Jansen <hpj(a)urpla.net>
+
+- Apply 0001-remove-include-sys-sysctl.h.patch
+
+-------------------------------------------------------------------
New:
----
0001-remove-include-sys-sysctl.h.patch
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ imagewriter.spec ++++++
--- /var/tmp/diff_new_pack.z2vPvn/_old 2020-11-29 12:29:07.454044413 +0100
+++ /var/tmp/diff_new_pack.z2vPvn/_new 2020-11-29 12:29:07.458044417 +0100
@@ -69,6 +69,7 @@
Group: Hardware/Other
URL: https://github.com/openSUSE/imagewriter
Source0: imagewriter-%{version}.tar.xz
+Patch0: 0001-remove-include-sys-sysctl.h.patch
BuildRequires: %{backend}
BuildRequires: %{breq}
BuildRequires: gcc-c++
@@ -80,8 +81,7 @@
A graphical utility for writing raw disk images & hybrid ISOs to USB keys.
%prep
-%setup -q
-
+%autosetup -p1
%build
# Create qmake cache file for building and use optflags.
++++++ 0001-remove-include-sys-sysctl.h.patch ++++++
From 3164e25267243ef4983f53ef5c1f849d3301c36f Mon Sep 17 00:00:00 2001
From: Hans-Peter Jansen <hp(a)urpla.net>
Date: Mon, 9 Nov 2020 18:06:49 +0100
Subject: [PATCH] remove include <sys/sysctl.h>
---
MainWindow.cpp | 1 -
1 file changed, 1 deletion(-)
diff --git a/MainWindow.cpp b/MainWindow.cpp
index c32bf9e..c634613 100644
--- a/MainWindow.cpp
+++ b/MainWindow.cpp
@@ -37,7 +37,6 @@
#include <unistd.h>
#include <sys/types.h>
-#include <sys/sysctl.h>
#ifdef USEUDISKS2
#include "udisks2_interface.h"
--
2.29.2
1
0