openSUSE Commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
December 2020
- 1 participants
- 2154 discussions
21 Dec '20
Hello community,
here is the log from the commit of package 00Meta for openSUSE:Leap:15.2:Images checked in at 2020-12-21 11:02:29
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Leap:15.2:Images/00Meta (Old)
and /work/SRC/openSUSE:Leap:15.2:Images/.00Meta.new.5145 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "00Meta"
Mon Dec 21 11:02:29 2020 rev:700 rq: version:unknown
Changes:
--------
New Changes file:
NO CHANGES FILE!!!
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ version_totest ++++++
--- /var/tmp/diff_new_pack.mjMnxT/_old 2020-12-21 11:02:30.906532821 +0100
+++ /var/tmp/diff_new_pack.mjMnxT/_new 2020-12-21 11:02:30.906532821 +0100
@@ -1 +1 @@
-31.297
\ No newline at end of file
+31.298
\ No newline at end of file
1
0
21 Dec '20
Hello community,
here is the log from the commit of package gnu_parallel for openSUSE:Factory checked in at 2020-12-21 10:24:44
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/gnu_parallel (Old)
and /work/SRC/openSUSE:Factory/.gnu_parallel.new.5145 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "gnu_parallel"
Mon Dec 21 10:24:44 2020 rev:61 rq:857726 version:20201122
Changes:
--------
--- /work/SRC/openSUSE:Factory/gnu_parallel/gnu_parallel.changes 2020-11-02 10:36:52.555316557 +0100
+++ /work/SRC/openSUSE:Factory/.gnu_parallel.new.5145/gnu_parallel.changes 2020-12-21 10:27:12.580237847 +0100
@@ -1,0 +2,6 @@
+Sun Dec 20 13:20:06 UTC 2020 - Dirk Müller <dmueller(a)suse.com>
+
+- update to 20201122:
+ * Bug fixes and man page updates.
+
+-------------------------------------------------------------------
Old:
----
parallel-20201022.tar.bz2
parallel-20201022.tar.bz2.sig
New:
----
parallel-20201122.tar.bz2
parallel-20201122.tar.bz2.sig
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ gnu_parallel.spec ++++++
--- /var/tmp/diff_new_pack.ipodA4/_old 2020-12-21 10:27:13.364238736 +0100
+++ /var/tmp/diff_new_pack.ipodA4/_new 2020-12-21 10:27:13.368238741 +0100
@@ -17,7 +17,7 @@
Name: gnu_parallel
-Version: 20201022
+Version: 20201122
Release: 0
Summary: Shell tool for executing jobs in parallel
License: GPL-3.0-or-later
++++++ parallel-20201022.tar.bz2 -> parallel-20201122.tar.bz2 ++++++
++++ 3747 lines of diff (skipped)
++++++ parallel-20201022.tar.bz2.sig -> parallel-20201122.tar.bz2.sig ++++++
--- /work/SRC/openSUSE:Factory/gnu_parallel/parallel-20201022.tar.bz2.sig 2020-11-02 10:36:52.683316645 +0100
+++ /work/SRC/openSUSE:Factory/.gnu_parallel.new.5145/parallel-20201122.tar.bz2.sig 2020-12-21 10:27:12.676237957 +0100
@@ -2,7 +2,7 @@
# To check the signature run:
# echo | gpg
-# gpg --auto-key-locate keyserver --keyserver-options auto-key-retrieve parallel-20201022.tar.bz2.sig
+# gpg --auto-key-locate keyserver --keyserver-options auto-key-retrieve parallel-20201122.tar.bz2.sig
echo | gpg 2>/dev/null
gpg --auto-key-locate keyserver --keyserver-options auto-key-retrieve $0
@@ -10,32 +10,32 @@
-----BEGIN PGP SIGNATURE-----
-iQUHBAABCgAdFiEEzaAaQgjE90UGEH570atFFoiIiIgFAl+R5ZgACgkQ0atFFoiI
-iIjz4SaeLIx4rlxp5OLgXmj+0qwW+243aAZ1HNvNYKorDms+v6M0ffYGIYjFK2+7
-6iLPUQpglkDN5W59LcGKKMxL+sfsKzdHkDdnMX1LHUIirugEn+cPSrLDxM01BAgK
-lTfAzu5CaGh4OO8zjNGA+0aadZBFdwHiJXTJju/P4agl6K3a6J65NcwT5NbTvGoQ
-vMVSylAHJ6a+S6PhonsDCG7C7QHOsnXdRpgDG0H5fjTk5HaaHztcBTUyCBnyEeGa
-4ki0ROUJrhyx2fgoKpsbvQIz9HwvaPEvfLlC73s/b36bOe0JgOUshkYUVgBhUozQ
-7N4/ann3jRBhZcqQkuoWduWosve7V6I+E/F4SRS9pkCMla9xoBBI6v7TXOtrR1Ml
-Cc6GBSU/OaHdO5R8gX4vGy4j17XPt+ILyo2DcGXRZ14BAudOkOdU4beO3yrkHlJ3
-4FSlGCQ9CyBF2h3M3ICyTW//T8tzSvAec7lThHwUv7eqNP2VZr34519YGC43xFb3
-Xmtcv7bmk8V4g1jE0L1sIxTMR3YPR9eh71AxTpNw2OzrrXgMIg5WRpVZEKh43pZQ
-pReyRgwJNZlUGCAINGbjd3U0L91AtKCJzsvjloQ+vbzr+ZyfXPITc3LOyEN5XPBl
-5esAwmrolgr0C8WWUtBWKPAdgy6d0F1S9VvTLBaZ1Hk9bYKK1nM4VxnITD8XnKj+
-OZZapHwr3yUKr3oW49YZM2TVWCgr9gK4xpF63+6Ze8xQdoz/lMWBFiZuoKRApR2U
-SEi3TIxoBxktT9W/gjhwtl4gWRCgvxUAzd/bddFTHGrOEgnc6Zeyeg5d5lCU/umd
-o6xQTkudcfLSFAesReN+qkkacz8s1M9LI2Qs7GQgMt6RYJskD72yiOJDxjX3TA+G
-9QTJjMO2MM+m+QnGFdL6bLNGUovTR3r3U8zFUVxowx9kLQ2FAHF9r5WmD4U2El2Z
-7n3Xn6FL26aSN9bkntR0V3IYT3pf/gkV66mdKRc+C/u99CSajWz9Fm9AZkvwDJkU
-c+OckfqFmSg/tkSr8sKlgXbGJZZzef5vPxh4r0fk9q3UqSwHKKLIzjJ0Ztukv+se
-j2dk11oY2p1xJ2itBjrnwWSJRisRtDpTKmyddeGAduHh2MzxHCQs8K8ErIH1b9Y4
-dNTs5DQGMLgVK3rOlMYG6gz7GF2OGcdAE/7uVbKYJFx0BLEiEqbqER4KlOtPi+L1
-3ZPbjjZqBJdoEDi3PugD2EquuyKTkhqLgggOQjuhTWnZqd5Ys/rlXAkhqZvHmvCD
-0gz6QHKV6lrMAbQhf0URvEsaV/puxhsAulIya7VUtETmSz/k0wtd0aSI2FRytZhX
-vCziAn57STk02ARgVxK2vUw69IgZJ7HbXM0Emnanudy8rzmzc9yg8o/bFOh2EMT7
-soG5uk15Uqc4HAbR4TUk579yEIJlHxAcXMiuC2A5wEa7YYBBRUa6mvpcVsQPwF88
-zSTDikr3vgm/1gp9KGDZhvp4QvdaF8++1targJcI8bAfEDaou9aWg4eXKO1XxLK8
-r1V94RX3IG61Qjty7qW4kK/PrTaf8IEXNwlwUxgTkt/s+nUj20JJG4G80Qmte5UK
-ED1FIZSXVF3Bfekmal07qTo64XxSgglpVF32C+cT+h3q4UoBvGgl/KRW
-=Nah0
+iQUHBAABCgAdFiEEzaAaQgjE90UGEH570atFFoiIiIgFAl+6gMMACgkQ0atFFoiI
+iIiT0CafeUHaLOzHslceNXHqKPY0/VIu+OyWw4+PvwcxnvI783equdksrJnf1ArI
+aqGxdS9ZEK38rjXB9hUajL8n+ArAtrcEKzY7VBoU1h7y8a3cjCsYl9Rf/tyYWgK6
+0Gh+lXOR0W8LEEeCJi8L7ADvJaUsShvBSyEzZtDvKwRJbLpAOXTsucTNPDmEEcm2
+D0KFuIhssjzDm0YixIpq+ZNePI5Q6E6GslPOnAZNrt1ah/UQ65+/U2UzLmxCu4ik
+Vfp7gQcjMc8vPuHwnQqgG0FinJImz4ecRS8dLbAxVJkMOOC0/OwLGadmFzjurfBZ
+Dl1nPCjj03GcfengwEjJGUOh4Y5aL8nmgm5KHj86QsvLyHdtk+uyNy/D3gqZS8BO
+9e0cMF3L0aW8lVUKaPiJ5c/l+p7zhCjsSj1s5pIkRNBHuGAxiDT1JMG/cBLnVC0k
+Pac9b2k6Gw+FEbgeiATzkGzkgkQSdjFjFzQ9zjm9RAyKLA6wZAt8CNKoKG7V2G13
+EALExYLVNzJNlS4kfheNtNsUDVRMHB4VUIJmEX5CWNM8SJqMdp961PmLjIr2KdRo
+MQ9XlPI2TnBNmlgPfO1MwXR1Z4CrA/01/36rfW1HplIJqjuNYlNQUwAk2OF4PuUd
+lix5aC1b8q8L9su/674LgPks6IJiw/YzurKQSbHmuMtwPwmsRosAcB0LyJnpLVJe
+CpmNAoKCSxRvjpBgtvP0Dk2Ck911h1LlvVP4O97eE/YtcNZl8xd/VPnDOXQZoOS6
+X5D+TyBQpCwsKo21H5kgvrbmXM2njbMSbe3zNxiOHycVqP5x5SgusHoL5k0IXRNQ
+jX29zdA/Ut8ilOjkoJkfBYy3mS0mEXxrsfnYUGi1Ju+i9RcERyXMhNX1cqE/hyVw
+S6UHPtEfXYZ131Wqnozc22NWeudoWX+mWtU/ErqHEOt/9oW4NtUzHQQ7dv/Nhfzu
+d8nlxsExejGqHTJRib+uBP3ByB061BNeX8M2vHn4YLoQIS/KwRAxa70jcwu6xMBi
+AKiqMEDY6Cu3wtRkwkmDRGYuX3fdBmWoxHCBQFfUcxLMNhi0XoyS/VxrM/7InVph
+fOjOjCwVQP7sMo3kwx1cO+pG/WTiW7mQ3S4kHV1LnaIyUOzxoBOYT0Uhz7wRPXH/
+vihLp1Dsa7LzGKn6EfZ3PcRsLODWNeWEBWZ58wTogK3mbtdngoR7XEVFuRuWKL+8
+AlQUZXVL+B+TmeFkkyhaZ8EMi+YRNAY9+ZOEjLz6bzx50S8MDN4/cOz7psp1sVqN
+pVrWN+DoINrzlgVhy33Ko4+sYqo4sENyt4j5mJ1G4Zchi46FNu/MRYHPqBT1yVID
+dWHFH4VIyQntyhdDMmdCG5KfGrDSM0YaMA33qlvhJphB6WJ7aFcUd1QEnbkPP8uk
+pMeSwFJLHyOO4PxXAE6d2wCOuCMCUjVAXpUjQJX3nKX0KAysts6PGdpYVCcTXeq0
+WGJCW14xI+qhQMR4t5KbWAq6pIPzbWT1HsSxExNCVBwexsEXsCVRLKqHNIw5nJ3z
+q6erY1bpd/7K3zv1iUZ1GzB2tDd0qfC0EFOoZeeutlWH2o9hSjLj1skwEbAYaDgl
+hg4odJZiwdtDdMKKVX48siT/17Dsj+NRVMbv4zhZZVYEeLIFo4layKxz
+=d5Me
-----END PGP SIGNATURE-----
1
0
21 Dec '20
Hello community,
here is the log from the commit of package imapfilter for openSUSE:Factory checked in at 2020-12-21 10:24:42
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/imapfilter (Old)
and /work/SRC/openSUSE:Factory/.imapfilter.new.5145 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "imapfilter"
Mon Dec 21 10:24:42 2020 rev:47 rq:857149 version:2.7.5
Changes:
--------
--- /work/SRC/openSUSE:Factory/imapfilter/imapfilter.changes 2020-11-29 12:29:20.558057666 +0100
+++ /work/SRC/openSUSE:Factory/.imapfilter.new.5145/imapfilter.changes 2020-12-21 10:27:11.780236941 +0100
@@ -1,0 +2,7 @@
+Sun Dec 20 12:51:29 UTC 2020 - Dirk Müller <dmueller(a)suse.com>
+
+- update to 2.7.5:
+ * New "hostnames" option can be used to disable hostname validation.
+ * Bug fix; "certificates" option incorrectly controlled hostname validation.
+
+-------------------------------------------------------------------
Old:
----
imapfilter-2.7.4.tar.gz
New:
----
imapfilter-2.7.5.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ imapfilter.spec ++++++
--- /var/tmp/diff_new_pack.C1A3ir/_old 2020-12-21 10:27:12.356237594 +0100
+++ /var/tmp/diff_new_pack.C1A3ir/_new 2020-12-21 10:27:12.360237598 +0100
@@ -17,13 +17,13 @@
Name: imapfilter
-Version: 2.7.4
+Version: 2.7.5
Release: 0
Summary: A mail filtering utility
License: MIT
Group: Productivity/Networking/Email/Utilities
URL: https://github.com/lefcha/imapfilter
-Source: %{name}-%{version}.tar.gz
+Source: https://github.com/lefcha/imapfilter/archive/v%{version}.tar.gz#/%{name}-%{…
BuildRequires: lua-devel >= 5.1
BuildRequires: openssl-devel
BuildRequires: pcre2-devel
@@ -44,7 +44,7 @@
%setup -q
%build
-make PREFIX="%{_prefix}" MANDIR="%{_mandir}" MYCFLAGS="%{optflags} -I%{lua_incdir}" %{?_smp_mflags}
+%make_build PREFIX="%{_prefix}" MANDIR="%{_mandir}" MYCFLAGS="%{optflags} -I%{lua_incdir}"
%install
%make_install PREFIX="%{_prefix}" MANDIR="%{_mandir}"
++++++ imapfilter-2.7.4.tar.gz -> imapfilter-2.7.5.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/imapfilter-2.7.4/NEWS new/imapfilter-2.7.5/NEWS
--- old/imapfilter-2.7.4/NEWS 2020-11-18 22:30:58.000000000 +0100
+++ new/imapfilter-2.7.5/NEWS 2020-12-05 22:35:02.000000000 +0100
@@ -1,3 +1,7 @@
+IMAPFilter 2.7.5 - 5 Dec 2020
+ - New "hostnames" option can be used to disable hostname validation.
+ - Bug fix; "certificates" option incorrectly controlled hostname validation.
+
IMAPFilter 2.7.4 - 18 Nov 2020
- Bug fix; incorrect argument to regular expression compile function.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/imapfilter-2.7.4/doc/imapfilter_config.5 new/imapfilter-2.7.5/doc/imapfilter_config.5
--- old/imapfilter-2.7.4/doc/imapfilter_config.5 2020-11-18 22:30:58.000000000 +0100
+++ new/imapfilter-2.7.5/doc/imapfilter_config.5 2020-12-05 22:35:02.000000000 +0100
@@ -1,4 +1,4 @@
-.Dd Aug 26, 2018
+.Dd Dec 5, 2020
.Dt IMAPFILTER_CONFIG 5
.Os
.Sh NAME
@@ -126,6 +126,13 @@
.Vt boolean
as a value. Default is
.Dq true .
+.It Va hostnames
+When this option is enabled, the server hostname is validated, in
+order to verify the client is talking to the correct server. This variable
+takes a
+.Vt boolean
+as a value. Default is
+.Dq true .
.It Va info
When this options is enabled, a summary of the program's actions is printed,
while processing mailboxes. This variable takes a
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/imapfilter-2.7.4/src/lua.c new/imapfilter-2.7.5/src/lua.c
--- old/imapfilter-2.7.4/src/lua.c 2020-11-18 22:30:58.000000000 +0100
+++ new/imapfilter-2.7.5/src/lua.c 2020-12-05 22:35:02.000000000 +0100
@@ -136,6 +136,7 @@
set_table_boolean("crammd5", 1);
set_table_boolean("create", 0);
set_table_boolean("expunge", 1);
+ set_table_boolean("hostnames", 1);
set_table_number("keepalive", 29);
set_table_boolean("namespace", 1);
set_table_boolean("persist", 0);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/imapfilter-2.7.4/src/socket.c new/imapfilter-2.7.5/src/socket.c
--- old/imapfilter-2.7.4/src/socket.c 2020-11-18 22:30:58.000000000 +0100
+++ new/imapfilter-2.7.5/src/socket.c 2020-12-05 22:35:02.000000000 +0100
@@ -142,7 +142,7 @@
if (!(ssn->sslconn = SSL_new(ctx)))
goto fail;
- if (get_option_boolean("certificates")) {
+ if (get_option_boolean("hostnames")) {
#if OPENSSL_VERSION_NUMBER >= 0x10100000L
SSL_set_hostflags(ssn->sslconn,
X509_CHECK_FLAG_NO_PARTIAL_WILDCARDS);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/imapfilter-2.7.4/src/version.h new/imapfilter-2.7.5/src/version.h
--- old/imapfilter-2.7.4/src/version.h 2020-11-18 22:30:58.000000000 +0100
+++ new/imapfilter-2.7.5/src/version.h 2020-12-05 22:35:02.000000000 +0100
@@ -3,7 +3,7 @@
/* Program's version number. */
-#define VERSION "2.7.4"
+#define VERSION "2.7.5"
/* Program's copyright. */
#define COPYRIGHT "Copyright (c) 2001-2020 Eleftherios Chatzimparmpas"
1
0
21 Dec '20
Hello community,
here is the log from the commit of package arpack-ng for openSUSE:Factory checked in at 2020-12-21 10:24:39
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/arpack-ng (Old)
and /work/SRC/openSUSE:Factory/.arpack-ng.new.5145 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "arpack-ng"
Mon Dec 21 10:24:39 2020 rev:16 rq:857140 version:3.8.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/arpack-ng/arpack-ng.changes 2020-07-01 14:26:27.686724115 +0200
+++ /work/SRC/openSUSE:Factory/.arpack-ng.new.5145/arpack-ng.changes 2020-12-21 10:27:10.740235762 +0100
@@ -1,0 +2,63 @@
+Thu Dec 10 00:31:22 UTC 2020 - Atri Bhattacharya <badshah400(a)gmail.com>
+
+- Update to version 3.8.0:
+ * Bug fixes:
+ - bmat return "G" instead of "B" for generalized matrix in
+ arpack.hpp.
+ - pass arrays of chars as scalar in fortran calls in order not
+ to crash when calling subroutines through icb interface.
+ - fix 'Unknown CMake command "check_symbol_exists".' when
+ ICB=ON.
+ - fix arpackdef.h (resp. arpackicb.h) must be included only by
+ C/C++ (resp. F77/F90).
+ - iparam/ipntr sizes may change depending on cases.
+ - ILP64 support: using debug_c and stat_c.
+ - fix check precision which may fail with some ATLAS versions.
+ - fix 'eval: Syntax error: "(" unexpected' error at build
+ time.
+ - ICB using rvec/select: rvec/select turned to integer bool
+ should be, but, is not always supported (depend on compiler,
+ options).
+ * arpackmm:
+ - extract arpackSolver.hpp from arpakmm.cpp.
+ - arpackSolver/arpackmm: switch eigen version to 3.3.
+ - arpackmm: add --slvItrPC option (PC: Jacobi, ILU).
+ - arpackmm: add --slv LLT LDLT (for SPD matrices).
+ - arpackmm: add --simplePrec option (to enable use of s*upd).
+ - arpackmm: add --dense option.
+ * pyarpack: python binding based on Boost.Python.Numpy exposing
+ C++ API.
+ * autotools: provide *.cmake files (in addition to *.pc file).
+ * Only build shared libraries by default. To build static
+ libraries, use --enable-static (autotools) or
+ -DBUILD_SHARED_LIBS=OFF (cmake).
+ * [CLEAN] arpackSolver API: more convenient, suppress template
+ parameters when possible.
+ * Add parpack.pc and arpackSolver.pc.
+ * Support of gfortran 10.
+- Drop patches incorporated upstream:
+ * arpack-ng-gcc10.patch.
+ * arpack-ng-double-comparison.patch.
+- New patches:
+ * arpack-ng-python-module-installdir.patch to move python
+ module to standard python sitearch.
+- Enable pyarpack: a python interface for arpack-ng, and split it
+ out into a new package: python3-arpack-ng; this module is only
+ built once -- for the serial flavor:
+ * Disabled for i586 (gh#opencollab/arpack-ng#289).
+ * Disbaled for openSUSE < 1550: boost too old.
+- Add _constraints to allow enough memory (12 GB) and disk size
+ (3 GB) required to build pyarpack (only for x86_64, aarch64).
+- Switch to building with cmake:
+ * Add BuildRequires: cmake.
+ * Pass -DCMAKE_INSTALL_<foo> options to suggest correct install
+ paths for different flavors.
+ * CMAKE_CXX_COMPILER_VERSION: GCC version is required to build
+ pyarpack
+ * To work around broken rpath handling in Leap 15.2's macros,
+ pass `-DCMAKE_SKIP_RPATH=OFF -DCMAKE_SKIP_INSTALL_RPATH=ON`
+ (do this on all distro versions as it doesn't hurt)
+- Use arpack-ng-%{version} as the naming format for the source
+ tarball.
+
+-------------------------------------------------------------------
Old:
----
3.7.0.tar.gz
arpack-ng-double-comparison.patch
arpack-ng-gcc10.patch
New:
----
_constraints
arpack-ng-3.8.0.tar.gz
arpack-ng-python-module-installdir.patch
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ arpack-ng.spec ++++++
--- /var/tmp/diff_new_pack.qlYpF1/_old 2020-12-21 10:27:11.528236655 +0100
+++ /var/tmp/diff_new_pack.qlYpF1/_new 2020-12-21 10:27:11.528236655 +0100
@@ -56,6 +56,12 @@
%{bcond_with hpc}
%endif
+%if "%flavor" == "openmpi4"
+%define mpi_family openmpi
+%define mpi_ver 4
+%{bcond_with hpc}
+%endif
+
# openmpi 1 was called just "openmpi" in Leap 15.x/SLE15
%if 0%{?suse_version} >= 1550 || "%{mpi_family}" != "openmpi" || "%{mpi_ver}" != "1"
%define mpi_ext %{?mpi_ver}
@@ -77,27 +83,54 @@
%global my_incdir %{_includedir}
%endif
+# 3. OOM on i586: https://github.com/opencollab/arpack-ng/issues/289
+%ifarch i586
+%bcond_with pyarpack
+%else
+# 2. Boost too old for 15.2 and earlier
+%if 0%{?suse_version} < 1550
+%bcond_with pyarpack
+%else
+# 1. Build python module once: for serial flavor only
+%if %{with mpi}
+%bcond_with pyarpack
+%else
+%bcond_without pyarpack
+%endif
+# /1
+%endif
+# /2
+%endif
+# /3
+
Name: %{pkgname}
-Version: 3.7.0
+Version: 3.8.0
Release: 0
Summary: Fortran77 subroutines for solving large scale eigenvalue problems
License: BSD-3-Clause
Group: System/Libraries
URL: https://github.com/opencollab/arpack-ng
-Source0: https://github.com/opencollab/arpack-ng/archive/%{version}.tar.gz
-# PATCH-FIX-UPSTREAM arpack-ng-gcc10.patch gh#opencollab/arpack-ng#239 gh#opencollab/arpack-ng#245 badshah400(a)gmail.com -- Fix building against GCC 10, patches taken from upstream commits
-Patch0: arpack-ng-gcc10.patch
-# PATCH-FIX-UPSTREAM arpack-ng-double-comparison.patch gh#opencollab/arpack-ng#269 badshah400(a)gmail.com -- Compare difference to zerop to test float equivalence in TESTS/bug_79_double_complex.f; fixes build failure for i586
-Patch1: arpack-ng-double-comparison.patch
+Source0: https://github.com/opencollab/arpack-ng/archive/%{version}.tar.gz#/arpack-n…
+# PATCH-FEATURE-OPENSUSE arpack-ng-python-module-installdir.patch badshah400(a)gmail.com -- Install python module to standard python sitearch instead of libdir
+Patch0: arpack-ng-python-module-installdir.patch
%if %{with mpi}
BuildRequires: %{mpi_family}%{?mpi_ext}-devel
%endif
-BuildRequires: autoconf
BuildRequires: blas-devel
+BuildRequires: cmake
+BuildRequires: gcc-c++
BuildRequires: gcc-fortran
BuildRequires: lapack-devel
+BuildRequires: libopenblas_pthreads-devel
BuildRequires: libtool
BuildRequires: pkg-config
+BuildRequires: pkgconfig(eigen3)
+%if %{with pyarpack}
+BuildRequires: libboost_numpy3-devel
+BuildRequires: libboost_python3-devel
+BuildRequires: python3-devel
+BuildRequires: python3-numpy
+%endif
%description
ARPACK is a collection of Fortran77 subroutines designed to solve
@@ -139,10 +172,17 @@
large scale eigenvalue problems. This package contains the so
library links used for building arpack based applications.
+%package -n python3-%{name}
+Summary: Python bindings for ARPACK
+Group: Development/Libraries/Python
+
+%description -n python3-%{name}
+ARPACK is a collection of Fortran77 subroutines designed to solve
+large scale eigenvalue problems. This package provides the python
+bindings for ARPACK.
+
%prep
-%setup -q -n arpack-ng-%{version}
-%patch0 -p1
-%patch1 -p1
+%autosetup -p1 -n arpack-ng-%{version}
# create baselibs.conf based on flavor
cat > %{_sourcedir}/baselibs.conf <<EOF
@@ -161,22 +201,26 @@
export CXXFLAGS="%{optflags} -fPIC"
%if %{with mpi}
+source %{my_prefix}/bin/mpivars.sh
+export CC=%{my_prefix}/bin/mpicc
+export CXX=%{my_prefix}/bin/mpic++
export F77=%{my_prefix}/bin/mpif77
export MPIF77=%{my_prefix}/bin/mpif77
export LD_LIBRARY_PATH=%{my_prefix}/%{_lib}
%endif
-%global orig_prefix %{_prefix}
-%define _prefix %{my_prefix}
-sh bootstrap
-%configure --disable-static \
- %{?with_mpi: --enable-mpi} \
- %{nil}
-%define _prefix %{orig_prefix}
-make %{?_smp_mflags}
+%cmake \
+ -DCMAKE_INSTALL_PREFIX:PATH=%{my_prefix} \
+ -DCMAKE_INSTALL_LIBDIR:PATH=%{my_libdir} \
+ -DCMAKE_SKIP_RPATH:BOOL=OFF \
+ -DCMAKE_SKIP_INSTALL_RPATH:BOOL=ON \
+ -DCMAKE_CXX_COMPILER_VERSION=$(gcc -dumpfullversion) \
+ -DMPI:BOOL=%{?with_mpi:ON}%{!?with_mpi:OFF} \
+ -DPYTHON3:BOOL=%{?with_pyarpack:ON}%{!?with_pyarpack:OFF}
+%cmake_build
%install
-%make_install
+%cmake_install
find %{buildroot} -type f -name "*.la" -delete -print
# Remove sequential version files
@@ -189,9 +233,9 @@
%check
%if %{with mpi}
-export PATH="%{my_prefix}/bin/:$PATH"
+source %{my_prefix}/bin/mpivars.sh
%endif
-%make_build check
+%ctest
%post -n %{libname} -p /sbin/ldconfig
%postun -n %{libname} -p /sbin/ldconfig
@@ -208,6 +252,14 @@
%if %{without mpi}
%dir %{_libdir}/pkgconfig
%{_libdir}/pkgconfig/*.pc
+%else
+%dir %{my_libdir}/cmake
+%endif
+%{my_libdir}/cmake/arpack-ng/
+
+%if %{with pyarpack}
+%files -n python3-%{name}
+%{python3_sitearch}/*.so
%endif
%changelog
++++++ _constraints ++++++
<?xml version="1.0" encoding="UTF-8"?>
<constraints>
<!-- Only for x86_64, aarch64 where pyarpack is built -->
<overwrite>
<conditions>
<arch>x86_64</arch>
<arch>aarch64</arch>
</conditions>
<hardware>
<disk>
<size unit="G">3</size>
</disk>
<physicalmemory>
<size unit="G">12</size>
</physicalmemory>
</hardware>
</overwrite>
</constraints>
++++++ arpack-ng-python-module-installdir.patch ++++++
Index: arpack-ng-3.8.0/CMakeLists.txt
===================================================================
--- arpack-ng-3.8.0.orig/CMakeLists.txt
+++ arpack-ng-3.8.0/CMakeLists.txt
@@ -608,8 +608,8 @@ if(ICB)
target_include_directories(pyarpack PUBLIC ${pyarpack_HDR} ${EIGEN3_INCLUDE_DIR} ${Boost_INCLUDE_DIRS} ${PYTHON_INCLUDE_DIRS})
target_link_libraries(pyarpack BLAS::BLAS LAPACK::LAPACK ${Boost_LIBRARIES} ${PYTHON_LIBRARIES})
install(TARGETS pyarpack
- ARCHIVE DESTINATION ${CMAKE_INSTALL_LIBDIR}/pyarpack
- LIBRARY DESTINATION ${CMAKE_INSTALL_LIBDIR}/pyarpack)
+ ARCHIVE DESTINATION ${CMAKE_INSTALL_LIBDIR}/python${PYTHON_VERSION_MAJOR}.${PYTHON_VERSION_MINOR}/site-packages
+ LIBRARY DESTINATION ${CMAKE_INSTALL_LIBDIR}/python${PYTHON_VERSION_MAJOR}.${PYTHON_VERSION_MINOR}/site-packages)
configure_file("${PROJECT_SOURCE_DIR}/EXAMPLES/PYARPACK/pyarpackSparseBiCGDiag.py.in" "${CMAKE_BINARY_DIR}/pyarpackSparseBiCGDiag.py" @ONLY)
add_test(NAME pyarpackSparseBiCGDiag_tst COMMAND ${PYTHON_EXECUTABLE} pyarpackSparseBiCGDiag.py)
set_tests_properties(pyarpackSparseBiCGDiag_tst PROPERTIES ENVIRONMENT PYTHONPATH=${CMAKE_BINARY_DIR}/lib:$ENV{PYTHONPATH})
1
0
Hello community,
here is the log from the commit of package kokkos for openSUSE:Factory checked in at 2020-12-21 10:24:37
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/kokkos (Old)
and /work/SRC/openSUSE:Factory/.kokkos.new.5145 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "kokkos"
Mon Dec 21 10:24:37 2020 rev:4 rq:857139 version:3.3.00
Changes:
--------
--- /work/SRC/openSUSE:Factory/kokkos/kokkos.changes 2020-09-25 16:35:20.624093240 +0200
+++ /work/SRC/openSUSE:Factory/.kokkos.new.5145/kokkos.changes 2020-12-21 10:27:09.432234279 +0100
@@ -1,0 +2,141 @@
+Sat Dec 19 15:05:39 UTC 2020 - Christoph Junghans <junghans(a)votca.org>
+
+- dropped 3308.patch - merged upstream
+- Version bump to 3.3.00:
+ - Features:
+ - Require C++14 as minimum C++ standard. C++17 and C++20 are
+ supported too.
+ - HIP backend is nearly feature complete. Kokkos Dynamic Task
+ Graphs are missing.
+ - Major update for OpenMPTarget: many capabilities now work.
+ For details contact us.
+ - Added DPC++/SYCL backend: primary capabilites are working.
+ - Added Kokkos Graph API analogous to CUDA Graphs.
+ - Added parallel_scan support with TeamThreadRange
+ [gh#kokkos/kokkos#3536]
+ - Added Logical Memory Spaces [gh#kokkos/kokkos#3546]
+ - Added initial half precision support [gh#kokkos/kokkos#3439]
+ - Experimental feature: control cuda occupancy
+ [gh#kokkos/kokkos#3379]
+ - Implemented enhancements Backends and Archs:
+ - Add a64fx and fujitsu Compiler support
+ [gh#kokkos/kokkos#3614]
+ - Adding support for AMD gfx908 archictecture
+ [gh#kokkos/kokkos#3375]
+ - SYCL parallel_for MDRangePolicy [gh#kokkos/kokkos#3583]
+ - SYCL add parallel_scan [gh#kokkos/kokkos#3577]
+ - SYCL custom reductions [gh#kokkos/kokkos#3544]
+ - SYCL Enable container unit tests [gh#kokkos/kokkos#3550]
+ - SYCL feature level 5 [gh#kokkos/kokkos#3480]
+ - SYCL Feature level 4 (parallel_for) [gh#kokkos/kokkos#3474]
+ - SYCL feature level 3 [gh#kokkos/kokkos#3451]
+ - SYCL feature level 2 [gh#kokkos/kokkos#3447]
+ - OpenMPTarget: Hierarchial reduction for + operator on
+ scalars [gh#kokkos/kokkos#3504]
+ - OpenMPTarget hierarchical [gh#kokkos/kokkos#3411]
+ - HIP Add Impl::atomic_[store,load] [gh#kokkos/kokkos#3440]
+ - HIP enable global lock arrays [gh#kokkos/kokkos#3418]
+ - HIP Implement multiple occupancy paths for various HIP
+ kernel launchers [gh#kokkos/kokkos#3366]
+ - Implemented enhancements Policies:
+ - MDRangePolicy: Let it be semiregular [gh#kokkos/kokkos#3494]
+ - MDRangePolicy: Check narrowing conversion in construction
+ [gh#kokkos/kokkos#3527]
+ - MDRangePolicy: CombinedReducers support
+ [gh#kokkos/kokkos#3395]
+ - Kokkos Graph: Interface and Default Implementation
+ [gh#kokkos/kokkos#3362]
+ - Kokkos Graph: add Cuda Graph implementation
+ [gh#kokkos/kokkos#3369]
+ - TeamPolicy: implemented autotuning of team sizes and vector
+ lengths [gh#kokkos/kokkos#3206]
+ - RangePolicy: Initialize all data members in default
+ constructor [gh#kokkos/kokkos#3509]
+ - Implemented enhancements BuildSystem:
+ - Auto-generate core test files for all backends
+ [gh#kokkos/kokkos#3488]
+ - Avoid rewriting test files when calling cmake
+ [gh#kokkos/kokkos#3548]
+ - RULE_LAUNCH_COMPILE and RULE_LAUNCH_LINK system for
+ nvcc_wrapper [gh#kokkos/kokkos#3136]
+ - Adding -include as a known argument to nvcc_wrapper
+ [gh#kokkos/kokkos#3434]
+ - Install hpcbind script [gh#kokkos/kokkos#3402]
+ - cmake/kokkos_tribits.cmake: add parsing for args
+ [gh#kokkos/kokkos#3457]
+ - Implemented enhancements Tools:
+ - Changed namespacing of
+ Kokkos::Tools::Impl::Impl::tune_policy
+ [gh#kokkos/kokkos#3455]
+ - Delegate to an impl allocate/deallocate method to allow
+ specifying a SpaceHandle for MemorySpaces
+ [gh#kokkos/kokkos#3530]
+ - Use the Kokkos Profiling interface rather than the Impl
+ interface [gh#kokkos/kokkos#3518]
+ - Runtime option for tuning [gh#kokkos/kokkos#3459]
+ - Dual View Tool Events [gh#kokkos/kokkos#3326]
+ - Implemented enhancements Other:
+ - Abort on errors instead of just printing
+ [gh#kokkos/kokkos#3528]
+ - Enable C++14 macros unconditionally [gh#kokkos/kokkos#3449]
+ - Make ViewMapping trivially copyable [gh#kokkos/kokkos#3436]
+ - Rename struct ViewMapping to class [gh#kokkos/kokkos#3435]
+ - Replace enums in Kokkos_ViewMapping.hpp (removes -Wextra)
+ [gh#kokkos/kokkos#3422]
+ - Use bool for enums representing bools
+ [gh#kokkos/kokkos#3416]
+ - Fence active instead of default execution space instances
+ [gh#kokkos/kokkos#3388]
+ - Refactor parallel_reduce fence usage [gh#kokkos/kokkos#3359]
+ - Moved Space EBO helpers to Kokkos_EBO
+ [gh#kokkos/kokkos#3357]
+ - Add remove_cvref type trait [gh#kokkos/kokkos#3340]
+ - Adding identity type traits and update definition of
+ identity_t alias [gh#kokkos/kokkos#3339]
+ - Add is_specialization_of type trait [gh#kokkos/kokkos#3338]
+ - Make ScratchMemorySpace semi-regular [gh#kokkos/kokkos#3309]
+ - Optimize min/max atomics with early exit on no-op case
+ [gh#kokkos/kokkos#3265]
+ - Refactor Backend Development [gh#kokkos/kokkos#2941]
+ - Fixed bugs:
+ - Fixup MDRangePolicy construction from Kokkos arrays
+ [gh#kokkos/kokkos#3591]
+ - Add atomic functions for unsigned long long using gcc
+ built-in [gh#kokkos/kokkos#3588]
+ - Fixup silent pointless comparison with zero in
+ checked_narrow_cast (compiler workaround)
+ [gh#kokkos/kokkos#3566]
+ - Fixes for ROCm 3.9 [gh#kokkos/kokkos#3565]
+ - Fix windows build issues which crept in for the CUDA build
+ [gh#kokkos/kokkos#3532]
+ - HIP Fix atomics of large data types and clean up lock arrays
+ [gh#kokkos/kokkos#3529]
+ - Pthreads fix exception resulting from 0 grain size
+ [gh#kokkos/kokkos#3510]
+ - Fixup do not require atomic operation to be default
+ constructible [gh#kokkos/kokkos#3503]
+ - Fix race condition in HIP backend [gh#kokkos/kokkos#3467]
+ - Replace KOKKOS_DEBUG with KOKKOS_ENABLE_DEBUG
+ [gh#kokkos/kokkos#3458]
+ - Fix multi-stream team scratch space definition for HIP
+ [gh#kokkos/kokkos#3398]
+ - HIP fix template deduction [gh#kokkos/kokkos#3393]
+ - Fix compiling with HIP and C++17 [gh#kokkos/kokkos#3390]
+ - Fix sigFPE in HIP blocksize deduction
+ [gh#kokkos/kokkos#3378]
+ - Type alias change: replace CS with CTS to avoid conflicts
+ with NVSHMEM [gh#kokkos/kokkos#3348]
+ - Clang compilation of CUDA backend on Windows
+ [gh#kokkos/kokkos#3345]
+ - Fix HBW support [gh#kokkos/kokkos#3343]
+ - Added missing fences to unique token [gh#kokkos/kokkos#3260]
+ - Incompatibilities:
+ - Remove unused utilities (forward, move, and expand_variadic)
+ from Kokkos::Impl [gh#kokkos/kokkos#3535]
+ - Remove unused traits [gh#kokkos/kokkos#3534]
+ - HIP: Remove old HCC code [gh#kokkos/kokkos#3301]
+ - Prepare for deprecation of ViewAllocateWithoutInitializing
+ [gh#kokkos/kokkos#3264]
+ - Remove ROCm backend [gh#kokkos/kokkos#3148]
+
+-------------------------------------------------------------------
Old:
----
3308.patch
kokkos-3.2.00.tar.gz
New:
----
kokkos-3.3.00.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ kokkos.spec ++++++
--- /var/tmp/diff_new_pack.bQgt45/_old 2020-12-21 10:27:09.976234896 +0100
+++ /var/tmp/diff_new_pack.bQgt45/_new 2020-12-21 10:27:09.976234896 +0100
@@ -18,7 +18,7 @@
Name: kokkos
-Version: 3.2.00
+Version: 3.3.00
Release: 0
%define sover 3
Summary: A C++ Performance Portability Programming
@@ -29,8 +29,6 @@
URL: https://github.com/kokkos/kokkos
Source0: https://github.com/kokkos/kokkos/archive/%{version}.tar.gz#/%{name}-%{versi…
-# PATCH-FIX-UPSTREAM 3308.patch - fix naming of printer-tool
-Patch0: https://github.com/kokkos/kokkos/pull/3308.patch
BuildRequires: cmake >= 3.0
BuildRequires: gcc-c++
@@ -70,7 +68,6 @@
%prep
%setup -q
-%patch0 -p1
%build
%{cmake} \
@@ -103,5 +100,7 @@
%{_libdir}/cmake/Kokkos
%{_includedir}/kokkos
%{_bindir}/nvcc_wrapper
+%{_bindir}/hpcbind
+%{_bindir}/kokkos_launch_compiler
%changelog
++++++ kokkos-3.2.00.tar.gz -> kokkos-3.3.00.tar.gz ++++++
++++ 98463 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package HepMC for openSUSE:Factory checked in at 2020-12-21 10:24:35
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/HepMC (Old)
and /work/SRC/openSUSE:Factory/.HepMC.new.5145 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "HepMC"
Mon Dec 21 10:24:35 2020 rev:10 rq:857138 version:3.2.3
Changes:
--------
--- /work/SRC/openSUSE:Factory/HepMC/HepMC.changes 2020-07-17 20:54:31.265150035 +0200
+++ /work/SRC/openSUSE:Factory/.HepMC.new.5145/HepMC.changes 2020-12-21 10:27:08.392233100 +0100
@@ -1,0 +2,17 @@
+Thu Dec 17 23:32:40 UTC 2020 - Atri Bhattacharya <badshah400(a)gmail.com>
+
+- Update to version 3.2.3:
+ * Documentation and copyright years were updated.
+ * Python bindings were regenerated with binder 1.1.0 and the
+ pybind11 copy updated to 2.6.0.
+ * The #ifdefs around the functions that had to be excluded from
+ bindings but binder 1.0.0 was not doing that were removed.
+ * A python test with attributes was added.
+ * ReaderLHEF was fixed for to treat correctly more comaplicated
+ input.
+ * Included updates to many python scripts.
+ * Included updates to CMake scripts.
+- Run tests; need to pass CMAKE_SKIP_RPATH=OFF to cmake to fix
+ rpath handling in Leap 15.2 (and doesn't hurt generally).
+
+-------------------------------------------------------------------
Old:
----
HepMC3-3.2.2.tar.gz
New:
----
HepMC3-3.2.3.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ HepMC.spec ++++++
--- /var/tmp/diff_new_pack.5yH2FN/_old 2020-12-21 10:27:09.220234039 +0100
+++ /var/tmp/diff_new_pack.5yH2FN/_new 2020-12-21 10:27:09.224234044 +0100
@@ -21,7 +21,7 @@
Name: HepMC
%define lname libHepMC3-1
-Version: 3.2.2
+Version: 3.2.3
Release: 0
Summary: An event record for High Energy Physics Monte Carlo Generators in C++
# Python bindings are BSD-3-Clause, packaged separately
@@ -112,8 +112,10 @@
-DCMAKE_INSTALL_DOCDIR:PATH=%{_docdir}/%{name} \
-DHEPMC3_BUILD_STATIC_LIBS:BOOL=OFF \
-DHEPMC3_PYTHON_VERSIONS:STRING="%{py3_ver}" \
+ -DCMAKE_SKIP_RPATH:BOOL=OFF \
-DCMAKE_SKIP_INSTALL_RPATH:BOOL=ON \
- -DHEPMC3_BUILD_EXAMPLES:BOOL=ON
+ -DHEPMC3_BUILD_EXAMPLES:BOOL=ON \
+ -DHEPMC3_ENABLE_TEST:BOOL=ON
%cmake_build
@@ -122,6 +124,9 @@
%fdupes %{buildroot}%{_docdir}/%{name}/
+%check
+%ctest
+
%post -n %{lname} -p /sbin/ldconfig
%postun -n %{lname} -p /sbin/ldconfig
@@ -131,7 +136,7 @@
%files devel
%license LICENCE COPYING
-%doc README* ChangeLog DESIGN
+%doc README* ChangeLog
%{_bindir}/HepMC3-config
%{_libdir}/libHepMC3.so
%{_libdir}/libHepMC3search.so
++++++ HepMC3-3.2.2.tar.gz -> HepMC3-3.2.3.tar.gz ++++++
/work/SRC/openSUSE:Factory/HepMC/HepMC3-3.2.2.tar.gz /work/SRC/openSUSE:Factory/.HepMC.new.5145/HepMC3-3.2.3.tar.gz differ: char 13, line 1
1
0
21 Dec '20
Hello community,
here is the log from the commit of package qalculate-gtk for openSUSE:Factory checked in at 2020-12-21 10:24:33
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/qalculate-gtk (Old)
and /work/SRC/openSUSE:Factory/.qalculate-gtk.new.5145 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "qalculate-gtk"
Mon Dec 21 10:24:33 2020 rev:6 rq:857100 version:3.15.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/qalculate-gtk/qalculate-gtk.changes 2020-12-03 18:43:09.406191615 +0100
+++ /work/SRC/openSUSE:Factory/.qalculate-gtk.new.5145/qalculate-gtk.changes 2020-12-21 10:27:07.208231759 +0100
@@ -1,0 +2,12 @@
+Thu Dec 10 07:38:05 UTC 2020 - Paolo Stivanin <info(a)paolostivanin.com>
+
+- Update to version 3.15.0:
+ * Replace equals button with a clickable icon in the upper right
+ corner of the expression entry
+ * Optional extra column of customizable keypad buttons
+ * Use icons for "Value", "Text", and "Copy" in history view and
+ hide all buttons when keypad is shown simultaneously
+ * Add "Exact" menu item to result popup menu when appropriate
+ * Fix segfault in unit manager
+
+-------------------------------------------------------------------
Old:
----
qalculate-gtk-3.14.0.tar.gz
New:
----
qalculate-gtk-3.15.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ qalculate-gtk.spec ++++++
--- /var/tmp/diff_new_pack.WD5ifQ/_old 2020-12-21 10:27:07.780232407 +0100
+++ /var/tmp/diff_new_pack.WD5ifQ/_new 2020-12-21 10:27:07.784232411 +0100
@@ -17,7 +17,7 @@
Name: qalculate-gtk
-Version: 3.14.0
+Version: 3.15.0
Release: 0
Summary: Multi-purpose cross-platform desktop calculator
License: GPL-2.0-or-later
++++++ qalculate-gtk-3.14.0.tar.gz -> qalculate-gtk-3.15.0.tar.gz ++++++
++++ 28122 lines of diff (skipped)
1
0
21 Dec '20
Hello community,
here is the log from the commit of package youtube-dl for openSUSE:Factory checked in at 2020-12-21 10:24:31
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/youtube-dl (Old)
and /work/SRC/openSUSE:Factory/.youtube-dl.new.5145 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "youtube-dl"
Mon Dec 21 10:24:31 2020 rev:152 rq:857759 version:2020.12.14
Changes:
--------
--- /work/SRC/openSUSE:Factory/youtube-dl/python-youtube-dl.changes 2020-12-12 20:36:15.242095541 +0100
+++ /work/SRC/openSUSE:Factory/.youtube-dl.new.5145/python-youtube-dl.changes 2020-12-21 10:27:05.416229727 +0100
@@ -1,0 +2,10 @@
+Sun Dec 13 19:35:10 UTC 2020 - Jan Engelhardt <jengelh(a)inai.de>
+
+- Update to release 2020.12.14
+ * youtube: Add some invidious instances
+ * itv: clean description from HTML tags
+ * linuxacademy] Fix authentication and extraction
+ * downloader/hls] delegate manifests with media initialization
+ to ffmpeg
+
+-------------------------------------------------------------------
youtube-dl.changes: same change
Old:
----
youtube-dl-2020.12.12.tar.gz
youtube-dl-2020.12.12.tar.gz.sig
New:
----
youtube-dl-2020.12.14.tar.gz
youtube-dl-2020.12.14.tar.gz.sig
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-youtube-dl.spec ++++++
--- /var/tmp/diff_new_pack.ntxBsv/_old 2020-12-21 10:27:06.596231065 +0100
+++ /var/tmp/diff_new_pack.ntxBsv/_new 2020-12-21 10:27:06.600231069 +0100
@@ -19,7 +19,7 @@
%define modname youtube-dl
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-youtube-dl
-Version: 2020.12.12
+Version: 2020.12.14
Release: 0
Summary: A Python module for downloading from video sites for offline watching
License: SUSE-Public-Domain AND CC-BY-SA-3.0
++++++ youtube-dl.spec ++++++
--- /var/tmp/diff_new_pack.ntxBsv/_old 2020-12-21 10:27:06.628231101 +0100
+++ /var/tmp/diff_new_pack.ntxBsv/_new 2020-12-21 10:27:06.628231101 +0100
@@ -17,7 +17,7 @@
Name: youtube-dl
-Version: 2020.12.12
+Version: 2020.12.14
Release: 0
Summary: A tool for downloading from video sites for offline watching
License: SUSE-Public-Domain AND CC-BY-SA-3.0
++++++ youtube-dl-2020.12.12.tar.gz -> youtube-dl-2020.12.14.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/ChangeLog new/youtube-dl/ChangeLog
--- old/youtube-dl/ChangeLog 2020-12-12 01:09:56.000000000 +0100
+++ new/youtube-dl/ChangeLog 2020-12-13 18:57:08.000000000 +0100
@@ -1,3 +1,30 @@
+version 2020.12.14
+
+Core
+* [extractor/common] Improve JSON-LD interaction statistic extraction (#23306)
+* [downloader/hls] Delegate manifests with media initialization to ffmpeg
++ [extractor/common] Document duration meta field for playlists
+
+Extractors
+* [mdr] Bypass geo restriction
+* [mdr] Improve extraction (#24346, #26873)
+* [yandexmusic:album] Improve album title extraction (#27418)
+* [eporner] Fix view count extraction and make optional (#23306)
++ [eporner] Extend URL regular expression
+* [eporner] Fix hash extraction and extend _VALID_URL (#27396)
+* [slideslive] Use m3u8 entry protocol for m3u8 formats (#27400)
+* [twitcasting] Fix format extraction and improve info extraction (#24868)
+* [linuxacademy] Fix authentication and extraction (#21129, #26223, #27402)
+* [itv] Clean description from HTML tags (#27399)
+* [vlive] Sort live formats (#27404)
+* [hotstart] Fix and improve extraction
+ * Fix format extraction (#26690)
+ + Extract thumbnail URL (#16079, #20412)
+ + Add support for country specific playlist URLs (#23496)
+ * Select the last id in video URL (#26412)
++ [youtube] Add some invidious instances (#27373)
+
+
version 2020.12.12
Core
@@ -106,7 +133,7 @@
Extractors
+ [tva] Add support for qub.ca (#27235)
-+ [toggle] Detect DRM protected videos (closes #16479)(closes #20805)
++ [toggle] Detect DRM protected videos (#16479, #20805)
+ [toggle] Add support for new MeWatch URLs (#27256)
* [youtube:tab] Extract channels only from channels tab (#27266)
+ [cspan] Extract info from jwplayer data (#3672, #3734, #10638, #13030,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/test/test_InfoExtractor.py new/youtube-dl/test/test_InfoExtractor.py
--- old/youtube-dl/test/test_InfoExtractor.py 2020-12-12 01:08:09.000000000 +0100
+++ new/youtube-dl/test/test_InfoExtractor.py 2020-12-13 18:56:48.000000000 +0100
@@ -98,6 +98,55 @@
self.assertRaises(RegexNotFoundError, ie._html_search_meta, 'z', html, None, fatal=True)
self.assertRaises(RegexNotFoundError, ie._html_search_meta, ('z', 'x'), html, None, fatal=True)
+ def test_search_json_ld_realworld(self):
+ # https://github.com/ytdl-org/youtube-dl/issues/23306
+ expect_dict(
+ self,
+ self.ie._search_json_ld(r'''<script type="application/ld+json">
+{
+"@context": "http://schema.org/",
+"@type": "VideoObject",
+"name": "1 On 1 With Kleio",
+"url": "https://www.eporner.com/hd-porn/xN49A1cT3eB/1-On-1-With-Kleio/",
+"duration": "PT0H12M23S",
+"thumbnailUrl": ["https://static-eu-cdn.eporner.com/thumbs/static4/7/78/780/780814/9_360.jpg", "https://imggen.eporner.com/780814/1920/1080/9.jpg"],
+"contentUrl": "https://gvideo.eporner.com/xN49A1cT3eB/xN49A1cT3eB.mp4",
+"embedUrl": "https://www.eporner.com/embed/xN49A1cT3eB/1-On-1-With-Kleio/",
+"image": "https://static-eu-cdn.eporner.com/thumbs/static4/7/78/780/780814/9_360.jpg",
+"width": "1920",
+"height": "1080",
+"encodingFormat": "mp4",
+"bitrate": "6617kbps",
+"isFamilyFriendly": "False",
+"description": "Kleio Valentien",
+"uploadDate": "2015-12-05T21:24:35+01:00",
+"interactionStatistic": {
+"@type": "InteractionCounter",
+"interactionType": { "@type": "http://schema.org/WatchAction" },
+"userInteractionCount": 1120958
+}, "aggregateRating": {
+"@type": "AggregateRating",
+"ratingValue": "88",
+"ratingCount": "630",
+"bestRating": "100",
+"worstRating": "0"
+}, "actor": [{
+"@type": "Person",
+"name": "Kleio Valentien",
+"url": "https://www.eporner.com/pornstar/kleio-valentien/"
+}]}
+</script>''', None),
+ {
+ 'title': '1 On 1 With Kleio',
+ 'description': 'Kleio Valentien',
+ 'url': 'https://gvideo.eporner.com/xN49A1cT3eB/xN49A1cT3eB.mp4',
+ 'timestamp': 1449347075,
+ 'duration': 743.0,
+ 'view_count': 1120958,
+ 'width': 1920,
+ 'height': 1080,
+ })
+
def test_download_json(self):
uri = encode_data_uri(b'{"foo": "blah"}', 'application/json')
self.assertEqual(self.ie._download_json(uri, None), {'foo': 'blah'})
Binary files old/youtube-dl/youtube-dl and new/youtube-dl/youtube-dl differ
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/downloader/hls.py new/youtube-dl/youtube_dl/downloader/hls.py
--- old/youtube-dl/youtube_dl/downloader/hls.py 2020-12-12 01:08:09.000000000 +0100
+++ new/youtube-dl/youtube_dl/downloader/hls.py 2020-12-13 18:56:48.000000000 +0100
@@ -42,11 +42,13 @@
# no segments will definitely be appended to the end of the playlist.
# r'#EXT-X-PLAYLIST-TYPE:EVENT', # media segments may be appended to the end of
# # event media playlists [4]
+ r'#EXT-X-MAP:', # media initialization [5]
# 1. https://tools.ietf.org/html/draft-pantos-http-live-streaming-17#section-4.3…
# 2. https://tools.ietf.org/html/draft-pantos-http-live-streaming-17#section-4.3…
# 3. https://tools.ietf.org/html/draft-pantos-http-live-streaming-17#section-4.3…
# 4. https://tools.ietf.org/html/draft-pantos-http-live-streaming-17#section-4.3…
+ # 5. https://tools.ietf.org/html/draft-pantos-http-live-streaming-17#section-4.3…
)
check_results = [not re.search(feature, manifest) for feature in UNSUPPORTED_FEATURES]
is_aes128_enc = '#EXT-X-KEY:METHOD=AES-128' in manifest
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/common.py new/youtube-dl/youtube_dl/extractor/common.py
--- old/youtube-dl/youtube_dl/extractor/common.py 2020-12-12 01:08:09.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/common.py 2020-12-13 18:56:48.000000000 +0100
@@ -336,8 +336,8 @@
object, each element of which is a valid dictionary by this specification.
Additionally, playlists can have "id", "title", "description", "uploader",
- "uploader_id", "uploader_url" attributes with the same semantics as videos
- (see above).
+ "uploader_id", "uploader_url", "duration" attributes with the same semantics
+ as videos (see above).
_type "multi_video" indicates that there are multiple videos that
@@ -1237,8 +1237,16 @@
'ViewAction': 'view',
}
+ def extract_interaction_type(e):
+ interaction_type = e.get('interactionType')
+ if isinstance(interaction_type, dict):
+ interaction_type = interaction_type.get('@type')
+ return str_or_none(interaction_type)
+
def extract_interaction_statistic(e):
interaction_statistic = e.get('interactionStatistic')
+ if isinstance(interaction_statistic, dict):
+ interaction_statistic = [interaction_statistic]
if not isinstance(interaction_statistic, list):
return
for is_e in interaction_statistic:
@@ -1246,8 +1254,8 @@
continue
if is_e.get('@type') != 'InteractionCounter':
continue
- interaction_type = is_e.get('interactionType')
- if not isinstance(interaction_type, compat_str):
+ interaction_type = extract_interaction_type(is_e)
+ if not interaction_type:
continue
# For interaction count some sites provide string instead of
# an integer (as per spec) with non digit characters (e.g. ",")
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/eporner.py new/youtube-dl/youtube_dl/extractor/eporner.py
--- old/youtube-dl/youtube_dl/extractor/eporner.py 2020-12-12 01:08:09.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/eporner.py 2020-12-13 18:56:48.000000000 +0100
@@ -16,7 +16,7 @@
class EpornerIE(InfoExtractor):
- _VALID_URL = r'https?://(?:www\.)?eporner\.com/(?:hd-porn|embed)/(?P<id>\w+)(?:/(?P<display_id>[\w-]+))?'
+ _VALID_URL = r'https?://(?:www\.)?eporner\.com/(?:(?:hd-porn|embed)/|video-)(?P<id>\w+)(?:/(?P<display_id>[\w-]+))?'
_TESTS = [{
'url': 'http://www.eporner.com/hd-porn/95008/Infamous-Tiffany-Teen-Strip-Tease-Vide…',
'md5': '39d486f046212d8e1b911c52ab4691f8',
@@ -43,7 +43,10 @@
'url': 'http://www.eporner.com/hd-porn/3YRUtzMcWn0',
'only_matching': True,
}, {
- 'url': 'http://www.eporner.com/hd-porn/3YRUtzMcWn0',
+ 'url': 'http://www.eporner.com/embed/3YRUtzMcWn0',
+ 'only_matching': True,
+ }, {
+ 'url': 'https://www.eporner.com/video-FJsA19J3Y3H/one-of-the-greats/',
'only_matching': True,
}]
@@ -57,7 +60,7 @@
video_id = self._match_id(urlh.geturl())
hash = self._search_regex(
- r'hash\s*:\s*["\']([\da-f]{32})', webpage, 'hash')
+ r'hash\s*[:=]\s*["\']([\da-f]{32})', webpage, 'hash')
title = self._og_search_title(webpage, default=None) or self._html_search_regex(
r'<title>(.+?) - EPORNER', webpage, 'title')
@@ -115,8 +118,8 @@
duration = parse_duration(self._html_search_meta(
'duration', webpage, default=None))
view_count = str_to_int(self._search_regex(
- r'id="cinemaviews">\s*([0-9,]+)\s*<small>views',
- webpage, 'view count', fatal=False))
+ r'id=["\']cinemaviews1["\'][^>]*>\s*([0-9,]+)',
+ webpage, 'view count', default=None))
return merge_dicts(json_ld, {
'id': video_id,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/hotstar.py new/youtube-dl/youtube_dl/extractor/hotstar.py
--- old/youtube-dl/youtube_dl/extractor/hotstar.py 2020-12-12 01:08:09.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/hotstar.py 2020-12-13 18:56:48.000000000 +0100
@@ -3,6 +3,7 @@
import hashlib
import hmac
+import json
import re
import time
import uuid
@@ -25,43 +26,50 @@
class HotStarBaseIE(InfoExtractor):
_AKAMAI_ENCRYPTION_KEY = b'\x05\xfc\x1a\x01\xca\xc9\x4b\xc4\x12\xfc\x53\x12\x07\x75\xf9\xee'
- def _call_api_impl(self, path, video_id, query):
+ def _call_api_impl(self, path, video_id, headers, query, data=None):
st = int(time.time())
exp = st + 6000
auth = 'st=%d~exp=%d~acl=/*' % (st, exp)
auth += '~hmac=' + hmac.new(self._AKAMAI_ENCRYPTION_KEY, auth.encode(), hashlib.sha256).hexdigest()
- response = self._download_json(
- 'https://api.hotstar.com/' + path, video_id, headers={
- 'hotstarauth': auth,
- 'x-country-code': 'IN',
- 'x-platform-code': 'JIO',
- }, query=query)
- if response['statusCode'] != 'OK':
- raise ExtractorError(
- response['body']['message'], expected=True)
- return response['body']['results']
+ h = {'hotstarauth': auth}
+ h.update(headers)
+ return self._download_json(
+ 'https://api.hotstar.com/' + path,
+ video_id, headers=h, query=query, data=data)
def _call_api(self, path, video_id, query_name='contentId'):
- return self._call_api_impl(path, video_id, {
+ response = self._call_api_impl(path, video_id, {
+ 'x-country-code': 'IN',
+ 'x-platform-code': 'JIO',
+ }, {
query_name: video_id,
'tas': 10000,
})
+ if response['statusCode'] != 'OK':
+ raise ExtractorError(
+ response['body']['message'], expected=True)
+ return response['body']['results']
- def _call_api_v2(self, path, video_id):
- return self._call_api_impl(
- '%s/in/contents/%s' % (path, video_id), video_id, {
- 'desiredConfig': 'encryption:plain;ladder:phone,tv;package:hls,dash',
- 'client': 'mweb',
- 'clientVersion': '6.18.0',
- 'deviceId': compat_str(uuid.uuid4()),
- 'osName': 'Windows',
- 'osVersion': '10',
- })
+ def _call_api_v2(self, path, video_id, headers, query=None, data=None):
+ h = {'X-Request-Id': compat_str(uuid.uuid4())}
+ h.update(headers)
+ try:
+ return self._call_api_impl(
+ path, video_id, h, query, data)
+ except ExtractorError as e:
+ if isinstance(e.cause, compat_HTTPError):
+ if e.cause.code == 402:
+ self.raise_login_required()
+ message = self._parse_json(e.cause.read().decode(), video_id)['message']
+ if message in ('Content not available in region', 'Country is not supported'):
+ raise self.raise_geo_restricted(message)
+ raise ExtractorError(message)
+ raise e
class HotStarIE(HotStarBaseIE):
IE_NAME = 'hotstar'
- _VALID_URL = r'https?://(?:www\.)?hotstar\.com/(?:.+?[/-])?(?P<id>\d{10})'
+ _VALID_URL = r'https?://(?:www\.)?hotstar\.com/(?:.+[/-])?(?P<id>\d{10})'
_TESTS = [{
# contentData
'url': 'https://www.hotstar.com/can-you-not-spread-rumours/1000076273',
@@ -92,8 +100,13 @@
# only available via api v2
'url': 'https://www.hotstar.com/tv/ek-bhram-sarvagun-sampanna/s-2116/janhvi-targets…',
'only_matching': True,
+ }, {
+ 'url': 'https://www.hotstar.com/in/tv/start-music/1260005217/cooks-vs-comalis/11000…',
+ 'only_matching': True,
}]
_GEO_BYPASS = False
+ _DEVICE_ID = None
+ _USER_TOKEN = None
def _real_extract(self, url):
video_id = self._match_id(url)
@@ -121,7 +134,30 @@
headers = {'Referer': url}
formats = []
geo_restricted = False
- playback_sets = self._call_api_v2('h/v2/play', video_id)['playBackSets']
+
+ if not self._USER_TOKEN:
+ self._DEVICE_ID = compat_str(uuid.uuid4())
+ self._USER_TOKEN = self._call_api_v2('um/v3/users', video_id, {
+ 'X-HS-Platform': 'PCTV',
+ 'Content-Type': 'application/json',
+ }, data=json.dumps({
+ 'device_ids': [{
+ 'id': self._DEVICE_ID,
+ 'type': 'device_id',
+ }],
+ }).encode())['user_identity']
+
+ playback_sets = self._call_api_v2(
+ 'play/v2/playback/content/' + video_id, video_id, {
+ 'X-HS-Platform': 'web',
+ 'X-HS-AppVersion': '6.99.1',
+ 'X-HS-UserToken': self._USER_TOKEN,
+ }, query={
+ 'device-id': self._DEVICE_ID,
+ 'desired-config': 'encryption:plain',
+ 'os-name': 'Windows',
+ 'os-version': '10',
+ })['data']['playBackSets']
for playback_set in playback_sets:
if not isinstance(playback_set, dict):
continue
@@ -163,19 +199,22 @@
for f in formats:
f.setdefault('http_headers', {}).update(headers)
+ image = try_get(video_data, lambda x: x['image']['h'], compat_str)
+
return {
'id': video_id,
'title': title,
+ 'thumbnail': 'https://img1.hotstarext.com/image/upload/' + image if image else None,
'description': video_data.get('description'),
'duration': int_or_none(video_data.get('duration')),
'timestamp': int_or_none(video_data.get('broadcastDate') or video_data.get('startDate')),
'formats': formats,
'channel': video_data.get('channelName'),
- 'channel_id': video_data.get('channelId'),
+ 'channel_id': str_or_none(video_data.get('channelId')),
'series': video_data.get('showName'),
'season': video_data.get('seasonName'),
'season_number': int_or_none(video_data.get('seasonNo')),
- 'season_id': video_data.get('seasonId'),
+ 'season_id': str_or_none(video_data.get('seasonId')),
'episode': title,
'episode_number': int_or_none(video_data.get('episodeNo')),
}
@@ -183,7 +222,7 @@
class HotStarPlaylistIE(HotStarBaseIE):
IE_NAME = 'hotstar:playlist'
- _VALID_URL = r'https?://(?:www\.)?hotstar\.com/tv/[^/]+/s-\w+/list/[^/]+/t-(?P<id>\w+)'
+ _VALID_URL = r'https?://(?:www\.)?hotstar\.com/(?:[a-z]{2}/)?tv/[^/]+/s-\w+/list/[^/]+/t-(?P<id>\w+)'
_TESTS = [{
'url': 'https://www.hotstar.com/tv/savdhaan-india/s-26/list/popular-clips/t-3_2_26',
'info_dict': {
@@ -193,6 +232,9 @@
}, {
'url': 'https://www.hotstar.com/tv/savdhaan-india/s-26/list/extras/t-2480',
'only_matching': True,
+ }, {
+ 'url': 'https://www.hotstar.com/us/tv/masterchef-india/s-830/list/episodes/t-1_2_830',
+ 'only_matching': True,
}]
def _real_extract(self, url):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/itv.py new/youtube-dl/youtube_dl/extractor/itv.py
--- old/youtube-dl/youtube_dl/extractor/itv.py 2020-12-12 01:08:15.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/itv.py 2020-12-13 18:56:48.000000000 +0100
@@ -7,6 +7,7 @@
from .common import InfoExtractor
from .brightcove import BrightcoveNewIE
from ..utils import (
+ clean_html,
determine_ext,
extract_attributes,
get_element_by_class,
@@ -14,7 +15,6 @@
merge_dicts,
parse_duration,
smuggle_url,
- strip_or_none,
url_or_none,
)
@@ -146,7 +146,7 @@
'formats': formats,
'subtitles': subtitles,
'duration': parse_duration(video_data.get('Duration')),
- 'description': strip_or_none(get_element_by_class('episode-info__synopsis', webpage)),
+ 'description': clean_html(get_element_by_class('episode-info__synopsis', webpage)),
}, info)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/linuxacademy.py new/youtube-dl/youtube_dl/extractor/linuxacademy.py
--- old/youtube-dl/youtube_dl/extractor/linuxacademy.py 2020-12-12 01:08:09.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/linuxacademy.py 2020-12-13 18:56:48.000000000 +0100
@@ -8,11 +8,15 @@
from ..compat import (
compat_b64decode,
compat_HTTPError,
+ compat_str,
)
from ..utils import (
+ clean_html,
ExtractorError,
- orderedSet,
- unescapeHTML,
+ js_to_json,
+ parse_duration,
+ try_get,
+ unified_timestamp,
urlencode_postdata,
urljoin,
)
@@ -28,11 +32,15 @@
)
'''
_TESTS = [{
- 'url': 'https://linuxacademy.com/cp/courses/lesson/course/1498/lesson/2/module/154',
+ 'url': 'https://linuxacademy.com/cp/courses/lesson/course/7971/lesson/2/module/675',
'info_dict': {
- 'id': '1498-2',
+ 'id': '7971-2',
'ext': 'mp4',
- 'title': "Introduction to the Practitioner's Brief",
+ 'title': 'What Is Data Science',
+ 'description': 'md5:c574a3c20607144fb36cb65bdde76c99',
+ 'timestamp': 1607387907,
+ 'upload_date': '20201208',
+ 'duration': 304,
},
'params': {
'skip_download': True,
@@ -46,7 +54,8 @@
'info_dict': {
'id': '154',
'title': 'AWS Certified Cloud Practitioner',
- 'description': 'md5:039db7e60e4aac9cf43630e0a75fa834',
+ 'description': 'md5:a68a299ca9bb98d41cca5abc4d4ce22c',
+ 'duration': 28835,
},
'playlist_count': 41,
'skip': 'Requires Linux Academy account credentials',
@@ -74,6 +83,7 @@
self._AUTHORIZE_URL, None, 'Downloading authorize page', query={
'client_id': self._CLIENT_ID,
'response_type': 'token id_token',
+ 'response_mode': 'web_message',
'redirect_uri': self._ORIGIN_URL,
'scope': 'openid email user_impersonation profile',
'audience': self._ORIGIN_URL,
@@ -129,7 +139,13 @@
access_token = self._search_regex(
r'access_token=([^=&]+)', urlh.geturl(),
- 'access token')
+ 'access token', default=None)
+ if not access_token:
+ access_token = self._parse_json(
+ self._search_regex(
+ r'authorizationResponse\s*=\s*({.+?})\s*;', callback_page,
+ 'authorization response'), None,
+ transform_source=js_to_json)['response']['access_token']
self._download_webpage(
'https://linuxacademy.com/cp/login/tokenValidateLogin/token/%s'
@@ -144,30 +160,84 @@
# course path
if course_id:
- entries = [
- self.url_result(
- urljoin(url, lesson_url), ie=LinuxAcademyIE.ie_key())
- for lesson_url in orderedSet(re.findall(
- r'<a[^>]+\bhref=["\'](/cp/courses/lesson/course/\d+/lesson/\d+/module/\d+)',
- webpage))]
- title = unescapeHTML(self._html_search_regex(
- (r'class=["\']course-title["\'][^>]*>(?P<value>[^<]+)',
- r'var\s+title\s*=\s*(["\'])(?P<value>(?:(?!\1).)+)\1'),
- webpage, 'title', default=None, group='value'))
- description = unescapeHTML(self._html_search_regex(
- r'var\s+description\s*=\s*(["\'])(?P<value>(?:(?!\1).)+)\1',
- webpage, 'description', default=None, group='value'))
- return self.playlist_result(entries, course_id, title, description)
+ module = self._parse_json(
+ self._search_regex(
+ r'window\.module\s*=\s*({.+?})\s*;', webpage, 'module'),
+ item_id)
+ entries = []
+ chapter_number = None
+ chapter = None
+ chapter_id = None
+ for item in module['items']:
+ if not isinstance(item, dict):
+ continue
+
+ def type_field(key):
+ return (try_get(item, lambda x: x['type'][key], compat_str) or '').lower()
+ type_fields = (type_field('name'), type_field('slug'))
+ # Move to next module section
+ if 'section' in type_fields:
+ chapter = item.get('course_name')
+ chapter_id = item.get('course_module')
+ chapter_number = 1 if not chapter_number else chapter_number + 1
+ continue
+ # Skip non-lessons
+ if 'lesson' not in type_fields:
+ continue
+ lesson_url = urljoin(url, item.get('url'))
+ if not lesson_url:
+ continue
+ title = item.get('title') or item.get('lesson_name')
+ description = item.get('md_desc') or clean_html(item.get('description')) or clean_html(item.get('text'))
+ entries.append({
+ '_type': 'url_transparent',
+ 'url': lesson_url,
+ 'ie_key': LinuxAcademyIE.ie_key(),
+ 'title': title,
+ 'description': description,
+ 'timestamp': unified_timestamp(item.get('date')) or unified_timestamp(item.get('created_on')),
+ 'duration': parse_duration(item.get('duration')),
+ 'chapter': chapter,
+ 'chapter_id': chapter_id,
+ 'chapter_number': chapter_number,
+ })
+ return {
+ '_type': 'playlist',
+ 'entries': entries,
+ 'id': course_id,
+ 'title': module.get('title'),
+ 'description': module.get('md_desc') or clean_html(module.get('desc')),
+ 'duration': parse_duration(module.get('duration')),
+ }
# single video path
- info = self._extract_jwplayer_data(
- webpage, item_id, require_title=False, m3u8_id='hls',)
- title = self._search_regex(
- (r'>Lecture\s*:\s*(?P<value>[^<]+)',
- r'lessonName\s*=\s*(["\'])(?P<value>(?:(?!\1).)+)\1'), webpage,
- 'title', group='value')
- info.update({
+ m3u8_url = self._parse_json(
+ self._search_regex(
+ r'player\.playlist\s*=\s*(\[.+?\])\s*;', webpage, 'playlist'),
+ item_id)[0]['file']
+ formats = self._extract_m3u8_formats(
+ m3u8_url, item_id, 'mp4', entry_protocol='m3u8_native',
+ m3u8_id='hls')
+ self._sort_formats(formats)
+ info = {
'id': item_id,
- 'title': title,
- })
+ 'formats': formats,
+ }
+ lesson = self._parse_json(
+ self._search_regex(
+ (r'window\.lesson\s*=\s*({.+?})\s*;',
+ r'player\.lesson\s*=\s*({.+?})\s*;'),
+ webpage, 'lesson', default='{}'), item_id, fatal=False)
+ if lesson:
+ info.update({
+ 'title': lesson.get('lesson_name'),
+ 'description': lesson.get('md_desc') or clean_html(lesson.get('desc')),
+ 'timestamp': unified_timestamp(lesson.get('date')) or unified_timestamp(lesson.get('created_on')),
+ 'duration': parse_duration(lesson.get('duration')),
+ })
+ if not info.get('title'):
+ info['title'] = self._search_regex(
+ (r'>Lecture\s*:\s*(?P<value>[^<]+)',
+ r'lessonName\s*=\s*(["\'])(?P<value>(?:(?!\1).)+)\1'), webpage,
+ 'title', group='value')
return info
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/mdr.py new/youtube-dl/youtube_dl/extractor/mdr.py
--- old/youtube-dl/youtube_dl/extractor/mdr.py 2020-12-12 01:08:09.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/mdr.py 2020-12-13 18:56:48.000000000 +0100
@@ -2,12 +2,16 @@
from __future__ import unicode_literals
from .common import InfoExtractor
-from ..compat import compat_urlparse
+from ..compat import (
+ compat_str,
+ compat_urlparse,
+)
from ..utils import (
determine_ext,
int_or_none,
parse_duration,
parse_iso8601,
+ url_or_none,
xpath_text,
)
@@ -16,6 +20,8 @@
IE_DESC = 'MDR.DE and KiKA'
_VALID_URL = r'https?://(?:www\.)?(?:mdr|kika)\.de/(?:.*)/[a-z-]+-?(?P<id>\d+)(?:_.+?)?\.html'
+ _GEO_COUNTRIES = ['DE']
+
_TESTS = [{
# MDR regularly deletes its videos
'url': 'http://www.mdr.de/fakt/video189002.html',
@@ -67,6 +73,22 @@
'uploader': 'MITTELDEUTSCHER RUNDFUNK',
},
}, {
+ # empty bitrateVideo and bitrateAudio
+ 'url': 'https://www.kika.de/filme/sendung128372_zc-572e3f45_zs-1d9fb70e.html',
+ 'info_dict': {
+ 'id': '128372',
+ 'ext': 'mp4',
+ 'title': 'Der kleine Wichtel kehrt zurück',
+ 'description': 'md5:f77fafdff90f7aa1e9dca14f662c052a',
+ 'duration': 4876,
+ 'timestamp': 1607823300,
+ 'upload_date': '20201213',
+ 'uploader': 'ZDF',
+ },
+ 'params': {
+ 'skip_download': True,
+ },
+ }, {
'url': 'http://www.kika.de/baumhaus/sendungen/video19636_zc-fea7f8a0_zs-4bf89c60.ht…',
'only_matching': True,
}, {
@@ -91,10 +113,13 @@
title = xpath_text(doc, ['./title', './broadcast/broadcastName'], 'title', fatal=True)
+ type_ = xpath_text(doc, './type', default=None)
+
formats = []
processed_urls = []
for asset in doc.findall('./assets/asset'):
for source in (
+ 'download',
'progressiveDownload',
'dynamicHttpStreamingRedirector',
'adaptiveHttpStreamingRedirector'):
@@ -102,63 +127,49 @@
if url_el is None:
continue
- video_url = url_el.text
- if video_url in processed_urls:
+ video_url = url_or_none(url_el.text)
+ if not video_url or video_url in processed_urls:
continue
processed_urls.append(video_url)
- vbr = int_or_none(xpath_text(asset, './bitrateVideo', 'vbr'), 1000)
- abr = int_or_none(xpath_text(asset, './bitrateAudio', 'abr'), 1000)
-
- ext = determine_ext(url_el.text)
+ ext = determine_ext(video_url)
if ext == 'm3u8':
- url_formats = self._extract_m3u8_formats(
+ formats.extend(self._extract_m3u8_formats(
video_url, video_id, 'mp4', entry_protocol='m3u8_native',
- preference=0, m3u8_id='HLS', fatal=False)
+ preference=0, m3u8_id='HLS', fatal=False))
elif ext == 'f4m':
- url_formats = self._extract_f4m_formats(
+ formats.extend(self._extract_f4m_formats(
video_url + '?hdcore=3.7.0&plugin=aasp-3.7.0.39.44', video_id,
- preference=0, f4m_id='HDS', fatal=False)
+ preference=0, f4m_id='HDS', fatal=False))
else:
media_type = xpath_text(asset, './mediaType', 'media type', default='MP4')
vbr = int_or_none(xpath_text(asset, './bitrateVideo', 'vbr'), 1000)
abr = int_or_none(xpath_text(asset, './bitrateAudio', 'abr'), 1000)
filesize = int_or_none(xpath_text(asset, './fileSize', 'file size'))
+ format_id = [media_type]
+ if vbr or abr:
+ format_id.append(compat_str(vbr or abr))
+
f = {
'url': video_url,
- 'format_id': '%s-%d' % (media_type, vbr or abr),
+ 'format_id': '-'.join(format_id),
'filesize': filesize,
'abr': abr,
- 'preference': 1,
+ 'vbr': vbr,
}
if vbr:
- width = int_or_none(xpath_text(asset, './frameWidth', 'width'))
- height = int_or_none(xpath_text(asset, './frameHeight', 'height'))
f.update({
- 'vbr': vbr,
- 'width': width,
- 'height': height,
+ 'width': int_or_none(xpath_text(asset, './frameWidth', 'width')),
+ 'height': int_or_none(xpath_text(asset, './frameHeight', 'height')),
})
- url_formats = [f]
-
- if not url_formats:
- continue
-
- if not vbr:
- for f in url_formats:
- abr = f.get('tbr') or abr
- if 'tbr' in f:
- del f['tbr']
- f.update({
- 'abr': abr,
- 'vcodec': 'none',
- })
+ if type_ == 'audio':
+ f['vcodec'] = 'none'
- formats.extend(url_formats)
+ formats.append(f)
self._sort_formats(formats)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/slideslive.py new/youtube-dl/youtube_dl/extractor/slideslive.py
--- old/youtube-dl/youtube_dl/extractor/slideslive.py 2020-12-12 01:08:09.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/slideslive.py 2020-12-13 18:56:48.000000000 +0100
@@ -83,9 +83,10 @@
else:
formats = []
_MANIFEST_PATTERN = 'https://01.cdn.yoda.slideslive.com/%s/master.%s'
+ # use `m3u8` entry_protocol until EXT-X-MAP is properly supported by `m3u8_native` entry_protocol
formats.extend(self._extract_m3u8_formats(
- _MANIFEST_PATTERN % (service_id, 'm3u8'), service_id, 'mp4',
- entry_protocol='m3u8_native', m3u8_id='hls', fatal=False))
+ _MANIFEST_PATTERN % (service_id, 'm3u8'),
+ service_id, 'mp4', m3u8_id='hls', fatal=False))
formats.extend(self._extract_mpd_formats(
_MANIFEST_PATTERN % (service_id, 'mpd'), service_id,
mpd_id='dash', fatal=False))
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/twitcasting.py new/youtube-dl/youtube_dl/extractor/twitcasting.py
--- old/youtube-dl/youtube_dl/extractor/twitcasting.py 2020-12-12 01:08:09.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/twitcasting.py 2020-12-13 18:56:48.000000000 +0100
@@ -1,11 +1,20 @@
# coding: utf-8
from __future__ import unicode_literals
-from .common import InfoExtractor
-from ..utils import urlencode_postdata
-
import re
+from .common import InfoExtractor
+from ..utils import (
+ clean_html,
+ float_or_none,
+ get_element_by_class,
+ get_element_by_id,
+ parse_duration,
+ str_to_int,
+ unified_timestamp,
+ urlencode_postdata,
+)
+
class TwitCastingIE(InfoExtractor):
_VALID_URL = r'https?://(?:[^/]+\.)?twitcasting\.tv/(?P<uploader_id>[^/]+)/movie/(?P<id>\d+)'
@@ -17,8 +26,12 @@
'ext': 'mp4',
'title': 'Live #2357609',
'uploader_id': 'ivetesangalo',
- 'description': "Moi! I'm live on TwitCasting from my iPhone.",
+ 'description': 'Twitter Oficial da cantora brasileira Ivete Sangalo.',
'thumbnail': r're:^https?://.*\.jpg$',
+ 'upload_date': '20110822',
+ 'timestamp': 1314010824,
+ 'duration': 32,
+ 'view_count': int,
},
'params': {
'skip_download': True,
@@ -30,8 +43,12 @@
'ext': 'mp4',
'title': 'Live playing something #3689740',
'uploader_id': 'mttbernardini',
- 'description': "I'm live on TwitCasting from my iPad. password: abc (Santa Marinella/Lazio, Italia)",
+ 'description': 'Salve, io sono Matto (ma con la e). Questa è la mia presentazione, in quanto sono letteralmente matto (nel senso di strano), con qualcosa in più.',
'thumbnail': r're:^https?://.*\.jpg$',
+ 'upload_date': '20120212',
+ 'timestamp': 1329028024,
+ 'duration': 681,
+ 'view_count': int,
},
'params': {
'skip_download': True,
@@ -40,9 +57,7 @@
}]
def _real_extract(self, url):
- mobj = re.match(self._VALID_URL, url)
- video_id = mobj.group('id')
- uploader_id = mobj.group('uploader_id')
+ uploader_id, video_id = re.match(self._VALID_URL, url).groups()
video_password = self._downloader.params.get('videopassword')
request_data = None
@@ -52,30 +67,45 @@
})
webpage = self._download_webpage(url, video_id, data=request_data)
- title = self._html_search_regex(
- r'(?s)<[^>]+id=["\']movietitle[^>]+>(.+?)</',
- webpage, 'title', default=None) or self._html_search_meta(
- 'twitter:title', webpage, fatal=True)
+ title = clean_html(get_element_by_id(
+ 'movietitle', webpage)) or self._html_search_meta(
+ ['og:title', 'twitter:title'], webpage, fatal=True)
+ video_js_data = {}
m3u8_url = self._search_regex(
- (r'data-movie-url=(["\'])(?P<url>(?:(?!\1).)+)\1',
- r'(["\'])(?P<url>http.+?\.m3u8.*?)\1'),
- webpage, 'm3u8 url', group='url')
+ r'data-movie-url=(["\'])(?P<url>(?:(?!\1).)+)\1',
+ webpage, 'm3u8 url', group='url', default=None)
+ if not m3u8_url:
+ video_js_data = self._parse_json(self._search_regex(
+ r"data-movie-playlist='(\[[^']+\])'",
+ webpage, 'movie playlist'), video_id)[0]
+ m3u8_url = video_js_data['source']['url']
+ # use `m3u8` entry_protocol until EXT-X-MAP is properly supported by `m3u8_native` entry_protocol
formats = self._extract_m3u8_formats(
- m3u8_url, video_id, ext='mp4', entry_protocol='m3u8_native',
- m3u8_id='hls')
+ m3u8_url, video_id, 'mp4', m3u8_id='hls')
- thumbnail = self._og_search_thumbnail(webpage)
- description = self._og_search_description(
- webpage, default=None) or self._html_search_meta(
- 'twitter:description', webpage)
+ thumbnail = video_js_data.get('thumbnailUrl') or self._og_search_thumbnail(webpage)
+ description = clean_html(get_element_by_id(
+ 'authorcomment', webpage)) or self._html_search_meta(
+ ['description', 'og:description', 'twitter:description'], webpage)
+ duration = float_or_none(video_js_data.get(
+ 'duration'), 1000) or parse_duration(clean_html(
+ get_element_by_class('tw-player-duration-time', webpage)))
+ view_count = str_to_int(self._search_regex(
+ r'Total\s*:\s*([\d,]+)\s*Views', webpage, 'views', None))
+ timestamp = unified_timestamp(self._search_regex(
+ r'data-toggle="true"[^>]+datetime="([^"]+)"',
+ webpage, 'datetime', None))
return {
'id': video_id,
'title': title,
'description': description,
'thumbnail': thumbnail,
+ 'timestamp': timestamp,
'uploader_id': uploader_id,
+ 'duration': duration,
+ 'view_count': view_count,
'formats': formats,
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/vlive.py new/youtube-dl/youtube_dl/extractor/vlive.py
--- old/youtube-dl/youtube_dl/extractor/vlive.py 2020-12-12 01:08:09.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/vlive.py 2020-12-13 18:56:48.000000000 +0100
@@ -155,6 +155,7 @@
'old/v3/live/%s/playInfo',
video_id)['result']['adaptiveStreamUrl']
formats = self._extract_m3u8_formats(stream_url, video_id, 'mp4')
+ self._sort_formats(formats)
info = get_common_fields()
info.update({
'title': self._live_title(video['title']),
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/yandexmusic.py new/youtube-dl/youtube_dl/extractor/yandexmusic.py
--- old/youtube-dl/youtube_dl/extractor/yandexmusic.py 2020-12-12 01:08:09.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/yandexmusic.py 2020-12-13 18:56:48.000000000 +0100
@@ -260,6 +260,14 @@
},
'playlist_count': 33,
# 'skip': 'Travis CI servers blocked by YandexMusic',
+ }, {
+ # empty artists
+ 'url': 'https://music.yandex.ru/album/9091882',
+ 'info_dict': {
+ 'id': '9091882',
+ 'title': 'ТЕД на русском',
+ },
+ 'playlist_count': 187,
}]
def _real_extract(self, url):
@@ -273,7 +281,10 @@
entries = self._build_playlist([track for volume in album['volumes'] for track in volume])
- title = '%s - %s' % (album['artists'][0]['name'], album['title'])
+ title = album['title']
+ artist = try_get(album, lambda x: x['artists'][0]['name'], compat_str)
+ if artist:
+ title = '%s - %s' % (artist, title)
year = album.get('year')
if year:
title += ' (%s)' % year
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/youtube.py new/youtube-dl/youtube_dl/extractor/youtube.py
--- old/youtube-dl/youtube_dl/extractor/youtube.py 2020-12-12 01:08:09.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/youtube.py 2020-12-13 18:56:48.000000000 +0100
@@ -319,10 +319,18 @@
(?:www\.)?invidious\.kabi\.tk/|
(?:www\.)?invidious\.13ad\.de/|
(?:www\.)?invidious\.mastodon\.host/|
+ (?:www\.)?invidious\.zapashcanon\.fr/|
+ (?:www\.)?invidious\.kavin\.rocks/|
+ (?:www\.)?invidious\.tube/|
+ (?:www\.)?invidiou\.site/|
+ (?:www\.)?invidious\.site/|
+ (?:www\.)?invidious\.xyz/|
(?:www\.)?invidious\.nixnet\.xyz/|
(?:www\.)?invidious\.drycat\.fr/|
(?:www\.)?tube\.poal\.co/|
+ (?:www\.)?tube\.connect\.cafe/|
(?:www\.)?vid\.wxzm\.sx/|
+ (?:www\.)?vid\.mint\.lgbt/|
(?:www\.)?yewtu\.be/|
(?:www\.)?yt\.elukerio\.org/|
(?:www\.)?yt\.lelux\.fi/|
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/version.py new/youtube-dl/youtube_dl/version.py
--- old/youtube-dl/youtube_dl/version.py 2020-12-12 01:09:56.000000000 +0100
+++ new/youtube-dl/youtube_dl/version.py 2020-12-13 18:57:08.000000000 +0100
@@ -1,3 +1,3 @@
from __future__ import unicode_literals
-__version__ = '2020.12.12'
+__version__ = '2020.12.14'
1
0
Hello community,
here is the log from the commit of package enet for openSUSE:Factory checked in at 2020-12-21 10:24:29
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/enet (Old)
and /work/SRC/openSUSE:Factory/.enet.new.5145 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "enet"
Mon Dec 21 10:24:29 2020 rev:7 rq:857146 version:1.3.17
Changes:
--------
--- /work/SRC/openSUSE:Factory/enet/enet.changes 2020-09-21 17:22:58.987908422 +0200
+++ /work/SRC/openSUSE:Factory/.enet.new.5145/enet.changes 2020-12-21 10:27:04.404228580 +0100
@@ -1,0 +2,6 @@
+Sun Dec 20 12:41:24 UTC 2020 - Dirk Müller <dmueller(a)suse.com>
+
+- update to 1.3.17:
+ * fixes for sender getting too far ahead or receiver that can cause instability with reliable packets
+
+-------------------------------------------------------------------
Old:
----
enet-1.3.16.tar.gz
New:
----
enet-1.3.17.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ enet.spec ++++++
--- /var/tmp/diff_new_pack.b8QLRo/_old 2020-12-21 10:27:04.968229219 +0100
+++ /var/tmp/diff_new_pack.b8QLRo/_new 2020-12-21 10:27:04.972229224 +0100
@@ -18,7 +18,7 @@
%define sover 7
Name: enet
-Version: 1.3.16
+Version: 1.3.17
Release: 0
Summary: Network Communication Layer on Top of UDP
License: MIT
++++++ enet-1.3.16.tar.gz -> enet-1.3.17.tar.gz ++++++
++++ 1969 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package fprintd for openSUSE:Factory checked in at 2020-12-21 10:24:27
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/fprintd (Old)
and /work/SRC/openSUSE:Factory/.fprintd.new.5145 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "fprintd"
Mon Dec 21 10:24:27 2020 rev:12 rq:857141 version:1.90.8
Changes:
--------
--- /work/SRC/openSUSE:Factory/fprintd/fprintd.changes 2020-12-09 22:21:53.735704238 +0100
+++ /work/SRC/openSUSE:Factory/.fprintd.new.5145/fprintd.changes 2020-12-21 10:27:03.232227251 +0100
@@ -1,0 +2,23 @@
+Sat Dec 12 21:51:37 UTC 2020 - Martin Hauke <mardnh(a)gmx.de>
+
+- Update to version 1.90.8
+ It seems that we are finally reaching the end of the tunnel with
+ regard to regressions. One more issue that cropped up was that a
+ pam_fprintd fix to avoid a possible authentication bypass caused
+ issues when fprintd was just started on demand.
+ Highlights:
+ * pam: Only listen to NameOwnerChanged after fprintd is known to
+ run.
+ * Place new ObjectManager DBus API at /net/reactivated/Fprint
+
+-------------------------------------------------------------------
+Wed Dec 9 19:22:16 UTC 2020 - Martin Hauke <mardnh(a)gmx.de>
+
+- Update to version 1.90.7
+ While 1.90.6 fixed a number of issues, we did have a bad
+ regression due causing pam_fprintd to crash when there are no
+ fingerprint devices installed.
+ Highlights:
+ * pam: Guard strdup calls against NULL pointers
+
+-------------------------------------------------------------------
@@ -4 +27 @@
-- Update to version 1.90.5
+- Update to version 1.90.6
Old:
----
fprintd-1.90.6.tar.bz2
New:
----
fprintd-1.90.8.tar.bz2
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ fprintd.spec ++++++
--- /var/tmp/diff_new_pack.q8cLMJ/_old 2020-12-21 10:27:03.792227887 +0100
+++ /var/tmp/diff_new_pack.q8cLMJ/_new 2020-12-21 10:27:03.792227887 +0100
@@ -16,10 +16,10 @@
#
-%define gitlabhash 52058c1ea0c3bd0eeb6e10c81af98aa687227f7f
+%define gitlabhash 7d22a2b5b9d323638bb213aefb8627d897c8e482
Name: fprintd
-Version: 1.90.6
+Version: 1.90.8
Release: 0
Summary: D-Bus service for Fingerprint reader access
License: GPL-2.0-or-later
++++++ fprintd-1.90.6.tar.bz2 -> fprintd-1.90.8.tar.bz2 ++++++
++++ 8867 lines of diff (skipped)
1
0