openSUSE Commits
Threads by month
- ----- 2024 -----
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
March 2024
- 1 participants
- 1846 discussions
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package sqlite-jdbc for openSUSE:Factory checked in at 2024-03-29 13:10:25
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/sqlite-jdbc (Old)
and /work/SRC/openSUSE:Factory/.sqlite-jdbc.new.1905 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "sqlite-jdbc"
Fri Mar 29 13:10:25 2024 rev:17 rq:1163406 version:3.45.2.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/sqlite-jdbc/sqlite-jdbc.changes 2024-02-22 20:59:01.740288935 +0100
+++ /work/SRC/openSUSE:Factory/.sqlite-jdbc.new.1905/sqlite-jdbc.changes 2024-03-29 13:13:13.996147826 +0100
@@ -1,0 +2,30 @@
+Wed Mar 13 09:11:38 UTC 2024 - Anton Shvetz <shvetz.anton(a)gmail.com>
+
+- Update ro v3.45.2.0
+ * Features
+ ~ sqlite
+ + upgrade to sqlite 3.45.2 (c56fbf1)
+ * Perf
+ ~ CoreStatement uses optimize regex for generated key matches
+ (95b8efa)
+ * Build
+ ~ deps
+ + bump org.apache.maven.plugins:maven-gpg-plugin (3b83760)
+ + bump org.jreleaser:jreleaser-maven-plugin (9ccd1e7)
+ + bump org.graalvm.buildtools:native-maven-plugin (eca45e5)
+ + bump andymckay/cancel-action from 0.3 to 0.4 (b11f8be)
+ + bump org.graalvm.buildtools:native-maven-plugin (cdad828)
+ ~ deps-dev
+ + bump org.mockito:mockito-core from 5.10.0 to 5.11.0
+ (07b38af)
+ + bump org.junit.jupiter:junit-jupiter (6c2e966)
+ + bump org.assertj:assertj-core from 3.25.2 to 3.25.3
+ (daca050)
+ ~ unscoped
+ + use BC signer (c84d122)
+ * Documentation
+ ~ add gpg key to README (18c0bd4), closes #1049
+ ~ adding try-with-resources to examples and demo. related #938
+ (9a072d3), closes #938
+
+-------------------------------------------------------------------
Old:
----
sqlite-amalgamation-3450100.zip
sqlite-jdbc-3.45.1.0.tar.gz
New:
----
sqlite-amalgamation-3450200.zip
sqlite-jdbc-3.45.2.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ sqlite-jdbc.spec ++++++
--- /var/tmp/diff_new_pack.dqjF1p/_old 2024-03-29 13:13:14.824178243 +0100
+++ /var/tmp/diff_new_pack.dqjF1p/_new 2024-03-29 13:13:14.828178390 +0100
@@ -17,8 +17,8 @@
%{!?make_build:%global make_build make %{?_smp_mflags}}
-%global version 3.45.1.0
-%global amalgamation_version 3450100
+%global version 3.45.2.0
+%global amalgamation_version 3450200
%global debug_package %{nil}
Name: sqlite-jdbc
Version: %{version}
++++++ sqlite-amalgamation-3450100.zip -> sqlite-amalgamation-3450200.zip ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/sqlite-amalgamation-3450100/shell.c new/sqlite-amalgamation-3450200/shell.c
--- old/sqlite-amalgamation-3450100/shell.c 2024-01-30 17:24:03.000000000 +0100
+++ new/sqlite-amalgamation-3450200/shell.c 2024-03-12 12:23:08.000000000 +0100
@@ -580,6 +580,9 @@
#ifndef HAVE_CONSOLE_IO_H
# include "console_io.h"
#endif
+#if defined(_MSC_VER)
+# pragma warning(disable : 4204)
+#endif
#ifndef SQLITE_CIO_NO_TRANSLATE
# if (defined(_WIN32) || defined(WIN32)) && !SQLITE_OS_WINRT
@@ -678,6 +681,10 @@
# endif
}
+# ifndef ENABLE_VIRTUAL_TERMINAL_PROCESSING
+# define ENABLE_VIRTUAL_TERMINAL_PROCESSING (0x4)
+# endif
+
# if CIO_WIN_WC_XLATE
/* Define console modes for use with the Windows Console API. */
# define SHELL_CONI_MODE \
@@ -1228,6 +1235,10 @@
}
#endif /* !defined(SQLITE_CIO_NO_TRANSLATE) */
+#if defined(_MSC_VER)
+# pragma warning(default : 4204)
+#endif
+
#undef SHELL_INVALID_FILE_PTR
/************************* End ../ext/consio/console_io.c ********************/
@@ -20619,6 +20630,7 @@
rc = sqlite3_step(pStmt);
if( rc!=SQLITE_ROW ) return;
nColumn = sqlite3_column_count(pStmt);
+ if( nColumn==0 ) goto columnar_end;
nAlloc = nColumn*4;
if( nAlloc<=0 ) nAlloc = 1;
azData = sqlite3_malloc64( nAlloc*sizeof(char*) );
@@ -20704,7 +20716,6 @@
if( n>p->actualWidth[j] ) p->actualWidth[j] = n;
}
if( seenInterrupt ) goto columnar_end;
- if( nColumn==0 ) goto columnar_end;
switch( p->cMode ){
case MODE_Column: {
colSep = " ";
@@ -25553,16 +25564,15 @@
#ifndef SQLITE_SHELL_FIDDLE
if( c=='i' && cli_strncmp(azArg[0], "import", n)==0 ){
char *zTable = 0; /* Insert data into this table */
- char *zSchema = 0; /* within this schema (may default to "main") */
+ char *zSchema = 0; /* Schema of zTable */
char *zFile = 0; /* Name of file to extra content from */
sqlite3_stmt *pStmt = NULL; /* A statement */
int nCol; /* Number of columns in the table */
- int nByte; /* Number of bytes in an SQL string */
+ i64 nByte; /* Number of bytes in an SQL string */
int i, j; /* Loop counters */
int needCommit; /* True to COMMIT or ROLLBACK at end */
int nSep; /* Number of bytes in p->colSeparator[] */
- char *zSql; /* An SQL statement */
- char *zFullTabName; /* Table name with schema if applicable */
+ char *zSql = 0; /* An SQL statement */
ImportCtx sCtx; /* Reader context */
char *(SQLITE_CDECL *xRead)(ImportCtx*); /* Func to read one value */
int eVerbose = 0; /* Larger for more console output */
@@ -25696,24 +25706,14 @@
while( (nSkip--)>0 ){
while( xRead(&sCtx) && sCtx.cTerm==sCtx.cColSep ){}
}
- if( zSchema!=0 ){
- zFullTabName = sqlite3_mprintf("\"%w\".\"%w\"", zSchema, zTable);
- }else{
- zFullTabName = sqlite3_mprintf("\"%w\"", zTable);
- }
- zSql = sqlite3_mprintf("SELECT * FROM %s", zFullTabName);
- if( zSql==0 || zFullTabName==0 ){
- import_cleanup(&sCtx);
- shell_out_of_memory();
- }
- nByte = strlen30(zSql);
- rc = sqlite3_prepare_v2(p->db, zSql, -1, &pStmt, 0);
import_append_char(&sCtx, 0); /* To ensure sCtx.z is allocated */
- if( rc && sqlite3_strglob("no such table: *", sqlite3_errmsg(p->db))==0 ){
+ if( sqlite3_table_column_metadata(p->db, zSchema, zTable,0,0,0,0,0,0) ){
+ /* Table does not exist. Create it. */
sqlite3 *dbCols = 0;
char *zRenames = 0;
char *zColDefs;
- zCreate = sqlite3_mprintf("CREATE TABLE %s", zFullTabName);
+ zCreate = sqlite3_mprintf("CREATE TABLE \"%w\".\"%w\"",
+ zSchema ? zSchema : "main", zTable);
while( xRead(&sCtx) ){
zAutoColumn(sCtx.z, &dbCols, 0);
if( sCtx.cTerm!=sCtx.cColSep ) break;
@@ -25728,34 +25728,50 @@
assert(dbCols==0);
if( zColDefs==0 ){
eputf("%s: empty file\n", sCtx.zFile);
- import_fail:
- sqlite3_free(zCreate);
- sqlite3_free(zSql);
- sqlite3_free(zFullTabName);
import_cleanup(&sCtx);
rc = 1;
goto meta_command_exit;
}
zCreate = sqlite3_mprintf("%z%z\n", zCreate, zColDefs);
+ if( zCreate==0 ){
+ import_cleanup(&sCtx);
+ shell_out_of_memory();
+ }
if( eVerbose>=1 ){
oputf("%s\n", zCreate);
}
rc = sqlite3_exec(p->db, zCreate, 0, 0, 0);
+ sqlite3_free(zCreate);
+ zCreate = 0;
if( rc ){
eputf("%s failed:\n%s\n", zCreate, sqlite3_errmsg(p->db));
- goto import_fail;
+ import_cleanup(&sCtx);
+ rc = 1;
+ goto meta_command_exit;
}
- sqlite3_free(zCreate);
- zCreate = 0;
- rc = sqlite3_prepare_v2(p->db, zSql, -1, &pStmt, 0);
}
+ zSql = sqlite3_mprintf("SELECT count(*) FROM pragma_table_info(%Q,%Q);",
+ zTable, zSchema);
+ if( zSql==0 ){
+ import_cleanup(&sCtx);
+ shell_out_of_memory();
+ }
+ nByte = strlen(zSql);
+ rc = sqlite3_prepare_v2(p->db, zSql, -1, &pStmt, 0);
+ sqlite3_free(zSql);
+ zSql = 0;
if( rc ){
if (pStmt) sqlite3_finalize(pStmt);
eputf("Error: %s\n", sqlite3_errmsg(p->db));
- goto import_fail;
+ import_cleanup(&sCtx);
+ rc = 1;
+ goto meta_command_exit;
+ }
+ if( sqlite3_step(pStmt)==SQLITE_ROW ){
+ nCol = sqlite3_column_int(pStmt, 0);
+ }else{
+ nCol = 0;
}
- sqlite3_free(zSql);
- nCol = sqlite3_column_count(pStmt);
sqlite3_finalize(pStmt);
pStmt = 0;
if( nCol==0 ) return 0; /* no columns, no error */
@@ -25764,7 +25780,12 @@
import_cleanup(&sCtx);
shell_out_of_memory();
}
- sqlite3_snprintf(nByte+20, zSql, "INSERT INTO %s VALUES(?", zFullTabName);
+ if( zSchema ){
+ sqlite3_snprintf(nByte+20, zSql, "INSERT INTO \"%w\".\"%w\" VALUES(?",
+ zSchema, zTable);
+ }else{
+ sqlite3_snprintf(nByte+20, zSql, "INSERT INTO \"%w\" VALUES(?", zTable);
+ }
j = strlen30(zSql);
for(i=1; i<nCol; i++){
zSql[j++] = ',';
@@ -25776,13 +25797,15 @@
oputf("Insert using: %s\n", zSql);
}
rc = sqlite3_prepare_v2(p->db, zSql, -1, &pStmt, 0);
+ sqlite3_free(zSql);
+ zSql = 0;
if( rc ){
eputf("Error: %s\n", sqlite3_errmsg(p->db));
if (pStmt) sqlite3_finalize(pStmt);
- goto import_fail;
+ import_cleanup(&sCtx);
+ rc = 1;
+ goto meta_command_exit;
}
- sqlite3_free(zSql);
- sqlite3_free(zFullTabName);
needCommit = sqlite3_get_autocommit(p->db);
if( needCommit ) sqlite3_exec(p->db, "BEGIN", 0, 0, 0);
do{
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/sqlite-amalgamation-3450100/sqlite3.c new/sqlite-amalgamation-3450200/sqlite3.c
--- old/sqlite-amalgamation-3450100/sqlite3.c 2024-01-30 17:24:03.000000000 +0100
+++ new/sqlite-amalgamation-3450200/sqlite3.c 2024-03-12 12:23:08.000000000 +0100
@@ -1,6 +1,6 @@
/******************************************************************************
** This file is an amalgamation of many separate C source files from SQLite
-** version 3.45.1. By combining all the individual C code files into this
+** version 3.45.2. By combining all the individual C code files into this
** single large file, the entire code can be compiled as a single translation
** unit. This allows many compilers to do optimizations that would not be
** possible if the files were compiled separately. Performance improvements
@@ -18,7 +18,7 @@
** separate file. This file contains only code for the core SQLite library.
**
** The content in this amalgamation comes from Fossil check-in
-** e876e51a0ed5c5b3126f52e532044363a014.
+** d8cd6d49b46a395b13955387d05e9e1a2a47.
*/
#define SQLITE_CORE 1
#define SQLITE_AMALGAMATION 1
@@ -459,9 +459,9 @@
** [sqlite3_libversion_number()], [sqlite3_sourceid()],
** [sqlite_version()] and [sqlite_source_id()].
*/
-#define SQLITE_VERSION "3.45.1"
-#define SQLITE_VERSION_NUMBER 3045001
-#define SQLITE_SOURCE_ID "2024-01-30 16:01:20 e876e51a0ed5c5b3126f52e532044363a014bc594cfefa87ffb5b82257cc467a"
+#define SQLITE_VERSION "3.45.2"
+#define SQLITE_VERSION_NUMBER 3045002
+#define SQLITE_SOURCE_ID "2024-03-12 11:06:23 d8cd6d49b46a395b13955387d05e9e1a2a47e54fb99f3c9b59835bbefad6af77"
/*
** CAPI3REF: Run-Time Library Version Numbers
@@ -733,6 +733,8 @@
** the 1st parameter to sqlite3_exec() while sqlite3_exec() is running.
** <li> The application must not modify the SQL statement text passed into
** the 2nd parameter of sqlite3_exec() while sqlite3_exec() is running.
+** <li> The application must not dereference the arrays or string pointers
+** passed as the 3rd and 4th callback parameters after it returns.
** </ul>
*/
SQLITE_API int sqlite3_exec(
@@ -15097,6 +15099,7 @@
** 0x00010000 Beginning of DELETE/INSERT/UPDATE processing
** 0x00020000 Transform DISTINCT into GROUP BY
** 0x00040000 SELECT tree dump after all code has been generated
+** 0x00080000 NOT NULL strength reduction
*/
/*
@@ -19346,6 +19349,7 @@
#define NC_InAggFunc 0x020000 /* True if analyzing arguments to an agg func */
#define NC_FromDDL 0x040000 /* SQL text comes from sqlite_schema */
#define NC_NoSelect 0x080000 /* Do not descend into sub-selects */
+#define NC_Where 0x100000 /* Processing WHERE clause of a SELECT */
#define NC_OrderAgg 0x8000000 /* Has an aggregate other than count/min/max */
/*
@@ -19369,6 +19373,7 @@
Expr *pUpsertWhere; /* WHERE clause for the ON CONFLICT UPDATE */
Upsert *pNextUpsert; /* Next ON CONFLICT clause in the list */
u8 isDoUpdate; /* True for DO UPDATE. False for DO NOTHING */
+ u8 isDup; /* True if 2nd or later with same pUpsertIdx */
/* Above this point is the parse tree for the ON CONFLICT clauses.
** The next group of fields stores intermediate data. */
void *pToFree; /* Free memory when deleting the Upsert object */
@@ -21444,7 +21449,7 @@
SQLITE_PRIVATE Upsert *sqlite3UpsertNew(sqlite3*,ExprList*,Expr*,ExprList*,Expr*,Upsert*);
SQLITE_PRIVATE void sqlite3UpsertDelete(sqlite3*,Upsert*);
SQLITE_PRIVATE Upsert *sqlite3UpsertDup(sqlite3*,Upsert*);
-SQLITE_PRIVATE int sqlite3UpsertAnalyzeTarget(Parse*,SrcList*,Upsert*);
+SQLITE_PRIVATE int sqlite3UpsertAnalyzeTarget(Parse*,SrcList*,Upsert*,Upsert*);
SQLITE_PRIVATE void sqlite3UpsertDoUpdate(Parse*,Upsert*,Table*,Index*,int);
SQLITE_PRIVATE Upsert *sqlite3UpsertOfIndex(Upsert*,Index*);
SQLITE_PRIVATE int sqlite3UpsertNextIsIPK(Upsert*);
@@ -31309,6 +31314,7 @@
if( xtype==etFLOAT ){
iRound = -precision;
}else if( xtype==etGENERIC ){
+ if( precision==0 ) precision = 1;
iRound = precision;
}else{
iRound = precision+1;
@@ -35199,6 +35205,9 @@
u64 s2;
rr[0] = (double)s;
s2 = (u64)rr[0];
+#if defined(_MSC_VER) && _MSC_VER<1700
+ if( s2==0x8000000000000000LL ){ s2 = 2*(u64)(0.5*rr[0]); }
+#endif
rr[1] = s>=s2 ? (double)(s - s2) : -(double)(s2 - s);
if( e>0 ){
while( e>=100 ){
@@ -35641,7 +35650,7 @@
assert( p->n>0 );
assert( p->n<sizeof(p->zBuf) );
p->iDP = p->n + exp;
- if( iRound<0 ){
+ if( iRound<=0 ){
iRound = p->iDP - iRound;
if( iRound==0 && p->zBuf[i+1]>='5' ){
iRound = 1;
@@ -53262,6 +53271,14 @@
pOut = 0;
}else{
sz = sqlite3_column_int64(pStmt, 0)*szPage;
+ if( sz==0 ){
+ sqlite3_reset(pStmt);
+ sqlite3_exec(db, "BEGIN IMMEDIATE; COMMIT;", 0, 0, 0);
+ rc = sqlite3_step(pStmt);
+ if( rc==SQLITE_ROW ){
+ sz = sqlite3_column_int64(pStmt, 0)*szPage;
+ }
+ }
if( piSize ) *piSize = sz;
if( mFlags & SQLITE_SERIALIZE_NOCOPY ){
pOut = 0;
@@ -77088,7 +77105,10 @@
n = nHeader + nPayload;
testcase( n==3 );
testcase( n==4 );
- if( n<4 ) n = 4;
+ if( n<4 ){
+ n = 4;
+ pPayload[nPayload] = 0;
+ }
*pnSize = n;
assert( nSrc<=nPayload );
testcase( nSrc<nPayload );
@@ -79534,7 +79554,10 @@
if( flags & BTREE_PREFORMAT ){
rc = SQLITE_OK;
szNew = p->pBt->nPreformatSize;
- if( szNew<4 ) szNew = 4;
+ if( szNew<4 ){
+ szNew = 4;
+ newCell[3] = 0;
+ }
if( ISAUTOVACUUM(p->pBt) && szNew>pPage->maxLocal ){
CellInfo info;
pPage->xParseCell(pPage, newCell, &info);
@@ -88379,6 +88402,23 @@
pMem->flags = IsNaN(x) ? MEM_Null : MEM_Real;
}
}
+static int serialGet7(
+ const unsigned char *buf, /* Buffer to deserialize from */
+ Mem *pMem /* Memory cell to write value into */
+){
+ u64 x = FOUR_BYTE_UINT(buf);
+ u32 y = FOUR_BYTE_UINT(buf+4);
+ x = (x<<32) + y;
+ assert( sizeof(x)==8 && sizeof(pMem->u.r)==8 );
+ swapMixedEndianFloat(x);
+ memcpy(&pMem->u.r, &x, sizeof(x));
+ if( IsNaN(x) ){
+ pMem->flags = MEM_Null;
+ return 1;
+ }
+ pMem->flags = MEM_Real;
+ return 0;
+}
SQLITE_PRIVATE void sqlite3VdbeSerialGet(
const unsigned char *buf, /* Buffer to deserialize from */
u32 serial_type, /* Serial type to deserialize */
@@ -89058,7 +89098,7 @@
}else if( serial_type==0 ){
rc = -1;
}else if( serial_type==7 ){
- sqlite3VdbeSerialGet(&aKey1[d1], serial_type, &mem1);
+ serialGet7(&aKey1[d1], &mem1);
rc = -sqlite3IntFloatCompare(pRhs->u.i, mem1.u.r);
}else{
i64 lhs = vdbeRecordDecodeInt(serial_type, &aKey1[d1]);
@@ -89083,14 +89123,18 @@
}else if( serial_type==0 ){
rc = -1;
}else{
- sqlite3VdbeSerialGet(&aKey1[d1], serial_type, &mem1);
if( serial_type==7 ){
- if( mem1.u.r<pRhs->u.r ){
+ if( serialGet7(&aKey1[d1], &mem1) ){
+ rc = -1; /* mem1 is a NaN */
+ }else if( mem1.u.r<pRhs->u.r ){
rc = -1;
}else if( mem1.u.r>pRhs->u.r ){
rc = +1;
+ }else{
+ assert( rc==0 );
}
}else{
+ sqlite3VdbeSerialGet(&aKey1[d1], serial_type, &mem1);
rc = sqlite3IntFloatCompare(mem1.u.i, pRhs->u.r);
}
}
@@ -89160,7 +89204,14 @@
/* RHS is null */
else{
serial_type = aKey1[idx1];
- rc = (serial_type!=0 && serial_type!=10);
+ if( serial_type==0
+ || serial_type==10
+ || (serial_type==7 && serialGet7(&aKey1[d1], &mem1)!=0)
+ ){
+ assert( rc==0 );
+ }else{
+ rc = 1;
+ }
}
if( rc!=0 ){
@@ -94858,7 +94909,9 @@
}
}
}else if( affinity==SQLITE_AFF_TEXT && ((flags1 | flags3) & MEM_Str)!=0 ){
- if( (flags1 & MEM_Str)==0 && (flags1&(MEM_Int|MEM_Real|MEM_IntReal))!=0 ){
+ if( (flags1 & MEM_Str)!=0 ){
+ pIn1->flags &= ~(MEM_Int|MEM_Real|MEM_IntReal);
+ }else if( (flags1&(MEM_Int|MEM_Real|MEM_IntReal))!=0 ){
testcase( pIn1->flags & MEM_Int );
testcase( pIn1->flags & MEM_Real );
testcase( pIn1->flags & MEM_IntReal );
@@ -94867,7 +94920,9 @@
flags1 = (pIn1->flags & ~MEM_TypeMask) | (flags1 & MEM_TypeMask);
if( NEVER(pIn1==pIn3) ) flags3 = flags1 | MEM_Str;
}
- if( (flags3 & MEM_Str)==0 && (flags3&(MEM_Int|MEM_Real|MEM_IntReal))!=0 ){
+ if( (flags3 & MEM_Str)!=0 ){
+ pIn3->flags &= ~(MEM_Int|MEM_Real|MEM_IntReal);
+ }else if( (flags3&(MEM_Int|MEM_Real|MEM_IntReal))!=0 ){
testcase( pIn3->flags & MEM_Int );
testcase( pIn3->flags & MEM_Real );
testcase( pIn3->flags & MEM_IntReal );
@@ -106212,6 +106267,8 @@
assert( iCol>=0 && iCol<pEList->nExpr );
pOrig = pEList->a[iCol].pExpr;
assert( pOrig!=0 );
+ assert( !ExprHasProperty(pExpr, EP_Reduced|EP_TokenOnly) );
+ if( pExpr->pAggInfo ) return;
db = pParse->db;
pDup = sqlite3ExprDup(db, pOrig, 0);
if( db->mallocFailed ){
@@ -107097,6 +107154,19 @@
** resolved. This prevents "column" from being counted as having been
** referenced, which might prevent a SELECT from being erroneously
** marked as correlated.
+ **
+ ** 2024-03-28: Beware of aggregates. A bare column of aggregated table
+ ** can still evaluate to NULL even though it is marked as NOT NULL.
+ ** Example:
+ **
+ ** CREATE TABLE t1(a INT NOT NULL);
+ ** SELECT a, a IS NULL, a IS NOT NULL, count(*) FROM t1;
+ **
+ ** The "a IS NULL" and "a IS NOT NULL" expressions cannot be optimized
+ ** here because at the time this case is hit, we do not yet know whether
+ ** or not t1 is being aggregated. We have to assume the worst and omit
+ ** the optimization. The only time it is safe to apply this optimization
+ ** is within the WHERE clause.
*/
case TK_NOTNULL:
case TK_ISNULL: {
@@ -107107,19 +107177,36 @@
anRef[i] = p->nRef;
}
sqlite3WalkExpr(pWalker, pExpr->pLeft);
- if( 0==sqlite3ExprCanBeNull(pExpr->pLeft) && !IN_RENAME_OBJECT ){
- testcase( ExprHasProperty(pExpr, EP_OuterON) );
- assert( !ExprHasProperty(pExpr, EP_IntValue) );
- pExpr->u.iValue = (pExpr->op==TK_NOTNULL);
- pExpr->flags |= EP_IntValue;
- pExpr->op = TK_INTEGER;
+ if( IN_RENAME_OBJECT ) return WRC_Prune;
+ if( sqlite3ExprCanBeNull(pExpr->pLeft) ){
+ /* The expression can be NULL. So the optimization does not apply */
+ return WRC_Prune;
+ }
- for(i=0, p=pNC; p && i<ArraySize(anRef); p=p->pNext, i++){
- p->nRef = anRef[i];
+ for(i=0, p=pNC; p; p=p->pNext, i++){
+ if( (p->ncFlags & NC_Where)==0 ){
+ return WRC_Prune; /* Not in a WHERE clause. Unsafe to optimize. */
}
- sqlite3ExprDelete(pParse->db, pExpr->pLeft);
- pExpr->pLeft = 0;
}
+ testcase( ExprHasProperty(pExpr, EP_OuterON) );
+ assert( !ExprHasProperty(pExpr, EP_IntValue) );
+#if TREETRACE_ENABLED
+ if( sqlite3TreeTrace & 0x80000 ){
+ sqlite3DebugPrintf(
+ "NOT NULL strength reduction converts the following to %d:\n",
+ pExpr->op==TK_NOTNULL
+ );
+ sqlite3ShowExpr(pExpr);
+ }
+#endif /* TREETRACE_ENABLED */
+ pExpr->u.iValue = (pExpr->op==TK_NOTNULL);
+ pExpr->flags |= EP_IntValue;
+ pExpr->op = TK_INTEGER;
+ for(i=0, p=pNC; p && i<ArraySize(anRef); p=p->pNext, i++){
+ p->nRef = anRef[i];
+ }
+ sqlite3ExprDelete(pParse->db, pExpr->pLeft);
+ pExpr->pLeft = 0;
return WRC_Prune;
}
@@ -108019,7 +108106,9 @@
}
if( sqlite3ResolveExprNames(&sNC, p->pHaving) ) return WRC_Abort;
}
+ sNC.ncFlags |= NC_Where;
if( sqlite3ResolveExprNames(&sNC, p->pWhere) ) return WRC_Abort;
+ sNC.ncFlags &= ~NC_Where;
/* Resolve names in table-valued-function arguments */
for(i=0; i<p->pSrc->nSrc; i++){
@@ -128947,13 +129036,13 @@
double r1, r2;
const char *zVal;
r1 = sqlite3_value_double(pValue);
- sqlite3_str_appendf(pStr, "%!.15g", r1);
+ sqlite3_str_appendf(pStr, "%!0.15g", r1);
zVal = sqlite3_str_value(pStr);
if( zVal ){
sqlite3AtoF(zVal, &r2, pStr->nChar, SQLITE_UTF8);
if( r1!=r2 ){
sqlite3_str_reset(pStr);
- sqlite3_str_appendf(pStr, "%!.20e", r1);
+ sqlite3_str_appendf(pStr, "%!0.20e", r1);
}
}
break;
@@ -129255,7 +129344,7 @@
}
if( zPattern[0]==0 ){
assert( sqlite3_value_type(argv[1])!=SQLITE_NULL );
- sqlite3_result_value(context, argv[0]);
+ sqlite3_result_text(context, (const char*)zStr, nStr, SQLITE_TRANSIENT);
return;
}
nPattern = sqlite3_value_bytes(argv[1]);
@@ -133175,7 +133264,7 @@
pNx->iDataCur = iDataCur;
pNx->iIdxCur = iIdxCur;
if( pNx->pUpsertTarget ){
- if( sqlite3UpsertAnalyzeTarget(pParse, pTabList, pNx) ){
+ if( sqlite3UpsertAnalyzeTarget(pParse, pTabList, pNx, pUpsert) ){
goto insert_cleanup;
}
}
@@ -139474,31 +139563,7 @@
int mxCol; /* Maximum non-virtual column number */
if( pObjTab && pObjTab!=pTab ) continue;
- if( !IsOrdinaryTable(pTab) ){
-#ifndef SQLITE_OMIT_VIRTUALTABLE
- sqlite3_vtab *pVTab;
- int a1;
- if( !IsVirtual(pTab) ) continue;
- if( pTab->nCol<=0 ){
- const char *zMod = pTab->u.vtab.azArg[0];
- if( sqlite3HashFind(&db->aModule, zMod)==0 ) continue;
- }
- sqlite3ViewGetColumnNames(pParse, pTab);
- if( pTab->u.vtab.p==0 ) continue;
- pVTab = pTab->u.vtab.p->pVtab;
- if( NEVER(pVTab==0) ) continue;
- if( NEVER(pVTab->pModule==0) ) continue;
- if( pVTab->pModule->iVersion<4 ) continue;
- if( pVTab->pModule->xIntegrity==0 ) continue;
- sqlite3VdbeAddOp3(v, OP_VCheck, i, 3, isQuick);
- pTab->nTabRef++;
- sqlite3VdbeAppendP4(v, pTab, P4_TABLEREF);
- a1 = sqlite3VdbeAddOp1(v, OP_IsNull, 3); VdbeCoverage(v);
- integrityCheckResultRow(v);
- sqlite3VdbeJumpHere(v, a1);
-#endif
- continue;
- }
+ if( !IsOrdinaryTable(pTab) ) continue;
if( isQuick || HasRowid(pTab) ){
pPk = 0;
r2 = 0;
@@ -139633,6 +139698,7 @@
** is REAL, we have to load the actual data using OP_Column
** to reliably determine if the value is a NULL. */
sqlite3VdbeAddOp3(v, OP_Column, p1, p3, 3);
+ sqlite3ColumnDefault(v, pTab, j, 3);
jmp3 = sqlite3VdbeAddOp2(v, OP_NotNull, 3, labelOk);
VdbeCoverage(v);
}
@@ -139823,6 +139889,38 @@
}
}
}
+
+#ifndef SQLITE_OMIT_VIRTUALTABLE
+ /* Second pass to invoke the xIntegrity method on all virtual
+ ** tables.
+ */
+ for(x=sqliteHashFirst(pTbls); x; x=sqliteHashNext(x)){
+ Table *pTab = sqliteHashData(x);
+ sqlite3_vtab *pVTab;
+ int a1;
+ if( pObjTab && pObjTab!=pTab ) continue;
+ if( IsOrdinaryTable(pTab) ) continue;
+ if( !IsVirtual(pTab) ) continue;
+ if( pTab->nCol<=0 ){
+ const char *zMod = pTab->u.vtab.azArg[0];
+ if( sqlite3HashFind(&db->aModule, zMod)==0 ) continue;
+ }
+ sqlite3ViewGetColumnNames(pParse, pTab);
+ if( pTab->u.vtab.p==0 ) continue;
+ pVTab = pTab->u.vtab.p->pVtab;
+ if( NEVER(pVTab==0) ) continue;
+ if( NEVER(pVTab->pModule==0) ) continue;
+ if( pVTab->pModule->iVersion<4 ) continue;
+ if( pVTab->pModule->xIntegrity==0 ) continue;
+ sqlite3VdbeAddOp3(v, OP_VCheck, i, 3, isQuick);
+ pTab->nTabRef++;
+ sqlite3VdbeAppendP4(v, pTab, P4_TABLEREF);
+ a1 = sqlite3VdbeAddOp1(v, OP_IsNull, 3); VdbeCoverage(v);
+ integrityCheckResultRow(v);
+ sqlite3VdbeJumpHere(v, a1);
+ continue;
+ }
+#endif
}
{
static const int iLn = VDBE_OFFSET_LINENO(2);
@@ -153460,7 +153558,8 @@
SQLITE_PRIVATE int sqlite3UpsertAnalyzeTarget(
Parse *pParse, /* The parsing context */
SrcList *pTabList, /* Table into which we are inserting */
- Upsert *pUpsert /* The ON CONFLICT clauses */
+ Upsert *pUpsert, /* The ON CONFLICT clauses */
+ Upsert *pAll /* Complete list of all ON CONFLICT clauses */
){
Table *pTab; /* That table into which we are inserting */
int rc; /* Result code */
@@ -153563,6 +153662,14 @@
continue;
}
pUpsert->pUpsertIdx = pIdx;
+ if( sqlite3UpsertOfIndex(pAll,pIdx)!=pUpsert ){
+ /* Really this should be an error. The isDup ON CONFLICT clause will
+ ** never fire. But this problem was not discovered until three years
+ ** after multi-CONFLICT upsert was added, and so we silently ignore
+ ** the problem to prevent breaking applications that might actually
+ ** have redundant ON CONFLICT clauses. */
+ pUpsert->isDup = 1;
+ }
break;
}
if( pUpsert->pUpsertIdx==0 ){
@@ -153589,9 +153696,13 @@
Upsert *pNext;
if( NEVER(pUpsert==0) ) return 0;
pNext = pUpsert->pNextUpsert;
- if( pNext==0 ) return 1;
- if( pNext->pUpsertTarget==0 ) return 1;
- if( pNext->pUpsertIdx==0 ) return 1;
+ while( 1 /*exit-by-return*/ ){
+ if( pNext==0 ) return 1;
+ if( pNext->pUpsertTarget==0 ) return 1;
+ if( pNext->pUpsertIdx==0 ) return 1;
+ if( !pNext->isDup ) return 0;
+ pNext = pNext->pNextUpsert;
+ }
return 0;
}
@@ -204785,6 +204896,7 @@
case '[': {
/* Parse array */
iThis = pParse->nBlob;
+ assert( i<=(u32)pParse->nJson );
jsonBlobAppendNode(pParse, JSONB_ARRAY, pParse->nJson - i, 0);
iStart = pParse->nBlob;
if( pParse->oom ) return -1;
@@ -205183,6 +205295,10 @@
JsonParse px;
memset(&px, 0, sizeof(px));
jsonStringTerminate(pStr);
+ if( pStr->eErr ){
+ sqlite3_result_error_nomem(pStr->pCtx);
+ return;
+ }
px.zJson = pStr->zBuf;
px.nJson = pStr->nUsed;
px.db = sqlite3_context_db_handle(pStr->pCtx);
@@ -206508,8 +206624,9 @@
}
p->zJson = (char*)sqlite3_value_text(pArg);
p->nJson = sqlite3_value_bytes(pArg);
+ if( db->mallocFailed ) goto json_pfa_oom;
if( p->nJson==0 ) goto json_pfa_malformed;
- if( NEVER(p->zJson==0) ) goto json_pfa_oom;
+ assert( p->zJson!=0 );
if( jsonConvertTextToBlob(p, (flgs & JSON_KEEPERROR) ? 0 : ctx) ){
if( flgs & JSON_KEEPERROR ){
p->nErr = 1;
@@ -206675,10 +206792,10 @@
if( sz==0 && x<=JSONB_FALSE ){
sqlite3_str_append(pOut, "\n", 1);
}else{
- u32 i;
+ u32 j;
sqlite3_str_appendall(pOut, ": \"");
- for(i=iStart+n; i<iStart+n+sz; i++){
- u8 c = pParse->aBlob[i];
+ for(j=iStart+n; j<iStart+n+sz; j++){
+ u8 c = pParse->aBlob[j];
if( c<0x20 || c>=0x7f ) c = '.';
sqlite3_str_append(pOut, (char*)&c, 1);
}
@@ -208086,6 +208203,9 @@
case JEACH_VALUE: {
u32 i = jsonSkipLabel(p);
jsonReturnFromBlob(&p->sParse, i, ctx, 1);
+ if( (p->sParse.aBlob[i] & 0x0f)>=JSONB_ARRAY ){
+ sqlite3_result_subtype(ctx, JSON_SUBTYPE);
+ }
break;
}
case JEACH_TYPE: {
@@ -208132,9 +208252,9 @@
case JEACH_JSON: {
if( p->sParse.zJson==0 ){
sqlite3_result_blob(ctx, p->sParse.aBlob, p->sParse.nBlob,
- SQLITE_STATIC);
+ SQLITE_TRANSIENT);
}else{
- sqlite3_result_text(ctx, p->sParse.zJson, -1, SQLITE_STATIC);
+ sqlite3_result_text(ctx, p->sParse.zJson, -1, SQLITE_TRANSIENT);
}
break;
}
@@ -209160,11 +209280,9 @@
** Clear the Rtree.pNodeBlob object
*/
static void nodeBlobReset(Rtree *pRtree){
- if( pRtree->pNodeBlob && pRtree->inWrTrans==0 && pRtree->nCursor==0 ){
- sqlite3_blob *pBlob = pRtree->pNodeBlob;
- pRtree->pNodeBlob = 0;
- sqlite3_blob_close(pBlob);
- }
+ sqlite3_blob *pBlob = pRtree->pNodeBlob;
+ pRtree->pNodeBlob = 0;
+ sqlite3_blob_close(pBlob);
}
/*
@@ -209208,7 +209326,6 @@
&pRtree->pNodeBlob);
}
if( rc ){
- nodeBlobReset(pRtree);
*ppNode = 0;
/* If unable to open an sqlite3_blob on the desired row, that can only
** be because the shadow tables hold erroneous data. */
@@ -209268,6 +209385,7 @@
}
*ppNode = pNode;
}else{
+ nodeBlobReset(pRtree);
if( pNode ){
pRtree->nNodeRef--;
sqlite3_free(pNode);
@@ -209412,6 +209530,7 @@
int iCoord, /* Which coordinate to extract */
RtreeCoord *pCoord /* OUT: Space to write result to */
){
+ assert( iCell<NCELL(pNode) );
readCoord(&pNode->zData[12 + pRtree->nBytesPerCell*iCell + 4*iCoord], pCoord);
}
@@ -209601,7 +209720,9 @@
sqlite3_finalize(pCsr->pReadAux);
sqlite3_free(pCsr);
pRtree->nCursor--;
- nodeBlobReset(pRtree);
+ if( pRtree->nCursor==0 && pRtree->inWrTrans==0 ){
+ nodeBlobReset(pRtree);
+ }
return SQLITE_OK;
}
@@ -210186,7 +210307,11 @@
int rc = SQLITE_OK;
RtreeNode *pNode = rtreeNodeOfFirstSearchPoint(pCsr, &rc);
if( rc==SQLITE_OK && ALWAYS(p) ){
- *pRowid = nodeGetRowid(RTREE_OF_CURSOR(pCsr), pNode, p->iCell);
+ if( p->iCell>=NCELL(pNode) ){
+ rc = SQLITE_ABORT;
+ }else{
+ *pRowid = nodeGetRowid(RTREE_OF_CURSOR(pCsr), pNode, p->iCell);
+ }
}
return rc;
}
@@ -210204,6 +210329,7 @@
if( rc ) return rc;
if( NEVER(p==0) ) return SQLITE_OK;
+ if( p->iCell>=NCELL(pNode) ) return SQLITE_ABORT;
if( i==0 ){
sqlite3_result_int64(ctx, nodeGetRowid(pRtree, pNode, p->iCell));
}else if( i<=pRtree->nDim2 ){
@@ -211685,8 +211811,7 @@
*/
static int rtreeBeginTransaction(sqlite3_vtab *pVtab){
Rtree *pRtree = (Rtree *)pVtab;
- assert( pRtree->inWrTrans==0 );
- pRtree->inWrTrans++;
+ pRtree->inWrTrans = 1;
return SQLITE_OK;
}
@@ -211700,6 +211825,9 @@
nodeBlobReset(pRtree);
return SQLITE_OK;
}
+static int rtreeRollback(sqlite3_vtab *pVtab){
+ return rtreeEndTransaction(pVtab);
+}
/*
** The xRename method for rtree module virtual tables.
@@ -211818,7 +211946,7 @@
rtreeBeginTransaction, /* xBegin - begin transaction */
rtreeEndTransaction, /* xSync - sync transaction */
rtreeEndTransaction, /* xCommit - commit transaction */
- rtreeEndTransaction, /* xRollback - rollback transaction */
+ rtreeRollback, /* xRollback - rollback transaction */
0, /* xFindFunction - function overloading */
rtreeRename, /* xRename - rename the table */
rtreeSavepoint, /* xSavepoint */
@@ -245377,23 +245505,26 @@
static void fts5TokendataIterNext(Fts5Iter *pIter, int bFrom, i64 iFrom){
int ii;
Fts5TokenDataIter *pT = pIter->pTokenDataIter;
+ Fts5Index *pIndex = pIter->pIndex;
for(ii=0; ii<pT->nIter; ii++){
Fts5Iter *p = pT->apIter[ii];
if( p->base.bEof==0
&& (p->base.iRowid==pIter->base.iRowid || (bFrom && p->base.iRowid<iFrom))
){
- fts5MultiIterNext(p->pIndex, p, bFrom, iFrom);
+ fts5MultiIterNext(pIndex, p, bFrom, iFrom);
while( bFrom && p->base.bEof==0
&& p->base.iRowid<iFrom
- && p->pIndex->rc==SQLITE_OK
+ && pIndex->rc==SQLITE_OK
){
- fts5MultiIterNext(p->pIndex, p, 0, 0);
+ fts5MultiIterNext(pIndex, p, 0, 0);
}
}
}
- fts5IterSetOutputsTokendata(pIter);
+ if( pIndex->rc==SQLITE_OK ){
+ fts5IterSetOutputsTokendata(pIter);
+ }
}
/*
@@ -250547,7 +250678,7 @@
){
assert( nArg==0 );
UNUSED_PARAM2(nArg, apUnused);
- sqlite3_result_text(pCtx, "fts5: 2024-01-30 16:01:20 e876e51a0ed5c5b3126f52e532044363a014bc594cfefa87ffb5b82257cc467a", -1, SQLITE_TRANSIENT);
+ sqlite3_result_text(pCtx, "fts5: 2024-03-12 11:06:23 d8cd6d49b46a395b13955387d05e9e1a2a47e54fb99f3c9b59835bbefad6af77", -1, SQLITE_TRANSIENT);
}
/*
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/sqlite-amalgamation-3450100/sqlite3.h new/sqlite-amalgamation-3450200/sqlite3.h
--- old/sqlite-amalgamation-3450100/sqlite3.h 2024-01-30 17:24:03.000000000 +0100
+++ new/sqlite-amalgamation-3450200/sqlite3.h 2024-03-12 12:23:08.000000000 +0100
@@ -146,9 +146,9 @@
** [sqlite3_libversion_number()], [sqlite3_sourceid()],
** [sqlite_version()] and [sqlite_source_id()].
*/
-#define SQLITE_VERSION "3.45.1"
-#define SQLITE_VERSION_NUMBER 3045001
-#define SQLITE_SOURCE_ID "2024-01-30 16:01:20 e876e51a0ed5c5b3126f52e532044363a014bc594cfefa87ffb5b82257cc467a"
+#define SQLITE_VERSION "3.45.2"
+#define SQLITE_VERSION_NUMBER 3045002
+#define SQLITE_SOURCE_ID "2024-03-12 11:06:23 d8cd6d49b46a395b13955387d05e9e1a2a47e54fb99f3c9b59835bbefad6af77"
/*
** CAPI3REF: Run-Time Library Version Numbers
@@ -420,6 +420,8 @@
** the 1st parameter to sqlite3_exec() while sqlite3_exec() is running.
** <li> The application must not modify the SQL statement text passed into
** the 2nd parameter of sqlite3_exec() while sqlite3_exec() is running.
+** <li> The application must not dereference the arrays or string pointers
+** passed as the 3rd and 4th callback parameters after it returns.
** </ul>
*/
SQLITE_API int sqlite3_exec(
++++++ sqlite-jdbc-3.45.1.0.tar.gz -> sqlite-jdbc-3.45.2.0.tar.gz ++++++
/work/SRC/openSUSE:Factory/sqlite-jdbc/sqlite-jdbc-3.45.1.0.tar.gz /work/SRC/openSUSE:Factory/.sqlite-jdbc.new.1905/sqlite-jdbc-3.45.2.0.tar.gz differ: char 13, line 1
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package mediainfo for openSUSE:Factory checked in at 2024-03-29 13:10:23
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/mediainfo (Old)
and /work/SRC/openSUSE:Factory/.mediainfo.new.1905 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "mediainfo"
Fri Mar 29 13:10:23 2024 rev:37 rq:1163404 version:24.03
Changes:
--------
--- /work/SRC/openSUSE:Factory/mediainfo/mediainfo.changes 2024-02-02 15:47:57.258154604 +0100
+++ /work/SRC/openSUSE:Factory/.mediainfo.new.1905/mediainfo.changes 2024-03-29 13:13:11.772066124 +0100
@@ -1,0 +2,16 @@
+Thu Mar 28 21:13:16 UTC 2024 - Maxime Gervais <maxime(a)mediaarea.net>
+
+- Update to version 24.03
+ Added features:
+ * ADM: ADM v3, including profile element, support
+ * ADM: conformance checks on AdvSS Emission profile
+ * Dolby E: display more AC-3 metadata items
+ * MOV/MP4: parsing of rtmd (real time metadata) tracks
+ * PNG: packing kind (linear or indexed)
+ * WAV: support of 4+ GiB axml (useful for huge ADM content)
+ Fixed bugs:
+ * MPEG-H: fix uninitialized values leading to random behavior
+ * PDF: fix crash with corrupted files
+ * MOV/MP4: fix bit depth info for some PCM tracks with pcmC box
+
+ -------------------------------------------------------------------
Old:
----
mediainfo_24.01.tar.xz
New:
----
mediainfo_24.03.tar.xz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ mediainfo.spec ++++++
--- /var/tmp/diff_new_pack.EqBY4O/_old 2024-03-29 13:13:12.240083317 +0100
+++ /var/tmp/diff_new_pack.EqBY4O/_new 2024-03-29 13:13:12.240083317 +0100
@@ -18,7 +18,7 @@
Name: mediainfo
-Version: 24.01
+Version: 24.03
Release: 0
Summary: Audio/video file technical and tag information utility
License: GPL-2.0-or-later
++++++ mediainfo_24.01.tar.xz -> mediainfo_24.03.tar.xz ++++++
++++ 13119 lines of diff (skipped)
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package libmediainfo for openSUSE:Factory checked in at 2024-03-29 13:10:23
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/libmediainfo (Old)
and /work/SRC/openSUSE:Factory/.libmediainfo.new.1905 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "libmediainfo"
Fri Mar 29 13:10:23 2024 rev:35 rq:1163403 version:24.03
Changes:
--------
--- /work/SRC/openSUSE:Factory/libmediainfo/libmediainfo.changes 2024-02-02 15:47:55.694097749 +0100
+++ /work/SRC/openSUSE:Factory/.libmediainfo.new.1905/libmediainfo.changes 2024-03-29 13:13:09.559984864 +0100
@@ -1,0 +2,16 @@
+Thu Mar 28 21:13:16 UTC 2024 - Maxime Gervais <maxime(a)mediaarea.net>
+
+- Update to version 24.03
+ Added features:
+ * ADM: ADM v3, including profile element, support
+ * ADM: conformance checks on AdvSS Emission profile
+ * Dolby E: display more AC-3 metadata items
+ * MOV/MP4: parsing of rtmd (real time metadata) tracks
+ * PNG: packing kind (linear or indexed)
+ * WAV: support of 4+ GiB axml (useful for huge ADM content)
+ Fixed bugs:
+ * MPEG-H: fix uninitialized values leading to random behavior
+ * PDF: fix crash with corrupted files
+ * MOV/MP4: fix bit depth info for some PCM tracks with pcmC box
+
+-------------------------------------------------------------------
Old:
----
libmediainfo_24.01.tar.xz
New:
----
libmediainfo_24.03.tar.xz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ libmediainfo.spec ++++++
--- /var/tmp/diff_new_pack.mfYLTa/_old 2024-03-29 13:13:10.020001762 +0100
+++ /var/tmp/diff_new_pack.mfYLTa/_new 2024-03-29 13:13:10.024001909 +0100
@@ -19,7 +19,7 @@
%define sover 0
Name: libmediainfo
-Version: 24.01
+Version: 24.03
Release: 0
Summary: Library for supplying technical and tag information about a video or audio file
License: BSD-2-Clause
++++++ libmediainfo_24.01.tar.xz -> libmediainfo_24.03.tar.xz ++++++
++++ 9907 lines of diff (skipped)
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package esbuild for openSUSE:Factory checked in at 2024-03-29 13:10:21
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/esbuild (Old)
and /work/SRC/openSUSE:Factory/.esbuild.new.1905 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "esbuild"
Fri Mar 29 13:10:21 2024 rev:8 rq:1163400 version:0.20.2
Changes:
--------
--- /work/SRC/openSUSE:Factory/esbuild/esbuild.changes 2024-02-12 18:55:08.500851075 +0100
+++ /work/SRC/openSUSE:Factory/.esbuild.new.1905/esbuild.changes 2024-03-29 13:13:08.679952535 +0100
@@ -1,0 +2,16 @@
+Thu Mar 28 21:20:38 UTC 2024 - Avindra Goolcharan <avindra(a)opensuse.org>
+
+- roll 0.20.0...0.20.2
+ * Support TypeScript experimental decorators on `abstract` class fields #3684
+ * constant folding for JavaScript inequality operators #3645
+ * Fix cross-platform non-determinism with CSS color
+ space transformations #3650
+ * Fix a bug with the CSS nesting transform for older browsers
+ * Work around issues with Deno 1.31+ #3682
+ * Handle Yarn Plug'n'Play edge case with `tsconfig.json` #3698
+ * Empty enums should behave like an object literal #3657
+ * Improve dead code removal of `switch` statements #3659
+ * JSON loader now preserves `__proto__` properties #3700
+ * Other bug fixes
+
+-------------------------------------------------------------------
Old:
----
esbuild-0.20.0.tar.gz
New:
----
esbuild-0.20.2.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ esbuild.spec ++++++
--- /var/tmp/diff_new_pack.qezMY7/_old 2024-03-29 13:13:09.127968993 +0100
+++ /var/tmp/diff_new_pack.qezMY7/_new 2024-03-29 13:13:09.127968993 +0100
@@ -21,7 +21,7 @@
%global tag v%{version}
%global extractdir0 esbuild-%{version}
Name: esbuild
-Version: 0.20.0
+Version: 0.20.2
Release: 0
Summary: A JavaScript bundler written for speed
License: MIT
++++++ esbuild-0.20.0.tar.gz -> esbuild-0.20.2.tar.gz ++++++
++++ 4001 lines of diff (skipped)
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package orthanc for openSUSE:Factory checked in at 2024-03-29 13:10:06
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/orthanc (Old)
and /work/SRC/openSUSE:Factory/.orthanc.new.1905 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "orthanc"
Fri Mar 29 13:10:06 2024 rev:33 rq:1163376 version:1.12.3
Changes:
--------
--- /work/SRC/openSUSE:Factory/orthanc/orthanc.changes 2024-03-20 21:17:16.128490786 +0100
+++ /work/SRC/openSUSE:Factory/.orthanc.new.1905/orthanc.changes 2024-03-29 13:11:15.571795947 +0100
@@ -1,0 +2,5 @@
+Thu Mar 21 16:31:18 UTC 2024 - Christophe Marin <christophe(a)krop.fr>
+
+- Update orthanc-source requirements to fix the orthanc-wsi build
+
+-------------------------------------------------------------------
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ orthanc.spec ++++++
--- /var/tmp/diff_new_pack.Qc7q3q/_old 2024-03-29 13:11:16.147817118 +0100
+++ /var/tmp/diff_new_pack.Qc7q3q/_new 2024-03-29 13:11:16.151817265 +0100
@@ -126,6 +126,8 @@
%package source
Summary: This package includes the source files for Orthanc
Group: Development/Sources
+# DcmtkConfiguration.cmake looks for dicom.dic
+Requires: dcmtk
%description source
This package includes the source files for Orthanc. Use it in conjunction with the -devel package
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-tldextract for openSUSE:Factory checked in at 2024-03-29 13:10:05
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-tldextract (Old)
and /work/SRC/openSUSE:Factory/.python-tldextract.new.1905 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-tldextract"
Fri Mar 29 13:10:05 2024 rev:24 rq:1163368 version:5.1.2
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-tldextract/python-tldextract.changes 2023-12-08 22:34:30.753428837 +0100
+++ /work/SRC/openSUSE:Factory/.python-tldextract.new.1905/python-tldextract.changes 2024-03-29 13:11:14.911771688 +0100
@@ -1,0 +2,10 @@
+Thu Mar 28 16:29:56 UTC 2024 - Mia Herkt <mia(a)0x0.st>
+
+- Update to 5.1.2:
+ * Remove socket.inet_pton, to fix platform-dependent IP parsing
+ #gh/john-kurkowski/tldextract#318
+ * Use non-capturing groups for IPv4 address detection, for a
+ slight speed boost
+ #gh/john-kurkowski/tldextract#323
+
+-------------------------------------------------------------------
Old:
----
tldextract-5.1.1.tar.gz
New:
----
tldextract-5.1.2.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-tldextract.spec ++++++
--- /var/tmp/diff_new_pack.qX88x5/_old 2024-03-29 13:11:15.363788302 +0100
+++ /var/tmp/diff_new_pack.qX88x5/_new 2024-03-29 13:11:15.363788302 +0100
@@ -1,7 +1,7 @@
#
# spec file for package python-tldextract
#
-# Copyright (c) 2023 SUSE LLC
+# Copyright (c) 2024 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -19,7 +19,7 @@
%define oldpython python
%{?sle15_python_module_pythons}
Name: python-tldextract
-Version: 5.1.1
+Version: 5.1.2
Release: 0
Summary: Python module to separate the TLD of a URL
License: BSD-3-Clause
@@ -38,6 +38,7 @@
BuildRequires: %{python_module setuptools_scm}
BuildRequires: %{python_module setuptools}
BuildRequires: %{python_module six}
+BuildRequires: %{python_module syrupy}
BuildRequires: %{python_module wheel}
BuildRequires: fdupes
BuildRequires: python-rpm-macros
@@ -46,10 +47,9 @@
Requires: python-requests >= 2.1.0
Requires: python-requests-file >= 1.4
Requires(post): update-alternatives
-Requires(postun):update-alternatives
+Requires(postun): update-alternatives
Obsoletes: %{oldpython}-tldextract <= 2.0.1
BuildArch: noarch
-
%python_subpackages
%description
++++++ tldextract-5.1.1.tar.gz -> tldextract-5.1.2.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/.github/workflows/ci.yml new/tldextract-5.1.2/.github/workflows/ci.yml
--- old/tldextract-5.1.1/.github/workflows/ci.yml 2023-11-13 21:07:39.000000000 +0100
+++ new/tldextract-5.1.2/.github/workflows/ci.yml 2024-03-16 01:08:14.000000000 +0100
@@ -1,5 +1,11 @@
name: build
-on: [push, pull_request]
+on:
+ pull_request: {}
+ push:
+ branches:
+ - "master"
+ tags-ignore:
+ - "**"
jobs:
test:
strategy:
@@ -14,6 +20,8 @@
{python-version: "3.11", toxenv: "py311"},
{python-version: "3.12", toxenv: "py312"},
{python-version: "pypy3.8", toxenv: "pypy38"},
+ {python-version: "pypy3.9", toxenv: "pypy39"},
+ {python-version: "pypy3.10", toxenv: "pypy310"},
]
include:
- os: ubuntu-latest
@@ -27,13 +35,13 @@
- name: Check out repository
uses: actions/checkout@v4
- name: Setup Python
- uses: actions/setup-python@v4
+ uses: actions/setup-python@v5
with:
python-version: ${{ matrix.language.python-version }}
+ check-latest: true
- name: Install Python requirements
run: |
- pip install --upgrade pip
- pip install --upgrade --editable '.[testing]'
+ pip install --upgrade tox
- name: Test
run: tox
env:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/CHANGELOG.md new/tldextract-5.1.2/CHANGELOG.md
--- old/tldextract-5.1.1/CHANGELOG.md 2023-11-17 04:39:20.000000000 +0100
+++ new/tldextract-5.1.2/CHANGELOG.md 2024-03-19 04:59:18.000000000 +0100
@@ -3,6 +3,16 @@
After upgrading, update your cache file by deleting it or via `tldextract
--update`.
+## 5.1.2 (2024-03-18)
+
+* Bugfixes
+ * Remove `socket.inet_pton`, to fix platform-dependent IP parsing ([#318](https://github.com/john-kurkowski/tldextract/issues/318))
+ * Use non-capturing groups for IPv4 address detection, for a slight speed boost ([#323](https://github.com/john-kurkowski/tldextract/issues/323))
+* Misc.
+ * Add CI for PyPy3.9 and PyPy3.10 ([#316](https://github.com/john-kurkowski/tldextract/issues/316))
+ * Add script to automate package release process ([#325](https://github.com/john-kurkowski/tldextract/issues/325))
+ * Update LICENSE copyright years
+
## 5.1.1 (2023-11-16)
* Bugfixes
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/LICENSE new/tldextract-5.1.2/LICENSE
--- old/tldextract-5.1.1/LICENSE 2022-05-03 21:53:33.000000000 +0200
+++ new/tldextract-5.1.2/LICENSE 2024-03-19 04:42:37.000000000 +0100
@@ -1,6 +1,6 @@
BSD 3-Clause License
-Copyright (c) 2020, John Kurkowski
+Copyright (c) 2013-2024, John Kurkowski
All rights reserved.
Redistribution and use in source and binary forms, with or without
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/PKG-INFO new/tldextract-5.1.2/PKG-INFO
--- old/tldextract-5.1.1/PKG-INFO 2023-11-17 04:43:35.332430400 +0100
+++ new/tldextract-5.1.2/PKG-INFO 2024-03-19 05:07:53.174141200 +0100
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: tldextract
-Version: 5.1.1
+Version: 5.1.2
Summary: Accurately separates a URL's subdomain, domain, and public suffix, using the Public Suffix List (PSL). By default, this includes the public ICANN TLDs and their exceptions. You can optionally support the Public Suffix List's private domains as well.
Author-email: John Kurkowski <john.kurkowski(a)gmail.com>
License: BSD-3-Clause
@@ -22,6 +22,9 @@
Requires-Dist: requests>=2.1.0
Requires-Dist: requests-file>=1.4
Requires-Dist: filelock>=3.0.8
+Provides-Extra: release
+Requires-Dist: build; extra == "release"
+Requires-Dist: twine; extra == "release"
Provides-Extra: testing
Requires-Dist: black; extra == "testing"
Requires-Dist: mypy; extra == "testing"
@@ -30,6 +33,7 @@
Requires-Dist: pytest-mock; extra == "testing"
Requires-Dist: responses; extra == "testing"
Requires-Dist: ruff; extra == "testing"
+Requires-Dist: syrupy; extra == "testing"
Requires-Dist: tox; extra == "testing"
Requires-Dist: types-filelock; extra == "testing"
Requires-Dist: types-requests; extra == "testing"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/pyproject.toml new/tldextract-5.1.2/pyproject.toml
--- old/tldextract-5.1.1/pyproject.toml 2023-11-13 21:07:39.000000000 +0100
+++ new/tldextract-5.1.2/pyproject.toml 2024-03-16 01:08:12.000000000 +0100
@@ -41,6 +41,10 @@
]
[project.optional-dependencies]
+release = [
+ "build",
+ "twine",
+]
testing = [
"black",
"mypy",
@@ -49,6 +53,7 @@
"pytest-mock",
"responses",
"ruff",
+ "syrupy",
"tox",
"types-filelock",
"types-requests",
@@ -79,12 +84,13 @@
version = {attr = "setuptools_scm.get_version"}
[tool.mypy]
+explicit_package_bases = true
strict = true
[tool.pytest.ini_options]
addopts = "--doctest-modules"
-[tool.ruff]
+[tool.ruff.lint]
select = [
"A",
"B",
@@ -101,5 +107,5 @@
"E501", # line too long; if Black does its job, not worried about the rare long line
]
-[tool.ruff.pydocstyle]
+[tool.ruff.lint.pydocstyle]
convention = "pep257"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/scripts/release.py new/tldextract-5.1.2/scripts/release.py
--- old/tldextract-5.1.1/scripts/release.py 1970-01-01 01:00:00.000000000 +0100
+++ new/tldextract-5.1.2/scripts/release.py 2024-03-16 01:07:21.000000000 +0100
@@ -0,0 +1,238 @@
+"""
+This script automates the release process for a Python package.
+
+It will:
+- Add a git tag for the given version.
+- Remove the previous dist folder.
+- Create a build.
+- Ask the user to verify the build.
+- Upload the build to PyPI.
+- Push all git tags to the remote.
+- Create a draft release on GitHub using the version notes in CHANGELOG.md.
+
+Prerequisites:
+ - This must be run from the root of the repository.
+ - The repo must have a clean git working tree.
+ - The user must have the GITHUB_TOKEN environment variable set to a valid GitHub personal access token.
+ - The user will need credentials for the PyPI repository, which the user will be prompted for during the upload step. The user will need to paste the token manually from a password manager or similar.
+ - The CHANGELOG.md file must already contain an entry for the version being released.
+ - Install requirements with: pip install --upgrade --editable '.[release]'
+
+"""
+
+from __future__ import annotations
+
+import os
+import re
+import subprocess
+import sys
+from pathlib import Path
+
+import requests
+
+
+def add_git_tag_for_version(version: str) -> None:
+ """Add a git tag for the given version."""
+ subprocess.run(["git", "tag", "-a", version, "-m", version], check=True)
+ print(f"Version {version} tag added successfully.")
+
+
+def remove_previous_dist() -> None:
+ """Check for dist folder, and if it exists, remove it."""
+ subprocess.run(["rm", "-rf", Path("dist")], check=True)
+ print("Previous dist folder removed successfully.")
+
+
+def create_build() -> None:
+ """Create a build."""
+ subprocess.run(["python", "-m", "build"], check=True)
+ print("Build created successfully.")
+
+
+def verify_build(is_test: str) -> None:
+ """Verify the build.
+
+ Print the archives in dist/ and ask the user to manually inspect and
+ confirm they contain the expected files, e.g. source files and test files.
+ """
+ build_files = os.listdir("dist")
+ if len(build_files) != 2:
+ print(
+ "WARNING: dist folder contains incorrect number of files.", file=sys.stderr
+ )
+ print("Contents of dist folder:")
+ subprocess.run(["ls", "-l", Path("dist")], check=True)
+ print("Contents of tar files in dist folder:")
+ for build_file in build_files:
+ subprocess.run(["tar", "tvf", Path("dist") / build_file], check=True)
+ confirmation = input("Does the build look correct? (y/n): ")
+ if confirmation == "y":
+ print("Build verified successfully.")
+ upload_build_to_pypi(is_test)
+ push_git_tags()
+ else:
+ raise Exception("Could not verify. Build was not uploaded.")
+
+
+def generate_github_release_notes_body(token: str, version: str) -> str:
+ """Generate and grab release notes URL from Github."""
+ response = requests.post(
+ "https://api.github.com/repos/john-kurkowski/tldextract/releases/generate-no…",
+ headers={
+ "Accept": "application/vnd.github+json",
+ "Authorization": f"Bearer {token}",
+ "X-GitHub-Api-Version": "2022-11-28",
+ },
+ json={"tag_name": version},
+ )
+
+ try:
+ response.raise_for_status()
+ except requests.exceptions.HTTPError as err:
+ print(
+ f"WARNING: Failed to generate release notes from Github: {err}",
+ file=sys.stderr,
+ )
+ return ""
+ return str(response.json()["body"])
+
+
+def get_release_notes_url(body: str) -> str:
+ """Parse the release notes content to get the changelog URL."""
+ url_pattern = re.compile(r"\*\*Full Changelog\*\*: (.*)$")
+ match = url_pattern.search(body)
+ if match:
+ return match.group(1)
+ else:
+ print(
+ "WARNING: Failed to parse release notes URL from GitHub response.",
+ file=sys.stderr,
+ )
+ return ""
+
+
+def get_changelog_release_notes(release_notes_url: str, version: str) -> str:
+ """Get the changelog release notes.
+
+ Uses a regex starting on a heading beginning with the version number
+ literal, and matching until the next heading. Using regex to match markup
+ is brittle. Consider a Markdown-parsing library instead.
+ """
+ with open("CHANGELOG.md") as file:
+ changelog_text = file.read()
+ pattern = re.compile(rf"## {re.escape(version)}[^\n]*(.*?)## ", re.DOTALL)
+ match = pattern.search(changelog_text)
+ if match:
+ return str(match.group(1)).strip()
+ else:
+ print(
+ f"WARNING: Failed to parse changelog release notes. Manually copy this version's notes from the CHANGELOG.md file to {release_notes_url}.",
+ file=sys.stderr,
+ )
+ return ""
+
+
+def create_release_notes_body(token: str, version: str) -> str:
+ """Compile the release notes."""
+ github_release_body = generate_github_release_notes_body(token, version)
+ release_notes_url = get_release_notes_url(github_release_body)
+ changelog_notes = get_changelog_release_notes(release_notes_url, version)
+ full_release_notes = f"{changelog_notes}\n\n**Full Changelog**: {release_notes_url}"
+ return full_release_notes
+
+
+def create_github_release_draft(token: str, version: str) -> None:
+ """Create a release on GitHub."""
+ release_body = create_release_notes_body(token, version)
+ response = requests.post(
+ "https://api.github.com/repos/john-kurkowski/tldextract/releases",
+ headers={
+ "Accept": "application/vnd.github+json",
+ "Authorization": f"Bearer {token}",
+ "X-GitHub-Api-Version": "2022-11-28",
+ },
+ json={
+ "tag_name": version,
+ "name": version,
+ "body": release_body,
+ "draft": True,
+ "prerelease": False,
+ },
+ )
+
+ try:
+ response.raise_for_status()
+ except requests.exceptions.HTTPError as err:
+ print(
+ f"WARNING: Failed to create release on Github: {err}",
+ file=sys.stderr,
+ )
+ return
+ print(f'Release created successfully: {response.json()["html_url"]}')
+
+
+def upload_build_to_pypi(is_test: str) -> None:
+ """Upload the build to PyPI."""
+ repository: list[str | Path] = (
+ [] if is_test == "n" else ["--repository", "testpypi"]
+ )
+ upload_command = ["twine", "upload", *repository, Path("dist") / "*"]
+ subprocess.run(
+ upload_command,
+ check=True,
+ )
+
+
+def push_git_tags() -> None:
+ """Push all git tags to the remote."""
+ subprocess.run(["git", "push", "--tags", "origin", "master"], check=True)
+
+
+def check_for_clean_working_tree() -> None:
+ """Check for a clean git working tree."""
+ git_status = subprocess.run(
+ ["git", "status", "--porcelain"], capture_output=True, text=True
+ )
+ if git_status.stdout:
+ print(
+ "Git working tree is not clean. Please commit or stash changes.",
+ file=sys.stderr,
+ )
+ sys.exit(1)
+
+
+def get_env_github_token() -> str:
+ """Check for the GITHUB_TOKEN environment variable."""
+ github_token = os.environ.get("GITHUB_TOKEN")
+ if not github_token:
+ print("GITHUB_TOKEN environment variable not set.", file=sys.stderr)
+ sys.exit(1)
+ return github_token
+
+
+def get_is_test_response() -> str:
+ """Ask the user if this is a test release."""
+ while True:
+ is_test = input("Is this a test release? (y/n): ")
+ if is_test in ["y", "n"]:
+ return is_test
+ else:
+ print("Invalid input. Please enter 'y' or 'n.'")
+
+
+def main() -> None:
+ """Run the main program."""
+ check_for_clean_working_tree()
+ github_token = get_env_github_token()
+ is_test = get_is_test_response()
+ version_number = input("Enter the version number: ")
+
+ add_git_tag_for_version(version_number)
+ remove_previous_dist()
+ create_build()
+ verify_build(is_test)
+ create_github_release_draft(github_token, version_number)
+
+
+if __name__ == "__main__":
+ main()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tests/__snapshots__/test_release.ambr new/tldextract-5.1.2/tests/__snapshots__/test_release.ambr
--- old/tldextract-5.1.1/tests/__snapshots__/test_release.ambr 1970-01-01 01:00:00.000000000 +0100
+++ new/tldextract-5.1.2/tests/__snapshots__/test_release.ambr 2024-03-16 01:07:21.000000000 +0100
@@ -0,0 +1,244 @@
+# serializer version: 1
+# name: test_happy_path
+ dict({
+ 'input': _CallList([
+ _Call(
+ '',
+ tuple(
+ 'Is this a test release? (y/n): ',
+ ),
+ dict({
+ }),
+ ),
+ _Call(
+ '',
+ tuple(
+ 'Enter the version number: ',
+ ),
+ dict({
+ }),
+ ),
+ _Call(
+ '',
+ tuple(
+ 'Does the build look correct? (y/n): ',
+ ),
+ dict({
+ }),
+ ),
+ ]),
+ 'listdir': _CallList([
+ _Call(
+ '',
+ tuple(
+ 'dist',
+ ),
+ dict({
+ }),
+ ),
+ ]),
+ 'requests': _CallList([
+ _Call(
+ '',
+ tuple(
+ 'https://api.github.com/repos/john-kurkowski/tldextract/releases/generate-no…',
+ ),
+ dict({
+ 'headers': dict({
+ 'Accept': 'application/vnd.github+json',
+ 'Authorization': 'Bearer fake-token',
+ 'X-GitHub-Api-Version': '2022-11-28',
+ }),
+ 'json': dict({
+ 'tag_name': '5.0.1',
+ }),
+ }),
+ ),
+ _Call(
+ '',
+ tuple(
+ 'https://api.github.com/repos/john-kurkowski/tldextract/releases',
+ ),
+ dict({
+ 'headers': dict({
+ 'Accept': 'application/vnd.github+json',
+ 'Authorization': 'Bearer fake-token',
+ 'X-GitHub-Api-Version': '2022-11-28',
+ }),
+ 'json': dict({
+ 'body': '''
+ * Bugfixes
+ * Indicate MD5 not used in a security context (FIPS compliance) ([#309](https://github.com/john-kurkowski/tldextract/issues/309))
+ * Misc.
+ * Increase typecheck aggression
+
+ **Full Changelog**: fake-body
+ ''',
+ 'draft': True,
+ 'name': '5.0.1',
+ 'prerelease': False,
+ 'tag_name': '5.0.1',
+ }),
+ }),
+ ),
+ ]),
+ 'subprocess': _CallList([
+ _Call(
+ '',
+ tuple(
+ list([
+ 'git',
+ 'status',
+ '--porcelain',
+ ]),
+ ),
+ dict({
+ 'capture_output': True,
+ 'text': True,
+ }),
+ ),
+ _Call(
+ '',
+ tuple(
+ list([
+ 'git',
+ 'tag',
+ '-a',
+ '5.0.1',
+ '-m',
+ '5.0.1',
+ ]),
+ ),
+ dict({
+ 'check': True,
+ }),
+ ),
+ _Call(
+ '',
+ tuple(
+ list([
+ 'rm',
+ '-rf',
+ PosixPath('dist'),
+ ]),
+ ),
+ dict({
+ 'check': True,
+ }),
+ ),
+ _Call(
+ '',
+ tuple(
+ list([
+ 'python',
+ '-m',
+ 'build',
+ ]),
+ ),
+ dict({
+ 'check': True,
+ }),
+ ),
+ _Call(
+ '',
+ tuple(
+ list([
+ 'ls',
+ '-l',
+ PosixPath('dist'),
+ ]),
+ ),
+ dict({
+ 'check': True,
+ }),
+ ),
+ _Call(
+ '',
+ tuple(
+ list([
+ 'tar',
+ 'tvf',
+ PosixPath('dist/archive1'),
+ ]),
+ ),
+ dict({
+ 'check': True,
+ }),
+ ),
+ _Call(
+ '',
+ tuple(
+ list([
+ 'tar',
+ 'tvf',
+ PosixPath('dist/archive2'),
+ ]),
+ ),
+ dict({
+ 'check': True,
+ }),
+ ),
+ _Call(
+ '',
+ tuple(
+ list([
+ 'tar',
+ 'tvf',
+ PosixPath('dist/archive3'),
+ ]),
+ ),
+ dict({
+ 'check': True,
+ }),
+ ),
+ _Call(
+ '',
+ tuple(
+ list([
+ 'twine',
+ 'upload',
+ '--repository',
+ 'testpypi',
+ PosixPath('dist/*'),
+ ]),
+ ),
+ dict({
+ 'check': True,
+ }),
+ ),
+ _Call(
+ '',
+ tuple(
+ list([
+ 'git',
+ 'push',
+ '--tags',
+ 'origin',
+ 'master',
+ ]),
+ ),
+ dict({
+ 'check': True,
+ }),
+ ),
+ ]),
+ })
+# ---
+# name: test_happy_path.1
+ '''
+ Version 5.0.1 tag added successfully.
+ Previous dist folder removed successfully.
+ Build created successfully.
+ Contents of dist folder:
+ Contents of tar files in dist folder:
+ Build verified successfully.
+ Release created successfully: https://github.com/path/to/release
+
+ '''
+# ---
+# name: test_happy_path.2
+ '''
+ WARNING: dist folder contains incorrect number of files.
+
+ '''
+# ---
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tests/main_test.py new/tldextract-5.1.2/tests/main_test.py
--- old/tldextract-5.1.1/tests/main_test.py 2023-10-28 20:47:14.000000000 +0200
+++ new/tldextract-5.1.2/tests/main_test.py 2024-03-08 22:53:51.000000000 +0100
@@ -4,6 +4,7 @@
import logging
import os
+import sys
import tempfile
from collections.abc import Sequence
from pathlib import Path
@@ -17,7 +18,7 @@
import tldextract
import tldextract.suffix_list
from tldextract.cache import DiskCache
-from tldextract.remote import inet_pton, lenient_netloc, looks_like_ip
+from tldextract.remote import lenient_netloc, looks_like_ip, looks_like_ipv6
from tldextract.suffix_list import SuffixListNotFound
from tldextract.tldextract import ExtractResult
@@ -152,21 +153,24 @@
)
-(a)pytest.mark.skipif(not inet_pton, reason="inet_pton unavailable")
-def test_looks_like_ip_with_inet_pton() -> None:
- """Test preferred function to check if a string looks like an IP address."""
- assert looks_like_ip("1.1.1.1", inet_pton) is True
- assert looks_like_ip("a.1.1.1", inet_pton) is False
- assert looks_like_ip("1.1.1.1\n", inet_pton) is False
- assert looks_like_ip("256.256.256.256", inet_pton) is False
-
-
-def test_looks_like_ip_without_inet_pton() -> None:
- """Test fallback function to check if a string looks like an IP address."""
- assert looks_like_ip("1.1.1.1", None) is True
- assert looks_like_ip("a.1.1.1", None) is False
- assert looks_like_ip("1.1.1.1\n", None) is False
- assert looks_like_ip("256.256.256.256", None) is False
+def test_looks_like_ip() -> None:
+ """Test function to check if a string looks like an IPv4 address."""
+ assert looks_like_ip("1.1.1.1") is True
+ assert looks_like_ip("1.1.1.01") is False
+ assert looks_like_ip("a.1.1.1") is False
+ assert looks_like_ip("1.1.1.1\n") is False
+ assert looks_like_ip("256.256.256.256") is False
+
+
+def test_looks_like_ipv6() -> None:
+ """Test function to check if a string looks like an IPv6 address."""
+ assert looks_like_ipv6("::") is True
+ assert looks_like_ipv6("aBcD:ef01:2345:6789:aBcD:ef01:aaaa:2288") is True
+ assert looks_like_ipv6("aBcD:ef01:2345:6789:aBcD:ef01:127.0.0.1") is True
+ assert looks_like_ipv6("ZBcD:ef01:2345:6789:aBcD:ef01:127.0.0.1") is False
+ if sys.version_info >= (3, 8, 12): # noqa: UP036
+ assert looks_like_ipv6("aBcD:ef01:2345:6789:aBcD:ef01:127.0.0.01") is False
+ assert looks_like_ipv6("aBcD:ef01:2345:6789:aBcD:") is False
def test_similar_to_ip() -> None:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tests/test_cache.py new/tldextract-5.1.2/tests/test_cache.py
--- old/tldextract-5.1.1/tests/test_cache.py 2023-11-13 21:07:39.000000000 +0100
+++ new/tldextract-5.1.2/tests/test_cache.py 2024-03-08 22:53:51.000000000 +0100
@@ -1,4 +1,5 @@
"""Test the caching functionality."""
+
from __future__ import annotations
import sys
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tests/test_release.py new/tldextract-5.1.2/tests/test_release.py
--- old/tldextract-5.1.1/tests/test_release.py 1970-01-01 01:00:00.000000000 +0100
+++ new/tldextract-5.1.2/tests/test_release.py 2024-03-19 04:36:52.000000000 +0100
@@ -0,0 +1,95 @@
+"""Test the library maintainer release script."""
+
+from __future__ import annotations
+
+import dataclasses
+import sys
+from collections.abc import Iterator
+from typing import Any
+from unittest import mock
+
+import pytest
+from syrupy.assertion import SnapshotAssertion
+
+from scripts import release
+
+
+(a)dataclasses.dataclass
+class Mocks:
+ """Collection of all mocked objects used in the release script."""
+
+ input: mock.Mock
+ listdir: mock.Mock
+ requests: mock.Mock
+ subprocess: mock.Mock
+
+ @property
+ def mock_calls(self) -> dict[str, Any]:
+ """A dict of _all_ calls to this class's mock objects."""
+ return {
+ k.name: getattr(self, k.name).mock_calls for k in dataclasses.fields(self)
+ }
+
+
+(a)pytest.fixture
+def mocks() -> Iterator[Mocks]:
+ """Stub network and subprocesses."""
+ with mock.patch("builtins.input") as mock_input, mock.patch(
+ "os.listdir"
+ ) as mock_listdir, mock.patch("requests.post") as mock_requests, mock.patch(
+ "subprocess.run"
+ ) as mock_subprocess:
+ yield Mocks(
+ input=mock_input,
+ listdir=mock_listdir,
+ requests=mock_requests,
+ subprocess=mock_subprocess,
+ )
+
+
+(a)pytest.mark.skipif(
+ sys.platform == "win32", reason="Snapshot paths are different on Windows"
+)
+def test_happy_path(
+ capsys: pytest.CaptureFixture[str],
+ mocks: Mocks,
+ monkeypatch: pytest.MonkeyPatch,
+ snapshot: SnapshotAssertion,
+) -> None:
+ """Test the release script happy path.
+
+ Simulate user input for a typical, existing release.
+
+ This one test case covers most lines of the release script, without
+ actually making network requests or running subprocesses. For an
+ infrequently used script, this coverage is useful without being too brittle
+ to change.
+ """
+ monkeypatch.setenv("GITHUB_TOKEN", "fake-token")
+
+ mocks.input.side_effect = ["y", "5.0.1", "y"]
+
+ mocks.listdir.return_value = ["archive1", "archive2", "archive3"]
+
+ def mock_post(*args: Any, **kwargs: Any) -> mock.Mock:
+ """Return _one_ response JSON that happens to match expectations for multiple requests."""
+ return mock.Mock(
+ json=mock.Mock(
+ return_value={
+ "body": "Body start **Full Changelog**: fake-body",
+ "html_url": "https://github.com/path/to/release",
+ }
+ ),
+ )
+
+ mocks.requests.side_effect = mock_post
+
+ mocks.subprocess.return_value.stdout = ""
+
+ release.main()
+
+ out, err = capsys.readouterr()
+
+ assert mocks.mock_calls == snapshot
+ assert out == snapshot
+ assert err == snapshot
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tests/test_trie.py new/tldextract-5.1.2/tests/test_trie.py
--- old/tldextract-5.1.1/tests/test_trie.py 2023-10-11 10:28:34.000000000 +0200
+++ new/tldextract-5.1.2/tests/test_trie.py 2024-03-08 22:53:51.000000000 +0100
@@ -1,4 +1,5 @@
"""Trie tests."""
+
from itertools import permutations
from tldextract.tldextract import Trie
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tldextract/__main__.py new/tldextract-5.1.2/tldextract/__main__.py
--- old/tldextract-5.1.1/tldextract/__main__.py 2022-06-06 01:36:58.000000000 +0200
+++ new/tldextract-5.1.2/tldextract/__main__.py 2024-03-08 22:53:51.000000000 +0100
@@ -1,6 +1,5 @@
"""tldextract __main__."""
-
from .cli import main
if __name__ == "__main__":
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tldextract/_version.py new/tldextract-5.1.2/tldextract/_version.py
--- old/tldextract-5.1.1/tldextract/_version.py 2023-11-17 04:43:35.000000000 +0100
+++ new/tldextract-5.1.2/tldextract/_version.py 2024-03-19 05:07:52.000000000 +0100
@@ -12,5 +12,5 @@
__version_tuple__: VERSION_TUPLE
version_tuple: VERSION_TUPLE
-__version__ = version = '5.1.1'
-__version_tuple__ = version_tuple = (5, 1, 1)
+__version__ = version = '5.1.2'
+__version_tuple__ = version_tuple = (5, 1, 2)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tldextract/cache.py new/tldextract-5.1.2/tldextract/cache.py
--- old/tldextract-5.1.1/tldextract/cache.py 2023-11-13 21:07:39.000000000 +0100
+++ new/tldextract-5.1.2/tldextract/cache.py 2024-03-08 22:53:51.000000000 +0100
@@ -1,4 +1,5 @@
"""Helpers."""
+
from __future__ import annotations
import errno
@@ -37,8 +38,7 @@
def get_pkg_unique_identifier() -> str:
- """
- Generate an identifier unique to the python version, tldextract version, and python instance.
+ """Generate an identifier unique to the python version, tldextract version, and python instance.
This will prevent interference between virtualenvs and issues that might arise when installing
a new version of tldextract
@@ -65,8 +65,7 @@
def get_cache_dir() -> str:
- """
- Get a cache dir that we have permission to write to.
+ """Get a cache dir that we have permission to write to.
Try to follow the XDG standard, but if that doesn't work fallback to the package directory
http://specifications.freedesktop.org/basedir-spec/basedir-spec-latest.html
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tldextract/remote.py new/tldextract-5.1.2/tldextract/remote.py
--- old/tldextract-5.1.1/tldextract/remote.py 2023-10-11 21:15:35.000000000 +0200
+++ new/tldextract-5.1.2/tldextract/remote.py 2024-03-08 23:02:00.000000000 +0100
@@ -3,19 +3,13 @@
from __future__ import annotations
import re
-from collections.abc import Callable
from ipaddress import AddressValueError, IPv6Address
from urllib.parse import scheme_chars
-inet_pton: Callable[[int, str], bytes] | None
-try:
- from socket import AF_INET, AF_INET6, inet_pton # Availability: Unix, Windows.
-except ImportError:
- inet_pton = None
-
IP_RE = re.compile(
- r"^(([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\.)"
- r"{3}([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])$"
+ r"^(?:(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\.)"
+ r"{3}(?:[0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])$",
+ re.ASCII,
)
scheme_chars_set = set(scheme_chars)
@@ -59,32 +53,16 @@
return url[double_slashes_start + 2 :]
-def looks_like_ip(
- maybe_ip: str, pton: Callable[[int, str], bytes] | None = inet_pton
-) -> bool:
- """Check whether the given str looks like an IP address."""
+def looks_like_ip(maybe_ip: str) -> bool:
+ """Check whether the given str looks like an IPv4 address."""
if not maybe_ip[0].isdigit():
return False
- if pton is not None:
- try:
- pton(AF_INET, maybe_ip)
- return True
- except OSError:
- return False
return IP_RE.fullmatch(maybe_ip) is not None
-def looks_like_ipv6(
- maybe_ip: str, pton: Callable[[int, str], bytes] | None = inet_pton
-) -> bool:
+def looks_like_ipv6(maybe_ip: str) -> bool:
"""Check whether the given str looks like an IPv6 address."""
- if pton is not None:
- try:
- pton(AF_INET6, maybe_ip)
- return True
- except OSError:
- return False
try:
IPv6Address(maybe_ip)
except AddressValueError:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tldextract/tldextract.py new/tldextract-5.1.2/tldextract/tldextract.py
--- old/tldextract-5.1.1/tldextract/tldextract.py 2023-10-28 20:47:14.000000000 +0200
+++ new/tldextract-5.1.2/tldextract/tldextract.py 2024-03-08 22:59:46.000000000 +0100
@@ -75,8 +75,7 @@
@property
def registered_domain(self) -> str:
- """
- Joins the domain and suffix fields with a dot, if they're both set.
+ """Joins the domain and suffix fields with a dot, if they're both set.
>>> extract('http://forums.bbc.co.uk').registered_domain
'bbc.co.uk'
@@ -89,8 +88,7 @@
@property
def fqdn(self) -> str:
- """
- Returns a Fully Qualified Domain Name, if there is a proper domain/suffix.
+ """Returns a Fully Qualified Domain Name, if there is a proper domain/suffix.
>>> extract('http://forums.bbc.co.uk/path/to/file').fqdn
'forums.bbc.co.uk'
@@ -103,8 +101,7 @@
@property
def ipv4(self) -> str:
- """
- Returns the ipv4 if that is what the presented domain/url is.
+ """Returns the ipv4 if that is what the presented domain/url is.
>>> extract('http://127.0.0.1/path/to/file').ipv4
'127.0.0.1'
@@ -123,8 +120,7 @@
@property
def ipv6(self) -> str:
- """
- Returns the ipv6 if that is what the presented domain/url is.
+ """Returns the ipv6 if that is what the presented domain/url is.
>>> extract('http://[aBcD:ef01:2345:6789:aBcD:ef01:127.0.0.1]/path/to/file').ipv6
'aBcD:ef01:2345:6789:aBcD:ef01:127.0.0.1'
@@ -334,8 +330,7 @@
@property
def tlds(self, session: requests.Session | None = None) -> list[str]:
- """
- Returns the list of tld's used by default.
+ """Returns the list of tld's used by default.
This will vary based on `include_psl_private_domains` and `extra_suffixes`
"""
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tldextract.egg-info/PKG-INFO new/tldextract-5.1.2/tldextract.egg-info/PKG-INFO
--- old/tldextract-5.1.1/tldextract.egg-info/PKG-INFO 2023-11-17 04:43:35.000000000 +0100
+++ new/tldextract-5.1.2/tldextract.egg-info/PKG-INFO 2024-03-19 05:07:53.000000000 +0100
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: tldextract
-Version: 5.1.1
+Version: 5.1.2
Summary: Accurately separates a URL's subdomain, domain, and public suffix, using the Public Suffix List (PSL). By default, this includes the public ICANN TLDs and their exceptions. You can optionally support the Public Suffix List's private domains as well.
Author-email: John Kurkowski <john.kurkowski(a)gmail.com>
License: BSD-3-Clause
@@ -22,6 +22,9 @@
Requires-Dist: requests>=2.1.0
Requires-Dist: requests-file>=1.4
Requires-Dist: filelock>=3.0.8
+Provides-Extra: release
+Requires-Dist: build; extra == "release"
+Requires-Dist: twine; extra == "release"
Provides-Extra: testing
Requires-Dist: black; extra == "testing"
Requires-Dist: mypy; extra == "testing"
@@ -30,6 +33,7 @@
Requires-Dist: pytest-mock; extra == "testing"
Requires-Dist: responses; extra == "testing"
Requires-Dist: ruff; extra == "testing"
+Requires-Dist: syrupy; extra == "testing"
Requires-Dist: tox; extra == "testing"
Requires-Dist: types-filelock; extra == "testing"
Requires-Dist: types-requests; extra == "testing"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tldextract.egg-info/SOURCES.txt new/tldextract-5.1.2/tldextract.egg-info/SOURCES.txt
--- old/tldextract-5.1.1/tldextract.egg-info/SOURCES.txt 2023-11-17 04:43:35.000000000 +0100
+++ new/tldextract-5.1.2/tldextract.egg-info/SOURCES.txt 2024-03-19 05:07:53.000000000 +0100
@@ -6,6 +6,7 @@
tox.ini
.github/FUNDING.yml
.github/workflows/ci.yml
+scripts/release.py
tests/__init__.py
tests/cli_test.py
tests/conftest.py
@@ -14,7 +15,9 @@
tests/main_test.py
tests/test_cache.py
tests/test_parallel.py
+tests/test_release.py
tests/test_trie.py
+tests/__snapshots__/test_release.ambr
tests/fixtures/fake_suffix_list_fixture.dat
tldextract/.tld_set_snapshot
tldextract/__init__.py
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tldextract.egg-info/requires.txt new/tldextract-5.1.2/tldextract.egg-info/requires.txt
--- old/tldextract-5.1.1/tldextract.egg-info/requires.txt 2023-11-17 04:43:35.000000000 +0100
+++ new/tldextract-5.1.2/tldextract.egg-info/requires.txt 2024-03-19 05:07:53.000000000 +0100
@@ -3,6 +3,10 @@
requests-file>=1.4
filelock>=3.0.8
+[release]
+build
+twine
+
[testing]
black
mypy
@@ -11,6 +15,7 @@
pytest-mock
responses
ruff
+syrupy
tox
types-filelock
types-requests
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/tldextract-5.1.1/tox.ini new/tldextract-5.1.2/tox.ini
--- old/tldextract-5.1.1/tox.ini 2023-11-13 21:07:39.000000000 +0100
+++ new/tldextract-5.1.2/tox.ini 2024-03-16 01:07:21.000000000 +0100
@@ -1,5 +1,5 @@
[tox]
-envlist = py{38,39,310,311,312,py38},codestyle,lint,typecheck
+envlist = py{38,39,310,311,312,py38,py39,py310},codestyle,lint,typecheck
[testenv]
commands = pytest {posargs}
@@ -18,5 +18,5 @@
[testenv:typecheck]
basepython = python3.8
-commands = mypy --show-error-codes tldextract tests
+commands = mypy --show-error-codes scripts tldextract tests
extras = testing
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package parsec for openSUSE:Factory checked in at 2024-03-29 13:10:01
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/parsec (Old)
and /work/SRC/openSUSE:Factory/.parsec.new.1905 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "parsec"
Fri Mar 29 13:10:01 2024 rev:23 rq:1163363 version:1.4.0~rc2
Changes:
--------
--- /work/SRC/openSUSE:Factory/parsec/parsec.changes 2023-10-30 22:11:33.777937886 +0100
+++ /work/SRC/openSUSE:Factory/.parsec.new.1905/parsec.changes 2024-03-29 13:11:08.323529544 +0100
@@ -1,0 +2,6 @@
+Thu Mar 28 15:23:19 UTC 2024 - Guillaume GARDET <guillaume.gardet(a)opensuse.org>
+
+- Update to 1.4.0-rc2:
+ * Full changelog: https://github.com/parallaxsecond/parsec/compare/1.3.0...1.4.0-rc2
+
+-------------------------------------------------------------------
Old:
----
cargo_config
parsec-1.3.0.tar.gz
New:
----
parsec-1.4.0-rc2.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ parsec.spec ++++++
--- /var/tmp/diff_new_pack.2ckuRc/_old 2024-03-29 13:11:12.571685681 +0100
+++ /var/tmp/diff_new_pack.2ckuRc/_new 2024-03-29 13:11:12.587686269 +0100
@@ -1,7 +1,7 @@
#
# spec file for package parsec
#
-# Copyright (c) 2023 SUSE LLC
+# Copyright (c) 2024 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -17,18 +17,17 @@
%global rustflags '-Clink-arg=-Wl,-z,relro,-z,now'
-%define archive_version 1.3.0
+%define archive_version 1.4.0-rc2
%{?systemd_ordering}
Name: parsec
-Version: 1.3.0
+Version: 1.4.0~rc2
Release: 0
Summary: Platform AbstRaction for SECurity
License: Apache-2.0
URL: https://parallaxsecond.github.io/parsec-book
Source0: https://github.com/parallaxsecond/parsec/archive/%{archive_version}.tar.gz#…
Source1: vendor.tar.xz
-Source2: cargo_config
Source3: parsec.service
Source4: config.toml
Source5: parsec.conf
@@ -52,7 +51,6 @@
BuildRequires: pkgconfig(tss2-esys) >= 2.3.3
# opensc is used to initialize HSM keys (PKCS#11 backend)
Recommends: opensc
-%sysusers_requires
# /dev/tpm* are owned by tss user
Requires(pre): system-user-tss
# tpm2-0-tss holds the udev rule to make /dev/tpm* owned by tss user
@@ -67,12 +65,17 @@
This abstraction layer keeps workloads decoupled from physical platform details,
enabling cloud-native delivery flows within the data center and at the edge.
+%package -n system-user-parsec
+Summary: System user and group parsec
+%sysusers_requires
+
+%description -n system-user-parsec
+Package to install system user 'parsec'
+
%prep
%setup -q -a1 -a10 -n parsec-%{archive_version}
rmdir trusted-services-vendor
mv trusted-services-389b506 trusted-services-vendor
-rm -rf .cargo && mkdir .cargo
-cp %{SOURCE2} .cargo/config
# Enable all providers
sed -i -e 's#default = \["unix-peer-credentials-authenticator"\]##' Cargo.toml
# Features available in 1.2.0-rc1:
@@ -116,7 +119,7 @@
export PROTOC_INCLUDE=%{_includedir}
%cargo_test -- --lib
-%pre -f parsec.pre
+%pre -n system-user-parsec -f parsec.pre
%service_add_pre parsec.service
%post
@@ -138,5 +141,7 @@
%{_libexecdir}/parsec
%{_tmpfilesdir}/parsec.conf
%{_unitdir}/parsec.service
+
+%files -n system-user-parsec
%{_sysusersdir}/system-user-parsec.conf
++++++ _service ++++++
--- /var/tmp/diff_new_pack.2ckuRc/_old 2024-03-29 13:11:12.939699207 +0100
+++ /var/tmp/diff_new_pack.2ckuRc/_new 2024-03-29 13:11:12.979700677 +0100
@@ -1,11 +1,11 @@
<services>
<service name="cargo_vendor" mode="manual">
<param name="compression">xz</param>
- <param name="srcdir">parsec-1.3.0</param>
+ <param name="srcdir">parsec-1.4.0-rc1</param>
<param name="update">true</param>
</service>
<service name="cargo_audit" mode="manual">
- <param name="srcdir">parsec-1.3.0</param>
+ <param name="srcdir">parsec-1.4.0-rc1</param>
</service>
</services>
++++++ vendor.tar.xz ++++++
/work/SRC/openSUSE:Factory/parsec/vendor.tar.xz /work/SRC/openSUSE:Factory/.parsec.new.1905/vendor.tar.xz differ: char 8, line 1
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package ansible-core for openSUSE:Factory checked in at 2024-03-29 13:10:00
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ansible-core (Old)
and /work/SRC/openSUSE:Factory/.ansible-core.new.1905 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ansible-core"
Fri Mar 29 13:10:00 2024 rev:26 rq:1163355 version:2.16.5
Changes:
--------
--- /work/SRC/openSUSE:Factory/ansible-core/ansible-core.changes 2024-03-17 22:16:38.569795484 +0100
+++ /work/SRC/openSUSE:Factory/.ansible-core.new.1905/ansible-core.changes 2024-03-29 13:11:06.135449124 +0100
@@ -1,0 +2,22 @@
+Wed Mar 27 19:54:49 UTC 2024 - Johannes Kastl <opensuse_buildservice(a)ojkastl.de>
+
+- update to 2.16.5:
+ https://github.com/ansible/ansible/blob/v2.16.5/changelogs/CHANGELOG-v2.16.…
+ * Minor Changes
+ - ansible-test - Add a work-around for permission denied errors
+ when using pytest >= 8 on multi-user systems with an
+ installed version of ansible-test.
+ * Bugfixes
+ - Fix an issue when setting a plugin name from an unsafe source
+ resulted in ValueError: unmarshallable object (#82708)
+ - Harden python templates for respawn and ansiballz around str
+ literal quoting
+ - ansible-test - The libexpat package is automatically upgraded
+ during remote bootstrapping to maintain compatibility with
+ newer Python packages.
+ - template - Fix error when templating an unsafe string which
+ corresponds to an invalid type in Python (#82600).
+ - winrm - does not hang when attempting to get process output
+ when stdin write failed
+
+-------------------------------------------------------------------
Old:
----
ansible-core-2.16.4.tar.gz
New:
----
ansible-core-2.16.5.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ansible-core.spec ++++++
--- /var/tmp/diff_new_pack.JbWeO7/_old 2024-03-29 13:11:06.747471618 +0100
+++ /var/tmp/diff_new_pack.JbWeO7/_new 2024-03-29 13:11:06.751471765 +0100
@@ -38,7 +38,7 @@
%endif
Name: ansible-core
-Version: 2.16.4
+Version: 2.16.5
Release: 0
Summary: Radically simple IT automation
License: GPL-3.0-or-later
++++++ ansible-core-2.16.4.tar.gz -> ansible-core-2.16.5.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/PKG-INFO new/ansible-core-2.16.5/PKG-INFO
--- old/ansible-core-2.16.4/PKG-INFO 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/PKG-INFO 2024-03-25 18:07:00.000000000 +0100
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: ansible-core
-Version: 2.16.4
+Version: 2.16.5
Summary: Radically simple IT automation
Home-page: https://ansible.com/
Author: Ansible, Inc.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/changelogs/CHANGELOG-v2.16.rst new/ansible-core-2.16.5/changelogs/CHANGELOG-v2.16.rst
--- old/ansible-core-2.16.4/changelogs/CHANGELOG-v2.16.rst 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/changelogs/CHANGELOG-v2.16.rst 2024-03-25 18:07:00.000000000 +0100
@@ -5,6 +5,30 @@
.. contents:: Topics
+v2.16.5
+=======
+
+Release Summary
+---------------
+
+| Release Date: 2024-03-25
+| `Porting Guide <https://docs.ansible.com/ansible-core/2.16/porting_guides/porting_guide_cor…>`__
+
+
+Minor Changes
+-------------
+
+- ansible-test - Add a work-around for permission denied errors when using ``pytest >= 8`` on multi-user systems with an installed version of ``ansible-test``.
+
+Bugfixes
+--------
+
+- Fix an issue when setting a plugin name from an unsafe source resulted in ``ValueError: unmarshallable object`` (https://github.com/ansible/ansible/issues/82708)
+- Harden python templates for respawn and ansiballz around str literal quoting
+- ansible-test - The ``libexpat`` package is automatically upgraded during remote bootstrapping to maintain compatibility with newer Python packages.
+- template - Fix error when templating an unsafe string which corresponds to an invalid type in Python (https://github.com/ansible/ansible/issues/82600).
+- winrm - does not hang when attempting to get process output when stdin write failed
+
v2.16.4
=======
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/changelogs/changelog.yaml new/ansible-core-2.16.5/changelogs/changelog.yaml
--- old/ansible-core-2.16.4/changelogs/changelog.yaml 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/changelogs/changelog.yaml 2024-03-25 18:07:00.000000000 +0100
@@ -933,3 +933,45 @@
- fix-expect-indefinite-timeout.yml
- fix-vars-plugins-in-roles.yml
release_date: '2024-02-19'
+ 2.16.5:
+ changes:
+ bugfixes:
+ - ansible-test - The ``libexpat`` package is automatically upgraded during remote
+ bootstrapping to maintain compatibility with newer Python packages.
+ release_summary: '| Release Date: 2024-03-25
+
+ | `Porting Guide <https://docs.ansible.com/ansible-core/2.16/porting_guides/porting_guide_cor…>`__
+
+ '
+ codename: All My Love
+ fragments:
+ - 2.16.5_summary.yaml
+ - ansible-test-alpine-libexpat.yml
+ release_date: '2024-03-25'
+ 2.16.5rc1:
+ changes:
+ bugfixes:
+ - 'Fix an issue when setting a plugin name from an unsafe source resulted in
+ ``ValueError: unmarshallable object`` (https://github.com/ansible/ansible/issues/82708)'
+ - Harden python templates for respawn and ansiballz around str literal quoting
+ - template - Fix error when templating an unsafe string which corresponds to
+ an invalid type in Python (https://github.com/ansible/ansible/issues/82600).
+ - winrm - does not hang when attempting to get process output when stdin write
+ failed
+ minor_changes:
+ - ansible-test - Add a work-around for permission denied errors when using ``pytest
+ >= 8`` on multi-user systems with an installed version of ``ansible-test``.
+ release_summary: '| Release Date: 2024-03-18
+
+ | `Porting Guide <https://docs.ansible.com/ansible-core/2.16/porting_guides/porting_guide_cor…>`__
+
+ '
+ codename: All My Love
+ fragments:
+ - 2.16.5rc1_summary.yaml
+ - 82675-fix-unsafe-templating-leading-to-type-error.yml
+ - 82708-unsafe-plugin-name-error.yml
+ - ansible-test-pytest-8.yml
+ - py-tmpl-hardening.yml
+ - winrm-timeout.yml
+ release_date: '2024-03-18'
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/lib/ansible/executor/module_common.py new/ansible-core-2.16.5/lib/ansible/executor/module_common.py
--- old/ansible-core-2.16.4/lib/ansible/executor/module_common.py 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/lib/ansible/executor/module_common.py 2024-03-25 18:07:00.000000000 +0100
@@ -167,7 +167,7 @@
else:
PY3 = True
- ZIPDATA = """%(zipdata)s"""
+ ZIPDATA = %(zipdata)r
# Note: temp_path isn't needed once we switch to zipimport
def invoke_module(modlib_path, temp_path, json_params):
@@ -197,7 +197,7 @@
basic._ANSIBLE_ARGS = json_params
%(coverage)s
# Run the module! By importing it as '__main__', it thinks it is executing as a script
- runpy.run_module(mod_name='%(module_fqn)s', init_globals=dict(_module_fqn='%(module_fqn)s', _modlib_path=modlib_path),
+ runpy.run_module(mod_name=%(module_fqn)r, init_globals=dict(_module_fqn=%(module_fqn)r, _modlib_path=modlib_path),
run_name='__main__', alter_sys=True)
# Ansible modules must exit themselves
@@ -288,7 +288,7 @@
basic._ANSIBLE_ARGS = json_params
# Run the module! By importing it as '__main__', it thinks it is executing as a script
- runpy.run_module(mod_name='%(module_fqn)s', init_globals=None, run_name='__main__', alter_sys=True)
+ runpy.run_module(mod_name=%(module_fqn)r, init_globals=None, run_name='__main__', alter_sys=True)
# Ansible modules must exit themselves
print('{"msg": "New-style module did not handle its own exit", "failed": true}')
@@ -313,9 +313,9 @@
# store this in remote_tmpdir (use system tempdir instead)
# Only need to use [ansible_module]_payload_ in the temp_path until we move to zipimport
# (this helps ansible-test produce coverage stats)
- temp_path = tempfile.mkdtemp(prefix='ansible_%(ansible_module)s_payload_')
+ temp_path = tempfile.mkdtemp(prefix='ansible_' + %(ansible_module)r + '_payload_')
- zipped_mod = os.path.join(temp_path, 'ansible_%(ansible_module)s_payload.zip')
+ zipped_mod = os.path.join(temp_path, 'ansible_' + %(ansible_module)r + '_payload.zip')
with open(zipped_mod, 'wb') as modlib:
modlib.write(base64.b64decode(ZIPDATA))
@@ -338,7 +338,7 @@
'''
ANSIBALLZ_COVERAGE_TEMPLATE = '''
- os.environ['COVERAGE_FILE'] = '%(coverage_output)s=python-%%s=coverage' %% '.'.join(str(v) for v in sys.version_info[:2])
+ os.environ['COVERAGE_FILE'] = %(coverage_output)r + '=python-%%s=coverage' %% '.'.join(str(v) for v in sys.version_info[:2])
import atexit
@@ -348,7 +348,7 @@
print('{"msg": "Could not import `coverage` module.", "failed": true}')
sys.exit(1)
- cov = coverage.Coverage(config_file='%(coverage_config)s')
+ cov = coverage.Coverage(config_file=%(coverage_config)r)
def atexit_coverage():
cov.stop()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/lib/ansible/module_utils/ansible_release.py new/ansible-core-2.16.5/lib/ansible/module_utils/ansible_release.py
--- old/ansible-core-2.16.4/lib/ansible/module_utils/ansible_release.py 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/lib/ansible/module_utils/ansible_release.py 2024-03-25 18:07:00.000000000 +0100
@@ -19,6 +19,6 @@
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
-__version__ = '2.16.4'
+__version__ = '2.16.5'
__author__ = 'Ansible, Inc.'
__codename__ = "All My Love"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/lib/ansible/module_utils/common/respawn.py new/ansible-core-2.16.5/lib/ansible/module_utils/common/respawn.py
--- old/ansible-core-2.16.4/lib/ansible/module_utils/common/respawn.py 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/lib/ansible/module_utils/common/respawn.py 2024-03-25 18:07:00.000000000 +0100
@@ -8,7 +8,7 @@
import subprocess
import sys
-from ansible.module_utils.common.text.converters import to_bytes, to_native
+from ansible.module_utils.common.text.converters import to_bytes
def has_respawned():
@@ -79,10 +79,9 @@
import runpy
import sys
-module_fqn = '{module_fqn}'
-modlib_path = '{modlib_path}'
-smuggled_args = b"""{smuggled_args}""".strip()
-
+module_fqn = {module_fqn!r}
+modlib_path = {modlib_path!r}
+smuggled_args = {smuggled_args!r}
if __name__ == '__main__':
sys.path.insert(0, modlib_path)
@@ -93,6 +92,6 @@
runpy.run_module(module_fqn, init_globals=dict(_respawned=True), run_name='__main__', alter_sys=True)
'''
- respawn_code = respawn_code_template.format(module_fqn=module_fqn, modlib_path=modlib_path, smuggled_args=to_native(smuggled_args))
+ respawn_code = respawn_code_template.format(module_fqn=module_fqn, modlib_path=modlib_path, smuggled_args=smuggled_args.strip())
return respawn_code
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/lib/ansible/plugins/connection/winrm.py new/ansible-core-2.16.5/lib/ansible/plugins/connection/winrm.py
--- old/ansible-core-2.16.4/lib/ansible/plugins/connection/winrm.py 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/lib/ansible/plugins/connection/winrm.py 2024-03-25 18:07:00.000000000 +0100
@@ -172,6 +172,7 @@
import subprocess
import time
import typing as t
+import xml.etree.ElementTree as ET
from inspect import getfullargspec
from urllib.parse import urlunsplit
@@ -189,7 +190,6 @@
from ansible.module_utils.json_utils import _filter_non_json_lines
from ansible.module_utils.parsing.convert_bool import boolean
from ansible.module_utils.common.text.converters import to_bytes, to_native, to_text
-from ansible.module_utils.six import binary_type
from ansible.plugins.connection import ConnectionBase
from ansible.plugins.shell.powershell import _parse_clixml
from ansible.plugins.shell.powershell import ShellBase as PowerShellBase
@@ -199,7 +199,6 @@
try:
import winrm
- from winrm import Response
from winrm.exceptions import WinRMError, WinRMOperationTimeoutError
from winrm.protocol import Protocol
import requests.exceptions
@@ -547,13 +546,84 @@
stream['@End'] = 'true'
protocol.send_message(xmltodict.unparse(rq))
+ def _winrm_get_raw_command_output(
+ self,
+ protocol: winrm.Protocol,
+ shell_id: str,
+ command_id: str,
+ ) -> tuple[bytes, bytes, int, bool]:
+ rq = {'env:Envelope': protocol._get_soap_header(
+ resource_uri='http://schemas.microsoft.com/wbem/wsman/1/windows/shell/cmd',
+ action='http://schemas.microsoft.com/wbem/wsman/1/windows/shell/Receive',
+ shell_id=shell_id)}
+
+ stream = rq['env:Envelope'].setdefault('env:Body', {}).setdefault('rsp:Receive', {})\
+ .setdefault('rsp:DesiredStream', {})
+ stream['@CommandId'] = command_id
+ stream['#text'] = 'stdout stderr'
+
+ res = protocol.send_message(xmltodict.unparse(rq))
+ root = ET.fromstring(res)
+ stream_nodes = [
+ node for node in root.findall('.//*')
+ if node.tag.endswith('Stream')]
+ stdout = []
+ stderr = []
+ return_code = -1
+ for stream_node in stream_nodes:
+ if not stream_node.text:
+ continue
+ if stream_node.attrib['Name'] == 'stdout':
+ stdout.append(base64.b64decode(stream_node.text.encode('ascii')))
+ elif stream_node.attrib['Name'] == 'stderr':
+ stderr.append(base64.b64decode(stream_node.text.encode('ascii')))
+
+ command_done = len([
+ node for node in root.findall('.//*')
+ if node.get('State', '').endswith('CommandState/Done')]) == 1
+ if command_done:
+ return_code = int(
+ next(node for node in root.findall('.//*')
+ if node.tag.endswith('ExitCode')).text)
+
+ return b"".join(stdout), b"".join(stderr), return_code, command_done
+
+ def _winrm_get_command_output(
+ self,
+ protocol: winrm.Protocol,
+ shell_id: str,
+ command_id: str,
+ try_once: bool = False,
+ ) -> tuple[bytes, bytes, int]:
+ stdout_buffer, stderr_buffer = [], []
+ command_done = False
+ return_code = -1
+
+ while not command_done:
+ try:
+ stdout, stderr, return_code, command_done = \
+ self._winrm_get_raw_command_output(protocol, shell_id, command_id)
+ stdout_buffer.append(stdout)
+ stderr_buffer.append(stderr)
+
+ # If we were able to get output at least once then we should be
+ # able to get the rest.
+ try_once = False
+ except WinRMOperationTimeoutError:
+ # This is an expected error when waiting for a long-running process,
+ # just silently retry if we haven't been set to do one attempt.
+ if try_once:
+ break
+ continue
+ return b''.join(stdout_buffer), b''.join(stderr_buffer), return_code
+
def _winrm_exec(
self,
command: str,
args: t.Iterable[bytes] = (),
from_exec: bool = False,
stdin_iterator: t.Iterable[tuple[bytes, bool]] = None,
- ) -> winrm.Response:
+ ) -> tuple[int, bytes, bytes]:
if not self.protocol:
self.protocol = self._winrm_connect()
self._connected = True
@@ -576,38 +646,40 @@
display.debug(traceback.format_exc())
stdin_push_failed = True
- # NB: this can hang if the receiver is still running (eg, network failed a Send request but the server's still happy).
- # FUTURE: Consider adding pywinrm status check/abort operations to see if the target is still running after a failure.
- resptuple = self.protocol.get_command_output(self.shell_id, command_id)
- # ensure stdout/stderr are text for py3
- # FUTURE: this should probably be done internally by pywinrm
- response = Response(tuple(to_text(v) if isinstance(v, binary_type) else v for v in resptuple))
+ # Even on a failure above we try at least once to get the output
+ # in case the stdin was actually written and it an normally.
+ b_stdout, b_stderr, rc = self._winrm_get_command_output(
+ self.protocol,
+ self.shell_id,
+ command_id,
+ try_once=stdin_push_failed,
+ )
+ stdout = to_text(b_stdout)
+ stderr = to_text(b_stderr)
- # TODO: check result from response and set stdin_push_failed if we have nonzero
if from_exec:
- display.vvvvv('WINRM RESULT %r' % to_text(response), host=self._winrm_host)
- else:
- display.vvvvvv('WINRM RESULT %r' % to_text(response), host=self._winrm_host)
-
- display.vvvvvv('WINRM STDOUT %s' % to_text(response.std_out), host=self._winrm_host)
- display.vvvvvv('WINRM STDERR %s' % to_text(response.std_err), host=self._winrm_host)
+ display.vvvvv('WINRM RESULT <Response code %d, out %r, err %r>' % (rc, stdout, stderr), host=self._winrm_host)
+ display.vvvvvv('WINRM RC %d' % rc, host=self._winrm_host)
+ display.vvvvvv('WINRM STDOUT %s' % stdout, host=self._winrm_host)
+ display.vvvvvv('WINRM STDERR %s' % stderr, host=self._winrm_host)
+
+ # This is done after logging so we can still see the raw stderr for
+ # debugging purposes.
+ if b_stderr.startswith(b"#< CLIXML"):
+ b_stderr = _parse_clixml(b_stderr)
+ stderr = to_text(stderr)
if stdin_push_failed:
# There are cases where the stdin input failed but the WinRM service still processed it. We attempt to
# see if stdout contains a valid json return value so we can ignore this error
try:
- filtered_output, dummy = _filter_non_json_lines(response.std_out)
+ filtered_output, dummy = _filter_non_json_lines(stdout)
json.loads(filtered_output)
except ValueError:
# stdout does not contain a return response, stdin input was a fatal error
- stderr = to_bytes(response.std_err, encoding='utf-8')
- if stderr.startswith(b"#< CLIXML"):
- stderr = _parse_clixml(stderr)
-
- raise AnsibleError('winrm send_input failed; \nstdout: %s\nstderr %s'
- % (to_native(response.std_out), to_native(stderr)))
+ raise AnsibleError(f'winrm send_input failed; \nstdout: {stdout}\nstderr {stderr}')
- return response
+ return rc, b_stdout, b_stderr
except requests.exceptions.Timeout as exc:
raise AnsibleConnectionFailure('winrm connection error: %s' % to_native(exc))
finally:
@@ -653,20 +725,7 @@
if in_data:
stdin_iterator = self._wrapper_payload_stream(in_data)
- result = self._winrm_exec(cmd_parts[0], cmd_parts[1:], from_exec=True, stdin_iterator=stdin_iterator)
-
- result.std_out = to_bytes(result.std_out)
- result.std_err = to_bytes(result.std_err)
-
- # parse just stderr from CLIXML output
- if result.std_err.startswith(b"#< CLIXML"):
- try:
- result.std_err = _parse_clixml(result.std_err)
- except Exception:
- # unsure if we're guaranteed a valid xml doc- use raw output in case of error
- pass
-
- return (result.status_code, result.std_out, result.std_err)
+ return self._winrm_exec(cmd_parts[0], cmd_parts[1:], from_exec=True, stdin_iterator=stdin_iterator)
# FUTURE: determine buffer size at runtime via remote winrm config?
def _put_file_stdin_iterator(self, in_path: str, out_path: str, buffer_size: int = 250000) -> t.Iterable[tuple[bytes, bool]]:
@@ -724,19 +783,18 @@
script = script_template.format(self._shell._escape(out_path))
cmd_parts = self._shell._encode_script(script, as_list=True, strict_mode=False, preserve_rc=False)
- result = self._winrm_exec(cmd_parts[0], cmd_parts[1:], stdin_iterator=self._put_file_stdin_iterator(in_path, out_path))
- # TODO: improve error handling
- if result.status_code != 0:
- raise AnsibleError(to_native(result.std_err))
+ status_code, b_stdout, b_stderr = self._winrm_exec(cmd_parts[0], cmd_parts[1:], stdin_iterator=self._put_file_stdin_iterator(in_path, out_path))
+ stdout = to_text(b_stdout)
+ stderr = to_text(b_stderr)
+
+ if status_code != 0:
+ raise AnsibleError(stderr)
try:
- put_output = json.loads(result.std_out)
+ put_output = json.loads(stdout)
except ValueError:
# stdout does not contain a valid response
- stderr = to_bytes(result.std_err, encoding='utf-8')
- if stderr.startswith(b"#< CLIXML"):
- stderr = _parse_clixml(stderr)
- raise AnsibleError('winrm put_file failed; \nstdout: %s\nstderr %s' % (to_native(result.std_out), to_native(stderr)))
+ raise AnsibleError('winrm put_file failed; \nstdout: %s\nstderr %s' % (stdout, stderr))
remote_sha1 = put_output.get("sha1")
if not remote_sha1:
@@ -788,13 +846,16 @@
''' % dict(buffer_size=buffer_size, path=self._shell._escape(in_path), offset=offset)
display.vvvvv('WINRM FETCH "%s" to "%s" (offset=%d)' % (in_path, out_path, offset), host=self._winrm_host)
cmd_parts = self._shell._encode_script(script, as_list=True, preserve_rc=False)
- result = self._winrm_exec(cmd_parts[0], cmd_parts[1:])
- if result.status_code != 0:
- raise IOError(to_native(result.std_err))
- if result.std_out.strip() == '[DIR]':
+ status_code, b_stdout, b_stderr = self._winrm_exec(cmd_parts[0], cmd_parts[1:])
+ stdout = to_text(b_stdout)
+ stderr = to_text(b_stderr)
+
+ if status_code != 0:
+ raise IOError(stderr)
+ if stdout.strip() == '[DIR]':
data = None
else:
- data = base64.b64decode(result.std_out.strip())
+ data = base64.b64decode(stdout.strip())
if data is None:
break
else:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/lib/ansible/plugins/loader.py new/ansible-core-2.16.5/lib/ansible/plugins/loader.py
--- old/ansible-core-2.16.4/lib/ansible/plugins/loader.py 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/lib/ansible/plugins/loader.py 2024-03-25 18:07:00.000000000 +0100
@@ -35,6 +35,7 @@
from ansible.utils.collection_loader._collection_finder import _AnsibleCollectionFinder, _get_collection_metadata
from ansible.utils.display import Display
from ansible.utils.plugin_docs import add_fragments
+from ansible.utils.unsafe_proxy import _is_unsafe
# TODO: take the packaging dep, or vendor SpecifierSet?
@@ -862,6 +863,17 @@
def get_with_context(self, name, *args, **kwargs):
''' instantiates a plugin of the given name using arguments '''
+ if _is_unsafe(name):
+ # Objects constructed using the name wrapped as unsafe remain
+ # (correctly) unsafe. Using such unsafe objects in places
+ # where underlying types (builtin string in this case) are
+ # expected can cause problems.
+ # One such case is importlib.abc.Loader.exec_module failing
+ # with "ValueError: unmarshallable object" because the module
+ # object is created with the __path__ attribute being wrapped
+ # as unsafe which isn't marshallable.
+ # Manually removing the unsafe wrapper prevents such issues.
+ name = name._strip_unsafe()
found_in_cache = True
class_only = kwargs.pop('class_only', False)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/lib/ansible/release.py new/ansible-core-2.16.5/lib/ansible/release.py
--- old/ansible-core-2.16.4/lib/ansible/release.py 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/lib/ansible/release.py 2024-03-25 18:07:00.000000000 +0100
@@ -19,6 +19,6 @@
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
-__version__ = '2.16.4'
+__version__ = '2.16.5'
__author__ = 'Ansible, Inc.'
__codename__ = "All My Love"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/lib/ansible/template/native_helpers.py new/ansible-core-2.16.5/lib/ansible/template/native_helpers.py
--- old/ansible-core-2.16.4/lib/ansible/template/native_helpers.py 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/lib/ansible/template/native_helpers.py 2024-03-25 18:07:00.000000000 +0100
@@ -67,7 +67,7 @@
)
)
)
- except (ValueError, SyntaxError, MemoryError):
+ except (TypeError, ValueError, SyntaxError, MemoryError):
pass
return out
@@ -129,7 +129,7 @@
# parse the string ourselves without removing leading spaces/tabs.
ast.parse(out, mode='eval')
)
- except (ValueError, SyntaxError, MemoryError):
+ except (TypeError, ValueError, SyntaxError, MemoryError):
return out
if isinstance(evaled, string_types):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/lib/ansible_core.egg-info/PKG-INFO new/ansible-core-2.16.5/lib/ansible_core.egg-info/PKG-INFO
--- old/ansible-core-2.16.4/lib/ansible_core.egg-info/PKG-INFO 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/lib/ansible_core.egg-info/PKG-INFO 2024-03-25 18:07:00.000000000 +0100
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: ansible-core
-Version: 2.16.4
+Version: 2.16.5
Summary: Radically simple IT automation
Home-page: https://ansible.com/
Author: Ansible, Inc.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/lib/ansible_core.egg-info/SOURCES.txt new/ansible-core-2.16.5/lib/ansible_core.egg-info/SOURCES.txt
--- old/ansible-core-2.16.4/lib/ansible_core.egg-info/SOURCES.txt 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/lib/ansible_core.egg-info/SOURCES.txt 2024-03-25 18:07:00.000000000 +0100
@@ -3104,7 +3104,9 @@
test/integration/targets/plugin_filtering/tempfile.yml
test/integration/targets/plugin_loader/aliases
test/integration/targets/plugin_loader/runme.sh
+test/integration/targets/plugin_loader/unsafe_plugin_name.yml
test/integration/targets/plugin_loader/use_coll_name.yml
+test/integration/targets/plugin_loader/collections/ansible_collections/n/c/plugins/action/a.py
test/integration/targets/plugin_loader/file_collision/play.yml
test/integration/targets/plugin_loader/file_collision/roles/r1/filter_plugins/custom.py
test/integration/targets/plugin_loader/file_collision/roles/r1/filter_plugins/filter1.yml
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/test/integration/targets/ansible-test-container/runme.py new/ansible-core-2.16.5/test/integration/targets/ansible-test-container/runme.py
--- old/ansible-core-2.16.4/test/integration/targets/ansible-test-container/runme.py 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/test/integration/targets/ansible-test-container/runme.py 2024-03-25 18:07:00.000000000 +0100
@@ -1058,9 +1058,9 @@
packages = ['docker', 'podman', 'openssl', 'crun', 'ip6tables']
run_command('apk', 'add', *packages)
- # 3.18 only contains crun 1.8.4, to get 1.9.2 to resolve the run/shm issue, install crun from edge
+ # 3.18 only contains crun 1.8.4, to get 1.9.2 to resolve the run/shm issue, install crun from 3.19
# Remove once we update to 3.19
- run_command('apk', 'upgrade', '-U', '--repository=http://dl-cdn.alpinelinux.org/alpine/edge/community', 'crun')
+ run_command('apk', 'upgrade', '-U', '--repository=http://dl-cdn.alpinelinux.org/alpine/v3.19/community', 'crun')
run_command('service', 'docker', 'start')
run_command('modprobe', 'tun')
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/test/integration/targets/connection_psrp/tests.yml new/ansible-core-2.16.5/test/integration/targets/connection_psrp/tests.yml
--- old/ansible-core-2.16.4/test/integration/targets/connection_psrp/tests.yml 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/test/integration/targets/connection_psrp/tests.yml 2024-03-25 18:07:00.000000000 +0100
@@ -32,15 +32,8 @@
- raw_out.stdout_lines[4] == "winrm"
- raw_out.stdout_lines[5] == "string - \U0001F4A9"
- # Become only works on Server 2008 when running with basic auth, skip this host for now as it is too complicated to
- # override the auth protocol in the tests.
- - name: check if we running on Server 2008
- win_shell: '[System.Environment]::OSVersion.Version -ge [Version]"6.1"'
- register: os_version
-
- name: test out become with psrp
win_whoami:
- when: os_version|bool
register: whoami_out
become: yes
become_method: runas
@@ -50,7 +43,6 @@
assert:
that:
- whoami_out.account.sid == "S-1-5-18"
- when: os_version|bool
- name: test out async with psrp
win_shell: Start-Sleep -Seconds 2; Write-Output abc
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/test/integration/targets/plugin_loader/collections/ansible_collections/n/c/plugins/action/a.py new/ansible-core-2.16.5/test/integration/targets/plugin_loader/collections/ansible_collections/n/c/plugins/action/a.py
--- old/ansible-core-2.16.4/test/integration/targets/plugin_loader/collections/ansible_collections/n/c/plugins/action/a.py 1970-01-01 01:00:00.000000000 +0100
+++ new/ansible-core-2.16.5/test/integration/targets/plugin_loader/collections/ansible_collections/n/c/plugins/action/a.py 2024-03-25 18:07:00.000000000 +0100
@@ -0,0 +1,6 @@
+from ansible.plugins.action import ActionBase
+
+
+class ActionModule(ActionBase):
+ def run(self, tmp=None, task_vars=None):
+ return {"nca_executed": True}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/test/integration/targets/plugin_loader/runme.sh new/ansible-core-2.16.5/test/integration/targets/plugin_loader/runme.sh
--- old/ansible-core-2.16.4/test/integration/targets/plugin_loader/runme.sh 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/test/integration/targets/plugin_loader/runme.sh 2024-03-25 18:07:00.000000000 +0100
@@ -37,3 +37,5 @@
# test filter loading ignoring duplicate file basename
ansible-playbook file_collision/play.yml "$@"
+
+ANSIBLE_COLLECTIONS_PATH=$PWD/collections ansible-playbook unsafe_plugin_name.yml "$@"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/test/integration/targets/plugin_loader/unsafe_plugin_name.yml new/ansible-core-2.16.5/test/integration/targets/plugin_loader/unsafe_plugin_name.yml
--- old/ansible-core-2.16.4/test/integration/targets/plugin_loader/unsafe_plugin_name.yml 1970-01-01 01:00:00.000000000 +0100
+++ new/ansible-core-2.16.5/test/integration/targets/plugin_loader/unsafe_plugin_name.yml 2024-03-25 18:07:00.000000000 +0100
@@ -0,0 +1,9 @@
+- hosts: localhost
+ gather_facts: false
+ tasks:
+ - action: !unsafe n.c.a
+ register: r
+
+ - assert:
+ that:
+ - r.nca_executed
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/test/integration/targets/template/unsafe.yml new/ansible-core-2.16.5/test/integration/targets/template/unsafe.yml
--- old/ansible-core-2.16.4/test/integration/targets/template/unsafe.yml 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/test/integration/targets/template/unsafe.yml 2024-03-25 18:07:00.000000000 +0100
@@ -3,6 +3,7 @@
vars:
nottemplated: this should not be seen
imunsafe: !unsafe '{{ nottemplated }}'
+ unsafe_set: !unsafe '{{ "test" }}'
tasks:
- set_fact:
@@ -12,11 +13,15 @@
- set_fact:
this_always_safe: '{{ imunsafe }}'
+ - set_fact:
+ this_unsafe_set: "{{ unsafe_set }}"
+
- name: ensure nothing was templated
assert:
that:
- this_always_safe == imunsafe
- imunsafe == this_was_unsafe.strip()
+ - unsafe_set == this_unsafe_set.strip()
- hosts: localhost
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/test/lib/ansible_test/_internal/commands/units/__init__.py new/ansible-core-2.16.5/test/lib/ansible_test/_internal/commands/units/__init__.py
--- old/ansible-core-2.16.4/test/lib/ansible_test/_internal/commands/units/__init__.py 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/test/lib/ansible_test/_internal/commands/units/__init__.py 2024-03-25 18:07:00.000000000 +0100
@@ -261,6 +261,7 @@
'--junit-xml', os.path.join(ResultType.JUNIT.path, 'python%s-%s-units.xml' % (python.version, test_context)),
'--strict-markers', # added in pytest 4.5.0
'--rootdir', data_context().content.root,
+ '--confcutdir', data_context().content.root, # avoid permission errors when running from an installed version and using pytest >= 8
] # fmt:skip
if not data_context().content.collection:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/test/lib/ansible_test/_util/target/setup/bootstrap.sh new/ansible-core-2.16.5/test/lib/ansible_test/_util/target/setup/bootstrap.sh
--- old/ansible-core-2.16.4/test/lib/ansible_test/_util/target/setup/bootstrap.sh 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/test/lib/ansible_test/_util/target/setup/bootstrap.sh 2024-03-25 18:07:00.000000000 +0100
@@ -111,6 +111,15 @@
echo "Failed to install packages. Sleeping before trying again..."
sleep 10
done
+
+ # Upgrade the `libexpat` package to ensure that an upgraded Python (`pyexpat`) continues to work.
+ while true; do
+ # shellcheck disable=SC2086
+ apk upgrade -q libexpat \
+ && break
+ echo "Failed to upgrade libexpat. Sleeping before trying again..."
+ sleep 10
+ done
}
bootstrap_remote_fedora()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/ansible-core-2.16.4/test/units/plugins/connection/test_winrm.py new/ansible-core-2.16.5/test/units/plugins/connection/test_winrm.py
--- old/ansible-core-2.16.4/test/units/plugins/connection/test_winrm.py 2024-02-26 22:04:58.000000000 +0100
+++ new/ansible-core-2.16.5/test/units/plugins/connection/test_winrm.py 2024-03-25 18:07:00.000000000 +0100
@@ -471,7 +471,7 @@
mock_proto = MagicMock()
mock_proto.run_command.return_value = "command_id"
- mock_proto.get_command_output.side_effect = requests_exc.Timeout("msg")
+ mock_proto.send_message.side_effect = requests_exc.Timeout("msg")
conn._connected = True
conn._winrm_host = 'hostname'
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package ansible for openSUSE:Factory checked in at 2024-03-29 13:09:58
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ansible (Old)
and /work/SRC/openSUSE:Factory/.ansible.new.1905 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ansible"
Fri Mar 29 13:09:58 2024 rev:110 rq:1163353 version:9.4.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/ansible/ansible.changes 2024-03-17 22:16:47.530130494 +0100
+++ /work/SRC/openSUSE:Factory/.ansible.new.1905/ansible.changes 2024-03-29 13:11:03.759361793 +0100
@@ -1,0 +2,11 @@
+Wed Mar 27 19:56:06 UTC 2024 - Johannes Kastl <opensuse_buildservice(a)ojkastl.de>
+
+- update to 9.4.0:
+ Ansible 9.4.0 includes ansible-core 2.16.5 as well as a curated
+ set of Ansible collections that provide a vast number of modules
+ and plugins.
+ Collections which have opted-in to being a part of the Ansible 9
+ unified changelog will have an entry on this page:
+ https://github.com/ansible-community/ansible-build-data/blob/main/9/CHANGEL…
+
+-------------------------------------------------------------------
@@ -5 +16 @@
- Ansible 9.2.0 includes ansible-core 2.16.4 as well as a curated
+ Ansible 9.3.0 includes ansible-core 2.16.4 as well as a curated
Old:
----
ansible-9.3.0.tar.gz
New:
----
ansible-9.4.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ansible.spec ++++++
--- /var/tmp/diff_new_pack.dy5Okj/_old 2024-03-29 13:11:04.663395020 +0100
+++ /var/tmp/diff_new_pack.dy5Okj/_new 2024-03-29 13:11:04.663395020 +0100
@@ -38,7 +38,7 @@
%endif
Name: ansible
-Version: 9.3.0
+Version: 9.4.0
Release: 0
Summary: Radically simple IT automation
License: GPL-3.0+
@@ -54,11 +54,12 @@
BuildRequires: dos2unix
# SECTION test requirements
-BuildRequires: ansible-core >= 2.16.4
+BuildRequires: ansible-core >= 2.16.5
# /SECTION
+#
Requires: %{ansible_python}-base >= 3.10
-Requires: ansible-core >= 2.16.4
+Requires: ansible-core >= 2.16.5
# Do not check any files in collections for requires
%global __requires_exclude_from ^%{ansible_python_sitelib}/.*$
++++++ ansible-9.3.0.tar.gz -> ansible-9.4.0.tar.gz ++++++
/work/SRC/openSUSE:Factory/ansible/ansible-9.3.0.tar.gz /work/SRC/openSUSE:Factory/.ansible.new.1905/ansible-9.4.0.tar.gz differ: char 5, line 1
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package php-pear-Net_LDAP2 for openSUSE:Factory checked in at 2024-03-29 13:09:57
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/php-pear-Net_LDAP2 (Old)
and /work/SRC/openSUSE:Factory/.php-pear-Net_LDAP2.new.1905 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "php-pear-Net_LDAP2"
Fri Mar 29 13:09:57 2024 rev:2 rq:1163358 version:2.3.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/php-pear-Net_LDAP2/php-pear-Net_LDAP2.changes 2019-12-13 12:05:58.705365804 +0100
+++ /work/SRC/openSUSE:Factory/.php-pear-Net_LDAP2.new.1905/php-pear-Net_LDAP2.changes 2024-03-29 13:10:48.490800613 +0100
@@ -1,0 +2,6 @@
+Thu Mar 28 14:20:45 UTC 2024 - pgajdos(a)suse.com
+
+- version update to 2.3.0
+ * Fixes for PHP 8
+
+-------------------------------------------------------------------
Old:
----
Net_LDAP2-2.2.0.tgz
New:
----
v2.3.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ php-pear-Net_LDAP2.spec ++++++
--- /var/tmp/diff_new_pack.Ea2E6L/_old 2024-03-29 13:10:49.870851336 +0100
+++ /var/tmp/diff_new_pack.Ea2E6L/_new 2024-03-29 13:10:49.874851482 +0100
@@ -1,7 +1,7 @@
#
# spec file for package php-pear-Net_LDAP2
#
-# Copyright (c) 2019 SUSE LLC
+# Copyright (c) 2024 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -19,13 +19,13 @@
%define pear_name Net_LDAP2
Name: php-pear-Net_LDAP2
-Version: 2.2.0
+Version: 2.3.0
Release: 0
Summary: Object oriented interface for searching and manipulating LDAP-entries
License: LGPL-3.0-only
Group: Development/Libraries/Other
-URL: https://pear.php.net/package/%{pear_name}
-Source: https://pear.php.net/get/%{pear_name}-%{version}.tgz
+URL: https://github.com/pear/%{pear_name}
+Source: https://github.com/pear/%{pear_name}/archive/refs/tags/v%{version}.tar.gz
BuildRequires: php-devel
BuildRequires: php-pear
Requires: php-ldap
1
0