openSUSE Commits
Threads by month
- ----- 2024 -----
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
June 2024
- 2 participants
- 1219 discussions
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package hyprpaper for openSUSE:Factory checked in at 2024-06-07 15:03:09
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/hyprpaper (Old)
and /work/SRC/openSUSE:Factory/.hyprpaper.new.24587 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "hyprpaper"
Fri Jun 7 15:03:09 2024 rev:4 rq:1178995 version:0.7.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/hyprpaper/hyprpaper.changes 2024-04-02 16:45:00.438711731 +0200
+++ /work/SRC/openSUSE:Factory/.hyprpaper.new.24587/hyprpaper.changes 2024-06-07 15:03:33.288183759 +0200
@@ -1,0 +2,26 @@
+Thu Jun 6 10:00:17 UTC 2024 - Joshua Smith <smolsheep(a)opensuse.org>
+
+- Simplify install to just cmake
+- Remove now-unneeded make protocols
+- Update to version 0.7.0:
+ Fixes
+ * Fixed IPC with wildcards
+ * Added unload unused
+ * Moved socket to match hyprland 0.40.0
+ MRs
+ * Disable splash message by default
+ * readme: fix typos
+ * ipc: Added listloaded and listactive requests
+ * Fix error checking while changing wallpaper.
+ * Updated link to reflect arch package movement from community to
+ extra
+ * Nix: add home-manager module
+ * Set standard exclusively for c++
+ * Add OpenSuse to the installer method list
+ * hyprpaper: add splash_color configuration option
+ * Added missing hyprlang-devel dependency for Fedora
+ * Remove comma from monitor description
+ * Fix typo in hm-module.nix
+ * Move socket to XDG_RUNTIME_DIR
+
+-------------------------------------------------------------------
Old:
----
hyprpaper-0.6.0.tar.gz
New:
----
hyprpaper-0.7.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ hyprpaper.spec ++++++
--- /var/tmp/diff_new_pack.oxCsdy/_old 2024-06-07 15:03:33.816202995 +0200
+++ /var/tmp/diff_new_pack.oxCsdy/_new 2024-06-07 15:03:33.816202995 +0200
@@ -19,7 +19,7 @@
%global __builder ninja
Name: hyprpaper
Summary: Wayland wallpaper utility with IPC controls
-Version: 0.6.0
+Version: 0.7.0
Release: 0
License: BSD-3-Clause
URL: https://github.com/hyprwm/hyprpaper
@@ -51,14 +51,11 @@
%autosetup
%build
-# Necessary to allow build
-make protocols
-
%cmake
%cmake_build
%install
-install -Dm0755 -t "%{buildroot}%{_bindir}" "%{_builddir}/%{name}-%{version}/build/%{name}"
+%cmake_install
%files
%_bindir/hyprpaper
++++++ hyprpaper-0.6.0.tar.gz -> hyprpaper-0.7.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/hyprpaper-0.6.0/CMakeLists.txt new/hyprpaper-0.7.0/CMakeLists.txt
--- old/hyprpaper-0.6.0/CMakeLists.txt 2024-01-02 22:22:34.000000000 +0100
+++ new/hyprpaper-0.7.0/CMakeLists.txt 2024-04-28 23:25:36.000000000 +0200
@@ -35,8 +35,38 @@
#
#
+find_program(WaylandScanner NAMES wayland-scanner)
+message(STATUS "Found WaylandScanner at ${WaylandScanner}")
+execute_process(
+ COMMAND pkg-config --variable=pkgdatadir wayland-protocols
+ WORKING_DIRECTORY ${CMAKE_SOURCE_DIR}
+ OUTPUT_VARIABLE WAYLAND_PROTOCOLS_DIR
+ OUTPUT_STRIP_TRAILING_WHITESPACE)
+message(STATUS "Found wayland-protocols at ${WAYLAND_PROTOCOLS_DIR}")
+
+function(protocol protoPath protoName external)
+ if (external)
+ execute_process(
+ COMMAND ${WaylandScanner} client-header ${protoPath} ${protoName}-protocol.h
+ WORKING_DIRECTORY ${CMAKE_SOURCE_DIR})
+ execute_process(
+ COMMAND ${WaylandScanner} private-code ${protoPath} ${protoName}-protocol.c
+ WORKING_DIRECTORY ${CMAKE_SOURCE_DIR})
+ target_sources(hyprpaper PRIVATE ${protoName}-protocol.h ${protoName}-protocol.c)
+ else()
+ execute_process(
+ COMMAND ${WaylandScanner} client-header ${WAYLAND_PROTOCOLS_DIR}/${protoPath} ${protoName}-protocol.h
+ WORKING_DIRECTORY ${CMAKE_SOURCE_DIR})
+ execute_process(
+ COMMAND ${WaylandScanner} private-code ${WAYLAND_PROTOCOLS_DIR}/${protoPath} ${protoName}-protocol.c
+ WORKING_DIRECTORY ${CMAKE_SOURCE_DIR})
+ target_sources(hyprpaper PRIVATE ${protoName}-protocol.h ${protoName}-protocol.c)
+ endif()
+endfunction()
+
include_directories(.)
-add_compile_options(-std=c++2b -DWLR_USE_UNSTABLE )
+set(CMAKE_CXX_STANDARD 23)
+add_compile_options(-DWLR_USE_UNSTABLE)
add_compile_options(-Wall -Wextra -Wno-unused-parameter -Wno-unused-value -Wno-missing-field-initializers -Wno-narrowing)
find_package(Threads REQUIRED)
@@ -47,6 +77,11 @@
add_executable(hyprpaper ${SRCFILES})
+protocol("protocols/wlr-layer-shell-unstable-v1.xml" "wlr-layer-shell-unstable-v1" true)
+protocol("stable/xdg-shell/xdg-shell.xml" "xdg-shell" false)
+protocol("stable/viewporter/viewporter.xml" "viewporter" false)
+protocol("staging/fractional-scale/fractional-scale-v1.xml" "fractional-scale-v1" false)
+
target_compile_definitions(hyprpaper PRIVATE "-DGIT_COMMIT_HASH=\"${GIT_COMMIT_HASH}\"")
target_compile_definitions(hyprpaper PRIVATE "-DGIT_BRANCH=\"${GIT_BRANCH}\"")
target_compile_definitions(hyprpaper PRIVATE "-DGIT_COMMIT_MESSAGE=\"${GIT_COMMIT_MESSAGE}\"")
@@ -66,10 +101,6 @@
pthread
magic
${CMAKE_THREAD_LIBS_INIT}
- ${CMAKE_SOURCE_DIR}/wlr-layer-shell-unstable-v1-protocol.o
- ${CMAKE_SOURCE_DIR}/xdg-shell-protocol.o
- ${CMAKE_SOURCE_DIR}/fractional-scale-v1-protocol.o
- ${CMAKE_SOURCE_DIR}/viewporter-protocol.o
wayland-cursor
)
@@ -78,3 +109,5 @@
SET(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -pg -no-pie -fno-builtin")
SET(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -pg -no-pie -fno-builtin")
ENDIF(CMAKE_BUILD_TYPE MATCHES Debug OR CMAKE_BUILD_TYPE MATCHES DEBUG)
+
+install(TARGETS hyprpaper)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/hyprpaper-0.6.0/Makefile new/hyprpaper-0.7.0/Makefile
--- old/hyprpaper-0.6.0/Makefile 2024-01-02 22:22:34.000000000 +0100
+++ new/hyprpaper-0.7.0/Makefile 1970-01-01 01:00:00.000000000 +0100
@@ -1,74 +0,0 @@
-PREFIX ?= /usr
-CFLAGS ?= -g -Wall -Wextra -Werror -Wno-unused-parameter -Wno-sign-compare -Wno-unused-function -Wno-unused-variable -Wno-unused-result -Wdeclaration-after-statement
-
-CFLAGS += -I. -DWLR_USE_UNSTABLE -std=c99
-
-WAYLAND_PROTOCOLS=$(shell pkg-config --variable=pkgdatadir wayland-protocols)
-WAYLAND_SCANNER=$(shell pkg-config --variable=wayland_scanner wayland-scanner)
-
-PKGS = wlroots wayland-server
-CFLAGS += $(foreach p,$(PKGS),$(shell pkg-config --cflags $(p)))
-LDLIBS += $(foreach p,$(PKGS),$(shell pkg-config --libs $(p)))
-
-wlr-layer-shell-unstable-v1-protocol.h:
- $(WAYLAND_SCANNER) client-header \
- protocols/wlr-layer-shell-unstable-v1.xml $@
-
-wlr-layer-shell-unstable-v1-protocol.c:
- $(WAYLAND_SCANNER) private-code \
- protocols/wlr-layer-shell-unstable-v1.xml $@
-
-wlr-layer-shell-unstable-v1-protocol.o: wlr-layer-shell-unstable-v1-protocol.h
-
-xdg-shell-protocol.h:
- $(WAYLAND_SCANNER) client-header \
- $(WAYLAND_PROTOCOLS)/stable/xdg-shell/xdg-shell.xml $@
-
-xdg-shell-protocol.c:
- $(WAYLAND_SCANNER) private-code \
- $(WAYLAND_PROTOCOLS)/stable/xdg-shell/xdg-shell.xml $@
-
-xdg-shell-protocol.o: xdg-shell-protocol.h
-
-fractional-scale-v1-protocol.h:
- $(WAYLAND_SCANNER) client-header \
- $(WAYLAND_PROTOCOLS)/staging/fractional-scale/fractional-scale-v1.xml $@
-
-fractional-scale-v1-protocol.c:
- $(WAYLAND_SCANNER) private-code \
- $(WAYLAND_PROTOCOLS)/staging/fractional-scale/fractional-scale-v1.xml $@
-
-fractional-scale-v1-protocol.o: fractional-scale-v1-protocol.h
-
-viewporter-protocol.h:
- $(WAYLAND_SCANNER) client-header \
- $(WAYLAND_PROTOCOLS)/stable/viewporter/viewporter.xml $@
-
-viewporter-protocol.c:
- $(WAYLAND_SCANNER) private-code \
- $(WAYLAND_PROTOCOLS)/stable/viewporter/viewporter.xml $@
-
-viewporter-protocol.o: viewporter-protocol.h
-
-protocols: wlr-layer-shell-unstable-v1-protocol.o xdg-shell-protocol.o fractional-scale-v1-protocol.o viewporter-protocol.o
-
-clear:
- rm -rf build
- rm -f *.o *-protocol.h *-protocol.c
-
-release:
- mkdir -p build && cmake --no-warn-unused-cli -DCMAKE_BUILD_TYPE:STRING=Release -H./ -B./build -G Ninja
- cmake --build ./build --config Release --target all -j 10
-
-debug:
- mkdir -p build && cmake --no-warn-unused-cli -DCMAKE_BUILD_TYPE:STRING=Debug -H./ -B./build -G Ninja
- cmake --build ./build --config Debug --target all -j 10
-
-all:
- make clear
- make protocols
- make release
-
-install:
- make all
- cp ./build/hyprpaper $(PREFIX)/bin -f
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/hyprpaper-0.6.0/README.md new/hyprpaper-0.7.0/README.md
--- old/hyprpaper-0.6.0/README.md 2024-01-02 22:22:34.000000000 +0100
+++ new/hyprpaper-0.7.0/README.md 2024-04-28 23:25:36.000000000 +0200
@@ -11,7 +11,9 @@
# Installation
-[Arch Linux](https://archlinux.org/packages/community/x86_64/hyprpaper/): `pacman -S hyprpaper`
+[Arch Linux](https://archlinux.org/packages/extra/x86_64/hyprpaper/): `pacman -S hyprpaper`
+
+[OpenSuse Linux](https://software.opensuse.org/package/hyprpaper): `zypper install hyprpaper`
## Manual:
@@ -34,7 +36,7 @@
To install all of these in Fedora, run this command:
```
-sudo dnf install wayland-devel wayland-protocols-devel pango-devel cairo-devel file-devel libglvnd-devel libglvnd-core-devel libjpeg-turbo-devel libwebp-devel gcc-c++
+sudo dnf install wayland-devel wayland-protocols-devel hyprlang-devel pango-devel cairo-devel file-devel libglvnd-devel libglvnd-core-devel libjpeg-turbo-devel libwebp-devel gcc-c++
```
On Arch:
@@ -49,13 +51,18 @@
### Building
-```
-git clone https://github.com/hyprwm/hyprpaper
-cd hyprpaper
-make all
+Building is done via CMake:
+
+```sh
+cmake --no-warn-unused-cli -DCMAKE_BUILD_TYPE:STRING=Release -DCMAKE_INSTALL_PREFIX:PATH=/usr -S . -B ./build
+cmake --build ./build --config Release --target hyprpaper -j`nproc 2>/dev/null || getconf NPROCESSORS_CONF`
```
-*the output binary will be in `./build/`, copy it to your PATH, e.g. `/usr/bin`*
+Install with:
+
+```sh
+cmake --install ./build
+```
# Usage
@@ -68,14 +75,22 @@
preload = /path/to/next_image.png
# .. more preloads
-#set the default wallpaper(s) seen on inital workspace(s) --depending on the number of monitors used
+#set the default wallpaper(s) seen on initial workspace(s) --depending on the number of monitors used
wallpaper = monitor1,/path/to/image.png
#if more than one monitor in use, can load a 2nd image
wallpaper = monitor2,/path/to/next_image.png
# .. more monitors
+
+#enable splash text rendering over the wallpaper
+splash = true
+
+#fully disable ipc
+# ipc = off
+
+
```
-Preload will tell Hyprland to load a particular image (supported formats: png, jpg, jpeg, webp). Wallpaper will apply the wallpaper to the selected output (`monitor` is the monitor's name, easily can be retrieved with `hyprctl monitors`. You can leave it empty for a wildcard (aka fallback). You can also use `desc:` followed by the monitor's description without the (PORT) at the end)
+Preload will tell Hyprland to load a particular image (supported formats: png, jpg, jpeg, webp). Wallpaper will apply the wallpaper to the selected output (`monitor` is the monitor's name, easily can be retrieved with `hyprctl monitors`. You can leave it empty to set all monitors without an active wallpaper. You can also use `desc:` followed by the monitor's description without the (PORT) at the end)
You may add `contain:` before the file path in `wallpaper=` to set the mode to contain instead of cover:
@@ -120,7 +135,7 @@
#yes use quotes around desired monitor and wallpaper
#... continued with desired amount
```
-With the varibles created we can now "exec" the actions.
+With the variables created we can now "exec" the actions.
Remember in Hyprland we can bind more than one action to a key so in the case where we'd like to change the wallpaper when we switch workspace we have to ensure that the actions are bound to the same key such as...
@@ -144,6 +159,13 @@
bind=SUPERSHIFT,1,exec,$w1 #SuperKey + Shift + 1 switches to wallpaper $w1 on DP-1 as defined in the variable
```
+## Getting information from hyprpaper
+You can also use `hyprctl hyprpaper` to get information about the state of hyprpaper using the following commands:
+```
+listloaded - lists the wallpapers that are currently preloaded (useful for dynamically preloading and unloading)
+listactive - prints the active wallpapers hyprpaper is displaying, along with its accociated monitor
+```
+
# Battery life
Since the IPC has to tick every now and then, and poll in the background, battery life might be a tiny bit worse with IPC on. If you want to fully disable it, use
```
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/hyprpaper-0.6.0/flake.lock new/hyprpaper-0.7.0/flake.lock
--- old/hyprpaper-0.6.0/flake.lock 2024-01-02 22:22:34.000000000 +0100
+++ new/hyprpaper-0.7.0/flake.lock 2024-04-28 23:25:36.000000000 +0200
@@ -2,14 +2,15 @@
"nodes": {
"hyprlang": {
"inputs": {
- "nixpkgs": "nixpkgs"
+ "nixpkgs": "nixpkgs",
+ "systems": "systems"
},
"locked": {
- "lastModified": 1704230242,
- "narHash": "sha256-S8DM+frECqmAdaUb3y5n3RjY73ajZcL5rnmx5YO+CkY=",
+ "lastModified": 1711250455,
+ "narHash": "sha256-LSq1ZsTpeD7xsqvlsepDEelWRDtAhqwetp6PusHXJRo=",
"owner": "hyprwm",
"repo": "hyprlang",
- "rev": "db5e1399b90d5a339330bdd49c5bca6fe58d6f60",
+ "rev": "b3e430f81f3364c5dd1a3cc9995706a4799eb3fa",
"type": "github"
},
"original": {
@@ -20,11 +21,11 @@
},
"nixpkgs": {
"locked": {
- "lastModified": 1702645756,
- "narHash": "sha256-qKI6OR3TYJYQB3Q8mAZ+DG4o/BR9ptcv9UnRV2hzljc=",
- "owner": "nixos",
+ "lastModified": 1708475490,
+ "narHash": "sha256-g1v0TsWBQPX97ziznfJdWhgMyMGtoBFs102xSYO4syU=",
+ "owner": "NixOS",
"repo": "nixpkgs",
- "rev": "40c3c94c241286dd2243ea34d3aef8a488f9e4d0",
+ "rev": "0e74ca98a74bc7270d28838369593635a5db3260",
"type": "github"
},
"original": {
@@ -36,11 +37,11 @@
},
"nixpkgs_2": {
"locked": {
- "lastModified": 1703637592,
- "narHash": "sha256-8MXjxU0RfFfzl57Zy3OfXCITS0qWDNLzlBAdwxGZwfY=",
+ "lastModified": 1711163522,
+ "narHash": "sha256-YN/Ciidm+A0fmJPWlHBGvVkcarYWSC+s3NTPk/P+q3c=",
"owner": "NixOS",
"repo": "nixpkgs",
- "rev": "cfc3698c31b1fb9cdcf10f36c9643460264d0ca8",
+ "rev": "44d0940ea560dee511026a53f0e2e2cde489b4d4",
"type": "github"
},
"original": {
@@ -53,7 +54,38 @@
"root": {
"inputs": {
"hyprlang": "hyprlang",
- "nixpkgs": "nixpkgs_2"
+ "nixpkgs": "nixpkgs_2",
+ "systems": "systems_2"
+ }
+ },
+ "systems": {
+ "locked": {
+ "lastModified": 1689347949,
+ "narHash": "sha256-12tWmuL2zgBgZkdoB6qXZsgJEH9LR3oUgpaQq2RbI80=",
+ "owner": "nix-systems",
+ "repo": "default-linux",
+ "rev": "31732fcf5e8fea42e59c2488ad31a0e651500f68",
+ "type": "github"
+ },
+ "original": {
+ "owner": "nix-systems",
+ "repo": "default-linux",
+ "type": "github"
+ }
+ },
+ "systems_2": {
+ "locked": {
+ "lastModified": 1689347949,
+ "narHash": "sha256-12tWmuL2zgBgZkdoB6qXZsgJEH9LR3oUgpaQq2RbI80=",
+ "owner": "nix-systems",
+ "repo": "default-linux",
+ "rev": "31732fcf5e8fea42e59c2488ad31a0e651500f68",
+ "type": "github"
+ },
+ "original": {
+ "owner": "nix-systems",
+ "repo": "default-linux",
+ "type": "github"
}
}
},
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/hyprpaper-0.6.0/flake.nix new/hyprpaper-0.7.0/flake.nix
--- old/hyprpaper-0.6.0/flake.nix 2024-01-02 22:22:34.000000000 +0100
+++ new/hyprpaper-0.7.0/flake.nix 2024-04-28 23:25:36.000000000 +0200
@@ -5,40 +5,53 @@
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
hyprlang.url = "github:hyprwm/hyprlang";
+
+ systems.url = "github:nix-systems/default-linux";
};
outputs = {
self,
nixpkgs,
+ systems,
...
} @ inputs: let
inherit (nixpkgs) lib;
- genSystems = lib.genAttrs [
- # Add more systems if they are supported
- "x86_64-linux"
- "aarch64-linux"
- ];
- pkgsFor = nixpkgs.legacyPackages;
+ eachSystem = lib.genAttrs (import systems);
+
+ pkgsFor = eachSystem (system:
+ import nixpkgs {
+ localSystem.system = system;
+ overlays = with self.overlays; [hyprpaper];
+ });
mkDate = longDate: (lib.concatStringsSep "-" [
- (__substring 0 4 longDate)
- (__substring 4 2 longDate)
- (__substring 6 2 longDate)
+ (builtins.substring 0 4 longDate)
+ (builtins.substring 4 2 longDate)
+ (builtins.substring 6 2 longDate)
]);
in {
- overlays.default = _: prev: rec {
- hyprpaper = prev.callPackage ./nix/default.nix {
- stdenv = prev.gcc13Stdenv;
- version = "0.pre" + "+date=" + (mkDate (self.lastModifiedDate or "19700101")) + "_" + (self.shortRev or "dirty");
- inherit (prev.xorg) libXdmcp;
- inherit (inputs.hyprlang.packages.${prev.system}) hyprlang;
+ overlays = {
+ default = self.overlays.hyprpaper;
+ hyprpaper = final: prev: rec {
+ hyprpaper = final.callPackage ./nix/default.nix {
+ stdenv = final.gcc13Stdenv;
+ version = "0.pre" + "+date=" + (mkDate (self.lastModifiedDate or "19700101")) + "_" + (self.shortRev or "dirty");
+ inherit (final.xorg) libXdmcp;
+ inherit (inputs.hyprlang.packages.${final.system}) hyprlang;
+ };
+ hyprpaper-debug = hyprpaper.override {debug = true;};
};
- hyprpaper-debug = hyprpaper.override {debug = true;};
};
- packages = genSystems (system:
- (self.overlays.default null pkgsFor.${system})
- // {default = self.packages.${system}.hyprpaper;});
+ packages = eachSystem (system: {
+ default = self.packages.${system}.hyprpaper;
+ inherit (pkgsFor.${system}) hyprpaper hyprpaper-debug;
+ });
+
+ homeManagerModules = {
+ default = self.homeManagerModules.hyprpaper;
+ hyprpaper = import ./nix/hm-module.nix self;
+ };
- formatter = genSystems (system: pkgsFor.${system}.alejandra);
+ formatter = eachSystem (system: pkgsFor.${system}.alejandra);
};
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/hyprpaper-0.6.0/nix/default.nix new/hyprpaper-0.7.0/nix/default.nix
--- old/hyprpaper-0.6.0/nix/default.nix 2024-01-02 22:22:34.000000000 +0100
+++ new/hyprpaper-0.7.0/nix/default.nix 2024-04-28 23:25:36.000000000 +0200
@@ -3,7 +3,6 @@
stdenv,
pkg-config,
cmake,
- ninja,
cairo,
expat,
file,
@@ -30,11 +29,16 @@
stdenv.mkDerivation {
pname = "hyprpaper" + lib.optionalString debug "-debug";
inherit version;
+
src = ../.;
+ cmakeBuildType =
+ if debug
+ then "Debug"
+ else "Release";
+
nativeBuildInputs = [
cmake
- ninja
pkg-config
];
@@ -61,33 +65,6 @@
util-linux
];
- configurePhase = ''
- runHook preConfigure
-
- make protocols
-
- runHook postConfigure
- '';
-
- buildPhase = ''
- runHook preBuild
-
- make release
-
- runHook postBuild
- '';
-
- installPhase = ''
- runHook preInstall
-
- mkdir -p $out/{bin,share/licenses}
-
- install -Dm755 build/hyprpaper -t $out/bin
- install -Dm644 LICENSE -t $out/share/licenses/hyprpaper
-
- runHook postInstall
- '';
-
meta = with lib; {
homepage = "https://github.com/hyprwm/hyprpaper";
description = "A blazing fast wayland wallpaper utility with IPC controls";
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/hyprpaper-0.6.0/nix/hm-module.nix new/hyprpaper-0.7.0/nix/hm-module.nix
--- old/hyprpaper-0.6.0/nix/hm-module.nix 1970-01-01 01:00:00.000000000 +0100
+++ new/hyprpaper-0.7.0/nix/hm-module.nix 2024-04-28 23:25:36.000000000 +0200
@@ -0,0 +1,102 @@
+self: {
+ config,
+ pkgs,
+ lib,
+ ...
+}: let
+ inherit (builtins) toString;
+ inherit (lib.types) bool float listOf package str;
+ inherit (lib.modules) mkIf;
+ inherit (lib.options) mkOption mkEnableOption;
+ inherit (lib.meta) getExe;
+
+ boolToString = x:
+ if x
+ then "true"
+ else "false";
+ cfg = config.services.hyprpaper;
+in {
+ options.services.hyprpaper = {
+ enable = mkEnableOption "Hyprpaper, Hyprland's wallpaper utility";
+
+ package = mkOption {
+ description = "The hyprpaper package";
+ type = package;
+ default = self.packages.${pkgs.stdenv.hostPlatform.system}.hyprpaper;
+ };
+
+ ipc = mkOption {
+ description = "Whether to enable IPC";
+ type = bool;
+ default = true;
+ };
+
+ splash = mkOption {
+ description = "Enable rendering of the hyprland splash over the wallpaper";
+ type = bool;
+ default = false;
+ };
+
+ splash_offset = mkOption {
+ description = "How far (in % of height) up should the splash be displayed";
+ type = float;
+ default = 2.0;
+ };
+
+ preloads = mkOption {
+ description = "List of paths to images that will be loaded into memory.";
+ type = listOf str;
+ example = [
+ "~/Images/wallpapers/forest.png"
+ "~/Images/wallpapers/desert.png"
+ ];
+ };
+
+ wallpapers = mkOption {
+ description = "The wallpapers";
+ type = listOf str;
+ example = [
+ "eDP-1,~/Images/wallpapers/forest.png"
+ "DP-7,~/Images/wallpapers/desert.png"
+ ];
+ };
+ };
+
+ config = mkIf cfg.enable {
+ xdg.configFile."hypr/hyprpaper.conf".text = ''
+ ipc = ${
+ if cfg.ipc
+ then "on"
+ else "off"
+ }
+ splash = ${boolToString cfg.splash}
+ splash_offset = ${toString cfg.splash_offset}
+
+ ${
+ builtins.concatStringsSep "\n"
+ (
+ map (preload: "preload = ${preload}") cfg.preloads
+ )
+ }
+ ${
+ builtins.concatStringsSep "\n"
+ (
+ map (wallpaper: "wallpaper = ${wallpaper}") cfg.wallpapers
+ )
+ }
+ '';
+
+ systemd.user.services.hyprpaper = {
+ Unit = {
+ Description = "Hyprland wallpaper daemon";
+ PartOf = ["graphical-session.target"];
+ };
+
+ Service = {
+ ExecStart = "${getExe cfg.package}";
+ Restart = "on-failure";
+ };
+ Install.WantedBy = ["graphical-session.target"];
+ };
+ };
+}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/hyprpaper-0.6.0/src/Hyprpaper.cpp new/hyprpaper-0.7.0/src/Hyprpaper.cpp
--- old/hyprpaper-0.6.0/src/Hyprpaper.cpp 2024-01-02 22:22:34.000000000 +0100
+++ new/hyprpaper-0.7.0/src/Hyprpaper.cpp 2024-04-28 23:25:36.000000000 +0200
@@ -15,11 +15,6 @@
removeOldHyprpaperImages();
- g_pConfigManager = std::make_unique<CConfigManager>();
- g_pIPCSocket = std::make_unique<CIPCSocket>();
-
- g_pConfigManager->parse();
-
m_sDisplay = (wl_display*)wl_display_connect(nullptr);
if (!m_sDisplay) {
@@ -27,25 +22,36 @@
exit(1);
}
+ // run
+ wl_registry* registry = wl_display_get_registry(m_sDisplay);
+ wl_registry_add_listener(registry, &Events::registryListener, nullptr);
+
+ wl_display_roundtrip(m_sDisplay);
+
+ while (m_vMonitors.size() < 1 || m_vMonitors[0]->name.empty()) {
+ wl_display_dispatch(m_sDisplay);
+ }
+
+ g_pConfigManager = std::make_unique<CConfigManager>();
+ g_pIPCSocket = std::make_unique<CIPCSocket>();
+
+ g_pConfigManager->parse();
+
preloadAllWallpapersFromConfig();
if (std::any_cast<Hyprlang::INT>(g_pConfigManager->config->getConfigValue("ipc")))
g_pIPCSocket->initialize();
- // run
- wl_registry* registry = wl_display_get_registry(m_sDisplay);
- wl_registry_add_listener(registry, &Events::registryListener, nullptr);
-
- while (wl_display_dispatch(m_sDisplay) != -1) {
+ do {
std::lock_guard<std::mutex> lg(m_mtTickMutex);
tick(true);
- }
+ } while (wl_display_dispatch(m_sDisplay) != -1);
unlockSingleInstance();
}
void CHyprpaper::tick(bool force) {
- bool reload = g_pIPCSocket->mainThreadParseRequest();
+ bool reload = g_pIPCSocket && g_pIPCSocket->mainThreadParseRequest();
if (!reload && !force)
return;
@@ -453,6 +459,10 @@
void CHyprpaper::renderWallpaperForMonitor(SMonitor* pMonitor) {
static auto* const PRENDERSPLASH = reinterpret_cast<Hyprlang::INT* const*>(g_pConfigManager->config->getConfigValuePtr("splash")->getDataStaticPtr());
static auto* const PSPLASHOFFSET = reinterpret_cast<Hyprlang::FLOAT* const*>(g_pConfigManager->config->getConfigValuePtr("splash_offset")->getDataStaticPtr());
+
+ if (!m_mMonitorActiveWallpaperTargets[pMonitor])
+ recheckMonitor(pMonitor);
+
const auto PWALLPAPERTARGET = m_mMonitorActiveWallpaperTargets[pMonitor];
const auto CONTAIN = m_mMonitorWallpaperRenderData[pMonitor->name].contain;
@@ -522,7 +532,11 @@
const auto FONTSIZE = (int)(DIMENSIONS.y / 76.0 / scale);
cairo_set_font_size(PCAIRO, FONTSIZE);
- cairo_set_source_rgba(PCAIRO, 1.0, 1.0, 1.0, 0.32);
+ static auto* const PSPLASHCOLOR = reinterpret_cast<Hyprlang::INT* const*>(g_pConfigManager->config->getConfigValuePtr("splash_color")->getDataStaticPtr());
+
+ Debug::log(LOG, "Splash color: %x", **PSPLASHCOLOR);
+
+ cairo_set_source_rgba(PCAIRO, ((**PSPLASHCOLOR >> 16) & 0xFF) / 255.0, ((**PSPLASHCOLOR >> 8) & 0xFF) / 255.0, (**PSPLASHCOLOR & 0xFF) / 255.0, ((**PSPLASHCOLOR >> 24) & 0xFF) / 255.0);
cairo_text_extents_t textExtents;
cairo_text_extents(PCAIRO, SPLASH.c_str(), &textExtents);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/hyprpaper-0.6.0/src/config/ConfigManager.cpp new/hyprpaper-0.7.0/src/config/ConfigManager.cpp
--- old/hyprpaper-0.6.0/src/config/ConfigManager.cpp 2024-01-02 22:22:34.000000000 +0100
+++ new/hyprpaper-0.7.0/src/config/ConfigManager.cpp 2024-04-28 23:25:36.000000000 +0200
@@ -40,6 +40,20 @@
g_pHyprpaper->m_mMonitorActiveWallpapers[MONITOR] = WALLPAPER;
g_pHyprpaper->m_mMonitorWallpaperRenderData[MONITOR].contain = contain;
+ if (MONITOR.empty()) {
+ for (auto& m : g_pHyprpaper->m_vMonitors) {
+ if (!m->hasATarget || m->wildcard) {
+ g_pHyprpaper->clearWallpaperFromMonitor(m->name);
+ g_pHyprpaper->m_mMonitorActiveWallpapers[m->name] = WALLPAPER;
+ g_pHyprpaper->m_mMonitorWallpaperRenderData[m->name].contain = contain;
+ }
+ }
+ } else {
+ const auto PMON = g_pHyprpaper->getMonitorFromName(MONITOR);
+ if (PMON)
+ PMON->wildcard = false;
+ }
+
return result;
}
@@ -70,17 +84,18 @@
std::vector<std::string> toUnload;
for (auto& [name, target] : g_pHyprpaper->m_mWallpaperTargets) {
-
- bool exists = false;
- for (auto& [mon, target2] : g_pHyprpaper->m_mMonitorActiveWallpaperTargets) {
- if (&target == target2) {
- exists = true;
- break;
+ if (VALUE == "unused") {
+ bool exists = false;
+ for (auto& [mon, target2] : g_pHyprpaper->m_mMonitorActiveWallpaperTargets) {
+ if (&target == target2) {
+ exists = true;
+ break;
+ }
}
- }
- if (exists)
- continue;
+ if (exists)
+ continue;
+ }
toUnload.emplace_back(name);
}
@@ -96,7 +111,7 @@
const std::string VALUE = V;
auto WALLPAPER = VALUE;
- if (VALUE == "all")
+ if (VALUE == "all" || VALUE == "unused")
return handleUnloadAll(C, V);
if (WALLPAPER[0] == '~') {
@@ -118,9 +133,10 @@
config = std::make_unique<Hyprlang::CConfig>(configPath.c_str(), Hyprlang::SConfigOptions{.allowMissingConfig = true});
- config->addConfigValue("ipc", {1L});
- config->addConfigValue("splash", {1L});
- config->addConfigValue("splash_offset", {2.F});
+ config->addConfigValue("ipc", Hyprlang::INT{1L});
+ config->addConfigValue("splash", Hyprlang::INT{0L});
+ config->addConfigValue("splash_offset", Hyprlang::FLOAT{2.F});
+ config->addConfigValue("splash_color", Hyprlang::INT{0x55ffffff});
config->registerHandler(&handleWallpaper, "wallpaper", {.allowFlags = false});
config->registerHandler(&handleUnload, "unload", {.allowFlags = false});
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/hyprpaper-0.6.0/src/events/Events.cpp new/hyprpaper-0.7.0/src/events/Events.cpp
--- old/hyprpaper-0.6.0/src/events/Events.cpp 2024-01-02 22:22:34.000000000 +0100
+++ new/hyprpaper-0.7.0/src/events/Events.cpp 2024-04-28 23:25:36.000000000 +0200
@@ -1,72 +1,76 @@
#include "Events.hpp"
#include "../Hyprpaper.hpp"
-void Events::geometry(void *data, wl_output *output, int32_t x, int32_t y, int32_t width_mm, int32_t height_mm, int32_t subpixel, const char *make, const char *model, int32_t transform) {
+void Events::geometry(void* data, wl_output* output, int32_t x, int32_t y, int32_t width_mm, int32_t height_mm, int32_t subpixel, const char* make, const char* model, int32_t transform) {
// ignored
}
-void Events::mode(void *data, wl_output *output, uint32_t flags, int32_t width, int32_t height, int32_t refresh) {
+void Events::mode(void* data, wl_output* output, uint32_t flags, int32_t width, int32_t height, int32_t refresh) {
const auto PMONITOR = (SMonitor*)data;
PMONITOR->size = Vector2D(width, height);
}
-void Events::done(void *data, wl_output *wl_output) {
+void Events::done(void* data, wl_output* wl_output) {
const auto PMONITOR = (SMonitor*)data;
PMONITOR->readyForLS = true;
std::lock_guard<std::mutex> lg(g_pHyprpaper->m_mtTickMutex);
- g_pHyprpaper->tick(true);
+ if (g_pConfigManager) // don't tick if this is the first roundtrip
+ g_pHyprpaper->tick(true);
}
-void Events::scale(void *data, wl_output *wl_output, int32_t scale) {
+void Events::scale(void* data, wl_output* wl_output, int32_t scale) {
const auto PMONITOR = (SMonitor*)data;
PMONITOR->scale = scale;
}
-void Events::name(void *data, wl_output *wl_output, const char *name) {
+void Events::name(void* data, wl_output* wl_output, const char* name) {
const auto PMONITOR = (SMonitor*)data;
PMONITOR->name = name;
}
-void Events::description(void *data, wl_output *wl_output, const char *description) {
+void Events::description(void* data, wl_output* wl_output, const char* description) {
const auto PMONITOR = (SMonitor*)data;
+ // remove comma character from description. This allow monitor specific rules to work on monitor with comma on their description
+ std::string m_description = description;
+ std::erase(m_description, ',');
- PMONITOR->description = description;
+ PMONITOR->description = m_description;
}
-void Events::handleCapabilities(void *data, wl_seat *wl_seat, uint32_t capabilities) {
+void Events::handleCapabilities(void* data, wl_seat* wl_seat, uint32_t capabilities) {
if (capabilities & WL_SEAT_CAPABILITY_POINTER) {
wl_pointer_add_listener(wl_seat_get_pointer(wl_seat), &pointerListener, wl_seat);
}
}
-void Events::handlePointerLeave(void *data, struct wl_pointer *wl_pointer, uint32_t serial, struct wl_surface *surface) {
+void Events::handlePointerLeave(void* data, struct wl_pointer* wl_pointer, uint32_t serial, struct wl_surface* surface) {
// ignored
wl_surface_commit(surface);
g_pHyprpaper->m_pLastMonitor = nullptr;
}
-void Events::handlePointerAxis(void *data, wl_pointer *wl_pointer, uint32_t time, uint32_t axis, wl_fixed_t value) {
+void Events::handlePointerAxis(void* data, wl_pointer* wl_pointer, uint32_t time, uint32_t axis, wl_fixed_t value) {
// ignored
}
-void Events::handlePointerMotion(void *data, struct wl_pointer *wl_pointer, uint32_t time, wl_fixed_t surface_x, wl_fixed_t surface_y) {
+void Events::handlePointerMotion(void* data, struct wl_pointer* wl_pointer, uint32_t time, wl_fixed_t surface_x, wl_fixed_t surface_y) {
// ignored
if (g_pHyprpaper->m_pLastMonitor) {
wl_surface_commit(g_pHyprpaper->m_pLastMonitor->pCurrentLayerSurface->pSurface);
}
}
-void Events::handlePointerButton(void *data, struct wl_pointer *wl_pointer, uint32_t serial, uint32_t time, uint32_t button, uint32_t button_state) {
+void Events::handlePointerButton(void* data, struct wl_pointer* wl_pointer, uint32_t serial, uint32_t time, uint32_t button, uint32_t button_state) {
// ignored
}
-void Events::handlePointerEnter(void *data, struct wl_pointer *wl_pointer, uint32_t serial, struct wl_surface *surface, wl_fixed_t surface_x, wl_fixed_t surface_y) {
+void Events::handlePointerEnter(void* data, struct wl_pointer* wl_pointer, uint32_t serial, struct wl_surface* surface, wl_fixed_t surface_x, wl_fixed_t surface_y) {
for (auto& mon : g_pHyprpaper->m_vMonitors) {
if (mon->pCurrentLayerSurface->pSurface == surface) {
g_pHyprpaper->m_pLastMonitor = mon.get();
@@ -79,7 +83,7 @@
}
}
-void Events::ls_configure(void *data, zwlr_layer_surface_v1 *surface, uint32_t serial, uint32_t width, uint32_t height) {
+void Events::ls_configure(void* data, zwlr_layer_surface_v1* surface, uint32_t serial, uint32_t width, uint32_t height) {
const auto PLAYERSURFACE = (CLayerSurface*)data;
PLAYERSURFACE->m_pMonitor->size = Vector2D(width, height);
@@ -91,7 +95,7 @@
Debug::log(LOG, "configure for %s", PLAYERSURFACE->m_pMonitor->name.c_str());
}
-void Events::handleLSClosed(void *data, zwlr_layer_surface_v1 *zwlr_layer_surface_v1) {
+void Events::handleLSClosed(void* data, zwlr_layer_surface_v1* zwlr_layer_surface_v1) {
const auto PLAYERSURFACE = (CLayerSurface*)data;
for (auto& m : g_pHyprpaper->m_vMonitors) {
@@ -107,18 +111,18 @@
}
}
-void Events::handleGlobal(void *data, struct wl_registry *registry, uint32_t name, const char *interface, uint32_t version) {
+void Events::handleGlobal(void* data, struct wl_registry* registry, uint32_t name, const char* interface, uint32_t version) {
if (strcmp(interface, wl_compositor_interface.name) == 0) {
- g_pHyprpaper->m_sCompositor = (wl_compositor *)wl_registry_bind(registry, name, &wl_compositor_interface, 4);
+ g_pHyprpaper->m_sCompositor = (wl_compositor*)wl_registry_bind(registry, name, &wl_compositor_interface, 4);
} else if (strcmp(interface, wl_shm_interface.name) == 0) {
- g_pHyprpaper->m_sSHM = (wl_shm *)wl_registry_bind(registry, name, &wl_shm_interface, 1);
+ g_pHyprpaper->m_sSHM = (wl_shm*)wl_registry_bind(registry, name, &wl_shm_interface, 1);
} else if (strcmp(interface, wl_output_interface.name) == 0) {
g_pHyprpaper->m_mtTickMutex.lock();
const auto PMONITOR = g_pHyprpaper->m_vMonitors.emplace_back(std::make_unique<SMonitor>()).get();
PMONITOR->wayland_name = name;
PMONITOR->name = "";
- PMONITOR->output = (wl_output *)wl_registry_bind(registry, name, &wl_output_interface, 4);
+ PMONITOR->output = (wl_output*)wl_registry_bind(registry, name, &wl_output_interface, 4);
wl_output_add_listener(PMONITOR->output, &Events::outputListener, PMONITOR);
g_pHyprpaper->m_mtTickMutex.unlock();
@@ -133,7 +137,7 @@
}
}
-void Events::handleGlobalRemove(void *data, struct wl_registry *registry, uint32_t name) {
+void Events::handleGlobalRemove(void* data, struct wl_registry* registry, uint32_t name) {
for (auto& m : g_pHyprpaper->m_vMonitors) {
if (m->wayland_name == name) {
Debug::log(LOG, "Destroying output %s", m->name.c_str());
@@ -144,10 +148,10 @@
}
}
-void Events::handlePreferredScale(void *data, wp_fractional_scale_v1* fractionalScaleInfo, uint32_t scale) {
+void Events::handlePreferredScale(void* data, wp_fractional_scale_v1* fractionalScaleInfo, uint32_t scale) {
const double SCALE = scale / 120.0;
- CLayerSurface *const pLS = (CLayerSurface*)data;
+ CLayerSurface* const pLS = (CLayerSurface*)data;
Debug::log(LOG, "handlePreferredScale: %.2lf for %lx", SCALE, pLS);
@@ -157,4 +161,3 @@
g_pHyprpaper->tick(true);
}
}
-
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/hyprpaper-0.6.0/src/helpers/Monitor.hpp new/hyprpaper-0.7.0/src/helpers/Monitor.hpp
--- old/hyprpaper-0.6.0/src/helpers/Monitor.hpp 2024-01-02 22:22:34.000000000 +0100
+++ new/hyprpaper-0.7.0/src/helpers/Monitor.hpp 2024-04-28 23:25:36.000000000 +0200
@@ -1,8 +1,8 @@
#pragma once
#include "../defines.hpp"
-#include "PoolBuffer.hpp"
#include "../render/LayerSurface.hpp"
+#include "PoolBuffer.hpp"
struct SMonitor {
std::string name = "";
@@ -15,6 +15,8 @@
bool readyForLS = false;
bool hasATarget = true;
+ bool wildcard = true;
+
uint32_t configureSerial = 0;
SPoolBuffer buffer;
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/hyprpaper-0.6.0/src/ipc/Socket.cpp new/hyprpaper-0.7.0/src/ipc/Socket.cpp
--- old/hyprpaper-0.6.0/src/ipc/Socket.cpp 2024-01-02 22:22:34.000000000 +0100
+++ new/hyprpaper-0.7.0/src/ipc/Socket.cpp 2024-04-28 23:25:36.000000000 +0200
@@ -11,6 +11,7 @@
#include <sys/types.h>
#include <sys/un.h>
#include <unistd.h>
+#include <pwd.h>
void CIPCSocket::initialize() {
std::thread([&]() {
@@ -24,12 +25,14 @@
sockaddr_un SERVERADDRESS = {.sun_family = AF_UNIX};
const auto HISenv = getenv("HYPRLAND_INSTANCE_SIGNATURE");
+ const std::string USERID = std::to_string(getpwuid(getuid())->pw_uid);
- std::string socketPath = HISenv ? "/tmp/hypr/" + std::string(HISenv) + "/.hyprpaper.sock" : "/tmp/hypr/.hyprpaper.sock";
+ const auto USERDIR = "/run/user/" + USERID + "/hypr/";
- if (!HISenv) {
- mkdir("/tmp/hypr", S_IRWXU | S_IRWXG);
- }
+ std::string socketPath = HISenv ? USERDIR + std::string(HISenv) + "/.hyprpaper.sock" : USERDIR + ".hyprpaper.sock";
+
+ if (!HISenv)
+ mkdir(USERDIR.c_str(), S_IRWXU);
unlink(socketPath.c_str());
@@ -90,34 +93,75 @@
std::string copy = m_szRequest;
- // now we can work on the copy
-
if (copy == "")
return false;
+ // now we can work on the copy
+
Debug::log(LOG, "Received a request: %s", copy.c_str());
- // parse
+ // set default reply
+ m_szReply = "ok";
+ m_bReplyReady = true;
+ m_bRequestReady = false;
+
+ // config commands
if (copy.find("wallpaper") == 0 || copy.find("preload") == 0 || copy.find("unload") == 0) {
const auto RESULT = g_pConfigManager->config->parseDynamic(copy.substr(0, copy.find_first_of(' ')).c_str(), copy.substr(copy.find_first_of(' ') + 1).c_str());
if (RESULT.error) {
m_szReply = RESULT.getError();
- m_bReplyReady = true;
- m_bRequestReady = false;
return false;
}
- } else {
- m_szReply = "invalid command";
- m_bReplyReady = true;
- m_bRequestReady = false;
- return false;
+
+ return true;
}
- m_szReply = "ok";
- m_bReplyReady = true;
- m_bRequestReady = false;
+ if (copy.find("listloaded") == 0) {
+
+ const auto numWallpapersLoaded = g_pHyprpaper->m_mWallpaperTargets.size();
+ Debug::log(LOG, "numWallpapersLoaded: %d", numWallpapersLoaded);
+
+ if (numWallpapersLoaded == 0) {
+ m_szReply = "no wallpapers loaded";
+ return false;
+ }
+
+ m_szReply = "";
+ long unsigned int i = 0;
+ for (auto& [name, target] : g_pHyprpaper->m_mWallpaperTargets) {
+ m_szReply += name;
+ i++;
+ if (i < numWallpapersLoaded)
+ m_szReply += '\n'; // dont add newline on last entry
+ }
+
+ return true;
+ }
+
+ if (copy.find("listactive") == 0) {
+
+ const auto numWallpapersActive = g_pHyprpaper->m_mMonitorActiveWallpapers.size();
+ Debug::log(LOG, "numWallpapersActive: %d", numWallpapersActive);
+
+ if (numWallpapersActive == 0) {
+ m_szReply = "no wallpapers active";
+ return false;
+ }
+
+ m_szReply = "";
+ long unsigned int i = 0;
+ for (auto& [mon, path1] : g_pHyprpaper->m_mMonitorActiveWallpapers) {
+ m_szReply += mon + " = " + path1;
+ i++;
+ if (i < numWallpapersActive)
+ m_szReply += '\n'; // dont add newline on last entry
+ }
+
+ return true;
+ }
- return true;
+ m_szReply = "invalid command";
+ return false;
}
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-pydantic for openSUSE:Factory checked in at 2024-06-07 15:02:31
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-pydantic (Old)
and /work/SRC/openSUSE:Factory/.python-pydantic.new.24587 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-pydantic"
Fri Jun 7 15:02:31 2024 rev:27 rq:1179027 version:2.7.3
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-pydantic/python-pydantic.changes 2024-06-04 12:50:41.586872917 +0200
+++ /work/SRC/openSUSE:Factory/.python-pydantic.new.24587/python-pydantic.changes 2024-06-07 15:02:47.714523522 +0200
@@ -1,0 +2,13 @@
+Thu Jun 6 14:14:46 UTC 2024 - Dirk Müller <dmueller(a)suse.com>
+
+- update to 2.7.3:
+ * Bump `pydantic-core` to `v2.18.4`
+ * Fix u style unicode strings in python @samuelcolvin in
+ pydantic/jiter#110
+ * Replace `__spec__.parent` with `__package__`
+ * Fix validation of `int`s with leading unary minus
+ * Fix `str` subclass validation for enums
+ * Support `BigInt`s in `Literal`s and `Enum`s
+ * Fix: uuid - allow `str` subclass as input
+
+-------------------------------------------------------------------
Old:
----
pydantic-2.7.1.tar.gz
New:
----
pydantic-2.7.3.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-pydantic.spec ++++++
--- /var/tmp/diff_new_pack.CchWR2/_old 2024-06-07 15:02:48.498552083 +0200
+++ /var/tmp/diff_new_pack.CchWR2/_new 2024-06-07 15:02:48.498552083 +0200
@@ -27,7 +27,7 @@
%endif
%{?sle15_python_module_pythons}
Name: python-pydantic%{psuffix}
-Version: 2.7.1
+Version: 2.7.3
Release: 0
Summary: Data validation and settings management using python type hinting
License: MIT
@@ -58,7 +58,7 @@
%if 0%{?python_version_nodots} < 310
Requires: python-eval-type-backport
%endif
-Requires: python-pydantic-core == 2.18.2
+Requires: python-pydantic-core == 2.18.4
Requires: python-typing_extensions >= 4.6.1
Suggests: python-email-validator >= 2.0
BuildArch: noarch
++++++ pydantic-2.7.1.tar.gz -> pydantic-2.7.3.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydantic-2.7.1/HISTORY.md new/pydantic-2.7.3/HISTORY.md
--- old/pydantic-2.7.1/HISTORY.md 2024-04-23 14:59:19.000000000 +0200
+++ new/pydantic-2.7.3/HISTORY.md 2024-06-03 20:32:51.000000000 +0200
@@ -1,3 +1,35 @@
+## v2.7.3 (2024-06-03)
+
+[GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.7.3)
+
+### What's Changed
+
+#### Packaging
+
+* Bump `pydantic-core` to `v2.18.4` by @sydney-runkle in [#9550](https://github.com/pydantic/pydantic/pull/9550)
+
+#### Fixes
+
+* Fix u style unicode strings in python @samuelcolvin in [pydantic/jiter#110](https://github.com/pydantic/jiter/pull/110)
+
+## v2.7.2 (2024-05-28)
+
+[GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.7.2)
+
+### What's Changed
+
+#### Packaging
+
+* Bump `pydantic-core` to `v2.18.3` by @sydney-runkle in [#9515](https://github.com/pydantic/pydantic/pull/9515)
+
+#### Fixes
+
+* Replace `__spec__.parent` with `__package__` by @hramezani in [#9331](https://github.com/pydantic/pydantic/pull/9331)
+* Fix validation of `int`s with leading unary minus by @RajatRajdeep in [pydantic/pydantic-core#1291](https://github.com/pydantic/pydantic-core/pull…
+* Fix `str` subclass validation for enums by @sydney-runkle in [pydantic/pydantic-core#1273](https://github.com/pydantic/pydantic-core/pull…
+* Support `BigInt`s in `Literal`s and `Enum`s by @samuelcolvin in [pydantic/pydantic-core#1297](https://github.com/pydantic/pydantic-core/pull…
+* Fix: uuid - allow `str` subclass as input by @davidhewitt in [pydantic/pydantic-core#1296](https://github.com/pydantic/pydantic-core/pull…
+
## v2.7.1 (2024-04-23)
[GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.7.1)
@@ -6,7 +38,7 @@
#### Packaging
-* Bump `pydantic-core` to `v2.18.2` by @sydney-runkle in [#9307](https://github.com/pydantic/pydantic/pull/9307)
+* Bump `pydantic-core` to `v2.18.3` by @sydney-runkle in [#9307](https://github.com/pydantic/pydantic/pull/9307)
#### New Features
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydantic-2.7.1/docs/theme/announce.html new/pydantic-2.7.3/docs/theme/announce.html
--- old/pydantic-2.7.1/docs/theme/announce.html 2024-04-23 14:59:19.000000000 +0200
+++ new/pydantic-2.7.3/docs/theme/announce.html 2024-06-03 20:32:51.000000000 +0200
@@ -1,3 +1,5 @@
<!-- the following line is displayed in the announcement bar -->
<!-- keep length under 164 characters (less HTML tags) to fit on 1280px desktop window -->
-<a href="https://docs.pydantic.dev/2.0/blog/pydantic-v2-final/">Pydantic V2</a> is here 🚀! Upgrading an existing app? See the <a href="https://docs.pydantic.dev/2.0/migration/">Migration Guide</a> for tips on essential changes from Pydantic V1!
+<b>We're live!</b> <a href="https://pydantic.dev/logfire">Pydantic Logfire</a> is out in open beta! 🎉<br>
+Logfire is a new observability tool for Python, from the creators of Pydantic, with great Pydantic support.
+Please try it, and tell us <a href="https://docs.pydantic.dev/logfire/help/">what you think</a>!
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydantic-2.7.1/pdm.lock new/pydantic-2.7.3/pdm.lock
--- old/pydantic-2.7.1/pdm.lock 2024-04-23 14:59:19.000000000 +0200
+++ new/pydantic-2.7.3/pdm.lock 2024-06-03 20:32:51.000000000 +0200
@@ -5,7 +5,7 @@
groups = ["default", "docs", "email", "linting", "memray", "mypy", "testing", "testing-extra"]
strategy = ["cross_platform"]
lock_version = "4.4.1"
-content_hash = "sha256:1f3f711d35fa13627c92e663011e3ca007476b79f4bc440570cccbf517fd3521"
+content_hash = "sha256:cf00208ac8f950e2c39119f826124fb055cb78d10f97ad39777a2121aa18b3d4"
[[package]]
name = "annotated-types"
@@ -1090,92 +1090,92 @@
[[package]]
name = "pydantic-core"
-version = "2.18.2"
+version = "2.18.4"
requires_python = ">=3.8"
summary = "Core functionality for Pydantic validation and serialization"
dependencies = [
"typing-extensions!=4.7.0,>=4.6.0",
]
files = [
- {file = "pydantic_core-2.18.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:9e08e867b306f525802df7cd16c44ff5ebbe747ff0ca6cf3fde7f36c05a59a81"},
- {file = "pydantic_core-2.18.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f0a21cbaa69900cbe1a2e7cad2aa74ac3cf21b10c3efb0fa0b80305274c0e8a2"},
- {file = "pydantic_core-2.18.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0680b1f1f11fda801397de52c36ce38ef1c1dc841a0927a94f226dea29c3ae3d"},
- {file = "pydantic_core-2.18.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:95b9d5e72481d3780ba3442eac863eae92ae43a5f3adb5b4d0a1de89d42bb250"},
- {file = "pydantic_core-2.18.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c4fcf5cd9c4b655ad666ca332b9a081112cd7a58a8b5a6ca7a3104bc950f2038"},
- {file = "pydantic_core-2.18.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b5155ff768083cb1d62f3e143b49a8a3432e6789a3abee8acd005c3c7af1c74"},
- {file = "pydantic_core-2.18.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:553ef617b6836fc7e4df130bb851e32fe357ce36336d897fd6646d6058d980af"},
- {file = "pydantic_core-2.18.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b89ed9eb7d616ef5714e5590e6cf7f23b02d0d539767d33561e3675d6f9e3857"},
- {file = "pydantic_core-2.18.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:75f7e9488238e920ab6204399ded280dc4c307d034f3924cd7f90a38b1829563"},
- {file = "pydantic_core-2.18.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:ef26c9e94a8c04a1b2924149a9cb081836913818e55681722d7f29af88fe7b38"},
- {file = "pydantic_core-2.18.2-cp310-none-win32.whl", hash = "sha256:182245ff6b0039e82b6bb585ed55a64d7c81c560715d1bad0cbad6dfa07b4027"},
- {file = "pydantic_core-2.18.2-cp310-none-win_amd64.whl", hash = "sha256:e23ec367a948b6d812301afc1b13f8094ab7b2c280af66ef450efc357d2ae543"},
- {file = "pydantic_core-2.18.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:219da3f096d50a157f33645a1cf31c0ad1fe829a92181dd1311022f986e5fbe3"},
- {file = "pydantic_core-2.18.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:cc1cfd88a64e012b74e94cd00bbe0f9c6df57049c97f02bb07d39e9c852e19a4"},
- {file = "pydantic_core-2.18.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:05b7133a6e6aeb8df37d6f413f7705a37ab4031597f64ab56384c94d98fa0e90"},
- {file = "pydantic_core-2.18.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:224c421235f6102e8737032483f43c1a8cfb1d2f45740c44166219599358c2cd"},
- {file = "pydantic_core-2.18.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b14d82cdb934e99dda6d9d60dc84a24379820176cc4a0d123f88df319ae9c150"},
- {file = "pydantic_core-2.18.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2728b01246a3bba6de144f9e3115b532ee44bd6cf39795194fb75491824a1413"},
- {file = "pydantic_core-2.18.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:470b94480bb5ee929f5acba6995251ada5e059a5ef3e0dfc63cca287283ebfa6"},
- {file = "pydantic_core-2.18.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:997abc4df705d1295a42f95b4eec4950a37ad8ae46d913caeee117b6b198811c"},
- {file = "pydantic_core-2.18.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:75250dbc5290e3f1a0f4618db35e51a165186f9034eff158f3d490b3fed9f8a0"},
- {file = "pydantic_core-2.18.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:4456f2dca97c425231d7315737d45239b2b51a50dc2b6f0c2bb181fce6207664"},
- {file = "pydantic_core-2.18.2-cp311-none-win32.whl", hash = "sha256:269322dcc3d8bdb69f054681edff86276b2ff972447863cf34c8b860f5188e2e"},
- {file = "pydantic_core-2.18.2-cp311-none-win_amd64.whl", hash = "sha256:800d60565aec896f25bc3cfa56d2277d52d5182af08162f7954f938c06dc4ee3"},
- {file = "pydantic_core-2.18.2-cp311-none-win_arm64.whl", hash = "sha256:1404c69d6a676245199767ba4f633cce5f4ad4181f9d0ccb0577e1f66cf4c46d"},
- {file = "pydantic_core-2.18.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:fb2bd7be70c0fe4dfd32c951bc813d9fe6ebcbfdd15a07527796c8204bd36242"},
- {file = "pydantic_core-2.18.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6132dd3bd52838acddca05a72aafb6eab6536aa145e923bb50f45e78b7251043"},
- {file = "pydantic_core-2.18.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d7d904828195733c183d20a54230c0df0eb46ec746ea1a666730787353e87182"},
- {file = "pydantic_core-2.18.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:c9bd70772c720142be1020eac55f8143a34ec9f82d75a8e7a07852023e46617f"},
- {file = "pydantic_core-2.18.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2b8ed04b3582771764538f7ee7001b02e1170223cf9b75dff0bc698fadb00cf3"},
- {file = "pydantic_core-2.18.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e6dac87ddb34aaec85f873d737e9d06a3555a1cc1a8e0c44b7f8d5daeb89d86f"},
- {file = "pydantic_core-2.18.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7ca4ae5a27ad7a4ee5170aebce1574b375de390bc01284f87b18d43a3984df72"},
- {file = "pydantic_core-2.18.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:886eec03591b7cf058467a70a87733b35f44707bd86cf64a615584fd72488b7c"},
- {file = "pydantic_core-2.18.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ca7b0c1f1c983e064caa85f3792dd2fe3526b3505378874afa84baf662e12241"},
- {file = "pydantic_core-2.18.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4b4356d3538c3649337df4074e81b85f0616b79731fe22dd11b99499b2ebbdf3"},
- {file = "pydantic_core-2.18.2-cp312-none-win32.whl", hash = "sha256:8b172601454f2d7701121bbec3425dd71efcb787a027edf49724c9cefc14c038"},
- {file = "pydantic_core-2.18.2-cp312-none-win_amd64.whl", hash = "sha256:b1bd7e47b1558ea872bd16c8502c414f9e90dcf12f1395129d7bb42a09a95438"},
- {file = "pydantic_core-2.18.2-cp312-none-win_arm64.whl", hash = "sha256:98758d627ff397e752bc339272c14c98199c613f922d4a384ddc07526c86a2ec"},
- {file = "pydantic_core-2.18.2-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:9fdad8e35f278b2c3eb77cbdc5c0a49dada440657bf738d6905ce106dc1de439"},
- {file = "pydantic_core-2.18.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:1d90c3265ae107f91a4f279f4d6f6f1d4907ac76c6868b27dc7fb33688cfb347"},
- {file = "pydantic_core-2.18.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:390193c770399861d8df9670fb0d1874f330c79caaca4642332df7c682bf6b91"},
- {file = "pydantic_core-2.18.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:82d5d4d78e4448683cb467897fe24e2b74bb7b973a541ea1dcfec1d3cbce39fb"},
- {file = "pydantic_core-2.18.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4774f3184d2ef3e14e8693194f661dea5a4d6ca4e3dc8e39786d33a94865cefd"},
- {file = "pydantic_core-2.18.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d4d938ec0adf5167cb335acb25a4ee69a8107e4984f8fbd2e897021d9e4ca21b"},
- {file = "pydantic_core-2.18.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e0e8b1be28239fc64a88a8189d1df7fad8be8c1ae47fcc33e43d4be15f99cc70"},
- {file = "pydantic_core-2.18.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:868649da93e5a3d5eacc2b5b3b9235c98ccdbfd443832f31e075f54419e1b96b"},
- {file = "pydantic_core-2.18.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:78363590ef93d5d226ba21a90a03ea89a20738ee5b7da83d771d283fd8a56761"},
- {file = "pydantic_core-2.18.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:852e966fbd035a6468fc0a3496589b45e2208ec7ca95c26470a54daed82a0788"},
- {file = "pydantic_core-2.18.2-cp38-none-win32.whl", hash = "sha256:6a46e22a707e7ad4484ac9ee9f290f9d501df45954184e23fc29408dfad61350"},
- {file = "pydantic_core-2.18.2-cp38-none-win_amd64.whl", hash = "sha256:d91cb5ea8b11607cc757675051f61b3d93f15eca3cefb3e6c704a5d6e8440f4e"},
- {file = "pydantic_core-2.18.2-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:ae0a8a797a5e56c053610fa7be147993fe50960fa43609ff2a9552b0e07013e8"},
- {file = "pydantic_core-2.18.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:042473b6280246b1dbf530559246f6842b56119c2926d1e52b631bdc46075f2a"},
- {file = "pydantic_core-2.18.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1a388a77e629b9ec814c1b1e6b3b595fe521d2cdc625fcca26fbc2d44c816804"},
- {file = "pydantic_core-2.18.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e25add29b8f3b233ae90ccef2d902d0ae0432eb0d45370fe315d1a5cf231004b"},
- {file = "pydantic_core-2.18.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f459a5ce8434614dfd39bbebf1041952ae01da6bed9855008cb33b875cb024c0"},
- {file = "pydantic_core-2.18.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:eff2de745698eb46eeb51193a9f41d67d834d50e424aef27df2fcdee1b153845"},
- {file = "pydantic_core-2.18.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a8309f67285bdfe65c372ea3722b7a5642680f3dba538566340a9d36e920b5f0"},
- {file = "pydantic_core-2.18.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f93a8a2e3938ff656a7c1bc57193b1319960ac015b6e87d76c76bf14fe0244b4"},
- {file = "pydantic_core-2.18.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:22057013c8c1e272eb8d0eebc796701167d8377441ec894a8fed1af64a0bf399"},
- {file = "pydantic_core-2.18.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:cfeecd1ac6cc1fb2692c3d5110781c965aabd4ec5d32799773ca7b1456ac636b"},
- {file = "pydantic_core-2.18.2-cp39-none-win32.whl", hash = "sha256:0d69b4c2f6bb3e130dba60d34c0845ba31b69babdd3f78f7c0c8fae5021a253e"},
- {file = "pydantic_core-2.18.2-cp39-none-win_amd64.whl", hash = "sha256:d9319e499827271b09b4e411905b24a426b8fb69464dfa1696258f53a3334641"},
- {file = "pydantic_core-2.18.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:a1874c6dd4113308bd0eb568418e6114b252afe44319ead2b4081e9b9521fe75"},
- {file = "pydantic_core-2.18.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:ccdd111c03bfd3666bd2472b674c6899550e09e9f298954cfc896ab92b5b0e6d"},
- {file = "pydantic_core-2.18.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e18609ceaa6eed63753037fc06ebb16041d17d28199ae5aba0052c51449650a9"},
- {file = "pydantic_core-2.18.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6e5c584d357c4e2baf0ff7baf44f4994be121e16a2c88918a5817331fc7599d7"},
- {file = "pydantic_core-2.18.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:43f0f463cf89ace478de71a318b1b4f05ebc456a9b9300d027b4b57c1a2064fb"},
- {file = "pydantic_core-2.18.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:e1b395e58b10b73b07b7cf740d728dd4ff9365ac46c18751bf8b3d8cca8f625a"},
- {file = "pydantic_core-2.18.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:0098300eebb1c837271d3d1a2cd2911e7c11b396eac9661655ee524a7f10587b"},
- {file = "pydantic_core-2.18.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:36789b70d613fbac0a25bb07ab3d9dba4d2e38af609c020cf4d888d165ee0bf3"},
- {file = "pydantic_core-2.18.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3f9a801e7c8f1ef8718da265bba008fa121243dfe37c1cea17840b0944dfd72c"},
- {file = "pydantic_core-2.18.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:3a6515ebc6e69d85502b4951d89131ca4e036078ea35533bb76327f8424531ce"},
- {file = "pydantic_core-2.18.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:20aca1e2298c56ececfd8ed159ae4dde2df0781988c97ef77d5c16ff4bd5b400"},
- {file = "pydantic_core-2.18.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:223ee893d77a310a0391dca6df00f70bbc2f36a71a895cecd9a0e762dc37b349"},
- {file = "pydantic_core-2.18.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2334ce8c673ee93a1d6a65bd90327588387ba073c17e61bf19b4fd97d688d63c"},
- {file = "pydantic_core-2.18.2-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:cbca948f2d14b09d20268cda7b0367723d79063f26c4ffc523af9042cad95592"},
- {file = "pydantic_core-2.18.2-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:b3ef08e20ec49e02d5c6717a91bb5af9b20f1805583cb0adfe9ba2c6b505b5ae"},
- {file = "pydantic_core-2.18.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:c6fdc8627910eed0c01aed6a390a252fe3ea6d472ee70fdde56273f198938374"},
- {file = "pydantic_core-2.18.2.tar.gz", hash = "sha256:2e29d20810dfc3043ee13ac7d9e25105799817683348823f305ab3f349b9386e"},
+ {file = "pydantic_core-2.18.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:f76d0ad001edd426b92233d45c746fd08f467d56100fd8f30e9ace4b005266e4"},
+ {file = "pydantic_core-2.18.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:59ff3e89f4eaf14050c8022011862df275b552caef8082e37b542b066ce1ff26"},
+ {file = "pydantic_core-2.18.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a55b5b16c839df1070bc113c1f7f94a0af4433fcfa1b41799ce7606e5c79ce0a"},
+ {file = "pydantic_core-2.18.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4d0dcc59664fcb8974b356fe0a18a672d6d7cf9f54746c05f43275fc48636851"},
+ {file = "pydantic_core-2.18.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8951eee36c57cd128f779e641e21eb40bc5073eb28b2d23f33eb0ef14ffb3f5d"},
+ {file = "pydantic_core-2.18.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4701b19f7e3a06ea655513f7938de6f108123bf7c86bbebb1196eb9bd35cf724"},
+ {file = "pydantic_core-2.18.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e00a3f196329e08e43d99b79b286d60ce46bed10f2280d25a1718399457e06be"},
+ {file = "pydantic_core-2.18.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:97736815b9cc893b2b7f663628e63f436018b75f44854c8027040e05230eeddb"},
+ {file = "pydantic_core-2.18.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:6891a2ae0e8692679c07728819b6e2b822fb30ca7445f67bbf6509b25a96332c"},
+ {file = "pydantic_core-2.18.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:bc4ff9805858bd54d1a20efff925ccd89c9d2e7cf4986144b30802bf78091c3e"},
+ {file = "pydantic_core-2.18.4-cp310-none-win32.whl", hash = "sha256:1b4de2e51bbcb61fdebd0ab86ef28062704f62c82bbf4addc4e37fa4b00b7cbc"},
+ {file = "pydantic_core-2.18.4-cp310-none-win_amd64.whl", hash = "sha256:6a750aec7bf431517a9fd78cb93c97b9b0c496090fee84a47a0d23668976b4b0"},
+ {file = "pydantic_core-2.18.4-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:942ba11e7dfb66dc70f9ae66b33452f51ac7bb90676da39a7345e99ffb55402d"},
+ {file = "pydantic_core-2.18.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:b2ebef0e0b4454320274f5e83a41844c63438fdc874ea40a8b5b4ecb7693f1c4"},
+ {file = "pydantic_core-2.18.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a642295cd0c8df1b86fc3dced1d067874c353a188dc8e0f744626d49e9aa51c4"},
+ {file = "pydantic_core-2.18.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f09baa656c904807e832cf9cce799c6460c450c4ad80803517032da0cd062e2"},
+ {file = "pydantic_core-2.18.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:98906207f29bc2c459ff64fa007afd10a8c8ac080f7e4d5beff4c97086a3dabd"},
+ {file = "pydantic_core-2.18.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:19894b95aacfa98e7cb093cd7881a0c76f55731efad31073db4521e2b6ff5b7d"},
+ {file = "pydantic_core-2.18.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0fbbdc827fe5e42e4d196c746b890b3d72876bdbf160b0eafe9f0334525119c8"},
+ {file = "pydantic_core-2.18.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f85d05aa0918283cf29a30b547b4df2fbb56b45b135f9e35b6807cb28bc47951"},
+ {file = "pydantic_core-2.18.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e85637bc8fe81ddb73fda9e56bab24560bdddfa98aa64f87aaa4e4b6730c23d2"},
+ {file = "pydantic_core-2.18.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:2f5966897e5461f818e136b8451d0551a2e77259eb0f73a837027b47dc95dab9"},
+ {file = "pydantic_core-2.18.4-cp311-none-win32.whl", hash = "sha256:44c7486a4228413c317952e9d89598bcdfb06399735e49e0f8df643e1ccd0558"},
+ {file = "pydantic_core-2.18.4-cp311-none-win_amd64.whl", hash = "sha256:8a7164fe2005d03c64fd3b85649891cd4953a8de53107940bf272500ba8a788b"},
+ {file = "pydantic_core-2.18.4-cp311-none-win_arm64.whl", hash = "sha256:4e99bc050fe65c450344421017f98298a97cefc18c53bb2f7b3531eb39bc7805"},
+ {file = "pydantic_core-2.18.4-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:6f5c4d41b2771c730ea1c34e458e781b18cc668d194958e0112455fff4e402b2"},
+ {file = "pydantic_core-2.18.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:2fdf2156aa3d017fddf8aea5adfba9f777db1d6022d392b682d2a8329e087cef"},
+ {file = "pydantic_core-2.18.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4748321b5078216070b151d5271ef3e7cc905ab170bbfd27d5c83ee3ec436695"},
+ {file = "pydantic_core-2.18.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:847a35c4d58721c5dc3dba599878ebbdfd96784f3fb8bb2c356e123bdcd73f34"},
+ {file = "pydantic_core-2.18.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3c40d4eaad41f78e3bbda31b89edc46a3f3dc6e171bf0ecf097ff7a0ffff7cb1"},
+ {file = "pydantic_core-2.18.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:21a5e440dbe315ab9825fcd459b8814bb92b27c974cbc23c3e8baa2b76890077"},
+ {file = "pydantic_core-2.18.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01dd777215e2aa86dfd664daed5957704b769e726626393438f9c87690ce78c3"},
+ {file = "pydantic_core-2.18.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4b06beb3b3f1479d32befd1f3079cc47b34fa2da62457cdf6c963393340b56e9"},
+ {file = "pydantic_core-2.18.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:564d7922e4b13a16b98772441879fcdcbe82ff50daa622d681dd682175ea918c"},
+ {file = "pydantic_core-2.18.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:0eb2a4f660fcd8e2b1c90ad566db2b98d7f3f4717c64fe0a83e0adb39766d5b8"},
+ {file = "pydantic_core-2.18.4-cp312-none-win32.whl", hash = "sha256:8b8bab4c97248095ae0c4455b5a1cd1cdd96e4e4769306ab19dda135ea4cdb07"},
+ {file = "pydantic_core-2.18.4-cp312-none-win_amd64.whl", hash = "sha256:14601cdb733d741b8958224030e2bfe21a4a881fb3dd6fbb21f071cabd48fa0a"},
+ {file = "pydantic_core-2.18.4-cp312-none-win_arm64.whl", hash = "sha256:c1322d7dd74713dcc157a2b7898a564ab091ca6c58302d5c7b4c07296e3fd00f"},
+ {file = "pydantic_core-2.18.4-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:823be1deb01793da05ecb0484d6c9e20baebb39bd42b5d72636ae9cf8350dbd2"},
+ {file = "pydantic_core-2.18.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:ebef0dd9bf9b812bf75bda96743f2a6c5734a02092ae7f721c048d156d5fabae"},
+ {file = "pydantic_core-2.18.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ae1d6df168efb88d7d522664693607b80b4080be6750c913eefb77e34c12c71a"},
+ {file = "pydantic_core-2.18.4-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f9899c94762343f2cc2fc64c13e7cae4c3cc65cdfc87dd810a31654c9b7358cc"},
+ {file = "pydantic_core-2.18.4-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99457f184ad90235cfe8461c4d70ab7dd2680e28821c29eca00252ba90308c78"},
+ {file = "pydantic_core-2.18.4-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:18f469a3d2a2fdafe99296a87e8a4c37748b5080a26b806a707f25a902c040a8"},
+ {file = "pydantic_core-2.18.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b7cdf28938ac6b8b49ae5e92f2735056a7ba99c9b110a474473fd71185c1af5d"},
+ {file = "pydantic_core-2.18.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:938cb21650855054dc54dfd9120a851c974f95450f00683399006aa6e8abb057"},
+ {file = "pydantic_core-2.18.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:44cd83ab6a51da80fb5adbd9560e26018e2ac7826f9626bc06ca3dc074cd198b"},
+ {file = "pydantic_core-2.18.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:972658f4a72d02b8abfa2581d92d59f59897d2e9f7e708fdabe922f9087773af"},
+ {file = "pydantic_core-2.18.4-cp38-none-win32.whl", hash = "sha256:1d886dc848e60cb7666f771e406acae54ab279b9f1e4143babc9c2258213daa2"},
+ {file = "pydantic_core-2.18.4-cp38-none-win_amd64.whl", hash = "sha256:bb4462bd43c2460774914b8525f79b00f8f407c945d50881568f294c1d9b4443"},
+ {file = "pydantic_core-2.18.4-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:44a688331d4a4e2129140a8118479443bd6f1905231138971372fcde37e43528"},
+ {file = "pydantic_core-2.18.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a2fdd81edd64342c85ac7cf2753ccae0b79bf2dfa063785503cb85a7d3593223"},
+ {file = "pydantic_core-2.18.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:86110d7e1907ab36691f80b33eb2da87d780f4739ae773e5fc83fb272f88825f"},
+ {file = "pydantic_core-2.18.4-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:46387e38bd641b3ee5ce247563b60c5ca098da9c56c75c157a05eaa0933ed154"},
+ {file = "pydantic_core-2.18.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:123c3cec203e3f5ac7b000bd82235f1a3eced8665b63d18be751f115588fea30"},
+ {file = "pydantic_core-2.18.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dc1803ac5c32ec324c5261c7209e8f8ce88e83254c4e1aebdc8b0a39f9ddb443"},
+ {file = "pydantic_core-2.18.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:53db086f9f6ab2b4061958d9c276d1dbe3690e8dd727d6abf2321d6cce37fa94"},
+ {file = "pydantic_core-2.18.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:abc267fa9837245cc28ea6929f19fa335f3dc330a35d2e45509b6566dc18be23"},
+ {file = "pydantic_core-2.18.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:a0d829524aaefdebccb869eed855e2d04c21d2d7479b6cada7ace5448416597b"},
+ {file = "pydantic_core-2.18.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:509daade3b8649f80d4e5ff21aa5673e4ebe58590b25fe42fac5f0f52c6f034a"},
+ {file = "pydantic_core-2.18.4-cp39-none-win32.whl", hash = "sha256:ca26a1e73c48cfc54c4a76ff78df3727b9d9f4ccc8dbee4ae3f73306a591676d"},
+ {file = "pydantic_core-2.18.4-cp39-none-win_amd64.whl", hash = "sha256:c67598100338d5d985db1b3d21f3619ef392e185e71b8d52bceacc4a7771ea7e"},
+ {file = "pydantic_core-2.18.4-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:574d92eac874f7f4db0ca653514d823a0d22e2354359d0759e3f6a406db5d55d"},
+ {file = "pydantic_core-2.18.4-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1f4d26ceb5eb9eed4af91bebeae4b06c3fb28966ca3a8fb765208cf6b51102ab"},
+ {file = "pydantic_core-2.18.4-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:77450e6d20016ec41f43ca4a6c63e9fdde03f0ae3fe90e7c27bdbeaece8b1ed4"},
+ {file = "pydantic_core-2.18.4-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d323a01da91851a4f17bf592faf46149c9169d68430b3146dcba2bb5e5719abc"},
+ {file = "pydantic_core-2.18.4-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:43d447dd2ae072a0065389092a231283f62d960030ecd27565672bd40746c507"},
+ {file = "pydantic_core-2.18.4-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:578e24f761f3b425834f297b9935e1ce2e30f51400964ce4801002435a1b41ef"},
+ {file = "pydantic_core-2.18.4-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:81b5efb2f126454586d0f40c4d834010979cb80785173d1586df845a632e4e6d"},
+ {file = "pydantic_core-2.18.4-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:ab86ce7c8f9bea87b9d12c7f0af71102acbf5ecbc66c17796cff45dae54ef9a5"},
+ {file = "pydantic_core-2.18.4-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:90afc12421df2b1b4dcc975f814e21bc1754640d502a2fbcc6d41e77af5ec312"},
+ {file = "pydantic_core-2.18.4-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:51991a89639a912c17bef4b45c87bd83593aee0437d8102556af4885811d59f5"},
+ {file = "pydantic_core-2.18.4-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:293afe532740370aba8c060882f7d26cfd00c94cae32fd2e212a3a6e3b7bc15e"},
+ {file = "pydantic_core-2.18.4-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b48ece5bde2e768197a2d0f6e925f9d7e3e826f0ad2271120f8144a9db18d5c8"},
+ {file = "pydantic_core-2.18.4-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:eae237477a873ab46e8dd748e515c72c0c804fb380fbe6c85533c7de51f23a8f"},
+ {file = "pydantic_core-2.18.4-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:834b5230b5dfc0c1ec37b2fda433b271cbbc0e507560b5d1588e2cc1148cf1ce"},
+ {file = "pydantic_core-2.18.4-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:e858ac0a25074ba4bce653f9b5d0a85b7456eaddadc0ce82d3878c22489fa4ee"},
+ {file = "pydantic_core-2.18.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:2fd41f6eff4c20778d717af1cc50eca52f5afe7805ee530a4fbd0bae284f16e9"},
+ {file = "pydantic_core-2.18.4.tar.gz", hash = "sha256:ec3beeada09ff865c344ff3bc2f427f5e6c26401cc6113d77e372c3fdac73864"},
]
[[package]]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydantic-2.7.1/pydantic/__init__.py new/pydantic-2.7.3/pydantic/__init__.py
--- old/pydantic-2.7.1/pydantic/__init__.py 2024-04-23 14:59:19.000000000 +0200
+++ new/pydantic-2.7.3/pydantic/__init__.py 2024-06-03 20:32:51.000000000 +0200
@@ -220,145 +220,145 @@
# A mapping of {<member name>: (package, <module name>)} defining dynamic imports
_dynamic_imports: 'dict[str, tuple[str, str]]' = {
- 'dataclasses': (__package__, '__module__'),
+ 'dataclasses': (__spec__.parent, '__module__'),
# functional validators
- 'field_validator': (__package__, '.functional_validators'),
- 'model_validator': (__package__, '.functional_validators'),
- 'AfterValidator': (__package__, '.functional_validators'),
- 'BeforeValidator': (__package__, '.functional_validators'),
- 'PlainValidator': (__package__, '.functional_validators'),
- 'WrapValidator': (__package__, '.functional_validators'),
- 'SkipValidation': (__package__, '.functional_validators'),
- 'InstanceOf': (__package__, '.functional_validators'),
+ 'field_validator': (__spec__.parent, '.functional_validators'),
+ 'model_validator': (__spec__.parent, '.functional_validators'),
+ 'AfterValidator': (__spec__.parent, '.functional_validators'),
+ 'BeforeValidator': (__spec__.parent, '.functional_validators'),
+ 'PlainValidator': (__spec__.parent, '.functional_validators'),
+ 'WrapValidator': (__spec__.parent, '.functional_validators'),
+ 'SkipValidation': (__spec__.parent, '.functional_validators'),
+ 'InstanceOf': (__spec__.parent, '.functional_validators'),
# JSON Schema
- 'WithJsonSchema': (__package__, '.json_schema'),
+ 'WithJsonSchema': (__spec__.parent, '.json_schema'),
# functional serializers
- 'field_serializer': (__package__, '.functional_serializers'),
- 'model_serializer': (__package__, '.functional_serializers'),
- 'PlainSerializer': (__package__, '.functional_serializers'),
- 'SerializeAsAny': (__package__, '.functional_serializers'),
- 'WrapSerializer': (__package__, '.functional_serializers'),
+ 'field_serializer': (__spec__.parent, '.functional_serializers'),
+ 'model_serializer': (__spec__.parent, '.functional_serializers'),
+ 'PlainSerializer': (__spec__.parent, '.functional_serializers'),
+ 'SerializeAsAny': (__spec__.parent, '.functional_serializers'),
+ 'WrapSerializer': (__spec__.parent, '.functional_serializers'),
# config
- 'ConfigDict': (__package__, '.config'),
- 'with_config': (__package__, '.config'),
+ 'ConfigDict': (__spec__.parent, '.config'),
+ 'with_config': (__spec__.parent, '.config'),
# validate call
- 'validate_call': (__package__, '.validate_call_decorator'),
+ 'validate_call': (__spec__.parent, '.validate_call_decorator'),
# errors
- 'PydanticErrorCodes': (__package__, '.errors'),
- 'PydanticUserError': (__package__, '.errors'),
- 'PydanticSchemaGenerationError': (__package__, '.errors'),
- 'PydanticImportError': (__package__, '.errors'),
- 'PydanticUndefinedAnnotation': (__package__, '.errors'),
- 'PydanticInvalidForJsonSchema': (__package__, '.errors'),
+ 'PydanticErrorCodes': (__spec__.parent, '.errors'),
+ 'PydanticUserError': (__spec__.parent, '.errors'),
+ 'PydanticSchemaGenerationError': (__spec__.parent, '.errors'),
+ 'PydanticImportError': (__spec__.parent, '.errors'),
+ 'PydanticUndefinedAnnotation': (__spec__.parent, '.errors'),
+ 'PydanticInvalidForJsonSchema': (__spec__.parent, '.errors'),
# fields
- 'Field': (__package__, '.fields'),
- 'computed_field': (__package__, '.fields'),
- 'PrivateAttr': (__package__, '.fields'),
+ 'Field': (__spec__.parent, '.fields'),
+ 'computed_field': (__spec__.parent, '.fields'),
+ 'PrivateAttr': (__spec__.parent, '.fields'),
# alias
- 'AliasChoices': (__package__, '.aliases'),
- 'AliasGenerator': (__package__, '.aliases'),
- 'AliasPath': (__package__, '.aliases'),
+ 'AliasChoices': (__spec__.parent, '.aliases'),
+ 'AliasGenerator': (__spec__.parent, '.aliases'),
+ 'AliasPath': (__spec__.parent, '.aliases'),
# main
- 'BaseModel': (__package__, '.main'),
- 'create_model': (__package__, '.main'),
+ 'BaseModel': (__spec__.parent, '.main'),
+ 'create_model': (__spec__.parent, '.main'),
# network
- 'AnyUrl': (__package__, '.networks'),
- 'AnyHttpUrl': (__package__, '.networks'),
- 'FileUrl': (__package__, '.networks'),
- 'HttpUrl': (__package__, '.networks'),
- 'FtpUrl': (__package__, '.networks'),
- 'WebsocketUrl': (__package__, '.networks'),
- 'AnyWebsocketUrl': (__package__, '.networks'),
- 'UrlConstraints': (__package__, '.networks'),
- 'EmailStr': (__package__, '.networks'),
- 'NameEmail': (__package__, '.networks'),
- 'IPvAnyAddress': (__package__, '.networks'),
- 'IPvAnyInterface': (__package__, '.networks'),
- 'IPvAnyNetwork': (__package__, '.networks'),
- 'PostgresDsn': (__package__, '.networks'),
- 'CockroachDsn': (__package__, '.networks'),
- 'AmqpDsn': (__package__, '.networks'),
- 'RedisDsn': (__package__, '.networks'),
- 'MongoDsn': (__package__, '.networks'),
- 'KafkaDsn': (__package__, '.networks'),
- 'NatsDsn': (__package__, '.networks'),
- 'MySQLDsn': (__package__, '.networks'),
- 'MariaDBDsn': (__package__, '.networks'),
- 'ClickHouseDsn': (__package__, '.networks'),
- 'validate_email': (__package__, '.networks'),
+ 'AnyUrl': (__spec__.parent, '.networks'),
+ 'AnyHttpUrl': (__spec__.parent, '.networks'),
+ 'FileUrl': (__spec__.parent, '.networks'),
+ 'HttpUrl': (__spec__.parent, '.networks'),
+ 'FtpUrl': (__spec__.parent, '.networks'),
+ 'WebsocketUrl': (__spec__.parent, '.networks'),
+ 'AnyWebsocketUrl': (__spec__.parent, '.networks'),
+ 'UrlConstraints': (__spec__.parent, '.networks'),
+ 'EmailStr': (__spec__.parent, '.networks'),
+ 'NameEmail': (__spec__.parent, '.networks'),
+ 'IPvAnyAddress': (__spec__.parent, '.networks'),
+ 'IPvAnyInterface': (__spec__.parent, '.networks'),
+ 'IPvAnyNetwork': (__spec__.parent, '.networks'),
+ 'PostgresDsn': (__spec__.parent, '.networks'),
+ 'CockroachDsn': (__spec__.parent, '.networks'),
+ 'AmqpDsn': (__spec__.parent, '.networks'),
+ 'RedisDsn': (__spec__.parent, '.networks'),
+ 'MongoDsn': (__spec__.parent, '.networks'),
+ 'KafkaDsn': (__spec__.parent, '.networks'),
+ 'NatsDsn': (__spec__.parent, '.networks'),
+ 'MySQLDsn': (__spec__.parent, '.networks'),
+ 'MariaDBDsn': (__spec__.parent, '.networks'),
+ 'ClickHouseDsn': (__spec__.parent, '.networks'),
+ 'validate_email': (__spec__.parent, '.networks'),
# root_model
- 'RootModel': (__package__, '.root_model'),
+ 'RootModel': (__spec__.parent, '.root_model'),
# types
- 'Strict': (__package__, '.types'),
- 'StrictStr': (__package__, '.types'),
- 'conbytes': (__package__, '.types'),
- 'conlist': (__package__, '.types'),
- 'conset': (__package__, '.types'),
- 'confrozenset': (__package__, '.types'),
- 'constr': (__package__, '.types'),
- 'StringConstraints': (__package__, '.types'),
- 'ImportString': (__package__, '.types'),
- 'conint': (__package__, '.types'),
- 'PositiveInt': (__package__, '.types'),
- 'NegativeInt': (__package__, '.types'),
- 'NonNegativeInt': (__package__, '.types'),
- 'NonPositiveInt': (__package__, '.types'),
- 'confloat': (__package__, '.types'),
- 'PositiveFloat': (__package__, '.types'),
- 'NegativeFloat': (__package__, '.types'),
- 'NonNegativeFloat': (__package__, '.types'),
- 'NonPositiveFloat': (__package__, '.types'),
- 'FiniteFloat': (__package__, '.types'),
- 'condecimal': (__package__, '.types'),
- 'condate': (__package__, '.types'),
- 'UUID1': (__package__, '.types'),
- 'UUID3': (__package__, '.types'),
- 'UUID4': (__package__, '.types'),
- 'UUID5': (__package__, '.types'),
- 'FilePath': (__package__, '.types'),
- 'DirectoryPath': (__package__, '.types'),
- 'NewPath': (__package__, '.types'),
- 'Json': (__package__, '.types'),
- 'Secret': (__package__, '.types'),
- 'SecretStr': (__package__, '.types'),
- 'SecretBytes': (__package__, '.types'),
- 'StrictBool': (__package__, '.types'),
- 'StrictBytes': (__package__, '.types'),
- 'StrictInt': (__package__, '.types'),
- 'StrictFloat': (__package__, '.types'),
- 'PaymentCardNumber': (__package__, '.types'),
- 'ByteSize': (__package__, '.types'),
- 'PastDate': (__package__, '.types'),
- 'FutureDate': (__package__, '.types'),
- 'PastDatetime': (__package__, '.types'),
- 'FutureDatetime': (__package__, '.types'),
- 'AwareDatetime': (__package__, '.types'),
- 'NaiveDatetime': (__package__, '.types'),
- 'AllowInfNan': (__package__, '.types'),
- 'EncoderProtocol': (__package__, '.types'),
- 'EncodedBytes': (__package__, '.types'),
- 'EncodedStr': (__package__, '.types'),
- 'Base64Encoder': (__package__, '.types'),
- 'Base64Bytes': (__package__, '.types'),
- 'Base64Str': (__package__, '.types'),
- 'Base64UrlBytes': (__package__, '.types'),
- 'Base64UrlStr': (__package__, '.types'),
- 'GetPydanticSchema': (__package__, '.types'),
- 'Tag': (__package__, '.types'),
- 'Discriminator': (__package__, '.types'),
- 'JsonValue': (__package__, '.types'),
- 'OnErrorOmit': (__package__, '.types'),
+ 'Strict': (__spec__.parent, '.types'),
+ 'StrictStr': (__spec__.parent, '.types'),
+ 'conbytes': (__spec__.parent, '.types'),
+ 'conlist': (__spec__.parent, '.types'),
+ 'conset': (__spec__.parent, '.types'),
+ 'confrozenset': (__spec__.parent, '.types'),
+ 'constr': (__spec__.parent, '.types'),
+ 'StringConstraints': (__spec__.parent, '.types'),
+ 'ImportString': (__spec__.parent, '.types'),
+ 'conint': (__spec__.parent, '.types'),
+ 'PositiveInt': (__spec__.parent, '.types'),
+ 'NegativeInt': (__spec__.parent, '.types'),
+ 'NonNegativeInt': (__spec__.parent, '.types'),
+ 'NonPositiveInt': (__spec__.parent, '.types'),
+ 'confloat': (__spec__.parent, '.types'),
+ 'PositiveFloat': (__spec__.parent, '.types'),
+ 'NegativeFloat': (__spec__.parent, '.types'),
+ 'NonNegativeFloat': (__spec__.parent, '.types'),
+ 'NonPositiveFloat': (__spec__.parent, '.types'),
+ 'FiniteFloat': (__spec__.parent, '.types'),
+ 'condecimal': (__spec__.parent, '.types'),
+ 'condate': (__spec__.parent, '.types'),
+ 'UUID1': (__spec__.parent, '.types'),
+ 'UUID3': (__spec__.parent, '.types'),
+ 'UUID4': (__spec__.parent, '.types'),
+ 'UUID5': (__spec__.parent, '.types'),
+ 'FilePath': (__spec__.parent, '.types'),
+ 'DirectoryPath': (__spec__.parent, '.types'),
+ 'NewPath': (__spec__.parent, '.types'),
+ 'Json': (__spec__.parent, '.types'),
+ 'Secret': (__spec__.parent, '.types'),
+ 'SecretStr': (__spec__.parent, '.types'),
+ 'SecretBytes': (__spec__.parent, '.types'),
+ 'StrictBool': (__spec__.parent, '.types'),
+ 'StrictBytes': (__spec__.parent, '.types'),
+ 'StrictInt': (__spec__.parent, '.types'),
+ 'StrictFloat': (__spec__.parent, '.types'),
+ 'PaymentCardNumber': (__spec__.parent, '.types'),
+ 'ByteSize': (__spec__.parent, '.types'),
+ 'PastDate': (__spec__.parent, '.types'),
+ 'FutureDate': (__spec__.parent, '.types'),
+ 'PastDatetime': (__spec__.parent, '.types'),
+ 'FutureDatetime': (__spec__.parent, '.types'),
+ 'AwareDatetime': (__spec__.parent, '.types'),
+ 'NaiveDatetime': (__spec__.parent, '.types'),
+ 'AllowInfNan': (__spec__.parent, '.types'),
+ 'EncoderProtocol': (__spec__.parent, '.types'),
+ 'EncodedBytes': (__spec__.parent, '.types'),
+ 'EncodedStr': (__spec__.parent, '.types'),
+ 'Base64Encoder': (__spec__.parent, '.types'),
+ 'Base64Bytes': (__spec__.parent, '.types'),
+ 'Base64Str': (__spec__.parent, '.types'),
+ 'Base64UrlBytes': (__spec__.parent, '.types'),
+ 'Base64UrlStr': (__spec__.parent, '.types'),
+ 'GetPydanticSchema': (__spec__.parent, '.types'),
+ 'Tag': (__spec__.parent, '.types'),
+ 'Discriminator': (__spec__.parent, '.types'),
+ 'JsonValue': (__spec__.parent, '.types'),
+ 'OnErrorOmit': (__spec__.parent, '.types'),
# type_adapter
- 'TypeAdapter': (__package__, '.type_adapter'),
+ 'TypeAdapter': (__spec__.parent, '.type_adapter'),
# warnings
- 'PydanticDeprecatedSince20': (__package__, '.warnings'),
- 'PydanticDeprecatedSince26': (__package__, '.warnings'),
- 'PydanticDeprecationWarning': (__package__, '.warnings'),
+ 'PydanticDeprecatedSince20': (__spec__.parent, '.warnings'),
+ 'PydanticDeprecatedSince26': (__spec__.parent, '.warnings'),
+ 'PydanticDeprecationWarning': (__spec__.parent, '.warnings'),
# annotated handlers
- 'GetCoreSchemaHandler': (__package__, '.annotated_handlers'),
- 'GetJsonSchemaHandler': (__package__, '.annotated_handlers'),
+ 'GetCoreSchemaHandler': (__spec__.parent, '.annotated_handlers'),
+ 'GetJsonSchemaHandler': (__spec__.parent, '.annotated_handlers'),
# generate schema from ._internal
- 'GenerateSchema': (__package__, '._internal._generate_schema'),
+ 'GenerateSchema': (__spec__.parent, '._internal._generate_schema'),
# pydantic_core stuff
'ValidationError': ('pydantic_core', '.'),
'ValidationInfo': ('pydantic_core', '.core_schema'),
@@ -367,13 +367,13 @@
'FieldSerializationInfo': ('pydantic_core', '.core_schema'),
'SerializerFunctionWrapHandler': ('pydantic_core', '.core_schema'),
# deprecated, mostly not included in __all__
- 'root_validator': (__package__, '.deprecated.class_validators'),
- 'validator': (__package__, '.deprecated.class_validators'),
- 'BaseConfig': (__package__, '.deprecated.config'),
- 'Extra': (__package__, '.deprecated.config'),
- 'parse_obj_as': (__package__, '.deprecated.tools'),
- 'schema_of': (__package__, '.deprecated.tools'),
- 'schema_json_of': (__package__, '.deprecated.tools'),
+ 'root_validator': (__spec__.parent, '.deprecated.class_validators'),
+ 'validator': (__spec__.parent, '.deprecated.class_validators'),
+ 'BaseConfig': (__spec__.parent, '.deprecated.config'),
+ 'Extra': (__spec__.parent, '.deprecated.config'),
+ 'parse_obj_as': (__spec__.parent, '.deprecated.tools'),
+ 'schema_of': (__spec__.parent, '.deprecated.tools'),
+ 'schema_json_of': (__spec__.parent, '.deprecated.tools'),
'FieldValidationInfo': ('pydantic_core', '.core_schema'),
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydantic-2.7.1/pydantic/_internal/_generate_schema.py new/pydantic-2.7.3/pydantic/_internal/_generate_schema.py
--- old/pydantic-2.7.1/pydantic/_internal/_generate_schema.py 2024-04-23 14:59:19.000000000 +0200
+++ new/pydantic-2.7.3/pydantic/_internal/_generate_schema.py 2024-06-03 20:32:51.000000000 +0200
@@ -1656,15 +1656,20 @@
bound = typevar.__bound__
constraints = typevar.__constraints__
- default = getattr(typevar, '__default__', None)
- if (bound is not None) + (len(constraints) != 0) + (default is not None) > 1:
+ try:
+ typevar_has_default = typevar.has_default() # type: ignore
+ except AttributeError:
+ # could still have a default if it's an old version of typing_extensions.TypeVar
+ typevar_has_default = getattr(typevar, '__default__', None) is not None
+
+ if (bound is not None) + (len(constraints) != 0) + typevar_has_default > 1:
raise NotImplementedError(
'Pydantic does not support mixing more than one of TypeVar bounds, constraints and defaults'
)
- if default is not None:
- return self.generate_schema(default)
+ if typevar_has_default:
+ return self.generate_schema(typevar.__default__) # type: ignore
elif constraints:
return self._union_schema(typing.Union[constraints]) # type: ignore
elif bound:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydantic-2.7.1/pydantic/version.py new/pydantic-2.7.3/pydantic/version.py
--- old/pydantic-2.7.1/pydantic/version.py 2024-04-23 14:59:19.000000000 +0200
+++ new/pydantic-2.7.3/pydantic/version.py 2024-06-03 20:32:51.000000000 +0200
@@ -3,7 +3,7 @@
__all__ = 'VERSION', 'version_info'
-VERSION = '2.7.1'
+VERSION = '2.7.3'
"""The version of Pydantic."""
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydantic-2.7.1/pyproject.toml new/pydantic-2.7.3/pyproject.toml
--- old/pydantic-2.7.1/pyproject.toml 2024-04-23 14:59:19.000000000 +0200
+++ new/pydantic-2.7.3/pyproject.toml 2024-06-03 20:32:51.000000000 +0200
@@ -48,7 +48,7 @@
dependencies = [
'typing-extensions>=4.6.1',
'annotated-types>=0.4.0',
- "pydantic-core==2.18.2",
+ "pydantic-core==2.18.4",
]
dynamic = ['version', 'readme']
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package autofs for openSUSE:Factory checked in at 2024-06-07 15:02:29
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/autofs (Old)
and /work/SRC/openSUSE:Factory/.autofs.new.24587 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "autofs"
Fri Jun 7 15:02:29 2024 rev:135 rq:1179026 version:5.1.9
Changes:
--------
--- /work/SRC/openSUSE:Factory/autofs/autofs.changes 2024-05-07 18:02:52.829365347 +0200
+++ /work/SRC/openSUSE:Factory/.autofs.new.24587/autofs.changes 2024-06-07 15:02:46.278471206 +0200
@@ -1,0 +2,9 @@
+Mon Jun 3 02:21:09 UTC 2024 - David Disseldorp <ddiss(a)suse.com>
+
+- Fix xmlStructuredErrorFunc callback parameter type (bsc#1221682)
+ * Refresh autofs-5.1.1-dbus-udisks-monitor.patch
+- Use upstream sasl_callback_t patch
+ * Remove autofs-5.1.9-cast-sasl_callback_t-function-pointers.patch
+ * Add autofs-5.1.9-Fix-incompatible-function-pointer-types.patch
+
+-------------------------------------------------------------------
Old:
----
autofs-5.1.9-cast-sasl_callback_t-function-pointers.patch
New:
----
autofs-5.1.9-Fix-incompatible-function-pointer-types.patch
BETA DEBUG BEGIN:
Old:- Use upstream sasl_callback_t patch
* Remove autofs-5.1.9-cast-sasl_callback_t-function-pointers.patch
* Add autofs-5.1.9-Fix-incompatible-function-pointer-types.patch
BETA DEBUG END:
BETA DEBUG BEGIN:
New: * Remove autofs-5.1.9-cast-sasl_callback_t-function-pointers.patch
* Add autofs-5.1.9-Fix-incompatible-function-pointer-types.patch
BETA DEBUG END:
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ autofs.spec ++++++
--- /var/tmp/diff_new_pack.oyrro6/_old 2024-06-07 15:02:47.050499331 +0200
+++ /var/tmp/diff_new_pack.oyrro6/_new 2024-06-07 15:02:47.050499331 +0200
@@ -61,7 +61,7 @@
Patch108: autofs-suse-manpage-remove-initdir.patch
# bsc#1221682 - GCC 14: autofs package fails
Patch109: autofs-5.1.9-fix-ldap_parse_page_control-check.patch
-Patch110: autofs-5.1.9-cast-sasl_callback_t-function-pointers.patch
+Patch110: autofs-5.1.9-Fix-incompatible-function-pointer-types.patch
BuildRequires: autoconf
BuildRequires: bison
BuildRequires: cyrus-sasl-devel
++++++ autofs-5.1.1-dbus-udisks-monitor.patch ++++++
--- /var/tmp/diff_new_pack.oyrro6/_old 2024-06-07 15:02:47.110501517 +0200
+++ /var/tmp/diff_new_pack.oyrro6/_new 2024-06-07 15:02:47.110501517 +0200
@@ -1,5 +1,6 @@
ddiss: rebase atop 37fda2c ("autofs-5.1.8 - add soucre parameter to
module functions")
+ Fix xmlStructuredErrorFunc callback parameter type (bsc#1221682)
---
Makefile.conf.in | 3
@@ -1970,7 +1971,7 @@
+}
+
+#ifdef LIBXML_TREE_ENABLED
-+static void xmlerror(void *context, xmlErrorPtr err)
++static void xmlerror(void *context, const xmlError *err)
+{
+ struct lookup_context *ctxt = (struct lookup_context*)context;
+ char *message = err->message;
++++++ autofs-5.1.9-Fix-incompatible-function-pointer-types.patch ++++++
From b7ff971bb8aa3fc609bb531ddc4c2ce56226383f Mon Sep 17 00:00:00 2001
From: Florian Weimer <fweimer(a)redhat.com>
Date: Mon, 18 Dec 2023 13:48:18 +0100
Subject: [PATCH] autofs-5.1.9 - Fix incompatible function pointer types in
cyrus-sasl module
Add casts to SASL callbacks to avoid incompatible-pointer-types
errors. Avoids a build failure with stricter compilers.
Signed-off-by: Florian Weimer <fweimer(a)redhat.com>
Signed-off-by: Ian Kent <raven(a)themaw.net>
Reviewed-by: David Disseldorp <ddiss(a)suse.de>
---
CHANGELOG | 2 ++
modules/cyrus-sasl.c | 14 +++++++-------
2 files changed, 9 insertions(+), 7 deletions(-)
diff --git a/CHANGELOG b/CHANGELOG
index 3e47daa..fd9d861 100644
--- a/CHANGELOG
+++ b/CHANGELOG
@@ -1,4 +1,6 @@
+- Fix incompatible function pointer types in cyrus-sasl module.
+
02/11/2023 autofs-5.1.9
- fix kernel mount status notification.
- fix fedfs build flags.
diff --git a/modules/cyrus-sasl.c b/modules/cyrus-sasl.c
index e742eaf..78b7794 100644
--- a/modules/cyrus-sasl.c
+++ b/modules/cyrus-sasl.c
@@ -109,17 +109,17 @@ static int getpass_func(sasl_conn_t *, void *, int, sasl_secret_t **);
static int getuser_func(void *, int, const char **, unsigned *);
static sasl_callback_t callbacks[] = {
- { SASL_CB_USER, &getuser_func, NULL },
- { SASL_CB_AUTHNAME, &getuser_func, NULL },
- { SASL_CB_PASS, &getpass_func, NULL },
+ { SASL_CB_USER, (int(*)(void)) &getuser_func, NULL },
+ { SASL_CB_AUTHNAME, (int(*)(void)) &getuser_func, NULL },
+ { SASL_CB_PASS, (int(*)(void)) &getpass_func, NULL },
{ SASL_CB_LIST_END, NULL, NULL },
};
static sasl_callback_t debug_callbacks[] = {
- { SASL_CB_LOG, &sasl_log_func, NULL },
- { SASL_CB_USER, &getuser_func, NULL },
- { SASL_CB_AUTHNAME, &getuser_func, NULL },
- { SASL_CB_PASS, &getpass_func, NULL },
+ { SASL_CB_LOG, (int(*)(void)) &sasl_log_func, NULL },
+ { SASL_CB_USER, (int(*)(void)) &getuser_func, NULL },
+ { SASL_CB_AUTHNAME, (int(*)(void)) &getuser_func, NULL },
+ { SASL_CB_PASS, (int(*)(void)) &getpass_func, NULL },
{ SASL_CB_LIST_END, NULL, NULL },
};
--
2.35.3
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package pmdk for openSUSE:Factory checked in at 2024-06-07 15:02:23
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/pmdk (Old)
and /work/SRC/openSUSE:Factory/.pmdk.new.24587 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "pmdk"
Fri Jun 7 15:02:23 2024 rev:16 rq:1179000 version:2.1.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/pmdk/pmdk.changes 2024-05-28 17:28:01.230257423 +0200
+++ /work/SRC/openSUSE:Factory/.pmdk.new.24587/pmdk.changes 2024-06-07 15:02:38.554189813 +0200
@@ -1,0 +2,6 @@
+Thu Jun 6 09:22:13 UTC 2024 - Guillaume GARDET <guillaume.gardet(a)opensuse.org>
+
+- Add patch to fix build on aarch64:
+ * 6096.patch
+
+-------------------------------------------------------------------
New:
----
6096.patch
BETA DEBUG BEGIN:
New:- Add patch to fix build on aarch64:
* 6096.patch
BETA DEBUG END:
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ pmdk.spec ++++++
--- /var/tmp/diff_new_pack.dT0bv0/_old 2024-06-07 15:02:39.138211089 +0200
+++ /var/tmp/diff_new_pack.dT0bv0/_new 2024-06-07 15:02:39.138211089 +0200
@@ -31,6 +31,8 @@
Source2: https://github.com/pmem/pmdk/releases/download/%version/%name-%version.tar.…
Source10: pregen-doc.tar.xz
Source99: gen-doc.sh
+# PATCH-FIX-UPSTREAM - https://github.com/pmem/pmdk/pull/6096
+Patch1: 6096.patch
BuildRequires: automake
BuildRequires: fdupes
BuildRequires: man
@@ -170,7 +172,7 @@
Documentation for the pmem library interface.
%prep
-%autosetup -p0 -a10
+%autosetup -p1 -a10
# we have pregenerated pages
find doc -type f -name "*.[0-9].md" -delete
++++++ 6096.patch ++++++
From 85d138490d1c314d337bb77659fb08bf62c4c099 Mon Sep 17 00:00:00 2001
From: Guillaume Gardet <Guillaume.Gardet(a)arm.com>
Date: Wed, 5 Jun 2024 10:45:16 +0200
Subject: [PATCH] Add missing log_internal.h to fix build on aarch64
---
src/libpmem2/aarch64/init.c | 1 +
1 file changed, 1 insertion(+)
diff --git a/src/libpmem2/aarch64/init.c b/src/libpmem2/aarch64/init.c
index d4dd8812b21..f0b504b4b89 100644
--- a/src/libpmem2/aarch64/init.c
+++ b/src/libpmem2/aarch64/init.c
@@ -7,6 +7,7 @@
#include "auto_flush.h"
#include "flush.h"
+#include "log_internal.h"
#include "out.h"
#include "pmem2_arch.h"
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-ruff for openSUSE:Factory checked in at 2024-06-07 15:02:21
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-ruff (Old)
and /work/SRC/openSUSE:Factory/.python-ruff.new.24587 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-ruff"
Fri Jun 7 15:02:21 2024 rev:29 rq:1178989 version:0.4.8
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-ruff/python-ruff.changes 2024-05-14 13:38:06.130018986 +0200
+++ /work/SRC/openSUSE:Factory/.python-ruff.new.24587/python-ruff.changes 2024-06-07 15:02:35.586081686 +0200
@@ -1,0 +2,106 @@
+Thu Jun 6 09:28:05 UTC 2024 - Ondřej Súkup <mimi.vx(a)gmail.com>
+
+- update ruff:
+* 0.4.8
+ * Performance
+ * Linter performance has been improved by around 10% on some microbenchmarks
+ * by refactoring the lexer and parser to maintain synchronicity between them
+ * Preview features
+ * [flake8-bugbear] Implement return-in-generator (B901)
+ * [flake8-pyi] Implement PYI063
+ * [pygrep_hooks] Check blanket ignores via file-level pragmas (PGH004)
+ * Rule changes
+ * [pyupgrade] Update UP035 for Python 3.13 and the latest version of typing_extensions
+ * [numpy] Update NPY001 rule for NumPy 2.0
+ * Server
+ * Formatting a document with syntax problems no longer spams a visible error popup
+ * CLI
+ * Add RDJson support for --output-format flag
+ * Bug fixes
+ * [pyupgrade] Write empty string in lieu of panic when fixing UP032
+ * [flake8-simplify] Simplify double negatives in SIM103
+ * Ensure the expression generator adds a newline before type statements
+ * Respect per-file ignores for blanket and redirected noqa rules
+* 0.4.7
+ * Preview features
+ * [flake8-pyi] Implement PYI064
+ * [flake8-pyi] Implement PYI066
+ * [flake8-pyi] Implement PYI057
+ * [pyflakes] Enable F822 in __init__.py files by default
+ * Formatter
+ * Fix incorrect placement of trailing stub function comments
+ * Server
+ * Respect file exclusions in ruff server
+ * Add support for documents not exist on disk
+ * Add Vim and Kate setup guide for ruff server
+ * Bug fixes
+ * Avoid removing newlines between docstring headers and rST blocks
+ * Infer indentation with imports when logical indent is absent
+ * Use char index rather than position for indent slice
+ * [flake8-comprehension] Strip parentheses around generators in C400
+ * Mark repeated-isinstance-calls as unsafe on Python 3.10 and later
+* 0.4.6
+ * Breaking changes
+ * Use project-relative paths when calculating GitLab fingerprints
+ * Preview features
+ * [flake8-async] Sleep with >24 hour interval should usually sleep forever (ASYNC116)
+ * Rule changes
+ * [numpy] Add missing functions to NumPy 2.0 migration rule
+ * [mccabe] Consider irrefutable pattern similar to if .. else for C901
+ * Consider match-case statements for C901, PLR0912, and PLR0915
+ * Remove empty strings when converting to f-string (UP032)
+ * [flake8-bandit] request-without-timeout should warn for requests.request
+ * [flake8-self] Ignore sunder accesses in flake8-self rules
+ * [pyupgrade] Lint for TypeAliasType usages (UP040)
+ * Server
+ * Respect excludes in ruff server configuration discovery
+ * Use default settings if initialization options is empty or not provided
+ * ruff server correctly treats .pyi files as stub files
+ * ruff server searches for configuration in parent directories
+ * ruff server: An empty code action filter no longer returns notebook source actions
+ * Bug fixes
+ * [flake8-logging-format] Fix autofix title in logging-warn (G010)
+ * [refurb] Avoid recommending operator.itemgetter with dependence on lambda arguments
+ * [flake8-simplify] Avoid recommending context manager in __enter__ implementations
+ * Create intermediary directories for --output-file
+ * Propagate reads on global variables
+ * Treat all singledispatch arguments as runtime-required
+* 0.4.5
+ * Ruff's language server is now in Beta
+ * Rule changes
+ * [flake8-future-annotations] Reword future-rewritable-type-annotation (FA100) message
+ * [pycodestyle] Consider soft keywords for E27 rules
+ * [pyflakes] Recommend adding unused import bindings to __all__
+ * [pyflakes] Update documentation and deprecate ignore_init_module_imports
+ * [pyupgrade] Mark quotes as unnecessary for non-evaluated annotations
+ * Formatter
+ * Avoid multiline quotes warning with quote-style = preserve
+ * Server
+ * Support Jupyter Notebook files
+ * Support noqa comment code actions
+ * Fix automatic configuration reloading
+ * Fix several issues with configuration in Neovim and Helix
+ * CLI
+ * Add --output-format as a CLI option for ruff config
+ * Bug fixes
+ * Avoid PLE0237 for property with setter
+ * Avoid TCH005 for if stmt with elif/else block
+ * Avoid flagging __future__ annotations as required for non-evaluated type annotations
+ * Check for ruff executable in 'bin' directory as installed by 'pip install --target'.
+ * Sort edits prior to deduplicating in quotation fix
+ * Treat escaped newline as valid sequence
+ * [flake8-pie] Preserve parentheses in unnecessary-dict-kwargs
+ * [pylint] Ignore __slots__ with dynamic values
+ * [pylint] Remove try body from branch counting
+ * [refurb] Respect operator precedence in FURB110
+ * Documentation
+ * Add --preview to the README
+ * Add Python 3.13 to list of allowed Python versions
+ * Simplify Neovim setup documentation
+ * Update CONTRIBUTING.md to reflect the new parser
+ * Update server documentation with new migration guide
+ * [pycodestyle] Clarify motivation for E713 and E714
+ * [pyflakes] Update docs to describe WAI behavior (F541)
+ * [pylint] Clearly indicate what is counted as a branch
+
+-------------------------------------------------------------------
Old:
----
ruff-0.4.4.tar.gz
New:
----
ruff-0.4.8.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-ruff.spec ++++++
--- /var/tmp/diff_new_pack.yHG644/_old 2024-06-07 15:02:37.306144347 +0200
+++ /var/tmp/diff_new_pack.yHG644/_new 2024-06-07 15:02:37.306144347 +0200
@@ -19,7 +19,7 @@
%bcond_without libalternatives
%{?sle15_python_module_pythons}
Name: python-ruff
-Version: 0.4.4
+Version: 0.4.8
Release: 0
Summary: An extremely fast Python linter, written in Rust
License: MIT
++++++ ruff-0.4.4.tar.gz -> ruff-0.4.8.tar.gz ++++++
++++ 42446 lines of diff (skipped)
++++++ vendor.tar.zst ++++++
/work/SRC/openSUSE:Factory/python-ruff/vendor.tar.zst /work/SRC/openSUSE:Factory/.python-ruff.new.24587/vendor.tar.zst differ: char 7, line 1
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-Faker for openSUSE:Factory checked in at 2024-06-07 15:02:18
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-Faker (Old)
and /work/SRC/openSUSE:Factory/.python-Faker.new.24587 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-Faker"
Fri Jun 7 15:02:18 2024 rev:55 rq:1178962 version:25.5.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-Faker/python-Faker.changes 2024-05-13 01:24:22.834201470 +0200
+++ /work/SRC/openSUSE:Factory/.python-Faker.new.24587/python-Faker.changes 2024-06-07 15:02:30.813907837 +0200
@@ -1,0 +2,10 @@
+Thu Jun 6 08:44:57 UTC 2024 - Dirk Müller <dmueller(a)suse.com>
+
+- update to 25.5.0:
+ * Fix data in geo for `pl_PL`. Thanks @george0st.
+ * Add landmarks in `geo` for `pl_PL`. Thanks @george0st.
+ * Add more iOS versions to `user_agent`. Thanks @george0st.
+ * Update VAT generation in `nl_BE` to align with correct
+ Belgian format. Thanks @JorisSpruyt.
+
+-------------------------------------------------------------------
Old:
----
Faker-25.1.0.tar.gz
New:
----
Faker-25.5.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-Faker.spec ++++++
--- /var/tmp/diff_new_pack.8oltyu/_old 2024-06-07 15:02:31.665938876 +0200
+++ /var/tmp/diff_new_pack.8oltyu/_new 2024-06-07 15:02:31.665938876 +0200
@@ -18,7 +18,7 @@
%{?sle15_python_module_pythons}
Name: python-Faker
-Version: 25.1.0
+Version: 25.5.0
Release: 0
Summary: Python package that generates fake data
License: MIT
++++++ Faker-25.1.0.tar.gz -> Faker-25.5.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Faker-25.1.0/CHANGELOG.md new/Faker-25.5.0/CHANGELOG.md
--- old/Faker-25.1.0/CHANGELOG.md 2024-05-08 23:21:35.000000000 +0200
+++ new/Faker-25.5.0/CHANGELOG.md 2024-06-04 22:17:38.000000000 +0200
@@ -1,5 +1,21 @@
## Changelog
+### [v25.5.0 - 2024-05-04](https://github.com/joke2k/faker/compare/v25.4.0...v25.5.0)
+
+* Fix data in geo for `pl_PL`. Thanks @george0st.
+
+### [v25.4.0 - 2024-05-03](https://github.com/joke2k/faker/compare/v25.3.0...v25.4.0)
+
+* Add landmarks in `geo` for `pl_PL`. Thanks @george0st.
+
+### [v25.3.0 - 2024-05-28](https://github.com/joke2k/faker/compare/v25.2.0...v25.3.0)
+
+* Add more iOS versions to `user_agent`. Thanks @george0st.
+
+### [v25.2.0 - 2024-05-13](https://github.com/joke2k/faker/compare/v25.1.0...v25.2.0)
+
+* Update VAT generation in `nl_BE` to align with correct Belgian format. Thanks @JorisSpruyt.
+
### [v25.1.0 - 2024-05-08](https://github.com/joke2k/faker/compare/v25.0.1...v25.1.0)
* Add geo for `pl_PL`. Thanks @george0st.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Faker-25.1.0/CONTRIBUTING.rst new/Faker-25.5.0/CONTRIBUTING.rst
--- old/Faker-25.1.0/CONTRIBUTING.rst 2024-04-04 16:05:57.000000000 +0200
+++ new/Faker-25.5.0/CONTRIBUTING.rst 2024-06-04 22:16:42.000000000 +0200
@@ -11,6 +11,7 @@
- Clearly describe the issue including steps to reproduce when it is a bug.
- Make sure you fill in the earliest version that you know has the issue.
- Fork the repository on GitHub
+- Please only make changes or add data to locales you're familiar with.
Making Changes
--------------
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Faker-25.1.0/Faker.egg-info/PKG-INFO new/Faker-25.5.0/Faker.egg-info/PKG-INFO
--- old/Faker-25.1.0/Faker.egg-info/PKG-INFO 2024-05-08 23:27:08.000000000 +0200
+++ new/Faker-25.5.0/Faker.egg-info/PKG-INFO 2024-06-04 22:18:19.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: Faker
-Version: 25.1.0
+Version: 25.5.0
Summary: Faker is a Python package that generates fake data for you.
Home-page: https://github.com/joke2k/faker
Author: joke2k
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Faker-25.1.0/PKG-INFO new/Faker-25.5.0/PKG-INFO
--- old/Faker-25.1.0/PKG-INFO 2024-05-08 23:27:09.551867500 +0200
+++ new/Faker-25.5.0/PKG-INFO 2024-06-04 22:18:20.182764000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: Faker
-Version: 25.1.0
+Version: 25.5.0
Summary: Faker is a Python package that generates fake data for you.
Home-page: https://github.com/joke2k/faker
Author: joke2k
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Faker-25.1.0/VERSION new/Faker-25.5.0/VERSION
--- old/Faker-25.1.0/VERSION 2024-05-08 23:22:33.000000000 +0200
+++ new/Faker-25.5.0/VERSION 2024-06-04 22:18:07.000000000 +0200
@@ -1 +1 @@
-25.1.0
+25.5.0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Faker-25.1.0/faker/__init__.py new/Faker-25.5.0/faker/__init__.py
--- old/Faker-25.1.0/faker/__init__.py 2024-05-08 23:22:33.000000000 +0200
+++ new/Faker-25.5.0/faker/__init__.py 2024-06-04 22:18:07.000000000 +0200
@@ -2,6 +2,6 @@
from faker.generator import Generator
from faker.proxy import Faker
-VERSION = "25.1.0"
+VERSION = "25.5.0"
__all__ = ("Factory", "Generator", "Faker")
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Faker-25.1.0/faker/providers/address/it_IT/__init__.py new/Faker-25.5.0/faker/providers/address/it_IT/__init__.py
--- old/Faker-25.1.0/faker/providers/address/it_IT/__init__.py 2023-02-28 21:39:57.000000000 +0100
+++ new/Faker-25.5.0/faker/providers/address/it_IT/__init__.py 2024-06-03 16:06:05.000000000 +0200
@@ -4,11 +4,7 @@
def getcities(fulldict):
- cities = []
- for cap in fulldict:
- for c in fulldict[cap]:
- cities.append(c[0]) if c[0] not in cities else cities
- return cities
+ return list({c[0] for _cap, cities in fulldict.items() for c in cities})
class Provider(AddressProvider):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Faker-25.1.0/faker/providers/file/__init__.py new/Faker-25.5.0/faker/providers/file/__init__.py
--- old/Faker-25.1.0/faker/providers/file/__init__.py 2024-04-29 17:31:08.000000000 +0200
+++ new/Faker-25.5.0/faker/providers/file/__init__.py 2024-06-03 16:06:05.000000000 +0200
@@ -337,7 +337,7 @@
if prefix is None:
prefix = self.random_element(self.unix_device_prefixes)
suffix: str = self.random_element(string.ascii_lowercase)
- path = "/dev/{}{}".format(prefix, suffix)
+ path = f"/dev/{prefix}{suffix}"
return path
def unix_partition(self, prefix: Optional[str] = None) -> str:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Faker-25.1.0/faker/providers/geo/pl_PL/__init__.py new/Faker-25.5.0/faker/providers/geo/pl_PL/__init__.py
--- old/Faker-25.1.0/faker/providers/geo/pl_PL/__init__.py 2024-05-08 23:19:51.000000000 +0200
+++ new/Faker-25.5.0/faker/providers/geo/pl_PL/__init__.py 2024-06-04 22:16:42.000000000 +0200
@@ -5,25 +5,44 @@
# Source:
# https://latitude.to/map/pl/poland/cities/
land_coords = (
- ("52.22977", "21.01178", "Warsaw", "PL", "Europe/Warsaw"),
- ("51.75", "19.46667", "Łódź", "PL", "Europe/Warsaw"),
- ("50.06143", "19.93658", "Krakow", "PL", "Europe/Warsaw"),
- ("51.1", "17.03333", "Wrocław", "PL", "Europe/Warsaw"),
- ("52.40692", "16.92993", "Poznań", "PL", "Europe/Warsaw"),
- ("54.35205", "18.64637", "Gdańsk", "PL", "Europe/Warsaw"),
- ("53.42894", "14.55302", "Szczecin", "PL", "Europe/Warsaw"),
- ("53.1235", "18.00762", "Bydgoszcz", "PL", "Europe/Warsaw"),
- ("51.25", "22.56667", "Lublin", "PL", "Europe/Warsaw"),
- ("50.25841", "19.02754", "Katowice", "PL", "Europe/Warsaw"),
- ("53.13333", "23.16433", "Białystok", "PL", "Europe/Warsaw"),
- ("54.51889", "18.53188", "Gdynia", "PL", "Europe/Warsaw"),
- ("50.79646", "19.12409", "Częstochowa", "PL", "Europe/Warsaw"),
- ("50.28682", "19.10385", "Sosnowiec", "PL", "Europe/Warsaw"),
- ("51.40253", "21.14714", "Radom", "PL", "Europe/Warsaw"),
- ("52.1934", "21.03487", "Mokotów", "PL", "Europe/Warsaw"),
- ("53.01375", "18.59814", "Toruń", "PL", "Europe/Warsaw"),
- ("50.87033", "20.62752", "Kielce", "PL", "Europe/Warsaw"),
- ("50.29761", "18.67658", "Gliwice", "PL", "Europe/Warsaw"),
- ("50.32492", "18.78576", "Zabrze", "PL", "Europe/Warsaw"),
- ("50.34802", "18.93282", "Bytom", "PL", "Europe/Warsaw"),
+ ("52.22977", "21.01178", "Warsaw", "PL", "Europe/Warszawa"),
+ ("51.75", "19.46667", "Łódź", "PL", "Europe/Warszawa"),
+ ("50.06143", "19.93658", "Krakow", "PL", "Europe/Warszawa"),
+ ("51.1", "17.03333", "Wrocław", "PL", "Europe/Warszawa"),
+ ("52.40692", "16.92993", "Poznań", "PL", "Europe/Warszawa"),
+ ("54.35205", "18.64637", "Gdańsk", "PL", "Europe/Warszawa"),
+ ("53.42894", "14.55302", "Szczecin", "PL", "Europe/Warszawa"),
+ ("53.1235", "18.00762", "Bydgoszcz", "PL", "Europe/Warszawa"),
+ ("51.25", "22.56667", "Lublin", "PL", "Europe/Warszawa"),
+ ("50.25841", "19.02754", "Katowice", "PL", "Europe/Warszawa"),
+ ("53.13333", "23.16433", "Białystok", "PL", "Europe/Warszawa"),
+ ("54.51889", "18.53188", "Gdynia", "PL", "Europe/Warszawa"),
+ ("50.79646", "19.12409", "Częstochowa", "PL", "Europe/Warszawa"),
+ ("50.28682", "19.10385", "Sosnowiec", "PL", "Europe/Warszawa"),
+ ("51.40253", "21.14714", "Radom", "PL", "Europe/Warszawa"),
+ ("52.1934", "21.03487", "Mokotów", "PL", "Europe/Warszawa"),
+ ("53.01375", "18.59814", "Toruń", "PL", "Europe/Warszawa"),
+ ("50.87033", "20.62752", "Kielce", "PL", "Europe/Warszawa"),
+ ("50.29761", "18.67658", "Gliwice", "PL", "Europe/Warszawa"),
+ ("50.32492", "18.78576", "Zabrze", "PL", "Europe/Warszawa"),
+ ("50.34802", "18.93282", "Bytom", "PL", "Europe/Warszawa"),
+ ("49.82245", "19.04686", "Bielsko-Biala", "PL", "Europe/Warszawa"),
+ ("53.77995", "20.49416", "Olsztyn", "PL", "Europe/Warszawa"),
+ ("50.04132", "21.99901", "Rzeszów", "PL", "Europe/Warszawa"),
+ ("52.15051", "21.05041", "Ursynów", "PL", "Europe/Warszawa"),
+ ("50.2584", "18.85632", "Ruda Śląska", "PL", "Europe/Warszawa"),
+ ("52.2401", "20.98869", "Wola", "PL", "Europe/Warszawa"),
+ ("50.09713", "18.54179", "Rybnik", "PL", "Europe/Warszawa"),
+ ("52.29242", "20.93531", "Bielany", "PL", "Europe/Warszawa"),
+ ("50.31818", "19.2374", "Dąbrowa Górnicza", "PL", "Europe/Warszawa"),
+ ("50.13717", "18.96641", "Tychy", "PL", "Europe/Warszawa"),
+ ("50.67211", "17.92533", "Opole", "PL", "Europe/Warszawa"),
+ ("54.1522", "19.40884", "Elblag", "PL", "Europe/Warszawa"),
+ ("52.54682", "19.70638", "Płock", "PL", "Europe/Warszawa"),
+ ("50.77141", "16.28432", "Wałbrzych", "PL", "Europe/Warszawa"),
+ ("52.73679", "15.22878", "Gorzów Wielkopolski", "PL", "Europe/Warszawa"),
+ ("52.29185", "21.04845", "Targówek", "PL", "Europe/Warszawa"),
+ ("52.64817", "19.0678", "Włocławek", "PL", "Europe/Warszawa"),
+ ("51.93548", "15.50643", "Zielona Góra", "PL", "Europe/Warszawa"),
+ ("50.01381", "20.98698", "Tarnów", "PL", "Europe/Warszawa"),
)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Faker-25.1.0/faker/providers/ssn/nl_BE/__init__.py new/Faker-25.5.0/faker/providers/ssn/nl_BE/__init__.py
--- old/Faker-25.1.0/faker/providers/ssn/nl_BE/__init__.py 2023-02-09 21:33:45.000000000 +0100
+++ new/Faker-25.5.0/faker/providers/ssn/nl_BE/__init__.py 2024-05-13 18:01:49.000000000 +0200
@@ -57,8 +57,18 @@
vat_id_formats = ("BE##########",)
def vat_id(self) -> str:
+ vat_id_random_section = "#######"
+
+ vat_id_possible_initial_numbers = ("0", "1")
"""
http://ec.europa.eu/taxation_customs/vies/faq.html#item_11
- :return: A random Belgian VAT ID
+ https://en.wikipedia.org/wiki/VAT_identification_number
+ :return: A random Belgian VAT ID starting with 0 or 1 and has a correct checksum with a modulo 97 check
"""
- return self.bothify(self.random_element(self.vat_id_formats))
+ generated_initial_number: str = self.random_element(vat_id_possible_initial_numbers)
+ vat_without_check = self.bothify(f"{generated_initial_number}{vat_id_random_section}")
+ vat_as_int = int(vat_without_check)
+ vat_check = 97 - (vat_as_int % 97)
+ vat_check_str = f"{vat_check:0>2}"
+
+ return f"BE{vat_without_check}{vat_check_str}"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/Faker-25.1.0/faker/providers/user_agent/__init__.py new/Faker-25.5.0/faker/providers/user_agent/__init__.py
--- old/Faker-25.1.0/faker/providers/user_agent/__init__.py 2024-04-29 17:31:08.000000000 +0200
+++ new/Faker-25.5.0/faker/providers/user_agent/__init__.py 2024-06-03 16:06:04.000000000 +0200
@@ -117,11 +117,16 @@
# sources
# https://en.wikipedia.org/wiki/IOS_version_history
ios_versions: ElementsType[str] = (
+ "1.1.5",
+ "2.2.1",
"3.1.3",
+ "3.2.2",
"4.2.1",
+ "4.3.5",
"5.1.1",
"6.1.6",
"7.1.2",
+ "8.4.1",
"9.3.5",
"9.3.6",
"10.3.3",
@@ -129,12 +134,15 @@
"11.4.1",
"12.4.4",
"12.4.8",
+ "12.5.7",
"13.5.1",
+ "13.7",
"14.2",
"14.2.1",
"14.8.1",
"15.8.2",
"16.7.6",
+ "16.7.7",
"17.1",
"17.1.1",
"17.1.2",
@@ -143,6 +151,7 @@
"17.3",
"17.3.1",
"17.4",
+ "17.4.1",
)
def mac_processor(self) -> str:
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package openvswitch for openSUSE:Factory checked in at 2024-06-07 15:02:15
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/openvswitch (Old)
and /work/SRC/openSUSE:Factory/.openvswitch.new.24587 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "openvswitch"
Fri Jun 7 15:02:15 2024 rev:75 rq:1178928 version:unknown
Changes:
--------
--- /work/SRC/openSUSE:Factory/openvswitch/openvswitch.changes 2024-03-03 20:19:05.577334762 +0100
+++ /work/SRC/openSUSE:Factory/.openvswitch.new.24587/openvswitch.changes 2024-06-07 15:02:25.841726701 +0200
@@ -1,0 +2,8 @@
+Tue Jun 4 09:48:39 UTC 2024 - Martin Jambor <mjambor(a)suse.com>
+
+- GCC 14 started to advertise c_atomic extension, older versions
+ didn't do that. Add check for __clang__, so GCC doesn't include
+ headers designed for Clang
+ (openvswitch-2.17.8-gcc14-build-fix.patch) [boo#1225906]
+
+-------------------------------------------------------------------
New:
----
openvswitch-2.17.8-gcc14-build-fix.patch
BETA DEBUG BEGIN:
New: headers designed for Clang
(openvswitch-2.17.8-gcc14-build-fix.patch) [boo#1225906]
BETA DEBUG END:
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ openvswitch.spec ++++++
--- /var/tmp/diff_new_pack.6vJL30/_old 2024-06-07 15:02:27.149774353 +0200
+++ /var/tmp/diff_new_pack.6vJL30/_new 2024-06-07 15:02:27.153774499 +0200
@@ -83,6 +83,8 @@
Patch6: CVE-2023-5366.patch
# Fix CVE-2023-3966 [bsc#1219465] -- Invalid memory access in Geneve with HW offload
Patch7: openvswitch-CVE-2023-3966.patch
+# boo#1225906: Restore build with gcc14
+Patch8: openvswitch-2.17.8-gcc14-build-fix.patch
#OVN patches
# PATCH-FIX-OPENSUSE: 0001-Run-ovn-as-openvswitch-openvswitch.patch
Patch20: 0001-Run-ovn-as-openvswitch-openvswitch.patch
@@ -133,9 +135,9 @@
BuildRequires: python3-rpm-macros
BuildRequires: systemd-units
Requires(post): systemd-units
-Requires(postun):systemd-units
+Requires(postun): systemd-units
Requires(pre): shadow-utils
-Requires(preun):systemd-units
+Requires(preun): systemd-units
%endif
# Needed by the testsuite
%if %{with check}
@@ -425,6 +427,7 @@
%patch -P 5 -p1
%patch -P 6 -p1
%patch -P 7 -p1
+%patch -P 8 -p1
# remove python/ovs/dirs.py - this is generated from template to have proper paths
rm python/ovs/dirs.py
cd %{ovn_dir}
++++++ openvswitch-2.17.8-gcc14-build-fix.patch ++++++
From 335a5deac3ff91448ca14651e92f39dfdd512fcf Mon Sep 17 00:00:00 2001
From: Ilya Maximets <i.maximets(a)ovn.org>
Date: Thu, 18 Jan 2024 15:59:05 +0100
Subject: [PATCH] ovs-atomic: Fix inclusion of Clang header by GCC 14.
GCC 14 started to advertise c_atomic extension, older versions didn't
do that. Add check for __clang__, so GCC doesn't include headers
designed for Clang.
Another option would be to prefer stdatomic implementation instead,
but some older versions of Clang are not able to use stdatomic.h
supplied by GCC as described in commit:
07ece367fb5f ("ovs-atomic: Prefer Clang intrinsics over <stdatomic.h>.")
This change fixes OVS build with GCC on Fedora Rawhide (40).
Reported-by: Jakob Meng <code(a)jakobmeng.de>
Acked-by: Jakob Meng <jmeng(a)redhat.com>
Acked-by: Eelco Chaudron <echaudro(a)redhat.com>
Acked-by: Simon Horman <horms(a)ovn.org>
Signed-off-by: Ilya Maximets <i.maximets(a)ovn.org>
---
lib/ovs-atomic.h | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/lib/ovs-atomic.h b/lib/ovs-atomic.h
index ab9ce6b2e0f..f140d25feba 100644
--- a/lib/ovs-atomic.h
+++ b/lib/ovs-atomic.h
@@ -328,7 +328,7 @@
#if __CHECKER__
/* sparse doesn't understand some GCC extensions we use. */
#include "ovs-atomic-pthreads.h"
- #elif __has_extension(c_atomic)
+ #elif __clang__ && __has_extension(c_atomic)
#include "ovs-atomic-clang.h"
#elif HAVE_ATOMIC && __cplusplus >= 201103L
#include "ovs-atomic-c++.h"
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package libinput for openSUSE:Factory checked in at 2024-06-07 15:02:09
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/libinput (Old)
and /work/SRC/openSUSE:Factory/.libinput.new.24587 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "libinput"
Fri Jun 7 15:02:09 2024 rev:119 rq:1178914 version:1.26.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/libinput/libinput.changes 2024-03-26 19:29:29.369855706 +0100
+++ /work/SRC/openSUSE:Factory/.libinput.new.24587/libinput.changes 2024-06-07 15:02:18.269450845 +0200
@@ -1,0 +2,12 @@
+Thu Jun 6 07:04:23 UTC 2024 - Jan Engelhardt <jengelh(a)inai.de>
+
+- Update to release 1.26
+ * Touchpads can now configure a clickfinger button map
+ * Tablet pads now have an API for relative dials.
+ * A new configuration option for tablet tools allow reducing
+ the available logical range.
+ * Tablet tools can now use BTN_STYLUS3 too and tablet pad strip
+ support should now work for non-Wacom devices, for systems
+ where the kernel driver implements it.
+
+-------------------------------------------------------------------
Old:
----
libinput-1.25.0.tar.gz
New:
----
libinput-1.26.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ libinput.spec ++++++
--- /var/tmp/diff_new_pack.V398ct/_old 2024-06-07 15:02:19.169483633 +0200
+++ /var/tmp/diff_new_pack.V398ct/_new 2024-06-07 15:02:19.173483779 +0200
@@ -37,7 +37,7 @@
%define lname libinput10
%define pname libinput
Name: libinput%{?xsuffix}
-Version: 1.25.0
+Version: 1.26.0
Release: 0
Summary: Input device and event processing library
License: MIT
++++++ libinput-1.25.0.tar.gz -> libinput-1.26.0.tar.gz ++++++
++++ 5427 lines of diff (skipped)
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-coverage for openSUSE:Factory checked in at 2024-06-07 15:02:08
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-coverage (Old)
and /work/SRC/openSUSE:Factory/.python-coverage.new.24587 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-coverage"
Fri Jun 7 15:02:08 2024 rev:63 rq:1178912 version:7.5.3
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-coverage/python-coverage.changes 2024-05-16 17:12:31.694165865 +0200
+++ /work/SRC/openSUSE:Factory/.python-coverage.new.24587/python-coverage.changes 2024-06-07 15:02:15.501350003 +0200
@@ -1,0 +2,30 @@
+Thu Jun 6 07:29:28 UTC 2024 - Dirk Müller <dmueller(a)suse.com>
+
+- update to 7.5.3:
+ * Performance improvements for combining data files, especially
+ when measuring line coverage. A few different quadratic
+ behaviors were eliminated. In one extreme case of combining
+ 700+ data files, the time dropped from more than three hours
+ to seven minutes. Thanks for Kraken Tech for funding the
+ fix.
+ * Performance improvements for generating HTML reports, with a
+ side benefit of reducing memory use, closing issue 1791.
+ Thanks to Daniel Diniz for helping to diagnose the problem.
+ * Fix: nested matches of exclude patterns could exclude too
+ much code, as reported in issue 1779. This is now fixed.
+ * Changed: previously, coverage.py would consider a module
+ docstring to be an executable statement if it appeared after
+ line 1 in the file, but not executable if it was the first
+ line. Now module docstrings are never counted as executable
+ statements. This can change coverage.py's count of the
+ number of statements in a file, which can slightly change the
+ coverage percentage reported.
+ * In the HTML report, the filter term and "hide covered"
+ checkbox settings are remembered between viewings, thanks to
+ Daniel Diniz.
+ * Python 3.13.0b1 is supported.
+ * Fix: parsing error handling is improved to ensure bizarre
+ source files are handled gracefully, and to unblock oss-fuzz
+ fuzzing, thanks to Liam DeVoe. Closes issue 1787.
+
+-------------------------------------------------------------------
Old:
----
coverage-7.5.1.tar.gz
New:
----
coverage-7.5.3.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-coverage.spec ++++++
--- /var/tmp/diff_new_pack.XK7TeO/_old 2024-06-07 15:02:17.021405379 +0200
+++ /var/tmp/diff_new_pack.XK7TeO/_new 2024-06-07 15:02:17.021405379 +0200
@@ -18,7 +18,7 @@
%{?sle15_python_module_pythons}
Name: python-coverage
-Version: 7.5.1
+Version: 7.5.3
Release: 0
Summary: Code coverage measurement for Python
License: Apache-2.0
++++++ coverage-7.5.1.tar.gz -> coverage-7.5.3.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/.github/workflows/python-nightly.yml new/coverage-7.5.3/.github/workflows/python-nightly.yml
--- old/coverage-7.5.1/.github/workflows/python-nightly.yml 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/.github/workflows/python-nightly.yml 2024-05-28 15:52:29.000000000 +0200
@@ -58,6 +58,7 @@
# https://launchpad.net/~deadsnakes/+archive/ubuntu/nightly/+packages
- "3.12-dev"
- "3.13-dev"
+ - "3.14-dev"
# https://github.com/actions/setup-python#available-versions-of-pypy
- "pypy-3.8-nightly"
- "pypy-3.9-nightly"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/CHANGES.rst new/coverage-7.5.3/CHANGES.rst
--- old/coverage-7.5.1/CHANGES.rst 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/CHANGES.rst 2024-05-28 15:52:29.000000000 +0200
@@ -22,6 +22,53 @@
.. scriv-start-here
+.. _changes_7-5-3:
+
+Version 7.5.3 — 2024-05-28
+--------------------------
+
+- Performance improvements for combining data files, especially when measuring
+ line coverage. A few different quadratic behaviors were eliminated. In one
+ extreme case of combining 700+ data files, the time dropped from more than
+ three hours to seven minutes. Thanks for Kraken Tech for funding the fix.
+
+- Performance improvements for generating HTML reports, with a side benefit of
+ reducing memory use, closing `issue 1791`_. Thanks to Daniel Diniz for
+ helping to diagnose the problem.
+
+.. _issue 1791: https://github.com/nedbat/coveragepy/issues/1791
+
+
+.. _changes_7-5-2:
+
+Version 7.5.2 — 2024-05-24
+--------------------------
+
+- Fix: nested matches of exclude patterns could exclude too much code, as
+ reported in `issue 1779`_. This is now fixed.
+
+- Changed: previously, coverage.py would consider a module docstring to be an
+ executable statement if it appeared after line 1 in the file, but not
+ executable if it was the first line. Now module docstrings are never counted
+ as executable statements. This can change coverage.py's count of the number
+ of statements in a file, which can slightly change the coverage percentage
+ reported.
+
+- In the HTML report, the filter term and "hide covered" checkbox settings are
+ remembered between viewings, thanks to `Daniel Diniz <pull 1776_>`_.
+
+- Python 3.13.0b1 is supported.
+
+- Fix: parsing error handling is improved to ensure bizarre source files are
+ handled gracefully, and to unblock oss-fuzz fuzzing, thanks to `Liam DeVoe
+ <pull 1788_>`_. Closes `issue 1787`_.
+
+.. _pull 1776: https://github.com/nedbat/coveragepy/pull/1776
+.. _issue 1779: https://github.com/nedbat/coveragepy/issues/1779
+.. _issue 1787: https://github.com/nedbat/coveragepy/issues/1787
+.. _pull 1788: https://github.com/nedbat/coveragepy/pull/1788
+
+
.. _changes_7-5-1:
Version 7.5.1 — 2024-05-04
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/CONTRIBUTORS.txt new/coverage-7.5.3/CONTRIBUTORS.txt
--- old/coverage-7.5.1/CONTRIBUTORS.txt 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/CONTRIBUTORS.txt 2024-05-28 15:52:29.000000000 +0200
@@ -132,6 +132,7 @@
Leonardo Pistone
Lewis Gaul
Lex Berezhny
+Liam DeVoe
Loïc Dachary
Lorenzo Micò
Louis Heredero
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/PKG-INFO new/coverage-7.5.3/PKG-INFO
--- old/coverage-7.5.1/PKG-INFO 2024-05-04 16:44:36.584669000 +0200
+++ new/coverage-7.5.3/PKG-INFO 2024-05-28 15:52:36.482906000 +0200
@@ -1,12 +1,12 @@
Metadata-Version: 2.1
Name: coverage
-Version: 7.5.1
+Version: 7.5.3
Summary: Code coverage measurement for Python
Home-page: https://github.com/nedbat/coveragepy
-Author: Ned Batchelder and 226 others
+Author: Ned Batchelder and 227 others
Author-email: ned(a)nedbatchelder.com
License: Apache-2.0
-Project-URL: Documentation, https://coverage.readthedocs.io/en/7.5.1
+Project-URL: Documentation, https://coverage.readthedocs.io/en/7.5.3
Project-URL: Funding, https://tidelift.com/subscription/pkg/pypi-coverage?utm_source=pypi-coverag…
Project-URL: Issues, https://github.com/nedbat/coveragepy/issues
Project-URL: Mastodon, https://hachyderm.io/@coveragepy
@@ -62,13 +62,13 @@
.. PYVERSIONS
-* Python 3.8 through 3.12, and 3.13.0a6 and up.
+* Python 3.8 through 3.12, and 3.13.0b1 and up.
* PyPy3 versions 3.8 through 3.10.
Documentation is on `Read the Docs`_. Code repository and issue tracker are on
`GitHub`_.
-.. _Read the Docs: https://coverage.readthedocs.io/en/7.5.1/
+.. _Read the Docs: https://coverage.readthedocs.io/en/7.5.3/
.. _GitHub: https://github.com/nedbat/coveragepy
**New in 7.x:**
@@ -112,7 +112,7 @@
Looking to run ``coverage`` on your test suite? See the `Quick Start section`_
of the docs.
-.. _Quick Start section: https://coverage.readthedocs.io/en/7.5.1/#quick-start
+.. _Quick Start section: https://coverage.readthedocs.io/en/7.5.3/#quick-start
Change history
@@ -120,7 +120,7 @@
The complete history of changes is on the `change history page`_.
-.. _change history page: https://coverage.readthedocs.io/en/7.5.1/changes.html
+.. _change history page: https://coverage.readthedocs.io/en/7.5.3/changes.html
Code of Conduct
@@ -139,7 +139,7 @@
Found a bug? Want to help improve the code or documentation? See the
`Contributing section`_ of the docs.
-.. _Contributing section: https://coverage.readthedocs.io/en/7.5.1/contributing.html
+.. _Contributing section: https://coverage.readthedocs.io/en/7.5.3/contributing.html
Security
@@ -167,7 +167,7 @@
:target: https://github.com/nedbat/coveragepy/actions/workflows/quality.yml
:alt: Quality check status
.. |docs| image:: https://readthedocs.org/projects/coverage/badge/?version=latest&style=flat
- :target: https://coverage.readthedocs.io/en/7.5.1/
+ :target: https://coverage.readthedocs.io/en/7.5.3/
:alt: Documentation
.. |kit| image:: https://img.shields.io/pypi/v/coverage
:target: https://pypi.org/project/coverage/
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/README.rst new/coverage-7.5.3/README.rst
--- old/coverage-7.5.1/README.rst 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/README.rst 2024-05-28 15:52:29.000000000 +0200
@@ -25,7 +25,7 @@
.. PYVERSIONS
-* Python 3.8 through 3.12, and 3.13.0a6 and up.
+* Python 3.8 through 3.12, and 3.13.0b1 and up.
* PyPy3 versions 3.8 through 3.10.
Documentation is on `Read the Docs`_. Code repository and issue tracker are on
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage/control.py new/coverage-7.5.3/coverage/control.py
--- old/coverage-7.5.1/coverage/control.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/coverage/control.py 2024-05-28 15:52:29.000000000 +0200
@@ -998,7 +998,7 @@
if self.config.paths:
mapped_data = CoverageData(warn=self._warn, debug=self._debug, no_disk=True)
if self._data is not None:
- mapped_data.update(self._data, aliases=self._make_aliases())
+ mapped_data.update(self._data, map_path=self._make_aliases().map)
self._data = mapped_data
def report(
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage/data.py new/coverage-7.5.3/coverage/data.py
--- old/coverage-7.5.1/coverage/data.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/coverage/data.py 2024-05-28 15:52:29.000000000 +0200
@@ -12,6 +12,7 @@
from __future__ import annotations
+import functools
import glob
import hashlib
import os.path
@@ -134,6 +135,11 @@
if strict and not files_to_combine:
raise NoDataError("No data to combine")
+ if aliases is None:
+ map_path = None
+ else:
+ map_path = functools.lru_cache(maxsize=None)(aliases.map)
+
file_hashes = set()
combined_any = False
@@ -176,7 +182,7 @@
message(f"Couldn't combine data file {rel_file_name}: {exc}")
delete_this_one = False
else:
- data.update(new_data, aliases=aliases)
+ data.update(new_data, map_path=map_path)
combined_any = True
if message:
message(f"Combined data file {rel_file_name}")
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage/html.py new/coverage-7.5.3/coverage/html.py
--- old/coverage-7.5.1/coverage/html.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/coverage/html.py 2024-05-28 15:52:29.000000000 +0200
@@ -597,7 +597,7 @@
"regions": index_page.summaries,
"totals": index_page.totals,
"noun": index_page.noun,
- "column2": index_page.noun if index_page.noun != "file" else "",
+ "region_noun": index_page.noun if index_page.noun != "file" else "",
"skip_covered": self.skip_covered,
"skipped_covered_msg": skipped_covered_msg,
"skipped_empty_msg": skipped_empty_msg,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage/htmlfiles/coverage_html.js new/coverage-7.5.3/coverage/htmlfiles/coverage_html.js
--- old/coverage-7.5.1/coverage/htmlfiles/coverage_html.js 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/coverage/htmlfiles/coverage_html.js 2024-05-28 15:52:29.000000000 +0200
@@ -125,6 +125,16 @@
// Create the events for the filter box.
coverage.wire_up_filter = function () {
+ // Populate the filter and hide100 inputs if there are saved values for them.
+ const saved_filter_value = localStorage.getItem(coverage.FILTER_STORAGE);
+ if (saved_filter_value) {
+ document.getElementById("filter").value = saved_filter_value;
+ }
+ const saved_hide100_value = localStorage.getItem(coverage.HIDE100_STORAGE);
+ if (saved_hide100_value) {
+ document.getElementById("hide100").checked = JSON.parse(saved_hide100_value);
+ }
+
// Cache elements.
const table = document.querySelector("table.index");
const table_body_rows = table.querySelectorAll("tbody tr");
@@ -138,8 +148,12 @@
totals[totals.length - 1] = { "numer": 0, "denom": 0 }; // nosemgrep: eslint.detect-object-injection
var text = document.getElementById("filter").value;
+ // Store filter value
+ localStorage.setItem(coverage.FILTER_STORAGE, text);
const casefold = (text === text.toLowerCase());
const hide100 = document.getElementById("hide100").checked;
+ // Store hide value.
+ localStorage.setItem(coverage.HIDE100_STORAGE, JSON.stringify(hide100));
// Hide / show elements.
table_body_rows.forEach(row => {
@@ -240,6 +254,8 @@
document.getElementById("filter").dispatchEvent(new Event("input"));
document.getElementById("hide100").dispatchEvent(new Event("input"));
};
+coverage.FILTER_STORAGE = "COVERAGE_FILTER_VALUE";
+coverage.HIDE100_STORAGE = "COVERAGE_HIDE100_VALUE";
// Set up the click-to-sort columns.
coverage.wire_up_sorting = function () {
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage/htmlfiles/index.html new/coverage-7.5.3/coverage/htmlfiles/index.html
--- old/coverage-7.5.1/coverage/htmlfiles/index.html 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/coverage/htmlfiles/index.html 2024-05-28 15:52:29.000000000 +0200
@@ -31,7 +31,7 @@
<div class="keyhelp">
<p>
<kbd>f</kbd>
- {% if column2 %}
+ {% if region_noun %}
<kbd>n</kbd>
{% endif %}
<kbd>s</kbd>
@@ -83,8 +83,8 @@
{# The title="" attr doesn't work in Safari. #}
<tr class="tablehead" title="Click to sort">
<th id="file" class="name left" aria-sort="none" data-shortcut="f">File<span class="arrows"></span></th>
- {% if column2 %}
- <th id="region" class="name left" aria-sort="none" data-default-sort-order="ascending" data-shortcut="n">{{ column2 }}<span class="arrows"></span></th>
+ {% if region_noun %}
+ <th id="region" class="name left" aria-sort="none" data-default-sort-order="ascending" data-shortcut="n">{{ region_noun }}<span class="arrows"></span></th>
{% endif %}
<th id="statements" aria-sort="none" data-default-sort-order="descending" data-shortcut="s">statements<span class="arrows"></span></th>
<th id="missing" aria-sort="none" data-default-sort-order="descending" data-shortcut="m">missing<span class="arrows"></span></th>
@@ -100,7 +100,7 @@
{% for region in regions %}
<tr class="region">
<td class="name left"><a href="{{region.url}}">{{region.file}}</a></td>
- {% if column2 %}
+ {% if region_noun %}
<td class="name left"><a href="{{region.url}}">{{region.description}}</a></td>
{% endif %}
<td>{{region.nums.n_statements}}</td>
@@ -117,7 +117,7 @@
<tfoot>
<tr class="total">
<td class="name left">Total</td>
- {% if column2 %}
+ {% if region_noun %}
<td class="name left"> </td>
{% endif %}
<td>{{totals.n_statements}}</td>
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage/misc.py new/coverage-7.5.3/coverage/misc.py
--- old/coverage-7.5.1/coverage/misc.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/coverage/misc.py 2024-05-28 15:52:29.000000000 +0200
@@ -13,7 +13,6 @@
import importlib
import importlib.util
import inspect
-import locale
import os
import os.path
import re
@@ -22,7 +21,7 @@
from types import ModuleType
from typing import (
- Any, IO, Iterable, Iterator, Mapping, NoReturn, Sequence, TypeVar,
+ Any, Iterable, Iterator, Mapping, NoReturn, Sequence, TypeVar,
)
from coverage.exceptions import CoverageException
@@ -156,18 +155,6 @@
ensure_dir(os.path.dirname(path))
-def output_encoding(outfile: IO[str] | None = None) -> str:
- """Determine the encoding to use for output written to `outfile` or stdout."""
- if outfile is None:
- outfile = sys.stdout
- encoding = (
- getattr(outfile, "encoding", None) or
- getattr(sys.__stdout__, "encoding", None) or
- locale.getpreferredencoding()
- )
- return encoding
-
-
class Hasher:
"""Hashes Python data for fingerprinting."""
def __init__(self) -> None:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage/parser.py new/coverage-7.5.3/coverage/parser.py
--- old/coverage-7.5.1/coverage/parser.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/coverage/parser.py 2024-05-28 15:52:29.000000000 +0200
@@ -25,7 +25,7 @@
from coverage.bytecode import code_objects
from coverage.debug import short_stack
from coverage.exceptions import NoSource, NotPython
-from coverage.misc import join_regex, nice_pair
+from coverage.misc import nice_pair
from coverage.phystokens import generate_tokens
from coverage.types import TArc, TLineNo
@@ -62,8 +62,8 @@
self.exclude = exclude
- # The text lines of the parsed code.
- self.lines: list[str] = self.text.split("\n")
+ # The parsed AST of the text.
+ self._ast_root: ast.AST | None = None
# The normalized line numbers of the statements in the code. Exclusions
# are taken into account, and statements are adjusted to their first
@@ -101,19 +101,16 @@
self._all_arcs: set[TArc] | None = None
self._missing_arc_fragments: TArcFragments | None = None
- @functools.lru_cache()
- def lines_matching(self, *regexes: str) -> set[TLineNo]:
- """Find the lines matching one of a list of regexes.
+ def lines_matching(self, regex: str) -> set[TLineNo]:
+ """Find the lines matching a regex.
- Returns a set of line numbers, the lines that contain a match for one
- of the regexes in `regexes`. The entire line needn't match, just a
- part of it.
+ Returns a set of line numbers, the lines that contain a match for
+ `regex`. The entire line needn't match, just a part of it.
"""
- combined = join_regex(regexes)
- regex_c = re.compile(combined)
+ regex_c = re.compile(regex)
matches = set()
- for i, ltext in enumerate(self.lines, start=1):
+ for i, ltext in enumerate(self.text.split("\n"), start=1):
if regex_c.search(ltext):
matches.add(self._multiline.get(i, i))
return matches
@@ -127,26 +124,18 @@
# Find lines which match an exclusion pattern.
if self.exclude:
self.raw_excluded = self.lines_matching(self.exclude)
+ self.excluded = set(self.raw_excluded)
- # Tokenize, to find excluded suites, to find docstrings, and to find
- # multi-line statements.
-
- # The last token seen. Start with INDENT to get module docstrings
- prev_toktype: int = token.INDENT
# The current number of indents.
indent: int = 0
# An exclusion comment will exclude an entire clause at this indent.
exclude_indent: int = 0
# Are we currently excluding lines?
excluding: bool = False
- # Are we excluding decorators now?
- excluding_decorators: bool = False
# The line number of the first line in a multi-line statement.
first_line: int = 0
# Is the file empty?
empty: bool = True
- # Is this the first token on a line?
- first_on_line: bool = True
# Parenthesis (and bracket) nesting level.
nesting: int = 0
@@ -162,42 +151,22 @@
indent += 1
elif toktype == token.DEDENT:
indent -= 1
- elif toktype == token.NAME:
- if ttext == "class":
- # Class definitions look like branches in the bytecode, so
- # we need to exclude them. The simplest way is to note the
- # lines with the "class" keyword.
- self.raw_classdefs.add(slineno)
elif toktype == token.OP:
if ttext == ":" and nesting == 0:
should_exclude = (
- self.raw_excluded.intersection(range(first_line, elineno + 1))
- or excluding_decorators
+ self.excluded.intersection(range(first_line, elineno + 1))
)
if not excluding and should_exclude:
# Start excluding a suite. We trigger off of the colon
# token so that the #pragma comment will be recognized on
# the same line as the colon.
- self.raw_excluded.add(elineno)
+ self.excluded.add(elineno)
exclude_indent = indent
excluding = True
- excluding_decorators = False
- elif ttext == "@" and first_on_line:
- # A decorator.
- if elineno in self.raw_excluded:
- excluding_decorators = True
- if excluding_decorators:
- self.raw_excluded.add(elineno)
elif ttext in "([{":
nesting += 1
elif ttext in ")]}":
nesting -= 1
- elif toktype == token.STRING:
- if prev_toktype == token.INDENT:
- # Strings that are first on an indented line are docstrings.
- # (a trick from trace.py in the stdlib.) This works for
- # 99.9999% of cases.
- self.raw_docstrings.update(range(slineno, elineno+1))
elif toktype == token.NEWLINE:
if first_line and elineno != first_line:
# We're at the end of a line, and we've ended on a
@@ -206,7 +175,6 @@
for l in range(first_line, elineno+1):
self._multiline[l] = first_line
first_line = 0
- first_on_line = True
if ttext.strip() and toktype != tokenize.COMMENT:
# A non-white-space token.
@@ -218,10 +186,7 @@
if excluding and indent <= exclude_indent:
excluding = False
if excluding:
- self.raw_excluded.add(elineno)
- first_on_line = False
-
- prev_toktype = toktype
+ self.excluded.add(elineno)
# Find the starts of the executable statements.
if not empty:
@@ -234,6 +199,34 @@
if env.PYBEHAVIOR.module_firstline_1 and self._multiline:
self._multiline[1] = min(self.raw_statements)
+ self.excluded = self.first_lines(self.excluded)
+
+ # AST lets us find classes, docstrings, and decorator-affected
+ # functions and classes.
+ assert self._ast_root is not None
+ for node in ast.walk(self._ast_root):
+ # Find class definitions.
+ if isinstance(node, ast.ClassDef):
+ self.raw_classdefs.add(node.lineno)
+ # Find docstrings.
+ if isinstance(node, (ast.ClassDef, ast.FunctionDef, ast.AsyncFunctionDef, ast.Module)):
+ if node.body:
+ first = node.body[0]
+ if (
+ isinstance(first, ast.Expr)
+ and isinstance(first.value, ast.Constant)
+ and isinstance(first.value.value, str)
+ ):
+ self.raw_docstrings.update(
+ range(first.lineno, cast(int, first.end_lineno) + 1)
+ )
+ # Exclusions carry from decorators and signatures to the bodies of
+ # functions and classes.
+ if isinstance(node, (ast.ClassDef, ast.FunctionDef, ast.AsyncFunctionDef)):
+ first_line = min((d.lineno for d in node.decorator_list), default=node.lineno)
+ if self.excluded.intersection(range(first_line, node.lineno + 1)):
+ self.excluded.update(range(first_line, cast(int, node.end_lineno) + 1))
+
@functools.lru_cache(maxsize=1000)
def first_line(self, lineno: TLineNo) -> TLineNo:
"""Return the first line number of the statement including `lineno`."""
@@ -268,6 +261,7 @@
"""
try:
+ self._ast_root = ast.parse(self.text)
self._raw_parse()
except (tokenize.TokenError, IndentationError, SyntaxError) as err:
if hasattr(err, "lineno"):
@@ -279,8 +273,6 @@
f"{err.args[0]!r} at line {lineno}",
) from err
- self.excluded = self.first_lines(self.raw_excluded)
-
ignore = self.excluded | self.raw_docstrings
starts = self.raw_statements - ignore
self.statements = self.first_lines(starts) - ignore
@@ -303,7 +295,8 @@
`_all_arcs` is the set of arcs in the code.
"""
- aaa = AstArcAnalyzer(self.text, self.raw_statements, self._multiline)
+ assert self._ast_root is not None
+ aaa = AstArcAnalyzer(self._ast_root, self.raw_statements, self._multiline)
aaa.analyze()
self._all_arcs = set()
@@ -403,14 +396,9 @@
self.code = code
else:
assert filename is not None
- try:
- self.code = compile(text, filename, "exec", dont_inherit=True)
- except SyntaxError as synerr:
- raise NotPython(
- "Couldn't parse '%s' as Python source: '%s' at line %d" % (
- filename, synerr.msg, synerr.lineno or 0,
- ),
- ) from synerr
+ # We only get here if earlier ast parsing succeeded, so no need to
+ # catch errors.
+ self.code = compile(text, filename, "exec", dont_inherit=True)
def child_parsers(self) -> Iterable[ByteParser]:
"""Iterate over all the code objects nested within this one.
@@ -685,11 +673,11 @@
def __init__(
self,
- text: str,
+ root_node: ast.AST,
statements: set[TLineNo],
multiline: dict[TLineNo, TLineNo],
) -> None:
- self.root_node = ast.parse(text)
+ self.root_node = root_node
# TODO: I think this is happening in too many places.
self.statements = {multiline.get(l, l) for l in statements}
self.multiline = multiline
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage/phystokens.py new/coverage-7.5.3/coverage/phystokens.py
--- old/coverage-7.5.1/coverage/phystokens.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/coverage/phystokens.py 2024-05-28 15:52:29.000000000 +0200
@@ -6,7 +6,6 @@
from __future__ import annotations
import ast
-import functools
import io
import keyword
import re
@@ -163,20 +162,15 @@
yield line
-(a)functools.lru_cache(maxsize=100)
def generate_tokens(text: str) -> TokenInfos:
- """A cached version of `tokenize.generate_tokens`.
+ """A helper around `tokenize.generate_tokens`.
+
+ Originally this was used to cache the results, but it didn't seem to make
+ reporting go faster, and caused issues with using too much memory.
- When reporting, coverage.py tokenizes files twice, once to find the
- structure of the file, and once to syntax-color it. Tokenizing is
- expensive, and easily cached.
-
- Unfortunately, the HTML report code tokenizes all the files the first time
- before then tokenizing them a second time, so we cache many. Ideally we'd
- rearrange the code to tokenize each file twice before moving onto the next.
"""
readline = io.StringIO(text).readline
- return list(tokenize.generate_tokens(readline))
+ return tokenize.generate_tokens(readline)
def source_encoding(source: bytes) -> str:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage/python.py new/coverage-7.5.3/coverage/python.py
--- old/coverage-7.5.1/coverage/python.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/coverage/python.py 2024-05-28 15:52:29.000000000 +0200
@@ -206,8 +206,10 @@
def no_branch_lines(self) -> set[TLineNo]:
assert self.coverage is not None
no_branch = self.parser.lines_matching(
- join_regex(self.coverage.config.partial_list),
- join_regex(self.coverage.config.partial_always_list),
+ join_regex(
+ self.coverage.config.partial_list
+ + self.coverage.config.partial_always_list
+ )
)
return no_branch
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage/pytracer.py new/coverage-7.5.3/coverage/pytracer.py
--- old/coverage-7.5.1/coverage/pytracer.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/coverage/pytracer.py 2024-05-28 15:52:29.000000000 +0200
@@ -166,12 +166,12 @@
if event == "call":
# Should we start a new context?
if self.should_start_context and self.context is None:
- context_maybe = self.should_start_context(frame)
+ context_maybe = self.should_start_context(frame) # pylint: disable=not-callable
if context_maybe is not None:
self.context = context_maybe
started_context = True
assert self.switch_context is not None
- self.switch_context(self.context)
+ self.switch_context(self.context) # pylint: disable=not-callable
else:
started_context = False
else:
@@ -280,7 +280,7 @@
if self.started_context:
assert self.switch_context is not None
self.context = None
- self.switch_context(None)
+ self.switch_context(None) # pylint: disable=not-callable
return self._cached_bound_method_trace
def start(self) -> TTraceFn:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage/sqldata.py new/coverage-7.5.3/coverage/sqldata.py
--- old/coverage-7.5.1/coverage/sqldata.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/coverage/sqldata.py 2024-05-28 15:52:29.000000000 +0200
@@ -21,13 +21,12 @@
import zlib
from typing import (
- cast, Any, Collection, Mapping,
+ cast, Any, Callable, Collection, Mapping,
Sequence,
)
from coverage.debug import NoDebugging, auto_repr
from coverage.exceptions import CoverageException, DataError
-from coverage.files import PathAliases
from coverage.misc import file_be_gone, isolate_module
from coverage.numbits import numbits_to_nums, numbits_union, nums_to_numbits
from coverage.sqlitedb import SqliteDb
@@ -647,12 +646,16 @@
continue
con.execute_void(sql, (file_id,))
- def update(self, other_data: CoverageData, aliases: PathAliases | None = None) -> None:
- """Update this data with data from several other :class:`CoverageData` instances.
+ def update(
+ self,
+ other_data: CoverageData,
+ map_path: Callable[[str], str] | None = None,
+ ) -> None:
+ """Update this data with data from another :class:`CoverageData`.
- If `aliases` is provided, it's a `PathAliases` object that is used to
- re-map paths to match the local machine's. Note: `aliases` is None
- only when called directly from the test suite.
+ If `map_path` is provided, it's a function that re-map paths to match
+ the local machine's. Note: `map_path` is None only when called
+ directly from the test suite.
"""
if self._debug.should("dataop"):
@@ -664,7 +667,7 @@
if self._has_arcs and other_data._has_lines:
raise DataError("Can't combine line data with arc data")
- aliases = aliases or PathAliases()
+ map_path = map_path or (lambda p: p)
# Force the database we're writing to to exist before we start nesting contexts.
self._start_using()
@@ -674,7 +677,7 @@
with other_data._connect() as con:
# Get files data.
with con.execute("select path from file") as cur:
- files = {path: aliases.map(path) for (path,) in cur}
+ files = {path: map_path(path) for (path,) in cur}
# Get contexts data.
with con.execute("select context from context") as cur:
@@ -729,7 +732,7 @@
"inner join file on file.id = tracer.file_id",
) as cur:
this_tracers.update({
- aliases.map(path): tracer
+ map_path(path): tracer
for path, tracer in cur
})
@@ -767,27 +770,15 @@
# Prepare arc and line rows to be inserted by converting the file
# and context strings with integer ids. Then use the efficient
# `executemany()` to insert all rows at once.
- arc_rows = (
- (file_ids[file], context_ids[context], fromno, tono)
- for file, context, fromno, tono in arcs
- )
-
- # Get line data.
- with con.execute(
- "select file.path, context.context, line_bits.numbits " +
- "from line_bits " +
- "inner join file on file.id = line_bits.file_id " +
- "inner join context on context.id = line_bits.context_id",
- ) as cur:
- for path, context, numbits in cur:
- key = (aliases.map(path), context)
- if key in lines:
- numbits = numbits_union(lines[key], numbits)
- lines[key] = numbits
if arcs:
self._choose_lines_or_arcs(arcs=True)
+ arc_rows = (
+ (file_ids[file], context_ids[context], fromno, tono)
+ for file, context, fromno, tono in arcs
+ )
+
# Write the combined data.
con.executemany_void(
"insert or ignore into arc " +
@@ -797,15 +788,25 @@
if lines:
self._choose_lines_or_arcs(lines=True)
- con.execute_void("delete from line_bits")
+
+ for (file, context), numbits in lines.items():
+ with con.execute(
+ "select numbits from line_bits where file_id = ? and context_id = ?",
+ (file_ids[file], context_ids[context]),
+ ) as cur:
+ existing = list(cur)
+ if existing:
+ lines[(file, context)] = numbits_union(numbits, existing[0][0])
+
con.executemany_void(
- "insert into line_bits " +
+ "insert or replace into line_bits " +
"(file_id, context_id, numbits) values (?, ?, ?)",
[
(file_ids[file], context_ids[context], numbits)
for (file, context), numbits in lines.items()
],
)
+
con.executemany_void(
"insert or ignore into tracer (file_id, tracer) values (?, ?)",
((file_ids[filename], tracer) for filename, tracer in tracer_map.items()),
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage/version.py new/coverage-7.5.3/coverage/version.py
--- old/coverage-7.5.1/coverage/version.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/coverage/version.py 2024-05-28 15:52:29.000000000 +0200
@@ -8,7 +8,7 @@
# version_info: same semantics as sys.version_info.
# _dev: the .devN suffix if any.
-version_info = (7, 5, 1, "final", 0)
+version_info = (7, 5, 3, "final", 0)
_dev = 0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/coverage.egg-info/PKG-INFO new/coverage-7.5.3/coverage.egg-info/PKG-INFO
--- old/coverage-7.5.1/coverage.egg-info/PKG-INFO 2024-05-04 16:44:36.000000000 +0200
+++ new/coverage-7.5.3/coverage.egg-info/PKG-INFO 2024-05-28 15:52:36.000000000 +0200
@@ -1,12 +1,12 @@
Metadata-Version: 2.1
Name: coverage
-Version: 7.5.1
+Version: 7.5.3
Summary: Code coverage measurement for Python
Home-page: https://github.com/nedbat/coveragepy
-Author: Ned Batchelder and 226 others
+Author: Ned Batchelder and 227 others
Author-email: ned(a)nedbatchelder.com
License: Apache-2.0
-Project-URL: Documentation, https://coverage.readthedocs.io/en/7.5.1
+Project-URL: Documentation, https://coverage.readthedocs.io/en/7.5.3
Project-URL: Funding, https://tidelift.com/subscription/pkg/pypi-coverage?utm_source=pypi-coverag…
Project-URL: Issues, https://github.com/nedbat/coveragepy/issues
Project-URL: Mastodon, https://hachyderm.io/@coveragepy
@@ -62,13 +62,13 @@
.. PYVERSIONS
-* Python 3.8 through 3.12, and 3.13.0a6 and up.
+* Python 3.8 through 3.12, and 3.13.0b1 and up.
* PyPy3 versions 3.8 through 3.10.
Documentation is on `Read the Docs`_. Code repository and issue tracker are on
`GitHub`_.
-.. _Read the Docs: https://coverage.readthedocs.io/en/7.5.1/
+.. _Read the Docs: https://coverage.readthedocs.io/en/7.5.3/
.. _GitHub: https://github.com/nedbat/coveragepy
**New in 7.x:**
@@ -112,7 +112,7 @@
Looking to run ``coverage`` on your test suite? See the `Quick Start section`_
of the docs.
-.. _Quick Start section: https://coverage.readthedocs.io/en/7.5.1/#quick-start
+.. _Quick Start section: https://coverage.readthedocs.io/en/7.5.3/#quick-start
Change history
@@ -120,7 +120,7 @@
The complete history of changes is on the `change history page`_.
-.. _change history page: https://coverage.readthedocs.io/en/7.5.1/changes.html
+.. _change history page: https://coverage.readthedocs.io/en/7.5.3/changes.html
Code of Conduct
@@ -139,7 +139,7 @@
Found a bug? Want to help improve the code or documentation? See the
`Contributing section`_ of the docs.
-.. _Contributing section: https://coverage.readthedocs.io/en/7.5.1/contributing.html
+.. _Contributing section: https://coverage.readthedocs.io/en/7.5.3/contributing.html
Security
@@ -167,7 +167,7 @@
:target: https://github.com/nedbat/coveragepy/actions/workflows/quality.yml
:alt: Quality check status
.. |docs| image:: https://readthedocs.org/projects/coverage/badge/?version=latest&style=flat
- :target: https://coverage.readthedocs.io/en/7.5.1/
+ :target: https://coverage.readthedocs.io/en/7.5.3/
:alt: Documentation
.. |kit| image:: https://img.shields.io/pypi/v/coverage
:target: https://pypi.org/project/coverage/
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/doc/branch.rst new/coverage-7.5.3/doc/branch.rst
--- old/coverage-7.5.1/doc/branch.rst 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/doc/branch.rst 2024-05-28 15:52:29.000000000 +0200
@@ -116,3 +116,16 @@
at some point. Coverage.py can't work that out on its own, but the "no branch"
pragma indicates that the branch is known to be partial, and the line is not
flagged.
+
+Generator expressions
+=====================
+
+Generator expressions may also report partial branch coverage. Consider the
+following example::
+
+ value = next(i in range(1))
+
+While we might expect this line of code to be reported as covered, the
+generator did not iterate until ``StopIteration`` is raised, the indication
+that the loop is complete. This is another case
+where adding ``# pragma: no branch`` may be desirable.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/doc/changes.rst new/coverage-7.5.3/doc/changes.rst
--- old/coverage-7.5.1/doc/changes.rst 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/doc/changes.rst 2024-05-28 15:52:29.000000000 +0200
@@ -845,10 +845,10 @@
would cause a "No data to report" error, as reported in `issue 549`_. This is
now fixed; thanks, Loïc Dachary.
-- If-statements can be optimized away during compilation, for example, `if 0:`
- or `if __debug__:`. Coverage.py had problems properly understanding these
- statements which existed in the source, but not in the compiled bytecode.
- This problem, reported in `issue 522`_, is now fixed.
+- If-statements can be optimized away during compilation, for example,
+ ``if 0:`` or ``if __debug__:``. Coverage.py had problems properly
+ understanding these statements which existed in the source, but not in the
+ compiled bytecode. This problem, reported in `issue 522`_, is now fixed.
- If you specified ``--source`` as a directory, then coverage.py would look for
importable Python files in that directory, and could identify ones that had
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/doc/conf.py new/coverage-7.5.3/doc/conf.py
--- old/coverage-7.5.1/doc/conf.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/doc/conf.py 2024-05-28 15:52:29.000000000 +0200
@@ -67,11 +67,11 @@
# @@@ editable
copyright = "2009–2024, Ned Batchelder" # pylint: disable=redefined-builtin
# The short X.Y.Z version.
-version = "7.5.1"
+version = "7.5.3"
# The full version, including alpha/beta/rc tags.
-release = "7.5.1"
+release = "7.5.3"
# The date of release, in "monthname day, year" format.
-release_date = "May 4, 2024"
+release_date = "May 28, 2024"
# @@@ end
rst_epilog = f"""
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/doc/index.rst new/coverage-7.5.3/doc/index.rst
--- old/coverage-7.5.1/doc/index.rst 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/doc/index.rst 2024-05-28 15:52:29.000000000 +0200
@@ -18,7 +18,7 @@
.. PYVERSIONS
-* Python 3.8 through 3.12, and 3.13.0a6 and up.
+* Python 3.8 through 3.12, and 3.13.0b1 and up.
* PyPy3 versions 3.8 through 3.10.
.. ifconfig:: prerelease
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/doc/requirements.in new/coverage-7.5.3/doc/requirements.in
--- old/coverage-7.5.1/doc/requirements.in 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/doc/requirements.in 2024-05-28 15:52:29.000000000 +0200
@@ -14,5 +14,6 @@
sphinx-autobuild
sphinx_rtd_theme
sphinx-code-tabs
+sphinx-lint
sphinxcontrib-restbuilder
sphinxcontrib-spelling
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/doc/requirements.pip new/coverage-7.5.3/doc/requirements.pip
--- old/coverage-7.5.1/doc/requirements.pip 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/doc/requirements.pip 2024-05-28 15:52:29.000000000 +0200
@@ -12,7 +12,7 @@
# watchfiles
attrs==23.2.0
# via scriv
-babel==2.14.0
+babel==2.15.0
# via sphinx
certifi==2024.2.2
# via requests
@@ -45,7 +45,7 @@
# requests
imagesize==1.4.1
# via sphinx
-jinja2==3.1.3
+jinja2==3.1.4
# via
# scriv
# sphinx
@@ -59,14 +59,18 @@
# via sphinx
pbr==6.0.0
# via stevedore
+polib==1.2.0
+ # via sphinx-lint
pyenchant==3.2.2
# via
# -r doc/requirements.in
# sphinxcontrib-spelling
-pygments==2.17.2
+pygments==2.18.0
# via
# doc8
# sphinx
+regex==2024.4.28
+ # via sphinx-lint
requests==2.31.0
# via
# scriv
@@ -92,6 +96,8 @@
# via -r doc/requirements.in
sphinx-code-tabs==0.5.5
# via -r doc/requirements.in
+sphinx-lint==0.9.1
+ # via -r doc/requirements.in
sphinx-rtd-theme==2.0.0
# via -r doc/requirements.in
sphinxcontrib-applehelp==1.0.8
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/howto.txt new/coverage-7.5.3/howto.txt
--- old/coverage-7.5.1/howto.txt 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/howto.txt 2024-05-28 15:52:29.000000000 +0200
@@ -34,10 +34,11 @@
- check in the new sample html
$ make relcommit2
- Done with changes to source files
- - check them in on the release prep branch
- - wait for ci to finish
- - merge to master
- - git push
+ - g puo; gshipit
+ - check them in on the release prep branch
+ - wait for ci to finish
+ - merge to master
+ - git push
- Start the kits:
- opvars github
- Trigger the kit GitHub Action
@@ -77,10 +78,8 @@
- IF NOT PRE-RELEASE:
- @ https://readthedocs.org/dashboard/coverage/advanced/
- change the "default version" to the new version
- - @ https://readthedocs.org/projects/coverage/builds/
- - manually build "latest"
- - wait for the new tag build to finish successfully.
- Once CI passes, merge the bump-version branch to master and push it
+ - gshipit
- things to automate:
- readthedocs api to do the readthedocs changes
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/igor.py new/coverage-7.5.3/igor.py
--- old/coverage-7.5.1/igor.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/igor.py 2024-05-28 15:52:29.000000000 +0200
@@ -248,6 +248,7 @@
os.getenv("COVERAGE_DYNCTX") or os.getenv("COVERAGE_CONTEXT"),
)
cov.html_report(show_contexts=show_contexts)
+ cov.json_report(show_contexts=show_contexts, pretty_print=True)
def do_test_with_core(core, *runner_args):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/lab/extract_code.py new/coverage-7.5.3/lab/extract_code.py
--- old/coverage-7.5.1/lab/extract_code.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/lab/extract_code.py 2024-05-28 15:52:29.000000000 +0200
@@ -5,8 +5,8 @@
Use this to copy some indented code from the coverage.py test suite into a
standalone file for deeper testing, or writing bug reports.
-Give it a file name and a line number, and it will find the indentend
-multiline string containing that line number, and output the dedented
+Give it a file name and a line number, and it will find the indented
+multi-line string containing that line number, and output the dedented
contents of the string.
If tests/test_arcs.py has this (partial) content::
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/lab/parser.py new/coverage-7.5.3/lab/parser.py
--- old/coverage-7.5.1/lab/parser.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/lab/parser.py 2024-05-28 15:52:29.000000000 +0200
@@ -80,7 +80,7 @@
if options.dis:
print("Main code:")
- disassemble(pyparser)
+ disassemble(pyparser.text)
arcs = pyparser.arcs()
@@ -95,8 +95,8 @@
exit_counts = pyparser.exit_counts()
- for lineno, ltext in enumerate(pyparser.lines, start=1):
- marks = [' ', ' ', ' ', ' ', ' ']
+ for lineno, ltext in enumerate(pyparser.text.splitlines(), start=1):
+ marks = [' '] * 6
a = ' '
if lineno in pyparser.raw_statements:
marks[0] = '-'
@@ -110,7 +110,13 @@
if lineno in pyparser.raw_classdefs:
marks[3] = 'C'
if lineno in pyparser.raw_excluded:
- marks[4] = 'x'
+ marks[4] = 'X'
+ elif lineno in pyparser.excluded:
+ marks[4] = '×'
+ if lineno in pyparser._multiline.values():
+ marks[5] = 'o'
+ elif lineno in pyparser._multiline.keys():
+ marks[5] = '.'
if arc_chars:
a = arc_chars[lineno].ljust(arc_width)
@@ -173,13 +179,13 @@
yield code
-def disassemble(pyparser):
+def disassemble(text):
"""Disassemble code, for ad-hoc experimenting."""
- code = compile(pyparser.text, "", "exec", dont_inherit=True)
+ code = compile(text, "", "exec", dont_inherit=True)
for code_obj in all_code_objects(code):
- if pyparser.text:
- srclines = pyparser.text.splitlines()
+ if text:
+ srclines = text.splitlines()
else:
srclines = None
print("\n%s: " % code_obj)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/tests/coveragetest.py new/coverage-7.5.3/tests/coveragetest.py
--- old/coverage-7.5.1/tests/coveragetest.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/tests/coveragetest.py 2024-05-28 15:52:29.000000000 +0200
@@ -151,7 +151,7 @@
self,
text: str,
lines: Sequence[TLineNo] | Sequence[list[TLineNo]] | None = None,
- missing: str | Sequence[str] = "",
+ missing: str = "",
report: str = "",
excludes: Iterable[str] | None = None,
partials: Iterable[str] = (),
@@ -226,15 +226,8 @@
assert False, f"None of the lines choices matched {statements!r}"
missing_formatted = analysis.missing_formatted()
- if isinstance(missing, str):
- msg = f"missing: {missing_formatted!r} != {missing!r}"
- assert missing_formatted == missing, msg
- else:
- for missing_list in missing:
- if missing_formatted == missing_list:
- break
- else:
- assert False, f"None of the missing choices matched {missing_formatted!r}"
+ msg = f"missing: {missing_formatted!r} != {missing!r}"
+ assert missing_formatted == missing, msg
if arcs is not None:
# print("Possible arcs:")
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/tests/gold/html/support/coverage_html.js new/coverage-7.5.3/tests/gold/html/support/coverage_html.js
--- old/coverage-7.5.1/tests/gold/html/support/coverage_html.js 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/tests/gold/html/support/coverage_html.js 2024-05-28 15:52:29.000000000 +0200
@@ -125,6 +125,16 @@
// Create the events for the filter box.
coverage.wire_up_filter = function () {
+ // Populate the filter and hide100 inputs if there are saved values for them.
+ const saved_filter_value = localStorage.getItem(coverage.FILTER_STORAGE);
+ if (saved_filter_value) {
+ document.getElementById("filter").value = saved_filter_value;
+ }
+ const saved_hide100_value = localStorage.getItem(coverage.HIDE100_STORAGE);
+ if (saved_hide100_value) {
+ document.getElementById("hide100").checked = JSON.parse(saved_hide100_value);
+ }
+
// Cache elements.
const table = document.querySelector("table.index");
const table_body_rows = table.querySelectorAll("tbody tr");
@@ -138,8 +148,12 @@
totals[totals.length - 1] = { "numer": 0, "denom": 0 }; // nosemgrep: eslint.detect-object-injection
var text = document.getElementById("filter").value;
+ // Store filter value
+ localStorage.setItem(coverage.FILTER_STORAGE, text);
const casefold = (text === text.toLowerCase());
const hide100 = document.getElementById("hide100").checked;
+ // Store hide value.
+ localStorage.setItem(coverage.HIDE100_STORAGE, JSON.stringify(hide100));
// Hide / show elements.
table_body_rows.forEach(row => {
@@ -240,6 +254,8 @@
document.getElementById("filter").dispatchEvent(new Event("input"));
document.getElementById("hide100").dispatchEvent(new Event("input"));
};
+coverage.FILTER_STORAGE = "COVERAGE_FILTER_VALUE";
+coverage.HIDE100_STORAGE = "COVERAGE_HIDE100_VALUE";
// Set up the click-to-sort columns.
coverage.wire_up_sorting = function () {
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/tests/helpers.py new/coverage-7.5.3/tests/helpers.py
--- old/coverage-7.5.1/tests/helpers.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/tests/helpers.py 2024-05-28 15:52:29.000000000 +0200
@@ -9,6 +9,7 @@
import contextlib
import dis
import io
+import locale
import os
import os.path
import re
@@ -28,7 +29,6 @@
from coverage import env
from coverage.debug import DebugControl
from coverage.exceptions import CoverageWarning
-from coverage.misc import output_encoding
from coverage.types import TArc, TLineNo
@@ -44,11 +44,13 @@
with open("/tmp/processes.txt", "a") as proctxt: # type: ignore[unreachable]
print(os.getenv("PYTEST_CURRENT_TEST", "unknown"), file=proctxt, flush=True)
+ encoding = os.device_encoding(1) or locale.getpreferredencoding()
+
# In some strange cases (PyPy3 in a virtualenv!?) the stdout encoding of
# the subprocess is set incorrectly to ascii. Use an environment variable
# to force the encoding to be the same as ours.
sub_env = dict(os.environ)
- sub_env['PYTHONIOENCODING'] = output_encoding()
+ sub_env['PYTHONIOENCODING'] = encoding
proc = subprocess.Popen(
cmd,
@@ -62,7 +64,7 @@
status = proc.returncode
# Get the output, and canonicalize it to strings with newlines.
- output_str = output.decode(output_encoding()).replace("\r", "")
+ output_str = output.decode(encoding).replace("\r", "")
return status, output_str
@@ -114,8 +116,11 @@
print(f"# {os.path.abspath(filename)}", file=fdis)
cur_test = os.getenv("PYTEST_CURRENT_TEST", "unknown")
print(f"# PYTEST_CURRENT_TEST = {cur_test}", file=fdis)
+ kwargs = {}
+ if env.PYVERSION >= (3, 13):
+ kwargs["show_offsets"] = True
try:
- dis.dis(text, file=fdis)
+ dis.dis(text, file=fdis, **kwargs)
except Exception as exc:
# Some tests make .py files that aren't Python, so dis will
# fail, which is expected.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/tests/test_coverage.py new/coverage-7.5.3/tests/test_coverage.py
--- old/coverage-7.5.1/tests/test_coverage.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/tests/test_coverage.py 2024-05-28 15:52:29.000000000 +0200
@@ -41,15 +41,6 @@
[1,2,3],
missing="3",
)
- # You can specify a list of possible missing lines.
- self.check_coverage("""\
- a = 1
- if a == 2:
- a = 3
- """,
- [1,2,3],
- missing=("47-49", "3", "100,102"),
- )
def test_failed_coverage(self) -> None:
# If the lines are wrong, the message shows right and wrong.
@@ -79,17 +70,6 @@
[1,2,3],
missing="37",
)
- # If the missing lines possibilities are wrong, the msg shows right.
- msg = r"None of the missing choices matched '3'"
- with pytest.raises(AssertionError, match=msg):
- self.check_coverage("""\
- a = 1
- if a == 2:
- a = 3
- """,
- [1,2,3],
- missing=("37", "4-10"),
- )
def test_exceptions_really_fail(self) -> None:
# An assert in the checked code will really raise up to us.
@@ -502,6 +482,7 @@
)
def test_strange_unexecuted_continue(self) -> None:
+ # This used to be true, but no longer is:
# Peephole optimization of jumps to jumps can mean that some statements
# never hit the line tracer. The behavior is different in different
# versions of Python, so be careful when running this test.
@@ -529,7 +510,7 @@
assert a == 33 and b == 50 and c == 50
""",
lines=[1,2,3,4,5,6,8,9,10, 12,13,14,15,16,17,19,20,21],
- missing=["", "6"],
+ missing="",
)
def test_import(self) -> None:
@@ -682,14 +663,13 @@
""",
[2, 3],
)
- lines = [2, 3, 4]
self.check_coverage("""\
- # Start with a comment, because it changes the behavior(!?)
+ # Start with a comment, even though it doesn't change the behavior.
'''I am a module docstring.'''
a = 3
b = 4
""",
- lines,
+ [3, 4],
)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/tests/test_parser.py new/coverage-7.5.3/tests/test_parser.py
--- old/coverage-7.5.1/tests/test_parser.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/tests/test_parser.py 2024-05-28 15:52:29.000000000 +0200
@@ -124,30 +124,23 @@
""")
assert parser.exit_counts() == { 1:1, 2:1, 3:1, 6:1 }
- def test_indentation_error(self) -> None:
- msg = (
- "Couldn't parse '<code>' as Python source: " +
- "'unindent does not match any outer indentation level.*' at line 3"
- )
- with pytest.raises(NotPython, match=msg):
- _ = self.parse_text("""\
- 0 spaces
- 2
- 1
- """)
-
- def test_token_error(self) -> None:
- submsgs = [
- r"EOF in multi-line string", # before 3.12.0b1
- r"unterminated triple-quoted string literal .detected at line 1.", # after 3.12.0b1
- ]
- msg = (
- r"Couldn't parse '<code>' as Python source: '"
- + r"(" + "|".join(submsgs) + ")"
- + r"' at line 1"
- )
+ @pytest.mark.parametrize("text", [
+ pytest.param("0 spaces\n 2\n 1", id="bad_indent"),
+ pytest.param("'''", id="string_eof"),
+ pytest.param("$hello", id="dollar"),
+ # on 3.10 this passes ast.parse but fails on tokenize.generate_tokens
+ pytest.param(
+ "\r'\\\n'''",
+ id="leading_newline_eof",
+ marks=[
+ pytest.mark.skipif(env.PYVERSION >= (3, 12), reason="parses fine in 3.12"),
+ ]
+ )
+ ])
+ def test_not_python(self, text: str) -> None:
+ msg = r"Couldn't parse '<code>' as Python source: '.*' at line \d+"
with pytest.raises(NotPython, match=msg):
- _ = self.parse_text("'''")
+ _ = self.parse_text(text)
def test_empty_decorated_function(self) -> None:
parser = self.parse_text("""\
@@ -180,6 +173,20 @@
assert expected_arcs == parser.arcs()
assert expected_exits == parser.exit_counts()
+ def test_module_docstrings(self) -> None:
+ parser = self.parse_text("""\
+ '''The docstring on line 1'''
+ a = 2
+ """)
+ assert {2} == parser.statements
+
+ parser = self.parse_text("""\
+ # Docstring is not line 1
+ '''The docstring on line 2'''
+ a = 3
+ """)
+ assert {3} == parser.statements
+
def test_fuzzed_double_parse(self) -> None:
# https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=50381
# The second parse used to raise `TypeError: 'NoneType' object is not iterable`
@@ -740,6 +747,10 @@
assert parser.raw_statements == raw_statements
assert parser.statements == set()
+ @pytest.mark.xfail(
+ env.PYPY and env.PYVERSION[:2] == (3, 8),
+ reason="AST doesn't mark end of classes correctly",
+ )
def test_class_decorator_pragmas(self) -> None:
parser = self.parse_text("""\
class Foo(object):
@@ -754,6 +765,22 @@
assert parser.raw_statements == {1, 2, 3, 5, 6, 7, 8}
assert parser.statements == {1, 2, 3}
+ def test_over_exclusion_bug1779(self) -> None:
+ # https://github.com/nedbat/coveragepy/issues/1779
+ parser = self.parse_text("""\
+ import abc
+
+ class MyProtocol: # nocover 3
+ @abc.abstractmethod # nocover 4
+ def my_method(self) -> int:
+ ... # 6
+
+ def function() -> int:
+ return 9
+ """)
+ assert parser.raw_statements == {1, 3, 4, 5, 6, 8, 9}
+ assert parser.statements == {1, 8, 9}
+
class ParserMissingArcDescriptionTest(PythonParserTestBase):
"""Tests for PythonParser.missing_arc_description."""
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/tests/test_report.py new/coverage-7.5.3/tests/test_report.py
--- old/coverage-7.5.1/tests/test_report.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/tests/test_report.py 2024-05-28 15:52:29.000000000 +0200
@@ -668,6 +668,34 @@
assert "not_covered.py 3 3 0.000000%" in report
assert "TOTAL 3 3 0.000000%" in report
+ def test_report_module_docstrings(self) -> None:
+ self.make_file("main.py", """\
+ # Line 1
+ '''Line 2 docstring.'''
+ import other
+ a = 4
+ """)
+ self.make_file("other.py", """\
+ '''Line 1'''
+ a = 2
+ """)
+ cov = coverage.Coverage()
+ self.start_import_stop(cov, "main")
+ report = self.get_report(cov)
+
+ # Name Stmts Miss Cover
+ # ------------------------------
+ # main.py 2 0 100%
+ # other.py 1 0 100%
+ # ------------------------------
+ # TOTAL 3 0 100%
+
+ assert self.line_count(report) == 6, report
+ squeezed = self.squeezed_lines(report)
+ assert squeezed[2] == "main.py 2 0 100%"
+ assert squeezed[3] == "other.py 1 0 100%"
+ assert squeezed[5] == "TOTAL 3 0 100%"
+
def test_dotpy_not_python(self) -> None:
# We run a .py file, and when reporting, we can't parse it as Python.
# We should get an error message in the report.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/tests/test_setup.py new/coverage-7.5.3/tests/test_setup.py
--- old/coverage-7.5.1/tests/test_setup.py 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/tests/test_setup.py 2024-05-28 15:52:29.000000000 +0200
@@ -9,7 +9,10 @@
from typing import List, cast
+import pytest
+
import coverage
+from coverage import env
from tests.coveragetest import CoverageTest
@@ -35,6 +38,10 @@
assert "github.com/nedbat/coveragepy" in out[2]
assert "Ned Batchelder" in out[3]
+ @pytest.mark.skipif(
+ env.PYVERSION[3:5] == ("alpha", 0),
+ reason="don't expect classifiers until labelled builds",
+ )
def test_more_metadata(self) -> None:
# Let's be sure we pick up our own setup.py
# CoverageTest restores the original sys.path for us.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/coverage-7.5.1/tox.ini new/coverage-7.5.3/tox.ini
--- old/coverage-7.5.1/tox.ini 2024-05-04 16:44:25.000000000 +0200
+++ new/coverage-7.5.3/tox.ini 2024-05-28 15:52:29.000000000 +0200
@@ -46,7 +46,7 @@
python -m pip install {env:COVERAGE_PIP_ARGS} -q -e .
python igor.py test_with_core ctrace {posargs}
- py3{12,13},anypy: python igor.py test_with_core sysmon {posargs}
+ py3{12,13,14},anypy: python igor.py test_with_core sysmon {posargs}
# Remove the C extension so that we can test the PyTracer
python igor.py remove_extension
@@ -76,6 +76,7 @@
# If this command fails, see the comment at the top of doc/cmd.rst
python -m cogapp -cP --check --verbosity=1 doc/*.rst
doc8 -q --ignore-path 'doc/_*' doc CHANGES.rst README.rst
+ sphinx-lint doc CHANGES.rst README.rst
sphinx-build -b html -aEnqW doc doc/_build/html
rst2html.py --strict README.rst doc/_build/trash
- sphinx-build -b html -b linkcheck -aEnq doc doc/_build/html
@@ -96,7 +97,7 @@
# If this command fails, see the comment at the top of doc/cmd.rst
python -m cogapp -cP --check --verbosity=1 doc/*.rst
python -m cogapp -cP --check --verbosity=1 .github/workflows/*.yml
- python -m pylint --notes= --ignore-paths 'doc/_build/.*' {env:LINTABLE}
+ python -m pylint -j 0 --notes= --ignore-paths 'doc/_build/.*' {env:LINTABLE}
check-manifest --ignore 'doc/sample_html/*,.treerc'
# If 'build -q' becomes a thing (https://github.com/pypa/build/issues/188),
# this can be simplified:
@@ -128,4 +129,5 @@
3.11 = py311
3.12 = py312
3.13 = py313
+ 3.14 = py314
pypy-3 = pypy3
1
0
Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-apache-libcloud for openSUSE:Factory checked in at 2024-06-07 15:02:06
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-apache-libcloud (Old)
and /work/SRC/openSUSE:Factory/.python-apache-libcloud.new.24587 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-apache-libcloud"
Fri Jun 7 15:02:06 2024 rev:46 rq:1178871 version:3.8.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-apache-libcloud/python-apache-libcloud.changes 2023-10-02 20:04:12.445930083 +0200
+++ /work/SRC/openSUSE:Factory/.python-apache-libcloud.new.24587/python-apache-libcloud.changes 2024-06-07 15:02:11.161191893 +0200
@@ -1,0 +2,10 @@
+Thu Jun 6 02:51:26 UTC 2024 - Steve Kowalik <steven.kowalik(a)suse.com>
+
+- Switch to pyproject and autosetup macros.
+- No more greedy globs in %files.
+- Tidy up {Build,}Requires on typing and lxml.
+- Stop skipping a bunch of tests, requests-mock has been fixed.
+- Add patches support-pytest-8.patch, support-pytest-8.2.patch:
+ * Support running under pytest 8.2.x.
+
+-------------------------------------------------------------------
New:
----
support-pytest-8.2.patch
support-pytest-8.patch
BETA DEBUG BEGIN:
New:- Stop skipping a bunch of tests, requests-mock has been fixed.
- Add patches support-pytest-8.patch, support-pytest-8.2.patch:
* Support running under pytest 8.2.x.
New:- Stop skipping a bunch of tests, requests-mock has been fixed.
- Add patches support-pytest-8.patch, support-pytest-8.2.patch:
* Support running under pytest 8.2.x.
BETA DEBUG END:
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-apache-libcloud.spec ++++++
--- /var/tmp/diff_new_pack.3Tgl2U/_old 2024-06-07 15:02:13.809288362 +0200
+++ /var/tmp/diff_new_pack.3Tgl2U/_new 2024-06-07 15:02:13.821288799 +0200
@@ -1,7 +1,7 @@
#
# spec file for package python-apache-libcloud
#
-# Copyright (c) 2023 SUSE LLC
+# Copyright (c) 2024 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -22,7 +22,6 @@
Release: 0
Summary: Abstraction over multiple cloud provider APIs
License: Apache-2.0
-Group: Development/Languages/Python
URL: https://libcloud.apache.org
Source0: https://downloads.apache.org/libcloud/apache-libcloud-%{version}.tar.gz
Source1: https://downloads.apache.org/libcloud/apache-libcloud-%{version}.tar.gz.asc
@@ -30,22 +29,24 @@
Source2: https://www.apache.org/dist/libcloud/KEYS#/%{name}.keyring
Patch1: gce_image_projects.patch
Patch2: ec2_create_node.patch
+# PATCH-FIX-UPSTREAM gh#apache/libcloud#1994
+Patch3: support-pytest-8.patch
+# PATCH-FIX-UPSTREAM gh#apache/libcloud#2014
+Patch4: support-pytest-8.2.patch
BuildRequires: %{python_module base >= 3.7}
BuildRequires: %{python_module fasteners}
BuildRequires: %{python_module libvirt-python}
-BuildRequires: %{python_module lxml}
BuildRequires: %{python_module paramiko}
+BuildRequires: %{python_module pip}
BuildRequires: %{python_module pyOpenSSL}
BuildRequires: %{python_module pytest}
BuildRequires: %{python_module requests-mock}
BuildRequires: %{python_module setuptools}
-BuildRequires: %{python_module typing}
+BuildRequires: %{python_module wheel}
BuildRequires: %{python_module xml}
BuildRequires: fdupes
BuildRequires: python-rpm-macros
-Requires: python-lxml
Requires: python-requests
-Requires: python-typing
Suggests: python-libvirt-python
Suggests: python-fastners
Suggests: python-paramiko
@@ -58,18 +59,17 @@
differences among multiple cloud provider APIs.
%prep
-%setup -q -n apache-libcloud-%{version}
-%autopatch -p1
+%autosetup -p1 -n apache-libcloud-%{version}
sed -i '/^#!/d' demos/gce_demo.py
chmod a-x demos/gce_demo.py
# Setup tests
cp libcloud/test/secrets.py-dist libcloud/test/secrets.py
%build
-%python_build
+%pyproject_wheel
%install
-%python_install
+%pyproject_install
find %{buildroot} -name '*.DS_Store' -delete
find %{buildroot} -name '*.json' -size 0 -delete
find %{buildroot} -name '*.pem' -size 0 -delete
@@ -92,25 +92,15 @@
donttest+=" or test_connection_timeout_raised"
donttest+=" or test_retry_on_all_default_retry_exception_classes"
-# Skip tests broken because requests-mock incompatibility with urllib3 >= 2.0.0
-# gh#jamielennox/requests-mock#228
-donttest+=" or test_openstack.py"
-donttest+=" or test_rackspace.py"
-donttest+=" or test_scaleway.py"
-donttest+=" or test_vcloud.py"
-donttest+=" or test_vultr_v2.py"
-donttest+=" or test_aurora.py"
-donttest+=" or test_azure_blobs.py"
-donttest+=" or test_cloudfiles.py"
-donttest+=" or test_google_storage.py"
-donttest+=" or test_oss.py"
-donttest+=" or test_ovh.py"
-donttest+=" or test_s3.py"
+# Works upstream, requires other changes made, drop after upgrade
+# from 3.8.0
+donttest+=" or test_init_once_and_debug_mode"
%pytest -k "not ($donttest)"
%files %{python_files}
%license LICENSE
%doc CHANGES.rst README.rst demos/ example_*.py
-%{python_sitelib}/*
+%{python_sitelib}/libcloud
+%{python_sitelib}/apache_libcloud-%{version}.dist-info
++++++ support-pytest-8.2.patch ++++++
From 0b69d0bf23b6c2edb1e2002f47ff2df0080e96d9 Mon Sep 17 00:00:00 2001
From: Steve Kowalik <steven(a)wedontsleep.org>
Date: Thu, 6 Jun 2024 12:25:15 +1000
Subject: [PATCH] Mark MockHttp as not for collection by pytest
pytest 8.2.0 contains a regression that will collect non-test classes,
so as to be explicit about it, mark MockHttp (and therefore all of its
children classes) as not to be collected.
---
libcloud/test/__init__.py | 2 ++
1 file changed, 2 insertions(+)
diff --git a/libcloud/test/__init__.py b/libcloud/test/__init__.py
index d45c82c84d..d0da40c74a 100644
--- a/libcloud/test/__init__.py
+++ b/libcloud/test/__init__.py
@@ -97,6 +97,8 @@ class MockHttp(LibcloudConnection):
(int status, str body, dict headers, str reason)
"""
+ # pytest may collect this class, and we don't need or want that
+ __test__ = False
type = None
use_param = None # will use this param to namespace the request function
++++++ support-pytest-8.patch ++++++
From 6a646d0c3fd3c1d33183bb338235aaa43f86d1f8 Mon Sep 17 00:00:00 2001
From: "dependabot[bot]" <49699333+dependabot[bot](a)users.noreply.github.com>
Date: Mon, 26 Feb 2024 17:44:37 +0000
Subject: [PATCH 01/15] Bump pytest from 7.4.0 to 8.0.2
Bumps [pytest](https://github.com/pytest-dev/pytest) from 7.4.0 to 8.0.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/7.4.0...8.0.2)
---
updated-dependencies:
- dependency-name: pytest
dependency-type: direct:production
update-type: version-update:semver-major
...
Signed-off-by: dependabot[bot] <support(a)github.com>
---
requirements-tests.txt | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
Index: apache-libcloud-3.8.0/requirements-tests.txt
===================================================================
--- apache-libcloud-3.8.0.orig/requirements-tests.txt
+++ apache-libcloud-3.8.0/requirements-tests.txt
@@ -1,7 +1,7 @@
coverage[toml]==7.2.7; python_version >= '3.8'
requests>=2.31.0
requests_mock==1.11.0
-pytest==7.4.0
+pytest==8.0.2
pytest-xdist==3.3.1
pytest-timeout==2.1.0
pytest-benchmark[histogram]==4.0.0
Index: apache-libcloud-3.8.0/tox.ini
===================================================================
--- apache-libcloud-3.8.0.orig/tox.ini
+++ apache-libcloud-3.8.0/tox.ini
@@ -39,8 +39,11 @@ setenv =
# To avoid per-test function process safety issues we run all tests in a single
# file in the same worker process.
# for pytest-xdist, we want to distribute tests by file aka --dist loadfile
+# Tests which are not safe to run in paralell are marked with "serial" tag
+# and run separately at the end
commands = cp libcloud/test/secrets.py-dist libcloud/test/secrets.py
- pytest --color=yes -rsx -vvv --capture=tee-sys -o log_cli=True --durations=10 --timeout=15 -n auto --dist loadfile --ignore libcloud/test/benchmarks/ --ignore-glob "*test_list_objects_filtering_performance*"
+ pytest --color=yes -rsx -vvv --capture=tee-sys -o log_cli=True --durations=10 --timeout=15 -n auto --dist loadfile --ignore libcloud/test/benchmarks/ --ignore-glob "*test_list_objects_filtering_performance*" -m "not serial"
+ pytest --color=yes -rsx -vvv --capture=tee-sys -o log_cli=True --durations=10 --timeout=15 --ignore libcloud/test/benchmarks/ --ignore-glob "*test_list_objects_filtering_performance*" -m "serial"
[testenv:py3.6-dist]
# Verify library installs without any dependencies when using python setup.py
Index: apache-libcloud-3.8.0/libcloud/test/test_init.py
===================================================================
--- apache-libcloud-3.8.0.orig/libcloud/test/test_init.py
+++ apache-libcloud-3.8.0/libcloud/test/test_init.py
@@ -19,6 +19,8 @@ import logging
import tempfile
from unittest.mock import patch
+import pytest
+
import libcloud
from libcloud import _init_once
from libcloud.base import DriverTypeNotFoundError
@@ -38,6 +40,7 @@ class TestUtils(unittest.TestCase):
if "LIBCLOUD_DEBUG" in os.environ:
del os.environ["LIBCLOUD_DEBUG"]
+ @pytest.mark.serial
def test_init_once_and_debug_mode(self):
if have_paramiko:
paramiko_logger = logging.getLogger("paramiko")
Index: apache-libcloud-3.8.0/libcloud/test/common/test_digitalocean_v2.py
===================================================================
--- apache-libcloud-3.8.0.orig/libcloud/test/common/test_digitalocean_v2.py
+++ apache-libcloud-3.8.0/libcloud/test/common/test_digitalocean_v2.py
@@ -15,22 +15,28 @@
import sys
import unittest
+from libcloud.http import LibcloudConnection
from libcloud.test import MockHttp, LibcloudTestCase
from libcloud.utils.py3 import httplib
from libcloud.common.types import InvalidCredsError
from libcloud.test.secrets import DIGITALOCEAN_v2_PARAMS
from libcloud.test.file_fixtures import FileFixtures
-from libcloud.common.digitalocean import DigitalOceanBaseDriver
+from libcloud.common.digitalocean import DigitalOceanBaseDriver, DigitalOcean_v2_BaseDriver
class DigitalOceanTests(LibcloudTestCase):
def setUp(self):
- DigitalOceanBaseDriver.connectionCls.conn_class = DigitalOceanMockHttp
- DigitalOceanMockHttp.type = None
+ DigitalOceanBaseDriver.connectionCls.conn_class = DigitalOceanCommonMockHttp
+ DigitalOcean_v2_BaseDriver.connectionCls.conn_class = DigitalOceanCommonMockHttp
+ DigitalOceanCommonMockHttp.type = None
self.driver = DigitalOceanBaseDriver(*DIGITALOCEAN_v2_PARAMS)
+ def tearDown(self):
+ LibcloudConnection.type = None
+ DigitalOceanCommonMockHttp.type = None
+
def test_authentication(self):
- DigitalOceanMockHttp.type = "UNAUTHORIZED"
+ DigitalOceanCommonMockHttp.type = "UNAUTHORIZED"
self.assertRaises(InvalidCredsError, self.driver.ex_account_info)
def test_ex_account_info(self):
@@ -51,13 +57,13 @@ class DigitalOceanTests(LibcloudTestCase
self.assertEqual(action["type"], "power_on")
def test__paginated_request(self):
- DigitalOceanMockHttp.type = "page_1"
+ DigitalOceanCommonMockHttp.type = "page_1"
actions = self.driver._paginated_request("/v2/actions", "actions")
self.assertEqual(actions[0]["id"], 12345671)
self.assertEqual(actions[0]["status"], "completed")
-class DigitalOceanMockHttp(MockHttp):
+class DigitalOceanCommonMockHttp(MockHttp):
fixtures = FileFixtures("common", "digitalocean")
response = {
Index: apache-libcloud-3.8.0/libcloud/test/compute/test_digitalocean_v2.py
===================================================================
--- apache-libcloud-3.8.0.orig/libcloud/test/compute/test_digitalocean_v2.py
+++ apache-libcloud-3.8.0/libcloud/test/compute/test_digitalocean_v2.py
@@ -16,6 +16,7 @@ import sys
import unittest
from datetime import datetime
+from libcloud.http import LibcloudConnection
from libcloud.test import MockHttp, LibcloudTestCase
from libcloud.utils.py3 import httplib, assertRaisesRegex
from libcloud.common.types import InvalidCredsError
@@ -23,8 +24,15 @@ from libcloud.compute.base import NodeIm
from libcloud.test.secrets import DIGITALOCEAN_v1_PARAMS, DIGITALOCEAN_v2_PARAMS
from libcloud.utils.iso8601 import UTC
from libcloud.test.file_fixtures import ComputeFileFixtures
-from libcloud.common.digitalocean import DigitalOcean_v1_Error
-from libcloud.compute.drivers.digitalocean import DigitalOceanNodeDriver
+from libcloud.common.digitalocean import (
+ DigitalOcean_v1_Error,
+ DigitalOceanBaseDriver,
+ DigitalOcean_v2_BaseDriver,
+)
+from libcloud.compute.drivers.digitalocean import (
+ DigitalOceanNodeDriver,
+ DigitalOcean_v2_NodeDriver,
+)
try:
import simplejson as json
@@ -35,10 +43,17 @@ except ImportError:
# class DigitalOceanTests(unittest.TestCase, TestCaseMixin):
class DigitalOcean_v2_Tests(LibcloudTestCase):
def setUp(self):
- DigitalOceanNodeDriver.connectionCls.conn_class = DigitalOceanMockHttp
- DigitalOceanMockHttp.type = None
+ DigitalOceanBaseDriver.connectionCls.conn_class = DigitalOceanComputeMockHttp
+ DigitalOcean_v2_BaseDriver.connectionCls.conn_class = DigitalOceanComputeMockHttp
+ DigitalOceanNodeDriver.connectionCls.conn_class = DigitalOceanComputeMockHttp
+ DigitalOcean_v2_NodeDriver.connectionCls.conn_class = DigitalOceanComputeMockHttp
+ DigitalOceanComputeMockHttp.type = None
self.driver = DigitalOceanNodeDriver(*DIGITALOCEAN_v2_PARAMS)
+ def tearDown(self):
+ LibcloudConnection.type = None
+ DigitalOceanComputeMockHttp.type = None
+
def test_v1_Error(self):
self.assertRaises(
DigitalOcean_v1_Error,
@@ -56,7 +71,7 @@ class DigitalOcean_v2_Tests(LibcloudTest
)
def test_authentication(self):
- DigitalOceanMockHttp.type = "UNAUTHORIZED"
+ DigitalOceanComputeMockHttp.type = "UNAUTHORIZED"
self.assertRaises(InvalidCredsError, self.driver.list_nodes)
def test_list_images_success(self):
@@ -128,7 +143,7 @@ class DigitalOcean_v2_Tests(LibcloudTest
size = self.driver.list_sizes()[0]
location = self.driver.list_locations()[0]
- DigitalOceanMockHttp.type = "INVALID_IMAGE"
+ DigitalOceanComputeMockHttp.type = "INVALID_IMAGE"
expected_msg = (
r"You specified an invalid image for Droplet creation."
+ r" \(code: (404|HTTPStatus.NOT_FOUND)\)"
@@ -146,13 +161,13 @@ class DigitalOcean_v2_Tests(LibcloudTest
def test_reboot_node_success(self):
node = self.driver.list_nodes()[0]
- DigitalOceanMockHttp.type = "REBOOT"
+ DigitalOceanComputeMockHttp.type = "REBOOT"
result = self.driver.reboot_node(node)
self.assertTrue(result)
def test_create_image_success(self):
node = self.driver.list_nodes()[0]
- DigitalOceanMockHttp.type = "SNAPSHOT"
+ DigitalOceanComputeMockHttp.type = "SNAPSHOT"
result = self.driver.create_image(node, "My snapshot")
self.assertTrue(result)
@@ -164,62 +179,62 @@ class DigitalOcean_v2_Tests(LibcloudTest
def test_delete_image_success(self):
image = self.driver.get_image(12345)
- DigitalOceanMockHttp.type = "DESTROY"
+ DigitalOceanComputeMockHttp.type = "DESTROY"
result = self.driver.delete_image(image)
self.assertTrue(result)
def test_ex_power_on_node_success(self):
node = self.driver.list_nodes()[0]
- DigitalOceanMockHttp.type = "POWERON"
+ DigitalOceanComputeMockHttp.type = "POWERON"
result = self.driver.ex_power_on_node(node)
self.assertTrue(result)
def test_ex_shutdown_node_success(self):
node = self.driver.list_nodes()[0]
- DigitalOceanMockHttp.type = "SHUTDOWN"
+ DigitalOceanComputeMockHttp.type = "SHUTDOWN"
result = self.driver.ex_shutdown_node(node)
self.assertTrue(result)
def test_ex_hard_reboot_success(self):
node = self.driver.list_nodes()[0]
- DigitalOceanMockHttp.type = "POWERCYCLE"
+ DigitalOceanComputeMockHttp.type = "POWERCYCLE"
result = self.driver.ex_hard_reboot(node)
self.assertTrue(result)
def test_ex_rebuild_node_success(self):
node = self.driver.list_nodes()[0]
- DigitalOceanMockHttp.type = "REBUILD"
+ DigitalOceanComputeMockHttp.type = "REBUILD"
result = self.driver.ex_rebuild_node(node)
self.assertTrue(result)
def test_ex_resize_node_success(self):
node = self.driver.list_nodes()[0]
size = self.driver.list_sizes()[0]
- DigitalOceanMockHttp.type = "RESIZE"
+ DigitalOceanComputeMockHttp.type = "RESIZE"
result = self.driver.ex_resize_node(node, size)
self.assertTrue(result)
def test_destroy_node_success(self):
node = self.driver.list_nodes()[0]
- DigitalOceanMockHttp.type = "DESTROY"
+ DigitalOceanComputeMockHttp.type = "DESTROY"
result = self.driver.destroy_node(node)
self.assertTrue(result)
def test_ex_change_kernel_success(self):
node = self.driver.list_nodes()[0]
- DigitalOceanMockHttp.type = "KERNELCHANGE"
+ DigitalOceanComputeMockHttp.type = "KERNELCHANGE"
result = self.driver.ex_change_kernel(node, 7515)
self.assertTrue(result)
def test_ex_enable_ipv6_success(self):
node = self.driver.list_nodes()[0]
- DigitalOceanMockHttp.type = "ENABLEIPV6"
+ DigitalOceanComputeMockHttp.type = "ENABLEIPV6"
result = self.driver.ex_enable_ipv6(node)
self.assertTrue(result)
def test_ex_rename_node_success(self):
node = self.driver.list_nodes()[0]
- DigitalOceanMockHttp.type = "RENAME"
+ DigitalOceanComputeMockHttp.type = "RENAME"
result = self.driver.ex_rename_node(node, "fedora helios")
self.assertTrue(result)
@@ -231,7 +246,7 @@ class DigitalOcean_v2_Tests(LibcloudTest
self.assertEqual(keys[0].public_key, "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAAAQQDGk5 example")
def test_create_key_pair(self):
- DigitalOceanMockHttp.type = "CREATE"
+ DigitalOceanComputeMockHttp.type = "CREATE"
key = self.driver.create_key_pair(
name="test1", public_key="ssh-rsa AAAAB3NzaC1yc2EAAAADAQsxRiUKn example"
)
@@ -250,7 +265,7 @@ class DigitalOcean_v2_Tests(LibcloudTest
self.assertEqual(nodes[0]["size_slug"], "s-1vcpu-1gb")
def test__paginated_request_two_pages(self):
- DigitalOceanMockHttp.type = "PAGE_ONE"
+ DigitalOceanComputeMockHttp.type = "PAGE_ONE"
nodes = self.driver._paginated_request("/v2/droplets", "droplets")
self.assertEqual(len(nodes), 2)
@@ -264,13 +279,13 @@ class DigitalOcean_v2_Tests(LibcloudTest
self.assertEqual(volume.driver, self.driver)
def test_list_volumes_empty(self):
- DigitalOceanMockHttp.type = "EMPTY"
+ DigitalOceanComputeMockHttp.type = "EMPTY"
volumes = self.driver.list_volumes()
self.assertEqual(len(volumes), 0)
def test_create_volume(self):
nyc1 = [r for r in self.driver.list_locations() if r.id == "nyc1"][0]
- DigitalOceanMockHttp.type = "CREATE"
+ DigitalOceanComputeMockHttp.type = "CREATE"
volume = self.driver.create_volume(4, "example", nyc1)
self.assertEqual(volume.id, "62766883-2c28-11e6-b8e6-000f53306ae1")
self.assertEqual(volume.name, "example")
@@ -280,19 +295,19 @@ class DigitalOcean_v2_Tests(LibcloudTest
def test_attach_volume(self):
node = self.driver.list_nodes()[0]
volume = self.driver.list_volumes()[0]
- DigitalOceanMockHttp.type = "ATTACH"
+ DigitalOceanComputeMockHttp.type = "ATTACH"
resp = self.driver.attach_volume(node, volume)
self.assertTrue(resp)
def test_detach_volume(self):
volume = self.driver.list_volumes()[0]
- DigitalOceanMockHttp.type = "DETACH"
+ DigitalOceanComputeMockHttp.type = "DETACH"
resp = self.driver.detach_volume(volume)
self.assertTrue(resp)
def test_destroy_volume(self):
volume = self.driver.list_volumes()[0]
- DigitalOceanMockHttp.type = "DESTROY"
+ DigitalOceanComputeMockHttp.type = "DESTROY"
resp = self.driver.destroy_volume(volume)
self.assertTrue(resp)
@@ -307,7 +322,7 @@ class DigitalOcean_v2_Tests(LibcloudTest
def test_create_volume_snapshot(self):
volume = self.driver.list_volumes()[0]
- DigitalOceanMockHttp.type = "CREATE"
+ DigitalOceanComputeMockHttp.type = "CREATE"
snapshot = self.driver.create_volume_snapshot(volume, "test-snapshot")
self.assertEqual(snapshot.id, "c0def940-9324-11e6-9a56-000f533176b1")
self.assertEqual(snapshot.name, "test-snapshot")
@@ -316,7 +331,7 @@ class DigitalOcean_v2_Tests(LibcloudTest
def test_delete_volume_snapshot(self):
volume = self.driver.list_volumes()[0]
snapshot = self.driver.list_volume_snapshots(volume)[0]
- DigitalOceanMockHttp.type = "DELETE"
+ DigitalOceanComputeMockHttp.type = "DELETE"
result = self.driver.delete_volume_snapshot(snapshot)
self.assertTrue(result)
@@ -396,7 +411,7 @@ class DigitalOcean_v2_Tests(LibcloudTest
self.assertTrue(ret)
-class DigitalOceanMockHttp(MockHttp):
+class DigitalOceanComputeMockHttp(MockHttp):
fixtures = ComputeFileFixtures("digitalocean_v2")
def _v2_regions(self, method, url, body, headers):
Index: apache-libcloud-3.8.0/libcloud/test/dns/test_digitalocean.py
===================================================================
--- apache-libcloud-3.8.0.orig/libcloud/test/dns/test_digitalocean.py
+++ apache-libcloud-3.8.0/libcloud/test/dns/test_digitalocean.py
@@ -15,20 +15,28 @@
import sys
import unittest
+from libcloud.http import LibcloudConnection
from libcloud.test import MockHttp, LibcloudTestCase
from libcloud.dns.types import RecordType
from libcloud.utils.py3 import httplib
from libcloud.test.secrets import DIGITALOCEAN_v2_PARAMS
from libcloud.test.file_fixtures import DNSFileFixtures
+from libcloud.common.digitalocean import DigitalOceanBaseDriver, DigitalOcean_v2_BaseDriver
from libcloud.dns.drivers.digitalocean import DigitalOceanDNSDriver
class DigitalOceanDNSTests(LibcloudTestCase):
def setUp(self):
+ DigitalOceanBaseDriver.connectionCls.conn_class = DigitalOceanDNSMockHttp
+ DigitalOcean_v2_BaseDriver.connectionCls.conn_class = DigitalOceanDNSMockHttp
DigitalOceanDNSDriver.connectionCls.conn_class = DigitalOceanDNSMockHttp
DigitalOceanDNSMockHttp.type = None
self.driver = DigitalOceanDNSDriver(*DIGITALOCEAN_v2_PARAMS)
+ def tearDown(self):
+ LibcloudConnection.type = None
+ DigitalOceanDNSMockHttp.type = None
+
def test_list_zones(self):
zones = self.driver.list_zones()
self.assertTrue(len(zones) >= 1)
Index: apache-libcloud-3.8.0/libcloud/test/__init__.py
===================================================================
--- apache-libcloud-3.8.0.orig/libcloud/test/__init__.py
+++ apache-libcloud-3.8.0/libcloud/test/__init__.py
@@ -205,7 +205,7 @@ class MockHttp(LibcloudConnection):
) # Python 3.7 no longer quotes ~
if type:
- meth_name = "{}_{}".format(meth_name, self.type)
+ meth_name = "{}_{}".format(meth_name, type)
if use_param and use_param in qs:
param = qs[use_param][0].replace(".", "_").replace("-", "_")
Index: apache-libcloud-3.8.0/CHANGES.rst
===================================================================
--- apache-libcloud-3.8.0.orig/CHANGES.rst
+++ apache-libcloud-3.8.0/CHANGES.rst
@@ -1,6 +1,22 @@
Changelog
=========
+Other / Development
+~~~~~--------------
+
+- pytest library used for running tests and microbenchmarks has been upgraded to
+ v8.1.
+
+ Changes in the pytest test discovery and collection mechanism and ordering
+ have uncovered some race conditions and cross test pollution which has been
+ addressed.
+
+ Now all the tests are passing, but it's possible that there are still some
+ race conditions hiding which many only pop up in the future (since we run
+ tests in parallel and order in which they run is not fully deterministic).
+ (#1994)
+ [Tomaz Muraus - @Kami]
+
Changes in Apache Libcloud 3.8.0
--------------------------------
1
0