openSUSE Commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
July 2016
- 1 participants
- 1930 discussions
Hello community,
here is the log from the commit of package extreme-tuxracer for openSUSE:Factory checked in at 2016-07-28 23:46:58
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/extreme-tuxracer (Old)
and /work/SRC/openSUSE:Factory/.extreme-tuxracer.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "extreme-tuxracer"
Changes:
--------
--- /work/SRC/openSUSE:Factory/extreme-tuxracer/extreme-tuxracer.changes 2016-04-22 16:24:46.000000000 +0200
+++ /work/SRC/openSUSE:Factory/.extreme-tuxracer.new/extreme-tuxracer.changes 2016-07-28 23:47:06.000000000 +0200
@@ -1,0 +2,7 @@
+Sun Jul 24 16:05:36 UTC 2016 - nemysis(a)openSUSE.org
+
+- Update to 0.7.3, please see
+
+ /usr/share/doc/packages/extreme-tuxracer/NEWS
+
+-------------------------------------------------------------------
Old:
----
etr-0.7.2.tar.xz
New:
----
etr-0.7.3.tar.xz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ extreme-tuxracer.spec ++++++
--- /var/tmp/diff_new_pack.kivWzi/_old 2016-07-28 23:47:09.000000000 +0200
+++ /var/tmp/diff_new_pack.kivWzi/_new 2016-07-28 23:47:09.000000000 +0200
@@ -19,7 +19,7 @@
%define oname etr
Name: extreme-tuxracer
-Version: 0.7.2
+Version: 0.7.3
Release: 0
Summary: Open source racing game featuring Tux the Linux Penguin
License: GPL-2.0+
++++++ etr-0.7.2.tar.xz -> etr-0.7.3.tar.xz ++++++
/work/SRC/openSUSE:Factory/extreme-tuxracer/etr-0.7.2.tar.xz /work/SRC/openSUSE:Factory/.extreme-tuxracer.new/etr-0.7.3.tar.xz differ: char 27, line 1
1
0
Hello community,
here is the log from the commit of package seccheck for openSUSE:Factory checked in at 2016-07-28 23:46:51
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/seccheck (Old)
and /work/SRC/openSUSE:Factory/.seccheck.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "seccheck"
Changes:
--------
--- /work/SRC/openSUSE:Factory/seccheck/seccheck.changes 2015-10-30 13:43:52.000000000 +0100
+++ /work/SRC/openSUSE:Factory/.seccheck.new/seccheck.changes 2016-07-28 23:46:53.000000000 +0200
@@ -1,0 +2,5 @@
+Sun Jul 24 21:10:23 UTC 2016 - vpereira(a)suse.com
+
+- fixed bnc#985802: security monthly reports were broken
+
+-------------------------------------------------------------------
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ seccheck.spec ++++++
--- /var/tmp/diff_new_pack.jBMex3/_old 2016-07-28 23:46:54.000000000 +0200
+++ /var/tmp/diff_new_pack.jBMex3/_new 2016-07-28 23:46:54.000000000 +0200
@@ -1,7 +1,7 @@
#
# spec file for package seccheck
#
-# Copyright (c) 2015 SUSE LINUX GmbH, Nuernberg, Germany.
+# Copyright (c) 2016 SUSE LINUX GmbH, Nuernberg, Germany.
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
++++++ master.zip ++++++
Files /var/tmp/diff_new_pack.jBMex3/_old and /var/tmp/diff_new_pack.jBMex3/_new differ
1
0
Hello community,
here is the log from the commit of package wine for openSUSE:Factory checked in at 2016-07-28 23:46:49
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/wine (Old)
and /work/SRC/openSUSE:Factory/.wine.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "wine"
Changes:
--------
--- /work/SRC/openSUSE:Factory/wine/wine.changes 2016-07-01 10:00:05.000000000 +0200
+++ /work/SRC/openSUSE:Factory/.wine.new/wine.changes 2016-07-28 23:46:51.000000000 +0200
@@ -1,0 +2,18 @@
+Sat Jul 23 19:31:42 UTC 2016 - meissner(a)suse.com
+
+- Updated to 1.9.15 development snapshot
+ - More shader instructions in Direct3D.
+ - Performance improvements in GDI.
+ - Better multi-joystick support on macOS.
+ - Active Scripting improvements.
+ - Additional stream support in the C++ runtime.
+ - Various bug fixes.
+- Updated to 1.9.14 development snapshot
+ - More Shader Model 5 support in Direct3D.
+ - Some more write support in WebServices.
+ - Performance improvements in GDI.
+ - Some more progress towards the Direct3D command stream.
+ - Various bug fixes.
+- updated winetricks
+
+-------------------------------------------------------------------
Old:
----
wine-1.9.13.tar.bz2
wine-1.9.13.tar.bz2.sign
New:
----
wine-1.9.15.tar.bz2
wine-1.9.15.tar.bz2.sign
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ wine.spec ++++++
--- /var/tmp/diff_new_pack.YmK8Ei/_old 2016-07-28 23:46:52.000000000 +0200
+++ /var/tmp/diff_new_pack.YmK8Ei/_new 2016-07-28 23:46:52.000000000 +0200
@@ -53,8 +53,8 @@
BuildRequires: update-desktop-files
BuildRequires: valgrind-devel
BuildRequires: xorg-x11-devel
-%define realver 1.9.13
-Version: 1.9.13
+%define realver 1.9.15
+Version: 1.9.15
Release: 0
Summary: An MS Windows Emulator
License: LGPL-2.1+
++++++ wine-1.9.13.tar.bz2 -> wine-1.9.15.tar.bz2 ++++++
/work/SRC/openSUSE:Factory/wine/wine-1.9.13.tar.bz2 /work/SRC/openSUSE:Factory/.wine.new/wine-1.9.15.tar.bz2 differ: char 11, line 1
++++++ winetricks ++++++
++++ 1342 lines (skipped)
++++ between /work/SRC/openSUSE:Factory/wine/winetricks
++++ and /work/SRC/openSUSE:Factory/.wine.new/winetricks
++++++ winetricks.1 ++++++
--- /var/tmp/diff_new_pack.YmK8Ei/_old 2016-07-28 23:46:53.000000000 +0200
+++ /var/tmp/diff_new_pack.YmK8Ei/_new 2016-07-28 23:46:53.000000000 +0200
@@ -1,5 +1,5 @@
.\" -*- nroff -*-
-.TH WINETRICKS 1 "June 2016" "Winetricks 20160622" "Wine Package Manager"
+.TH WINETRICKS 1 "June 2016" "Winetricks 20160628" "Wine Package Manager"
.SH NAME
winetricks \- manage virtual Windows environments using Wine
.SH SYNOPSIS
1
0
Hello community,
here is the log from the commit of package meson for openSUSE:Factory checked in at 2016-07-28 23:46:47
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/meson (Old)
and /work/SRC/openSUSE:Factory/.meson.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "meson"
Changes:
--------
--- /work/SRC/openSUSE:Factory/meson/meson.changes 2016-05-17 17:16:20.000000000 +0200
+++ /work/SRC/openSUSE:Factory/.meson.new/meson.changes 2016-07-28 23:46:49.000000000 +0200
@@ -1,0 +2,7 @@
+Sat Jul 23 16:15:39 UTC 2016 - sor.alexei(a)meowr.ru
+
+- Update to version 0.32.0:
+ * No changelog available.
+- Remove meson-gui package: GUI was removed upstream.
+
+-------------------------------------------------------------------
Old:
----
meson-0.31.0.tar.gz
meson-0.31.0.tar.gz.asc
New:
----
meson-0.32.0.tar.gz
meson-0.32.0.tar.gz.asc
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ meson.spec ++++++
--- /var/tmp/diff_new_pack.TNLVfQ/_old 2016-07-28 23:46:49.000000000 +0200
+++ /var/tmp/diff_new_pack.TNLVfQ/_new 2016-07-28 23:46:49.000000000 +0200
@@ -18,7 +18,7 @@
%define _name mesonbuild
Name: meson
-Version: 0.31.0
+Version: 0.32.0
Release: 0
Summary: High productivity build system
License: Apache-2.0
@@ -42,7 +42,7 @@
BuildRequires: mono-core
BuildRequires: mono-devel
BuildRequires: ninja
-BuildRequires: pkg-config
+BuildRequires: pkgconfig
BuildRequires: python3 >= 3.4
BuildRequires: python3-devel
BuildRequires: python3-gobject
@@ -54,24 +54,18 @@
BuildRequires: pkgconfig(protobuf)
BuildRequires: pkgconfig(zlib)
Requires: ninja
+# meson-gui was last used in openSUSE Leap 42.1.
+Provides: %{name}-gui = %{version}
+Obsoletes: %{name}-gui < %{version}
BuildArch: noarch
%description
-Meson is a build system designed to optimize programmer
+Meson is a build system designed to optimise programmer
productivity. It aims to do this by providing simple,
out-of-the-box support for modern software development tools and
practices, such as unit tests, coverage reports, Valgrind, CCache
and the like.
-%package gui
-Summary: GUI for high productivity build system
-Group: Development/Tools/Building
-Requires: %{name} = %{version}
-Requires: python3-qt5
-
-%description gui
-Graphical user interface for the high productivity build system.
-
%prep
%setup -q
@@ -90,7 +84,8 @@
for file in %{buildroot}%{_bindir}/*.py; do
mv -fv "$file" "${file%.py}"
done
-install -Dm 0644 data/macros.%{name} %{buildroot}%{_rpmconfigdir}/macros.d/macros.%{name}
+install -Dm 0644 data/macros.%{name} \
+ %{buildroot}%{_rpmconfigdir}/macros.d/macros.%{name}
%check
export MESON_PRINT_TEST_OUTPUT=1
@@ -104,9 +99,6 @@
%{_bindir}/%{name}introspect
%{_bindir}/wraptool
%{python3_sitelib}/%{_name}/
-%exclude %{python3_sitelib}/%{_name}/*.ui
-%exclude %{python3_sitelib}/%{_name}/mgui.py
-%exclude %{python3_sitelib}/%{_name}/__pycache__/mgui.*
%{python3_sitelib}/%{name}-*
%{_rpmconfigdir}/macros.d/macros.%{name}
%{_mandir}/man1/%{name}.1%{?ext_man}
@@ -114,13 +106,4 @@
%{_mandir}/man1/%{name}introspect.1%{?ext_man}
%{_mandir}/man1/wraptool.1%{?ext_man}
-%files gui
-%defattr(-,root,root)
-%doc authors.txt contributing.txt COPYING
-%{_bindir}/%{name}gui
-%{python3_sitelib}/%{_name}/*.ui
-%{python3_sitelib}/%{_name}/mgui.py
-%{python3_sitelib}/%{_name}/__pycache__/mgui.*
-%{_mandir}/man1/%{name}gui.1%{?ext_man}
-
%changelog
++++++ meson-0.31.0.tar.gz -> meson-0.32.0.tar.gz ++++++
++++ 4862 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package python3-bsddb3 for openSUSE:Factory checked in at 2016-07-28 23:46:41
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python3-bsddb3 (Old)
and /work/SRC/openSUSE:Factory/.python3-bsddb3.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python3-bsddb3"
Changes:
--------
--- /work/SRC/openSUSE:Factory/python3-bsddb3/python3-bsddb3.changes 2016-05-25 21:23:46.000000000 +0200
+++ /work/SRC/openSUSE:Factory/.python3-bsddb3.new/python3-bsddb3.changes 2016-07-28 23:46:47.000000000 +0200
@@ -1,0 +2,16 @@
+Sat Jul 23 18:09:04 UTC 2016 - arun(a)gmx.de
+
+- update to version 6.2.1:
+ * Correctly detect Berkeley DB installations in modern 64 bits
+ Debians.
+
+- changes from version 6.2.0:
+ * Support Berkeley DB 6.2.x.
+ * Declare Python 3.5 support for PyPI.
+ * Drop support for Python 3.2. If you need compatibility with that
+ version, you can keep using old releases of these bindings.
+ * Drop support for Berkeley DB 5.0, 5.2 and 6.0. If you need
+ compatibility with those versions, you can keep using old releases
+ of these bindings.
+
+-------------------------------------------------------------------
@@ -7 +22,0 @@
-
Old:
----
bsddb3-6.1.1.tar.gz
New:
----
bsddb3-6.2.1.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python3-bsddb3.spec ++++++
--- /var/tmp/diff_new_pack.3XUP9q/_old 2016-07-28 23:46:48.000000000 +0200
+++ /var/tmp/diff_new_pack.3XUP9q/_new 2016-07-28 23:46:48.000000000 +0200
@@ -17,7 +17,7 @@
Name: python3-bsddb3
-Version: 6.1.1
+Version: 6.2.1
Release: 0
Url: http://pypi.python.org/pypi/bsddb3
Summary: Python interface for Berkeley DB
++++++ bsddb3-6.1.1.tar.gz -> bsddb3-6.2.1.tar.gz ++++++
++++ 22143 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package afl for openSUSE:Factory checked in at 2016-07-28 23:46:39
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/afl (Old)
and /work/SRC/openSUSE:Factory/.afl.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "afl"
Changes:
--------
--- /work/SRC/openSUSE:Factory/afl/afl.changes 2016-07-01 09:59:30.000000000 +0200
+++ /work/SRC/openSUSE:Factory/.afl.new/afl.changes 2016-07-28 23:46:41.000000000 +0200
@@ -1,0 +2,15 @@
+Sat Jul 23 19:10:30 UTC 2016 - astieger(a)suse.com
+
+- afl 2.21b:
+ * Minor UI fixes
+- includes changes from 2.20b:
+ * Revamp handling of variable paths
+ * Stablility improvements
+ * Include current input bitmap density in UI
+ * Add experimental support for parallelizing -M.
+- includes changes from 2.19b:
+ * Ensure auto CPU binding happens at non-overlapping times
+- includes changes from 2.18b
+ * Performance improvements
+
+-------------------------------------------------------------------
Old:
----
afl-2.17b.tgz
New:
----
afl-2.21b.tgz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ afl.spec ++++++
--- /var/tmp/diff_new_pack.oASzMh/_old 2016-07-28 23:46:42.000000000 +0200
+++ /var/tmp/diff_new_pack.oASzMh/_new 2016-07-28 23:46:42.000000000 +0200
@@ -17,7 +17,7 @@
Name: afl
-Version: 2.17b
+Version: 2.21b
Release: 0
Summary: American fuzzy lop is a security-oriented fuzzer
License: Apache-2.0
++++++ afl-2.17b.tgz -> afl-2.21b.tgz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/afl-2.17b/afl-as.h new/afl-2.21b/afl-as.h
--- old/afl-2.17b/afl-as.h 2016-06-21 06:44:52.000000000 +0200
+++ new/afl-2.21b/afl-as.h 2016-07-04 22:08:29.000000000 +0200
@@ -98,7 +98,7 @@
of every .c file. This should have no impact in any practical sense.
Another side effect of this design is that getenv() will be called once per
- every .o file when running in non-instrumented mode; an since getenv() tends
+ every .o file when running in non-instrumented mode; and since getenv() tends
to be optimized in funny ways, we need to be very careful to save every
oddball register it may touch.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/afl-2.17b/afl-fuzz.c new/afl-2.21b/afl-fuzz.c
--- old/afl-2.17b/afl-fuzz.c 2016-06-28 07:22:19.000000000 +0200
+++ new/afl-2.21b/afl-fuzz.c 2016-07-23 01:21:04.000000000 +0200
@@ -112,12 +112,12 @@
in_place_resume, /* Attempt in-place resume? */
auto_changed, /* Auto-generated tokens changed? */
no_cpu_meter_red, /* Feng shui on the status screen */
- no_var_check, /* Don't detect variable behavior */
shuffle_queue, /* Shuffle input queue? */
bitmap_changed = 1, /* Time to update bitmap? */
qemu_mode, /* Running in QEMU mode? */
skip_requested, /* Skip request, via SIGUSR1 */
- run_over10m; /* Run time over 10 minutes? */
+ run_over10m, /* Run time over 10 minutes? */
+ persistent_mode; /* Running in persistent mode? */
static s32 out_fd, /* Persistent fd for out_file */
dev_urandom_fd = -1, /* Persistent fd for /dev/urandom */
@@ -135,6 +135,8 @@
virgin_hang[MAP_SIZE], /* Bits we haven't seen in hangs */
virgin_crash[MAP_SIZE]; /* Bits we haven't seen in crashes */
+static u8 var_bytes[MAP_SIZE]; /* Bytes that appear to be variable */
+
static s32 shm_id; /* ID of the SHM region */
static volatile u8 stop_soon, /* Ctrl-C pressed? */
@@ -154,6 +156,7 @@
cur_depth, /* Current path depth */
max_depth, /* Max path depth */
useless_at_start, /* Number of useless starting paths */
+ var_byte_count, /* Bitmap bytes with var behavior */
current_entry, /* Current queue entry ID */
havoc_div = 1; /* Cycle count divisor for havoc */
@@ -166,6 +169,7 @@
last_path_time, /* Time for most recent path (ms) */
last_crash_time, /* Time for most recent crash (ms) */
last_hang_time, /* Time for most recent hang (ms) */
+ last_crash_execs, /* Exec counter at last crash */
queue_cycle, /* Queue round counter */
cycles_wo_finds, /* Cycles without any new paths */
trim_execs, /* Execs done to trim input files */
@@ -183,6 +187,8 @@
static s32 stage_cur, stage_max; /* Stage progression */
static s32 splicing_with = -1; /* Splicing with which test case? */
+static u32 master_id, master_max; /* Master instance job splitting */
+
static u32 syncing_case; /* Syncing with case #... */
static s32 stage_cur_byte, /* Byte offset of current stage op */
@@ -343,7 +349,7 @@
static inline u32 UR(u32 limit) {
- if (!rand_cnt--) {
+ if (unlikely(!rand_cnt--)) {
u32 seed[2];
@@ -863,9 +869,6 @@
This function is called after every exec() on a fairly large buffer, so
it needs to be fast. We do this in 32-bit and 64-bit flavors. */
-#define FFL(_b) (0xffULL << ((_b) << 3))
-#define FF(_b) (0xff << ((_b) << 3))
-
static inline u8 has_new_bits(u8* virgin_map) {
#ifdef __x86_64__
@@ -888,53 +891,39 @@
while (i--) {
-#ifdef __x86_64__
-
- u64 cur = *current;
- u64 vir = *virgin;
-
-#else
-
- u32 cur = *current;
- u32 vir = *virgin;
-
-#endif /* ^__x86_64__ */
+ /* Optimize for (*current & *virgin) == 0 - i.e., no bits in current bitmap
+ that have not been already cleared from the virgin map - since this will
+ almost always be the case. */
- /* Optimize for *current == ~*virgin, since this will almost always be the
- case. */
+ if (unlikely(*current) && unlikely(*current & *virgin)) {
- if (cur & vir) {
+ if (likely(ret < 2)) {
- if (ret < 2) {
+ u8* cur = (u8*)current;
+ u8* vir = (u8*)virgin;
- /* This trace did not have any new bytes yet; see if there's any
- current[] byte that is non-zero when virgin[] is 0xff. */
+ /* Looks like we have not found any new bytes yet; see if any non-zero
+ bytes in current[] are pristine in virgin[]. */
#ifdef __x86_64__
- if (((cur & FFL(0)) && (vir & FFL(0)) == FFL(0)) ||
- ((cur & FFL(1)) && (vir & FFL(1)) == FFL(1)) ||
- ((cur & FFL(2)) && (vir & FFL(2)) == FFL(2)) ||
- ((cur & FFL(3)) && (vir & FFL(3)) == FFL(3)) ||
- ((cur & FFL(4)) && (vir & FFL(4)) == FFL(4)) ||
- ((cur & FFL(5)) && (vir & FFL(5)) == FFL(5)) ||
- ((cur & FFL(6)) && (vir & FFL(6)) == FFL(6)) ||
- ((cur & FFL(7)) && (vir & FFL(7)) == FFL(7))) ret = 2;
+ if ((cur[0] && vir[0] == 0xff) || (cur[1] && vir[1] == 0xff) ||
+ (cur[2] && vir[2] == 0xff) || (cur[3] && vir[3] == 0xff) ||
+ (cur[4] && vir[4] == 0xff) || (cur[5] && vir[5] == 0xff) ||
+ (cur[6] && vir[6] == 0xff) || (cur[7] && vir[7] == 0xff)) ret = 2;
else ret = 1;
#else
- if (((cur & FF(0)) && (vir & FF(0)) == FF(0)) ||
- ((cur & FF(1)) && (vir & FF(1)) == FF(1)) ||
- ((cur & FF(2)) && (vir & FF(2)) == FF(2)) ||
- ((cur & FF(3)) && (vir & FF(3)) == FF(3))) ret = 2;
+ if ((cur[0] && vir[0] == 0xff) || (cur[1] && vir[1] == 0xff) ||
+ (cur[2] && vir[2] == 0xff) || (cur[3] && vir[3] == 0xff)) ret = 2;
else ret = 1;
#endif /* ^__x86_64__ */
}
- *virgin = vir & ~cur;
+ *virgin &= ~*current;
}
@@ -982,6 +971,8 @@
}
+#define FF(_b) (0xff << ((_b) << 3))
+
/* Count the number of bytes set in the bitmap. Called fairly sporadically,
mostly to update the status screen or calibrate and examine confirmed
new paths. */
@@ -1060,7 +1051,7 @@
/* Optimize for sparse bitmaps. */
- if (*mem) {
+ if (unlikely(*mem)) {
u8* mem8 = (u8*)mem;
@@ -1091,7 +1082,7 @@
/* Optimize for sparse bitmaps. */
- if (*mem) {
+ if (unlikely(*mem)) {
u8* mem8 = (u8*)mem;
@@ -1114,7 +1105,7 @@
preprocessing step for any newly acquired traces. Called on every exec,
must be fast. */
-static const u8 count_class_lookup[256] = {
+static const u8 count_class_lookup8[256] = {
[0] = 0,
[1] = 1,
@@ -1128,6 +1119,22 @@
};
+static u16 count_class_lookup16[65536];
+
+
+static void init_count_class16(void) {
+
+ u32 b1, b2;
+
+ for (b1 = 0; b1 < 256; b1++)
+ for (b2 = 0; b2 < 256; b2++)
+ count_class_lookup16[(b1 << 8) + b2] =
+ (count_class_lookup8[b1] << 8) |
+ count_class_lookup8[b2];
+
+}
+
+
#ifdef __x86_64__
static inline void classify_counts(u64* mem) {
@@ -1138,18 +1145,14 @@
/* Optimize for sparse bitmaps. */
- if (*mem) {
+ if (unlikely(*mem)) {
- u8* mem8 = (u8*)mem;
+ u16* mem16 = (u16*)mem;
- mem8[0] = count_class_lookup[mem8[0]];
- mem8[1] = count_class_lookup[mem8[1]];
- mem8[2] = count_class_lookup[mem8[2]];
- mem8[3] = count_class_lookup[mem8[3]];
- mem8[4] = count_class_lookup[mem8[4]];
- mem8[5] = count_class_lookup[mem8[5]];
- mem8[6] = count_class_lookup[mem8[6]];
- mem8[7] = count_class_lookup[mem8[7]];
+ mem16[0] = count_class_lookup16[mem16[0]];
+ mem16[1] = count_class_lookup16[mem16[1]];
+ mem16[2] = count_class_lookup16[mem16[2]];
+ mem16[3] = count_class_lookup16[mem16[3]];
}
@@ -1169,14 +1172,12 @@
/* Optimize for sparse bitmaps. */
- if (*mem) {
+ if (unlikely(*mem)) {
- u8* mem8 = (u8*)mem;
+ u16* mem16 = (u16*)mem;
- mem8[0] = count_class_lookup[mem8[0]];
- mem8[1] = count_class_lookup[mem8[1]];
- mem8[2] = count_class_lookup[mem8[2]];
- mem8[3] = count_class_lookup[mem8[3]];
+ mem16[0] = count_class_lookup16[mem16[0]];
+ mem16[1] = count_class_lookup16[mem16[1]];
}
@@ -2520,7 +2521,11 @@
static u8 calibrate_case(char** argv, struct queue_entry* q, u8* use_mem,
u32 handicap, u8 from_queue) {
- u8 fault = 0, new_bits = 0, var_detected = 0, first_run = (q->exec_cksum == 0);
+ static u8 first_trace[MAP_SIZE];
+
+ u8 fault = 0, new_bits = 0, var_detected = 0,
+ first_run = (q->exec_cksum == 0);
+
u64 start_us, stop_us;
s32 old_sc = stage_cur, old_sm = stage_max, old_tmout = exec_tmout;
@@ -2537,7 +2542,7 @@
q->cal_failed++;
stage_name = "calibration";
- stage_max = no_var_check ? CAL_CYCLES_NO_VAR : CAL_CYCLES;
+ stage_max = CAL_CYCLES;
/* Make sure the forkserver is up before we do anything, and let's not
count its spin-up time toward binary calibration. */
@@ -2545,6 +2550,8 @@
if (dumb_mode != 1 && !no_forkserver && !forksrv_pid)
init_forkserver(argv);
+ if (q->exec_cksum) memcpy(first_trace, trace_bits, MAP_SIZE);
+
start_us = get_cur_time_us();
for (stage_cur = 0; stage_cur < stage_max; stage_cur++) {
@@ -2574,12 +2581,24 @@
u8 hnb = has_new_bits(virgin_bits);
if (hnb > new_bits) new_bits = hnb;
- if (!no_var_check && q->exec_cksum) {
+ if (q->exec_cksum) {
+
+ u32 i;
+
+ for (i = 0; i < MAP_SIZE; i++)
+ if (!var_bytes[i] && first_trace[i] != trace_bits[i]) {
+ var_bytes[i] = 1;
+ stage_max = CAL_CYCLES_LONG;
+ }
var_detected = 1;
- stage_max = CAL_CYCLES_LONG;
- } else q->exec_cksum = cksum;
+ } else {
+
+ q->exec_cksum = cksum;
+ memcpy(first_trace, trace_bits, MAP_SIZE);
+
+ }
}
@@ -2618,9 +2637,15 @@
/* Mark variable paths. */
- if (var_detected && !q->var_behavior) {
- mark_as_variable(q);
- queued_variable++;
+ if (var_detected) {
+
+ var_byte_count = count_bytes(var_bytes);
+
+ if (!q->var_behavior) {
+ mark_as_variable(q);
+ queued_variable++;
+ }
+
}
stage_name = old_sn;
@@ -3209,6 +3234,7 @@
unique_crashes++;
last_crash_time = get_cur_time();
+ last_crash_execs = total_execs;
break;
@@ -3306,9 +3332,9 @@
/* Update stats file for unattended monitoring. */
-static void write_stats_file(double bitmap_cvg, double eps) {
+static void write_stats_file(double bitmap_cvg, double stability, double eps) {
- static double last_bcvg, last_eps;
+ static double last_bcvg, last_stab, last_eps;
u8* fn = alloc_printf("%s/fuzzer_stats", out_dir);
s32 fd;
@@ -3327,46 +3353,51 @@
/* Keep last values in case we're called from another context
where exec/sec stats and such are not readily available. */
- if (!bitmap_cvg && !eps) {
+ if (!bitmap_cvg && !stability && !eps) {
bitmap_cvg = last_bcvg;
+ stability = last_stab;
eps = last_eps;
} else {
last_bcvg = bitmap_cvg;
+ last_stab = stability;
last_eps = eps;
}
- fprintf(f, "start_time : %llu\n"
- "last_update : %llu\n"
- "fuzzer_pid : %u\n"
- "cycles_done : %llu\n"
- "execs_done : %llu\n"
- "execs_per_sec : %0.02f\n"
- "paths_total : %u\n"
- "paths_favored : %u\n"
- "paths_found : %u\n"
- "paths_imported : %u\n"
- "max_depth : %u\n"
- "cur_path : %u\n"
- "pending_favs : %u\n"
- "pending_total : %u\n"
- "variable_paths : %u\n"
- "bitmap_cvg : %0.02f%%\n"
- "unique_crashes : %llu\n"
- "unique_hangs : %llu\n"
- "last_path : %llu\n"
- "last_crash : %llu\n"
- "last_hang : %llu\n"
- "exec_timeout : %u\n"
- "afl_banner : %s\n"
- "afl_version : " VERSION "\n"
- "command_line : %s\n",
+ fprintf(f, "start_time : %llu\n"
+ "last_update : %llu\n"
+ "fuzzer_pid : %u\n"
+ "cycles_done : %llu\n"
+ "execs_done : %llu\n"
+ "execs_per_sec : %0.02f\n"
+ "paths_total : %u\n"
+ "paths_favored : %u\n"
+ "paths_found : %u\n"
+ "paths_imported : %u\n"
+ "max_depth : %u\n"
+ "cur_path : %u\n"
+ "pending_favs : %u\n"
+ "pending_total : %u\n"
+ "variable_paths : %u\n"
+ "stability : %0.02f%%\n"
+ "bitmap_cvg : %0.02f%%\n"
+ "unique_crashes : %llu\n"
+ "unique_hangs : %llu\n"
+ "last_path : %llu\n"
+ "last_crash : %llu\n"
+ "last_hang : %llu\n"
+ "execs_since_crash : %llu\n"
+ "exec_timeout : %u\n"
+ "afl_banner : %s\n"
+ "afl_version : " VERSION "\n"
+ "command_line : %s\n",
start_time / 1000, get_cur_time() / 1000, getpid(),
queue_cycle ? (queue_cycle - 1) : 0, total_execs, eps,
queued_paths, queued_favored, queued_discovered, queued_imported,
max_depth, current_entry, pending_favored, pending_not_fuzzed,
- queued_variable, bitmap_cvg, unique_crashes, unique_hangs,
- last_path_time / 1000, last_crash_time / 1000,
- last_hang_time / 1000, exec_tmout, use_banner, orig_cmdline);
+ queued_variable, stability, bitmap_cvg, unique_crashes,
+ unique_hangs, last_path_time / 1000, last_crash_time / 1000,
+ last_hang_time / 1000, total_execs - last_crash_execs,
+ exec_tmout, use_banner, orig_cmdline);
/* ignore errors */
fclose(f);
@@ -3789,7 +3820,7 @@
static u64 last_stats_ms, last_plot_ms, last_ms, last_execs;
static double avg_exec;
- double t_byte_ratio;
+ double t_byte_ratio, stab_ratio;
u64 cur_ms;
u32 t_bytes, t_bits;
@@ -3842,12 +3873,17 @@
t_bytes = count_non_255_bytes(virgin_bits);
t_byte_ratio = ((double)t_bytes * 100) / MAP_SIZE;
+ if (t_bytes)
+ stab_ratio = 100 - ((double)var_byte_count) * 100 / t_bytes;
+ else
+ stab_ratio = 100;
+
/* Roughly every minute, update fuzzer stats and save auto tokens. */
if (cur_ms - last_stats_ms > STATS_UPDATE_SEC * 1000) {
last_stats_ms = cur_ms;
- write_stats_file(t_byte_ratio, avg_exec);
+ write_stats_file(t_byte_ratio, stab_ratio, avg_exec);
save_auto();
write_bitmap();
@@ -4009,8 +4045,8 @@
SAYF(bV bSTOP " now processing : " cRST "%-17s " bSTG bV bSTOP, tmp);
-
- sprintf(tmp, "%s (%0.02f%%)", DI(t_bytes), t_byte_ratio);
+ sprintf(tmp, "%0.02f%% / %0.02f%%", ((double)queue_cur->bitmap_size) *
+ 100 / MAP_SIZE, t_byte_ratio);
SAYF(" map density : %s%-21s " bSTG bV "\n", t_byte_ratio > 70 ? cLRD :
((t_bytes < 200 && !dumb_mode) ? cPIN : cRST), tmp);
@@ -4154,9 +4190,13 @@
DI(stage_finds[STAGE_HAVOC]), DI(stage_cycles[STAGE_HAVOC]),
DI(stage_finds[STAGE_SPLICE]), DI(stage_cycles[STAGE_SPLICE]));
- SAYF(bV bSTOP " havoc : " cRST "%-37s " bSTG bV bSTOP
- " variable : %s%-10s " bSTG bV "\n", tmp, queued_variable ? cLRD : cRST,
- no_var_check ? (u8*)"n/a" : DI(queued_variable));
+ SAYF(bV bSTOP " havoc : " cRST "%-37s " bSTG bV bSTOP, tmp);
+
+ if (t_bytes) sprintf(tmp, "%0.02f%%", stab_ratio);
+ else strcpy(tmp, "n/a");
+
+ SAYF(" stability : %s%-10s " bSTG bV "\n", stab_ratio < 90 ? cLRD :
+ ((queued_variable && !persistent_mode) ? cMGN : cRST), tmp);
if (!bytes_trim_out) {
@@ -4967,6 +5007,12 @@
if (skip_deterministic || queue_cur->was_fuzzed || queue_cur->passed_det)
goto havoc_stage;
+ /* Skip deterministic fuzzing if exec path checksum puts this out of scope
+ for this master instance. */
+
+ if (master_max && (queue_cur->exec_cksum % master_max) != master_id - 1)
+ goto havoc_stage;
+
/*********************************************
* SIMPLE BITFLIP (+dictionary construction) *
*********************************************/
@@ -5136,7 +5182,7 @@
/* Effector map setup. These macros calculate:
EFF_APOS - position of a particular file offset in the map.
- EFF_ALEN - length of an map with a particular number of bytes.
+ EFF_ALEN - length of a map with a particular number of bytes.
EFF_SPAN_ALEN - map span for a sequence of bytes.
*/
@@ -6567,8 +6613,14 @@
path = alloc_printf("%s/%s", qd_path, qd_ent->d_name);
+ /* Allow this to fail in case the other fuzzer is resuming or so... */
+
fd = open(path, O_RDONLY);
- if (fd < 0) PFATAL("Unable to open '%s'", path);
+
+ if (fd < 0) {
+ ck_free(path);
+ continue;
+ }
if (fstat(fd, &st)) PFATAL("fstat() failed");
@@ -6802,7 +6854,6 @@
OKF(cPIN "Persistent mode binary detected.");
setenv(PERSIST_ENV_VAR, "1", 1);
- no_var_check = 1;
} else if (getenv("AFL_PERSISTENT")) {
@@ -6814,6 +6865,7 @@
OKF(cPIN "Deferred forkserver binary detected.");
setenv(DEFER_ENV_VAR, "1", 1);
+ persistent_mode = 1;
} else if (getenv("AFL_DEFER_FORKSRV")) {
@@ -7556,18 +7608,23 @@
u8 *extras_dir = 0;
u8 mem_limit_given = 0;
u8 exit_1 = !!getenv("AFL_BENCH_JUST_ONE");
-
char** use_argv;
+ struct timeval tv;
+ struct timezone tz;
+
SAYF(cCYA "afl-fuzz " cBRI VERSION cRST " by <lcamtuf(a)google.com>\n");
doc_path = access(DOC_PATH, F_OK) ? "docs" : DOC_PATH;
+ gettimeofday(&tv, &tz);
+ srandom(tv.tv_sec ^ tv.tv_usec ^ getpid());
+
while ((opt = getopt(argc, argv, "+i:o:f:m:t:T:dnCB:S:M:x:Q")) > 0)
switch (opt) {
- case 'i':
+ case 'i': /* input dir */
if (in_dir) FATAL("Multiple -i options not supported");
in_dir = optarg;
@@ -7582,15 +7639,30 @@
out_dir = optarg;
break;
- case 'M':
+ case 'M': /* master sync ID */
force_deterministic = 1;
/* Fall through */
- case 'S': /* sync ID */
+ case 'S': { /* secondary sync ID */
+
+ u8* c;
+
+ if (sync_id) FATAL("Multiple -S or -M options not supported");
+ sync_id = optarg;
+
+ if ((c = strchr(sync_id, ':'))) {
+
+ *c = 0;
+
+ if (sscanf(c + 1, "%u/%u", &master_id, &master_max) != 2 ||
+ !master_id || !master_max || master_id > master_max ||
+ master_max > 1000000) FATAL("Bogus master ID passed to -M");
+
+ }
+
+ }
- if (sync_id) FATAL("Multiple -S or -M options not supported");
- sync_id = optarg;
break;
case 'f': /* target file */
@@ -7599,13 +7671,13 @@
out_file = optarg;
break;
- case 'x':
+ case 'x': /* dictionary */
if (extras_dir) FATAL("Multiple -x options not supported");
extras_dir = optarg;
break;
- case 't': {
+ case 't': { /* timeout */
u8 suffix = 0;
@@ -7622,7 +7694,7 @@
}
- case 'm': {
+ case 'm': { /* mem limit */
u8 suffix = 'M';
@@ -7659,14 +7731,14 @@
break;
- case 'd':
+ case 'd': /* skip deterministic */
if (skip_deterministic) FATAL("Multiple -d options not supported");
skip_deterministic = 1;
use_splicing = 1;
break;
- case 'B':
+ case 'B': /* load bitmap */
/* This is a secret undocumented option! It is useful if you find
an interesting test case during a normal fuzzing process, and want
@@ -7685,26 +7757,26 @@
read_bitmap(in_bitmap);
break;
- case 'C':
+ case 'C': /* crash mode */
if (crash_mode) FATAL("Multiple -C options not supported");
crash_mode = FAULT_CRASH;
break;
- case 'n':
+ case 'n': /* dumb mode */
if (dumb_mode) FATAL("Multiple -n options not supported");
if (getenv("AFL_DUMB_FORKSRV")) dumb_mode = 2; else dumb_mode = 1;
break;
- case 'T':
+ case 'T': /* banner */
if (use_banner) FATAL("Multiple -T options not supported");
use_banner = optarg;
break;
- case 'Q':
+ case 'Q': /* QEMU mode */
if (qemu_mode) FATAL("Multiple -Q options not supported");
qemu_mode = 1;
@@ -7738,7 +7810,6 @@
if (getenv("AFL_NO_FORKSRV")) no_forkserver = 1;
if (getenv("AFL_NO_CPU_RED")) no_cpu_meter_red = 1;
- if (getenv("AFL_NO_VAR_CHECK")) no_var_check = 1;
if (getenv("AFL_SHUFFLE_QUEUE")) shuffle_queue = 1;
if (dumb_mode == 2 && no_forkserver)
@@ -7764,6 +7835,7 @@
setup_post();
setup_shm();
+ init_count_class16();
setup_dirs_fds();
read_testcases();
@@ -7796,7 +7868,7 @@
seek_to = find_start_position();
- write_stats_file(0, 0);
+ write_stats_file(0, 0, 0);
save_auto();
if (stop_soon) goto stop_fuzzing;
@@ -7872,7 +7944,7 @@
if (queue_cur) show_stats();
write_bitmap();
- write_stats_file(0, 0);
+ write_stats_file(0, 0, 0);
save_auto();
stop_fuzzing:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/afl-2.17b/config.h new/afl-2.21b/config.h
--- old/afl-2.17b/config.h 2016-06-27 21:06:37.000000000 +0200
+++ new/afl-2.21b/config.h 2016-07-23 01:21:30.000000000 +0200
@@ -21,7 +21,7 @@
/* Version string: */
-#define VERSION "2.17b"
+#define VERSION "2.21b"
/******************************************************
* *
@@ -61,13 +61,9 @@
/* Number of calibration cycles per every new test case (and for test
cases that show variable behavior): */
-#define CAL_CYCLES 10
+#define CAL_CYCLES 8
#define CAL_CYCLES_LONG 40
-/* The same, but when AFL_NO_VAR_CHECK is set in the environment: */
-
-#define CAL_CYCLES_NO_VAR 4
-
/* Number of subsequent hangs before abandoning an input file: */
#define HANG_LIMIT 250
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/afl-2.17b/docs/ChangeLog new/afl-2.21b/docs/ChangeLog
--- old/afl-2.17b/docs/ChangeLog 2016-06-28 04:17:51.000000000 +0200
+++ new/afl-2.21b/docs/ChangeLog 2016-07-23 01:21:22.000000000 +0200
@@ -13,10 +13,50 @@
sending a mail to <afl-users+subscribe(a)googlegroups.com>.
Not sure if you should upgrade? The lowest currently recommended version
-is 2.07b. If you're stuck on an earlier release, it's strongly advisable
+is 2.18b. If you're stuck on an earlier release, it's strongly advisable
to get on with the times.
--------------
+Version 2.21b:
+--------------
+
+ - Added some crash reporting notes for Solaris in docs/INSTALL, as
+ investigated by Martin Carpenter.
+
+ - Fixed a minor UI mix-up with havoc strategy stats.
+
+--------------
+Version 2.20b:
+--------------
+
+ - Revamped the handling of variable paths, replacing path count with a
+ "stability" score to give users a much better signal. Based on the
+ feedback from Vegard Nossum.
+
+ - Made a stability improvement to the syncing behavior with resuming
+ fuzzers. Based on the feedback from Vegard.
+
+ - Changed the UI to include current input bitmap density along with
+ total density. Ditto.
+
+ - Added experimental support for parallelizing -M.
+
+--------------
+Version 2.19b:
+--------------
+
+ - Made a fix to make sure that auto CPU binding happens at non-overlapping
+ times.
+
+--------------
+Version 2.18b:
+--------------
+
+ - Made several performance improvements to has_new_bits() and
+ classify_counts(). This should offer a robust performance bump with
+ fast targets.
+
+--------------
Version 2.17b:
--------------
@@ -1495,7 +1535,7 @@
- Refactored the code slightly to make more frequent updates to fuzzer_stats
and to provide more detail about synchronization.
- - Added a fflush(stdout) call for non-tty operation, as requested by
+ - Added an fflush(stdout) call for non-tty operation, as requested by
Joonas Kuorilehto.
- Added some detail to fuzzer_stats for parity with plot_file.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/afl-2.17b/docs/INSTALL new/afl-2.21b/docs/INSTALL
--- old/afl-2.17b/docs/INSTALL 2016-06-07 20:17:00.000000000 +0200
+++ new/afl-2.21b/docs/INSTALL 2016-07-22 20:48:18.000000000 +0200
@@ -140,11 +140,13 @@
Do *not* specify --with-as=/usr/gnu/bin/as - this will produce a GCC binary that
ignores the -B flag and you will be back to square one.
-If you have system-wide crash reporting enabled, you may run into problems
-similar to the gotchas for Linux and MacOS X, but I have not verified this.
-More information about AppCrash can be found here:
+Note that Solaris reportedly comes withe crash reporting enabled, which causes
+problems with crashes being misinterpreted as hangs, similarly to the gotchas
+for Linux and MacOS X. AFL does not auto-detect crash reporting on this
+particular platform, but you may need to run the following command:
- http://www.oracle.com/technetwork/server-storage/solaris10/app-crash-142906…
+$ coreadm -d global -d global-setid -d process -d proc-setid \
+ -d kzone -d log
User emulation mode of QEMU is not available on Solaris, so black-box
instrumentation mode (-Q) will not work.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/afl-2.17b/docs/README new/afl-2.21b/docs/README
--- old/afl-2.17b/docs/README 2016-06-19 00:24:18.000000000 +0200
+++ new/afl-2.21b/docs/README 2016-07-21 20:54:39.000000000 +0200
@@ -464,6 +464,7 @@
Daniel Godas-Lopez Franjo Ivancic
Austin Seipp Daniel Komaromy
Daniel Binderman Jonathan Metzman
+ Vegard Nossum
Thank you!
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/afl-2.17b/docs/env_variables.txt new/afl-2.21b/docs/env_variables.txt
--- old/afl-2.17b/docs/env_variables.txt 2016-06-27 20:43:23.000000000 +0200
+++ new/afl-2.21b/docs/env_variables.txt 2016-07-21 20:42:24.000000000 +0200
@@ -99,11 +99,6 @@
normally done when starting up the forkserver and causes a pretty
significant performance drop.
- - Setting AFL_NO_VAR_CHECK skips the detection of variable test cases,
- greatly speeding up session resumption and path discovery for complex
- multi-threaded apps (but depriving you of a potentially useful signal
- in more orderly programs).
-
- AFL_EXIT_WHEN_DONE causes afl-fuzz to terminate when all existing paths
have been fuzzed and there were no new finds for a while. This would be
normally indicated by the cycle counter in the UI turning green. May be
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/afl-2.17b/docs/parallel_fuzzing.txt new/afl-2.21b/docs/parallel_fuzzing.txt
--- old/afl-2.17b/docs/parallel_fuzzing.txt 2016-04-23 08:54:06.000000000 +0200
+++ new/afl-2.21b/docs/parallel_fuzzing.txt 2016-07-21 22:21:48.000000000 +0200
@@ -51,13 +51,25 @@
for any test cases found by other fuzzers - and will incorporate them into
its own fuzzing when they are deemed interesting enough.
-The only difference between the -M and -S modes is that the master instance
-will still perform deterministic checks; while the secondary instances will
+The difference between the -M and -S modes is that the master instance will
+still perform deterministic checks; while the secondary instances will
proceed straight to random tweaks. If you don't want to do deterministic
fuzzing at all, it's OK to run all instances with -S. With very slow or complex
targets, or when running heavily parallelized jobs, this is usually a good plan.
-You can monitor the progress of your jobs from the command line with the
+Note that running multiple -M instances is wasteful, although there is an
+experimental support for parallelizing the deterministic checks. To leverage
+that, you need to create -M instances like so:
+
+$ ./afl-fuzz -i testcase_dir -o sync_dir -M masterA:1/3 [...]
+$ ./afl-fuzz -i testcase_dir -o sync_dir -M masterB:2/3 [...]
+$ ./afl-fuzz -i testcase_dir -o sync_dir -M masterC:3/3 [...]
+
+...where the first value after ':' is the sequential ID of a particular master
+instance (starting at 1), and the second value is the total number of fuzzers to
+distribute the deterministic fuzzing across.
+
+You can also monitor the progress of your jobs from the command line with the
provided afl-whatsup tool. When the instances are no longer finding new paths,
it's probably time to stop.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/afl-2.17b/docs/sister_projects.txt new/afl-2.21b/docs/sister_projects.txt
--- old/afl-2.17b/docs/sister_projects.txt 2016-06-27 21:33:46.000000000 +0200
+++ new/afl-2.21b/docs/sister_projects.txt 2016-07-12 04:42:04.000000000 +0200
@@ -78,6 +78,13 @@
https://www.nccgroup.trust/us/about-us/newsroom-and-events/blog/2016/june/p…
+WinAFL (Ivan Fratric)
+---------------------
+
+ As the name implies, allows you to fuzz Windows binaries (using DynamoRio).
+
+ https://github.com/ivanfratric/winafl
+
----------------
Network fuzzing:
----------------
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/afl-2.17b/docs/status_screen.txt new/afl-2.21b/docs/status_screen.txt
--- old/afl-2.17b/docs/status_screen.txt 2016-02-21 01:01:37.000000000 +0100
+++ new/afl-2.21b/docs/status_screen.txt 2016-07-21 22:37:18.000000000 +0200
@@ -119,7 +119,7 @@
---------------
+--------------------------------------+
- | map density : 4763 (29.07%) |
+ | map density : 10.15% / 29.07% |
| count coverage : 4.03 bits/tuple |
+--------------------------------------+
@@ -127,7 +127,11 @@
instrumentation embedded in the target binary.
The first line in the box tells you how many branch tuples we have already
-hit, in proportion to how much the bitmap can hold. Be wary of extremes:
+hit, in proportion to how much the bitmap can hold. The number on the left
+describes the current input; the one on the right is the value for the entire
+input corpus.
+
+Be wary of extremes:
- Absolute numbers below 200 or so suggest one of three things: that the
program is extremely simple; that it is not instrumented properly (e.g.,
@@ -271,7 +275,7 @@
| pend fav : 583 |
| own finds : 0 |
| imported : 0 |
- | variable : 0 |
+ | stability : 100.00% |
+---------------------+
The first field in this section tracks the path depth reached through the
@@ -291,27 +295,33 @@
imported from other fuzzer instances when doing parallelized fuzzing; and the
number of inputs that produce seemingly variable behavior in the tested binary.
-That last bit is actually fairly interesting. There are four quasi-common
-explanations for variable behavior of the tested program:
-
- - Use of uninitialized memory in conjunction with some intrinsic sources of
- entropy in the tested binary. This can be indicative of a security bug.
-
- - Attempts to create files that were already created during previous runs, or
- otherwise interact with some form of persistent state. This is harmless,
- but you may want to instruct the targeted program to write to stdout or to
- /dev/null to avoid surprises (and disable the creation of temporary files
- and similar artifacts, if applicable).
-
- - Hitting functionality that is actually designed to behave randomly. For
- example, when fuzzing sqlite, the fuzzer will dutifully detect variable
- behavior once the mutation engine generates something like:
-
- select random();
-
- - Multiple threads executing at once in semi-random order. This is usually
- just a nuisance, but if the number of variable paths is very high, try the
- following options:
+That last bit is actually fairly interesting: it measures the consistency of
+observed traces. If a program always behaves the same for the same input data,
+it will earn a score of 100%. When the value is over 90%, the fuzzing process
+is still unlikely to be negatively affected. If it gets much lower, you may
+be in trouble, since AFL will have difficulty discerning between meaningful
+and "phantom" effects of tweaking the input file.
+
+Now, most targets will just get a 100% score, but when you see lower figures,
+there are several things to look at:
+
+ - The use of uninitialized memory in conjunction with some intrinsic sources
+ of entropy in the tested binary. Harmless to AFL, but could be indicative
+ of a security bug.
+
+ - Attempts to manipulate persistent resources, such as left over temporary
+ files or shared memory objects. This is usually harmless, but you may want
+ to double-check to make sure the program isn't bailing out prematurely.
+ Running out of disk space, SHM handles, or other global resources can
+ trigger this, too.
+
+ - Hitting some functionality that is actually designed to behave randomly.
+ Generally harmless. For example, when fuzzing sqlite, an input like
+ 'select random();' will trigger a variable execution path.
+
+ - Multiple threads executing at once in semi-random order. This is harmless
+ when the 'stability' metric stays over 90% or so, but can become an issue
+ if not. Here's what to try:
- Use afl-clang-fast from llvm_mode/ - it uses a thread-local tracking
model that is less prone to concurrency issues,
@@ -323,17 +333,10 @@
- Replace pthreads with GNU Pth (https://www.gnu.org/software/pth/) which
allows you to use a deterministic scheduler.
-Less likely causes may include running out of disk space, SHM handles, or other
-globally limited resources.
-
The paths where variable behavior is detected are marked with a matching entry
in the <out_dir>/queue/.state/variable_behavior/ directory, so you can look
them up easily.
-If you can't suppress variable behavior and don't want to see these warnings,
-simply set AFL_NO_VAR_CHECK=1 in the environment before running afl-fuzz. This
-will also dramatically speed up session resumption.
-
9) CPU load
-----------
@@ -378,6 +381,7 @@
- cur_path - currently processed entry number
- pending_favs - number of favored entries still waiting to be fuzzed
- pending_total - number of all entries waiting to be fuzzed
+ - stability - percentage of bitmap bytes that behave consistently
- variable_paths - number of test cases showing variable behavior
- unique_crashes - number of unique crashes recorded
- unique_hangs - number of unique hangs encountered
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/afl-2.17b/llvm_mode/README.llvm new/afl-2.21b/llvm_mode/README.llvm
--- old/afl-2.17b/llvm_mode/README.llvm 2016-06-07 20:14:22.000000000 +0200
+++ new/afl-2.21b/llvm_mode/README.llvm 2016-07-21 20:42:06.000000000 +0200
@@ -163,8 +163,8 @@
When running in this mode, the execution paths will inherently vary a bit
depending on whether the input loop is being entered for the first time or
-executed again. To avoid spurious warnings, the feature implies
-AFL_NO_VAR_CHECK and hides the "variable path" warnings in the UI.
+executed again. This can cause the "stability" metric in the UI to dip
+slightly under 100%.
PS. Because there are task switches still involved, the mode isn't as fast as
"pure" in-process fuzzing offered, say, by LLVM's LibFuzzer; but it is a lot
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/afl-2.17b/types.h new/afl-2.21b/types.h
--- old/afl-2.17b/types.h 2015-02-09 06:06:27.000000000 +0100
+++ new/afl-2.21b/types.h 2016-07-03 06:32:05.000000000 +0200
@@ -76,4 +76,7 @@
#define MEM_BARRIER() \
asm volatile("" ::: "memory")
+#define likely(_x) __builtin_expect(!!(_x), 1)
+#define unlikely(_x) __builtin_expect(!!(_x), 0)
+
#endif /* ! _HAVE_TYPES_H */
1
0
Hello community,
here is the log from the commit of package nomacs for openSUSE:Factory checked in at 2016-07-28 23:46:34
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/nomacs (Old)
and /work/SRC/openSUSE:Factory/.nomacs.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "nomacs"
Changes:
--------
--- /work/SRC/openSUSE:Factory/nomacs/nomacs.changes 2016-06-19 10:50:30.000000000 +0200
+++ /work/SRC/openSUSE:Factory/.nomacs.new/nomacs.changes 2016-07-28 23:46:38.000000000 +0200
@@ -1,0 +2,17 @@
+Sat Jul 23 16:15:39 UTC 2016 - sor.alexei(a)meowr.ru
+
+- Update to version 3.4:
+ * Rework Batch UI.
+ * Add Batch Profiles.
+ * Cropping to metadata.
+ * Improve RGB to Gray.
+ * Fix crash on delete.
+ * Fix slow thumbnail rendering.
+ * Improve start-up time.
+ * Fix RAW/PSD orientation.
+- Remove nomacs-3.2.0-gcc6.patch: fixed upstream.
+- Remove /usr/lib(|64)/libnomacs*.so files: for development, yet
+ no development seem possible for nomacs now.
+- Add ldconfig to post(|un).
+
+-------------------------------------------------------------------
Old:
----
nomacs-3.2.0-gcc6.patch
nomacs-3.2.0-source.tar.bz2
New:
----
nomacs-3.4.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ nomacs.spec ++++++
--- /var/tmp/diff_new_pack.7IVxRS/_old 2016-07-28 23:46:40.000000000 +0200
+++ /var/tmp/diff_new_pack.7IVxRS/_new 2016-07-28 23:46:40.000000000 +0200
@@ -17,17 +17,14 @@
Name: nomacs
-Version: 3.2.0
+Version: 3.4
Release: 0
Summary: Lightweight image viewer
License: GPL-3.0+
Group: Productivity/Graphics/Viewers
Url: http://nomacs.org/
-Source: https://github.com/%{name}/%{name}/releases/download/%{version}/%{name}-%{v…
-# PATCH-FIX-UPSTREAM nomacs-3.2.0-gcc6.patch boo#985374 sergio(a)serjux.com -- Fix GCC6 errors.
-Patch0: %{name}-3.2.0-gcc6.patch
-BuildRequires: cmake >= 2.6
-BuildRequires: dos2unix
+Source: https://github.com/%{name}/%{name}/archive/%{version}.tar.gz#/%{name}-%{ver…
+BuildRequires: cmake >= 2.8
BuildRequires: gcc-c++
BuildRequires: libqt5-linguist-devel >= 5.2
BuildRequires: opencv-qt5-devel >= 2.4.6
@@ -54,40 +51,40 @@
%prep
%setup -q
-%patch0 -p1
-dos2unix Readme/*
-find src -type f -name '*.cpp' | while read f; do
- # Fix encoding issues in nomacs with Qt5.
- dos2unix "$f" && \
- iconv -f iso-8859-2 -t utf-8 "$f" > "$f.new" && \
- mv -f "$f.new" "$f"
-done
+sed -i 's/\r$//g' ImageLounge/Readme/*
%build
+pushd ImageLounge/
%cmake \
-DCMAKE_C_FLAGS='%{optflags} -fno-strict-aliasing' \
-DCMAKE_CXX_FLAGS='%{optflags} -fno-strict-aliasing' \
-DCMAKE_SHARED_LINKER_FLAGS=""
make %{?_smp_mflags}
+popd
%install
+pushd ImageLounge/
%cmake_install
-install -Dm 0644 %{name}.appdata.xml \
+popd
+install -Dm 0644 ImageLounge/%{name}.appdata.xml \
%{buildroot}%{_datadir}/appdata/%{name}.appdata.xml
+
+rm %{buildroot}%{_libdir}/lib%{name}*.so
%suse_update_desktop_file %{name}
%find_lang %{name} --with-qt
%post
%desktop_database_post
+/sbin/ldconfig
%postun
%desktop_database_postun
+/sbin/ldconfig
%files
%defattr(-,root,root)
-%doc Readme/*
+%doc ImageLounge/Readme/*
%{_bindir}/%{name}
-%{_libdir}/lib%{name}*.so
%{_libdir}/lib%{name}*.so.*
%{_datadir}/%{name}/
%{_datadir}/applications/%{name}.desktop
1
0
Hello community,
here is the log from the commit of package python3-beautifulsoup4 for openSUSE:Factory checked in at 2016-07-28 23:46:31
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python3-beautifulsoup4 (Old)
and /work/SRC/openSUSE:Factory/.python3-beautifulsoup4.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python3-beautifulsoup4"
Changes:
--------
--- /work/SRC/openSUSE:Factory/python3-beautifulsoup4/python3-beautifulsoup4-doc.changes 2016-05-25 21:27:38.000000000 +0200
+++ /work/SRC/openSUSE:Factory/.python3-beautifulsoup4.new/python3-beautifulsoup4-doc.changes 2016-07-28 23:46:33.000000000 +0200
@@ -1,0 +2,42 @@
+Wed Jul 20 15:07:27 UTC 2016 - arun(a)gmx.de
+
+- update to version 4.5.0:
+ * Beautiful Soup is no longer compatible with Python 2.6. This
+ actually happened a few releases ago, but it's now official.
+ * Beautiful Soup will now work with versions of html5lib greater
+ than 0.99999999. [bug=1603299]
+ * If a search against each individual value of a multi-valued
+ attribute fails, the search will be run one final time against the
+ complete attribute value considered as a single string. That is,
+ if a tag has class="foo bar" and neither "foo" nor "bar" matches,
+ but "foo bar" does, the tag is now considered a match.
+ This happened in previous versions, but only when the value being
+ searched for was a string. Now it also works when that value is a
+ regular expression, a list of strings, etc. [bug=1476868]
+ * Fixed a bug that deranged the tree when a whitespace element was
+ reparented into a tag that contained an identical whitespace
+ element. [bug=1505351]
+ * Added support for CSS selector values that contain quoted spaces,
+ such as tag[style="display: foo"]. [bug=1540588]
+ * Corrected handling of XML processing instructions. [bug=1504393]
+ * Corrected an encoding error that happened when a BeautifulSoup
+ object was copied. [bug=1554439]
+ * The contents of <textarea> tags will no longer be modified when
+ the tree is prettified. [bug=1555829]
+ * When a BeautifulSoup object is pickled but its tree builder cannot
+ be pickled, its .builder attribute is set to None instead of being
+ destroyed. This avoids a performance problem once the object is
+ unpickled. [bug=1523629]
+ * Specify the file and line number when warning about a
+ BeautifulSoup object being instantiated without a parser being
+ specified. [bug=1574647]
+ * The `limit` argument to `select()` now works correctly, though
+ it's not implemented very efficiently. [bug=1520530]
+ * Fixed a Python 3 ByteWarning when a URL was passed in as though it
+ were markup. Thanks to James Salter for a patch and
+ test. [bug=1533762]
+ * We don't run the check for a filename passed in as markup if the
+ 'filename' contains a less-than character; the less-than character
+ indicates it's most likely a very small document. [bug=1577864]
+
+-------------------------------------------------------------------
@@ -11 +52,0 @@
-
python3-beautifulsoup4.changes: same change
Old:
----
beautifulsoup4-4.4.1.tar.gz
New:
----
beautifulsoup4-4.5.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python3-beautifulsoup4-doc.spec ++++++
--- /var/tmp/diff_new_pack.F2JBPt/_old 2016-07-28 23:46:34.000000000 +0200
+++ /var/tmp/diff_new_pack.F2JBPt/_new 2016-07-28 23:46:34.000000000 +0200
@@ -17,7 +17,7 @@
Name: python3-beautifulsoup4-doc
-Version: 4.4.1
+Version: 4.5.0
Release: 0
Summary: Documentation for python3-beautifulsoup4
License: MIT
@@ -25,8 +25,8 @@
Url: http://www.crummy.com/software/BeautifulSoup/
Source: https://files.pythonhosted.org/packages/source/b/beautifulsoup4/beautifulso…
BuildRoot: %{_tmppath}/%{name}-%{version}-build
-BuildRequires: python3-beautifulsoup4 = %{version}
BuildRequires: python3-Sphinx
+BuildRequires: python3-beautifulsoup4 = %{version}
Requires: python3-beautifulsoup4 = %{version}
BuildArch: noarch
++++++ python3-beautifulsoup4.spec ++++++
--- /var/tmp/diff_new_pack.F2JBPt/_old 2016-07-28 23:46:34.000000000 +0200
+++ /var/tmp/diff_new_pack.F2JBPt/_new 2016-07-28 23:46:34.000000000 +0200
@@ -17,7 +17,7 @@
Name: python3-beautifulsoup4
-Version: 4.4.1
+Version: 4.5.0
Release: 0
Summary: HTML/XML Parser for Quick-Turnaround Applications Like Screen-Scraping
License: MIT
++++++ beautifulsoup4-4.4.1.tar.gz -> beautifulsoup4-4.5.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/COPYING.txt new/beautifulsoup4-4.5.0/COPYING.txt
--- old/beautifulsoup4-4.4.1/COPYING.txt 2015-06-24 12:56:23.000000000 +0200
+++ new/beautifulsoup4-4.5.0/COPYING.txt 2016-07-16 17:25:37.000000000 +0200
@@ -1,6 +1,6 @@
Beautiful Soup is made available under the MIT license:
- Copyright (c) 2004-2015 Leonard Richardson
+ Copyright (c) 2004-2016 Leonard Richardson
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/NEWS.txt new/beautifulsoup4-4.5.0/NEWS.txt
--- old/beautifulsoup4-4.4.1/NEWS.txt 2015-09-29 01:53:36.000000000 +0200
+++ new/beautifulsoup4-4.5.0/NEWS.txt 2016-07-20 02:35:09.000000000 +0200
@@ -1,3 +1,56 @@
+= 4.5.0 (20160719) =
+
+* Beautiful Soup is no longer compatible with Python 2.6. This
+ actually happened a few releases ago, but it's now official.
+
+* Beautiful Soup will now work with versions of html5lib greater than
+ 0.99999999. [bug=1603299]
+
+* If a search against each individual value of a multi-valued
+ attribute fails, the search will be run one final time against the
+ complete attribute value considered as a single string. That is, if
+ a tag has class="foo bar" and neither "foo" nor "bar" matches, but
+ "foo bar" does, the tag is now considered a match.
+
+ This happened in previous versions, but only when the value being
+ searched for was a string. Now it also works when that value is
+ a regular expression, a list of strings, etc. [bug=1476868]
+
+* Fixed a bug that deranged the tree when a whitespace element was
+ reparented into a tag that contained an identical whitespace
+ element. [bug=1505351]
+
+* Added support for CSS selector values that contain quoted spaces,
+ such as tag[style="display: foo"]. [bug=1540588]
+
+* Corrected handling of XML processing instructions. [bug=1504393]
+
+* Corrected an encoding error that happened when a BeautifulSoup
+ object was copied. [bug=1554439]
+
+* The contents of <textarea> tags will no longer be modified when the
+ tree is prettified. [bug=1555829]
+
+* When a BeautifulSoup object is pickled but its tree builder cannot
+ be pickled, its .builder attribute is set to None instead of being
+ destroyed. This avoids a performance problem once the object is
+ unpickled. [bug=1523629]
+
+* Specify the file and line number when warning about a
+ BeautifulSoup object being instantiated without a parser being
+ specified. [bug=1574647]
+
+* The `limit` argument to `select()` now works correctly, though it's
+ not implemented very efficiently. [bug=1520530]
+
+* Fixed a Python 3 ByteWarning when a URL was passed in as though it
+ were markup. Thanks to James Salter for a patch and
+ test. [bug=1533762]
+
+* We don't run the check for a filename passed in as markup if the
+ 'filename' contains a less-than character; the less-than character
+ indicates it's most likely a very small document. [bug=1577864]
+
= 4.4.1 (20150928) =
* Fixed a bug that deranged the tree when part of it was
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/PKG-INFO new/beautifulsoup4-4.5.0/PKG-INFO
--- old/beautifulsoup4-4.4.1/PKG-INFO 2015-09-29 02:19:48.000000000 +0200
+++ new/beautifulsoup4-4.5.0/PKG-INFO 2016-07-20 12:38:04.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: beautifulsoup4
-Version: 4.4.1
+Version: 4.5.0
Summary: Screen-scraping library
Home-page: http://www.crummy.com/software/BeautifulSoup/bs4/
Author: Leonard Richardson
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/beautifulsoup4.egg-info/PKG-INFO new/beautifulsoup4-4.5.0/beautifulsoup4.egg-info/PKG-INFO
--- old/beautifulsoup4-4.4.1/beautifulsoup4.egg-info/PKG-INFO 2015-09-29 02:19:48.000000000 +0200
+++ new/beautifulsoup4-4.5.0/beautifulsoup4.egg-info/PKG-INFO 2016-07-20 12:38:04.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: beautifulsoup4
-Version: 4.4.1
+Version: 4.5.0
Summary: Screen-scraping library
Home-page: http://www.crummy.com/software/BeautifulSoup/bs4/
Author: Leonard Richardson
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/beautifulsoup4.egg-info/requires.txt new/beautifulsoup4-4.5.0/beautifulsoup4.egg-info/requires.txt
--- old/beautifulsoup4-4.4.1/beautifulsoup4.egg-info/requires.txt 2015-09-29 02:19:48.000000000 +0200
+++ new/beautifulsoup4-4.5.0/beautifulsoup4.egg-info/requires.txt 2016-07-20 12:38:04.000000000 +0200
@@ -1,7 +1,6 @@
+[html5lib]
+html5lib
[lxml]
lxml
-
-[html5lib]
-html5lib
\ No newline at end of file
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/bs4/__init__.py new/beautifulsoup4-4.5.0/bs4/__init__.py
--- old/beautifulsoup4-4.4.1/bs4/__init__.py 2015-09-29 02:09:17.000000000 +0200
+++ new/beautifulsoup4-4.5.0/bs4/__init__.py 2016-07-20 02:28:09.000000000 +0200
@@ -5,8 +5,8 @@
Beautiful Soup uses a pluggable XML or HTML parser to parse a
(possibly invalid) document into a tree representation. Beautiful Soup
-provides provides methods and Pythonic idioms that make it easy to
-navigate, search, and modify the parse tree.
+provides methods and Pythonic idioms that make it easy to navigate,
+search, and modify the parse tree.
Beautiful Soup works with Python 2.6 and up. It works better if lxml
and/or html5lib is installed.
@@ -14,17 +14,22 @@
For more than you ever wanted to know about Beautiful Soup, see the
documentation:
http://www.crummy.com/software/BeautifulSoup/bs4/doc/
+
"""
+# Use of this source code is governed by a BSD-style license that can be
+# found in the LICENSE file.
+
__author__ = "Leonard Richardson (leonardr(a)segfault.org)"
-__version__ = "4.4.1"
-__copyright__ = "Copyright (c) 2004-2015 Leonard Richardson"
+__version__ = "4.5.0"
+__copyright__ = "Copyright (c) 2004-2016 Leonard Richardson"
__license__ = "MIT"
__all__ = ['BeautifulSoup']
import os
import re
+import traceback
import warnings
from .builder import builder_registry, ParserRejectedMarkup
@@ -77,7 +82,7 @@
ASCII_SPACES = '\x20\x0a\x09\x0c\x0d'
- NO_PARSER_SPECIFIED_WARNING = "No parser was explicitly specified, so I'm using the best available %(markup_type)s parser for this system (\"%(parser)s\"). This usually isn't a problem, but if you run this code on another system, or in a different virtual environment, it may use a different parser and behave differently.\n\nTo get rid of this warning, change this:\n\n BeautifulSoup([your markup])\n\nto this:\n\n BeautifulSoup([your markup], \"%(parser)s\")\n"
+ NO_PARSER_SPECIFIED_WARNING = "No parser was explicitly specified, so I'm using the best available %(markup_type)s parser for this system (\"%(parser)s\"). This usually isn't a problem, but if you run this code on another system, or in a different virtual environment, it may use a different parser and behave differently.\n\nThe code that caused this warning is on line %(line_number)s of the file %(filename)s. To get rid of this warning, change code that looks like this:\n\n BeautifulSoup([your markup])\n\nto this:\n\n BeautifulSoup([your markup], \"%(parser)s\")\n"
def __init__(self, markup="", features=None, builder=None,
parse_only=None, from_encoding=None, exclude_encodings=None,
@@ -137,6 +142,10 @@
from_encoding = from_encoding or deprecated_argument(
"fromEncoding", "from_encoding")
+ if from_encoding and isinstance(markup, unicode):
+ warnings.warn("You provided Unicode markup but also provided a value for from_encoding. Your from_encoding will be ignored.")
+ from_encoding = None
+
if len(kwargs) > 0:
arg = kwargs.keys().pop()
raise TypeError(
@@ -161,19 +170,29 @@
markup_type = "XML"
else:
markup_type = "HTML"
+
+ caller = traceback.extract_stack()[0]
+ filename = caller[0]
+ line_number = caller[1]
warnings.warn(self.NO_PARSER_SPECIFIED_WARNING % dict(
+ filename=filename,
+ line_number=line_number,
parser=builder.NAME,
markup_type=markup_type))
self.builder = builder
self.is_xml = builder.is_xml
+ self.known_xml = self.is_xml
self.builder.soup = self
self.parse_only = parse_only
if hasattr(markup, 'read'): # It's a file-type object.
markup = markup.read()
- elif len(markup) <= 256:
+ elif len(markup) <= 256 and (
+ (isinstance(markup, bytes) and not b'<' in markup)
+ or (isinstance(markup, unicode) and not u'<' in markup)
+ ):
# Print out warnings for a couple beginner problems
# involving passing non-markup to Beautiful Soup.
# Beautiful Soup will still parse the input as markup,
@@ -195,16 +214,10 @@
if isinstance(markup, unicode):
markup = markup.encode("utf8")
warnings.warn(
- '"%s" looks like a filename, not markup. You should probably open this file and pass the filehandle into Beautiful Soup.' % markup)
- if markup[:5] == "http:" or markup[:6] == "https:":
- # TODO: This is ugly but I couldn't get it to work in
- # Python 3 otherwise.
- if ((isinstance(markup, bytes) and not b' ' in markup)
- or (isinstance(markup, unicode) and not u' ' in markup)):
- if isinstance(markup, unicode):
- markup = markup.encode("utf8")
- warnings.warn(
- '"%s" looks like a URL. Beautiful Soup is not an HTTP client. You should probably use an HTTP client to get the document behind the URL, and feed that document to Beautiful Soup.' % markup)
+ '"%s" looks like a filename, not markup. You should'
+ 'probably open this file and pass the filehandle into'
+ 'Beautiful Soup.' % markup)
+ self._check_markup_is_url(markup)
for (self.markup, self.original_encoding, self.declared_html_encoding,
self.contains_replacement_characters) in (
@@ -223,15 +236,52 @@
self.builder.soup = None
def __copy__(self):
- return type(self)(self.encode(), builder=self.builder)
+ copy = type(self)(
+ self.encode('utf-8'), builder=self.builder, from_encoding='utf-8'
+ )
+
+ # Although we encoded the tree to UTF-8, that may not have
+ # been the encoding of the original markup. Set the copy's
+ # .original_encoding to reflect the original object's
+ # .original_encoding.
+ copy.original_encoding = self.original_encoding
+ return copy
def __getstate__(self):
# Frequently a tree builder can't be pickled.
d = dict(self.__dict__)
if 'builder' in d and not self.builder.picklable:
- del d['builder']
+ d['builder'] = None
return d
+ @staticmethod
+ def _check_markup_is_url(markup):
+ """
+ Check if markup looks like it's actually a url and raise a warning
+ if so. Markup can be unicode or str (py2) / bytes (py3).
+ """
+ if isinstance(markup, bytes):
+ space = b' '
+ cant_start_with = (b"http:", b"https:")
+ elif isinstance(markup, unicode):
+ space = u' '
+ cant_start_with = (u"http:", u"https:")
+ else:
+ return
+
+ if any(markup.startswith(prefix) for prefix in cant_start_with):
+ if not space in markup:
+ if isinstance(markup, bytes):
+ decoded_markup = markup.decode('utf-8', 'replace')
+ else:
+ decoded_markup = markup
+ warnings.warn(
+ '"%s" looks like a URL. Beautiful Soup is not an'
+ ' HTTP client. You should probably use an HTTP client like'
+ ' requests to get the document behind the URL, and feed'
+ ' that document to Beautiful Soup.' % decoded_markup
+ )
+
def _feed(self):
# Convert the document to Unicode.
self.builder.reset()
@@ -335,7 +385,18 @@
if parent.next_sibling:
# This node is being inserted into an element that has
# already been parsed. Deal with any dangling references.
- index = parent.contents.index(o)
+ index = len(parent.contents)-1
+ while index >= 0:
+ if parent.contents[index] is o:
+ break
+ index -= 1
+ else:
+ raise ValueError(
+ "Error building tree: supposedly %r was inserted "
+ "into %r after the fact, but I don't see it!" % (
+ o, parent
+ )
+ )
if index == 0:
previous_element = parent
previous_sibling = None
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/bs4/builder/__init__.py new/beautifulsoup4-4.5.0/bs4/builder/__init__.py
--- old/beautifulsoup4-4.4.1/bs4/builder/__init__.py 2015-06-28 21:48:48.000000000 +0200
+++ new/beautifulsoup4-4.5.0/bs4/builder/__init__.py 2016-07-20 02:28:09.000000000 +0200
@@ -1,9 +1,13 @@
+# Use of this source code is governed by a BSD-style license that can be
+# found in the LICENSE file.
+
from collections import defaultdict
import itertools
import sys
from bs4.element import (
CharsetMetaAttributeValue,
ContentMetaAttributeValue,
+ HTMLAwareEntitySubstitution,
whitespace_re
)
@@ -227,7 +231,7 @@
Such as which tags are empty-element tags.
"""
- preserve_whitespace_tags = set(['pre', 'textarea'])
+ preserve_whitespace_tags = HTMLAwareEntitySubstitution.preserve_whitespace_tags
empty_element_tags = set(['br' , 'hr', 'input', 'img', 'meta',
'spacer', 'link', 'frame', 'base'])
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/bs4/builder/_html5lib.py new/beautifulsoup4-4.5.0/bs4/builder/_html5lib.py
--- old/beautifulsoup4-4.4.1/bs4/builder/_html5lib.py 2015-09-29 01:48:58.000000000 +0200
+++ new/beautifulsoup4-4.5.0/bs4/builder/_html5lib.py 2016-07-17 17:31:37.000000000 +0200
@@ -1,8 +1,10 @@
+# Use of this source code is governed by a BSD-style license that can be
+# found in the LICENSE file.
+
__all__ = [
'HTML5TreeBuilder',
]
-from pdb import set_trace
import warnings
from bs4.builder import (
PERMISSIVE,
@@ -23,6 +25,15 @@
Tag,
)
+try:
+ # Pre-0.99999999
+ from html5lib.treebuilders import _base as treebuilder_base
+ new_html5lib = False
+except ImportError, e:
+ # 0.99999999 and up
+ from html5lib.treebuilders import base as treebuilder_base
+ new_html5lib = True
+
class HTML5TreeBuilder(HTMLTreeBuilder):
"""Use html5lib to build a tree."""
@@ -47,7 +58,14 @@
if self.soup.parse_only is not None:
warnings.warn("You provided a value for parse_only, but the html5lib tree builder doesn't support parse_only. The entire document will be parsed.")
parser = html5lib.HTMLParser(tree=self.create_treebuilder)
- doc = parser.parse(markup, encoding=self.user_specified_encoding)
+
+ extra_kwargs = dict()
+ if not isinstance(markup, unicode):
+ if new_html5lib:
+ extra_kwargs['override_encoding'] = self.user_specified_encoding
+ else:
+ extra_kwargs['encoding'] = self.user_specified_encoding
+ doc = parser.parse(markup, **extra_kwargs)
# Set the character encoding detected by the tokenizer.
if isinstance(markup, unicode):
@@ -55,7 +73,13 @@
# charEncoding to UTF-8 if it gets Unicode input.
doc.original_encoding = None
else:
- doc.original_encoding = parser.tokenizer.stream.charEncoding[0]
+ original_encoding = parser.tokenizer.stream.charEncoding[0]
+ if not isinstance(original_encoding, basestring):
+ # In 0.99999999 and up, the encoding is an html5lib
+ # Encoding object. We want to use a string for compatibility
+ # with other tree builders.
+ original_encoding = original_encoding.name
+ doc.original_encoding = original_encoding
def create_treebuilder(self, namespaceHTMLElements):
self.underlying_builder = TreeBuilderForHtml5lib(
@@ -67,7 +91,7 @@
return u'<html><head></head><body>%s</body></html>' % fragment
-class TreeBuilderForHtml5lib(html5lib.treebuilders._base.TreeBuilder):
+class TreeBuilderForHtml5lib(treebuilder_base.TreeBuilder):
def __init__(self, soup, namespaceHTMLElements):
self.soup = soup
@@ -105,7 +129,7 @@
return self.soup
def getFragment(self):
- return html5lib.treebuilders._base.TreeBuilder.getFragment(self).element
+ return treebuilder_base.TreeBuilder.getFragment(self).element
class AttrList(object):
def __init__(self, element):
@@ -137,9 +161,9 @@
return name in list(self.attrs.keys())
-class Element(html5lib.treebuilders._base.Node):
+class Element(treebuilder_base.Node):
def __init__(self, element, soup, namespace):
- html5lib.treebuilders._base.Node.__init__(self, element.name)
+ treebuilder_base.Node.__init__(self, element.name)
self.element = element
self.soup = soup
self.namespace = namespace
@@ -324,7 +348,7 @@
class TextNode(Element):
def __init__(self, element, soup):
- html5lib.treebuilders._base.Node.__init__(self, None)
+ treebuilder_base.Node.__init__(self, None)
self.element = element
self.soup = soup
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/bs4/builder/_htmlparser.py new/beautifulsoup4-4.5.0/bs4/builder/_htmlparser.py
--- old/beautifulsoup4-4.4.1/bs4/builder/_htmlparser.py 2015-06-28 21:49:08.000000000 +0200
+++ new/beautifulsoup4-4.5.0/bs4/builder/_htmlparser.py 2016-07-17 21:10:15.000000000 +0200
@@ -1,5 +1,8 @@
"""Use the HTMLParser library to parse HTML files that aren't too bad."""
+# Use of this source code is governed by a BSD-style license that can be
+# found in the LICENSE file.
+
__all__ = [
'HTMLParserTreeBuilder',
]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/bs4/builder/_lxml.py new/beautifulsoup4-4.5.0/bs4/builder/_lxml.py
--- old/beautifulsoup4-4.4.1/bs4/builder/_lxml.py 2015-06-28 21:49:20.000000000 +0200
+++ new/beautifulsoup4-4.5.0/bs4/builder/_lxml.py 2016-07-17 00:35:57.000000000 +0200
@@ -1,3 +1,5 @@
+# Use of this source code is governed by a BSD-style license that can be
+# found in the LICENSE file.
__all__ = [
'LXMLTreeBuilderForXML',
'LXMLTreeBuilder',
@@ -12,6 +14,7 @@
Doctype,
NamespacedAttribute,
ProcessingInstruction,
+ XMLProcessingInstruction,
)
from bs4.builder import (
FAST,
@@ -103,6 +106,10 @@
# iterate over the encodings, and tell lxml to try to parse
# the document as each one in turn.
is_html = not self.is_xml
+ if is_html:
+ self.processing_instruction_class = ProcessingInstruction
+ else:
+ self.processing_instruction_class = XMLProcessingInstruction
try_encodings = [user_specified_encoding, document_declared_encoding]
detector = EncodingDetector(
markup, try_encodings, is_html, exclude_encodings)
@@ -201,7 +208,7 @@
def pi(self, target, data):
self.soup.endData()
self.soup.handle_data(target + ' ' + data)
- self.soup.endData(ProcessingInstruction)
+ self.soup.endData(self.processing_instruction_class)
def data(self, content):
self.soup.handle_data(content)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/bs4/dammit.py new/beautifulsoup4-4.5.0/bs4/dammit.py
--- old/beautifulsoup4-4.4.1/bs4/dammit.py 2015-09-29 01:58:41.000000000 +0200
+++ new/beautifulsoup4-4.5.0/bs4/dammit.py 2016-07-17 21:14:33.000000000 +0200
@@ -6,9 +6,10 @@
Feed Parser. It works best on XML and HTML, but it does not rewrite the
XML or HTML to reflect a new encoding; that's the tree builder's job.
"""
+# Use of this source code is governed by a BSD-style license that can be
+# found in the LICENSE file.
__license__ = "MIT"
-from pdb import set_trace
import codecs
from htmlentitydefs import codepoint2name
import re
@@ -346,7 +347,7 @@
self.tried_encodings = []
self.contains_replacement_characters = False
self.is_html = is_html
-
+ self.log = logging.getLogger(__name__)
self.detector = EncodingDetector(
markup, override_encodings, is_html, exclude_encodings)
@@ -376,9 +377,10 @@
if encoding != "ascii":
u = self._convert_from(encoding, "replace")
if u is not None:
- logging.warning(
+ self.log.warning(
"Some characters could not be decoded, and were "
- "replaced with REPLACEMENT CHARACTER.")
+ "replaced with REPLACEMENT CHARACTER."
+ )
self.contains_replacement_characters = True
break
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/bs4/diagnose.py new/beautifulsoup4-4.5.0/bs4/diagnose.py
--- old/beautifulsoup4-4.4.1/bs4/diagnose.py 2015-09-29 01:56:24.000000000 +0200
+++ new/beautifulsoup4-4.5.0/bs4/diagnose.py 2016-07-16 17:27:02.000000000 +0200
@@ -1,5 +1,7 @@
"""Diagnostic functions, mainly for use when doing tech support."""
+# Use of this source code is governed by a BSD-style license that can be
+# found in the LICENSE file.
__license__ = "MIT"
import cProfile
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/bs4/element.py new/beautifulsoup4-4.5.0/bs4/element.py
--- old/beautifulsoup4-4.4.1/bs4/element.py 2015-09-29 01:56:01.000000000 +0200
+++ new/beautifulsoup4-4.5.0/bs4/element.py 2016-07-20 02:28:09.000000000 +0200
@@ -1,8 +1,10 @@
+# Use of this source code is governed by a BSD-style license that can be
+# found in the LICENSE file.
__license__ = "MIT"
-from pdb import set_trace
import collections
import re
+import shlex
import sys
import warnings
from bs4.dammit import EntitySubstitution
@@ -99,6 +101,8 @@
preformatted_tags = set(["pre"])
+ preserve_whitespace_tags = set(['pre', 'textarea'])
+
@classmethod
def _substitute_if_appropriate(cls, ns, f):
if (isinstance(ns, NavigableString)
@@ -169,11 +173,19 @@
This is used when mapping a formatter name ("minimal") to an
appropriate function (one that performs entity-substitution on
- the contents of <script> and <style> tags, or not). It's
+ the contents of <script> and <style> tags, or not). It can be
inefficient, but it should be called very rarely.
"""
+ if self.known_xml is not None:
+ # Most of the time we will have determined this when the
+ # document is parsed.
+ return self.known_xml
+
+ # Otherwise, it's likely that this element was created by
+ # direct invocation of the constructor from within the user's
+ # Python code.
if self.parent is None:
- # This is the top-level object. It should have .is_xml set
+ # This is the top-level object. It should have .known_xml set
# from tree creation. If not, take a guess--BS is usually
# used on HTML markup.
return getattr(self, 'is_xml', False)
@@ -677,6 +689,11 @@
PREFIX = ''
SUFFIX = ''
+ # We can't tell just by looking at a string whether it's contained
+ # in an XML document or an HTML document.
+
+ known_xml = None
+
def __new__(cls, value):
"""Create a new NavigableString.
@@ -743,10 +760,16 @@
SUFFIX = u']]>'
class ProcessingInstruction(PreformattedString):
+ """A SGML processing instruction."""
PREFIX = u'<?'
SUFFIX = u'>'
+class XMLProcessingInstruction(ProcessingInstruction):
+ """An XML processing instruction."""
+ PREFIX = u'<?'
+ SUFFIX = u'?>'
+
class Comment(PreformattedString):
PREFIX = u'<!--'
@@ -781,7 +804,8 @@
"""Represents a found HTML tag with its attributes and contents."""
def __init__(self, parser=None, builder=None, name=None, namespace=None,
- prefix=None, attrs=None, parent=None, previous=None):
+ prefix=None, attrs=None, parent=None, previous=None,
+ is_xml=None):
"Basic constructor."
if parser is None:
@@ -795,6 +819,14 @@
self.name = name
self.namespace = namespace
self.prefix = prefix
+ if builder is not None:
+ preserve_whitespace_tags = builder.preserve_whitespace_tags
+ else:
+ if is_xml:
+ preserve_whitespace_tags = []
+ else:
+ preserve_whitespace_tags = HTMLAwareEntitySubstitution.preserve_whitespace_tags
+ self.preserve_whitespace_tags = preserve_whitespace_tags
if attrs is None:
attrs = {}
elif attrs:
@@ -805,6 +837,13 @@
attrs = dict(attrs)
else:
attrs = dict(attrs)
+
+ # If possible, determine ahead of time whether this tag is an
+ # XML tag.
+ if builder:
+ self.known_xml = builder.is_xml
+ else:
+ self.known_xml = is_xml
self.attrs = attrs
self.contents = []
self.setup(parent, previous)
@@ -824,7 +863,7 @@
Its contents are a copy of the old Tag's contents.
"""
clone = type(self)(None, self.builder, self.name, self.namespace,
- self.nsprefix, self.attrs)
+ self.nsprefix, self.attrs, is_xml=self._is_xml)
for attr in ('can_be_empty_element', 'hidden'):
setattr(clone, attr, getattr(self, attr))
for child in self.contents:
@@ -997,7 +1036,7 @@
tag_name, tag_name))
return self.find(tag_name)
# We special case contents to avoid recursion.
- elif not tag.startswith("__") and not tag=="contents":
+ elif not tag.startswith("__") and not tag == "contents":
return self.find(tag)
raise AttributeError(
"'%s' object has no attribute '%s'" % (self.__class__, tag))
@@ -1057,10 +1096,11 @@
def _should_pretty_print(self, indent_level):
"""Should this tag be pretty-printed?"""
+
return (
- indent_level is not None and
- (self.name not in HTMLAwareEntitySubstitution.preformatted_tags
- or self._is_xml))
+ indent_level is not None
+ and self.name not in self.preserve_whitespace_tags
+ )
def decode(self, indent_level=None,
eventual_encoding=DEFAULT_OUTPUT_ENCODING,
@@ -1280,6 +1320,7 @@
_selector_combinators = ['>', '+', '~']
_select_debug = False
+ quoted_colon = re.compile('"[^"]*:[^"]*"')
def select_one(self, selector):
"""Perform a CSS selection operation on the current element."""
value = self.select(selector, limit=1)
@@ -1305,8 +1346,7 @@
if limit and len(context) >= limit:
break
return context
-
- tokens = selector.split()
+ tokens = shlex.split(selector)
current_context = [self]
if tokens[-1] in self._selector_combinators:
@@ -1358,7 +1398,7 @@
return classes.issubset(candidate.get('class', []))
checker = classes_match
- elif ':' in token:
+ elif ':' in token and not self.quoted_colon.search(token):
# Pseudo-class
tag_name, pseudo = token.split(':', 1)
if tag_name == '':
@@ -1389,11 +1429,8 @@
self.count += 1
if self.count == self.destination:
return True
- if self.count > self.destination:
- # Stop the generator that's sending us
- # these things.
- raise StopIteration()
- return False
+ else:
+ return False
checker = Counter(pseudo_value).nth_child_of_type
else:
raise NotImplementedError(
@@ -1498,13 +1535,12 @@
# don't include it in the context more than once.
new_context.append(candidate)
new_context_ids.add(id(candidate))
- if limit and len(new_context) >= limit:
- break
elif self._select_debug:
print " FAILURE %s %s" % (candidate.name, repr(candidate.attrs))
-
current_context = new_context
+ if limit and len(current_context) >= limit:
+ current_context = current_context[:limit]
if self._select_debug:
print "Final verdict:"
@@ -1668,21 +1704,15 @@
if isinstance(markup, list) or isinstance(markup, tuple):
# This should only happen when searching a multi-valued attribute
# like 'class'.
- if (isinstance(match_against, unicode)
- and ' ' in match_against):
- # A bit of a special case. If they try to match "foo
- # bar" on a multivalue attribute's value, only accept
- # the literal value "foo bar"
- #
- # XXX This is going to be pretty slow because we keep
- # splitting match_against. But it shouldn't come up
- # too often.
- return (whitespace_re.split(match_against) == markup)
- else:
- for item in markup:
- if self._matches(item, match_against):
- return True
- return False
+ for item in markup:
+ if self._matches(item, match_against):
+ return True
+ # We didn't match any particular value of the multivalue
+ # attribute, but maybe we match the attribute value when
+ # considered as a string.
+ if self._matches(' '.join(markup), match_against):
+ return True
+ return False
if match_against is True:
# True matches any non-None value.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/bs4/testing.py new/beautifulsoup4-4.5.0/bs4/testing.py
--- old/beautifulsoup4-4.4.1/bs4/testing.py 2015-09-29 01:56:34.000000000 +0200
+++ new/beautifulsoup4-4.5.0/bs4/testing.py 2016-07-17 04:18:57.000000000 +0200
@@ -1,5 +1,7 @@
"""Helper classes for tests."""
+# Use of this source code is governed by a BSD-style license that can be
+# found in the LICENSE file.
__license__ = "MIT"
import pickle
@@ -215,9 +217,22 @@
self.assertEqual(comment, baz.previous_element)
def test_preserved_whitespace_in_pre_and_textarea(self):
- """Whitespace must be preserved in <pre> and <textarea> tags."""
- self.assertSoupEquals("<pre> </pre>")
- self.assertSoupEquals("<textarea> woo </textarea>")
+ """Whitespace must be preserved in <pre> and <textarea> tags,
+ even if that would mean not prettifying the markup.
+ """
+ pre_markup = "<pre> </pre>"
+ textarea_markup = "<textarea> woo\nwoo </textarea>"
+ self.assertSoupEquals(pre_markup)
+ self.assertSoupEquals(textarea_markup)
+
+ soup = self.soup(pre_markup)
+ self.assertEqual(soup.pre.prettify(), pre_markup)
+
+ soup = self.soup(textarea_markup)
+ self.assertEqual(soup.textarea.prettify(), textarea_markup)
+
+ soup = self.soup("<textarea></textarea>")
+ self.assertEqual(soup.textarea.prettify(), "<textarea></textarea>")
def test_nested_inline_elements(self):
"""Inline elements can be nested indefinitely."""
@@ -480,7 +495,9 @@
hebrew_document = b'<html><head><title>Hebrew (ISO 8859-8) in Visual Directionality</title></head><body><h1>Hebrew (ISO 8859-8) in Visual Directionality</h1>\xed\xe5\xec\xf9</body></html>'
soup = self.soup(
hebrew_document, from_encoding="iso8859-8")
- self.assertEqual(soup.original_encoding, 'iso8859-8')
+ # Some tree builders call it iso8859-8, others call it iso-8859-9.
+ # That's not a difference we really care about.
+ assert soup.original_encoding in ('iso8859-8', 'iso-8859-8')
self.assertEqual(
soup.encode('utf-8'),
hebrew_document.decode("iso8859-8").encode("utf-8"))
@@ -563,6 +580,11 @@
soup = self.soup(markup)
self.assertEqual(markup, soup.encode("utf8"))
+ def test_processing_instruction(self):
+ markup = b"""<?xml version="1.0" encoding="utf8"?>\n<?PITarget PIContent?>"""
+ soup = self.soup(markup)
+ self.assertEqual(markup, soup.encode("utf8"))
+
def test_real_xhtml_document(self):
"""A real XHTML document should come out *exactly* the same as it went in."""
markup = b"""<?xml version="1.0" encoding="utf-8"?>
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/bs4/tests/test_html5lib.py new/beautifulsoup4-4.5.0/bs4/tests/test_html5lib.py
--- old/beautifulsoup4-4.4.1/bs4/tests/test_html5lib.py 2015-09-29 01:51:22.000000000 +0200
+++ new/beautifulsoup4-4.5.0/bs4/tests/test_html5lib.py 2016-07-17 17:42:40.000000000 +0200
@@ -84,6 +84,17 @@
self.assertEqual(u"<body><p><em>foo</em></p><em>\n</em><p><em>bar<a></a></em></p>\n</body>", soup.body.decode())
self.assertEqual(2, len(soup.find_all('p')))
+ def test_reparented_markup_containing_identical_whitespace_nodes(self):
+ """Verify that we keep the two whitespace nodes in this
+ document distinct when reparenting the adjacent <tbody> tags.
+ """
+ markup = '<table> <tbody><tbody><ims></tbody> </table>'
+ soup = self.soup(markup)
+ space1, space2 = soup.find_all(string=' ')
+ tbody1, tbody2 = soup.find_all('tbody')
+ assert space1.next_element is tbody1
+ assert tbody2.next_element is space2
+
def test_processing_instruction(self):
"""Processing instructions become comments."""
markup = b"""<?PITarget PIContent?>"""
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/bs4/tests/test_soup.py new/beautifulsoup4-4.5.0/bs4/tests/test_soup.py
--- old/beautifulsoup4-4.4.1/bs4/tests/test_soup.py 2015-07-05 19:19:39.000000000 +0200
+++ new/beautifulsoup4-4.5.0/bs4/tests/test_soup.py 2016-07-16 17:55:30.000000000 +0200
@@ -118,15 +118,34 @@
soup = self.soup(filename)
self.assertEqual(0, len(w))
- def test_url_warning(self):
- with warnings.catch_warnings(record=True) as w:
- soup = self.soup("http://www.crummy.com/")
- msg = str(w[0].message)
- self.assertTrue("looks like a URL" in msg)
+ def test_url_warning_with_bytes_url(self):
+ with warnings.catch_warnings(record=True) as warning_list:
+ soup = self.soup(b"http://www.crummybytes.com/")
+ # Be aware this isn't the only warning that can be raised during
+ # execution..
+ self.assertTrue(any("looks like a URL" in str(w.message)
+ for w in warning_list))
+
+ def test_url_warning_with_unicode_url(self):
+ with warnings.catch_warnings(record=True) as warning_list:
+ # note - this url must differ from the bytes one otherwise
+ # python's warnings system swallows the second warning
+ soup = self.soup(u"http://www.crummyunicode.com/")
+ self.assertTrue(any("looks like a URL" in str(w.message)
+ for w in warning_list))
+
+ def test_url_warning_with_bytes_and_space(self):
+ with warnings.catch_warnings(record=True) as warning_list:
+ soup = self.soup(b"http://www.crummybytes.com/ is great")
+ self.assertFalse(any("looks like a URL" in str(w.message)
+ for w in warning_list))
+
+ def test_url_warning_with_unicode_and_space(self):
+ with warnings.catch_warnings(record=True) as warning_list:
+ soup = self.soup(u"http://www.crummyuncode.com/ is great")
+ self.assertFalse(any("looks like a URL" in str(w.message)
+ for w in warning_list))
- with warnings.catch_warnings(record=True) as w:
- soup = self.soup("http://www.crummy.com/ is great")
- self.assertEqual(0, len(w))
class TestSelectiveParsing(SoupTest):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/bs4/tests/test_tree.py new/beautifulsoup4-4.5.0/bs4/tests/test_tree.py
--- old/beautifulsoup4-4.4.1/bs4/tests/test_tree.py 2015-09-29 01:42:21.000000000 +0200
+++ new/beautifulsoup4-4.5.0/bs4/tests/test_tree.py 2016-07-20 02:51:35.000000000 +0200
@@ -222,6 +222,17 @@
self.assertSelects(
tree.find_all(id_matches_name), ["Match 1.", "Match 2."])
+ def test_find_with_multi_valued_attribute(self):
+ soup = self.soup(
+ "<div class='a b'>1</div><div class='a c'>2</div><div class='a d'>3</div>"
+ )
+ r1 = soup.find('div', 'a d');
+ r2 = soup.find('div', re.compile(r'a d'));
+ r3, r4 = soup.find_all('div', ['a b', 'a d']);
+ self.assertEqual('3', r1.string)
+ self.assertEqual('3', r2.string)
+ self.assertEqual('1', r3.string)
+ self.assertEqual('3', r4.string)
class TestFindAllByAttribute(TreeTest):
@@ -294,10 +305,10 @@
f = tree.find_all("gar", class_=re.compile("a"))
self.assertSelects(f, ["Found it"])
- # Since the class is not the string "foo bar", but the two
- # strings "foo" and "bar", this will not find anything.
+ # If the search fails to match the individual strings "foo" and "bar",
+ # it will be tried against the combined string "foo bar".
f = tree.find_all("gar", class_=re.compile("o b"))
- self.assertSelects(f, [])
+ self.assertSelects(f, ["Found it"])
def test_find_all_with_non_dictionary_for_attrs_finds_by_class(self):
soup = self.soup("<a class='bar'>Found it</a>")
@@ -1328,6 +1339,13 @@
copied = copy.deepcopy(self.tree)
self.assertEqual(copied.decode(), self.tree.decode())
+ def test_copy_preserves_encoding(self):
+ soup = BeautifulSoup(b'<p> </p>', 'html.parser')
+ encoding = soup.original_encoding
+ copy = soup.__copy__()
+ self.assertEqual(u"<p> </p>", unicode(copy))
+ self.assertEqual(encoding, copy.original_encoding)
+
def test_unicode_pickle(self):
# A tree containing Unicode characters can be pickled.
html = u"<b>\N{SNOWMAN}</b>"
@@ -1676,8 +1694,8 @@
def setUp(self):
self.soup = BeautifulSoup(self.HTML, 'html.parser')
- def assertSelects(self, selector, expected_ids):
- el_ids = [el['id'] for el in self.soup.select(selector)]
+ def assertSelects(self, selector, expected_ids, **kwargs):
+ el_ids = [el['id'] for el in self.soup.select(selector, **kwargs)]
el_ids.sort()
expected_ids.sort()
self.assertEqual(expected_ids, el_ids,
@@ -1720,6 +1738,13 @@
for selector in ('html div', 'html body div', 'body div'):
self.assertSelects(selector, ['data1', 'main', 'inner', 'footer'])
+
+ def test_limit(self):
+ self.assertSelects('html div', ['main'], limit=1)
+ self.assertSelects('html body div', ['inner', 'main'], limit=2)
+ self.assertSelects('body div', ['data1', 'main', 'inner', 'footer'],
+ limit=10)
+
def test_tag_no_match(self):
self.assertEqual(len(self.soup.select('del')), 0)
@@ -1902,6 +1927,14 @@
('div[data-tag]', ['data1'])
)
+ def test_quoted_space_in_selector_name(self):
+ html = """<div style="display: wrong">nope</div>
+ <div style="display: right">yes</div>
+ """
+ soup = BeautifulSoup(html, 'html.parser')
+ [chosen] = soup.select('div[style="display: right"]')
+ self.assertEqual("yes", chosen.string)
+
def test_unsupported_pseudoclass(self):
self.assertRaises(
NotImplementedError, self.soup.select, "a:no-such-pseudoclass")
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/doc/source/index.rst new/beautifulsoup4-4.5.0/doc/source/index.rst
--- old/beautifulsoup4-4.4.1/doc/source/index.rst 2015-09-29 00:46:53.000000000 +0200
+++ new/beautifulsoup4-4.5.0/doc/source/index.rst 2015-11-24 13:36:12.000000000 +0100
@@ -1649,7 +1649,7 @@
soup.select("title")
# [<title>The Dormouse's story</title>]
- soup.select("p nth-of-type(3)")
+ soup.select("p:nth-of-type(3)")
# [<p class="story">...</p>]
Find tags beneath other tags::
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/beautifulsoup4-4.4.1/setup.py new/beautifulsoup4-4.5.0/setup.py
--- old/beautifulsoup4-4.4.1/setup.py 2015-09-29 02:11:15.000000000 +0200
+++ new/beautifulsoup4-4.5.0/setup.py 2016-07-20 12:37:28.000000000 +0200
@@ -5,7 +5,7 @@
setup(
name="beautifulsoup4",
- version = "4.4.1",
+ version = "4.5.0",
author="Leonard Richardson",
author_email='leonardr(a)segfault.org',
url="http://www.crummy.com/software/BeautifulSoup/bs4/",
1
0
Hello community,
here is the log from the commit of package python3-PyMySQL for openSUSE:Factory checked in at 2016-07-28 23:46:25
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python3-PyMySQL (Old)
and /work/SRC/openSUSE:Factory/.python3-PyMySQL.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python3-PyMySQL"
Changes:
--------
--- /work/SRC/openSUSE:Factory/python3-PyMySQL/python3-PyMySQL.changes 2016-05-25 21:23:12.000000000 +0200
+++ /work/SRC/openSUSE:Factory/.python3-PyMySQL.new/python3-PyMySQL.changes 2016-07-28 23:46:30.000000000 +0200
@@ -1,0 +2,28 @@
+Sat Jul 23 17:32:11 UTC 2016 - arun(a)gmx.de
+
+- specfile:
+ *updated url
+
+- update to version 0.7.5:
+ * Fix exception raised while importing when getpwuid() fails (#472)
+ * SSCursor supports LOAD DATA LOCAL INFILE (#473)
+ * Fix encoding error happen for JSON type (#477)
+ * Fix test fail on Python 2.7 and MySQL 5.7 (#478)
+
+- changes from version 0.7.4:
+ * Fix AttributeError may happen while Connection.__del__ (#463)
+ * Fix SyntaxError in test_cursor. (#464)
+ * frozenset support for query value. (#461)
+ * Start using readthedocs.io
+
+- changes from version 0.7.3:
+ * Add read_timeout and write_timeout option.
+ * Support serialization customization by `conv` option.
+ * Unknown type is converted by `str()`, for MySQLdb compatibility.
+ * Support '%%' in `Cursor.executemany()`
+ * Support REPLACE statement in `Cursor.executemany()`
+ * Fix handling incomplete row caused by 'SHOW SLAVE HOSTS'.
+ * Fix decode error when use_unicode=False on PY3
+ * Fix port option in my.cnf file is ignored.
+
+-------------------------------------------------------------------
@@ -6 +33,0 @@
-
Old:
----
PyMySQL-0.7.2.tar.gz
New:
----
PyMySQL-0.7.5.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python3-PyMySQL.spec ++++++
--- /var/tmp/diff_new_pack.ZQWNB7/_old 2016-07-28 23:46:32.000000000 +0200
+++ /var/tmp/diff_new_pack.ZQWNB7/_new 2016-07-28 23:46:32.000000000 +0200
@@ -17,12 +17,12 @@
Name: python3-PyMySQL
-Version: 0.7.2
+Version: 0.7.5
Release: 0
Summary: Pure Python MySQL Driver
License: MIT
Group: Development/Languages/Python
-Url: http://code.google.com/p/pymysql
+Url: https://github.com/PyMySQL/PyMySQL
Source: https://files.pythonhosted.org/packages/source/P/PyMySQL/PyMySQL-%{version}…
BuildRequires: python3-devel
BuildRequires: python3-setuptools
++++++ PyMySQL-0.7.2.tar.gz -> PyMySQL-0.7.5.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/CHANGELOG new/PyMySQL-0.7.5/CHANGELOG
--- old/PyMySQL-0.7.2/CHANGELOG 2016-02-24 11:38:43.000000000 +0100
+++ new/PyMySQL-0.7.5/CHANGELOG 2016-06-28 14:18:40.000000000 +0200
@@ -1,5 +1,37 @@
# Changes
+## 0.7.5
+
+Release date: 2016-06-28
+
+* Fix exception raised while importing when getpwuid() fails (#472)
+* SSCursor supports LOAD DATA LOCAL INFILE (#473)
+* Fix encoding error happen for JSON type (#477)
+* Fix test fail on Python 2.7 and MySQL 5.7 (#478)
+
+## 0.7.4
+
+Release date: 2016-05-26
+
+* Fix AttributeError may happen while Connection.__del__ (#463)
+* Fix SyntaxError in test_cursor. (#464)
+* frozenset support for query value. (#461)
+* Start using readthedocs.io
+
+## 0.7.3
+
+Release date: 2016-05-19
+
+* Add read_timeout and write_timeout option.
+* Support serialization customization by `conv` option.
+* Unknown type is converted by `str()`, for MySQLdb compatibility.
+* Support '%%' in `Cursor.executemany()`
+* Support REPLACE statement in `Cursor.executemany()`
+* Fix handling incomplete row caused by 'SHOW SLAVE HOSTS'.
+* Fix decode error when use_unicode=False on PY3
+* Fix port option in my.cnf file is ignored.
+
+
## 0.7.2
Release date: 2016-02-24
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/PKG-INFO new/PyMySQL-0.7.5/PKG-INFO
--- old/PyMySQL-0.7.2/PKG-INFO 2016-02-24 11:40:57.000000000 +0100
+++ new/PyMySQL-0.7.5/PKG-INFO 2016-06-28 14:18:51.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: PyMySQL
-Version: 0.7.2
+Version: 0.7.5
Summary: Pure Python MySQL Driver
Home-page: https://github.com/PyMySQL/PyMySQL/
Author: INADA Naoki
@@ -8,15 +8,14 @@
License: MIT
Description: UNKNOWN
Platform: UNKNOWN
+Classifier: Development Status :: 5 - Production/Stable
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python :: 3.3
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy
-Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Topic :: Database
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/PyMySQL.egg-info/PKG-INFO new/PyMySQL-0.7.5/PyMySQL.egg-info/PKG-INFO
--- old/PyMySQL-0.7.2/PyMySQL.egg-info/PKG-INFO 2016-02-24 11:40:56.000000000 +0100
+++ new/PyMySQL-0.7.5/PyMySQL.egg-info/PKG-INFO 2016-06-28 14:18:50.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: PyMySQL
-Version: 0.7.2
+Version: 0.7.5
Summary: Pure Python MySQL Driver
Home-page: https://github.com/PyMySQL/PyMySQL/
Author: INADA Naoki
@@ -8,15 +8,14 @@
License: MIT
Description: UNKNOWN
Platform: UNKNOWN
+Classifier: Development Status :: 5 - Production/Stable
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
-Classifier: Programming Language :: Python :: 3.3
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy
-Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Topic :: Database
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/PyMySQL.egg-info/SOURCES.txt new/PyMySQL-0.7.5/PyMySQL.egg-info/SOURCES.txt
--- old/PyMySQL-0.7.2/PyMySQL.egg-info/SOURCES.txt 2016-02-24 11:40:57.000000000 +0100
+++ new/PyMySQL-0.7.5/PyMySQL.egg-info/SOURCES.txt 2016-06-28 14:18:51.000000000 +0200
@@ -39,7 +39,6 @@
pymysql/tests/test_connection.py
pymysql/tests/test_converters.py
pymysql/tests/test_cursor.py
-pymysql/tests/test_example.py
pymysql/tests/test_issues.py
pymysql/tests/test_load_local.py
pymysql/tests/test_nextset.py
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/README.rst new/PyMySQL-0.7.5/README.rst
--- old/PyMySQL-0.7.2/README.rst 2016-02-24 11:30:58.000000000 +0100
+++ new/PyMySQL-0.7.5/README.rst 2016-05-24 10:05:28.000000000 +0200
@@ -1,18 +1,30 @@
-=======
-PyMySQL
-=======
+.. image:: https://readthedocs.org/projects/pymysql/badge/?version=latest
+ :target: http://pymysql.readthedocs.io/en/latest/?badge=latest
+ :alt: Documentation Status
.. image:: https://travis-ci.org/PyMySQL/PyMySQL.svg?branch=master
- :target: https://travis-ci.org/PyMySQL/PyMySQL
+ :target: https://travis-ci.org/PyMySQL/PyMySQL
.. image:: https://coveralls.io/repos/PyMySQL/PyMySQL/badge.svg?branch=master&service=…
- :target: https://coveralls.io/github/PyMySQL/PyMySQL?branch=master
+ :target: https://coveralls.io/github/PyMySQL/PyMySQL?branch=master
+
+.. image:: https://img.shields.io/badge/license-MIT-blue.svg
+ :target: https://github.com/PyMySQL/PyMySQL/blob/master/LICENSE
+
+
+PyMySQL
+=======
.. contents::
This package contains a pure-Python MySQL client library. The goal of PyMySQL
is to be a drop-in replacement for MySQLdb and work on CPython, PyPy and IronPython.
+NOTE: PyMySQL doesn't support low level APIs `_mysql` provides like `data_seek`,
+`store_result`, and `use_result`. You should use high level APIs defined in PEP 294.
+But some APIs like `autocommit` and `ping` are supported because PEP 294 doesn't cover
+their usecase.
+
Requirements
-------------
@@ -42,38 +54,14 @@
$ pip install PyMySQL
-Alternatively (e.g. if ``pip`` is not available), a tarball can be downloaded
-from GitHub and installed with Setuptools::
- $ # X.X is the desired PyMySQL version (e.g. 0.5 or 0.6).
- $ curl -L https://github.com/PyMySQL/PyMySQL/tarball/pymysql-X.X | tar xz
- $ cd PyMySQL*
- $ python setup.py install
- $ # The folder PyMySQL* can be safely removed now.
-
-Test Suite
-----------
-
-If you would like to run the test suite, create database for test like this::
-
- mysql -e 'create database test_pymysql DEFAULT CHARACTER SET utf8 DEFAULT COLLATE utf8_general_ci;'
- mysql -e 'create database test_pymysql2 DEFAULT CHARACTER SET utf8 DEFAULT COLLATE utf8_general_ci;'
-
-Then, copy the file ``.travis.databases.json`` to ``pymysql/tests/databases.json``
-and edit the new file to match your MySQL configuration::
-
- $ cp .travis.databases.json pymysql/tests/databases.json
- $ $EDITOR pymysql/tests/databases.json
-
-To run all the tests, execute the script ``runtests.py``::
-
- $ python runtests.py
-
-A ``tox.ini`` file is also provided for conveniently running tests on multiple
-Python versions::
+Documentation
+-------------
- $ tox
+Documentation is available online: http://pymysql.readthedocs.io/
+For support, please refer to the `StackOverflow
+<http://stackoverflow.com/questions/tagged/pymysql>`_.
Example
-------
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/pymysql/__init__.py new/PyMySQL-0.7.5/pymysql/__init__.py
--- old/PyMySQL-0.7.2/pymysql/__init__.py 2016-02-24 11:39:11.000000000 +0100
+++ new/PyMySQL-0.7.5/pymysql/__init__.py 2016-06-28 14:18:40.000000000 +0200
@@ -33,7 +33,7 @@
DateFromTicks, TimeFromTicks, TimestampFromTicks
-VERSION = (0, 7, 2, None)
+VERSION = (0, 7, 5, None)
threadsafety = 1
apilevel = "2.0"
paramstyle = "pyformat"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/pymysql/connections.py new/PyMySQL-0.7.5/pymysql/connections.py
--- old/PyMySQL-0.7.2/pymysql/connections.py 2016-02-24 11:30:58.000000000 +0100
+++ new/PyMySQL-0.7.5/pymysql/connections.py 2016-06-28 13:27:57.000000000 +0200
@@ -18,8 +18,7 @@
from .charset import MBLENGTH, charset_by_name, charset_by_id
from .constants import CLIENT, COMMAND, FIELD_TYPE, SERVER_STATUS
-from .converters import (
- escape_item, encoders, decoders, escape_string, through)
+from .converters import escape_item, escape_string, through, conversions as _conv
from .cursors import Cursor
from .optionfile import Parser
from .util import byte2int, int2byte
@@ -36,7 +35,8 @@
import getpass
DEFAULT_USER = getpass.getuser()
del getpass
-except ImportError:
+except (ImportError, KeyError):
+ # KeyError occurs when there's no entry in OS database for a current user.
DEFAULT_USER = None
@@ -88,6 +88,7 @@
FIELD_TYPE.BLOB,
FIELD_TYPE.LONG_BLOB,
FIELD_TYPE.MEDIUM_BLOB,
+ FIELD_TYPE.JSON,
FIELD_TYPE.STRING,
FIELD_TYPE.TINY_BLOB,
FIELD_TYPE.VAR_STRING,
@@ -523,19 +524,19 @@
connect().
"""
- socket = None
+ _sock = None
_auth_plugin_name = ''
def __init__(self, host=None, user=None, password="",
- database=None, port=3306, unix_socket=None,
+ database=None, port=0, unix_socket=None,
charset='', sql_mode=None,
- read_default_file=None, conv=decoders, use_unicode=None,
+ read_default_file=None, conv=None, use_unicode=None,
client_flag=0, cursorclass=Cursor, init_command=None,
connect_timeout=None, ssl=None, read_default_group=None,
compress=None, named_pipe=None, no_delay=None,
autocommit=False, db=None, passwd=None, local_infile=False,
max_allowed_packet=16*1024*1024, defer_connect=False,
- auth_plugin_map={}):
+ auth_plugin_map={}, read_timeout=None, write_timeout=None):
"""
Establish a connection to the MySQL database. Accepts several
arguments:
@@ -544,15 +545,16 @@
user: Username to log in as
password: Password to use.
database: Database to use, None to not use a particular one.
- port: MySQL port to use, default is usually OK.
+ port: MySQL port to use, default is usually OK. (default: 3306)
unix_socket: Optionally, you can use a unix socket rather than TCP/IP.
charset: Charset you want to use.
sql_mode: Default SQL_MODE to use.
read_default_file:
Specifies my.cnf file to read these parameters from under the [client] section.
conv:
- Decoders dictionary to use instead of the default one.
- This is used to provide custom marshalling of types. See converters.
+ Conversion dictionary to use instead of the default one.
+ This is used to provide custom marshalling and unmarshaling of types.
+ See converters.
use_unicode:
Whether or not to default to unicode strings.
This option defaults to true for Py3k.
@@ -635,11 +637,17 @@
charset = _config("default-character-set", charset)
self.host = host or "localhost"
- self.port = port
+ self.port = port or 3306
self.user = user or DEFAULT_USER
self.password = password or ""
self.db = database
self.unix_socket = unix_socket
+ if read_timeout is not None and read_timeout <= 0:
+ raise ValueError("read_timeout should be >= 0")
+ self._read_timeout = read_timeout
+ if write_timeout is not None and write_timeout <= 0:
+ raise ValueError("write_timeout should be >= 0")
+ self._write_timeout = write_timeout
if charset:
self.charset = charset
self.use_unicode = True
@@ -667,14 +675,17 @@
#: specified autocommit mode. None means use server default.
self.autocommit_mode = autocommit
- self.encoders = encoders # Need for MySQLdb compatibility.
- self.decoders = conv
+ if conv is None:
+ conv = _conv
+ # Need for MySQLdb compatibility.
+ self.encoders = dict([(k, v) for (k, v) in conv.items() if type(k) is not int])
+ self.decoders = dict([(k, v) for (k, v) in conv.items() if type(k) is int])
self.sql_mode = sql_mode
self.init_command = init_command
self.max_allowed_packet = max_allowed_packet
self._auth_plugin_map = auth_plugin_map
if defer_connect:
- self.socket = None
+ self._sock = None
else:
self.connect()
@@ -697,7 +708,7 @@
def close(self):
"""Send the quit message and close the socket"""
- if self.socket is None:
+ if self._sock is None:
raise err.Error("Already closed")
send_data = struct.pack('<iB', 1, COMMAND.COM_QUIT)
try:
@@ -705,22 +716,22 @@
except Exception:
pass
finally:
- sock = self.socket
- self.socket = None
+ sock = self._sock
+ self._sock = None
self._rfile = None
sock.close()
@property
def open(self):
- return self.socket is not None
+ return self._sock is not None
def __del__(self):
- if self.socket:
+ if self._sock:
try:
- self.socket.close()
+ self._sock.close()
except:
pass
- self.socket = None
+ self._sock = None
self._rfile = None
def autocommit(self, value):
@@ -770,19 +781,25 @@
return result.rows
def select_db(self, db):
- '''Set current db'''
+ """Set current db"""
self._execute_command(COMMAND.COM_INIT_DB, db)
self._read_ok_packet()
def escape(self, obj, mapping=None):
- """Escape whatever value you pass to it"""
+ """Escape whatever value you pass to it.
+
+ Non-standard, for internal use; do not use this in your applications.
+ """
if isinstance(obj, str_type):
return "'" + self.escape_string(obj) + "'"
return escape_item(obj, self.charset, mapping=mapping)
def literal(self, obj):
- '''Alias for escape()'''
- return self.escape(obj)
+ """Alias for escape()
+
+ Non-standard, for internal use; do not use this in your applications.
+ """
+ return self.escape(obj, self.encoders)
def escape_string(self, s):
if (self.server_status &
@@ -834,7 +851,7 @@
def ping(self, reconnect=True):
"""Check if the server is alive"""
- if self.socket is None:
+ if self._sock is None:
if reconnect:
self.connect()
reconnect = False
@@ -883,7 +900,7 @@
sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
sock.settimeout(None)
sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
- self.socket = sock
+ self._sock = sock
self._rfile = _makefile(sock, 'rb')
self._next_seq_id = 0
@@ -967,6 +984,7 @@
return packet
def _read_bytes(self, num_bytes):
+ self._sock.settimeout(self._read_timeout)
while True:
try:
data = self._rfile.read(num_bytes)
@@ -983,8 +1001,9 @@
return data
def _write_bytes(self, data):
+ self._sock.settimeout(self._write_timeout)
try:
- self.socket.sendall(data)
+ self._sock.sendall(data)
except IOError as e:
raise err.OperationalError(2006, "MySQL server has gone away (%r)" % (e,))
@@ -1012,7 +1031,7 @@
return 0
def _execute_command(self, command, sql):
- if not self.socket:
+ if not self._sock:
raise err.InterfaceError("(0, '')")
# If the last query was unbuffered, make sure it finishes before
@@ -1066,8 +1085,8 @@
if self.ssl and self.server_capabilities & CLIENT.SSL:
self.write_packet(data_init)
- self.socket = self.ctx.wrap_socket(self.socket, server_hostname=self.host)
- self._rfile = _makefile(self.socket, 'rb')
+ self._sock = self.ctx.wrap_socket(self._sock, server_hostname=self.host)
+ self._rfile = _makefile(self._sock, 'rb')
data = data_init + self.user + b'\0'
@@ -1301,6 +1320,10 @@
self._read_ok_packet(first_packet)
self.unbuffered_active = False
self.connection = None
+ elif first_packet.is_load_local_packet():
+ self._read_load_local_packet(first_packet)
+ self.unbuffered_active = False
+ self.connection = None
else:
self.field_count = first_packet.read_length_encoded_integer()
self._get_descriptions()
@@ -1390,7 +1413,12 @@
def _read_row_from_packet(self, packet):
row = []
for encoding, converter in self.converters:
- data = packet.read_length_coded_string()
+ try:
+ data = packet.read_length_coded_string()
+ except IndexError:
+ # No more columns in this row
+ # See https://github.com/PyMySQL/PyMySQL/pull/434
+ break
if data is not None:
if encoding is not None:
data = data.decode(encoding)
@@ -1441,7 +1469,7 @@
def send_data(self):
"""Send data packets from the local file to the server"""
- if not self.connection.socket:
+ if not self.connection._sock:
raise err.InterfaceError("(0, '')")
conn = self.connection
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/pymysql/constants/FIELD_TYPE.py new/PyMySQL-0.7.5/pymysql/constants/FIELD_TYPE.py
--- old/PyMySQL-0.7.2/pymysql/constants/FIELD_TYPE.py 2014-08-29 13:32:54.000000000 +0200
+++ new/PyMySQL-0.7.5/pymysql/constants/FIELD_TYPE.py 2016-06-28 13:27:57.000000000 +0200
@@ -17,6 +17,7 @@
NEWDATE = 14
VARCHAR = 15
BIT = 16
+JSON = 245
NEWDECIMAL = 246
ENUM = 247
SET = 248
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/pymysql/converters.py new/PyMySQL-0.7.5/pymysql/converters.py
--- old/PyMySQL-0.7.2/pymysql/converters.py 2016-02-24 11:30:58.000000000 +0100
+++ new/PyMySQL-0.7.5/pymysql/converters.py 2016-05-24 09:22:27.000000000 +0200
@@ -109,7 +109,7 @@
return u"'%s'" % _escape_unicode(value)
def escape_str(value, mapping=None):
- return "'%s'" % escape_string(value, mapping)
+ return "'%s'" % escape_string(str(value), mapping)
def escape_None(value, mapping=None):
return 'NULL'
@@ -161,6 +161,8 @@
True
"""
+ if not PY2 and isinstance(obj, (bytes, bytearray)):
+ obj = obj.decode('ascii')
if ' ' in obj:
sep = ' '
elif 'T' in obj:
@@ -196,6 +198,8 @@
can accept values as (+|-)DD HH:MM:SS. The latter format will not
be parsed correctly by this function.
"""
+ if not PY2 and isinstance(obj, (bytes, bytearray)):
+ obj = obj.decode('ascii')
try:
microseconds = 0
if "." in obj:
@@ -238,6 +242,8 @@
to be treated as time-of-day and not a time offset, then you can
use set this function as the converter for FIELD_TYPE.TIME.
"""
+ if not PY2 and isinstance(obj, (bytes, bytearray)):
+ obj = obj.decode('ascii')
try:
microseconds = 0
if "." in obj:
@@ -263,6 +269,8 @@
True
"""
+ if not PY2 and isinstance(obj, (bytes, bytearray)):
+ obj = obj.decode('ascii')
try:
return datetime.date(*[ int(x) for x in obj.split('-', 2) ])
except ValueError:
@@ -290,6 +298,8 @@
True
"""
+ if not PY2 and isinstance(timestamp, (bytes, bytearray)):
+ timestamp = timestamp.decode('ascii')
if timestamp[4] == '-':
return convert_datetime(timestamp)
timestamp += "0"*(14-len(timestamp)) # padding
@@ -302,6 +312,8 @@
return None
def convert_set(s):
+ if isinstance(s, (bytes, bytearray)):
+ return set(s.split(b","))
return set(s.split(","))
@@ -343,6 +355,7 @@
tuple: escape_sequence,
list: escape_sequence,
set: escape_sequence,
+ frozenset: escape_sequence,
dict: escape_dict,
bytearray: escape_bytes,
type(None): escape_None,
@@ -385,7 +398,6 @@
# for MySQLdb compatibility
-conversions = decoders
-
-def Thing2Literal(obj):
- return escape_str(str(obj))
+conversions = encoders.copy()
+conversions.update(decoders)
+Thing2Literal = escape_str
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/pymysql/cursors.py new/PyMySQL-0.7.5/pymysql/cursors.py
--- old/PyMySQL-0.7.2/pymysql/cursors.py 2016-02-24 11:35:47.000000000 +0100
+++ new/PyMySQL-0.7.5/pymysql/cursors.py 2016-06-28 13:27:57.000000000 +0200
@@ -12,8 +12,11 @@
#: Regular expression for :meth:`Cursor.executemany`.
#: executemany only suports simple bulk insert.
#: You can use it to load large dataset.
-RE_INSERT_VALUES = re.compile(r"""(INSERT\s.+\sVALUES\s+)(\(\s*(?:%s|%\(.+\)s)\s*(?:,\s*(?:%s|%\(.+\)s)\s*)*\))(\s*(?:ON DUPLICATE.*)?)\Z""",
- re.IGNORECASE | re.DOTALL)
+RE_INSERT_VALUES = re.compile(
+ r"\s*((?:INSERT|REPLACE)\s.+\sVALUES?\s+)" +
+ r"(\(\s*(?:%s|%\(.+\)s)\s*(?:,\s*(?:%s|%\(.+\)s)\s*)*\))" +
+ r"(\s*(?:ON DUPLICATE.*)?)\Z",
+ re.IGNORECASE | re.DOTALL)
class Cursor(object):
@@ -141,7 +144,7 @@
:param str query: Query to execute.
- :param args: arameters used with query. (optional)
+ :param args: parameters used with query. (optional)
:type args: tuple, list or dict
:return: Number of affected rows
@@ -160,17 +163,23 @@
return result
def executemany(self, query, args):
+ # type: (str, list) -> int
"""Run several data against one query
- PyMySQL can execute bulkinsert for query like 'INSERT ... VALUES (%s)'.
- In other form of queries, just run :meth:`execute` many times.
+ :param query: query to execute on server
+ :param args: Sequence of sequences or mappings. It is used as parameter.
+ :return: Number of rows affected, if any.
+
+ This method improves performance on multiple-row INSERT and
+ REPLACE. Otherwise it is equivalent to looping over args with
+ execute().
"""
if not args:
return
m = RE_INSERT_VALUES.match(query)
if m:
- q_prefix = m.group(1)
+ q_prefix = m.group(1) % ()
q_values = m.group(2).rstrip()
q_postfix = m.group(3) or ''
assert q_values[0] == '(' and q_values[-1] == ')'
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/pymysql/tests/__init__.py new/PyMySQL-0.7.5/pymysql/tests/__init__.py
--- old/PyMySQL-0.7.2/pymysql/tests/__init__.py 2016-02-24 11:30:58.000000000 +0100
+++ new/PyMySQL-0.7.5/pymysql/tests/__init__.py 2016-06-14 10:59:53.000000000 +0200
@@ -1,12 +1,14 @@
-from pymysql.tests.test_issues import *
-from pymysql.tests.test_basic import *
-from pymysql.tests.test_nextset import *
+# Sorted by alphabetical order
from pymysql.tests.test_DictCursor import *
-from pymysql.tests.test_connection import *
from pymysql.tests.test_SSCursor import *
+from pymysql.tests.test_basic import *
+from pymysql.tests.test_connection import *
+from pymysql.tests.test_converters import *
+from pymysql.tests.test_cursor import *
+from pymysql.tests.test_issues import *
from pymysql.tests.test_load_local import *
+from pymysql.tests.test_nextset import *
from pymysql.tests.test_optionfile import *
-from pymysql.tests.test_converters import *
from pymysql.tests.thirdparty import *
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/pymysql/tests/test_basic.py new/PyMySQL-0.7.5/pymysql/tests/test_basic.py
--- old/PyMySQL-0.7.2/pymysql/tests/test_basic.py 2015-02-28 19:44:12.000000000 +0100
+++ new/PyMySQL-0.7.5/pymysql/tests/test_basic.py 2016-05-24 09:22:27.000000000 +0200
@@ -42,11 +42,15 @@
c.execute("delete from test_datatypes")
- # check sequence type
- c.execute("insert into test_datatypes (i, l) values (2,4), (6,8), (10,12)")
- c.execute("select l from test_datatypes where i in %s order by i", ((2,6),))
- r = c.fetchall()
- self.assertEqual(((4,),(8,)), r)
+ # check sequences type
+ for seq_type in (tuple, list, set, frozenset):
+ c.execute("insert into test_datatypes (i, l) values (2,4), (6,8), (10,12)")
+ seq = seq_type([2,6])
+ c.execute("select l from test_datatypes where i in %s order by i", (seq,))
+ r = c.fetchall()
+ self.assertEqual(((4,),(8,)), r)
+ c.execute("delete from test_datatypes")
+
finally:
c.execute("drop table test_datatypes")
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/pymysql/tests/test_cursor.py new/PyMySQL-0.7.5/pymysql/tests/test_cursor.py
--- old/PyMySQL-0.7.2/pymysql/tests/test_cursor.py 2016-02-24 11:30:58.000000000 +0100
+++ new/PyMySQL-0.7.5/pymysql/tests/test_cursor.py 2016-06-14 10:59:53.000000000 +0200
@@ -34,18 +34,8 @@
c2 = conn.cursor()
- with warnings.catch_warnings(record=True) as log:
- warnings.filterwarnings("always")
-
- c2.execute("select 1")
-
- self.assertGreater(len(log), 0)
- self.assertEqual(
- "Previous unbuffered result was left incomplete",
- str(log[-1].message))
- self.assertEqual(
- c2.fetchone(), (1,)
- )
+ c2.execute("select 1")
+ self.assertEqual(c2.fetchone(), (1,))
self.assertIsNone(c2.fetchone())
def test_cleanup_rows_buffered(self):
@@ -89,13 +79,26 @@
self.assertIsNotNone(m, 'error parse %(id_name)s')
self.assertEqual(m.group(3), ' ON duplicate update', 'group 3 not ON duplicate update, bug in RE_INSERT_VALUES?')
- # cursor._executed myst bee "insert into test (data) values (0),(1),(2),(3),(4),(5),(6),(7),(8),(9)"
+ # cursor._executed must bee "insert into test (data) values (0),(1),(2),(3),(4),(5),(6),(7),(8),(9)"
# list args
- data = xrange(10)
+ data = range(10)
cursor.executemany("insert into test (data) values (%s)", data)
- self.assertTrue(cursor._executed.endswith(",(7),(8),(9)"), 'execute many with %s not in one query')
+ self.assertTrue(cursor._executed.endswith(b",(7),(8),(9)"), 'execute many with %s not in one query')
# dict args
- data_dict = [{'data': i} for i in xrange(10)]
+ data_dict = [{'data': i} for i in range(10)]
cursor.executemany("insert into test (data) values (%(data)s)", data_dict)
- self.assertTrue(cursor._executed.endswith(",(7),(8),(9)"), 'execute many with %(data)s not in one query')
+ self.assertTrue(cursor._executed.endswith(b",(7),(8),(9)"), 'execute many with %(data)s not in one query')
+
+ # %% in column set
+ cursor.execute("""\
+ CREATE TABLE percent_test (
+ `A%` INTEGER,
+ `B%` INTEGER)""")
+ try:
+ q = "INSERT INTO percent_test (`A%%`, `B%%`) VALUES (%s, %s)"
+ self.assertIsNotNone(pymysql.cursors.RE_INSERT_VALUES.match(q))
+ cursor.executemany(q, [(3, 4), (5, 6)])
+ self.assertTrue(cursor._executed.endswith(b"(3, 4),(5, 6)"), "executemany with %% not in one query")
+ finally:
+ cursor.execute("DROP TABLE IF EXISTS percent_test")
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/pymysql/tests/test_example.py new/PyMySQL-0.7.5/pymysql/tests/test_example.py
--- old/PyMySQL-0.7.2/pymysql/tests/test_example.py 2014-08-29 13:32:54.000000000 +0200
+++ new/PyMySQL-0.7.5/pymysql/tests/test_example.py 1970-01-01 01:00:00.000000000 +0100
@@ -1,32 +0,0 @@
-import pymysql
-from pymysql.tests import base
-
-class TestExample(base.PyMySQLTestCase):
- def test_example(self):
- conn = pymysql.connect(host='127.0.0.1', port=3306, user='root', passwd='', db='mysql')
-
-
- cur = conn.cursor()
-
- cur.execute("SELECT Host,User FROM user")
-
- # print cur.description
-
- # r = cur.fetchall()
- # print r
- # ...or...
- u = False
-
- for r in cur.fetchall():
- u = u or conn.user in r
-
- self.assertTrue(u)
-
- cur.close()
- conn.close()
-
-__all__ = ["TestExample"]
-
-if __name__ == "__main__":
- import unittest
- unittest.main()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/pymysql/tests/test_issues.py new/PyMySQL-0.7.5/pymysql/tests/test_issues.py
--- old/PyMySQL-0.7.2/pymysql/tests/test_issues.py 2016-02-24 11:30:58.000000000 +0100
+++ new/PyMySQL-0.7.5/pymysql/tests/test_issues.py 2016-06-28 13:58:08.000000000 +0200
@@ -457,7 +457,7 @@
# select WKT
query = "SELECT AsText(geom) FROM issue363"
- if sys.version_info[0:2] >= (3,2) and self.mysql_server_is(conn, (5, 7, 0)):
+ if self.mysql_server_is(conn, (5, 7, 0)):
with self.assertWarns(pymysql.err.Warning) as cm:
cur.execute(query)
else:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/pymysql/tests/test_load_local.py new/PyMySQL-0.7.5/pymysql/tests/test_load_local.py
--- old/PyMySQL-0.7.2/pymysql/tests/test_load_local.py 2016-02-24 11:30:58.000000000 +0100
+++ new/PyMySQL-0.7.5/pymysql/tests/test_load_local.py 2016-06-14 10:59:53.000000000 +0200
@@ -1,4 +1,4 @@
-from pymysql import OperationalError, Warning
+from pymysql import cursors, OperationalError, Warning
from pymysql.tests import base
import os
@@ -42,6 +42,28 @@
finally:
c.execute("DROP TABLE test_load_local")
+ def test_unbuffered_load_file(self):
+ """Test unbuffered load local infile with a valid file"""
+ conn = self.connections[0]
+ c = conn.cursor(cursors.SSCursor)
+ c.execute("CREATE TABLE test_load_local (a INTEGER, b INTEGER)")
+ filename = os.path.join(os.path.dirname(os.path.realpath(__file__)),
+ 'data',
+ 'load_local_data.txt')
+ try:
+ c.execute(
+ ("LOAD DATA LOCAL INFILE '{0}' INTO TABLE " +
+ "test_load_local FIELDS TERMINATED BY ','").format(filename)
+ )
+ c.execute("SELECT COUNT(*) FROM test_load_local")
+ self.assertEqual(22749, c.fetchone()[0])
+ finally:
+ c.close()
+ conn.close()
+ conn.connect()
+ c = conn.cursor()
+ c.execute("DROP TABLE test_load_local")
+
def test_load_warnings(self):
"""Test load local infile produces the appropriate warnings"""
conn = self.connections[0]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/runtests.py new/PyMySQL-0.7.5/runtests.py
--- old/PyMySQL-0.7.2/runtests.py 2015-02-04 18:10:29.000000000 +0100
+++ new/PyMySQL-0.7.5/runtests.py 2016-05-18 11:03:00.000000000 +0200
@@ -10,6 +10,7 @@
@atexit.register
def report_uncollectable():
+ import gc
if not gc.garbage:
print("No garbages!")
return
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/setup.cfg new/PyMySQL-0.7.5/setup.cfg
--- old/PyMySQL-0.7.2/setup.cfg 2016-02-24 11:40:57.000000000 +0100
+++ new/PyMySQL-0.7.5/setup.cfg 2016-06-28 14:18:51.000000000 +0200
@@ -7,7 +7,7 @@
max-line-length = 119
[egg_info]
-tag_svn_revision = 0
-tag_date = 0
tag_build =
+tag_date = 0
+tag_svn_revision = 0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/PyMySQL-0.7.2/setup.py new/PyMySQL-0.7.5/setup.py
--- old/PyMySQL-0.7.2/setup.py 2016-02-24 11:39:26.000000000 +0100
+++ new/PyMySQL-0.7.5/setup.py 2016-06-14 10:59:53.000000000 +0200
@@ -20,15 +20,14 @@
license="MIT",
packages=find_packages(),
classifiers=[
+ 'Development Status :: 5 - Production/Stable',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
- 'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: Implementation :: CPython',
'Programming Language :: Python :: Implementation :: PyPy',
- 'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: MIT License',
'Topic :: Database',
1
0
Hello community,
here is the log from the commit of package python3-aiohttp for openSUSE:Factory checked in at 2016-07-28 23:46:22
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python3-aiohttp (Old)
and /work/SRC/openSUSE:Factory/.python3-aiohttp.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python3-aiohttp"
Changes:
--------
--- /work/SRC/openSUSE:Factory/python3-aiohttp/python3-aiohttp.changes 2016-07-18 21:25:40.000000000 +0200
+++ /work/SRC/openSUSE:Factory/.python3-aiohttp.new/python3-aiohttp.changes 2016-07-28 23:46:25.000000000 +0200
@@ -1,0 +2,10 @@
+Sat Jul 23 17:28:04 UTC 2016 - arun(a)gmx.de
+
+- update to version 0.22.2:
+ * Suppress CancelledError when Timeout raises TimeoutError #970
+ * Don’t expose aiohttp.__version__
+ * Add unsafe parameter to CookieJar #968
+ * Use unsafe cookie jar in test client tools
+ * Expose aiohttp.CookieJar name
+
+-------------------------------------------------------------------
Old:
----
aiohttp-0.22.1.tar.gz
New:
----
aiohttp-0.22.2.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python3-aiohttp.spec ++++++
--- /var/tmp/diff_new_pack.y263mf/_old 2016-07-28 23:46:26.000000000 +0200
+++ /var/tmp/diff_new_pack.y263mf/_new 2016-07-28 23:46:26.000000000 +0200
@@ -17,7 +17,7 @@
Name: python3-aiohttp
-Version: 0.22.1
+Version: 0.22.2
Release: 0
Url: https://pypi.python.org/pypi/aiohttp
Summary: Http client/server for asyncio
++++++ aiohttp-0.22.1.tar.gz -> aiohttp-0.22.2.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/aiohttp-0.22.1/CHANGES.txt new/aiohttp-0.22.2/CHANGES.txt
--- old/aiohttp-0.22.1/CHANGES.txt 2016-07-16 13:35:59.000000000 +0200
+++ new/aiohttp-0.22.2/CHANGES.txt 2016-07-23 15:31:40.000000000 +0200
@@ -1,14 +1,28 @@
CHANGES
=======
-0.22.1 (08-16-2016)
+0.22.2 (07-23-2016)
+-------------------
+
+- Suppress CancelledError when Timeout raises TimeoutError #970
+
+- Don't expose `aiohttp.__version__`
+
+- Add unsafe parameter to CookieJar #968
+
+- Use unsafe cookie jar in test client tools
+
+- Expose aiohttp.CookieJar name
+
+
+0.22.1 (07-16-2016)
-------------------
- Large cookie expiration/max-age doesn't break an event loop from now
(fixes #967)
-0.22.0 (08-15-2016)
+0.22.0 (07-15-2016)
-------------------
- Fix bug in serving static directory #803
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/aiohttp-0.22.1/PKG-INFO new/aiohttp-0.22.2/PKG-INFO
--- old/aiohttp-0.22.1/PKG-INFO 2016-07-16 13:37:29.000000000 +0200
+++ new/aiohttp-0.22.2/PKG-INFO 2016-07-23 15:33:07.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: aiohttp
-Version: 0.22.1
+Version: 0.22.2
Summary: http client/server for asyncio
Home-page: https://github.com/KeepSafe/aiohttp/
Author: Andrew Svetlov
@@ -150,14 +150,28 @@
CHANGES
=======
- 0.22.1 (08-16-2016)
+ 0.22.2 (07-23-2016)
+ -------------------
+
+ - Suppress CancelledError when Timeout raises TimeoutError #970
+
+ - Don't expose `aiohttp.__version__`
+
+ - Add unsafe parameter to CookieJar #968
+
+ - Use unsafe cookie jar in test client tools
+
+ - Expose aiohttp.CookieJar name
+
+
+ 0.22.1 (07-16-2016)
-------------------
- Large cookie expiration/max-age doesn't break an event loop from now
(fixes #967)
- 0.22.0 (08-15-2016)
+ 0.22.0 (07-15-2016)
-------------------
- Fix bug in serving static directory #803
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/aiohttp-0.22.1/aiohttp/__init__.py new/aiohttp-0.22.2/aiohttp/__init__.py
--- old/aiohttp-0.22.1/aiohttp/__init__.py 2016-07-16 13:35:59.000000000 +0200
+++ new/aiohttp-0.22.2/aiohttp/__init__.py 2016-07-23 15:31:40.000000000 +0200
@@ -1,6 +1,6 @@
# This relies on each of the submodules having an __all__ variable.
-__version__ = '0.22.1'
+__version__ = '0.22.2'
import multidict # noqa
@@ -30,4 +30,4 @@
multidict.__all__ + # noqa
multipart.__all__ + # noqa
websocket_client.__all__ + # noqa
- ('hdrs', '__version__', 'FileSender'))
+ ('hdrs', 'FileSender'))
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/aiohttp-0.22.1/aiohttp/client.py new/aiohttp-0.22.2/aiohttp/client.py
--- old/aiohttp-0.22.1/aiohttp/client.py 2016-07-16 13:35:59.000000000 +0200
+++ new/aiohttp-0.22.2/aiohttp/client.py 2016-07-23 15:31:40.000000000 +0200
@@ -152,8 +152,7 @@
redirects = 0
history = []
- if not isinstance(method, upstr):
- method = upstr(method)
+ method = upstr(method)
# Merge with default headers and transform to CIMultiDict
headers = self._prepare_headers(headers)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/aiohttp-0.22.1/aiohttp/helpers.py new/aiohttp-0.22.2/aiohttp/helpers.py
--- old/aiohttp-0.22.1/aiohttp/helpers.py 2016-07-16 13:35:59.000000000 +0200
+++ new/aiohttp-0.22.2/aiohttp/helpers.py 2016-07-23 15:31:40.000000000 +0200
@@ -27,7 +27,7 @@
__all__ = ('BasicAuth', 'create_future', 'FormData', 'parse_mimetype',
- 'Timeout')
+ 'Timeout', 'CookieJar')
class BasicAuth(namedtuple('BasicAuth', ['login', 'password', 'encoding'])):
@@ -561,7 +561,7 @@
if exc_type is asyncio.CancelledError and self._cancelled:
self._cancel_handler = None
self._task = None
- raise asyncio.TimeoutError
+ raise asyncio.TimeoutError from None
if self._timeout is not None:
self._cancel_handler.cancel()
self._cancel_handler = None
@@ -587,9 +587,10 @@
DATE_YEAR_RE = re.compile("(\d{2,4})")
- def __init__(self, *, loop=None):
+ def __init__(self, *, unsafe=False, loop=None):
super().__init__(loop=loop)
self._host_only_cookies = set()
+ self._unsafe = unsafe
def _expire_cookie(self, when, name, DAY=24*3600):
now = self._loop.time()
@@ -608,7 +609,7 @@
url_parsed = urlsplit(response_url or "")
hostname = url_parsed.hostname
- if is_ip_address(hostname):
+ if not self._unsafe and is_ip_address(hostname):
# Don't accept cookies from IPs
return
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/aiohttp-0.22.1/aiohttp/test_utils.py new/aiohttp-0.22.2/aiohttp/test_utils.py
--- old/aiohttp-0.22.1/aiohttp/test_utils.py 2016-07-16 13:35:59.000000000 +0200
+++ new/aiohttp-0.22.2/aiohttp/test_utils.py 2016-07-23 15:31:40.000000000 +0200
@@ -349,7 +349,10 @@
self._server = None
if not loop.is_running():
loop.run_until_complete(self.start_server())
- self._session = ClientSession(loop=self._loop)
+ self._session = ClientSession(
+ loop=self._loop,
+ cookie_jar=aiohttp.CookieJar(unsafe=True,
+ loop=self._loop))
self._root = '{}://{}:{}'.format(protocol, self._address, self.port)
self._closed = False
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/aiohttp-0.22.1/aiohttp.egg-info/PKG-INFO new/aiohttp-0.22.2/aiohttp.egg-info/PKG-INFO
--- old/aiohttp-0.22.1/aiohttp.egg-info/PKG-INFO 2016-07-16 13:37:28.000000000 +0200
+++ new/aiohttp-0.22.2/aiohttp.egg-info/PKG-INFO 2016-07-23 15:33:06.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: aiohttp
-Version: 0.22.1
+Version: 0.22.2
Summary: http client/server for asyncio
Home-page: https://github.com/KeepSafe/aiohttp/
Author: Andrew Svetlov
@@ -150,14 +150,28 @@
CHANGES
=======
- 0.22.1 (08-16-2016)
+ 0.22.2 (07-23-2016)
+ -------------------
+
+ - Suppress CancelledError when Timeout raises TimeoutError #970
+
+ - Don't expose `aiohttp.__version__`
+
+ - Add unsafe parameter to CookieJar #968
+
+ - Use unsafe cookie jar in test client tools
+
+ - Expose aiohttp.CookieJar name
+
+
+ 0.22.1 (07-16-2016)
-------------------
- Large cookie expiration/max-age doesn't break an event loop from now
(fixes #967)
- 0.22.0 (08-15-2016)
+ 0.22.0 (07-15-2016)
-------------------
- Fix bug in serving static directory #803
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/aiohttp-0.22.1/docs/client.rst new/aiohttp-0.22.2/docs/client.rst
--- old/aiohttp-0.22.1/docs/client.rst 2016-07-16 13:35:59.000000000 +0200
+++ new/aiohttp-0.22.2/docs/client.rst 2016-07-23 15:31:40.000000000 +0200
@@ -386,6 +386,24 @@
:class:`~aiohttp.ClientSession` supports keep-alive requests
and connection pooling out-of-the-box.
+.. _aiohttp-client-cookie-safety:
+
+Cookie safety
+-------------
+
+By default :class:`~aiohttp.ClientSession` uses strict version of
+:class:`~aiohttp.CookieJar`. :rfc:`2109` explicitly forbids cookie
+accepting from URLs with IP address instead of DNS name
+(e.g. `http://127.0.0.1:80/cookie`).
+
+It's good but sometimes for testing we need to enable support for such
+cookies. It should be done by passing `usafe=True` to
+:class:`~aiohttp.CookieJar` constructor::
+
+
+ jar = aiohttp.CookieJar(unsafe=True)
+ session = aiohttp.ClientSession(cookie_jar=jar)
+
Connectors
----------
@@ -421,8 +439,8 @@
aiodns is required.
from aiohttp.resolver import AsyncResolver
-
-
+
+
resolver = AsyncResolver(nameservers=["8.8.8.8", "8.8.4.4"])
conn = aiohttp.TCPConnector(resolver=resolver)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/aiohttp-0.22.1/tests/test_timeout.py new/aiohttp-0.22.2/tests/test_timeout.py
--- old/aiohttp-0.22.1/tests/test_timeout.py 2016-07-16 13:35:59.000000000 +0200
+++ new/aiohttp-0.22.2/tests/test_timeout.py 2016-07-23 15:31:40.000000000 +0200
@@ -165,3 +165,13 @@
yield from task
assert task.cancelled()
assert task.done()
+
+
+(a)pytest.mark.run_loop
+def test_timeout_suppress_exception_chain(loop):
+
+ with pytest.raises(asyncio.TimeoutError) as ctx:
+ with Timeout(0.01, loop=loop) as t:
+ yield from asyncio.sleep(10, loop=loop)
+ assert t._loop is loop
+ assert ctx.value.__suppress_context__
1
0