openSUSE Commits
Threads by month
- ----- 2025 -----
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
November 2018
- 1 participants
- 1606 discussions
Hello community,
here is the log from the commit of package gnu_parallel for openSUSE:Factory checked in at 2018-11-26 10:31:20
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/gnu_parallel (Old)
and /work/SRC/openSUSE:Factory/.gnu_parallel.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "gnu_parallel"
Mon Nov 26 10:31:20 2018 rev:50 rq:651434 version:20181122
Changes:
--------
--- /work/SRC/openSUSE:Factory/gnu_parallel/gnu_parallel.changes 2018-11-15 12:41:37.938169240 +0100
+++ /work/SRC/openSUSE:Factory/.gnu_parallel.new.19453/gnu_parallel.changes 2018-11-26 10:31:58.988910059 +0100
@@ -1,0 +2,6 @@
+Fri Nov 23 17:26:52 UTC 2018 - Jan Engelhardt <jengelh(a)inai.de>
+
+- Update to new upstream release 20181122
+ * Experimental simpler job flow control.
+
+-------------------------------------------------------------------
Old:
----
parallel-20181022.tar.bz2
parallel-20181022.tar.bz2.sig
New:
----
parallel-20181122.tar.bz2
parallel-20181122.tar.bz2.sig
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ gnu_parallel.spec ++++++
--- /var/tmp/diff_new_pack.ot0hmR/_old 2018-11-26 10:32:05.084902916 +0100
+++ /var/tmp/diff_new_pack.ot0hmR/_new 2018-11-26 10:32:05.088902912 +0100
@@ -17,7 +17,7 @@
Name: gnu_parallel
-Version: 20181022
+Version: 20181122
Release: 0
Summary: Shell tool for executing jobs in parallel
License: GPL-3.0-or-later
++++++ parallel-20181022.tar.bz2 -> parallel-20181122.tar.bz2 ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/NEWS new/parallel-20181122/NEWS
--- old/parallel-20181022/NEWS 2018-10-23 00:56:52.000000000 +0200
+++ new/parallel-20181122/NEWS 2018-11-23 00:35:17.000000000 +0100
@@ -1,3 +1,13 @@
+20181122
+
+* Experimental simpler job flow control.
+
+* 時間がかかるコマンドを GNU parallel で 並列実行する
+ https://qiita.com//grohiro/items/4db3fa951a4778c5c479
+
+* Bug fixes and man page updates.
+
+
20181022
* env_parallel.fish: --session support (alpha quality)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/README new/parallel-20181122/README
--- old/parallel-20181022/README 2018-10-23 00:48:01.000000000 +0200
+++ new/parallel-20181122/README 2018-11-23 00:32:16.000000000 +0100
@@ -44,9 +44,9 @@
Full installation of GNU Parallel is as simple as:
- wget https://ftpmirror.gnu.org/parallel/parallel-20181022.tar.bz2
- bzip2 -dc parallel-20181022.tar.bz2 | tar xvf -
- cd parallel-20181022
+ wget https://ftpmirror.gnu.org/parallel/parallel-20181122.tar.bz2
+ bzip2 -dc parallel-20181122.tar.bz2 | tar xvf -
+ cd parallel-20181122
./configure && make && sudo make install
@@ -55,9 +55,9 @@
If you are not root you can add ~/bin to your path and install in
~/bin and ~/share:
- wget https://ftpmirror.gnu.org/parallel/parallel-20181022.tar.bz2
- bzip2 -dc parallel-20181022.tar.bz2 | tar xvf -
- cd parallel-20181022
+ wget https://ftpmirror.gnu.org/parallel/parallel-20181122.tar.bz2
+ bzip2 -dc parallel-20181122.tar.bz2 | tar xvf -
+ cd parallel-20181122
./configure --prefix=$HOME && make && make install
Or if your system lacks 'make' you can simply copy src/parallel
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/configure new/parallel-20181122/configure
--- old/parallel-20181022/configure 2018-10-23 00:48:17.000000000 +0200
+++ new/parallel-20181122/configure 2018-11-23 00:32:28.000000000 +0100
@@ -1,6 +1,6 @@
#! /bin/sh
# Guess values for system-dependent variables and create Makefiles.
-# Generated by GNU Autoconf 2.69 for parallel 20181022.
+# Generated by GNU Autoconf 2.69 for parallel 20181122.
#
# Report bugs to <bug-parallel(a)gnu.org>.
#
@@ -579,8 +579,8 @@
# Identity of this package.
PACKAGE_NAME='parallel'
PACKAGE_TARNAME='parallel'
-PACKAGE_VERSION='20181022'
-PACKAGE_STRING='parallel 20181022'
+PACKAGE_VERSION='20181122'
+PACKAGE_STRING='parallel 20181122'
PACKAGE_BUGREPORT='bug-parallel(a)gnu.org'
PACKAGE_URL=''
@@ -1214,7 +1214,7 @@
# Omit some internal or obsolete options to make the list less imposing.
# This message is too long to be a string in the A/UX 3.1 sh.
cat <<_ACEOF
-\`configure' configures parallel 20181022 to adapt to many kinds of systems.
+\`configure' configures parallel 20181122 to adapt to many kinds of systems.
Usage: $0 [OPTION]... [VAR=VALUE]...
@@ -1281,7 +1281,7 @@
if test -n "$ac_init_help"; then
case $ac_init_help in
- short | recursive ) echo "Configuration of parallel 20181022:";;
+ short | recursive ) echo "Configuration of parallel 20181122:";;
esac
cat <<\_ACEOF
@@ -1357,7 +1357,7 @@
test -n "$ac_init_help" && exit $ac_status
if $ac_init_version; then
cat <<\_ACEOF
-parallel configure 20181022
+parallel configure 20181122
generated by GNU Autoconf 2.69
Copyright (C) 2012 Free Software Foundation, Inc.
@@ -1374,7 +1374,7 @@
This file contains any messages produced by compilers while
running configure, to aid debugging if configure makes a mistake.
-It was created by parallel $as_me 20181022, which was
+It was created by parallel $as_me 20181122, which was
generated by GNU Autoconf 2.69. Invocation command line was
$ $0 $@
@@ -2237,7 +2237,7 @@
# Define the identity of the package.
PACKAGE='parallel'
- VERSION='20181022'
+ VERSION='20181122'
cat >>confdefs.h <<_ACEOF
@@ -2880,7 +2880,7 @@
# report actual input values of CONFIG_FILES etc. instead of their
# values after options handling.
ac_log="
-This file was extended by parallel $as_me 20181022, which was
+This file was extended by parallel $as_me 20181122, which was
generated by GNU Autoconf 2.69. Invocation command line was
CONFIG_FILES = $CONFIG_FILES
@@ -2942,7 +2942,7 @@
cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1
ac_cs_config="`$as_echo "$ac_configure_args" | sed 's/^ //; s/[\\""\`\$]/\\\\&/g'`"
ac_cs_version="\\
-parallel config.status 20181022
+parallel config.status 20181122
configured by $0, generated by GNU Autoconf 2.69,
with options \\"\$ac_cs_config\\"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/configure.ac new/parallel-20181122/configure.ac
--- old/parallel-20181022/configure.ac 2018-10-23 00:48:00.000000000 +0200
+++ new/parallel-20181122/configure.ac 2018-11-23 00:32:16.000000000 +0100
@@ -1,4 +1,4 @@
-AC_INIT([parallel], [20181022], [bug-parallel(a)gnu.org]
+AC_INIT([parallel], [20181122], [bug-parallel(a)gnu.org]
AM_INIT_AUTOMAKE([-Wall -Werror foreign])
AC_CONFIG_HEADERS([config.h])
AC_CONFIG_FILES([
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/src/niceload new/parallel-20181122/src/niceload
--- old/parallel-20181022/src/niceload 2018-10-23 00:48:01.000000000 +0200
+++ new/parallel-20181122/src/niceload 2018-11-23 00:32:16.000000000 +0100
@@ -24,7 +24,7 @@
use strict;
use Getopt::Long;
$Global::progname="niceload";
-$Global::version = 20181022;
+$Global::version = 20181122;
Getopt::Long::Configure("bundling","require_order");
get_options_from_array(\@ARGV) || die_usage();
if($opt::version) {
@@ -1147,7 +1147,7 @@
# throw away all execpt the last Device:-section
my @iostat;
for(reverse @iostat_out) {
- /Device:/ and last;
+ /Device/ and last;
push @iostat, (split(/\s+/,$_))[13];
}
my $io = ::max(@iostat);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/src/parallel new/parallel-20181122/src/parallel
--- old/parallel-20181022/src/parallel 2018-10-23 00:48:01.000000000 +0200
+++ new/parallel-20181122/src/parallel 2018-11-23 00:32:16.000000000 +0100
@@ -120,7 +120,7 @@
$sem = acquire_semaphore();
}
$SIG{TERM} = \&start_no_new_jobs;
-start_more_jobs();
+while(start_more_jobs()) {}
if($opt::tee) {
# All jobs must be running in parallel for --tee
$Global::start_no_new_jobs = 1;
@@ -620,6 +620,7 @@
my $sleep =1;
while($Global::total_running > 0) {
$sleep = ::reap_usleep($sleep);
+ start_more_jobs();
}
}
$Global::start_no_new_jobs ||= 1;
@@ -1554,7 +1555,7 @@
sub init_globals {
# Defaults:
- $Global::version = 20181022;
+ $Global::version = 20181122;
$Global::progname = 'parallel';
$Global::infinity = 2**31;
$Global::debug = 0;
@@ -2668,7 +2669,6 @@
# Returns:
# $jobs_started = number of jobs started
my $jobs_started = 0;
- my $jobs_started_this_round = 0;
if($Global::start_no_new_jobs) {
return $jobs_started;
}
@@ -2678,65 +2678,61 @@
changed_procs_file();
changed_sshloginfile();
}
- do {
- $jobs_started_this_round = 0;
- # This will start 1 job on each --sshlogin (if possible)
- # thus distribute the jobs on the --sshlogins round robin
- for my $sshlogin (values %Global::host) {
- if($Global::JobQueue->empty() and not $opt::pipe) {
- # No more jobs in the queue
- last;
- }
- debug("run", "Running jobs before on ", $sshlogin->string(), ": ",
- $sshlogin->jobs_running(), "\n");
- if ($sshlogin->jobs_running() < $sshlogin->max_jobs_running()) {
- if($opt::delay
- and
- $opt::delay > ::now() - $Global::newest_starttime) {
- # It has been too short since last start
- next;
- }
- if($opt::load and $sshlogin->loadavg_too_high()) {
- # The load is too high or unknown
- next;
- }
- if($opt::noswap and $sshlogin->swapping()) {
- # The server is swapping
- next;
- }
- if($opt::limit and $sshlogin->limit()) {
- # Over limit
- next;
- }
- if($opt::memfree and $sshlogin->memfree() < $opt::memfree) {
- # The server has not enough mem free
- ::debug("mem", "Not starting job: not enough mem\n");
- next;
- }
- if($sshlogin->too_fast_remote_login()) {
- # It has been too short since
- next;
- }
- debug("run", $sshlogin->string(),
- " has ", $sshlogin->jobs_running(),
- " out of ", $sshlogin->max_jobs_running(),
- " jobs running. Start another.\n");
- if(start_another_job($sshlogin) == 0) {
- # No more jobs to start on this $sshlogin
- debug("run","No jobs started on ",
- $sshlogin->string(), "\n");
- next;
- }
- $sshlogin->inc_jobs_running();
- $sshlogin->set_last_login_at(::now());
- $jobs_started++;
- $jobs_started_this_round++;
- }
- debug("run","Running jobs after on ", $sshlogin->string(), ": ",
- $sshlogin->jobs_running(), " of ",
- $sshlogin->max_jobs_running(), "\n");
+ # This will start 1 job on each --sshlogin (if possible)
+ # thus distribute the jobs on the --sshlogins round robin
+ for my $sshlogin (values %Global::host) {
+ if($Global::JobQueue->empty() and not $opt::pipe) {
+ # No more jobs in the queue
+ last;
}
- } while($jobs_started_this_round);
+ debug("run", "Running jobs before on ", $sshlogin->string(), ": ",
+ $sshlogin->jobs_running(), "\n");
+ if ($sshlogin->jobs_running() < $sshlogin->max_jobs_running()) {
+ if($opt::delay
+ and
+ $opt::delay > ::now() - $Global::newest_starttime) {
+ # It has been too short since last start
+ next;
+ }
+ if($opt::load and $sshlogin->loadavg_too_high()) {
+ # The load is too high or unknown
+ next;
+ }
+ if($opt::noswap and $sshlogin->swapping()) {
+ # The server is swapping
+ next;
+ }
+ if($opt::limit and $sshlogin->limit()) {
+ # Over limit
+ next;
+ }
+ if($opt::memfree and $sshlogin->memfree() < $opt::memfree) {
+ # The server has not enough mem free
+ ::debug("mem", "Not starting job: not enough mem\n");
+ next;
+ }
+ if($sshlogin->too_fast_remote_login()) {
+ # It has been too short since
+ next;
+ }
+ debug("run", $sshlogin->string(),
+ " has ", $sshlogin->jobs_running(),
+ " out of ", $sshlogin->max_jobs_running(),
+ " jobs running. Start another.\n");
+ if(start_another_job($sshlogin) == 0) {
+ # No more jobs to start on this $sshlogin
+ debug("run","No jobs started on ",
+ $sshlogin->string(), "\n");
+ next;
+ }
+ $sshlogin->inc_jobs_running();
+ $sshlogin->set_last_login_at(::now());
+ $jobs_started++;
+ }
+ debug("run","Running jobs after on ", $sshlogin->string(), ": ",
+ $sshlogin->jobs_running(), " of ",
+ $sshlogin->max_jobs_running(), "\n");
+ }
return $jobs_started;
}
@@ -2912,8 +2908,8 @@
}
# * because of loadavg
# * because of too little time between each ssh login.
- start_more_jobs();
$sleep = ::reap_usleep($sleep);
+ start_more_jobs();
if($Global::max_jobs_running == 0) {
::warning("There are no job slots available. Increase --jobs.");
}
@@ -2921,6 +2917,7 @@
while($opt::sqlmaster and not $Global::sql->finished()) {
# SQL master
$sleep = ::reap_usleep($sleep);
+ start_more_jobs();
if($Global::start_sqlworker) {
# Start an SQL worker as we are now sure there is work to do
$Global::start_sqlworker = 0;
@@ -4054,27 +4051,24 @@
# @pids_reaped = PIDs of children finished
my $stiff;
my @pids_reaped;
- my $children_reaped = 0;
+ my $total_reaped;
debug("run", "Reaper ");
- # For efficiency surround with BEGIN/COMMIT when using $opt::sqlmaster
- $opt::sqlmaster and $Global::sql->run("BEGIN;");
- while (($stiff = waitpid(-1, &WNOHANG)) > 0) {
+ if (($stiff = waitpid(-1, &WNOHANG)) > 0) {
# $stiff = pid of dead process
if(wantarray) {
push(@pids_reaped,$stiff);
- } else {
- $children_reaped++;
}
- if($Global::sshmaster{$stiff}) {
- # This is one of the ssh -M: ignore
- next;
- }
- my $job = $Global::running{$stiff};
+ $total_reaped++;
+ if($Global::sshmaster{$stiff}) {
+ # This is one of the ssh -M: ignore
+ next;
+ }
+ my $job = $Global::running{$stiff};
# '-a <(seq 10)' will give us a pid not in %Global::running
- $job or next;
- delete $Global::running{$stiff};
- $Global::total_running--;
+ $job or return 0;
+ delete $Global::running{$stiff};
+ $Global::total_running--;
if($job->{'commandline'}{'skip'}) {
# $job->skip() was called
$job->set_exitstatus(-2);
@@ -4084,11 +4078,12 @@
$job->set_exitsignal($? & 127);
}
- debug("run", "seq ",$job->seq()," died (", $job->exitstatus(), ")");
- $job->set_endtime(::now());
- my $sshlogin = $job->sshlogin();
- $sshlogin->dec_jobs_running();
- if($job->should_be_retried()) {
+ debug("run", "seq ",$job->seq()," died (", $job->exitstatus(), ")");
+ $job->set_endtime(::now());
+ my $sshlogin = $job->sshlogin();
+ $sshlogin->dec_jobs_running();
+ if($job->should_be_retried()) {
+ # Free up file handles
$job->free_ressources();
} else {
# The job is done
@@ -4109,18 +4104,17 @@
::kill_sleep_seq($job->pid());
::killall();
::wait_and_exit($Global::halt_exitstatus);
- }
- }
+ }
+ }
$job->cleanup();
- start_more_jobs();
+
if($opt::progress) {
my %progress = progress();
::status_no_nl("\r",$progress{'status'});
}
}
- $opt::sqlmaster and $Global::sql->run("COMMIT;");
debug("run", "done ");
- return wantarray ? @pids_reaped : $children_reaped;
+ return wantarray ? @pids_reaped : $total_reaped;
}
@@ -5102,6 +5096,7 @@
# $ms*1.1 if no children reaped
my $ms = shift;
if(reaper()) {
+ while(reaper()) {}
if(not $Global::total_completed % 100) {
if($opt::timeout) {
# Force cleaning the timeout queue for every 1000 jobs
@@ -7947,7 +7942,7 @@
$command =
'cat > $PARALLEL_TMP;'.
$command.";". postpone_exit_and_cleanup().
- '$PARALLEL_TMP';
+ '$PARALLEL_TMP';
} elsif($opt::fifo) {
# Prepend fifo-wrapper. In essence:
# mkfifo {}
@@ -8280,7 +8275,7 @@
my $self = shift;
my $command = shift;
# TODO test that *sh -c 'parallel --env' use *sh
- if(not defined $self->{'sshlogin_wrap'}) {
+ if(not defined $self->{'sshlogin_wrap'}{$command}) {
my $sshlogin = $self->sshlogin();
my $serverlogin = $sshlogin->serverlogin();
my $quoted_remote_command;
@@ -8310,12 +8305,12 @@
$command =~ /\n/) {
# csh does not deal well with > 1000 chars in one word
# csh does not deal well with $ENV with \n
- $self->{'sshlogin_wrap'} = base64_wrap($perl_code);
+ $self->{'sshlogin_wrap'}{$command} = base64_wrap($perl_code);
} else {
- $self->{'sshlogin_wrap'} = "perl -e ".::Q($perl_code);
+ $self->{'sshlogin_wrap'}{$command} = "perl -e ".::Q($perl_code);
}
} else {
- $self->{'sshlogin_wrap'} = $command;
+ $self->{'sshlogin_wrap'}{$command} = $command;
}
} else {
my $pwd = "";
@@ -8357,7 +8352,7 @@
# We need to save the exit status of the job
$post = '_EXIT_status=$?; ' . $post . ' exit $_EXIT_status;';
}
- $self->{'sshlogin_wrap'} =
+ $self->{'sshlogin_wrap'}{$command} =
($pre
. "$sshcmd $serverlogin -- exec "
. $quoted_remote_command
@@ -8365,7 +8360,7 @@
. $post);
}
}
- return $self->{'sshlogin_wrap'};
+ return $self->{'sshlogin_wrap'}{$command};
}
sub transfer {
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/src/parallel_alternatives.7 new/parallel-20181122/src/parallel_alternatives.7
--- old/parallel-20181022/src/parallel_alternatives.7 2018-10-22 20:46:44.000000000 +0200
+++ new/parallel-20181122/src/parallel_alternatives.7 2018-11-10 15:01:55.000000000 +0100
@@ -129,7 +129,7 @@
.\" ========================================================================
.\"
.IX Title "PARALLEL_ALTERNATIVES 7"
-.TH PARALLEL_ALTERNATIVES 7 "2018-10-22" "20180922" "parallel"
+.TH PARALLEL_ALTERNATIVES 7 "2018-10-23" "20181022" "parallel"
.\" For nroff, turn off justification. Always turn off hyphenation; it makes
.\" way too many mistakes in technical documents.
.if n .ad l
@@ -1936,12 +1936,17 @@
dependency graph described in a file, so this is similar to \fBmake\fR.
.PP
https://github.com/cetra3/lorikeet (Last checked: 2018\-10)
+.SS "\s-1DIFFERENCES BETWEEN\s0 spp \s-1AND GNU\s0 Parallel"
+.IX Subsection "DIFFERENCES BETWEEN spp AND GNU Parallel"
+\&\fBspp\fR can run jobs in parallel. \fBspp\fR does not use a command
+template to generate the jobs, but requires jobs to be in a
+file. Output from the jobs mix.
+.PP
+https://github.com/john01dav/spp
.SS "Todo"
.IX Subsection "Todo"
Url for spread
.PP
-https://github.com/john01dav/spp
-.PP
https://github.com/amritb/with\-this.git
.PP
https://github.com/fd0/machma Requires Go >= 1.7.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/src/parallel_alternatives.html new/parallel-20181122/src/parallel_alternatives.html
--- old/parallel-20181022/src/parallel_alternatives.html 2018-10-22 20:46:54.000000000 +0200
+++ new/parallel-20181122/src/parallel_alternatives.html 2018-11-10 15:01:55.000000000 +0100
@@ -68,6 +68,7 @@
<li><a href="#DIFFERENCES-BETWEEN-map-soveran-AND-GNU-Parallel">DIFFERENCES BETWEEN map(soveran) AND GNU Parallel</a></li>
<li><a href="#DIFFERENCES-BETWEEN-loop-AND-GNU-Parallel">DIFFERENCES BETWEEN loop AND GNU Parallel</a></li>
<li><a href="#DIFFERENCES-BETWEEN-lorikeet-AND-GNU-Parallel">DIFFERENCES BETWEEN lorikeet AND GNU Parallel</a></li>
+ <li><a href="#DIFFERENCES-BETWEEN-spp-AND-GNU-Parallel">DIFFERENCES BETWEEN spp AND GNU Parallel</a></li>
<li><a href="#Todo">Todo</a></li>
</ul>
</li>
@@ -1527,12 +1528,16 @@
<p>https://github.com/cetra3/lorikeet (Last checked: 2018-10)</p>
-<h2 id="Todo">Todo</h2>
+<h2 id="DIFFERENCES-BETWEEN-spp-AND-GNU-Parallel">DIFFERENCES BETWEEN spp AND GNU Parallel</h2>
-<p>Url for spread</p>
+<p><b>spp</b> can run jobs in parallel. <b>spp</b> does not use a command template to generate the jobs, but requires jobs to be in a file. Output from the jobs mix.</p>
<p>https://github.com/john01dav/spp</p>
+<h2 id="Todo">Todo</h2>
+
+<p>Url for spread</p>
+
<p>https://github.com/amritb/with-this.git</p>
<p>https://github.com/fd0/machma Requires Go >= 1.7.</p>
Binary files old/parallel-20181022/src/parallel_alternatives.pdf and new/parallel-20181122/src/parallel_alternatives.pdf differ
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/src/parallel_alternatives.pod new/parallel-20181122/src/parallel_alternatives.pod
--- old/parallel-20181022/src/parallel_alternatives.pod 2018-10-22 20:17:25.000000000 +0200
+++ new/parallel-20181122/src/parallel_alternatives.pod 2018-10-24 01:07:54.000000000 +0200
@@ -1716,12 +1716,19 @@
https://github.com/cetra3/lorikeet (Last checked: 2018-10)
+=head2 DIFFERENCES BETWEEN spp AND GNU Parallel
+
+B<spp> can run jobs in parallel. B<spp> does not use a command
+template to generate the jobs, but requires jobs to be in a
+file. Output from the jobs mix.
+
+https://github.com/john01dav/spp
=head2 Todo
Url for spread
-https://github.com/john01dav/spp
+
https://github.com/amritb/with-this.git
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/src/parallel_alternatives.texi new/parallel-20181122/src/parallel_alternatives.texi
--- old/parallel-20181022/src/parallel_alternatives.texi 2018-10-22 20:47:26.000000000 +0200
+++ new/parallel-20181122/src/parallel_alternatives.texi 2018-11-10 15:01:55.000000000 +0100
@@ -68,6 +68,7 @@
* DIFFERENCES BETWEEN map(soveran) AND GNU Parallel::
* DIFFERENCES BETWEEN loop AND GNU Parallel::
* DIFFERENCES BETWEEN lorikeet AND GNU Parallel::
+* DIFFERENCES BETWEEN spp AND GNU Parallel::
* Todo::
@end menu
@@ -1962,13 +1963,20 @@
https://github.com/cetra3/lorikeet (Last checked: 2018-10)
+@node DIFFERENCES BETWEEN spp AND GNU Parallel
+@section DIFFERENCES BETWEEN spp AND GNU Parallel
+
+@strong{spp} can run jobs in parallel. @strong{spp} does not use a command
+template to generate the jobs, but requires jobs to be in a
+file. Output from the jobs mix.
+
+https://github.com/john01dav/spp
+
@node Todo
@section Todo
Url for spread
-https://github.com/john01dav/spp
-
https://github.com/amritb/with-this.git
https://github.com/fd0/machma Requires Go >= 1.7.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/src/sem new/parallel-20181122/src/sem
--- old/parallel-20181022/src/sem 2018-10-23 00:48:01.000000000 +0200
+++ new/parallel-20181122/src/sem 2018-11-23 00:32:16.000000000 +0100
@@ -120,7 +120,7 @@
$sem = acquire_semaphore();
}
$SIG{TERM} = \&start_no_new_jobs;
-start_more_jobs();
+while(start_more_jobs()) {}
if($opt::tee) {
# All jobs must be running in parallel for --tee
$Global::start_no_new_jobs = 1;
@@ -620,6 +620,7 @@
my $sleep =1;
while($Global::total_running > 0) {
$sleep = ::reap_usleep($sleep);
+ start_more_jobs();
}
}
$Global::start_no_new_jobs ||= 1;
@@ -1554,7 +1555,7 @@
sub init_globals {
# Defaults:
- $Global::version = 20181022;
+ $Global::version = 20181122;
$Global::progname = 'parallel';
$Global::infinity = 2**31;
$Global::debug = 0;
@@ -2668,7 +2669,6 @@
# Returns:
# $jobs_started = number of jobs started
my $jobs_started = 0;
- my $jobs_started_this_round = 0;
if($Global::start_no_new_jobs) {
return $jobs_started;
}
@@ -2678,65 +2678,61 @@
changed_procs_file();
changed_sshloginfile();
}
- do {
- $jobs_started_this_round = 0;
- # This will start 1 job on each --sshlogin (if possible)
- # thus distribute the jobs on the --sshlogins round robin
- for my $sshlogin (values %Global::host) {
- if($Global::JobQueue->empty() and not $opt::pipe) {
- # No more jobs in the queue
- last;
- }
- debug("run", "Running jobs before on ", $sshlogin->string(), ": ",
- $sshlogin->jobs_running(), "\n");
- if ($sshlogin->jobs_running() < $sshlogin->max_jobs_running()) {
- if($opt::delay
- and
- $opt::delay > ::now() - $Global::newest_starttime) {
- # It has been too short since last start
- next;
- }
- if($opt::load and $sshlogin->loadavg_too_high()) {
- # The load is too high or unknown
- next;
- }
- if($opt::noswap and $sshlogin->swapping()) {
- # The server is swapping
- next;
- }
- if($opt::limit and $sshlogin->limit()) {
- # Over limit
- next;
- }
- if($opt::memfree and $sshlogin->memfree() < $opt::memfree) {
- # The server has not enough mem free
- ::debug("mem", "Not starting job: not enough mem\n");
- next;
- }
- if($sshlogin->too_fast_remote_login()) {
- # It has been too short since
- next;
- }
- debug("run", $sshlogin->string(),
- " has ", $sshlogin->jobs_running(),
- " out of ", $sshlogin->max_jobs_running(),
- " jobs running. Start another.\n");
- if(start_another_job($sshlogin) == 0) {
- # No more jobs to start on this $sshlogin
- debug("run","No jobs started on ",
- $sshlogin->string(), "\n");
- next;
- }
- $sshlogin->inc_jobs_running();
- $sshlogin->set_last_login_at(::now());
- $jobs_started++;
- $jobs_started_this_round++;
- }
- debug("run","Running jobs after on ", $sshlogin->string(), ": ",
- $sshlogin->jobs_running(), " of ",
- $sshlogin->max_jobs_running(), "\n");
+ # This will start 1 job on each --sshlogin (if possible)
+ # thus distribute the jobs on the --sshlogins round robin
+ for my $sshlogin (values %Global::host) {
+ if($Global::JobQueue->empty() and not $opt::pipe) {
+ # No more jobs in the queue
+ last;
}
- } while($jobs_started_this_round);
+ debug("run", "Running jobs before on ", $sshlogin->string(), ": ",
+ $sshlogin->jobs_running(), "\n");
+ if ($sshlogin->jobs_running() < $sshlogin->max_jobs_running()) {
+ if($opt::delay
+ and
+ $opt::delay > ::now() - $Global::newest_starttime) {
+ # It has been too short since last start
+ next;
+ }
+ if($opt::load and $sshlogin->loadavg_too_high()) {
+ # The load is too high or unknown
+ next;
+ }
+ if($opt::noswap and $sshlogin->swapping()) {
+ # The server is swapping
+ next;
+ }
+ if($opt::limit and $sshlogin->limit()) {
+ # Over limit
+ next;
+ }
+ if($opt::memfree and $sshlogin->memfree() < $opt::memfree) {
+ # The server has not enough mem free
+ ::debug("mem", "Not starting job: not enough mem\n");
+ next;
+ }
+ if($sshlogin->too_fast_remote_login()) {
+ # It has been too short since
+ next;
+ }
+ debug("run", $sshlogin->string(),
+ " has ", $sshlogin->jobs_running(),
+ " out of ", $sshlogin->max_jobs_running(),
+ " jobs running. Start another.\n");
+ if(start_another_job($sshlogin) == 0) {
+ # No more jobs to start on this $sshlogin
+ debug("run","No jobs started on ",
+ $sshlogin->string(), "\n");
+ next;
+ }
+ $sshlogin->inc_jobs_running();
+ $sshlogin->set_last_login_at(::now());
+ $jobs_started++;
+ }
+ debug("run","Running jobs after on ", $sshlogin->string(), ": ",
+ $sshlogin->jobs_running(), " of ",
+ $sshlogin->max_jobs_running(), "\n");
+ }
return $jobs_started;
}
@@ -2912,8 +2908,8 @@
}
# * because of loadavg
# * because of too little time between each ssh login.
- start_more_jobs();
$sleep = ::reap_usleep($sleep);
+ start_more_jobs();
if($Global::max_jobs_running == 0) {
::warning("There are no job slots available. Increase --jobs.");
}
@@ -2921,6 +2917,7 @@
while($opt::sqlmaster and not $Global::sql->finished()) {
# SQL master
$sleep = ::reap_usleep($sleep);
+ start_more_jobs();
if($Global::start_sqlworker) {
# Start an SQL worker as we are now sure there is work to do
$Global::start_sqlworker = 0;
@@ -4054,27 +4051,24 @@
# @pids_reaped = PIDs of children finished
my $stiff;
my @pids_reaped;
- my $children_reaped = 0;
+ my $total_reaped;
debug("run", "Reaper ");
- # For efficiency surround with BEGIN/COMMIT when using $opt::sqlmaster
- $opt::sqlmaster and $Global::sql->run("BEGIN;");
- while (($stiff = waitpid(-1, &WNOHANG)) > 0) {
+ if (($stiff = waitpid(-1, &WNOHANG)) > 0) {
# $stiff = pid of dead process
if(wantarray) {
push(@pids_reaped,$stiff);
- } else {
- $children_reaped++;
}
- if($Global::sshmaster{$stiff}) {
- # This is one of the ssh -M: ignore
- next;
- }
- my $job = $Global::running{$stiff};
+ $total_reaped++;
+ if($Global::sshmaster{$stiff}) {
+ # This is one of the ssh -M: ignore
+ next;
+ }
+ my $job = $Global::running{$stiff};
# '-a <(seq 10)' will give us a pid not in %Global::running
- $job or next;
- delete $Global::running{$stiff};
- $Global::total_running--;
+ $job or return 0;
+ delete $Global::running{$stiff};
+ $Global::total_running--;
if($job->{'commandline'}{'skip'}) {
# $job->skip() was called
$job->set_exitstatus(-2);
@@ -4084,11 +4078,12 @@
$job->set_exitsignal($? & 127);
}
- debug("run", "seq ",$job->seq()," died (", $job->exitstatus(), ")");
- $job->set_endtime(::now());
- my $sshlogin = $job->sshlogin();
- $sshlogin->dec_jobs_running();
- if($job->should_be_retried()) {
+ debug("run", "seq ",$job->seq()," died (", $job->exitstatus(), ")");
+ $job->set_endtime(::now());
+ my $sshlogin = $job->sshlogin();
+ $sshlogin->dec_jobs_running();
+ if($job->should_be_retried()) {
+ # Free up file handles
$job->free_ressources();
} else {
# The job is done
@@ -4109,18 +4104,17 @@
::kill_sleep_seq($job->pid());
::killall();
::wait_and_exit($Global::halt_exitstatus);
- }
- }
+ }
+ }
$job->cleanup();
- start_more_jobs();
+
if($opt::progress) {
my %progress = progress();
::status_no_nl("\r",$progress{'status'});
}
}
- $opt::sqlmaster and $Global::sql->run("COMMIT;");
debug("run", "done ");
- return wantarray ? @pids_reaped : $children_reaped;
+ return wantarray ? @pids_reaped : $total_reaped;
}
@@ -5102,6 +5096,7 @@
# $ms*1.1 if no children reaped
my $ms = shift;
if(reaper()) {
+ while(reaper()) {}
if(not $Global::total_completed % 100) {
if($opt::timeout) {
# Force cleaning the timeout queue for every 1000 jobs
@@ -7947,7 +7942,7 @@
$command =
'cat > $PARALLEL_TMP;'.
$command.";". postpone_exit_and_cleanup().
- '$PARALLEL_TMP';
+ '$PARALLEL_TMP';
} elsif($opt::fifo) {
# Prepend fifo-wrapper. In essence:
# mkfifo {}
@@ -8280,7 +8275,7 @@
my $self = shift;
my $command = shift;
# TODO test that *sh -c 'parallel --env' use *sh
- if(not defined $self->{'sshlogin_wrap'}) {
+ if(not defined $self->{'sshlogin_wrap'}{$command}) {
my $sshlogin = $self->sshlogin();
my $serverlogin = $sshlogin->serverlogin();
my $quoted_remote_command;
@@ -8310,12 +8305,12 @@
$command =~ /\n/) {
# csh does not deal well with > 1000 chars in one word
# csh does not deal well with $ENV with \n
- $self->{'sshlogin_wrap'} = base64_wrap($perl_code);
+ $self->{'sshlogin_wrap'}{$command} = base64_wrap($perl_code);
} else {
- $self->{'sshlogin_wrap'} = "perl -e ".::Q($perl_code);
+ $self->{'sshlogin_wrap'}{$command} = "perl -e ".::Q($perl_code);
}
} else {
- $self->{'sshlogin_wrap'} = $command;
+ $self->{'sshlogin_wrap'}{$command} = $command;
}
} else {
my $pwd = "";
@@ -8357,7 +8352,7 @@
# We need to save the exit status of the job
$post = '_EXIT_status=$?; ' . $post . ' exit $_EXIT_status;';
}
- $self->{'sshlogin_wrap'} =
+ $self->{'sshlogin_wrap'}{$command} =
($pre
. "$sshcmd $serverlogin -- exec "
. $quoted_remote_command
@@ -8365,7 +8360,7 @@
. $post);
}
}
- return $self->{'sshlogin_wrap'};
+ return $self->{'sshlogin_wrap'}{$command};
}
sub transfer {
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/src/sql new/parallel-20181122/src/sql
--- old/parallel-20181022/src/sql 2018-10-23 00:48:01.000000000 +0200
+++ new/parallel-20181122/src/sql 2018-11-23 00:32:16.000000000 +0100
@@ -576,7 +576,7 @@
exit ($err);
sub parse_options {
- $Global::version = 20181022;
+ $Global::version = 20181122;
$Global::progname = 'sql';
# This must be done first as this may exec myself
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/parallel-20181022/src/sql.1 new/parallel-20181122/src/sql.1
--- old/parallel-20181022/src/sql.1 2018-10-23 00:48:24.000000000 +0200
+++ new/parallel-20181122/src/sql.1 2018-11-23 00:32:31.000000000 +0100
@@ -129,7 +129,7 @@
.\" ========================================================================
.\"
.IX Title "SQL 1"
-.TH SQL 1 "2018-10-22" "20181022" "parallel"
+.TH SQL 1 "2018-11-22" "20181122" "parallel"
.\" For nroff, turn off justification. Always turn off hyphenation; it makes
.\" way too many mistakes in technical documents.
.if n .ad l
++++++ parallel-20181022.tar.bz2.sig -> parallel-20181122.tar.bz2.sig ++++++
--- /work/SRC/openSUSE:Factory/gnu_parallel/parallel-20181022.tar.bz2.sig 2018-11-15 12:41:41.230165493 +0100
+++ /work/SRC/openSUSE:Factory/.gnu_parallel.new.19453/parallel-20181122.tar.bz2.sig 2018-11-26 10:32:03.960904233 +0100
@@ -2,7 +2,7 @@
# To check the signature run:
# echo | gpg
-# gpg --auto-key-locate keyserver --keyserver-options auto-key-retrieve parallel-20181022.tar.bz2.sig
+# gpg --auto-key-locate keyserver --keyserver-options auto-key-retrieve parallel-20181122.tar.bz2.sig
echo | gpg 2>/dev/null
gpg --auto-key-locate keyserver --keyserver-options auto-key-retrieve $0
@@ -10,32 +10,32 @@
-----BEGIN PGP SIGNATURE-----
-iQTwBAABCgAGBQJbzlrhAAoJENGrRRaIiIiIjJkmoINuu2EzJxlOlIPwri2NZQ2R
-F2URNBuzaW3iP+dxiAN3Lz2ABnfPv71+MuRTwPkhu6DhoPaO3FS6wz9WEPknxcE8
-/xejg8xs5fLFNL4TTVemOsC5SsGOIjqpzLw6L5HV2iCzeYUdFDMcP9QRUca2lu0B
-fVTXfuuOsUICb6F6VjCWfa+2cBXCpuMjg7SuWGTcFF0DyuussCQZcq3BGZNtSrhO
-ipoOENKHqcme2dfRSZangWkd7wigxVOyNmUTg+qIq1/b1qhK1XBib0cp7fA6nFOd
-SK0mZNav0jMxQHudBfEpRXRVT3C55uKi7vjqMuZA/XvPQaiYefd2vpFjLLQw0OSV
-Oyw7P4CrgN7oUBHCiJfTqd+hk+u5kPfU3kmi5Yy87FNAvR5XDbw7+aYfy/mxpiwz
-3PccSeikGe+HGzuWwLWEdEnnYeY6wcpBtREBHxeLycsVD+sKXLS2Gwx5b5mYeIrl
-svfNxdw0bNRzFNt69mtq9AZMTPnMdP1K3IFyV72j3rzFyMhwUfHfE6sIoumex3tk
-71x5FZXJn3ELqGKhVRHGKvnSZLu4ZPcQYG/Pd563Fz/8+bjeVPyY56AKF3VdViYP
-CmEGH3aQ+QvT/CDWBdA9ymBx1zkZP/v2uBwZzmeWKvAdCOZ20WqLDYzyW3LgLigT
-0gzLys+i35Q8+VCYILybEbJIdSLgniyahZXlPk+GMz0qUFCnRYuNdfFjBq/fDqTI
-wf+6b2UFhSzZc492QO91z+Ano9sRJLMGOwGoebQhY8R6LhSfFLEnjCNbCYaIQSRI
-yIgidGZenjHRfWaUYxJEEeYl/nPupFoiOAoCBuUwO+T+heeluoXW5QKiFYPKB5yl
-46FTaGTcTn60uTfRmLgPwrfGKSUJOmZ1Xwgsqq9cfayaYJhHcUsiJE8qusjlPlVG
-/DDaCJ0zfwkj5jklVA1H/swI3sIpG+dvcWkSlfqCK05uAgZX8sNZVSuSzZjJG0vr
-/JYQGvIPnA02DL8H5AC66hZtbqAHXnfwdoPndWssTmDb62z3OKY3os1gLITAfMz4
-seSIivAxZYKmrjDNF09s/lF7ihv4Pi1X9iV1HIJaeyHKE3YqvPL5xpovFlBggnRy
-ZsG3CmoBmATLE1jO3z/Pucm5LUG97sD3WR/sgm7NJJPAqAlX9jh+VdYC3AEPNEoR
-xfKt6LUsdNv9VwpPJF8WrJwc5k6FRT4Td1XUrjZfXmJXPswIqBjlFsI1IameT+WA
-AvMmmWVOkecK672XizBqObNAtIIJH0ER7a4DsPa/Pre8cs/Gy2a2GPajHu22DSsp
-gV5SGefmc2PwUxun6L4ssuFCp+Cfzm3kDH3aoaCi8C8ljDGmlRABFL559I7d37qE
-8d/jYTRjM4zXK59o41Dmew2Pu9154ht8mblelqwPdU3zF30BYsFubx7frSH7LSen
-hzeTuBqpuYHfm23f48Mf4DpXdXJ3DzyldVjZ1pUufryTOMQPayYRTFvLwjUbW2jB
-a1Xs+7r0GSsHxrWodZRZ3Ljl4crOpDsL77phhYkebfzfqhaboCcoPj9pF2fDJR/G
-axIif1rSO8XBPtr3kPJllrV1uoqRQ4493oqmkqJcyqL4JPYQEgm2E0bEbIn9qwg9
-8l81eS0sLwIoSDTxpffM40Pw5w==
-=m7d/
+iQTwBAABCgAGBQJb9z2JAAoJENGrRRaIiIiI6S0mn1HMIHl+3Wxog/PHiqA3N7Of
+KRVZGzSTClo9dBFchs86bm20t8D7zFmTLdiSfTqh/CnNUlU530Fd2pSpcvAQECmQ
+XmDrlWTkhuRYTWO/FS/Ngh0+IfVBt4ycJiO0p46OODxdp64ewXHJ8KstUy607uHv
+cRnw1rKEqu8suUMx9rdZii6b+STAPNNG0/KyJnvuiiMtwVCZEFSTFS509lIKWsQH
+ffkzqUNoNxy7hwkRXVD7zA9X35Iyh55TMsmhBDvn+6cDNv0AHyNHTXnQWqmok83A
+0SHkK6jwndmFJWYZCNcmjykg8bW94IYO/ThoRzcthD54FgcTpKgutQtjlpSKqFBy
+LjP/shHbNpJSDsnxwxcppQL2XvWz79LnTt5dKWQRQcj5Ijh0y1NiZnBP8xreVLrb
+Rx+SN6gaiUtd1ig5hi/okfIRJnd2nCxAVyvrhS6cRy1XdYJf55lc3+a5p36YBT3Q
+o0jmFzVsSeSnG0v17zrJiyP0SDJrmhFHLI+qN+jxLu9/lqvMqhCSzlDOjwhv0mGq
+v1x3D4sYCX1cg61CvCl532hQILU0tf7duaMlAZNnsWz4cN2o9twFTi/TTQMbku8T
+C05kMgbT4xQzs62LMZJ0iaD7itTGog1Rxov3Nu6i+3Tm4kv9iYKP8BdjDJ3WckGd
+CBhDgBNU2OH0KGhw6rNqSsb+G+E0xZx5dO+7KByA/PCKz9n070Mhh3LUdS9lFvi3
+OO/CuaT6PGGCsTevfResqAFcKrpP+fxUWnyFF5pxVgNHpKcxX2by9RcGYDXQYpu4
+ntDnxq5idarHhj+kVzc8ehX6KVxuXVm3Arkx5QeWbVnf33lWak6rGqW+lUZlM3Dd
+LZ2JKnantHjjIDzQBfMHMDt4/hPVroOwEu6uCSyWid2T80ty8DeNBH+XJY+kS8TT
+zoPslLgp9KQedlApIr+2tAoiTI4n5wQArMAUx9MU0a7qWRHYIyXK3iyHM+63dJuw
+vYkEi6sfn1/27EZWwiKDkLeIOR6jeZNdy7q43jjyGs05m1uhY/PfAQ6ofshlu+XR
+na6gu0g7A24iOG89brptxDa2pfm0KSVjrC1t4WrjDSJ1X/JCCBAIaOAlYEskEOni
+fWFYkcNEW/73b3uKFHTXl/Pjhcr0aK1xmybP1PlX/WDQpNwuNZulJpvZuY1bxAsa
+4jzCjddL+VwO17d3auU8jKh/IUMUV62brOnTUeJ3lAC+aF6SF7SXVm61JTNOa1i2
+9wIt+5nJ3skQq7gs4S6IkJXSb3yCSvn2MfntULzcOc/5IaebEZ0Aya1RZRr/mqUt
+nk/yhe0V/Jwnp0hVlsWIC4IftGk6mVUOio5mPfcJZ4fvEllLcQKvAAIvjGU0RMAr
+LLMsuWyKydIY6LtjQil20/IAlkSz/948XyDfQjqsoho1UJoPp9yM8ilkAUijzkaA
+2zTyOApT5Ts6w96WYSqMIrFwqii2ACP/xIv3g3XpupxPHPzwWFa2LrsyIhbR263X
+hARwQLvugG96ef/aSEJJSZaGM6J02ntU5KjOGPBPcTx/tq1vh9YTnxhWX3G8YT7Z
+Ivw4oS+wfw2fTerNqLwH+Ey38eMtsWPMojfJUo3D+dj+qnfTdcxjyVZEQWntceaB
+xfgRRHFG5f9tZ6Gpu6pZhhPP6w==
+=GJ5s
-----END PGP SIGNATURE-----
1
0
Hello community,
here is the log from the commit of package youtube-dl for openSUSE:Factory checked in at 2018-11-26 10:31:06
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/youtube-dl (Old)
and /work/SRC/openSUSE:Factory/.youtube-dl.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "youtube-dl"
Mon Nov 26 10:31:06 2018 rev:88 rq:651431 version:2018.11.23
Changes:
--------
--- /work/SRC/openSUSE:Factory/youtube-dl/python-youtube-dl.changes 2018-11-12 09:45:42.484801875 +0100
+++ /work/SRC/openSUSE:Factory/.youtube-dl.new.19453/python-youtube-dl.changes 2018-11-26 10:31:50.496920009 +0100
@@ -1,0 +2,28 @@
+Fri Nov 23 13:34:30 UTC 2018 - sschricker(a)suse.de
+
+- Update to new upstream release 2018.11.23
+ * [mixcloud] Fallback to hardcoded decryption key
+ * [nbc:news] Fix article extraction
+ * [foxsports] Fix extraction
+ * [ciscolive] Add support for ciscolive.cisco.com
+ * [nzz] Relax kaltura regex
+ * [kaltura] Limit requested MediaEntry fields
+ * [americastestkitchen] Add support for zype embeds
+ * [nova:embed] Fix extraction
+
+-------------------------------------------------------------------
+Sun Nov 18 01:35:33 UTC 2018 - seb95.scou(a)gmail.com
+
+- Update to new upstream release 2018.11.18
+ * [wwe] Add support for wwe.com
+ * [vk] Detect geo restriction
+ * [openload] Use original host during extraction
+ * [atvat] Fix extraction
+ * [rte] Add support for new API endpoint
+ * [tnaflixnetwork:embed] Fix extraction
+ * [picarto] Use API and add token support
+ * [zype] Add support for player.zype.com
+ * [vivo] Fix extraction
+ * [ruutu] Update API endpoint
+
+-------------------------------------------------------------------
youtube-dl.changes: same change
Old:
----
youtube-dl-2018.11.07.tar.gz
youtube-dl-2018.11.07.tar.gz.sig
New:
----
youtube-dl-2018.11.23.tar.gz
youtube-dl-2018.11.23.tar.gz.sig
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-youtube-dl.spec ++++++
--- /var/tmp/diff_new_pack.MO6Oxm/_old 2018-11-26 10:31:54.816914947 +0100
+++ /var/tmp/diff_new_pack.MO6Oxm/_new 2018-11-26 10:31:54.820914943 +0100
@@ -19,7 +19,7 @@
%define modname youtube-dl
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-youtube-dl
-Version: 2018.11.07
+Version: 2018.11.23
Release: 0
Summary: A python module for downloading from video sites for offline watching
License: SUSE-Public-Domain AND CC-BY-SA-3.0
++++++ youtube-dl.spec ++++++
--- /var/tmp/diff_new_pack.MO6Oxm/_old 2018-11-26 10:31:54.844914914 +0100
+++ /var/tmp/diff_new_pack.MO6Oxm/_new 2018-11-26 10:31:54.844914914 +0100
@@ -17,7 +17,7 @@
Name: youtube-dl
-Version: 2018.11.07
+Version: 2018.11.23
Release: 0
Summary: A tool for downloading from video sites for offline watching
License: SUSE-Public-Domain AND CC-BY-SA-3.0
@@ -42,23 +42,22 @@
rm -f youtube-dl
%build
-perl -i -pe '
- s{^PREFIX\ \?=\ %_prefix/local}{PREFIX ?= %_prefix};
- s{^BINDIR\ \?=\ \$\(PREFIX\)/bin}{BINDIR\ \?=\ %_bindir};
- s{^MANDIR\ \?=\ \$\(PREFIX\)/man}{MANDIR\ \?=\ %_mandir};
- s{^SHAREDIR\ \?=\ \$\(PREFIX\)/share}{SHAREDIR\ \?=\ %_datadir};' Makefile
make %{?_smp_mflags}
%install
-%make_install
+install -D -m 755 youtube-dl %buildroot/%_bindir/youtube-dl
+install -D -m 644 youtube-dl.bash-completion %buildroot/%_datadir/bash-completion/completions/youtube-dl
+install -D -m 644 youtube-dl.zsh %buildroot/%_datadir/zsh/site-functions/_youtube-dl
+install -D -m 644 youtube-dl.fish %buildroot/%_datadir/fish/completions/youtube-dl.fish
+install -D -m 644 youtube-dl.1 %buildroot/%_mandir/man1/youtube-dl.1
%files
%license LICENSE
%doc README.txt
%_bindir/youtube-dl
-%_mandir/man1/*
-%config %_sysconfdir/bash_completion.d/
-%config %_sysconfdir/fish/
-%config %_datadir/zsh/
+%_mandir/man1/youtube-dl.1*
+%_datadir/fish/
+%_datadir/zsh/
+%_datadir/bash-completion/
%changelog
++++++ youtube-dl-2018.11.07.tar.gz -> youtube-dl-2018.11.23.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/ChangeLog new/youtube-dl/ChangeLog
--- old/youtube-dl/ChangeLog 2018-11-06 19:38:22.000000000 +0100
+++ new/youtube-dl/ChangeLog 2018-11-22 18:16:43.000000000 +0100
@@ -1,3 +1,40 @@
+version 2018.11.23
+
+Core
++ [setup.py] Add more relevant classifiers
+
+Extractors
+* [mixcloud] Fallback to hardcoded decryption key (#18016)
+* [nbc:news] Fix article extraction (#16194)
+* [foxsports] Fix extraction (#17543)
+* [loc] Relax regular expression and improve formats extraction
++ [ciscolive] Add support for ciscolive.cisco.com (#17984)
+* [nzz] Relax kaltura regex (#18228)
+* [sixplay] Fix formats extraction
+* [bitchute] Improve title extraction
+* [kaltura] Limit requested MediaEntry fields
++ [americastestkitchen] Add support for zype embeds (#18225)
++ [pornhub] Add pornhub.net alias
+* [nova:embed] Fix extraction (#18222)
+
+
+version 2018.11.18
+
+Extractors
++ [wwe] Extract subtitles
++ [wwe] Add support for playlistst (#14781)
++ [wwe] Add support for wwe.com (#14781, #17450)
+* [vk] Detect geo restriction (#17767)
+* [openload] Use original host during extraction (#18211)
+* [atvat] Fix extraction (#18041)
++ [rte] Add support for new API endpoint (#18206)
+* [tnaflixnetwork:embed] Fix extraction (#18205)
+* [picarto] Use API and add token support (#16518)
++ [zype] Add support for player.zype.com (#18143)
+* [vivo] Fix extraction (#18139)
+* [ruutu] Update API endpoint (#18138)
+
+
version 2018.11.07
Extractors
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/docs/supportedsites.md new/youtube-dl/docs/supportedsites.md
--- old/youtube-dl/docs/supportedsites.md 2018-11-06 19:38:25.000000000 +0100
+++ new/youtube-dl/docs/supportedsites.md 2018-11-22 18:16:45.000000000 +0100
@@ -163,6 +163,8 @@
- **chirbit**
- **chirbit:profile**
- **Cinchcast**
+ - **CiscoLiveSearch**
+ - **CiscoLiveSession**
- **CJSW**
- **cliphunter**
- **Clippit**
@@ -1080,6 +1082,7 @@
- **wrzuta.pl:playlist**
- **WSJ**: Wall Street Journal
- **WSJArticle**
+ - **WWE**
- **XBef**
- **XboxClips**
- **XFileShare**: XFileShare based sites: DaClips, FileHoot, GorillaVid, MovPod, PowerWatch, Rapidvideo.ws, TheVideoBee, Vidto, Streamin.To, XVIDSTAGE, Vid ABC, VidBom, vidlo, RapidVideo.TV, FastVideo.me
@@ -1139,3 +1142,4 @@
- **ZDF**
- **ZDFChannel**
- **zingmp3**: mp3.zing.vn
+ - **Zype**
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/setup.py new/youtube-dl/setup.py
--- old/youtube-dl/setup.py 2018-11-06 19:36:15.000000000 +0100
+++ new/youtube-dl/setup.py 2018-11-21 23:55:12.000000000 +0100
@@ -124,6 +124,8 @@
'Development Status :: 5 - Production/Stable',
'Environment :: Console',
'License :: Public Domain',
+ 'Programming Language :: Python',
+ 'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
@@ -132,6 +134,13 @@
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
+ 'Programming Language :: Python :: 3.7',
+ 'Programming Language :: Python :: 3.8',
+ 'Programming Language :: Python :: Implementation',
+ 'Programming Language :: Python :: Implementation :: CPython',
+ 'Programming Language :: Python :: Implementation :: IronPython',
+ 'Programming Language :: Python :: Implementation :: Jython',
+ 'Programming Language :: Python :: Implementation :: PyPy',
],
cmdclass={'build_lazy_extractors': build_lazy_extractors},
Binary files old/youtube-dl/youtube-dl and new/youtube-dl/youtube-dl differ
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/americastestkitchen.py new/youtube-dl/youtube_dl/extractor/americastestkitchen.py
--- old/youtube-dl/youtube_dl/extractor/americastestkitchen.py 2018-11-06 19:36:16.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/americastestkitchen.py 2018-11-21 23:55:12.000000000 +0100
@@ -43,10 +43,6 @@
webpage = self._download_webpage(url, video_id)
- partner_id = self._search_regex(
- r'src=["\'](?:https?:)?//(?:[^/]+\.)kaltura\.com/(?:[^/]+/)*(?:p|partner_id)/(\d+)',
- webpage, 'kaltura partner id')
-
video_data = self._parse_json(
self._search_regex(
r'window\.__INITIAL_STATE__\s*=\s*({.+?})\s*;\s*</script>',
@@ -58,7 +54,18 @@
(lambda x: x['episodeDetail']['content']['data'],
lambda x: x['videoDetail']['content']['data']), dict)
ep_meta = ep_data.get('full_video', {})
- external_id = ep_data.get('external_id') or ep_meta['external_id']
+
+ zype_id = ep_meta.get('zype_id')
+ if zype_id:
+ embed_url = 'https://player.zype.com/embed/%s.js?api_key=jZ9GUhRmxcPvX7M3SlfejB6Hle9jyHT…' % zype_id
+ ie_key = 'Zype'
+ else:
+ partner_id = self._search_regex(
+ r'src=["\'](?:https?:)?//(?:[^/]+\.)kaltura\.com/(?:[^/]+/)*(?:p|partner_id)/(\d+)',
+ webpage, 'kaltura partner id')
+ external_id = ep_data.get('external_id') or ep_meta['external_id']
+ embed_url = 'kaltura:%s:%s' % (partner_id, external_id)
+ ie_key = 'Kaltura'
title = ep_data.get('title') or ep_meta.get('title')
description = clean_html(ep_meta.get('episode_description') or ep_data.get(
@@ -72,8 +79,8 @@
return {
'_type': 'url_transparent',
- 'url': 'kaltura:%s:%s' % (partner_id, external_id),
- 'ie_key': 'Kaltura',
+ 'url': embed_url,
+ 'ie_key': ie_key,
'title': title,
'description': description,
'thumbnail': thumbnail,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/atvat.py new/youtube-dl/youtube_dl/extractor/atvat.py
--- old/youtube-dl/youtube_dl/extractor/atvat.py 2018-11-06 19:36:16.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/atvat.py 2018-11-21 23:55:00.000000000 +0100
@@ -28,8 +28,10 @@
display_id = self._match_id(url)
webpage = self._download_webpage(url, display_id)
video_data = self._parse_json(unescapeHTML(self._search_regex(
- r'class="[^"]*jsb_video/FlashPlayer[^"]*"[^>]+data-jsb="([^"]+)"',
- webpage, 'player data')), display_id)['config']['initial_video']
+ [r'flashPlayerOptions\s*=\s*(["\'])(?P<json>(?:(?!\1).)+)\1',
+ r'class="[^"]*jsb_video/FlashPlayer[^"]*"[^>]+data-jsb="(?P<json>[^"]+)"'],
+ webpage, 'player data', group='json')),
+ display_id)['config']['initial_video']
video_id = video_data['id']
video_title = video_data['title']
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/bitchute.py new/youtube-dl/youtube_dl/extractor/bitchute.py
--- old/youtube-dl/youtube_dl/extractor/bitchute.py 2018-11-06 19:36:16.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/bitchute.py 2018-11-21 23:55:12.000000000 +0100
@@ -37,7 +37,7 @@
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/69.0.3497.57 Safari/537.36',
})
- title = self._search_regex(
+ title = self._html_search_regex(
(r'<[^>]+\bid=["\']video-title[^>]+>([^<]+)', r'<title>([^<]+)'),
webpage, 'title', default=None) or self._html_search_meta(
'description', webpage, 'title',
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/ciscolive.py new/youtube-dl/youtube_dl/extractor/ciscolive.py
--- old/youtube-dl/youtube_dl/extractor/ciscolive.py 1970-01-01 01:00:00.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/ciscolive.py 2018-11-21 23:55:12.000000000 +0100
@@ -0,0 +1,142 @@
+# coding: utf-8
+from __future__ import unicode_literals
+
+import itertools
+
+from .common import InfoExtractor
+from ..compat import (
+ compat_parse_qs,
+ compat_urllib_parse_urlparse,
+)
+from ..utils import (
+ clean_html,
+ float_or_none,
+ int_or_none,
+ try_get,
+ urlencode_postdata,
+)
+
+
+class CiscoLiveBaseIE(InfoExtractor):
+ # These appear to be constant across all Cisco Live presentations
+ # and are not tied to any user session or event
+ RAINFOCUS_API_URL = 'https://events.rainfocus.com/api/%s'
+ RAINFOCUS_API_PROFILE_ID = 'Na3vqYdAlJFSxhYTYQGuMbpafMqftalz'
+ RAINFOCUS_WIDGET_ID = 'n6l4Lo05R8fiy3RpUBm447dZN8uNWoye'
+ BRIGHTCOVE_URL_TEMPLATE = 'http://players.brightcove.net/5647924234001/SyK2FdqjM_default/index.html?vi…'
+
+ HEADERS = {
+ 'Origin': 'https://ciscolive.cisco.com',
+ 'rfApiProfileId': RAINFOCUS_API_PROFILE_ID,
+ 'rfWidgetId': RAINFOCUS_WIDGET_ID,
+ }
+
+ def _call_api(self, ep, rf_id, query, referrer, note=None):
+ headers = self.HEADERS.copy()
+ headers['Referer'] = referrer
+ return self._download_json(
+ self.RAINFOCUS_API_URL % ep, rf_id, note=note,
+ data=urlencode_postdata(query), headers=headers)
+
+ def _parse_rf_item(self, rf_item):
+ event_name = rf_item.get('eventName')
+ title = rf_item['title']
+ description = clean_html(rf_item.get('abstract'))
+ presenter_name = try_get(rf_item, lambda x: x['participants'][0]['fullName'])
+ bc_id = rf_item['videos'][0]['url']
+ bc_url = self.BRIGHTCOVE_URL_TEMPLATE % bc_id
+ duration = float_or_none(try_get(rf_item, lambda x: x['times'][0]['length']))
+ location = try_get(rf_item, lambda x: x['times'][0]['room'])
+
+ if duration:
+ duration = duration * 60
+
+ return {
+ '_type': 'url_transparent',
+ 'url': bc_url,
+ 'ie_key': 'BrightcoveNew',
+ 'title': title,
+ 'description': description,
+ 'duration': duration,
+ 'creator': presenter_name,
+ 'location': location,
+ 'series': event_name,
+ }
+
+
+class CiscoLiveSessionIE(CiscoLiveBaseIE):
+ _VALID_URL = r'https?://ciscolive\.cisco\.com/on-demand-library/\??[^#]*#/session/(?P<id>[^/?&]+)'
+ _TEST = {
+ 'url': 'https://ciscolive.cisco.com/on-demand-library/?#/session/1423353499155001Fo…',
+ 'md5': 'c98acf395ed9c9f766941c70f5352e22',
+ 'info_dict': {
+ 'id': '5803694304001',
+ 'ext': 'mp4',
+ 'title': '13 Smart Automations to Monitor Your Cisco IOS Network',
+ 'description': 'md5:ec4a436019e09a918dec17714803f7cc',
+ 'timestamp': 1530305395,
+ 'upload_date': '20180629',
+ 'uploader_id': '5647924234001',
+ 'location': '16B Mezz.',
+ },
+ }
+
+ def _real_extract(self, url):
+ rf_id = self._match_id(url)
+ rf_result = self._call_api('session', rf_id, {'id': rf_id}, url)
+ return self._parse_rf_item(rf_result['items'][0])
+
+
+class CiscoLiveSearchIE(CiscoLiveBaseIE):
+ _VALID_URL = r'https?://ciscolive\.cisco\.com/on-demand-library/'
+ _TESTS = [{
+ 'url': 'https://ciscolive.cisco.com/on-demand-library/?search.event=ciscoliveus2018…',
+ 'info_dict': {
+ 'title': 'Search query',
+ },
+ 'playlist_count': 5,
+ }, {
+ 'url': 'https://ciscolive.cisco.com/on-demand-library/?search.technology=scpsTechno…',
+ 'only_matching': True,
+ }]
+
+ @classmethod
+ def suitable(cls, url):
+ return False if CiscoLiveSessionIE.suitable(url) else super(CiscoLiveSearchIE, cls).suitable(url)
+
+ @staticmethod
+ def _check_bc_id_exists(rf_item):
+ return int_or_none(try_get(rf_item, lambda x: x['videos'][0]['url'])) is not None
+
+ def _entries(self, query, url):
+ query['size'] = 50
+ query['from'] = 0
+ for page_num in itertools.count(1):
+ results = self._call_api(
+ 'search', None, query, url,
+ 'Downloading search JSON page %d' % page_num)
+ sl = try_get(results, lambda x: x['sectionList'][0], dict)
+ if sl:
+ results = sl
+ items = results.get('items')
+ if not items or not isinstance(items, list):
+ break
+ for item in items:
+ if not isinstance(item, dict):
+ continue
+ if not self._check_bc_id_exists(item):
+ continue
+ yield self._parse_rf_item(item)
+ size = int_or_none(results.get('size'))
+ if size is not None:
+ query['size'] = size
+ total = int_or_none(results.get('total'))
+ if total is not None and query['from'] + query['size'] > total:
+ break
+ query['from'] += query['size']
+
+ def _real_extract(self, url):
+ query = compat_parse_qs(compat_urllib_parse_urlparse(url).query)
+ query['type'] = 'session'
+ return self.playlist_result(
+ self._entries(query, url), playlist_title='Search query')
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/extractors.py new/youtube-dl/youtube_dl/extractor/extractors.py
--- old/youtube-dl/youtube_dl/extractor/extractors.py 2018-11-06 19:36:17.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/extractors.py 2018-11-21 23:55:12.000000000 +0100
@@ -194,6 +194,10 @@
ChirbitProfileIE,
)
from .cinchcast import CinchcastIE
+from .ciscolive import (
+ CiscoLiveSessionIE,
+ CiscoLiveSearchIE,
+)
from .cjsw import CJSWIE
from .cliphunter import CliphunterIE
from .clippit import ClippitIE
@@ -1386,6 +1390,7 @@
WSJIE,
WSJArticleIE,
)
+from .wwe import WWEIE
from .xbef import XBefIE
from .xboxclips import XboxClipsIE
from .xfileshare import XFileShareIE
@@ -1478,3 +1483,4 @@
)
from .zdf import ZDFIE, ZDFChannelIE
from .zingmp3 import ZingMp3IE
+from .zype import ZypeIE
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/foxsports.py new/youtube-dl/youtube_dl/extractor/foxsports.py
--- old/youtube-dl/youtube_dl/extractor/foxsports.py 2018-11-06 19:36:17.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/foxsports.py 2018-11-21 23:55:12.000000000 +0100
@@ -1,43 +1,33 @@
from __future__ import unicode_literals
from .common import InfoExtractor
-from ..utils import (
- smuggle_url,
- update_url_query,
-)
class FoxSportsIE(InfoExtractor):
- _VALID_URL = r'https?://(?:www\.)?foxsports\.com/(?:[^/]+/)*(?P<id>[^/]+)'
+ _VALID_URL = r'https?://(?:www\.)?foxsports\.com/(?:[^/]+/)*video/(?P<id>\d+)'
_TEST = {
'url': 'http://www.foxsports.com/tennessee/video/432609859715',
'md5': 'b49050e955bebe32c301972e4012ac17',
'info_dict': {
- 'id': 'bwduI3X_TgUB',
+ 'id': '432609859715',
'ext': 'mp4',
'title': 'Courtney Lee on going up 2-0 in series vs. Blazers',
'description': 'Courtney Lee talks about Memphis being focused.',
- 'upload_date': '20150423',
- 'timestamp': 1429761109,
+ # TODO: fix timestamp
+ 'upload_date': '19700101', # '20150423',
+ # 'timestamp': 1429761109,
'uploader': 'NEWA-FNG-FOXSPORTS',
},
+ 'params': {
+ # m3u8 download
+ 'skip_download': True,
+ },
'add_ie': ['ThePlatform'],
}
def _real_extract(self, url):
video_id = self._match_id(url)
- webpage = self._download_webpage(url, video_id)
-
- config = self._parse_json(
- self._html_search_regex(
- r"""class="[^"]*(?:fs-player|platformPlayer-wrapper)[^"]*".+?data-player-config='([^']+)'""",
- webpage, 'data player config'),
- video_id)
-
- return self.url_result(smuggle_url(update_url_query(
- config['releaseURL'], {
- 'mbr': 'true',
- 'switch': 'http',
- }), {'force_smil_url': True}))
+ return self.url_result(
+ 'https://feed.theplatform.com/f/BKQ29B/foxsports-all?byId=' + video_id, 'ThePlatformFeed')
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/generic.py new/youtube-dl/youtube_dl/extractor/generic.py
--- old/youtube-dl/youtube_dl/extractor/generic.py 2018-11-06 19:36:17.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/generic.py 2018-11-21 23:55:01.000000000 +0100
@@ -114,6 +114,7 @@
from .foxnews import FoxNewsIE
from .viqeo import ViqeoIE
from .expressen import ExpressenIE
+from .zype import ZypeIE
class GenericIE(InfoExtractor):
@@ -2071,6 +2072,20 @@
'playlist_count': 6,
},
{
+ # Zype embed
+ 'url': 'https://www.cookscountry.com/episode/554-smoky-barbecue-favorites',
+ 'info_dict': {
+ 'id': '5b400b834b32992a310622b9',
+ 'ext': 'mp4',
+ 'title': 'Smoky Barbecue Favorites',
+ 'thumbnail': r're:^https?://.*\.jpe?g',
+ },
+ 'add_ie': [ZypeIE.ie_key()],
+ 'params': {
+ 'skip_download': True,
+ },
+ },
+ {
# videojs embed
'url': 'https://video.sibnet.ru/shell.php?videoid=3422904',
'info_dict': {
@@ -3129,6 +3144,11 @@
return self.playlist_from_matches(
expressen_urls, video_id, video_title, ie=ExpressenIE.ie_key())
+ zype_urls = ZypeIE._extract_urls(webpage)
+ if zype_urls:
+ return self.playlist_from_matches(
+ zype_urls, video_id, video_title, ie=ZypeIE.ie_key())
+
# Look for HTML5 media
entries = self._parse_html5_media_entries(url, webpage, video_id, m3u8_id='hls')
if entries:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/kaltura.py new/youtube-dl/youtube_dl/extractor/kaltura.py
--- old/youtube-dl/youtube_dl/extractor/kaltura.py 2018-11-06 19:36:17.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/kaltura.py 2018-11-21 23:55:12.000000000 +0100
@@ -192,6 +192,8 @@
'entryId': video_id,
'service': 'baseentry',
'ks': '{1:result:ks}',
+ 'responseProfile:fields': 'createdAt,dataUrl,duration,name,plays,thumbnailUrl,userId',
+ 'responseProfile:type': 1,
},
{
'action': 'getbyentryid',
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/libraryofcongress.py new/youtube-dl/youtube_dl/extractor/libraryofcongress.py
--- old/youtube-dl/youtube_dl/extractor/libraryofcongress.py 2018-11-06 19:36:17.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/libraryofcongress.py 2018-11-21 23:55:12.000000000 +0100
@@ -16,16 +16,15 @@
class LibraryOfCongressIE(InfoExtractor):
IE_NAME = 'loc'
IE_DESC = 'Library of Congress'
- _VALID_URL = r'https?://(?:www\.)?loc\.gov/(?:item/|today/cyberlc/feature_wdesc\.php\?.*\brec=)(?P<id>[0-9]+)'
+ _VALID_URL = r'https?://(?:www\.)?loc\.gov/(?:item/|today/cyberlc/feature_wdesc\.php\?.*\brec=)(?P<id>[0-9a-z_.]+)'
_TESTS = [{
# embedded via <div class="media-player"
'url': 'http://loc.gov/item/90716351/',
- 'md5': '353917ff7f0255aa6d4b80a034833de8',
+ 'md5': '6ec0ae8f07f86731b1b2ff70f046210a',
'info_dict': {
'id': '90716351',
'ext': 'mp4',
'title': "Pa's trip to Mars",
- 'thumbnail': r're:^https?://.*\.jpg$',
'duration': 0,
'view_count': int,
},
@@ -57,6 +56,12 @@
'params': {
'skip_download': True,
},
+ }, {
+ 'url': 'https://www.loc.gov/item/ihas.200197114/',
+ 'only_matching': True,
+ }, {
+ 'url': 'https://www.loc.gov/item/afc1981005_afs20503/',
+ 'only_matching': True,
}]
def _real_extract(self, url):
@@ -67,12 +72,13 @@
(r'id=(["\'])media-player-(?P<id>.+?)\1',
r'<video[^>]+id=(["\'])uuid-(?P<id>.+?)\1',
r'<video[^>]+data-uuid=(["\'])(?P<id>.+?)\1',
- r'mediaObjectId\s*:\s*(["\'])(?P<id>.+?)\1'),
+ r'mediaObjectId\s*:\s*(["\'])(?P<id>.+?)\1',
+ r'data-tab="share-media-(?P<id>[0-9A-F]{32})"'),
webpage, 'media id', group='id')
data = self._download_json(
'https://media.loc.gov/services/v1/media?id=%s&context=json' % media_id,
- video_id)['mediaObject']
+ media_id)['mediaObject']
derivative = data['derivatives'][0]
media_url = derivative['derivativeUrl']
@@ -89,25 +95,29 @@
if ext not in ('mp4', 'mp3'):
media_url += '.mp4' if is_video else '.mp3'
- if 'vod/mp4:' in media_url:
- formats = [{
- 'url': media_url.replace('vod/mp4:', 'hls-vod/media/') + '.m3u8',
+ formats = []
+ if '/vod/mp4:' in media_url:
+ formats.append({
+ 'url': media_url.replace('/vod/mp4:', '/hls-vod/media/') + '.m3u8',
'format_id': 'hls',
'ext': 'mp4',
'protocol': 'm3u8_native',
'quality': 1,
- }]
- elif 'vod/mp3:' in media_url:
- formats = [{
- 'url': media_url.replace('vod/mp3:', ''),
- 'vcodec': 'none',
- }]
+ })
+ http_format = {
+ 'url': re.sub(r'(://[^/]+/)(?:[^/]+/)*(?:mp4|mp3):', r'\1', media_url),
+ 'format_id': 'http',
+ 'quality': 1,
+ }
+ if not is_video:
+ http_format['vcodec'] = 'none'
+ formats.append(http_format)
download_urls = set()
for m in re.finditer(
r'<option[^>]+value=(["\'])(?P<url>.+?)\1[^>]+data-file-download=[^>]+>\s*(?P<id>.+?)(?:(?: |\s+)\((?P<size>.+?)\))?\s*<', webpage):
format_id = m.group('id').lower()
- if format_id == 'gif':
+ if format_id in ('gif', 'jpeg'):
continue
download_url = m.group('url')
if download_url in download_urls:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/mixcloud.py new/youtube-dl/youtube_dl/extractor/mixcloud.py
--- old/youtube-dl/youtube_dl/extractor/mixcloud.py 2018-11-06 19:36:17.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/mixcloud.py 2018-11-21 23:55:12.000000000 +0100
@@ -161,11 +161,17 @@
stream_info = info_json['streamInfo']
formats = []
+ def decrypt_url(f_url):
+ for k in (key, 'IFYOUWANTTHEARTISTSTOGETPAIDDONOTDOWNLOADFROMMIXCLOUD'):
+ decrypted_url = self._decrypt_xor_cipher(k, f_url)
+ if re.search(r'^https?://[0-9a-z.]+/[0-9A-Za-z/.?=&_-]+$', decrypted_url):
+ return decrypted_url
+
for url_key in ('url', 'hlsUrl', 'dashUrl'):
format_url = stream_info.get(url_key)
if not format_url:
continue
- decrypted = self._decrypt_xor_cipher(key, compat_b64decode(format_url))
+ decrypted = decrypt_url(compat_b64decode(format_url))
if not decrypted:
continue
if url_key == 'hlsUrl':
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/nbc.py new/youtube-dl/youtube_dl/extractor/nbc.py
--- old/youtube-dl/youtube_dl/extractor/nbc.py 2018-11-06 19:36:17.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/nbc.py 2018-11-21 23:55:12.000000000 +0100
@@ -9,10 +9,8 @@
from .adobepass import AdobePassIE
from ..compat import compat_urllib_parse_unquote
from ..utils import (
- find_xpath_attr,
smuggle_url,
try_get,
- unescapeHTML,
update_url_query,
int_or_none,
)
@@ -269,27 +267,14 @@
class NBCNewsIE(ThePlatformIE):
- _VALID_URL = r'''(?x)https?://(?:www\.)?(?:nbcnews|today|msnbc)\.com/
- (?:video/.+?/(?P<id>\d+)|
- ([^/]+/)*(?:.*-)?(?P<mpx_id>[^/?]+))
- '''
+ _VALID_URL = r'(?x)https?://(?:www\.)?(?:nbcnews|today|msnbc)\.com/([^/]+/)*(?:.*-)?(?P<id>[^/?]+)'
_TESTS = [
{
- 'url': 'http://www.nbcnews.com/video/nbc-news/52753292',
- 'md5': '47abaac93c6eaf9ad37ee6c4463a5179',
- 'info_dict': {
- 'id': '52753292',
- 'ext': 'flv',
- 'title': 'Crew emerges after four-month Mars food study',
- 'description': 'md5:24e632ffac72b35f8b67a12d1b6ddfc1',
- },
- },
- {
'url': 'http://www.nbcnews.com/watch/nbcnews-com/how-twitter-reacted-to-the-snowden…',
'md5': 'af1adfa51312291a017720403826bb64',
'info_dict': {
- 'id': 'p_tweet_snow_140529',
+ 'id': '269389891880',
'ext': 'mp4',
'title': 'How Twitter Reacted To The Snowden Interview',
'description': 'md5:65a0bd5d76fe114f3c2727aa3a81fe64',
@@ -313,7 +298,7 @@
'url': 'http://www.nbcnews.com/nightly-news/video/nightly-news-with-brian-williams-…',
'md5': '73135a2e0ef819107bbb55a5a9b2a802',
'info_dict': {
- 'id': 'nn_netcast_150204',
+ 'id': '394064451844',
'ext': 'mp4',
'title': 'Nightly News with Brian Williams Full Broadcast (February 4)',
'description': 'md5:1c10c1eccbe84a26e5debb4381e2d3c5',
@@ -326,7 +311,7 @@
'url': 'http://www.nbcnews.com/business/autos/volkswagen-11-million-vehicles-could-…',
'md5': 'a49e173825e5fcd15c13fc297fced39d',
'info_dict': {
- 'id': 'x_lon_vwhorn_150922',
+ 'id': '529953347624',
'ext': 'mp4',
'title': 'Volkswagen U.S. Chief:\xa0 We Have Totally Screwed Up',
'description': 'md5:c8be487b2d80ff0594c005add88d8351',
@@ -339,7 +324,7 @@
'url': 'http://www.today.com/video/see-the-aurora-borealis-from-space-in-stunning-n…',
'md5': '118d7ca3f0bea6534f119c68ef539f71',
'info_dict': {
- 'id': 'tdy_al_space_160420',
+ 'id': '669831235788',
'ext': 'mp4',
'title': 'See the aurora borealis from space in stunning new NASA video',
'description': 'md5:74752b7358afb99939c5f8bb2d1d04b1',
@@ -352,7 +337,7 @@
'url': 'http://www.msnbc.com/all-in-with-chris-hayes/watch/the-chaotic-gop-immigrat…',
'md5': '6d236bf4f3dddc226633ce6e2c3f814d',
'info_dict': {
- 'id': 'n_hayes_Aimm_140801_272214',
+ 'id': '314487875924',
'ext': 'mp4',
'title': 'The chaotic GOP immigration vote',
'description': 'The Republican House votes on a border bill that has no chance of getting through the Senate or signed by the President and is drawing criticism from all sides.',
@@ -374,60 +359,22 @@
]
def _real_extract(self, url):
- mobj = re.match(self._VALID_URL, url)
- video_id = mobj.group('id')
- if video_id is not None:
- all_info = self._download_xml('http://www.nbcnews.com/id/%s/displaymode/1219' % video_id, video_id)
- info = all_info.find('video')
-
- return {
- 'id': video_id,
- 'title': info.find('headline').text,
- 'ext': 'flv',
- 'url': find_xpath_attr(info, 'media', 'type', 'flashVideo').text,
- 'description': info.find('caption').text,
- 'thumbnail': find_xpath_attr(info, 'media', 'type', 'thumbnail').text,
- }
- else:
- # "feature" and "nightly-news" pages use theplatform.com
- video_id = mobj.group('mpx_id')
+ video_id = self._match_id(url)
+ if not video_id.isdigit():
webpage = self._download_webpage(url, video_id)
- filter_param = 'byId'
- bootstrap_json = self._search_regex(
- [r'(?m)(?:var\s+(?:bootstrapJson|playlistData)|NEWS\.videoObj)\s*=\s*({.+});?\s*$',
- r'videoObj\s*:\s*({.+})', r'data-video="([^"]+)"',
- r'jQuery\.extend\(Drupal\.settings\s*,\s*({.+?})\);'],
- webpage, 'bootstrap json', default=None)
- if bootstrap_json:
- bootstrap = self._parse_json(
- bootstrap_json, video_id, transform_source=unescapeHTML)
-
- info = None
- if 'results' in bootstrap:
- info = bootstrap['results'][0]['video']
- elif 'video' in bootstrap:
- info = bootstrap['video']
- elif 'msnbcVideoInfo' in bootstrap:
- info = bootstrap['msnbcVideoInfo']['meta']
- elif 'msnbcThePlatform' in bootstrap:
- info = bootstrap['msnbcThePlatform']['videoPlayer']['video']
- else:
- info = bootstrap
-
- if 'guid' in info:
- video_id = info['guid']
- filter_param = 'byGuid'
- elif 'mpxId' in info:
- video_id = info['mpxId']
-
- return {
- '_type': 'url_transparent',
- 'id': video_id,
- # http://feed.theplatform.com/f/2E2eJC/nbcnews also works
- 'url': update_url_query('http://feed.theplatform.com/f/2E2eJC/nnd_NBCNews', {filter_param: video_id}),
- 'ie_key': 'ThePlatformFeed',
- }
+ data = self._parse_json(self._search_regex(
+ r'window\.__data\s*=\s*({.+});', webpage,
+ 'bootstrap json'), video_id)
+ video_id = data['article']['content'][0]['primaryMedia']['video']['mpxMetadata']['id']
+
+ return {
+ '_type': 'url_transparent',
+ 'id': video_id,
+ # http://feed.theplatform.com/f/2E2eJC/nbcnews also works
+ 'url': update_url_query('http://feed.theplatform.com/f/2E2eJC/nnd_NBCNews', {'byId': video_id}),
+ 'ie_key': 'ThePlatformFeed',
+ }
class NBCOlympicsIE(InfoExtractor):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/nova.py new/youtube-dl/youtube_dl/extractor/nova.py
--- old/youtube-dl/youtube_dl/extractor/nova.py 2018-11-06 19:36:17.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/nova.py 2018-11-21 23:55:12.000000000 +0100
@@ -35,7 +35,7 @@
bitrates = self._parse_json(
self._search_regex(
- r'(?s)bitrates\s*=\s*({.+?})\s*;', webpage, 'formats'),
+ r'(?s)(?:src|bitrates)\s*=\s*({.+?})\s*;', webpage, 'formats'),
video_id, transform_source=js_to_json)
QUALITIES = ('lq', 'mq', 'hq', 'hd')
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/nzz.py new/youtube-dl/youtube_dl/extractor/nzz.py
--- old/youtube-dl/youtube_dl/extractor/nzz.py 2018-11-06 19:36:17.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/nzz.py 2018-11-21 23:55:12.000000000 +0100
@@ -11,20 +11,27 @@
class NZZIE(InfoExtractor):
_VALID_URL = r'https?://(?:www\.)?nzz\.ch/(?:[^/]+/)*[^/?#]+-ld\.(?P<id>\d+)'
- _TEST = {
+ _TESTS = [{
'url': 'http://www.nzz.ch/zuerich/gymizyte/gymizyte-schreiben-schueler-heute-noch-d…',
'info_dict': {
'id': '9153',
},
'playlist_mincount': 6,
- }
+ }, {
+ 'url': 'https://www.nzz.ch/video/nzz-standpunkte/cvp-auf-der-suche-nach-dem-mass-de…',
+ 'info_dict': {
+ 'id': '1368112',
+ },
+ 'playlist_count': 1,
+ }]
def _real_extract(self, url):
page_id = self._match_id(url)
webpage = self._download_webpage(url, page_id)
entries = []
- for player_element in re.findall(r'(<[^>]+class="kalturaPlayer"[^>]*>)', webpage):
+ for player_element in re.findall(
+ r'(<[^>]+class="kalturaPlayer[^"]*"[^>]*>)', webpage):
player_params = extract_attributes(player_element)
if player_params.get('data-type') not in ('kaltura_singleArticle',):
self.report_warning('Unsupported player type')
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/openload.py new/youtube-dl/youtube_dl/extractor/openload.py
--- old/youtube-dl/youtube_dl/extractor/openload.py 2018-11-06 19:36:17.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/openload.py 2018-11-21 23:55:02.000000000 +0100
@@ -243,7 +243,18 @@
class OpenloadIE(InfoExtractor):
- _VALID_URL = r'https?://(?:www\.)?(?:openload\.(?:co|io|link)|oload\.(?:tv|stream|site|xyz|win|download|cloud|cc|icu|fun))/(?:f|embed)/(?P<id>[a-zA-Z0-9-_]+)'
+ _VALID_URL = r'''(?x)
+ https?://
+ (?P<host>
+ (?:www\.)?
+ (?:
+ openload\.(?:co|io|link)|
+ oload\.(?:tv|stream|site|xyz|win|download|cloud|cc|icu|fun)
+ )
+ )/
+ (?:f|embed)/
+ (?P<id>[a-zA-Z0-9-_]+)
+ '''
_TESTS = [{
'url': 'https://openload.co/f/kUEfGclsU9o',
@@ -334,8 +345,11 @@
webpage)
def _real_extract(self, url):
- video_id = self._match_id(url)
- url_pattern = 'https://openload.co/%%s/%s/' % video_id
+ mobj = re.match(self._VALID_URL, url)
+ host = mobj.group('host')
+ video_id = mobj.group('id')
+
+ url_pattern = 'https://%s/%%s/%s/' % (host, video_id)
headers = {
'User-Agent': self._USER_AGENT,
}
@@ -368,7 +382,7 @@
r'>\s*([\w~-]+~[a-f0-9:]+~[\w~-]+)'), webpage,
'stream URL'))
- video_url = 'https://openload.co/stream/%s?mime=true' % decoded_id
+ video_url = 'https://%s/stream/%s?mime=true' % (host, decoded_id)
title = self._og_search_title(webpage, default=None) or self._search_regex(
r'<span[^>]+class=["\']title["\'][^>]*>([^<]+)', webpage,
@@ -379,7 +393,7 @@
entry = entries[0] if entries else {}
subtitles = entry.get('subtitles')
- info_dict = {
+ return {
'id': video_id,
'title': title,
'thumbnail': entry.get('thumbnail') or self._og_search_thumbnail(webpage, default=None),
@@ -388,4 +402,3 @@
'subtitles': subtitles,
'http_headers': headers,
}
- return info_dict
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/picarto.py new/youtube-dl/youtube_dl/extractor/picarto.py
--- old/youtube-dl/youtube_dl/extractor/picarto.py 2018-11-06 19:36:17.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/picarto.py 2018-11-21 23:55:02.000000000 +0100
@@ -1,6 +1,7 @@
# coding: utf-8
from __future__ import unicode_literals
+import re
import time
from .common import InfoExtractor
@@ -15,7 +16,7 @@
class PicartoIE(InfoExtractor):
- _VALID_URL = r'https?://(?:www.)?picarto\.tv/(?P<id>[a-zA-Z0-9]+)'
+ _VALID_URL = r'https?://(?:www.)?picarto\.tv/(?P<id>[a-zA-Z0-9]+)(?:/(?P<token>[a-zA-Z0-9]+))?'
_TEST = {
'url': 'https://picarto.tv/Setz',
'info_dict': {
@@ -33,20 +34,14 @@
return False if PicartoVodIE.suitable(url) else super(PicartoIE, cls).suitable(url)
def _real_extract(self, url):
- channel_id = self._match_id(url)
- stream_page = self._download_webpage(url, channel_id)
+ mobj = re.match(self._VALID_URL, url)
+ channel_id = mobj.group('id')
- if '>This channel does not exist' in stream_page:
- raise ExtractorError(
- 'Channel %s does not exist' % channel_id, expected=True)
-
- player = self._parse_json(
- self._search_regex(
- r'(?s)playerSettings\[\d+\]\s*=\s*(\{.+?\}\s*\n)', stream_page,
- 'player settings'),
- channel_id, transform_source=js_to_json)
+ metadata = self._download_json(
+ 'https://api.picarto.tv/v1/channel/name/' + channel_id,
+ channel_id)
- if player.get('online') is False:
+ if metadata.get('online') is False:
raise ExtractorError('Stream is offline', expected=True)
cdn_data = self._download_json(
@@ -54,20 +49,13 @@
data=urlencode_postdata({'loadbalancinginfo': channel_id}),
note='Downloading load balancing info')
- def get_event(key):
- return try_get(player, lambda x: x['event'][key], compat_str) or ''
-
+ token = mobj.group('token') or 'public'
params = {
- 'token': player.get('token') or '',
- 'ticket': get_event('ticket'),
'con': int(time.time() * 1000),
- 'type': get_event('ticket'),
- 'scope': get_event('scope'),
+ 'token': token,
}
prefered_edge = cdn_data.get('preferedEdge')
- default_tech = player.get('defaultTech')
-
formats = []
for edge in cdn_data['edges']:
@@ -81,8 +69,6 @@
preference = 0
if edge_id == prefered_edge:
preference += 1
- if tech_type == default_tech:
- preference += 1
format_id = []
if edge_id:
format_id.append(edge_id)
@@ -109,7 +95,7 @@
continue
self._sort_formats(formats)
- mature = player.get('mature')
+ mature = metadata.get('adult')
if mature is None:
age_limit = None
else:
@@ -117,9 +103,11 @@
return {
'id': channel_id,
- 'title': self._live_title(channel_id),
+ 'title': self._live_title(metadata.get('title') or channel_id),
'is_live': True,
- 'thumbnail': player.get('vodThumb'),
+ 'thumbnail': try_get(metadata, lambda x: x['thumbnails']['web']),
+ 'channel': channel_id,
+ 'channel_url': 'https://picarto.tv/%s' % channel_id,
'age_limit': age_limit,
'formats': formats,
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/pornhub.py new/youtube-dl/youtube_dl/extractor/pornhub.py
--- old/youtube-dl/youtube_dl/extractor/pornhub.py 2018-11-06 19:36:17.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/pornhub.py 2018-11-21 23:55:12.000000000 +0100
@@ -27,7 +27,7 @@
_VALID_URL = r'''(?x)
https?://
(?:
- (?:[^/]+\.)?pornhub\.com/(?:(?:view_video\.php|video/show)\?viewkey=|embed/)|
+ (?:[^/]+\.)?pornhub\.(?:com|net)/(?:(?:view_video\.php|video/show)\?viewkey=|embed/)|
(?:www\.)?thumbzilla\.com/video/
)
(?P<id>[\da-z]+)
@@ -121,6 +121,9 @@
}, {
'url': 'http://www.pornhub.com/video/show?viewkey=648719015',
'only_matching': True,
+ }, {
+ 'url': 'https://www.pornhub.net/view_video.php?viewkey=203640933',
+ 'only_matching': True,
}]
@staticmethod
@@ -340,7 +343,7 @@
class PornHubPlaylistIE(PornHubPlaylistBaseIE):
- _VALID_URL = r'https?://(?:[^/]+\.)?pornhub\.com/playlist/(?P<id>\d+)'
+ _VALID_URL = r'https?://(?:[^/]+\.)?pornhub\.(?:com|net)/playlist/(?P<id>\d+)'
_TESTS = [{
'url': 'http://www.pornhub.com/playlist/4667351',
'info_dict': {
@@ -355,7 +358,7 @@
class PornHubUserVideosIE(PornHubPlaylistBaseIE):
- _VALID_URL = r'https?://(?:[^/]+\.)?pornhub\.com/(?:(?:user|channel)s|model|pornstar)/(?P<id>[^/]+)/videos'
+ _VALID_URL = r'https?://(?:[^/]+\.)?pornhub\.(?:com|net)/(?:(?:user|channel)s|model|pornstar)/(?P<id>[^/]+)/videos'
_TESTS = [{
'url': 'http://www.pornhub.com/users/zoe_ph/videos/public',
'info_dict': {
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/rte.py new/youtube-dl/youtube_dl/extractor/rte.py
--- old/youtube-dl/youtube_dl/extractor/rte.py 2018-11-06 19:36:18.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/rte.py 2018-11-21 23:55:02.000000000 +0100
@@ -8,7 +8,10 @@
from ..utils import (
float_or_none,
parse_iso8601,
+ str_or_none,
+ try_get,
unescapeHTML,
+ url_or_none,
ExtractorError,
)
@@ -17,65 +20,87 @@
def _real_extract(self, url):
item_id = self._match_id(url)
- try:
- json_string = self._download_json(
- 'http://www.rte.ie/rteavgen/getplaylist/?type=web&format=json&id=' + item_id,
- item_id)
- except ExtractorError as ee:
- if isinstance(ee.cause, compat_HTTPError) and ee.cause.code == 404:
- error_info = self._parse_json(ee.cause.read().decode(), item_id, fatal=False)
- if error_info:
- raise ExtractorError(
- '%s said: %s' % (self.IE_NAME, error_info['message']),
- expected=True)
- raise
-
- # NB the string values in the JSON are stored using XML escaping(!)
- show = json_string['shows'][0]
- title = unescapeHTML(show['title'])
- description = unescapeHTML(show.get('description'))
- thumbnail = show.get('thumbnail')
- duration = float_or_none(show.get('duration'), 1000)
- timestamp = parse_iso8601(show.get('published'))
-
- mg = show['media:group'][0]
-
+ info_dict = {}
formats = []
- if mg.get('url'):
- m = re.match(r'(?P<url>rtmpe?://[^/]+)/(?P<app>.+)/(?P<playpath>mp4:.*)', mg['url'])
- if m:
- m = m.groupdict()
- formats.append({
- 'url': m['url'] + '/' + m['app'],
- 'app': m['app'],
- 'play_path': m['playpath'],
- 'player_url': url,
- 'ext': 'flv',
- 'format_id': 'rtmp',
- })
-
- if mg.get('hls_server') and mg.get('hls_url'):
- formats.extend(self._extract_m3u8_formats(
- mg['hls_server'] + mg['hls_url'], item_id, 'mp4',
- entry_protocol='m3u8_native', m3u8_id='hls', fatal=False))
-
- if mg.get('hds_server') and mg.get('hds_url'):
- formats.extend(self._extract_f4m_formats(
- mg['hds_server'] + mg['hds_url'], item_id,
- f4m_id='hds', fatal=False))
+ ENDPOINTS = (
+ 'https://feeds.rasset.ie/rteavgen/player/playlist?type=iptv&format=json&show…',
+ 'http://www.rte.ie/rteavgen/getplaylist/?type=web&format=json&id=',
+ )
+
+ for num, ep_url in enumerate(ENDPOINTS, start=1):
+ try:
+ data = self._download_json(ep_url + item_id, item_id)
+ except ExtractorError as ee:
+ if num < len(ENDPOINTS) or formats:
+ continue
+ if isinstance(ee.cause, compat_HTTPError) and ee.cause.code == 404:
+ error_info = self._parse_json(ee.cause.read().decode(), item_id, fatal=False)
+ if error_info:
+ raise ExtractorError(
+ '%s said: %s' % (self.IE_NAME, error_info['message']),
+ expected=True)
+ raise
+
+ # NB the string values in the JSON are stored using XML escaping(!)
+ show = try_get(data, lambda x: x['shows'][0], dict)
+ if not show:
+ continue
+
+ if not info_dict:
+ title = unescapeHTML(show['title'])
+ description = unescapeHTML(show.get('description'))
+ thumbnail = show.get('thumbnail')
+ duration = float_or_none(show.get('duration'), 1000)
+ timestamp = parse_iso8601(show.get('published'))
+ info_dict = {
+ 'id': item_id,
+ 'title': title,
+ 'description': description,
+ 'thumbnail': thumbnail,
+ 'timestamp': timestamp,
+ 'duration': duration,
+ }
+
+ mg = try_get(show, lambda x: x['media:group'][0], dict)
+ if not mg:
+ continue
+
+ if mg.get('url'):
+ m = re.match(r'(?P<url>rtmpe?://[^/]+)/(?P<app>.+)/(?P<playpath>mp4:.*)', mg['url'])
+ if m:
+ m = m.groupdict()
+ formats.append({
+ 'url': m['url'] + '/' + m['app'],
+ 'app': m['app'],
+ 'play_path': m['playpath'],
+ 'player_url': url,
+ 'ext': 'flv',
+ 'format_id': 'rtmp',
+ })
+
+ if mg.get('hls_server') and mg.get('hls_url'):
+ formats.extend(self._extract_m3u8_formats(
+ mg['hls_server'] + mg['hls_url'], item_id, 'mp4',
+ entry_protocol='m3u8_native', m3u8_id='hls', fatal=False))
+
+ if mg.get('hds_server') and mg.get('hds_url'):
+ formats.extend(self._extract_f4m_formats(
+ mg['hds_server'] + mg['hds_url'], item_id,
+ f4m_id='hds', fatal=False))
+
+ mg_rte_server = str_or_none(mg.get('rte:server'))
+ mg_url = str_or_none(mg.get('url'))
+ if mg_rte_server and mg_url:
+ hds_url = url_or_none(mg_rte_server + mg_url)
+ if hds_url:
+ formats.extend(self._extract_f4m_formats(
+ hds_url, item_id, f4m_id='hds', fatal=False))
self._sort_formats(formats)
- return {
- 'id': item_id,
- 'title': title,
- 'description': description,
- 'thumbnail': thumbnail,
- 'timestamp': timestamp,
- 'duration': duration,
- 'formats': formats,
- }
+ info_dict['formats'] = formats
+ return info_dict
class RteIE(RteBaseIE):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/ruutu.py new/youtube-dl/youtube_dl/extractor/ruutu.py
--- old/youtube-dl/youtube_dl/extractor/ruutu.py 2018-11-06 19:36:18.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/ruutu.py 2018-11-21 23:55:02.000000000 +0100
@@ -65,7 +65,8 @@
video_id = self._match_id(url)
video_xml = self._download_xml(
- 'http://gatling.ruutu.fi/media-xml-cache?id=%s' % video_id, video_id)
+ 'https://gatling.nelonenmedia.fi/media-xml-cache', video_id,
+ query={'id': video_id})
formats = []
processed_urls = []
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/shared.py new/youtube-dl/youtube_dl/extractor/shared.py
--- old/youtube-dl/youtube_dl/extractor/shared.py 2018-11-06 19:36:18.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/shared.py 2018-11-21 23:55:02.000000000 +0100
@@ -5,6 +5,7 @@
from ..utils import (
ExtractorError,
int_or_none,
+ url_or_none,
urlencode_postdata,
)
@@ -86,9 +87,16 @@
}
def _extract_video_url(self, webpage, video_id, *args):
+ def decode_url(encoded_url):
+ return compat_b64decode(encoded_url).decode('utf-8')
+
+ stream_url = url_or_none(decode_url(self._search_regex(
+ r'data-stream\s*=\s*(["\'])(?P<url>(?:(?!\1).)+)\1', webpage,
+ 'stream url', default=None, group='url')))
+ if stream_url:
+ return stream_url
return self._parse_json(
self._search_regex(
r'InitializeStream\s*\(\s*(["\'])(?P<url>(?:(?!\1).)+)\1',
webpage, 'stream', group='url'),
- video_id,
- transform_source=lambda x: compat_b64decode(x).decode('utf-8'))[0]
+ video_id, transform_source=decode_url)[0]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/sixplay.py new/youtube-dl/youtube_dl/extractor/sixplay.py
--- old/youtube-dl/youtube_dl/extractor/sixplay.py 2018-11-06 19:36:18.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/sixplay.py 2018-11-21 23:55:12.000000000 +0100
@@ -64,7 +64,7 @@
for asset in clip_data['assets']:
asset_url = asset.get('full_physical_path')
protocol = asset.get('protocol')
- if not asset_url or protocol == 'primetime' or asset_url in urls:
+ if not asset_url or protocol == 'primetime' or asset.get('type') == 'usp_hlsfp_h264' or asset_url in urls:
continue
urls.append(asset_url)
container = asset.get('video_container')
@@ -81,19 +81,17 @@
if not urlh:
continue
asset_url = urlh.geturl()
- asset_url = re.sub(r'/([^/]+)\.ism/[^/]*\.m3u8', r'/\1.ism/\1.m3u8', asset_url)
- formats.extend(self._extract_m3u8_formats(
- asset_url, video_id, 'mp4', 'm3u8_native',
- m3u8_id='hls', fatal=False))
- formats.extend(self._extract_f4m_formats(
- asset_url.replace('.m3u8', '.f4m'),
- video_id, f4m_id='hds', fatal=False))
- formats.extend(self._extract_mpd_formats(
- asset_url.replace('.m3u8', '.mpd'),
- video_id, mpd_id='dash', fatal=False))
- formats.extend(self._extract_ism_formats(
- re.sub(r'/[^/]+\.m3u8', '/Manifest', asset_url),
- video_id, ism_id='mss', fatal=False))
+ for i in range(3, 0, -1):
+ asset_url = asset_url = asset_url.replace('_sd1/', '_sd%d/' % i)
+ m3u8_formats = self._extract_m3u8_formats(
+ asset_url, video_id, 'mp4', 'm3u8_native',
+ m3u8_id='hls', fatal=False)
+ formats.extend(m3u8_formats)
+ formats.extend(self._extract_mpd_formats(
+ asset_url.replace('.m3u8', '.mpd'),
+ video_id, mpd_id='dash', fatal=False))
+ if m3u8_formats:
+ break
else:
formats.extend(self._extract_m3u8_formats(
asset_url, video_id, 'mp4', 'm3u8_native',
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/theplatform.py new/youtube-dl/youtube_dl/extractor/theplatform.py
--- old/youtube-dl/youtube_dl/extractor/theplatform.py 2018-11-06 19:36:18.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/theplatform.py 2018-11-21 23:55:12.000000000 +0100
@@ -343,7 +343,7 @@
def _extract_feed_info(self, provider_id, feed_id, filter_query, video_id, custom_fields=None, asset_types_query={}, account_id=None):
real_url = self._URL_TEMPLATE % (self.http_scheme(), provider_id, feed_id, filter_query)
entry = self._download_json(real_url, video_id)['entries'][0]
- main_smil_url = 'http://link.theplatform.com/s/%s/media/guid/%d/%s' % (provider_id, account_id, entry['guid']) if account_id else None
+ main_smil_url = 'http://link.theplatform.com/s/%s/media/guid/%d/%s' % (provider_id, account_id, entry['guid']) if account_id else entry.get('plmedia$publicUrl')
formats = []
subtitles = {}
@@ -356,7 +356,8 @@
if first_video_id is None:
first_video_id = cur_video_id
duration = float_or_none(item.get('plfile$duration'))
- for asset_type in item['plfile$assetTypes']:
+ file_asset_types = item.get('plfile$assetTypes') or compat_parse_qs(compat_urllib_parse_urlparse(smil_url).query)['assetTypes']
+ for asset_type in file_asset_types:
if asset_type in asset_types:
continue
asset_types.append(asset_type)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/tnaflix.py new/youtube-dl/youtube_dl/extractor/tnaflix.py
--- old/youtube-dl/youtube_dl/extractor/tnaflix.py 2018-11-06 19:36:18.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/tnaflix.py 2018-11-21 23:55:02.000000000 +0100
@@ -18,8 +18,9 @@
class TNAFlixNetworkBaseIE(InfoExtractor):
# May be overridden in descendants if necessary
_CONFIG_REGEX = [
- r'flashvars\.config\s*=\s*escape\("([^"]+)"',
- r'<input[^>]+name="config\d?" value="([^"]+)"',
+ r'flashvars\.config\s*=\s*escape\("(?P<url>[^"]+)"',
+ r'<input[^>]+name="config\d?" value="(?P<url>[^"]+)"',
+ r'config\s*=\s*(["\'])(?P<url>(?:https?:)?//(?:(?!\1).)+)\1',
]
_HOST = 'tna'
_VKEY_SUFFIX = ''
@@ -85,7 +86,8 @@
webpage = self._download_webpage(url, display_id)
cfg_url = self._proto_relative_url(self._html_search_regex(
- self._CONFIG_REGEX, webpage, 'flashvars.config', default=None), 'http:')
+ self._CONFIG_REGEX, webpage, 'flashvars.config', default=None,
+ group='url'), 'http:')
if not cfg_url:
inputs = self._hidden_inputs(webpage)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/vk.py new/youtube-dl/youtube_dl/extractor/vk.py
--- old/youtube-dl/youtube_dl/extractor/vk.py 2018-11-06 19:36:18.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/vk.py 2018-11-21 23:55:03.000000000 +0100
@@ -293,8 +293,12 @@
# This video is no longer available, because its author has been blocked.
'url': 'https://vk.com/video-10639516_456240611',
'only_matching': True,
- }
- ]
+ },
+ {
+ # The video is not available in your region.
+ 'url': 'https://vk.com/video-51812607_171445436',
+ 'only_matching': True,
+ }]
def _real_extract(self, url):
mobj = re.match(self._VALID_URL, url)
@@ -354,6 +358,9 @@
r'<!>This video is no longer available, because it has been deleted.':
'Video %s is no longer available, because it has been deleted.',
+
+ r'<!>The video .+? is not available in your region.':
+ 'Video %s is not available in your region.',
}
for error_re, error_msg in ERRORS.items():
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/wwe.py new/youtube-dl/youtube_dl/extractor/wwe.py
--- old/youtube-dl/youtube_dl/extractor/wwe.py 1970-01-01 01:00:00.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/wwe.py 2018-11-21 23:55:03.000000000 +0100
@@ -0,0 +1,140 @@
+from __future__ import unicode_literals
+
+import re
+
+from .common import InfoExtractor
+from ..compat import compat_str
+from ..utils import (
+ try_get,
+ unescapeHTML,
+ url_or_none,
+ urljoin,
+)
+
+
+class WWEBaseIE(InfoExtractor):
+ _SUBTITLE_LANGS = {
+ 'English': 'en',
+ 'Deutsch': 'de',
+ }
+
+ def _extract_entry(self, data, url, video_id=None):
+ video_id = compat_str(video_id or data['nid'])
+ title = data['title']
+
+ formats = self._extract_m3u8_formats(
+ data['file'], video_id, 'mp4', entry_protocol='m3u8_native',
+ m3u8_id='hls')
+
+ description = data.get('description')
+ thumbnail = urljoin(url, data.get('image'))
+ series = data.get('show_name')
+ episode = data.get('episode_name')
+
+ subtitles = {}
+ tracks = data.get('tracks')
+ if isinstance(tracks, list):
+ for track in tracks:
+ if not isinstance(track, dict):
+ continue
+ if track.get('kind') != 'captions':
+ continue
+ track_file = url_or_none(track.get('file'))
+ if not track_file:
+ continue
+ label = track.get('label')
+ lang = self._SUBTITLE_LANGS.get(label, label) or 'en'
+ subtitles.setdefault(lang, []).append({
+ 'url': track_file,
+ })
+
+ return {
+ 'id': video_id,
+ 'title': title,
+ 'description': description,
+ 'thumbnail': thumbnail,
+ 'series': series,
+ 'episode': episode,
+ 'formats': formats,
+ 'subtitles': subtitles,
+ }
+
+
+class WWEIE(WWEBaseIE):
+ _VALID_URL = r'https?://(?:[^/]+\.)?wwe\.com/(?:[^/]+/)*videos/(?P<id>[^/?#&]+)'
+ _TESTS = [{
+ 'url': 'https://www.wwe.com/videos/daniel-bryan-vs-andrade-cien-almas-smackdown-liv…',
+ 'md5': '92811c6a14bfc206f7a6a9c5d9140184',
+ 'info_dict': {
+ 'id': '40048199',
+ 'ext': 'mp4',
+ 'title': 'Daniel Bryan vs. Andrade "Cien" Almas: SmackDown LIVE, Sept. 4, 2018',
+ 'description': 'md5:2d7424dbc6755c61a0e649d2a8677f67',
+ 'thumbnail': r're:^https?://.*\.jpg$',
+ }
+ }, {
+ 'url': 'https://de.wwe.com/videos/gran-metalik-vs-tony-nese-wwe-205-live-sept-4-2018',
+ 'only_matching': True,
+ }]
+
+ def _real_extract(self, url):
+ display_id = self._match_id(url)
+ webpage = self._download_webpage(url, display_id)
+
+ landing = self._parse_json(
+ self._html_search_regex(
+ r'(?s)Drupal\.settings\s*,\s*({.+?})\s*\)\s*;',
+ webpage, 'drupal settings'),
+ display_id)['WWEVideoLanding']
+
+ data = landing['initialVideo']['playlist'][0]
+ video_id = landing.get('initialVideoId')
+
+ info = self._extract_entry(data, url, video_id)
+ info['display_id'] = display_id
+ return info
+
+
+class WWEPlaylistIE(WWEBaseIE):
+ _VALID_URL = r'https?://(?:[^/]+\.)?wwe\.com/(?:[^/]+/)*(?P<id>[^/?#&]+)'
+ _TESTS = [{
+ 'url': 'https://www.wwe.com/shows/raw/2018-11-12',
+ 'info_dict': {
+ 'id': '2018-11-12',
+ },
+ 'playlist_mincount': 11,
+ }, {
+ 'url': 'http://www.wwe.com/article/walk-the-prank-wwe-edition',
+ 'only_matching': True,
+ }, {
+ 'url': 'https://www.wwe.com/shows/wwenxt/article/matt-riddle-interview',
+ 'only_matching': True,
+ }]
+
+ @classmethod
+ def suitable(cls, url):
+ return False if WWEIE.suitable(url) else super(WWEPlaylistIE, cls).suitable(url)
+
+ def _real_extract(self, url):
+ display_id = self._match_id(url)
+ webpage = self._download_webpage(url, display_id)
+
+ entries = []
+ for mobj in re.finditer(
+ r'data-video\s*=\s*(["\'])(?P<data>{.+?})\1', webpage):
+ video = self._parse_json(
+ mobj.group('data'), display_id, transform_source=unescapeHTML,
+ fatal=False)
+ if not video:
+ continue
+ data = try_get(video, lambda x: x['playlist'][0], dict)
+ if not data:
+ continue
+ try:
+ entry = self._extract_entry(data, url)
+ except Exception:
+ continue
+ entry['extractor_key'] = WWEIE.ie_key()
+ entries.append(entry)
+
+ return self.playlist_result(entries, display_id)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/extractor/zype.py new/youtube-dl/youtube_dl/extractor/zype.py
--- old/youtube-dl/youtube_dl/extractor/zype.py 1970-01-01 01:00:00.000000000 +0100
+++ new/youtube-dl/youtube_dl/extractor/zype.py 2018-11-21 23:55:03.000000000 +0100
@@ -0,0 +1,57 @@
+# coding: utf-8
+from __future__ import unicode_literals
+
+import re
+
+from .common import InfoExtractor
+
+
+class ZypeIE(InfoExtractor):
+ _VALID_URL = r'https?://player\.zype\.com/embed/(?P<id>[\da-fA-F]+)\.js\?.*?api_key=[^&]+'
+ _TEST = {
+ 'url': 'https://player.zype.com/embed/5b400b834b32992a310622b9.js?api_key=jZ9GUhRmx…',
+ 'md5': 'eaee31d474c76a955bdaba02a505c595',
+ 'info_dict': {
+ 'id': '5b400b834b32992a310622b9',
+ 'ext': 'mp4',
+ 'title': 'Smoky Barbecue Favorites',
+ 'thumbnail': r're:^https?://.*\.jpe?g',
+ },
+ }
+
+ @staticmethod
+ def _extract_urls(webpage):
+ return [
+ mobj.group('url')
+ for mobj in re.finditer(
+ r'<script[^>]+\bsrc=(["\'])(?P<url>(?:https?:)?//player\.zype\.com/embed/[\da-fA-F]+\.js\?.*?api_key=.+?)\1',
+ webpage)]
+
+ def _real_extract(self, url):
+ video_id = self._match_id(url)
+
+ webpage = self._download_webpage(url, video_id)
+
+ title = self._search_regex(
+ r'video_title\s*[:=]\s*(["\'])(?P<value>(?:(?!\1).)+)\1', webpage,
+ 'title', group='value')
+
+ m3u8_url = self._search_regex(
+ r'(["\'])(?P<url>(?:(?!\1).)+\.m3u8(?:(?!\1).)*)\1', webpage,
+ 'm3u8 url', group='url')
+
+ formats = self._extract_m3u8_formats(
+ m3u8_url, video_id, 'mp4', entry_protocol='m3u8_native',
+ m3u8_id='hls')
+ self._sort_formats(formats)
+
+ thumbnail = self._search_regex(
+ r'poster\s*[:=]\s*(["\'])(?P<url>(?:(?!\1).)+)\1', webpage, 'thumbnail',
+ default=False, group='url')
+
+ return {
+ 'id': video_id,
+ 'title': title,
+ 'thumbnail': thumbnail,
+ 'formats': formats,
+ }
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/youtube-dl/youtube_dl/version.py new/youtube-dl/youtube_dl/version.py
--- old/youtube-dl/youtube_dl/version.py 2018-11-06 19:38:22.000000000 +0100
+++ new/youtube-dl/youtube_dl/version.py 2018-11-22 18:16:43.000000000 +0100
@@ -1,3 +1,3 @@
from __future__ import unicode_literals
-__version__ = '2018.11.07'
+__version__ = '2018.11.23'
1
0
Hello community,
here is the log from the commit of package percona-toolkit for openSUSE:Factory checked in at 2018-11-26 10:30:58
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/percona-toolkit (Old)
and /work/SRC/openSUSE:Factory/.percona-toolkit.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "percona-toolkit"
Mon Nov 26 10:30:58 2018 rev:33 rq:651380 version:3.0.12
Changes:
--------
--- /work/SRC/openSUSE:Factory/percona-toolkit/percona-toolkit.changes 2018-08-27 13:48:56.584516595 +0200
+++ /work/SRC/openSUSE:Factory/.percona-toolkit.new.19453/percona-toolkit.changes 2018-11-26 10:31:48.964921804 +0100
@@ -1,0 +2,9 @@
+Fri Nov 23 12:49:32 UTC 2018 - astieger(a)suse.com
+
+- update to Percona Toolkit 3.0.12:
+ * pt-archiver failed with UTF-8 chars
+ * pt-online-schema-change failed on UK and NULLs
+ * Better usage of ENUM fields in keys in NibbleIterator
+ * pt-mysql-summary may get stuck when Time: NULL in processlist
+
+-------------------------------------------------------------------
Old:
----
percona-toolkit-3.0.11.tar.gz
New:
----
percona-toolkit-3.0.12.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ percona-toolkit.spec ++++++
--- /var/tmp/diff_new_pack.k9HC7J/_old 2018-11-26 10:31:50.296920244 +0100
+++ /var/tmp/diff_new_pack.k9HC7J/_new 2018-11-26 10:31:50.296920244 +0100
@@ -17,7 +17,7 @@
Name: percona-toolkit
-Version: 3.0.11
+Version: 3.0.12
Release: 0
Summary: Advanced MySQL and system command-line tools
License: GPL-2.0-only
++++++ percona-toolkit-3.0.11.tar.gz -> percona-toolkit-3.0.12.tar.gz ++++++
/work/SRC/openSUSE:Factory/percona-toolkit/percona-toolkit-3.0.11.tar.gz /work/SRC/openSUSE:Factory/.percona-toolkit.new.19453/percona-toolkit-3.0.12.tar.gz differ: char 5, line 1
1
0
Hello community,
here is the log from the commit of package goaccess for openSUSE:Factory checked in at 2018-11-26 10:30:52
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/goaccess (Old)
and /work/SRC/openSUSE:Factory/.goaccess.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "goaccess"
Mon Nov 26 10:30:52 2018 rev:2 rq:651371 version:1.3
Changes:
--------
--- /work/SRC/openSUSE:Factory/goaccess/goaccess.changes 2018-10-18 15:39:51.142057259 +0200
+++ /work/SRC/openSUSE:Factory/.goaccess.new.19453/goaccess.changes 2018-11-26 10:31:30.232943755 +0100
@@ -1,0 +2,75 @@
+Fri Nov 23 11:39:52 UTC 2018 - mvetter(a)suse.com
+
+- Update to 1.3
+ * Added ability to store accumulated processing time into DB_GEN_STATS tcb
+ file via '--accumulated-time' command line option.
+ * Added additional Apache status codes to the list.
+ * Added a few feed readers to the list.
+ * Added more OSs to the list of OSs.
+ * Added --anonymize-ip command line option to anonymize ip addresses.
+ * Added --browsers-file command line option to load a list of crawlers from a
+ text file.
+ * Added byte unit (PiB) to formatters
+ * Added translations
+ * Added '%h' date specifier to the allowed date character specifiers.
+ * Added "HeadlessChrome" to the list of browsers.
+ * Added --hide-referer command line option to hide referers from report.
+ * Added HTTP status code 429 (TOO MANY REQUESTS).
+ * Added IGNORE_LEVEL_PANEL and IGNORE_LEVEL_REQ definitions.
+ * Added --ignore-referer-report command line option to hide referers from
+ output.
+ * Added "Mastodon" user-agent to the list of crawlers/unix-like.
+ * Added new fontawesome icons and use angle arrows in HTML paging.
+ * Added new purple theme to HTML report and default to it.
+ * Added --no-parsing-spinner command line option to switch off parsing
+ spinner.
+ * Added .ogv and ogg static file extension (ogg video, Ogg Vorbis audio).
+ * Added OS X version numbers when outputting with --real-os.
+ * Added parsing mechanism in an attempt capture more bots and to include
+ unspecified bots/crawlers.
+ * Added --pidfile command line option to the default config file.
+ * Added SSL support for Docker goaccess build.
+ * Added support to the WebSocket server for openssl-1.1*.
+ * Added the ability to show/hide a chart per panel in the HTML report.
+ * Added transparency to the navigation bar of the HTML report.
+ * Added "WhatsApp" user-agent to the list of crawlers.
+ * Changed default db folder so it adds the process id (PID). --db-path is
+ required now when using --load-from-disk.
+ * Changed Dockerfile to build from the current source.
+ * Changed 'hits' to be right-aligned on TUI.
+ * Changed to use faster slide animations on HTML report.
+ * Changed wording from 'Bandwidth' to the proper term 'Tx. Amount'.
+ * Ensure database filenames used by btree are less predictable.
+ * Ensure HTML templates, CSS and JS files are minified when outputting
+ report.
+ * Ensure key phrases from Google are added even when https is used.
+ * Ensure live report updates data & charts if tab/document has focus.
+ * Ensure multiple 'Yandex' crawlers are properly parsed.
+ * Ensure Safari has priority over most crawlers except the ones that are
+ known to have it.
+ * Ensure the request protocol on its own is properly parsed.
+ * Ensure the right number of tests are performed against the given log.
+ * Ensure user configuration is parsed first when available.
+ * Ensure wss:// is used when connecting via HTTPS.
+ * Ensure XFF parser takes into account escaped braces.
+ * Fixed a regression where fifo-in/out would fail with ENXIO.
+ * Fixed a regression where it would return EXIT_FAILURE on an empty log.
+ * Fixed a (ssh) pipeline problem with fgetline()/fgets() when there is a race
+ for data on stdin.
+ * Fixed broken X-Forwarded-For (XFF) %~ specifier in certain parsing cases.
+ * Fixed conf.filenames duplication problem if logs are via pipe.
+ * Fixed float percent value on JSON/HTML output for locales using decimal comma.
+ * Fixed issue where it was not possible to establish a Web Socket connection
+ when attempting to parse and extract HTTP method.
+ * Fixed issue where log formats with pipe delimiter were not propely parsed.
+ * Fixed memory leak after config file path has been set (housekeeping).
+ * Fixed memory leak when adding host to holder introduced in c052d1ea.
+ * Fixed possible memory leak when hiding specific referrers.
+ * Fixed several JS jshint warnings.
+ * Fixed sudo installs on TravisCI.
+ * Fixed UNDEFINED time range in HTML report when VISITORS panel was ignored.
+ * Fixed unnecessary closing span tags from template.
+ * Fixed use-after-free when two color items were found on color_list.
+- Add language package
+
+-------------------------------------------------------------------
Old:
----
goaccess-1.2.tar.gz
New:
----
goaccess-1.3.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ goaccess.spec ++++++
--- /var/tmp/diff_new_pack.zPHczS/_old 2018-11-26 10:31:30.864943015 +0100
+++ /var/tmp/diff_new_pack.zPHczS/_new 2018-11-26 10:31:30.864943015 +0100
@@ -19,7 +19,7 @@
Name: goaccess
-Version: 1.2
+Version: 1.3
Release: 0
Summary: Apache Web Log Analyzer
License: GPL-2.0-or-later
@@ -32,11 +32,14 @@
BuildRequires: libGeoIP-devel
BuildRequires: make
BuildRequires: ncurses-devel
+Recommends: %{name}-lang
%description
GoAccess is an Apache web log analyzer that provides HTTP statistics
for system administrators that require a visual report on the fly.
+%lang_package
+
%prep
%setup -q
@@ -50,12 +53,15 @@
%install
%make_install
-%files
+%find_lang %{name}
+
+%files -f %{name}.lang
%license COPYING
%doc AUTHORS ChangeLog NEWS README TODO
%{_bindir}/goaccess
%{_mandir}/man1/goaccess.1%{?ext_man}
-%config(noreplace) %{_sysconfdir}/goaccess.conf
-%{_datadir}/doc/goaccess/
+%dir %{_sysconfdir}/goaccess/
+%config %{_sysconfdir}/goaccess/browsers.list
+%config %{_sysconfdir}/goaccess/goaccess.conf
%changelog
++++++ goaccess-1.2.tar.gz -> goaccess-1.3.tar.gz ++++++
++++ 23442 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package nextcloud for openSUSE:Factory checked in at 2018-11-26 10:30:36
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/nextcloud (Old)
and /work/SRC/openSUSE:Factory/.nextcloud.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "nextcloud"
Mon Nov 26 10:30:36 2018 rev:20 rq:651370 version:14.0.4
Changes:
--------
--- /work/SRC/openSUSE:Factory/nextcloud/nextcloud.changes 2018-10-15 09:43:30.267367459 +0200
+++ /work/SRC/openSUSE:Factory/.nextcloud.new.19453/nextcloud.changes 2018-11-26 10:31:24.656950290 +0100
@@ -1,0 +2,94 @@
+Thu Nov 22 16:56:26 UTC 2018 - ecsos(a)opensuse.org
+
+- update to 14.0.4
+ - Allow overwrite.cli.url without trailing slash (server#11772)
+ - Remove duplicate call to decodeURIComponent (server#11781)
+ - Check for empty string (server#11783)
+ - Add "Referrer-Policy" to htaccess file, addresses issue #11099
+ (server#11798)
+ - Always query the lookup server in a global scale setup
+ (server#11800)
+ - Fix a case where "password_by_talk" was not a boolean
+ (server#11851)
+ - Add .l10nignore files for compiled assets (server#11925)
+ - Properly escape column name in "createFunction" call
+ (server#11929)
+ - Allow userId to be null (server#11939)
+ - Allow "same-origin" as "Referrer-Policy" (Backport to stable14)
+ (server#11950)
+ - Do not emit preHooks twice on non-part-storage (server#11961)
+ - Filter null values for UserManager::getByEmail (server#11976)
+ - Allow local delivery of schedule message while prohibiting
+ FreeBusy requests (server#11979)
+ - Load apps/APP/l10n/*.js and themes/THEME/apps/APP/l10n/*.js
+ (server#11990)
+ - Lazy open first source stream in assemblystream (server#11994)
+ - Fix opening a section again in the Files app (server#11995)
+ - Remove cookies from Clear-Site-Data Header (server#12005)
+ - Forwarded ExpiredTokenException (server#12032)
+ - Allow chunked uploads even if your quota is not sufficient
+ (server#12040)
+ - Improve encrypt all / decrypt all (server#12045)
+ - Double check for failed cache with a shared storage
+ (server#12108)
+ - Implement the size of an assembly stream (server#12111)
+ - Bring the browser window of an actor to the foreground when
+ acting as him (server#12120)
+ - Move acceptance tests that crash the PHP built-in server to
+ Apache (server#12121)
+ - Remove unneeded empty search attribute values, fixes #12086
+ (server#12122)
+ - Fixes wrong variable usage (server#12137)
+ - LDAP: announce display name changes so that addressbook picks
+ it up (server#12141)
+ - Bruteforce protection handling in combination with
+ (server#12160)
+ - Add global site selector as user back-end which doesn't support
+ password confirmation (server#12184)
+ - Do not set indeterminate state for file shares (server#12187)
+ - Revert "Wait for cron to finish before running upgrade command"
+ (server#12197)
+ - Fix bug #12151: fix list formatting by correcting malformed
+ html (server#12202)
+ - A folder should get a folder mimetype (server#12297)
+ - Use the proper server for the apptoken flow login
+ (server#12299)
+ - Do not log FileLock as exception (server#12300)
+ - Set the filemodel before rending the detailsview (server#12301)
+ - Disabled ldap fix (server#12331)
+ - Fix - Add to favorites not working in IE11 (server#12339)
+ - Remove arrow function for ie compatibility (server#12341)
+ - Fix default types of activity event member variables
+ (server#12353)
+ - Suppress wrong audit log messages about failed login attempts
+ (server#12372)
+ - Add fix for IE11 flexbox height bug (server#12374)
+ - Properly search the root of a shared external storage
+ (server#12375)
+ - Fix app update available check (server#12412)
+ - Use nextcloud-password-confirmation (server#12416)
+ - Fix IE rule for min width (server#12431)
+ - Added cache override to ensure an always up-to-date
+ accessibility css (server#12432)
+ - Unique contraint and deadlock fixes for filecache and
+ file_locks (server#12433)
+ - Fix app menu calculation for random size of the right header
+ (server#12440)
+ - Fix missing quickaccess favorite folder on add (server#12441)
+ - Fixes dav share issue with owner (server#12459)
+ - Fix wrong share popover opening on share link (server#12482)
+ - Only use width and opacity for transition (server#12492)
+ - Forward object not found error in swift as dav 404
+ (server#12502)
+ - Fix the warning appearing in the admin section when
+ mail_smtpmode is not configured (server#12529)
+ - Remove unused svg api route (server#12542)
+ - Bearer tokens are app token (server#12545)
+ - Handle permission in update of share better (server#12561)
+ - Correctly restrict affected users when using command to send
+ emails (activity#312)
+ - Improve code blocks in markdown rendering (files_texteditor#121)
+ - Properly escape column name in "createFunction" call
+ (survey_client#85)
+
+-------------------------------------------------------------------
Old:
----
nextcloud-14.0.3.tar.bz2
New:
----
nextcloud-14.0.4.tar.bz2
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ nextcloud.spec ++++++
--- /var/tmp/diff_new_pack.56Yzjw/_old 2018-11-26 10:31:26.164948523 +0100
+++ /var/tmp/diff_new_pack.56Yzjw/_new 2018-11-26 10:31:26.168948519 +0100
@@ -49,7 +49,7 @@
%endif
Name: nextcloud
-Version: 14.0.3
+Version: 14.0.4
Release: 0
Summary: File hosting service
License: AGPL-3.0-only
++++++ nextcloud-14.0.3.tar.bz2 -> nextcloud-14.0.4.tar.bz2 ++++++
/work/SRC/openSUSE:Factory/nextcloud/nextcloud-14.0.3.tar.bz2 /work/SRC/openSUSE:Factory/.nextcloud.new.19453/nextcloud-14.0.4.tar.bz2 differ: char 11, line 1
1
0
Hello community,
here is the log from the commit of package way-cooler for openSUSE:Factory checked in at 2018-11-26 10:30:26
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/way-cooler (Old)
and /work/SRC/openSUSE:Factory/.way-cooler.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "way-cooler"
Mon Nov 26 10:30:26 2018 rev:4 rq:651358 version:0.8.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/way-cooler/way-cooler.changes 2018-01-25 12:41:43.591341550 +0100
+++ /work/SRC/openSUSE:Factory/.way-cooler.new.19453/way-cooler.changes 2018-11-26 10:31:11.208966052 +0100
@@ -1,0 +2,7 @@
+Thu Nov 22 18:59:37 UTC 2018 - mvetter(a)suse.com
+
+- Update to 0.8.1:
+ * Fix build with Rust 1.30
+ * See https://github.com/way-cooler/way-cooler/issues/595
+
+-------------------------------------------------------------------
Old:
----
v0.8.0.tar.gz
New:
----
v0.8.1.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ way-cooler.spec ++++++
--- /var/tmp/diff_new_pack.9tSmPC/_old 2018-11-26 10:31:14.420962287 +0100
+++ /var/tmp/diff_new_pack.9tSmPC/_new 2018-11-26 10:31:14.424962282 +0100
@@ -17,7 +17,7 @@
Name: way-cooler
-Version: 0.8.0
+Version: 0.8.1
Release: 0
Summary: Customizeable Wayland compositor written in Rust
License: MIT
@@ -73,7 +73,7 @@
%doc README.md LICENSE
%{_bindir}/way-cooler
%dir %{_sysconfdir}/way-cooler
-%{_sysconfdir}/way-cooler/init.lua
+%config %{_sysconfdir}/way-cooler/init.lua
%dir %{_datadir}/wayland-sessions
%{_datadir}/wayland-sessions/way-cooler.desktop
++++++ v0.8.0.tar.gz -> v0.8.1.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/way-cooler-0.8.0/Cargo.lock new/way-cooler-0.8.1/Cargo.lock
--- old/way-cooler-0.8.0/Cargo.lock 2018-01-24 23:54:55.000000000 +0100
+++ new/way-cooler-0.8.1/Cargo.lock 2018-11-22 18:42:29.000000000 +0100
@@ -28,7 +28,7 @@
[[package]]
name = "bitflags"
-version = "1.0.1"
+version = "1.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
@@ -45,7 +45,7 @@
"cairo-sys-rs 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
"glib 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)",
"glib-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
]
@@ -54,23 +54,36 @@
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
- "pkg-config 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
+ "pkg-config 0.3.14 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
+name = "cc"
+version = "1.0.25"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+
+[[package]]
name = "cfg-if"
-version = "0.1.2"
+version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
+name = "cloudabi"
+version = "0.0.3"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+dependencies = [
+ "bitflags 1.0.4 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[[package]]
name = "dbus"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
- "pkg-config 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
+ "pkg-config 0.3.14 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -91,15 +104,15 @@
[[package]]
name = "dlib"
-version = "0.4.0"
+version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "libloading 0.4.3 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libloading 0.5.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "dtoa"
-version = "0.4.2"
+version = "0.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
@@ -108,7 +121,7 @@
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"bitflags 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
"wayland-sys 0.9.10 (registry+https://github.com/rust-lang/crates.io-index)",
]
@@ -123,26 +136,26 @@
[[package]]
name = "fixedbitset"
-version = "0.1.8"
+version = "0.1.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "fuchsia-zircon"
-version = "0.3.2"
+version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "bitflags 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
- "fuchsia-zircon-sys 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)",
+ "bitflags 1.0.4 (registry+https://github.com/rust-lang/crates.io-index)",
+ "fuchsia-zircon-sys 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "fuchsia-zircon-sys"
-version = "0.3.2"
+version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "gcc"
-version = "0.3.54"
+version = "0.3.55"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
@@ -154,7 +167,7 @@
"glib 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)",
"glib-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
"gobject-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -166,14 +179,17 @@
"gio-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
"glib-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
"gobject-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
- "pkg-config 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
+ "pkg-config 0.3.14 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "getopts"
-version = "0.2.15"
+version = "0.2.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
+dependencies = [
+ "unicode-width 0.1.5 (registry+https://github.com/rust-lang/crates.io-index)",
+]
[[package]]
name = "gio-sys"
@@ -183,8 +199,8 @@
"bitflags 0.9.1 (registry+https://github.com/rust-lang/crates.io-index)",
"glib-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
"gobject-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
- "pkg-config 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
+ "pkg-config 0.3.14 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -196,7 +212,7 @@
"glib-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
"gobject-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
"lazy_static 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -205,8 +221,8 @@
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"bitflags 0.9.1 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
- "pkg-config 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
+ "pkg-config 0.3.14 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -216,8 +232,8 @@
dependencies = [
"bitflags 0.9.1 (registry+https://github.com/rust-lang/crates.io-index)",
"glib-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
- "pkg-config 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
+ "pkg-config 0.3.14 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -249,12 +265,12 @@
[[package]]
name = "lazy_static"
-version = "1.0.0"
+version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "libc"
-version = "0.2.34"
+version = "0.2.44"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
@@ -270,12 +286,11 @@
[[package]]
name = "libloading"
-version = "0.4.3"
+version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
- "lazy_static 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)",
- "winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
+ "cc 1.0.25 (registry+https://github.com/rust-lang/crates.io-index)",
+ "winapi 0.3.6 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -283,15 +298,15 @@
version = "0.3.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "log 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
+ "log 0.4.6 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "log"
-version = "0.4.0"
+version = "0.4.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "cfg-if 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)",
+ "cfg-if 0.1.6 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -299,7 +314,7 @@
version = "0.1.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -308,8 +323,8 @@
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"bitflags 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
- "cfg-if 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "cfg-if 0.1.6 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc_version 0.1.7 (registry+https://github.com/rust-lang/crates.io-index)",
"semver 0.1.20 (registry+https://github.com/rust-lang/crates.io-index)",
"void 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)",
@@ -321,79 +336,123 @@
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"bitflags 0.9.1 (registry+https://github.com/rust-lang/crates.io-index)",
- "cfg-if 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "cfg-if 0.1.6 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
"void 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "num-traits"
-version = "0.1.41"
+version = "0.1.43"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+dependencies = [
+ "num-traits 0.2.6 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[[package]]
+name = "num-traits"
+version = "0.2.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "ordermap"
-version = "0.3.2"
+version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "petgraph"
-version = "0.4.10"
+version = "0.4.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "fixedbitset 0.1.8 (registry+https://github.com/rust-lang/crates.io-index)",
- "ordermap 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)",
+ "fixedbitset 0.1.9 (registry+https://github.com/rust-lang/crates.io-index)",
+ "ordermap 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "phf"
-version = "0.7.21"
+version = "0.7.23"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "phf_shared 0.7.21 (registry+https://github.com/rust-lang/crates.io-index)",
+ "phf_shared 0.7.23 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "phf_codegen"
-version = "0.7.21"
+version = "0.7.23"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "phf_generator 0.7.21 (registry+https://github.com/rust-lang/crates.io-index)",
- "phf_shared 0.7.21 (registry+https://github.com/rust-lang/crates.io-index)",
+ "phf_generator 0.7.23 (registry+https://github.com/rust-lang/crates.io-index)",
+ "phf_shared 0.7.23 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "phf_generator"
-version = "0.7.21"
+version = "0.7.23"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "phf_shared 0.7.21 (registry+https://github.com/rust-lang/crates.io-index)",
- "rand 0.3.19 (registry+https://github.com/rust-lang/crates.io-index)",
+ "phf_shared 0.7.23 (registry+https://github.com/rust-lang/crates.io-index)",
+ "rand 0.5.5 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "phf_shared"
-version = "0.7.21"
+version = "0.7.23"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "siphasher 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
+ "siphasher 0.2.3 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "pkg-config"
-version = "0.3.9"
+version = "0.3.14"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+
+[[package]]
+name = "rand"
+version = "0.3.22"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+dependencies = [
+ "fuchsia-zircon 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
+ "rand 0.4.3 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[[package]]
+name = "rand"
+version = "0.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
+dependencies = [
+ "fuchsia-zircon 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
+ "winapi 0.3.6 (registry+https://github.com/rust-lang/crates.io-index)",
+]
[[package]]
name = "rand"
-version = "0.3.19"
+version = "0.5.5"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+dependencies = [
+ "cloudabi 0.0.3 (registry+https://github.com/rust-lang/crates.io-index)",
+ "fuchsia-zircon 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
+ "rand_core 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
+ "winapi 0.3.6 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[[package]]
+name = "rand_core"
+version = "0.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "fuchsia-zircon 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "rand_core 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
+name = "rand_core"
+version = "0.3.0"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+
+[[package]]
name = "regex"
version = "0.1.80"
source = "registry+https://github.com/rust-lang/crates.io-index"
@@ -415,8 +474,8 @@
version = "0.9.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "gcc 0.3.54 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "gcc 0.3.55 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -438,7 +497,7 @@
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"bitflags 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
"wayland-sys 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
@@ -457,15 +516,15 @@
version = "0.9.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "dtoa 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)",
+ "dtoa 0.4.3 (registry+https://github.com/rust-lang/crates.io-index)",
"itoa 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)",
- "num-traits 0.1.41 (registry+https://github.com/rust-lang/crates.io-index)",
+ "num-traits 0.1.43 (registry+https://github.com/rust-lang/crates.io-index)",
"serde 0.9.15 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "siphasher"
-version = "0.2.2"
+version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
@@ -473,8 +532,8 @@
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "phf 0.7.21 (registry+https://github.com/rust-lang/crates.io-index)",
- "phf_codegen 0.7.21 (registry+https://github.com/rust-lang/crates.io-index)",
+ "phf 0.7.23 (registry+https://github.com/rust-lang/crates.io-index)",
+ "phf_codegen 0.7.23 (registry+https://github.com/rust-lang/crates.io-index)",
"serde_json 0.9.10 (registry+https://github.com/rust-lang/crates.io-index)",
]
@@ -484,7 +543,7 @@
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -501,6 +560,11 @@
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
+name = "unicode-width"
+version = "0.1.5"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+
+[[package]]
name = "utf8-ranges"
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
@@ -510,7 +574,7 @@
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "rand 0.3.19 (registry+https://github.com/rust-lang/crates.io-index)",
+ "rand 0.3.22 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-serialize 0.3.24 (registry+https://github.com/rust-lang/crates.io-index)",
]
@@ -521,7 +585,7 @@
[[package]]
name = "way-cooler"
-version = "0.8.0"
+version = "0.8.1"
dependencies = [
"bitflags 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)",
"cairo-rs 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
@@ -531,26 +595,26 @@
"dummy-rustwlc 0.7.1 (registry+https://github.com/rust-lang/crates.io-index)",
"env_logger 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)",
"gdk-pixbuf 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
- "getopts 0.2.15 (registry+https://github.com/rust-lang/crates.io-index)",
+ "getopts 0.2.18 (registry+https://github.com/rust-lang/crates.io-index)",
"glib 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)",
"json_macro 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)",
"lazy_static 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)",
"log 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)",
"nix 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)",
- "petgraph 0.4.10 (registry+https://github.com/rust-lang/crates.io-index)",
+ "petgraph 0.4.13 (registry+https://github.com/rust-lang/crates.io-index)",
"rlua 0.9.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-serialize 0.3.24 (registry+https://github.com/rust-lang/crates.io-index)",
"rustwlc 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)",
"uuid 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)",
- "wayland-scanner 0.12.4 (registry+https://github.com/rust-lang/crates.io-index)",
- "wayland-server 0.12.4 (registry+https://github.com/rust-lang/crates.io-index)",
- "wayland-sys 0.12.4 (registry+https://github.com/rust-lang/crates.io-index)",
- "xcb 0.8.1 (registry+https://github.com/rust-lang/crates.io-index)",
+ "wayland-scanner 0.12.5 (registry+https://github.com/rust-lang/crates.io-index)",
+ "wayland-server 0.12.5 (registry+https://github.com/rust-lang/crates.io-index)",
+ "wayland-sys 0.12.5 (registry+https://github.com/rust-lang/crates.io-index)",
+ "xcb 0.8.2 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "wayland-scanner"
-version = "0.12.4"
+version = "0.12.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"xml-rs 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)",
@@ -558,15 +622,15 @@
[[package]]
name = "wayland-server"
-version = "0.12.4"
+version = "0.12.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "bitflags 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "bitflags 1.0.4 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
"nix 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)",
"token_store 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)",
- "wayland-scanner 0.12.4 (registry+https://github.com/rust-lang/crates.io-index)",
- "wayland-sys 0.12.4 (registry+https://github.com/rust-lang/crates.io-index)",
+ "wayland-scanner 0.12.5 (registry+https://github.com/rust-lang/crates.io-index)",
+ "wayland-sys 0.12.5 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -575,7 +639,7 @@
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"dlib 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -585,17 +649,17 @@
dependencies = [
"dlib 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)",
"lazy_static 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "wayland-sys"
-version = "0.12.4"
+version = "0.12.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "dlib 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
- "lazy_static 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
+ "dlib 0.4.1 (registry+https://github.com/rust-lang/crates.io-index)",
+ "lazy_static 1.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -604,17 +668,36 @@
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
+name = "winapi"
+version = "0.3.6"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+dependencies = [
+ "winapi-i686-pc-windows-gnu 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
+ "winapi-x86_64-pc-windows-gnu 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
+]
+
+[[package]]
name = "winapi-build"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
+name = "winapi-i686-pc-windows-gnu"
+version = "0.4.0"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+
+[[package]]
+name = "winapi-x86_64-pc-windows-gnu"
+version = "0.4.0"
+source = "registry+https://github.com/rust-lang/crates.io-index"
+
+[[package]]
name = "xcb"
-version = "0.8.1"
+version = "0.8.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)",
- "log 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)",
+ "log 0.4.6 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -622,7 +705,7 @@
version = "0.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "bitflags 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
+ "bitflags 1.0.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[metadata]
@@ -631,25 +714,27 @@
"checksum bitflags 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)" = "72cd7314bd4ee024071241147222c706e80385a1605ac7d4cd2fcc339da2ae46"
"checksum bitflags 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)" = "aad18937a628ec6abcd26d1489012cc0e18c21798210f491af69ded9b881106d"
"checksum bitflags 0.9.1 (registry+https://github.com/rust-lang/crates.io-index)" = "4efd02e230a02e18f92fc2735f44597385ed02ad8f831e7c1c1156ee5e1ab3a5"
-"checksum bitflags 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)" = "b3c30d3802dfb7281680d6285f2ccdaa8c2d8fee41f93805dba5c4cf50dc23cf"
+"checksum bitflags 1.0.4 (registry+https://github.com/rust-lang/crates.io-index)" = "228047a76f468627ca71776ecdebd732a3423081fcf5125585bcd7c49886ce12"
"checksum c_vec 1.2.1 (registry+https://github.com/rust-lang/crates.io-index)" = "6237ac5a4b1e81c213c24c6437964c61e646df910a914b4ab1487b46df20bd13"
"checksum cairo-rs 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)" = "a9d336f1b2ff46c17475a14360de7f456707008da475c54824887e52e453ab00"
"checksum cairo-sys-rs 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "9e8a1e2a76ac09b959788c2c30a355d693ce6f7f7d7268f6d1dd5d8c3359c521"
-"checksum cfg-if 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)" = "d4c819a1287eb618df47cc647173c5c4c66ba19d888a6e50d605672aed3140de"
+"checksum cc 1.0.25 (registry+https://github.com/rust-lang/crates.io-index)" = "f159dfd43363c4d08055a07703eb7a3406b0dac4d0584d96965a3262db3c9d16"
+"checksum cfg-if 0.1.6 (registry+https://github.com/rust-lang/crates.io-index)" = "082bb9b28e00d3c9d39cc03e64ce4cea0f1bb9b3fde493f0cbc008472d22bdf4"
+"checksum cloudabi 0.0.3 (registry+https://github.com/rust-lang/crates.io-index)" = "ddfc5b9aa5d4507acaf872de71051dfd0e309860e88966e1051e462a077aac4f"
"checksum dbus 0.4.1 (registry+https://github.com/rust-lang/crates.io-index)" = "58ec7b4cac6f79f36af1cd9cfdb9b935fc5a4e899f494ee03a3a6165f7d10b4b"
"checksum dbus-macros 0.0.6 (registry+https://github.com/rust-lang/crates.io-index)" = "e1e013b945c57472e5c016f3695114b00c774f03feed9b03df78a9759bb42beb"
"checksum dlib 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)" = "148bce4ce1c36c4509f29cb54e62c2bd265551a9b00b38070fad551a851866ec"
-"checksum dlib 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "95518d8f88d556e62c9b3014629f21bdad97a9fdfee85c68a185e3980af29e7c"
-"checksum dtoa 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)" = "09c3753c3db574d215cba4ea76018483895d7bff25a31b49ba45db21c48e50ab"
+"checksum dlib 0.4.1 (registry+https://github.com/rust-lang/crates.io-index)" = "77e51249a9d823a4cb79e3eca6dcd756153e8ed0157b6c04775d04bf1b13b76a"
+"checksum dtoa 0.4.3 (registry+https://github.com/rust-lang/crates.io-index)" = "6d301140eb411af13d3115f9a562c85cc6b541ade9dfa314132244aaee7489dd"
"checksum dummy-rustwlc 0.7.1 (registry+https://github.com/rust-lang/crates.io-index)" = "0dff1c82c8a98a5a3c0e1fb92e7c3090d7bbb1bb44266e420b9034ee58b42cb0"
"checksum env_logger 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)" = "15abd780e45b3ea4f76b4e9a26ff4843258dd8a3eed2775a0e7368c2e7936c2f"
-"checksum fixedbitset 0.1.8 (registry+https://github.com/rust-lang/crates.io-index)" = "85cb8fec437468d86dc7c83ca7cfc933341d561873275f22dd5eedefa63a6478"
-"checksum fuchsia-zircon 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)" = "bd510087c325af53ba24f3be8f1c081b0982319adcb8b03cad764512923ccc19"
-"checksum fuchsia-zircon-sys 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)" = "08b3a6f13ad6b96572b53ce7af74543132f1a7055ccceb6d073dd36c54481859"
-"checksum gcc 0.3.54 (registry+https://github.com/rust-lang/crates.io-index)" = "5e33ec290da0d127825013597dbdfc28bee4964690c7ce1166cbc2a7bd08b1bb"
+"checksum fixedbitset 0.1.9 (registry+https://github.com/rust-lang/crates.io-index)" = "86d4de0081402f5e88cdac65c8dcdcc73118c1a7a465e2a05f0da05843a8ea33"
+"checksum fuchsia-zircon 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)" = "2e9763c69ebaae630ba35f74888db465e49e259ba1bc0eda7d06f4a067615d82"
+"checksum fuchsia-zircon-sys 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)" = "3dcaa9ae7725d12cdb85b3ad99a434db70b468c09ded17e012d86b5c1010f7a7"
+"checksum gcc 0.3.55 (registry+https://github.com/rust-lang/crates.io-index)" = "8f5f3913fa0bfe7ee1fd8248b6b9f42a5af4b9d65ec2dd2c3c26132b950ecfc2"
"checksum gdk-pixbuf 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)" = "caf05dab73febcc6e90abaff8f24cfe1cf1bd2222cd648ddfe337bf3b994489f"
"checksum gdk-pixbuf-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "85eb441420653b33e5a29d13227ea34995383e65bf4f33b16492ec95e44a8996"
-"checksum getopts 0.2.15 (registry+https://github.com/rust-lang/crates.io-index)" = "65922871abd2f101a2eb0eaebadc66668e54a87ad9c3dd82520b5f86ede5eff9"
+"checksum getopts 0.2.18 (registry+https://github.com/rust-lang/crates.io-index)" = "0a7292d30132fb5424b354f5dc02512a86e4c516fe544bb7a25e7f266951b797"
"checksum gio-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "189969f8189604c371d42b613d928c9d17fcfbf6e175d6b0ce9475a950f76dc6"
"checksum glib 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)" = "d4da1d7f4bdc5c708d8ce4df1ac440dcb2f9d97d937c989032185a48aeef1d10"
"checksum glib-sys 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "cdd7d911c5dc610aabe37caae7d3b9d2cfe6d8f4c85ff4c062f3d6f490e75067"
@@ -658,24 +743,29 @@
"checksum json_macro 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "cae828a10b461ca79f7a1e366e4aead4c4bd8fc9c39a57546c39cbda8ee3a0c1"
"checksum kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)" = "7507624b29483431c0ba2d82aece8ca6cdba9382bff4ddd0f7490560c056098d"
"checksum lazy_static 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)" = "76f033c7ad61445c5b347c7382dd1237847eb1bce590fe50365dcb33d546be73"
-"checksum lazy_static 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)" = "c8f31047daa365f19be14b47c29df4f7c3b581832407daabe6ae77397619237d"
-"checksum libc 0.2.34 (registry+https://github.com/rust-lang/crates.io-index)" = "36fbc8a8929c632868295d0178dd8f63fc423fd7537ad0738372bd010b3ac9b0"
+"checksum lazy_static 1.2.0 (registry+https://github.com/rust-lang/crates.io-index)" = "a374c89b9db55895453a74c1e38861d9deec0b01b405a82516e9d5de4820dea1"
+"checksum libc 0.2.44 (registry+https://github.com/rust-lang/crates.io-index)" = "10923947f84a519a45c8fefb7dd1b3e8c08747993381adee176d7a82b4195311"
"checksum libloading 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)" = "0a020ac941774eb37e9d13d418c37b522e76899bfc4e7b1a600d529a53f83a66"
-"checksum libloading 0.4.3 (registry+https://github.com/rust-lang/crates.io-index)" = "fd38073de8f7965d0c17d30546d4bb6da311ab428d1c7a3fc71dff7f9d4979b9"
+"checksum libloading 0.5.0 (registry+https://github.com/rust-lang/crates.io-index)" = "9c3ad660d7cb8c5822cd83d10897b0f1f1526792737a179e73896152f85b88c2"
"checksum log 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)" = "e19e8d5c34a3e0e2223db8e060f9e8264aeeb5c5fc64a4ee9965c062211c024b"
-"checksum log 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "b3a89a0c46ba789b8a247d4c567aed4d7c68e624672d238b45cc3ec20dc9f940"
+"checksum log 0.4.6 (registry+https://github.com/rust-lang/crates.io-index)" = "c84ec4b527950aa83a329754b01dbe3f58361d1c5efacd1f6d68c494d08a17c6"
"checksum memchr 0.1.11 (registry+https://github.com/rust-lang/crates.io-index)" = "d8b629fb514376c675b98c1421e80b151d3817ac42d7c667717d282761418d20"
"checksum nix 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)" = "7a7bb1da2be7da3cbffda73fc681d509ffd9e665af478d2bee1907cee0bc64b2"
"checksum nix 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)" = "a2c5afeb0198ec7be8569d666644b574345aad2e95a53baf3a532da3e0f3fb32"
-"checksum num-traits 0.1.41 (registry+https://github.com/rust-lang/crates.io-index)" = "cacfcab5eb48250ee7d0c7896b51a2c5eec99c1feea5f32025635f5ae4b00070"
-"checksum ordermap 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)" = "40cd62c688b4d8078c3f0eef70d7762f17c08bf52b225799ddcb4cf275dd1f19"
-"checksum petgraph 0.4.10 (registry+https://github.com/rust-lang/crates.io-index)" = "28d0872a49ce3ee71b345f4fa675afe394d9e0d077f8eeeb3d04081724065d67"
-"checksum phf 0.7.21 (registry+https://github.com/rust-lang/crates.io-index)" = "cb325642290f28ee14d8c6201159949a872f220c62af6e110a56ea914fbe42fc"
-"checksum phf_codegen 0.7.21 (registry+https://github.com/rust-lang/crates.io-index)" = "d62594c0bb54c464f633175d502038177e90309daf2e0158be42ed5f023ce88f"
-"checksum phf_generator 0.7.21 (registry+https://github.com/rust-lang/crates.io-index)" = "6b07ffcc532ccc85e3afc45865469bf5d9e4ef5bfcf9622e3cfe80c2d275ec03"
-"checksum phf_shared 0.7.21 (registry+https://github.com/rust-lang/crates.io-index)" = "07e24b0ca9643bdecd0632f2b3da6b1b89bbb0030e0b992afc1113b23a7bc2f2"
-"checksum pkg-config 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)" = "3a8b4c6b8165cd1a1cd4b9b120978131389f64bdaf456435caa41e630edba903"
-"checksum rand 0.3.19 (registry+https://github.com/rust-lang/crates.io-index)" = "9e7944d95d25ace8f377da3ac7068ce517e4c646754c43a1b1849177bbf72e59"
+"checksum num-traits 0.1.43 (registry+https://github.com/rust-lang/crates.io-index)" = "92e5113e9fd4cc14ded8e499429f396a20f98c772a47cc8622a736e1ec843c31"
+"checksum num-traits 0.2.6 (registry+https://github.com/rust-lang/crates.io-index)" = "0b3a5d7cc97d6d30d8b9bc8fa19bf45349ffe46241e8816f50f62f6d6aaabee1"
+"checksum ordermap 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)" = "a86ed3f5f244b372d6b1a00b72ef7f8876d0bc6a78a4c9985c53614041512063"
+"checksum petgraph 0.4.13 (registry+https://github.com/rust-lang/crates.io-index)" = "9c3659d1ee90221741f65dd128d9998311b0e40c5d3c23a62445938214abce4f"
+"checksum phf 0.7.23 (registry+https://github.com/rust-lang/crates.io-index)" = "cec29da322b242f4c3098852c77a0ca261c9c01b806cae85a5572a1eb94db9a6"
+"checksum phf_codegen 0.7.23 (registry+https://github.com/rust-lang/crates.io-index)" = "7d187f00cd98d5afbcd8898f6cf181743a449162aeb329dcd2f3849009e605ad"
+"checksum phf_generator 0.7.23 (registry+https://github.com/rust-lang/crates.io-index)" = "03dc191feb9b08b0dc1330d6549b795b9d81aec19efe6b4a45aec8d4caee0c4b"
+"checksum phf_shared 0.7.23 (registry+https://github.com/rust-lang/crates.io-index)" = "b539898d22d4273ded07f64a05737649dc69095d92cb87c7097ec68e3f150b93"
+"checksum pkg-config 0.3.14 (registry+https://github.com/rust-lang/crates.io-index)" = "676e8eb2b1b4c9043511a9b7bea0915320d7e502b0a079fb03f9635a5252b18c"
+"checksum rand 0.3.22 (registry+https://github.com/rust-lang/crates.io-index)" = "15a732abf9d20f0ad8eeb6f909bf6868722d9a06e1e50802b6a70351f40b4eb1"
+"checksum rand 0.4.3 (registry+https://github.com/rust-lang/crates.io-index)" = "8356f47b32624fef5b3301c1be97e5944ecdd595409cc5da11d05f211db6cfbd"
+"checksum rand 0.5.5 (registry+https://github.com/rust-lang/crates.io-index)" = "e464cd887e869cddcae8792a4ee31d23c7edd516700695608f5b98c67ee0131c"
+"checksum rand_core 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)" = "1961a422c4d189dfb50ffa9320bf1f2a9bd54ecb92792fb9477f99a1045f3372"
+"checksum rand_core 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)" = "0905b6b7079ec73b314d4c748701f6931eb79fd97c668caa3f1899b22b32c6db"
"checksum regex 0.1.80 (registry+https://github.com/rust-lang/crates.io-index)" = "4fd4ace6a8cf7860714a2c2280d6c1f7e6a413486c13298bbc86fd3da019402f"
"checksum regex-syntax 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)" = "f9ec002c35e86791825ed294b50008eea9ddfc8def4420124fbc6b08db834957"
"checksum rlua 0.9.7 (registry+https://github.com/rust-lang/crates.io-index)" = "fb7eec2f28b4812553f10b2e69888a5a674b2e761f5a39731c14541c32bdb6d8"
@@ -685,20 +775,24 @@
"checksum semver 0.1.20 (registry+https://github.com/rust-lang/crates.io-index)" = "d4f410fedcf71af0345d7607d246e7ad15faaadd49d240ee3b24e5dc21a820ac"
"checksum serde 0.9.15 (registry+https://github.com/rust-lang/crates.io-index)" = "34b623917345a631dc9608d5194cc206b3fe6c3554cd1c75b937e55e285254af"
"checksum serde_json 0.9.10 (registry+https://github.com/rust-lang/crates.io-index)" = "ad8bcf487be7d2e15d3d543f04312de991d631cfe1b43ea0ade69e6a8a5b16a1"
-"checksum siphasher 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)" = "0df90a788073e8d0235a67e50441d47db7c8ad9debd91cbf43736a2a92d36537"
+"checksum siphasher 0.2.3 (registry+https://github.com/rust-lang/crates.io-index)" = "0b8de496cf83d4ed58b6be86c3a275b8602f6ffe98d3024a869e124147a9a3ac"
"checksum target_build_utils 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)" = "013d134ae4a25ee744ad6129db589018558f620ddfa44043887cdd45fa08e75c"
"checksum thread-id 2.0.0 (registry+https://github.com/rust-lang/crates.io-index)" = "a9539db560102d1cef46b8b78ce737ff0bb64e7e18d35b2a5688f7d097d0ff03"
"checksum thread_local 0.2.7 (registry+https://github.com/rust-lang/crates.io-index)" = "8576dbbfcaef9641452d5cf0df9b0e7eeab7694956dd33bb61515fb8f18cfdd5"
"checksum token_store 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)" = "a686838375fc11103b9c1529c6508320b7bd5e2401cd62831ca51b3e82e61849"
+"checksum unicode-width 0.1.5 (registry+https://github.com/rust-lang/crates.io-index)" = "882386231c45df4700b275c7ff55b6f3698780a650026380e72dabe76fa46526"
"checksum utf8-ranges 0.1.3 (registry+https://github.com/rust-lang/crates.io-index)" = "a1ca13c08c41c9c3e04224ed9ff80461d97e121589ff27c753a16cb10830ae0f"
"checksum uuid 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)" = "1a9ff57156caf7e22f37baf3c9d8f6ce8194842c23419dafcb0716024514d162"
"checksum void 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)" = "6a02e4885ed3bc0f2de90ea6dd45ebcbb66dacffe03547fadbb0eeae2770887d"
-"checksum wayland-scanner 0.12.4 (registry+https://github.com/rust-lang/crates.io-index)" = "64b0d284bc45314185ce89780d52e9f40c94e9daa958ff03203cc7e5672734e6"
-"checksum wayland-server 0.12.4 (registry+https://github.com/rust-lang/crates.io-index)" = "db9e55f96552f808a74c4314ede86ed667797412adf8ec1fdd03e99239ffd829"
-"checksum wayland-sys 0.12.4 (registry+https://github.com/rust-lang/crates.io-index)" = "b92ae34392c489c4223afde3048ead461b06bdaeb6e50beb86fb63944e341c1b"
+"checksum wayland-scanner 0.12.5 (registry+https://github.com/rust-lang/crates.io-index)" = "dcffa55a621e6f2c3d436de64d840fc325e1d0a467b92ee5e7292e17552e08ad"
+"checksum wayland-server 0.12.5 (registry+https://github.com/rust-lang/crates.io-index)" = "c7fad257bdd075cd9cf7c43b28bb6f0559a88e536bf8a2085ed028cca0f6279e"
+"checksum wayland-sys 0.12.5 (registry+https://github.com/rust-lang/crates.io-index)" = "377a2f83063c463e801ca10ae8cb9666e6e597eecac0049ac36cc7b9a83b0db3"
"checksum wayland-sys 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)" = "3a6dd94a0fbbd2fa8fdcd95466d602284743adff37dde0250ad1c71f5b60eeeb"
"checksum wayland-sys 0.9.10 (registry+https://github.com/rust-lang/crates.io-index)" = "b433ca9dbd9289a8ae8a5c49148d2a0e724b89432d7648727ca553027c247c47"
"checksum winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)" = "167dc9d6949a9b857f3451275e911c3f44255842c1f7a76f33c55103a909087a"
+"checksum winapi 0.3.6 (registry+https://github.com/rust-lang/crates.io-index)" = "92c1eb33641e276cfa214a0522acad57be5c56b10cb348b3c5117db75f3ac4b0"
"checksum winapi-build 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "2d315eee3b34aca4797b2da6b13ed88266e6d612562a0c46390af8299fc699bc"
-"checksum xcb 0.8.1 (registry+https://github.com/rust-lang/crates.io-index)" = "400cebeaedeca931825f11606874080f18aa51370dd3d7e11bc08d5aac8b3142"
+"checksum winapi-i686-pc-windows-gnu 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "ac3b87c63620426dd9b991e5ce0329eff545bccbbb34f3be09ff6fb6ab51b7b6"
+"checksum winapi-x86_64-pc-windows-gnu 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "712e227841d057c1ee1cd2fb22fa7e5a5461ae8e48fa2ca79ec42cfc1931183f"
+"checksum xcb 0.8.2 (registry+https://github.com/rust-lang/crates.io-index)" = "5e917a3f24142e9ff8be2414e36c649d47d6cc2ba81f16201cdef96e533e02de"
"checksum xml-rs 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)" = "3c1cb601d29fe2c2ac60a2b2e5e293994d87a1f6fa9687a31a15270f909be9c2"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/way-cooler-0.8.0/Cargo.toml new/way-cooler-0.8.1/Cargo.toml
--- old/way-cooler-0.8.0/Cargo.toml 2018-01-24 23:54:55.000000000 +0100
+++ new/way-cooler-0.8.1/Cargo.toml 2018-11-22 18:42:29.000000000 +0100
@@ -1,7 +1,7 @@
[package]
name = "way-cooler"
description = "Customizeable Wayland compositor written in Rust"
-version = "0.8.0"
+version = "0.8.1"
repository = "https://github.com/Immington-Industries/way-cooler/"
keywords = ["Wayland", "compositor", "window", "manager", "wlc"]
readme = "README.md"
++++++ vendor.tar.xz ++++++
/work/SRC/openSUSE:Factory/way-cooler/vendor.tar.xz /work/SRC/openSUSE:Factory/.way-cooler.new.19453/vendor.tar.xz differ: char 25, line 1
1
0
Hello community,
here is the log from the commit of package yast2-instserver for openSUSE:Factory checked in at 2018-11-26 10:30:13
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/yast2-instserver (Old)
and /work/SRC/openSUSE:Factory/.yast2-instserver.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "yast2-instserver"
Mon Nov 26 10:30:13 2018 rev:57 rq:651357 version:4.1.3
Changes:
--------
--- /work/SRC/openSUSE:Factory/yast2-instserver/yast2-instserver.changes 2018-11-01 19:57:08.533915775 +0100
+++ /work/SRC/openSUSE:Factory/.yast2-instserver.new.19453/yast2-instserver.changes 2018-11-26 10:30:56.780982963 +0100
@@ -1,0 +2,7 @@
+Wed Nov 21 16:19:19 UTC 2018 - dgonzalez(a)suse.com
+
+- Use the nfs-server service real name instead an alias
+ (bsc#1116779).
+- 4.1.3
+
+-------------------------------------------------------------------
Old:
----
yast2-instserver-4.1.2.tar.bz2
New:
----
yast2-instserver-4.1.3.tar.bz2
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ yast2-instserver.spec ++++++
--- /var/tmp/diff_new_pack.CCYS9N/_old 2018-11-26 10:30:57.672981917 +0100
+++ /var/tmp/diff_new_pack.CCYS9N/_new 2018-11-26 10:30:57.676981913 +0100
@@ -17,7 +17,7 @@
Name: yast2-instserver
-Version: 4.1.2
+Version: 4.1.3
Release: 0
BuildRoot: %{_tmppath}/%{name}-%{version}-build
++++++ yast2-instserver-4.1.2.tar.bz2 -> yast2-instserver-4.1.3.tar.bz2 ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/yast2-instserver-4.1.2/package/yast2-instserver.changes new/yast2-instserver-4.1.3/package/yast2-instserver.changes
--- old/yast2-instserver-4.1.2/package/yast2-instserver.changes 2018-10-16 15:28:04.000000000 +0200
+++ new/yast2-instserver-4.1.3/package/yast2-instserver.changes 2018-11-23 12:19:14.000000000 +0100
@@ -1,4 +1,11 @@
-------------------------------------------------------------------
+Wed Nov 21 16:19:19 UTC 2018 - dgonzalez(a)suse.com
+
+- Use the nfs-server service real name instead an alias
+ (bsc#1116779).
+- 4.1.3
+
+-------------------------------------------------------------------
Tue Oct 16 11:58:29 CEST 2018 - schubi(a)suse.de
- Fixed path to license file. . Build error in bsc#1110037.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/yast2-instserver-4.1.2/package/yast2-instserver.spec new/yast2-instserver-4.1.3/package/yast2-instserver.spec
--- old/yast2-instserver-4.1.2/package/yast2-instserver.spec 2018-10-16 15:28:04.000000000 +0200
+++ new/yast2-instserver-4.1.3/package/yast2-instserver.spec 2018-11-23 12:19:14.000000000 +0100
@@ -17,7 +17,7 @@
Name: yast2-instserver
-Version: 4.1.2
+Version: 4.1.3
Release: 0
BuildRoot: %{_tmppath}/%{name}-%{version}-build
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/yast2-instserver-4.1.2/src/modules/Instserver.rb new/yast2-instserver-4.1.3/src/modules/Instserver.rb
--- old/yast2-instserver-4.1.2/src/modules/Instserver.rb 2018-10-16 15:28:04.000000000 +0200
+++ new/yast2-instserver-4.1.3/src/modules/Instserver.rb 2018-11-23 12:19:14.000000000 +0100
@@ -13,6 +13,8 @@
module Yast
class InstserverClass < Module
+ NFS_SERVER_SEVICE = "nfs-server".freeze
+
def main
textdomain "instserver"
@@ -549,11 +551,11 @@
ConfigureService("nfs_server_auto", nfs)
- Service.Enable("nfsserver")
- if Service.Status("nfsserver") == 0
- Service.Reload("nfsserver")
+ Service.Enable(NFS_SERVER_SERVICE)
+ if Service.Status(NFS_SERVER_SERVICE) == 0
+ Service.Reload(NFS_SERVER_SERVICE)
else
- Service.Start("nfsserver")
+ Service.Start(NFS_SERVER_SERVICE)
end
firewalld.write
@@ -1013,7 +1015,7 @@
# is the directory in /etc/exports?
return false if !NFSExported(dir)
- nfsserver_running = Service.Status("nfsserver") == 0
+ nfsserver_running = Service.Status(NFS_SERVER_SERVICE) == 0
Builtins.y2milestone("NFS server running: %1", nfsserver_running)
# is the nfsserver running?
1
0
Hello community,
here is the log from the commit of package azure-cli for openSUSE:Factory checked in at 2018-11-26 10:30:00
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/azure-cli (Old)
and /work/SRC/openSUSE:Factory/.azure-cli.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "azure-cli"
Mon Nov 26 10:30:00 2018 rev:6 rq:651347 version:2.0.45
Changes:
--------
--- /work/SRC/openSUSE:Factory/azure-cli/azure-cli.changes 2018-11-15 12:41:15.102195237 +0100
+++ /work/SRC/openSUSE:Factory/.azure-cli.new.19453/azure-cli.changes 2018-11-26 10:30:51.672988951 +0100
@@ -1,0 +2,5 @@
+Wed Nov 21 16:15:57 UTC 2018 - okurz(a)suse.com
+
+- Add multibuild package self-test
+
+-------------------------------------------------------------------
New:
----
_multibuild
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ azure-cli.spec ++++++
--- /var/tmp/diff_new_pack.442yTv/_old 2018-11-26 10:30:52.432988060 +0100
+++ /var/tmp/diff_new_pack.442yTv/_new 2018-11-26 10:30:52.436988055 +0100
@@ -12,11 +12,26 @@
# license that conforms to the Open Source Definition (Version 1.9)
# published by the Open Source Initiative.
-# Please submit bugfixes or comments via https://bugs.opensuse.org/
+# Please submit bugfixes or comments via http://bugs.opensuse.org/
#
+# Define just "test" as a package in _multibuild file to distinguish test
+# instructions here
+%if "@BUILD_FLAVOR@" == ""
+%define _test 0
+%define name_ext %nil
+%else
+%define _test 1
+%define name_ext -test
+%endif
+
+%if !%{?_test}
Name: azure-cli
+%else
+Name: azure-cli%{?name_ext}
+%endif
+%define short_name azure-cli
Version: 2.0.45
Release: 0
Summary: Microsoft Azure CLI 2.0
@@ -26,6 +41,9 @@
Source: https://files.pythonhosted.org/packages/source/a/azure-cli/azure-cli-%{vers…
Source1: LICENSE.txt
Patch1: ac_use-python3-by-default.patch
+%if 0%{?_test}
+BuildRequires: %{short_name} = %{version}
+%else
BuildRequires: azure-cli-nspkg
BuildRequires: python3-azure-nspkg
BuildRequires: python3-devel
@@ -83,27 +101,44 @@
Conflicts: azure-cli < 2.0.0
BuildArch: noarch
+%endif
%description
Microsoft Azure CLI 2.0 Command Line Utilities
%prep
+%if 0%{?_test}
+# workaround to prevent post/install failing assuming this file for whatever
+# reason
+touch %{_sourcedir}/%{short_name}
+%else
%setup -q -n azure-cli-%{version}
%patch1 -p1
+%endif
%build
+%if 0%{?_test}
+az --help
+%else
install -m 644 %{SOURCE1} %{_builddir}/azure-cli-%{version}
python3 setup.py build
+%endif
%install
+%if 0%{?_test}
+# disable debug packages in package test to prevent error about missing files
+%define debug_package %{nil}
+%else
python3 setup.py install --root=%{buildroot} --prefix=%{_prefix} --install-lib=%{python3_sitelib}
install -DTm644 %{buildroot}%{_bindir}/az.completion.sh %{buildroot}/etc/bash_completion.d/az.completion.sh
rm -rf %{buildroot}%{python3_sitelib}/azure/cli/__init__.*
rm -rf %{buildroot}%{python3_sitelib}/azure/cli/__pycache__
rm -rf %{buildroot}%{python3_sitelib}/azure/__init__.*
rm -rf %{buildroot}%{python3_sitelib}/azure/__pycache__
+%endif
%files
+%if !%{?_test}
%defattr(-,root,root,-)
%doc HISTORY.rst README.rst
%license LICENSE.txt
@@ -113,5 +148,6 @@
%exclude /usr/bin/az.completion.sh
%{python3_sitelib}/azure/cli
%{python3_sitelib}/azure_cli-*.egg-info
+%endif
%changelog
++++++ _multibuild ++++++
<multibuild>
<package>test</package>
</multibuild>
1
0
Hello community,
here is the log from the commit of package buildah for openSUSE:Factory checked in at 2018-11-26 10:29:53
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/buildah (Old)
and /work/SRC/openSUSE:Factory/.buildah.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "buildah"
Mon Nov 26 10:29:53 2018 rev:11 rq:651345 version:1.5
Changes:
--------
--- /work/SRC/openSUSE:Factory/buildah/buildah.changes 2018-11-06 14:37:41.248750304 +0100
+++ /work/SRC/openSUSE:Factory/.buildah.new.19453/buildah.changes 2018-11-26 10:30:41.101001344 +0100
@@ -1,0 +2,34 @@
+Fri Nov 23 07:57:58 UTC 2018 - Valentin Rothberg <vrothberg(a)suse.com>
+
+- Changelog for v1.5-1 (2018-11-21)
+ * Print command in SystemExec as debug information
+ * Sort CLI flags
+ * Update imagebuild depdency to support heading ARGs in Dockerfile
+ * rootless: do not specify --rootless to the OCI runtime
+ * Exclude --force-rm from common bud cli flags
+ * run: bind mount /etc/hosts and /etc/resolv.conf if not in a volume
+ * rootless: use slirp4netns to setup the network namespace
+ * rootless: only discard network configuration names
+ * run: only set up /etc/hosts or /etc/resolv.conf with network
+ * Handle directories better in bud -f
+ * common: support a per-user registries conf file
+ * unshare: do not override the configuration
+ * common: honor the rootless configuration file
+ * unshare: create a new mount namespace
+ * unshare: support libpod rootless pkg
+ * Allow container storage to manage the SELinux labels
+ * imagebuilder.BuildDockerfiles: return the image ID
+ * Allow setting --no-pivot default with an env var
+ * Add man page and bash completion, for --no-pivot
+ * Add the --no-pivot flag to the run command
+ * Improve reporting about individual pull failures
+ * Fix From As in Dockerfile
+ * Sort CLI flags of buildah bud
+ * unshare: detect when unprivileged userns are disabled
+ * buildah: use the same logic for XDG_RUNTIME_DIR as podman
+ * Make sure we log or return every error
+ * Correctly set DockerInsecureSkipTLSVerify when pulling images
+ * chroot: set up seccomp and capabilities after supplemental groups
+ * chroot: fix capabilities list setup and application
+
+-------------------------------------------------------------------
Old:
----
buildah-1.4.tar.xz
New:
----
buildah-1.5.tar.xz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ buildah.spec ++++++
--- /var/tmp/diff_new_pack.r3mgMl/_old 2018-11-26 10:30:44.872996922 +0100
+++ /var/tmp/diff_new_pack.r3mgMl/_new 2018-11-26 10:30:44.872996922 +0100
@@ -23,7 +23,7 @@
%define with_libostree 1
%endif
Name: buildah
-Version: 1.4
+Version: 1.5
Release: 0
Summary: Tool for building OCI containers
License: Apache-2.0
++++++ _service ++++++
--- /var/tmp/diff_new_pack.r3mgMl/_old 2018-11-26 10:30:44.896996894 +0100
+++ /var/tmp/diff_new_pack.r3mgMl/_new 2018-11-26 10:30:44.896996894 +0100
@@ -4,8 +4,8 @@
<param name="url">https://github.com/containers/buildah.git</param>
<param name="scm">git</param>
<param name="filename">buildah</param>
-<param name="versionformat">1.4</param>
-<param name="revision">v1.4</param>
+<param name="versionformat">1.5</param>
+<param name="revision">v1.5</param>
</service>
<service name="recompress" mode="disabled">
++++++ buildah-1.4.tar.xz -> buildah-1.5.tar.xz ++++++
++++ 9369 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package python-ipy for openSUSE:Factory checked in at 2018-11-26 10:29:50
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-ipy (Old)
and /work/SRC/openSUSE:Factory/.python-ipy.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-ipy"
Mon Nov 26 10:29:50 2018 rev:8 rq:651338 version:0.83
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-ipy/python-ipy.changes 2013-12-09 16:59:44.000000000 +0100
+++ /work/SRC/openSUSE:Factory/.python-ipy.new.19453/python-ipy.changes 2018-11-26 10:30:27.601017171 +0100
@@ -1,0 +2,12 @@
+Thu Nov 22 14:26:39 UTC 2018 - jsegitz(a)suse.com
+
+- Update to version 0.83
+ * Add carrier grade NAT ranges
+ * Unbreak lots of packing systems by not having a letter in the release version
+ * Correct x.next() -> next(x) python3 compatability
+ * Add support for array slices
+ * Add __and__ and isdisjoint for IPSet
+ * Fix a bug in IPSet where contains may incorrectly return false
+- Moved to singlespec and added explicit license
+
+-------------------------------------------------------------------
Old:
----
IPy-0.81.tar.gz
New:
----
IPy-0.83.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-ipy.spec ++++++
--- /var/tmp/diff_new_pack.SHcjF4/_old 2018-11-26 10:30:29.861014521 +0100
+++ /var/tmp/diff_new_pack.SHcjF4/_new 2018-11-26 10:30:29.861014521 +0100
@@ -1,7 +1,7 @@
#
# spec file for package python-ipy
#
-# Copyright (c) 2013 SUSE LINUX Products GmbH, Nuernberg, Germany.
+# Copyright (c) 2018 SUSE LINUX GmbH, Nuernberg, Germany.
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -16,23 +16,21 @@
#
-%define modname IPy
-
+%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-ipy
-Version: 0.81
+Version: 0.83
Release: 0
+Summary: Class and tools for handling of IPv4 and IPv6 addresses and networks
License: BSD-3-Clause
-Summary: Class and Tools for Handling of IPv4 and IPv6 Addresses and Networks
-Url: http://software.inl.fr/trac/wiki/IPy
Group: Development/Languages/Python
-Source: https://pypi.python.org/packages/source/I/IPy/IPy-%{version}.tar.gz
-BuildRequires: python-devel
-BuildRoot: %{_tmppath}/%{name}-%{version}-build
-%if 0%{?suse_version} && 0%{?suse_version} <= 1110
-%{!?python_sitelib: %global python_sitelib %(python -c "from distutils.sysconfig import get_python_lib; print get_python_lib()")}
-%else
+Url: https://github.com/autocracy/python-ipy
+Source: https://files.pythonhosted.org/packages/source/I/IPy/IPy-%{version}.tar.gz
+BuildRequires: %{python_module setuptools}
+BuildRequires: fdupes
+BuildRequires: python-rpm-macros
BuildArch: noarch
-%endif
+
+%python_subpackages
%description
The IP class allows a comfortable parsing and handling for most
@@ -42,17 +40,18 @@
so funky stuff like a netmask of 0xffffff0f can't be done here.
%prep
-%setup -q -n %{modname}-%{version}
+%setup -q -n IPy-%{version}
%build
-python setup.py build
+%python_build
%install
-python setup.py install --prefix=%{_prefix} --root=%{buildroot}
-
-%files
-%defattr(-,root,root)
-%doc AUTHORS COPYING ChangeLog README
-%{python_sitelib}
+%python_install
+%python_expand %fdupes %{buildroot}%{$python_sitelib}
+#install COPYING %{buildroot}
+
+%files %{python_files}
+%{python_sitelib}/*
+%license COPYING
%changelog
++++++ IPy-0.81.tar.gz -> IPy-0.83.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/IPy-0.81/.gitignore new/IPy-0.83/.gitignore
--- old/IPy-0.81/.gitignore 1970-01-01 01:00:00.000000000 +0100
+++ new/IPy-0.83/.gitignore 2015-04-05 02:48:02.000000000 +0200
@@ -0,0 +1,2 @@
+*.pyc
+*.swp
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/IPy-0.81/AUTHORS new/IPy-0.83/AUTHORS
--- old/IPy-0.81/AUTHORS 2013-03-27 01:53:18.000000000 +0100
+++ new/IPy-0.83/AUTHORS 2015-04-05 02:48:02.000000000 +0200
@@ -1,12 +1,12 @@
Authors
=======
+Jeff Ferland <jeff AT storyinmemo.com>
+ > Current maintainer, versions 0.76 through latest
+Victor Stinner <victor.stinner AT gmail.com>
+ > Maintainer, versions 0.5 through 0.75
Maximillian Dornseif <md AT hudora.de>
> IPy author and maintainer until the version 0.42
-Victor Stinner <victor.stinner AT haypocalc.com>
- > new maintainer since the version 0.5
-Jeff Ferland <jeff AT storyinmemo.com>
- > New maintainer from version 0.76
Contributors
============
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/IPy-0.81/ChangeLog new/IPy-0.83/ChangeLog
--- old/IPy-0.81/ChangeLog 2013-04-08 19:25:56.000000000 +0200
+++ new/IPy-0.83/ChangeLog 2015-04-05 02:48:02.000000000 +0200
@@ -1,4 +1,24 @@
+Version 0.83 (2015-04-04)
+------------
+ * Add carrier grade NAT ranges
+ * Unbreak lots of packing systems by not having a letter in the release version
+
+Version 0.82a (2014-10-07)
+------------
+ * Fix version numbers in files
+ * Correct x.next() -> next(x) python3 compatability
+
+Version 0.82 (2014-10-06)
+------------
+
+ * Add support for array slices
+ * Add __and__ and isdisjoint for IPSet
+ * Fix a bug in IPSet where contains may incorrectly return false
+ * Added some fuzz testing
+
Version 0.81 (2013-04-08)
+------------
+
* Correct reverseName() for IPv6 addresses, so IP('::1').reverseName() returns correct.
* Add network mask awareness to v46map()
* Fix Python 3 errors in IPSet class
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/IPy-0.81/IPy.py new/IPy-0.83/IPy.py
--- old/IPy-0.81/IPy.py 2013-04-01 23:42:25.000000000 +0200
+++ new/IPy-0.83/IPy.py 2015-04-05 02:48:02.000000000 +0200
@@ -6,7 +6,7 @@
https://github.com/haypo/python-ipy
"""
-__version__ = '0.81'
+__version__ = '0.83'
import bisect
import collections
@@ -20,6 +20,7 @@
'0': 'PUBLIC', # fall back
'00000000': 'PRIVATE', # 0/8
'00001010': 'PRIVATE', # 10/8
+ '0110010001': 'CARRIER_GRADE_NAT', #100.64/10
'01111111': 'PRIVATE', # 127.0/8
'1': 'PUBLIC', # fall back
'1010100111111110': 'PRIVATE', # 169.254/16
@@ -610,6 +611,8 @@
IP('127.0.0.3')
"""
+ if isinstance(key, slice):
+ return [self.ip + int(x) for x in xrange(*key.indices(len(self)))]
if not isinstance(key, INT_TYPES):
raise TypeError
if key < 0:
@@ -959,6 +962,8 @@
>>> print(str(ip[-1]))
127.0.0.3
"""
+ if isinstance(key, slice):
+ return [IP(IPint.__getitem__(self, x), ipversion=self._ipversion) for x in xrange(*key.indices(len(self)))]
return IP(IPint.__getitem__(self, key), ipversion=self._ipversion)
def __repr__(self):
@@ -1012,12 +1017,7 @@
raise ValueError("%s cannot be converted to an IPv4 address."
% repr(self))
-try:
- IPSetBaseClass = collections.MutableSet
-except AttributeError:
- IPSetBaseClass = object
-
-class IPSet(IPSetBaseClass):
+class IPSet(collections.MutableSet):
def __init__(self, iterable=[]):
# Make sure it's iterable, otherwise wrap
if not isinstance(iterable, collections.Iterable):
@@ -1038,13 +1038,11 @@
#Don't dig through more-specific ranges
ip_mask = ip._prefixlen
valid_masks = [x for x in valid_masks if x <= ip_mask]
- for mask in valid_masks:
+ for mask in sorted(valid_masks):
i = bisect.bisect(self.prefixtable[mask], ip)
# Because of sorting order, a match can only occur in the prefix
# that comes before the result of the search.
- if i == 0:
- return False
- if ip in self.prefixtable[mask][i - 1]:
+ if i and ip in self.prefixtable[mask][i - 1]:
return True
def __iter__(self):
@@ -1063,6 +1061,31 @@
new.discard(prefix)
return new
+ def __and__(self, other):
+ left = iter(self.prefixes)
+ right = iter(other.prefixes)
+ result = []
+ try:
+ l = next(left)
+ r = next(right)
+ while True:
+ # iterate over prefixes in order, keeping the smaller of the
+ # two if they overlap
+ if l in r:
+ result.append(l)
+ l = next(left)
+ continue
+ elif r in l:
+ result.append(r)
+ r = next(right)
+ continue
+ if l < r:
+ l = next(left)
+ else:
+ r = next(right)
+ except StopIteration:
+ return IPSet(result)
+
def __repr__(self):
return '%s([' % self.__class__.__name__ + ', '.join(map(repr, self.prefixes)) + '])'
@@ -1120,6 +1143,22 @@
self.optimize()
+ def isdisjoint(self, other):
+ left = iter(self.prefixes)
+ right = iter(other.prefixes)
+ try:
+ l = next(left)
+ r = next(right)
+ while True:
+ if l in r or r in l:
+ return False
+ if l < r:
+ l = next(left)
+ else:
+ r = next(right)
+ except StopIteration:
+ return True
+
def optimize(self):
# The algorithm below *depends* on the sort order
self.prefixes.sort()
@@ -1597,7 +1636,7 @@
# Start cutting in half, recursively
prefixes = [
IP('%s/%d' % (prefix[0], prefix._prefixlen + 1)),
- IP('%s/%d' % (prefix[prefix.len() / 2], prefix._prefixlen + 1)),
+ IP('%s/%d' % (prefix[int(prefix.len() / 2)], prefix._prefixlen + 1)),
]
if subprefix in prefixes[0]:
return _remove_subprefix(prefixes[0], subprefix) + IPSet([prefixes[1]])
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/IPy-0.81/Makefile new/IPy-0.83/Makefile
--- old/IPy-0.81/Makefile 1970-01-01 01:00:00.000000000 +0100
+++ new/IPy-0.83/Makefile 2015-04-05 02:48:02.000000000 +0200
@@ -0,0 +1,27 @@
+.PHONY: tests egg install clean
+
+PYTHON=python
+
+tests:
+ @echo "[ run unit tests in python 2 ]"
+ PYTHONPATH=$(PWD) $(PYTHON)2.6 test/test_IPy.py || exit $$?
+ @echo "[ run unit tests in python 3 ]"
+ PYTHONPATH=$(PWD) $(PYTHON)3.4 test/test_IPy.py || exit $$?
+ @echo
+ @echo "[ test README in python 2 ]"
+ $(PYTHON)2.6 test_doc.py || exit $$?
+ @echo "[ test README in python 3 ]"
+ $(PYTHON)3.4 test_doc.py || exit $$?
+
+egg: clean
+ $(PYTHON) setup.py sdist bdist_egg
+
+IPy.html: README
+ rst2html README $@ --stylesheet=rest.css
+
+install:
+ ./setup.py install
+
+clean:
+ rm -rf IPy.html *.pyc build dist IPy.egg-info
+
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/IPy-0.81/PKG-INFO new/IPy-0.83/PKG-INFO
--- old/IPy-0.81/PKG-INFO 2013-04-08 23:22:31.000000000 +0200
+++ new/IPy-0.83/PKG-INFO 1970-01-01 01:00:00.000000000 +0100
@@ -1,306 +0,0 @@
-Metadata-Version: 1.1
-Name: IPy
-Version: 0.81
-Summary: Class and tools for handling of IPv4 and IPv6 addresses and networks
-Home-page: https://github.com/haypo/python-ipy
-Author: Jeff Ferland
-Author-email: jeff AT storyinmemo.com
-License: BSD License
-Download-URL: https://github.com/haypo/python-ipy
-Description: IPy - class and tools for handling of IPv4 and IPv6 addresses and networks.
-
- Website: https://github.com/haypo/python-ipy/
-
- Presentation of the API
- =======================
-
- The IP class allows a comfortable parsing and handling for most
- notations in use for IPv4 and IPv6 addresses and networks. It was
- greatly inspired by RIPE's Perl module NET::IP's interface but
- doesn't share the implementation. It doesn't share non-CIDR netmasks,
- so funky stuff like a netmask of 0xffffff0f can't be done here.
-
- >>> from IPy import IP
- >>> ip = IP('127.0.0.0/30')
- >>> for x in ip:
- ... print(x)
- ...
- 127.0.0.0
- 127.0.0.1
- 127.0.0.2
- 127.0.0.3
- >>> ip2 = IP('0x7f000000/30')
- >>> ip == ip2
- 1
- >>> ip.reverseNames()
- ['0.0.0.127.in-addr.arpa.', '1.0.0.127.in-addr.arpa.', '2.0.0.127.in-addr.arpa.', '3.0.0.127.in-addr.arpa.']
- >>> ip.reverseName()
- '0-3.0.0.127.in-addr.arpa.'
- >>> ip.iptype()
- 'PRIVATE'
-
-
- Supports most IP address formats
- ================================
-
- It can detect about a dozen different ways of expressing IP addresses
- and networks, parse them and distinguish between IPv4 and IPv6 addresses:
-
- >>> IP('10.0.0.0/8').version()
- 4
- >>> IP('::1').version()
- 6
-
- IPv4 addresses
- --------------
-
- >>> print(IP(0x7f000001))
- 127.0.0.1
- >>> print(IP('0x7f000001'))
- 127.0.0.1
- >>> print(IP('127.0.0.1'))
- 127.0.0.1
- >>> print(IP('10'))
- 10.0.0.0
-
- IPv6 addresses
- --------------
-
- >>> print(IP('1080:0:0:0:8:800:200C:417A'))
- 1080::8:800:200c:417a
- >>> print(IP('1080::8:800:200C:417A'))
- 1080::8:800:200c:417a
- >>> print(IP('::1'))
- ::1
- >>> print(IP('::13.1.68.3'))
- ::d01:4403
-
- Network mask and prefixes
- -------------------------
-
- >>> print(IP('127.0.0.0/8'))
- 127.0.0.0/8
- >>> print(IP('127.0.0.0/255.0.0.0'))
- 127.0.0.0/8
- >>> print(IP('127.0.0.0-127.255.255.255'))
- 127.0.0.0/8
-
-
- Derive network address
- ===========================
-
- IPy can transform an IP address into a network address by applying the given
- netmask:
- >>> print(IP('127.0.0.1/255.0.0.0', make_net=True))
- 127.0.0.0/8
-
- This can also be done for existing IP instances:
- >>> print(IP('127.0.0.1').make_net('255.0.0.0'))
- 127.0.0.0/8
-
-
- Convert address to string
- =========================
-
- Nearly all class methods which return a string have an optional
- parameter 'wantprefixlen' which controls if the prefixlen or netmask
- is printed. Per default the prefilen is always shown if the network
- contains more than one address::
-
- wantprefixlen == 0 / None don't return anything 1.2.3.0
- wantprefixlen == 1 /prefix 1.2.3.0/24
- wantprefixlen == 2 /netmask 1.2.3.0/255.255.255.0
- wantprefixlen == 3 -lastip 1.2.3.0-1.2.3.255
-
- You can also change the defaults on an per-object basis by fiddling with
- the class members:
-
- * NoPrefixForSingleIp
- * WantPrefixLen
-
- Examples of string conversions:
-
- >>> IP('10.0.0.0/32').strNormal()
- '10.0.0.0'
- >>> IP('10.0.0.0/24').strNormal()
- '10.0.0.0/24'
- >>> IP('10.0.0.0/24').strNormal(0)
- '10.0.0.0'
- >>> IP('10.0.0.0/24').strNormal(1)
- '10.0.0.0/24'
- >>> IP('10.0.0.0/24').strNormal(2)
- '10.0.0.0/255.255.255.0'
- >>> IP('10.0.0.0/24').strNormal(3)
- '10.0.0.0-10.0.0.255'
- >>> ip = IP('10.0.0.0')
- >>> print(ip)
- 10.0.0.0
- >>> ip.NoPrefixForSingleIp = None
- >>> print(ip)
- 10.0.0.0/32
- >>> ip.WantPrefixLen = 3
- >>> print(ip)
- 10.0.0.0-10.0.0.0
-
- Work with multiple networks
- ===========================
-
- Simple addition of neighboring netblocks that can be aggregated will yield
- a parent network of both, but more complex range mapping and aggregation
- requires is available with the IPSet class which will hold any number of
- unique address ranges and will aggregate overlapping ranges.
-
- >>> from IPy import IP, IPSet
- >>> IP('10.0.0.0/22') - IP('10.0.2.0/24')
- IPSet([IP('10.0.0.0/23'), IP('10.0.3.0/24')])
- >>> IPSet([IP('10.0.0.0/23'), IP('10.0.3.0/24'), IP('10.0.2.0/24')])
- IPSet([IP('10.0.0.0/22')])
- >>> s = IPSet([IP('10.0.0.0/22')])
- >>> s.add(IP('192.168.1.0/29'))
- >>> s
- IPSet([IP('10.0.0.0/22'), IP('192.168.1.0/29')])
- >>> s.discard(IP('192.168.1.2'))
- >>> s
- IPSet([IP('10.0.0.0/22'), IP('192.168.1.0/31'), IP('192.168.1.3'), IP('192.168.1.4/30')])
-
- Compatibility and links
- =======================
-
- IPy 0.81 works on Python version 2.5 - 3.3.
-
- This Python module is under BSD license: see COPYING file.
-
- Further Information might be available at:
- https://github.com/haypo/python-ipy
-
- What's new
- ==========
-
- Version 0.81 (2013-04-08)
- * Correct reverseName() for IPv6 addresses, so IP('::1').reverseName() returns correct.
- * Add network mask awareness to v46map()
- * Fix Python 3 errors in IPSet class
- * Make IPSet base class be object when MutableSet isn't available, fixing
- errors in Python 2.5
-
- Version 0.80 (2013-03-26)
- ------------
-
- * Drop support of Python older than 2.4
- * Python 3 does not need 2to3 conversion anymore (same code base)
- * Fix adding of non-adjacent networks:
- 192.168.0.0/24 + 192.168.255.0/24 made 192.168.0.0/23
- * Fix adding networks that don't create a valid subnet:
- 192.168.1.0/24 + 192.168.2.0/24 made 192.168.1.0/23
- * Fix adding with an IPv6 address where .int() was < 32 bits made IPy believe it
- was an IPv4 address:
- ::ffff:0/112 + ::1:0:0/112 made 255.255.0.0/111
- * Add support of IPSets
- * Add support for subtracting a network range
- * Prevent IPv4 and IPv6 ranges from saying they contain each other
- * Add a .v46map() method to convert mapped address ranges
- such as IP('::ffff:192.168.1.1'); RFC 4291
- * Change sort order to more natural:
- IPv4 before IPv6; less-specific prefixes first (/0 before /32)
-
-
- Version 0.76 (2013-03-19)
- -------------------------
-
- * ip == other and ip != other doesn't fail with an exception anymore if other
- is not a IP object
- * Add IP.get_mac() method: get the 802.3 MAC address from IPv6 RFC 2464
- address.
- * Fix IP('::/0')[0]: return an IPv6 instead of an IPv4 address
-
- Version 0.75 (2011-04-12)
- -------------------------
-
- * IP('::/0').netmask() gives IP('::') instead of IP('0.0.0.0')
-
- Version 0.74 (2011-02-16)
- -------------------------
-
- * Fix tests for Python 3.1 and 3.2
- * ip.__nonzero__() and (ipa in ipb) return a bool instead of 0 or 1
- * IP('0.0.0.0/0') + IP('0.0.0.0/0') raises an error, fix written by Arfrever
-
- Version 0.73 (2011-02-15)
- -------------------------
-
- * Support Python 3: setup.py runs 2to3
- * Update the ranges for IPv6 IPs
- * Fix reverseName() and reverseNames() for IPv4 in IPv6 addresses
- * Drop support of Python < 2.5
-
- Version 0.72 (2010-11-23)
- -------------------------
-
- * Include examples and MANIFEST.in in source build (add them to
- MANIFEST.in)
- * Remove __rcsid__ constant from IPy module
-
- Version 0.71 (2010-10-01)
- -------------------------
-
- * Use xrange() instead of range()
- * Use isinstance(x, int) instead of type(x) == types.IntType
- * Prepare support of Python3 (use integer division: x // y)
- * Fix IP(long) constructor: ensure that the address is not too large
- * Constructor raise a TypeError if the type is not int, long,
- str or unicode
- * 223.0.0.0/8 is now public (belongs to APNIC)
-
- Version 0.70 (2009-10-29)
- -------------------------
-
- * New "major" version because it may break compatibility
- * Fix __cmp__(): IP('0.0.0.0/0') and IP('0.0.0.0') are not equal
- * Fix IP.net() of the network "::/0": "::" instead of "0.0.0.0".
- IPy 0.63 should fix this bug, but it wasn't.
-
- Version 0.64 (2009-08-19)
- -------------------------
-
- * Create MANIFEST.in to fix setup.py bdist_rpm, fix by Robert Nickel
-
- Version 0.63 (2009-06-23)
- -------------------------
-
- * Fix formatting of "IPv4 in IPv6" network, eg. IP('::ffff:192.168.10.0/120'),
- the netmask ("/120" in the example) was missing!
-
- Version 0.62 (2008-07-15)
- -------------------------
-
- * Fix reverse DNS of IPv6 address: use ".ip6.arpa." suffix instead of
- deprecated ".ip6.int." suffix
-
- Version 0.61 (2008-06-12)
- -------------------------
-
- * Patch from Aras Vaichas allowing the [-1] operator
- to work with an IP object of size 1.
-
- Version 0.60 (2008-05-16)
- -------------------------
-
- * strCompressed() formats '::ffff:a.b.c.d' correctly
- * Use strCompressed() instead of strFullsize() to format IP addresses,
- ouput is smarter with IPv6 address
- * Remove check_addr_prefixlen because it generates invalid IP address
-Keywords: ipv4 ipv6 netmask
-Platform: UNKNOWN
-Classifier: Development Status :: 5 - Production/Stable
-Classifier: Intended Audience :: Developers
-Classifier: Intended Audience :: System Administrators
-Classifier: Environment :: Plugins
-Classifier: Topic :: Software Development :: Libraries :: Python Modules
-Classifier: Topic :: Communications
-Classifier: Topic :: Internet
-Classifier: Topic :: System :: Networking
-Classifier: License :: OSI Approved :: BSD License
-Classifier: Operating System :: OS Independent
-Classifier: Natural Language :: English
-Classifier: Programming Language :: Python
-Classifier: Programming Language :: Python :: 3
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/IPy-0.81/README new/IPy-0.83/README
--- old/IPy-0.81/README 2013-04-08 20:30:02.000000000 +0200
+++ new/IPy-0.83/README 2015-04-05 02:48:02.000000000 +0200
@@ -1,6 +1,6 @@
IPy - class and tools for handling of IPv4 and IPv6 addresses and networks.
-Website: https://github.com/haypo/python-ipy/
+Website: https://github.com/autocracy/python-ipy/
Presentation of the API
=======================
@@ -154,12 +154,39 @@
>>> s
IPSet([IP('10.0.0.0/22'), IP('192.168.1.0/31'), IP('192.168.1.3'), IP('192.168.1.4/30')])
+IPSet supports the `set` method `isdisjoint`:
+
+ >>> s.isdisjoint(IPSet([IP('192.168.0.0/16')]))
+ False
+ >>> s.isdisjoint(IPSet([IP('172.16.0.0/12')]))
+ True
+
+IPSet supports intersection:
+
+ >>> s & IPSet([IP('10.0.0.0/8')])
+ IPSet([IP('10.0.0.0/22')])
+
Compatibility and links
=======================
-IPy 0.81 works on Python version 2.5 - 3.3.
+IPy 0.83 works on Python version 2.6 - 3.4.
+
+The IP module should work in Python 2.5 as long as the subtraction operation
+is not used. IPSet requires features of the collecitons class which appear
+in Python 2.6, though they can be backported.
+
+Eratta
+======
+
+When using IPv6 addresses, it is best to compare using IP().len() instead of
+len(IP). Addresses with an integer value > 64 bits can break the 2nd method.
+See http://stackoverflow.com/questions/15650878 for more info.
+
+Fuzz testing for IPSet will throw spurious errors when the IPSet module
+combines two smaller prefixes into a larger prefix that matches the random
+prefix tested against.
This Python module is under BSD license: see COPYING file.
Further Information might be available at:
-https://github.com/haypo/python-ipy
+https://github.com/autocracy/python-ipy
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/IPy-0.81/setup.py new/IPy-0.83/setup.py
--- old/IPy-0.81/setup.py 2013-04-08 20:46:39.000000000 +0200
+++ new/IPy-0.83/setup.py 2015-04-05 02:48:02.000000000 +0200
@@ -24,7 +24,7 @@
import sys
from distutils.core import setup
-VERSION = '0.81'
+VERSION = '0.83'
options = {}
@@ -54,7 +54,7 @@
'Programming Language :: Python',
'Programming Language :: Python :: 3',
]
-URL = "https://github.com/haypo/python-ipy"
+URL = "https://github.com/autocracy/python-ipy"
setup(
name="IPy",
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/IPy-0.81/test/test_IPy.py new/IPy-0.83/test/test_IPy.py
--- old/IPy-0.81/test/test_IPy.py 2013-03-27 01:53:18.000000000 +0100
+++ new/IPy-0.83/test/test_IPy.py 2015-04-05 02:48:02.000000000 +0200
@@ -16,8 +16,6 @@
import unittest
import random
-testloops = 250
-
class parseAddress(unittest.TestCase):
okValues = [('FEDC:BA98:7654:3210:FEDC:BA98:7654:3210', 338770000845734292534325025077361652240),
('FEDCBA9876543210FEDCBA9876543210', 338770000845734292534325025077361652240),
@@ -202,21 +200,6 @@
self.assertRaises(ValueError, IPy.intToIp, 1, 7)
self.assertRaises(ValueError, IPy.intToIp, 1, 8)
-class ParseAndBack(unittest.TestCase):
- def testRandomValuesv4(self):
- for i in range(testloops):
- question = random.randrange(0x7fffffff) + random.randrange(0x7fffffff)
- self.assertEqual(IPy.parseAddress(IPy.intToIp(question, 4)), (question, 4), hex(question))
-
- def testRandomValuesv6(self):
- for i in range(testloops):
- question = ((random.randrange(0x7fffffff) + random.randrange(0x7fffffff)) +
- ((random.randrange(0x7fffffff) + random.randrange(0x7fffffff)) << 32) +
- ((random.randrange(0x7fffffff) + random.randrange(0x7fffffff)) << 64) +
- ((random.randrange(0x7fffffff) + random.randrange(0x7fffffff)) << 96))
- self.assertEqual(IPy.parseAddress(IPy.intToIp(question, 6)), (question, 6), hex(question))
-
-
class _countXBits(unittest.TestCase):
def testCount1Bits(self):
self.assertEqual(IPy._count1Bits(0), 0)
@@ -463,6 +446,7 @@
self.assertEqual(ip[0], ip.net())
self.assertEqual(ip[-1], ip.broadcast())
self.assertTrue(ip[255])
+ self.assertTrue(isinstance(ip[4::4], list))
self.assertRaises(IndexError, ip.__getitem__, 256)
def testStr(self):
@@ -844,10 +828,53 @@
self.t.discard(self.sixRange)
self.assertEqual(self.t, self.c)
+ def testAnd(self):
+ ten24s = IPy.IPSet([
+ IPy.IP('10.0.1.0/24'),
+ IPy.IP('10.0.3.0/24'),
+ IPy.IP('10.0.5.0/24'),
+ IPy.IP('10.0.7.0/24'),
+ ])
+
+ self.assertEqual(ten24s & IPy.IPSet([IPy.IP('10.0.1.10')]),
+ IPy.IPSet([IPy.IP('10.0.1.10')]))
+
+ self.assertEqual(ten24s & IPy.IPSet([
+ IPy.IP('10.0.0.99'),
+ IPy.IP('10.0.1.10'),
+ IPy.IP('10.0.3.40'),
+ IPy.IP('11.1.1.99'),
+ ]), IPy.IPSet([
+ IPy.IP('10.0.1.10'),
+ IPy.IP('10.0.3.40'),
+ ]))
+
def testContains(self):
self.assertTrue(IPy.IP('192.168.15.32/28') in self.t)
self.assertFalse(IPy.IP('192.169.15.32/28') in self.t)
+ # test for a regression where __contains__ prematurely returns False
+ # after testing a prefix length where all IP instances are greater than
+ # the query IP.
+ ipset = IPy.IPSet([IPy.IP('10.0.0.0/8'), IPy.IP('128.0.0.0/1')])
+ self.assertTrue(IPy.IP('10.0.0.0') in ipset)
+
+ def testIsdisjoint(self):
+ self.assertTrue(IPy.IPSet([IPy.IP('0.0.0.0/1')])
+ .isdisjoint(IPy.IPSet([IPy.IP('128.0.0.0/1')])))
+ self.assertFalse(IPy.IPSet([IPy.IP('0.0.0.0/1')])
+ .isdisjoint(IPy.IPSet([IPy.IP('0.0.0.0/2')])))
+ self.assertFalse(IPy.IPSet([IPy.IP('0.0.0.0/2')])
+ .isdisjoint(IPy.IPSet([IPy.IP('0.0.0.0/1')])))
+ self.assertFalse(IPy.IPSet([IPy.IP('0.0.0.0/2')])
+ .isdisjoint(IPy.IPSet([IPy.IP('0.1.2.3')])))
+ self.assertFalse(IPy.IPSet([IPy.IP('0.1.2.3')])
+ .isdisjoint(IPy.IPSet([IPy.IP('0.0.0.0/2')])))
+ self.assertTrue(IPy.IPSet([IPy.IP('1.1.1.1'), IPy.IP('1.1.1.3')])
+ .isdisjoint(IPy.IPSet([IPy.IP('1.1.1.2'), IPy.IP('1.1.1.4')])))
+ self.assertFalse(IPy.IPSet([IPy.IP('1.1.1.1'), IPy.IP('1.1.1.3'), IPy.IP('1.1.2.0/24')])
+ .isdisjoint(IPy.IPSet([IPy.IP('1.1.2.2'), IPy.IP('1.1.1.4')])))
+
class RegressionTest(unittest.TestCase):
def testNulNetmask(self):
ip = timeout(IPy.IP, ["0.0.0.0/0.0.0.0"], timeout_duration=0.250, default=None)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/IPy-0.81/test/test_fuzz.py new/IPy-0.83/test/test_fuzz.py
--- old/IPy-0.81/test/test_fuzz.py 1970-01-01 01:00:00.000000000 +0100
+++ new/IPy-0.83/test/test_fuzz.py 2015-04-05 02:48:02.000000000 +0200
@@ -0,0 +1,109 @@
+"""Fuzing for IPy.py"""
+
+# TODO: unify assert / FilIf usage
+
+import sys
+import functools
+import itertools
+sys.path.append('.')
+sys.path.append('..')
+
+import IPy
+import unittest
+import random
+
+if sys.version_info >= (3,):
+ xrange = range
+
+# on Python-2.7 and higher, we use load_tests to multiply out the test cases so that unittest
+# represents each as an individual test case.
+def iterate_27(n):
+ def wrap(func):
+ func.iterations = n
+ return func
+ return wrap
+
+def load_tests(loader, tests, pattern):
+ def expand(tests):
+ if isinstance(tests, unittest.TestCase):
+ method_name = tests._testMethodName
+ meth = getattr(tests, method_name)
+ if hasattr(meth, 'iterations'):
+ tests = unittest.TestSuite(type(tests)(method_name) for i in xrange(meth.iterations))
+ else:
+ tests = unittest.TestSuite(expand(t) for t in tests)
+ return tests
+ return expand(tests)
+
+# On older Pythons, we run the requisite iterations directly, in a single test case.
+def iterate_old(n):
+ def wrap(func):
+ @functools.wraps(func)
+ def replacement(*args):
+ for i in xrange(n):
+ func(*args)
+ return replacement
+ return wrap
+
+if sys.version_info >= (2,7):
+ iterate = iterate_27
+else:
+ iterate = iterate_old
+
+# utilities
+
+def random_ipv4_prefix():
+ prefixlen = random.randrange(32)
+ int_ip = random.randrange(IPy.MAX_IPV4_ADDRESS)
+ int_ip &= 0xffffffff << (32-prefixlen)
+ return IPy.IP('.'.join(map(str, (int_ip >> 24,
+ (int_ip >> 16) & 0xff,
+ (int_ip >> 8) & 0xff,
+ int_ip & 0xff)))
+ + '/%d' % prefixlen)
+
+# tests
+
+class ParseAndBack(unittest.TestCase):
+
+ @iterate(500)
+ def testRandomValuesv4(self):
+ question = random.randrange(0xffffffff)
+ self.assertEqual(IPy.parseAddress(IPy.intToIp(question, 4)), (question, 4), hex(question))
+
+ @iterate(500)
+ def testRandomValuesv6(self):
+ question = random.randrange(0xffffffffffffffffffffffffffffffff)
+ self.assertEqual(IPy.parseAddress(IPy.intToIp(question, 6)), (question, 6), hex(question))
+
+class TestIPSet(unittest.TestCase):
+
+ @iterate(1000)
+ def testRandomContains(self):
+ prefixes = [random_ipv4_prefix() for i in xrange(random.randrange(50))]
+ question = random_ipv4_prefix()
+ answer = any(question in pfx for pfx in prefixes)
+ ipset = IPy.IPSet(prefixes)
+ self.assertEqual(question in ipset, answer,
+ "%s in %s != %s (made from %s)" % (question, ipset, answer, prefixes))
+
+
+ @iterate(1000)
+ def testRandomDisjoint(self):
+ prefixes1 = [random_ipv4_prefix() for i in xrange(random.randrange(50))]
+ prefixes2 = [random_ipv4_prefix() for i in xrange(random.randrange(50))]
+ # test disjointnes the stupid way
+ disjoint = True
+ for p1, p2 in itertools.product(prefixes1, prefixes2):
+ if p1 in p2 or p2 in p1:
+ disjoint = False
+ break
+ ipset1 = IPy.IPSet(prefixes1)
+ ipset2 = IPy.IPSet(prefixes2)
+ self.assertEqual(ipset1.isdisjoint(ipset2), disjoint,
+ "%s.isdisjoint(%s) != %s" % (ipset1, ipset2, disjoint))
+ self.assertEqual(ipset2.isdisjoint(ipset1), disjoint,
+ "%s.isdisjoint(%s) != %s" % (ipset2, ipset1, disjoint))
+
+if __name__ == "__main__":
+ unittest.main()
1
0