openSUSE Commits
Threads by month
- ----- 2024 -----
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
June 2020
- 1 participants
- 3943 discussions
Hello community,
here is the log from the commit of package python-pypet for openSUSE:Factory checked in at 2020-06-30 21:56:45
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-pypet (Old)
and /work/SRC/openSUSE:Factory/.python-pypet.new.3060 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-pypet"
Tue Jun 30 21:56:45 2020 rev:4 rq:817771 version:0.5.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-pypet/python-pypet.changes 2018-08-03 12:40:30.563855943 +0200
+++ /work/SRC/openSUSE:Factory/.python-pypet.new.3060/python-pypet.changes 2020-06-30 21:56:48.266821850 +0200
@@ -1,0 +2,12 @@
+Tue Jun 23 20:05:26 UTC 2020 - Todd R <toddrme2178(a)gmail.com>
+
+- Update to 0.5.1
+ * Updated package description to automatically convert md to rst for pypi
+ * Updated pngpath for Sphinx
+- Update to 0.5.0
+ * Fix to work with pandas 1.0
+ * Fix to work with brian2 2.3
+ * Fix to work with Python 3.7 and 3.8
+ * Removal `expectedrows` and `filters` option for HDF5Storage.put as this is no longer supported by pandas
+
+-------------------------------------------------------------------
Old:
----
pypet-0.4.3.tar.gz
New:
----
pypet-0.5.1.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-pypet.spec ++++++
--- /var/tmp/diff_new_pack.6NAytZ/_old 2020-06-30 21:56:48.890823781 +0200
+++ /var/tmp/diff_new_pack.6NAytZ/_new 2020-06-30 21:56:48.894823793 +0200
@@ -1,7 +1,7 @@
#
# spec file for package python-pypet
#
-# Copyright (c) 2018 SUSE LINUX GmbH, Nuernberg, Germany.
+# Copyright (c) 2020 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -12,16 +12,15 @@
# license that conforms to the Open Source Definition (Version 1.9)
# published by the Open Source Initiative.
-# Please submit bugfixes or comments via http://bugs.opensuse.org/
+# Please submit bugfixes or comments via https://bugs.opensuse.org/
#
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
%define skip_python2 1
-# Tests take forever
-%bcond_with test
+%bcond_without test
Name: python-pypet
-Version: 0.4.3
+Version: 0.5.1
Release: 0
Summary: Parameter exploration and storage of results for numerical simulations
License: BSD-3-Clause
@@ -36,7 +35,6 @@
BuildRequires: %{python_module numpy >= 1.6.1}
BuildRequires: %{python_module pandas >= 0.15.0}
BuildRequires: %{python_module scipy >= 0.9.0}
-BuildRequires: %{python_module scoop >= 0.7.1}
BuildRequires: %{python_module tables >= 3.1.1}
%endif
Requires: python-numpy >= 1.6.1
@@ -74,11 +72,13 @@
%if %{with test}
%check
export LANG=en_US.UTF-8
-%python_exec setup.py test
+pushd pypet/tests
+%{python_expand export PYTHONPATH=%{buildroot}%{$python_sitelib}
+$python -B all_single_core_tests.py
+}
%endif
%files %{python_files}
-%defattr(-,root,root,-)
%doc README.md
%license LICENSE
%{python_sitelib}/*
++++++ pypet-0.4.3.tar.gz -> pypet-0.5.1.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pypet-0.4.3/LICENSE new/pypet-0.5.1/LICENSE
--- old/pypet-0.4.3/LICENSE 2018-06-24 19:39:52.000000000 +0200
+++ new/pypet-0.5.1/LICENSE 2020-06-01 22:42:02.000000000 +0200
@@ -1,4 +1,4 @@
-Copyright (c) 2013-2018, Robert Meyer
+Copyright (c) 2013-2020, Robert Meyer
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pypet-0.4.3/PKG-INFO new/pypet-0.5.1/PKG-INFO
--- old/pypet-0.4.3/PKG-INFO 2018-06-24 20:51:52.000000000 +0200
+++ new/pypet-0.5.1/PKG-INFO 2020-06-02 12:57:56.386352800 +0200
@@ -1,19 +1,45 @@
-Metadata-Version: 1.1
+Metadata-Version: 1.2
Name: pypet
-Version: 0.4.3
+Version: 0.5.1
Summary: A toolkit for numerical simulations to allow easy parameter exploration and storage of results.
Home-page: https://github.com/SmokinCaterpillar/pypet
Author: Robert Meyer
-Author-email: robert.meyer(a)ni.tu-berlin.de
+Author-email: robert.meyer(a)alcemy.tech
License: BSD
-Description: # pypet
+Description:
+ pypet
+ =====
+
+
+ .. image:: https://travis-ci.org/SmokinCaterpillar/pypet.svg?branch=master
+ :target: https://travis-ci.org/SmokinCaterpillar/pypet
+ :alt: Travis Build Status
+
+
+ .. image:: https://ci.appveyor.com/api/projects/status/9amhj3iyf105xa2y/branch/master?…
+ :target: https://ci.appveyor.com/project/SmokinCaterpillar/pypet/branch/master
+ :alt: Appveyor Build status
+
+
+ .. image:: https://coveralls.io/repos/github/SmokinCaterpillar/pypet/badge.svg?branch=…
+ :target: https://coveralls.io/github/SmokinCaterpillar/pypet?branch=master
+ :alt: Coverage Status
+
+
+ .. image:: https://api.codacy.com/project/badge/grade/86268960751442799fcf6192b36e386f
+ :target: https://www.codacy.com/app/robert-meyer/pypet
+ :alt: Codacy Badge
+
+
+ .. image:: https://badge.fury.io/py/pypet.svg
+ :target: https://badge.fury.io/py/pypet
+ :alt: PyPI version
+
+
+ .. image:: https://readthedocs.org/projects/pypet/badge/?version=latest
+ :target: http://pypet.readthedocs.io/en/latest/?badge=latest
+ :alt: Documentation Status
- [![Travis Build Status](https://travis-ci.org/SmokinCaterpillar/pypet.svg?branch=master)](h…
- [![Appveyor Build status](https://ci.appveyor.com/api/projects/status/9amhj3iyf105xa2y/branch…
- [![Coverage Status](https://coveralls.io/repos/github/SmokinCaterpillar/pypet/badge.svg?branch=master)](https://coveralls.io/github/SmokinCaterpillar/pypet?branch=master)
- [![Codacy Badge](https://api.codacy.com/project/badge/grade/86268960751442799fcf6192b…
- [![PyPI version](https://badge.fury.io/py/pypet.svg)](https://badge.fury.io/py/pypet)
- [![Documentation Status](https://readthedocs.org/projects/pypet/badge/?version=latest)](http…
The new python parameter exploration toolkit:
*pypet* manages exploration of the parameter space
@@ -24,58 +50,68 @@
from a single source. Data I/O of your simulations and
analyses becomes a piece of cake!
+ Requirements
+ ------------
- ## Requirements
-
- Python 3.5 or 3.6 and
+ Python 3.6, 3.7 or 3.8 and
- * tables >= 3.1.1
- * pandas >= 0.20.0
+ *
+ tables >= 3.5.0
- * numpy >= 1.12.0
+ *
+ pandas >= 1.0.0
- * scipy >= 0.17.0
+ *
+ numpy >= 1.16.0
- * HDF5 >= 1.8.9
+ *
+ scipy >= 1.3.0
+ *
+ HDF5 >= 1.10.0
There are also some optional packages that you can but do not have to install.
If you want to combine *pypet* with SCOOP you need
+
* scoop >= 0.7.1
For git integration you additionally need
- * GitPython >= 0.3.1
+
+ * GitPython >= 3.1.3
To utilize the cap feature for multiprocessing you need
- * psutil >= 2.0.0
+
+ * psutil >= 5.7.0
To utilize the continuing of crashed trajectories you need
- * dill >= 0.2.1
+
+ * dill >= 0.3.1
Automatic Sumatra records are supported for
- * Sumatra >= 0.7.1
+ * Sumatra >= 0.7.1
- ## Python 2.7
+ Python 2.7
+ ----------
This release no longer supports Python 2.7.
If you are still using Python 2.7, you need to
use the pypet legacy version 0.3.0 (https://pypi.python.org/pypi/pypet/0.3.0).
-
- # What is pypet all about?
+ What is pypet all about?
+ ========================
Whenever you do numerical simulations in science, you come across two major challenges.
First, you need some way to save your data. Secondly, you extensively explore the parameter space.
In order to accomplish both you write some hacky I/O functionality to get it done the quick and
- dirty way. This means storing stuff into text files, as *MATLAB* *m*-files,
+ dirty way. This means storing stuff into text files, as *MATLAB* *m*\ -files,
or whatever comes in handy.
After a while and many simulations later, you want to look back at some of your very
@@ -90,33 +126,38 @@
that was not specific to my current simulations, but I could also use for future scientific
projects right out of the box.
- The python parameter exploration toolkit (*pypet*) provides a framework to define *parameters*
+ The python parameter exploration toolkit (\ *pypet*\ ) provides a framework to define *parameters*
that you need to run your simulations. You can actively explore these by following a
*trajectory* through the space spanned by the parameters.
And finally, you can get your *results* together and store everything appropriately to disk.
The storage format of choice is HDF5 (http://www.hdfgroup.org/HDF5/) via PyTables
(http://www.pytables.org/).
-
- ## Package Organization
+ Package Organization
+ --------------------
This project encompasses these core modules:
- * The `pypet.environment` module for handling the running of simulations
- * The `pypet.trajectory` module for managing the parameters and results,
- and providing a way to *explore* your parameter space. Somewhat related is also the
- `pypet.naturalnaming` module, that provides functionality to access and put data into
- the *trajectory*.
+ *
+ The ``pypet.environment`` module for handling the running of simulations
- * The `pypet.parameters` module including containers for parameters and results
+ *
+ The ``pypet.trajectory`` module for managing the parameters and results,
+ and providing a way to *explore* your parameter space. Somewhat related is also the
+ ``pypet.naturalnaming`` module, that provides functionality to access and put data into
+ the *trajectory*.
- * The `pypet.storageservice` for saving your data to disk
+ *
+ The ``pypet.parameters`` module including containers for parameters and results
+ *
+ The ``pypet.storageservice`` for saving your data to disk
- ## Install
+ Install
+ -------
- If you don't have all prerequisites (*numpy*, *scipy*, *tables*, *pandas*) install them first.
+ If you don't have all prerequisites (\ *numpy*\ , *scipy*\ , *tables*\ , *pandas*\ ) install them first.
These are standard python packages, so chances are high that they are already installed.
By the way, in case you use the python package manager ``pip``
you can list all installed packages with ``pip freeze``.
@@ -130,13 +171,13 @@
**Or**
- In case you use **Windows**, you have to download the tar file from https://pypi.python.org/pypi/pypet
+ In case you use **Windows**\ , you have to download the tar file from https://pypi.python.org/pypi/pypet
and unzip it. Next, open a windows terminal
- and navigate to your unpacked *pypet* files to the folder containing the `setup.py` file.
+ and navigate to your unpacked *pypet* files to the folder containing the ``setup.py`` file.
As above run from the terminal ``python setup.py install``.
-
- ## Documentation and Support
+ Documentation and Support
+ -------------------------
Documentation can be found on http://pypet.readthedocs.org/.
@@ -144,137 +185,162 @@
If you have any further questions feel free to contact me at **robert.meyer (at) ni.tu-berlin.de**.
+ Main Features
+ -------------
- ## Main Features
- * **Novel tree container** `Trajectory`, for handling and managing of
+ *
+ **Novel tree container** ``Trajectory``\ , for handling and managing of
parameters and results of numerical simulations
- * **Group** your parameters and results into meaningful categories
+ *
+ **Group** your parameters and results into meaningful categories
- * Access data via **natural naming**, e.g. `traj.parameters.traffic.ncars`
+ *
+ Access data via **natural naming**\ , e.g. ``traj.parameters.traffic.ncars``
- * Automatic **storage** of simulation data into HDF5 files via PyTables
+ *
+ Automatic **storage** of simulation data into HDF5 files via PyTables
- * Support for many different **data formats**
+ *
+ Support for many different **data formats**
- * python native data types: bool, int, long, float, str, complex
- * list, tuple, dict
+ *
+ python native data types: bool, int, long, float, str, complex
- * Numpy arrays and matrices
+ *
+ list, tuple, dict
- * Scipy sparse matrices
+ *
+ Numpy arrays and matrices
- * pandas DataFrames (http://pandas.pydata.org/)
+ *
+ Scipy sparse matrices
- * BRIAN2 quantities and monitors (http://briansimulator.org/)
+ *
+ pandas DataFrames (http://pandas.pydata.org/)
- * Easily **extendable** to other data formats!
+ *
+ BRIAN2 quantities and monitors (http://briansimulator.org/)
- * **Exploration** of the parameter space of your simulations
+ *
+ Easily **extendable** to other data formats!
- * **Merging** of *trajectories* residing in the same space
+ *
+ **Exploration** of the parameter space of your simulations
- * Support for **multiprocessing**, *pypet* can run your simulations in parallel
+ *
+ **Merging** of *trajectories* residing in the same space
- * **Analyse** your data on-the-fly during multiprocessing
+ *
+ Support for **multiprocessing**\ , *pypet* can run your simulations in parallel
- * **Adaptively** explore tha parameter space combining *pypet* with optimization
+ *
+ **Analyse** your data on-the-fly during multiprocessing
+
+ *
+ **Adaptively** explore tha parameter space combining *pypet* with optimization
tools like the evolutionary algorithms framework DEAP (http://deap.readthedocs.org/en/)
- * **Dynamic Loading**, load only the parts of your data you currently need
+ *
+ **Dynamic Loading**\ , load only the parts of your data you currently need
- * **Resume** a crashed or halted simulation
+ *
+ **Resume** a crashed or halted simulation
- * **Annotate** your parameters, results and groups
+ *
+ **Annotate** your parameters, results and groups
- * **Git Integration**, let *pypet* make automatic commits of your codebase
+ *
+ **Git Integration**\ , let *pypet* make automatic commits of your codebase
- * **Sumatra Integration**, let *pypet* add your simulations to the *electronic lab notebook* tool
+ *
+ **Sumatra Integration**\ , let *pypet* add your simulations to the *electronic lab notebook* tool
Sumatra (http://neuralensemble.org/sumatra/)
-
- * *pypet* can be used on **computing clusters** or multiple servers at once if it is combined with
- SCOOP (http://scoop.readthedocs.org/)
+ *
+ *pypet* can be used on **computing clusters** or multiple servers at once if it is combined with
+ SCOOP (http://scoop.readthedocs.org/)
- # Quick Working Example
+ Quick Working Example
+ =====================
The best way to show how stuff works is by giving examples. I will start right away with a
very simple code snippet.
Well, what we have in mind is some sort of numerical simulation. For now we will keep it simple,
- let's say we need to simulate the multiplication of 2 values, i.e. `z=x*y`.
- We have two objectives, a) we want to store results of this simulation `z` and
- b) we want to explore the parameter space and try different values of `x` and `y`.
+ let's say we need to simulate the multiplication of 2 values, i.e. ``z=x*y``.
+ We have two objectives, a) we want to store results of this simulation ``z`` and
+ b) we want to explore the parameter space and try different values of ``x`` and ``y``.
Let's take a look at the snippet at once:
- ```python
- from pypet import Environment, cartesian_product
+ .. code-block:: python
+
+ from pypet import Environment, cartesian_product
- def multiply(traj):
- """Example of a sophisticated simulation that involves multiplying two values.
+ def multiply(traj):
+ """Example of a sophisticated simulation that involves multiplying two values.
- :param traj:
+ :param traj:
- Trajectory containing the parameters in a particular combination,
- it also serves as a container for results.
+ Trajectory containing the parameters in a particular combination,
+ it also serves as a container for results.
- """
- z=traj.x * traj.y
- traj.f_add_result('z',z, comment='I am the product of two values!')
+ """
+ z=traj.x * traj.y
+ traj.f_add_result('z',z, comment='I am the product of two values!')
- # Create an environment that handles running our simulation
- env = Environment(trajectory='Multiplication',filename='./HDF/example_01.hdf5',
- file_title='Example_01',
- comment = 'I am the first example!')
+ # Create an environment that handles running our simulation
+ env = Environment(trajectory='Multiplication',filename='./HDF/example_01.hdf5',
+ file_title='Example_01',
+ comment = 'I am the first example!')
- # Get the trajectory from the environment
- traj = env.trajectory
+ # Get the trajectory from the environment
+ traj = env.trajectory
- # Add both parameters
- traj.f_add_parameter('x', 1.0, comment='Im the first dimension!')
- traj.f_add_parameter('y', 1.0, comment='Im the second dimension!')
+ # Add both parameters
+ traj.f_add_parameter('x', 1.0, comment='Im the first dimension!')
+ traj.f_add_parameter('y', 1.0, comment='Im the second dimension!')
- # Explore the parameters with a cartesian product
- traj.f_explore(cartesian_product({'x':[1.0,2.0,3.0,4.0], 'y':[6.0,7.0,8.0]}))
+ # Explore the parameters with a cartesian product
+ traj.f_explore(cartesian_product({'x':[1.0,2.0,3.0,4.0], 'y':[6.0,7.0,8.0]}))
- # Run the simulation with all parameter combinations
- env.run(multiply)
- ```
+ # Run the simulation with all parameter combinations
+ env.run(multiply)
And now let's go through it one by one. At first we have a job to do, that is multiplying two
values:
- ```python
- def multiply(traj):
- """Example of a sophisticated simulation that involves multiplying two values.
+ .. code-block:: python
+
+ def multiply(traj):
+ """Example of a sophisticated simulation that involves multiplying two values.
- :param traj:
+ :param traj:
- Trajectory containing the parameters in a particular combination,
- it also serves as a container for results.
+ Trajectory containing the parameters in a particular combination,
+ it also serves as a container for results.
- """
- z=traj.x * traj.y
- traj.f_add_result('z',z, comment='I am the product of two values!')
- ```
+ """
+ z=traj.x * traj.y
+ traj.f_add_result('z',z, comment='I am the product of two values!')
- This is our simulation function `multiply`. The function uses a so called *trajectory*
+ This is our simulation function ``multiply``. The function uses a so called *trajectory*
container which manages our parameters. We can access the parameters simply by natural naming,
- as seen above via `traj.x` and `traj.y`. The value of `z` is simply added as a result
- to the `traj` object.
+ as seen above via ``traj.x`` and ``traj.y``. The value of ``z`` is simply added as a result
+ to the ``traj`` object.
After the definition of the job that we want to simulate, we create an environment which
will run the simulation.
- ```python
- # Create an environment that handles running our simulation
- env = Environment(trajectory='Multiplication',filename='./HDF/example_01.hdf5',
- file_title='Example_01',
- comment = 'I am the first example!')
- ```
+ .. code-block:: python
+
+ # Create an environment that handles running our simulation
+ env = Environment(trajectory='Multiplication',filename='./HDF/example_01.hdf5',
+ file_title='Example_01',
+ comment = 'I am the first example!')
The environment uses some parameters here, that is the name of the new trajectory, a filename to
store the trajectory into, the title of the file, and a comment that is added to the trajectory.
@@ -283,41 +349,41 @@
Check out the documentation (http://pypet.readthedocs.org/) if you want to know more.
The environment will automatically generate a trajectory for us which we can access via:
- ```python
- # Get the trajectory from the environment
- traj = env.trajectory
- ```
+ .. code-block:: python
+
+ # Get the trajectory from the environment
+ traj = env.trajectory
Now we need to populate our trajectory with our parameters. They are added with the default values
- of `x=y=1.0`.
+ of ``x=y=1.0``.
- ```python
- # Add both parameters
- traj.f_add_parameter('x', 1.0, comment='Im the first dimension!')
- traj.f_add_parameter('y', 1.0, comment='Im the second dimension!')
- ```
-
- Well, calculating `1.0 * 1.0` is quite boring, we want to figure out more products, that is
- the results of the cartesian product set `{1.0,2.0,3.0,4.0} x {6.0,7.0,8.0}`.
- Therefore, we use `f_explore` in combination with the builder function
- `cartesian_product`.
-
- ```python
- # Explore the parameters with a cartesian product
- traj.f_explore(cartesian_product({'x':[1.0,2.0,3.0,4.0], 'y':[6.0,7.0,8.0]}))
- ```
+ .. code-block:: python
- Finally, we need to tell the environment to run our job `multiply` with all parameter
+ # Add both parameters
+ traj.f_add_parameter('x', 1.0, comment='Im the first dimension!')
+ traj.f_add_parameter('y', 1.0, comment='Im the second dimension!')
+
+ Well, calculating ``1.0 * 1.0`` is quite boring, we want to figure out more products, that is
+ the results of the cartesian product set ``{1.0,2.0,3.0,4.0} x {6.0,7.0,8.0}``.
+ Therefore, we use ``f_explore`` in combination with the builder function
+ ``cartesian_product``.
+
+ .. code-block:: python
+
+ # Explore the parameters with a cartesian product
+ traj.f_explore(cartesian_product({'x':[1.0,2.0,3.0,4.0], 'y':[6.0,7.0,8.0]}))
+
+ Finally, we need to tell the environment to run our job ``multiply`` with all parameter
combinations.
- ```python
- # Run the simulation with all parameter combinations
- env.run(multiply)
- ```
-
- And that's it. The environment will evoke the function `multiply` now 12 times with
- all parameter combinations. Every time it will pass a `traj` container with another one of these
- 12 combinations of different `x` and `y` values to calculate the value of `z`.
+ .. code-block:: python
+
+ # Run the simulation with all parameter combinations
+ env.run(multiply)
+
+ And that's it. The environment will evoke the function ``multiply`` now 12 times with
+ all parameter combinations. Every time it will pass a ``traj`` container with another one of these
+ 12 combinations of different ``x`` and ``y`` values to calculate the value of ``z``.
Moreover, the environment and the storage service will have taken care about the storage
of our trajectory - including the results we have computed - into an HDF5 file.
@@ -326,82 +392,91 @@
Cheers,
Robert
+ Miscellaneous
+ =============
- # Miscellaneous
+ Acknowledgements
+ ----------------
- ## Acknowledgements
- * Thanks to Robert Pröpper and Philipp Meier for answering all my Python questions
+ *
+ Thanks to Robert Pröpper and Philipp Meier for answering all my Python questions
- You might want to check out their SpykeViewer (https://github.com/rproepp/spykeviewer)
- tool for visualization of MEA recordings and NEO (http://pythonhosted.org/neo) data
+ You might want to check out their SpykeViewer (https://github.com/rproepp/spykeviewer)
+ tool for visualization of MEA recordings and NEO (http://pythonhosted.org/neo) data
- * Thanks to Owen Mackwood for his SNEP toolbox which provided the initial ideas
- for this project
-
- * Thanks to Mehmet Nevvaf Timur for his work on the SCOOP integration and the ``'NETQUEUE'`` feature
+ *
+ Thanks to Owen Mackwood for his SNEP toolbox which provided the initial ideas
+ for this project
- * Thanks to Henri Bunting for his work on the BRIAN2 subpackage
+ *
+ Thanks to Mehmet Nevvaf Timur for his work on the SCOOP integration and the ``'NETQUEUE'`` feature
- * Thanks to the BCCN Berlin (http://www.bccn-berlin.de),
- the Research Training Group GRK 1589/1, and the
- Neural Information Processing Group ( http://www.ni.tu-berlin.de) for support
+ *
+ Thanks to Henri Bunting for his work on the BRIAN2 subpackage
+ *
+ Thanks to the BCCN Berlin (http://www.bccn-berlin.de),
+ the Research Training Group GRK 1589/1, and the
+ Neural Information Processing Group ( http://www.ni.tu-berlin.de) for support
- ## Tests
+ Tests
+ -----
- Tests can be found in `pypet/tests`.
+ Tests can be found in ``pypet/tests``.
Note that they involve heavy file I/O and you need privileges
to write files to a temporary folder.
- The tests suite will make use of the `tempfile.gettempdir()` function to
+ The tests suite will make use of the ``tempfile.gettempdir()`` function to
create such a temporary folder.
- Each test module can be run individually, for instance `$ python trajectory_test.py`.
+ Each test module can be run individually, for instance ``$ python trajectory_test.py``.
- You can run **all** tests with `$ python all_tests.py` which can also be found under
- `pypet/tests`.
- You can pass additional arguments as `$ python all_tests.py -k --folder=myfolder/`
- with `-k` to keep the HDF5 and log files created by the tests
+ You can run **all** tests with ``$ python all_tests.py`` which can also be found under
+ ``pypet/tests``.
+ You can pass additional arguments as ``$ python all_tests.py -k --folder=myfolder/``
+ with ``-k`` to keep the HDF5 and log files created by the tests
(if you want to inspect them, otherwise they will be deleted after the completed tests),
- and `--folder=` to specify a folder where to store the HDF5 files instead of the temporary one.
- If the folder cannot be created, the program defaults to `tempfile.gettempdir()`.
+ and ``--folder=`` to specify a folder where to store the HDF5 files instead of the temporary one.
+ If the folder cannot be created, the program defaults to ``tempfile.gettempdir()``.
Running all tests can take up to 20 minutes. The test suite encompasses more than **1000** tests
- and has a code coverage of about **90%**!
+ and has a code coverage of about **90%**\ !
- Moreover, *pypet* is constantly tested with Python 3.5 and 3.6 for **Linux** using
+ Moreover, *pypet* is constantly tested with Python 3.7 and 3.8 for **Linux** using
Travis-CI. Testing for **Windows** platforms is performed via Appveyor.
The source code is available at https://github.com/SmokinCaterpillar/pypet/.
-
- ## License
+ License
+ -------
BSD, please read LICENSE file.
-
- ## Legal Notice
+ Legal Notice
+ ------------
*pypet* was created by Robert Meyer at the Neural Information Processing Group (TU Berlin),
supported by the Research Training Group GRK 1589/1.
+ Contact
+ -------
- ## Contact
-
- **robert.meyer (at) ni.tu-berlin.de**
+ **robert.meyer (at) alcemy.tech**
- Marchstr. 23
+ alcemy GmbH
- MAR 5.046
+ Choriner Str. 83
- D-10587 Berlin
+ 10119 Berlin, Germany
Platform: UNKNOWN
Classifier: Development Status :: 4 - Beta
Classifier: Programming Language :: Python :: 3.6
-Classifier: Programming Language :: Python :: 3.5
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
Classifier: Intended Audience :: Science/Research
Classifier: Natural Language :: English
Classifier: Operating System :: OS Independent
Classifier: Topic :: Scientific/Engineering
Classifier: License :: OSI Approved :: BSD License
Classifier: Topic :: Utilities
+Requires-Python: >=3.6
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pypet-0.4.3/README.md new/pypet-0.5.1/README.md
--- old/pypet-0.4.3/README.md 2018-06-24 19:39:52.000000000 +0200
+++ new/pypet-0.5.1/README.md 2020-06-02 09:48:52.000000000 +0200
@@ -19,17 +19,17 @@
## Requirements
-Python 3.5 or 3.6 and
+Python 3.6, 3.7 or 3.8 and
-* tables >= 3.1.1
+* tables >= 3.5.0
-* pandas >= 0.20.0
+* pandas >= 1.0.0
-* numpy >= 1.12.0
+* numpy >= 1.16.0
-* scipy >= 0.17.0
+* scipy >= 1.3.0
-* HDF5 >= 1.8.9
+* HDF5 >= 1.10.0
There are also some optional packages that you can but do not have to install.
@@ -40,15 +40,15 @@
For git integration you additionally need
-* GitPython >= 0.3.1
+* GitPython >= 3.1.3
To utilize the cap feature for multiprocessing you need
-* psutil >= 2.0.0
+* psutil >= 5.7.0
To utilize the continuing of crashed trajectories you need
-* dill >= 0.2.1
+* dill >= 0.3.1
Automatic Sumatra records are supported for
@@ -361,7 +361,7 @@
Running all tests can take up to 20 minutes. The test suite encompasses more than **1000** tests
and has a code coverage of about **90%**!
-Moreover, *pypet* is constantly tested with Python 3.5 and 3.6 for **Linux** using
+Moreover, *pypet* is constantly tested with Python 3.7 and 3.8 for **Linux** using
Travis-CI. Testing for **Windows** platforms is performed via Appveyor.
The source code is available at https://github.com/SmokinCaterpillar/pypet/.
@@ -379,10 +379,10 @@
## Contact
-**robert.meyer (at) ni.tu-berlin.de**
+**robert.meyer (at) alcemy.tech**
-Marchstr. 23
+alcemy GmbH
-MAR 5.046
+Choriner Str. 83
-D-10587 Berlin
+10119 Berlin, Germany
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pypet-0.4.3/pypet/_version.py new/pypet-0.5.1/pypet/_version.py
--- old/pypet-0.4.3/pypet/_version.py 2018-06-24 19:49:53.000000000 +0200
+++ new/pypet-0.5.1/pypet/_version.py 2020-06-02 09:53:26.000000000 +0200
@@ -1 +1 @@
-__version__ = '0.4.3'
+__version__ = '0.5.1'
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pypet-0.4.3/pypet/brian2/parameter.py new/pypet-0.5.1/pypet/brian2/parameter.py
--- old/pypet-0.4.3/pypet/brian2/parameter.py 2018-06-24 19:39:52.000000000 +0200
+++ new/pypet-0.5.1/pypet/brian2/parameter.py 2020-06-01 22:42:02.000000000 +0200
@@ -44,7 +44,7 @@
return unit_from_expression(expr)
elif expr.__class__ is ast.Name:
return ALLUNITS[expr.id]
- elif expr.__class__ is ast.Num:
+ elif expr.__class__ is ast.Num or expr.__class__ is ast.Constant:
return expr.n
elif expr.__class__ is ast.UnaryOp:
op = expr.op.__class__.__name__
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pypet-0.4.3/pypet/storageservice.py new/pypet-0.5.1/pypet/storageservice.py
--- old/pypet-0.4.3/pypet/storageservice.py 2018-06-24 19:39:52.000000000 +0200
+++ new/pypet-0.5.1/pypet/storageservice.py 2020-06-01 22:42:02.000000000 +0200
@@ -4182,8 +4182,8 @@
"""
try:
if 'filters' not in kwargs:
- filters = self._all_get_filters(kwargs)
- kwargs['filters'] = filters
+ self._logger.debug(
+ 'filters are no longer supported by pandas')
if 'format' not in kwargs:
kwargs['format'] = self.pandas_format
if 'encoding' not in kwargs:
@@ -4198,8 +4198,9 @@
else:
self._logger.debug('Appending to pandas data `%s` in `%s`' % (key, fullname))
- if data is not None and (kwargs['format'] == 'f' or kwargs['format'] == 'fixed'):
- kwargs['expectedrows'] = data.shape[0]
+ if 'expectedrows' in kwargs:
+ self._logger.debug('expectedrows no longer supported by pandas, will '
+ 'ignore the option')
name = group._v_pathname + '/' + key
self._hdf5store.put(name, data, **kwargs)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pypet-0.4.3/pypet/utils/comparisons.py new/pypet-0.5.1/pypet/utils/comparisons.py
--- old/pypet-0.4.3/pypet/utils/comparisons.py 2018-06-24 19:39:52.000000000 +0200
+++ new/pypet-0.5.1/pypet/utils/comparisons.py 2020-06-01 22:42:02.000000000 +0200
@@ -184,7 +184,7 @@
new_frame = a == b
new_frame = new_frame | (pd.isnull(a) & pd.isnull(b))
if isinstance(new_frame, pd.DataFrame):
- return np.all(new_frame.as_matrix())
+ return np.all(new_frame.values)
except (ValueError, TypeError):
# The Value Error can happen if the data frame is of dtype=object and contains
# numpy arrays. Numpy array comparisons do not evaluate to a single truth value
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pypet-0.4.3/pypet/utils/helpful_classes.py new/pypet-0.5.1/pypet/utils/helpful_classes.py
--- old/pypet-0.4.3/pypet/utils/helpful_classes.py 2018-06-24 12:01:13.000000000 +0200
+++ new/pypet-0.5.1/pypet/utils/helpful_classes.py 2020-06-01 22:42:02.000000000 +0200
@@ -62,7 +62,12 @@
def __iter__(self):
while True:
- yield self.next()
+ try:
+ yield self.next()
+ except StopIteration:
+ # new behavior since PEP479
+ # one should return to stop iteration
+ return
class ChainMap(object):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pypet-0.4.3/pypet.egg-info/PKG-INFO new/pypet-0.5.1/pypet.egg-info/PKG-INFO
--- old/pypet-0.4.3/pypet.egg-info/PKG-INFO 2018-06-24 20:51:52.000000000 +0200
+++ new/pypet-0.5.1/pypet.egg-info/PKG-INFO 2020-06-02 12:57:56.000000000 +0200
@@ -1,19 +1,45 @@
-Metadata-Version: 1.1
+Metadata-Version: 1.2
Name: pypet
-Version: 0.4.3
+Version: 0.5.1
Summary: A toolkit for numerical simulations to allow easy parameter exploration and storage of results.
Home-page: https://github.com/SmokinCaterpillar/pypet
Author: Robert Meyer
-Author-email: robert.meyer(a)ni.tu-berlin.de
+Author-email: robert.meyer(a)alcemy.tech
License: BSD
-Description: # pypet
+Description:
+ pypet
+ =====
+
+
+ .. image:: https://travis-ci.org/SmokinCaterpillar/pypet.svg?branch=master
+ :target: https://travis-ci.org/SmokinCaterpillar/pypet
+ :alt: Travis Build Status
+
+
+ .. image:: https://ci.appveyor.com/api/projects/status/9amhj3iyf105xa2y/branch/master?…
+ :target: https://ci.appveyor.com/project/SmokinCaterpillar/pypet/branch/master
+ :alt: Appveyor Build status
+
+
+ .. image:: https://coveralls.io/repos/github/SmokinCaterpillar/pypet/badge.svg?branch=…
+ :target: https://coveralls.io/github/SmokinCaterpillar/pypet?branch=master
+ :alt: Coverage Status
+
+
+ .. image:: https://api.codacy.com/project/badge/grade/86268960751442799fcf6192b36e386f
+ :target: https://www.codacy.com/app/robert-meyer/pypet
+ :alt: Codacy Badge
+
+
+ .. image:: https://badge.fury.io/py/pypet.svg
+ :target: https://badge.fury.io/py/pypet
+ :alt: PyPI version
+
+
+ .. image:: https://readthedocs.org/projects/pypet/badge/?version=latest
+ :target: http://pypet.readthedocs.io/en/latest/?badge=latest
+ :alt: Documentation Status
- [![Travis Build Status](https://travis-ci.org/SmokinCaterpillar/pypet.svg?branch=master)](h…
- [![Appveyor Build status](https://ci.appveyor.com/api/projects/status/9amhj3iyf105xa2y/branch…
- [![Coverage Status](https://coveralls.io/repos/github/SmokinCaterpillar/pypet/badge.svg?branch=master)](https://coveralls.io/github/SmokinCaterpillar/pypet?branch=master)
- [![Codacy Badge](https://api.codacy.com/project/badge/grade/86268960751442799fcf6192b…
- [![PyPI version](https://badge.fury.io/py/pypet.svg)](https://badge.fury.io/py/pypet)
- [![Documentation Status](https://readthedocs.org/projects/pypet/badge/?version=latest)](http…
The new python parameter exploration toolkit:
*pypet* manages exploration of the parameter space
@@ -24,58 +50,68 @@
from a single source. Data I/O of your simulations and
analyses becomes a piece of cake!
+ Requirements
+ ------------
- ## Requirements
-
- Python 3.5 or 3.6 and
+ Python 3.6, 3.7 or 3.8 and
- * tables >= 3.1.1
- * pandas >= 0.20.0
+ *
+ tables >= 3.5.0
- * numpy >= 1.12.0
+ *
+ pandas >= 1.0.0
- * scipy >= 0.17.0
+ *
+ numpy >= 1.16.0
- * HDF5 >= 1.8.9
+ *
+ scipy >= 1.3.0
+ *
+ HDF5 >= 1.10.0
There are also some optional packages that you can but do not have to install.
If you want to combine *pypet* with SCOOP you need
+
* scoop >= 0.7.1
For git integration you additionally need
- * GitPython >= 0.3.1
+
+ * GitPython >= 3.1.3
To utilize the cap feature for multiprocessing you need
- * psutil >= 2.0.0
+
+ * psutil >= 5.7.0
To utilize the continuing of crashed trajectories you need
- * dill >= 0.2.1
+
+ * dill >= 0.3.1
Automatic Sumatra records are supported for
- * Sumatra >= 0.7.1
+ * Sumatra >= 0.7.1
- ## Python 2.7
+ Python 2.7
+ ----------
This release no longer supports Python 2.7.
If you are still using Python 2.7, you need to
use the pypet legacy version 0.3.0 (https://pypi.python.org/pypi/pypet/0.3.0).
-
- # What is pypet all about?
+ What is pypet all about?
+ ========================
Whenever you do numerical simulations in science, you come across two major challenges.
First, you need some way to save your data. Secondly, you extensively explore the parameter space.
In order to accomplish both you write some hacky I/O functionality to get it done the quick and
- dirty way. This means storing stuff into text files, as *MATLAB* *m*-files,
+ dirty way. This means storing stuff into text files, as *MATLAB* *m*\ -files,
or whatever comes in handy.
After a while and many simulations later, you want to look back at some of your very
@@ -90,33 +126,38 @@
that was not specific to my current simulations, but I could also use for future scientific
projects right out of the box.
- The python parameter exploration toolkit (*pypet*) provides a framework to define *parameters*
+ The python parameter exploration toolkit (\ *pypet*\ ) provides a framework to define *parameters*
that you need to run your simulations. You can actively explore these by following a
*trajectory* through the space spanned by the parameters.
And finally, you can get your *results* together and store everything appropriately to disk.
The storage format of choice is HDF5 (http://www.hdfgroup.org/HDF5/) via PyTables
(http://www.pytables.org/).
-
- ## Package Organization
+ Package Organization
+ --------------------
This project encompasses these core modules:
- * The `pypet.environment` module for handling the running of simulations
- * The `pypet.trajectory` module for managing the parameters and results,
- and providing a way to *explore* your parameter space. Somewhat related is also the
- `pypet.naturalnaming` module, that provides functionality to access and put data into
- the *trajectory*.
+ *
+ The ``pypet.environment`` module for handling the running of simulations
- * The `pypet.parameters` module including containers for parameters and results
+ *
+ The ``pypet.trajectory`` module for managing the parameters and results,
+ and providing a way to *explore* your parameter space. Somewhat related is also the
+ ``pypet.naturalnaming`` module, that provides functionality to access and put data into
+ the *trajectory*.
- * The `pypet.storageservice` for saving your data to disk
+ *
+ The ``pypet.parameters`` module including containers for parameters and results
+ *
+ The ``pypet.storageservice`` for saving your data to disk
- ## Install
+ Install
+ -------
- If you don't have all prerequisites (*numpy*, *scipy*, *tables*, *pandas*) install them first.
+ If you don't have all prerequisites (\ *numpy*\ , *scipy*\ , *tables*\ , *pandas*\ ) install them first.
These are standard python packages, so chances are high that they are already installed.
By the way, in case you use the python package manager ``pip``
you can list all installed packages with ``pip freeze``.
@@ -130,13 +171,13 @@
**Or**
- In case you use **Windows**, you have to download the tar file from https://pypi.python.org/pypi/pypet
+ In case you use **Windows**\ , you have to download the tar file from https://pypi.python.org/pypi/pypet
and unzip it. Next, open a windows terminal
- and navigate to your unpacked *pypet* files to the folder containing the `setup.py` file.
+ and navigate to your unpacked *pypet* files to the folder containing the ``setup.py`` file.
As above run from the terminal ``python setup.py install``.
-
- ## Documentation and Support
+ Documentation and Support
+ -------------------------
Documentation can be found on http://pypet.readthedocs.org/.
@@ -144,137 +185,162 @@
If you have any further questions feel free to contact me at **robert.meyer (at) ni.tu-berlin.de**.
+ Main Features
+ -------------
- ## Main Features
- * **Novel tree container** `Trajectory`, for handling and managing of
+ *
+ **Novel tree container** ``Trajectory``\ , for handling and managing of
parameters and results of numerical simulations
- * **Group** your parameters and results into meaningful categories
+ *
+ **Group** your parameters and results into meaningful categories
- * Access data via **natural naming**, e.g. `traj.parameters.traffic.ncars`
+ *
+ Access data via **natural naming**\ , e.g. ``traj.parameters.traffic.ncars``
- * Automatic **storage** of simulation data into HDF5 files via PyTables
+ *
+ Automatic **storage** of simulation data into HDF5 files via PyTables
- * Support for many different **data formats**
+ *
+ Support for many different **data formats**
- * python native data types: bool, int, long, float, str, complex
- * list, tuple, dict
+ *
+ python native data types: bool, int, long, float, str, complex
- * Numpy arrays and matrices
+ *
+ list, tuple, dict
- * Scipy sparse matrices
+ *
+ Numpy arrays and matrices
- * pandas DataFrames (http://pandas.pydata.org/)
+ *
+ Scipy sparse matrices
- * BRIAN2 quantities and monitors (http://briansimulator.org/)
+ *
+ pandas DataFrames (http://pandas.pydata.org/)
- * Easily **extendable** to other data formats!
+ *
+ BRIAN2 quantities and monitors (http://briansimulator.org/)
- * **Exploration** of the parameter space of your simulations
+ *
+ Easily **extendable** to other data formats!
- * **Merging** of *trajectories* residing in the same space
+ *
+ **Exploration** of the parameter space of your simulations
- * Support for **multiprocessing**, *pypet* can run your simulations in parallel
+ *
+ **Merging** of *trajectories* residing in the same space
- * **Analyse** your data on-the-fly during multiprocessing
+ *
+ Support for **multiprocessing**\ , *pypet* can run your simulations in parallel
- * **Adaptively** explore tha parameter space combining *pypet* with optimization
+ *
+ **Analyse** your data on-the-fly during multiprocessing
+
+ *
+ **Adaptively** explore tha parameter space combining *pypet* with optimization
tools like the evolutionary algorithms framework DEAP (http://deap.readthedocs.org/en/)
- * **Dynamic Loading**, load only the parts of your data you currently need
+ *
+ **Dynamic Loading**\ , load only the parts of your data you currently need
- * **Resume** a crashed or halted simulation
+ *
+ **Resume** a crashed or halted simulation
- * **Annotate** your parameters, results and groups
+ *
+ **Annotate** your parameters, results and groups
- * **Git Integration**, let *pypet* make automatic commits of your codebase
+ *
+ **Git Integration**\ , let *pypet* make automatic commits of your codebase
- * **Sumatra Integration**, let *pypet* add your simulations to the *electronic lab notebook* tool
+ *
+ **Sumatra Integration**\ , let *pypet* add your simulations to the *electronic lab notebook* tool
Sumatra (http://neuralensemble.org/sumatra/)
-
- * *pypet* can be used on **computing clusters** or multiple servers at once if it is combined with
- SCOOP (http://scoop.readthedocs.org/)
+ *
+ *pypet* can be used on **computing clusters** or multiple servers at once if it is combined with
+ SCOOP (http://scoop.readthedocs.org/)
- # Quick Working Example
+ Quick Working Example
+ =====================
The best way to show how stuff works is by giving examples. I will start right away with a
very simple code snippet.
Well, what we have in mind is some sort of numerical simulation. For now we will keep it simple,
- let's say we need to simulate the multiplication of 2 values, i.e. `z=x*y`.
- We have two objectives, a) we want to store results of this simulation `z` and
- b) we want to explore the parameter space and try different values of `x` and `y`.
+ let's say we need to simulate the multiplication of 2 values, i.e. ``z=x*y``.
+ We have two objectives, a) we want to store results of this simulation ``z`` and
+ b) we want to explore the parameter space and try different values of ``x`` and ``y``.
Let's take a look at the snippet at once:
- ```python
- from pypet import Environment, cartesian_product
+ .. code-block:: python
+
+ from pypet import Environment, cartesian_product
- def multiply(traj):
- """Example of a sophisticated simulation that involves multiplying two values.
+ def multiply(traj):
+ """Example of a sophisticated simulation that involves multiplying two values.
- :param traj:
+ :param traj:
- Trajectory containing the parameters in a particular combination,
- it also serves as a container for results.
+ Trajectory containing the parameters in a particular combination,
+ it also serves as a container for results.
- """
- z=traj.x * traj.y
- traj.f_add_result('z',z, comment='I am the product of two values!')
+ """
+ z=traj.x * traj.y
+ traj.f_add_result('z',z, comment='I am the product of two values!')
- # Create an environment that handles running our simulation
- env = Environment(trajectory='Multiplication',filename='./HDF/example_01.hdf5',
- file_title='Example_01',
- comment = 'I am the first example!')
+ # Create an environment that handles running our simulation
+ env = Environment(trajectory='Multiplication',filename='./HDF/example_01.hdf5',
+ file_title='Example_01',
+ comment = 'I am the first example!')
- # Get the trajectory from the environment
- traj = env.trajectory
+ # Get the trajectory from the environment
+ traj = env.trajectory
- # Add both parameters
- traj.f_add_parameter('x', 1.0, comment='Im the first dimension!')
- traj.f_add_parameter('y', 1.0, comment='Im the second dimension!')
+ # Add both parameters
+ traj.f_add_parameter('x', 1.0, comment='Im the first dimension!')
+ traj.f_add_parameter('y', 1.0, comment='Im the second dimension!')
- # Explore the parameters with a cartesian product
- traj.f_explore(cartesian_product({'x':[1.0,2.0,3.0,4.0], 'y':[6.0,7.0,8.0]}))
+ # Explore the parameters with a cartesian product
+ traj.f_explore(cartesian_product({'x':[1.0,2.0,3.0,4.0], 'y':[6.0,7.0,8.0]}))
- # Run the simulation with all parameter combinations
- env.run(multiply)
- ```
+ # Run the simulation with all parameter combinations
+ env.run(multiply)
And now let's go through it one by one. At first we have a job to do, that is multiplying two
values:
- ```python
- def multiply(traj):
- """Example of a sophisticated simulation that involves multiplying two values.
+ .. code-block:: python
+
+ def multiply(traj):
+ """Example of a sophisticated simulation that involves multiplying two values.
- :param traj:
+ :param traj:
- Trajectory containing the parameters in a particular combination,
- it also serves as a container for results.
+ Trajectory containing the parameters in a particular combination,
+ it also serves as a container for results.
- """
- z=traj.x * traj.y
- traj.f_add_result('z',z, comment='I am the product of two values!')
- ```
+ """
+ z=traj.x * traj.y
+ traj.f_add_result('z',z, comment='I am the product of two values!')
- This is our simulation function `multiply`. The function uses a so called *trajectory*
+ This is our simulation function ``multiply``. The function uses a so called *trajectory*
container which manages our parameters. We can access the parameters simply by natural naming,
- as seen above via `traj.x` and `traj.y`. The value of `z` is simply added as a result
- to the `traj` object.
+ as seen above via ``traj.x`` and ``traj.y``. The value of ``z`` is simply added as a result
+ to the ``traj`` object.
After the definition of the job that we want to simulate, we create an environment which
will run the simulation.
- ```python
- # Create an environment that handles running our simulation
- env = Environment(trajectory='Multiplication',filename='./HDF/example_01.hdf5',
- file_title='Example_01',
- comment = 'I am the first example!')
- ```
+ .. code-block:: python
+
+ # Create an environment that handles running our simulation
+ env = Environment(trajectory='Multiplication',filename='./HDF/example_01.hdf5',
+ file_title='Example_01',
+ comment = 'I am the first example!')
The environment uses some parameters here, that is the name of the new trajectory, a filename to
store the trajectory into, the title of the file, and a comment that is added to the trajectory.
@@ -283,41 +349,41 @@
Check out the documentation (http://pypet.readthedocs.org/) if you want to know more.
The environment will automatically generate a trajectory for us which we can access via:
- ```python
- # Get the trajectory from the environment
- traj = env.trajectory
- ```
+ .. code-block:: python
+
+ # Get the trajectory from the environment
+ traj = env.trajectory
Now we need to populate our trajectory with our parameters. They are added with the default values
- of `x=y=1.0`.
+ of ``x=y=1.0``.
- ```python
- # Add both parameters
- traj.f_add_parameter('x', 1.0, comment='Im the first dimension!')
- traj.f_add_parameter('y', 1.0, comment='Im the second dimension!')
- ```
-
- Well, calculating `1.0 * 1.0` is quite boring, we want to figure out more products, that is
- the results of the cartesian product set `{1.0,2.0,3.0,4.0} x {6.0,7.0,8.0}`.
- Therefore, we use `f_explore` in combination with the builder function
- `cartesian_product`.
-
- ```python
- # Explore the parameters with a cartesian product
- traj.f_explore(cartesian_product({'x':[1.0,2.0,3.0,4.0], 'y':[6.0,7.0,8.0]}))
- ```
+ .. code-block:: python
- Finally, we need to tell the environment to run our job `multiply` with all parameter
+ # Add both parameters
+ traj.f_add_parameter('x', 1.0, comment='Im the first dimension!')
+ traj.f_add_parameter('y', 1.0, comment='Im the second dimension!')
+
+ Well, calculating ``1.0 * 1.0`` is quite boring, we want to figure out more products, that is
+ the results of the cartesian product set ``{1.0,2.0,3.0,4.0} x {6.0,7.0,8.0}``.
+ Therefore, we use ``f_explore`` in combination with the builder function
+ ``cartesian_product``.
+
+ .. code-block:: python
+
+ # Explore the parameters with a cartesian product
+ traj.f_explore(cartesian_product({'x':[1.0,2.0,3.0,4.0], 'y':[6.0,7.0,8.0]}))
+
+ Finally, we need to tell the environment to run our job ``multiply`` with all parameter
combinations.
- ```python
- # Run the simulation with all parameter combinations
- env.run(multiply)
- ```
-
- And that's it. The environment will evoke the function `multiply` now 12 times with
- all parameter combinations. Every time it will pass a `traj` container with another one of these
- 12 combinations of different `x` and `y` values to calculate the value of `z`.
+ .. code-block:: python
+
+ # Run the simulation with all parameter combinations
+ env.run(multiply)
+
+ And that's it. The environment will evoke the function ``multiply`` now 12 times with
+ all parameter combinations. Every time it will pass a ``traj`` container with another one of these
+ 12 combinations of different ``x`` and ``y`` values to calculate the value of ``z``.
Moreover, the environment and the storage service will have taken care about the storage
of our trajectory - including the results we have computed - into an HDF5 file.
@@ -326,82 +392,91 @@
Cheers,
Robert
+ Miscellaneous
+ =============
- # Miscellaneous
+ Acknowledgements
+ ----------------
- ## Acknowledgements
- * Thanks to Robert Pröpper and Philipp Meier for answering all my Python questions
+ *
+ Thanks to Robert Pröpper and Philipp Meier for answering all my Python questions
- You might want to check out their SpykeViewer (https://github.com/rproepp/spykeviewer)
- tool for visualization of MEA recordings and NEO (http://pythonhosted.org/neo) data
+ You might want to check out their SpykeViewer (https://github.com/rproepp/spykeviewer)
+ tool for visualization of MEA recordings and NEO (http://pythonhosted.org/neo) data
- * Thanks to Owen Mackwood for his SNEP toolbox which provided the initial ideas
- for this project
-
- * Thanks to Mehmet Nevvaf Timur for his work on the SCOOP integration and the ``'NETQUEUE'`` feature
+ *
+ Thanks to Owen Mackwood for his SNEP toolbox which provided the initial ideas
+ for this project
- * Thanks to Henri Bunting for his work on the BRIAN2 subpackage
+ *
+ Thanks to Mehmet Nevvaf Timur for his work on the SCOOP integration and the ``'NETQUEUE'`` feature
- * Thanks to the BCCN Berlin (http://www.bccn-berlin.de),
- the Research Training Group GRK 1589/1, and the
- Neural Information Processing Group ( http://www.ni.tu-berlin.de) for support
+ *
+ Thanks to Henri Bunting for his work on the BRIAN2 subpackage
+ *
+ Thanks to the BCCN Berlin (http://www.bccn-berlin.de),
+ the Research Training Group GRK 1589/1, and the
+ Neural Information Processing Group ( http://www.ni.tu-berlin.de) for support
- ## Tests
+ Tests
+ -----
- Tests can be found in `pypet/tests`.
+ Tests can be found in ``pypet/tests``.
Note that they involve heavy file I/O and you need privileges
to write files to a temporary folder.
- The tests suite will make use of the `tempfile.gettempdir()` function to
+ The tests suite will make use of the ``tempfile.gettempdir()`` function to
create such a temporary folder.
- Each test module can be run individually, for instance `$ python trajectory_test.py`.
+ Each test module can be run individually, for instance ``$ python trajectory_test.py``.
- You can run **all** tests with `$ python all_tests.py` which can also be found under
- `pypet/tests`.
- You can pass additional arguments as `$ python all_tests.py -k --folder=myfolder/`
- with `-k` to keep the HDF5 and log files created by the tests
+ You can run **all** tests with ``$ python all_tests.py`` which can also be found under
+ ``pypet/tests``.
+ You can pass additional arguments as ``$ python all_tests.py -k --folder=myfolder/``
+ with ``-k`` to keep the HDF5 and log files created by the tests
(if you want to inspect them, otherwise they will be deleted after the completed tests),
- and `--folder=` to specify a folder where to store the HDF5 files instead of the temporary one.
- If the folder cannot be created, the program defaults to `tempfile.gettempdir()`.
+ and ``--folder=`` to specify a folder where to store the HDF5 files instead of the temporary one.
+ If the folder cannot be created, the program defaults to ``tempfile.gettempdir()``.
Running all tests can take up to 20 minutes. The test suite encompasses more than **1000** tests
- and has a code coverage of about **90%**!
+ and has a code coverage of about **90%**\ !
- Moreover, *pypet* is constantly tested with Python 3.5 and 3.6 for **Linux** using
+ Moreover, *pypet* is constantly tested with Python 3.7 and 3.8 for **Linux** using
Travis-CI. Testing for **Windows** platforms is performed via Appveyor.
The source code is available at https://github.com/SmokinCaterpillar/pypet/.
-
- ## License
+ License
+ -------
BSD, please read LICENSE file.
-
- ## Legal Notice
+ Legal Notice
+ ------------
*pypet* was created by Robert Meyer at the Neural Information Processing Group (TU Berlin),
supported by the Research Training Group GRK 1589/1.
+ Contact
+ -------
- ## Contact
-
- **robert.meyer (at) ni.tu-berlin.de**
+ **robert.meyer (at) alcemy.tech**
- Marchstr. 23
+ alcemy GmbH
- MAR 5.046
+ Choriner Str. 83
- D-10587 Berlin
+ 10119 Berlin, Germany
Platform: UNKNOWN
Classifier: Development Status :: 4 - Beta
Classifier: Programming Language :: Python :: 3.6
-Classifier: Programming Language :: Python :: 3.5
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
Classifier: Intended Audience :: Science/Research
Classifier: Natural Language :: English
Classifier: Operating System :: OS Independent
Classifier: Topic :: Scientific/Engineering
Classifier: License :: OSI Approved :: BSD License
Classifier: Topic :: Utilities
+Requires-Python: >=3.6
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pypet-0.4.3/setup.py new/pypet-0.5.1/setup.py
--- old/pypet-0.4.3/setup.py 2018-06-24 19:39:52.000000000 +0200
+++ new/pypet-0.5.1/setup.py 2020-06-02 11:52:57.000000000 +0200
@@ -1,18 +1,22 @@
__author__ = 'Robert Meyer'
import re
-import sys
-
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
+try:
+ # Used to convert md to rst for pypi, otherwise not needed
+ import m2r
+except ImportError:
+ m2r = None
install_requires=[
- 'tables',
- 'pandas',
- 'numpy',
- 'scipy']
+ 'tables',
+ 'pandas',
+ 'numpy',
+ 'scipy'
+]
# For versioning, Version found in pypet._version.py
verstrline = open('pypet/_version.py', "rt").read()
@@ -24,6 +28,14 @@
else:
raise RuntimeError('Unable to find version in pypet/_version.py')
+description = ('A toolkit for numerical simulations to allow '
+ 'easy parameter exploration and storage of results.')
+if m2r is None:
+ long_description = description
+else:
+ # convert markdown to rst
+ long_description = m2r.convert(open('README.md').read())
+
setup(
name='pypet',
version=verstr,
@@ -41,19 +53,21 @@
package_data={'pypet.tests': ['testdata/*.hdf5'], 'pypet': ['logging/*.ini']},
license='BSD',
author='Robert Meyer',
- author_email='robert.meyer(a)ni.tu-berlin.de',
- description='A toolkit for numerical simulations to allow easy parameter exploration and storage of results.',
- long_description=open('README.md').read(),
+ author_email='robert.meyer(a)alcemy.tech',
+ description=description,
+ long_description=long_description,
url='https://github.com/SmokinCaterpillar/pypet',
install_requires=install_requires,
classifiers=[
'Development Status :: 4 - Beta',
'Programming Language :: Python :: 3.6',
- 'Programming Language :: Python :: 3.5',
+ 'Programming Language :: Python :: 3.7',
+ 'Programming Language :: Python :: 3.8',
'Intended Audience :: Science/Research',
'Natural Language :: English',
'Operating System :: OS Independent',
'Topic :: Scientific/Engineering',
'License :: OSI Approved :: BSD License',
- 'Topic :: Utilities']
+ 'Topic :: Utilities'],
+ python_requires='>=3.6',
)
\ No newline at end of file
1
0
Hello community,
here is the log from the commit of package python-scikit-dsp-comm for openSUSE:Factory checked in at 2020-06-30 21:56:42
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-scikit-dsp-comm (Old)
and /work/SRC/openSUSE:Factory/.python-scikit-dsp-comm.new.3060 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-scikit-dsp-comm"
Tue Jun 30 21:56:42 2020 rev:2 rq:817769 version:1.1.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-scikit-dsp-comm/python-scikit-dsp-comm.changes 2018-11-06 14:33:46.209097881 +0100
+++ /work/SRC/openSUSE:Factory/.python-scikit-dsp-comm.new.3060/python-scikit-dsp-comm.changes 2020-06-30 21:56:46.414816119 +0200
@@ -1,0 +2,69 @@
+Mon Jun 29 18:52:20 UTC 2020 - Todd R <toddrme2178(a)gmail.com>
+
+- Update to version 1.1.0
+ * Added imports to examples
+ * Adding FIR and IIR filter design notebook
+ * Adding FIR csv file
+ * Adding a device check inside pyaudio_helper before going to pyaudio;
+ * Adding audio resources for Real-Time ipynb
+ * Adding base stripped ipynb for pyaudio example
+ * Adding block diagram figure and updated notebook
+ * Adding final images to notebook
+ * Adding full name for music buffer plot image
+ * Adding images to manifest so they will show up on pypi docs next time.
+ * Adding ipykernel; missing requirement for jupyter notebooks
+ * Adding ipynb with changes for paths to files; first widget image added
+ * Adding jupyter notebook and toctree setup
+ * Adding mic noise file
+ * Adding multirate processing notebook
+ * Adding music buffer capture plot to notebook
+ * Adding nb examples to a toctree
+ * Adding nbsphinx setup
+ * Adding new cell for random seed
+ * Adding path change for three band equalizer csv
+ * Adding readme file to docs
+ * Adding reqs for widget building
+ * Adding spaces for proper latex parsing
+ * Adding the Convolutional Codes notebook to the docs
+ * Adding three band equalizer csv file
+ * Adds a check for integer values of M only for downsample; Fixes: #16
+ * Also need ipywidgets
+ * Bumping version for new releaes with viterbi rate 1/3
+ * FIR and IIR Filter Design gets its own section
+ * Fixing doc strings for viterbi
+ * Fixing in out check function and warning text
+ * Include license in sdists
+ * Include license in wheels
+ * Initializing class elements to blank for accessors
+ * Making changes to PLL_cbb docstring
+ * Making return string doc more explicit, refs: #12
+ * Minimum matplotlib requirement added for rtd
+ * Modifying PLL doc string
+ * Modifying doc string to remove indents
+ * Modifying phase step doc string
+ * Modifying time step doc string
+ * Moving print lines to logging output at debug level
+ * New file to hold jupyter notebook example links
+ * Now using one file for versioning project; setup and docs
+ * Optfir module functionality was rewritten into fir_design_helper
+ * Raising warnings to errors to force correct usage of devices
+ * Refactoring test to deconflict naming for rate 1/3
+ * Remove if statement from viterbi_decoder. Get length from rate denomi…
+ * Remove rate if statements from bm_calc. Get length from rate denomina…
+ * Removed optfir from __init__.py.
+ * Removing optfir dependency and fix mlab.find warnings.
+ * Removing optfir from docs
+ * Require 0/1 integers for hard decision Viterbi decoding, add examples…
+ * Test for saturation limit in simpleQuant.
+ * Use a ValueError to raise an error instead of exiting a users process
+ * Use fraction for rate instead of string
+ * Using dc MPSK_bb which has fixes for Ns
+ * Using markdown long description instead of converting with pandoc
+ * extend to rate 1/3 convolutional codes
+ * from future needs to be moved to first line
+ * ignore vs code JSON file.
+ * implement two channel (stereo) in loop_audio class
+ * only_directive is builtin now so removing from extensions list
+ * using Regexp for 2.7 compat
+
+-------------------------------------------------------------------
Old:
----
LICENSE.md
scikit-dsp-comm-0.0.5.tar.gz
New:
----
scikit-dsp-comm-1.1.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-scikit-dsp-comm.spec ++++++
--- /var/tmp/diff_new_pack.FKoYeL/_old 2020-06-30 21:56:47.058818111 +0200
+++ /var/tmp/diff_new_pack.FKoYeL/_new 2020-06-30 21:56:47.062818124 +0200
@@ -1,7 +1,7 @@
#
# spec file for package python-scikit-dsp-comm
#
-# Copyright (c) 2018 SUSE LINUX GmbH, Nuernberg, Germany.
+# Copyright (c) 2020 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -18,14 +18,13 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-scikit-dsp-comm
-Version: 0.0.5
+Version: 1.1.0
Release: 0
Summary: DSP and Comm package for Python
License: BSD-2-Clause
Group: Development/Languages/Python
Url: https://github.com/mwickert/scikit-dsp-comm
Source0: https://files.pythonhosted.org/packages/source/s/scikit-dsp-comm/scikit-dsp…
-Source10: https://raw.githubusercontent.com/mwickert/scikit-dsp-comm/v%{version}/LICE…
Source100: python-scikit-dsp-comm-rpmlintrc
BuildRequires: %{python_module devel}
BuildRequires: %{python_module setuptools}
@@ -35,7 +34,6 @@
BuildRequires: %{python_module matplotlib}
BuildRequires: %{python_module nose}
BuildRequires: %{python_module numpy}
-BuildRequires: %{python_module numpy}
BuildRequires: %{python_module scipy}
BuildRequires: %{python_module tox}
# /SECTION
@@ -55,7 +53,6 @@
%prep
%setup -q -n scikit-dsp-comm-%{version}
-cp %{SOURCE10} .
%build
%python_build
++++++ scikit-dsp-comm-0.0.5.tar.gz -> scikit-dsp-comm-1.1.0.tar.gz ++++++
++++ 1609 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package python-infinity for openSUSE:Factory checked in at 2020-06-30 21:56:39
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-infinity (Old)
and /work/SRC/openSUSE:Factory/.python-infinity.new.3060 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-infinity"
Tue Jun 30 21:56:39 2020 rev:3 rq:817766 version:1.5
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-infinity/python-infinity.changes 2019-01-24 14:14:03.571317813 +0100
+++ /work/SRC/openSUSE:Factory/.python-infinity.new.3060/python-infinity.changes 2020-06-30 21:56:45.058811923 +0200
@@ -1,0 +2,6 @@
+Sat Jun 27 02:40:34 UTC 2020 - Todd R <toddrme2178(a)gmail.com>
+
+- Update to 1.5
+ * Removed py27, py33 support
+
+-------------------------------------------------------------------
Old:
----
infinity-1.4.tar.gz
New:
----
infinity-1.5.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-infinity.spec ++++++
--- /var/tmp/diff_new_pack.UeTAGw/_old 2020-06-30 21:56:45.854814386 +0200
+++ /var/tmp/diff_new_pack.UeTAGw/_new 2020-06-30 21:56:45.854814386 +0200
@@ -1,7 +1,7 @@
#
# spec file for package python-infinity
#
-# Copyright (c) 2018 SUSE LINUX GmbH, Nuernberg, Germany.
+# Copyright (c) 2020 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -19,7 +19,7 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
%bcond_without test
Name: python-infinity
-Version: 1.4
+Version: 1.5
Release: 0
Summary: All-in-one infinity value for Python
License: BSD-3-Clause
@@ -58,7 +58,6 @@
%endif
%files %{python_files}
-%defattr(-,root,root,-)
%doc CHANGES.rst README.rst
%license LICENSE
%{python_sitelib}/*
++++++ infinity-1.4.tar.gz -> infinity-1.5.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/infinity-1.4/CHANGES.rst new/infinity-1.5/CHANGES.rst
--- old/infinity-1.4/CHANGES.rst 2016-04-01 08:12:10.000000000 +0200
+++ new/infinity-1.5/CHANGES.rst 2020-06-03 12:27:35.000000000 +0200
@@ -4,6 +4,12 @@
Here you can see the full list of changes between each infinity release.
+1.5 (2020-06-03)
+----------------
+
+- Removed py27, py33 support
+
+
1.4 (2016-04-01)
----------------
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/infinity-1.4/PKG-INFO new/infinity-1.5/PKG-INFO
--- old/infinity-1.4/PKG-INFO 2016-04-01 08:14:49.000000000 +0200
+++ new/infinity-1.5/PKG-INFO 2020-06-03 12:33:09.000000000 +0200
@@ -1,6 +1,6 @@
-Metadata-Version: 1.1
+Metadata-Version: 2.1
Name: infinity
-Version: 1.4
+Version: 1.5
Summary: All-in-one infinity value for Python. Can be compared to any object.
Home-page: https://github.com/kvesteri/infinity
Author: Konsta Vesterinen
@@ -18,10 +18,6 @@
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
-Classifier: Programming Language :: Python
-Classifier: Programming Language :: Python :: 2
-Classifier: Programming Language :: Python :: 2.6
-Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.3
Classifier: Programming Language :: Python :: 3.4
@@ -30,3 +26,4 @@
Classifier: Programming Language :: Python :: Implementation :: PyPy
Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
Classifier: Topic :: Software Development :: Libraries :: Python Modules
+Provides-Extra: test
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/infinity-1.4/infinity.egg-info/PKG-INFO new/infinity-1.5/infinity.egg-info/PKG-INFO
--- old/infinity-1.4/infinity.egg-info/PKG-INFO 2016-04-01 08:14:48.000000000 +0200
+++ new/infinity-1.5/infinity.egg-info/PKG-INFO 2020-06-03 12:33:08.000000000 +0200
@@ -1,6 +1,6 @@
-Metadata-Version: 1.1
+Metadata-Version: 2.1
Name: infinity
-Version: 1.4
+Version: 1.5
Summary: All-in-one infinity value for Python. Can be compared to any object.
Home-page: https://github.com/kvesteri/infinity
Author: Konsta Vesterinen
@@ -18,10 +18,6 @@
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
-Classifier: Programming Language :: Python
-Classifier: Programming Language :: Python :: 2
-Classifier: Programming Language :: Python :: 2.6
-Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.3
Classifier: Programming Language :: Python :: 3.4
@@ -30,3 +26,4 @@
Classifier: Programming Language :: Python :: Implementation :: PyPy
Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
Classifier: Topic :: Software Development :: Libraries :: Python Modules
+Provides-Extra: test
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/infinity-1.4/infinity.py new/infinity-1.5/infinity.py
--- old/infinity-1.4/infinity.py 2016-04-01 08:13:12.000000000 +0200
+++ new/infinity-1.5/infinity.py 2020-06-03 12:30:10.000000000 +0200
@@ -1,11 +1,6 @@
-try:
- from functools import total_ordering
-except ImportError:
- # Use Python 2.6 port
- from total_ordering import total_ordering
+from functools import total_ordering
-
-__version__ = '1.4'
+__version__ = '1.5'
@total_ordering
@@ -91,7 +86,7 @@
__rfloordiv__ = __rdiv__
def __mul__(self, other):
- if other is 0:
+ if other == 0:
return NotImplemented
return Infinity(
other > 0 and self.positive or other < 0 and not self.positive
@@ -100,7 +95,7 @@
__rmul__ = __mul__
def __pow__(self, other):
- if other is 0:
+ if other == 0:
return NotImplemented
elif other == -self:
return -0.0 if not self.positive else 0.0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/infinity-1.4/setup.cfg new/infinity-1.5/setup.cfg
--- old/infinity-1.4/setup.cfg 2016-04-01 08:14:49.000000000 +0200
+++ new/infinity-1.5/setup.cfg 2020-06-03 12:33:09.000000000 +0200
@@ -1,5 +1,4 @@
[egg_info]
tag_build =
tag_date = 0
-tag_svn_revision = 0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/infinity-1.4/setup.py new/infinity-1.5/setup.py
--- old/infinity-1.4/setup.py 2016-01-28 17:34:45.000000000 +0100
+++ new/infinity-1.5/setup.py 2020-06-03 12:25:33.000000000 +0200
@@ -61,10 +61,6 @@
'License :: OSI Approved :: BSD License',
'Operating System :: OS Independent',
'Programming Language :: Python',
- 'Programming Language :: Python',
- 'Programming Language :: Python :: 2',
- 'Programming Language :: Python :: 2.6',
- 'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
1
0
Hello community,
here is the log from the commit of package python-google-cloud-storage for openSUSE:Factory checked in at 2020-06-30 21:56:37
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-google-cloud-storage (Old)
and /work/SRC/openSUSE:Factory/.python-google-cloud-storage.new.3060 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-google-cloud-storage"
Tue Jun 30 21:56:37 2020 rev:8 rq:817772 version:1.29.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-google-cloud-storage/python-google-cloud-storage.changes 2020-06-19 17:24:07.088022186 +0200
+++ /work/SRC/openSUSE:Factory/.python-google-cloud-storage.new.3060/python-google-cloud-storage.changes 2020-06-30 21:56:43.930808432 +0200
@@ -1,0 +2,6 @@
+Mon Jun 29 15:55:00 UTC 2020 - Sean Marlow <sean.marlow(a)suse.com>
+
+- Update version requirement for mock package.
+ + The tests require at least version 3.0.0.
+
+-------------------------------------------------------------------
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-google-cloud-storage.spec ++++++
--- /var/tmp/diff_new_pack.9kXuQg/_old 2020-06-30 21:56:44.682810759 +0200
+++ /var/tmp/diff_new_pack.9kXuQg/_new 2020-06-30 21:56:44.686810771 +0200
@@ -27,7 +27,7 @@
BuildRequires: %{python_module google-auth >= 1.11.0}
BuildRequires: %{python_module google-cloud-core >= 1.2.0}
BuildRequires: %{python_module google-resumable-media >= 0.5.0}
-BuildRequires: %{python_module mock}
+BuildRequires: %{python_module mock >= 3.0.0}
BuildRequires: %{python_module pytest}
BuildRequires: %{python_module setuptools}
BuildRequires: fdupes
1
0
Hello community,
here is the log from the commit of package python-torch for openSUSE:Factory checked in at 2020-06-30 21:56:34
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-torch (Old)
and /work/SRC/openSUSE:Factory/.python-torch.new.3060 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-torch"
Tue Jun 30 21:56:34 2020 rev:4 rq:817740 version:1.5.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-torch/python-torch.changes 2020-05-01 11:15:56.756185758 +0200
+++ /work/SRC/openSUSE:Factory/.python-torch.new.3060/python-torch.changes 2020-06-30 21:56:38.490791599 +0200
@@ -1,0 +2,36 @@
+Tue Jun 23 15:28:57 UTC 2020 - Christian Goll <cgoll(a)suse.com>
+
+- updated to new stable release 1.5.1 which has following changes:
+ This release includes several major new API additions and improvements. These
+ include new APIs for autograd allowing for easy computation of hessians and
+ jacobians, a significant update to the C++ frontend, ‘channels last’ memory
+ format for more performant computer vision models, a stable release of the
+ distributed RPC framework used for model parallel training, and a new API
+ that allows for the creation of Custom C++ Classes that was inspired by
+ PyBind. Additionally torch_xla 1.5 is now available and tested with the
+ PyTorch 1.5 release providing a mature Cloud TPU experience.
+ * see release.html for detailed information
+- added patches:
+ * fix-call-of-onnxInitGraph.patch for API mismatch in onnx
+ * fix-mov-operand-for-gcc.patch for aarch64 operands
+
+- removed sources:
+ * cpuinfo-89fe1695edf9ee14c22f815f24bac45577a4f135.tar.gz
+ * gloo-7c541247a6fa49e5938e304ab93b6da661823d0f.tar.gz
+ * onnx-fea8568cac61a482ed208748fdc0e1a8e47f62f5.tar.gz
+ * psimd-90a938f30ba414ada2f4b00674ee9631d7d85e19.tar.gz
+ * pthreadpool-13da0b4c21d17f94150713366420baaf1b5a46f4.tar.gz
+- added sources:
+ * cpuinfo-0e6bde92b343c5fbcfe34ecd41abf9515d54b4a7.tar.gz
+ * gloo-113bde13035594cafdca247be953610b53026553.tar.gz
+ * onnx-9fdae4c68960a2d44cd1cc871c74a6a9d469fa1f.tar.gz
+ * psimd-10b4ffc6ea9e2e11668f86969586f88bc82aaefa.tar.gz
+ * pthreadpool-d465747660ecf9ebbaddf8c3db37e4a13d0c9103.tar.gz
+
+-------------------------------------------------------------------
+Tue Jun 23 09:25:06 UTC 2020 - Christian Goll <cgoll(a)suse.com>
+
+- updated to bugfix release 1.4.1 and added _multibuild file so
+ that cuda versions can be build on commandline
+
+-------------------------------------------------------------------
Old:
----
cpuinfo-89fe1695edf9ee14c22f815f24bac45577a4f135.tar.gz
gloo-7c541247a6fa49e5938e304ab93b6da661823d0f.tar.gz
onnx-fea8568cac61a482ed208748fdc0e1a8e47f62f5.tar.gz
psimd-90a938f30ba414ada2f4b00674ee9631d7d85e19.tar.gz
pthreadpool-13da0b4c21d17f94150713366420baaf1b5a46f4.tar.gz
pytorch-1.4.0.tar.gz
New:
----
XNNPACK-7493bfb9d412e59529bcbced6a902d44cfa8ea1c.tar.gz
_multibuild
cpuinfo-0e6bde92b343c5fbcfe34ecd41abf9515d54b4a7.tar.gz
fix-call-of-onnxInitGraph.patch
fix-mov-operand-for-gcc.patch
gloo-113bde13035594cafdca247be953610b53026553.tar.gz
onnx-9fdae4c68960a2d44cd1cc871c74a6a9d469fa1f.tar.gz
psimd-10b4ffc6ea9e2e11668f86969586f88bc82aaefa.tar.gz
pthreadpool-d465747660ecf9ebbaddf8c3db37e4a13d0c9103.tar.gz
pytorch-1.5.1.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-torch.spec ++++++
--- /var/tmp/diff_new_pack.a452ro/_old 2020-06-30 21:56:39.978796203 +0200
+++ /var/tmp/diff_new_pack.a452ro/_new 2020-06-30 21:56:39.982796216 +0200
@@ -21,8 +21,19 @@
%define skip_python2 1
%define pname torch
+%global flavor @BUILD_FLAVOR@%{nil}
+
+%if "%{flavor}" == "standard"
+%bcond_with cuda
+%endif
+
+%if "%{flavor}" == "cuda-10-2"
+%bcond_without cuda
+%define cudaver 10-2
+%endif
+
Name: python-torch
-Version: 1.4.0
+Version: 1.5.1
Release: 0
Summary: Deep learning framework aka pytorch/Caffe2
License: BSD-2-Clause AND BSD-3-Clause AND MIT AND Zlib AND BSL-1.0 AND Apache-2.0
@@ -31,21 +42,21 @@
Source0: https://github.com/pytorch/pytorch/archive/v%{version}.tar.gz#/%{srcname}-%…
Source1: releases.html
#License10: BSD-3-Clause
-Source10: https://github.com/facebookincubator/gloo/archive/7c541247a6fa49e5938e304ab…
+Source10: https://github.com/facebookincubator/gloo/archive/113bde13035594cafdca247be…
#License12: BSD-2-Clause
-Source12: https://github.com/pytorch/cpuinfo/archive/89fe1695edf9ee14c22f815f24bac455…
+Source12: https://github.com/pytorch/cpuinfo/archive/0e6bde92b343c5fbcfe34ecd41abf951…
#License13: BSL-1.0
Source13: https://github.com/zdevito/sleef/archive/7f523de651585fe25cade462efccca647d…
#License14: BSD-3-Clause
Source14: https://github.com/pybind/pybind11/archive/25abf7efba0b2990f5a6dfb0a31bc65c…
# License15: MIT
-Source15: https://github.com/onnx/onnx/archive/fea8568cac61a482ed208748fdc0e1a8e47f62…
+Source15: https://github.com/onnx/onnx/archive/9fdae4c68960a2d44cd1cc871c74a6a9d469fa…
#License16: BSD-2-Clause
-Source16: https://github.com/Maratyszcza/pthreadpool/archive/13da0b4c21d17f9415071336…
+Source16: https://github.com/Maratyszcza/pthreadpool/archive/d465747660ecf9ebbaddf8c3…
# License17: MIT
Source17: https://github.com/Maratyszcza/FXdiv/archive/b742d1143724d646cd0f914646f124…
# License18: MIT
-Source18: https://github.com/Maratyszcza/psimd/archive/90a938f30ba414ada2f4b00674ee96…
+Source18: https://github.com/Maratyszcza/psimd/archive/10b4ffc6ea9e2e11668f86969586f8…
# License19: MIT
Source19: https://github.com/Maratyszcza/FP16/archive/febbb1c163726b5db24bed55cc9dc42…
#License20: Apache-2.0
@@ -54,9 +65,13 @@
Source21: https://github.com/houseroad/foxi/archive/97fe555430a857581b9b826ecd955e4f0…
# License22: MIT
Source22: https://github.com/pytorch/QNNPACK/archive/7d2a4e9931a82adc3814275b6219a03e…
+# License: BSD-3-Clause
+Source23: https://github.com/google/XNNPACK/archive/7493bfb9d412e59529bcbced6a902d44c…
Patch0: removed-peachpy-depedency.patch
Patch1: skip-third-party-check.patch
+Patch2: fix-call-of-onnxInitGraph.patch
+Patch3: fix-mov-operand-for-gcc.patch
# A python call to cmake fails with a return code of 1 on this arch, disable it for now.
ExcludeArch: %ix86
@@ -96,6 +111,20 @@
BuildRequires: protobuf-c
BuildRequires: protobuf-devel
BuildRequires: snappy-devel
+%if %{with cuda}
+BuildRequires: cuda-compiler-%cudaver
+BuildRequires: cuda-cudart-dev-%cudaver
+BuildRequires: cuda-libraries-dev-%cudaver
+BuildRequires: cuda-misc-headers-%cudaver
+BuildRequires: cuda-nsight-%cudaver
+BuildRequires: cuda-toolkit-%cudaver
+%if 0%{?suse_version} > 1500
+BuildRequires: gcc7
+BuildRequires: gcc7-c++
+%endif
+BuildRequires: libcudnn7-devel
+BuildRequires: libnccl-devel
+%endif
BuildRoot: %{_tmppath}/%{name}-%{version}-build
Requires: python-future
Requires: python-leveldb
@@ -106,6 +135,10 @@
Provides: python-caffe2 = %version
Provides: python-pytorch = %version
+%if "%flavor" == ""
+ExclusiveArch: do_not_build
+%endif
+
%python_subpackages
%description
@@ -176,39 +209,42 @@
%make_depend_src %{SOURCE20} gemmlowp/gemmlowp
%make_depend_src %{SOURCE21}
%make_depend_src %{SOURCE22}
-# link system eigen to right place
-rmdir eigen
-ln -s /usr/include/eigen3 eigen
-cd ..
+%make_depend_src %{SOURCE23}
%build
-#export CC=gcc-7
-#export CXX=g++-7
-export USE_NNPACK=0
-export USE_CUDNN=0
-export USE_TEST=0
-export USE_LEVELDB=ON
-export USE_LMDB=ON
-export USE_FBGEMM=0
-export USE_SYSTEM_LIB="tbb,fbgemm,fbgemm/third_party/asmjit,onnx/third_party/benchmark"
-export BUILD_CUSTOM_PROTOBUF=OFF
-export BUILD_TEST=0
+%define buildvars \
+ export USE_NNPACK=OFF \
+ %if %{with cuda} \
+ export USE_CUDNN=ON \
+ export USE_SYSTEM_NCCL=ON \
+ export PATH="/usr/local/cuda-10.1/bin:$PATH" \
+ export CPLUS_INCLUDE_PATH="/usr/local/cuda-10.1/include" \
+ export C_INCLUDE_PATH="/usr/local/cuda-10.1/include" \
+ export LD_LIBRARY_PATH="/usr/local/cuda-10.1/lib" \
+ export NCCL_INCLUDE_DIR="/usr/include/" \
+ %if 0%{?suse_version} > 1500 \
+ export CC=gcc-7 \
+ export CXX=g++-7 \
+ %endif \
+ %else \
+ export USE_CUDNN=OFF \
+ %endif \
+ export USE_TEST=OFF \
+ export USE_LEVELDB=ON \
+ export USE_LMDB=ON \
+ export USE_FBGEMM=OFF \
+ export USE_SYSTEM_LIB="tbb,fbgemm,fbgemm/third_party/asmjit,onnx/third_party/benchmark" \
+ export USE_SYSTEM_EIGEN_INSTALL=ON \
+ export BUILD_CUSTOM_PROTOBUF=OFF \
+ export BUILD_TEST=OFF \
+ export MAX_JOBS=%{?jobs} \
+
+%buildvars
%limit_build -m 2000
-export MAX_JOBS=%{?jobs}
%python_build
%install
-export USE_NNPACK=0
-export USE_CUDNN=0
-export USE_TEST=0
-export USE_LEVELDB=ON
-export USE_LMDB=ON
-export USE_FBGEMM=0
-export USE_SYSTEM_LIB="tbb,fbgemm,fbgemm/third_party/asmjit,onnx/third_party/benchmark"
-export BUILD_CUSTOM_PROTOBUF=OFF
-export BUILD_TEST=1
-%limit_build -m 2000
-export MAX_JOBS=%{?jobs}
+%buildvars
%python_install
%python_expand %fdupes %{buildroot}%{$python_sitearch}
++++++ _multibuild ++++++
<multibuild>
<package>standard</package>
</multibuild>
++++++ cpuinfo-89fe1695edf9ee14c22f815f24bac45577a4f135.tar.gz -> cpuinfo-0e6bde92b343c5fbcfe34ecd41abf9515d54b4a7.tar.gz ++++++
++++ 2375 lines of diff (skipped)
++++++ fix-call-of-onnxInitGraph.patch ++++++
>From 872d5e67e06e8fbde32d31dd91e07fc137677d9d Mon Sep 17 00:00:00 2001
From: Christian Goll <cgoll(a)suse.de>
Date: Tue, 23 Jun 2020 16:55:25 +0200
Subject: [PATCH] fix call of onnxInitGraph() Removed max_seq_size_ as
onnxInitGraph() does not need this argument any more
---
caffe2/opt/onnxifi_op.h | 1 -
1 file changed, 1 deletion(-)
diff --git a/caffe2/opt/onnxifi_op.h b/caffe2/opt/onnxifi_op.h
index c45ad7c9f8..b5df81ef03 100644
--- a/caffe2/opt/onnxifi_op.h
+++ b/caffe2/opt/onnxifi_op.h
@@ -256,7 +256,6 @@ class OnnxifiOp final : public Operator<Context> {
weight_descs.size(),
weight_descs.data(),
&graph,
- static_cast<uint32_t>(max_seq_size_),
defered_blob_reader),
ONNXIFI_STATUS_SUCCESS);
--
2.25.0
++++++ fix-mov-operand-for-gcc.patch ++++++
>From 5c318611978a7f9add5b889ad70e4af5b10a9c00 Mon Sep 17 00:00:00 2001
From: Edward Swarthout <Ed.Swarthout(a)nxp.com>
Date: Sat, 8 Feb 2020 12:53:07 -0600
Subject: [PATCH] QNNPACK: q8gemm/8x8-dq-aarch64-neon.S fix mov operand for gcc
Unlike clang, GNU assembler does not support 4s on neon mov, so use 16b.
Fixes:
8x8-dq-aarch64-neon.S: Assembler messages:
8x8-dq-aarch64-neon.S:657:
Error: operand mismatch -- `mov V8.4s,V9.4s'
Info: did you mean this?
Info: mov v8.8b, v9.8b
Info: other valid variant(s):
Info: mov v8.16b, v9.16b
Signed-off-by: Edward Swarthout <Ed.Swarthout(a)nxp.com>
---
.../cpu/qnnpack/src/q8gemm/8x8-dq-aarch64-neon.S | 16 ++++++++--------
1 file changed, 8 insertions(+), 8 deletions(-)
diff --git a/aten/src/ATen/native/quantized/cpu/qnnpack/src/q8gemm/8x8-dq-aarch64-neon.S b/aten/src/ATen/native/quantized/cpu/qnnpack/src/q8gemm/8x8-dq-aarch64-neon.S
index 7dc861110186a..60ad8d1d4b340 100644
--- a/aten/src/ATen/native/quantized/cpu/qnnpack/src/q8gemm/8x8-dq-aarch64-neon.S
+++ b/aten/src/ATen/native/quantized/cpu/qnnpack/src/q8gemm/8x8-dq-aarch64-neon.S
@@ -659,14 +659,14 @@ BEGIN_FUNCTION pytorch_q8gemm_dq_ukernel_8x8__aarch64_neon
SUB x1, x1, 4
- MOV V8.4s, V9.4s
- MOV v10.4s, v11.4s
- MOV v12.4s, V13.4s
- MOV V14.4s, V15.4s
- MOV V16.4s, V17.4s
- MOV V18.4s, V19.4s
- MOV V20.4s, V21.4s
- MOV V22.4s, V23.4s
+ MOV V8.16b, V9.16b
+ MOV v10.16b, v11.16b
+ MOV v12.16b, V13.16b
+ MOV V14.16b, V15.16b
+ MOV V16.16b, V17.16b
+ MOV V18.16b, V19.16b
+ MOV V20.16b, V21.16b
+ MOV V22.16b, V23.16b
5:
CMP x1, 2
++++++ gloo-7c541247a6fa49e5938e304ab93b6da661823d0f.tar.gz -> gloo-113bde13035594cafdca247be953610b53026553.tar.gz ++++++
++++ 2209 lines of diff (skipped)
++++++ onnx-fea8568cac61a482ed208748fdc0e1a8e47f62f5.tar.gz -> onnx-9fdae4c68960a2d44cd1cc871c74a6a9d469fa1f.tar.gz ++++++
++++ 9774 lines of diff (skipped)
++++++ psimd-90a938f30ba414ada2f4b00674ee9631d7d85e19.tar.gz -> psimd-10b4ffc6ea9e2e11668f86969586f88bc82aaefa.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/psimd-90a938f30ba414ada2f4b00674ee9631d7d85e19/LICENSE new/psimd-10b4ffc6ea9e2e11668f86969586f88bc82aaefa/LICENSE
--- old/psimd-90a938f30ba414ada2f4b00674ee9631d7d85e19/LICENSE 2018-09-06 18:11:46.000000000 +0200
+++ new/psimd-10b4ffc6ea9e2e11668f86969586f88bc82aaefa/LICENSE 2019-12-26 20:22:39.000000000 +0100
@@ -2,6 +2,7 @@
Copyright (c) 2017 Facebook Inc.
Copyright (c) 2014-2017 Georgia Institute of Technology
+Copyright 2019 Google LLC
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/psimd-90a938f30ba414ada2f4b00674ee9631d7d85e19/include/psimd.h new/psimd-10b4ffc6ea9e2e11668f86969586f88bc82aaefa/include/psimd.h
--- old/psimd-90a938f30ba414ada2f4b00674ee9631d7d85e19/include/psimd.h 2018-09-06 18:11:46.000000000 +0200
+++ new/psimd-10b4ffc6ea9e2e11668f86969586f88bc82aaefa/include/psimd.h 2019-12-26 20:22:39.000000000 +0100
@@ -295,20 +295,84 @@
return *((const psimd_f32*) address);
}
+ PSIMD_INTRINSIC psimd_s8 psimd_load_splat_s8(const void* address) {
+ return psimd_splat_s8(*((const int8_t*) address));
+ }
+
+ PSIMD_INTRINSIC psimd_u8 psimd_load_splat_u8(const void* address) {
+ return psimd_splat_u8(*((const uint8_t*) address));
+ }
+
+ PSIMD_INTRINSIC psimd_s16 psimd_load_splat_s16(const void* address) {
+ return psimd_splat_s16(*((const int16_t*) address));
+ }
+
+ PSIMD_INTRINSIC psimd_u16 psimd_load_splat_u16(const void* address) {
+ return psimd_splat_u16(*((const uint16_t*) address));
+ }
+
+ PSIMD_INTRINSIC psimd_s32 psimd_load_splat_s32(const void* address) {
+ return psimd_splat_s32(*((const int32_t*) address));
+ }
+
+ PSIMD_INTRINSIC psimd_u32 psimd_load_splat_u32(const void* address) {
+ return psimd_splat_u32(*((const uint32_t*) address));
+ }
+
+ PSIMD_INTRINSIC psimd_f32 psimd_load_splat_f32(const void* address) {
+ return psimd_splat_f32(*((const float*) address));
+ }
+
+ PSIMD_INTRINSIC psimd_s32 psimd_load1_s32(const void* address) {
+ return (psimd_s32) { *((const int32_t*) address), 0, 0, 0 };
+ }
+
+ PSIMD_INTRINSIC psimd_u32 psimd_load1_u32(const void* address) {
+ return (psimd_u32) { *((const uint32_t*) address), 0, 0, 0 };
+ }
+
PSIMD_INTRINSIC psimd_f32 psimd_load1_f32(const void* address) {
return (psimd_f32) { *((const float*) address), 0.0f, 0.0f, 0.0f };
}
+ PSIMD_INTRINSIC psimd_s32 psimd_load2_s32(const void* address) {
+ const int32_t* address_s32 = (const int32_t*) address;
+ return (psimd_s32) { address_s32[0], address_s32[1], 0, 0 };
+ }
+
+ PSIMD_INTRINSIC psimd_u32 psimd_load2_u32(const void* address) {
+ const uint32_t* address_u32 = (const uint32_t*) address;
+ return (psimd_u32) { address_u32[0], address_u32[1], 0, 0 };
+ }
+
PSIMD_INTRINSIC psimd_f32 psimd_load2_f32(const void* address) {
const float* address_f32 = (const float*) address;
return (psimd_f32) { address_f32[0], address_f32[1], 0.0f, 0.0f };
}
+ PSIMD_INTRINSIC psimd_s32 psimd_load3_s32(const void* address) {
+ const int32_t* address_s32 = (const int32_t*) address;
+ return (psimd_s32) { address_s32[0], address_s32[1], address_s32[2], 0 };
+ }
+
+ PSIMD_INTRINSIC psimd_u32 psimd_load3_u32(const void* address) {
+ const uint32_t* address_u32 = (const uint32_t*) address;
+ return (psimd_u32) { address_u32[0], address_u32[1], address_u32[2], 0 };
+ }
+
PSIMD_INTRINSIC psimd_f32 psimd_load3_f32(const void* address) {
const float* address_f32 = (const float*) address;
return (psimd_f32) { address_f32[0], address_f32[1], address_f32[2], 0.0f };
}
+ PSIMD_INTRINSIC psimd_s32 psimd_load4_s32(const void* address) {
+ return psimd_load_s32(address);
+ }
+
+ PSIMD_INTRINSIC psimd_u32 psimd_load4_u32(const void* address) {
+ return psimd_load_u32(address);
+ }
+
PSIMD_INTRINSIC psimd_f32 psimd_load4_f32(const void* address) {
return psimd_load_f32(address);
}
@@ -403,16 +467,50 @@
*((psimd_f32*) address) = value;
}
+ PSIMD_INTRINSIC void psimd_store1_s32(void* address, psimd_s32 value) {
+ *((int32_t*) address) = value[0];
+ }
+
+ PSIMD_INTRINSIC void psimd_store1_u32(void* address, psimd_u32 value) {
+ *((uint32_t*) address) = value[0];
+ }
+
PSIMD_INTRINSIC void psimd_store1_f32(void* address, psimd_f32 value) {
*((float*) address) = value[0];
}
+ PSIMD_INTRINSIC void psimd_store2_s32(void* address, psimd_s32 value) {
+ int32_t* address_s32 = (int32_t*) address;
+ address_s32[0] = value[0];
+ address_s32[1] = value[1];
+ }
+
+ PSIMD_INTRINSIC void psimd_store2_u32(void* address, psimd_u32 value) {
+ uint32_t* address_u32 = (uint32_t*) address;
+ address_u32[0] = value[0];
+ address_u32[1] = value[1];
+ }
+
PSIMD_INTRINSIC void psimd_store2_f32(void* address, psimd_f32 value) {
float* address_f32 = (float*) address;
address_f32[0] = value[0];
address_f32[1] = value[1];
}
+ PSIMD_INTRINSIC void psimd_store3_s32(void* address, psimd_s32 value) {
+ int32_t* address_s32 = (int32_t*) address;
+ address_s32[0] = value[0];
+ address_s32[1] = value[1];
+ address_s32[2] = value[2];
+ }
+
+ PSIMD_INTRINSIC void psimd_store3_u32(void* address, psimd_u32 value) {
+ uint32_t* address_u32 = (uint32_t*) address;
+ address_u32[0] = value[0];
+ address_u32[1] = value[1];
+ address_u32[2] = value[2];
+ }
+
PSIMD_INTRINSIC void psimd_store3_f32(void* address, psimd_f32 value) {
float* address_f32 = (float*) address;
address_f32[0] = value[0];
@@ -420,6 +518,14 @@
address_f32[2] = value[2];
}
+ PSIMD_INTRINSIC void psimd_store4_s32(void* address, psimd_s32 value) {
+ psimd_store_s32(address, value);
+ }
+
+ PSIMD_INTRINSIC void psimd_store4_u32(void* address, psimd_u32 value) {
+ psimd_store_u32(address, value);
+ }
+
PSIMD_INTRINSIC void psimd_store4_f32(void* address, psimd_f32 value) {
psimd_store_f32(address, value);
}
@@ -553,65 +659,103 @@
#endif
}
+ /* Quasi-Fused Multiply-Add */
+ PSIMD_INTRINSIC psimd_f32 psimd_qfma_f32(psimd_f32 a, psimd_f32 b, psimd_f32 c) {
+ #if defined(__aarch64__) || defined(__ARM_NEON__) && defined(__ARM_FEATURE_FMA)
+ return (psimd_f32) vfmaq_f32((float32x4_t) a, (float32x4_t) b, (float32x4_t) c);
+ #elif (defined(__x86_64__) || defined(__i386__) || defined(__i686__)) && defined(__FMA__)
+ return (psimd_f32) _mm_fmadd_ps((__m128) c, (__m128) a, (__m128) b);
+ #elif (defined(__x86_64__) || defined(__i386__) || defined(__i686__)) && defined(__FMA4__)
+ return (psimd_f32) _mm_macc_ps((__m128) c, (__m128) a, (__m128) b);
+ #elif defined(__wasm__) && defined(__wasm_simd128__) && defined(__clang__)
+ return (psimd_f32) __builtin_wasm_qfma_f32x4(a, b, c);
+ #else
+ return a + b * c;
+ #endif
+ }
+
+ PSIMD_INTRINSIC psimd_f32 psimd_div_f32(psimd_f32 a, psimd_f32 b) {
+ return a / b;
+ }
+
/* Vector and */
PSIMD_INTRINSIC psimd_f32 psimd_andmask_f32(psimd_s32 mask, psimd_f32 v) {
return (psimd_f32) (mask & (psimd_s32) v);
}
+ /* Vector and-not */
+ PSIMD_INTRINSIC psimd_f32 psimd_andnotmask_f32(psimd_s32 mask, psimd_f32 v) {
+ return (psimd_f32) (~mask & (psimd_s32) v);
+ }
+
/* Vector blend */
PSIMD_INTRINSIC psimd_s8 psimd_blend_s8(psimd_s8 mask, psimd_s8 a, psimd_s8 b) {
#if defined(__ARM_NEON__) || defined(__ARM_NEON)
return (psimd_s8) vbslq_s8((uint8x16_t) mask, (int8x16_t) a, (int8x16_t) b);
+ #elif defined(__wasm__) && defined(__wasm_simd128__) && defined(__clang__)
+ return (psimd_s8) __builtin_wasm_bitselect(a, b, mask);
#else
return (mask & a) | (~mask & b);
#endif
}
- PSIMD_INTRINSIC psimd_u8 psimd_blend_u8(psimd_u8 mask, psimd_u8 a, psimd_u8 b) {
+ PSIMD_INTRINSIC psimd_u8 psimd_blend_u8(psimd_s8 mask, psimd_u8 a, psimd_u8 b) {
#if defined(__ARM_NEON__) || defined(__ARM_NEON)
return (psimd_u8) vbslq_u8((uint8x16_t) mask, (uint8x16_t) a, (uint8x16_t) b);
+ #elif defined(__wasm__) && defined(__wasm_simd128__) && defined(__clang__)
+ return (psimd_u8) __builtin_wasm_bitselect(a, b, mask);
#else
- return (mask & a) | (~mask & b);
+ return (psimd_u8) ((mask & (psimd_s8) a) | (~mask & (psimd_s8) b));
#endif
}
PSIMD_INTRINSIC psimd_s16 psimd_blend_s16(psimd_s16 mask, psimd_s16 a, psimd_s16 b) {
#if defined(__ARM_NEON__) || defined(__ARM_NEON)
return (psimd_s16) vbslq_s16((uint16x8_t) mask, (int16x8_t) a, (int16x8_t) b);
+ #elif defined(__wasm__) && defined(__wasm_simd128__) && defined(__clang__)
+ return (psimd_s16) __builtin_wasm_bitselect(a, b, mask);
#else
return (mask & a) | (~mask & b);
#endif
}
- PSIMD_INTRINSIC psimd_u16 psimd_blend_u16(psimd_u16 mask, psimd_u16 a, psimd_u16 b) {
+ PSIMD_INTRINSIC psimd_u16 psimd_blend_u16(psimd_s16 mask, psimd_u16 a, psimd_u16 b) {
#if defined(__ARM_NEON__) || defined(__ARM_NEON)
return (psimd_u16) vbslq_u16((uint16x8_t) mask, (uint16x8_t) a, (uint16x8_t) b);
+ #elif defined(__wasm__) && defined(__wasm_simd128__) && defined(__clang__)
+ return (psimd_u16) __builtin_wasm_bitselect(a, b, mask);
#else
- return (mask & a) | (~mask & b);
+ return (psimd_u16) ((mask & (psimd_s16) a) | (~mask & (psimd_s16) b));
#endif
}
PSIMD_INTRINSIC psimd_s32 psimd_blend_s32(psimd_s32 mask, psimd_s32 a, psimd_s32 b) {
#if defined(__ARM_NEON__) || defined(__ARM_NEON)
return (psimd_s32) vbslq_s32((uint32x4_t) mask, (int32x4_t) a, (int32x4_t) b);
+ #elif defined(__wasm__) && defined(__wasm_simd128__) && defined(__clang__)
+ return (psimd_s32) __builtin_wasm_bitselect(a, b, mask);
#else
return (mask & a) | (~mask & b);
#endif
}
- PSIMD_INTRINSIC psimd_u32 psimd_blend_u32(psimd_u32 mask, psimd_u32 a, psimd_u32 b) {
+ PSIMD_INTRINSIC psimd_u32 psimd_blend_u32(psimd_s32 mask, psimd_u32 a, psimd_u32 b) {
#if defined(__ARM_NEON__) || defined(__ARM_NEON)
return (psimd_u32) vbslq_u32((uint32x4_t) mask, (uint32x4_t) a, (uint32x4_t) b);
+ #elif defined(__wasm__) && defined(__wasm_simd128__) && defined(__clang__)
+ return (psimd_u32) __builtin_wasm_bitselect(a, b, mask);
#else
- return (mask & a) | (~mask & b);
+ return (psimd_u32) ((mask & (psimd_s32) a) | (~mask & (psimd_s32) b));
#endif
}
PSIMD_INTRINSIC psimd_f32 psimd_blend_f32(psimd_s32 mask, psimd_f32 a, psimd_f32 b) {
#if defined(__ARM_NEON__) || defined(__ARM_NEON)
return (psimd_f32) vbslq_f32((uint32x4_t) mask, (float32x4_t) a, (float32x4_t) b);
+ #elif defined(__wasm__) && defined(__wasm_simd128__) && defined(__clang__)
+ return (psimd_f32) __builtin_wasm_bitselect(a, b, mask);
#else
- return (psimd_f32) psimd_blend_s32(mask, (psimd_s32) a, (psimd_s32) b);
+ return (psimd_f32) ((mask & (psimd_s32) a) | (~mask & (psimd_s32) b));
#endif
}
@@ -621,7 +765,7 @@
}
PSIMD_INTRINSIC psimd_u8 psimd_signblend_u8(psimd_s8 x, psimd_u8 a, psimd_u8 b) {
- return psimd_blend_u8((psimd_u8) (x >> psimd_splat_s8(7)), a, b);
+ return psimd_blend_u8((x >> psimd_splat_s8(7)), a, b);
}
PSIMD_INTRINSIC psimd_s16 psimd_signblend_s16(psimd_s16 x, psimd_s16 a, psimd_s16 b) {
@@ -629,7 +773,7 @@
}
PSIMD_INTRINSIC psimd_u16 psimd_signblend_u16(psimd_s16 x, psimd_u16 a, psimd_u16 b) {
- return psimd_blend_u16((psimd_u16) (x >> psimd_splat_s16(15)), a, b);
+ return psimd_blend_u16((x >> psimd_splat_s16(15)), a, b);
}
PSIMD_INTRINSIC psimd_s32 psimd_signblend_s32(psimd_s32 x, psimd_s32 a, psimd_s32 b) {
@@ -637,7 +781,7 @@
}
PSIMD_INTRINSIC psimd_u32 psimd_signblend_u32(psimd_s32 x, psimd_u32 a, psimd_u32 b) {
- return psimd_blend_u32((psimd_u32) (x >> psimd_splat_s32(31)), a, b);
+ return psimd_blend_u32((x >> psimd_splat_s32(31)), a, b);
}
PSIMD_INTRINSIC psimd_f32 psimd_signblend_f32(psimd_f32 x, psimd_f32 a, psimd_f32 b) {
@@ -648,7 +792,7 @@
/* Vector absolute value */
PSIMD_INTRINSIC psimd_f32 psimd_abs_f32(psimd_f32 v) {
const psimd_s32 mask = (psimd_s32) psimd_splat_f32(-0.0f);
- return (psimd_f32) ((psimd_s32) v & mask);
+ return (psimd_f32) ((psimd_s32) v & ~mask);
}
/* Vector negation */
@@ -709,6 +853,8 @@
PSIMD_INTRINSIC psimd_f32 psimd_max_f32(psimd_f32 a, psimd_f32 b) {
#if defined(__ARM_NEON__) || defined(__ARM_NEON)
return (psimd_f32) vmaxq_f32((float32x4_t) a, (float32x4_t) b);
+ #elif defined(__wasm__) && defined(__wasm_simd128__) && defined(__clang__)
+ return __builtin_wasm_max_f32x4(a, b);
#else
return psimd_blend_f32(a > b, a, b);
#endif
@@ -766,6 +912,8 @@
PSIMD_INTRINSIC psimd_f32 psimd_min_f32(psimd_f32 a, psimd_f32 b) {
#if defined(__ARM_NEON__) || defined(__ARM_NEON)
return (psimd_f32) vminq_f32((float32x4_t) a, (float32x4_t) b);
+ #elif defined(__wasm__) && defined(__wasm_simd128__) && defined(__clang__)
+ return __builtin_wasm_min_f32x4(a, b);
#else
return psimd_blend_f32(a < b, a, b);
#endif
++++++ pthreadpool-13da0b4c21d17f94150713366420baaf1b5a46f4.tar.gz -> pthreadpool-d465747660ecf9ebbaddf8c3db37e4a13d0c9103.tar.gz ++++++
++++ 5728 lines of diff (skipped)
++++++ pytorch-1.4.0.tar.gz -> pytorch-1.5.1.tar.gz ++++++
/work/SRC/openSUSE:Factory/python-torch/pytorch-1.4.0.tar.gz /work/SRC/openSUSE:Factory/.python-torch.new.3060/pytorch-1.5.1.tar.gz differ: char 13, line 1
++++++ releases.html ++++++
++++ 4843 lines (skipped)
++++ between /work/SRC/openSUSE:Factory/python-torch/releases.html
++++ and /work/SRC/openSUSE:Factory/.python-torch.new.3060/releases.html
1
0
Hello community,
here is the log from the commit of package gsequencer for openSUSE:Factory checked in at 2020-06-30 21:56:28
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/gsequencer (Old)
and /work/SRC/openSUSE:Factory/.gsequencer.new.3060 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "gsequencer"
Tue Jun 30 21:56:28 2020 rev:72 rq:817564 version:3.4.9
Changes:
--------
--- /work/SRC/openSUSE:Factory/gsequencer/gsequencer.changes 2020-06-25 15:10:11.349885114 +0200
+++ /work/SRC/openSUSE:Factory/.gsequencer.new.3060/gsequencer.changes 2020-06-30 21:56:34.062777897 +0200
@@ -1,0 +2,6 @@
+Sun Jun 28 04:28:20 UTC 2020 - Joël Krähemann <jkraehemann(a)gmail.com>
+
+- new upstream v3.4.9 compute phase using fmod() in
+ ags_filter_util.c
+
+-------------------------------------------------------------------
Old:
----
gsequencer-3.4.6.tar.gz
New:
----
gsequencer-3.4.9.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ gsequencer.spec ++++++
--- /var/tmp/diff_new_pack.FSOHH7/_old 2020-06-30 21:56:34.966780694 +0200
+++ /var/tmp/diff_new_pack.FSOHH7/_new 2020-06-30 21:56:34.970780706 +0200
@@ -22,7 +22,7 @@
# activated with --with run_functional_tests command line switch.
%bcond_with run_functional_tests
Name: gsequencer
-Version: 3.4.6
+Version: 3.4.9
Release: 0
Summary: Audio processing engine
License: GPL-3.0-or-later AND AGPL-3.0-or-later AND GFDL-1.3-only
++++++ gsequencer-3.4.6.tar.gz -> gsequencer-3.4.9.tar.gz ++++++
/work/SRC/openSUSE:Factory/gsequencer/gsequencer-3.4.6.tar.gz /work/SRC/openSUSE:Factory/.gsequencer.new.3060/gsequencer-3.4.9.tar.gz differ: char 115, line 1
++++++ gsequencer.1-improved-glib-compatibility.patch ++++++
--- /var/tmp/diff_new_pack.FSOHH7/_old 2020-06-30 21:56:35.010780830 +0200
+++ /var/tmp/diff_new_pack.FSOHH7/_new 2020-06-30 21:56:35.014780843 +0200
@@ -1,6 +1,6 @@
---- configure.ac.orig 2020-06-07 00:44:49.780581237 +0200
-+++ configure.ac 2020-06-07 00:46:14.617504085 +0200
-@@ -340,23 +340,23 @@
+--- configure.ac.orig 2020-06-28 06:45:52.219135439 +0200
++++ configure.ac 2020-06-28 06:46:54.720493269 +0200
+@@ -340,11 +340,11 @@
AC_SUBST(FFTW_CFLAGS)
AC_SUBST(FFTW_LIBS)
@@ -14,8 +14,9 @@
AC_SUBST(GOBJECT_CFLAGS)
AC_SUBST(GOBJECT_LIBS)
- AC_DEFINE([HAVE_GLIB_2_6], [1], [GLib 2.6 available])
+@@ -352,12 +352,12 @@
AC_DEFINE([HAVE_GLIB_2_44], [1], [GLib 2.44 available])
+ AC_DEFINE([HAVE_GLIB_2_52], [1], [GLib 2.52 available])
AC_DEFINE([HAVE_GLIB_2_54], [1], [GLib 2.54 available])
-AC_DEFINE([HAVE_GLIB_2_56], [1], [GLib 2.56 available])
+AC_DEFINE([HAVE_GLIB_2_56], [0], [GLib 2.56 available])
1
0
Hello community,
here is the log from the commit of package monitoring-plugins-mailstat for openSUSE:Factory checked in at 2020-06-30 21:56:26
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/monitoring-plugins-mailstat (Old)
and /work/SRC/openSUSE:Factory/.monitoring-plugins-mailstat.new.3060 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "monitoring-plugins-mailstat"
Tue Jun 30 21:56:26 2020 rev:2 rq:816225 version:0.9.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/monitoring-plugins-mailstat/monitoring-plugins-mailstat.changes 2016-02-11 12:36:37.000000000 +0100
+++ /work/SRC/openSUSE:Factory/.monitoring-plugins-mailstat.new.3060/monitoring-plugins-mailstat.changes 2020-06-30 21:56:32.002771523 +0200
@@ -1,0 +2,8 @@
+Sun Jun 21 09:51:12 UTC 2020 - lars(a)linux-schulserver.de - 0.9.1
+
+- require perl(RRDs) for the images and mailgraph for the stats
+- add an example check_mailstat.cfg NRPE definition
+- added gpl-3.0.txt license file
+- ran spec-cleaner
+
+-------------------------------------------------------------------
New:
----
check_mailstat.cfg
gpl-3.0.txt
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ monitoring-plugins-mailstat.spec ++++++
--- /var/tmp/diff_new_pack.fbo3X4/_old 2020-06-30 21:56:32.738773800 +0200
+++ /var/tmp/diff_new_pack.fbo3X4/_new 2020-06-30 21:56:32.738773800 +0200
@@ -1,7 +1,7 @@
#
# spec file for package monitoring-plugins-mailstat
#
-# Copyright (c) 2014 SUSE LINUX Products GmbH, Nuernberg, Germany.
+# Copyright (c) 2020 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -12,34 +12,36 @@
# license that conforms to the Open Source Definition (Version 1.9)
# published by the Open Source Initiative.
-# Please submit bugfixes or comments via http://bugs.opensuse.org/
+# Please submit bugfixes or comments via https://bugs.opensuse.org/
#
Name: monitoring-plugins-mailstat
Version: 0.9.1
-Release: 100
+Release: 0
Summary: Monitoring mail server statistics
-License: GPL-3.0+
+License: GPL-3.0-or-later
Group: System/Monitoring
-Url: http://linuxplayer.org/2010/12/check_mailstat-pl-a-nagios-plugin-for-monito…
+URL: http://linuxplayer.org/2010/12/check_mailstat-pl-a-nagios-plugin-for-monito…
Source0: check_mailstat_plugin_v%{version}.zip
Source1: monitoring-plugins-mailstat-rpmlintrc
+Source2: check_mailstat.cfg
+Source3: gpl-3.0.txt
# PATCH-FIX-UPSTREAM -- allow to configure the path name of the statistics file via -s option
Patch1: check_mailstat_plugin_v0.9.1-stat_file.patch
# PATCH-FIX-UPSTREAM -- write out the initial values if there is no old file instead of all zero (confuses people)
Patch2: check_mailstat_plugin_v0.9.1-initial_values.patch
-%if 0%{?suse_version} > 1010
-# nagios can execute the script with embedded perl
-Recommends: perl
-%endif
BuildRequires: nagios-rpm-macros
BuildRequires: unzip
+Requires: mailgraph
+Requires: monitoring-plugins-common
+Requires: perl(RRDs)
Provides: nagios-plugins-mailstat = %{version}-%{release}
Obsoletes: nagios-plugins-mailstat < %{version}-%{release}
-Requires: monitoring-plugins-common
-BuildRoot: %{_tmppath}/%{name}-%{version}-build
BuildArch: noarch
+# nagios can execute the script with embedded perl
+Recommends: perl
+BuildRoot: %{_tmppath}/%{name}-%{version}-build
%description
This plugin includes a patch for mailgraph so that it will also output its
@@ -53,6 +55,7 @@
%patch1 -p1
%patch2 -p1
sed -i "s|
||g" README.txt
+install -m0644 %{SOURCE3} .
%build
@@ -60,19 +63,20 @@
install -D -m755 check_mailstat.pl %{buildroot}/%{nagios_plugindir}/check_mailstat
ln -s %{nagios_plugindir}/check_mailstat %{buildroot}/%{nagios_plugindir}/check_mailstat.pl
install -D -m644 extra/check_mailstat.php %{buildroot}/%{pnp4nagios_templatedir}/check_mailstat.php
-
-%clean
-rm -rf %{buildroot}
+install -D -m644 %{SOURCE2} %{buildroot}%{nrpe_sysconfdir}/check_mailstat.cfg
%files
%defattr(-,root,root)
%doc README.txt
+%license gpl-3.0.txt
# avoid build dependecy of nagios - own the dirs
%dir %{nagios_libdir}
%dir %{nagios_plugindir}
+%dir %{nrpe_sysconfdir}
%{nagios_plugindir}/check_mailstat*
%dir %{pnp4nagios_datarootdir}
%dir %{pnp4nagios_templatedir}
%config(noreplace) %{pnp4nagios_templatedir}/check_mailstat.php
+%config(noreplace) %{nrpe_sysconfdir}/check_mailstat.cfg
%changelog
++++++ check_mailstat.cfg ++++++
# the following check requires an additional command option
# for the running mailgraph process:
# -s /var/log/mailgraph/mailgraph.stats
# Please make sure that the file and a file with the
# same name but ending .old exist and are writable by
# the user executing the check below. Example:
# touch /var/log/mailgraph/mailgraph.stats{,old}
# chown nagios: /var/log/mailgraph/mailgraph.stats{,old}
command[check_mailstat]=/usr/lib/nagios/plugins/check_mailstat -w 1000:0:0:0:0:10 -c 2000:0:0:0:0:30 -s /var/log/mailgraph/mailgraph.statsi
++++++ gpl-3.0.txt ++++++
++++ 674 lines (skipped)
1
0
Hello community,
here is the log from the commit of package erlang for openSUSE:Factory checked in at 2020-06-30 21:56:24
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/erlang (Old)
and /work/SRC/openSUSE:Factory/.erlang.new.3060 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "erlang"
Tue Jun 30 21:56:24 2020 rev:103 rq:815752 version:23.0.2
Changes:
--------
--- /work/SRC/openSUSE:Factory/erlang/erlang.changes 2020-03-25 23:50:27.116018096 +0100
+++ /work/SRC/openSUSE:Factory/.erlang.new.3060/erlang.changes 2020-06-30 21:56:28.358760246 +0200
@@ -1,0 +2,84 @@
+Fri Jun 12 07:24:09 UTC 2020 - Matwey Kornilov <matwey.kornilov(a)gmail.com>
+
+- Changes for 23.0.2:
+ * erts: Fixed bug when sending an export fun (eg lists:reverse/1)
+ on a not yet established connection. It could cause VM crash.
+ Bug exists since OTP 23.0.
+ * megaco: The mini parser could not properly decode some IPv6
+ addresses.
+- Changes for 23.0.1:
+ * erts: The functionality utilized by BIFs for temporary
+ disabling of garbage collection while yielding could cause
+ system task queues to become inconsistent on a process
+ executing such a BIF. Process system tasks are for example
+ utilized when purging code, garbage collecting literal data,
+ and when issuing an ordinary garbage collection from another
+ process. The bug does not trigger frequently. Multiple code
+ purges in direct sequence makes it more likely that this bug is
+ triggered. In the cases observed, this has resulted in a
+ hanging code purge operation.
+ * erts: SCTP and UDP recv/2,3 hangs indefinitely if socket is
+ closed while recv is called (socket in passive mode).
+ * compiler: In rare circumstances, a guard using 'not' could
+ evaluate to the wrong boolean value.
+ * compiler: A guard expression that referenced a variable bound
+ to a boolean expression could evaluate to the wrong value.
+
+-------------------------------------------------------------------
+Fri Jun 12 06:36:45 UTC 2020 - Matwey Kornilov <matwey.kornilov(a)gmail.com>
+
+- Version 23.0:
+- Potential Incompatibilities:
+ * SSL:Support for SSL 3.0 is completely removed. TLS 1.3 is added
+ to the list of default supported versions.
+ * erl_interface: Removed the deprecated parts of erl_interface
+ (erl_interface.h and essentially all C functions with prefix
+ erl_).
+ * The deprecated erlang:get_stacktrace/0 BIF now returns an empty
+ list instead of a stacktrace. erlang:get_stacktrace/0 is
+ scheduled for removal in OTP 24.
+- Improvements and new features:
+ * ssh: OpenSSH 6.5 introduced a new file representation of keys
+ called openssh-key-v1. This is now supported with the exception
+ of handling encrypted keys.
+ * Algorithm configuration could now be done in a .config file. This
+ is useful for example to enable an algorithm that is disabled by
+ default without need to change the code.
+ * SSL: Support for the middlebox compatibility mode makes the TLS
+ 1.3 handshake look more like a TLS 1.2 handshake and increases
+ the chance of successfully establishing TLS 1.3 connections
+ through legacy middleboxes.
+ * Add support for key exchange with Edward curves and PSS-RSA
+ padding in signature verification
+ * The possibility to run Erlang distribution without relying on
+ EPMD has been extended.
+ * A first EXPERIMENTAL socket backend to gen_tcp and inet has been
+ implemented. gen_udp and gen_sctp will follow.
+ * Putting {inet_backend, socket} as first option to listen() or
+ connect() makes it easy to try this for existing code
+ * A new module erpc in kernel which implements an enhanced subset
+ of the operations provided by the rpc module. Enhanced in the
+ sense that it makes it possible to distinguish between returned
+ value, raised exceptions and other errors. erpc also has better
+ performance and scalability than the original rpc implementation.
+ This by utilizing the newly introduced spawn_request() BIF. Also
+ the rpc module benefits from these improvements by utilizing erpc
+ when possible.
+ * Scalability and performance Improvements plus new functionality
+ regarding distributed spawn operations.
+ * In binary matching, the size of the segment to be matched is now
+ allowed to be a guard expression (EEP-52)
+ * When matching with maps the keys can now be guard expressions
+ (EEP-52).
+ * ssh: support for TCP/IP port forwarding, a.k.a tunnelling a.k.a
+ as tcp-forward/direct-tcp is implemented. In the OpenSSH client,
+ this corresponds to the options -L and -R.
+ * Allow underscores in numeric literals to improve readability.
+ Examples: 123_456_789, 16#1234_ABCD.
+ * New functions in the shell for displaying documentation for
+ Erlang modules, functions and types.
+ * kernel: The module pg with a new implementation of distributed
+ named process groups is introduced. The old module pg2 is
+ deprecated and scheduled for removal in OTP 24.
+
+-------------------------------------------------------------------
Old:
----
OTP-22.3.tar.gz
New:
----
OTP-23.0.2.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ erlang.spec ++++++
--- /var/tmp/diff_new_pack.IWojYR/_old 2020-06-30 21:56:29.714764442 +0200
+++ /var/tmp/diff_new_pack.IWojYR/_new 2020-06-30 21:56:29.714764442 +0200
@@ -23,7 +23,7 @@
%define _fillupdir %{_localstatedir}/adm/fillup-templates
%endif
Name: erlang
-Version: 22.3
+Version: 23.0.2
Release: 0
Summary: General-purpose programming language and runtime environment
License: Apache-2.0
++++++ OTP-22.3.tar.gz -> OTP-23.0.2.tar.gz ++++++
/work/SRC/openSUSE:Factory/erlang/OTP-22.3.tar.gz /work/SRC/openSUSE:Factory/.erlang.new.3060/OTP-23.0.2.tar.gz differ: char 13, line 1
++++++ erlang-not-install-misc.patch ++++++
--- /var/tmp/diff_new_pack.IWojYR/_old 2020-06-30 21:56:29.798764702 +0200
+++ /var/tmp/diff_new_pack.IWojYR/_new 2020-06-30 21:56:29.802764715 +0200
@@ -3,11 +3,11 @@
Subject: [PATCH] Do not format man-pages and do not install miscellaneous
utilities for dealing with man-pages.
-Index: otp-OTP-22.1/erts/etc/common/Makefile.in
+Index: otp-OTP-23.0/erts/etc/common/Makefile.in
===================================================================
---- otp-OTP-22.1.orig/erts/etc/common/Makefile.in
-+++ otp-OTP-22.1/erts/etc/common/Makefile.in
-@@ -504,10 +504,6 @@ endif
+--- otp-OTP-23.0.orig/erts/etc/common/Makefile.in
++++ otp-OTP-23.0/erts/etc/common/Makefile.in
+@@ -521,10 +521,6 @@ endif
ifneq ($(INSTALL_TOP_BIN),)
$(INSTALL_PROGRAM) $(INSTALL_TOP_BIN) "$(RELEASE_PATH)"
endif
@@ -18,11 +18,11 @@
ifneq ($(INSTALL_SRC),)
$(INSTALL_DIR) "$(RELEASE_PATH)/erts-$(VSN)/src"
$(INSTALL_DATA) $(INSTALL_SRC) "$(RELEASE_PATH)/erts-$(VSN)/src"
-Index: otp-OTP-22.1/erts/etc/unix/Install.src
+Index: otp-OTP-23.0/erts/etc/unix/Install.src
===================================================================
---- otp-OTP-22.1.orig/erts/etc/unix/Install.src
-+++ otp-OTP-22.1/erts/etc/unix/Install.src
-@@ -141,14 +141,5 @@ cp -p ../releases/%I_SYSTEM_VSN%/start_*
+--- otp-OTP-23.0.orig/erts/etc/unix/Install.src
++++ otp-OTP-23.0/erts/etc/unix/Install.src
+@@ -142,14 +142,5 @@ cp -p ../releases/%I_SYSTEM_VSN%/start_*
cp -p ../releases/%I_SYSTEM_VSN%/no_dot_erlang.boot .
cp -p $Name.boot start.boot
cp -p ../releases/%I_SYSTEM_VSN%/$Name.script start.script
++++++ otp-R16B-rpath.patch ++++++
--- /var/tmp/diff_new_pack.IWojYR/_old 2020-06-30 21:56:29.834764814 +0200
+++ /var/tmp/diff_new_pack.IWojYR/_new 2020-06-30 21:56:29.834764814 +0200
@@ -1,8 +1,8 @@
-Index: otp-OTP-22.1/lib/crypto/c_src/Makefile.in
+Index: otp-OTP-23.0/lib/crypto/c_src/Makefile.in
===================================================================
---- otp-OTP-22.1.orig/lib/crypto/c_src/Makefile.in
-+++ otp-OTP-22.1/lib/crypto/c_src/Makefile.in
-@@ -113,7 +113,7 @@ TEST_ENGINE_LIB = $(LIBDIR)/otp_test_eng
+--- otp-OTP-23.0.orig/lib/crypto/c_src/Makefile.in
++++ otp-OTP-23.0/lib/crypto/c_src/Makefile.in
+@@ -114,7 +114,7 @@ TEST_ENGINE_LIB = $(LIBDIR)/otp_test_eng
DYNAMIC_CRYPTO_LIB=@SSL_DYNAMIC_ONLY@
ifeq ($(DYNAMIC_CRYPTO_LIB),yes)
@@ -11,10 +11,10 @@
CRYPTO_LINK_LIB=$(SSL_DED_LD_RUNTIME_LIBRARY_PATH) -L$(SSL_LIBDIR) -l$(SSL_CRYPTO_LIBNAME)
EXTRA_FLAGS = -DHAVE_DYNAMIC_CRYPTO_LIB
else
-Index: otp-OTP-22.1/lib/crypto/priv/Makefile
+Index: otp-OTP-23.0/lib/crypto/priv/Makefile
===================================================================
---- otp-OTP-22.1.orig/lib/crypto/priv/Makefile
-+++ otp-OTP-22.1/lib/crypto/priv/Makefile
+--- otp-OTP-23.0.orig/lib/crypto/priv/Makefile
++++ otp-OTP-23.0/lib/crypto/priv/Makefile
@@ -61,7 +61,7 @@ OBJS = $(OBJDIR)/crypto.o
# ----------------------------------------------------
1
0
Hello community,
here is the log from the commit of package xrootd for openSUSE:Factory checked in at 2020-06-30 21:56:18
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/xrootd (Old)
and /work/SRC/openSUSE:Factory/.xrootd.new.3060 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "xrootd"
Tue Jun 30 21:56:18 2020 rev:19 rq:810692 version:4.12.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/xrootd/xrootd.changes 2018-07-06 10:40:58.451327655 +0200
+++ /work/SRC/openSUSE:Factory/.xrootd.new.3060/xrootd.changes 2020-06-30 21:56:19.954734241 +0200
@@ -1,0 +2,180 @@
+Thu May 28 21:02:06 UTC 2020 - Atri Bhattacharya <badshah400(a)gmail.com>
+
+- Update to version 4.12.1:
+ * [XrdXrootdVoms] Fix run-time lib dependencies.
+ * [xrdcp] Don't create unwanted dir if --recursive option was
+ used and the source is a file.
+- Changes from version 4.12.0:
+ * New Features:
+ - [Server] User redirector to find directory to be listed (R5
+ backport).
+ - [XrdCl] More effective error recovery policy when the source
+ is a metalink.
+ - [xrdcp] Implement bandwidth limiting (--xrate option).
+ - [xrdcp] Implement ability to continue interrupted transfer
+ (--continue).
+ - [xrdcp/Python] Add an option (--zip-mtln-cksum) to use the
+ checksum from a metalink with files extracted from ZIP
+ archives.
+ - [xrdcp/Python] Automatically infer checksum type if asked
+ to.
+ - [xrdcp/Python] Add an option (--rm-bad-cksum) to
+ automatically remove destination file if the checksum check
+ failed.
+ * Bug fixes:
+ - [Server] Correct sequencing of an async close to prevent a
+ SEGV.
+ - [Server] Add missing initializer to avoid TPC SEGV on busy
+ systems.
+ - [Server] Initialize the XrdAccEntityInfo structure.
+ - [XrdHttp] Fix MKCOL response when we have an EEXIST.
+ - [XrdHttp] Periodically reload verify cert store.
+ - [XrdHttp] Disable session cache.
+ - [XrdCl] Don't set on-connection handler for local files.
+ - [XrdCl] Don't set the stream error window for auth errors.
+ - [XrdCl] Fix race condition resulting in dangling pointer to
+ SidMgr.
+ - [XrdCl] Make Channel operations thread-safe in respect to
+ ForceDisconnect.
+ * [CMake] Repleace XRootD find module with config module.
+ * [RPM/CMake] Don't build XrdCns.
+ * [Packaging] Add xrootd-voms and xrootd-voms-devel packages
+ equivalent.
+ * [Packaging] Add additional private headers for vomsxrd to the
+ vomsxrd packages.
+ * [Python] Support PyPy.
+- Only package the COPYING.LGPL file and drop all extraneous
+ COPYING* files; the LICENSE file makes it clear that the
+ software is licensed under LGPL-3.0-or-later.
+
+-------------------------------------------------------------------
+Fri Apr 3 01:40:54 UTC 2020 - Atri Bhattacharya <badshah400(a)gmail.com>
+
+- Update to version 4.11.3:
+ * [Server] Avoid SEGV when skipping over large if/else/fi
+ blocks.
+ * [XrdHttp] Fix curl speed limit to be really around 10KB/s.
+ * [xrootdfs] Make sure xrootdfs_create() checks return code of
+ xrootdfs_do_create().
+ * [XrdHttp] Change the default ssl cipher.
+ * [XrdHttp] Enable elliptic-curve support for OpenSSL 1.0.2+.
+ * [XrdHttp] Use Mozilla cipher list only with OpenSSL 1.0.2+.
+ * [XrdCl] When xrdcp reports an error refine the message so it
+ is clear whether the erros comes from the source or
+ destination.
+ * [XrdCl] Make sure on error local redirect is not retried
+ infinitely.
+ * [XrdXrootdConfig] Fixed wrong segsz parameter.
+ * [XrdHttp] Give a chance to map the issuer name in the case of
+ a proxy certificate (needed to accommodate systems that do
+ user mapping, e.g. eos)
+ * [XrdHttp] Fix the logic for determining SecEntity.name.
+ * [XrdHttp] Don't overwrite SecEntity.name after the gridmap
+ phase.
+ * [xrootdfs] Make sure xrootdfs_create() does check the return
+ code of xrootdfs_do_create().
+ * [XrdSecgsi] In the case of delegation, give client a chance to
+ use XrdSecGSISRVNAMES to limit where it is being redirected
+ to.
+ * [Python] Make rpath relative to the module.
+ * [Debian] Proper Debian pkg naming.
+ * [XrdCl] Use glibc semaphores for rhel>=7.
+ * [Server] Export pointers to Acc, Ofs, and Sfs plugin into the
+ XrdEnv.
+ * [XrdMacaroons] Use env ptrs to get authorize obj.
+- Changes from versions 4.11.0 through 4.11.2:
+ * See
+ https://github.com/xrootd/xrootd/blob/v4.11.3/docs/ReleaseNotes.txt.
+- Remove a spurious 'exit 0' line from %pre scriptlet.
+- Recommend logrotate for xrootd-server as it installs a file in
+ /etc/logrotate.d/.
+- Use ninja for building:
+ * Add BuildRequires: ninja.
+ * Use %cmake_build instead of make.
+ * Define __builder to ninja so cmake uses ninja instead of make
+ for building.
+
+-------------------------------------------------------------------
+Mon Aug 19 12:10:46 UTC 2019 - Atri Bhattacharya <badshah400(a)gmail.com>
+
+- Update to version 4.10.0:
+ * New Features
+ - [POSIX] Add methods to the cache mngt objecT to get status
+ of a cached file.
+ - [Server] Add xrd.network dyndns option for dynamic DNS
+ handling.
+ - [Server] Properly handle dynamic DNS environments.
+ - [Server] Add evict option to the prepare request.
+ - [Server] Allow better handling for proxy caching clusters.
+ - [Server] Allow configurable posc sync level.
+ - [Server] Implement framework for a kXR_prepare plug-in.
+ - [XrdCl] Implement streaming dirls, closes #225
+ - [XrdCl] Retry policy changes: see details in #950.
+ - [XrdCl] Add switch/flag to enforce zip archive listing.
+ - [XrdCl] Preserve tried/triedrc cgi info on redirect for
+ kXR_locate request, #closes #944
+ - [XrdCl] Implement prepare evict and expose in xrdfs.
+ - [XrdCl] Expose prepare abort and query prepare.
+ - [XrdCl] Add tpc.scgi if delegation is being used.
+ - [Python] Expose chunked dirlist API in Python.
+ - [Python] Support globbing.
+ - [XrdClHttp] Add XrdClHttp submodule.
+ - [XCache] Implement write-queue number of blocks / threads
+ config options. (#963)
+ - [CMake] Add switch for XrdCl builds only.
+ * Major bug fixes:
+ - [XrdCl] Allow to cancel classical copy jobs.
+ - [XrdCl] Fix race condition in error recovery.
+ - [XrdCl] Prevent race condition between Request T/O and
+ socket read.
+ - [XCache] Check for errors during XrdOssFile::FSync - do not
+ write cinfo file if it fails.
+ - [XCache] Deny write access to a file in initial Prepare()
+ call, fixes #663
+ - [XCache] Review and correct error handling in POSIX and
+ XCache, implement XCache::Unlink()
+ - [XrdTpc] Always query dual stack for HTTP TPC.
+ - [XrdTpc] Do not include authz header in the filename.
+ - [XrdTpc] Add curl handle cleanup on redirections or errors.
+ - [XrdHttp] Include provided opaque information in the
+ redirection.
+ - [XrdHttp] Fix digest names returned to clients.
+ - [XrdHttp] Fix opaque info handling on redirections.
+ - [XrdMacaroon] Fix macaroon authorization config.
+ - [XrdSecgsi] Make proxy cache path aware.
+ - [XrdSecgsi] XrdSecProtocolgsi::Encrypt - set IV correctly
+ and report correct size.
+ * Minor bug fixes and miscellaneous changes: see
+ https://github.com/xrootd/xrootd/blob/master/docs/ReleaseNotes.txt
+- Use github URL for source tarball.
+- Replace xrootd-devel by xroots-libs-devel for
+ xrootd-server-devel's Requires; xrootd-devel does not exist.
+
+-------------------------------------------------------------------
+Thu Jan 17 19:02:48 UTC 2019 - Jan Engelhardt <jengelh(a)inai.de>
+
+- Disable ceph since linking to it fails.
+
+-------------------------------------------------------------------
+Wed Dec 19 10:49:19 UTC 2018 - Jan Engelhardt <jengelh(a)inai.de>
+
+- Trim long repeated descriptions in subpackages.
+- Trim future aims. Run fdupes over all of the usr subvolume.
+
+-------------------------------------------------------------------
+Wed Dec 5 16:29:57 UTC 2018 - Todd R <toddrme2178(a)gmail.com>
+
+- Update to version 4.8.5
+- Remove upstream-included xrootd-gcc6-fix.patch
+- Remove unneeded xrootd-gcc8-fix.patch
+- Merge cl and cl-devel subpackages into client and client-devel
+ subpackages, respectively, to match upstream
+- Split client and server libraries into subpackages to simplify
+ dependencies, simplify installation, and better match upstream.
+- Switch to cmake-based install
+- Switch from sysv-init to systemd.
+- Build and package python3 bindings
+- Build and package ceph storage backend
+- Build and package documentation
+
+-------------------------------------------------------------------
Old:
----
cmsd
frm_purged
frm_xfrd
xrootd
xrootd-3.3.6.tar.gz
xrootd-gcc6-fix.patch
xrootd-gcc8-fix.patch
New:
----
xrootd-4.12.1.tar.gz
xrootd-rpmlintrc
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ xrootd.spec ++++++
++++ 900 lines (skipped)
++++ between /work/SRC/openSUSE:Factory/xrootd/xrootd.spec
++++ and /work/SRC/openSUSE:Factory/.xrootd.new.3060/xrootd.spec
++++++ xrootd-3.3.6.tar.gz -> xrootd-4.12.1.tar.gz ++++++
++++ 235052 lines of diff (skipped)
++++++ xrootd-rpmlintrc ++++++
addFilter("devel-file-in-non-devel-package .*/libXrdPosixPreload.so")
addFilter("no-soname .*/lib.*-[0-9]+.so")
addFilter("xrootd.*-libs\..* shlib-policy-missing-suffix")
1
0
Hello community,
here is the log from the commit of package kmymoney for openSUSE:Factory checked in at 2020-06-30 21:56:14
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/kmymoney (Old)
and /work/SRC/openSUSE:Factory/.kmymoney.new.3060 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "kmymoney"
Tue Jun 30 21:56:14 2020 rev:71 rq:814786 version:5.1.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/kmymoney/kmymoney.changes 2020-01-24 13:13:54.537499232 +0100
+++ /work/SRC/openSUSE:Factory/.kmymoney.new.3060/kmymoney.changes 2020-06-30 21:56:16.154722483 +0200
@@ -1,0 +2,77 @@
+Mon Jun 15 19:27:12 UTC 2020 - Wolfgang Bauer <wbauer(a)tmo.at>
+
+- Update to 5.1.0
+- Bugs fixed:
+ * OFX import leaves brokerage account field blank for nested
+ accounts (kde#350360)
+ * KF5 ofximporter "Map account" fails on MS Windows (kde#396286)
+ * report's chart mess with data if there are too many data
+ (kde#399261)
+ * Some ui files are not compilable after editing with designer
+ (kde#416534)
+ * Message Box Doesn't Size (kde#416577)
+ * Import a QFX file fails on MacOS (kde#416621)
+ * Missing german translation (kde#416711)
+ * Summary values are not updated for investment transaction of
+ type interest income (kde#416746)
+ * libofx dtd files are not found in AppImage (kde#416827)
+ * Investment reports should ignore setting for "Show equity
+ accounts" (kde#416902)
+ * After the migration to aq6 the change of views takes a long
+ time (kde#416963)
+ * cannot find yahoo finance under online quotes (kde#417142)
+ * Request: Use latest values to fill in transaction (kde#418334)
+ * Script based online quotes do not work in the AppImage version
+ (kde#418823)
+ * Backup (kde#419082)
+ * Data displayed in scheduled transaction and home page are
+ sometimes not consistent (kde#419113)
+ * Balance of budget is shown incorrect (kde#419554)
+ * MyMoneyStatementReader uses base currency instead brokerage
+ account's own when adding new price (kde#419974)
+ * When importing transactions, we're matching against the other
+ transactions also being imported (kde#419975)
+ * BUY/SELL information ignored when importing OFX investment
+ transactions (kde#420056)
+ * Startlogo not translated in french (kde#420082)
+ * Indian Rupee has new symbol since 7 years,it is ₹ (kde#420422)
+ * QIF importer ignores new investments (kde#420584)
+ * A sum of multiple rows selected is incorrect for securities
+ with fraction > 100 (kde#420593)
+ * Inaccurate decimal precision of South Korean Won (KRW)
+ (kde#420683)
+ * After upgrade from Fedora 31 to 32, one of my checking accounts
+ shows a huge negative "Cleared" balance (kde#420761)
+ * Incorrect ordinate axis labels when zooming a chart
+ (kde#420767)
+ * Crash in "Edit loan Wizard" (kde#420931)
+ * Freeze: logarithmic vertical axis and negative data range From
+ value (kde#421056)
+ * Logarithmic vertical axis has multiple zero labels (kde#421105)
+ * Securities Dialog "Market" field not populated with existing
+ data on edit (kde#421126)
+ * Networth "account balances by institution" provides incorrect
+ results (kde#421260)
+ * Account context menu's Reconcile option opens incorrect ledger
+ (kde#421307)
+ * New Account Wizard throws exception on empty payment method
+ selected (kde#421569)
+ * New Account Wizard is not asking if the user wants to add a new
+ payee (kde#421691)
+ * Scheduled monthly transaction will only change first date
+ (kde#421750)
+ * Anonymised files are no longer created (kde#421757)
+ * SEGFAULT occurring when marking an account as preferred
+ (kde#421900)
+ * Incorrect account hierarchy if an account is marked as
+ preferred (kde#422012)
+ * Budget view displays all account types (kde#422196)
+ * KMyMoney crashes when navigating backwards through CSV import
+ wizard (kde#422200)
+ * Search widget in the Budgets view ignores user input
+ (kde#422480)
+- Enhancements:
+ * Add option "Reverse charges and payments" to OFX import
+ (kde#416279)
+
+-------------------------------------------------------------------
Old:
----
kmymoney-5.0.8.tar.xz
New:
----
kmymoney-5.1.0.tar.xz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ kmymoney.spec ++++++
--- /var/tmp/diff_new_pack.sbHf68/_old 2020-06-30 21:56:17.146725552 +0200
+++ /var/tmp/diff_new_pack.sbHf68/_new 2020-06-30 21:56:17.150725564 +0200
@@ -28,7 +28,7 @@
%bcond_with qtwebengine
%endif
Name: kmymoney
-Version: 5.0.8
+Version: 5.1.0
Release: 0
Summary: A Personal Finance Manager by KDE
License: GPL-2.0-only OR GPL-3.0-only
++++++ kmymoney-5.0.8.tar.xz -> kmymoney-5.1.0.tar.xz ++++++
/work/SRC/openSUSE:Factory/kmymoney/kmymoney-5.0.8.tar.xz /work/SRC/openSUSE:Factory/.kmymoney.new.3060/kmymoney-5.1.0.tar.xz differ: char 26, line 1
1
0