openSUSE Commits
Threads by month
- ----- 2024 -----
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
December 2018
- 1 participants
- 2073 discussions
Hello community,
here is the log from the commit of package python-social-auth-app-django for openSUSE:Factory checked in at 2018-12-03 10:12:32
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-social-auth-app-django (Old)
and /work/SRC/openSUSE:Factory/.python-social-auth-app-django.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-social-auth-app-django"
Mon Dec 3 10:12:32 2018 rev:2 rq:653426 version:3.1.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-social-auth-app-django/python-social-auth-app-django.changes 2018-08-03 12:36:56.759537879 +0200
+++ /work/SRC/openSUSE:Factory/.python-social-auth-app-django.new.19453/python-social-auth-app-django.changes 2018-12-03 10:12:40.215582379 +0100
@@ -1,0 +2,18 @@
+Fri Nov 30 10:20:04 UTC 2018 - Matthias Fehring <buschmann23(a)opensuse.org>
+
+- Update to version 3.1.0
+ * Updated JSONField.from_db_value signature to support multiple
+ Django versions by accepting just the needed parameters.
+- Changes from version 3.0.0
+ * Reduce log level of exceptions to INFO if messages app is installed
+ * Encode association secret with encodebytes if available
+ * Decode association secret for proper storage
+ * Remove obsolete code from JSONField
+ * Pass user as keyword argument to do_complete
+ * Cleanup username when using email as username
+ * Drop Python 3.3 support
+ * Correct spelling errors
+ * Correct version that renamed field.rel
+ * Reduce error logs in SocialAuthExceptionMiddleware
+
+-------------------------------------------------------------------
Old:
----
social-auth-app-django-2.1.0.tar.gz
New:
----
social-auth-app-django-3.1.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-social-auth-app-django.spec ++++++
--- /var/tmp/diff_new_pack.AjgzDf/_old 2018-12-03 10:12:40.899581745 +0100
+++ /var/tmp/diff_new_pack.AjgzDf/_new 2018-12-03 10:12:40.899581745 +0100
@@ -12,29 +12,29 @@
# license that conforms to the Open Source Definition (Version 1.9)
# published by the Open Source Initiative.
-# Please submit bugfixes or comments via http://bugs.opensuse.org/
+# Please submit bugfixes or comments via https://bugs.opensuse.org/
#
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-social-auth-app-django
-Version: 2.1.0
+Version: 3.1.0
Release: 0
Summary: Python Social Authentication, Django integration
License: BSD-3-Clause
Group: Development/Languages/Python
URL: https://github.com/python-social-auth/social-app-django
Source: https://files.pythonhosted.org/packages/source/s/social-auth-app-django/soc…
+BuildRequires: %{python_module Django >= 1.11}
BuildRequires: %{python_module mock}
BuildRequires: %{python_module setuptools}
BuildRequires: %{python_module six}
BuildRequires: %{python_module social-auth-core >= 1.2.0}
-BuildRequires: %{python_module Django >= 1.11}
BuildRequires: fdupes
BuildRequires: python-rpm-macros
+Requires: python-Django >= 1.11
Requires: python-six
Requires: python-social-auth-core >= 1.2.0
-Requires: python-Django >= 1.11
BuildArch: noarch
%python_subpackages
++++++ social-auth-app-django-2.1.0.tar.gz -> social-auth-app-django-3.1.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/social-auth-app-django-2.1.0/CHANGELOG.md new/social-auth-app-django-3.1.0/CHANGELOG.md
--- old/social-auth-app-django-2.1.0/CHANGELOG.md 2017-12-22 13:44:21.000000000 +0100
+++ new/social-auth-app-django-3.1.0/CHANGELOG.md 2018-10-31 15:43:54.000000000 +0100
@@ -5,7 +5,25 @@
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).
-## [Unreleased](https://github.com/python-social-auth/social-app-django/commits…
+## [3.1.0](https://github.com/python-social-auth/social-app-django/releases/tag… - 2018-10-31
+
+### Changed
+- Updated `JSONField.from_db_value` signature to support multiple Django
+ versions by accepting just the needed parameters.
+
+## [3.0.0](https://github.com/python-social-auth/social-app-django/releases/tag… - 2018-10-28
+
+### Changed
+- Reduce log level of exceptions to `INFO` if messages app is installed
+- Encode association secret with `encodebytes` if available
+- Decode association secret for proper storage
+- Remove obsolete code from JSONField
+- Pass `user` as keyword argument to `do_complete`
+- Cleanup `username` when using email as username
+- Drop Python 3.3 support
+- Correct spelling errors
+- Correct version that renamed `field.rel`
+- Reduce error logs in `SocialAuthExceptionMiddleware`
## [2.1.0](https://github.com/python-social-auth/social-app-django/releases/tag… - 2017-12-22
@@ -13,6 +31,7 @@
- Use Django `urlquote` since it handles unicode
- Remove version check in favor of import error catch
- Remove call to deprecated method `_get_val_from_obj()`
+- Drop Python 3.3 support
## [2.0.0](https://github.com/python-social-auth/social-app-django/releases/tag… - 2017-10-28
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/social-auth-app-django-2.1.0/PKG-INFO new/social-auth-app-django-3.1.0/PKG-INFO
--- old/social-auth-app-django-2.1.0/PKG-INFO 2017-12-22 13:49:21.000000000 +0100
+++ new/social-auth-app-django-3.1.0/PKG-INFO 2018-10-31 15:46:24.000000000 +0100
@@ -1,16 +1,16 @@
Metadata-Version: 1.1
Name: social-auth-app-django
-Version: 2.1.0
+Version: 3.1.0
Summary: Python Social Authentication, Django integration.
Home-page: https://github.com/python-social-auth/social-app-django
Author: Matias Aguirre
Author-email: matiasaguirre(a)gmail.com
License: BSD
-Description-Content-Type: UNKNOWN
Description: # Python Social Auth - Django
[![Build Status](https://travis-ci.org/python-social-auth/social-app-django.svg?bran…
[![Donate](https://img.shields.io/badge/Donate-PayPal-orange.svg)](https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=matiasaguirre%40gmail%2ecom&lc=US&item_name=Python%20Social%20Auth&no_note=0¤cy_code=USD&bn=PP%2dDonationsBF%3abtn_donate_SM%2egif%3aNonHostedGuest)
+ [![PyPI version](https://badge.fury.io/py/social-auth-app-django.svg)](https://badge.fury.io/py/social-auth-app-django)
Python Social Auth is an easy to setup social authentication/registration
mechanism with support for several frameworks and auth providers.
@@ -54,7 +54,7 @@
## Donations
- This project is maintened on my spare time, consider donating to keep
+ This project is maintained on my spare time, consider donating to keep
it improving.
[![Donate](https://img.shields.io/badge/Donate-PayPal-orange.svg)](https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=matiasaguirre%40gmail%2ecom&lc=US&item_name=Python%20Social%20Auth&no_note=0¤cy_code=USD&bn=PP%2dDonationsBF%3abtn_donate_SM%2egif%3aNonHostedGuest)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/social-auth-app-django-2.1.0/README.md new/social-auth-app-django-3.1.0/README.md
--- old/social-auth-app-django-2.1.0/README.md 2017-05-06 14:05:26.000000000 +0200
+++ new/social-auth-app-django-3.1.0/README.md 2018-10-28 21:12:02.000000000 +0100
@@ -2,6 +2,7 @@
[![Build Status](https://travis-ci.org/python-social-auth/social-app-django.svg?bran…
[![Donate](https://img.shields.io/badge/Donate-PayPal-orange.svg)](https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=matiasaguirre%40gmail%2ecom&lc=US&item_name=Python%20Social%20Auth&no_note=0¤cy_code=USD&bn=PP%2dDonationsBF%3abtn_donate_SM%2egif%3aNonHostedGuest)
+[![PyPI version](https://badge.fury.io/py/social-auth-app-django.svg)](https://badge.fury.io/py/social-auth-app-django)
Python Social Auth is an easy to setup social authentication/registration
mechanism with support for several frameworks and auth providers.
@@ -45,7 +46,7 @@
## Donations
-This project is maintened on my spare time, consider donating to keep
+This project is maintained on my spare time, consider donating to keep
it improving.
[![Donate](https://img.shields.io/badge/Donate-PayPal-orange.svg)](https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=matiasaguirre%40gmail%2ecom&lc=US&item_name=Python%20Social%20Auth&no_note=0¤cy_code=USD&bn=PP%2dDonationsBF%3abtn_donate_SM%2egif%3aNonHostedGuest)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/social-auth-app-django-2.1.0/setup.py new/social-auth-app-django-3.1.0/setup.py
--- old/social-auth-app-django-2.1.0/setup.py 2017-01-01 15:37:05.000000000 +0100
+++ new/social-auth-app-django-3.1.0/setup.py 2018-08-20 15:43:51.000000000 +0200
@@ -34,7 +34,9 @@
url='https://github.com/python-social-auth/social-app-django',
packages=[
'social_django',
- 'social_django.migrations'
+ 'social_django.migrations',
+ 'social_django.management',
+ 'social_django.management.commands',
],
long_description=long_description(),
install_requires=load_requirements(),
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/social-auth-app-django-2.1.0/social_auth_app_django.egg-info/PKG-INFO new/social-auth-app-django-3.1.0/social_auth_app_django.egg-info/PKG-INFO
--- old/social-auth-app-django-2.1.0/social_auth_app_django.egg-info/PKG-INFO 2017-12-22 13:49:21.000000000 +0100
+++ new/social-auth-app-django-3.1.0/social_auth_app_django.egg-info/PKG-INFO 2018-10-31 15:46:24.000000000 +0100
@@ -1,16 +1,16 @@
Metadata-Version: 1.1
Name: social-auth-app-django
-Version: 2.1.0
+Version: 3.1.0
Summary: Python Social Authentication, Django integration.
Home-page: https://github.com/python-social-auth/social-app-django
Author: Matias Aguirre
Author-email: matiasaguirre(a)gmail.com
License: BSD
-Description-Content-Type: UNKNOWN
Description: # Python Social Auth - Django
[![Build Status](https://travis-ci.org/python-social-auth/social-app-django.svg?bran…
[![Donate](https://img.shields.io/badge/Donate-PayPal-orange.svg)](https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=matiasaguirre%40gmail%2ecom&lc=US&item_name=Python%20Social%20Auth&no_note=0¤cy_code=USD&bn=PP%2dDonationsBF%3abtn_donate_SM%2egif%3aNonHostedGuest)
+ [![PyPI version](https://badge.fury.io/py/social-auth-app-django.svg)](https://badge.fury.io/py/social-auth-app-django)
Python Social Auth is an easy to setup social authentication/registration
mechanism with support for several frameworks and auth providers.
@@ -54,7 +54,7 @@
## Donations
- This project is maintened on my spare time, consider donating to keep
+ This project is maintained on my spare time, consider donating to keep
it improving.
[![Donate](https://img.shields.io/badge/Donate-PayPal-orange.svg)](https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=matiasaguirre%40gmail%2ecom&lc=US&item_name=Python%20Social%20Auth&no_note=0¤cy_code=USD&bn=PP%2dDonationsBF%3abtn_donate_SM%2egif%3aNonHostedGuest)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/social-auth-app-django-2.1.0/social_django/__init__.py new/social-auth-app-django-3.1.0/social_django/__init__.py
--- old/social-auth-app-django-2.1.0/social_django/__init__.py 2017-12-22 13:46:23.000000000 +0100
+++ new/social-auth-app-django-3.1.0/social_django/__init__.py 2018-10-31 15:45:18.000000000 +0100
@@ -1,4 +1,4 @@
-__version__ = '2.1.0'
+__version__ = '3.1.0'
from social_core.backends.base import BaseAuth
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/social-auth-app-django-2.1.0/social_django/compat.py new/social-auth-app-django-3.1.0/social_django/compat.py
--- old/social-auth-app-django-2.1.0/social_django/compat.py 2017-12-22 13:40:50.000000000 +0100
+++ new/social-auth-app-django-3.1.0/social_django/compat.py 2018-10-28 21:18:39.000000000 +0100
@@ -16,7 +16,7 @@
def get_rel_model(field):
- if django.VERSION >= (2, 0):
+ if django.VERSION >= (1, 9):
return field.remote_field.model
user_model = field.rel.to
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/social-auth-app-django-2.1.0/social_django/fields.py new/social-auth-app-django-3.1.0/social_django/fields.py
--- old/social-auth-app-django-2.1.0/social_django/fields.py 2017-12-22 13:38:26.000000000 +0100
+++ new/social-auth-app-django-3.1.0/social_django/fields.py 2018-10-31 15:41:57.000000000 +0100
@@ -1,35 +1,17 @@
import json
import six
-import functools
-
-import django
from django.core.exceptions import ValidationError
from django.conf import settings
from django.db import models
+from django.utils.encoding import force_text
from social_core.utils import setting_name
-try:
- from django.utils.encoding import smart_unicode as smart_text
- smart_text # placate pyflakes
-except ImportError:
- from django.utils.encoding import smart_text
-
-# SubfieldBase causes RemovedInDjango110Warning in 1.8 and 1.9, and
-# will not work in 1.10 or later
-if django.VERSION[:2] >= (1, 8):
- field_metaclass = type
-else:
- from django.db.models import SubfieldBase
- field_metaclass = SubfieldBase
-
-field_class = functools.partial(six.with_metaclass, field_metaclass)
-
if getattr(settings, setting_name('POSTGRES_JSONFIELD'), False):
from django.contrib.postgres.fields import JSONField as JSONFieldBase
else:
- JSONFieldBase = field_class(models.TextField)
+ JSONFieldBase = models.TextField
class JSONField(JSONFieldBase):
@@ -41,7 +23,7 @@
kwargs.setdefault('default', dict)
super(JSONField, self).__init__(*args, **kwargs)
- def from_db_value(self, value, expression, connection, context):
+ def from_db_value(self, value, *args, **kwargs):
return self.to_python(value)
def to_python(self, value):
@@ -56,10 +38,6 @@
value = six.text_type(value, 'utf-8')
if isinstance(value, six.string_types):
try:
- # with django 1.6 i have '"{}"' as default value here
- if value[0] == value[-1] == '"':
- value = value[1:-1]
-
return json.loads(value)
except Exception as err:
raise ValidationError(str(err))
@@ -85,10 +63,9 @@
def value_to_string(self, obj):
"""Return value from object converted to string properly"""
- return smart_text(self.value_from_object(obj))
+ return force_text(self.value_from_object(obj))
def value_from_object(self, obj):
"""Return value dumped to string."""
orig_val = super(JSONField, self).value_from_object(obj)
return self.get_prep_value(orig_val)
-
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/social-auth-app-django-2.1.0/social_django/middleware.py new/social-auth-app-django-3.1.0/social_django/middleware.py
--- old/social-auth-app-django-2.1.0/social_django/middleware.py 2017-10-28 15:21:25.000000000 +0200
+++ new/social-auth-app-django-3.1.0/social_django/middleware.py 2018-08-20 15:43:51.000000000 +0200
@@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
import six
+from django.apps import apps
from django.conf import settings
from django.contrib import messages
from django.contrib.messages.api import MessageFailure
@@ -33,17 +34,21 @@
backend_name = getattr(backend, 'name', 'unknown-backend')
message = self.get_message(request, exception)
- social_logger.error(message)
-
url = self.get_redirect_uri(request, exception)
- try:
- messages.error(request, message,
- extra_tags='social-auth ' + backend_name)
- except MessageFailure:
- if url:
- url += ('?' in url and '&' or '?') + \
- 'message={0}&backend={1}'.format(urlquote(message),
- backend_name)
+
+ if apps.is_installed('django.contrib.messages'):
+ social_logger.info(message)
+ try:
+ messages.error(request, message,
+ extra_tags='social-auth ' + backend_name)
+ except MessageFailure:
+ if url:
+ url += ('?' in url and '&' or '?') + \
+ 'message={0}&backend={1}'.format(urlquote(message),
+ backend_name)
+ else:
+ social_logger.error(message)
+
if url:
return redirect(url)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/social-auth-app-django-2.1.0/social_django/storage.py new/social-auth-app-django-3.1.0/social_django/storage.py
--- old/social-auth-app-django-2.1.0/social_django/storage.py 2017-01-28 15:06:05.000000000 +0100
+++ new/social-auth-app-django-3.1.0/social_django/storage.py 2018-10-28 21:44:39.000000000 +0100
@@ -2,6 +2,7 @@
import base64
import six
import sys
+from django.core.exceptions import FieldDoesNotExist
from django.db import transaction
from django.db.utils import IntegrityError
@@ -58,8 +59,16 @@
@classmethod
def create_user(cls, *args, **kwargs):
username_field = cls.username_field()
- if 'username' in kwargs and username_field not in kwargs:
- kwargs[username_field] = kwargs.pop('username')
+ if 'username' in kwargs:
+ if username_field not in kwargs:
+ kwargs[username_field] = kwargs.pop('username')
+ else:
+ # If username_field is 'email' and there is no field named "username"
+ # then latest should be removed from kwargs.
+ try:
+ cls.user_model()._meta.get_field('username')
+ except FieldDoesNotExist:
+ kwargs.pop('username')
try:
if hasattr(transaction, 'atomic'):
# In Django versions that have an "atomic" transaction decorator / context
@@ -152,7 +161,11 @@
except cls.DoesNotExist:
assoc = cls(server_url=server_url,
handle=association.handle)
- assoc.secret = base64.encodestring(association.secret)
+
+ try:
+ assoc.secret = base64.encodebytes(association.secret).decode()
+ except AttributeError:
+ assoc.secret = base64.encodestring(association.secret).decode()
assoc.issued = association.issued
assoc.lifetime = association.lifetime
assoc.assoc_type = association.assoc_type
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/social-auth-app-django-2.1.0/social_django/views.py new/social-auth-app-django-3.1.0/social_django/views.py
--- old/social-auth-app-django-2.1.0/social_django/views.py 2017-10-28 15:21:25.000000000 +0200
+++ new/social-auth-app-django-3.1.0/social_django/views.py 2018-10-28 21:12:02.000000000 +0100
@@ -28,7 +28,7 @@
@psa('{0}:complete'.format(NAMESPACE))
def complete(request, backend, *args, **kwargs):
"""Authentication complete view"""
- return do_complete(request.backend, _do_login, request.user,
+ return do_complete(request.backend, _do_login, user=request.user,
redirect_name=REDIRECT_FIELD_NAME, request=request,
*args, **kwargs)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/social-auth-app-django-2.1.0/tests/test_models.py new/social-auth-app-django-3.1.0/tests/test_models.py
--- old/social-auth-app-django-2.1.0/tests/test_models.py 2017-10-28 15:21:25.000000000 +0200
+++ new/social-auth-app-django-3.1.0/tests/test_models.py 2018-10-28 21:15:18.000000000 +0100
@@ -188,6 +188,7 @@
qs = Association.get(handle='a')
self.assertEqual(qs.count(), 1)
+ self.assertEqual(qs[0].secret, 'Yg==\n')
Association.remove(ids_to_delete=[qs.first().id])
self.assertEqual(Association.objects.count(), 0)
1
0
Hello community,
here is the log from the commit of package python-cassandra-driver for openSUSE:Factory checked in at 2018-12-03 10:12:30
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-cassandra-driver (Old)
and /work/SRC/openSUSE:Factory/.python-cassandra-driver.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-cassandra-driver"
Mon Dec 3 10:12:30 2018 rev:6 rq:653425 version:3.16.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-cassandra-driver/python-cassandra-driver.changes 2018-11-01 14:46:10.550866625 +0100
+++ /work/SRC/openSUSE:Factory/.python-cassandra-driver.new.19453/python-cassandra-driver.changes 2018-12-03 10:12:36.531585796 +0100
@@ -1,0 +2,22 @@
+Sat Dec 1 18:26:17 UTC 2018 - Arun Persaud <arun(a)gmx.de>
+
+- update to version 3.16.0:
+ * Bug Fixes
+ + Improve and fix socket error-catching code in nonblocking-socket
+ reactors (PYTHON-1024)
+ + Non-ASCII characters in schema break CQL string generation
+ (PYTHON-1008)
+ + Fix OSS driver's virtual table support against DSE 6.0.X and
+ future server releases (PYTHON-1020)
+ + ResultSet.one() fails if the row_factory is using a generator
+ (PYTHON-1026)
+ + Log profile name on attempt to create existing profile
+ (PYTHON-944)
+ + Cluster instantiation fails if any contact points' hostname
+ resolution fails (PYTHON-895)
+ * Other
+ + Fix tests when RF is not maintained if we decomission a node
+ (PYTHON-1017)
+ + Fix wrong use of ResultSet indexing (PYTHON-1015)
+
+-------------------------------------------------------------------
Old:
----
3.15.1.tar.gz
New:
----
3.16.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-cassandra-driver.spec ++++++
--- /var/tmp/diff_new_pack.jgSJiV/_old 2018-12-03 10:12:37.175585199 +0100
+++ /var/tmp/diff_new_pack.jgSJiV/_new 2018-12-03 10:12:37.175585199 +0100
@@ -18,7 +18,7 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-cassandra-driver
-Version: 3.15.1
+Version: 3.16.0
Release: 0
Summary: Python driver for Cassandra
License: Apache-2.0
++++++ 3.15.1.tar.gz -> 3.16.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/CHANGELOG.rst new/python-driver-3.16.0/CHANGELOG.rst
--- old/python-driver-3.15.1/CHANGELOG.rst 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/CHANGELOG.rst 2018-11-12 19:19:38.000000000 +0100
@@ -1,3 +1,21 @@
+3.16.0
+======
+November 12, 2018
+
+Bug Fixes
+---------
+* Improve and fix socket error-catching code in nonblocking-socket reactors (PYTHON-1024)
+* Non-ASCII characters in schema break CQL string generation (PYTHON-1008)
+* Fix OSS driver's virtual table support against DSE 6.0.X and future server releases (PYTHON-1020)
+* ResultSet.one() fails if the row_factory is using a generator (PYTHON-1026)
+* Log profile name on attempt to create existing profile (PYTHON-944)
+* Cluster instantiation fails if any contact points' hostname resolution fails (PYTHON-895)
+
+Other
+-----
+* Fix tests when RF is not maintained if we decomission a node (PYTHON-1017)
+* Fix wrong use of ResultSet indexing (PYTHON-1015)
+
3.15.1
======
September 6, 2018
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/README-dev.rst new/python-driver-3.16.0/README-dev.rst
--- old/python-driver-3.15.1/README-dev.rst 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/README-dev.rst 2018-11-12 19:19:38.000000000 +0100
@@ -79,35 +79,35 @@
=================
In order for the extensions to be built and used in the test, run::
- python setup.py nosetests
+ nosetests
You can run a specific test module or package like so::
- python setup.py nosetests -w tests/unit/
+ nosetests -w tests/unit/
You can run a specific test method like so::
- python setup.py nosetests -w tests/unit/test_connection.py:ConnectionTest.test_bad_protocol_version
+ nosetests -w tests/unit/test_connection.py:ConnectionTest.test_bad_protocol_version
Seeing Test Logs in Real Time
-----------------------------
Sometimes it's useful to output logs for the tests as they run::
- python setup.py nosetests -w tests/unit/ --nocapture --nologcapture
+ nosetests -w tests/unit/ --nocapture --nologcapture
Use tee to capture logs and see them on your terminal::
- python setup.py nosetests -w tests/unit/ --nocapture --nologcapture 2>&1 | tee test.log
+ nosetests -w tests/unit/ --nocapture --nologcapture 2>&1 | tee test.log
Specifying a Cassandra Version for Integration Tests
----------------------------------------------------
You can specify a cassandra version with the ``CASSANDRA_VERSION`` environment variable::
- CASSANDRA_VERSION=2.0.9 python setup.py nosetests -w tests/integration/standard
+ CASSANDRA_VERSION=2.0.9 nosetests -w tests/integration/standard
You can also specify a cassandra directory (to test unreleased versions)::
- CASSANDRA_DIR=/home/thobbs/cassandra python setup.py nosetests -w tests/integration/standard
+ CASSANDRA_DIR=/home/thobbs/cassandra nosetests -w tests/integration/standard
Specifying the usage of an already running Cassandra cluster
----------------------------------------------------
@@ -120,7 +120,7 @@
The protocol version defaults to 1 for cassandra 1.2 and 2 otherwise. You can explicitly set
it with the ``PROTOCOL_VERSION`` environment variable::
- PROTOCOL_VERSION=3 python setup.py nosetests -w tests/integration/standard
+ PROTOCOL_VERSION=3 nosetests -w tests/integration/standard
Testing Multiple Python Versions
--------------------------------
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/build.yaml new/python-driver-3.16.0/build.yaml
--- old/python-driver-3.15.1/build.yaml 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/build.yaml 2018-11-12 19:19:38.000000000 +0100
@@ -20,6 +20,17 @@
exclude:
- python: [3.4, 3.6]
+ commit_long_test:
+ schedule: per_commit
+ branches:
+ include: [/long-python.*/]
+ env_vars: |
+ EVENT_LOOP_MANAGER='libev'
+ matrix:
+ exclude:
+ - python: [3.4, 3.6]
+ - cassandra: ['2.0', '2.1', '3.0']
+
commit_branches:
schedule: per_commit
branches:
@@ -107,6 +118,7 @@
- '3.0'
- '3.11'
- 'test-dse'
+ - 'dse-6.7'
env:
CYTHON:
@@ -148,6 +160,13 @@
exit 0
fi
+ if [[ $CCM_IS_DSE == 'true' ]]; then
+ # We only use a DSE version for unreleased DSE versions, so we only need to run the smoke tests here
+ echo "CCM_IS_DSE: $CCM_IS_DSE"
+ echo "==========RUNNING SMOKE TESTS==========="
+ EVENT_LOOP_MANAGER=$EVENT_LOOP_MANAGER CCM_ARGS="$CCM_ARGS" CASSANDRA_VERSION=$CCM_CASSANDRA_VERSION DSE_VERSION='6.7.0' MAPPED_CASSANDRA_VERSION=$MAPPED_CASSANDRA_VERSION VERIFY_CYTHON=$FORCE_CYTHON nosetests -s -v --logging-format="[%(levelname)s] %(asctime)s %(thread)d: %(message)s" --with-ignore-docstrings --with-xunit --xunit-file=standard_results.xml tests/integration/standard/test_dse.py || true
+ exit 0
+ fi
# Run the unit tests, this is not done in travis because
# it takes too much time for the whole matrix to build with cython
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/cassandra/__init__.py new/python-driver-3.16.0/cassandra/__init__.py
--- old/python-driver-3.15.1/cassandra/__init__.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/cassandra/__init__.py 2018-11-12 19:19:38.000000000 +0100
@@ -22,7 +22,7 @@
logging.getLogger('cassandra').addHandler(NullHandler())
-__version_info__ = (3, 15, 1)
+__version_info__ = (3, 16, 0)
__version__ = '.'.join(map(str, __version_info__))
@@ -686,3 +686,13 @@
for more details.
"""
pass
+
+
+class UnresolvableContactPoints(DriverException):
+ """
+ The driver was unable to resolve any provided hostnames.
+
+ Note that this is *not* raised when a :class:`.Cluster` is created with no
+ contact points, only when lookup fails for all hosts
+ """
+ pass
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/cassandra/cluster.py new/python-driver-3.16.0/cassandra/cluster.py
--- old/python-driver-3.15.1/cassandra/cluster.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/cassandra/cluster.py 2018-11-12 19:19:38.000000000 +0100
@@ -43,7 +43,8 @@
from cassandra import (ConsistencyLevel, AuthenticationFailed,
OperationTimedOut, UnsupportedOperation,
- SchemaTargetType, DriverException, ProtocolVersion)
+ SchemaTargetType, DriverException, ProtocolVersion,
+ UnresolvableContactPoints)
from cassandra.connection import (ConnectionException, ConnectionShutdown,
ConnectionHeartbeat, ProtocolVersionUnsupported)
from cassandra.cqltypes import UserType
@@ -86,6 +87,7 @@
import gevent.socket
return socket.socket is gevent.socket.socket
+
# default to gevent when we are monkey patched with gevent, eventlet when
# monkey patched with eventlet, otherwise if libev is available, use that as
# the default because it's fastest. Otherwise, use asyncore.
@@ -181,6 +183,7 @@
for cluster in clusters:
cluster.shutdown()
+
atexit.register(_shutdown_clusters)
@@ -190,6 +193,35 @@
return DCAwareRoundRobinPolicy()
+def _addrinfo_or_none(contact_point, port):
+ """
+ A helper function that wraps socket.getaddrinfo and returns None
+ when it fails to, e.g. resolve one of the hostnames. Used to address
+ PYTHON-895.
+ """
+ try:
+ return socket.getaddrinfo(contact_point, port,
+ socket.AF_UNSPEC, socket.SOCK_STREAM)
+ except socket.gaierror:
+ log.debug('Could not resolve hostname "{}" '
+ 'with port {}'.format(contact_point, port))
+ return None
+
+
+def _resolve_contact_points(contact_points, port):
+ resolved = tuple(_addrinfo_or_none(p, port)
+ for p in contact_points)
+
+ if resolved and all((x is None for x in resolved)):
+ raise UnresolvableContactPoints(contact_points, port)
+
+ resolved = tuple(r for r in resolved if r is not None)
+
+ return [endpoint[4][0]
+ for addrinfo in resolved
+ for endpoint in addrinfo]
+
+
class ExecutionProfile(object):
load_balancing_policy = None
"""
@@ -822,8 +854,8 @@
self.port = port
- self.contact_points_resolved = [endpoint[4][0] for a in self.contact_points
- for endpoint in socket.getaddrinfo(a, self.port, socket.AF_UNSPEC, socket.SOCK_STREAM)]
+ self.contact_points_resolved = _resolve_contact_points(self.contact_points,
+ self.port)
self.compression = compression
@@ -1058,7 +1090,7 @@
if self._config_mode == _ConfigMode.LEGACY:
raise ValueError("Cannot add execution profiles when legacy parameters are set explicitly.")
if name in self.profile_manager.profiles:
- raise ValueError("Profile %s already exists")
+ raise ValueError("Profile {} already exists".format(name))
contact_points_but_no_lbp = (
self._contact_points_explicit and not
profile._load_balancing_policy_explicit)
@@ -1086,7 +1118,6 @@
if not_done:
raise OperationTimedOut("Failed to create all new connection pools in the %ss timeout.")
-
def get_min_requests_per_connection(self, host_distance):
return self._min_requests_per_connection[host_distance]
@@ -2029,7 +2060,6 @@
.. versionadded:: 3.8.0
"""
-
encoder = None
"""
A :class:`~cassandra.encoder.Encoder` instance that will be used when
@@ -2219,7 +2249,6 @@
load_balancing_policy = execution_profile.load_balancing_policy
spec_exec_policy = execution_profile.speculative_execution_policy
-
fetch_size = query.fetch_size
if fetch_size is FETCH_SIZE_UNSET and self._protocol_version >= 2:
fetch_size = self.default_fetch_size
@@ -4244,7 +4273,14 @@
you know a query returns a single row. Consider using an iterator if the
ResultSet contains more than one row.
"""
- return self._current_rows[0] if self._current_rows else None
+ row = None
+ if self._current_rows:
+ try:
+ row = self._current_rows[0]
+ except TypeError: # generator object is not subscriptable, PYTHON-1026
+ row = next(iter(self._current_rows))
+
+ return row
def __iter__(self):
if self._list_mode:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/cassandra/cqlengine/query.py new/python-driver-3.16.0/cassandra/cqlengine/query.py
--- old/python-driver-3.15.1/cassandra/cqlengine/query.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/cassandra/cqlengine/query.py 2018-11-12 19:19:38.000000000 +0100
@@ -76,7 +76,7 @@
except Exception:
applied = True # result was not LWT form
if not applied:
- raise LWTException(result[0])
+ raise LWTException(result.one())
class AbstractQueryableColumn(UnicodeMixin):
@@ -841,7 +841,7 @@
query = self._select_query()
query.count = True
result = self._execute(query)
- count_row = result[0].popitem()
+ count_row = result.one().popitem()
self._count = count_row[1]
return self._count
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/cassandra/encoder.py new/python-driver-3.16.0/cassandra/encoder.py
--- old/python-driver-3.15.1/cassandra/encoder.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/cassandra/encoder.py 2018-11-12 19:19:38.000000000 +0100
@@ -224,12 +224,15 @@
"""
return '{%s}' % ', '.join(self.mapping.get(type(v), self.cql_encode_object)(v) for v in val)
- def cql_encode_all_types(self, val):
+ def cql_encode_all_types(self, val, as_text_type=False):
"""
Converts any type into a CQL string, defaulting to ``cql_encode_object``
if :attr:`~Encoder.mapping` does not contain an entry for the type.
"""
- return self.mapping.get(type(val), self.cql_encode_object)(val)
+ encoded = self.mapping.get(type(val), self.cql_encode_object)(val)
+ if as_text_type and not isinstance(encoded, six.text_type):
+ return encoded.decode('utf-8')
+ return encoded
if six.PY3:
def cql_encode_ipaddress(self, val):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/cassandra/io/asyncorereactor.py new/python-driver-3.16.0/cassandra/io/asyncorereactor.py
--- old/python-driver-3.15.1/cassandra/io/asyncorereactor.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/cassandra/io/asyncorereactor.py 2018-11-12 19:19:38.000000000 +0100
@@ -424,10 +424,14 @@
break
except socket.error as err:
if ssl and isinstance(err, ssl.SSLError):
- if err.args[0] not in (ssl.SSL_ERROR_WANT_READ, ssl.SSL_ERROR_WANT_WRITE):
+ if err.args[0] in (ssl.SSL_ERROR_WANT_READ, ssl.SSL_ERROR_WANT_WRITE):
+ return
+ else:
self.defunct(err)
return
- elif err.args[0] not in NONBLOCKING:
+ elif err.args[0] in NONBLOCKING:
+ return
+ else:
self.defunct(err)
return
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/cassandra/io/libevreactor.py new/python-driver-3.16.0/cassandra/io/libevreactor.py
--- old/python-driver-3.15.1/cassandra/io/libevreactor.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/cassandra/io/libevreactor.py 2018-11-12 19:19:38.000000000 +0100
@@ -344,10 +344,14 @@
break
except socket.error as err:
if ssl and isinstance(err, ssl.SSLError):
- if err.args[0] not in (ssl.SSL_ERROR_WANT_READ, ssl.SSL_ERROR_WANT_WRITE):
+ if err.args[0] in (ssl.SSL_ERROR_WANT_READ, ssl.SSL_ERROR_WANT_WRITE):
+ return
+ else:
self.defunct(err)
return
- elif err.args[0] not in NONBLOCKING:
+ elif err.args[0] in NONBLOCKING:
+ return
+ else:
self.defunct(err)
return
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/cassandra/metadata.py new/python-driver-3.16.0/cassandra/metadata.py
--- old/python-driver-3.15.1/cassandra/metadata.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/cassandra/metadata.py 2018-11-12 19:19:38.000000000 +0100
@@ -1435,7 +1435,9 @@
index_target,
class_name)
if options:
- ret += " WITH OPTIONS = %s" % Encoder().cql_encode_all_types(options)
+ # PYTHON-1008: `ret` will always be a unicode
+ opts_cql_encoded = _encoder.cql_encode_all_types(options, as_text_type=True)
+ ret += " WITH OPTIONS = %s" % opts_cql_encoded
return ret
def export_as_string(self):
@@ -1665,8 +1667,30 @@
self.connection = connection
self.timeout = timeout
- def _handle_results(self, success, result):
- if success:
+ def _handle_results(self, success, result, expected_failures=tuple()):
+ """
+ Given a bool and a ResultSet (the form returned per result from
+ Connection.wait_for_responses), return a dictionary containing the
+ results. Used to process results from asynchronous queries to system
+ tables.
+
+ ``expected_failures`` will usually be used to allow callers to ignore
+ ``InvalidRequest`` errors caused by a missing system keyspace. For
+ example, some DSE versions report a 4.X server version, but do not have
+ virtual tables. Thus, running against 4.X servers, SchemaParserV4 uses
+ expected_failures to make a best-effort attempt to read those
+ keyspaces, but treat them as empty if they're not found.
+
+ :param success: A boolean representing whether or not the query
+ succeeded
+ :param result: The resultset in question.
+ :expected_failures: An Exception class or an iterable thereof. If the
+ query failed, but raised an instance of an expected failure class, this
+ will ignore the failure and return an empty list.
+ """
+ if not success and isinstance(result, expected_failures):
+ return []
+ elif success:
return dict_factory(*result.results) if result else []
else:
raise result
@@ -1782,11 +1806,9 @@
table_result = self._handle_results(cf_success, cf_result)
col_result = self._handle_results(col_success, col_result)
- # handle the triggers table not existing in Cassandra 1.2
- if not triggers_success and isinstance(triggers_result, InvalidRequest):
- triggers_result = []
- else:
- triggers_result = self._handle_results(triggers_success, triggers_result)
+ # the triggers table doesn't exist in C* 1.2
+ triggers_result = self._handle_results(triggers_success, triggers_result,
+ expected_failures=InvalidRequest)
if table_result:
return self._build_table_metadata(table_result[0], col_result, triggers_result)
@@ -2529,12 +2551,21 @@
self.indexes_result = self._handle_results(indexes_success, indexes_result)
self.views_result = self._handle_results(views_success, views_result)
# V4-only results
- self.virtual_keyspaces_result = self._handle_results(virtual_ks_success,
- virtual_ks_result)
- self.virtual_tables_result = self._handle_results(virtual_table_success,
- virtual_table_result)
- self.virtual_columns_result = self._handle_results(virtual_column_success,
- virtual_column_result)
+ # These tables don't exist in some DSE versions reporting 4.X so we can
+ # ignore them if we got an error
+ self.virtual_keyspaces_result = self._handle_results(
+ virtual_ks_success, virtual_ks_result,
+ expected_failures=InvalidRequest
+ )
+ self.virtual_tables_result = self._handle_results(
+ virtual_table_success, virtual_table_result,
+ expected_failures=InvalidRequest
+ )
+ self.virtual_columns_result = self._handle_results(
+ virtual_column_success, virtual_column_result,
+ expected_failures=InvalidRequest
+ )
+
self._aggregate_results()
def _aggregate_results(self):
@@ -2720,9 +2751,7 @@
def get_schema_parser(connection, server_version, timeout):
server_major_version = int(server_version.split('.')[0])
- # check for DSE version
- has_build_version = len(server_version.split('.')) > 3
- if server_major_version >= 4 and not has_build_version:
+ if server_major_version >= 4:
return SchemaParserV4(connection, timeout)
if server_major_version >= 3:
return SchemaParserV3(connection, timeout)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/docs.yaml new/python-driver-3.16.0/docs.yaml
--- old/python-driver-3.15.1/docs.yaml 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/docs.yaml 2018-11-12 19:19:38.000000000 +0100
@@ -1,6 +1,7 @@
title: DataStax Python Driver for Apache Cassandra
summary: DataStax Python Driver for Apache Cassandra Documentation
output: docs/_build/
+swiftype_drivers: pythondrivers
checks:
external_links:
exclude:
@@ -21,6 +22,8 @@
# build extensions like libev
CASS_DRIVER_NO_CYTHON=1 python setup.py build_ext --inplace --force
versions:
+ - name: '3.16'
+ ref: '3.16.0'
- name: '3.15'
ref: '2ce0bd97'
- name: '3.14'
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/tests/integration/long/test_loadbalancingpolicies.py new/python-driver-3.16.0/tests/integration/long/test_loadbalancingpolicies.py
--- old/python-driver-3.15.1/tests/integration/long/test_loadbalancingpolicies.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/tests/integration/long/test_loadbalancingpolicies.py 2018-11-12 19:19:38.000000000 +0100
@@ -65,13 +65,20 @@
self.probe_session = self.probe_cluster.connect()
def _wait_for_nodes_up(self, nodes, cluster=None):
+ log.debug('entered: _wait_for_nodes_up(nodes={ns}, '
+ 'cluster={cs})'.format(ns=nodes,
+ cs=cluster))
if not cluster:
+ log.debug('connecting to cluster')
self._connect_probe_cluster()
cluster = self.probe_cluster
for n in nodes:
wait_for_up(cluster, n)
def _wait_for_nodes_down(self, nodes, cluster=None):
+ log.debug('entered: _wait_for_nodes_down(nodes={ns}, '
+ 'cluster={cs})'.format(ns=nodes,
+ cs=cluster))
if not cluster:
self._connect_probe_cluster()
cluster = self.probe_cluster
@@ -87,6 +94,11 @@
def _insert(self, session, keyspace, count=12,
consistency_level=ConsistencyLevel.ONE):
+ log.debug('entered _insert('
+ 'session={session}, keyspace={keyspace}, '
+ 'count={count}, consistency_level={consistency_level}'
+ ')'.format(session=session, keyspace=keyspace, count=count,
+ consistency_level=consistency_level))
session.execute('USE %s' % keyspace)
ss = SimpleStatement('INSERT INTO cf(k, i) VALUES (0, 0)', consistency_level=consistency_level)
@@ -94,6 +106,7 @@
while tries < 100:
try:
execute_concurrent_with_args(session, ss, [None] * count)
+ log.debug('Completed _insert on try #{}'.format(tries + 1))
return
except (OperationTimedOut, WriteTimeout, WriteFailure):
ex_type, ex, tb = sys.exc_info()
@@ -105,6 +118,13 @@
def _query(self, session, keyspace, count=12,
consistency_level=ConsistencyLevel.ONE, use_prepared=False):
+ log.debug('entered _query('
+ 'session={session}, keyspace={keyspace}, '
+ 'count={count}, consistency_level={consistency_level}, '
+ 'use_prepared={use_prepared}'
+ ')'.format(session=session, keyspace=keyspace, count=count,
+ consistency_level=consistency_level,
+ use_prepared=use_prepared))
if use_prepared:
query_string = 'SELECT * FROM %s.cf WHERE k = ?' % keyspace
if not self.prepared or self.prepared.query_string != query_string:
@@ -549,7 +569,7 @@
self._check_query_order_changes(session=session, keyspace=keyspace)
- #check TokenAwarePolicy still return the remaining replicas when one goes down
+ # check TokenAwarePolicy still return the remaining replicas when one goes down
self.coordinator_stats.reset_counts()
stop(2)
self._wait_for_nodes_down([2], cluster)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/tests/integration/long/utils.py new/python-driver-3.16.0/tests/integration/long/utils.py
--- old/python-driver-3.15.1/tests/integration/long/utils.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/tests/integration/long/utils.py 2018-11-12 19:19:38.000000000 +0100
@@ -17,8 +17,9 @@
import time
from collections import defaultdict
-from ccmlib.node import Node
+from ccmlib.node import Node, ToolError
+from nose.tools import assert_in
from cassandra.query import named_tuple_factory
from cassandra.cluster import ConsistencyLevel
@@ -35,6 +36,7 @@
self.coordinator_counts = defaultdict(int)
def add_coordinator(self, future):
+ log.debug('adding coordinator from {}'.format(future))
future.result()
coordinator = future._current_host.address
self.coordinator_counts[coordinator] += 1
@@ -100,11 +102,24 @@
def decommission(node):
- get_node(node).decommission()
+ try:
+ get_node(node).decommission()
+ except ToolError as e:
+ expected_errs = (('Not enough live nodes to maintain replication '
+ 'factor in keyspace system_distributed'),
+ 'Perform a forceful decommission to ignore.')
+ for err in expected_errs:
+ assert_in(err, e.stdout)
+ # in this case, we're running against a C* version with CASSANDRA-12510
+ # applied and need to decommission with `--force`
+ get_node(node).decommission(force=True)
get_node(node).stop()
def bootstrap(node, data_center=None, token=None):
+ log.debug('called bootstrap('
+ 'node={node}, data_center={data_center}, '
+ 'token={token})')
node_instance = Node('node%s' % node,
get_cluster(),
auto_bootstrap=False,
@@ -118,12 +133,15 @@
try:
start(node)
- except:
+ except Exception as e0:
+ log.debug('failed 1st bootstrap attempt with: \n{}'.format(e0))
# Try only twice
try:
start(node)
- except:
+ except Exception as e1:
+ log.debug('failed 2nd bootstrap attempt with: \n{}'.format(e1))
log.error('Added node failed to start twice.')
+ raise e1
def ring(node):
@@ -140,7 +158,7 @@
log.debug("Done waiting for node %s to be up", node)
return
else:
- log.debug("Host is still marked down, waiting")
+ log.debug("Host {} is still marked down, waiting".format(addr))
tries += 1
time.sleep(1)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/tests/integration/simulacron/__init__.py new/python-driver-3.16.0/tests/integration/simulacron/__init__.py
--- old/python-driver-3.15.1/tests/integration/simulacron/__init__.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/tests/integration/simulacron/__init__.py 2018-11-12 19:19:38.000000000 +0100
@@ -35,20 +35,24 @@
class SimulacronCluster(SimulacronBase):
+
+ cluster, connect = None, True
+
@classmethod
def setUpClass(cls):
if SIMULACRON_JAR is None or CASSANDRA_VERSION < Version("2.1"):
return
start_and_prime_singledc()
- cls.cluster = Cluster(protocol_version=PROTOCOL_VERSION, compression=False)
- cls.session = cls.cluster.connect(wait_for_all_pools=True)
+ if cls.connect:
+ cls.cluster = Cluster(protocol_version=PROTOCOL_VERSION, compression=False)
+ cls.session = cls.cluster.connect(wait_for_all_pools=True)
@classmethod
def tearDownClass(cls):
if SIMULACRON_JAR is None or CASSANDRA_VERSION < Version("2.1"):
return
- cls.cluster.shutdown()
+ if cls.cluster:
+ cls.cluster.shutdown()
stop_simulacron()
-
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/tests/integration/simulacron/test_cluster.py new/python-driver-3.16.0/tests/integration/simulacron/test_cluster.py
--- old/python-driver-3.15.1/tests/integration/simulacron/test_cluster.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/tests/integration/simulacron/test_cluster.py 2018-11-12 19:19:38.000000000 +0100
@@ -17,10 +17,13 @@
import unittest # noqa
from tests.integration.simulacron import SimulacronCluster
-from tests.integration import requiressimulacron
+from tests.integration import (requiressimulacron, PROTOCOL_VERSION)
from tests.integration.simulacron.utils import prime_query
-from cassandra import WriteTimeout, WriteType, ConsistencyLevel
+from cassandra import (WriteTimeout, WriteType,
+ ConsistencyLevel, UnresolvableContactPoints)
+from cassandra.cluster import Cluster
+
@requiressimulacron
class ClusterTests(SimulacronCluster):
@@ -53,3 +56,25 @@
self.assertIn(consistency, str(wt))
self.assertIn(str(received_responses), str(wt))
self.assertIn(str(required_responses), str(wt))
+
+
+@requiressimulacron
+class ClusterDNSResolutionTests(SimulacronCluster):
+
+ connect = False
+
+ def tearDown(self):
+ if self.cluster:
+ self.cluster.shutdown()
+
+ def test_connection_with_one_unresolvable_contact_point(self):
+ # shouldn't raise anything due to name resolution failures
+ self.cluster = Cluster(['127.0.0.1', 'dns.invalid'],
+ protocol_version=PROTOCOL_VERSION,
+ compression=False)
+
+ def test_connection_with_only_unresolvable_contact_points(self):
+ with self.assertRaises(UnresolvableContactPoints):
+ self.cluster = Cluster(['dns.invalid'],
+ protocol_version=PROTOCOL_VERSION,
+ compression=False)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/tests/integration/standard/test_dse.py new/python-driver-3.16.0/tests/integration/standard/test_dse.py
--- old/python-driver-3.15.1/tests/integration/standard/test_dse.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/tests/integration/standard/test_dse.py 2018-11-12 19:19:38.000000000 +0100
@@ -18,6 +18,7 @@
from cassandra.cluster import Cluster
from tests import notwindows
+from tests.unit.cython.utils import notcython
from tests.integration import (execute_until_pass,
execute_with_long_wait_retry, use_cluster)
@@ -27,8 +28,12 @@
import unittest # noqa
+CCM_IS_DSE = (os.environ.get('CCM_IS_DSE', None) == 'true')
+
+
@unittest.skipIf(os.environ.get('CCM_ARGS', None), 'environment has custom CCM_ARGS; skipping')
@notwindows
+@notcython # no need to double up on this test; also __default__ setting doesn't work
class DseCCMClusterTest(unittest.TestCase):
"""
This class can be executed setting the DSE_VERSION variable, for example:
@@ -42,6 +47,10 @@
def test_dse_60(self):
self._test_basic(Version('6.0.2'))
+ @unittest.skipUnless(CCM_IS_DSE, 'DSE version unavailable')
+ def test_dse_67(self):
+ self._test_basic(Version('6.7.0'))
+
def _test_basic(self, dse_version):
"""
Test basic connection and usage
@@ -52,7 +61,8 @@
use_cluster(cluster_name=cluster_name, nodes=[3],
dse_cluster=True, dse_options={}, dse_version=dse_version)
- cluster = Cluster()
+ cluster = Cluster(
+ allow_beta_protocol_version=(dse_version >= Version('6.7.0')))
session = cluster.connect()
result = execute_until_pass(
session,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/tests/integration/standard/test_row_factories.py new/python-driver-3.16.0/tests/integration/standard/test_row_factories.py
--- old/python-driver-3.15.1/tests/integration/standard/test_row_factories.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/tests/integration/standard/test_row_factories.py 2018-11-12 19:19:38.000000000 +0100
@@ -181,6 +181,32 @@
self.assertEqual(result[1]['k'], result[1]['v'])
self.assertEqual(result[1]['k'], 2)
+ def test_generator_row_factory(self):
+ """
+ Test that ResultSet.one() works with a row_factory that contains a generator.
+
+ @since 3.16
+ @jira_ticket PYTHON-1026
+ @expected_result one() returns the first row
+
+ @test_category queries
+ """
+ def generator_row_factory(column_names, rows):
+ return _gen_row_factory(rows)
+
+ def _gen_row_factory(rows):
+ for r in rows:
+ yield r
+
+ session = self.session
+ session.row_factory = generator_row_factory
+
+ session.execute(self.insert1)
+ result = session.execute(self.select)
+ self.assertIsInstance(result, ResultSet)
+ first_row = result.one()
+ self.assertEqual(first_row[0], first_row[1])
+
class NamedTupleFactoryAndNumericColNamesTests(unittest.TestCase):
"""
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/tests/unit/io/utils.py new/python-driver-3.16.0/tests/unit/io/utils.py
--- old/python-driver-3.15.1/tests/unit/io/utils.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/tests/unit/io/utils.py 2018-11-12 19:19:38.000000000 +0100
@@ -25,9 +25,11 @@
from mock import Mock
import errno
+import logging
import math
import os
from socket import error as socket_error
+import ssl
try:
import unittest2 as unittest
@@ -37,6 +39,9 @@
import time
+log = logging.getLogger(__name__)
+
+
class TimerCallback(object):
invoked = False
@@ -247,18 +252,31 @@
return c
def test_eagain_on_buffer_size(self):
+ self._check_error_recovery_on_buffer_size(errno.EAGAIN)
+
+ def test_ewouldblock_on_buffer_size(self):
+ self._check_error_recovery_on_buffer_size(errno.EWOULDBLOCK)
+
+ def test_sslwantread_on_buffer_size(self):
+ self._check_error_recovery_on_buffer_size(ssl.SSL_ERROR_WANT_READ)
+
+ def test_sslwantwrite_on_buffer_size(self):
+ self._check_error_recovery_on_buffer_size(ssl.SSL_ERROR_WANT_WRITE)
+
+ def _check_error_recovery_on_buffer_size(self, error_code):
c = self.test_successful_connection()
header = six.b('\x00\x00\x00\x00') + int32_pack(20000)
responses = [
header + (six.b('a') * (4096 - len(header))),
six.b('a') * 4096,
- socket_error(errno.EAGAIN),
+ socket_error(error_code),
six.b('a') * 100,
- socket_error(errno.EAGAIN)]
+ socket_error(error_code)]
def side_effect(*args):
response = responses.pop(0)
+ log.debug('about to mock return {}'.format(response))
if isinstance(response, socket_error):
raise response
else:
@@ -266,7 +284,6 @@
self.get_socket(c).recv.side_effect = side_effect
c.handle_read(*self.null_handle_function_args)
- self.assertEqual(c._current_frame.end_pos, 20000 + len(header))
# the EAGAIN prevents it from reading the last 100 bytes
c._iobuf.seek(0, os.SEEK_END)
pos = c._iobuf.tell()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/python-driver-3.15.1/tests/unit/test_metadata.py new/python-driver-3.16.0/tests/unit/test_metadata.py
--- old/python-driver-3.15.1/tests/unit/test_metadata.py 2018-09-06 21:55:48.000000000 +0200
+++ new/python-driver-3.16.0/tests/unit/test_metadata.py 2018-11-12 19:19:38.000000000 +0100
@@ -18,6 +18,7 @@
import unittest # noqa
from binascii import unhexlify
+import logging
from mock import Mock
import os
import six
@@ -38,6 +39,9 @@
from cassandra.pool import Host
+log = logging.getLogger(__name__)
+
+
class StrategiesTest(unittest.TestCase):
@classmethod
@@ -536,9 +540,12 @@
def test_index(self):
im = IndexMetadata(self.name, self.name, self.name, kind='', index_options={'target': self.name})
- im.export_as_string()
+ log.debug(im.export_as_string())
im = IndexMetadata(self.name, self.name, self.name, kind='CUSTOM', index_options={'target': self.name, 'class_name': 'Class'})
- im.export_as_string()
+ log.debug(im.export_as_string())
+ # PYTHON-1008
+ im = IndexMetadata(self.name, self.name, self.name, kind='CUSTOM', index_options={'target': self.name, 'class_name': 'Class', 'delimiter': self.name})
+ log.debug(im.export_as_string())
def test_function(self):
fm = Function(self.name, self.name, (u'int', u'int'), (u'x', u'y'), u'int', u'language', self.name, False)
1
0
Hello community,
here is the log from the commit of package python-llvmlite for openSUSE:Factory checked in at 2018-12-03 10:12:23
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-llvmlite (Old)
and /work/SRC/openSUSE:Factory/.python-llvmlite.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-llvmlite"
Mon Dec 3 10:12:23 2018 rev:9 rq:653424 version:0.26.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-llvmlite/python-llvmlite.changes 2018-10-11 11:58:27.721793909 +0200
+++ /work/SRC/openSUSE:Factory/.python-llvmlite.new.19453/python-llvmlite.changes 2018-12-03 10:12:34.759587440 +0100
@@ -1,0 +2,20 @@
+Sat Dec 1 18:33:44 UTC 2018 - Arun Persaud <arun(a)gmx.de>
+
+- update to version 0.26.0:
+ * The primary new features in this release is support for generation
+ of Intel JIT events, which makes profiling of JIT compiled code in
+ Intel VTune possible. This release also contains some minor build
+ improvements for ARMv7, and some small fixes.
+ * LLVM 7 support was originally slated for this release, but had to
+ be delayed after some issues arose in testing. LLVM 6 is still
+ required for llvmlite.
+ * PR #409: Use native cmake on armv7l
+ * PR #407: Throttle thread count for llvm build on armv7l.
+ * PR #403: Add shutdown detection to ObjectRef __del__ method.
+ * PR #400: conda recipe: add make as build dep
+ * PR #399: Add get_element_offset to TargetData
+ * PR #398: Fix gep method call on Constant objects
+ * PR #395: Fix typo in irbuilder documentation
+ * PR #394: Enable IntelJIT events for LLVM for VTune support
+
+-------------------------------------------------------------------
Old:
----
llvmlite-0.25.0.tar.gz
New:
----
llvmlite-0.26.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-llvmlite.spec ++++++
--- /var/tmp/diff_new_pack.56h5WV/_old 2018-12-03 10:12:35.283586954 +0100
+++ /var/tmp/diff_new_pack.56h5WV/_new 2018-12-03 10:12:35.287586950 +0100
@@ -18,7 +18,7 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-llvmlite
-Version: 0.25.0
+Version: 0.26.0
Release: 0
Summary: Lightweight wrapper around basic LLVM functionality
License: BSD-2-Clause
++++++ llvmlite-0.25.0.tar.gz -> llvmlite-0.26.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/PKG-INFO new/llvmlite-0.26.0/PKG-INFO
--- old/llvmlite-0.25.0/PKG-INFO 2018-09-21 21:32:23.000000000 +0200
+++ new/llvmlite-0.26.0/PKG-INFO 2018-11-28 15:59:12.000000000 +0100
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: llvmlite
-Version: 0.25.0
+Version: 0.26.0
Summary: lightweight wrapper around basic LLVM functionality
Home-page: http://llvmlite.pydata.org
Author: Continuum Analytics, Inc.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/ffi/core.cpp new/llvmlite-0.26.0/ffi/core.cpp
--- old/llvmlite-0.25.0/ffi/core.cpp 2018-09-21 21:31:30.000000000 +0200
+++ new/llvmlite-0.26.0/ffi/core.cpp 2018-11-28 15:58:46.000000000 +0100
@@ -29,6 +29,16 @@
return LLVMGetGlobalContext();
}
+API_EXPORT(LLVMContextRef)
+LLVMPY_ContextCreate() {
+ return LLVMContextCreate();
+}
+
+API_EXPORT(void)
+LLVMPY_ContextDispose(LLVMContextRef context) {
+ return LLVMContextDispose(context);
+}
+
API_EXPORT(void)
LLVMPY_SetCommandLine(const char *name, const char *option)
{
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/ffi/core.h new/llvmlite-0.26.0/ffi/core.h
--- old/llvmlite-0.25.0/ffi/core.h 2018-09-21 21:31:30.000000000 +0200
+++ new/llvmlite-0.26.0/ffi/core.h 2018-11-28 15:58:46.000000000 +0100
@@ -30,6 +30,9 @@
API_EXPORT(LLVMContextRef)
LLVMPY_GetGlobalContext();
+API_EXPORT(LLVMContextRef)
+LLVMPY_ContextCreate();
+
} /* end extern "C" */
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/ffi/executionengine.cpp new/llvmlite-0.26.0/ffi/executionengine.cpp
--- old/llvmlite-0.25.0/ffi/executionengine.cpp 2018-09-21 21:31:30.000000000 +0200
+++ new/llvmlite-0.26.0/ffi/executionengine.cpp 2018-11-28 15:58:46.000000000 +0100
@@ -153,15 +153,24 @@
API_EXPORT(bool)
LLVMPY_EnableJITEvents(LLVMExecutionEngineRef EE)
{
+ llvm::JITEventListener *listener;
+ bool result = false;
+
#ifdef __linux__
- llvm::JITEventListener *listener = llvm::JITEventListener::createOProfileJITEventListener();
+ listener = llvm::JITEventListener::createOProfileJITEventListener();
// if listener is null, then LLVM was not compiled for OProfile JIT events.
if (listener) {
llvm::unwrap(EE)->RegisterJITEventListener(listener);
- return true;
+ result = true;
}
#endif
- return false;
+ listener = llvm::JITEventListener::createIntelJITEventListener();
+ // if listener is null, then LLVM was not compiled for Intel JIT events.
+ if (listener) {
+ llvm::unwrap(EE)->RegisterJITEventListener(listener);
+ result = true;
+ }
+ return result;
}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/ffi/module.cpp new/llvmlite-0.26.0/ffi/module.cpp
--- old/llvmlite-0.25.0/ffi/module.cpp 2018-09-21 21:31:30.000000000 +0200
+++ new/llvmlite-0.26.0/ffi/module.cpp 2018-11-28 15:58:46.000000000 +0100
@@ -3,6 +3,7 @@
#include "llvm-c/Core.h"
#include "llvm-c/Analysis.h"
#include "llvm/IR/Module.h"
+#include "llvm/IR/TypeFinder.h"
#include "core.h"
@@ -35,6 +36,26 @@
struct OpaqueFunctionsIterator;
typedef OpaqueFunctionsIterator* LLVMFunctionsIteratorRef;
+/* module types iterator */
+class TypesIterator {
+private:
+ llvm::TypeFinder finder;
+ using const_iterator = llvm::TypeFinder::const_iterator;
+ const_iterator cur;
+public:
+ TypesIterator(llvm::Module& m, bool namedOnly) : finder(llvm::TypeFinder()) {
+ finder.run(m, namedOnly);
+ cur = finder.begin();
+ }
+ const llvm::Type* next() {
+ if (cur != finder.end()) {
+ return *cur++;
+ }
+ return nullptr;
+ }
+};
+
+typedef TypesIterator* LLVMTypesIteratorRef;
//
// Local helper functions
@@ -62,6 +83,16 @@
return reinterpret_cast<FunctionsIterator *>(GI);
}
+static LLVMTypesIteratorRef
+wrap(TypesIterator* TyI) {
+ return reinterpret_cast<LLVMTypesIteratorRef>(TyI);
+}
+
+static TypesIterator*
+unwrap(LLVMTypesIteratorRef TyI) {
+ return reinterpret_cast<TypesIterator*>(TyI);
+}
+
} // end namespace llvm
@@ -119,6 +150,13 @@
return wrap(unwrap(M)->getGlobalVariable(Name));
}
+API_EXPORT(LLVMTypeRef)
+LLVMPY_GetNamedStructType(LLVMModuleRef M,
+ const char *Name)
+{
+ return LLVMGetTypeByName(M, Name);
+}
+
API_EXPORT(int)
LLVMPY_VerifyModule(LLVMModuleRef M, char **OutMsg)
@@ -177,6 +215,14 @@
mod->end()));
}
+API_EXPORT(LLVMTypesIteratorRef)
+LLVMPY_ModuleTypesIter(LLVMModuleRef M)
+{
+ llvm::Module* mod = llvm::unwrap(M);
+ auto* iter = new TypesIterator(*mod, false);
+ return llvm::wrap(iter);
+}
+
/*
These functions return NULL if we are at the end
@@ -205,6 +251,12 @@
}
}
+API_EXPORT(LLVMTypeRef)
+LLVMPY_TypesIterNext(LLVMTypesIteratorRef TyI)
+{
+ return llvm::wrap(llvm::unwrap(TyI)->next());
+}
+
API_EXPORT(void)
LLVMPY_DisposeGlobalsIter(LLVMGlobalsIteratorRef GI)
{
@@ -217,6 +269,14 @@
delete llvm::unwrap(GI);
}
+API_EXPORT(void)
+LLVMPY_DisposeTypesIter(LLVMTypesIteratorRef TyI)
+{
+ delete llvm::unwrap(TyI);
+}
+
+
+
API_EXPORT(LLVMModuleRef)
LLVMPY_CloneModule(LLVMModuleRef M)
{
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/ffi/targets.cpp new/llvmlite-0.26.0/ffi/targets.cpp
--- old/llvmlite-0.25.0/ffi/targets.cpp 2018-09-21 21:31:30.000000000 +0200
+++ new/llvmlite-0.26.0/ffi/targets.cpp 2018-11-28 15:58:46.000000000 +0100
@@ -119,6 +119,15 @@
}
API_EXPORT(long long)
+LLVMPY_OffsetOfElement(LLVMTargetDataRef TD, LLVMTypeRef Ty, int Element)
+{
+ llvm::Type *tp = llvm::unwrap(Ty);
+ if (!tp->isStructTy())
+ return -1;
+ return (long long) LLVMOffsetOfElement(TD, Ty, Element);
+}
+
+API_EXPORT(long long)
LLVMPY_ABISizeOfElementType(LLVMTargetDataRef TD, LLVMTypeRef Ty)
{
llvm::Type *tp = llvm::unwrap(Ty);
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/llvmlite/_version.py new/llvmlite-0.26.0/llvmlite/_version.py
--- old/llvmlite-0.25.0/llvmlite/_version.py 2018-09-21 21:32:23.000000000 +0200
+++ new/llvmlite-0.26.0/llvmlite/_version.py 2018-11-28 15:59:12.000000000 +0100
@@ -4,8 +4,8 @@
# unpacked source archive. Distribution tarballs contain a pre-generated copy
# of this file.
-version_version = '0.25.0'
-version_full = '9af98a608a49278dbc4ce5dc743152f2341b6a87'
+version_version = '0.26.0'
+version_full = 'f63f7b0b67bfc159a9cc4e6a9728c67f8c690825'
def get_versions(default={}, verbose=False):
return {'version': version_version, 'full': version_full}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/llvmlite/binding/__init__.py new/llvmlite-0.26.0/llvmlite/binding/__init__.py
--- old/llvmlite-0.25.0/llvmlite/binding/__init__.py 2018-09-21 21:31:30.000000000 +0200
+++ new/llvmlite-0.26.0/llvmlite/binding/__init__.py 2018-11-28 15:58:46.000000000 +0100
@@ -15,3 +15,4 @@
from .value import *
from .analysis import *
from .object_file import *
+from .context import *
\ No newline at end of file
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/llvmlite/binding/context.py new/llvmlite-0.26.0/llvmlite/binding/context.py
--- old/llvmlite-0.25.0/llvmlite/binding/context.py 1970-01-01 01:00:00.000000000 +0100
+++ new/llvmlite-0.26.0/llvmlite/binding/context.py 2018-11-28 15:58:46.000000000 +0100
@@ -0,0 +1,30 @@
+from __future__ import print_function, absolute_import
+
+from . import ffi
+
+
+def create_context():
+ return ContextRef(ffi.lib.LLVMPY_ContextCreate())
+
+
+def get_global_context():
+ return GlobalContextRef(ffi.lib.LLVMPY_GetGlobalContext())
+
+
+class ContextRef(ffi.ObjectRef):
+ def __init__(self, context_ptr):
+ super(ContextRef, self).__init__(context_ptr)
+
+ def _dispose(self):
+ ffi.lib.LLVMPY_ContextDispose(self)
+
+class GlobalContextRef(ContextRef):
+ def _dispose(self):
+ pass
+
+
+ffi.lib.LLVMPY_GetGlobalContext.restype = ffi.LLVMContextRef
+
+ffi.lib.LLVMPY_ContextCreate.restype = ffi.LLVMContextRef
+
+ffi.lib.LLVMPY_ContextDispose.argtypes = [ffi.LLVMContextRef]
\ No newline at end of file
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/llvmlite/binding/ffi.py new/llvmlite-0.26.0/llvmlite/binding/ffi.py
--- old/llvmlite-0.25.0/llvmlite/binding/ffi.py 2018-09-21 21:31:30.000000000 +0200
+++ new/llvmlite-0.26.0/llvmlite/binding/ffi.py 2018-11-28 15:58:46.000000000 +0100
@@ -26,6 +26,7 @@
LLVMMemoryBufferRef = _make_opaque_ref("LLVMMemoryBuffer")
LLVMGlobalsIterator = _make_opaque_ref("LLVMGlobalsIterator")
LLVMFunctionsIterator = _make_opaque_ref("LLVMFunctionsIterator")
+LLVMTypesIterator = _make_opaque_ref("LLVMTypesIterator")
LLVMObjectCacheRef = _make_opaque_ref("LLVMObjectCache")
LLVMObjectFileRef = _make_opaque_ref("LLVMObjectFile")
LLVMSectionIteratorRef = _make_opaque_ref("LLVMSectionIterator")
@@ -164,7 +165,8 @@
# Avoid errors trying to rely on globals and modules at interpreter
# shutdown.
if not _is_shutting_down():
- self.close()
+ if self.close is not None:
+ self.close()
def __str__(self):
if self._ptr is None:
@@ -238,13 +240,20 @@
def __exit__(self, exc_type, exc_val, exc_tb):
self.close()
- def __del__(self):
- if self.close is not None:
- self.close()
+ def __del__(self, _is_shutting_down=_is_shutting_down):
+ if not _is_shutting_down():
+ if self.close is not None:
+ self.close()
def __bool__(self):
return bool(self._ptr)
+ def __eq__(self, other):
+ if not hasattr(other, "_ptr"):
+ return False
+ return ctypes.addressof(self._ptr[0]) == \
+ ctypes.addressof(other._ptr[0])
+
__nonzero__ = __bool__
# XXX useful?
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/llvmlite/binding/module.py new/llvmlite-0.26.0/llvmlite/binding/module.py
--- old/llvmlite-0.25.0/llvmlite/binding/module.py 2018-09-21 21:31:30.000000000 +0200
+++ new/llvmlite-0.26.0/llvmlite/binding/module.py 2018-11-28 15:58:46.000000000 +0100
@@ -5,36 +5,43 @@
from . import ffi
from .linker import link_modules
from .common import _decode_string, _encode_string
-from .value import ValueRef
+from .value import ValueRef, TypeRef
+from .context import get_global_context
-def parse_assembly(llvmir):
+def parse_assembly(llvmir, context=None):
"""
Create Module from a LLVM IR string
"""
- context = ffi.lib.LLVMPY_GetGlobalContext()
+ if context is None:
+ context = get_global_context()
llvmir = _encode_string(llvmir)
strbuf = c_char_p(llvmir)
with ffi.OutputString() as errmsg:
- mod = ModuleRef(ffi.lib.LLVMPY_ParseAssembly(context, strbuf, errmsg))
+ mod = ModuleRef(
+ ffi.lib.LLVMPY_ParseAssembly(context, strbuf, errmsg),
+ context)
if errmsg:
mod.close()
raise RuntimeError("LLVM IR parsing error\n{0}".format(errmsg))
return mod
-def parse_bitcode(bitcode):
+def parse_bitcode(bitcode, context=None):
"""
Create Module from a LLVM *bitcode* (a bytes object).
"""
- context = ffi.lib.LLVMPY_GetGlobalContext()
+ if context is None:
+ context = get_global_context()
buf = c_char_p(bitcode)
bufsize = len(bitcode)
with ffi.OutputString() as errmsg:
- mod = ModuleRef(ffi.lib.LLVMPY_ParseBitcode(context, buf, bufsize, errmsg))
+ mod = ModuleRef(ffi.lib.LLVMPY_ParseBitcode(
+ context, buf, bufsize, errmsg), context)
if errmsg:
mod.close()
- raise RuntimeError("LLVM bitcode parsing error\n{0}".format(errmsg))
+ raise RuntimeError(
+ "LLVM bitcode parsing error\n{0}".format(errmsg))
return mod
@@ -43,6 +50,10 @@
A reference to a LLVM module.
"""
+ def __init__(self, module_ptr, context):
+ super(ModuleRef, self).__init__(module_ptr)
+ self._context = context
+
def __str__(self):
with ffi.OutputString() as outstr:
ffi.lib.LLVMPY_PrintModuleToString(self, outstr)
@@ -86,6 +97,16 @@
raise NameError(name)
return ValueRef(p, module=self)
+ def get_struct_type(self, name):
+ """
+ Get a TypeRef pointing to a structure type named *name*.
+ NameError is raised if the struct type isn't found.
+ """
+ p = ffi.lib.LLVMPY_GetNamedStructType(self, _encode_string(name))
+ if not p:
+ raise NameError(name)
+ return TypeRef(p)
+
def verify(self):
"""
Verify the module IR's correctness. RuntimeError is raised on error.
@@ -168,8 +189,17 @@
it = ffi.lib.LLVMPY_ModuleFunctionsIter(self)
return _FunctionsIterator(it, module=self)
+ @property
+ def struct_types(self):
+ """
+ Return an iterator over the struct types defined in
+ the module. The iterator will yield a TypeRef.
+ """
+ it = ffi.lib.LLVMPY_ModuleTypesIter(self)
+ return _TypesIterator(it, module=self)
+
def clone(self):
- return ModuleRef(ffi.lib.LLVMPY_CloneModule(self))
+ return ModuleRef(ffi.lib.LLVMPY_CloneModule(self), self._context)
class _Iterator(ffi.ObjectRef):
@@ -210,6 +240,24 @@
return ffi.lib.LLVMPY_FunctionsIterNext(self)
+class _TypesIterator(_Iterator):
+
+ def _dispose(self):
+ self._capi.LLVMPY_DisposeTypesIter(self)
+
+ def __next__(self):
+ vp = self._next()
+ if vp:
+ return TypeRef(vp)
+ else:
+ raise StopIteration
+
+ def _next(self):
+ return ffi.lib.LLVMPY_TypesIterNext(self)
+
+ next = __next__
+
+
# =============================================================================
# Set function FFI
@@ -223,8 +271,6 @@
POINTER(c_char_p)]
ffi.lib.LLVMPY_ParseBitcode.restype = ffi.LLVMModuleRef
-ffi.lib.LLVMPY_GetGlobalContext.restype = ffi.LLVMContextRef
-
ffi.lib.LLVMPY_DisposeModule.argtypes = [ffi.LLVMModuleRef]
ffi.lib.LLVMPY_PrintModuleToString.argtypes = [ffi.LLVMModuleRef,
@@ -250,6 +296,9 @@
ffi.lib.LLVMPY_GetNamedGlobalVariable.argtypes = [ffi.LLVMModuleRef, c_char_p]
ffi.lib.LLVMPY_GetNamedGlobalVariable.restype = ffi.LLVMValueRef
+ffi.lib.LLVMPY_GetNamedStructType.argtypes = [ffi.LLVMModuleRef, c_char_p]
+ffi.lib.LLVMPY_GetNamedStructType.restype = ffi.LLVMTypeRef
+
ffi.lib.LLVMPY_ModuleGlobalsIter.argtypes = [ffi.LLVMModuleRef]
ffi.lib.LLVMPY_ModuleGlobalsIter.restype = ffi.LLVMGlobalsIterator
@@ -261,11 +310,19 @@
ffi.lib.LLVMPY_ModuleFunctionsIter.argtypes = [ffi.LLVMModuleRef]
ffi.lib.LLVMPY_ModuleFunctionsIter.restype = ffi.LLVMFunctionsIterator
+ffi.lib.LLVMPY_ModuleTypesIter.argtypes = [ffi.LLVMModuleRef]
+ffi.lib.LLVMPY_ModuleTypesIter.restype = ffi.LLVMTypesIterator
+
ffi.lib.LLVMPY_DisposeFunctionsIter.argtypes = [ffi.LLVMFunctionsIterator]
+ffi.lib.LLVMPY_DisposeTypesIter.argtypes = [ffi.LLVMTypesIterator]
+
ffi.lib.LLVMPY_FunctionsIterNext.argtypes = [ffi.LLVMFunctionsIterator]
ffi.lib.LLVMPY_FunctionsIterNext.restype = ffi.LLVMValueRef
+ffi.lib.LLVMPY_TypesIterNext.argtypes = [ffi.LLVMTypesIterator]
+ffi.lib.LLVMPY_TypesIterNext.restype = ffi.LLVMTypeRef
+
ffi.lib.LLVMPY_CloneModule.argtypes = [ffi.LLVMModuleRef]
ffi.lib.LLVMPY_CloneModule.restype = ffi.LLVMModuleRef
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/llvmlite/binding/targets.py new/llvmlite-0.26.0/llvmlite/binding/targets.py
--- old/llvmlite-0.25.0/llvmlite/binding/targets.py 2018-09-21 21:31:30.000000000 +0200
+++ new/llvmlite-0.26.0/llvmlite/binding/targets.py 2018-11-28 15:58:46.000000000 +0100
@@ -133,6 +133,18 @@
"""
return ffi.lib.LLVMPY_ABISizeOfType(self, ty)
+ def get_element_offset(self, ty, position):
+ """
+ Get byte offset of type's ty element at the given position
+ """
+
+ offset = ffi.lib.LLVMPY_OffsetOfElement(self, ty, position)
+ if offset == -1:
+ raise ValueError("Could not determined offset of {}th "
+ "element of the type '{}'. Is it a struct type?".format(
+ position, str(ty)))
+ return offset
+
def get_pointee_abi_size(self, ty):
"""
Get ABI size of pointee type of LLVM pointer type *ty*.
@@ -343,6 +355,11 @@
ffi.LLVMTypeRef]
ffi.lib.LLVMPY_ABISizeOfType.restype = c_longlong
+ffi.lib.LLVMPY_OffsetOfElement.argtypes = [ffi.LLVMTargetDataRef,
+ ffi.LLVMTypeRef,
+ c_int]
+ffi.lib.LLVMPY_OffsetOfElement.restype = c_longlong
+
ffi.lib.LLVMPY_ABISizeOfElementType.argtypes = [ffi.LLVMTargetDataRef,
ffi.LLVMTypeRef]
ffi.lib.LLVMPY_ABISizeOfElementType.restype = c_longlong
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/llvmlite/ir/values.py new/llvmlite-0.26.0/llvmlite/ir/values.py
--- old/llvmlite-0.25.0/llvmlite/ir/values.py 2018-09-21 21:31:30.000000000 +0200
+++ new/llvmlite-0.26.0/llvmlite/ir/values.py 2018-11-28 15:58:46.000000000 +0100
@@ -161,6 +161,12 @@
tys = [el.type for el in elems]
return cls(types.LiteralStructType(tys), elems)
+ @property
+ def addrspace(self):
+ if not isinstance(self.type, types.PointerType):
+ raise TypeError("Only pointer constant have address spaces")
+ return self.type.addrspace
+
def __eq__(self, other):
if isinstance(other, Constant):
return str(self) == str(other)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/llvmlite/tests/test_binding.py new/llvmlite-0.26.0/llvmlite/tests/test_binding.py
--- old/llvmlite-0.25.0/llvmlite/tests/test_binding.py 2018-09-21 21:31:30.000000000 +0200
+++ new/llvmlite-0.26.0/llvmlite/tests/test_binding.py 2018-11-28 15:58:46.000000000 +0100
@@ -174,9 +174,9 @@
# This will probably put any existing garbage in gc.garbage again
del self.old_garbage
- def module(self, asm=asm_sum):
+ def module(self, asm=asm_sum, context=None):
asm = asm.format(triple=llvm.get_default_triple())
- mod = llvm.parse_assembly(asm)
+ mod = llvm.parse_assembly(asm, context)
return mod
def glob(self, name='glob', mod=None):
@@ -244,6 +244,11 @@
self.assertIn("parsing error", s)
self.assertIn("invalid operand type", s)
+ def test_global_context(self):
+ gcontext1 = llvm.context.get_global_context()
+ gcontext2 = llvm.context.get_global_context()
+ assert gcontext1 == gcontext2
+
def test_dylib_symbols(self):
llvm.add_symbol("__xyzzy", 1234)
llvm.add_symbol("__xyzzy", 5678)
@@ -418,6 +423,18 @@
del mod
str(fn.module)
+ def test_get_struct_type(self):
+ mod = self.module()
+ st_ty = mod.get_struct_type("struct.glob_type")
+ self.assertEquals(st_ty.name, "struct.glob_type")
+ # also match struct names of form "%struct.glob_type.{some_index}"
+ self.assertIsNotNone(re.match(
+ r'%struct\.glob_type(\.[\d]+)? = type { i64, \[2 x i64\] }',
+ str(st_ty)))
+
+ with self.assertRaises(NameError):
+ mod.get_struct_type("struct.doesnt_exist")
+
def test_get_global_variable(self):
mod = self.module()
gv = mod.get_global_variable("glob")
@@ -447,6 +464,19 @@
funcs = list(it)
self.assertEqual(len(funcs), 1)
self.assertEqual(funcs[0].name, "sum")
+
+ def test_structs(self):
+ mod = self.module()
+ it = mod.struct_types
+ del mod
+ structs = list(it)
+ self.assertEqual(len(structs), 1)
+ self.assertIsNotNone(re.match(r'struct\.glob_type(\.[\d]+)?',
+ structs[0].name))
+ self.assertIsNotNone(re.match(
+ r'%struct\.glob_type(\.[\d]+)? = type { i64, \[2 x i64\] }',
+ str(structs[0])))
+
def test_link_in(self):
dest = self.module()
src = self.module(asm_mul)
@@ -494,12 +524,15 @@
self.assertIn("Invalid bitcode signature", str(cm.exception))
def test_bitcode_roundtrip(self):
- bc = self.module(asm=asm_mul).as_bitcode()
- mod = llvm.parse_bitcode(bc)
+ # create a new context to avoid struct renaming
+ context1 = llvm.create_context()
+ bc = self.module(context=context1).as_bitcode()
+ context2 = llvm.create_context()
+ mod = llvm.parse_bitcode(bc, context2)
self.assertEqual(mod.as_bitcode(), bc)
- mod.get_function("mul")
- mod.get_global_variable("mul_glob")
+ mod.get_function("sum")
+ mod.get_global_variable("glob")
def test_cloning(self):
m = self.module()
@@ -806,7 +839,7 @@
self.assertEqual(tp.name, "")
st = mod.get_global_variable("glob_struct")
self.assertIsNotNone(re.match(r"struct\.glob_type(\.[\d]+)?",
- st.type.element_type.name))
+ st.type.element_type.name))
def test_type_printing_variable(self):
mod = self.module()
@@ -824,7 +857,7 @@
st = mod.get_global_variable("glob_struct")
self.assertTrue(st.type.is_pointer)
self.assertIsNotNone(re.match(r'%struct\.glob_type(\.[\d]+)?\*',
- str(st.type)))
+ str(st.type)))
self.assertIsNotNone(re.match(
r"%struct\.glob_type(\.[\d]+)? = type { i64, \[2 x i64\] }",
str(st.type.element_type)))
@@ -891,6 +924,26 @@
glob = self.glob()
self.assertEqual(td.get_abi_size(glob.type), 8)
+ def test_get_pointee_abi_size(self):
+ td = self.target_data()
+
+ glob = self.glob()
+ self.assertEqual(td.get_pointee_abi_size(glob.type), 4)
+
+ glob = self.glob("glob_struct")
+ self.assertEqual(td.get_pointee_abi_size(glob.type), 24)
+
+ def test_get_struct_element_offset(self):
+ td = self.target_data()
+ glob = self.glob("glob_struct")
+
+ with self.assertRaises(ValueError):
+ td.get_element_offset(glob.type, 0)
+
+ struct_type = glob.type.element_type
+ self.assertEqual(td.get_element_offset(struct_type, 0), 0)
+ self.assertEqual(td.get_element_offset(struct_type, 1), 8)
+
class TestTargetMachine(BaseTest):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/llvmlite-0.25.0/llvmlite/tests/test_ir.py new/llvmlite-0.26.0/llvmlite/tests/test_ir.py
--- old/llvmlite-0.25.0/llvmlite/tests/test_ir.py 2018-09-21 21:31:30.000000000 +0200
+++ new/llvmlite-0.26.0/llvmlite/tests/test_ir.py 2018-11-28 15:58:46.000000000 +0100
@@ -1908,6 +1908,17 @@
'getelementptr ({float, i1}, {float, i1}* @"myconstant", i32 0, i32 1)')
self.assertEqual(c.type, ir.PointerType(int1))
+ const = ir.Constant(tp, None)
+ with self.assertRaises(TypeError):
+ c_wrong = const.gep([ir.Constant(int32, 0)])
+
+ const_ptr = ir.Constant(tp.as_pointer(), None)
+ c2 = const_ptr.gep([ir.Constant(int32, 0)])
+ self.assertEqual(str(c2),
+ 'getelementptr ({float, i1}, {float, i1}* null, i32 0)')
+ self.assertEqual(c.type, ir.PointerType(int1))
+
+
def test_gep_addrspace_globalvar(self):
m = self.module()
tp = ir.LiteralStructType((flt, int1))
1
0
Hello community,
here is the log from the commit of package python-numba for openSUSE:Factory checked in at 2018-12-03 10:12:16
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-numba (Old)
and /work/SRC/openSUSE:Factory/.python-numba.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-numba"
Mon Dec 3 10:12:16 2018 rev:15 rq:653423 version:0.41.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-numba/python-numba.changes 2018-10-29 14:25:31.190213783 +0100
+++ /work/SRC/openSUSE:Factory/.python-numba.new.19453/python-numba.changes 2018-12-03 10:12:32.615589428 +0100
@@ -1,0 +2,108 @@
+Sat Dec 1 18:34:28 UTC 2018 - Arun Persaud <arun(a)gmx.de>
+
+- update to version 0.41.0:
+ * major features:
+ + Diagnostics showing the optimizations done by
+ ParallelAccelerator
+ + Support for profiling Numba-compiled functions in Intel VTune
+ + Additional NumPy functions: partition, nancumsum, nancumprod,
+ ediff1d, cov, conj, conjugate, tri, tril, triu
+ + Initial support for Python 3 Unicode strings
+ * General Enhancements:
+ + PR #1968: armv7 support
+ + PR #2983: invert mapping b/w binop operators and the operator
+ module #2297
+ + PR #3160: First attempt at parallel diagnostics
+ + PR #3307: Adding NUMBA_ENABLE_PROFILING envvar, enabling jit
+ event
+ + PR #3320: Support for np.partition
+ + PR #3324: Support for np.nancumsum and np.nancumprod
+ + PR #3325: Add location information to exceptions.
+ + PR #3337: Support for np.ediff1d
+ + PR #3345: Support for np.cov
+ + PR #3348: Support user pipeline class in with lifting
+ + PR #3363: string support
+ + PR #3373: Improve error message for empty imprecise lists.
+ + PR #3375: Enable overload(operator.getitem)
+ + PR #3402: Support negative indexing in tuple.
+ + PR #3414: Refactor Const type
+ + PR #3416: Optimized usage of alloca out of the loop
+ + PR #3424: Updates for llvmlite 0.26
+ + PR #3462: Add support for `np.conj/np.conjugate`.
+ + PR #3480: np.tri, np.tril, np.triu - default optional args
+ + PR #3481: Permit dtype argument as sole kwarg in np.eye
+ * CUDA Enhancements:
+ + PR #3399: Add max_registers Option to cuda.jit
+ * Continuous Integration / Testing:
+ + PR #3303: CI with Azure Pipelines
+ + PR #3309: Workaround race condition with apt
+ + PR #3371: Fix issues with Azure Pipelines
+ + PR #3362: Fix #3360: `RuntimeWarning: 'numba.runtests' found in
+ sys.modules`
+ + PR #3374: Disable openmp in wheel building
+ + PR #3404: Azure Pipelines templates
+ + PR #3419: Fix cuda tests and error reporting in test discovery
+ + PR #3491: Prevent faulthandler installation on armv7l
+ + PR #3493: Fix CUDA test that used negative indexing behaviour
+ that's fixed.
+ + PR #3495: Start Flake8 checking of Numba source
+ * Fixes:
+ + PR #2950: Fix dispatcher to only consider contiguous-ness.
+ + PR #3124: Fix 3119, raise for 0d arrays in reductions
+ + PR #3228: Reduce redundant module linking
+ + PR #3329: Fix AOT on windows.
+ + PR #3335: Fix memory management of __cuda_array_interface__
+ views.
+ + PR #3340: Fix typo in error name.
+ + PR #3365: Fix the default unboxing logic
+ + PR #3367: Allow non-global reference to objmode()
+ context-manager
+ + PR #3381: Fix global reference in objmode for dynamically
+ created function
+ + PR #3382: CUDA_ERROR_MISALIGNED_ADDRESS Using Multiple Const
+ Arrays
+ + PR #3384: Correctly handle very old versions of colorama
+ + PR #3394: Add 32bit package guard for non-32bit installs
+ + PR #3397: Fix with-objmode warning
+ + PR #3403 Fix label offset in call inline after parfor pass
+ + PR #3429: Fixes raising of user defined exceptions for
+ exec(<string>).
+ + PR #3432: Fix error due to function naming in CI in py2.7
+ + PR #3444: Fixed TBB's single thread execution and test added for
+ #3440
+ + PR #3449: Allow matching non-array objects in find_callname()
+ + PR #3455: Change getiter and iternext to not be pure. Resolves
+ #3425
+ + PR #3467: Make ir.UndefinedType singleton class.
+ + PR #3478: Fix np.random.shuffle sideeffect
+ + PR #3487: Raise unsupported for kwargs given to `print()`
+ + PR #3488: Remove dead script.
+ + PR #3498: Fix stencil support for boolean as return type
+ + PR #3511: Fix handling make_function literals (regression of
+ #3414)
+ + PR #3514: Add missing unicode != unicode
+ + PR #3527: Fix complex math sqrt implementation for large -ve
+ values
+ + PR #3530: This adds arg an check for the pattern supplied to
+ Parfors.
+ + PR #3536: Sets list dtor linkage to `linkonce_odr` to fix
+ visibility in AOT
+ * Documentation Updates:
+ + PR #3316: Update 0.40 changelog with additional PRs
+ + PR #3318: Tweak spacing to avoid search box wrapping onto second
+ line
+ + PR #3321: Add note about memory leaks with exceptions to
+ docs. Fixes #3263
+ + PR #3322: Add FAQ on CUDA + fork issue. Fixes #3315.
+ + PR #3343: Update docs for argsort, kind kwarg partially
+ supported.
+ + PR #3357: Added mention of njit in 5minguide.rst
+ + PR #3434: Fix parallel reduction example in docs.
+ + PR #3452: Fix broken link and mark up problem.
+ + PR #3484: Size Numba logo in docs in em units. Fixes #3313
+ + PR #3502: just two typos
+ + PR #3506: Document string support
+ + PR #3513: Documentation for parallel diagnostics.
+ + PR #3526: Fix 5 min guide with respect to @njit decl
+
+-------------------------------------------------------------------
Old:
----
numba-0.40.1.tar.gz
New:
----
numba-0.41.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-numba.spec ++++++
--- /var/tmp/diff_new_pack.PhlrWY/_old 2018-12-03 10:12:33.155588927 +0100
+++ /var/tmp/diff_new_pack.PhlrWY/_new 2018-12-03 10:12:33.155588927 +0100
@@ -18,7 +18,7 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-numba
-Version: 0.40.1
+Version: 0.41.0
Release: 0
Summary: NumPy-aware optimizing compiler for Python using LLVM
License: BSD-2-Clause
++++++ numba-0.40.1.tar.gz -> numba-0.41.0.tar.gz ++++++
++++ 19335 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package python-distributed for openSUSE:Factory checked in at 2018-12-03 10:12:06
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-distributed (Old)
and /work/SRC/openSUSE:Factory/.python-distributed.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-distributed"
Mon Dec 3 10:12:06 2018 rev:12 rq:653422 version:1.25.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-distributed/python-distributed.changes 2018-11-26 10:29:45.061067052 +0100
+++ /work/SRC/openSUSE:Factory/.python-distributed.new.19453/python-distributed.changes 2018-12-03 10:12:30.943590979 +0100
@@ -1,0 +2,13 @@
+Sat Dec 1 18:37:39 UTC 2018 - Arun Persaud <arun(a)gmx.de>
+
+- update to version 1.25.0:
+ * Fixed the 404 error on the Scheduler Dashboard homepage (GH#2361)
+ Michael Wheeler
+ * Consolidate two Worker classes into one (GH#2363) Matthew Rocklin
+ * Avoid warnings in pyarrow and msgpack (GH#2364) Matthew Rocklin
+ * Avoid race condition in Actor’s Future (GH#2374) Matthew Rocklin
+ * Support missing packages keyword in Client.get_versions (GH#2379)
+ Matthew Rocklin
+ * Fixup serializing masked arrays (GH#2373) Jim Crist
+
+-------------------------------------------------------------------
Old:
----
distributed-1.24.2.tar.gz
New:
----
distributed-1.25.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-distributed.spec ++++++
--- /var/tmp/diff_new_pack.PqyDLy/_old 2018-12-03 10:12:31.335590615 +0100
+++ /var/tmp/diff_new_pack.PqyDLy/_new 2018-12-03 10:12:31.339590612 +0100
@@ -20,7 +20,7 @@
# Test requires network connection
%bcond_with test
Name: python-distributed
-Version: 1.24.2
+Version: 1.25.0
Release: 0
Summary: Library for distributed computing with Python
License: BSD-3-Clause
++++++ distributed-1.24.2.tar.gz -> distributed-1.25.0.tar.gz ++++++
++++ 2223 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package python-bokeh for openSUSE:Factory checked in at 2018-12-03 10:11:59
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-bokeh (Old)
and /work/SRC/openSUSE:Factory/.python-bokeh.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-bokeh"
Mon Dec 3 10:11:59 2018 rev:12 rq:653421 version:1.0.2
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-bokeh/python-bokeh.changes 2018-11-05 22:51:42.228335610 +0100
+++ /work/SRC/openSUSE:Factory/.python-bokeh.new.19453/python-bokeh.changes 2018-12-03 10:12:29.539592280 +0100
@@ -1,0 +2,40 @@
+Sat Dec 1 18:42:28 UTC 2018 - Arun Persaud <arun(a)gmx.de>
+
+- update to version 1.0.2:
+ * bugfixes:
+ + #5721 [component: bokehjs] [widgets] Text_align attribute in
+ numberformatter not doing anything
+ + #8395 [component: bokehjs] Legend breaks plot when plotting
+ empty scatter glyph
+ + #8396 [component: docs] Fix small typo [ci skip]
+ + #8398 Fix typo and grammar mistakes
+ + #8409 [component: docs] Typo in documentation of
+ io.export.create_webdriver
+ + #8415 Make components() preserve the type of dict
+ + #8418 [component: bokehjs] [component: build] Make bokehjs build
+ under node 10.x
+ + #8425 [component: docs] Apache documentation typo
+ + #8428 [component: bokehjs] [component: docs] Can't get gridplot
+ to work in bokehjs
+ + #8451 [component: bokehjs] [component: build] Run `npm install`
+ when `node make *` on fresh install
+ + #8457 [component: bokehjs] Embeds with json_item missing
+ toolbar/interactivity
+ + #8459 [component: bokehjs] Hovertool does not display fields
+ within jupyterlab's dark theme
+ + #8460 [component: examples] Fix a typo
+ * features:
+ + #8399 [component: bokehjs] Omit colon in hover tooltips if first
+ tuple entry is empty
+ + #8411 [widgets] Feature request: add support for setting the
+ datatable row height
+ * tasks:
+ + #8393 [component: docs] "customjs for selections" example in
+ docs broken
+ + #8405 [component: tests] Fix failing codebase tests
+ + #8413 [component: bokehjs] [typescript] Upgrade to typescript
+ 3.1
+ + #8438 [component: bokehjs] [typescript] Clean up semicolons
+ after transition to typescript
+
+-------------------------------------------------------------------
Old:
----
bokeh-1.0.1.tar.gz
New:
----
bokeh-1.0.2.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-bokeh.spec ++++++
--- /var/tmp/diff_new_pack.8EGZB5/_old 2018-12-03 10:12:30.707591198 +0100
+++ /var/tmp/diff_new_pack.8EGZB5/_new 2018-12-03 10:12:30.711591194 +0100
@@ -20,7 +20,7 @@
# Tests fail due to missing git data
%bcond_with tests
Name: python-bokeh
-Version: 1.0.1
+Version: 1.0.2
Release: 0
Summary: Statistical interactive HTML plots for Python
License: BSD-3-Clause
++++++ bokeh-1.0.1.tar.gz -> bokeh-1.0.2.tar.gz ++++++
/work/SRC/openSUSE:Factory/python-bokeh/bokeh-1.0.1.tar.gz /work/SRC/openSUSE:Factory/.python-bokeh.new.19453/bokeh-1.0.2.tar.gz differ: char 5, line 1
1
0
Hello community,
here is the log from the commit of package python-alembic for openSUSE:Factory checked in at 2018-12-03 10:11:50
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-alembic (Old)
and /work/SRC/openSUSE:Factory/.python-alembic.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-alembic"
Mon Dec 3 10:11:50 2018 rev:38 rq:653420 version:1.0.5
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-alembic/python-alembic.changes 2018-11-26 10:29:38.549074689 +0100
+++ /work/SRC/openSUSE:Factory/.python-alembic.new.19453/python-alembic.changes 2018-12-03 10:12:21.367599858 +0100
@@ -1,0 +2,21 @@
+Sat Dec 1 18:27:49 UTC 2018 - Arun Persaud <arun(a)gmx.de>
+
+- specfile:
+ * update url
+
+- update to version 1.0.5:
+ * bug
+ + [bug] [py3k] Resolved remaining Python 3 deprecation warnings,
+ covering the use of inspect.formatargspec() with a vendored
+ version copied from the Python standard library, importing
+ collections.abc above Python 3.3 when testing against abstract
+ base classes, fixed one occurrence of log.warn(), as well as a
+ few invalid escape sequences. References: #507
+
+- changes from version 1.0.4:
+ * [change] Code hosting has been moved to GitHub, at
+ https://github.com/sqlalchemy/alembic. Additionally, the main
+ Alembic website documentation URL is now
+ https://alembic.sqlalchemy.org.
+
+-------------------------------------------------------------------
Old:
----
alembic-1.0.3.tar.gz
New:
----
alembic-1.0.5.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-alembic.spec ++++++
--- /var/tmp/diff_new_pack.KszyxR/_old 2018-12-03 10:12:22.447598856 +0100
+++ /var/tmp/diff_new_pack.KszyxR/_new 2018-12-03 10:12:22.447598856 +0100
@@ -18,12 +18,12 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-alembic
-Version: 1.0.3
+Version: 1.0.5
Release: 0
Summary: A database migration tool for SQLAlchemy
License: MIT
Group: Development/Languages/Python
-URL: http://bitbucket.org/zzzeek/alembic
+URL: https://github.com/sqlalchemy/alembic
Source0: https://files.pythonhosted.org/packages/source/a/alembic/alembic-%{version}…
Source1: python-alembic-rpmlintrc
# Test requirements:
++++++ alembic-1.0.3.tar.gz -> alembic-1.0.5.tar.gz ++++++
++++ 5032 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package python-gTTS-token for openSUSE:Factory checked in at 2018-12-03 10:11:43
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-gTTS-token (Old)
and /work/SRC/openSUSE:Factory/.python-gTTS-token.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-gTTS-token"
Mon Dec 3 10:11:43 2018 rev:3 rq:653417 version:1.1.3
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-gTTS-token/python-gTTS-token.changes 2018-11-06 14:39:27.256590088 +0100
+++ /work/SRC/openSUSE:Factory/.python-gTTS-token.new.19453/python-gTTS-token.changes 2018-12-03 10:12:17.475603465 +0100
@@ -1,0 +2,6 @@
+Sat Dec 1 14:20:32 UTC 2018 - Adrian Schröter <adrian(a)suse.de>
+
+-- update to version 1.1.3
+ * updating tkk extraction due to new Google Translate UI
+
+-------------------------------------------------------------------
Old:
----
gTTS-token-1.1.2.tar.gz
New:
----
gTTS-token-1.1.3.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-gTTS-token.spec ++++++
--- /var/tmp/diff_new_pack.HLy7MF/_old 2018-12-03 10:12:18.671602357 +0100
+++ /var/tmp/diff_new_pack.HLy7MF/_new 2018-12-03 10:12:18.675602354 +0100
@@ -18,7 +18,7 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-gTTS-token
-Version: 1.1.2
+Version: 1.1.3
Release: 0
Summary: Python module for calculating a token to run the Google text-to-speech engine
License: MIT
++++++ gTTS-token-1.1.2.tar.gz -> gTTS-token-1.1.3.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gTTS-token-1.1.2/PKG-INFO new/gTTS-token-1.1.3/PKG-INFO
--- old/gTTS-token-1.1.2/PKG-INFO 2018-09-21 19:19:54.000000000 +0200
+++ new/gTTS-token-1.1.3/PKG-INFO 2018-11-29 17:53:48.000000000 +0100
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: gTTS-token
-Version: 1.1.2
+Version: 1.1.3
Summary: Calculates a token to run the Google Translate text to speech
Home-page: https://github.com/boudewijn26/gTTS-token
Author: Boudewijn van Groos
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gTTS-token-1.1.2/gTTS_token.egg-info/PKG-INFO new/gTTS-token-1.1.3/gTTS_token.egg-info/PKG-INFO
--- old/gTTS-token-1.1.2/gTTS_token.egg-info/PKG-INFO 2018-09-21 19:19:54.000000000 +0200
+++ new/gTTS-token-1.1.3/gTTS_token.egg-info/PKG-INFO 2018-11-29 17:53:48.000000000 +0100
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: gTTS-token
-Version: 1.1.2
+Version: 1.1.3
Summary: Calculates a token to run the Google Translate text to speech
Home-page: https://github.com/boudewijn26/gTTS-token
Author: Boudewijn van Groos
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gTTS-token-1.1.2/gtts_token/gtts_token.py new/gTTS-token-1.1.3/gtts_token/gtts_token.py
--- old/gTTS-token-1.1.2/gtts_token/gtts_token.py 2018-09-21 19:12:57.000000000 +0200
+++ new/gTTS-token-1.1.3/gtts_token/gtts_token.py 2018-11-29 17:49:00.000000000 +0100
@@ -53,9 +53,13 @@
return self.token_key
response = requests.get("https://translate.google.com/")
- line = response.text.split('\n')[-1]
- tkk_expr = re.search(".*?(TKK=.*?;)W.*?", line).group(1)
+ tkk_expr = re.search("(tkk:.*?),", response.text)
+ if not tkk_expr:
+ raise ValueError(
+ "Unable to find token seed! Did https://translate.google.com change?"
+ )
+ tkk_expr = tkk_expr.group(1)
try:
# Grab the token directly if already generated by function call
result = re.search("\d{6}\.[0-9]+", tkk_expr).group(0)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gTTS-token-1.1.2/gtts_token/version.py new/gTTS-token-1.1.3/gtts_token/version.py
--- old/gTTS-token-1.1.2/gtts_token/version.py 2018-09-21 19:17:55.000000000 +0200
+++ new/gTTS-token-1.1.3/gtts_token/version.py 2018-11-29 17:49:00.000000000 +0100
@@ -1 +1 @@
-__version__ = '1.1.2'
+__version__ = '1.1.3'
1
0
Hello community,
here is the log from the commit of package gsequencer for openSUSE:Factory checked in at 2018-12-03 10:11:40
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/gsequencer (Old)
and /work/SRC/openSUSE:Factory/.gsequencer.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "gsequencer"
Mon Dec 3 10:11:40 2018 rev:25 rq:653416 version:2.1.5
Changes:
--------
--- /work/SRC/openSUSE:Factory/gsequencer/gsequencer.changes 2018-10-29 14:59:02.734029827 +0100
+++ /work/SRC/openSUSE:Factory/.gsequencer.new.19453/gsequencer.changes 2018-12-03 10:12:06.451613683 +0100
@@ -1,0 +2,16 @@
+Sat Dec 1 22:09:37 UTC 2018 - Joël Krähemann <jkraehemann(a)gmail.com>
+
+- new upstream v2.1.5 minor fixes.
+
+-------------------------------------------------------------------
+Sat Dec 1 19:09:09 UTC 2018 - Joël Krähemann <jkraehemann(a)gmail.com>
+
+- new upstream v2.1.3 implemented OSC server and related content
+ format utilities.
+- Edited spec file to match new server path.
+- Implemented configuration in place.
+- Increased minor version.
+- Removed gsequencer.1-fix-configure-ac.patch because applied
+ upstream.
+
+-------------------------------------------------------------------
Old:
----
gsequencer-2.0.37.tar.gz
gsequencer.1-fix-configure-ac.patch
New:
----
gsequencer-2.1.5.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ gsequencer.spec ++++++
--- /var/tmp/diff_new_pack.riHXrE/_old 2018-12-03 10:12:07.451612757 +0100
+++ /var/tmp/diff_new_pack.riHXrE/_new 2018-12-03 10:12:07.451612757 +0100
@@ -18,16 +18,15 @@
%define libagssonumber 2
%define libgsequencersonumber 0
Name: gsequencer
-Version: 2.0.37
+Version: 2.1.5
Release: 0
Summary: Audio processing engine
License: GPL-3.0+ AND AGPL-3.0+ AND GFDL-1.3
Group: Productivity/Multimedia/Sound/Midi
Url: https://nongnu.org/gsequencer
-Source0: https://download.savannah.gnu.org/releases/gsequencer/2.0.x/%{name}-%{versi…
+Source0: https://download.savannah.gnu.org/releases/gsequencer/2.1.x/%{name}-%{versi…
# PATCH-FIX-OPENSUSE gsequencer.0-fix-makefile-am.patch -- fix opensuse specific locations
Patch0: gsequencer.0-fix-makefile-am.patch
-Patch1: gsequencer.1-fix-configure-ac.patch
BuildRequires: fluid-soundfont-gm
BuildRequires: hydrogen
BuildRequires: cunit-devel
@@ -67,7 +66,6 @@
%prep
%setup -q
%patch0
-%patch1
%build
autoreconf -fi
++++++ gsequencer-2.0.37.tar.gz -> gsequencer-2.1.5.tar.gz ++++++
++++ 42099 lines of diff (skipped)
++++++ gsequencer.0-fix-makefile-am.patch ++++++
--- /var/tmp/diff_new_pack.riHXrE/_old 2018-12-03 10:12:08.027612223 +0100
+++ /var/tmp/diff_new_pack.riHXrE/_new 2018-12-03 10:12:08.027612223 +0100
@@ -1,5 +1,5 @@
---- Makefile.am.orig 2018-10-27 16:08:41.000000000 +0200
-+++ Makefile.am 2018-10-27 21:41:22.922367876 +0200
+--- Makefile.am.orig 2018-11-30 22:06:05.000000000 +0100
++++ Makefile.am 2018-12-01 20:07:46.844538441 +0100
@@ -35,12 +35,12 @@
# this lists the binaries to produce, the (non-PHONY, binary) targets in
# the previous manual Makefile
@@ -16,7 +16,7 @@
EXTRA_DIST = config.rpath \
COPYING.server \
-@@ -114,7 +114,7 @@
+@@ -134,7 +134,7 @@
# pkg-config
pkgconfigdir = $(libdir)/pkgconfig
pkgconfig_DATA = libags.pc libags_audio.pc libags_gui.pc
@@ -25,7 +25,7 @@
EXTRA_DIST += libags.pc.in libags_audio.pc.in libags_gui.pc.in libgsequencer.pc.in
# EXTRA_DIST += libgsequencer.pc.in
-@@ -153,7 +153,7 @@
+@@ -173,7 +173,7 @@
# include
otherincludedir = $(includedir)/ags
nobase_include_HEADERS = $(libags_la_HEADERS_0) $(libags_thread_la_HEADERS_0) $(libags_server_la_HEADERS_0) $(libags_audio_la_HEADERS_0) $(libags_gui_la_HEADERS_0)
@@ -34,7 +34,7 @@
# doc
# docdir = $(datadir)/doc/gsequencer
-@@ -1959,8 +1959,8 @@
+@@ -2045,8 +2045,8 @@
html:
mkdir -p $(top_builddir)/html/
mkdir -p $(top_builddir)/html/{developer-docs,user-docs}
@@ -45,7 +45,7 @@
$(MAKE) -C $(top_srcdir)/docs/reference/libags
cd $(top_srcdir)
$(MAKE) -C $(top_srcdir)/docs/reference/libags-audio
-@@ -1993,18 +1993,18 @@
+@@ -2080,18 +2080,18 @@
gzip -9 -c $(top_srcdir)/ChangeLog > $(DESTDIR)/$(docdir)/changelog.gz
fix-local-html: html
@@ -76,7 +76,7 @@
fix-online-books-html: html
find $(top_srcdir)/html/ -name "*.html" -type f -exec sed -i 's/\/usr\/share\/icons\/Adwaita\/32x32\/actions/..\/images/g' {} \;
-@@ -2031,19 +2031,11 @@
+@@ -2118,19 +2118,11 @@
install -c -p -m 644 $(devdocimages) $(DESTDIR)/$(datadir)/doc/libags-audio-doc/images/
install -c -p -m 644 $(top_builddir)/html/user-docs/* $(DESTDIR)/$(docdir)/html/
install -c -p -m 644 $(top_builddir)/html/developer-docs/* $(DESTDIR)/$(datadir)/doc/libags-audio-doc/html/
1
0
Hello community,
here is the log from the commit of package ibus-pinyin for openSUSE:Factory checked in at 2018-12-03 10:11:36
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ibus-pinyin (Old)
and /work/SRC/openSUSE:Factory/.ibus-pinyin.new.19453 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ibus-pinyin"
Mon Dec 3 10:11:36 2018 rev:35 rq:653397 version:1.5.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/ibus-pinyin/ibus-pinyin.changes 2018-11-26 10:28:56.941123493 +0100
+++ /work/SRC/openSUSE:Factory/.ibus-pinyin.new.19453/ibus-pinyin.changes 2018-12-03 10:12:04.943615081 +0100
@@ -1,0 +2,6 @@
+Fri Nov 30 13:22:56 UTC 2018 - qzhao(a)suse.com
+
+- Update ibus-pinyin.spec: To adapt to tumbleweed's update to
+python3.
+
+-------------------------------------------------------------------
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ibus-pinyin.spec ++++++
--- /var/tmp/diff_new_pack.OcYD0i/_old 2018-12-03 10:12:05.647614428 +0100
+++ /var/tmp/diff_new_pack.OcYD0i/_new 2018-12-03 10:12:05.655614421 +0100
@@ -55,7 +55,7 @@
BuildRequires: pkgconfig(uuid)
Requires: ibus >= 1.4.99
Requires: opencc
-Requires: python-xdg
+Requires: python3-xdg
Requires: pyzy-db-android
Requires: pyzy-db-open-phrase
1
0