openSUSE Commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
September 2018
- 1 participants
- 1171 discussions
Hello community,
here is the log from the commit of package python-cmd2 for openSUSE:Factory checked in at 2018-09-27 09:50:24
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-cmd2 (Old)
and /work/SRC/openSUSE:Factory/.python-cmd2.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-cmd2"
Thu Sep 27 09:50:24 2018 rev:19 rq: version:0.8.9
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-cmd2/python-cmd2.changes 2018-09-26 16:12:19.399396344 +0200
+++ /work/SRC/openSUSE:Factory/.python-cmd2.new/python-cmd2.changes 2018-09-27 09:50:28.644465024 +0200
@@ -2,99 +1,0 @@
-Thu Sep 20 20:17:41 UTC 2018 - Todd R <toddrme2178(a)gmail.com>
-
-- Update to version 0.9.4
- + Bug Fixes
- * Fixed bug where ``preparse`` was not getting called
- * Fixed bug in parsing of multiline commands where matching quote is on another line
- + Enhancements
- * Improved implementation of lifecycle hooks to support a plugin
- framework, see ``docs/hooks.rst`` for details.
- * New dependency on ``attrs`` third party module
- * Added ``matches_sorted`` member to support custom sorting of tab-completion matches
- * Added [tab_autocomp_dynamic.py](https://github.com/python-cmd2/cmd2/blob/master/ex… example
- * Demonstrates updating the argparse object during init instead of during class construction
- + Deprecations
- * Deprecated the following hook methods, see ``hooks.rst`` for full details:
- * ``cmd2.Cmd.preparse()`` - equivalent functionality available
- via ``cmd2.Cmd.register_postparsing_hook()``
- * ``cmd2.Cmd.postparsing_precmd()`` - equivalent functionality available
- via ``cmd2.Cmd.register_postparsing_hook()``
- * ``cmd2.Cmd.postparsing_postcmd()`` - equivalent functionality available
- via ``cmd2.Cmd.register_postcmd_hook()``
-- Update to version 0.9.3
- + Bug Fixes
- * Fixed bug when StatementParser ``__init__()`` was called with ``terminators`` equal to ``None``
- * Fixed bug when ``Cmd.onecmd()`` was called with a raw ``str``
- + Enhancements
- * Added ``--clear`` flag to ``history`` command that clears both the command and readline history.
- + Deletions
- * The ``CmdResult`` helper class which was *deprecated* in the previous release has now been deleted
- * It has been replaced by the improved ``CommandResult`` class
-- Update to version 0.9.2
- + Bug Fixes
- * Fixed issue where piping and redirecting did not work correctly with paths that had spaces
- + Enhancements
- * Added ability to print a header above tab-completion suggestions using `completion_header` member
- * Added ``pager`` and ``pager_chop`` attributes to the ``cmd2.Cmd`` class
- * ``pager`` defaults to **less -RXF** on POSIX and **more** on Windows
- * ``pager_chop`` defaults to **less -SRXF** on POSIX and **more** on Windows
- * Added ``chop`` argument to ``cmd2.Cmd.ppaged()`` method for displaying output using a pager
- * If ``chop`` is ``False``, then ``self.pager`` is used as the pager
- * Otherwise ``self.pager_chop`` is used as the pager
- * Greatly improved the [table_display.py](https://github.com/python-cmd2/cmd2/blob/master/examples/… example
- * Now uses the new [tableformatter](https://github.com/python-tableformatter/tableformatter) module which looks better than ``tabulate``
- + Deprecations
- * The ``CmdResult`` helper class is *deprecated* and replaced by the improved ``CommandResult`` class
- * ``CommandResult`` has the following attributes: **stdout**, **stderr**, and **data**
- * ``CmdResult`` had attributes of: **out**, **err**, **war**
- * ``CmdResult`` will be deleted in the next release
-- Update to version 0.8.8
- + Bug Fixes
- * Prevent crashes that could occur attempting to open a file in non-existent directory or with very long filename
- + Enhancements
- * `display_matches` is no longer restricted to delimited strings
-- Update to version 0.9.1
- + Bug Fixes
- * fix packaging error for 0.8.x versions (yes we had to deploy a new version
- of the 0.9.x series to fix a packaging error with the 0.8.x version)
-- Update to version 0.9.0
- + Bug Fixes
- * If self.default_to_shell is true, then redirection and piping are now properly passed to the shell. Previously it was truncated.
- * Submenus now call all hooks, it used to just call precmd and postcmd.
- + Enhancements
- * Automatic completion of ``argparse`` arguments via ``cmd2.argparse_completer.AutoCompleter``
- * See the [tab_autocompletion.py](https://github.com/python-cmd2/cmd2/blob/master/exam… example for a demonstration of how to use this feature
- * ``cmd2`` no longer depends on the ``six`` module
- * ``cmd2`` is now a multi-file Python package instead of a single-file module
- * New pyscript approach that provides a pythonic interface to commands in the cmd2 application.
- * Switch command parsing from pyparsing to custom code which utilizes shlex.
- * The object passed to do_* methods has changed. It no longer is the pyparsing object, it's a new Statement object, which is a subclass of ``str``. The statement object has many attributes which give you access to various components of the parsed input. If you were using anything but the string in your do_* methods, this change will require you to update your code.
- * ``commentGrammers`` is no longer supported or available. Comments are C-style or python style.
- * Input redirection no longer supported. Use the load command instead.
- * ``multilineCommand`` attribute is ``now multiline_command``
- * ``identchars`` is now ignored. The standardlibrary cmd uses those characters to split the first "word" of the input, but cmd2 hasn't used those for a while, and the new parsing logic parses on whitespace, which has the added benefit of full unicode support, unlike cmd or prior versions of cmd2.
- * ``set_posix_shlex`` function and ``POSIX_SHLEX`` variable have been removed. Parsing behavior is now always the more forgiving ``posix=false``.
- * ``set_strip_quotes`` function and ``STRIP_QUOTES_FOR_NON_POSIX`` have been removed. Quotes are stripped from arguments when presented as a list (a la ``sys.argv``), and present when arguments are presented as a string (like the string passed to do_*).
- + Changes
- * ``strip_ansi()`` and ``strip_quotes()`` functions have moved to new utils module
- * Several constants moved to new constants module
- * Submenu support has been moved to a new [cmd2-submenu](https://github.com/python-cmd2/cmd2-submenu) plugin. If you use submenus, you will need to update your dependencies and modify your imports.
- + Deletions (potentially breaking changes)
- * Deleted all ``optparse`` code which had previously been deprecated in release 0.8.0
- * The ``options`` decorator no longer exists
- * All ``cmd2`` code should be ported to use the new ``argparse``-based decorators
- * See the [Argument Processing](http://cmd2.readthedocs.io/en/latest/argument_processing.html) section of the documentation for more information on these decorators
- * Alternatively, see the [argparse_example.py](https://github.com/python-cmd2/cmd2/blob/master/exampl…
- * Deleted ``cmd_with_subs_completer``, ``get_subcommands``, and ``get_subcommand_completer``
- * Replaced by default AutoCompleter implementation for all commands using argparse
- * Deleted support for old method of calling application commands with ``cmd()`` and ``self``
- * ``cmd2.redirector`` is no longer supported. Output redirection can only be done with '>' or '>>'
- * Deleted ``postparse()`` hook since it was redundant with ``postparsing_precmd``
- + Python 2 no longer supported
- * ``cmd2`` now supports Python 3.4+
- + Known Issues
- * Some developers have noted very slow performance when importing the ``cmd2`` module. The issue
- it intermittant, and investigation of the root cause is ongoing.
-- Python2 is no longer supported upstream,
- so don't build it.
-
--------------------------------------------------------------------
Old:
----
cmd2-0.9.4.tar.gz
New:
----
cmd2-0.8.9.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-cmd2.spec ++++++
--- /var/tmp/diff_new_pack.OQs9WG/_old 2018-09-27 09:50:29.228464301 +0200
+++ /var/tmp/diff_new_pack.OQs9WG/_new 2018-09-27 09:50:29.228464301 +0200
@@ -17,32 +17,33 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
-%define skip_python2 1
Name: python-cmd2
-Version: 0.9.4
+Version: 0.8.9
Release: 0
Summary: Extra features for standard library's cmd module
License: MIT
Group: Development/Languages/Python
-Url: https://github.com/python-cmd2/cmd2
+Url: http://packages.python.org/cmd2/
Source: https://files.pythonhosted.org/packages/source/c/cmd2/cmd2-%{version}.tar.gz
-BuildRequires: %{python_module setuptools}
-BuildRequires: %{python_module setuptools_scm}
-BuildRequires: fdupes
-BuildRequires: python-rpm-macros
-# SECTION Test requirements
-BuildRequires: %{python_module attrs}
-BuildRequires: %{python_module colorama}
+BuildRequires: %{python_module contextlib2}
BuildRequires: %{python_module mock}
-BuildRequires: %{python_module pyperclip >= 1.5.27}
-BuildRequires: %{python_module pytest-mock}
-BuildRequires: %{python_module pytest}
+BuildRequires: %{python_module pyperclip}
+BuildRequires: %{python_module pytest-xdist}
+BuildRequires: %{python_module setuptools}
BuildRequires: %{python_module wcwidth}
-# /SECTION
-Requires: python-attrs
-Requires: python-colorama
-Requires: python-pyperclip >= 1.5.27
+BuildRequires: python-enum34
+BuildRequires: python-rpm-macros
+BuildRequires: python-subprocess32
+Requires: python-pyparsing >= 2.0.1
+Requires: python-pyperclip
+Requires: python-six
Requires: python-wcwidth
+%ifpython2
+Requires: python-contextlib2
+Requires: python-enum34
+Requires: python-subprocess32
+%endif
+BuildRoot: %{_tmppath}/%{name}-%{version}-build
BuildArch: noarch
%python_subpackages
@@ -68,24 +69,19 @@
%prep
%setup -q -n cmd2-%{version}
-# Fix non-executable-script
-sed -i -e '/^#!\//, 1d' cmd2/cmd2.py
-# Fix spurious-executable-perm
-chmod a-x README.md
%build
%python_build
%install
%python_install
-%python_expand %fdupes %{buildroot}%{$python_sitelib}
%check
%python_exec setup.py test
%files %{python_files}
%license LICENSE
-%doc CHANGELOG.md CODEOWNERS README.md
+%doc README.md
%{python_sitelib}/*
%changelog
++++++ cmd2-0.9.4.tar.gz -> cmd2-0.8.9.tar.gz ++++++
++++ 22261 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package 000product for openSUSE:Factory checked in at 2018-09-27 03:22:54
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/000product (Old)
and /work/SRC/openSUSE:Factory/.000product.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "000product"
Thu Sep 27 03:22:54 2018 rev:551 rq: version:unknown
Changes:
--------
New Changes file:
NO CHANGES FILE!!!
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
_service:product_converter:openSUSE-Addon-NonOss-ftp-ftp-i586_x86_64.kiwi: same change
_service:product_converter:openSUSE-Addon-NonOss-release.spec: same change
_service:product_converter:openSUSE-Tumbleweed-Kubic-dvd5-dvd-x86_64.kiwi: same change
_service:product_converter:openSUSE-Tumbleweed-Kubic-release.spec: same change
_service:product_converter:openSUSE-cd-mini-i586.kiwi: same change
_service:product_converter:openSUSE-cd-mini-x86_64.kiwi: same change
_service:product_converter:openSUSE-dvd5-dvd-i586.kiwi: same change
_service:product_converter:openSUSE-dvd5-dvd-x86_64.kiwi: same change
_service:product_converter:openSUSE-ftp-ftp-i586_x86_64.kiwi: same change
++++++ _service:product_converter:openSUSE-release.spec ++++++
--- /var/tmp/diff_new_pack.HsvnHC/_old 2018-09-27 03:23:08.006179751 +0200
+++ /var/tmp/diff_new_pack.HsvnHC/_new 2018-09-27 03:23:08.030179725 +0200
@@ -2882,10 +2882,13 @@
Provides: weakremover(python-libbtctl)
Provides: weakremover(python-libntrack)
Provides: weakremover(python-louis)
+Provides: weakremover(python-matplotlib)
+Provides: weakremover(python-matplotlib-tk)
Provides: weakremover(python-matplotlib-wx)
Provides: weakremover(python-metacity)
Provides: weakremover(python-ogg)
Provides: weakremover(python-psycopg2-doc)
+Provides: weakremover(python-pylint)
Provides: weakremover(python-pyparsing-doc)
Provides: weakremover(python-qscintilla-sip)
Provides: weakremover(python-rapi2)
@@ -4926,6 +4929,7 @@
Provides: weakremover(python-cinder)
Provides: weakremover(python-cinderclient-test)
Provides: weakremover(python-cliff-tablib)
+Provides: weakremover(python-cmd2)
Provides: weakremover(python-glanceclient-test)
Provides: weakremover(python-heatclient-test)
Provides: weakremover(python-horizon)
@@ -6672,10 +6676,18 @@
Provides: weakremover(python-CXX)
Provides: weakremover(python-CXX-devel)
Provides: weakremover(python-appindicator)
+Provides: weakremover(python-astroid)
Provides: weakremover(python-clang)
Provides: weakremover(python-ec2metadata)
Provides: weakremover(python-libsmdev)
+Provides: weakremover(python-matplotlib-cairo)
Provides: weakremover(python-matplotlib-gtk2)
+Provides: weakremover(python-matplotlib-gtk3)
+Provides: weakremover(python-matplotlib-latex)
+Provides: weakremover(python-matplotlib-qt-shared)
+Provides: weakremover(python-matplotlib-qt4)
+Provides: weakremover(python-matplotlib-qt5)
+Provides: weakremover(python-matplotlib-web)
Provides: weakremover(python-oslo.db-test)
Provides: weakremover(python-oslo.messaging-test)
Provides: weakremover(python-oslo.version)
@@ -8850,7 +8862,6 @@
Provides: weakremover(python-pyinotify)
Provides: weakremover(python-retrying)
Provides: weakremover(python-scripttest)
-Provides: weakremover(python-service_identity)
Provides: weakremover(python-simpleeval)
Provides: weakremover(python-slapdsock)
Provides: weakremover(python-socketpool)
@@ -14916,6 +14927,8 @@
Provides: weakremover(libf2fs_format0)
Provides: weakremover(libfaxutil5_5_9)
Provides: weakremover(libflif0)
+Provides: weakremover(libfluidsynth1)
+Provides: weakremover(libfluidsynth1-32bit)
Provides: weakremover(libfm-qt-lang)
Provides: weakremover(libfm-qt3)
Provides: weakremover(libfplll3)
@@ -17270,6 +17283,7 @@
Provides: weakremover(libstdc++6-gcc8-locale)
Provides: weakremover(libtsan0-gcc8)
Provides: weakremover(libdpdk-18_02-0)
+Provides: weakremover(libopenvswitch-2_9-0)
Provides: weakremover(libphobos2-0_79)
Provides: weakremover(libgnomecups-1_0-1)
Provides: weakremover(libgnomecups-1_0-1-32bit)
@@ -20466,10 +20480,13 @@
<obsoletepackage>python-libbtctl</obsoletepackage>
<obsoletepackage>python-libntrack</obsoletepackage>
<obsoletepackage>python-louis</obsoletepackage>
+ <obsoletepackage>python-matplotlib</obsoletepackage>
+ <obsoletepackage>python-matplotlib-tk</obsoletepackage>
<obsoletepackage>python-matplotlib-wx</obsoletepackage>
<obsoletepackage>python-metacity</obsoletepackage>
<obsoletepackage>python-ogg</obsoletepackage>
<obsoletepackage>python-psycopg2-doc</obsoletepackage>
+ <obsoletepackage>python-pylint</obsoletepackage>
<obsoletepackage>python-pyparsing-doc</obsoletepackage>
<obsoletepackage>python-qscintilla-sip</obsoletepackage>
<obsoletepackage>python-rapi2</obsoletepackage>
@@ -22510,6 +22527,7 @@
<obsoletepackage>python-cinder</obsoletepackage>
<obsoletepackage>python-cinderclient-test</obsoletepackage>
<obsoletepackage>python-cliff-tablib</obsoletepackage>
+ <obsoletepackage>python-cmd2</obsoletepackage>
<obsoletepackage>python-glanceclient-test</obsoletepackage>
<obsoletepackage>python-heatclient-test</obsoletepackage>
<obsoletepackage>python-horizon</obsoletepackage>
@@ -24256,10 +24274,18 @@
<obsoletepackage>python-CXX</obsoletepackage>
<obsoletepackage>python-CXX-devel</obsoletepackage>
<obsoletepackage>python-appindicator</obsoletepackage>
+ <obsoletepackage>python-astroid</obsoletepackage>
<obsoletepackage>python-clang</obsoletepackage>
<obsoletepackage>python-ec2metadata</obsoletepackage>
<obsoletepackage>python-libsmdev</obsoletepackage>
+ <obsoletepackage>python-matplotlib-cairo</obsoletepackage>
<obsoletepackage>python-matplotlib-gtk2</obsoletepackage>
+ <obsoletepackage>python-matplotlib-gtk3</obsoletepackage>
+ <obsoletepackage>python-matplotlib-latex</obsoletepackage>
+ <obsoletepackage>python-matplotlib-qt-shared</obsoletepackage>
+ <obsoletepackage>python-matplotlib-qt4</obsoletepackage>
+ <obsoletepackage>python-matplotlib-qt5</obsoletepackage>
+ <obsoletepackage>python-matplotlib-web</obsoletepackage>
<obsoletepackage>python-oslo.db-test</obsoletepackage>
<obsoletepackage>python-oslo.messaging-test</obsoletepackage>
<obsoletepackage>python-oslo.version</obsoletepackage>
@@ -26434,7 +26460,6 @@
<obsoletepackage>python-pyinotify</obsoletepackage>
<obsoletepackage>python-retrying</obsoletepackage>
<obsoletepackage>python-scripttest</obsoletepackage>
- <obsoletepackage>python-service_identity</obsoletepackage>
<obsoletepackage>python-simpleeval</obsoletepackage>
<obsoletepackage>python-slapdsock</obsoletepackage>
<obsoletepackage>python-socketpool</obsoletepackage>
@@ -32500,6 +32525,8 @@
<obsoletepackage>libf2fs_format0</obsoletepackage>
<obsoletepackage>libfaxutil5_5_9</obsoletepackage>
<obsoletepackage>libflif0</obsoletepackage>
+ <obsoletepackage>libfluidsynth1</obsoletepackage>
+ <obsoletepackage>libfluidsynth1-32bit</obsoletepackage>
<obsoletepackage>libfm-qt-lang</obsoletepackage>
<obsoletepackage>libfm-qt3</obsoletepackage>
<obsoletepackage>libfplll3</obsoletepackage>
@@ -34854,6 +34881,7 @@
<obsoletepackage>libstdc++6-gcc8-locale</obsoletepackage>
<obsoletepackage>libtsan0-gcc8</obsoletepackage>
<obsoletepackage>libdpdk-18_02-0</obsoletepackage>
+ <obsoletepackage>libopenvswitch-2_9-0</obsoletepackage>
<obsoletepackage>libphobos2-0_79</obsoletepackage>
<obsoletepackage>libgnomecups-1_0-1</obsoletepackage>
<obsoletepackage>libgnomecups-1_0-1-32bit</obsoletepackage>
openSUSE-release.spec: same change
++++++ obsoletepackages.inc ++++++
--- /var/tmp/diff_new_pack.HsvnHC/_old 2018-09-27 03:23:08.282179455 +0200
+++ /var/tmp/diff_new_pack.HsvnHC/_new 2018-09-27 03:23:08.286179450 +0200
@@ -2800,10 +2800,13 @@
<obsoletepackage>python-libbtctl</obsoletepackage>
<obsoletepackage>python-libntrack</obsoletepackage>
<obsoletepackage>python-louis</obsoletepackage>
+ <obsoletepackage>python-matplotlib</obsoletepackage>
+ <obsoletepackage>python-matplotlib-tk</obsoletepackage>
<obsoletepackage>python-matplotlib-wx</obsoletepackage>
<obsoletepackage>python-metacity</obsoletepackage>
<obsoletepackage>python-ogg</obsoletepackage>
<obsoletepackage>python-psycopg2-doc</obsoletepackage>
+ <obsoletepackage>python-pylint</obsoletepackage>
<obsoletepackage>python-pyparsing-doc</obsoletepackage>
<obsoletepackage>python-qscintilla-sip</obsoletepackage>
<obsoletepackage>python-rapi2</obsoletepackage>
@@ -4846,6 +4849,7 @@
<obsoletepackage>python-cinder</obsoletepackage>
<obsoletepackage>python-cinderclient-test</obsoletepackage>
<obsoletepackage>python-cliff-tablib</obsoletepackage>
+ <obsoletepackage>python-cmd2</obsoletepackage>
<obsoletepackage>python-glanceclient-test</obsoletepackage>
<obsoletepackage>python-heatclient-test</obsoletepackage>
<obsoletepackage>python-horizon</obsoletepackage>
@@ -6594,10 +6598,18 @@
<obsoletepackage>python-CXX</obsoletepackage>
<obsoletepackage>python-CXX-devel</obsoletepackage>
<obsoletepackage>python-appindicator</obsoletepackage>
+ <obsoletepackage>python-astroid</obsoletepackage>
<obsoletepackage>python-clang</obsoletepackage>
<obsoletepackage>python-ec2metadata</obsoletepackage>
<obsoletepackage>python-libsmdev</obsoletepackage>
+ <obsoletepackage>python-matplotlib-cairo</obsoletepackage>
<obsoletepackage>python-matplotlib-gtk2</obsoletepackage>
+ <obsoletepackage>python-matplotlib-gtk3</obsoletepackage>
+ <obsoletepackage>python-matplotlib-latex</obsoletepackage>
+ <obsoletepackage>python-matplotlib-qt-shared</obsoletepackage>
+ <obsoletepackage>python-matplotlib-qt4</obsoletepackage>
+ <obsoletepackage>python-matplotlib-qt5</obsoletepackage>
+ <obsoletepackage>python-matplotlib-web</obsoletepackage>
<obsoletepackage>python-oslo.db-test</obsoletepackage>
<obsoletepackage>python-oslo.messaging-test</obsoletepackage>
<obsoletepackage>python-oslo.version</obsoletepackage>
@@ -8775,7 +8787,6 @@
<obsoletepackage>python-pyinotify</obsoletepackage>
<obsoletepackage>python-retrying</obsoletepackage>
<obsoletepackage>python-scripttest</obsoletepackage>
- <obsoletepackage>python-service_identity</obsoletepackage>
<obsoletepackage>python-simpleeval</obsoletepackage>
<obsoletepackage>python-slapdsock</obsoletepackage>
<obsoletepackage>python-socketpool</obsoletepackage>
@@ -14842,6 +14853,8 @@
<obsoletepackage>libf2fs_format0</obsoletepackage>
<obsoletepackage>libfaxutil5_5_9</obsoletepackage>
<obsoletepackage>libflif0</obsoletepackage>
+ <obsoletepackage>libfluidsynth1</obsoletepackage>
+ <obsoletepackage>libfluidsynth1-32bit</obsoletepackage>
<obsoletepackage>libfm-qt-lang</obsoletepackage>
<obsoletepackage>libfm-qt3</obsoletepackage>
<obsoletepackage>libfplll3</obsoletepackage>
@@ -17226,6 +17239,7 @@
<obsoletepackage>libtsan0-gcc8</obsoletepackage>
<!-- Factory/openSUSE_20180307_i586_x86_64_Build0001 -->
<obsoletepackage>libdpdk-18_02-0</obsoletepackage>
+ <obsoletepackage>libopenvswitch-2_9-0</obsoletepackage>
<obsoletepackage>libphobos2-0_79</obsoletepackage>
<!-- Factory/openSUSE_20180309_i586_x86_64_Build0002 -->
<obsoletepackage>libgnomecups-1_0-1</obsoletepackage>
1
0
Hello community,
here is the log from the commit of package 000product for openSUSE:Factory checked in at 2018-09-27 02:22:08
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/000product (Old)
and /work/SRC/openSUSE:Factory/.000product.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "000product"
Thu Sep 27 02:22:08 2018 rev:550 rq: version:unknown
Changes:
--------
New Changes file:
NO CHANGES FILE!!!
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
_service:product_converter:openSUSE-Addon-NonOss-ftp-ftp-i586_x86_64.kiwi: same change
_service:product_converter:openSUSE-Addon-NonOss-release.spec: same change
++++++ _service:product_converter:openSUSE-Tumbleweed-Kubic-dvd5-dvd-x86_64.kiwi ++++++
--- /var/tmp/diff_new_pack.kOeMal/_old 2018-09-27 02:22:22.986024329 +0200
+++ /var/tmp/diff_new_pack.kOeMal/_new 2018-09-27 02:22:22.986024329 +0200
@@ -627,8 +627,6 @@
<repopackage name="libsss_idmap0"/>
<repopackage name="libsss_nss_idmap0"/>
<repopackage name="libsss_nss_idmap0"/>
- <repopackage name="libsss_simpleifp0"/>
- <repopackage name="libsss_simpleifp0"/>
<repopackage name="libstdc++6"/>
<repopackage name="libstdc++6"/>
<repopackage name="libstdc++6"/>
_service:product_converter:openSUSE-cd-mini-i586.kiwi: same change
_service:product_converter:openSUSE-cd-mini-x86_64.kiwi: same change
++++++ _service:product_converter:openSUSE-dvd5-dvd-i586.kiwi ++++++
--- /var/tmp/diff_new_pack.kOeMal/_old 2018-09-27 02:22:23.030024281 +0200
+++ /var/tmp/diff_new_pack.kOeMal/_new 2018-09-27 02:22:23.034024277 +0200
@@ -2800,7 +2800,6 @@
<repopackage name="libsss_certmap0" arch="i586,i686"/>
<repopackage name="libsss_idmap0" arch="i586,i686"/>
<repopackage name="libsss_nss_idmap0" arch="i586,i686"/>
- <repopackage name="libsss_simpleifp0" arch="i586,i686"/>
<repopackage name="libstaroffice-0_0-0" arch="i586,i686"/>
<repopackage name="libstartup-notification-1-0" arch="i586,i686"/>
<repopackage name="libstdc++-devel" arch="i586,i686"/>
++++++ _service:product_converter:openSUSE-dvd5-dvd-x86_64.kiwi ++++++
--- /var/tmp/diff_new_pack.kOeMal/_old 2018-09-27 02:22:23.054024255 +0200
+++ /var/tmp/diff_new_pack.kOeMal/_new 2018-09-27 02:22:23.054024255 +0200
@@ -2898,7 +2898,6 @@
<repopackage name="libsss_certmap0" arch="x86_64"/>
<repopackage name="libsss_idmap0" arch="x86_64"/>
<repopackage name="libsss_nss_idmap0" arch="x86_64"/>
- <repopackage name="libsss_simpleifp0" arch="x86_64"/>
<repopackage name="libstaroffice-0_0-0" arch="x86_64"/>
<repopackage name="libstartup-notification-1-0" arch="x86_64"/>
<repopackage name="libstdc++-devel" arch="x86_64"/>
_service:product_converter:openSUSE-release.spec: same change
openSUSE-Tumbleweed-Kubic-release.spec: same change
openSUSE-release.spec: same change
++++++ DVD5-i586.group ++++++
--- /var/tmp/diff_new_pack.kOeMal/_old 2018-09-27 02:22:23.142024160 +0200
+++ /var/tmp/diff_new_pack.kOeMal/_new 2018-09-27 02:22:23.146024155 +0200
@@ -2837,7 +2837,6 @@
<package name="libsss_certmap0"/>
<package name="libsss_idmap0"/>
<package name="libsss_nss_idmap0"/>
- <package name="libsss_simpleifp0"/>
<package name="libstaroffice-0_0-0"/>
<package name="libstartup-notification-1-0"/>
<package name="libstdc++-devel"/>
++++++ DVD5-x86_64.group ++++++
--- /var/tmp/diff_new_pack.kOeMal/_old 2018-09-27 02:22:23.162024138 +0200
+++ /var/tmp/diff_new_pack.kOeMal/_new 2018-09-27 02:22:23.162024138 +0200
@@ -2951,7 +2951,6 @@
<package name="libsss_certmap0"/>
<package name="libsss_idmap0"/>
<package name="libsss_nss_idmap0"/>
- <package name="libsss_simpleifp0"/>
<package name="libstaroffice-0_0-0"/>
<package name="libstartup-notification-1-0"/>
<package name="libstdc++-devel"/>
++++++ openSUSE-Kubic-3.group ++++++
--- /var/tmp/diff_new_pack.kOeMal/_old 2018-09-27 02:22:23.350023933 +0200
+++ /var/tmp/diff_new_pack.kOeMal/_new 2018-09-27 02:22:23.354023929 +0200
@@ -167,7 +167,6 @@
<package name="libsss_certmap0"/>
<package name="libsss_idmap0"/>
<package name="libsss_nss_idmap0"/>
- <package name="libsss_simpleifp0"/>
<package name="libstdc++6"/>
<package name="libsystemd0"/>
<package name="libtalloc2"/>
++++++ openSUSE-Kubic-DVD.group ++++++
--- /var/tmp/diff_new_pack.kOeMal/_old 2018-09-27 02:22:23.366023916 +0200
+++ /var/tmp/diff_new_pack.kOeMal/_new 2018-09-27 02:22:23.366023916 +0200
@@ -177,7 +177,6 @@
<package name="libsss_certmap0"/>
<package name="libsss_idmap0"/>
<package name="libsss_nss_idmap0"/>
- <package name="libsss_simpleifp0"/>
<package name="libstdc++6"/>
<package name="libsystemd0"/>
<package name="libtalloc2"/>
1
0
Hello community,
here is the log from the commit of package python-pydocumentdb for openSUSE:Factory checked in at 2018-09-26 16:16:16
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-pydocumentdb (Old)
and /work/SRC/openSUSE:Factory/.python-pydocumentdb.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-pydocumentdb"
Wed Sep 26 16:16:16 2018 rev:2 rq:638009 version:2.3.2
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-pydocumentdb/python-pydocumentdb.changes 2018-05-13 16:05:16.625867742 +0200
+++ /work/SRC/openSUSE:Factory/.python-pydocumentdb.new/python-pydocumentdb.changes 2018-09-26 16:16:17.491001770 +0200
@@ -1,0 +2,7 @@
+Thu Sep 6 12:57:59 UTC 2018 - John Paul Adrian Glaubitz <adrian.glaubitz(a)suse.com>
+
+- New upstream release
+ + Version 2.3.2
+ + No upstream changelog provided
+
+-------------------------------------------------------------------
Old:
----
pydocumentdb-2.3.1.tar.gz
New:
----
pydocumentdb-2.3.2.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-pydocumentdb.spec ++++++
--- /var/tmp/diff_new_pack.JjqgS5/_old 2018-09-26 16:16:18.051000843 +0200
+++ /var/tmp/diff_new_pack.JjqgS5/_new 2018-09-26 16:16:18.051000843 +0200
@@ -18,7 +18,7 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-pydocumentdb
-Version: 2.3.1
+Version: 2.3.2
Release: 0
Summary: Azure DocumentDB Python SDK
License: MIT
++++++ pydocumentdb-2.3.1.tar.gz -> pydocumentdb-2.3.2.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/PKG-INFO new/pydocumentdb-2.3.2/PKG-INFO
--- old/pydocumentdb-2.3.1/PKG-INFO 2017-12-22 01:45:45.000000000 +0100
+++ new/pydocumentdb-2.3.2/PKG-INFO 2018-05-08 22:39:48.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: pydocumentdb
-Version: 2.3.1
+Version: 2.3.2
Summary: Azure DocumentDB Python SDK
Home-page: https://github.com/Azure/azure-documentdb-python
Author: Microsoft
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/README.md new/pydocumentdb-2.3.2/README.md
--- old/pydocumentdb-2.3.1/README.md 2017-12-22 00:52:28.000000000 +0100
+++ new/pydocumentdb-2.3.2/README.md 2018-05-08 00:09:38.000000000 +0200
@@ -1,19 +1,21 @@
-This is the README of the Python driver for Microsoft Azure DocumentDB database service.
+# Microsoft Azure Cosmos DB Python SDK
-Welcome to DocumentDB.
+Welcome to the repo containing all things Python for the Azure Cosmos DB API which is published with name [pydocumentdb](https://pypi.python.org/pypi/pydocumentdb/). For documentation please see the Microsoft Azure [link](https://docs.microsoft.com/en-us/azure/cosmos-db/sql-api-sdk-python).
-0) Pre-requirements:
+## Pre-requirements
- Python 2.7, Python 3.3, Python 3.4, or Python 3.5
- https://www.python.org/downloads/
+Python 2.7, Python 3.3, Python 3.4, or Python 3.5
+https://www.python.org/downloads/
- If you use Microsoft Visual Studio as IDE (we use 2015), please install the
- following extension for Python.
- http://microsoft.github.io/PTVS/
+If you use Microsoft Visual Studio as IDE (we use 2015), please install the
+following extension for Python.
+http://microsoft.github.io/PTVS/
+Install Cosmos DB emulator
+Follow instruction at https://docs.microsoft.com/en-us/azure/cosmos-db/local-emulator
-1) Installation:
+## Installation:
$ python setup.py install
@@ -22,7 +24,12 @@
$ pip install pydocumentdb
-2) Testing:
+## Running Testing
+Clone the repo
+```bash
+git clone https://github.com/Azure/azure-documentdb-python.git
+cd azure-documentdb-python
+```
Most of the test files under test sub-folder require you to enter your Azure DocumentDB master key and host endpoint:
@@ -40,7 +47,7 @@
Most of the test cases create collections in your DocumentDB account. Collections are billing entities. By running these test cases, you may incur monetary costs on your account.
-3) To generate documentations:
+## Documentation generation
Install Sphinx: http://sphinx-doc.org/install.html
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/doc/conf.py new/pydocumentdb-2.3.2/doc/conf.py
--- old/pydocumentdb-2.3.1/doc/conf.py 2017-12-22 00:52:28.000000000 +0100
+++ new/pydocumentdb-2.3.2/doc/conf.py 2018-05-08 00:13:56.000000000 +0200
@@ -52,9 +52,9 @@
# built documents.
#
# The short X.Y version.
-version = '2.3.1'
+version = '2.3.2'
# The full version, including alpha/beta/rc tags.
-release = '2.3.1'
+release = '2.3.2'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/pydocumentdb/base.py new/pydocumentdb-2.3.2/pydocumentdb/base.py
--- old/pydocumentdb-2.3.1/pydocumentdb/base.py 2017-12-22 00:52:28.000000000 +0100
+++ new/pydocumentdb-2.3.2/pydocumentdb/base.py 2018-05-08 00:09:38.000000000 +0200
@@ -150,6 +150,9 @@
if options.get('enableCrossPartitionQuery'):
headers[http_constants.HttpHeaders.EnableCrossPartitionQuery] = options['enableCrossPartitionQuery']
+ if options.get('populateQueryMetrics'):
+ headers[http_constants.HttpHeaders.PopulateQueryMetrics] = options['populateQueryMetrics']
+
if document_client.master_key:
headers[http_constants.HttpHeaders.XDate] = (
datetime.datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT'))
@@ -577,4 +580,4 @@
return tokens
-
\ No newline at end of file
+
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/pydocumentdb/default_retry_policy.py new/pydocumentdb-2.3.2/pydocumentdb/default_retry_policy.py
--- old/pydocumentdb-2.3.1/pydocumentdb/default_retry_policy.py 1970-01-01 01:00:00.000000000 +0100
+++ new/pydocumentdb-2.3.2/pydocumentdb/default_retry_policy.py 2018-05-08 00:09:38.000000000 +0200
@@ -0,0 +1,74 @@
+#The MIT License (MIT)
+#Copyright (c) 2017 Microsoft Corporation
+
+#Permission is hereby granted, free of charge, to any person obtaining a copy
+#of this software and associated documentation files (the "Software"), to deal
+#in the Software without restriction, including without limitation the rights
+#to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+#copies of the Software, and to permit persons to whom the Software is
+#furnished to do so, subject to the following conditions:
+
+#The above copyright notice and this permission notice shall be included in all
+#copies or substantial portions of the Software.
+
+#THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+#IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+#FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+#AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+#LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+#OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+#SOFTWARE.
+
+"""Internal class for connection reset retry policy implementation in the Azure Cosmos DB database service.
+"""
+import pydocumentdb.http_constants as http_constants
+
+class _DefaultRetryPolicy(object):
+
+ error_codes = http_constants._ErrorCodes;
+ CONNECTION_ERROR_CODES = [
+ error_codes.WindowsInterruptedFunctionCall,
+ error_codes.WindowsFileHandleNotValid,
+ error_codes.WindowsPermissionDenied,
+ error_codes.WindowsBadAddress,
+ error_codes.WindowsInvalidArgumnet,
+ error_codes.WindowsResourceTemporarilyUnavailable,
+ error_codes.WindowsOperationNowInProgress,
+ error_codes.WindowsAddressAlreadyInUse,
+ error_codes.WindowsConnectionResetByPeer,
+ error_codes.WindowsCannotSendAfterSocketShutdown,
+ error_codes.WindowsConnectionTimedOut,
+ error_codes.WindowsConnectionRefused,
+ error_codes.WindowsNameTooLong,
+ error_codes.WindowsHostIsDown,
+ error_codes.WindowsNoRouteTohost,
+ error_codes.LinuxConnectionReset
+ ]
+
+ def __init__(self, *args):
+ self._max_retry_attempt_count = 10
+ self.current_retry_attempt_count = 0
+ self.retry_after_in_milliseconds = 1000
+ self.args = args
+
+ def needsRetry(self, error_code):
+ if error_code in _DefaultRetryPolicy.CONNECTION_ERROR_CODES:
+ if (len(self.args) > 0):
+ if (self.args[3]['method'] == 'GET') or (http_constants.HttpHeaders.IsQuery in self.args[3]['headers']):
+ return True
+ return False
+ return True
+
+ def ShouldRetry(self, exception):
+ """Returns true if should retry based on the passed-in exception.
+
+ :param (errors.HTTPFailure instance) exception:
+
+ :rtype:
+ boolean
+
+ """
+ if (self.current_retry_attempt_count < self._max_retry_attempt_count) and self.needsRetry(exception.status_code):
+ self.current_retry_attempt_count += 1
+ return True
+ return False
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/pydocumentdb/endpoint_discovery_retry_policy.py new/pydocumentdb-2.3.2/pydocumentdb/endpoint_discovery_retry_policy.py
--- old/pydocumentdb-2.3.1/pydocumentdb/endpoint_discovery_retry_policy.py 2017-12-22 00:52:28.000000000 +0100
+++ new/pydocumentdb-2.3.2/pydocumentdb/endpoint_discovery_retry_policy.py 2018-05-08 00:09:38.000000000 +0200
@@ -24,6 +24,14 @@
import logging
+logger = logging.getLogger(__name__)
+logger.setLevel(logging.INFO)
+log_formatter = logging.Formatter('%(levelname)s:%(message)s')
+log_handler = logging.StreamHandler()
+log_handler.setFormatter(log_formatter)
+logger.addHandler(log_handler)
+
+
class _EndpointDiscoveryRetryPolicy(object):
"""The endpoint discovery retry policy class used for geo-replicated database accounts
to handle the write forbidden exceptions due to writable/readable location changes
@@ -32,15 +40,12 @@
Max_retry_attempt_count = 120
Retry_after_in_milliseconds = 1000
- FORBIDDEN_STATUS_CODE = 403
- WRITE_FORBIDDEN_SUB_STATUS_CODE = 3
def __init__(self, global_endpoint_manager):
self.global_endpoint_manager = global_endpoint_manager
self._max_retry_attempt_count = _EndpointDiscoveryRetryPolicy.Max_retry_attempt_count
self.current_retry_attempt_count = 0
self.retry_after_in_milliseconds = _EndpointDiscoveryRetryPolicy.Retry_after_in_milliseconds
- logging.basicConfig(format='%(levelname)s:%(message)s', level=logging.INFO)
def ShouldRetry(self, exception):
"""Returns true if should retry based on the passed-in exception.
@@ -53,7 +58,7 @@
"""
if self.current_retry_attempt_count < self._max_retry_attempt_count and self.global_endpoint_manager.EnableEndpointDiscovery:
self.current_retry_attempt_count += 1
- logging.info('Write location was changed, refreshing the locations list from database account and will retry the request.')
+ logger.info('Write location was changed, refreshing the locations list from database account and will retry the request.')
# Refresh the endpoint list to refresh the new writable and readable locations
self.global_endpoint_manager.RefreshEndpointList()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/pydocumentdb/execution_context/execution_dispatcher.py new/pydocumentdb-2.3.2/pydocumentdb/execution_context/execution_dispatcher.py
--- old/pydocumentdb-2.3.1/pydocumentdb/execution_context/execution_dispatcher.py 2017-12-22 00:52:28.000000000 +0100
+++ new/pydocumentdb-2.3.2/pydocumentdb/execution_context/execution_dispatcher.py 2018-05-08 00:09:38.000000000 +0200
@@ -30,6 +30,7 @@
from pydocumentdb.execution_context.query_execution_info import _PartitionedQueryExecutionInfo
from pydocumentdb.execution_context import endpoint_component
from pydocumentdb.execution_context import multi_execution_aggregator
+from pydocumentdb.http_constants import StatusCodes, SubStatusCodes
class _ProxyQueryExecutionContext(_QueryExecutionContextBase):
'''
@@ -92,7 +93,7 @@
return self._execution_context.fetch_next_block()
def _is_partitioned_execution_info(self, e):
- return e.status_code == 400 and e.sub_status == 1004
+ return e.status_code == StatusCodes.BAD_REQUEST and e.sub_status == SubStatusCodes.CROSS_PARTITION_QUERY_NOT_SERVABLE
def _get_partitioned_execution_info(self, e):
error_msg = json.loads(e._http_error_message)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/pydocumentdb/http_constants.py new/pydocumentdb-2.3.2/pydocumentdb/http_constants.py
--- old/pydocumentdb-2.3.1/pydocumentdb/http_constants.py 2017-12-22 00:52:28.000000000 +0100
+++ new/pydocumentdb-2.3.2/pydocumentdb/http_constants.py 2018-05-08 00:14:30.000000000 +0200
@@ -112,6 +112,7 @@
SubStatus = 'x-ms-substatus'
AlternateContentPath = 'x-ms-alt-content-path'
IsContinuationExpected = "x-ms-documentdb-query-iscontinuationexpected"
+ PopulateQueryMetrics = "x-ms-documentdb-populatequerymetrics"
# Quota Info
MaxEntityCount = 'x-ms-root-entity-max-count'
@@ -245,7 +246,7 @@
"""
CurrentVersion = '2017-11-15'
SDKName = 'documentdb-python-sdk'
- SDKVersion = '2.3.1'
+ SDKVersion = '2.3.2'
class Delimiters:
@@ -266,3 +267,95 @@
"""Constants of http context properties.
"""
SubscriptionId = 'SubscriptionId'
+
+
+class _ErrorCodes:
+ """Windows Socket Error Codes
+ """
+ WindowsInterruptedFunctionCall = 10004;
+ WindowsFileHandleNotValid = 10009;
+ WindowsPermissionDenied = 10013;
+ WindowsBadAddress = 10014;
+ WindowsInvalidArgumnet = 10022;
+ WindowsResourceTemporarilyUnavailable = 10035;
+ WindowsOperationNowInProgress = 10036;
+ WindowsAddressAlreadyInUse = 10048;
+ WindowsConnectionResetByPeer = 10054;
+ WindowsCannotSendAfterSocketShutdown = 10058;
+ WindowsConnectionTimedOut = 10060;
+ WindowsConnectionRefused = 10061;
+ WindowsNameTooLong = 10063;
+ WindowsHostIsDown = 10064;
+ WindowsNoRouteTohost = 10065;
+
+ """Linux Error Codes
+ """
+ LinuxConnectionReset = 131;
+
+
+class StatusCodes:
+ """HTTP status codes returned by the REST operations
+ """
+ # Success
+ OK = 200
+ CREATED = 201
+ ACCEPTED = 202
+ NO_CONTENT = 204
+
+ NOT_MODIFIED = 304
+
+ # Client Error
+ BAD_REQUEST = 400
+ UNAUTHORIZED = 401
+ FORBIDDEN = 403
+ NOT_FOUND = 404
+ METHOD_NOT_ALLOWED = 405
+ REQUEST_TIMEOUT = 408
+ CONFLICT = 409
+ GONE = 410
+ PRECONDITION_FAILED = 412
+ REQUEST_ENTITY_TOO_LARGE = 413
+ TOO_MANY_REQUESTS = 429
+ RETRY_WITH = 449
+
+ INTERNAL_SERVER_ERROR = 500
+ SERVICE_UNAVAILABLE = 503
+
+ # Operation pause and cancel. These are FAKE status codes for QOS logging purpose only.
+ OPERATION_PAUSED = 1200
+ OPERATION_CANCELLED = 1201
+
+
+class SubStatusCodes:
+ """Sub status codes returned by the REST operations specifying the details of the operation
+ """
+ UNKNOWN = 0
+
+ # 400: Bad Request Substatus
+ PARTITION_KEY_MISMATCH = 1001
+ CROSS_PARTITION_QUERY_NOT_SERVABLE = 1004
+
+ # 410: StatusCodeType_Gone: substatus
+ NAME_CACHE_IS_STALE = 1000
+ PARTITION_KEY_RANGE_GONE = 1002
+ COMPLETING_SPLIT = 1007
+ COMPLETING_PARTITION_MIGRATION = 1008
+
+ # 403: Forbidden Substatus.
+ WRITE_FORBIDDEN = 3
+ PROVISION_LIMIT_REACHED = 1005
+ DATABASE_ACCOUNT_NOT_FOUND = 1008
+ REDUNDANT_COLLECTION_PUT = 1009
+ SHARED_THROUGHPUT_DATABASE_QUOTA_EXCEEDED = 1010
+ SHARED_THROUGHPUT_OFFER_GROW_NOT_NEEDED = 1011
+
+ # 404: LSN in session token is higher
+ READ_SESSION_NOTAVAILABLE = 1002
+ OWNER_RESOURCE_NOT_FOUND = 1003
+
+ # 409: Conflict exception
+ CONFLICT_WITH_CONTROL_PLANE = 1006
+
+ # 503: Service Unavailable due to region being out of capacity for bindable partitions
+ INSUFFICIENT_BINDABLE_PARTITIONS = 1007
+
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/pydocumentdb/resource_throttle_retry_policy.py new/pydocumentdb-2.3.2/pydocumentdb/resource_throttle_retry_policy.py
--- old/pydocumentdb-2.3.1/pydocumentdb/resource_throttle_retry_policy.py 2017-12-22 00:52:28.000000000 +0100
+++ new/pydocumentdb-2.3.2/pydocumentdb/resource_throttle_retry_policy.py 2018-05-08 00:09:38.000000000 +0200
@@ -25,7 +25,6 @@
import pydocumentdb.http_constants as http_constants
class _ResourceThrottleRetryPolicy(object):
- THROTTLE_STATUS_CODE = 429
def __init__(self, max_retry_attempt_count, fixed_retry_interval_in_milliseconds, max_wait_time_in_seconds):
self._max_retry_attempt_count = max_retry_attempt_count
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/pydocumentdb/retry_options.py new/pydocumentdb-2.3.2/pydocumentdb/retry_options.py
--- old/pydocumentdb-2.3.1/pydocumentdb/retry_options.py 2017-12-22 00:52:28.000000000 +0100
+++ new/pydocumentdb-2.3.2/pydocumentdb/retry_options.py 2018-05-08 00:09:38.000000000 +0200
@@ -29,7 +29,7 @@
Max number of retries to be performed for a request. Default value 9.
:ivar int FixedRetryIntervalInMilliseconds:
Fixed retry interval in milliseconds to wait between each retry ignoring the retryAfter returned as part of the response.
- :ivar int MaxRetryAttemptCount:
+ :ivar int MaxWaitTimeInSeconds:
Max wait time in seconds to wait for a request while the retries are happening. Default value 30 seconds.
"""
def __init__(self, max_retry_attempt_count = 9, fixed_retry_interval_in_milliseconds = None, max_wait_time_in_seconds = 30):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/pydocumentdb/retry_utility.py new/pydocumentdb-2.3.2/pydocumentdb/retry_utility.py
--- old/pydocumentdb-2.3.1/pydocumentdb/retry_utility.py 2017-12-22 00:52:28.000000000 +0100
+++ new/pydocumentdb-2.3.2/pydocumentdb/retry_utility.py 2018-05-08 00:09:38.000000000 +0200
@@ -27,7 +27,8 @@
import pydocumentdb.errors as errors
import pydocumentdb.endpoint_discovery_retry_policy as endpoint_discovery_retry_policy
import pydocumentdb.resource_throttle_retry_policy as resource_throttle_retry_policy
-import pydocumentdb.http_constants as http_constants
+import pydocumentdb.default_retry_policy as default_retry_policy
+from pydocumentdb.http_constants import HttpHeaders, StatusCodes, SubStatusCodes
def _Execute(client, global_endpoint_manager, function, *args, **kwargs):
"""Exectutes the function with passed parameters applying all retry policies
@@ -48,6 +49,7 @@
resourceThrottle_retry_policy = resource_throttle_retry_policy._ResourceThrottleRetryPolicy(client.connection_policy.RetryOptions.MaxRetryAttemptCount,
client.connection_policy.RetryOptions.FixedRetryIntervalInMilliseconds,
client.connection_policy.RetryOptions.MaxWaitTimeInSeconds)
+ defaultRetry_policy = default_retry_policy._DefaultRetryPolicy(*args)
while True:
try:
@@ -57,26 +59,28 @@
client.last_response_headers = {}
# setting the throttle related response headers before returning the result
- client.last_response_headers[http_constants.HttpHeaders.ThrottleRetryCount] = resourceThrottle_retry_policy.current_retry_attempt_count
- client.last_response_headers[http_constants.HttpHeaders.ThrottleRetryWaitTimeInMs] = resourceThrottle_retry_policy.cummulative_wait_time_in_milliseconds
+ client.last_response_headers[HttpHeaders.ThrottleRetryCount] = resourceThrottle_retry_policy.current_retry_attempt_count
+ client.last_response_headers[HttpHeaders.ThrottleRetryWaitTimeInMs] = resourceThrottle_retry_policy.cummulative_wait_time_in_milliseconds
return result
except errors.HTTPFailure as e:
retry_policy = None
- if (e.status_code == endpoint_discovery_retry_policy._EndpointDiscoveryRetryPolicy.FORBIDDEN_STATUS_CODE
- and e.sub_status == endpoint_discovery_retry_policy._EndpointDiscoveryRetryPolicy.WRITE_FORBIDDEN_SUB_STATUS_CODE):
+ if (e.status_code == StatusCodes.FORBIDDEN
+ and e.sub_status == SubStatusCodes.WRITE_FORBIDDEN):
retry_policy = endpointDiscovery_retry_policy
- elif e.status_code == resource_throttle_retry_policy._ResourceThrottleRetryPolicy.THROTTLE_STATUS_CODE:
+ elif e.status_code == StatusCodes.TOO_MANY_REQUESTS:
retry_policy = resourceThrottle_retry_policy
+ else:
+ retry_policy = defaultRetry_policy
# If none of the retry policies applies or there is no retry needed, set the throttle related response hedaers and
# re-throw the exception back
- if not (retry_policy and retry_policy.ShouldRetry(e)):
+ if not (retry_policy.ShouldRetry(e)):
if not client.last_response_headers:
client.last_response_headers = {}
- client.last_response_headers[http_constants.HttpHeaders.ThrottleRetryCount] = resourceThrottle_retry_policy.current_retry_attempt_count
- client.last_response_headers[http_constants.HttpHeaders.ThrottleRetryWaitTimeInMs] = resourceThrottle_retry_policy.cummulative_wait_time_in_milliseconds
+ client.last_response_headers[HttpHeaders.ThrottleRetryCount] = resourceThrottle_retry_policy.current_retry_attempt_count
+ client.last_response_headers[HttpHeaders.ThrottleRetryWaitTimeInMs] = resourceThrottle_retry_policy.cummulative_wait_time_in_milliseconds
raise
else:
# Wait for retry_after_in_milliseconds time before the next retry
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/pydocumentdb.egg-info/PKG-INFO new/pydocumentdb-2.3.2/pydocumentdb.egg-info/PKG-INFO
--- old/pydocumentdb-2.3.1/pydocumentdb.egg-info/PKG-INFO 2017-12-22 01:45:43.000000000 +0100
+++ new/pydocumentdb-2.3.2/pydocumentdb.egg-info/PKG-INFO 2018-05-08 22:39:46.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: pydocumentdb
-Version: 2.3.1
+Version: 2.3.2
Summary: Azure DocumentDB Python SDK
Home-page: https://github.com/Azure/azure-documentdb-python
Author: Microsoft
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/pydocumentdb.egg-info/SOURCES.txt new/pydocumentdb-2.3.2/pydocumentdb.egg-info/SOURCES.txt
--- old/pydocumentdb-2.3.1/pydocumentdb.egg-info/SOURCES.txt 2017-12-22 01:45:43.000000000 +0100
+++ new/pydocumentdb-2.3.2/pydocumentdb.egg-info/SOURCES.txt 2018-05-08 22:39:46.000000000 +0200
@@ -12,6 +12,7 @@
pydocumentdb/base.py
pydocumentdb/consistent_hash_ring.py
pydocumentdb/constants.py
+pydocumentdb/default_retry_policy.py
pydocumentdb/document_client.py
pydocumentdb/documents.py
pydocumentdb/endpoint_discovery_retry_policy.py
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/setup.py new/pydocumentdb-2.3.2/setup.py
--- old/pydocumentdb-2.3.1/setup.py 2017-12-22 00:52:29.000000000 +0100
+++ new/pydocumentdb-2.3.2/setup.py 2018-05-08 00:14:45.000000000 +0200
@@ -4,7 +4,7 @@
import setuptools
setup(name='pydocumentdb',
- version='2.3.1',
+ version='2.3.2',
description='Azure DocumentDB Python SDK',
author="Microsoft",
author_email="askdocdb(a)microsoft.com",
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/test/crud_tests.py new/pydocumentdb-2.3.2/test/crud_tests.py
--- old/pydocumentdb-2.3.1/test/crud_tests.py 2017-12-22 00:52:29.000000000 +0100
+++ new/pydocumentdb-2.3.2/test/crud_tests.py 2018-05-08 00:09:38.000000000 +0200
@@ -38,7 +38,7 @@
import pydocumentdb.document_client as document_client
import pydocumentdb.errors as errors
import pydocumentdb.hash_partition_resolver as hash_partition_resolver
-import pydocumentdb.http_constants as http_constants
+from pydocumentdb.http_constants import HttpHeaders, StatusCodes, SubStatusCodes
import pydocumentdb.murmur_hash as murmur_hash
import pydocumentdb.range_partition_resolver as range_partition_resolver
import pydocumentdb.range as partition_range
@@ -140,7 +140,7 @@
# delete database.
client.DeleteDatabase(self.GetDatabaseLink(created_db, is_name_based))
# read database after deletion
- self.__AssertHTTPFailureWithStatus(404,
+ self.__AssertHTTPFailureWithStatus(StatusCodes.NOT_FOUND,
client.ReadDatabase,
self.GetDatabaseLink(created_db, is_name_based))
@@ -213,7 +213,7 @@
# Replacing collection Id should fail.
change_collection = created_collection.copy()
change_collection['id'] = 'try_change_id'
- self.__AssertHTTPFailureWithStatus(400,
+ self.__AssertHTTPFailureWithStatus(StatusCodes.BAD_REQUEST,
client.ReplaceCollection,
self.GetDocumentCollectionLink(created_db, created_collection, is_name_based),
change_collection)
@@ -222,7 +222,7 @@
# delete collection
client.DeleteCollection(self.GetDocumentCollectionLink(created_db, created_collection, is_name_based))
# read collection after deletion
- self.__AssertHTTPFailureWithStatus(404,
+ self.__AssertHTTPFailureWithStatus(StatusCodes.NOT_FOUND,
client.ReadCollection,
self.GetDocumentCollectionLink(created_db, created_collection, is_name_based))
@@ -295,7 +295,7 @@
options = { 'partitionKey': 'NY' }
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.CreateDocument,
self.GetDocumentCollectionLink(created_db, created_collection),
document_definition,
@@ -441,7 +441,7 @@
# For ReadDocument, we require to have the partitionKey to be specified as part of options otherwise we get BadRequest(status code 400)
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.ReadDocument,
self.GetDocumentLink(created_db, created_collection, created_document))
@@ -484,7 +484,7 @@
# For DeleteDocument, we require to have the partitionKey to be specified as part of options otherwise we get BadRequest(status code 400)
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.DeleteDocument,
self.GetDocumentLink(created_db, created_collection, upserted_document))
@@ -596,7 +596,7 @@
# Create document in read_collection should fail since it has only read permissions for this collection
self.__AssertHTTPFailureWithStatus(
- 403,
+ StatusCodes.FORBIDDEN,
restricted_client.CreateDocument,
self.GetDocumentCollectionLink(created_db, read_collection, False),
document_definition)
@@ -613,7 +613,7 @@
options = { 'partitionKey': document_definition.get('key') }
# Create document should fail since the partitionKey is 2 which is different that what is specified as resourcePartitionKey in permission object
self.__AssertHTTPFailureWithStatus(
- 403,
+ StatusCodes.FORBIDDEN,
restricted_client.CreateDocument,
self.GetDocumentCollectionLink(created_db, all_collection, False),
document_definition,
@@ -628,7 +628,7 @@
# Delete document in read_collection should fail since it has only read permissions for this collection
self.__AssertHTTPFailureWithStatus(
- 403,
+ StatusCodes.FORBIDDEN,
restricted_client.DeleteDocument,
self.GetDocumentCollectionLink(created_db, read_collection, False),
options)
@@ -673,7 +673,7 @@
# Partiton Key value different than what is specified in the stored procedure body will cause a bad request(400) error
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.ExecuteStoredProcedure,
self.GetStoredProcedureLink(created_db, created_collection, created_sproc),
None,
@@ -746,7 +746,7 @@
# Currently, we require to have the partitionKey to be specified as part of options otherwise we get BadRequest(status code 400)
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.CreateAttachmentAndUploadMedia,
self.GetDocumentLink(db, collection, document),
content_stream,
@@ -782,7 +782,7 @@
'contentType': 'application/text' }
# Currently, we require to have the partitionKey to be specified as part of options otherwise we get BadRequest(status code 400)
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.UpsertAttachmentAndUploadMedia,
self.GetDocumentLink(db, collection, document),
content_stream,
@@ -817,7 +817,7 @@
# Currently, we require to have the partitionKey to be specified as part of options otherwise we get BadRequest(status code 400)
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.CreateAttachment,
self.GetDocumentLink(db, collection, document),
dynamic_attachment)
@@ -838,7 +838,7 @@
# Currently, we require to have the partitionKey to be specified as part of options otherwise we get BadRequest(status code 400)
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.ReadAttachment,
self.GetAttachmentLink(db, collection, document, attachment))
@@ -853,7 +853,7 @@
# Currently, we require to have the partitionKey to be specified as part of options otherwise we get BadRequest(status code 400)
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.ReplaceAttachment,
self.GetAttachmentLink(db, collection, document, attachment),
attachment)
@@ -870,7 +870,7 @@
# Currently, we require to have the partitionKey to be specified as part of options otherwise we get BadRequest(status code 400)
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.UpsertAttachment,
self.GetDocumentLink(db, collection, document),
attachment)
@@ -904,7 +904,7 @@
# Currently, we require to have the partitionKey to be specified as part of options otherwise we get BadRequest(status code 400)
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.DeleteAttachment,
self.GetAttachmentLink(db, collection, document, attachment))
@@ -1009,14 +1009,14 @@
# Currently, we require to have the partitionKey to be specified as part of options otherwise we get BadRequest(status code 400)
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.ReadConflict,
self.GetConflictLink(created_db, created_collection, conflict_definition))
# read conflict here will return resource not found(404) since there is no conflict here
options = { 'partitionKey': conflict_definition.get('id') }
self.__AssertHTTPFailureWithStatus(
- 404,
+ StatusCodes.NOT_FOUND,
client.ReadConflict,
self.GetConflictLink(created_db, created_collection, conflict_definition),
options)
@@ -1027,14 +1027,14 @@
# Currently, we require to have the partitionKey to be specified as part of options otherwise we get BadRequest(status code 400)
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.DeleteConflict,
self.GetConflictLink(created_db, created_collection, conflict_definition))
# delete conflict here will return resource not found(404) since there is no conflict here
options = { 'partitionKey': conflict_definition.get('id') }
self.__AssertHTTPFailureWithStatus(
- 404,
+ StatusCodes.NOT_FOUND,
client.DeleteConflict,
self.GetConflictLink(created_db, created_collection, conflict_definition),
options)
@@ -1096,7 +1096,7 @@
'key': 'value'}
# Should throw an error because automatic id generation is disabled.
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.CreateDocument,
self.GetDocumentCollectionLink(created_db, created_collection, is_name_based),
document_definition,
@@ -1120,7 +1120,7 @@
# duplicated documents are not allowed when 'id' is provided.
duplicated_definition_with_id = document_definition.copy()
duplicated_definition_with_id['id'] = created_document['id']
- self.__AssertHTTPFailureWithStatus(409,
+ self.__AssertHTTPFailureWithStatus(StatusCodes.CONFLICT,
client.CreateDocument,
self.GetDocumentCollectionLink(created_db, created_collection, is_name_based),
duplicated_definition_with_id)
@@ -1174,7 +1174,7 @@
# delete document
client.DeleteDocument(self.GetDocumentLink(created_db, created_collection, replaced_document, is_name_based))
# read documents after deletion
- self.__AssertHTTPFailureWithStatus(404,
+ self.__AssertHTTPFailureWithStatus(StatusCodes.NOT_FOUND,
client.ReadDocument,
self.GetDocumentLink(created_db, created_collection, replaced_document, is_name_based))
@@ -1833,7 +1833,7 @@
# create attachment with invalid content-type
content_stream = ReadableStream()
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.CreateAttachmentAndUploadMedia,
self.GetDocumentLink(db, collection, document, is_name_based),
content_stream,
@@ -1849,7 +1849,7 @@
content_stream = ReadableStream()
# create colliding attachment
self.__AssertHTTPFailureWithStatus(
- 409,
+ StatusCodes.CONFLICT,
client.CreateAttachmentAndUploadMedia,
self.GetDocumentLink(db, collection, document, is_name_based),
content_stream,
@@ -2154,7 +2154,7 @@
# delete user
client.DeleteUser(self.GetUserLink(db, user, is_name_based))
# read user after deletion
- self.__AssertHTTPFailureWithStatus(404,
+ self.__AssertHTTPFailureWithStatus(StatusCodes.NOT_FOUND,
client.ReadUser,
self.GetUserLink(db, user, is_name_based))
@@ -2279,7 +2279,7 @@
# delete permission
client.DeletePermission(self.GetPermissionLink(db, user, replaced_permission, is_name_based))
# read permission after deletion
- self.__AssertHTTPFailureWithStatus(404,
+ self.__AssertHTTPFailureWithStatus(StatusCodes.NOT_FOUND,
client.ReadPermission,
self.GetPermissionLink(db, user, permission, is_name_based))
@@ -2462,7 +2462,7 @@
# Client without any authorization will fail.
client = document_client.DocumentClient(CRUDTests.host, {})
- self.__AssertHTTPFailureWithStatus(401,
+ self.__AssertHTTPFailureWithStatus(StatusCodes.UNAUTHORIZED,
list,
client.ReadDatabases())
# Client with master key.
@@ -2481,7 +2481,7 @@
success_coll1 = col1_client.ReadCollection(
entities['coll1']['_self'])
# 2. Failure-- Use Col1 Permission to delete
- self.__AssertHTTPFailureWithStatus(403,
+ self.__AssertHTTPFailureWithStatus(StatusCodes.FORBIDDEN,
col1_client.DeleteCollection,
success_coll1['_self'])
# 3. Success-- Use Col1 Permission to Read All Docs
@@ -2588,7 +2588,7 @@
# delete trigger
res = client.DeleteTrigger(self.GetTriggerLink(db, collection, replaced_trigger, is_name_based))
# read triggers after deletion
- self.__AssertHTTPFailureWithStatus(404,
+ self.__AssertHTTPFailureWithStatus(StatusCodes.NOT_FOUND,
client.ReadTrigger,
self.GetTriggerLink(db, collection, replaced_trigger, is_name_based))
@@ -2746,7 +2746,7 @@
# delete udf
res = client.DeleteUserDefinedFunction(self.GetUserDefinedFunctionLink(db, collection, replaced_udf, is_name_based))
# read udfs after deletion
- self.__AssertHTTPFailureWithStatus(404,
+ self.__AssertHTTPFailureWithStatus(StatusCodes.NOT_FOUND,
client.ReadUserDefinedFunction,
self.GetUserDefinedFunctionLink(db, collection, replaced_udf, is_name_based))
@@ -2910,7 +2910,7 @@
# delete sproc
res = client.DeleteStoredProcedure(self.GetStoredProcedureLink(db, collection, replaced_sproc, is_name_based))
# read sprocs after deletion
- self.__AssertHTTPFailureWithStatus(404,
+ self.__AssertHTTPFailureWithStatus(StatusCodes.NOT_FOUND,
client.ReadStoredProcedure,
self.GetStoredProcedureLink(db, collection, replaced_sproc, is_name_based))
@@ -3036,19 +3036,19 @@
result = client.ExecuteStoredProcedure(self.GetStoredProcedureLink(created_db, created_collection, created_sproc), None)
self.assertEqual(result, 'Success!')
- self.assertFalse(http_constants.HttpHeaders.ScriptLogResults in client.last_response_headers)
+ self.assertFalse(HttpHeaders.ScriptLogResults in client.last_response_headers)
options = { 'enableScriptLogging': True }
result = client.ExecuteStoredProcedure(self.GetStoredProcedureLink(created_db, created_collection, created_sproc), None, options)
self.assertEqual(result, 'Success!')
- self.assertEqual('The value of x is 1.', client.last_response_headers.get(http_constants.HttpHeaders.ScriptLogResults))
+ self.assertEqual('The value of x is 1.', client.last_response_headers.get(HttpHeaders.ScriptLogResults))
options = { 'enableScriptLogging': False }
result = client.ExecuteStoredProcedure(self.GetStoredProcedureLink(created_db, created_collection, created_sproc), None, options)
self.assertEqual(result, 'Success!')
- self.assertFalse(http_constants.HttpHeaders.ScriptLogResults in client.last_response_headers)
+ self.assertFalse(HttpHeaders.ScriptLogResults in client.last_response_headers)
client.DeleteCollection(self.GetDocumentCollectionLink(created_db, created_collection))
@@ -3588,11 +3588,11 @@
self.assertEqual(expected_offer.get('_self'), query_one_offer.get('_self'))
self.assertEqual(expected_offer.get('resource'), query_one_offer.get('resource'))
# Expects an exception when reading offer with bad offer link.
- self.__AssertHTTPFailureWithStatus(400, client.ReadOffer, expected_offer.get('_self')[:-1] + 'x')
+ self.__AssertHTTPFailureWithStatus(StatusCodes.BAD_REQUEST, client.ReadOffer, expected_offer.get('_self')[:-1] + 'x')
# Now delete the collection.
client.DeleteCollection(collection.get('_self'))
# Reading fails.
- self.__AssertHTTPFailureWithStatus(404, client.ReadOffer, expected_offer.get('_self'))
+ self.__AssertHTTPFailureWithStatus(StatusCodes.NOT_FOUND, client.ReadOffer, expected_offer.get('_self'))
# Read feed now returns 0 results.
offers = list(client.ReadOffers())
self.assertEqual(initial_count, len(offers))
@@ -3622,18 +3622,18 @@
offer_to_replace_bad_id = dict(offer_to_replace)
offer_to_replace_bad_id['_rid'] = 'NotAllowed'
self.__AssertHTTPFailureWithStatus(
- 400, client.ReplaceOffer, offer_to_replace_bad_id['_self'], offer_to_replace_bad_id)
+ StatusCodes.BAD_REQUEST, client.ReplaceOffer, offer_to_replace_bad_id['_self'], offer_to_replace_bad_id)
# Expects an exception when replacing an offer with bad rid.
offer_to_replace_bad_rid = dict(offer_to_replace)
offer_to_replace_bad_rid['_rid'] = 'InvalidRid'
self.__AssertHTTPFailureWithStatus(
- 400, client.ReplaceOffer, offer_to_replace_bad_rid['_self'], offer_to_replace_bad_rid)
+ StatusCodes.BAD_REQUEST, client.ReplaceOffer, offer_to_replace_bad_rid['_self'], offer_to_replace_bad_rid)
# Expects an exception when replaceing an offer with null id and rid.
offer_to_replace_null_ids = dict(offer_to_replace)
offer_to_replace_null_ids['id'] = None
offer_to_replace_null_ids['_rid'] = None
self.__AssertHTTPFailureWithStatus(
- 400, client.ReplaceOffer, offer_to_replace_null_ids['_self'], offer_to_replace_null_ids)
+ StatusCodes.BAD_REQUEST, client.ReplaceOffer, offer_to_replace_null_ids['_self'], offer_to_replace_null_ids)
def test_collection_with_offer_type(self):
client = document_client.DocumentClient(CRUDTests.host,
@@ -3672,18 +3672,18 @@
database_account = client.GetDatabaseAccount()
self.assertEqual(database_account.DatabasesLink, '/dbs/')
self.assertEqual(database_account.MediaLink, '/media/')
- if (http_constants.HttpHeaders.MaxMediaStorageUsageInMB in
+ if (HttpHeaders.MaxMediaStorageUsageInMB in
client.last_response_headers):
self.assertEqual(
database_account.MaxMediaStorageUsageInMB,
client.last_response_headers[
- http_constants.HttpHeaders.MaxMediaStorageUsageInMB])
- if (http_constants.HttpHeaders.CurrentMediaStorageUsageInMB in
+ HttpHeaders.MaxMediaStorageUsageInMB])
+ if (HttpHeaders.CurrentMediaStorageUsageInMB in
client.last_response_headers):
self.assertEqual(
database_account.CurrentMediaStorageUsageInMB,
client.last_response_headers[
- http_constants.HttpHeaders.
+ HttpHeaders.
CurrentMediaStorageUsageInMB])
self.assertTrue(
database_account.ConsistencyPolicy['defaultConsistencyLevel']
@@ -3700,24 +3700,24 @@
created_db = client.CreateDatabase({ 'id': CRUDTests.testDbName })
consistent_coll = client.CreateCollection(self.GetDatabaseLink(created_db, is_name_based), { 'id': 'consistent_coll' })
client.ReadCollection(self.GetDocumentCollectionLink(created_db, consistent_coll, is_name_based))
- self.assertFalse(http_constants.HttpHeaders.LazyIndexingProgress in client.last_response_headers)
- self.assertTrue(http_constants.HttpHeaders.IndexTransformationProgress in client.last_response_headers)
+ self.assertFalse(HttpHeaders.LazyIndexingProgress in client.last_response_headers)
+ self.assertTrue(HttpHeaders.IndexTransformationProgress in client.last_response_headers)
lazy_coll = client.CreateCollection(self.GetDatabaseLink(created_db, is_name_based),
{
'id': 'lazy_coll',
'indexingPolicy': { 'indexingMode' : documents.IndexingMode.Lazy }
})
client.ReadCollection(self.GetDocumentCollectionLink(created_db, lazy_coll, is_name_based))
- self.assertTrue(http_constants.HttpHeaders.LazyIndexingProgress in client.last_response_headers)
- self.assertTrue(http_constants.HttpHeaders.IndexTransformationProgress in client.last_response_headers)
+ self.assertTrue(HttpHeaders.LazyIndexingProgress in client.last_response_headers)
+ self.assertTrue(HttpHeaders.IndexTransformationProgress in client.last_response_headers)
none_coll = client.CreateCollection(self.GetDatabaseLink(created_db, is_name_based),
{
'id': 'none_coll',
'indexingPolicy': { 'indexingMode': documents.IndexingMode.NoIndex, 'automatic': False }
})
client.ReadCollection(self.GetDocumentCollectionLink(created_db, none_coll, is_name_based))
- self.assertFalse(http_constants.HttpHeaders.LazyIndexingProgress in client.last_response_headers)
- self.assertTrue(http_constants.HttpHeaders.IndexTransformationProgress in client.last_response_headers)
+ self.assertFalse(HttpHeaders.LazyIndexingProgress in client.last_response_headers)
+ self.assertTrue(HttpHeaders.IndexTransformationProgress in client.last_response_headers)
# To run this test, please provide your own CA certs file or download one from
# http://curl.haxx.se/docs/caextract.html
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/test/globaldb_mock_tests.py new/pydocumentdb-2.3.2/test/globaldb_mock_tests.py
--- old/pydocumentdb-2.3.1/test/globaldb_mock_tests.py 2017-12-22 00:52:29.000000000 +0100
+++ new/pydocumentdb-2.3.2/test/globaldb_mock_tests.py 2018-05-08 00:09:38.000000000 +0200
@@ -26,6 +26,7 @@
import pydocumentdb.documents as documents
import pydocumentdb.errors as errors
import pydocumentdb.constants as constants
+from pydocumentdb.http_constants import StatusCodes
import pydocumentdb.global_endpoint_manager as global_endpoint_manager
import pydocumentdb.retry_utility as retry_utility
import test.test_config as test_config
@@ -149,10 +150,10 @@
else:
self.endpoint_discovery_retry_count += 1
location_changed = True
- raise errors.HTTPFailure(403, "Forbidden", {'x-ms-substatus' : 3})
+ raise errors.HTTPFailure(StatusCodes.FORBIDDEN, "Forbidden", {'x-ms-substatus' : 3})
def MockGetDatabaseAccountStub(self, endpoint):
- raise errors.HTTPFailure(503, "Service unavailable")
+ raise errors.HTTPFailure(StatusCodes.SERVICE_UNAVAILABLE, "Service unavailable")
def MockCreateDatabase(self, client, database):
self.OriginalExecuteFunction = retry_utility._ExecuteFunction
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/test/globaldb_tests.py new/pydocumentdb-2.3.2/test/globaldb_tests.py
--- old/pydocumentdb-2.3.1/test/globaldb_tests.py 2017-12-22 00:52:29.000000000 +0100
+++ new/pydocumentdb-2.3.2/test/globaldb_tests.py 2018-05-08 00:09:38.000000000 +0200
@@ -31,7 +31,7 @@
import pydocumentdb.global_endpoint_manager as global_endpoint_manager
import pydocumentdb.endpoint_discovery_retry_policy as endpoint_discovery_retry_policy
import pydocumentdb.retry_utility as retry_utility
-import pydocumentdb.http_constants as http_constants
+from pydocumentdb.http_constants import HttpHeaders, StatusCodes, SubStatusCodes
import test.test_config as test_config
#IMPORTANT NOTES:
@@ -124,7 +124,7 @@
time.sleep(5)
client.ReadDocument(created_document['_self'])
- content_location = str(client.last_response_headers[http_constants.HttpHeaders.ContentLocation])
+ content_location = str(client.last_response_headers[HttpHeaders.ContentLocation])
content_location_url = urlparse(content_location)
host_url = urlparse(Test_globaldb_tests.host)
@@ -146,7 +146,7 @@
time.sleep(5)
client.ReadDocument(created_document['_self'])
- content_location = str(client.last_response_headers[http_constants.HttpHeaders.ContentLocation])
+ content_location = str(client.last_response_headers[HttpHeaders.ContentLocation])
content_location_url = urlparse(content_location)
write_location_url = urlparse(Test_globaldb_tests.write_location_host)
@@ -168,8 +168,8 @@
# Create Document will fail for the read location client since it has EnableEndpointDiscovery set to false, and hence the request will directly go to
# the endpoint that was used to create the client instance(which happens to be a read endpoint)
self.__AssertHTTPFailureWithStatus(
- 403,
- 3,
+ StatusCodes.FORBIDDEN,
+ SubStatusCodes.WRITE_FORBIDDEN,
read_location_client.CreateDocument,
self.test_coll['_self'],
document_definition)
@@ -206,7 +206,7 @@
time.sleep(5)
client.ReadDocument(created_document['_self'])
- content_location = str(client.last_response_headers[http_constants.HttpHeaders.ContentLocation])
+ content_location = str(client.last_response_headers[HttpHeaders.ContentLocation])
content_location_url = urlparse(content_location)
write_location_url = urlparse(Test_globaldb_tests.write_location_host)
@@ -225,7 +225,7 @@
time.sleep(5)
client.ReadDocument(created_document['_self'])
- content_location = str(client.last_response_headers[http_constants.HttpHeaders.ContentLocation])
+ content_location = str(client.last_response_headers[HttpHeaders.ContentLocation])
content_location_url = urlparse(content_location)
read_location2_url = urlparse(Test_globaldb_tests.read_location2_host)
@@ -372,8 +372,8 @@
'key': 'value'}
self.__AssertHTTPFailureWithStatus(
- 403,
- 3,
+ StatusCodes.FORBIDDEN,
+ SubStatusCodes.WRITE_FORBIDDEN,
client.CreateDocument,
self.test_coll['_self'],
document_definition)
@@ -381,7 +381,7 @@
retry_utility._ExecuteFunction = self.OriginalExecuteFunction
def _MockExecuteFunction(self, function, *args, **kwargs):
- raise errors.HTTPFailure(403, "Write Forbidden", {'x-ms-substatus' : 3})
+ raise errors.HTTPFailure(StatusCodes.FORBIDDEN, "Write Forbidden", {'x-ms-substatus' : SubStatusCodes.WRITE_FORBIDDEN})
def _MockGetDatabaseAccount(self, url_conection):
database_account = documents.DatabaseAccount()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/test/query_tests.py new/pydocumentdb-2.3.2/test/query_tests.py
--- old/pydocumentdb-2.3.1/test/query_tests.py 2017-12-22 00:52:29.000000000 +0100
+++ new/pydocumentdb-2.3.2/test/query_tests.py 2018-05-08 00:09:38.000000000 +0200
@@ -21,7 +21,7 @@
test_db = next(it, None)
if test_db is not None:
- client.DeleteDatabase("/dbs/" + cls.testDbName + "/")
+ client.DeleteDatabase("/dbs/" + cls.testDbName + "/")
""" change """
@classmethod
@@ -38,10 +38,10 @@
collection_options = { 'offerThroughput': 10100 }
created_collection = client.CreateCollection(created_db['_self'], collection_definition, collection_options)
- def test_first_and_last_slashes_trimmed_for_query_string (self):
document_definition = {'pk': 'pk', 'id':'myId'}
created_doc = client.CreateDocument(created_collection['_self'], document_definition)
+ def test_first_and_last_slashes_trimmed_for_query_string (self):
query_options = {'partitionKey': 'pk'}
collectionLink = '/dbs/' + self.testDbName + '/colls/' + self.testCollectionName + '/'
query = 'SELECT * from ' + self.testCollectionName
@@ -50,3 +50,20 @@
iter_list = list(query_iterable)
self.assertEqual(iter_list[0]['id'], 'myId')
+ def test_populate_query_metrics (self):
+ query_options = {'partitionKey': 'pk',
+ 'populateQueryMetrics': True}
+ collectionLink = '/dbs/' + self.testDbName + '/colls/' + self.testCollectionName + '/'
+ query = 'SELECT * from ' + self.testCollectionName
+ query_iterable = client.QueryDocuments(collectionLink, query, query_options)
+
+ iter_list = list(query_iterable)
+ self.assertEqual(iter_list[0]['id'], 'myId')
+
+ METRICS_HEADER_NAME = 'x-ms-documentdb-query-metrics'
+ self.assertTrue(METRICS_HEADER_NAME in client.last_response_headers)
+ metrics_header = client.last_response_headers[METRICS_HEADER_NAME]
+ # Validate header is well-formed: "key1=value1;key2=value2;etc"
+ metrics = metrics_header.split(';')
+ self.assertTrue(len(metrics) > 1)
+ self.assertTrue(all(['=' in x for x in metrics]))
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/test/retry_policy_tests.py new/pydocumentdb-2.3.2/test/retry_policy_tests.py
--- old/pydocumentdb-2.3.1/test/retry_policy_tests.py 2017-12-22 00:52:29.000000000 +0100
+++ new/pydocumentdb-2.3.2/test/retry_policy_tests.py 2018-05-08 00:09:38.000000000 +0200
@@ -25,7 +25,7 @@
import pydocumentdb.documents as documents
import pydocumentdb.errors as errors
import pydocumentdb.retry_options as retry_options
-import pydocumentdb.http_constants as http_constants
+from pydocumentdb.http_constants import HttpHeaders, StatusCodes, SubStatusCodes
import pydocumentdb.retry_utility as retry_utility
import test.test_config as test_config
@@ -44,6 +44,7 @@
masterKey = test_config._test_config.masterKey
test_db_name = 'sample database'
test_coll_name = 'sample collection'
+ counter = 0;
def __AssertHTTPFailureWithStatus(self, status_code, func, *args, **kwargs):
"""Assert HTTP failure with status.
@@ -115,9 +116,9 @@
try:
client.CreateDocument(self.created_collection['_self'], document_definition)
except errors.HTTPFailure as e:
- self.assertEqual(e.status_code, 429)
- self.assertEqual(connection_policy.RetryOptions.MaxRetryAttemptCount, client.last_response_headers[http_constants.HttpHeaders.ThrottleRetryCount])
- self.assertGreaterEqual(client.last_response_headers[http_constants.HttpHeaders.ThrottleRetryWaitTimeInMs], connection_policy.RetryOptions.MaxRetryAttemptCount * self.retry_after_in_milliseconds)
+ self.assertEqual(e.status_code, StatusCodes.TOO_MANY_REQUESTS)
+ self.assertEqual(connection_policy.RetryOptions.MaxRetryAttemptCount, client.last_response_headers[HttpHeaders.ThrottleRetryCount])
+ self.assertGreaterEqual(client.last_response_headers[HttpHeaders.ThrottleRetryWaitTimeInMs], connection_policy.RetryOptions.MaxRetryAttemptCount * self.retry_after_in_milliseconds)
retry_utility._ExecuteFunction = self.OriginalExecuteFunction
@@ -138,9 +139,9 @@
try:
client.CreateDocument(self.created_collection['_self'], document_definition)
except errors.HTTPFailure as e:
- self.assertEqual(e.status_code, 429)
- self.assertEqual(connection_policy.RetryOptions.MaxRetryAttemptCount, client.last_response_headers[http_constants.HttpHeaders.ThrottleRetryCount])
- self.assertGreaterEqual(client.last_response_headers[http_constants.HttpHeaders.ThrottleRetryWaitTimeInMs], connection_policy.RetryOptions.MaxRetryAttemptCount * connection_policy.RetryOptions.FixedRetryIntervalInMilliseconds)
+ self.assertEqual(e.status_code, StatusCodes.TOO_MANY_REQUESTS)
+ self.assertEqual(connection_policy.RetryOptions.MaxRetryAttemptCount, client.last_response_headers[HttpHeaders.ThrottleRetryCount])
+ self.assertGreaterEqual(client.last_response_headers[HttpHeaders.ThrottleRetryWaitTimeInMs], connection_policy.RetryOptions.MaxRetryAttemptCount * connection_policy.RetryOptions.FixedRetryIntervalInMilliseconds)
retry_utility._ExecuteFunction = self.OriginalExecuteFunction
@@ -161,8 +162,8 @@
try:
client.CreateDocument(self.created_collection['_self'], document_definition)
except errors.HTTPFailure as e:
- self.assertEqual(e.status_code, 429)
- self.assertGreaterEqual(client.last_response_headers[http_constants.HttpHeaders.ThrottleRetryWaitTimeInMs], connection_policy.RetryOptions.MaxWaitTimeInSeconds * 1000)
+ self.assertEqual(e.status_code, StatusCodes.TOO_MANY_REQUESTS)
+ self.assertGreaterEqual(client.last_response_headers[HttpHeaders.ThrottleRetryWaitTimeInMs], connection_policy.RetryOptions.MaxWaitTimeInSeconds * 1000)
retry_utility._ExecuteFunction = self.OriginalExecuteFunction
@@ -191,14 +192,99 @@
]
}))
except errors.HTTPFailure as e:
- self.assertEqual(e.status_code, 429)
- self.assertEqual(connection_policy.RetryOptions.MaxRetryAttemptCount, client.last_response_headers[http_constants.HttpHeaders.ThrottleRetryCount])
- self.assertGreaterEqual(client.last_response_headers[http_constants.HttpHeaders.ThrottleRetryWaitTimeInMs], connection_policy.RetryOptions.MaxRetryAttemptCount * self.retry_after_in_milliseconds)
+ self.assertEqual(e.status_code, StatusCodes.TOO_MANY_REQUESTS)
+ self.assertEqual(connection_policy.RetryOptions.MaxRetryAttemptCount, client.last_response_headers[HttpHeaders.ThrottleRetryCount])
+ self.assertGreaterEqual(client.last_response_headers[HttpHeaders.ThrottleRetryWaitTimeInMs], connection_policy.RetryOptions.MaxRetryAttemptCount * self.retry_after_in_milliseconds)
+
+ retry_utility._ExecuteFunction = self.OriginalExecuteFunction
+
+ def test_default_retry_policy_for_query(self):
+ connection_policy = documents.ConnectionPolicy()
+
+ client = document_client.DocumentClient(Test_retry_policy_tests.host, {'masterKey': Test_retry_policy_tests.masterKey}, connection_policy)
+
+ document_definition_1 = { 'id': 'doc1',
+ 'name': 'sample document',
+ 'key': 'value'}
+ document_definition_2 = { 'id': 'doc2',
+ 'name': 'sample document',
+ 'key': 'value'}
+
+ client.CreateDocument(self.created_collection['_self'], document_definition_1)
+ client.CreateDocument(self.created_collection['_self'], document_definition_2)
+
+ self.OriginalExecuteFunction = retry_utility._ExecuteFunction
+ retry_utility._ExecuteFunction = self._MockExecuteFunctionConnectionReset
+
+ docs = client.QueryDocuments(self.created_collection['_self'], "Select * from c", {'maxItemCount':1})
+
+ result_docs = list(docs)
+ self.assertEqual(result_docs[0]['id'], 'doc1')
+ self.assertEqual(result_docs[1]['id'], 'doc2')
+ self.assertEqual(self.counter, 12)
+
+ self.counter = 0;
+ retry_utility._ExecuteFunction = self.OriginalExecuteFunction
+
+ client.DeleteDocument(result_docs[0]['_self'])
+ client.DeleteDocument(result_docs[1]['_self'])
+
+ def test_default_retry_policy_for_read(self):
+ connection_policy = documents.ConnectionPolicy()
+
+ client = document_client.DocumentClient(Test_retry_policy_tests.host, {'masterKey': Test_retry_policy_tests.masterKey}, connection_policy)
+
+ document_definition = { 'id': 'doc',
+ 'name': 'sample document',
+ 'key': 'value'}
+
+ created_document = client.CreateDocument(self.created_collection['_self'], document_definition)
+
+ self.OriginalExecuteFunction = retry_utility._ExecuteFunction
+ retry_utility._ExecuteFunction = self._MockExecuteFunctionConnectionReset
+
+ doc = client.ReadDocument(created_document['_self'], {})
+ self.assertEqual(doc['id'], 'doc')
+ self.assertEqual(self.counter, 3)
+
+ self.counter = 0;
+ retry_utility._ExecuteFunction = self.OriginalExecuteFunction
+
+ client.DeleteDocument(doc['_self'])
+
+ def test_default_retry_policy_for_create(self):
+ connection_policy = documents.ConnectionPolicy()
+
+ client = document_client.DocumentClient(Test_retry_policy_tests.host, {'masterKey': Test_retry_policy_tests.masterKey}, connection_policy)
+
+ document_definition = { 'id': 'doc',
+ 'name': 'sample document',
+ 'key': 'value'}
+
+ self.OriginalExecuteFunction = retry_utility._ExecuteFunction
+ retry_utility._ExecuteFunction = self._MockExecuteFunctionConnectionReset
+
+ created_document = {}
+ try :
+ created_document = client.CreateDocument(self.created_collection['_self'], document_definition)
+ except Exception as err:
+ self.assertEqual(err.status_code, 10054)
+
+ self.assertDictEqual(created_document, {})
+ self.assertEqual(self.counter, 7)
retry_utility._ExecuteFunction = self.OriginalExecuteFunction
def _MockExecuteFunction(self, function, *args, **kwargs):
- raise errors.HTTPFailure(429, "Request rate is too large", {http_constants.HttpHeaders.RetryAfterInMilliseconds: self.retry_after_in_milliseconds})
+ raise errors.HTTPFailure(StatusCodes.TOO_MANY_REQUESTS, "Request rate is too large", {HttpHeaders.RetryAfterInMilliseconds: self.retry_after_in_milliseconds})
+
+ def _MockExecuteFunctionConnectionReset(self, function, *args, **kwargs):
+ self.counter += 1;
+
+ if self.counter % 3 == 0:
+ return self.OriginalExecuteFunction(function, *args, **kwargs)
+ else:
+ raise errors.HTTPFailure(10054, "Connection was reset", {})
if __name__ == '__main__':
unittest.main()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/pydocumentdb-2.3.1/test/ttl_tests.py new/pydocumentdb-2.3.2/test/ttl_tests.py
--- old/pydocumentdb-2.3.1/test/ttl_tests.py 2017-12-22 00:52:29.000000000 +0100
+++ new/pydocumentdb-2.3.2/test/ttl_tests.py 2018-05-08 00:09:38.000000000 +0200
@@ -24,6 +24,7 @@
import pydocumentdb.document_client as document_client
import pydocumentdb.errors as errors
+from pydocumentdb.http_constants import StatusCodes
import test.test_config as test_config
@@ -92,7 +93,7 @@
# None is an unsupported value for defaultTtl. Valid values are -1 or a non-zero positive 32-bit integer value
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.CreateCollection,
created_db['_self'],
collection_definition)
@@ -102,7 +103,7 @@
# 0 is an unsupported value for defaultTtl. Valid values are -1 or a non-zero positive 32-bit integer value
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.CreateCollection,
created_db['_self'],
collection_definition)
@@ -112,7 +113,7 @@
# -10 is an unsupported value for defaultTtl. Valid values are -1 or a non-zero positive 32-bit integer value
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.CreateCollection,
created_db['_self'],
collection_definition)
@@ -124,7 +125,7 @@
# 0 is an unsupported value for ttl. Valid values are -1 or a non-zero positive 32-bit integer value
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.CreateDocument,
created_collection['_self'],
document_definition)
@@ -134,7 +135,7 @@
# None is an unsupported value for ttl. Valid values are -1 or a non-zero positive 32-bit integer value
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.CreateDocument,
created_collection['_self'],
document_definition)
@@ -144,7 +145,7 @@
# -10 is an unsupported value for ttl. Valid values are -1 or a non-zero positive 32-bit integer value
self.__AssertHTTPFailureWithStatus(
- 400,
+ StatusCodes.BAD_REQUEST,
client.CreateDocument,
created_collection['_self'],
document_definition)
@@ -170,7 +171,7 @@
# the created document should be gone now as it's ttl value would be same as defaultTtl value of the collection
self.__AssertHTTPFailureWithStatus(
- 404,
+ StatusCodes.NOT_FOUND,
client.ReadDocument,
created_document['_self'])
@@ -192,7 +193,7 @@
# the created document should be gone now as it's ttl value is set to 2 which overrides the collections's defaultTtl value(5)
self.__AssertHTTPFailureWithStatus(
- 404,
+ StatusCodes.NOT_FOUND,
client.ReadDocument,
created_document['_self'])
@@ -210,7 +211,7 @@
# the created document should be gone now as we have waited for (6+4) secs which is greater than documents's ttl value of 8
self.__AssertHTTPFailureWithStatus(
- 404,
+ StatusCodes.NOT_FOUND,
client.ReadDocument,
created_document['_self'])
@@ -245,7 +246,7 @@
# the created document should be gone now as it's ttl value is set to 2 which overrides the collections's defaultTtl value(-1)
self.__AssertHTTPFailureWithStatus(
- 404,
+ StatusCodes.NOT_FOUND,
client.ReadDocument,
created_document3['_self'])
@@ -299,7 +300,7 @@
# the created document cannot be deleted since it should already be gone now
self.__AssertHTTPFailureWithStatus(
- 404,
+ StatusCodes.NOT_FOUND,
client.DeleteDocument,
created_document['_self'])
@@ -323,7 +324,7 @@
# the upserted document should be gone now after 10 secs from the last write(upsert) of the document
self.__AssertHTTPFailureWithStatus(
- 404,
+ StatusCodes.NOT_FOUND,
client.ReadDocument,
upserted_docment['_self'])
1
0
Hello community,
here is the log from the commit of package python-msrestazure for openSUSE:Factory checked in at 2018-09-26 16:16:14
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-msrestazure (Old)
and /work/SRC/openSUSE:Factory/.python-msrestazure.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-msrestazure"
Wed Sep 26 16:16:14 2018 rev:6 rq:638008 version:0.5.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-msrestazure/python-msrestazure.changes 2018-05-13 16:03:42.677295231 +0200
+++ /work/SRC/openSUSE:Factory/.python-msrestazure.new/python-msrestazure.changes 2018-09-26 16:16:15.915004379 +0200
@@ -1,0 +2,11 @@
+Thu Sep 6 13:00:23 UTC 2018 - John Paul Adrian Glaubitz <adrian.glaubitz(a)suse.com>
+
+- New upstream release
+ + Version 0.5.0
+ + No upstream changelog provided
+- Drop obsolete patches
+ + m_drop-compatible-releases-operator.patch
+- Update Requires from setup.py
+- Update Summary from setup.py
+
+-------------------------------------------------------------------
Old:
----
m_drop-compatible-releases-operator.patch
msrestazure-0.4.28.tar.gz
New:
----
msrestazure-0.5.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-msrestazure.spec ++++++
--- /var/tmp/diff_new_pack.G4B7Lk/_old 2018-09-26 16:16:16.323003704 +0200
+++ /var/tmp/diff_new_pack.G4B7Lk/_new 2018-09-26 16:16:16.327003697 +0200
@@ -18,21 +18,20 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-msrestazure
-Version: 0.4.28
+Version: 0.5.0
Release: 0
-Summary: AutoRest swagger generator
+Summary: AutoRest swagger generator - Azure-specific module
License: MIT
Group: Development/Languages/Python
Url: https://pypi.python.org/pypi/msrestazure
Source: https://files.pythonhosted.org/packages/source/m/msrestazure/msrestazure-%{…
Source1: LICENSE.md
-Patch0: m_drop-compatible-releases-operator.patch
BuildRequires: %{python_module devel}
BuildRequires: %{python_module setuptools}
BuildRequires: fdupes
BuildRequires: python-rpm-macros
-Requires: python-adal < 1.0.0
-Requires: python-adal >= 0.5.0
+Requires: python-adal < 2.0.0
+Requires: python-adal >= 0.6.0
Requires: python-msrest < 2.0.0
Requires: python-msrest >= 0.4.28
BuildRoot: %{_tmppath}/%{name}-%{version}-build
@@ -45,7 +44,6 @@
%prep
%setup -q -n msrestazure-%{version}
-%patch0 -p1
cp %{SOURCE1} LICENSE.md
%build
++++++ msrestazure-0.4.28.tar.gz -> msrestazure-0.5.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/msrestazure-0.4.28/PKG-INFO new/msrestazure-0.5.0/PKG-INFO
--- old/msrestazure-0.4.28/PKG-INFO 2018-04-24 01:45:43.000000000 +0200
+++ new/msrestazure-0.5.0/PKG-INFO 2018-08-02 21:17:34.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: msrestazure
-Version: 0.4.28
+Version: 0.5.0
Summary: AutoRest swagger generator Python client runtime. Azure-specific module.
Home-page: https://github.com/Azure/msrestazure-for-python
Author: Microsoft Corporation
@@ -28,6 +28,87 @@
Release History
---------------
+ 2018-08-02 Version 0.5.0
+ ++++++++++++++++++++++++
+
+ **Features**
+
+ - Implementation is now using ADAL and not request-oauthlib. This allows more AD scenarios (like federated) #94
+ - Add additionalInfo parsing for CloudError #102
+
+ **Breaking changes**
+
+ These breaking changes applies to ServicePrincipalCredentials, UserPassCredentials, AADTokenCredentials
+
+ - Remove "auth_uri" attribute and parameter. This was unused.
+ - Remove "state" attribute. This was unused.
+ - Remove "client" attribute. This was exposed by mistake and should have been internal. No replacement is possible.
+ - Remove "token_uri" attribute and parameter. Use "cloud_environment" and "tenant" to impact the login url now.
+ - Remove token caching based on "keyring". Token caching should be implemented using ADAL now. This implies:
+
+ - Remove the "keyring" parameter
+ - Remove the "clear_cached_token" method
+ - Remove the "retrieve_session" method
+
+ 2018-07-03 Version 0.4.35
+ +++++++++++++++++++++++++
+
+ **Bugfixes**
+
+ - MSIAuthentication regression for KeyVault since IMDS support #109
+
+ 2018-07-02 Version 0.4.34
+ +++++++++++++++++++++++++
+
+ **Bugfixes**
+
+ - MSIAuthentication should initialize the token attribute on creation #106
+
+ 2018-06-21 Version 0.4.33
+ +++++++++++++++++++++++++
+
+ **Bugfixes**
+
+ - Fixes refreshToken in UserPassCredentials and AADTokenCredentials #103
+ - Fix US government cloud definition #104
+
+ Thanks to mjcaley for his contribution
+
+ 2018-06-13 Version 0.4.32
+ +++++++++++++++++++++++++
+
+ **Features**
+
+ - Implement new LRO options of Autorest #101
+
+ **Bug fixes**
+
+ - Reduce max MSI polling time for VM #100
+
+
+ 2018-05-17 Version 0.4.31
+ +++++++++++++++++++++++++
+
+ **Features**
+
+ - Improve MSI for VM token polling algorithm
+
+ 2018-05-16 Version 0.4.30
+ +++++++++++++++++++++++++
+
+ **Features**
+
+ - Allow ADAL 0.5.0 to 2.0.0 excluded as valid ADAL dependency
+
+ 2018-04-30 Version 0.4.29
+ +++++++++++++++++++++++++
+
+ **Bugfixes**
+
+ - Fix refresh Token on `AADTokenCredentials` (was broken in 0.4.27)
+ - Now `UserPasswordCredentials` correctly use the refreshToken, and not user/password to refresh the session (was broken in 0.4.27)
+ - Bring back `keyring`, with minimal dependency 12.0.2 that fixes the installation problem on old Python
+
2018-04-23 Version 0.4.28
+++++++++++++++++++++++++
@@ -35,7 +116,7 @@
Do to some stability issues with "keyring" dependency that highly change from one system to another,
this package is no longer a dependency of "msrestazure".
- If you were using the secured token cache of `ServicePrincipalCredentials` and `UserPassCredentials`,
+ If you were using the secured token cache of `ServicePrincipalCredentials` and `UserPassCredentials`,
the feature is still available, but you need to install manually "keyring". The functionnality will activate automatically.
2018-04-18 Version 0.4.27
@@ -76,7 +157,7 @@
**Bugfix**
- - Fix LRO result if POST uses AsyncOperation header (Autorest.Python 3.0 only) #79
+ - Fix LRO result if POST uses AsyncOperation header (Autorest.Python 3.0 only) #79
2018-02-27 Version 0.4.22
+++++++++++++++++++++++++
@@ -441,5 +522,6 @@
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
+Classifier: Programming Language :: Python :: 3.7
Classifier: License :: OSI Approved :: MIT License
Classifier: Topic :: Software Development
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/msrestazure-0.4.28/README.rst new/msrestazure-0.5.0/README.rst
--- old/msrestazure-0.4.28/README.rst 2018-04-24 01:45:02.000000000 +0200
+++ new/msrestazure-0.5.0/README.rst 2018-08-02 21:16:52.000000000 +0200
@@ -20,6 +20,87 @@
Release History
---------------
+2018-08-02 Version 0.5.0
+++++++++++++++++++++++++
+
+**Features**
+
+- Implementation is now using ADAL and not request-oauthlib. This allows more AD scenarios (like federated) #94
+- Add additionalInfo parsing for CloudError #102
+
+**Breaking changes**
+
+These breaking changes applies to ServicePrincipalCredentials, UserPassCredentials, AADTokenCredentials
+
+- Remove "auth_uri" attribute and parameter. This was unused.
+- Remove "state" attribute. This was unused.
+- Remove "client" attribute. This was exposed by mistake and should have been internal. No replacement is possible.
+- Remove "token_uri" attribute and parameter. Use "cloud_environment" and "tenant" to impact the login url now.
+- Remove token caching based on "keyring". Token caching should be implemented using ADAL now. This implies:
+
+ - Remove the "keyring" parameter
+ - Remove the "clear_cached_token" method
+ - Remove the "retrieve_session" method
+
+2018-07-03 Version 0.4.35
++++++++++++++++++++++++++
+
+**Bugfixes**
+
+- MSIAuthentication regression for KeyVault since IMDS support #109
+
+2018-07-02 Version 0.4.34
++++++++++++++++++++++++++
+
+**Bugfixes**
+
+- MSIAuthentication should initialize the token attribute on creation #106
+
+2018-06-21 Version 0.4.33
++++++++++++++++++++++++++
+
+**Bugfixes**
+
+- Fixes refreshToken in UserPassCredentials and AADTokenCredentials #103
+- Fix US government cloud definition #104
+
+Thanks to mjcaley for his contribution
+
+2018-06-13 Version 0.4.32
++++++++++++++++++++++++++
+
+**Features**
+
+- Implement new LRO options of Autorest #101
+
+**Bug fixes**
+
+- Reduce max MSI polling time for VM #100
+
+
+2018-05-17 Version 0.4.31
++++++++++++++++++++++++++
+
+**Features**
+
+- Improve MSI for VM token polling algorithm
+
+2018-05-16 Version 0.4.30
++++++++++++++++++++++++++
+
+**Features**
+
+- Allow ADAL 0.5.0 to 2.0.0 excluded as valid ADAL dependency
+
+2018-04-30 Version 0.4.29
++++++++++++++++++++++++++
+
+**Bugfixes**
+
+- Fix refresh Token on `AADTokenCredentials` (was broken in 0.4.27)
+- Now `UserPasswordCredentials` correctly use the refreshToken, and not user/password to refresh the session (was broken in 0.4.27)
+- Bring back `keyring`, with minimal dependency 12.0.2 that fixes the installation problem on old Python
+
2018-04-23 Version 0.4.28
+++++++++++++++++++++++++
@@ -27,7 +108,7 @@
Do to some stability issues with "keyring" dependency that highly change from one system to another,
this package is no longer a dependency of "msrestazure".
-If you were using the secured token cache of `ServicePrincipalCredentials` and `UserPassCredentials`,
+If you were using the secured token cache of `ServicePrincipalCredentials` and `UserPassCredentials`,
the feature is still available, but you need to install manually "keyring". The functionnality will activate automatically.
2018-04-18 Version 0.4.27
@@ -68,7 +149,7 @@
**Bugfix**
-- Fix LRO result if POST uses AsyncOperation header (Autorest.Python 3.0 only) #79
+- Fix LRO result if POST uses AsyncOperation header (Autorest.Python 3.0 only) #79
2018-02-27 Version 0.4.22
+++++++++++++++++++++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/msrestazure-0.4.28/msrestazure/azure_active_directory.py new/msrestazure-0.5.0/msrestazure/azure_active_directory.py
--- old/msrestazure-0.4.28/msrestazure/azure_active_directory.py 2018-04-24 01:45:02.000000000 +0200
+++ new/msrestazure-0.5.0/msrestazure/azure_active_directory.py 2018-08-02 21:16:52.000000000 +0200
@@ -36,21 +36,8 @@
from urllib.parse import urlparse, parse_qs
import adal
-from oauthlib.oauth2 import BackendApplicationClient, LegacyApplicationClient
-from oauthlib.oauth2.rfc6749.errors import (
- InvalidGrantError,
- MismatchingStateError,
- OAuth2Error,
- TokenExpiredError)
from requests import RequestException, ConnectionError, HTTPError
import requests
-import requests_oauthlib as oauth
-
-try:
- import keyring
-except Exception as err:
- keyring = False
- KEYRING_EXCEPTION = err
from msrest.authentication import OAuthTokenAuthentication, Authentication, BasicTokenAuthentication
from msrest.exceptions import TokenExpiredError as Expired
@@ -61,65 +48,13 @@
_LOGGER = logging.getLogger(__name__)
-if not keyring:
- _LOGGER.warning("Cannot load 'keyring' on your system (either not installed, or not configured correctly): %s", KEYRING_EXCEPTION)
-
-def _build_url(uri, paths, scheme):
- """Combine URL parts.
-
- :param str uri: The base URL.
- :param list paths: List of strings that make up the URL.
- :param str scheme: The URL scheme, 'http' or 'https'.
- :rtype: str
- :return: Combined, formatted URL.
- """
- path = [str(p).strip('/') for p in paths]
- combined_path = '/'.join(path)
- parsed_url = urlparse(uri)
- replaced = parsed_url._replace(scheme=scheme)
- if combined_path:
- path = '/'.join([replaced.path, combined_path])
- replaced = replaced._replace(path=path)
-
- new_url = replaced.geturl()
- new_url = new_url.replace('///', '//')
- return new_url
-
-
-def _http(uri, *extra):
- """Convert https URL to http.
-
- :param str uri: The base URL.
- :param str extra: Additional URL paths (optional).
- :rtype: str
- :return: An HTTP URL.
- """
- return _build_url(uri, extra, 'http')
-
-
-def _https(uri, *extra):
- """Convert http URL to https.
-
- :param str uri: The base URL.
- :param str extra: Additional URL paths (optional).
- :rtype: str
- :return: An HTTPS URL.
- """
- return _build_url(uri, extra, 'https')
-
-
class AADMixin(OAuthTokenAuthentication):
"""Mixin for Authentication object.
Provides some AAD functionality:
- - State validation
- Token caching and retrieval
- Default AAD configuration
"""
- _token_uri = "/oauth2/token"
- _auth_uri = "/oauth2/authorize"
- _tenant = "common"
- _keyring = "AzureAAD"
_case = re.compile('([a-z0-9])([A-Z])')
def _configure(self, **kwargs):
@@ -131,54 +66,95 @@
- china (bool): Configure auth for China-based service,
default is 'False'.
- tenant (str): Alternative tenant, default is 'common'.
- - auth_uri (str): Alternative authentication endpoint.
- - token_uri (str): Alternative token retrieval endpoint.
- resource (str): Alternative authentication resource, default
is 'https://management.core.windows.net/'.
- verify (bool): Verify secure connection, default is 'True'.
- - keyring (str): Name of local token cache, default is 'AzureAAD'.
- timeout (int): Timeout of the request in seconds.
- - proxies (dict): Dictionary mapping protocol or protocol and
+ - proxies (dict): Dictionary mapping protocol or protocol and
hostname to the URL of the proxy.
+ - cache (adal.TokenCache): A adal.TokenCache, see ADAL configuration
+ for details. This parameter is not used here and directly passed to ADAL.
"""
if kwargs.get('china'):
err_msg = ("china parameter is deprecated, "
"please use "
"cloud_environment=msrestazure.azure_cloud.AZURE_CHINA_CLOUD")
warnings.warn(err_msg, DeprecationWarning)
- self.cloud_environment = AZURE_CHINA_CLOUD
+ self._cloud_environment = AZURE_CHINA_CLOUD
else:
- self.cloud_environment = AZURE_PUBLIC_CLOUD
- self.cloud_environment = kwargs.get('cloud_environment', self.cloud_environment)
+ self._cloud_environment = AZURE_PUBLIC_CLOUD
+ self._cloud_environment = kwargs.get('cloud_environment', self._cloud_environment)
- auth_endpoint = self.cloud_environment.endpoints.active_directory
- resource = self.cloud_environment.endpoints.active_directory_resource_id
+ auth_endpoint = self._cloud_environment.endpoints.active_directory
+ resource = self._cloud_environment.endpoints.active_directory_resource_id
- tenant = kwargs.get('tenant', self._tenant)
- self.auth_uri = kwargs.get('auth_uri', _https(
- auth_endpoint, tenant, self._auth_uri))
- self.token_uri = kwargs.get('token_uri', _https(
- auth_endpoint, tenant, self._token_uri))
- self.verify = kwargs.get('verify', True)
- self.cred_store = kwargs.get('keyring', self._keyring)
+ self._tenant = kwargs.get('tenant', "common")
+ self._verify = kwargs.get('verify') # 'None' will honor ADAL_PYTHON_SSL_NO_VERIFY
self.resource = kwargs.get('resource', resource)
- self.proxies = kwargs.get('proxies')
- self.timeout = kwargs.get('timeout')
- self.state = oauth.oauth2_session.generate_token()
+ self._proxies = kwargs.get('proxies')
+ self._timeout = kwargs.get('timeout')
+ self._cache = kwargs.get('cache')
self.store_key = "{}_{}".format(
auth_endpoint.strip('/'), self.store_key)
+ self.secret = None
+ self._context = None # Future ADAL context
- def _check_state(self, response):
- """Validate state returned by AAD server.
+ def _create_adal_context(self):
+ authority_url = self.cloud_environment.endpoints.active_directory
+ is_adfs = bool(re.match('.+(/adfs|/adfs/)$', authority_url, re.I))
+ if is_adfs:
+ authority_url = authority_url.rstrip('/') # workaround: ADAL is known to reject auth urls with trailing /
+ else:
+ authority_url = authority_url + '/' + self._tenant
- :param str response: URL returned by server redirect.
- :raises: ValueError if state does not match that of the request.
- :rtype: None
- """
- query = parse_qs(urlparse(response).query)
- if self.state not in query.get('state', []):
- raise ValueError(
- "State received from server does not match that of request.")
+ self._context = adal.AuthenticationContext(
+ authority_url,
+ timeout=self._timeout,
+ verify_ssl=self._verify,
+ proxies=self._proxies,
+ validate_authority=not is_adfs,
+ cache=self._cache,
+ api_version=None
+ )
+
+ def _destroy_adal_context(self):
+ self._context = None
+
+ @property
+ def verify(self):
+ return self._verify
+
+ @verify.setter
+ def verify(self, value):
+ self._verify = value
+ self._destroy_adal_context()
+
+ @property
+ def proxies(self):
+ return self._proxies
+
+ @proxies.setter
+ def proxies(self, value):
+ self._proxies = value
+ self._destroy_adal_context()
+
+ @property
+ def timeout(self):
+ return self._timeout
+
+ @timeout.setter
+ def timeout(self, value):
+ self._timeout = value
+ self._destroy_adal_context()
+
+ @property
+ def cloud_environment(self):
+ return self._cloud_environment
+
+ @cloud_environment.setter
+ def cloud_environment(self, value):
+ self._cloud_environment = value
+ self._destroy_adal_context()
def _convert_token(self, token):
"""Convert token fields from camel case.
@@ -186,40 +162,29 @@
:param dict token: An authentication token.
:rtype: dict
"""
+ # Beware that ADAL returns a pointer to its own dict, do
+ # NOT change it in place
+ token = token.copy()
+
+ # If it's from ADAL, expiresOn will be in ISO form.
+ # Bring it back to float, using expiresIn
+ if "expiresOn" in token and "expiresIn" in token:
+ token["expiresOn"] = token['expiresIn'] + time.time()
return {self._case.sub(r'\1_\2', k).lower(): v
for k, v in token.items()}
def _parse_token(self):
- # TODO: We could also check expires_on and use to update expires_in
+ # AD answers 'expires_on', and Python oauthlib expects 'expires_at'
+ if 'expires_on' in self.token and 'expires_at' not in self.token:
+ self.token['expires_at'] = self.token['expires_on']
+
if self.token.get('expires_at'):
countdown = float(self.token['expires_at']) - time.time()
self.token['expires_in'] = countdown
- def _default_token_cache(self, token):
- """Store token for future sessions.
-
- :param dict token: An authentication token.
- :rtype: None
- """
- self.token = token
- if keyring:
- try:
- keyring.set_password(self.cred_store, self.store_key, str(token))
- except Exception as err:
- _LOGGER.warning("Keyring cache token has failed: %s", str(err))
-
- def _retrieve_stored_token(self):
- """Retrieve stored token for new session.
-
- :raises: ValueError if no cached token found.
- :rtype: dict
- :return: Retrieved token.
- """
- token = keyring.get_password(self.cred_store, self.store_key)
- if token is None:
- raise ValueError("No stored token found.")
- self.token = ast.literal_eval(str(token))
- self.signed_session()
+ def set_token(self):
+ if not self._context:
+ self._create_adal_context()
def signed_session(self, session=None):
"""Create token-friendly Requests session, using auto-refresh.
@@ -231,18 +196,33 @@
:param session: The session to configure for authentication
:type session: requests.Session
"""
+ self.set_token() # Adal does the caching.
self._parse_token()
return super(AADMixin, self).signed_session(session)
-
- def clear_cached_token(self):
- """Clear any stored tokens.
- :raises: KeyError if failed to clear token.
+ def refresh_session(self, session=None):
+ """Return updated session if token has expired, attempts to
+ refresh using newly acquired token.
+
+ If a session object is provided, configure it directly. Otherwise,
+ create a new session and return it.
+
+ :param session: The session to configure for authentication
+ :type session: requests.Session
+ :rtype: requests.Session.
"""
- try:
- keyring.delete_password(self.cred_store, self.store_key)
- except keyring.errors.PasswordDeleteError:
- raise_with_traceback(KeyError, "Unable to clear token.")
+ if 'refresh_token' in self.token:
+ try:
+ token = self._context.acquire_token_with_refresh_token(
+ self.token['refresh_token'],
+ self.id,
+ self.resource,
+ self.secret # This is needed when using Confidential Client
+ )
+ self.token = self._convert_token(token)
+ except adal.AdalError as err:
+ raise_with_traceback(AuthenticationError, "", err)
+ return self.signed_session(session)
class AADTokenCredentials(AADMixin):
@@ -250,20 +230,22 @@
Credentials objects for AAD token retrieved through external process
e.g. Python ADAL lib.
+ If you just provide "token", refresh will be done on Public Azure with
+ default public Azure "resource". You can set "cloud_environment",
+ "tenant", "resource" and "client_id" to change that behavior.
+
Optional kwargs may include:
- cloud_environment (msrestazure.azure_cloud.Cloud): A targeted cloud environment
- china (bool): Configure auth for China-based service,
default is 'False'.
- tenant (str): Alternative tenant, default is 'common'.
- - auth_uri (str): Alternative authentication endpoint.
- - token_uri (str): Alternative token retrieval endpoint.
- resource (str): Alternative authentication resource, default
is 'https://management.core.windows.net/'.
- verify (bool): Verify secure connection, default is 'True'.
- - keyring (str): Name of local token cache, default is 'AzureAAD'.
- - cached (bool): If true, will not attempt to collect a token,
- which can then be populated later from a cached token.
+ - cache (adal.TokenCache): A adal.TokenCache, see ADAL configuration
+ for details. This parameter is not used here and directly passed to ADAL.
+
:param dict token: Authentication token.
:param str client_id: Client ID, if not set, Xplat Client ID
@@ -276,18 +258,8 @@
client_id = '04b07795-8ddb-461a-bbee-02f9e1bf7b46'
super(AADTokenCredentials, self).__init__(client_id, None)
self._configure(**kwargs)
- if not kwargs.get('cached'):
- self.token = self._convert_token(token)
- self.signed_session()
-
- @classmethod
- def retrieve_session(cls, client_id=None):
- """Create AADTokenCredentials from a cached token if it has not
- yet expired.
- """
- session = cls(None, client_id=client_id, cached=True)
- session._retrieve_stored_token()
- return session
+ self.client = None
+ self.token = self._convert_token(token)
class UserPassCredentials(AADMixin):
@@ -304,17 +276,14 @@
- china (bool): Configure auth for China-based service,
default is 'False'.
- tenant (str): Alternative tenant, default is 'common'.
- - auth_uri (str): Alternative authentication endpoint.
- - token_uri (str): Alternative token retrieval endpoint.
- resource (str): Alternative authentication resource, default
is 'https://management.core.windows.net/'.
- verify (bool): Verify secure connection, default is 'True'.
- - keyring (str): Name of local token cache, default is 'AzureAAD'.
- timeout (int): Timeout of the request in seconds.
- - cached (bool): If true, will not attempt to collect a token,
- which can then be populated later from a cached token.
- proxies (dict): Dictionary mapping protocol or protocol and
hostname to the URL of the proxy.
+ - cache (adal.TokenCache): A adal.TokenCache, see ADAL configuration
+ for details. This parameter is not used here and directly passed to ADAL.
:param str username: Account username.
:param str password: Account password.
@@ -335,83 +304,25 @@
self.username = username
self.password = password
self.secret = secret
- self.client = LegacyApplicationClient(client_id=self.id)
- if not kwargs.get('cached'):
- self.set_token()
-
- @classmethod
- def retrieve_session(cls, username, client_id=None):
- """Create ServicePrincipalCredentials from a cached token if it has not
- yet expired.
- """
- session = cls(username, None, client_id=client_id, cached=True)
- session._retrieve_stored_token()
- return session
-
- def _setup_session(self):
- """Create token-friendly Requests session.
+ self.set_token()
- :rtype: requests_oauthlib.OAuth2Session
- """
- return oauth.OAuth2Session(client=self.client)
def set_token(self):
"""Get token using Username/Password credentials.
:raises: AuthenticationError if credentials invalid, or call fails.
"""
- with self._setup_session() as session:
- optional = {}
- if self.secret:
- optional['client_secret'] = self.secret
- try:
- token = session.fetch_token(self.token_uri,
- client_id=self.id,
- username=self.username,
- password=self.password,
- resource=self.resource,
- verify=self.verify,
- proxies=self.proxies,
- timeout=self.timeout,
- **optional)
- except (RequestException, OAuth2Error, InvalidGrantError) as err:
- raise_with_traceback(AuthenticationError, "", err)
-
- self.token = token
- self._default_token_cache(self.token)
-
- def refresh_session(self, session=None):
- """Return updated session if token has expired, attempts to
- refresh using newly acquired token.
-
- If a session object is provided, configure it directly. Otherwise,
- create a new session and return it.
-
- :param session: The session to configure for authentication
- :type session: requests.Session
- :rtype: requests.Session.
- """
- with self._setup_session() as session:
- optional = {}
- if self.secret:
- optional['client_secret'] = self.secret
- try:
- token = session.refresh_token(self.token_uri,
- client_id=self.id,
- username=self.username,
- password=self.password,
- resource=self.resource,
- verify=self.verify,
- proxies=self.proxies,
- timeout=self.timeout,
- **optional)
- except (RequestException, OAuth2Error, InvalidGrantError) as err:
- raise_with_traceback(AuthenticationError, "", err)
-
- self.token = token
- self._default_token_cache(self.token)
- return self.signed_session(session)
-
+ super(UserPassCredentials, self).set_token()
+ try:
+ token = self._context.acquire_token_with_username_password(
+ self.resource,
+ self.username,
+ self.password,
+ self.id
+ )
+ self.token = self._convert_token(token)
+ except adal.AdalError as err:
+ raise_with_traceback(AuthenticationError, "", err)
class ServicePrincipalCredentials(AADMixin):
"""Credentials object for Service Principle Authentication.
@@ -423,17 +334,14 @@
- china (bool): Configure auth for China-based service,
default is 'False'.
- tenant (str): Alternative tenant, default is 'common'.
- - auth_uri (str): Alternative authentication endpoint.
- - token_uri (str): Alternative token retrieval endpoint.
- resource (str): Alternative authentication resource, default
is 'https://management.core.windows.net/'.
- verify (bool): Verify secure connection, default is 'True'.
- - keyring (str): Name of local token cache, default is 'AzureAAD'.
- timeout (int): Timeout of the request in seconds.
- - cached (bool): If true, will not attempt to collect a token,
- which can then be populated later from a cached token.
- proxies (dict): Dictionary mapping protocol or protocol and
hostname to the URL of the proxy.
+ - cache (adal.TokenCache): A adal.TokenCache, see ADAL configuration
+ for details. This parameter is not used here and directly passed to ADAL.
:param str client_id: Client ID.
:param str secret: Client secret.
@@ -443,63 +351,23 @@
self._configure(**kwargs)
self.secret = secret
- self.client = BackendApplicationClient(self.id)
- if not kwargs.get('cached'):
- self.set_token()
-
- @classmethod
- def retrieve_session(cls, client_id):
- """Create ServicePrincipalCredentials from a cached token if it has not
- yet expired.
- """
- session = cls(client_id, None, cached=True)
- session._retrieve_stored_token()
- return session
-
- def _setup_session(self):
- """Create token-friendly Requests session.
-
- :rtype: requests_oauthlib.OAuth2Session
- """
- return oauth.OAuth2Session(self.id, client=self.client)
+ self.set_token()
def set_token(self):
"""Get token using Client ID/Secret credentials.
:raises: AuthenticationError if credentials invalid, or call fails.
"""
- with self._setup_session() as session:
- try:
- token = session.fetch_token(self.token_uri,
- client_id=self.id,
- resource=self.resource,
- client_secret=self.secret,
- response_type="client_credentials",
- verify=self.verify,
- timeout=self.timeout,
- proxies=self.proxies)
- except (RequestException, OAuth2Error, InvalidGrantError) as err:
- raise_with_traceback(AuthenticationError, "", err)
- else:
- self.token = token
- self._default_token_cache(self.token)
-
- def refresh_session(self, session=None):
- """Alias to signed_session().
-
- SP flow does not contain refresh_token, so this method is just asking a new
- token to AD.
-
- If a session object is provided, configure it directly. Otherwise,
- create a new session and return it.
-
- :param session: The session to configure for authentication
- :type session: requests.Session
- :rtype: requests.Session.
- """
- self.set_token()
- return self.signed_session(session)
-
+ super(ServicePrincipalCredentials, self).set_token()
+ try:
+ token = self._context.acquire_token_with_client_credentials(
+ self.resource,
+ self.id,
+ self.secret
+ )
+ self.token = self._convert_token(token)
+ except adal.AdalError as err:
+ raise_with_traceback(AuthenticationError, "", err)
# For backward compatibility of import, but I doubt someone uses that...
class InteractiveCredentials(object):
@@ -619,7 +487,7 @@
_LOGGER.warning("MSI: Failed to retrieve a token from '%s' with an error of '%s'. This could be caused "
"by the MSI extension not yet fullly provisioned.",
request_uri, ex)
- raise
+ raise
token_entry = result.json()
return token_entry['token_type'], token_entry['access_token'], token_entry
@@ -703,7 +571,9 @@
raise AuthenticationError("User Assigned Entity is not available on WebApp yet.")
elif "MSI_ENDPOINT" not in os.environ:
# Use IMDS if no MSI_ENDPOINT
- self._vm_msi = _ImdsTokenProvider(self.resource, self.msi_conf)
+ self._vm_msi = _ImdsTokenProvider(self.msi_conf)
+ # Follow the same convention as all Credentials class to check for the token at creation time #106
+ self.set_token()
def set_token(self):
if _is_app_service():
@@ -711,7 +581,7 @@
elif "MSI_ENDPOINT" in os.environ:
self.scheme, _, self.token = get_msi_token(self.resource, self.port, self.msi_conf)
else:
- token_entry = self._vm_msi.get_token()
+ token_entry = self._vm_msi.get_token(self.resource)
self.scheme, self.token = token_entry['token_type'], token_entry
def signed_session(self, session=None):
@@ -733,7 +603,7 @@
"""A help class handling token acquisitions through Azure IMDS plugin.
"""
- def __init__(self, resource, msi_conf=None):
+ def __init__(self, msi_conf=None):
self._user_agent = AzureConfiguration(None).user_agent
self.identity_type, self.identity_id = None, None
if msi_conf:
@@ -744,12 +614,11 @@
# default to system assigned identity on an empty configuration object
self.cache = {}
- self.resource = resource
- def get_token(self):
+ def get_token(self, resource):
import datetime
# let us hit the cache first
- token_entry = self.cache.get(self.resource, None)
+ token_entry = self.cache.get(resource, None)
if token_entry:
expires_on = int(token_entry['expires_on'])
expires_on_datetime = datetime.datetime.fromtimestamp(expires_on)
@@ -758,43 +627,52 @@
_LOGGER.info("MSI: token is found in cache.")
return token_entry
_LOGGER.info("MSI: cache is found but expired within %s minutes, so getting a new one.", expiration_margin)
- self.cache.pop(self.resource)
+ self.cache.pop(resource)
- token_entry = self._retrieve_token_from_imds_with_retry()
- self.cache[self.resource] = token_entry
+ token_entry = self._retrieve_token_from_imds_with_retry(resource)
+ self.cache[resource] = token_entry
return token_entry
- def _retrieve_token_from_imds_with_retry(self):
+ def _retrieve_token_from_imds_with_retry(self, resource):
import random
import json
# 169.254.169.254 is a well known ip address hosting the web service that provides the Azure IMDS metadata
request_uri = 'http://169.254.169.254/metadata/identity/oauth2/token'
payload = {
- 'resource': self.resource,
+ 'resource': resource,
'api-version': '2018-02-01'
}
if self.identity_id:
payload[self.identity_type] = self.identity_id
- retry, max_retry = 1, 20
+ retry, max_retry, start_time = 1, 12, time.time()
# simplified version of https://en.wikipedia.org/wiki/Exponential_backoff
slots = [100 * ((2 << x) - 1) / 1000 for x in range(max_retry)]
- while retry <= max_retry:
+ while True:
result = requests.get(request_uri, params=payload, headers={'Metadata': 'true', 'User-Agent':self._user_agent})
_LOGGER.debug("MSI: Retrieving a token from %s, with payload %s", request_uri, payload)
- if result.status_code in [404, 429] or (499 < result.status_code < 600):
- wait = random.choice(slots[:retry])
- _LOGGER.warning("MSI: Wait: %ss and retry: %s", wait, retry)
- time.sleep(wait)
- retry += 1
+ if result.status_code in [404, 410, 429] or (499 < result.status_code < 600):
+ if retry <= max_retry:
+ wait = random.choice(slots[:retry])
+ _LOGGER.warning("MSI: wait: %ss and retry: %s", wait, retry)
+ time.sleep(wait)
+ retry += 1
+ else:
+ if result.status_code == 410: # For IMDS upgrading, we wait up to 70s
+ gap = 70 - (time.time() - start_time)
+ if gap > 0:
+ _LOGGER.warning("MSI: wait till 70 seconds when IMDS is upgrading")
+ time.sleep(gap)
+ continue
+ break
elif result.status_code != 200:
raise HTTPError(request=result.request, response=result.raw)
else:
break
- if retry > max_retry:
+ if result.status_code != 200:
raise TimeoutError('MSI: Failed to acquire tokens after {} times'.format(max_retry))
_LOGGER.debug('MSI: Token retrieved')
token_entry = json.loads(result.content.decode())
- return token_entry
\ No newline at end of file
+ return token_entry
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/msrestazure-0.4.28/msrestazure/azure_cloud.py new/msrestazure-0.5.0/msrestazure/azure_cloud.py
--- old/msrestazure-0.4.28/msrestazure/azure_cloud.py 2018-04-24 01:45:02.000000000 +0200
+++ new/msrestazure-0.5.0/msrestazure/azure_cloud.py 2018-08-02 21:16:52.000000000 +0200
@@ -170,7 +170,7 @@
sql_management='https://management.core.usgovcloudapi.net:8443/',
batch_resource_id='https://batch.core.usgovcloudapi.net/',
gallery='https://gallery.usgovcloudapi.net/',
- active_directory='https://login-us.microsoftonline.com',
+ active_directory='https://login.microsoftonline.us',
active_directory_resource_id='https://management.core.usgovcloudapi.net/',
active_directory_graph_resource_id='https://graph.windows.net/'),
suffixes=CloudSuffixes(
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/msrestazure-0.4.28/msrestazure/azure_exceptions.py new/msrestazure-0.5.0/msrestazure/azure_exceptions.py
--- old/msrestazure-0.4.28/msrestazure/azure_exceptions.py 2018-04-24 01:45:02.000000000 +0200
+++ new/msrestazure-0.5.0/msrestazure/azure_exceptions.py 2018-08-02 21:16:52.000000000 +0200
@@ -54,6 +54,7 @@
'target': {'key': 'target', 'type': 'str'},
'details': {'key': 'details', 'type': '[CloudErrorData]'},
'innererror': {'key': 'innererror', 'type': 'object'},
+ 'additionalInfo': {'key': 'additionalInfo', 'type': '[TypedErrorInfo]'},
'data': {'key': 'values', 'type': '{str}'}
}
@@ -65,6 +66,7 @@
self.target = kwargs.get('target')
self.details = kwargs.get('details')
self.innererror = kwargs.get('innererror')
+ self.additionalInfo = kwargs.get('additionalInfo')
self.data = kwargs.get('data')
super(CloudErrorData, self).__init__(*args)
@@ -89,10 +91,18 @@
error_str += "\n\tMessage: {}".format(error_obj.message)
if error_obj.target:
error_str += "\n\tTarget: {}".format(error_obj.target)
- if error_obj.innererror:
- error_str += "\nInner error: {}".format(json.dumps(error_obj.innererror, indent=4))
- if self.innererror:
- error_str += "\nInner error: {}".format(json.dumps(self.innererror, indent=4))
+ if error_obj.innererror:
+ error_str += "\nInner error: {}".format(json.dumps(error_obj.innererror, indent=4))
+ if error_obj.additionalInfo:
+ error_str += "\n\tAdditional Information:"
+ for error_info in error_obj.additionalInfo:
+ error_str += "\n\t\t{}".format(str(error_info).replace("\n", "\n\t\t"))
+ if self.innererror:
+ error_str += "\nInner error: {}".format(json.dumps(self.innererror, indent=4))
+ if self.additionalInfo:
+ error_str += "\nAdditional Information:"
+ for error_info in self.additionalInfo:
+ error_str += "\n\t{}".format(str(error_info).replace("\n", "\n\t"))
error_bytes = error_str.encode()
return error_bytes.decode('ascii')
@@ -111,8 +121,9 @@
error data.
"""
try:
- value = eval(value)
- except (SyntaxError, TypeError):
+ import ast
+ value = ast.literal_eval(value)
+ except (SyntaxError, TypeError, ValueError):
pass
try:
value = value.get('value', value)
@@ -142,7 +153,8 @@
def __init__(self, response, error=None, *args, **kwargs):
self.deserializer = Deserializer({
'CloudErrorRoot': CloudErrorRoot,
- 'CloudErrorData': CloudErrorData
+ 'CloudErrorData': CloudErrorData,
+ 'TypedErroInfo': TypedErrorInfo
})
self.error = None
self.message = None
@@ -216,4 +228,26 @@
if not self.message:
msg = "Operation failed with status: {!r}. Details: {}"
self.message = msg.format(
- response.status_code, message)
\ No newline at end of file
+ response.status_code, message)
+
+class TypedErrorInfo(object):
+ """Typed Error Info object, deserialized from error data returned
+ during a failed REST API call. Contains additional error information
+ """
+
+ _validation = {}
+ _attribute_map = {
+ 'type': {'key': 'type', 'type': 'str'},
+ 'info': {'key': 'info', 'type': 'object'}
+ }
+
+ def __init__(self, type, info):
+ self.type = type
+ self.info = info
+
+ def __str__(self):
+ """Cloud error message."""
+ error_str = "Type: {}".format(self.type)
+ error_str += "\nInfo: {}".format(json.dumps(self.info, indent=4))
+ error_bytes = error_str.encode()
+ return error_bytes.decode('ascii')
\ No newline at end of file
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/msrestazure-0.4.28/msrestazure/polling/arm_polling.py new/msrestazure-0.5.0/msrestazure/polling/arm_polling.py
--- old/msrestazure-0.4.28/msrestazure/polling/arm_polling.py 2018-04-24 01:45:02.000000000 +0200
+++ new/msrestazure-0.5.0/msrestazure/polling/arm_polling.py 2018-08-02 21:16:52.000000000 +0200
@@ -40,6 +40,8 @@
FAILED = frozenset(['canceled', 'failed'])
SUCCEEDED = frozenset(['succeeded'])
+_AZURE_ASYNC_OPERATION_FINAL_STATE = "azure-async-operation"
+_LOCATION_FINAL_STATE = "location"
def finished(status):
if hasattr(status, 'value'):
@@ -102,9 +104,14 @@
"""LongRunningOperation
Provides default logic for interpreting operation responses
and status updates.
+
+ :param requests.Response response: The initial response.
+ :param callable deserialization_callback: The deserialization callaback.
+ :param dict lro_options: LRO options.
+ :param kwargs: Unused for now
"""
- def __init__(self, response, deserialization_callback):
+ def __init__(self, response, deserialization_callback, lro_options=None, **kwargs):
self.method = response.request.method
self.initial_response = response
self.status = ""
@@ -112,6 +119,11 @@
self.deserialization_callback = deserialization_callback
self.async_url = None
self.location_url = None
+ if lro_options is None:
+ lro_options = {
+ 'final-state-via': _AZURE_ASYNC_OPERATION_FINAL_STATE
+ }
+ self.lro_options = lro_options
def _raise_if_bad_http_status_and_method(self, response):
"""Check response status code is valid for a Put or Patch
@@ -179,8 +191,8 @@
:param requests.Response response: latest REST call response.
:rtype: bool
"""
- return (self.async_url or not self.resource) and \
- self.method in {'PUT', 'PATCH'}
+ return ((self.async_url or not self.resource) and self.method in {'PUT', 'PATCH'}) \
+ or (self.lro_options['final-state-via'] == _LOCATION_FINAL_STATE and self.location_url and self.async_url and self.method == 'POST')
def set_initial_status(self, response):
"""Process first response after initiating long running
@@ -279,7 +291,7 @@
try:
self.resource = self._deserialize(response)
except Exception:
- self.resource = None
+ self.resource = None
def set_async_url_if_present(self, response):
async_url = get_header_url(response, 'azure-asyncoperation')
@@ -302,11 +314,12 @@
class ARMPolling(PollingMethod):
- def __init__(self, timeout=30, **operation_config):
+ def __init__(self, timeout=30, lro_options=None, **operation_config):
self._timeout = timeout
self._operation = None # Will hold an instance of LongRunningOperation
self._response = None # Will hold latest received response
self._operation_config = operation_config
+ self._lro_options = lro_options
def status(self):
"""Return the current status as a string.
@@ -335,7 +348,7 @@
"""
self._client = client
self._response = initial_response
- self._operation = LongRunningOperation(initial_response, deserialization_callback)
+ self._operation = LongRunningOperation(initial_response, deserialization_callback, self._lro_options)
try:
self._operation.set_initial_status(initial_response)
except BadStatus:
@@ -380,8 +393,11 @@
raise OperationFailed("Operation failed or cancelled")
elif self._operation.should_do_final_get():
- initial_url = self._operation.initial_response.request.url
- self._response = self.request_status(initial_url)
+ if self._operation.method == 'POST' and self._operation.location_url:
+ final_get_url = self._operation.location_url
+ else:
+ final_get_url = self._operation.initial_response.request.url
+ self._response = self.request_status(final_get_url)
self._operation.get_status_from_resource(self._response)
def _delay(self):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/msrestazure-0.4.28/msrestazure/version.py new/msrestazure-0.5.0/msrestazure/version.py
--- old/msrestazure-0.4.28/msrestazure/version.py 2018-04-24 01:45:02.000000000 +0200
+++ new/msrestazure-0.5.0/msrestazure/version.py 2018-08-02 21:16:52.000000000 +0200
@@ -25,4 +25,4 @@
# --------------------------------------------------------------------------
#: version of the package. Use msrestazure.__version__ instead.
-msrestazure_version = "0.4.28"
+msrestazure_version = "0.5.0"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/msrestazure-0.4.28/msrestazure.egg-info/PKG-INFO new/msrestazure-0.5.0/msrestazure.egg-info/PKG-INFO
--- old/msrestazure-0.4.28/msrestazure.egg-info/PKG-INFO 2018-04-24 01:45:43.000000000 +0200
+++ new/msrestazure-0.5.0/msrestazure.egg-info/PKG-INFO 2018-08-02 21:17:34.000000000 +0200
@@ -1,6 +1,6 @@
Metadata-Version: 1.1
Name: msrestazure
-Version: 0.4.28
+Version: 0.5.0
Summary: AutoRest swagger generator Python client runtime. Azure-specific module.
Home-page: https://github.com/Azure/msrestazure-for-python
Author: Microsoft Corporation
@@ -28,6 +28,87 @@
Release History
---------------
+ 2018-08-02 Version 0.5.0
+ ++++++++++++++++++++++++
+
+ **Features**
+
+ - Implementation is now using ADAL and not request-oauthlib. This allows more AD scenarios (like federated) #94
+ - Add additionalInfo parsing for CloudError #102
+
+ **Breaking changes**
+
+ These breaking changes applies to ServicePrincipalCredentials, UserPassCredentials, AADTokenCredentials
+
+ - Remove "auth_uri" attribute and parameter. This was unused.
+ - Remove "state" attribute. This was unused.
+ - Remove "client" attribute. This was exposed by mistake and should have been internal. No replacement is possible.
+ - Remove "token_uri" attribute and parameter. Use "cloud_environment" and "tenant" to impact the login url now.
+ - Remove token caching based on "keyring". Token caching should be implemented using ADAL now. This implies:
+
+ - Remove the "keyring" parameter
+ - Remove the "clear_cached_token" method
+ - Remove the "retrieve_session" method
+
+ 2018-07-03 Version 0.4.35
+ +++++++++++++++++++++++++
+
+ **Bugfixes**
+
+ - MSIAuthentication regression for KeyVault since IMDS support #109
+
+ 2018-07-02 Version 0.4.34
+ +++++++++++++++++++++++++
+
+ **Bugfixes**
+
+ - MSIAuthentication should initialize the token attribute on creation #106
+
+ 2018-06-21 Version 0.4.33
+ +++++++++++++++++++++++++
+
+ **Bugfixes**
+
+ - Fixes refreshToken in UserPassCredentials and AADTokenCredentials #103
+ - Fix US government cloud definition #104
+
+ Thanks to mjcaley for his contribution
+
+ 2018-06-13 Version 0.4.32
+ +++++++++++++++++++++++++
+
+ **Features**
+
+ - Implement new LRO options of Autorest #101
+
+ **Bug fixes**
+
+ - Reduce max MSI polling time for VM #100
+
+
+ 2018-05-17 Version 0.4.31
+ +++++++++++++++++++++++++
+
+ **Features**
+
+ - Improve MSI for VM token polling algorithm
+
+ 2018-05-16 Version 0.4.30
+ +++++++++++++++++++++++++
+
+ **Features**
+
+ - Allow ADAL 0.5.0 to 2.0.0 excluded as valid ADAL dependency
+
+ 2018-04-30 Version 0.4.29
+ +++++++++++++++++++++++++
+
+ **Bugfixes**
+
+ - Fix refresh Token on `AADTokenCredentials` (was broken in 0.4.27)
+ - Now `UserPasswordCredentials` correctly use the refreshToken, and not user/password to refresh the session (was broken in 0.4.27)
+ - Bring back `keyring`, with minimal dependency 12.0.2 that fixes the installation problem on old Python
+
2018-04-23 Version 0.4.28
+++++++++++++++++++++++++
@@ -35,7 +116,7 @@
Do to some stability issues with "keyring" dependency that highly change from one system to another,
this package is no longer a dependency of "msrestazure".
- If you were using the secured token cache of `ServicePrincipalCredentials` and `UserPassCredentials`,
+ If you were using the secured token cache of `ServicePrincipalCredentials` and `UserPassCredentials`,
the feature is still available, but you need to install manually "keyring". The functionnality will activate automatically.
2018-04-18 Version 0.4.27
@@ -76,7 +157,7 @@
**Bugfix**
- - Fix LRO result if POST uses AsyncOperation header (Autorest.Python 3.0 only) #79
+ - Fix LRO result if POST uses AsyncOperation header (Autorest.Python 3.0 only) #79
2018-02-27 Version 0.4.22
+++++++++++++++++++++++++
@@ -441,5 +522,6 @@
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
+Classifier: Programming Language :: Python :: 3.7
Classifier: License :: OSI Approved :: MIT License
Classifier: Topic :: Software Development
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/msrestazure-0.4.28/msrestazure.egg-info/requires.txt new/msrestazure-0.5.0/msrestazure.egg-info/requires.txt
--- old/msrestazure-0.4.28/msrestazure.egg-info/requires.txt 2018-04-24 01:45:43.000000000 +0200
+++ new/msrestazure-0.5.0/msrestazure.egg-info/requires.txt 2018-08-02 21:17:34.000000000 +0200
@@ -1,2 +1,2 @@
msrest<2.0.0,>=0.4.28
-adal~=0.5.0
+adal<2.0.0,>=0.6.0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/msrestazure-0.4.28/setup.py new/msrestazure-0.5.0/setup.py
--- old/msrestazure-0.4.28/setup.py 2018-04-24 01:45:02.000000000 +0200
+++ new/msrestazure-0.5.0/setup.py 2018-08-02 21:16:52.000000000 +0200
@@ -28,7 +28,7 @@
setup(
name='msrestazure',
- version='0.4.28',
+ version='0.5.0',
author='Microsoft Corporation',
author_email='azpysdkhelp(a)microsoft.com',
packages=find_packages(exclude=["tests", "tests.*"]),
@@ -46,10 +46,11 @@
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
+ 'Programming Language :: Python :: 3.7',
'License :: OSI Approved :: MIT License',
'Topic :: Software Development'],
install_requires=[
"msrest>=0.4.28,<2.0.0",
- "adal~=0.5.0"
+ "adal>=0.6.0,<2.0.0",
],
)
1
0
Hello community,
here is the log from the commit of package python-msrest for openSUSE:Factory checked in at 2018-09-26 16:16:12
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-msrest (Old)
and /work/SRC/openSUSE:Factory/.python-msrest.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-msrest"
Wed Sep 26 16:16:12 2018 rev:6 rq:638007 version:0.5.5
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-msrest/python-msrest.changes 2018-05-13 16:03:43.821253496 +0200
+++ /work/SRC/openSUSE:Factory/.python-msrest.new/python-msrest.changes 2018-09-26 16:16:12.947009292 +0200
@@ -1,0 +2,14 @@
+Thu Sep 6 12:59:32 UTC 2018 - John Paul Adrian Glaubitz <adrian.glaubitz(a)suse.com>
+
+- New upstream release
+ + Version 0.5.0
+ + No upstream changelog provided
+- Add LICENSE.md and install into %license directory
+- Add python-enum34 to Requires for Python 2.x and Python 3.4
+- Add python-typing to Requires for Python Python 2x., 3.4 and 3.5
+- Refresh patches for new version
+ + m_drop-compatible-releases-operator.patch
+ + m_drop-extras-require.patch
+- Update Requires from setup.py
+
+-------------------------------------------------------------------
Old:
----
msrest-0.4.28.tar.gz
New:
----
LICENSE.md
msrest-0.5.5.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-msrest.spec ++++++
--- /var/tmp/diff_new_pack.2ze7ys/_old 2018-09-26 16:16:13.443008471 +0200
+++ /var/tmp/diff_new_pack.2ze7ys/_new 2018-09-26 16:16:13.447008464 +0200
@@ -12,19 +12,20 @@
# license that conforms to the Open Source Definition (Version 1.9)
# published by the Open Source Initiative.
-# Please submit bugfixes or comments via http://bugs.opensuse.org/
+# Please submit bugfixes or comments via https://bugs.opensuse.org/
#
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-msrest
-Version: 0.4.28
+Version: 0.5.5
Release: 0
Summary: AutoRest swagger generator Python client runtime
License: MIT
Group: Development/Languages/Python
Url: https://pypi.python.org/pypi/msrest
Source: https://files.pythonhosted.org/packages/source/m/msrest/msrest-%{version}.t…
+Source1: LICENSE.md
Patch0: m_drop-compatible-releases-operator.patch
Patch1: m_drop-extras-require.patch
BuildRequires: %{python_module devel}
@@ -34,8 +35,14 @@
Requires: python-certifi >= 2017.4.17
Requires: python-isodate >= 0.6.0
Requires: python-requests < 3.00
-Requires: python-requests >= 2.14
+Requires: python-requests >= 2.16
Requires: python-requests-oauthlib >= 0.5.0
+%if "%{python_flavor}" == "python2" || %{python3_version_nodots} < 35
+Requires: python-typing
+%endif
+%if "%{python_flavor}" == "python2" || %{python3_version_nodots} < 34
+Requires: python-enum34 >= 1.0.4
+%endif
BuildRoot: %{_tmppath}/%{name}-%{version}-build
BuildArch: noarch
@@ -49,6 +56,7 @@
%setup -q -n msrest-%{version}
%patch0 -p1
%patch1 -p1
+cp %{SOURCE1} LICENSE.md
%build
%python_build
@@ -61,6 +69,7 @@
%files %{python_files}
%defattr(-,root,root,-)
%doc README.rst
+%license LICENSE.md
%{python_sitelib}/*
%changelog
++++++ LICENSE.md ++++++
MIT License
Copyright (c) 2016 Microsoft Azure
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
++++++ m_drop-compatible-releases-operator.patch ++++++
--- /var/tmp/diff_new_pack.2ze7ys/_old 2018-09-26 16:16:13.511008358 +0200
+++ /var/tmp/diff_new_pack.2ze7ys/_new 2018-09-26 16:16:13.511008358 +0200
@@ -1,12 +1,12 @@
-diff -Nru msrest-0.4.28.orig/setup.py msrest-0.4.28/setup.py
---- msrest-0.4.28.orig/setup.py 2018-04-18 23:08:15.000000000 +0200
-+++ msrest-0.4.28/setup.py 2018-04-24 12:46:33.099513431 +0200
+diff -Nru msrest-0.5.5.orig/setup.py msrest-0.5.5/setup.py
+--- msrest-0.5.5.orig/setup.py 2018-09-04 21:14:13.000000000 +0200
++++ msrest-0.5.5/setup.py 2018-09-06 14:25:16.473344949 +0200
@@ -47,7 +47,7 @@
'License :: OSI Approved :: MIT License',
'Topic :: Software Development'],
install_requires=[
-- "requests~=2.14",
-+ "requests>=2.14",
+- "requests~=2.16",
++ "requests>=2.16",
"requests_oauthlib>=0.5.0",
"isodate>=0.6.0",
"certifi>=2017.4.17",
++++++ m_drop-extras-require.patch ++++++
--- /var/tmp/diff_new_pack.2ze7ys/_old 2018-09-26 16:16:13.527008332 +0200
+++ /var/tmp/diff_new_pack.2ze7ys/_new 2018-09-26 16:16:13.527008332 +0200
@@ -1,11 +1,12 @@
-diff -Nru msrest-0.4.28.orig/setup.py msrest-0.4.28/setup.py
---- msrest-0.4.28.orig/setup.py 2018-04-24 12:46:33.099513431 +0200
-+++ msrest-0.4.28/setup.py 2018-04-24 12:47:31.784002289 +0200
-@@ -52,7 +52,4 @@
+diff -Nru msrest-0.5.5.orig/setup.py msrest-0.5.5/setup.py
+--- msrest-0.5.5.orig/setup.py 2018-09-06 14:25:16.473344949 +0200
++++ msrest-0.5.5/setup.py 2018-09-06 14:25:41.501568822 +0200
+@@ -52,8 +52,4 @@
"isodate>=0.6.0",
"certifi>=2017.4.17",
],
- extras_require={
- ":python_version<'3.4'": ['enum34>=1.0.4'],
+- ":python_version<'3.5'": ['typing'],
- }
)
++++++ msrest-0.4.28.tar.gz -> msrest-0.5.5.tar.gz ++++++
++++ 2665 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package python-azure-storage for openSUSE:Factory checked in at 2018-09-26 16:16:10
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-azure-storage (Old)
and /work/SRC/openSUSE:Factory/.python-azure-storage.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-azure-storage"
Wed Sep 26 16:16:10 2018 rev:2 rq:638006 version:0.36.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-azure-storage/python-azure-storage.changes 2017-10-11 23:01:03.537676077 +0200
+++ /work/SRC/openSUSE:Factory/.python-azure-storage.new/python-azure-storage.changes 2018-09-26 16:16:11.627011477 +0200
@@ -1,0 +2,5 @@
+Thu Sep 6 12:19:34 UTC 2018 - John Paul Adrian Glaubitz <adrian.glaubitz(a)suse.com>
+
+- Move LICENSE.txt from %doc to %license section
+
+-------------------------------------------------------------------
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-azure-storage.spec ++++++
--- /var/tmp/diff_new_pack.5QqjzH/_old 2018-09-26 16:16:12.615009842 +0200
+++ /var/tmp/diff_new_pack.5QqjzH/_new 2018-09-26 16:16:12.619009835 +0200
@@ -1,7 +1,7 @@
#
# spec file for package python-azure-storage
#
-# Copyright (c) 2017 SUSE LINUX GmbH, Nuernberg, Germany.
+# Copyright (c) 2018 SUSE LINUX GmbH, Nuernberg, Germany.
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -71,7 +71,8 @@
%files %{python_files}
%defattr(-,root,root,-)
-%doc LICENSE.txt README.rst
+%doc README.rst
+%license LICENSE.txt
%{python_sitelib}/azure/storage
%{python_sitelib}/azure_storage-*.egg-info
1
0
Hello community,
here is the log from the commit of package python-azure-servicemanagement-legacy for openSUSE:Factory checked in at 2018-09-26 16:16:07
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-azure-servicemanagement-legacy (Old)
and /work/SRC/openSUSE:Factory/.python-azure-servicemanagement-legacy.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-azure-servicemanagement-legacy"
Wed Sep 26 16:16:07 2018 rev:3 rq:638005 version:0.20.6
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-azure-servicemanagement-legacy/python-azure-servicemanagement-legacy.changes 2018-01-30 15:45:19.352031164 +0100
+++ /work/SRC/openSUSE:Factory/.python-azure-servicemanagement-legacy.new/python-azure-servicemanagement-legacy.changes 2018-09-26 16:16:07.583018172 +0200
@@ -1,0 +2,5 @@
+Thu Sep 6 12:17:22 UTC 2018 - John Paul Adrian Glaubitz <adrian.glaubitz(a)suse.com>
+
+- Move LICENSE.txt from %doc to %license section
+
+-------------------------------------------------------------------
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-azure-servicemanagement-legacy.spec ++++++
--- /var/tmp/diff_new_pack.NxIjUd/_old 2018-09-26 16:16:08.031017430 +0200
+++ /var/tmp/diff_new_pack.NxIjUd/_new 2018-09-26 16:16:08.035017424 +0200
@@ -63,7 +63,8 @@
%files %{python_files}
%defattr(-,root,root,-)
-%doc HISTORY.rst LICENSE.txt README.rst
+%doc HISTORY.rst README.rst
+%license LICENSE.txt
%{python_sitelib}/azure/servicemanagement
%{python_sitelib}/azure_servicemanagement_legacy-*.egg-info
1
0
Hello community,
here is the log from the commit of package python-azure-servicefabric for openSUSE:Factory checked in at 2018-09-26 16:16:05
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-azure-servicefabric (Old)
and /work/SRC/openSUSE:Factory/.python-azure-servicefabric.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-azure-servicefabric"
Wed Sep 26 16:16:05 2018 rev:4 rq:638004 version:6.3.0.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-azure-servicefabric/python-azure-servicefabric.changes 2018-05-13 15:59:32.622418466 +0200
+++ /work/SRC/openSUSE:Factory/.python-azure-servicefabric.new/python-azure-servicefabric.changes 2018-09-26 16:16:06.939019238 +0200
@@ -1,0 +2,9 @@
+Thu Sep 6 12:13:33 UTC 2018 - John Paul Adrian Glaubitz <adrian.glaubitz(a)suse.com>
+
+- New upstream release
+ + Version 6.3.0.0
+ + For detailed information about changes see the
+ HISTORY.txt file provided with this package
+- Update %description from setup.py
+
+-------------------------------------------------------------------
Old:
----
azure-servicefabric-6.1.2.9.zip
New:
----
azure-servicefabric-6.3.0.0.zip
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-azure-servicefabric.spec ++++++
--- /var/tmp/diff_new_pack.NY40jC/_old 2018-09-26 16:16:07.447018397 +0200
+++ /var/tmp/diff_new_pack.NY40jC/_new 2018-09-26 16:16:07.447018397 +0200
@@ -18,7 +18,7 @@
%{?!python_module:%define python_module() python-%{**} python3-%{**}}
Name: python-azure-servicefabric
-Version: 6.1.2.9
+Version: 6.3.0.0
Release: 0
Summary: Microsoft Azure Service Fabric Client Library
License: MIT
@@ -49,7 +49,7 @@
Azure Resource Manager (ARM) is the next generation of management APIs that
replace the old Azure Service Management (ASM).
-This package has been tested with Python 2.7, 3.4, 3.5 and 3.6.
+This package has been tested with Python 2.7, 3.4, 3.5, 3.6, and 3.7.
%prep
%setup -q -n azure-servicefabric-%{version}
1
0
Hello community,
here is the log from the commit of package python-azure-servicebus for openSUSE:Factory checked in at 2018-09-26 16:16:01
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-azure-servicebus (Old)
and /work/SRC/openSUSE:Factory/.python-azure-servicebus.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-azure-servicebus"
Wed Sep 26 16:16:01 2018 rev:3 rq:638003 version:0.21.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-azure-servicebus/python-azure-servicebus.changes 2018-01-30 15:45:24.363797195 +0100
+++ /work/SRC/openSUSE:Factory/.python-azure-servicebus.new/python-azure-servicebus.changes 2018-09-26 16:16:04.663023006 +0200
@@ -1,0 +2,5 @@
+Thu Sep 6 12:09:01 UTC 2018 - John Paul Adrian Glaubitz <adrian.glaubitz(a)suse.com>
+
+- Move LICENSE.txt from %doc to %license section
+
+-------------------------------------------------------------------
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-azure-servicebus.spec ++++++
--- /var/tmp/diff_new_pack.ALlMr1/_old 2018-09-26 16:16:05.243022046 +0200
+++ /var/tmp/diff_new_pack.ALlMr1/_new 2018-09-26 16:16:05.247022039 +0200
@@ -61,7 +61,8 @@
%files %{python_files}
%defattr(-,root,root,-)
-%doc HISTORY.rst LICENSE.txt README.rst
+%doc HISTORY.rst README.rst
+%license LICENSE.txt
%{python_sitelib}/azure/servicebus
%{python_sitelib}/azure_servicebus-*.egg-info
1
0