openSUSE Commits
Threads by month
- ----- 2024 -----
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2007 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2006 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
August 2017
- 1 participants
- 2097 discussions
Hello community,
here is the log from the commit of package ghc-megaparsec for openSUSE:Factory checked in at 2017-08-31 20:48:24
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ghc-megaparsec (Old)
and /work/SRC/openSUSE:Factory/.ghc-megaparsec.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ghc-megaparsec"
Thu Aug 31 20:48:24 2017 rev:5 rq:513430 version:5.3.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/ghc-megaparsec/ghc-megaparsec.changes 2017-07-05 23:59:10.569996700 +0200
+++ /work/SRC/openSUSE:Factory/.ghc-megaparsec.new/ghc-megaparsec.changes 2017-08-31 20:48:26.539003999 +0200
@@ -1,0 +2,5 @@
+Thu Jul 27 14:07:26 UTC 2017 - psimons(a)suse.com
+
+- Update to version 5.3.1.
+
+-------------------------------------------------------------------
Old:
----
megaparsec-5.2.0.tar.gz
megaparsec.cabal
New:
----
megaparsec-5.3.1.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ghc-megaparsec.spec ++++++
--- /var/tmp/diff_new_pack.SmXjMD/_old 2017-08-31 20:48:27.350890037 +0200
+++ /var/tmp/diff_new_pack.SmXjMD/_new 2017-08-31 20:48:27.350890037 +0200
@@ -19,14 +19,13 @@
%global pkg_name megaparsec
%bcond_with tests
Name: ghc-%{pkg_name}
-Version: 5.2.0
+Version: 5.3.1
Release: 0
Summary: Monadic parser combinators
License: BSD-2-Clause
Group: Development/Languages/Other
Url: https://hackage.haskell.org/package/%{pkg_name}
Source0: https://hackage.haskell.org/package/%{pkg_name}-%{version}/%{pkg_name}-%{ve…
-Source1: https://hackage.haskell.org/package/%{pkg_name}-%{version}/revision/1.cabal…
BuildRequires: ghc-Cabal-devel
BuildRequires: ghc-QuickCheck-devel
BuildRequires: ghc-bytestring-devel
@@ -61,7 +60,6 @@
%prep
%setup -q -n %{pkg_name}-%{version}
-cp -p %{SOURCE1} %{pkg_name}.cabal
%build
%ghc_lib_build
++++++ megaparsec-5.2.0.tar.gz -> megaparsec-5.3.1.tar.gz ++++++
++++ 3055 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package ghc-makefile for openSUSE:Factory checked in at 2017-08-31 20:48:22
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ghc-makefile (Old)
and /work/SRC/openSUSE:Factory/.ghc-makefile.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ghc-makefile"
Thu Aug 31 20:48:22 2017 rev:2 rq:513428 version:1.0.0.4
Changes:
--------
--- /work/SRC/openSUSE:Factory/ghc-makefile/ghc-makefile.changes 2017-04-12 18:07:40.202463831 +0200
+++ /work/SRC/openSUSE:Factory/.ghc-makefile.new/ghc-makefile.changes 2017-08-31 20:48:23.415442444 +0200
@@ -1,0 +2,5 @@
+Thu Jul 27 14:08:10 UTC 2017 - psimons(a)suse.com
+
+- Update to version 1.0.0.4.
+
+-------------------------------------------------------------------
Old:
----
makefile-0.1.1.0.tar.gz
New:
----
makefile-1.0.0.4.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ghc-makefile.spec ++++++
--- /var/tmp/diff_new_pack.famTNi/_old 2017-08-31 20:48:24.395304904 +0200
+++ /var/tmp/diff_new_pack.famTNi/_new 2017-08-31 20:48:24.403303781 +0200
@@ -19,30 +19,33 @@
%global pkg_name makefile
%bcond_with tests
Name: ghc-%{pkg_name}
-Version: 0.1.1.0
+Version: 1.0.0.4
Release: 0
-Summary: Simple Makefile parser
+Summary: Simple Makefile parser and generator
License: MIT
Group: Development/Languages/Other
Url: https://hackage.haskell.org/package/%{pkg_name}
Source0: https://hackage.haskell.org/package/%{pkg_name}-%{version}/%{pkg_name}-%{ve…
BuildRequires: ghc-Cabal-devel
BuildRequires: ghc-attoparsec-devel
-BuildRequires: ghc-bytestring-devel
BuildRequires: ghc-rpm-macros
+BuildRequires: ghc-text-devel
BuildRoot: %{_tmppath}/%{name}-%{version}-build
%if %{with tests}
BuildRequires: ghc-Glob-devel
+BuildRequires: ghc-QuickCheck-devel
BuildRequires: ghc-doctest-devel
%endif
%description
This package provides a few 'Attoparser' parsers and convenience functions for
-parsing Makefiles. The datatypes used for describing Makefiles are located in
-'Data.Makefile'. The parsers and parsing functions are located in
-'Data.Makefile.Parse'. To parse a Makefile in the current folder, simply run
+parsing and generating Makefiles. The datatypes used for describing Makefiles
+are located in 'Data.Makefile'. The parsers and parsing functions are located
+in 'Data.Makefile.Parse'. The generating and encoding functions are located in
+'Data.Makefile.Render'. To parse a Makefile in the current folder, simply run
'parseMakefile'. To parse a Makefile located at 'path', run 'parseAsMakefile'
-'path'.
+'path'. To parse a Makefile from a Text 'txt', run 'parseMakefileContents txt`.
+To encode a 'Makefile', run 'encodeMakefile'.
%package devel
Summary: Haskell %{pkg_name} library development files
++++++ makefile-0.1.1.0.tar.gz -> makefile-1.0.0.4.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/makefile-0.1.1.0/makefile.cabal new/makefile-1.0.0.4/makefile.cabal
--- old/makefile-0.1.1.0/makefile.cabal 2017-02-17 22:20:44.000000000 +0100
+++ new/makefile-1.0.0.4/makefile.cabal 2017-06-16 09:50:45.000000000 +0200
@@ -1,63 +1,65 @@
-name: makefile
-version: 0.1.1.0
-synopsis: Simple Makefile parser
+name: makefile
+version: 1.0.0.4
+cabal-version: >=1.10
+build-type: Simple
+license: MIT
+license-file: LICENSE
+copyright: 2016-2017 Nicolas Mattia
+maintainer: nicolas(a)nmattia.com
+homepage: http://github.com/nmattia/mask
+synopsis: Simple Makefile parser and generator
description:
- This package provides a few @Attoparser@ parsers and convenience functions
- for parsing Makefiles.
-
- The datatypes used for describing Makefiles are located in 'Data.Makefile'.
- The parsers and parsing functions are located in 'Data.Makefile.Parse'.
-
- To parse a Makefile in the current folder, simply run 'parseMakefile'. To
- parse a Makefile located at @path@, run 'parseAsMakefile' @path@.
-
-homepage: http://github.com/nmattia/mask
-license: MIT
-license-file: LICENSE
-author: Nicolas Mattia
-maintainer: nicolas(a)nmattia.com
-copyright: 2016 Nicolas Mattia
-category: Parsing
-build-type: Simple
-cabal-version: >=1.10
-
+ This package provides a few @Attoparser@ parsers and convenience functions
+ for parsing and generating Makefiles.
+ The datatypes used for describing Makefiles are located in 'Data.Makefile'.
+ The parsers and parsing functions are located in 'Data.Makefile.Parse'.
+ The generating and encoding functions are located in 'Data.Makefile.Render'.
+ To parse a Makefile in the current folder, simply run 'parseMakefile'. To
+ parse a Makefile located at @path@, run 'parseAsMakefile' @path@. To parse a
+ Makefile from a Text @txt@, run 'parseMakefileContents txt`.
+ To encode a @Makefile@, run 'encodeMakefile'.
+category: Parsing
+author: Nicolas Mattia
extra-source-files:
- test-data/basic/Makefile1
- test-data/basic/Makefile2
- test-data/elfparse/Makefile
+ test-data/basic/Makefile1
+ test-data/basic/Makefile2
+ test-data/elfparse/Makefile
source-repository head
- type: git
- location: https://github.com/nmattia/mask.git
+ type: git
+ location: https://github.com/nmattia/mask.git
library
- hs-source-dirs: src
- default-language: Haskell2010
- build-depends: base >= 4.7 && < 5
- , attoparsec >= 0.12
- , bytestring >= 0.10
- exposed-modules:
- Data.Makefile
- , Data.Makefile.Parse
- , Data.Makefile.Parse.Internal
- , Data.Makefile.Render
- , Data.Makefile.Render.Internal
- ghc-options: -Wall
-
+ exposed-modules:
+ Data.Makefile
+ Data.Makefile.Parse
+ Data.Makefile.Parse.Internal
+ Data.Makefile.Render
+ Data.Makefile.Render.Internal
+ build-depends:
+ base >=4.7 && <5,
+ attoparsec >=0.12 && <0.14,
+ text >=1.1 && <1.3
+ default-language: Haskell2010
+ hs-source-dirs: src
+ ghc-options: -Wall
test-suite test
- hs-source-dirs: src
- default-language: Haskell2010
- type: exitcode-stdio-1.0
- main-is: Test.hs
- build-depends: base
- , attoparsec >= 0.12
- , bytestring >= 0.10
- , doctest >= 0.9
- , Glob >= 0.7
- , makefile
- other-modules: Data.Makefile
- , Data.Makefile.Parse
- , Data.Makefile.Parse.Internal
- , Data.Makefile.Render
- , Data.Makefile.Render.Internal
+ type: exitcode-stdio-1.0
+ main-is: Test.hs
+ build-depends:
+ base >=4.9.1.0 && <4.10,
+ attoparsec >=0.12 && <0.14,
+ text >=1.1 && <1.3,
+ doctest >=0.9 && <0.12,
+ Glob >=0.7 && <0.9,
+ QuickCheck >=2.9.2 && <2.11,
+ makefile >=1.0.0.4 && <1.1
+ default-language: Haskell2010
+ hs-source-dirs: src
+ other-modules:
+ Data.Makefile
+ Data.Makefile.Parse
+ Data.Makefile.Parse.Internal
+ Data.Makefile.Render
+ Data.Makefile.Render.Internal
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/makefile-0.1.1.0/src/Data/Makefile/Parse/Internal.hs new/makefile-1.0.0.4/src/Data/Makefile/Parse/Internal.hs
--- old/makefile-0.1.1.0/src/Data/Makefile/Parse/Internal.hs 2016-08-05 15:37:35.000000000 +0200
+++ new/makefile-1.0.0.4/src/Data/Makefile/Parse/Internal.hs 2017-04-29 12:29:28.000000000 +0200
@@ -1,13 +1,16 @@
{-# LANGUAGE OverloadedStrings #-}
+{-# LANGUAGE LambdaCase #-}
module Data.Makefile.Parse.Internal where
-import Control.Applicative ((<|>))
-import Data.Attoparsec.ByteString
+import Control.Monad
+import Control.Applicative
+import Data.Attoparsec.Text
import Data.Makefile
-import qualified Data.Attoparsec.ByteString.Char8 as Atto
-import qualified Data.ByteString as B
+import qualified Data.Attoparsec.Text as Atto
+import qualified Data.Text as T
+import qualified Data.Text.IO as T
-- $setup
-- >>> :set -XOverloadedStrings
@@ -16,11 +19,14 @@
--
-- Tries to open and parse a file name @Makefile@ in the current directory.
parseMakefile :: IO (Either String Makefile)
-parseMakefile = Atto.parseOnly makefile <$> B.readFile "Makefile"
+parseMakefile = Atto.parseOnly makefile <$> T.readFile "Makefile"
-- | Parse the specified file as a makefile.
parseAsMakefile :: FilePath -> IO (Either String Makefile)
-parseAsMakefile f = Atto.parseOnly makefile <$> B.readFile f
+parseAsMakefile f = Atto.parseOnly makefile <$> T.readFile f
+
+parseMakefileContents :: T.Text -> Either String Makefile
+parseMakefileContents = Atto.parseOnly makefile
--------------------------------------------------------------------------------
-- Parsers
@@ -34,56 +40,125 @@
entry :: Parser Entry
entry = many' emptyLine *> (assignment <|> rule)
--- | Parser of variable assignment
+-- | Parser of variable assignment (see 'Assignment'). Note that leading and
+-- trailing whitespaces will be stripped both from the variable name and
+-- assigned value.
+--
+-- Note that this tries to follow GNU make's (crazy) behavior when it comes to
+-- variable names and assignment operators.
+--
+-- >>> Atto.parseOnly assignment "foo = bar "
+-- Right (Assignment RecursiveAssign "foo" "bar")
+--
+-- >>> Atto.parseOnly assignment "foo := bar "
+-- Right (Assignment SimpleAssign "foo" "bar")
+--
+-- >>> Atto.parseOnly assignment "foo ::= bar "
+-- Right (Assignment SimplePosixAssign "foo" "bar")
+--
+-- >>> Atto.parseOnly assignment "foo?= bar "
+-- Right (Assignment ConditionalAssign "foo" "bar")
+--
+-- >>> Atto.parseOnly assignment "foo??= bar "
+-- Right (Assignment ConditionalAssign "foo?" "bar")
+--
+-- >>> Atto.parseOnly assignment "foo!?!= bar "
+-- Right (Assignment ShellAssign "foo!?" "bar")
assignment :: Parser Entry
-assignment = Assignment <$> (lazyVar <|> immVar)
- <*> toLineEnd1
+assignment = do
+ varName <- variableName
+ assType <- assignmentType
+ varVal <- toEscapedLineEnd
+ return (Assignment assType varName varVal)
+
+-- | Read chars while some ('Parser', monadic) predicate is 'True'.
+--
+-- XXX: extremely inefficient.
+takeWhileM :: (Char -> Parser Bool) -> Parser T.Text
+takeWhileM a = (T.pack . reverse) <$> go []
+ where
+ go cs = do
+ c <- Atto.anyChar
+ True <- a c
+ go (c:cs) <|> pure (c:cs)
+
+
+-- | Parse a variable name, not consuming any of the assignment operator. See
+-- also 'assignment'.
+--
+-- >>> Atto.parseOnly variableName "foo!?!= bar "
+-- Right "foo!?"
+variableName :: Parser T.Text
+variableName = stripped $ takeWhileM go
+ where
+ go '+' = Atto.peekChar' >>= \case
+ '=' -> return False
+ _c -> return True
+ go '?' = Atto.peekChar' >>= \case
+ '=' -> return False
+ _c -> return True
+ go '!' = Atto.peekChar' >>= \case
+ '=' -> return False
+ _c -> return True
+ -- those chars are not allowed in variable names
+ go ':' = return False
+ go '#' = return False
+ go '=' = return False
+ go _c = return True
+
+-- | Parse an assignment type, not consuming any of the assigned value. See
+-- also 'assignment'.
+--
+-- >>> Atto.parseOnly assignmentType "!= bar "
+-- Right ShellAssign
+assignmentType :: Parser AssignmentType
+assignmentType =
+ ("=" *> pure RecursiveAssign)
+ <|> ("+=" *> pure AppendAssign)
+ <|> ("?=" *> pure ConditionalAssign)
+ <|> ("!=" *> pure ShellAssign)
+ <|> (":=" *> pure SimpleAssign)
+ <|> ("::=" *> pure SimplePosixAssign)
-- | Parser for an entire rule
rule :: Parser Entry
-rule = Rule <$> target
- <*> (many' dependency <* nextLine)
- <*> many' command
+rule =
+ Rule
+ <$> target
+ <*> many' dependency
+ <*> many' (many' emptyLine *> command)
-- | Parser for a command
command :: Parser Command
-command = Command <$> (many' emptyLine *> Atto.char8 '\t'
- *> toLineEnd1
- <* nextLine)
+command = Command <$> (Atto.char '\t' *> toEscapedLineEnd)
-- | Parser for a (rule) target
target :: Parser Target
-target = Target <$> (Atto.takeWhile (/= ':') <* Atto.char8 ':')
+target = Target <$> stripped (Atto.takeWhile (/= ':') <* Atto.char ':')
-- | Parser for a (rule) dependency
dependency :: Parser Dependency
-dependency = Dependency <$> (Atto.takeWhile isSpaceChar
- *> Atto.takeWhile1 (`notElem` [' ', '\n', '#']))
-
--- | Parser for variable name in declaration (lazy set, @var = x@)
---
--- >>> Atto.parseOnly lazyVar "CFLAGS=-c -Wall"
--- Right "CFLAGS"
-lazyVar :: Parser B.ByteString
-lazyVar = Atto.takeWhile1 (`notElem` ['=', '\n', '#']) <* Atto.char8 '='
-
--- | Parser for variable name in declaration (immediate set, @var := x@)
---
--- >>> Atto.parseOnly immVar "CFLAGS:=-c -Wall"
--- Right "CFLAGS"
-immVar :: Parser B.ByteString
-immVar = Atto.takeWhile1 (`notElem` [':', '\n', '#']) <* Atto.string ":="
+dependency = Dependency <$> (sameLine <|> newLine)
+ where
+ sameLine =
+ Atto.takeWhile (== ' ')
+ *> Atto.takeWhile1 (`notElem` [' ', '\n', '#', '\\'])
+ newLine =
+ Atto.takeWhile (== ' ')
+ *> Atto.char '\\'
+ *> Atto.char '\n'
+ *> (sameLine <|> newLine)
-- | Parser for a comment (the comment starts with the hashtag)
--
-- >>> Atto.parseOnly comment "# I AM A COMMENT"
-- Right " I AM A COMMENT"
-comment :: Parser B.ByteString
-comment = Atto.char8 '#' *> Atto.takeWhile (/= '\n')
+comment :: Parser T.Text
+comment = Atto.char '#' *> Atto.takeWhile (/= '\n')
-- | Consume a newline character (@'\n'@)
nextLine :: Parser ()
-nextLine = Atto.takeWhile (/= '\n') *> Atto.char8 '\n' *> pure ()
+nextLine = Atto.takeWhile (/= '\n') *> Atto.char '\n' *> pure ()
-- | Consume an empty line (potentially containing spaces and/or tabs).
--
@@ -92,11 +167,35 @@
emptyLine :: Parser ()
emptyLine = Atto.takeWhile (`elem` ['\t', ' ']) *>
many' comment *>
- Atto.char8 '\n' *>
+ Atto.char '\n' *>
pure ()
-isSpaceChar :: Char -> Bool
-isSpaceChar c = c == ' '
+toLineEnd :: Parser T.Text
+toLineEnd = Atto.takeWhile (`notElem` ['\n', '#'])
+
+-- | Get the contents until the end of the (potentially multi) line. Multiple
+-- lines are separated by a @\\@ char and individual lines will be stripped and
+-- spaces will be interspersed.
+--
+-- The final @\n@ character is consumed.
+--
+-- >>> Atto.parseOnly toEscapedLineEnd "foo bar \\\n baz"
+-- Right "foo bar baz"
+--
+-- >>> Atto.parseOnly toEscapedLineEnd "foo \t\\\n bar \\\n baz \\\n \t"
+-- Right "foo bar baz"
+toEscapedLineEnd :: Parser T.Text
+toEscapedLineEnd = (T.unwords . filter (not . T.null)) <$> go
+ where
+ go = do
+ l <- toLineEnd <* (void (Atto.char '\n') <|> pure ())
+ case T.stripSuffix "\\" l of
+ Nothing -> return [T.strip l]
+ Just l' -> (T.strip l':) <$> go
+
+-------------------------------------------------------------------------------
+-- Helpers
+-------------------------------------------------------------------------------
-toLineEnd1 :: Parser B.ByteString
-toLineEnd1 = Atto.takeWhile1 (`notElem` ['\n', '#'])
+stripped :: Parser T.Text -> Parser T.Text
+stripped = fmap T.strip
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/makefile-0.1.1.0/src/Data/Makefile/Parse.hs new/makefile-1.0.0.4/src/Data/Makefile/Parse.hs
--- old/makefile-0.1.1.0/src/Data/Makefile/Parse.hs 2016-07-01 17:29:08.000000000 +0200
+++ new/makefile-1.0.0.4/src/Data/Makefile/Parse.hs 2017-04-29 12:29:28.000000000 +0200
@@ -3,15 +3,18 @@
module Data.Makefile.Parse
( I.parseMakefile
, I.parseAsMakefile
+ , I.parseMakefileContents
, I.makefile
, I.entry
, I.assignment
+ , I.variableName
+ , I.assignmentType
, I.rule
, I.command
, I.target
, I.dependency
- , I.lazyVar
- , I.immVar
- , I.comment) where
+ , I.comment
+ , I.toEscapedLineEnd
+ ) where
import qualified Data.Makefile.Parse.Internal as I
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/makefile-0.1.1.0/src/Data/Makefile/Render/Internal.hs new/makefile-1.0.0.4/src/Data/Makefile/Render/Internal.hs
--- old/makefile-0.1.1.0/src/Data/Makefile/Render/Internal.hs 2017-02-17 22:19:13.000000000 +0100
+++ new/makefile-1.0.0.4/src/Data/Makefile/Render/Internal.hs 2017-04-15 22:41:21.000000000 +0200
@@ -3,31 +3,42 @@
module Data.Makefile.Render.Internal where
import Data.Makefile
import Data.Monoid
-import qualified Data.ByteString.Lazy as B
-import Data.ByteString.Builder
-import qualified Data.ByteString.Lazy.Char8 as BL
+import qualified Data.Text.Lazy as TL
+import qualified Data.Text.Lazy.IO as TL
+import Data.Text.Lazy.Builder
writeMakefile :: FilePath -> Makefile -> IO ()
writeMakefile f m = do
let s = encodeMakefile m
- BL.writeFile f s
+ TL.writeFile f s
-encodeMakefile :: Makefile -> B.ByteString
-encodeMakefile = toLazyByteString . renderMakefile
+encodeMakefile :: Makefile -> TL.Text
+encodeMakefile = toLazyText . renderMakefile
renderMakefile :: Makefile -> Builder
-renderMakefile (Makefile es ) = mconcat [renderEntry e <> charUtf8 '\n' | e <- es]
+renderMakefile (Makefile es ) = mconcat [renderEntry e <> singleton '\n' | e <- es]
renderEntry :: Entry -> Builder
-renderEntry (Assignment key value ) = byteString key <> charUtf8 '=' <> byteString value
+renderEntry (Assignment RecursiveAssign key value ) =
+ fromText key <> singleton '=' <> fromText value
+renderEntry (Assignment SimpleAssign key value ) =
+ fromText key <> fromText ":=" <> fromText value
+renderEntry (Assignment SimplePosixAssign key value ) =
+ fromText key <> fromText "::=" <> fromText value
+renderEntry (Assignment ConditionalAssign key value ) =
+ fromText key <> fromText "?=" <> fromText value
+renderEntry (Assignment ShellAssign key value ) =
+ fromText key <> fromText "!=" <> fromText value
+renderEntry (Assignment AppendAssign key value ) =
+ fromText key <> fromText "+=" <> fromText value
renderEntry (Rule (Target t) ds cmds) =
- byteString t <> charUtf8 ':' <>
- mconcat [charUtf8 ' ' <> renderDep d | d <- ds] <>
- charUtf8 '\n' <>
- mconcat [renderCmd cmd <> charUtf8 '\n' | cmd <- cmds]
+ fromText t <> singleton ':' <>
+ mconcat [singleton ' ' <> renderDep d | d <- ds] <>
+ singleton '\n' <>
+ mconcat [renderCmd cmd <> singleton '\n' | cmd <- cmds]
renderDep :: Dependency -> Builder
-renderDep (Dependency dep ) = byteString dep
+renderDep (Dependency dep ) = fromText dep
renderCmd :: Command -> Builder
-renderCmd (Command cmd ) = charUtf8 '\t' <> byteString cmd
+renderCmd (Command cmd ) = singleton '\t' <> fromText cmd
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/makefile-0.1.1.0/src/Data/Makefile.hs new/makefile-1.0.0.4/src/Data/Makefile.hs
--- old/makefile-0.1.1.0/src/Data/Makefile.hs 2016-07-11 11:57:16.000000000 +0200
+++ new/makefile-1.0.0.4/src/Data/Makefile.hs 2017-04-29 12:31:28.000000000 +0200
@@ -22,9 +22,10 @@
@
Makefile {
entries =
- [ Assignment "hello " " world"
- , Rule (Target "foo") [Dependency "bar"] [Command "baz"] ]
- })
+ [ Assignment RecursiveAssign "hello" "world"
+ , Rule (Target "foo") [Dependency "bar"] [Command "baz"]
+ ]
+ }
@
-}
@@ -33,7 +34,7 @@
import Data.String (IsString)
-import qualified Data.ByteString as B
+import qualified Data.Text as T
-- | A Makefile object, a list of makefile entries
@@ -42,13 +43,29 @@
-- | A makefile entry, either a rule @(target: dep1 dep1; commands)@ or a
-- variable assignment (@hello = world@ or @hello := world@)
data Entry = Rule Target [Dependency] [Command]
- | Assignment B.ByteString B.ByteString deriving (Show, Eq)
+ | Assignment AssignmentType T.Text T.Text
+ deriving (Show, Eq)
+
+data AssignmentType
+ = RecursiveAssign
+ -- ^ foo = bar
+ | SimpleAssign
+ -- ^ foo := bar
+ | SimplePosixAssign
+ -- ^ foo ::= bar
+ | ConditionalAssign
+ -- ^ foo ?= bar
+ | ShellAssign
+ -- ^ foo != bar
+ | AppendAssign
+ -- ^ foo += bar
+ deriving (Show, Eq, Enum, Bounded)
-- | Makefile target (@foo@ in the example above)
-newtype Target = Target B.ByteString deriving (Show, Eq, IsString)
+newtype Target = Target T.Text deriving (Show, Eq, IsString)
-- | Target dependency (@bar@ in the example above)
-newtype Dependency = Dependency B.ByteString deriving (Show, Eq, IsString)
+newtype Dependency = Dependency T.Text deriving (Show, Eq, IsString)
-- | Command (@baz@ in the example above)
-newtype Command = Command B.ByteString deriving (Show, Eq, IsString)
+newtype Command = Command T.Text deriving (Show, Eq, IsString)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/makefile-0.1.1.0/src/Test.hs new/makefile-1.0.0.4/src/Test.hs
--- old/makefile-0.1.1.0/src/Test.hs 2017-02-17 22:19:13.000000000 +0100
+++ new/makefile-1.0.0.4/src/Test.hs 2017-04-15 22:41:21.000000000 +0200
@@ -5,12 +5,39 @@
import "Glob" System.FilePath.Glob (glob)
import Test.DocTest (doctest)
+import Data.Monoid
import Control.Monad
-import Data.ByteString hiding (any)
import Data.Makefile
import Data.Makefile.Parse
import Data.Makefile.Render
+import Test.QuickCheck
+import qualified Data.Text as T
+import qualified Data.Text.Lazy as TL
+
+instance Arbitrary Target where
+ arbitrary = pure $ Target "foo"
+
+instance Arbitrary Dependency where
+ arbitrary = pure $ Dependency "bar"
+
+instance Arbitrary Command where
+ arbitrary = pure $ Command "baz"
+
+instance Arbitrary AssignmentType where
+ arbitrary =
+ elements [minBound..maxBound]
+
+instance Arbitrary Entry where
+ arbitrary =
+ oneof
+ [ Rule <$> arbitrary <*> arbitrary <*> arbitrary
+ , Assignment <$> arbitrary <*> pure "foo" <*> pure "bar"
+ ]
+
+instance Arbitrary Makefile where
+ arbitrary = Makefile <$> arbitrary
+
main :: IO ()
main = do
@@ -66,23 +93,150 @@
withMakefile "test-data/basic/Makefile2" $ \m -> do
writeMakefile "test-data/basic/_Makefile2" m
withMakefile "test-data/basic/_Makefile2" $ \mm -> assertMakefile m mm
+ withMakefileContents
+ "foo = bar"
+ (assertAssignments [("foo", "bar")])
+ withMakefileContents "foo: bar" (assertTargets ["foo"])
+ withMakefileContents
+ "foo : bar"
+ (assertTargets ["foo"])
+ withMakefileContents
+ (T.pack $ unlines
+ [ "var="
+ , "foo: bar"
+ ]
+ )
+ (assertMakefile
+ Makefile
+ { entries =
+ [ Assignment RecursiveAssign "var" ""
+ , Rule "foo" ["bar"] []
+ ]
+ }
+ )
+ withMakefileContents
+ (T.pack $ unlines
+ [ "var=foo bar" ]
+ )
+ (assertAssignments [("var", "foo bar")])
+ withMakefileContents
+ (T.pack $ unlines
+ [ "var=foo bar\\"
+ , "baz"
+ ]
+ )
+ (assertAssignments [("var", "foo bar baz")])
+ withMakefileContents
+ (T.pack $ unlines
+ [ "var=foo bar \\"
+ , "baz"
+ ]
+ )
+ (assertAssignments [("var", "foo bar baz")])
+ withMakefileContents
+ (T.pack $ unlines
+ [ "var=foo bar\\"
+ , " baz"
+ ]
+ )
+ (assertAssignments [("var", "foo bar baz")])
+ withMakefileContents
+ (T.pack $ unlines
+ [ "var=foo bar \\"
+ , " baz"
+ ]
+ )
+ (assertAssignments [("var", "foo bar baz")])
+ withMakefileContents
+ (T.pack $ unlines
+ [ "var=foo bar \\"
+ , "\tbaz"
+ ]
+ )
+ (assertAssignments [("var", "foo bar baz")])
+ withMakefileContents
+ (T.pack $ unlines
+ [ "var=foo bar \t \\"
+ , " \t baz"
+ ]
+ )
+ (assertAssignments [("var", "foo bar baz")])
+ withMakefileContents
+ (T.pack $ unlines
+ [ "SUBDIRS=anna bspt cacheprof \\"
+ , " compress compress2 fem"
+ ]
+ )
+ (assertAssignments
+ [("SUBDIRS", "anna bspt cacheprof compress compress2 fem")])
+ withMakefileContents
+ (T.pack $ unlines
+ [ "foo: anna bspt cacheprof \\"
+ , " compress compress2 fem"
+ ]
+ )
+ (assertMakefile
+ Makefile
+ { entries =
+ [ Rule
+ "foo"
+ [ "anna"
+ , "bspt"
+ , "cacheprof"
+ , "compress"
+ , "compress2"
+ , "fem"
+ ] []
+ ]
+ }
+ )
+ withMakefileContents
+ (T.pack $ unlines
+ [ "foo:"
+ , "\tcd dir/ && \\"
+ , " ls"
+ ]
+ )
+ (assertMakefile
+ Makefile
+ { entries =
+ [ Rule "foo" [] ["cd dir/ && ls"]
+ ]
+ }
+ )
+ Success{} <- quickCheckResult prop_encodeDecode
+ return ()
+
+prop_encodeDecode :: Makefile -> Bool
+prop_encodeDecode m =
+ (fromRight $ parseMakefileContents $ TL.toStrict $ encodeMakefile m) == m
+
+withMakefileContents :: T.Text -> (Makefile -> IO ()) -> IO ()
+withMakefileContents contents a =
+ a $ fromRight (parseMakefileContents contents)
withMakefile :: FilePath -> (Makefile -> IO ()) -> IO ()
withMakefile f a = fromRight <$> parseAsMakefile f >>= a
assertMakefile :: Makefile -> Makefile -> IO ()
-assertMakefile m1 m2 = if (m1 == m2) then return () else error "Makefiles mismatch!"
+assertMakefile m1 m2 =
+ unless (m1 == m2)
+ $ error $ unwords
+ [ "Makefiles mismatch!"
+ , "got " <> show m1
+ , "and " <> show m2
+ ]
assertTargets :: [Target] -> Makefile -> IO ()
assertTargets ts m = mapM_ (`assertTarget` m) ts
-assertAssignments :: [(ByteString, ByteString)] -> Makefile -> IO ()
+assertAssignments :: [(T.Text, T.Text)] -> Makefile -> IO ()
assertAssignments as m = mapM_ (`assertAssignment` m) as
-assertAssignment :: (ByteString, ByteString) -> Makefile -> IO ()
+assertAssignment :: (T.Text, T.Text) -> Makefile -> IO ()
assertAssignment (n, v) (Makefile m) = unless (any hasAssignment m) $
error ("Assignment " ++ show (n, v) ++ " wasn't found in Makefile " ++ show m)
- where hasAssignment (Assignment n' v') = n == n' && v == v'
+ where hasAssignment (Assignment _ n' v') = n == n' && v == v'
hasAssignment _ = False
assertTarget :: Target -> Makefile -> IO ()
1
0
Hello community,
here is the log from the commit of package ghc-logging-facade for openSUSE:Factory checked in at 2017-08-31 20:48:19
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ghc-logging-facade (Old)
and /work/SRC/openSUSE:Factory/.ghc-logging-facade.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ghc-logging-facade"
Thu Aug 31 20:48:19 2017 rev:3 rq:513426 version:0.3.0
Changes:
--------
--- /work/SRC/openSUSE:Factory/ghc-logging-facade/ghc-logging-facade.changes 2016-12-06 14:25:03.000000000 +0100
+++ /work/SRC/openSUSE:Factory/.ghc-logging-facade.new/ghc-logging-facade.changes 2017-08-31 20:48:20.431861240 +0200
@@ -1,0 +2,5 @@
+Thu Jul 27 14:05:34 UTC 2017 - psimons(a)suse.com
+
+- Update to version 0.3.0.
+
+-------------------------------------------------------------------
Old:
----
logging-facade-0.1.1.tar.gz
logging-facade.cabal
New:
----
logging-facade-0.3.0.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ghc-logging-facade.spec ++++++
--- /var/tmp/diff_new_pack.afI0dZ/_old 2017-08-31 20:48:21.291740541 +0200
+++ /var/tmp/diff_new_pack.afI0dZ/_new 2017-08-31 20:48:21.299739419 +0200
@@ -1,7 +1,7 @@
#
# spec file for package ghc-logging-facade
#
-# Copyright (c) 2016 SUSE LINUX GmbH, Nuernberg, Germany.
+# Copyright (c) 2017 SUSE LINUX GmbH, Nuernberg, Germany.
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -19,17 +19,16 @@
%global pkg_name logging-facade
%bcond_with tests
Name: ghc-%{pkg_name}
-Version: 0.1.1
+Version: 0.3.0
Release: 0
Summary: Simple logging abstraction that allows multiple back-ends
License: MIT
Group: Development/Languages/Other
Url: https://hackage.haskell.org/package/%{pkg_name}
Source0: https://hackage.haskell.org/package/%{pkg_name}-%{version}/%{pkg_name}-%{ve…
-Source1: https://hackage.haskell.org/package/%{pkg_name}-%{version}/revision/1.cabal…
BuildRequires: ghc-Cabal-devel
+BuildRequires: ghc-call-stack-devel
BuildRequires: ghc-rpm-macros
-BuildRequires: ghc-template-haskell-devel
BuildRequires: ghc-transformers-devel
BuildRoot: %{_tmppath}/%{name}-%{version}-build
%if %{with tests}
@@ -52,7 +51,6 @@
%prep
%setup -q -n %{pkg_name}-%{version}
-cp -p %{SOURCE1} %{pkg_name}.cabal
%build
%ghc_lib_build
++++++ logging-facade-0.1.1.tar.gz -> logging-facade-0.3.0.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/logging-facade-0.1.1/LICENSE new/logging-facade-0.3.0/LICENSE
--- old/logging-facade-0.1.1/LICENSE 2016-02-21 05:05:49.000000000 +0100
+++ new/logging-facade-0.3.0/LICENSE 2017-06-01 15:24:19.000000000 +0200
@@ -1,4 +1,4 @@
-Copyright (c) 2014 Simon Hengel <sol(a)typeful.net>
+Copyright (c) 2014-2017 Simon Hengel <sol(a)typeful.net>
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/logging-facade-0.1.1/logging-facade.cabal new/logging-facade-0.3.0/logging-facade.cabal
--- old/logging-facade-0.1.1/logging-facade.cabal 2016-02-21 05:05:49.000000000 +0100
+++ new/logging-facade-0.3.0/logging-facade.cabal 2017-06-01 15:24:19.000000000 +0200
@@ -1,10 +1,16 @@
+-- This file has been generated from package.yaml by hpack version 0.17.0.
+--
+-- see: https://github.com/sol/hpack
+
name: logging-facade
-version: 0.1.1
+version: 0.3.0
synopsis: Simple logging abstraction that allows multiple back-ends
description: Simple logging abstraction that allows multiple back-ends
+homepage: https://github.com/sol/logging-facade#readme
+bug-reports: https://github.com/sol/logging-facade/issues
license: MIT
license-file: LICENSE
-copyright: (c) 2014 Simon Hengel
+copyright: (c) 2014-2017 Simon Hengel
author: Simon Hengel <sol(a)typeful.net>
maintainer: Simon Hengel <sol(a)typeful.net>
build-type: Simple
@@ -17,24 +23,30 @@
library
ghc-options: -Wall
- hs-source-dirs: src
+ hs-source-dirs:
+ src
exposed-modules:
System.Logging.Facade
- System.Logging.Facade.Sink
System.Logging.Facade.Class
+ System.Logging.Facade.Sink
System.Logging.Facade.Types
+ other-modules:
+ Paths_logging_facade
build-depends:
base == 4.*
+ , call-stack
, transformers
- , template-haskell
default-language: Haskell2010
test-suite spec
type: exitcode-stdio-1.0
ghc-options: -Wall
- hs-source-dirs: test
+ hs-source-dirs:
+ test
main-is: Spec.hs
other-modules:
+ Helper
+ System.Logging.Facade.SinkSpec
System.Logging.FacadeSpec
build-depends:
base == 4.*
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/logging-facade-0.1.1/src/System/Logging/Facade/Sink.hs new/logging-facade-0.3.0/src/System/Logging/Facade/Sink.hs
--- old/logging-facade-0.1.1/src/System/Logging/Facade/Sink.hs 2016-02-21 05:05:49.000000000 +0100
+++ new/logging-facade-0.3.0/src/System/Logging/Facade/Sink.hs 2017-06-01 15:24:19.000000000 +0200
@@ -1,13 +1,18 @@
+{-# LANGUAGE CPP #-}
module System.Logging.Facade.Sink (
LogSink
, defaultLogSink
-, setLogSink
, getLogSink
+, setLogSink
+, swapLogSink
+, withLogSink
) where
+import Control.Concurrent
import Data.IORef
import System.IO
import System.IO.Unsafe (unsafePerformIO)
+import Control.Exception
import System.Logging.Facade.Types
@@ -16,7 +21,7 @@
-- use the unsafePerformIO hack to share one sink across a process
logSink :: IORef LogSink
-logSink = unsafePerformIO (newIORef defaultLogSink)
+logSink = unsafePerformIO (defaultLogSink >>= newIORef)
{-# NOINLINE logSink #-}
-- | Return the global log sink.
@@ -27,9 +32,22 @@
setLogSink :: LogSink -> IO ()
setLogSink = atomicWriteIORef logSink
--- | A log sink that writes log messages to `stderr`
-defaultLogSink :: LogSink
-defaultLogSink record = hPutStrLn stderr output
+-- | Return the global log sink and set it to a new value in one atomic
+-- operation.
+swapLogSink :: LogSink -> IO LogSink
+swapLogSink new = atomicModifyIORef logSink $ \old -> (new, old)
+
+-- | Set the global log sink to a specified value, run given action, and
+-- finally restore the global log sink to its previous value.
+withLogSink :: LogSink -> IO () -> IO ()
+withLogSink sink action = bracket (swapLogSink sink) setLogSink (const action)
+
+-- | A thread-safe log sink that writes log messages to `stderr`
+defaultLogSink :: IO LogSink
+defaultLogSink = defaultLogSink_ `fmap` newMVar ()
+
+defaultLogSink_ :: MVar () -> LogSink
+defaultLogSink_ mvar record = withMVar mvar (\() -> hPutStrLn stderr output)
where
level = logRecordLevel record
mLocation = logRecordLocation record
@@ -40,3 +58,10 @@
formatLocation :: Location -> ShowS
formatLocation loc = showString (locationFile loc) . colon . shows (locationLine loc) . colon . shows (locationColumn loc)
where colon = showString ":"
+
+#if !MIN_VERSION_base(4,6,0)
+atomicWriteIORef :: IORef a -> a -> IO ()
+atomicWriteIORef ref a = do
+ x <- atomicModifyIORef ref (\_ -> (a, ()))
+ x `seq` return ()
+#endif
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/logging-facade-0.1.1/src/System/Logging/Facade/Types.hs new/logging-facade-0.3.0/src/System/Logging/Facade/Types.hs
--- old/logging-facade-0.1.1/src/System/Logging/Facade/Types.hs 2016-02-21 05:05:49.000000000 +0100
+++ new/logging-facade-0.3.0/src/System/Logging/Facade/Types.hs 2017-06-01 15:24:19.000000000 +0200
@@ -1,7 +1,7 @@
module System.Logging.Facade.Types where
data LogLevel = TRACE | DEBUG | INFO | WARN | ERROR
- deriving (Eq, Show, Ord, Bounded, Enum)
+ deriving (Eq, Show, Read, Ord, Bounded, Enum)
data Location = Location {
locationPackage :: String
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/logging-facade-0.1.1/src/System/Logging/Facade.hs new/logging-facade-0.3.0/src/System/Logging/Facade.hs
--- old/logging-facade-0.1.1/src/System/Logging/Facade.hs 2016-02-21 05:05:49.000000000 +0100
+++ new/logging-facade-0.3.0/src/System/Logging/Facade.hs 2017-06-01 15:24:19.000000000 +0200
@@ -1,8 +1,5 @@
-{-# LANGUAGE CPP #-}
-#if MIN_VERSION_base(4,8,1)
-#define HAS_SOURCE_LOCATIONS
-{-# LANGUAGE ImplicitParams #-}
-#endif
+{-# LANGUAGE FlexibleContexts #-}
+{-# LANGUAGE ConstraintKinds #-}
-- |
-- This module is intended to be imported qualified:
--
@@ -22,49 +19,36 @@
) where
import Prelude hiding (log, error)
+import Data.CallStack
import System.Logging.Facade.Types
import System.Logging.Facade.Class
-#ifdef HAS_SOURCE_LOCATIONS
-#if ! MIN_VERSION_base(4,9,0)
-import GHC.SrcLoc
-#endif
-import GHC.Stack
-#define with_loc (?loc :: CallStack) =>
-#else
-#define with_loc
-#endif
-
-- | Produce a log message with specified log level.
-log :: with_loc Logging m => LogLevel -> String -> m ()
+log :: (HasCallStack, Logging m) => LogLevel -> String -> m ()
log level message = consumeLogRecord (LogRecord level location message)
- where
- location :: Maybe Location
-#ifdef HAS_SOURCE_LOCATIONS
- location = case reverse (getCallStack ?loc) of
- (_, loc) : _ -> Just $ Location (srcLocPackage loc) (srcLocModule loc) (srcLocFile loc) (srcLocStartLine loc) (srcLocStartCol loc)
- _ -> Nothing
-#else
- location = Nothing
-#endif
+
+location :: HasCallStack => Maybe Location
+location = case reverse callStack of
+ (_, loc) : _ -> Just $ Location (srcLocPackage loc) (srcLocModule loc) (srcLocFile loc) (srcLocStartLine loc) (srcLocStartCol loc)
+ _ -> Nothing
-- | Produce a log message with log level `TRACE`.
-trace :: with_loc Logging m => String -> m ()
+trace :: (HasCallStack, Logging m) => String -> m ()
trace = log TRACE
-- | Produce a log message with log level `DEBUG`.
-debug :: with_loc Logging m => String -> m ()
+debug :: (HasCallStack, Logging m) => String -> m ()
debug = log DEBUG
-- | Produce a log message with log level `INFO`.
-info :: with_loc Logging m => String -> m ()
+info :: (HasCallStack, Logging m) => String -> m ()
info = log INFO
-- | Produce a log message with log level `WARN`.
-warn :: with_loc Logging m => String -> m ()
+warn :: (HasCallStack, Logging m) => String -> m ()
warn = log WARN
-- | Produce a log message with log level `ERROR`.
-error :: with_loc Logging m => String -> m ()
+error :: (HasCallStack, Logging m) => String -> m ()
error = log ERROR
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/logging-facade-0.1.1/test/Helper.hs new/logging-facade-0.3.0/test/Helper.hs
--- old/logging-facade-0.1.1/test/Helper.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/logging-facade-0.3.0/test/Helper.hs 2017-06-01 15:24:19.000000000 +0200
@@ -0,0 +1,17 @@
+module Helper (
+ module Test.Hspec
+, logSinkSpy
+) where
+
+import Test.Hspec
+import Data.IORef
+
+import System.Logging.Facade.Types
+import System.Logging.Facade.Sink
+
+logSinkSpy :: IO (IO [LogRecord], LogSink)
+logSinkSpy = do
+ ref <- newIORef []
+ let spy :: LogSink
+ spy record = modifyIORef ref (record {logRecordLocation = Nothing} :)
+ return (readIORef ref, spy)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/logging-facade-0.1.1/test/System/Logging/Facade/SinkSpec.hs new/logging-facade-0.3.0/test/System/Logging/Facade/SinkSpec.hs
--- old/logging-facade-0.1.1/test/System/Logging/Facade/SinkSpec.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/logging-facade-0.3.0/test/System/Logging/Facade/SinkSpec.hs 2017-06-01 15:24:19.000000000 +0200
@@ -0,0 +1,25 @@
+module System.Logging.Facade.SinkSpec (main, spec) where
+
+import Helper
+
+import System.Logging.Facade
+import System.Logging.Facade.Types
+import System.Logging.Facade.Sink
+
+main :: IO ()
+main = hspec spec
+
+spec :: Spec
+spec = do
+ describe "withLogSink" $ do
+ it "sets the global log sink to specified value before running specified action" $ do
+ (logRecords, spy) <- logSinkSpy
+ withLogSink spy (info "some log message")
+ logRecords `shouldReturn` [LogRecord INFO Nothing "some log message"]
+
+ it "restores the original log sink when done" $ do
+ (logRecords, spy) <- logSinkSpy
+ setLogSink spy
+ withLogSink (\_ -> return ()) (return ())
+ info "some log message"
+ logRecords `shouldReturn` [LogRecord INFO Nothing "some log message"]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/logging-facade-0.1.1/test/System/Logging/FacadeSpec.hs new/logging-facade-0.3.0/test/System/Logging/FacadeSpec.hs
--- old/logging-facade-0.1.1/test/System/Logging/FacadeSpec.hs 2016-02-21 05:05:49.000000000 +0100
+++ new/logging-facade-0.3.0/test/System/Logging/FacadeSpec.hs 2017-06-01 15:24:19.000000000 +0200
@@ -1,7 +1,6 @@
module System.Logging.FacadeSpec (main, spec) where
-import Test.Hspec
-import Data.IORef
+import Helper
import System.Logging.Facade.Types
import System.Logging.Facade.Sink
@@ -14,9 +13,6 @@
spec = do
describe "info" $ do
it "writes a log message with log level INFO" $ do
- ref <- newIORef []
- let captureLogMessage :: LogSink
- captureLogMessage record = modifyIORef ref (record {logRecordLocation = Nothing} :)
- setLogSink captureLogMessage
- info "some log message"
- readIORef ref `shouldReturn` [LogRecord INFO Nothing "some log message"]
+ (logRecords, spy) <- logSinkSpy
+ withLogSink spy (info "some log message")
+ logRecords `shouldReturn` [LogRecord INFO Nothing "some log message"]
1
0
Hello community,
here is the log from the commit of package ghc-log-elasticsearch for openSUSE:Factory checked in at 2017-08-31 20:48:16
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ghc-log-elasticsearch (Old)
and /work/SRC/openSUSE:Factory/.ghc-log-elasticsearch.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ghc-log-elasticsearch"
Thu Aug 31 20:48:16 2017 rev:2 rq:513424 version:0.9.0.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/ghc-log-elasticsearch/ghc-log-elasticsearch.changes 2017-05-10 20:45:26.257855554 +0200
+++ /work/SRC/openSUSE:Factory/.ghc-log-elasticsearch.new/ghc-log-elasticsearch.changes 2017-08-31 20:48:18.084190775 +0200
@@ -1,0 +2,5 @@
+Thu Jul 27 14:08:12 UTC 2017 - psimons(a)suse.com
+
+- Update to version 0.9.0.1.
+
+-------------------------------------------------------------------
Old:
----
log-elasticsearch-0.7.tar.gz
log-elasticsearch.cabal
New:
----
log-elasticsearch-0.9.0.1.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ghc-log-elasticsearch.spec ++++++
--- /var/tmp/diff_new_pack.ENWmEM/_old 2017-08-31 20:48:18.912074568 +0200
+++ /var/tmp/diff_new_pack.ENWmEM/_new 2017-08-31 20:48:18.924072884 +0200
@@ -18,14 +18,13 @@
%global pkg_name log-elasticsearch
Name: ghc-%{pkg_name}
-Version: 0.7
+Version: 0.9.0.1
Release: 0
Summary: Structured logging solution (Elasticsearch back end)
License: BSD-3-Clause
Group: Development/Languages/Other
Url: https://hackage.haskell.org/package/%{pkg_name}
Source0: https://hackage.haskell.org/package/%{pkg_name}-%{version}/%{pkg_name}-%{ve…
-Source1: https://hackage.haskell.org/package/%{pkg_name}-%{version}/revision/1.cabal…
BuildRequires: ghc-Cabal-devel
BuildRequires: ghc-aeson-devel
BuildRequires: ghc-aeson-pretty-devel
@@ -34,6 +33,7 @@
BuildRequires: ghc-bytestring-devel
BuildRequires: ghc-deepseq-devel
BuildRequires: ghc-http-client-devel
+BuildRequires: ghc-http-client-tls-devel
BuildRequires: ghc-log-base-devel
BuildRequires: ghc-rpm-macros
BuildRequires: ghc-semigroups-devel
@@ -62,7 +62,6 @@
%prep
%setup -q -n %{pkg_name}-%{version}
-cp -p %{SOURCE1} %{pkg_name}.cabal
%build
%ghc_lib_build
@@ -82,5 +81,6 @@
%files devel -f %{name}-devel.files
%defattr(-,root,root,-)
+%doc CHANGELOG.md README.md
%changelog
++++++ log-elasticsearch-0.7.tar.gz -> log-elasticsearch-0.9.0.1.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-elasticsearch-0.7/CHANGELOG.md new/log-elasticsearch-0.9.0.1/CHANGELOG.md
--- old/log-elasticsearch-0.7/CHANGELOG.md 1970-01-01 01:00:00.000000000 +0100
+++ new/log-elasticsearch-0.9.0.1/CHANGELOG.md 2017-06-20 18:17:46.000000000 +0200
@@ -0,0 +1,16 @@
+# log-elasticsearch-0.9.0.1 (2017-06-19)
+* 'withElasticSearchLogger' no longer fails when the Elasticsearch server is down.
+
+# log-elasticsearch-0.9.0.0 (2017-05-04)
+* Now works with bloodhound-0.14.0.0 (#30).
+
+# log-elasticsearch-0.8.1 (2017-03-27)
+* Log.Backend.ElasticSearch.Internal now exports 'EsUsername' and
+ 'EsPassword'.
+
+# log-elasticsearch-0.8 (2017-03-16)
+* Made ElasticSearchConfig an abstract type (#27).
+* Added support for HTTPS and basic auth (#26).
+
+# log-elasticsearch-0.7 (2016-11-25)
+* Initial release (split from the log package).
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-elasticsearch-0.7/README.md new/log-elasticsearch-0.9.0.1/README.md
--- old/log-elasticsearch-0.7/README.md 1970-01-01 01:00:00.000000000 +0100
+++ new/log-elasticsearch-0.9.0.1/README.md 2017-06-20 18:17:46.000000000 +0200
@@ -0,0 +1,3 @@
+# log-elasticsearch [![Hackage version](https://img.shields.io/hackage/v/log-elasticsearch.svg?label=Hacka… [![Build Status](https://secure.travis-ci.org/scrive/log.svg?branch=master)](http://…
+
+Elasticsearch back end for the `log` library.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-elasticsearch-0.7/log-elasticsearch.cabal new/log-elasticsearch-0.9.0.1/log-elasticsearch.cabal
--- old/log-elasticsearch-0.7/log-elasticsearch.cabal 2016-11-25 10:08:48.000000000 +0100
+++ new/log-elasticsearch-0.9.0.1/log-elasticsearch.cabal 2017-06-20 18:17:46.000000000 +0200
@@ -1,5 +1,5 @@
name: log-elasticsearch
-version: 0.7
+version: 0.9.0.1
synopsis: Structured logging solution (Elasticsearch back end)
description: Elasticsearch back end for the 'log' library.
@@ -17,23 +17,33 @@
category: System
build-type: Simple
cabal-version: >=1.10
-tested-with: GHC == 7.8.4, GHC == 7.10.3, GHC == 8.0.1
+extra-source-files: CHANGELOG.md, README.md
+tested-with: GHC == 7.8.4, GHC == 7.10.3, GHC == 8.0.2
Source-repository head
Type: git
Location: https://github.com/scrive/log.git
library
- exposed-modules: Log.Backend.ElasticSearch
+ exposed-modules: Log.Backend.ElasticSearch.V1
+ Log.Backend.ElasticSearch.V1.Internal
+ Log.Backend.ElasticSearch.V1.Lens
+ Log.Backend.ElasticSearch.V5
+ Log.Backend.ElasticSearch.V5.Internal
+ Log.Backend.ElasticSearch.V5.Lens
+ Log.Backend.ElasticSearch
+ Log.Backend.ElasticSearch.Lens
+ Log.Backend.ElasticSearch.Internal
build-depends: base <5,
log-base >= 0.7,
aeson >=0.11.0.0,
aeson-pretty >=0.8.2,
bytestring,
base64-bytestring,
- bloodhound >= 0.11.1,
+ bloodhound >= 0.13 && < 0.15,
deepseq,
http-client,
+ http-client-tls,
semigroups,
text,
text-show >= 2,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/Internal.hs new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/Internal.hs
--- old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/Internal.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/Internal.hs 2017-06-20 18:17:46.000000000 +0200
@@ -0,0 +1,5 @@
+module Log.Backend.ElasticSearch.Internal
+ {-# DEPRECATED "Use directly Log.Backend.ElasticSearch.V1 or V5" #-}
+ ( module Log.Backend.ElasticSearch.V1.Internal ) where
+
+import Log.Backend.ElasticSearch.V1.Internal
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/Lens.hs new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/Lens.hs
--- old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/Lens.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/Lens.hs 2017-06-20 18:17:46.000000000 +0200
@@ -0,0 +1,5 @@
+module Log.Backend.ElasticSearch.Lens
+ {-# DEPRECATED "Use directly Log.Backend.ElasticSearch.V1 or V5" #-}
+ ( module Log.Backend.ElasticSearch.V1.Lens ) where
+
+import Log.Backend.ElasticSearch.V1.Lens
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/V1/Internal.hs new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/V1/Internal.hs
--- old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/V1/Internal.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/V1/Internal.hs 2017-06-20 18:17:46.000000000 +0200
@@ -0,0 +1,31 @@
+module Log.Backend.ElasticSearch.V1.Internal
+ (ElasticSearchConfig(..)
+ ,defaultElasticSearchConfig
+ ,EsUsername(..)
+ ,EsPassword(..))
+where
+
+import Database.V1.Bloodhound hiding (Status)
+import Prelude
+import qualified Data.Text as T
+
+-- | Configuration for the Elasticsearch 'Logger'. See
+-- <https://www.elastic.co/guide/en/elasticsearch/reference/current/glossary.ht…>
+-- for the explanation of terms.
+data ElasticSearchConfig = ElasticSearchConfig {
+ esServer :: !T.Text -- ^ Elasticsearch server address.
+ , esIndex :: !T.Text -- ^ Elasticsearch index name.
+ , esMapping :: !T.Text -- ^ Elasticsearch mapping name.
+ , esLogin :: Maybe (EsUsername, EsPassword) -- ^ Elasticsearch basic authentication username and password.
+ , esLoginInsecure :: !Bool -- ^ Allow basic authentication over non-TLS connections.
+ } deriving (Eq, Show)
+
+-- | Sensible defaults for 'ElasticSearchConfig'.
+defaultElasticSearchConfig :: ElasticSearchConfig
+defaultElasticSearchConfig = ElasticSearchConfig {
+ esServer = "http://localhost:9200",
+ esIndex = "logs",
+ esMapping = "log",
+ esLogin = Nothing,
+ esLoginInsecure = False
+ }
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/V1/Lens.hs new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/V1/Lens.hs
--- old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/V1/Lens.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/V1/Lens.hs 2017-06-20 18:17:46.000000000 +0200
@@ -0,0 +1,40 @@
+{-# LANGUAGE RankNTypes #-}
+-- | Lensified version of "Log.Backend.ElasticSearch".
+module Log.Backend.ElasticSearch.V1.Lens (
+ I.ElasticSearchConfig
+ , esServer
+ , esIndex
+ , esMapping
+ , esLogin
+ , esLoginInsecure
+ , I.defaultElasticSearchConfig
+ , I.withElasticSearchLogger
+ ) where
+
+import Database.V1.Bloodhound hiding (Status)
+import Prelude
+import qualified Data.Text as T
+import qualified Log.Backend.ElasticSearch.V1 as I
+import qualified Log.Backend.ElasticSearch.V1.Internal ()
+
+type Lens' s a = forall f. Functor f => (a -> f a) -> s -> f s
+
+-- | Elasticsearch server address.
+esServer :: Lens' I.ElasticSearchConfig T.Text
+esServer f esc = fmap (\x -> esc { I.esServer = x }) $ f (I.esServer esc)
+
+-- | Elasticsearch index name.
+esIndex :: Lens' I.ElasticSearchConfig T.Text
+esIndex f esc = fmap (\x -> esc { I.esIndex = x }) $ f (I.esIndex esc)
+
+-- | Elasticsearch mapping name.
+esMapping :: Lens' I.ElasticSearchConfig T.Text
+esMapping f esc = fmap (\x -> esc { I.esMapping = x }) $ f (I.esMapping esc)
+
+-- | Elasticsearch basic authentication username and password.
+esLogin :: Lens' I.ElasticSearchConfig (Maybe (EsUsername, EsPassword))
+esLogin f esc = fmap (\x -> esc { I.esLogin = x }) $ f (I.esLogin esc)
+
+-- | Allow basic authentication over non-TLS connections.
+esLoginInsecure :: Lens' I.ElasticSearchConfig Bool
+esLoginInsecure f esc = fmap (\x -> esc { I.esLoginInsecure = x }) $ f (I.esLoginInsecure esc)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/V1.hs new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/V1.hs
--- old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/V1.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/V1.hs 2017-06-20 18:17:46.000000000 +0200
@@ -0,0 +1,254 @@
+-- | Elasticsearch logging back-end.
+module Log.Backend.ElasticSearch.V1 (
+ ElasticSearchConfig
+ , esServer
+ , esIndex
+ , esMapping
+ , esLogin
+ , esLoginInsecure
+ , defaultElasticSearchConfig
+ , withElasticSearchLogger
+ , elasticSearchLogger
+ ) where
+
+import Control.Applicative
+import Control.Arrow (second)
+import Control.Concurrent
+import Control.Exception
+import Control.Monad
+import Control.Monad.IO.Class
+import Data.Aeson
+import Data.Aeson.Encode.Pretty
+import Data.Bits
+import Data.IORef
+import Data.Maybe (isJust)
+import Data.Semigroup
+import Data.Time
+import Data.Time.Clock.POSIX
+import Data.Word
+import Database.V1.Bloodhound hiding (Status)
+import Log
+import Log.Internal.Logger
+import Network.HTTP.Client
+import Network.HTTP.Client.TLS (tlsManagerSettings)
+import Prelude
+import System.IO
+import TextShow
+import qualified Data.ByteString as BS
+import qualified Data.ByteString.Base64 as B64
+import qualified Data.ByteString.Lazy.Char8 as BSL
+import qualified Data.HashMap.Strict as H
+import qualified Data.Text as T
+import qualified Data.Text.Encoding as T
+import qualified Data.Traversable as F
+import qualified Data.Vector as V
+
+import Log.Backend.ElasticSearch.V1.Internal
+
+----------------------------------------
+-- | Create an 'elasticSearchLogger' for the duration of the given
+-- action, and shut it down afterwards, making sure that all buffered
+-- messages are actually written to the Elasticsearch store.
+withElasticSearchLogger :: ElasticSearchConfig -> IO Word32 -> (Logger -> IO r)
+ -> IO r
+withElasticSearchLogger conf randGen act = do
+ logger <- elasticSearchLogger conf randGen
+ withLogger logger act
+
+{-# DEPRECATED elasticSearchLogger "Use 'withElasticSearchLogger' instead!" #-}
+
+-- | Start an asynchronous logger thread that stores messages using
+-- Elasticsearch.
+--
+-- Please use 'withElasticSearchLogger' instead, which is more
+-- exception-safe (see the note attached to 'mkBulkLogger').
+elasticSearchLogger ::
+ ElasticSearchConfig -- ^ Configuration.
+ -> IO Word32 -- ^ Generate a random 32-bit word for use in
+ -- document IDs.
+ -> IO Logger
+elasticSearchLogger ElasticSearchConfig{..} genRandomWord = do
+ checkElasticSearchLogin
+ checkElasticSearchConnection
+ indexRef <- newIORef $ IndexName T.empty
+ mkBulkLogger "ElasticSearch" (\msgs -> do
+ now <- getCurrentTime
+ oldIndex <- readIORef indexRef
+ -- Bloodhound doesn't support letting ES autogenerate IDs, so let's generate
+ -- them ourselves. An ID of a log message is 12 bytes (4 bytes: random, 4
+ -- bytes: current time as epoch, 4 bytes: insertion order) encoded as
+ -- Base64. This makes eventual collisions practically impossible.
+ baseID <- (<>)
+ <$> (littleEndianRep <$> liftIO genRandomWord)
+ <*> pure (littleEndianRep . floor $ timeToDouble now)
+ retryOnException . runBH_ $ do
+ -- Elasticsearch index names are additionally indexed by date so that each
+ -- day is logged to a separate index to make log management easier.
+ let index = IndexName $ T.concat [
+ esIndex
+ , "-"
+ , T.pack $ formatTime defaultTimeLocale "%F" now
+ ]
+ when (oldIndex /= index) $ do
+ -- There is an obvious race condition in the presence of more than one
+ -- logger instance running, but it's irrelevant as attempting to create
+ -- index that already exists is harmless.
+ indexExists' <- indexExists index
+ unless indexExists' $ do
+ -- Bloodhound is weird and won't let us create index using default
+ -- settings, so pass these as the default ones.
+ let indexSettings = IndexSettings {
+ indexShards = ShardCount 4
+ , indexReplicas = ReplicaCount 1
+ }
+ void $ createIndex indexSettings index
+ reply <- putMapping index mapping LogsMapping
+ when (not $ isSuccess reply) $ do
+ error $ "ElasticSearch: error while creating mapping: "
+ <> T.unpack (T.decodeUtf8 . BSL.toStrict . jsonToBSL
+ $ decodeReply reply)
+ liftIO $ writeIORef indexRef index
+ let jsonMsgs = V.fromList $ map (toJsonMsg now) $ zip [1..] msgs
+ reply <- bulk $ V.map (toBulk index baseID) jsonMsgs
+ -- Try to parse parts of reply to get information about log messages that
+ -- failed to be inserted for some reason.
+ let replyBody = decodeReply reply
+ result = do
+ Object response <- return replyBody
+ Bool hasErrors <- "errors" `H.lookup` response
+ Array jsonItems <- "items" `H.lookup` response
+ items <- F.forM jsonItems $ \v -> do
+ Object item <- return v
+ Object index_ <- "index" `H.lookup` item
+ return index_
+ guard $ V.length items == V.length jsonMsgs
+ return (hasErrors, items)
+ case result of
+ Nothing -> liftIO . BSL.putStrLn
+ $ "ElasticSearch: unexpected response: " <> jsonToBSL replyBody
+ Just (hasErrors, items) -> when hasErrors $ do
+ -- If any message failed to be inserted because of type mismatch, go
+ -- back to them, replace their data with elastic search error and put
+ -- old data into its own namespace to work around insertion errors.
+ let failed = V.findIndices (H.member "error") items
+ dummyMsgs <- V.forM failed $ \n -> do
+ dataNamespace <- liftIO genRandomWord
+ let modifyData oldData = object [
+ "__es_error" .= H.lookup "error" (items V.! n)
+ , "__es_modified" .= True
+ , ("__data_" <> showt dataNamespace) .= oldData
+ ]
+ return . second (H.adjust modifyData "data") $ jsonMsgs V.! n
+ -- Attempt to put modified messages and ignore any further errors.
+ void $ bulk (V.map (toBulk index baseID) dummyMsgs))
+ (elasticSearchSync indexRef)
+ where
+ server = Server esServer
+ mapping = MappingName esMapping
+
+ elasticSearchSync :: IORef IndexName -> IO ()
+ elasticSearchSync indexRef = do
+ indexName <- readIORef indexRef
+ void . runBH_ $ refreshIndex indexName
+
+ checkElasticSearchLogin :: IO ()
+ checkElasticSearchLogin =
+ when (isJust esLogin
+ && not esLoginInsecure
+ && not ("https:" `T.isPrefixOf` esServer)) $
+ error $ "ElasticSearch: insecure login: "
+ <> "Attempting to send login credentials over an insecure connection. "
+ <> "Set esLoginInsecure = True to disable this check."
+
+ checkElasticSearchConnection :: IO ()
+ checkElasticSearchConnection = try (void $ runBH_ listIndices) >>= \case
+ Left (ex::HttpException) ->
+ hPutStrLn stderr $ "ElasticSearch: unexpected error: " <> show ex
+ <> " (is ElasticSearch server running?)"
+ Right () -> return ()
+
+ retryOnException :: forall r. IO r -> IO r
+ retryOnException m = try m >>= \case
+ Left (ex::SomeException) -> do
+ putStrLn $ "ElasticSearch: unexpected error: "
+ <> show ex <> ", retrying in 10 seconds"
+ threadDelay $ 10 * 1000000
+ retryOnException m
+ Right result -> return result
+
+ timeToDouble :: UTCTime -> Double
+ timeToDouble = realToFrac . utcTimeToPOSIXSeconds
+
+ runBH_ :: forall r. BH IO r -> IO r
+ runBH_ f = do
+ mgr <- newManager tlsManagerSettings
+ let hook = maybe return (uncurry basicAuthHook) esLogin
+ let env = (mkBHEnv server mgr) { bhRequestHook = hook }
+ runBH env f
+
+
+ jsonToBSL :: Value -> BSL.ByteString
+ jsonToBSL = encodePretty' defConfig { confIndent = Spaces 2 }
+
+ toJsonMsg :: UTCTime -> (Word32, LogMessage)
+ -> (Word32, H.HashMap T.Text Value)
+ toJsonMsg now (n, msg) = (n, H.union jMsg $ H.fromList [
+ ("insertion_order", toJSON n)
+ , ("insertion_time", toJSON now)
+ ])
+ where
+ Object jMsg = toJSON msg
+
+ mkDocId :: BS.ByteString -> Word32 -> DocId
+ mkDocId baseID insertionOrder = DocId . T.decodeUtf8
+ . B64.encode $ BS.concat [
+ baseID
+ , littleEndianRep insertionOrder
+ ]
+
+ toBulk :: IndexName -> BS.ByteString -> (Word32, H.HashMap T.Text Value)
+ -> BulkOperation
+ toBulk index baseID (n, obj) =
+ BulkIndex index mapping (mkDocId baseID n) $ Object obj
+
+data LogsMapping = LogsMapping
+instance ToJSON LogsMapping where
+ toJSON LogsMapping = object [
+ "properties" .= object [
+ "insertion_order" .= object [
+ "type" .= ("integer"::T.Text)
+ ]
+ , "insertion_time" .= object [
+ "type" .= ("date"::T.Text)
+ , "format" .= ("date_time"::T.Text)
+ ]
+ , "time" .= object [
+ "type" .= ("date"::T.Text)
+ , "format" .= ("date_time"::T.Text)
+ ]
+ , "domain" .= object [
+ "type" .= ("string"::T.Text)
+ ]
+ , "level" .= object [
+ "type" .= ("string"::T.Text)
+ ]
+ , "component" .= object [
+ "type" .= ("string"::T.Text)
+ ]
+ , "message" .= object [
+ "type" .= ("string"::T.Text)
+ ]
+ ]
+ ]
+
+----------------------------------------
+
+littleEndianRep :: Word32 -> BS.ByteString
+littleEndianRep = fst . BS.unfoldrN 4 step
+ where
+ step n = Just (fromIntegral $ n .&. 0xff, n `shiftR` 8)
+
+decodeReply :: Reply -> Value
+decodeReply reply = case eitherDecode' $ responseBody reply of
+ Right body -> body
+ Left err -> object ["decoding_error" .= err]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/V5/Internal.hs new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/V5/Internal.hs
--- old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/V5/Internal.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/V5/Internal.hs 2017-06-20 18:17:46.000000000 +0200
@@ -0,0 +1,31 @@
+module Log.Backend.ElasticSearch.V5.Internal
+ (ElasticSearchConfig(..)
+ ,defaultElasticSearchConfig
+ ,EsUsername(..)
+ ,EsPassword(..))
+where
+
+import Database.V5.Bloodhound hiding (Status)
+import Prelude
+import qualified Data.Text as T
+
+-- | Configuration for the Elasticsearch 'Logger'. See
+-- <https://www.elastic.co/guide/en/elasticsearch/reference/current/glossary.ht…>
+-- for the explanation of terms.
+data ElasticSearchConfig = ElasticSearchConfig {
+ esServer :: !T.Text -- ^ Elasticsearch server address.
+ , esIndex :: !T.Text -- ^ Elasticsearch index name.
+ , esMapping :: !T.Text -- ^ Elasticsearch mapping name.
+ , esLogin :: Maybe (EsUsername, EsPassword) -- ^ Elasticsearch basic authentication username and password.
+ , esLoginInsecure :: !Bool -- ^ Allow basic authentication over non-TLS connections.
+ } deriving (Eq, Show)
+
+-- | Sensible defaults for 'ElasticSearchConfig'.
+defaultElasticSearchConfig :: ElasticSearchConfig
+defaultElasticSearchConfig = ElasticSearchConfig {
+ esServer = "http://localhost:9200",
+ esIndex = "logs",
+ esMapping = "log",
+ esLogin = Nothing,
+ esLoginInsecure = False
+ }
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/V5/Lens.hs new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/V5/Lens.hs
--- old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/V5/Lens.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/V5/Lens.hs 2017-06-20 18:17:46.000000000 +0200
@@ -0,0 +1,40 @@
+{-# LANGUAGE RankNTypes #-}
+-- | Lensified version of "Log.Backend.ElasticSearch".
+module Log.Backend.ElasticSearch.V5.Lens (
+ I.ElasticSearchConfig
+ , esServer
+ , esIndex
+ , esMapping
+ , esLogin
+ , esLoginInsecure
+ , I.defaultElasticSearchConfig
+ , I.withElasticSearchLogger
+ ) where
+
+import Database.V5.Bloodhound hiding (Status)
+import Prelude
+import qualified Data.Text as T
+import qualified Log.Backend.ElasticSearch.V5 as I
+import qualified Log.Backend.ElasticSearch.V5.Internal ()
+
+type Lens' s a = forall f. Functor f => (a -> f a) -> s -> f s
+
+-- | Elasticsearch server address.
+esServer :: Lens' I.ElasticSearchConfig T.Text
+esServer f esc = fmap (\x -> esc { I.esServer = x }) $ f (I.esServer esc)
+
+-- | Elasticsearch index name.
+esIndex :: Lens' I.ElasticSearchConfig T.Text
+esIndex f esc = fmap (\x -> esc { I.esIndex = x }) $ f (I.esIndex esc)
+
+-- | Elasticsearch mapping name.
+esMapping :: Lens' I.ElasticSearchConfig T.Text
+esMapping f esc = fmap (\x -> esc { I.esMapping = x }) $ f (I.esMapping esc)
+
+-- | Elasticsearch basic authentication username and password.
+esLogin :: Lens' I.ElasticSearchConfig (Maybe (EsUsername, EsPassword))
+esLogin f esc = fmap (\x -> esc { I.esLogin = x }) $ f (I.esLogin esc)
+
+-- | Allow basic authentication over non-TLS connections.
+esLoginInsecure :: Lens' I.ElasticSearchConfig Bool
+esLoginInsecure f esc = fmap (\x -> esc { I.esLoginInsecure = x }) $ f (I.esLoginInsecure esc)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/V5.hs new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/V5.hs
--- old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch/V5.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch/V5.hs 2017-06-20 18:17:46.000000000 +0200
@@ -0,0 +1,254 @@
+-- | Elasticsearch logging back-end.
+module Log.Backend.ElasticSearch.V5 (
+ ElasticSearchConfig
+ , esServer
+ , esIndex
+ , esMapping
+ , esLogin
+ , esLoginInsecure
+ , defaultElasticSearchConfig
+ , withElasticSearchLogger
+ , elasticSearchLogger
+ ) where
+
+import Control.Applicative
+import Control.Arrow (second)
+import Control.Concurrent
+import Control.Exception
+import Control.Monad
+import Control.Monad.IO.Class
+import Data.Aeson
+import Data.Aeson.Encode.Pretty
+import Data.Bits
+import Data.IORef
+import Data.Maybe (isJust)
+import Data.Semigroup
+import Data.Time
+import Data.Time.Clock.POSIX
+import Data.Word
+import Database.V5.Bloodhound hiding (Status)
+import Log
+import Log.Internal.Logger
+import Network.HTTP.Client
+import Network.HTTP.Client.TLS (tlsManagerSettings)
+import Prelude
+import System.IO
+import TextShow
+import qualified Data.ByteString as BS
+import qualified Data.ByteString.Base64 as B64
+import qualified Data.ByteString.Lazy.Char8 as BSL
+import qualified Data.HashMap.Strict as H
+import qualified Data.Text as T
+import qualified Data.Text.Encoding as T
+import qualified Data.Traversable as F
+import qualified Data.Vector as V
+
+import Log.Backend.ElasticSearch.V5.Internal
+
+----------------------------------------
+-- | Create an 'elasticSearchLogger' for the duration of the given
+-- action, and shut it down afterwards, making sure that all buffered
+-- messages are actually written to the Elasticsearch store.
+withElasticSearchLogger :: ElasticSearchConfig -> IO Word32 -> (Logger -> IO r)
+ -> IO r
+withElasticSearchLogger conf randGen act = do
+ logger <- elasticSearchLogger conf randGen
+ withLogger logger act
+
+{-# DEPRECATED elasticSearchLogger "Use 'withElasticSearchLogger' instead!" #-}
+
+-- | Start an asynchronous logger thread that stores messages using
+-- Elasticsearch.
+--
+-- Please use 'withElasticSearchLogger' instead, which is more
+-- exception-safe (see the note attached to 'mkBulkLogger').
+elasticSearchLogger ::
+ ElasticSearchConfig -- ^ Configuration.
+ -> IO Word32 -- ^ Generate a random 32-bit word for use in
+ -- document IDs.
+ -> IO Logger
+elasticSearchLogger ElasticSearchConfig{..} genRandomWord = do
+ checkElasticSearchLogin
+ checkElasticSearchConnection
+ indexRef <- newIORef $ IndexName T.empty
+ mkBulkLogger "ElasticSearch" (\msgs -> do
+ now <- getCurrentTime
+ oldIndex <- readIORef indexRef
+ -- Bloodhound doesn't support letting ES autogenerate IDs, so let's generate
+ -- them ourselves. An ID of a log message is 12 bytes (4 bytes: random, 4
+ -- bytes: current time as epoch, 4 bytes: insertion order) encoded as
+ -- Base64. This makes eventual collisions practically impossible.
+ baseID <- (<>)
+ <$> (littleEndianRep <$> liftIO genRandomWord)
+ <*> pure (littleEndianRep . floor $ timeToDouble now)
+ retryOnException . runBH_ $ do
+ -- Elasticsearch index names are additionally indexed by date so that each
+ -- day is logged to a separate index to make log management easier.
+ let index = IndexName $ T.concat [
+ esIndex
+ , "-"
+ , T.pack $ formatTime defaultTimeLocale "%F" now
+ ]
+ when (oldIndex /= index) $ do
+ -- There is an obvious race condition in the presence of more than one
+ -- logger instance running, but it's irrelevant as attempting to create
+ -- index that already exists is harmless.
+ indexExists' <- indexExists index
+ unless indexExists' $ do
+ -- Bloodhound is weird and won't let us create index using default
+ -- settings, so pass these as the default ones.
+ let indexSettings = IndexSettings {
+ indexShards = ShardCount 4
+ , indexReplicas = ReplicaCount 1
+ }
+ void $ createIndex indexSettings index
+ reply <- putMapping index mapping LogsMapping
+ when (not $ isSuccess reply) $ do
+ error $ "ElasticSearch: error while creating mapping: "
+ <> T.unpack (T.decodeUtf8 . BSL.toStrict . jsonToBSL
+ $ decodeReply reply)
+ liftIO $ writeIORef indexRef index
+ let jsonMsgs = V.fromList $ map (toJsonMsg now) $ zip [1..] msgs
+ reply <- bulk $ V.map (toBulk index baseID) jsonMsgs
+ -- Try to parse parts of reply to get information about log messages that
+ -- failed to be inserted for some reason.
+ let replyBody = decodeReply reply
+ result = do
+ Object response <- return replyBody
+ Bool hasErrors <- "errors" `H.lookup` response
+ Array jsonItems <- "items" `H.lookup` response
+ items <- F.forM jsonItems $ \v -> do
+ Object item <- return v
+ Object index_ <- "index" `H.lookup` item
+ return index_
+ guard $ V.length items == V.length jsonMsgs
+ return (hasErrors, items)
+ case result of
+ Nothing -> liftIO . BSL.putStrLn
+ $ "ElasticSearch: unexpected response: " <> jsonToBSL replyBody
+ Just (hasErrors, items) -> when hasErrors $ do
+ -- If any message failed to be inserted because of type mismatch, go
+ -- back to them, replace their data with elastic search error and put
+ -- old data into its own namespace to work around insertion errors.
+ let failed = V.findIndices (H.member "error") items
+ dummyMsgs <- V.forM failed $ \n -> do
+ dataNamespace <- liftIO genRandomWord
+ let modifyData oldData = object [
+ "__es_error" .= H.lookup "error" (items V.! n)
+ , "__es_modified" .= True
+ , ("__data_" <> showt dataNamespace) .= oldData
+ ]
+ return . second (H.adjust modifyData "data") $ jsonMsgs V.! n
+ -- Attempt to put modified messages and ignore any further errors.
+ void $ bulk (V.map (toBulk index baseID) dummyMsgs))
+ (elasticSearchSync indexRef)
+ where
+ server = Server esServer
+ mapping = MappingName esMapping
+
+ elasticSearchSync :: IORef IndexName -> IO ()
+ elasticSearchSync indexRef = do
+ indexName <- readIORef indexRef
+ void . runBH_ $ refreshIndex indexName
+
+ checkElasticSearchLogin :: IO ()
+ checkElasticSearchLogin =
+ when (isJust esLogin
+ && not esLoginInsecure
+ && not ("https:" `T.isPrefixOf` esServer)) $
+ error $ "ElasticSearch: insecure login: "
+ <> "Attempting to send login credentials over an insecure connection. "
+ <> "Set esLoginInsecure = True to disable this check."
+
+ checkElasticSearchConnection :: IO ()
+ checkElasticSearchConnection = try (void $ runBH_ listIndices) >>= \case
+ Left (ex::HttpException) ->
+ hPutStrLn stderr $ "ElasticSearch: unexpected error: " <> show ex
+ <> " (is ElasticSearch server running?)"
+ Right () -> return ()
+
+ retryOnException :: forall r. IO r -> IO r
+ retryOnException m = try m >>= \case
+ Left (ex::SomeException) -> do
+ putStrLn $ "ElasticSearch: unexpected error: "
+ <> show ex <> ", retrying in 10 seconds"
+ threadDelay $ 10 * 1000000
+ retryOnException m
+ Right result -> return result
+
+ timeToDouble :: UTCTime -> Double
+ timeToDouble = realToFrac . utcTimeToPOSIXSeconds
+
+ runBH_ :: forall r. BH IO r -> IO r
+ runBH_ f = do
+ mgr <- newManager tlsManagerSettings
+ let hook = maybe return (uncurry basicAuthHook) esLogin
+ let env = (mkBHEnv server mgr) { bhRequestHook = hook }
+ runBH env f
+
+
+ jsonToBSL :: Value -> BSL.ByteString
+ jsonToBSL = encodePretty' defConfig { confIndent = Spaces 2 }
+
+ toJsonMsg :: UTCTime -> (Word32, LogMessage)
+ -> (Word32, H.HashMap T.Text Value)
+ toJsonMsg now (n, msg) = (n, H.union jMsg $ H.fromList [
+ ("insertion_order", toJSON n)
+ , ("insertion_time", toJSON now)
+ ])
+ where
+ Object jMsg = toJSON msg
+
+ mkDocId :: BS.ByteString -> Word32 -> DocId
+ mkDocId baseID insertionOrder = DocId . T.decodeUtf8
+ . B64.encode $ BS.concat [
+ baseID
+ , littleEndianRep insertionOrder
+ ]
+
+ toBulk :: IndexName -> BS.ByteString -> (Word32, H.HashMap T.Text Value)
+ -> BulkOperation
+ toBulk index baseID (n, obj) =
+ BulkIndex index mapping (mkDocId baseID n) $ Object obj
+
+data LogsMapping = LogsMapping
+instance ToJSON LogsMapping where
+ toJSON LogsMapping = object [
+ "properties" .= object [
+ "insertion_order" .= object [
+ "type" .= ("integer"::T.Text)
+ ]
+ , "insertion_time" .= object [
+ "type" .= ("date"::T.Text)
+ , "format" .= ("date_time"::T.Text)
+ ]
+ , "time" .= object [
+ "type" .= ("date"::T.Text)
+ , "format" .= ("date_time"::T.Text)
+ ]
+ , "domain" .= object [
+ "type" .= ("string"::T.Text)
+ ]
+ , "level" .= object [
+ "type" .= ("string"::T.Text)
+ ]
+ , "component" .= object [
+ "type" .= ("string"::T.Text)
+ ]
+ , "message" .= object [
+ "type" .= ("string"::T.Text)
+ ]
+ ]
+ ]
+
+----------------------------------------
+
+littleEndianRep :: Word32 -> BS.ByteString
+littleEndianRep = fst . BS.unfoldrN 4 step
+ where
+ step n = Just (fromIntegral $ n .&. 0xff, n `shiftR` 8)
+
+decodeReply :: Reply -> Value
+decodeReply reply = case eitherDecode' $ responseBody reply of
+ Right body -> body
+ Left err -> object ["decoding_error" .= err]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch.hs new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch.hs
--- old/log-elasticsearch-0.7/src/Log/Backend/ElasticSearch.hs 2016-11-25 10:08:48.000000000 +0100
+++ new/log-elasticsearch-0.9.0.1/src/Log/Backend/ElasticSearch.hs 2017-06-20 18:17:46.000000000 +0200
@@ -1,248 +1,5 @@
--- | Elasticsearch logging back-end.
-module Log.Backend.ElasticSearch (
- ElasticSearchConfig(..)
- , defaultElasticSearchConfig
- , withElasticSearchLogger
- , elasticSearchLogger
+module Log.Backend.ElasticSearch
+ {-# DEPRECATED "Use directly Log.Backend.ElasticSearch.V1 or V5" #-}
+ ( module Log.Backend.ElasticSearch.V1 ) where
- ) where
-
-import Control.Applicative
-import Control.Arrow (second)
-import Control.Concurrent
-import Control.Exception
-import Control.Monad
-import Control.Monad.IO.Class
-import Data.Aeson
-import Data.Aeson.Encode.Pretty
-import Data.Bits
-import Data.IORef
-import Data.Semigroup
-import Data.Time
-import Data.Time.Clock.POSIX
-import Data.Word
-import Database.Bloodhound hiding (Status)
-import Log
-import Log.Internal.Logger
-import Network.HTTP.Client
-import Prelude
-import TextShow
-import qualified Data.ByteString as BS
-import qualified Data.ByteString.Base64 as B64
-import qualified Data.ByteString.Lazy.Char8 as BSL
-import qualified Data.HashMap.Strict as H
-import qualified Data.Text as T
-import qualified Data.Text.Encoding as T
-import qualified Data.Traversable as F
-import qualified Data.Vector as V
-
--- | Configuration for the Elasticsearch 'Logger'. See
--- <https://www.elastic.co/guide/en/elasticsearch/reference/current/glossary.ht…>
--- for the explanation of terms.
-data ElasticSearchConfig = ElasticSearchConfig {
- esServer :: !T.Text -- ^ Elasticsearch server address.
- , esIndex :: !T.Text -- ^ Elasticsearch index name.
- , esMapping :: !T.Text -- ^ Elasticsearch mapping name.
- } deriving (Eq, Show)
-
--- | Sensible defaults for 'ElasticSearchConfig'.
-defaultElasticSearchConfig :: ElasticSearchConfig
-defaultElasticSearchConfig = ElasticSearchConfig {
- esServer = "http://localhost:9200",
- esIndex = "logs",
- esMapping = "log"
- }
-
-
-----------------------------------------
--- | Create an 'elasticSearchLogger' for the duration of the given
--- action, and shut it down afterwards, making sure that all buffered
--- messages are actually written to the Elasticsearch store.
-withElasticSearchLogger :: ElasticSearchConfig -> IO Word32 -> (Logger -> IO r)
- -> IO r
-withElasticSearchLogger conf randGen act = do
- logger <- elasticSearchLogger conf randGen
- withLogger logger act
-
-{-# DEPRECATED elasticSearchLogger "Use 'withElasticSearchLogger' instead!" #-}
-
--- | Start an asynchronous logger thread that stores messages using
--- Elasticsearch.
---
--- Please use 'withElasticSearchLogger' instead, which is more
--- exception-safe (see the note attached to 'mkBulkLogger').
-elasticSearchLogger ::
- ElasticSearchConfig -- ^ Configuration.
- -> IO Word32 -- ^ Generate a random 32-bit word for use in
- -- document IDs.
- -> IO Logger
-elasticSearchLogger ElasticSearchConfig{..} genRandomWord = do
- checkElasticSearchConnection
- indexRef <- newIORef $ IndexName T.empty
- mkBulkLogger "ElasticSearch" (\msgs -> do
- now <- getCurrentTime
- oldIndex <- readIORef indexRef
- -- Bloodhound doesn't support letting ES autogenerate IDs, so let's generate
- -- them ourselves. An ID of a log message is 12 bytes (4 bytes: random, 4
- -- bytes: current time as epoch, 4 bytes: insertion order) encoded as
- -- Base64. This makes eventual collisions practically impossible.
- baseID <- (<>)
- <$> (littleEndianRep <$> liftIO genRandomWord)
- <*> pure (littleEndianRep . floor $ timeToDouble now)
- retryOnException . runBH_ $ do
- -- Elasticsearch index names are additionally indexed by date so that each
- -- day is logged to a separate index to make log management easier.
- let index = IndexName $ T.concat [
- esIndex
- , "-"
- , T.pack $ formatTime defaultTimeLocale "%F" now
- ]
- when (oldIndex /= index) $ do
- -- There is an obvious race condition in the presence of more than one
- -- logger instance running, but it's irrelevant as attempting to create
- -- index that already exists is harmless.
- indexExists' <- indexExists index
- unless indexExists' $ do
- -- Bloodhound is weird and won't let us create index using default
- -- settings, so pass these as the default ones.
- let indexSettings = IndexSettings {
- indexShards = ShardCount 4
- , indexReplicas = ReplicaCount 1
- }
- void $ createIndex indexSettings index
- reply <- putMapping index mapping LogsMapping
- when (not $ isSuccess reply) $ do
- error $ "ElasticSearch: error while creating mapping: "
- <> T.unpack (T.decodeUtf8 . BSL.toStrict . jsonToBSL
- $ decodeReply reply)
- liftIO $ writeIORef indexRef index
- let jsonMsgs = V.fromList $ map (toJsonMsg now) $ zip [1..] msgs
- reply <- bulk $ V.map (toBulk index baseID) jsonMsgs
- -- Try to parse parts of reply to get information about log messages that
- -- failed to be inserted for some reason.
- let replyBody = decodeReply reply
- result = do
- Object response <- return replyBody
- Bool hasErrors <- "errors" `H.lookup` response
- Array jsonItems <- "items" `H.lookup` response
- items <- F.forM jsonItems $ \v -> do
- Object item <- return v
- Object index_ <- "index" `H.lookup` item
- return index_
- guard $ V.length items == V.length jsonMsgs
- return (hasErrors, items)
- case result of
- Nothing -> liftIO . BSL.putStrLn
- $ "ElasticSearch: unexpected response: " <> jsonToBSL replyBody
- Just (hasErrors, items) -> when hasErrors $ do
- -- If any message failed to be inserted because of type mismatch, go
- -- back to them, replace their data with elastic search error and put
- -- old data into its own namespace to work around insertion errors.
- let failed = V.findIndices (H.member "error") items
- dummyMsgs <- V.forM failed $ \n -> do
- dataNamespace <- liftIO genRandomWord
- let modifyData oldData = object [
- "__es_error" .= H.lookup "error" (items V.! n)
- , "__es_modified" .= True
- , ("__data_" <> showt dataNamespace) .= oldData
- ]
- return . second (H.adjust modifyData "data") $ jsonMsgs V.! n
- -- Attempt to put modified messages and ignore any further errors.
- void $ bulk (V.map (toBulk index baseID) dummyMsgs))
- (elasticSearchSync indexRef)
- where
- server = Server esServer
- mapping = MappingName esMapping
-
- elasticSearchSync :: IORef IndexName -> IO ()
- elasticSearchSync indexRef = do
- indexName <- readIORef indexRef
- void . runBH_ $ refreshIndex indexName
-
- checkElasticSearchConnection :: IO ()
- checkElasticSearchConnection = try (void $ runBH_ listIndices) >>= \case
- Left (ex::HttpException) -> error $ "ElasticSearch: unexpected error: "
- <> show ex
- <> " (is ElasticSearch server running?)"
- Right () -> return ()
-
- retryOnException :: forall r. IO r -> IO r
- retryOnException m = try m >>= \case
- Left (ex::SomeException) -> do
- putStrLn $ "ElasticSearch: unexpected error: "
- <> show ex <> ", retrying in 10 seconds"
- threadDelay $ 10 * 1000000
- retryOnException m
- Right result -> return result
-
- timeToDouble :: UTCTime -> Double
- timeToDouble = realToFrac . utcTimeToPOSIXSeconds
-
- runBH_ :: forall r. BH IO r -> IO r
- runBH_ = withBH defaultManagerSettings server
-
- jsonToBSL :: Value -> BSL.ByteString
- jsonToBSL = encodePretty' defConfig { confIndent = Spaces 2 }
-
- toJsonMsg :: UTCTime -> (Word32, LogMessage)
- -> (Word32, H.HashMap T.Text Value)
- toJsonMsg now (n, msg) = (n, H.union jMsg $ H.fromList [
- ("insertion_order", toJSON n)
- , ("insertion_time", toJSON now)
- ])
- where
- Object jMsg = toJSON msg
-
- mkDocId :: BS.ByteString -> Word32 -> DocId
- mkDocId baseID insertionOrder = DocId . T.decodeUtf8
- . B64.encode $ BS.concat [
- baseID
- , littleEndianRep insertionOrder
- ]
-
- toBulk :: IndexName -> BS.ByteString -> (Word32, H.HashMap T.Text Value)
- -> BulkOperation
- toBulk index baseID (n, obj) =
- BulkIndex index mapping (mkDocId baseID n) $ Object obj
-
-data LogsMapping = LogsMapping
-instance ToJSON LogsMapping where
- toJSON LogsMapping = object [
- "properties" .= object [
- "insertion_order" .= object [
- "type" .= ("integer"::T.Text)
- ]
- , "insertion_time" .= object [
- "type" .= ("date"::T.Text)
- , "format" .= ("date_time"::T.Text)
- ]
- , "time" .= object [
- "type" .= ("date"::T.Text)
- , "format" .= ("date_time"::T.Text)
- ]
- , "domain" .= object [
- "type" .= ("string"::T.Text)
- ]
- , "level" .= object [
- "type" .= ("string"::T.Text)
- ]
- , "component" .= object [
- "type" .= ("string"::T.Text)
- ]
- , "message" .= object [
- "type" .= ("string"::T.Text)
- ]
- ]
- ]
-
-----------------------------------------
-
-littleEndianRep :: Word32 -> BS.ByteString
-littleEndianRep = fst . BS.unfoldrN 4 step
- where
- step n = Just (fromIntegral $ n .&. 0xff, n `shiftR` 8)
-
-decodeReply :: Reply -> Value
-decodeReply reply = case eitherDecode' $ responseBody reply of
- Right body -> body
- Left err -> object ["decoding_error" .= err]
+import Log.Backend.ElasticSearch.V1
1
0
Hello community,
here is the log from the commit of package ghc-log for openSUSE:Factory checked in at 2017-08-31 20:48:15
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ghc-log (Old)
and /work/SRC/openSUSE:Factory/.ghc-log.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ghc-log"
Thu Aug 31 20:48:15 2017 rev:3 rq:513423 version:0.9.0.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/ghc-log/ghc-log.changes 2017-05-18 20:50:53.489487369 +0200
+++ /work/SRC/openSUSE:Factory/.ghc-log.new/ghc-log.changes 2017-08-31 20:48:15.852504030 +0200
@@ -1,0 +2,5 @@
+Thu Jul 27 14:06:02 UTC 2017 - psimons(a)suse.com
+
+- Update to version 0.9.0.1.
+
+-------------------------------------------------------------------
Old:
----
log-0.7.tar.gz
log.cabal
New:
----
log-0.9.0.1.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ghc-log.spec ++++++
--- /var/tmp/diff_new_pack.mCGgeq/_old 2017-08-31 20:48:16.728381086 +0200
+++ /var/tmp/diff_new_pack.mCGgeq/_new 2017-08-31 20:48:16.732380524 +0200
@@ -19,14 +19,13 @@
%global pkg_name log
%bcond_with tests
Name: ghc-%{pkg_name}
-Version: 0.7
+Version: 0.9.0.1
Release: 0
Summary: Structured logging solution with multiple backends
License: BSD-3-Clause
Group: Development/Languages/Other
Url: https://hackage.haskell.org/package/%{pkg_name}
Source0: https://hackage.haskell.org/package/%{pkg_name}-%{version}/%{pkg_name}-%{ve…
-Source1: https://hackage.haskell.org/package/%{pkg_name}-%{version}/revision/1.cabal…
BuildRequires: ghc-Cabal-devel
BuildRequires: ghc-log-base-devel
BuildRequires: ghc-log-elasticsearch-devel
@@ -74,7 +73,6 @@
%prep
%setup -q -n %{pkg_name}-%{version}
-cp -p %{SOURCE1} %{pkg_name}.cabal
%build
%ghc_lib_build
@@ -97,5 +95,6 @@
%files devel -f %{name}-devel.files
%defattr(-,root,root,-)
+%doc CHANGELOG.md README.md
%changelog
++++++ log-0.7.tar.gz -> log-0.9.0.1.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/CHANGELOG.md new/log-0.9.0.1/CHANGELOG.md
--- old/log-0.7/CHANGELOG.md 1970-01-01 01:00:00.000000000 +0100
+++ new/log-0.9.0.1/CHANGELOG.md 2017-06-20 18:22:15.000000000 +0200
@@ -0,0 +1,40 @@
+# log-0.9.0.0 (2017-05-04)
+* Updated the Elasticsearch back-end to work with bloodhound-0.14.0.0 (#30).
+* The following modules are now deprecated: Log.Backend.ElasticSearch,
+ Log.Backend.ElasticSearch.Internal,
+ Log.Backend.ElasticSearch.Lens. Use V1/V5 variants directly (#30).
+
+# log-0.8 (2017-03-16)
+* Added a few MTL class instances (#28).
+* Made ElasticSearchConfig an abstract type (#27).
+* Added support for HTTPS and basic auth to log-elasticsearch (#26).
+
+# log-0.7 (2016-11-25)
+* Split into four libraries (log, log-base, log-postgres,
+ log-elasticsearch).
+* Improved documentation (#22).
+* Implement 'toEncoding' directly in 'ToJSON' instances (#21).
+
+# log-0.6 (2016-11-22)
+* Moved 'withLogger' to 'Log.Internal.Logger'.
+
+# log-0.5.7 (2016-11-22)
+* Remove the dependency on 'cond'.
+* Fix formatting in 'mkBulkLogger' haddocks (#16).
+* Generalise the types of 'logAttention', 'logInfo' and 'logTrace'
+ (#17).
+
+# log-0.5.5 (2016-11-16)
+* Add an in-memory logging backend for testing (#13).
+* Fix the deprecation message for stdout logger.
+
+# log-0.5.4 (2016-10-21)
+* New logger creation API, which is harder to misuse.
+* Remove the use of finalisers in favour of the new logger API.
+* Fix a JSON serialisation issue affecting the Elasticsearch back-end.
+* Make the Elasticsearch back-end compatible with Elasticsearch 1.x.
+* Fix a synchronisation issue affecting the Elasticsearch back-end.
+* Add a test suite and Travis-based CI.
+
+# log-0.1.0
+* Initial version.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/README.md new/log-0.9.0.1/README.md
--- old/log-0.7/README.md 1970-01-01 01:00:00.000000000 +0100
+++ new/log-0.9.0.1/README.md 2017-06-20 18:22:15.000000000 +0200
@@ -0,0 +1,38 @@
+# log [![Hackage version](https://img.shields.io/hackage/v/log.svg?label=Hackage)](https://h… [![Build Status](https://secure.travis-ci.org/scrive/log.svg?branch=master)](http://…
+
+A library that provides a way to record structured log messages with
+multiple back ends.
+
+Supported back ends:
+
+* Standard output
+* Elasticsearch
+* PostgreSQL
+
+The `log` library provides Elasticsearch and PostgreSQL back ends. If
+you only need one of those, use `log-base` and `log-elasticsearch` or
+`log-postgres`.
+
+## Example
+
+```haskell
+{-# LANGUAGE OverloadedStrings #-}
+
+module Main where
+
+import Log
+import Log.Backend.ElasticSearch.V5
+
+import System.Random
+
+main :: IO ()
+main = do
+ let config = defaultElasticSearchConfig {
+ esServer = "http://localhost:9200",
+ esIndex = "logs",
+ esMapping = "log"
+ }
+ withElasticSearchLogger config randomIO $ \logger ->
+ runLogT "main" logger $ do
+ logTrace_ "foo"
+```
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/bench/Bench.hs new/log-0.9.0.1/bench/Bench.hs
--- old/log-0.7/bench/Bench.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-0.9.0.1/bench/Bench.hs 2017-06-20 18:22:15.000000000 +0200
@@ -0,0 +1,65 @@
+module Main where
+
+import Log
+import Log.Backend.ElasticSearch.V5
+import Log.Backend.PostgreSQL
+
+import Control.Concurrent
+import Control.Monad
+import Control.Monad.Base
+import Control.Monad.IO.Class
+import Database.PostgreSQL.PQTypes
+import Data.Monoid
+import System.Environment
+import System.Random
+import TextShow
+
+import qualified Data.Text as T
+import qualified System.Remote.Monitoring as EKG
+
+-- Usage:
+--
+-- 1. (Optional) Start elasticsearch/postgres
+-- 2. Run the benchmark exe with the appropriate argument ('postgres'/'elastic').
+-- 3. Open htop and/or localhost:8000 in the browser
+-- 4. Stop elasticsearch/postgres
+-- 5. Observe the memory allocation behaviour.
+
+main :: IO ()
+main = do
+ void $ EKG.forkServer "localhost" 8000
+ args <- getArgs
+ case args of
+ ["postgres"] -> benchPostgres defConnString
+ ["postgres", connString] -> benchPostgres (T.pack connString)
+ ["elastic"] -> benchElastic
+ _ -> benchElastic
+
+defConnString :: T.Text
+defConnString = "postgresql://user:password@localhost/log-postgresql-bench"
+
+benchPostgres :: T.Text -> IO ()
+benchPostgres connString = do
+ putStrLn "postgres"
+ ConnectionSource connSource <- poolSource def { csConnInfo = connString } 1 10 1
+ withPgLogger connSource $
+ \logger -> forever $ benchLogger logger
+
+benchElastic :: IO ()
+benchElastic = do
+ putStrLn "elastic"
+ let config = defaultElasticSearchConfig
+ withElasticSearchLogger config randomIO $
+ \logger -> forever $ benchLogger logger
+
+
+benchLogger :: (MonadIO m, MonadTime m, MonadBase IO m) => Logger -> m ()
+benchLogger logger= do
+ liftIO $ putStrLn "writing 100 000 log messages..."
+
+ runLogT "log-bench-elasticsearch" logger $
+ forM_ [0..10000] $ \(i :: Int) ->
+ logTrace_ ("kaboozle kaboozle kaboozle kaboozle kaboozle " <> showt i)
+
+ liftIO $ putStrLn "sleeping for 1 s..."
+ liftIO $ threadDelay 1000000 {- 1 sec -}
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/log.cabal new/log-0.9.0.1/log.cabal
--- old/log-0.7/log.cabal 2016-11-25 10:09:31.000000000 +0100
+++ new/log-0.9.0.1/log.cabal 2017-06-20 18:22:15.000000000 +0200
@@ -1,5 +1,5 @@
name: log
-version: 0.7
+version: 0.9.0.1
synopsis: Structured logging solution with multiple backends
description: A library that provides a way to record structured
@@ -25,7 +25,8 @@
category: System
build-type: Simple
cabal-version: >=1.10
-tested-with: GHC == 7.8.4, GHC == 7.10.3, GHC == 8.0.1
+extra-source-files: CHANGELOG.md, README.md
+tested-with: GHC == 7.8.4, GHC == 7.10.3, GHC == 8.0.2
Source-repository head
Type: git
@@ -34,6 +35,14 @@
library
exposed-modules: Log,
Log.Backend.ElasticSearch,
+ Log.Backend.ElasticSearch.Internal,
+ Log.Backend.ElasticSearch.Lens,
+ Log.Backend.ElasticSearch.V1,
+ Log.Backend.ElasticSearch.V1.Internal,
+ Log.Backend.ElasticSearch.V1.Lens,
+ Log.Backend.ElasticSearch.V5,
+ Log.Backend.ElasticSearch.V5.Internal,
+ Log.Backend.ElasticSearch.V5.Lens,
Log.Backend.PostgreSQL,
Log.Backend.StandardOutput,
Log.Backend.StandardOutput.Bulk,
@@ -45,9 +54,9 @@
Log.Monad
build-depends: base <5,
- log-base >= 0.7,
- log-elasticsearch >= 0.7,
- log-postgres >= 0.7
+ log-base >= 0.7.1.1 && < 0.9,
+ log-elasticsearch >= 0.9.0.1 && < 0.10,
+ log-postgres >= 0.7.0.1 && < 0.9
hs-source-dirs: src
@@ -104,3 +113,22 @@
default-extensions: BangPatterns
, OverloadedStrings
, RecordWildCards
+
+benchmark log-bench
+ type: exitcode-stdio-1.0
+ build-depends: base,
+ ekg,
+ log,
+ hpqtypes,
+ random,
+ text,
+ text-show,
+ transformers,
+ transformers-base
+ hs-source-dirs: bench
+ main-is: Bench.hs
+ ghc-options: -Wall -threaded "-with-rtsopts=-T -sstderr"
+ default-language: Haskell2010
+ default-extensions: OverloadedStrings,
+ FlexibleContexts,
+ ScopedTypeVariables
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/src/Log/Backend/ElasticSearch/Internal.hs new/log-0.9.0.1/src/Log/Backend/ElasticSearch/Internal.hs
--- old/log-0.7/src/Log/Backend/ElasticSearch/Internal.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-0.9.0.1/src/Log/Backend/ElasticSearch/Internal.hs 2017-06-20 18:22:15.000000000 +0200
@@ -0,0 +1,5 @@
+module Log.Backend.ElasticSearch.Internal
+ {-# DEPRECATED "Use directly Log.Backend.ElasticSearch.V1 or V5" #-}
+ ( module Log.Backend.ElasticSearch.V1.Internal ) where
+
+import Log.Backend.ElasticSearch.V1.Internal
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/src/Log/Backend/ElasticSearch/Lens.hs new/log-0.9.0.1/src/Log/Backend/ElasticSearch/Lens.hs
--- old/log-0.7/src/Log/Backend/ElasticSearch/Lens.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-0.9.0.1/src/Log/Backend/ElasticSearch/Lens.hs 2017-06-20 18:22:15.000000000 +0200
@@ -0,0 +1,5 @@
+module Log.Backend.ElasticSearch.Lens
+ {-# DEPRECATED "Use directly Log.Backend.ElasticSearch.V1 or V5" #-}
+ ( module Log.Backend.ElasticSearch.V1.Lens ) where
+
+import Log.Backend.ElasticSearch.V1.Lens
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/src/Log/Backend/ElasticSearch/V1/Internal.hs new/log-0.9.0.1/src/Log/Backend/ElasticSearch/V1/Internal.hs
--- old/log-0.7/src/Log/Backend/ElasticSearch/V1/Internal.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-0.9.0.1/src/Log/Backend/ElasticSearch/V1/Internal.hs 2017-06-20 18:22:15.000000000 +0200
@@ -0,0 +1,5 @@
+module Log.Backend.ElasticSearch.V1.Internal (
+ module Log.Backend.ElasticSearch.V1.Internal
+ ) where
+
+import "log-elasticsearch" Log.Backend.ElasticSearch.V1.Internal
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/src/Log/Backend/ElasticSearch/V1/Lens.hs new/log-0.9.0.1/src/Log/Backend/ElasticSearch/V1/Lens.hs
--- old/log-0.7/src/Log/Backend/ElasticSearch/V1/Lens.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-0.9.0.1/src/Log/Backend/ElasticSearch/V1/Lens.hs 2017-06-20 18:22:15.000000000 +0200
@@ -0,0 +1,5 @@
+module Log.Backend.ElasticSearch.V1.Lens (
+ module Log.Backend.ElasticSearch.V1.Lens
+ ) where
+
+import "log-elasticsearch" Log.Backend.ElasticSearch.V1.Lens
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/src/Log/Backend/ElasticSearch/V1.hs new/log-0.9.0.1/src/Log/Backend/ElasticSearch/V1.hs
--- old/log-0.7/src/Log/Backend/ElasticSearch/V1.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-0.9.0.1/src/Log/Backend/ElasticSearch/V1.hs 2017-06-20 18:22:15.000000000 +0200
@@ -0,0 +1,5 @@
+module Log.Backend.ElasticSearch.V1 (
+ module Log.Backend.ElasticSearch.V1
+ ) where
+
+import "log-elasticsearch" Log.Backend.ElasticSearch.V1
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/src/Log/Backend/ElasticSearch/V5/Internal.hs new/log-0.9.0.1/src/Log/Backend/ElasticSearch/V5/Internal.hs
--- old/log-0.7/src/Log/Backend/ElasticSearch/V5/Internal.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-0.9.0.1/src/Log/Backend/ElasticSearch/V5/Internal.hs 2017-06-20 18:22:15.000000000 +0200
@@ -0,0 +1,5 @@
+module Log.Backend.ElasticSearch.V5.Internal (
+ module Log.Backend.ElasticSearch.V5.Internal
+ ) where
+
+import "log-elasticsearch" Log.Backend.ElasticSearch.V5.Internal
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/src/Log/Backend/ElasticSearch/V5/Lens.hs new/log-0.9.0.1/src/Log/Backend/ElasticSearch/V5/Lens.hs
--- old/log-0.7/src/Log/Backend/ElasticSearch/V5/Lens.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-0.9.0.1/src/Log/Backend/ElasticSearch/V5/Lens.hs 2017-06-20 18:22:15.000000000 +0200
@@ -0,0 +1,5 @@
+module Log.Backend.ElasticSearch.V5.Lens (
+ module Log.Backend.ElasticSearch.V5.Lens
+ ) where
+
+import "log-elasticsearch" Log.Backend.ElasticSearch.V5.Lens
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/src/Log/Backend/ElasticSearch/V5.hs new/log-0.9.0.1/src/Log/Backend/ElasticSearch/V5.hs
--- old/log-0.7/src/Log/Backend/ElasticSearch/V5.hs 1970-01-01 01:00:00.000000000 +0100
+++ new/log-0.9.0.1/src/Log/Backend/ElasticSearch/V5.hs 2017-06-20 18:22:15.000000000 +0200
@@ -0,0 +1,5 @@
+module Log.Backend.ElasticSearch.V5 (
+ module Log.Backend.ElasticSearch.V5
+ ) where
+
+import "log-elasticsearch" Log.Backend.ElasticSearch.V5
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/src/Log/Backend/ElasticSearch.hs new/log-0.9.0.1/src/Log/Backend/ElasticSearch.hs
--- old/log-0.7/src/Log/Backend/ElasticSearch.hs 2016-11-25 10:09:31.000000000 +0100
+++ new/log-0.9.0.1/src/Log/Backend/ElasticSearch.hs 2017-06-20 18:22:15.000000000 +0200
@@ -1,5 +1,5 @@
-module Log.Backend.ElasticSearch (
- module Log.Backend.ElasticSearch
- ) where
+module Log.Backend.ElasticSearch
+ {-# DEPRECATED "Use directly Log.Backend.ElasticSearch.V1 or V5" #-}
+ ( module Log.Backend.ElasticSearch.V1 ) where
-import "log-elasticsearch" Log.Backend.ElasticSearch
+import Log.Backend.ElasticSearch.V1
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/test/IntegrationTest.hs new/log-0.9.0.1/test/IntegrationTest.hs
--- old/log-0.7/test/IntegrationTest.hs 2016-11-25 10:09:31.000000000 +0100
+++ new/log-0.9.0.1/test/IntegrationTest.hs 2017-06-20 18:22:15.000000000 +0200
@@ -3,7 +3,7 @@
import Log
import Log.Backend.StandardOutput
import Log.Backend.StandardOutput.Bulk
-import Log.Backend.ElasticSearch
+import Log.Backend.ElasticSearch.V5
import Test.ElasticSearch
import Data.List
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/test/Test/ElasticSearch.hs new/log-0.9.0.1/test/Test/ElasticSearch.hs
--- old/log-0.7/test/Test/ElasticSearch.hs 2016-11-25 10:09:31.000000000 +0100
+++ new/log-0.9.0.1/test/Test/ElasticSearch.hs 2017-06-20 18:22:15.000000000 +0200
@@ -6,13 +6,13 @@
,refreshTestIndex)
where
-import Log.Backend.ElasticSearch
+import Log.Backend.ElasticSearch.V5
import Control.Monad
import Data.Aeson
import Data.Text (Text)
import Data.Time
-import Database.Bloodhound
+import Database.V5.Bloodhound
import Network.HTTP.Client
import Test.Tasty.HUnit
@@ -25,11 +25,11 @@
defaultElasticSearchTestConfig :: ElasticSearchConfig
-> IO ElasticSearchTestConfig
-defaultElasticSearchTestConfig ElasticSearchConfig{..} = do
+defaultElasticSearchTestConfig esc = do
now <- getCurrentTime
- let testServer = Server esServer
+ let testServer = Server (esServer esc)
testIndex = IndexName $ T.concat
- [ esIndex
+ [ esIndex esc
, "-"
, T.pack $ formatTime defaultTimeLocale "%F" now
]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/log-0.7/test/Test.hs new/log-0.9.0.1/test/Test.hs
--- old/log-0.7/test/Test.hs 2016-11-25 10:09:31.000000000 +0100
+++ new/log-0.9.0.1/test/Test.hs 2017-06-20 18:22:15.000000000 +0200
@@ -1,7 +1,7 @@
module Main where
import Log
-import Log.Backend.ElasticSearch
+import Log.Backend.ElasticSearch.V5
import Test.ElasticSearch
import System.Random
1
0
Hello community,
here is the log from the commit of package ghc-linear-accelerate for openSUSE:Factory checked in at 2017-08-31 20:48:13
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ghc-linear-accelerate (Old)
and /work/SRC/openSUSE:Factory/.ghc-linear-accelerate.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ghc-linear-accelerate"
Thu Aug 31 20:48:13 2017 rev:2 rq:513422 version:0.4
Changes:
--------
--- /work/SRC/openSUSE:Factory/ghc-linear-accelerate/ghc-linear-accelerate.changes 2016-11-02 12:33:52.000000000 +0100
+++ /work/SRC/openSUSE:Factory/.ghc-linear-accelerate.new/ghc-linear-accelerate.changes 2017-08-31 20:48:14.560685358 +0200
@@ -1,0 +2,5 @@
+Thu Jul 27 14:07:50 UTC 2017 - psimons(a)suse.com
+
+- Update to version 0.4.
+
+-------------------------------------------------------------------
Old:
----
linear-accelerate-0.2.tar.gz
New:
----
linear-accelerate-0.4.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ghc-linear-accelerate.spec ++++++
--- /var/tmp/diff_new_pack.52ZEOA/_old 2017-08-31 20:48:15.588541081 +0200
+++ /var/tmp/diff_new_pack.52ZEOA/_new 2017-08-31 20:48:15.600539397 +0200
@@ -1,7 +1,7 @@
#
# spec file for package ghc-linear-accelerate
#
-# Copyright (c) 2016 SUSE LINUX GmbH, Nuernberg, Germany.
+# Copyright (c) 2017 SUSE LINUX GmbH, Nuernberg, Germany.
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -17,25 +17,29 @@
%global pkg_name linear-accelerate
+%bcond_with tests
Name: ghc-%{pkg_name}
-Version: 0.2
+Version: 0.4
Release: 0
-Summary: Instances to use linear vector spaces on accelerate backends
+Summary: Lifting linear vector spaces into Accelerate
License: BSD-3-Clause
-Group: System/Libraries
+Group: Development/Languages/Other
Url: https://hackage.haskell.org/package/%{pkg_name}
Source0: https://hackage.haskell.org/package/%{pkg_name}-%{version}/%{pkg_name}-%{ve…
BuildRequires: ghc-Cabal-devel
-# Begin cabal-rpm deps:
BuildRequires: ghc-accelerate-devel
+BuildRequires: ghc-cabal-doctest-devel
+BuildRequires: ghc-distributive-devel
BuildRequires: ghc-lens-devel
BuildRequires: ghc-linear-devel
BuildRequires: ghc-rpm-macros
BuildRoot: %{_tmppath}/%{name}-%{version}-build
-# End cabal-rpm deps
+%if %{with tests}
+BuildRequires: ghc-doctest-devel
+%endif
%description
-Instances to use linear vector spaces on accelerate backends.
+Lifting linear vector spaces into Accelerate.
%package devel
Summary: Haskell %{pkg_name} library development files
@@ -52,14 +56,14 @@
%prep
%setup -q -n %{pkg_name}-%{version}
-
%build
%ghc_lib_build
-
%install
%ghc_lib_install
+%check
+%cabal_test
%post devel
%ghc_pkg_recache
++++++ linear-accelerate-0.2.tar.gz -> linear-accelerate-0.4.tar.gz ++++++
++++ 2419 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package ghc-libsystemd-journal for openSUSE:Factory checked in at 2017-08-31 20:48:10
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ghc-libsystemd-journal (Old)
and /work/SRC/openSUSE:Factory/.ghc-libsystemd-journal.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ghc-libsystemd-journal"
Thu Aug 31 20:48:10 2017 rev:2 rq:513419 version:1.4.2
Changes:
--------
--- /work/SRC/openSUSE:Factory/ghc-libsystemd-journal/ghc-libsystemd-journal.changes 2017-04-12 18:07:33.303439324 +0200
+++ /work/SRC/openSUSE:Factory/.ghc-libsystemd-journal.new/ghc-libsystemd-journal.changes 2017-08-31 20:48:12.089032296 +0200
@@ -1,0 +2,5 @@
+Thu Jul 27 14:06:30 UTC 2017 - psimons(a)suse.com
+
+- Update to version 1.4.2.
+
+-------------------------------------------------------------------
Old:
----
libsystemd-journal-1.4.1.tar.gz
libsystemd-journal.cabal
New:
----
libsystemd-journal-1.4.2.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ghc-libsystemd-journal.spec ++++++
--- /var/tmp/diff_new_pack.pOQyap/_old 2017-08-31 20:48:12.928914404 +0200
+++ /var/tmp/diff_new_pack.pOQyap/_new 2017-08-31 20:48:12.932913843 +0200
@@ -18,14 +18,13 @@
%global pkg_name libsystemd-journal
Name: ghc-%{pkg_name}
-Version: 1.4.1
+Version: 1.4.2
Release: 0
Summary: Haskell bindings to libsystemd-journal
License: BSD-3-Clause
Group: Development/Languages/Other
Url: https://hackage.haskell.org/package/%{pkg_name}
Source0: https://hackage.haskell.org/package/%{pkg_name}-%{version}/%{pkg_name}-%{ve…
-Source1: https://hackage.haskell.org/package/%{pkg_name}-%{version}/revision/1.cabal…
BuildRequires: ghc-Cabal-devel
BuildRequires: ghc-bytestring-devel
BuildRequires: ghc-hashable-devel
@@ -63,7 +62,6 @@
%prep
%setup -q -n %{pkg_name}-%{version}
-cp -p %{SOURCE1} %{pkg_name}.cabal
%build
%ghc_lib_build
++++++ libsystemd-journal-1.4.1.tar.gz -> libsystemd-journal-1.4.2.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/libsystemd-journal-1.4.1/Changelog.md new/libsystemd-journal-1.4.2/Changelog.md
--- old/libsystemd-journal-1.4.1/Changelog.md 2016-11-04 18:33:20.000000000 +0100
+++ new/libsystemd-journal-1.4.2/Changelog.md 2017-07-24 14:18:20.000000000 +0200
@@ -1,3 +1,7 @@
+# 1.4.2
+
+* Updated `base` upper bound
+
# 1.4.1
* Updated `base` upper bound
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/libsystemd-journal-1.4.1/libsystemd-journal.cabal new/libsystemd-journal-1.4.2/libsystemd-journal.cabal
--- old/libsystemd-journal-1.4.1/libsystemd-journal.cabal 2016-11-04 18:33:20.000000000 +0100
+++ new/libsystemd-journal-1.4.2/libsystemd-journal.cabal 2017-07-24 14:18:20.000000000 +0200
@@ -1,5 +1,5 @@
name: libsystemd-journal
-version: 1.4.1
+version: 1.4.2
synopsis: Haskell bindings to libsystemd-journal
homepage: http://github.com/ocharles/libsystemd-journal
license: BSD3
@@ -16,7 +16,19 @@
library
exposed-modules: Systemd.Journal
- build-depends: base >=4.6 && <4.10, bytestring >= 0.9.1, pipes >= 4.0, pipes-safe >= 2.0, text >= 0.1 && < 1.3, transformers >= 0.3, unix-bytestring >= 0.3.2 && < 0.4, vector >= 0.4 && < 0.12, uuid, unordered-containers >= 0.1 && < 0.3, hashable >= 1.1.2.5, hsyslog, uniplate >= 1.6
+ build-depends: base >=4.6 && <4.11
+ , bytestring >= 0.9.1
+ , pipes >= 4.0
+ , pipes-safe >= 2.0
+ , text >= 0.1 && < 1.3
+ , transformers >= 0.3
+ , unix-bytestring >= 0.3.2 && < 0.4
+ , vector >= 0.4 && < 0.13
+ , uuid
+ , unordered-containers >= 0.1 && < 0.3
+ , hashable >= 1.1.2.5
+ , hsyslog
+ , uniplate >= 1.6
hs-source-dirs: src
default-language: Haskell2010
pkgconfig-depends: libsystemd >= 209
1
0
Hello community,
here is the log from the commit of package ghc-lens for openSUSE:Factory checked in at 2017-08-31 20:48:08
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ghc-lens (Old)
and /work/SRC/openSUSE:Factory/.ghc-lens.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ghc-lens"
Thu Aug 31 20:48:08 2017 rev:6 rq:513418 version:4.15.3
Changes:
--------
--- /work/SRC/openSUSE:Factory/ghc-lens/ghc-lens.changes 2017-03-20 17:07:33.230889989 +0100
+++ /work/SRC/openSUSE:Factory/.ghc-lens.new/ghc-lens.changes 2017-08-31 20:48:10.285285481 +0200
@@ -1,0 +2,5 @@
+Thu Jul 27 14:07:54 UTC 2017 - psimons(a)suse.com
+
+- Update to version 4.15.3.
+
+-------------------------------------------------------------------
Old:
----
lens-4.15.1.tar.gz
lens.cabal
New:
----
lens-4.15.3.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ghc-lens.spec ++++++
--- /var/tmp/diff_new_pack.9nKcwS/_old 2017-08-31 20:48:11.161162538 +0200
+++ /var/tmp/diff_new_pack.9nKcwS/_new 2017-08-31 20:48:11.165161976 +0200
@@ -19,19 +19,19 @@
%global pkg_name lens
%bcond_with tests
Name: ghc-%{pkg_name}
-Version: 4.15.1
+Version: 4.15.3
Release: 0
Summary: Lenses, Folds and Traversals
-License: BSD-3-Clause
+License: BSD-2-Clause
Group: Development/Languages/Other
Url: https://hackage.haskell.org/package/%{pkg_name}
Source0: https://hackage.haskell.org/package/%{pkg_name}-%{version}/%{pkg_name}-%{ve…
-Source1: https://hackage.haskell.org/package/%{pkg_name}-%{version}/revision/4.cabal…
BuildRequires: ghc-Cabal-devel
BuildRequires: ghc-array-devel
BuildRequires: ghc-base-orphans-devel
BuildRequires: ghc-bifunctors-devel
BuildRequires: ghc-bytestring-devel
+BuildRequires: ghc-cabal-doctest-devel
BuildRequires: ghc-comonad-devel
BuildRequires: ghc-containers-devel
BuildRequires: ghc-contravariant-devel
@@ -51,6 +51,7 @@
BuildRequires: ghc-tagged-devel
BuildRequires: ghc-template-haskell-devel
BuildRequires: ghc-text-devel
+BuildRequires: ghc-th-abstraction-devel
BuildRequires: ghc-transformers-compat-devel
BuildRequires: ghc-transformers-devel
BuildRequires: ghc-unordered-containers-devel
@@ -170,7 +171,6 @@
%prep
%setup -q -n %{pkg_name}-%{version}
-cp -p %{SOURCE1} %{pkg_name}.cabal
%build
%ghc_lib_build
++++++ lens-4.15.1.tar.gz -> lens-4.15.3.tar.gz ++++++
++++ 4181 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package ghc-language-c-quote for openSUSE:Factory checked in at 2017-08-31 20:48:07
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ghc-language-c-quote (Old)
and /work/SRC/openSUSE:Factory/.ghc-language-c-quote.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ghc-language-c-quote"
Thu Aug 31 20:48:07 2017 rev:2 rq:513415 version:0.12
Changes:
--------
--- /work/SRC/openSUSE:Factory/ghc-language-c-quote/ghc-language-c-quote.changes 2017-05-16 14:40:41.792717566 +0200
+++ /work/SRC/openSUSE:Factory/.ghc-language-c-quote.new/ghc-language-c-quote.changes 2017-08-31 20:48:07.829630174 +0200
@@ -1,0 +2,5 @@
+Thu Jul 27 14:08:14 UTC 2017 - psimons(a)suse.com
+
+- Update to version 0.12.
+
+-------------------------------------------------------------------
Old:
----
language-c-quote-0.11.7.3.tar.gz
language-c-quote.cabal
New:
----
language-c-quote-0.12.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ghc-language-c-quote.spec ++++++
--- /var/tmp/diff_new_pack.zj43V4/_old 2017-08-31 20:48:08.997466248 +0200
+++ /var/tmp/diff_new_pack.zj43V4/_new 2017-08-31 20:48:09.001465687 +0200
@@ -19,14 +19,13 @@
%global pkg_name language-c-quote
%bcond_with tests
Name: ghc-%{pkg_name}
-Version: 0.11.7.3
+Version: 0.12
Release: 0
Summary: C/CUDA/OpenCL/Objective-C quasiquoting library
License: BSD-3-Clause
Group: Development/Languages/Other
Url: https://hackage.haskell.org/package/%{pkg_name}
Source0: https://hackage.haskell.org/package/%{pkg_name}-%{version}/%{pkg_name}-%{ve…
-Source1: https://hackage.haskell.org/package/%{pkg_name}-%{version}/revision/1.cabal…
BuildRequires: alex
BuildRequires: ghc-Cabal-devel
BuildRequires: ghc-array-devel
@@ -70,7 +69,6 @@
%prep
%setup -q -n %{pkg_name}-%{version}
-cp -p %{SOURCE1} %{pkg_name}.cabal
%build
%ghc_lib_build
++++++ language-c-quote-0.11.7.3.tar.gz -> language-c-quote-0.12.tar.gz ++++++
++++ 13330 lines of diff (skipped)
1
0
Hello community,
here is the log from the commit of package ghc-language-c for openSUSE:Factory checked in at 2017-08-31 20:48:03
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ghc-language-c (Old)
and /work/SRC/openSUSE:Factory/.ghc-language-c.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ghc-language-c"
Thu Aug 31 20:48:03 2017 rev:4 rq:513414 version:0.6.1
Changes:
--------
--- /work/SRC/openSUSE:Factory/ghc-language-c/ghc-language-c.changes 2016-07-20 09:21:58.000000000 +0200
+++ /work/SRC/openSUSE:Factory/.ghc-language-c.new/ghc-language-c.changes 2017-08-31 20:48:03.638218507 +0200
@@ -1,0 +2,5 @@
+Thu Jul 27 14:03:57 UTC 2017 - psimons(a)suse.com
+
+- Update to version 0.6.1.
+
+-------------------------------------------------------------------
Old:
----
language-c-0.5.0.tar.gz
New:
----
language-c-0.6.1.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ ghc-language-c.spec ++++++
--- /var/tmp/diff_new_pack.4II13z/_old 2017-08-31 20:48:04.518095001 +0200
+++ /var/tmp/diff_new_pack.4II13z/_new 2017-08-31 20:48:04.522094441 +0200
@@ -1,7 +1,7 @@
#
# spec file for package ghc-language-c
#
-# Copyright (c) 2016 SUSE LINUX GmbH, Nuernberg, Germany.
+# Copyright (c) 2017 SUSE LINUX GmbH, Nuernberg, Germany.
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -17,15 +17,15 @@
%global pkg_name language-c
+%bcond_with tests
Name: ghc-%{pkg_name}
-Version: 0.5.0
+Version: 0.6.1
Release: 0
Summary: Analysis and generation of C code
License: BSD-3-Clause
-Group: System/Libraries
+Group: Development/Languages/Other
Url: https://hackage.haskell.org/package/%{pkg_name}
Source0: https://hackage.haskell.org/package/%{pkg_name}-%{version}/%{pkg_name}-%{ve…
-# Begin cabal-rpm deps:
BuildRequires: alex
BuildRequires: ghc-Cabal-devel
BuildRequires: ghc-array-devel
@@ -39,12 +39,11 @@
BuildRequires: ghc-syb-devel
BuildRequires: happy
BuildRoot: %{_tmppath}/%{name}-%{version}-build
-# End cabal-rpm deps
%description
Language C is a haskell library for the analysis and generation of C code.
It features a complete, well tested parser and pretty printer for all of C99
-and a large set of GNU extensions.
+and a large set of C11 and clang/GNU extensions.
%package devel
Summary: Haskell %{pkg_name} library development files
@@ -60,14 +59,14 @@
%prep
%setup -q -n %{pkg_name}-%{version}
-
%build
%ghc_lib_build
-
%install
%ghc_lib_install
+%check
+%cabal_test
%post devel
%ghc_pkg_recache
++++++ language-c-0.5.0.tar.gz -> language-c-0.6.1.tar.gz ++++++
++++ 11041 lines of diff (skipped)
1
0