Posts tagged "amazonka":
Getting Amazonka S3 to work with localstack
I'm writing this in case someone else is getting strange errors when trying to use amazonka-s3 with localstack. It took me rather too long finding the answer and neither the errors I got from Amazonka nor from localstack were very helpful.
The code I started with for setting up the connection looked like this
main = do awsEnv <- AWS.overrideService localEndpoint <$> AWS.newEnv AWS.discover -- do S3 stuff where localEndpoint = AWS.setEndpoint False "localhost" 4566
A few years ago, when I last wrote some Haskell to talk to S3 this was enough1, but now I got some strange errors.
It turns out there are different ways to address buckets and the default, which
is used by AWS itself, isn't used by localstack. The documentation of
S3AddressingStyle
has more details.
So to get it to work I had to change the S3 addressing style as well and ended up with this code instead
main = do awsEnv <- AWS.overrideService (s3AddrStyle . localEndpoint) <$> AWS.newEnv AWS.discover -- do S3 stuff where localEndpoint = AWS.setEndpoint False "localhost" 4566 s3AddrStyle svc = svc {AWS.s3AddressingStyle = AWS.S3AddressingStylePath}
Footnotes:
That was before version 2.0 of Amazonka, so it did look slightly different, but overriding the endpoint was all that was needed.
Combining Amazonka and Conduit
Combining amazonka and conduit turned out to be easier than I had expected.
Here's an SNS sink I put together today
snsSink :: (MonadAWS m, MonadIO m) => T.Text -> C.ConduitT Value C.Void m () snsSink topic = do C.await >>= \case Nothing -> pure () Just msg -> do _ <- C.lift $ publishSNS topic (TL.toStrict $ TL.decodeUtf8 $ encode msg) snsSink topic
Putting it to use can be done with something like
foo = do ... awsEnv <- newEnv Discover runAWSCond awsEnv $ <source producing Value> .| snsSink topicArn where runAWSCond awsEnv = runResourceT . runAWS awsEnv . within Frankfurt . C.runConduit
Using stack to get around upstream bugs
Recently I bumped into a bug in amazonka.1 I can't really sit around waiting for Amazon to fix it, and then for amazonka to use the fixed documentation to generate the code and make another release.
Luckily stack
contains features that make it fairly simple to work around this
bug until it's properly fixed. Here's how.
- Put the upstream code in a git repository of your own. In my case I simply forked the amazonka repository on github (my fork is here).
- Fix the bug and commit the change. My change to amazonka-codepipeline was simply to remove the missing fields – it was easier than trying to make them optional (i.e. wrapping them in =Maybe=s).
- Tell
slack
to use the code from your modified git repository. In my case I added the following to myslack.yaml
:
extra-deps: - github: magthe/amazonka commit: 1543b65e3a8b692aa9038ada68aaed9967752983 subdirs: - amazonka-codepipeline
That's it!
Footnotes:
The guilty party is Amazon, not amazonka, though I was a little surprised that there doesn't seem to be any established way to modify the Amazon API documentation before it's used to autogenerate the Haskell code.