amazon web services - AWS Kinesis and Lambda data versioning -


i have created aws firehose endpoint (might change simple kinesis) receives logs producers , saves them s3 bucket , lambda function consumes data, processes , saves output db.

everything works fine. planning creating staging , development flow entire structure. when release new version not capable of replacing entire producers instantly, therefore need keep older production versions until no producer left - because might make breaking protocol changes on new versions.

i not sure best approach create versionable system using kinesis , lambda. should copy entire structure new versions (including dev , staging) , make producers write specific versioned stream?

or should create mid lambda function inspects packets (which contain version info) , outputs events specific s3 has versioned folders? lambda functions consume data know about. let me use versioning support lambda functions.

here structure image first idea

seperate flows each version

here second structure

single common flow versions

i wonder better solution or there better ways accomplish this

first, lambdas can triggered directly using kinesis- no need kinesis firehose or s3.

second, question boils down to: need separate kinesis+lambda pipeline per version or not. i'd go following solution:

  • one kinesis stream all versions of data.
  • one lambda function on stream. internally handles different versions separately. crudely speaking, think of various if-else checks on version number.

the advantages of above approach vs 1 kinesis+lambda pipeline per version:

  • the former operationally simpler. in latter, you'll need setup new pipeline every time new version introduced.
  • at point of time, you'd have small number of active versions. so, few if-else checks in code should work fine.

of course, keep dev , prod pipelines separate, minimize blast radius of bad code code in former.


Comments

Popular posts from this blog

javascript - Clear button on addentry page doesn't work -

c# - Selenium Authentication Popup preventing driver close or quit -

tensorflow when input_data MNIST_data , zlib.error: Error -3 while decompressing: invalid block type -