-
Notifications
You must be signed in to change notification settings - Fork 108
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spike on a new API for Kafka Streams support #34
base: 5.4.x
Are you sure you want to change the base?
Conversation
|
Test failures seem unrelated and resolved by another PR |
The changes seem ok to me, only think is they are breaking changes. I wonder if there is way to maintain some form of backward's compatibility. Either way we need to alter the version in https://github.com/micronaut-projects/micronaut-kafka/blob/master/gradle.properties#L1 to |
@graemerocher If we're backwards-breaking, wouldn't we bump to |
413714f
to
3148388
Compare
I've rebased with my correct email address in the commit author, to please the CLA bot. |
@bobby Would there be any doc updates for your changes? |
@ctoestreich Yes, there will need to be doc changes given the new API. This new API will also enable Kafka Streams testing, but I may add that plus testing docs in its own PR. |
@bobby could you rebase the changes and provide documentation? We do use semantic versioning so I will need to split out a 1.1.x branch before merging this commit. Thanks. |
3148388
to
5bed275
Compare
@bobby any more thoughts on documentation? I am thinking that may we should just bump the kafka streams module to 2.0 or even split it into a separate repo. That way the core library can continue to evolve. Thoughts? |
Yes, I think at least the kafka-streams module will need to bump to 2.0.0 given the changes, but can probably stay in same repo?
I’m working on the docs now.
Cheers,
Bobby
…Sent from my iPhone
On May 27, 2019, at 10:22 AM, Graeme Rocher ***@***.***> wrote:
@bobby any more thoughts on documentation? I am thinking that may we should just bump the kafka streams module to 2.0 or even split it into a separate repo. That way the core library can continue to evolve. Thoughts?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.
|
While testing an application with several instances of |
038628d
to
0088ffc
Compare
0eba4fd
to
703ca1b
Compare
950a04f
to
5f65d00
Compare
@bobby I also ran into this a bit recently. I am available to help with your branch a bit if you want to resurrect this PR. I spent some time during my last PR to add health trying to dissect the KafkaStreamsFactory as I needed stuff there to relate a kstream to the configuration used to build it for metadata on the health endpiont. |
@ctoestreich let me know if you plan to contribute this, since we plan to do a release soon. I have updated the module to Kafka 2.5.0 in the meantime |
Spending some time today revisiting @bobby code. @bobby @tetriscode what we’re the remaining open issues? |
Hi @bobby @graemerocher @ctoestreich and others, is there any chance to fix this PR and merge it? or to do a serious changes in Micronaut's Kafka Streams configuration/DSL? |
We would love contributions in this area to advance this PR and make Micronaut with Kafka streams a better experience |
@graemerocher I can't promise now. I have a bigger POC project (btw. a major part might be written in Micronaut) I need to finish till autumn, but who knows after that. I'm also involved in a bigger system written from scratch using Kafka Streams mostly.
|
My team ran into some issues when using KafkaStreams with Micronaut on a current project:
KStream
from factory methods seemed odd and arbitrary (what if my topology only usesKTable
orGlobalKTable
?), and theKStream
bean itself is never used by the FactoryConfiguredStreamBuilder
andKStream
weren't named or specified in theKafkaStreams
factory methodSo we reworked the API a bit, and this new implementation has solved these issues. Now the API looks like:
If this API looks good to the maintainers, I'll be happy to update the documentation, etc.