Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feed Management Enhancements : READ and UPDATE lifecycleEvent #220

Open
csantanapr opened this issue Oct 30, 2017 · 6 comments
Open

Feed Management Enhancements : READ and UPDATE lifecycleEvent #220

csantanapr opened this issue Oct 30, 2017 · 6 comments

Comments

@csantanapr
Copy link
Member

csantanapr commented Oct 30, 2017

Feed Management Enhancements for Kafka feed

This issue is an Epic to collect all the related tasks to implement the final solution for kafka

For some background on feeds and lifecycles refer to https://github.com/apache/incubator-openwhisk/blob/master/docs/feeds.md

This features are based on user feedback

Feature Requests

  • Allow the user retrieve trigger information from the feed provider
    • This allows the user to get back the original config
    • This allows the user to get the current status of the trigger
  • Allow the user to update the trigger configuration on the feed provider without the need to delete and re-create the trigger with a new configuration

Lifecycle Events

Today this feed implements the lifecycle events CREATE and DETELETE
To implement the features requested above the this feed would need to implement the lifecycle events READ, UPDATE

READ lifecycle

The feed action should accept the parameter lifecycleEvent with the value of READ.

It should return the configuration and status for the trigger, for example:

{
  "config":{
    "triggerName": "_/myKafkaTrigger",
    "isMessageHub": true,
    "topic": "mytopic",
    "isJSONData": true,
    "isBinaryValue": false,
    "isBinaryKey": false,
    "kafka_brokers_sasl": [
      "kafka01-prod01.messagehub.services.us-south.bluemix.net:9093",
      "kafka02-prod01.messagehub.services.us-south.bluemix.net:9093",
      "kafka03-prod01.messagehub.services.us-south.bluemix.net:9093",
      "kafka04-prod01.messagehub.services.us-south.bluemix.net:9093",
      "kafka05-prod01.messagehub.services.us-south.bluemix.net:9093"
    ],
    "user": "admin",
    "password": "123",
    "kafka_admin_url": "https://kafka-admin-prod01.messagehub.services.us-south.bluemix.net:443"
  },
  "status":{
    "active": false,
    "dateChanged":1509321736699,
    "dateChangedISO":"2017-10-30T00:02:16.699Z",
    "reason":{
      "kind": "AUTO",
      "statusCode": 403,
      "message": "Automatically disabled after receiving a 403 status code when firing the trigger."
    }
  }
}

Notes:
The field status.reason not present when status.active is true

The clients would be able to invoke the feed action with this lifecycleEvent
For example using the CLI directly invoking the feed action.

wsk action invoke /whisk.system/messaging/messageHubFeed -p triggerName _/myKafkaTrigger -p authKey guest:123 -p lifecycleEvent READ -r

The go-CLI would be able to call the feed action automatically on a trigger get.
For example:

wsk trigger get myKafkaTrigger
ok: got trigger myCloudantTrigger
{
    "namespace": "guest",
    "name": "myKafkaTrigger",
    "version": "0.0.1",
    "annotations": [
        {
            "key": "feed",
            "value": "/whisk.system/messaging/messageHubFeed"
        }
    ]
}
ok: invoked feed action /whisk.system/messaging/messageHubFeed
{
  config:{
    triggerName: "_/myKafkaTrigger",
    isMessageHub: true,
    topic: "mytopic",
    isJSONData: true,
    isBinaryValue: false,
    isBinaryKey: false,
    kafka_brokers_sasl: [
      "kafka01-prod01.messagehub.services.us-south.bluemix.net:9093",
      "kafka02-prod01.messagehub.services.us-south.bluemix.net:9093",
      "kafka03-prod01.messagehub.services.us-south.bluemix.net:9093",
      "kafka04-prod01.messagehub.services.us-south.bluemix.net:9093",
      "kafka05-prod01.messagehub.services.us-south.bluemix.net:9093"
    ],
    user: "admin",
    password: "123",
    "kafka_admin_url": "https://kafka-admin-prod01.messagehub.services.us-south.bluemix.net:443"
  },
  status:{
    active: true
    }
  }
}

UPDATE lifecycle

The UPDATE lifecycle will allow users to update their feed without the need to delete and re-create their trigger.
The feed action should accept the parameter lifecycleEvent with the value of UPDATE.

The user should be able to pass a partial set of the configuration values, allowing them to update their trigger feed.

The feed will be paused while it's being updated.
For example if user wants to update the topic name, it will invoke the feed action with the topic parameter for the existing trigger.

For example invoking directly the feed action:

wsk action invoke /whisk.system/messaging/messageHubFeed -p topic newtopic -p triggerName _/myKafkaTrigger -p authKey guest:123 -p lifecycleEvent UPDATE -r

Using the cli using the integration embedded into the CLI

wsk trigger update myKafkaTrigger -p topic newtopic
@jberstler
Copy link

One wrinkle here is that, for Message Hub triggers, the properties recorded in the DB can have different names than the properties specified when creating the trigger. For example, I specify my brokers using the kafka_brokers_sasl property, but this gets recorded as brokers in the DB.

I would argue that when retrieving the trigger details, the user should always see the property names the same as what they set when creating the trigger. This may mean massaging the property name back to what the user expects to see (e.g. brokers -> kafka_brokers_sasl)

@csantanapr
Copy link
Member Author

Agree we should not expose the implementation details and keep the abstraction of the interface to the user the same so massaging the keys is a good thing.

@csantanapr
Copy link
Member Author

@abaruni Could you change READ config object to match the input parameters by user based on feedback from @jberstler above ^^ #220 (comment)

I would argue that when retrieving the trigger details, the user should always see the property names the same as what they set when creating the trigger. This may mean massaging the property name back to what the user expects to see (e.g. brokers -> kafka_brokers_sasl)

cc @jasonpet

@csantanapr
Copy link
Member Author

I did a quick look it looks like there are 2 fields to change
username -> user
brokers -> kafka_brokers_sasl

{
"kafka_brokers_sasl": [
      "kafka01-prod01.messagehub.services.us-south.bluemix.net:9093",
      "kafka02-prod01.messagehub.services.us-south.bluemix.net:9093",
      "kafka03-prod01.messagehub.services.us-south.bluemix.net:9093",
      "kafka04-prod01.messagehub.services.us-south.bluemix.net:9093",
      "kafka05-prod01.messagehub.services.us-south.bluemix.net:9093"
    ],
"user": "admin"
}

@abaruni I will update description above, can you double check?

@csantanapr
Copy link
Member Author

@csantanapr
Copy link
Member Author

It's definitely user here in MessageHubFeedWeb Action https://github.com/apache/incubator-openwhisk-package-kafka/blob/master/action/messageHubFeedWeb.js#L145

But stored as username in db, so we need the get to change to user in the way out
https://github.com/apache/incubator-openwhisk-package-kafka/blob/master/action/messageHubFeedWeb.js#L84-L86
it is

                            kafka_brokers_sasl: triggerDoc.brokers,
                            user: triggerDoc.username,

it should be

                            brokers: triggerDoc.brokers,
                            username: triggerDoc.username,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants