Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to create trigger from openwhisk kafka feed that listens to a Generic Kafka instance in a Local ubuntu installation #221

Open
prabhastechie opened this issue Nov 9, 2017 · 12 comments

Comments

@prabhastechie
Copy link

I have openwhisk local installation on Ubuntu 16.04 desktop. Actions, triggers, rules and alarm triggers are working.

I cloned the git repository https://github.com/apache/incubator-openwhisk-package-kafka and ran the following in sequence: installCatalog.sh , gradlew :distDocker, installKafka.sh

Then I tried to create a trigger:

bin/wsk trigger create MyKafkaTrigger -f /messaging/kafkaFeed -p brokers "
[\"localhost:9092\", \"localhost:9093\"]" -p topic test -p isJSONData true -
-insecure

I am following this section of the README: "Creating a Trigger that listens to a Generic Kafka instance"

I am re-using the Kafka instance created as part of Openwhisk installation, and created a topic named 'test' - I am able to publish / consume to this topic using kafka command line tools.

The trigger creation fails (it deletes the trigger, saying resource does not exist).

One thing I observed is that the following packages were created under /guest:

bin/wsk package list /guest --insecurepackages
/guest/messagingWeb                                                    
                     private
/guest/messaging                      

I did change the feed name to /guest/messaging/kafkFeed, only kafkaFeed, etc., but the results are slightly different:

bin/wsk trigger create MyKafkaTrigger -f /guest/messaging/kafkaFeed -p
brokers "["localhost:9092", "localhost:9093"]" -p topic test -p
isJSONData true --insecure
GIVES a JSON output saying "error": "The requested resource does not exist."

bin/wsk trigger create MyKafkaTrigger -f /messaging/kafkaFeed -p brokers "
["localhost:9092", "localhost:9093"]" -p topic test -p isJSONData true -
-insecure
GIVES

ok: deleted trigger MyKafkaTrigger

error: Unable to create trigger 'MyKafkaTrigger': Unable to invoke trigger 
'MyKafkaTrigger' feed action '/messaging/kafkaFeed'; feed is not configured: 
Unable to invoke action 'kafkaFeed': The supplied authentication is not 
authorized to access this resource. (code 186)

Note that I am just planning to use a Generic Kafka instance without SASL . username / password / kafka admin (REST) url.

Later, I was advised to follow the document:
https://github.com/apache/incubator-openwhisk-package-kafka/blob/master/devGuide.md#install-actions
I created a docker build and create steps there too. But the trigger still cannot be created.

I do get expected results with the following commands:

bin/wsk -i package list /guest

packages
/guest/messagingWeb                                                    private
/guest/messaging

and

bin/wsk -i package get --summary /guest/messaging

package /guest/messaging: Returns a result based on parameter endpoint
   (parameters: *endpoint)
 action /guest/messaging/kafkaProduce: Produce a message to a Kafka cluster
   (parameters: base64DecodeKey, base64DecodeValue, brokers, key, topic, value)
 feed   /guest/messaging/kafkaFeed: Feed to listen to Kafka messages
   (parameters: brokers, endpoint, isBinaryKey, isBinaryValue, isJSONData, topic)

Also, I tried:

bin/wsk -i property get --namespace 
whisk namespace

Then I tried unsetting the namespace:
bin/wsk -i property unset --namespace

and

run trigger create again

But same error remains.

On a related note, I am not sure how the docker creation in the devGuide will work, since the port mapping 80:5000 will conflict with port 80 used by nginx. Note that this is a local installation - all in the same machine. If we map it to a different port, how will the trigger know which (non-standard) port to connect to?

But creating the trigger should work even if the docker is not running, so the more fundamental question is how to create the trigger.

@prabhastechie
Copy link
Author

It may be worth mentioning that I had the same problem with the alarms trigger when I tried installCatalog.sh, but was able to run the triggers smoothly by using the unofficial ansible script for the alarms trigger mentioner here:
apache/openwhisk-package-alarms#51
The ansible script was contributed by Jason Peterson

@jberstler
Copy link

One thing to note here is that the trigger service runs inside a Docker container and, as such, if your broker list contains "localhost" that will resolve to the Docker container of the trigger service, and not the host machine where you presumably have Kafka running. Instead, you need to use your real IP address (an no, not 127.0.0.1 as that too will resolve to the container). For further clarity, the trigger service does not include a Kafka server.

@jberstler
Copy link

@prabhastechie Please see #210 and especially all the comments, which should help you through the most common problems encountered when trying to set this up locally.

@prabhastechie
Copy link
Author

Ok, I'll check it out. But I was reusing the kafka docker created by the openwhisk ubuntu local setup.I thought that kafka docker exposes the ports 9092 / 9093 to the docker host. By using localhost:9092, I was able to send messages to a test topic in the kafka docker, from the docker host.

@jberstler
Copy link

@prabhastechie Ah, great then! It still stands that localhost will not resolve to the host IP address when run from within the trigger service docker container. You'll need to use your host machine's actual IP address in the broker list for this to work.

@prabhastechie
Copy link
Author

prabhastechie commented Nov 12, 2017

I found out the ip address of the docker running kafka on port 9092 and replaced the localhost:9092 with that ip address in the trigger create command. But the same error remains.

My trigger create command was:
bin/wsk -i trigger create MyKafkaTrigger -f /guest/messaging/kafkaFeed -p brokers "172.17.0.6:9092" -p topic test -p isJSONData true --insecure

I looked through the issue #210 and got the idea to scan the wsk logs. In the logs I am getting:

bin/wsk -i activation logs 0250dd4292c3427b90dd4292c3627b72
2017-11-11T18:07:30.465220155Z stdout: Error creating trigger: {
2017-11-11T18:07:30.465321917Z stdout: "name": "StatusCodeError",
2017-11-11T18:07:30.465330104Z stdout: "statusCode": 404,
2017-11-11T18:07:30.46533561Z  stdout: "message": "404 - {\"error\":\"The requested resource does not exist.\",\"code\":378}",
2017-11-11T18:07:30.465345409Z stdout: "error": {
2017-11-11T18:07:30.465350631Z stdout: "error": "The requested resource does not exist.",
2017-11-11T18:07:30.465355994Z stdout: "code": 378
2017-11-11T18:07:30.465360882Z stdout: },
2017-11-11T18:07:30.465365572Z stdout: "options": {
2017-11-11T18:07:30.465371622Z stdout: "method": "PUT",
2017-11-11T18:07:30.465378249Z stdout: "url": "https://172.17.0.1/api/v1/web/whisk.system/messagingWeb/kafkaFeedWeb.http",
2017-11-11T18:07:30.465383764Z stdout: "rejectUnauthorized": false,
2017-11-11T18:07:30.465388753Z stdout: "json": true,
2017-11-11T18:07:30.465393569Z stdout: "body": {
2017-11-11T18:07:30.465422611Z stdout: "authKey": "23bc46b1-71f6-4ed5-8c54-816aa4f8c502:123zO3xZCLrMN6v2BKK1dXYFpXlPkccOFqm12CdAsMgRU4VrNZ9lyGVCGuMDGIwP",
2017-11-11T18:07:30.465428612Z stdout: "isJSONData": true,
2017-11-11T18:07:30.465433576Z stdout: "brokers": "172.17.0.6:9092",
2017-11-11T18:07:30.465438729Z stdout: "topic": "test",
2017-11-11T18:07:30.465443762Z stdout: "triggerName": "/guest/MyKafkaTrigger"
2017-11-11T18:07:30.465448991Z stdout: },
2017-11-11T18:07:30.465453641Z stdout: "headers": {
2017-11-11T18:07:30.465458582Z stdout: "Content-Type": "application/json",
2017-11-11T18:07:30.465463711Z stdout: "Accept": "text/plain",
2017-11-11T18:07:30.465468781Z stdout: "User-Agent": "whisk"
2017-11-11T18:07:30.465473843Z stdout: },
2017-11-11T18:07:30.465478532Z stdout: "simple": true,
2017-11-11T18:07:30.465483446Z stdout: "resolveWithFullResponse": false,
2017-11-11T18:07:30.465488452Z stdout: "transform2xxOnly": false
2017-11-11T18:07:30.465493383Z stdout: },
2017-11-11T18:07:30.465498076Z stdout: "response": {
2017-11-11T18:07:30.465502992Z stdout: "statusCode": 404,
2017-11-11T18:07:30.465507869Z stdout: "body": {
2017-11-11T18:07:30.465512767Z stdout: "error": "The requested resource does not exist.",
2017-11-11T18:07:30.465518095Z stdout: "code": 378
2017-11-11T18:07:30.46552294Z  stdout: },
2017-11-11T18:07:30.465527654Z stdout: "headers": {
2017-11-11T18:07:30.465532536Z stdout: "server": "nginx/1.11.13",
2017-11-11T18:07:30.465537655Z stdout: "date": "Sat, 11 Nov 2017 18:07:30 GMT",
2017-11-11T18:07:30.465542848Z stdout: "content-type": "application/json",
2017-11-11T18:07:30.465548026Z stdout: "content-length": "70",
2017-11-11T18:07:30.465560008Z stdout: "connection": "close"
2017-11-11T18:07:30.465566005Z stdout: },
2017-11-11T18:07:30.465570782Z stdout: "request": {
2017-11-11T18:07:30.465575761Z stdout: "uri": {
2017-11-11T18:07:30.465580669Z stdout: "protocol": "https:",
2017-11-11T18:07:30.465585902Z stdout: "slashes": true,
2017-11-11T18:07:30.465590931Z stdout: "auth": null,
2017-11-11T18:07:30.465595848Z stdout: "host": "172.17.0.1",
2017-11-11T18:07:30.465601004Z stdout: "port": 443,
2017-11-11T18:07:30.465605965Z stdout: "hostname": "172.17.0.1",
2017-11-11T18:07:30.465611131Z stdout: "hash": null,
2017-11-11T18:07:30.46561604Z  stdout: "search": null,
2017-11-11T18:07:30.465621016Z stdout: "query": null,
2017-11-11T18:07:30.465625996Z stdout: "pathname": "/api/v1/web/whisk.system/messagingWeb/kafkaFeedWeb.http",
2017-11-11T18:07:30.465631461Z stdout: "path": "/api/v1/web/whisk.system/messagingWeb/kafkaFeedWeb.http",
2017-11-11T18:07:30.465636884Z stdout: "href": "https://172.17.0.1/api/v1/web/whisk.system/messagingWeb/kafkaFeedWeb.http"
2017-11-11T18:07:30.465642324Z stdout: },
2017-11-11T18:07:30.465647068Z stdout: "method": "PUT",
2017-11-11T18:07:30.465652784Z stdout: "headers": {
2017-11-11T18:07:30.465657956Z stdout: "Content-Type": "application/json",
2017-11-11T18:07:30.465663304Z stdout: "Accept": "text/plain",
2017-11-11T18:07:30.465668504Z stdout: "User-Agent": "whisk",
2017-11-11T18:07:30.465673635Z stdout: "content-length": 214
2017-11-11T18:07:30.46567863Z  stdout: }
2017-11-11T18:07:30.465683354Z stdout: }
2017-11-11T18:07:30.465688084Z stdout: }
2017-11-11T18:07:30.465692796Z stdout: }

So it seems like trigger create is expecting the resource; https://172.17.0.1/api/v1/web/whisk.system/messagingWeb/kafkaFeedWeb.http
But I have the following resources:

bin/wsk -i package get --summary /guest/messaging --insecure
package /guest/messaging: Returns a result based on parameters endpoint, isBinaryKey, isBinaryValue, isJSONData, kafka_admin_url, kafka_brokers_sasl, password, topic and user
   (parameters: *endpoint, isBinaryKey, isBinaryValue, isJSONData, kafka_admin_url, kafka_brokers_sasl, password, topic, user)
 action /guest/messaging/kafkaProduce: Produce a message to a Kafka cluster
   (parameters: base64DecodeKey, base64DecodeValue, brokers, key, topic, value)
 action /guest/messaging/messageHubProduce: Produce a message to Message Hub
   (parameters: base64DecodeKey, base64DecodeValue, kafka_brokers_sasl, key, password, topic, user, value)
 feed   /guest/messaging/kafkaFeed: Feed to listen to Kafka messages
   (parameters: brokers, endpoint, isBinaryKey, isBinaryValue, isJSONData, topic)
 feed   /guest/messaging/messageHubFeed: Feed to list to Message Hub messages
   (parameters: endpoint, isBinaryKey, isBinaryValue, isJSONData, kafka_admin_url, kafka_brokers_sasl, password, topic, user)

So there are two issues:
i. My trigger create is referring to kafkaFeed, but internally it is searching for 'kafkaFeedWeb'
ii. Even for 'kafkaFeed' my deployment has deployed it as /guest/messaging/kafkaFeed . How can I deploy it under /whisk.system/messaging ?

In case its relevant, I was able to deploy the alarms feed under whisk.system by following the ansible scripts mentioned in apache/openwhisk-package-alarms#51 (comment)

@caesterlein
Copy link

@prabhastechie Did you get this working? I am seeing the exact same behavior.

I did notice about your logs is that in your activation logs show the brokers field in the post body as a string instead of a JSON array. Although, I don't think that would cause this error.

@SchuhMichael
Copy link

I had similar problems and finally found that the kafkafeedprovider dropped incoming packages, because I forgot to set LOCAL_DEV=true. When I run it like in the following example, everything works as expected.

APIHOST=$LOCAL_ADDRESS
EDGEHOST=$LOCAL_ADDRESS
LOCAL_DEV=true
DB_PREFIX=whisk_local_
DB_USER=whisk_admin
DB_PASS=some_passw0rd
DB_URL=http://$LOCAL_ADDRESS:5984
DB_URL_FULL=http://$DB_USER:$DB_PASS@$LOCAL_ADDRESS:5984
docker run -id -e DB_PREFIX=$DB_PREFIX -e DB_URL=$DB_URL -e DB_USER=$DB_USER -e DB_PASS=$DB_PASS -e LOCAL_DEV=$LOCAL_DEV -p 81:5000 kafkafeedprovider

This is from the working example https://github.com/SchuhMichael/dCache-FaaS-Tutorial

@caesterlein
Copy link

Thanks @SchuhMichael, but that does not seem to be my issues. I have LOCAL_DEV set to true.

Im using the incubator-openwhisk-deploy-kube helm charts to install everything from openwhisk to the kafka packages.

@axelrose
Copy link

Hello all,

I might have a similar issue but not sure. This is what I'm doing:

$ ./installKafka.sh $AUTHSYS $EDGEHOST $DB_URL_FULL $DB_PREFIX $APIHOST
$ docker run -id -e DB_PREFIX=$DB_PREFIX -e DB_URL=$DB_URL -e DB_USER=$DB_USER -e DB_PASS=$DB_PASS -e LOCAL_DEV=true -p 81:5000 kafkafeedprovider

docker logs to running container looks healthy

$ wsk -i --auth $AUTH trigger create my-trigger -f /whisk.system/messaging/kafkaFeed -p brokers $WSK_IP:9099 -p topic billing -p isJSONData True
{
    "activationId": "443a9c11125d47f8ba9c11125d87f8ec",
    "annotations": [
        {
            "key": "path",
            "value": "whisk.system/messaging/kafkaFeed"
        },
        {
            "key": "waitTime",
            "value": 76
        },
        {
            "key": "kind",
            "value": "nodejs:6"
        },
        {
            "key": "timeout",
            "value": false
        },
        {
            "key": "limits",
            "value": {
                "concurrency": 1,
                "logs": 10,
                "memory": 256,
                "timeout": 60000
            }
        },
        {
            "key": "initTime",
            "value": 401
        }
    ],
    "duration": 1368,
    "end": 1555324127661,
    "logs": [],
    "name": "kafkaFeed",
    "namespace": "guest",
    "publish": false,
    "response": {
        "result": {
            "error": "TypeError: Cannot read property 'split' of null"
        },
        "status": "application error",
        "success": false
    },
    "start": 1555324126293,
    "subject": "guest",
    "version": "0.0.1"
}
{
    "activationId": "2c42d29ec43746aa82d29ec43706aa4a",
    "annotations": [
        {
            "key": "path",
            "value": "whisk.system/messaging/kafkaFeed"
        },
        {
            "key": "waitTime",
            "value": 157
        },
        {
            "key": "kind",
            "value": "nodejs:6"
        },
        {
            "key": "timeout",
            "value": false
        },
        {
            "key": "limits",
            "value": {
                "concurrency": 1,
                "logs": 10,
                "memory": 256,
                "timeout": 60000
            }
        }
    ],
    "duration": 277,
    "end": 1555324128135,
    "logs": [],
    "name": "kafkaFeed",
    "namespace": "guest",
    "publish": false,
    "response": {
        "result": {
            "error": "TypeError: Cannot read property 'split' of null"
        },
        "status": "application error",
        "success": false
    },
    "start": 1555324127858,
    "subject": "guest",
    "version": "0.0.1"
}
ok: deleted trigger my-trigger

"error": "TypeError: Cannot read property 'split' of null"

sounds like some JavaScript error. What could I do?

Thanks for your time,
Axel.

@dubee
Copy link
Member

dubee commented Apr 15, 2019

@axelrose, try using an array for your brokers value. For example, -p brokers "[\"broker1:9099\"]" or -p brokers "[broker1:9099]" or -p brokers '["broker1:9099"]'. Sometimes depends on your OS as to which quoting pattern to use.

Here you can see all the split calls in the project: https://github.com/apache/incubator-openwhisk-package-kafka/search?utf8=%E2%9C%93&q=split&type=.

@axelrose
Copy link

Thanks @dubee for your suggestion!

It turned out to be a misleading error message since the same command line now works, also with just a single broker. My env vars weren't correctly set. My fault.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants