Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When will dinky's flink sql support flink sql cep? #3845

Closed
2 of 3 tasks
jxchanghe opened this issue Sep 27, 2024 · 4 comments
Closed
2 of 3 tasks

When will dinky's flink sql support flink sql cep? #3845

jxchanghe opened this issue Sep 27, 2024 · 4 comments
Labels
FAQ Frequently Asked Questions

Comments

@jxchanghe
Copy link

Search before asking

  • I had searched in the issues and found no similar feature requirement.

Description

No response

Use case

No response

Related issues

No response

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@jxchanghe jxchanghe added New Feature New feature Waiting for reply Waiting for reply labels Sep 27, 2024
@github-actions github-actions bot changed the title 请问dinky的flink sql什么时候能够支持flink sql cep When will dinky's flink sql support flink sql cep? Sep 27, 2024
@Zzm0809
Copy link
Contributor

Zzm0809 commented Sep 27, 2024

现在就能 直接写

@Zzm0809 Zzm0809 added FAQ Frequently Asked Questions and removed Waiting for reply Waiting for reply New Feature New feature labels Sep 27, 2024
@jxchanghe
Copy link
Author

现在就能 直接写

我在Dinky的Flink sql作业开发窗口,用 LAST,FIRST 这种函数,检查时就通不过,报calcite解析不了。直接用sql-client去跑,报同样的错误。用的是最新版的Dinky, Flink用的是1.18版本的

@Zzm0809
Copy link
Contributor

Zzm0809 commented Sep 28, 2024

You can write directly now

When I use functions such as LAST and FIRST in the Flink sql job development window of Dinky, they fail to pass the check and the report is that calcite cannot be parsed. I ran it directly with sql-client and got the same error. I am using the latest version of Dinky, and Flink is using version 1.18.

Then what you wrote is wrong> > 现在就能 直接写

我在Dinky的Flink sql作业开发窗口,用 LAST,FIRST 这种函数,检查时就通不过,报calcite解析不了。直接用sql-client去跑,报同样的错误。用的是最新版的Dinky, Flink用的是1.18版本的

那你写的不对, 原生 sql-client 都语法校验不通过,说明你的 cep sql 有问题

如下可以正常提交(此demo 根据 flink 官网样例而来,此处不注重结果,只注重逻辑), 请对比你的 sql 是否存在问题
链接: 模式匹配

DROP table if EXISTS Ticker;
CREATE TABLE if not EXISTS Ticker (
    symbol STRING,
    price BIGINT,
    tax BIGINT,
    rowtime as proctime()
) WITH (
    'connector' = 'datagen',
    'rows-per-second'='5',
    'fields.symbol.kind'='random',
    'fields.symbol.length'='5',
    'fields.price.min'='100',
    'fields.price.max'='500',
    'fields.tax.min'='0',
    'fields.tax.max'='50'
);



SELECT *
FROM Ticker
    MATCH_RECOGNIZE (
        PARTITION BY symbol
        ORDER BY rowtime
        MEASURES
            START_ROW.rowtime AS start_tstamp,
            LAST(PRICE_DOWN.rowtime) AS bottom_tstamp,
            LAST(PRICE_UP.rowtime) AS end_tstamp
        ONE ROW PER MATCH
        AFTER MATCH SKIP TO LAST PRICE_UP
        PATTERN (START_ROW PRICE_DOWN+ PRICE_UP)
        DEFINE
            PRICE_DOWN AS
                (LAST(PRICE_DOWN.price, 1) IS NULL AND PRICE_DOWN.price < START_ROW.price) OR
                    PRICE_DOWN.price < LAST(PRICE_DOWN.price, 1),
            PRICE_UP AS
                PRICE_UP.price > LAST(PRICE_DOWN.price, 1)
    ) MR;





@jxchanghe
Copy link
Author

You can write directly now

When I use functions such as LAST and FIRST in the Flink sql job development window of Dinky, they fail to pass the check and the report is that calcite cannot be parsed. I ran it directly with sql-client and got the same error. I am using the latest version of Dinky, and Flink is using version 1.18.

Then what you wrote is wrong> > 现在就能 直接写

我在Dinky的Flink sql作业开发窗口,用 LAST,FIRST 这种函数,检查时就通不过,报calcite解析不了。直接用sql-client去跑,报同样的错误。用的是最新版的Dinky, Flink用的是1.18版本的

那你写的不对, 原生 sql-client 都语法校验不通过,说明你的 cep sql 有问题

如下可以正常提交(此demo 根据 flink 官网样例而来,此处不注重结果,只注重逻辑), 请对比你的 sql 是否存在问题 链接: 模式匹配

DROP table if EXISTS Ticker;
CREATE TABLE if not EXISTS Ticker (
    symbol STRING,
    price BIGINT,
    tax BIGINT,
    rowtime as proctime()
) WITH (
    'connector' = 'datagen',
    'rows-per-second'='5',
    'fields.symbol.kind'='random',
    'fields.symbol.length'='5',
    'fields.price.min'='100',
    'fields.price.max'='500',
    'fields.tax.min'='0',
    'fields.tax.max'='50'
);



SELECT *
FROM Ticker
    MATCH_RECOGNIZE (
        PARTITION BY symbol
        ORDER BY rowtime
        MEASURES
            START_ROW.rowtime AS start_tstamp,
            LAST(PRICE_DOWN.rowtime) AS bottom_tstamp,
            LAST(PRICE_UP.rowtime) AS end_tstamp
        ONE ROW PER MATCH
        AFTER MATCH SKIP TO LAST PRICE_UP
        PATTERN (START_ROW PRICE_DOWN+ PRICE_UP)
        DEFINE
            PRICE_DOWN AS
                (LAST(PRICE_DOWN.price, 1) IS NULL AND PRICE_DOWN.price < START_ROW.price) OR
                    PRICE_DOWN.price < LAST(PRICE_DOWN.price, 1),
            PRICE_UP AS
                PRICE_UP.price > LAST(PRICE_DOWN.price, 1)
    ) MR;

可以了。感谢。确实是cep sql写的有问题。建议文档中着重补充下flink cep sql相关部分。搜索网上一些最近的帖子中,有的还说dinky的flink sql目前不支持cep。

@aiwenmo aiwenmo closed this as completed Oct 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
FAQ Frequently Asked Questions
Projects
None yet
Development

No branches or pull requests

3 participants