diff --git a/daprdocs/content/en/contributing/roadmap.md b/daprdocs/content/en/contributing/roadmap.md index d3a7909357f..6c1093ecbd9 100644 --- a/daprdocs/content/en/contributing/roadmap.md +++ b/daprdocs/content/en/contributing/roadmap.md @@ -2,47 +2,9 @@ type: docs title: "Dapr Roadmap" linkTitle: "Roadmap" -description: "The Dapr Roadmap is a tool to help with visibility into investments across the Dapr project" +description: "The Dapr Roadmap gives the community visibility into the different priorities of the projecs" weight: 30 no_list: true --- - -Dapr encourages the community to help with prioritization. A GitHub project board is available to view and provide feedback on proposed issues and track them across development. - -[Screenshot of the Dapr Roadmap board](https://aka.ms/dapr/roadmap) - -{{< button text="View the backlog" link="https://aka.ms/dapr/roadmap" color="primary" >}} -
- -Please vote by adding a 👍 on the GitHub issues for the feature capabilities you would most like to see Dapr support. This will help the Dapr maintainers understand which features will provide the most value. - -Contributions from the community is also welcomed. If there are features on the roadmap that you are interested in contributing to, please comment on the GitHub issue and include your solution proposal. - -{{% alert title="Note" color="primary" %}} -The Dapr roadmap includes issues only from the v1.2 release and onwards. Issues closed and released prior to v1.2 are not included. -{{% /alert %}} - -## Stages - -The Dapr Roadmap progresses through the following stages: - -{{< cardpane >}} -{{< card title="**[📄 Backlog](https://github.com/orgs/dapr/projects/52#column-14691591)**" >}} - Issues (features) that need 👍 votes from the community to prioritize. Updated by Dapr maintainers. -{{< /card >}} -{{< card title="**[⏳ Planned (Committed)](https://github.com/orgs/dapr/projects/52#column-14561691)**" >}} - Issues with a proposal and/or targeted release milestone. This is where design proposals are discussed and designed. -{{< /card >}} -{{< card title="**[👩‍💻 In Progress (Development)](https://github.com/orgs/dapr/projects/52#column-14561696)**" >}} - Implementation specifics have been agreed upon and the feature is under active development. -{{< /card >}} -{{< /cardpane >}} -{{< cardpane >}} -{{< card title="**[☑ Done](https://github.com/orgs/dapr/projects/52#column-14561700)**" >}} - The feature capability has been completed and is scheduled for an upcoming release. -{{< /card >}} -{{< card title="**[✅ Released](https://github.com/orgs/dapr/projects/52#column-14659973)**" >}} - The feature is released and available for use. -{{< /card >}} -{{< /cardpane >}} +See [this document](https://github.com/dapr/community/blob/master/roadmap.md) to view the Dapr project's roadmap. diff --git a/daprdocs/content/en/developing-applications/integrations/Diagrid/diagrid-conductor.md b/daprdocs/content/en/developing-applications/integrations/Diagrid/diagrid-conductor.md index 554ca118a23..c7504b56cc2 100644 --- a/daprdocs/content/en/developing-applications/integrations/Diagrid/diagrid-conductor.md +++ b/daprdocs/content/en/developing-applications/integrations/Diagrid/diagrid-conductor.md @@ -26,6 +26,4 @@ By studying past resource behavior, recommend application resource optimization The application graph facilitates collaboration between dev and ops by providing a dynamic overview of your services and infrastructure components. -Try out [Conductor Free](https://www.diagrid.io/pricing), ideal for individual developers building and testing Dapr applications on Kubernetes. - {{< button text="Learn more about Diagrid Conductor" link="https://www.diagrid.io/conductor" >}} diff --git a/daprdocs/content/en/operations/support/support-release-policy.md b/daprdocs/content/en/operations/support/support-release-policy.md index 55500ed6058..e3a7cc11956 100644 --- a/daprdocs/content/en/operations/support/support-release-policy.md +++ b/daprdocs/content/en/operations/support/support-release-policy.md @@ -45,6 +45,7 @@ The table below shows the versions of Dapr releases that have been tested togeth | Release date | Runtime | CLI | SDKs | Dashboard | Status | Release notes | |--------------------|:--------:|:--------|---------|---------|---------|------------| +| August 14th 2024 | 1.14.1
| 1.14.1 | Java 1.12.0
Go 1.11.0
PHP 1.2.0
Python 1.14.0
.NET 1.14.0
JS 3.3.1 | 0.15.0 | Supported (current) | [v1.14.1 release notes](https://github.com/dapr/dapr/releases/tag/v1.14.1) | | August 14th 2024 | 1.14.0
| 1.14.0 | Java 1.12.0
Go 1.11.0
PHP 1.2.0
Python 1.14.0
.NET 1.14.0
JS 3.3.1 | 0.15.0 | Supported (current) | [v1.14.0 release notes](https://github.com/dapr/dapr/releases/tag/v1.14.0) | | May 29th 2024 | 1.13.4
| 1.13.0 | Java 1.11.0
Go 1.10.0
PHP 1.2.0
Python 1.13.0
.NET 1.13.0
JS 3.3.0 | 0.14.0 | Supported | [v1.13.4 release notes](https://github.com/dapr/dapr/releases/tag/v1.13.4) | | May 21st 2024 | 1.13.3
| 1.13.0 | Java 1.11.0
Go 1.10.0
PHP 1.2.0
Python 1.13.0
.NET 1.13.0
JS 3.3.0 | 0.14.0 | Supported | [v1.13.3 release notes](https://github.com/dapr/dapr/releases/tag/v1.13.3) | @@ -139,7 +140,7 @@ General guidance on upgrading can be found for [self hosted mode]({{< ref self-h | 1.11.0 to 1.11.4 | N/A | 1.12.4 | | 1.12.0 to 1.12.4 | N/A | 1.13.5 | | 1.13.0 to 1.13.5 | N/A | 1.14.0 | -| 1.14.0 | N/A | 1.14.0 | +| 1.14.0 to 1.14.1 | N/A | 1.14.1 | ## Upgrade on Hosting platforms diff --git a/daprdocs/content/en/operations/support/support-security-issues.md b/daprdocs/content/en/operations/support/support-security-issues.md index 1ae3fce27c8..6e7b24a2d2b 100644 --- a/daprdocs/content/en/operations/support/support-security-issues.md +++ b/daprdocs/content/en/operations/support/support-security-issues.md @@ -52,7 +52,7 @@ The people who should have access to read your security report are listed in [`m code which allows the issue to be reproduced. Explain why you believe this to be a security issue in Dapr. 2. Put that information into an email. Use a descriptive title. -3. Send the email to [Dapr Maintainers (dapr@dapr.io)](mailto:dapr@dapr.io?subject=[Security%20Disclosure]:%20ISSUE%20TITLE) +3. Send an email to [Security (security@dapr.io)](mailto:security@dapr.io?subject=[Security%20Disclosure]:%20ISSUE%20TITLE) ## Response diff --git a/daprdocs/content/en/reference/api/jobs_api.md b/daprdocs/content/en/reference/api/jobs_api.md index 3a04ed1a9d4..4a6a9c3683d 100644 --- a/daprdocs/content/en/reference/api/jobs_api.md +++ b/daprdocs/content/en/reference/api/jobs_api.md @@ -63,13 +63,11 @@ Entry | Description | Equivalent ```json { - "job": { - "data": { + "data": { "@type": "type.googleapis.com/google.protobuf.StringValue", "value": "\"someData\"" }, "dueTime": "30s" - } } ``` @@ -90,14 +88,12 @@ $ curl -X POST \ http://localhost:3500/v1.0-alpha1/jobs/jobforjabba \ -H "Content-Type: application/json" -d '{ - "job": { - "data": { + "data": { "@type": "type.googleapis.com/google.protobuf.StringValue", "value": "Running spice" }, - "schedule": "@every 1m", - "repeats": 5 - } + "schedule": "@every 1m", + "repeats": 5 }' ``` diff --git a/daprdocs/content/en/reference/components-reference/supported-bindings/kafka.md b/daprdocs/content/en/reference/components-reference/supported-bindings/kafka.md index addfba98a8c..413e1893fe6 100644 --- a/daprdocs/content/en/reference/components-reference/supported-bindings/kafka.md +++ b/daprdocs/content/en/reference/components-reference/supported-bindings/kafka.md @@ -63,6 +63,8 @@ spec: value: true - name: schemaLatestVersionCacheTTL # Optional. When using Schema Registry Avro serialization/deserialization. The TTL for schema caching when publishing a message with latest schema available. value: 5m + - name: escapeHeaders # Optional. + value: false ``` ## Spec metadata fields @@ -99,6 +101,7 @@ spec: | `consumerFetchDefault` | N | Input/Output | The default number of message bytes to fetch from the broker in each request. Default is `"1048576"` bytes. | `"2097152"` | | `heartbeatInterval` | N | Input | The interval between heartbeats to the consumer coordinator. At most, the value should be set to a 1/3 of the `sessionTimeout` value. Defaults to `"3s"`. | `"5s"` | | `sessionTimeout` | N | Input | The timeout used to detect client failures when using Kafka’s group management facility. If the broker fails to receive any heartbeats from the consumer before the expiration of this session timeout, then the consumer is removed and initiates a rebalance. Defaults to `"10s"`. | `"20s"` | +| `escapeHeaders` | N | Input | Enables URL escaping of the message header values received by the consumer. Allows receiving content with special characters that are usually not allowed in HTTP headers. Default is `false`. | `true` | #### Note The metadata `version` must be set to `1.0.0` when using Azure EventHubs with Kafka. diff --git a/daprdocs/content/en/reference/components-reference/supported-pubsub/setup-apache-kafka.md b/daprdocs/content/en/reference/components-reference/supported-pubsub/setup-apache-kafka.md index cafcee537fe..e6091d87e29 100644 --- a/daprdocs/content/en/reference/components-reference/supported-pubsub/setup-apache-kafka.md +++ b/daprdocs/content/en/reference/components-reference/supported-pubsub/setup-apache-kafka.md @@ -63,6 +63,8 @@ spec: value: true - name: schemaLatestVersionCacheTTL # Optional. When using Schema Registry Avro serialization/deserialization. The TTL for schema caching when publishing a message with latest schema available. value: 5m + - name: escapeHeaders # Optional. + value: false ``` @@ -112,6 +114,7 @@ spec: | consumerFetchDefault | N | The default number of message bytes to fetch from the broker in each request. Default is `"1048576"` bytes. | `"2097152"` | | heartbeatInterval | N | The interval between heartbeats to the consumer coordinator. At most, the value should be set to a 1/3 of the `sessionTimeout` value. Defaults to "3s". | `"5s"` | | sessionTimeout | N | The timeout used to detect client failures when using Kafka’s group management facility. If the broker fails to receive any heartbeats from the consumer before the expiration of this session timeout, then the consumer is removed and initiates a rebalance. Defaults to "10s". | `"20s"` | +| escapeHeaders | N | Enables URL escaping of the message header values received by the consumer. Allows receiving content with special characters that are usually not allowed in HTTP headers. Default is `false`. | `true` | The `secretKeyRef` above is referencing a [kubernetes secrets store]({{< ref kubernetes-secret-store.md >}}) to access the tls information. Visit [here]({{< ref setup-secret-store.md >}}) to learn more about how to configure a secret store component. @@ -485,6 +488,39 @@ curl -X POST http://localhost:3500/v1.0/publish/myKafka/myTopic?metadata.correla }' ``` +## Receiving message headers with special characters + +The consumer application may be required to receive message headers that include special characters, which may cause HTTP protocol validation errors. +HTTP header values must follow specifications, making some characters not allowed. [Learn more about the protocols](https://www.w3.org/Protocols/rfc2616/rfc2616-sec4.html#sec4.2). +In this case, you can enable `escapeHeaders` configuration setting, which uses URL escaping to encode header values on the consumer side. + +{{% alert title="Note" color="primary" %}} +When using this setting, the received message headers are URL escaped, and you need to URL "un-escape" it to get the original value. +{{% /alert %}} + +Set `escapeHeaders` to `true` to URL escape. + +```yaml +apiVersion: dapr.io/v1alpha1 +kind: Component +metadata: + name: kafka-pubsub-escape-headers +spec: + type: pubsub.kafka + version: v1 + metadata: + - name: brokers # Required. Kafka broker connection setting + value: "dapr-kafka.myapp.svc.cluster.local:9092" + - name: consumerGroup # Optional. Used for input bindings. + value: "group1" + - name: clientID # Optional. Used as client tracing ID by Kafka brokers. + value: "my-dapr-app-id" + - name: authType # Required. + value: "none" + - name: escapeHeaders + value: "true" +``` + ## Avro Schema Registry serialization/deserialization You can configure pub/sub to publish or consume data encoded using [Avro binary serialization](https://avro.apache.org/docs/), leveraging an [Apache Schema Registry](https://developer.confluent.io/courses/apache-kafka/schema-registry/) (for example, [Confluent Schema Registry](https://developer.confluent.io/courses/apache-kafka/schema-registry/), [Apicurio](https://www.apicur.io/registry/)). @@ -597,6 +633,7 @@ To run Kafka on Kubernetes, you can use any Kafka operator, such as [Strimzi](ht {{< /tabs >}} + ## Related links - [Basic schema for a Dapr component]({{< ref component-schema >}}) - Read [this guide]({{< ref "howto-publish-subscribe.md##step-1-setup-the-pubsub-component" >}}) for instructions on configuring pub/sub components diff --git a/daprdocs/layouts/shortcodes/dapr-latest-version.html b/daprdocs/layouts/shortcodes/dapr-latest-version.html index c64a87827be..868372aa489 100644 --- a/daprdocs/layouts/shortcodes/dapr-latest-version.html +++ b/daprdocs/layouts/shortcodes/dapr-latest-version.html @@ -1 +1 @@ -{{- if .Get "short" }}1.14{{ else if .Get "long" }}1.14.0{{ else if .Get "cli" }}1.14.0{{ else }}1.14.0{{ end -}} +{{- if .Get "short" }}1.14{{ else if .Get "long" }}1.14.1{{ else if .Get "cli" }}1.14.1{{ else }}1.14.1{{ end -}}