Skip to content

Commit

Permalink
chore: Release version 0.0.2
Browse files Browse the repository at this point in the history
  • Loading branch information
Mcdostone committed Dec 15, 2024
1 parent 5454236 commit bed74bd
Show file tree
Hide file tree
Showing 8 changed files with 68 additions and 61 deletions.
12 changes: 6 additions & 6 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

18 changes: 9 additions & 9 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ default-members = [
resolver = "2"

[workspace.package]
version = "0.0.1"
version = "0.0.2"
edition = "2021"
authors = ["Yann Prono <[email protected]>"]
readme = "README.md"
Expand All @@ -32,12 +32,12 @@ license = "Apache-2.0"


[workspace.dependencies]
lib = { package = "yozefu-lib", path = "crates/lib/", version = "0.0.1" }
app = { package = "yozefu-app", path = "crates/app/", version = "0.0.1" }
command = { package = "yozefu-command", path = "crates/command/", version = "0.0.1" }
yozefu = { package = "yozefu", path = "crates/bin/", version = "0.0.1" }
tui = { package = "yozefu-tui", path = "crates/tui/", version = "0.0.1" }
wasm-types = { package = "wasm-types", path = "crates/wasm-types/", version = "0.0.1" }
lib = { package = "yozefu-lib", path = "crates/lib/", version = "0.0.2" }
app = { package = "yozefu-app", path = "crates/app/", version = "0.0.2" }
command = { package = "yozefu-command", path = "crates/command/", version = "0.0.2" }
yozefu = { package = "yozefu", path = "crates/bin/", version = "0.0.2" }
tui = { package = "yozefu-tui", path = "crates/tui/", version = "0.0.2" }
wasm-types = { package = "wasm-types", path = "crates/wasm-types/", version = "0.0.2" }

[profile.release]
opt-level = 3
Expand All @@ -51,7 +51,7 @@ incremental = false

[workspace.metadata.release]
shared-version = true
tag-message = "chore: Release version {{version}}"
pre-release-commit-message = "chore: Release version {{version}}"
tag-message = "chore: Release version v{{version}}"
pre-release-commit-message = "chore: Release version v{{version}}"
tag-name = "{{version}}"

3 changes: 1 addition & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
<!--
<a href="https://github.com/MAIF/yozefu/releases"><img src="https://img.shields.io/github/v/release/MAIF/yozefu?style=flatd&color=f8be75&logo=GitHub"></a>-->
<a href="https://crates.io/crates/yozefu/"><img src="https://img.shields.io/crates/v/yozefu?logo=Rust"></a>
<a href="https://github.com/MAIF/yozefu/actions/workflows/build.yml"><img src="https://github.com/MAIF/yozefu/actions/workflows/build.yml/badge.svg" alt="Build status"/></a>
<a href="https://www.rust-lang.org/"><img src="https://img.shields.io/badge/MSRV-1.80.1+-lightgray.svg?logo=rust" alt="Minimum supported Rust version: 1.80.1 or plus"/></a>
<a href="https://github.com/MAIF/yozefu/blob/main/LICENSE"><img src="https://img.shields.io/badge/License-Apache_2.0-blue.svg" alt="Licence"/></a>

Expand Down Expand Up @@ -96,7 +95,7 @@ yozf -c localhost

- [The query language.](./docs/query-language/README.md)
- [Creating a search filter.](./docs/search-filter/README.md)
- [Configuring the tool with TLS.](./docs/tls/README.md)
- [TLS encryption and authentication](./docs/tls/README.md).
- [URL templates to switch to web applications.](./docs/url-templates/README.md)
- [Schema registry.](./docs/schema-registry/README.md)
- [Themes.](./docs/themes/README.md)
Expand Down
2 changes: 1 addition & 1 deletion crates/bin/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -31,4 +31,4 @@ ssl-vendored = [
]
gssapi-vendored = [
"command/gssapi-vendored"
]
]
2 changes: 1 addition & 1 deletion crates/command/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -62,4 +62,4 @@ gssapi-vendored = [
"rdkafka/gssapi-vendored",
"tui/gssapi-vendored",
"app/gssapi-vendored"
]
]
47 changes: 28 additions & 19 deletions docs/keybindings/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,13 @@ So far, keybindings are hardcoded.



**General keybindings**
**General**
| Keybinding | Description |
| --------------------------------- | :------------------------------------- |
| <kbd>tab</kbd> | Next panel |
| <kbd>shift</kbd> + <kbd>tab</kbd> | previous panel |
| <kbd>Tab</kbd> | Next panel |
| <kbd>Shift</kbd> + <kbd>Tab</kbd> | previous panel |
| <kbd>/</kbd> | Go to search bar |
| <kbd>escape</kbd> | Close the view or exit the app |
| <kbd>Escape</kbd> | Close the last visible dialog |
| <kbd>ctrl</kbd> + <kbd>H</kbd> | Show/Hide help |
| <kbd>ctrl</kbd> + <kbd>O</kbd> | Show/Hide topics |
| <kbd>[</kbd> | Scroll to top |
Expand All @@ -20,17 +20,17 @@ So far, keybindings are hardcoded.
| <kbd>K</kbd> | Move to downward direction by one line |



<br />

**Topics**
| Keybinding | Description |
| ------------------------------ | :----------------- |
| <kbd>tab</kbd> | Next panel |
| <kbd>ctrl</kbd> + <kbd>P</kbd> | Show topic details |
| <kbd>ctrl</kbd> + <kbd>U</kbd> | Unselect topics |
| <kbd>enter</kbd> | Select the topic |
| Keybinding | Description |
| ------------------------------ | :------------------ |
| <kbd>ctrl</kbd> + <kbd>P</kbd> | Show topic details |
| <kbd>ctrl</kbd> + <kbd>U</kbd> | Unselect all topics |
| <kbd>Enter</kbd> | Select the topic |


<br />

**Records list**
| Keybinding | Description |
Expand All @@ -39,10 +39,12 @@ So far, keybindings are hardcoded.
| <kbd>O</kbd> | Open the kafka record in the web browser |
| <kbd>E</kbd> | Export kafka record to the file |
| <kbd>F</kbd> | Keep selecting the last consumed kafka record |
| <kbd>enter</kbd> | Open the selected record |
| <kbd>Enter</kbd> | Open the selected record |
| <kbd>↑</kbd> or <kbd>↓</kbd> | Previous/next record |


<br />

**Record**

| Keybinding | Description |
Expand All @@ -55,12 +57,19 @@ So far, keybindings are hardcoded.

<br />

**Schemas**

| Keybinding | Description |
| ---------------------------- | :------------------------ |
| <kbd>C</kbd> | Copy schemas to clipboard |
| <kbd>↑</kbd> or <kbd>↓</kbd> | Scroll |

<br />



**Search**
| Keybinding | Description |
| ---------------------------- | :--------------------------------------- |
| <kbd>↓</kbd> or <kbd>↑</kbd> | Browse history |
| <kbd>C</kbd> | Copy kafka record to clipboard |
| <kbd>O</kbd> | Open the kafka record in the web browser |
| <kbd>E</kbd> | Export kafka record to the file |
| <kbd>enter</kbd> | Search kafka records |
| Keybinding | Description |
| ---------------------------- | :------------------- |
| <kbd>↓</kbd> or <kbd>↑</kbd> | Browse history |
| <kbd>Enter</kbd> | Search kafka records |
23 changes: 11 additions & 12 deletions docs/schemas/MyProducer.java
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@ public Properties kafkaProperties() {
return props;
}

public static <K, V> void produce(KafkaProducer<K, V> producer, Into<K, V> mapper, List<String> addresses, String topic) throws Exception {
public static <K, V> void produce(final KafkaProducer<K, V> producer, final Into<K, V> mapper, final List<String> addresses, final String topic) throws Exception {
for (var address : addresses) {
var record = mapper.into(address, topic);
producer.send(record, onSend());
Expand All @@ -189,7 +189,7 @@ private static Callback onSend() {
};
}

private static List<String> get(String apiUrl, String query) throws IOException, InterruptedException {
private static List<String> get(final String apiUrl, String query) throws IOException, InterruptedException {
System.err.printf(" 🏡 Searching french addresses matching the query '%s'\n", query);
var url = String.format(apiUrl, query.trim().toLowerCase());

Expand Down Expand Up @@ -229,7 +229,7 @@ public static void main(String[] args) {


interface Into<K, V> {
ProducerRecord<K, V> into(String value, String topic) throws Exception;
ProducerRecord<K, V> into(final String value, final String topic) throws Exception;

default String generateKey() {
return UUID.randomUUID().toString();
Expand All @@ -243,21 +243,21 @@ default String readResource(String path) throws Exception {
}

class IntoText implements Into<String, String> {
public ProducerRecord<String, String> into(String value, String topic) throws JsonProcessingException {
public ProducerRecord<String, String> into(final String value, final String topic) throws JsonProcessingException {
var objectMapper = new ObjectMapper();
var object = objectMapper.readTree(value);
return new ProducerRecord<>(topic, this.generateKey(), object.get("properties").get("label").asText());
}
}

class IntoJson implements Into<String, String> {
public ProducerRecord<String, String> into(String value, String topic) {
public ProducerRecord<String, String> into(final String value, final String topic) {
return new ProducerRecord<>(topic, generateKey(), value);
}
}

class IntoJsonSchema implements Into<JsonNode, JsonNode> {
public ProducerRecord<JsonNode, JsonNode> into(String input, String topic) throws Exception {
public ProducerRecord<JsonNode, JsonNode> into(final String input, final String topic) throws Exception {
var objectMapper = new ObjectMapper();
var keySchemaString = readResource("/json-schema/key-schema.json");
var valueSchemaString = readResource("/json-schema/value-schema.json");
Expand All @@ -275,7 +275,7 @@ public ProducerRecord<JsonNode, JsonNode> into(String input, String topic) throw
}

class IntoAvro implements Into<GenericRecord, GenericRecord> {
public ProducerRecord<GenericRecord, GenericRecord> into(String input, String topic) throws Exception {
public ProducerRecord<GenericRecord, GenericRecord> into(final String input, final String topic) throws Exception {
var keySchemaString = readResource("/avro/key-schema.json");
var valueSchemaString = readResource("/avro/value-schema.json");

Expand All @@ -293,7 +293,7 @@ public ProducerRecord<GenericRecord, GenericRecord> into(String input, String to

// TODO work in progress
class IntoProtobuf implements Into<Object, Object> {
public ProducerRecord<Object, Object> into(String input, String topic) throws Exception {
public ProducerRecord<Object, Object> into(final String input, final String topic) throws Exception {
var keySchemaString = readResource("/protobuf/key-schema.proto");
var valueSchemaString = readResource("/protobuf/value-schema.proto");

Expand All @@ -309,7 +309,7 @@ public ProducerRecord<Object, Object> into(String input, String topic) throws Ex
}

class IntoMalformed implements Into<byte[], byte[]> {
public ProducerRecord<byte[], byte[]> into(String input, String topic) throws Exception {
public ProducerRecord<byte[], byte[]> into(final String input, final String topic) throws Exception {
byte randomSchemaId = (byte) ((Math.random() * (127 - 1)) + 1);
var header = new byte[]{0, 0, 0, 0, randomSchemaId};

Expand All @@ -329,9 +329,8 @@ public ProducerRecord<byte[], byte[]> into(String input, String topic) throws Ex
}
}


class IntoInvalidJson implements Into<JsonNode, JsonNode> {
public ProducerRecord<JsonNode, JsonNode> into(String input, String topic) throws Exception {
public ProducerRecord<JsonNode, JsonNode> into(final String input, final String topic) throws Exception {
var objectMapper = new ObjectMapper();
var keySchemaString = readResource("/json-schema/key-schema.json");
var valueSchemaString = readResource("/json-schema/value-schema.json");
Expand All @@ -350,7 +349,7 @@ public ProducerRecord<JsonNode, JsonNode> into(String input, String topic) throw
}

class IntoXml implements Into<String, String> {
public ProducerRecord<String, String> into(String input, String topic) throws Exception {
public ProducerRecord<String, String> into(final String input, final String topic) throws Exception {
var objectMapper = new ObjectMapper();
var xmlMapper = new XmlMapper();
var value = objectMapper.readTree(input);
Expand Down
22 changes: 11 additions & 11 deletions docs/tls/README.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,17 @@
# TLS Support
# TLS encryption and authentication
<p>
<a href="https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md">
<img src="https://img.shields.io/badge/librdkafka-Global_configuration_properties-black.svg?logo=github"></a> <a href="https://github.com/confluentinc/librdkafka/wiki/Using-SSL-with-librdkafka#configure-librdkafka-client">
<img src="https://img.shields.io/badge/librdkafka-Configure_librdkafka_client-black.svg?logo=github"></a>
</p>



This page helps you configure TLS settings for different providers.
The steps are always the same:
1. Open the configuration with `yozf configure`
2. Edit the configuration file by adding a new cluster.
3. Save the file and run start the tool `yozf -c my-cluster`

If you use any of the following properties:`ssl.ca.location`, `ssl.certificate.location`, `ssl.key.location`, make sure to provide an absolute path, using `~` in the path doesn't work.

> [!WARNING]
> `SASL_SSL` security protocol is not available for `aarch64-unknown-linux-gnu` and `windows` targets. I'm facing some compilation issues.
Expand Down Expand Up @@ -138,10 +135,13 @@ Please note that, according to [the documentation](https://github.com/confluenti
[Contributions are welcomed](https://github.com/MAIF/yozefu/edit/main/docs/tls.md) to improve this page.


| Provider | Tested | Documentation |
| --------------------- | ------- | ----------------------------------------------------------------------------------------------------------------------------- |
| Google Cloud Platform | `false` | https://cloud.google.com/managed-service-for-apache-kafka/docs/quickstart#cloud-shell |
| Amazon Web Services | `false` | https://docs.aws.amazon.com/msk/latest/developerguide/produce-consume.html |
| Microsoft Azure | `false` | https://learn.microsoft.com/fr-fr/azure/event-hubs/azure-event-hubs-kafka-overview |
| DigitalOcean | `false` | https://docs.digitalocean.com/products/databases/kafka/how-to/connect/ |
| OVH | `false` | https://help.ovhcloud.com/csm/en-ie-public-cloud-databases-kafka-getting-started?id=kb_article_view&sysparm_article=KB0048944 |
| Provider | Compatible | Documentation |
| ----------------------- | ---------- | ----------------------------------------------------------------------------------------------------------------------------- |
| Google Cloud Platform | ? | https://cloud.google.com/managed-service-for-apache-kafka/docs/quickstart#cloud-shell |
| Amazon Web Services | ? | https://docs.aws.amazon.com/msk/latest/developerguide/produce-consume.html |
| Microsoft Azure | ? | https://learn.microsoft.com/fr-fr/azure/event-hubs/azure-event-hubs-kafka-overview |
| DigitalOcean | ? | https://docs.digitalocean.com/products/databases/kafka/how-to/connect/ |
| OVH | ? | https://help.ovhcloud.com/csm/en-ie-public-cloud-databases-kafka-getting-started?id=kb_article_view&sysparm_article=KB0048944 |
| Aiven for Apache Kafka® | `true` | https://aiven.io/docs/products/kafka/howto/list-code-samples |
| Confluent Cloud | `true` | https://confluent.cloud/environments |
| Redpanda | `true` | https://cloud.redpanda.com/clusters |

0 comments on commit bed74bd

Please sign in to comment.