Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Data stream flow control #67

Open
AjaniBilby opened this issue Aug 26, 2022 · 1 comment
Open

Feature: Data stream flow control #67

AjaniBilby opened this issue Aug 26, 2022 · 1 comment
Labels
enhancement New feature or request Low Priority
Milestone

Comments

@AjaniBilby
Copy link
Member

This language will avoid the archaic for loop style, as the more commonly used for .. of/in pattern used my other languages is the same behaviour as a map however with different syntactical sugar.
However in most languages map/filter produced new containers after each operation which is a waste of allocations, instead these operations should be streamlined into a single execution unit producing a final result, rather than multiple intermittent results.

I.e. The bellow snippet should only produce one final array, and ideally in this case since arr is being consumed, and there is a 1:1 relationship between inputs and outputs, the container which held the input values should be reused for the output which would result in zero allocations or data shifting

arr.map(x => x*2).map(x => x-3).map(x => x % 4)

Initially there were going to be six operators creating symmetrical control:

  • Unfold: 1:m
  • Expand: n:m where n<=m
  • Map: 1:1
  • Filter: n:m where n>=m
  • Reduce: n:m where n>=m
  • Fold: n:1

However expand is simply a unfold nested inside of a map, and there is little increase in syntactic complexity simplifying the operation into a single operator.
However removing filter and insead relying on reduce would rely on a runtime either type between the streamed value and void, which means values would mean the values would need to be packed into a struct then read back out to be determined if they need to be dropped. So breaking filter into its own operation can save on extra type syntax and compilation complexities and possible runtime side affects on lower levels of optimisation.
Reduce can also be eliminated as it is simply a reduce which has been designed to only yield once.

Unfold:

let val = 64;
@val unfold (x) => {
  if (x > 2) {
    x = x/2;
    yield 2:
  } else {
    finish;
  }
} map (x) => {
  printf("Divided by %i", x);
}

Filter:

&arr filter (x) => {
  ret x % 2;
} map (x) => {
  printf("Even numbers %i", x);
}

Reduce:

// Squash the array so each element sums with it's neighbour and length /= 2
let buf = Vec#[int].new();
concat { arr, buff } reduce (x => {
  if (buf->length() > 0) {
    yield buf->pop() + x;
  } else {
    buf.push(x);
  }
  // reached end of block, load next elm and repeat
} map (x) => { printf("%i", x) }

Concat
When allows multiple sources of the same data type, when one source stops starting feeding from the next source

@AjaniBilby AjaniBilby added the enhancement New feature or request label Aug 26, 2022
@AjaniBilby
Copy link
Member Author

This fulfills issues #9 #8 #7 #6 and thus makes them irrelevant

@AjaniBilby AjaniBilby added this to the Version 0.3.0 milestone Aug 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request Low Priority
Projects
None yet
Development

No branches or pull requests

1 participant