Skip to content
This repository has been archived by the owner on Jan 26, 2022. It is now read-only.

flatten shallow by default #9

Closed
jdalton opened this issue Mar 5, 2016 · 28 comments
Closed

flatten shallow by default #9

jdalton opened this issue Mar 5, 2016 · 28 comments

Comments

@jdalton
Copy link
Member

jdalton commented Mar 5, 2016

Hey Brian. I dig these proposals!

I think Array#flatten should be shallow by default because it makes sense to do less work by default, it aligns with existing APIs like the DOM Node#cloneNode which is shallow by default, and it would align with the existing ES3 pattern of using Array#concat for a shallow flatten. Shallow by default would also align with flatMap too.

@bterlson
Copy link
Member

bterlson commented Apr 6, 2016

I have discussed with others and there seems to be a weak consensus for 1-level-by-default. I was basing the semantics off of what I was familiar with (Ruby) but the arguments in favor of level-1-by-default are strong. Will update.

@leobalter
Copy link
Member

any updates so far? I agree with the sense of less work by default and the arg will escalate it to the necessary level if necessary.

This would also make flatten similar to flatMap and more intuitive to use both.

@bterlson
Copy link
Member

Haven't updated spec text, adding to backlog again due to renewed interest. 1-level deep is the obvious way to go.

michaelficarra added a commit to michaelficarra/proposal-flatMap that referenced this issue Jul 6, 2017
bterlson added a commit that referenced this issue Jul 6, 2017
fixes #9: flatten shallow by default
@dcporter
Copy link

dcporter commented Jan 4, 2018

the sense of less work by default

A highly unscientific survey of my cohorts has suggested that:

  • Single-layer flattening is surprising; full flattening is the expectation.
  • Use cases for not flattening all the way were not immediately obvious (although the use cases discussed were in general contrived anyway).

If the shallow flattening is undesired behavior, then the performance benefit is illusory. If there's only one layer of arrays, then it's the same performance, and if there are more, then it's the wrong outcome.

Are we wrong about real-life use cases?

@ridgeworks
Copy link

This may have been closed but I must disagree with this decision. Complete flattening is a common use case, so what value of depth should be used when that's desired; Number.MAX_SAFE_INTEGER? Or does the array have to be traversed to calculate the correct depth; hardly a performant solution.

If you want a depth of 1, specify it. At the very least, but hardly ideal, allow a more convenient depth value of 0 or -1 to specify complete flattening.

Or provide a flatten_deep() method to do this job.

@ljharb
Copy link
Member

ljharb commented Feb 27, 2018

If you want infinite flattening, you can pass a Infinity.

@ridgeworks
Copy link

Yes I can (see Number.MAX_SAFE_INT) but isn't it much easier for those who want a depth of 1 to specify 1? And isn't the resulting code semantically clearer? By the way, I don't want "infinite" flattening, just sufficient flattening (instance dependent).

For other usage see (Node module):
https://www.npmjs.com/package/array-flatten

And see "popular" answer for flatten(arr) at:
https://stackoverflow.com/questions/10865025/merge-flatten-an-array-of-arrays-in-javascript

amongst others. Just seems this proposal is making a common use case unnecessarily complicated in order to make a simple use case even simpler and for no particular reason IMO.

@ljharb
Copy link
Member

ljharb commented Feb 27, 2018

sufficient flattening depends on your use case; you might want to flatten an array like [[[[1]]], 2, [[3]]] - you could specify a depth of 3, if you know to do that, but in practice you'd specify Infinity and the algorithm will know when to stop.

@dcporter
Copy link

You’re still answering @ridgeworks’ “how do I do it” question from two comments ago, dodging the follow up “common case” point. I for one understand that Infinity will do the right thing here, but I agree with @ridgeworks that it’s a better default than 1. Flattening to Infinity covers nearly every case, including most 1 cases—the only cases where you want to stop short of Infinity that I’ve heard involve tuples, which are not unheard-of but are not very common either.

@dcporter
Copy link

(Followup to pre-apologize for “dodging”, which suggests intent. I should have said “missing”.)

@ljharb
Copy link
Member

ljharb commented Feb 27, 2018

@dcporter fair :-) I think that 1 is the common case; and that "i don't know exactly how many levels I want to flatten" is not the common case. I often have an array of groups of arrays, and I want one level of flattening down to "an array of arrays".

"Common case" is obviously quite subjective, though - but the OP to me quite convincingly indicates that the default should be 1.

@ridgeworks
Copy link

I must disagree that 1 is the obviously common case. I think it's quite a common case to dynamically build a nested array structure, e.g., from a parser, and flatten it to some output. I'm not arguing it's the only case; just that when you want it, providing some arbitrarily large integer value seems contrived to say the least.

Some more existing art in the form of JS libraries (they're not hard to find) that at least recognizes it's a common situation:

underscore.js - flatten() defaults to deep, but takes an extra boolean parameter for shallow (1 level)

ramda.js - flatten() flattens deep, use unnest() for shallow(?)

loadash - flatten() flattens 1 level but flattenDeep() also provided

collections.js - flatten() flattens deep (I think)

@bakkot
Copy link
Contributor

bakkot commented Feb 27, 2018

@ridgeworks,

providing some arbitrarily large integer value seems contrived to say the least.

You shouldn't provide "some arbitrarily large integer value". You should provide Infinity, to indicate that you want to flatten arbitrarily deep structures: array.flatten(Infinity).

That doesn't seem contrived to me.

@ridgeworks
Copy link

I concede that array.flatten(Infinity) isn't that bad and does the job. But I also listed in previous posts 5 JavaScript libraries where either flatten() was a deep flatten or a separate flattenDeep() method/function was provided (1 case). Is prior art not a consideration? I suspect users of those libraries would find array.flatten(Infinity) somewhat idiosyncratic. The same would apply to array.flatten(1) for that matter, but at least it can be written as array.flatten(). So maybe my real problem is with the depth parameter for these "common" cases. Maybe this proposal is trying to do too many things with a single method. Even providing an additional array.flattenDeep() method would be an improvement IMO.

Or maybe I'm missing something about the rationale behind this proposal. A function that does a shallow flatten using the "slightly quirky" (IMO) behaviour of Array.concat is pretty simple to write:

function flattenShallow(arr) {
	return arr.reduce(
	  (acc, item) => acc.concat(item),
	  [])
}

although to be fair, deep flatten isn't much more complicated:

function flattenDeep(arr) {
	return arr.reduce(
	  (acc, item) => acc.concat(Array.isArray(item) ? item.flattenDeep() : item),
	  [])
}

So maybe it's about performance - all those intermediate concat's and the garbage they create that has to be collected. Fortunately, shallow flatten has an existing solution (using the same quirky behaviour of concat):

function flattenShallow(arr) {
	return [].concat.apply([], arr)
}

Can an implementation based on the Array.flatten(depth) proposal do better than this?

Unfortunately I don't think the deep flatten case has a similar performant solution - correct me if I'm wrong. And it's also recursive so, absent TRO, it can get quite expensive in relative terms. So it could really benefit from an optimized built-in, although it seems to have lower priority in this discussion.

Apologies if this comes off as a rant. It's just the default we're talking about and it's not a huge deal; more of a missed opportunity (IMO).

@ridgeworks
Copy link

Sorry, I mixed my metaphors in the previous post. Deep flatten should be:

function flattenDeep(arr) {
	return arr.reduce(
	  (acc, item) => acc.concat(Array.isArray(item) ? flattenDeep(item) : item),
	  [])
}

@appsforartists
Copy link

In light of the MooTools debacle, can we consider reopening this? Infinity seems like a reasonable expectation for the default depth of something called flatten, and avoids the choice between web compat problems or grossness like [].smoosh() (#56).

@js-choi
Copy link

js-choi commented Mar 7, 2018

For what it’s worth, if @timwienk’s comment in mootools/mootools-core#2797 (comment) is correct, then flatten-with-infinite-depth vs. rename-flatten-but-keep-shallow-depth might be a false dilemma:

It should work in such a way - but I'll have to double check if that goes for all versions (do you know off the top of your head @cpojer ?) - that old versions of MooTools Core just overwrite a native method if it exists.

If this is true, then flatten-with-shallow-depth may still be viable. Don’t know.

@bakkot
Copy link
Contributor

bakkot commented Mar 7, 2018

It turns out the difference in default depth is not the issue with MooTools; see here. So I don't think this needs to be revisited.

@appsforartists
Copy link

😢

@authenticsebastian
Copy link

I would be in favour of deep flattening, too. Seems more intuitive.

@littledan
Copy link
Member

@bakkot Thanks for your detailed investigation into the issue with MooTools. I'd give a 😱 react if I could.

Would we solve the compatibility problem if we made flatten both deep and enumerable? It's a bit of a hack, but it's much smaller than @@unscopables. Plus, many embedding environments (e.g., the web platform) have tons of enumerable methods, so it's not so unusual in the ecosystem.

@domenic
Copy link
Member

domenic commented Mar 8, 2018

Enumerable methods on Array.prototype cause for-in loops to get confused, so unless we add @@unforinables at the same time, that doesn't quite work.

@Jerczu
Copy link

Jerczu commented Mar 8, 2018

In light of the MooTools debacle, can we consider reopening this?

Shouldn't anyone who complaints this will break MooTools be making an issue with MooTools https://github.com/mootools/mootools-core rather than asking language to accommodate their library choices? If they are developers... that's kinda what we get paid for to stay on top of the changes. Right?

@littledan
Copy link
Member

@domenic Oh, right, of course. Never mind.

@jgaskins
Copy link

jgaskins commented Mar 9, 2018

The thing that I find confusing about a default depth of 1 is that the name flatten implies it returns a flat array, rather than merely removing a single layer of nesting. Some people really seem to prefer to keep the default of 1, and I get that, but then what about changing the method name to be more accurate/intuitive? Maybe unnest? unwrap? unpack?

@wycats
Copy link

wycats commented Mar 9, 2018

@jgaskins I quite like unwrap for the one layer use case.

@dcporter
Copy link

I would definitely get behind “flatten” defaulting to Infinity and “unwrap” defaulting to 1, or either one if we didn’t want to double the surface area.

@shalvah

This comment has been minimized.

@tc39 tc39 locked as resolved and limited conversation to collaborators Jun 11, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

17 participants