A simple question, but I don't know how to solve t...
# coroutines
a
A simple question, but I don't know how to solve this. • I have a
Flow<List<Foo>>
where
Foo
also contains a
Flow
interface Foo { val name: String; val flow: Flow<Long> }
How can I transform
Flow<List<Foo>
to something like
Flow<List<Pair<String, Long>>>
where the pair is based on the name anf the inner flow value?
s
I guess you’re looking for
flatMap
a
I guess, but these are concattenated or merged? I have hard time understanding how that works.
I made a typo
I think I need combine probably?
s
Flow has two versions of `flatMap`:
flatMapMerge
and
flatMapConcat
. Which one you want is going to depend on your specific case.
a
But it returns a Flow of R, I need a Flow of List<R>
s
Oh… I guess that was the typo that you fixed after I replied 😄
a
Yes 😄
s
Should the new flow emit just one value for each
Foo
, or should it emit a new value each time
Foo.flow
emits? 🤔
a
A new value each time a Foo.flow emits. So I want to receive a whole new List
s
When the
Flow<List<Foo>>
emits a new list, should that replace all of the previous
Foo
flows?
a
I have this now which seems to work:
Copy code
val state = fooListFlow.flatMapLatest { fooFlows: List<Foo> ->
        combine(
            fooFlows.map { foo: Foo ->
                foo.flow.map { inner: Long ->
                    Pair(foo.name, inner)
                }
            }
        ) {
           it.toList()
        }
Don't know if it is very efficient
s
That looks like about what I would expect, based on what you described 👍
a
flatMapLatest / flatMapMerged / flatMapCombined doesn't make any difference though
s
The difference would probably not be apparent unless one of the ‘old’ inner flows emits a new value after it has been superseded by a new emission from the outer flow.
flatMapLatest
will stop collecting the existing flows each time it gets new ones, but
flatMapMerge
and
flatMapConcat
will just keep on collecting all of them them until they terminate.
a
FlatmapLatest makes the most sense then I guess. Thanks!