https://kotlinlang.org logo
#android-architecture
Title
# android-architecture
u

ursus

04/15/2019, 11:57 PM
and if having it singlethreaded would not actually be better perf wise
g

gildor

04/16/2019, 1:57 AM
It will be most probably much better if your server is fast enough, because IO is bottleneck in most cases
u

ursus

04/16/2019, 3:13 AM
Not sure if I understand. Are you saying singlethreaded queue will be better because the parallel downloads will compete for bandwidth?
i.e. 2 parallel will run both at half the speed of 1 single?
g

gildor

04/16/2019, 3:17 AM
No, I do not suggest singlethreaded queue
u

ursus

04/16/2019, 3:18 AM
and by IO you mean reading writing the socket?
g

gildor

04/16/2019, 3:18 AM
Yes, bandwidth may be a bottleneck, there is no simple answer, but in general most probably running multiple IO events in parallel would be better than do that one by one,
yes
Of course it depends on particular use case and server, but in general case it’s good strategy
u

ursus

04/16/2019, 3:19 AM
naively, do you think they would run at same speed per download (parallel vs single) ?
g

gildor

04/16/2019, 3:19 AM
It really depends on network and server
u

ursus

04/16/2019, 3:20 AM
I know, but in general, for some average apache or something like that, i.e. if parallel downloads could noticably downgrade the speed is what am getting at
g

gildor

04/16/2019, 3:21 AM
In general better to do that in parallel I believe, if your target goal is “download all files” If your use case “make any file available as soon as possible” than download sequentially is probably better strategy
u

ursus

04/16/2019, 3:23 AM
well, that depends on the overhead, since in sequential you wait in queue, which i supposed will be way longer than download time in parallel + overhead?
g

gildor

04/16/2019, 3:24 AM
as I said, it depends on your use case. Do you want download everything as fast as possible? than do it in parallel, with some reasonable limitations (not more than n downloads in paralle). If your goal to provide access to some of files even before all of them are downloaded, than use sequntail downloading or even more limited paralle
This is not about any low level stuff, it’s about user story and use case
There is no significan overhead to do it in parallel even using blocked threads (and even less if you use non-blocking networking), networking overhead is much higher than this so in most cases you just can ignore it
u

ursus

04/16/2019, 3:26 AM
Ok ok, yes, the use case is mix of both, its downloading of files in messenger, i.e. you click download on a download message
and it fetches the file to local storage, as all messengers do
g

gildor

04/16/2019, 3:28 AM
For messenger use case I would download sequentially or with small parallelism, no need to download everything, user see only last few files at the same time
u

ursus

04/16/2019, 3:28 AM
Ok thanks I was just wondering if for n large downloads, the speed would drop to 1/n or something, since if there is not some fixed total IO ammount per OS or something
g

gildor

04/16/2019, 3:28 AM
It really depends on many factors
u

ursus

04/16/2019, 3:29 AM
yes sure, n is low like idk .. 10 maybe
g

gildor

04/16/2019, 3:29 AM
do this in perfectly optimzied way is very tricky, so better to choose some “good enough strategy for most cases”
u

ursus

04/16/2019, 3:30 AM
so batches of say 5 in parallel sounds reasonable for generic messenger ui user driven use case
g

gildor

04/16/2019, 3:30 AM
if you want to sync everyhing yes, but I’m not sure that it’s the best strategy for messanger
you can just download everything starting from nevest messages
u

ursus

04/16/2019, 3:31 AM
Not really syncing, im thinking about like in slack youd post a file and the other user clicks on it to download it locally
g

gildor

04/16/2019, 3:32 AM
so it by request? if so just do this sequentially or with small parallelist, like 3-5
u

ursus

04/16/2019, 3:34 AM
yes by user request, not automatically (although there will be some automatics when restarting the app and it recognizes some downloads didnt finish downloading)
and what would be your reasoning for such 3-5?
g

gildor

04/16/2019, 3:35 AM
User story
If user press download a few files, probably user would like to open them as fast as possible
parallelism here to improve a bit overall download speed if many downalods are requested
u

ursus

04/16/2019, 3:36 AM
im confused now 😄 so there is a time penalty to such paralelism per downlaod vs singlethreaded?
g

gildor

04/16/2019, 3:36 AM
No, it’s about user story
If I requested download of 10 files
u

ursus

04/16/2019, 3:37 AM
yes, and they all start downloading, each on its own thread
g

gildor

04/16/2019, 3:37 AM
I would like probably open at least one of them as fast as possible
u

ursus

04/16/2019, 3:37 AM
and if each download eats like 5% penalty for paralelism, then thats fine, as opossed to waiting in queue for the 10th, no?
g

gildor

04/16/2019, 3:37 AM
It’s not about penalty
u

ursus

04/16/2019, 3:37 AM
should be way faster, and first will be only 5% slower
its 100,200,300 vs 105,105,105 no? (given the randomly chosen 5%)
g

gildor

04/16/2019, 3:38 AM
As a user I would like to get my first file faster even if overal download speed was slower, do you see what I mean?
yes, like this
u

ursus

04/16/2019, 3:39 AM
true, but 5% is negligable .. 20-30 would be not
g

gildor

04/16/2019, 3:39 AM
but because in most cases bandwidth is bottleneck, you will get bigger difference
u

ursus

04/16/2019, 3:40 AM
im not sure if I understand where its the bottleneck, do you mean like phone's total bandwidth?
g

gildor

04/16/2019, 3:40 AM
so in perfect condition if you have exactly 1mb per second, to download 2 1mb files you need 2 seconds
Not only phone, it also may be server per connection bandwidth
u

ursus

04/16/2019, 3:41 AM
im a total noob to this, but naively im thinking if the pipe is wide lets say 1MB/s
then that means 1MBs speed for single download
and 0,5MBs for 2 in parallel, etc, no?
g

gildor

04/16/2019, 3:41 AM
yes
correct
there are more factors involved, but yes
u

ursus

04/16/2019, 3:42 AM
okay I see, so the total speed is constant and paralelism just gets fractions
g

gildor

04/16/2019, 3:42 AM
for simple case it work like that
yes!
u

ursus

04/16/2019, 3:42 AM
okay now I understand what you meant by when you want your first file
g

gildor

04/16/2019, 3:42 AM
but on practice parallel download usually faster, because you also pay local IO, file processing, connection timeout, problems with particual files download speed etc
u

ursus

04/16/2019, 3:43 AM
since in theory the last file will download in exactly the same time
g

gildor

04/16/2019, 3:43 AM
so usually parallel download result is better
but for such use case it’s not so important to download all the files, first file is much more important
u

ursus

04/16/2019, 3:44 AM
you mean faster to downlaod all the files, but speed per file is still going to be 1/n?
+-
Chrome seems to donwload in parallel, maybe you can leave it up to the user to only download single file at a time for fastest speed per file 😄
g

gildor

04/16/2019, 3:46 AM
chrome has pretty different use case
u

ursus

04/16/2019, 3:47 AM
not really, you click 3 different download links, you download 3 files in parallel, those 2 dont wait in queue
g

gildor

04/16/2019, 3:47 AM
chrome usually download files from multiple websites and on your home wifi, when your local bandwidth is usually wider than remote web service
in your use case you probably download from your own server
u

ursus

04/16/2019, 3:48 AM
yea, does that make a difference if its a different tcp connection, different socket?
g

gildor

04/16/2019, 3:48 AM
sure
for example by default web browser limits amount of connection to the same host (about 5 parallel connections)
u

ursus

04/16/2019, 3:49 AM
hm, but isnt it the same for 1 phone to have 5 downloads / connections, vs 5 phones having single connection -- in server's eyes?
its 5 connections total and thats it, same load, no?
g

gildor

04/16/2019, 3:50 AM
no, because download 5 files is more efficient than one by one
Because browser goal to download ALL web page resources as fast as possilble to render the page
u

ursus

04/16/2019, 3:51 AM
that is what im saying, i mean a regular server can handle hundreds of parallel connections for sure, right?
g

gildor

04/16/2019, 3:51 AM
Yes
But this is just default strategy
because connection is actually pretty heavy weight thing for server
browser just tries to be a good citizen
u

ursus

04/16/2019, 3:52 AM
so by that logic id expect it to be same thing in terms of backend, if it is single client making hundreds of connections vs hundreds of clients making single connection?
g

gildor

04/16/2019, 3:52 AM
there is no difference
but better to support hundreds of clients, than 1 client
u

ursus

04/16/2019, 3:53 AM
hm, true
ok thank you very much
One more thing I want to ask, how does Upload play into this? Is it same in terms of dividing bandwidth when parallel? I.e. a connection is a connection and that it?
g

gildor

04/17/2019, 1:33 AM
Yes, but if web server doesn’t support resume of upload probably safer would do this sequentially
u

ursus

04/17/2019, 4:06 AM
how so? I mean sure, but resuming can be also done on downloads
g

gildor

04/17/2019, 4:56 AM
In 99% cases nobody support resuming of download or upload
Usually it’s just better strategy for upload to do that one by one
3 Views