How to perform a lengthy process in sequential use with RxJava?

I have a large list of strings to check with the remote API.

Observable.from(List<String> strings) // let say the `strings` has > 5000 items .buffer(50) // splitting the strings into 50-sized chunks, it returns Observable<List<String>> (fast) .flatMap((strings) -> { // checkPhoneNumbers is a network call using Retrofit RxJava (slow) return mSyncApi.checkPhoneNumbers(strings); }) .reduce( ... ) // aggregate all checking results .subscribe( ... ); 

The buffer() problem seems to emit List<String> too quickly that all multiple .checkPhoneNumbers() are executed almost at the same time. What I would like to achieve is to install .checkPhoneNumbers() to better support devices with a slow connection.

Throttling the emitted List<String> predefined time interval does not make sense, since this will be a drawback for devices with a lightning fast connection. I tried RxJava serialize() right after flatMap() , but that doesn't seem to make any difference (although I don't know if it's correct to use serialize ).

Any alternative approaches appreciated! Thank you

+2
source share
1 answer

As I said @zsxwing, I think that overloading maxConcurrent is what you are looking for if you are trying to limit the concurrency that happens inside flatMap .

For example: https://gist.github.com/benjchristensen/a0350776a595fd6e3810#file-parallelexecution-java-L78

 private static void flatMapBufferedExampleAsync() { final AtomicInteger total = new AtomicInteger(); Observable.range(0, 500000000) .doOnNext(i -> total.incrementAndGet()) .buffer(100) .doOnNext(i -> System.out.println("emit " + i)) .flatMap(i -> { return Observable.from(i).subscribeOn(Schedulers.computation()).map(item -> { // simulate computational work try { Thread.sleep(10); } catch (Exception e) { } return item + " processed " + Thread.currentThread(); }); }, 2 /* limit concurrency to 2 */) // <--- note argument here .toBlocking().forEach(System.out::println); System.out.println("total emitted: " + total.get()); } 
+3
source

Source: https://habr.com/ru/post/1275397/


All Articles