The concept of scientific workflow makes it possible to link and control different tasks to carry out a complex treatment. The complicated workflow is generated by scientific distributed …
Task scheduling is a crucial key component for the efficient execution of data-intensive applications on distributed environments, by which many machines must be coordinated to …
The extensive use of HPC infrastructures and frameworks for running data-intensive applications has led to a growing interest in data partitioning techniques and strategies. In …
High-level programming models can help application developers to access and use resources without the need to manage low-level architectural entities, as a parallel …
Workflows are largely used to orchestrate complex sets of operations required to handle and process huge amounts of data. Parallel processing is often vital to reduce execution time …
J Jin, Q An, W Zhou, J Tang, R Xiong - Applied Sciences, 2018 - mdpi.com
Featured Application This work is applicable to most state-of-the-art data-parallel frameworks, such as Hadoop, Spark, Pregel, and Tensorflow, to improve task-scheduling …
J Xu, J Wang, Q Qi, J Liao, H Sun, Z Han, T Li - Journal of Network and …, 2021 - Elsevier
One new metric that plays a vital role in evaluating the cloud service is the multi-application makespan. There are usually multiple applications without a deadline in the cloud, while the …
Pharmacogenomics is an important research field that studies the impact of genetic variation of patients on drug responses, looking for correlations between single nucleotide …
J Carretero, D Exposito, A Cascajo… - … Conference on Parallel …, 2022 - Springer
Abstract The current static usage model of HPC systems is becoming increasingly inefficient due to the continuously growing complexity of system architectures, combined with the …