One of the core issues across computer and computational science today is adapting to, managing, and learning from the influx of" Big Data". In the commercial space, this problem …
Scientific simulations on high performance computing (HPC) platforms generate large quantities of data. To bridge the widening gap between compute and I/O, and enable data to …
As scientific workflows increasingly use extreme-scale resources, the imbalance between higher computational capabilities, generated data volumes, and available I/O bandwidth is …
Scientific visualization for exascale computing is very likely to require in situ processing. Traditional simulation checkpointing and post hoc visualization will likely be unsustainable …
As we continue toward exascale, scientific data volume is continuing to scale and becoming more burdensome to manage. In this paper, we lay out opportunities to enhance state of the …
With this work, we explore the feasibility of using in situ data binning techniques to achieve significant data reductions for particle data, and study the associated errors for several post …
In situ visualization is increasingly necessary to address I/O limitations on supercomputers. With the increasing heterogeneity of supercomputer design, efficient and cost effective use …
In-situ and in-transit processing alleviate the gap between the computing and I/O capabilities by scheduling data analytics close to the data source. Hybrid in-situ processing splits data …
Scientific simulations on high performance computing (HPC) platforms generate large quantities of data. To bridge the widening gap between compute and I/O, and enable data to …