[PDF] High Performance Tensor Computations In Scientific Computing And Data Science eBook

High Performance Tensor Computations In Scientific Computing And Data Science Book in PDF, ePub and Kindle version is available to download in english. Read online anytime anywhere directly from your device. Click on the download button below to get a free pdf file of High Performance Tensor Computations In Scientific Computing And Data Science book. This book definitely worth reading, it is an incredibly well-written.

High-Performance Scientific Computing

Author : Michael W. Berry
Publisher : Springer Science & Business Media
Page : 351 pages
File Size : 18,32 MB
Release : 2012-01-18
Category : Computers
ISBN : 1447124375

GET BOOK

This book presents the state of the art in parallel numerical algorithms, applications, architectures, and system software. The book examines various solutions for issues of concurrency, scale, energy efficiency, and programmability, which are discussed in the context of a diverse range of applications. Features: includes contributions from an international selection of world-class authorities; examines parallel algorithm-architecture interaction through issues of computational capacity-based codesign and automatic restructuring of programs using compilation techniques; reviews emerging applications of numerical methods in information retrieval and data mining; discusses the latest issues in dense and sparse matrix computations for modern high-performance systems, multicores, manycores and GPUs, and several perspectives on the Spike family of algorithms for solving linear systems; presents outstanding challenges and developing technologies, and puts these in their historical context.

User-Defined Tensor Data Analysis

Author : Bin Dong
Publisher : Springer Nature
Page : 111 pages
File Size : 11,58 MB
Release : 2021-09-29
Category : Computers
ISBN : 3030707504

GET BOOK

The SpringerBrief introduces FasTensor, a powerful parallel data programming model developed for big data applications. This book also provides a user's guide for installing and using FasTensor. FasTensor enables users to easily express many data analysis operations, which may come from neural networks, scientific computing, or queries from traditional database management systems (DBMS). FasTensor frees users from all underlying and tedious data management tasks, such as data partitioning, communication, and parallel execution. This SpringerBrief gives a high-level overview of the state-of-the-art in parallel data programming model and a motivation for the design of FasTensor. It illustrates the FasTensor application programming interface (API) with an abundance of examples and two real use cases from cutting edge scientific applications. FasTensor can achieve multiple orders of magnitude speedup over Spark and other peer systems in executing big data analysis operations. FasTensor makes programming for data analysis operations at large scale on supercomputers as productively and efficiently as possible. A complete reference of FasTensor includes its theoretical foundations, C++ implementation, and usage in applications. Scientists in domains such as physical and geosciences, who analyze large amounts of data will want to purchase this SpringerBrief. Data engineers who design and develop data analysis software and data scientists, and who use Spark or TensorFlow to perform data analyses, such as training a deep neural network will also find this SpringerBrief useful as a reference tool.

Tensor Computation for Data Analysis

Author : Yipeng Liu
Publisher : Springer Nature
Page : 347 pages
File Size : 42,91 MB
Release : 2021-08-31
Category : Technology & Engineering
ISBN : 3030743861

GET BOOK

Tensor is a natural representation for multi-dimensional data, and tensor computation can avoid possible multi-linear data structure loss in classical matrix computation-based data analysis. This book is intended to provide non-specialists an overall understanding of tensor computation and its applications in data analysis, and benefits researchers, engineers, and students with theoretical, computational, technical and experimental details. It presents a systematic and up-to-date overview of tensor decompositions from the engineer's point of view, and comprehensive coverage of tensor computation based data analysis techniques. In addition, some practical examples in machine learning, signal processing, data mining, computer vision, remote sensing, and biomedical engineering are also presented for easy understanding and implementation. These data analysis techniques may be further applied in other applications on neuroscience, communication, psychometrics, chemometrics, biometrics, quantum physics, quantum chemistry, etc. The discussion begins with basic coverage of notations, preliminary operations in tensor computations, main tensor decompositions and their properties. Based on them, a series of tensor-based data analysis techniques are presented as the tensor extensions of their classical matrix counterparts, including tensor dictionary learning, low rank tensor recovery, tensor completion, coupled tensor analysis, robust principal tensor component analysis, tensor regression, logistical tensor regression, support tensor machine, multilinear discriminate analysis, tensor subspace clustering, tensor-based deep learning, tensor graphical model and tensor sketch. The discussion also includes a number of typical applications with experimental results, such as image reconstruction, image enhancement, data fusion, signal recovery, recommendation system, knowledge graph acquisition, traffic flow prediction, link prediction, environmental prediction, weather forecasting, background extraction, human pose estimation, cognitive state classification from fMRI, infrared small target detection, heterogeneous information networks clustering, multi-view image clustering, and deep neural network compression.

Computational Science – ICCS 2019

Author : João M. F. Rodrigues
Publisher : Springer
Page : 659 pages
File Size : 16,11 MB
Release : 2019-06-07
Category : Computers
ISBN : 3030227340

GET BOOK

The five-volume set LNCS 11536, 11537, 11538, 11539, and 11540 constitutes the proceedings of the 19th International Conference on Computational Science, ICCS 2019, held in Faro, Portugal, in June 2019. The total of 65 full papers and 168 workshop papers presented in this book set were carefully reviewed and selected from 573 submissions (228 submissions to the main track and 345 submissions to the workshops). The papers were organized in topical sections named: Part I: ICCS Main Track Part II: ICCS Main Track; Track of Advances in High-Performance Computational Earth Sciences: Applications and Frameworks; Track of Agent-Based Simulations, Adaptive Algorithms and Solvers; Track of Applications of Matrix Methods in Artificial Intelligence and Machine Learning; Track of Architecture, Languages, Compilation and Hardware Support for Emerging and Heterogeneous Systems Part III: Track of Biomedical and Bioinformatics Challenges for Computer Science; Track of Classifier Learning from Difficult Data; Track of Computational Finance and Business Intelligence; Track of Computational Optimization, Modelling and Simulation; Track of Computational Science in IoT and Smart Systems Part IV: Track of Data-Driven Computational Sciences; Track of Machine Learning and Data Assimilation for Dynamical Systems; Track of Marine Computing in the Interconnected World for the Benefit of the Society; Track of Multiscale Modelling and Simulation; Track of Simulations of Flow and Transport: Modeling, Algorithms and Computation Part V: Track of Smart Systems: Computer Vision, Sensor Networks and Machine Learning; Track of Solving Problems with Uncertainties; Track of Teaching Computational Science; Poster Track ICCS 2019 Chapter “Comparing Domain-decomposition Methods for the Parallelization of Distributed Land Surface Models” is available open access under a Creative Commons Attribution 4.0 International License via link.springer.com.

Tensor Numerical Methods in Quantum Chemistry

Author : Venera Khoromskaia
Publisher : Walter de Gruyter GmbH & Co KG
Page : 343 pages
File Size : 32,13 MB
Release : 2018-06-11
Category : Mathematics
ISBN : 3110391376

GET BOOK

The conventional numerical methods when applied to multidimensional problems suffer from the so-called "curse of dimensionality", that cannot be eliminated by using parallel architectures and high performance computing. The novel tensor numerical methods are based on a "smart" rank-structured tensor representation of the multivariate functions and operators discretized on Cartesian grids thus reducing solution of the multidimensional integral-differential equations to 1D calculations. We explain basic tensor formats and algorithms and show how the orthogonal Tucker tensor decomposition originating from chemometrics made a revolution in numerical analysis, relying on rigorous results from approximation theory. Benefits of tensor approach are demonstrated in ab-initio electronic structure calculations. Computation of the 3D convolution integrals for functions with multiple singularities is replaced by a sequence of 1D operations, thus enabling accurate MATLAB calculations on a laptop using 3D uniform tensor grids of the size up to 1015. Fast tensor-based Hartree-Fock solver, incorporating the grid-based low-rank factorization of the two-electron integrals, serves as a prerequisite for economical calculation of the excitation energies of molecules. Tensor approach suggests efficient grid-based numerical treatment of the long-range electrostatic potentials on large 3D finite lattices with defects.The novel range-separated tensor format applies to interaction potentials of multi-particle systems of general type opening the new prospects for tensor methods in scientific computing. This research monograph presenting the modern tensor techniques applied to problems in quantum chemistry may be interesting for a wide audience of students and scientists working in computational chemistry, material science and scientific computing.

Research on Tensor Computation and Its Application on Data Science

Author : Zequn Zheng
Publisher :
Page : 0 pages
File Size : 16,85 MB
Release : 2023
Category :
ISBN :

GET BOOK

Tensors or multidimensional arrays are higher order generalizations of matrices. They are natural structures for expressing data that have inherent higher order structures. Tensor decompositions and Tensor approximations play an important role in learning those hidden structures. They have many applications in machine learning, statistical learning, data science, signal processing, neuroscience, and more. Canonical Polyadic Decomposition (CPD) is a tensor decomposition that decomposes a tensor to minimal number of summation of rank 1 tensors. While for a given tensor, Low-Rank Tensor Approximation (LRTA) aims at finding a new one whose rank is small and that is close to the given one. We study the generating polynomials for computing tensor decompositions and low-rank approximations for given tensors and propose methods that can compute tensor decompositions for generic tensors under certain rank conditions. For low-rank tensor approximation, the proposed method guarantees that the constructed tensor is a good enough low-rank approximation if the tensor is to be approximated is close enough to a low-rank one. The proof built on perturbation analysis is presented. When the rank is higher than the second dimension, we are not able to find the common zeros of generating polynomials directly. In this case, we need to use the quadratic equations that we get from those generating polynomials. We show that under certain conditions, we are able to find the tensor decompositions using standard linear algebra operations (i.e., solving linear systems, singular value decompositions, QR decompositions). Numerical examples and some comparisons are presented to show the performance of our algorithm. Multi-view learning is frequently used in data science. The pairwise correlation maximization is a classical approach for exploring the consensus of multiple views. Since the pairwise correlation is inherent for two views, the extensions to more views can be diversified and the intrinsic interconnections among views are generally lost. To address this issue, we propose to maximize the high-order tensor correlation. This can be formulated as a low-rank approximation problem with the high-order correlation tensor of multi-view data. We propose to use the generating polynomial method to efficiently solve the high-order correlation maximization problem of tensor canonical correlation analysis for multi-view learning. Numerical results on simulated data and two real multi-view data sets demonstrate that our proposed method not only consistently outperforms existing methods but also is efficient for large scale tensors.

High Performance Computing for Computational Science – VECPAR 2018

Author : Hermes Senger
Publisher : Springer
Page : 264 pages
File Size : 44,85 MB
Release : 2019-03-25
Category : Computers
ISBN : 3030159965

GET BOOK

This book constitutes the thoroughly refereed post-conference proceedings of the 13th International Conference on High Performance Computing in Computational Science, VECPAR 2018, held in São Pedro, Brazil, in September 2018. The 17 full papers and one short paper included in this book were carefully reviewed and selected from 32 submissions presented at the conference. The papers cover the following topics: heterogeneous systems, shared memory systems and GPUs, and techniques including domain decomposition, scheduling and load balancing, with a strong focus on computational science applications.