Research activities

Selected recent work (Papers: http://www.math.purdue.edu/~xiaj/papers.html)

1.    Direct linear/eigenvlaue solvers for both 2D and 3D on general meshes (or general sparse matrices), with high efficiency, flexibility, and applicability

a.     A randomized sparse direct solver

     http://www.math.purdue.edu/~xiaj/work/mfhssrs.pdf

(revised with SIAM J. Matrix Anal. Appl.)

(1) This paper is the first to use randomization ideas for structured sparse factorizations, and replaces complicated structured operations in direct solvers by operations on simple vectors (which are random or products of matrices with random vectors). Some early work concentrates on 2D, regular mesh, and SPD problems, since the structured operations are too complicated otherwise. (For years, researchers have tried to tackle the problem of doing HSS extend-add, but failed). I used a skinny vector extend-add operation in Figure 3.8 on page 15 to conveniently solve all these troubles, with improved efficiency.

(2) The method is thus applicable to 2D, 3D, irregular meshes, and nonsymmetric/indefinite problems.

(3) I showed some practical rank patterns, so that for 2D, the factorization  cost and memory are about O(n). For 3D, about O(n) to O(n^{4/3}) later. Moreover, the SOLUTION COSTS and memory are O(n) in both 2D and 3D, as also confirmed later by some other researchers. This is suitable for preconditioning and multiple RHS.

b.    Two other schemes to avoid using HSS extend-add

(1)       A structured sparse solver for general meshes

http://www.math.purdue.edu/~xiaj/work/mfhsss.pdf

(Revised with SIAM J. Sci. Comput.)

The paper proves a concept of reduced HSS matrices, which replaces HSS inversion/solution by simple inversion/solution on small reduced matrices.

(2)       A robust version

http://www.springerlink.com/content/978-1-4471-2437-5#section=1019567

It is guaranteed to produce an SPD preconditioner for SPD sparse matrices with any compression accuracy.

c.      Structure sparse eigensolver

I prove that, under certain conditions, in bisection method for selected eigenvalue solution with inertia computation, if only few eigenvalues are designed, then we do NOT need small ranks, since a compact structured approximation gives the SAME inertia.

 

d.     Matrix-free sparse direct factorization with fast update

We perform direct factorization based on randomization and matrix-vector products. This was almost impossible before. This has two significant benefits: no explicit matrix needed, and the factorization can be quickly updated for varying parameters  (such as frequencies or shifts).

e.    O(n) complexity method for finding the diagonal (and other entries) of  A^{-1} for a sparse A

The cost is O(n) for both 2D and 3D, under some conditions. This is very useful for electronic structure calculations, Green's function evaluation, and preconiditioning for imaging problems. In these problems, the diagonal of the inverse is proven to be very effective.

d.     One of the first work on O(n) complexity direct solver for discretized PDEs

     http://dx.doi.org/10.1137/09074543X

 

2.    New multi-layer structures and structured algorithms

a.     Multi-layer sparse structures

Three layers of trees for solving 3D problems.

b.    Multi-layer dense structures

A two layer structure, where the HSS generators are Cauchy-like, defined by vectors, is used to solve Toeplitz problems in extremely high efficiency.

c.      Matrix-free/randomized algorithms

http://epubs.siam.org/doi/abs/10.1137/110831982

A matrix-free structure construction based on matrix-vector products, and used for Toeplitz solution. Again, the generators have special structures.

d.    Structured least squares methods

http://www.math.purdue.edu/~xiaj/work/toepls.pdf

(To be submitted to SIAM J. Matrix Anal. Appl.)

A structured URV algorithm replaces QR algorithm for least squares solutions. The method for Toeplitz least squares is much more efficient and accurate than some recent methods.

e.      Structured eigenvalue solutions

Structured inertia computation general (nonstructured) matrices.

f.     Schur-complement-avoiding factorizations

http://epubs.siam.org/doi/abs/10.1137/110851110

 

3.    New systematic analysis for structured methods

a.     Off-diagonal rank property analysis

(1)       Rank property in spectral element methods

This helps my colleague in spectral methods to use my fast matrix-free direct solvers for PDEs, instead of iterative ones.

(2)       Off-diagonal rank analysis of other problems (Toeplitz, etc.)

b.    Error and stability analysis

http://www.math.purdue.edu/~xiaj/work/toepls.pdf

We give the systematic error and stability analysis for HSS construction and factorizations. We give an idea of structured growth factor, which is much better than the worst case bound in LU factorizations.

c.      Strategy for structured preconditioning with guaranteed effectiveness WITHOUT low-rank property

http://link.aip.org/link/?SML/31/2899

(A more comprehensive version is to be submitted)

I proved that, by compressing SCALED off-diagonal blocks, the HSS preconditioner can give a condition number that decays much faster than the decay of the off-diagonal singular values. Thus, following my strategy, we can get effective structured preconditioner for GENERAL matrices, by dropping most off-diagonal singular values. The criterion is given.

d.    A nested/hierarchical rank relaxation idea

http://www.math.purdue.edu/~xiaj/work/mfhssrs.pdf

http://epubs.siam.org/simax/resource/1/sjmael/v33/i2/p388_s1

With multiple layers of trees in 3D sparse solutions, the inner layers can have relaxed complexity requirements. Thus, we do NOT need O(1) rank for global O(1) complexity. A small improvement in inner layers will be amplified in outer layers.

This also holds for hierarchical matrices. At different levels, the ranks can follow certain rank patterns, INSTEAD OF O(1), and can still give satisfactory results.

e.      Proof of inertias with structured approximation for general problems, as stated above

f.      Various aspects of complexity analysis

http://epubs.siam.org/simax/resource/1/sjmael/v33/i2/p388_s1

Graph techniques such as Pascal triangles are used in the proofs.

 

4.    Additional work (with multiple publications)

a.     Inner-outer preconditioners

b.    Inverse problem solution and optimization with full waveform inversion in 3D frequency domain

c.     High performance techniques

d.    Statistical condition estimations

e.     Symplectic integrators for oscillatory systems

General overview (may be incomplete)