Purdue Univ Numerical Linear Algebra Group (PUNLAG)

PUNLAG is intended to be a student-led seminar connecting grad students across disciplines who create and/or use numerical linear algebra tools. The idea is each week a student will present something along these lines:
  • - a new lin alg tool you're developing
  • - a problem in your research you're hoping someone else has a tool for
  • - or even an introduction to well-known theory that you think will be useful to others.

Ideally, if you attend a seminar, you might
  • - find applications in other fields that motivate your own research
  • - find new computing tools that others are working on that you can use in your work
  • - find new collaborators or new problems
  • - learn useful, commonly known tools

If you are interested in receiving future seminar announcements or giving a talk, contact me at kkloste@purdue.edu
  • - Jimmy Vogel: Hierarchically Semiseperable Matrices and the Rank-Structured Eigenvalue Problem, 4/23/14
  • Abstract: In this talk, I will discuss the rank-structured eigenvalue problem and present my current research, an algorithm that computes all eigenvalues of any symmetric Hierarchically Semiseperable (HSS) matrix accurately and in nearly linear complexity. Unlike last week, where we learned how to best solve for the eigenvalues of a large sparse matrix, this week we focus on dense matrices whose information is clustered near the diagonal (i.e. “data-sparse” matrices). I will begin with a brief introduction of HSS matrices. From there I will give a short history of rank-structured eigensolvers, including Jan Cuppen’s famous “divide-and-conquer” algorithm from the 80’s, as well as Ming Gu’s stable and efficient formulation of divide-and-conquer from the 90’s, which can be thought of as a special case of my algorithm. Finally, I will discuss my algorithm, how it works, its applications, and where the field of rank-structured eigensolvers is headed.

  • - Alicia Klinvex: TraceMin, part 2, 4/16/14
  • Abstract: Last week, I gave a basic introduction to TraceMin, but some of the steps were treated as magic. How do we obtain that mysterious "section?" How do we do load balancing? Is exascale computing realistic with TraceMin? (i.e. If we have a whole bunch of computing resources [or mathematicians] available, does the algorithm I presented use them effectively?) This week, we will revisit those questions and conclude our exploration of why TraceMin is so cool. What to expect at this seminar:
    • * the answers to those questions
    • * a survey of saddle point solvers
    • * at least one analogy involving food
    • * a brief description of other eigensolvers
    • * scalability plots
    • * Dr. Sameh's other big research project: SPIKE, and an explanation of why SPIKE and TraceMin are close buddies

  • - Alicia Klinvex: TraceMin: A Fast and Robust Eigensolver, 4/9/14
  • Abstract: In this talk, I will present what TraceMin is and why even thirty years after its initial publication, it's still the coolest eigensolver around. What makes TraceMin so cool, you ask? First of all, many eigensolvers require linear systems to be solved very accurately at each iteration, but TraceMin does not. It's capable of finding interior eigenpairs, unlike some others. TraceMin is also one of the few eigensolvers currently capable of finding a large number of eigenvalues. We have compared TraceMin with the leading eigensolver packages (SLEPc, Trilinos' Anasazi, FEAST, PRIMME, and ARPACK) and found it to be faster and more reliable than the competition. What to expect at this seminar:
    • * eigenvalues: where they come from and why we want to find them
    • * the basic theory of TraceMin and its convergence properties
    • * how to solve the constrained minimization problem arising at each iteration
    • * Ritz shifts: what they are, how they effect convergence, when to use them
    • * the three flavors of TraceMin: standard, sampling, and multisectioning
    • * a brief description of other eigensolvers
    • * parallel scalability plots...TONS of scalability plots

  • - Kyle Kloster: Computing Functions of Matrices: How, and Why, 4/2/14
  • Abstract: Functions of matrices, like exp(X), log(X), and sqrt(X), are useful tools in numerical PDEs, network analysis, and markov analysis. We'll journey through an introduction to their basic theory (when does a matrix behave like a real number?), end with their use in the Netflix Prize (!), and pick up a few matrix analysis tools along the way (Jordan factorization ho!).