Our activities


The BECOSE project, funded by the French National Research Agency (ANR), aims at developing and analyzing sparsity based algorithms dedicated to ill-posed inverse problems arising in signal and image processing.

Beginning of the project: January 1, 2016.

End of the project: Dec. 31, 2019.

 

Short summary:

The past decade has witnessed a tremendous interest in the concept of sparse representations in signal and image processing. One of the main reasons explaining this enthusiasm stands in the discovery of compressive sensing, a new sampling paradigm defying the theoretical limits established sixty years before by Shannon. Compressive sensing led many researchers to focus on inverse problems involving fairly-well conditioned dictionaries as those arising from random, independent measurements. Yet in many applications, the dictionaries relating the observations to the sought sparse signal are deterministic and ill-conditioned. In these scenarios, many classical algorithms and theoretical analyses are likely to fail. The BECOSE project aims to extend the scope of sparsity techniques much beyond the academic setting of random and well-conditioned dictionaries.

  1. Conception of new algorithms. Inverse problems exploiting the sparse nature of the solution rely on the minimization of the counting function, referred to as the L0-“norm”. This problem being intractable in most practical settings, many suboptimal resolutions have been suggested. The conception of algorithms dedicated to ill-conditioned inverse problems will revolve around three lines of thought. First, we will step back from the popular L1-convexification of the sparse representation problem and consider more involved nonconvex formulations. Recent works indeed demonstrate their relevance for difficult inverse problems. However, designing effective and computationally efficient algorithms remains a challenge for problems of large dimension. Second, we will study the benefit of working with continuous dictionaries in contrast with the classical discrete approach. Third, we will investigate the exploitation of additional sources of structural information (on top of sparsity) suchas non-negativity constraints.
  2. Theoretical analysis of algorithms. The theoretical analysis aims at characterizing the performance of heuristic sparse algorithms. The traditional worst-case exact recovery guarantees are acknowledged to be rather pessimistic because they may not reflect the average behavior of algorithms. It is noticeable, though, that sharp worst-case exact recovery conditions are not even available for a number of popular L0 algorithms. We will focus on stepwise orthogonal greedy search algorithms, which are very well-suited to the ill-conditioned context. We foresee that they will enjoy much weaker recovery guarantees than simpler L0 algorithms. We further propose to elaborate an average analysis of greedy algorithms for deterministic dictionaries, which is a major open issue. To do so, several intermediate steps will be carried out including a guaranteed failure analysis and the derivation of weakened guarantees of success by taking into account other constraints on top of sparsity such as prior knowledge on the signs, coefficient values, and partial support information.
  3. From theory to practice. The proposed algorithms will be assessed in the context of tomographic Particle Image Velocimetry (PIV), a rapidly growing imaging technique in fluid mechanics that will have strong impact in several industrial sectors including environment, automotive and aeronautical industries. This flow measurement technique aims to determine the 3D displacement of tracer particles that passively follow the flow, based on the acquisition of a limited number of 2D camera images. The resulting inverse problem involves high-dimensional data as a time sequence of highly resolved 3D volumes must be reconstructed. Presently available methods for 3D reconstruction and flow tracking are still restricted to small volumes, which is the main bottleneck together with accuracy and resolution limits. The sparse approach is the key methodological tool to handle problems of larger dimension. The proposed solutions will be validated using both realistic simulators and real experimental data.
Last updated: February 12, 2016 at 23:00 pm