Pages

Thursday, May 28, 2015

Compressive Sensing at work: Three-dimensional coordinates of individual atoms in materials, Measurement of a 16.8 Million-Dimensional Entangled Probability Distribution, WiFi Fingerprinting in Indoor Environment, Airborne gravimetry data reconstruction

Every once in while people ask how compressive sensing is used. Here are four different ways. Most of them use the sparsity seeking solvers side of things while others take full advantage of the multiplexing ability of the measurement matrices. Without further ado:


Three-dimensional coordinates of individual atoms in materials revealed by electron tomography  by Rui Xu, Chien-Chun Chen, Li Wu, M. C. Scott, W. Theis, Colin Ophus, Matthias Bartels, Yongsoo Yang, Hadi Ramezani-Dakhel, Michael R. Sawaya, Hendrik Heinz, Laurence D. Marks, Peter Ercius, Jianwei Miao
Crystallography, the primary method for determining the three-dimensional (3D) atomic positions in crystals, has been fundamental to the development of many fields of science. However, the atomic positions obtained from crystallography represent a global average of many unit cells in a crystal. Here, we report, for the first time, the determination of the 3D coordinates of thousands of individual atoms and a point defect in a material by electron tomography with a precision of ~19 picometers, where the crystallinity of the material is not assumed. From the coordinates of these individual atoms, we measure the atomic displacement field and the full strain tensor with a 3D resolution of ~1nm^3 and a precision of ~10^-3, which are further verified by density functional theory calculations and molecular dynamics simulations. The ability to precisely localize the 3D coordinates of individual atoms in materials without assuming crystallinity is expected to find important applications in materials science, nanoscience, physics and chemistry.



Millimeter Wave Beamforming Based on WiFi Fingerprinting in Indoor Environment by Ehab Mahmoud Mohamed, Kei Sakaguchi, Seiichi Sampei
(Submitted on 21 May 2015)
Millimeter Wave (mm-w), especially the 60 GHz band, has been receiving much attention as a key enabler for the 5G cellular networks. Beamforming (BF) is tremendously used with mm-w transmissions to enhance the link quality and overcome the channel impairments. The current mm-w BF mechanism, proposed by the IEEE 802.11ad standard, is mainly based on exhaustive searching the best transmit (TX) and receive (RX) antenna beams. This BF mechanism requires a very high setup time, which makes it difficult to coordinate a multiple number of mm-w Access Points (APs) in mobile channel conditions as a 5G requirement. In this paper, we propose a mm-w BF mechanism, which enables a mm-w AP to estimate the best beam to communicate with a User Equipment (UE) using statistical learning. In this scheme, the fingerprints of the UE WiFi signal and mm-w best beam identification (ID) are collected in an offline phase on a grid of arbitrary learning points (LPs) in target environments. Therefore, by just comparing the current UE WiFi signal with the pre-stored UE WiFi fingerprints, the mm-w AP can immediately estimate the best beam to communicate with the UE at its current position. The proposed mm-w BF can estimate the best beam, using a very small setup time, with a comparable performance to the exhaustive search BF.

We demonstrate how to implement extremely high-dimensional compressive imaging on a bi-photon probability distribution. When computationally reconstructing the two-party system, compressive imaging requires a sensing matrix that may drastically exceed practical limits for conventional computers. These limitations are virtually eliminated via fast-Hadamard transform Kronecker-based compressive sensing. We list, in detail, the operations necessary to implement this method and provide an experimental demonstration in which we measure and reconstruct a 16.8 million-dimensional bi-photon probability distribution. Instead of requiring over a year to raster scan or over 2 terabytes of computer memory to perform a reconstruction, we performed the experiment in approximately twenty hours, required between 8 and 32 gigabytes of computer memory, and reconstructed the full distribution in approximately twenty minutes.

Airborne gravimetry data sparse reconstruction via L1-norm convex quadratic programming by Ya-Peng Yang, Mei-Ping Wu, Gang Tang
In practice, airborne gravimetry is a sub-Nyquist sampling method because of the restrictions imposed by national boundaries, fi nancial cost, and database size. In this study, we analyze the sparsity of airborne gravimetry data by using the discrete Fourier transform and propose a reconstruction method based on the theory of compressed sensing for largescale gravity anomaly data. Consequently, the reconstruction of the gravity anomaly data is transformed to a L1-norm convex quadratic programming problem. We combine the preconditioned conjugate gradient algorithm (PCG) and the improved interior-point method (IPM) to solve the convex quadratic programming problem. Furthermore, a fl ight test was carried out with the homegrown strapdown airborne gravimeter SGA-WZ. Subsequently, we reconstructed the gravity anomaly data of the flight test, and then, we compared the proposed method with the linear interpolation method, which is commonly used in airborne gravimetry. The test results show that the PCG-IPM algorithm can be used to reconstruct large-scale gravity anomaly data with higher accuracy and more effectiveness than the linear interpolation method.
 
 
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment