Schedule for: 17w5055 - Mathematical Advances in Electron Microscopy

Beginning on Sunday, October 15 and ending Friday October 20, 2017

All times in Oaxaca, Mexico time, CDT (UTC-5).

Sunday, October 15
14:00 - 23:59 Check-in begins (Front desk at your assigned hotel)
19:30 - 22:00 Dinner (Restaurant Hotel Hacienda Los Laureles)
20:30 - 21:30 Informal gathering (Hotel Hacienda Los Laureles)
Monday, October 16
07:30 - 08:45 Breakfast (Restaurant at your assigned hotel)
08:45 - 09:00 Introduction and Welcome (Conference Room San Felipe)
09:00 - 10:00 Joachim Mayer: Chromatic Aberration Corrected TEM: State of the Art and Future Challenges in Data Acquisition and Analysis (Conference Room San Felipe)
10:00 - 10:30 Joakim Anden: Structural Heterogeneity from 3D Covariance Estimation in Cryo-EM
Molecules imaged using cryo-electron microscopy (cryo-EM) often exhibit a significant amount of variability, be it in conformation or composition. The molecules are typically represented as 3D volume maps of the electric potential as a function of space. In order to characterize the variability of these maps, the authors propose a method for fast and accurate estimation of the 3D covariance matrix. The estimator is given by the least-squares solution to a linear inverse problem and is efficiently calculated by exploiting its 6D convolutional structure. Combining this with a circulant preconditioner, the solution is obtained using the conjugate gradient method. For $n$ images of size $N$-by-$N$, the computational complexity of the algorithm is $O(n N^4 + \sqrt{\kappa} N^6 \log N)$, where $\kappa$ is a condition number typically of the order $200$. The method is evaluated on simulated and experimental datasets, achieving results comparable to the state of the art at very short runtimes.
(Conference Room San Felipe)
10:30 - 11:00 Coffee Break (Conference Room San Felipe)
11:00 - 11:55 Tamir Ben Dory: Invariants for multireference alignment and cryo-EM
The talk of Tamir Bendory will focus on invariants for the multireference alignment (MRA) and cryo-EM problems. MRA is a simplification and abstraction of the cryo-EM problem. In MRA, we aim to estimate a signal from its circularly-translated copies in high noise regimes. It will be shown that the optimal estimation rate for MRA can be achieved by exploiting features of the signal that are invariant under translation. In a similar manner, these invariant features are used for the heterogeneous MRA problem in which one aims to estimate multiple signals simultaneously. Then, the invariants of the cryo-EM problem will be discussed using the framework of Kam's method. We will show how an ab initio model of the molecule can be estimated directly from the data, without estimating the viewing direction. Finally, we will discuss extensions of Kam's method.
(Conference Room San Felipe)
11:55 - 12:50 Yoel Shkolnisky: Manifold methods for cryo-EM image denoising
The goal in cryo-electron microscopy is to recover the three-dimensional structure of a molecule from many of its two-dimensional tomographic projection images. A key challenge in this imaging setup is that the two-dimensional images are extremely noisy. Thus, the process of reconstructing a three-dimensional model form the two-dimensional images typically requires some form of denoising, to improve the quality (signal-to-noise ratio) of the two-dimensional images. Such denoising is often implemented by one of the existing class averaging algorithms, or by other forms of statistical analysis, such as principal components analysis and its variants. We present an algorithm for denoising the entire set of two-dimensional images at once, by exploiting the geometrical property that all (unknown) clean images corresponding to the same underlying structure lie on a manifold of intrinsic dimension three. Thus, each image can be denoised by projecting it onto this (unknown) low-dimensional manifold of clean images. We show that all the quantities required to compute this projection can be estimated using only the two-dimensional projection images. In particular, no prior assumptions on the images nor the three-dimensional volume are required.
(Conference Room San Felipe)
12:50 - 13:20 Discussion on Cryo-EM (Conference Room San Felipe)
13:20 - 13:30 Group Photo (Hotel Hacienda Los Laureles)
13:30 - 15:00 Lunch (Restaurant Hotel Hacienda Los Laureles)
15:00 - 16:00 Thomas Vogt: Bright and Dark-Field Imaging of Complex Oxides (Conference Room San Felipe)
16:00 - 16:30 Coffee Break (Conference Room San Felipe)
16:30 - 17:30 Andrew Stevens: A Tutorial and Outline of Recent Advances in Computational S/TEM
Recently combined hardware and software approaches have been proposed, developed, and used in (scanning) transmission electron microscopy (S/TEM) for dose and acquisition-time reduction. It has also been shown that different dose-locality gives rise to different damage mechanisms. Computational S/TEM includes methods such as compressive sensing and sub-sampled imaging, with the key factor being that the data collected is not necessarily useful without specific software processing. After reviewing the foundational principles, recent advances will be discussed with an outlook to the future of computational S/TEM.
(Conference Room San Felipe)
17:30 - 19:00 Nigel Browning: Discussion on "Faster, Better Resolution, Lower Dose – how far can math take us and do we need new hardware?" (Conference Room San Felipe)
19:00 - 21:00 Dinner (Restaurant Hotel Hacienda Los Laureles)
Tuesday, October 17
07:30 - 09:00 Breakfast (Restaurant at your assigned hotel)
09:00 - 10:00 Bryan Reed: Compressively Sensed Video Acquisition in Transmission Electron Microscopy
Transmission electron microscopy (TEM) is an extremely powerful tool for physical, material, and biological science at the nanoscale. This is especially true for higher-dimensional acquisition modes that add resolution in time, depth (via tomography), scattering angle (through STEM-diffraction, also called 4D STEM), and spectroscopy. Unfortunately, these higher-dimensional modes are extremely bandwidth-hungry. Every added dimension greatly multiplies the number of bytes that need to be captured, with associated increases in acquisition time, sample damage, and equipment expense. This includes, more and more, the equipment needed to transfer, store, and analyze the data. Modern kilohertz-scale cameras are starting to address the issue, but they are not a complete solution, and they can be quite expensive. Yet much of the raw-byte-count data so acquired is either redundant or practically meaningless. Standard analysis methods such as principal component analysis can often eliminate 90% or more of these bytes without losing any information of interest to the user. Even then, the data is usually highly compressible, with strong multi-length-scale spatial correlations allowing standard image compression schemes to reduce the byte counts still further. Compressive sensing (CS) addresses this problem by posing the question: Why acquire the redundant information in the first place? CS, by acquiring data in a nonstandard way, effectively performs data compression in the analog domain, before the analog-to-digital conversion bottleneck. This speeds up acquisition, reduces data transfer and storage costs, and potentially allows a given piece of hardware to achieve resolution (time, spectral, etc.) well beyond what it would normally be capable of. IDES has developed a new approach for CS video acquisition in a TEM that uses post-sample deflection to array multiple images over a large camera. This allows many images to be captured in a single camera acquisition, even in a sequential non-CS operating mode. This enables nanosecond-precision exposure control and, surprisingly, substantially improves the quality of conventional TEM images, most likely because of the limitations of the millisecond-scale beam blanking systems used in many TEMs. When the images are exposed in a known pseudorandom sequence, returning to each image multiple times in one camera exposure, this in effect performs data compression on the video sequence. CS techniques can then reconstruct ~50-100 frames of kilohertz-scale video in a single acquisition from a conventional slow-scan camera, as has been demonstrated on three different test systems at LLNL, SNL-Albuquerque, and the University of Strasbourg. Coupled with modern high-frame-rate cameras, the time resolution could be even faster, ultimately limited only by beam current and signal-to-noise ratio limitations and not by the data collection system itself. This kind of rapid acquisition of images has clear applications in in situ TEM experimentation, tomography, STEM-diffraction, and potentially spectrum imaging. Moreover, the system is simple and modular and the installation is quick and non-invasive, requiring no modification of the TEM itself. It also by its nature integrates a high-precision timing system into a conventional TEM, thus providing advantages in terms of exposure control, image quality, and facilitation of complex multidimensional acquisition modes.
(Conference Room San Felipe)
10:00 - 10:30 Maximilian März: Does (co-)sparsity characterize the success (or failure) of l1-analysis recovery in compressed sensing? (Conference Room San Felipe)
10:30 - 11:00 Coffee Break (Conference Room San Felipe)
11:00 - 11:55 Kevin Kelly: Optical Domain Compressive Imaging as a Model for Electron Microscopies
Our research group has been successful in translating the mathematics of compressive sensing into a variety of optical imaging systems. This talk will review the results with a specific focus on the tradeoffs between algorithmic restrictions and hardware limitations in these systems. Examples will include hyperspectral optical microscopy for both dark-field and sum-frequency generation imaging. Sparse manifold secants and compressive neural network measurements for machine-vision object recognition will also be discussed with an eye towards their utility in cryo-electron microscopy. Lastly we will discuss a new compressive imaging algorithm that is able to exploit the inherent redundancy in the temporal-spatio-spectral datacube so when combined with a unique optical system based on a single light modulator and a single detector it is capable of acquiring hyperspectral video imaging at a compression of nine hundred to one.
(Conference Room San Felipe)
11:55 - 12:50 Holger Rauhut: On mathematical aspects of compressive sensing (Conference Room San Felipe)
12:50 - 13:30 Discussion on compressive sensing (Conference Room San Felipe)
13:30 - 15:00 Lunch (Restaurant Hotel Hacienda Los Laureles)
15:00 - 16:00 Paul Voyles: Structure Optimization for Complex Materials Incorporating Microscopy (and Other) Data (Conference Room San Felipe)
16:00 - 16:30 Coffee Break (Conference Room San Felipe)
16:30 - 17:00 Lewys Jones: Using Bayesian Inference to Improve 3D Nanoparticle Reconstructions from Single-projection ADF Observations
The scanning transmission electron microscope (STEM) is now routinely able to probe the structure and chemistry of materials at the atomic scale. However, interpretation of the rich data obtained often remains somewhat qualitative. In recent years much progress has been made in putting such analysis on a more quantitative footing, and in understanding both the precision and accuracy of experimental annular dark-field images from the scanning transmission electron microscope (ADF-STEM). A special interest using this type of data is atom-counting within mono-metallic nanoparticles, where the number of atoms in individual atomic-columns can be used to rebuild the 3D structure. However, using Bayesian methods, to consider the whole ensemble of column observations, we can retrieve more realistic nanostructures with an overall lower system energy. This presentation will include a brief recap of the image formation and contrast mechanism in ADF STEM, a discussion of the tools for counting individual atoms in nanostructures, and how we are now able to invert projected 2D-images back to plausible 3D-structures. A special example will be presented where the surface sites of Pt nanoparticle catalysts are evaluated for their suitability for use in catalysing the oxygen reduction reaction in fuel-cell cathodes.
(Conference Room San Felipe)
17:00 - 18:00 Peter Nellist: Algorithms for processing the STEM 4D ptychography data-set: phase accuracy and dose robustness
The development of fast direct electron detectors has enabled the routine collection of the 4D STEM dataset (diffraction patterns as a function of probe position). An application of the 4D is ptychography, which allows low noise phase images to be formed. In this presentation, I will start by showing how electron ptychography can be used to solve real materials problems, and demonstrate some of its capabilities for correction of residual aberrations and optical sectioning to gain 3D specimen information. I will then go on to discuss in more detail the current approaches to reconstructing the phase image from the 4D data set, including simplifications that arise when using a weak phase object, the use of the Wigner distribution deconvolution approach, and a comparison with the ePIE iterative reconstruction method. Signal to noise ratios, the danger of noise amplification and other artefacts will be discussed.
(Conference Room San Felipe)
18:00 - 19:00 Discussion on incorporation of prior information through a Bayesian framework (Conference Room San Felipe)
19:00 - 21:00 Dinner (Restaurant Hotel Hacienda Los Laureles)
Wednesday, October 18
07:30 - 08:45 Breakfast (Restaurant at your assigned hotel)
08:45 - 09:30 Toby Sanders: Techniques for Data Alignment and Image Reconstruction for Electron Tomography
Electron tomography is a technique to obtain 3D nanoscale image reconstructions from 2D electron microscopy projection images, i.e. a tomographic tilt series. Due to the nature of imaging at the nanoscale, the data processing procedures leading from data acquisition to accurate approximation of the 3D scene are far from automated and rarely streamline. These challenges include accurate alignment of the sinogram data and accurate reconstruction from a limited number angles in the tilt series. This talk addresses some of the most recent advances to resolving these issues and the concerns that still remain. We consider two new methods for accurate alignment of the sinogram data, which can be characterized by center-of-mass methods and phase based autofocusing [1,2]. Next, we consider accurate 3D reconstruction techniques based on higher order total variation L1 regularization and multiscale generalizations [3]. [1] Sanders, Toby, et al. "Physically motivated global alignment method for electron tomography." Advanced Structural and Chemical Imaging 1.1 (2015): 4. [2] Sanders, Toby, Ilke Arslan. “Improved 3D resolution of electron tomograms using robust mathematical data processing techniques.” Microscopy and Microanalysis. (accepted, 2017). [3] Sanders, Toby, et al. "Recovering fine details from under-resolved electron tomography data using higher order total variation L1 regularization." Ultramicroscopy 174 (2017): 97-105.
(Conference Room San Felipe)
09:30 - 10:15 Doga Gursoy: Experiment Design and Data Analysis in Tomographic X-ray Imaging and Microscopy
As the sophistication of today's experiments grow at synchrotron light sources, collecting the most informative data has become greatly relevant, necessitating the development of methods and techniques that can provide good quality reconstructions from big data streams. Overcoming these challenges commonly requires developing better approximations of physical systems, and when these approximations are not available or too costly to compute, approaches based on machine learning can help in analysis and/or automating the process. In this talk, I will first give a broad overview of the status of imaging and microscopy applications, and then describe how existing big data, compressed sensing, and machine learning methods can be adopted to enable faster and reliable information extraction from complex measurement data. I will also highlight the need for an integration of hardware and software in building successful instruments of the future, especially after realization of the next-generation of x-ray sources providing orders of increased brilliance and coherence.
(Conference Room San Felipe)
10:15 - 10:45 Coffee Break (Conference Room San Felipe)
10:45 - 11:15 Colin Ophus: PRISM and Prismatic - a new algorithm and code for very fast scanning transmission electron microscopy (STEM) simulations
Image simulation for scanning transmission electron microscopy at atomic resolution for samples with realistic dimensions can require very large computation times using existing simulation algorithms. We present a new algorithm named PRISM that combines features of the two most commonly used algorithms, namely the Bloch wave and multislice methods. PRISM uses a Fourier interpolation factor f that has typical values of 4–20 for atomic resolution simulations. We show that in many cases PRISM can provide a speedup that scales with f^4 compared to multislice simulations, with a negligible loss of accuracy. We demonstrate the usefulness of this method with large-scale scanning transmission electron microscopy image simulations of a crystalline nanoparticle on an amorphous carbon substrate. We also present a software package called Prismatic for parallelized simulation of image formation in STEM using both the PRISM and multislice methods. By distributing the workload between multiple CUDA-enabled GPUs and multicore processors, accelerations as high as 1000x for PRISM and 15x for multislice are achieved relative to traditional multislice implementations using a single 4-GPU machine.Prismatic is freely available both as an open-source CUDA/C++ package with a graphical user interface for Windows and OSX, and as a Python package.
(Conference Room San Felipe)
11:15 - 12:00 Christian Dwyer: Phase measurement beyond the shot-noise limit (Conference Room San Felipe)
12:00 - 12:30 Discussion on Quantum Metrology applied to Electron Microscopy - can we dramatically improve the signal-to-dose ratio for electrons? (Conference Room San Felipe)
12:30 - 13:30 Lunch (Restaurant Hotel Hacienda Los Laureles)
13:30 - 19:00 Free Afternoon (Monte Alban - Oaxaca)
19:00 - 21:00 Dinner (Restaurant Hotel Hacienda Los Laureles)
Thursday, October 19
07:30 - 09:00 Breakfast (Restaurant at your assigned hotel)
09:00 - 09:45 Emilie Ringe: Hyperspectral optical and electron microscopy: correlation and deconvolution approaches
In this talk, approaches for the correlation of optical signals with electron-beam techniques are introduced and application examples are discussed. Recent developments in scanning spatial hyperspectral optical spectroscopy, allowing the interrogation on 100 x100 micrometers in minutes, has revolutionized our ability to acquire statistically significant data on light-matter interactions. Coupled with spatially correlated electron microscopy and spectroscopy, supplying information on structure, composition and local fields, these approaches provide a new tool to quantify structure-properties relationships across lengthscales. Systems of particular interest to the Ringe group include 1) nanoparticles (NPs) supporting localized surface plasmon resonances (LSPRs), in which variations in NP size, shape, and composition manipulate the near-field and far-field LSPR response, and 2) semiconducting 2D materials, where the optical badgap can be tuned by the defect density, phase, and doping/oxidation.
(Conference Room San Felipe)
09:45 - 10:15 Discussion on Challenges in spectral deconvolution: blind source approaches vs fitting (Conference Room San Felipe)
10:15 - 10:30 Peter Binev: Processing of EDX tomography data
The procedure consists of several steps: analysis of the combined spectrum of all the data; identification and extraction of the concentration of the main elements for each of the frames; aligning of the frames; and tomographic reconstruction of each of the main elements. The analysis includes calculation of local centre of mass displacements, decomposition of the combined spectrum into a sum of gaussians, and discrete filtering for extraction of elements concentration. The alignment procedure estimates the frame displacements using the centres of mass of several element maps. The tomography reconstruction uses a generalization of TV regularization employing total variation term of order two. This is a joint work with Kelsey Larkin (University of South Carolina), Zineb Saghi (CEA-LETI, Grenoble), and Toby Sanders (Arizona State University).
(Conference Room San Felipe)
10:30 - 11:00 Coffee Break (Conference Room San Felipe)
11:00 - 11:30 Aidan Rooney: Image processing to characterise 2D material heterostructures and graphitic folds
2D materials such as graphene and transition metal dichalcogenides (TMDCs) have demonstrated new properties underpinned by exotic charge carriers with enhanced conductivity and novel light-matter interactions. These properties can be harnessed by ‘stacking’ 2D crystals on top of one another to form a van der Waals heterostructure: a series of atomically thin sheets glued together by weak forces[1]. Heterostructures made in this way allow the new properties of 2D materials to be characterised and applied as transistors[2], optical modulators[3] and light emitting diodes[4]. We demonstrate cross-sectional STEM images of these heterostructures, showing how each 2D crystal interfaces its neighbour at high resolution. Cross sections are fabricated in a dual-beam FIB-SEM instrument using the in situ lift-out method and polished with low energy ions to achieve electron transparency.[5] Cross sections were imaged in high resolution HAADF STEM using a probe-side aberration corrected FEI Titan G2 80-200 kV with an X-FEG electron source. The nature of these van der Waals interfaces not only determines the carrier injection between components, but also affects the bandstructure of the device and its ultimate functionality. However, measurements at these buried interfaces are only possible by cross-sectional STEM. Image processing and principle component analysis were used to determine the width of the interfaces, demonstrating that some 2D materials have cleaner interfaces than others and that fabrication in inert atmosphere produces better interfaces than in air.[6] This analysis is extended to bends and folds which underpin the structure of kink-bands in graphite and other bulk ‘van der Waals’ materials such as hexagonal boron nitride and MoSe2. Extensive image processing allows the bent basal planes to be fitted by functions to reveal their radius of curvature, angle and distance to the nearest neighbour plane. Three broad classes of bend emerge which depend on the bend angle and crystal thickness. [1] A. K. Geim and I. V. Grigorieva, ‘Van der Waals heterostructures’, Nature, vol. 499, no. 7459, pp. 419–425, Jul. 2013. [2] L. Britnell et al., ‘Field-Effect Tunneling Transistor Based on Vertical Graphene Heterostructures’, Science, vol. 335, no. 6071, pp. 947–950, Feb. 2012. [3] P. A. Thomas et al., ‘Nanomechanical electro-optical modulator based on atomic heterostructures’, Nat. Commun., vol. 7, p. ncomms13590, Nov. 2016. [4] F. Withers et al., ‘Light-emitting diodes by band-structure engineering in van der Waals heterostructures’, Nat Mater, vol. 14, no. 3, pp. 301–306, 2015. [5] M. Schaffer, B. Schaffer, and Q. Ramasse, ‘Sample preparation for atomic-resolution STEM at low voltages by FIB’, Ultramicroscopy, vol. 114, no. 0, pp. 62–71, 2012. [6] A. P. Rooney et al., ‘Observing Imperfection in Atomic Interfaces for van der Waals Heterostructures’, Nano Lett., Jul. 2017.
(Conference Room San Felipe)
11:30 - 12:30 James LeBeau: Quantifying local structure and chemistry with the scanning transmission electron microscope
In his talk, J. LeBeau presented his work improving both the accuracy and precision of STEM imaging. While electron microscopy has been revolutionized by the aberration corrector, which dramatically improved spatial resolution, real-space distance measurements have remained semi-quantitative. In particular, accuracy and precision for scanning transmission electron microscopy (STEM) was significantly hampered by the presence of sample drift and scan distortion. Until recently, this limitation has obscured the capabilities to characterize minute changes to the atomic structure that can ultimately define material properties. He discussed his approach to resolve this problem, called revolving scanning transmission electron microscopy (RevSTEM). The method uses a series of fast-acquisition STEM images, but with the scan coordinates rotated between successive frames, which encodes drift rate and direction to the resulting image distortion. Multiple case studies were presented to highlight the power of this new technique to characterize materials. For example, picometer precise measurements were shown to enable the direct quantification of static atomic displacements within a complex oxide solid solution due to local chemistry. He also discussed recent work implementing deep convolutional neural networks to autonomously quantify electron diffraction data. The networks were shown to first calibrate the zero-order disk size, center position, and rotation without the need for pretreating the data for sample thickness and tilt measurements. The performance of the network was explored as a function of a variety of variables including thickness, tilt, and dose. The processing speed was also shown to far outpace a least squares approach by orders of magnitude. He also discussed the generality of the method to other materials/orientations as well as a hybrid approach that combines the features of the neural network with least squares fitting for even more robust analysis.
(Conference Room San Felipe)
12:30 - 13:30 Sarah Haigh: Challenges for extracting quantitative data from in situ (S)TEM experiments - a talk followed by a discussion
The ability to control the growth, agglomeration and degradation of materials at the atomic scale is one of the key aims of modern material science. However, probing dynamic chemical processes occurring within a liquid or gas environment is highly challenging. In their Science review article, Tao and Salmeron comment, “the development of improved methods to synthesise semiconductor, metal and dielectric nanoparticles depends on a thorough understanding of the growth mechanisms that occur at solid-liquid or solid-gas interfaces. In situ studies relevant to energy conversion technology would be crucial for understanding reaction mechanisms and designing new and efficient materials for a wide range of energy conversion processes”.[1] Environmental-cell (e-cell) transmission electron microscopy (TEM) is the only technique with the potential to directly probe nanomaterial synthesis and degradation occurring in liquids and gases at atomic resolution [2]. Current state of the art TEM and scanning TEM (STEM) has enabled excellent spatial resolution (~0.05 nm) using coherent aberration corrected lenses, monochromated and higher brightness electron sources, and faster detectors with improved efficiency. A further advantage of STEM is its ability to combine atomic resolution imaging with local elemental analysis obtained via energy dispersive x-ray spectroscopy (EDXS) or electron energy loss spectroscopy (EELS). The vast majority of TEM experiments are performed under high vacuum conditions. Unfortunately several high profile studies have revealed that the structure of functional materials at room temperature in a vacuum may be significantly different from that in their operational environment.[3,4] Consequently our ability to understand the properties of the material under investigation may be severely decreased by the need for vacuum conditions, such that the results obtained may even be misleading. The problem is even more serious when considering dynamic in situ experiments. The majority of important nanomaterial reactions we would wish to study (such as the growth of nanoparticles, the corrosion of materials, or the action of catalysts) do not occur in high vacuum. The last 5 years have seen explosion of interest in new silicon chip based STEM/TEM holder systems, which allow samples to be imaged in gas and liquid environments, with applied bias, and at elevated temperature.[5-9] The key component of all e-cell holders is a pair of silicon chips with electron-transparent silicon nitride windows. The liquid or gas environment is encapsulated between the 2 chips while a high vacuum is maintained in the rest of the microscope. Dedicated Environmental TEM (ETEM) instruments also allow samples to be imaged in gases.[10] Both e-cell holders and E-TEM microscopes suffer from limitations associated with poor signal to noise and poor spectroscopy capabilities due to the presence of the imaging environment and/or the silicon nitride windows and surrounding e-cell. This session will begin with some example case studies illustrating the capabilies of current in situ imaging including ETEM observation of Al oxidation and e-cell STEM imaging of particle dynamics and elemental analysis. The key problems of quantitative data interpretation and image/spectral analysis will be highlighted to facilitate discussions on how these could be improved. [1] F. Tao and M. Salmeron, Science 2011 331, 171; [2] M.J.Williamson et al. Nat. Mater. 2003, 2, 532;. [3] P.L.Hansen, et al. Science 2002, 295, (5562), 2053; [4] H.Yoshida, et al. Science 2012, 335, 317; [5] R.Boston et al., Science 2014, 334, (6184) 623; [6] F.Panciera, et al. Nat. Mater. 2015, 14, 8, 820 [7] S.B. Alam, Nano Lett., 2015, 15 (10), 6535; [8] M.E.Holtz, et al, Nano Lett., 2014, 14, 1453 [9] M.J.Dukes, et al. Chem. Commun., 2013, 49, 3007 [10] E.D.Boyes et al, Annalen der Physik 2013, 525, (6), 423;
(Conference Room San Felipe)
13:30 - 15:00 Lunch (Restaurant Hotel Hacienda Los Laureles)
15:00 - 16:00 Benjamin Berkels: Joint denoising and distortion correction of atomic scale scanning transmission electron microscopy images (Conference Room San Felipe)
16:00 - 16:30 Coffee Break (Conference Room San Felipe)
16:30 - 17:15 Andrew Yankovich: Exposing new atomic-scale information about nanomaterials by improving the quality and quantifiability of STEM data (Conference Room San Felipe)
17:15 - 18:15 Clayton Webster: Polynomial approximation via compressed sensing of high dimension complex-valued functions
In this talk, we present a compressed sensing approach to polynomial approximation of complex-valued functions in high dimensions. Of particular interest is the parameterized PDE setting, where the target function is smooth, characterized by a rapidly decaying orthonormal expansion, whose most important terms are captured by a lower (or downward closed) set. By exploiting this fact, we develop a novel weighted minimization procedure with a precise choice of weights, and a modification of the iterative hard thresholding method, for imposing the downward closed preference. We will also present theoretical results that reveal our new computational approaches possess a provably reduced sample complexity compared to existing compressed sensing, least squares, and interpolation techniques. In addition, the recovery of the corresponding best approximation using our methods is established through an improved bound for the restricted isometry property. Numerical examples are provided to support the theoretical results and demonstrate the computational efficiency of the new weighted minimization method.
(Conference Room San Felipe)
18:15 - 19:00 Quentin Ramasse: Pushing the limits of electron energy loss spectroscopy: from phonons to core losses in real and momentum space
A new generation of beam monochromators has recently pushed the energy resolution of (scanning) transmission electron microscopes (S)TEMs deep into the sub 20meV range [1]. In addition to the obvious increase in resolution, the flexibility of these instruments is allowing the energy resolution, beam current and optics can be adjusted seamlessly within a greatly increased range, thus enabling tantalising new modes of operations. This contribution will illustrate some of the new possibilities offered by these instruments through a number of examples, putting a specific emphasis on the need for advanced theoretical calculations to rationalise the experimental results. The increase in resolution has made it possible to explore the phonon region of the electron energy loss spectrum (EELS). Although the physical origin of vibrational signals in the STEM is very similar to that giving rise to low-energy phonon vibrations in neutron or inelastic X-ray scattering, differences in experimental geometries and selection rules, among other factors, have made their interpretation challenging. A better understanding of this phonon response can be achieved by observing the dependence of the phonon peaks under different optical conditions and mapping their energy in momentum space. A theoretical formalism based on that used by inelastic X-ray and neutron scattering [2] can be applied to obtain a good agreement with experimental data, e.g. from two polymorphs of boron nitride, across different directions in the Brillouin zone [3]. However, while this approach is successful, more complex and computationally demanding models taking into account finite momentum resolution and final sample size are shown to be necessary if a truly quantitative match between experiment and theory is to be achieved [3]. Furthermore, the effect of atomic scale defects on the bonding of materials can now be fingerprinted through core and low loss spectroscopy with a greater precision and sensitivity than ever before. Here, 2-dimensional materials have provided an ideal ‘sandbox’: low energy excitons can be clearly distinguished in MoS2, while the introduction of a single B or N atom in graphene results in subtle localised modifications of its electronic structure [5,6]. However, for a reliable prediction of the loss function of these materials, at finite momentum transfer within density functional theory (DFT), it is necessary to invoke corrections for local field effects, in addition to any excitonic modification to the optical absorption. Thus for a comprehensive understanding of the experimental results a theoretical treatment beyond classical dielectric theory is imperative. In conclusion, this contribution will aim to illustrate that while modern electron microscopes are now sufficiently advanced to push the validity of the approximations used in theoretical electronic structure calculations, there is evidently a growing need to increase the efficiency of more advanced computational schemes using the GW and BSE formalisms, among others, and to tackle increasingly large atomic models to provide realistic and quantitative simulations to validate the experiments. [1] O.L. Krivanek, T.C. Lovejoy, N. Dellby et al., Nature 514 (2014), pp. 209-212. [2] E. Burkel, J.Phys.: Cond. Mat. 13 (2001), 7627. [3] F.S. Hage, R. Nicholls, J. Yates et al., Submitted (2017). [4] H.C. Nerl, K.T. Winther, F.S. Hage et al., NPJ 2D Mat. Appl. 1 (2017), pp. 1-11. [5] T.P. Hardcastle, C.R. Seabourne, D.M. Kepaptsoglou et al., J.Phys.: Cond. Mat. 29 (2017), 225303. [6] F.S. Hage, T.P. Hardcastle, M.N. Gjerding et al., Submitted (2017).
(Conference Room San Felipe)
19:00 - 21:00 Dinner (Restaurant Hotel Hacienda Los Laureles)
Friday, October 20
07:30 - 09:00 Breakfast (Restaurant at your assigned hotel)
09:00 - 09:15 Huolin Xin: Artificially intelligent S/TEM
Deep learning introduces the potential for autonomous S/TEM characterization, a step towards unsupervised data acquisition and analysis through machine learning. Whole image classification is the first step towards the long-term goal of autonomous image data acquisition and analysis. We have successfully retrained AlexNet, b-FCN, and U-Net, pre-existing deep learning Convolutional Neural Networks(CNNs), for autonomous, whole image classification and analysis, on TEM image datasets.
(Conference Room San Felipe)
09:15 - 10:30 Discussions (Conference Room San Felipe)
10:30 - 11:00 Coffee Break (Conference Room San Felipe)
11:00 - 12:00 Discussions (Conference Room San Felipe)
12:00 - 14:00 Lunch (Restaurant Hotel Hacienda Los Laureles)