Plenary Speakers

  • Satellite Radar Earth Remote Sensing: Icebergs and Winds

    Microwave remote sensing extracts environmental information from what is often considered the undesired components of signals encountered in surveillance radar: noise and clutter. Radiometers exploit noise, while remote sensing radars employ clutter to study the Earth.  Satellite-based radar sensors, coupled with computer processing offer unique perspectives and measurements of important geophysical processes beyond just imaging. In this talk, I consider applications of satellite radar measurements of the microwave scattering properties of the Earth’s surface. Synthetic Aperture Radar (SAR) make highly detailed backscatter images regardless of the weather or solar illumination conditions. These have both military and civilian applications. However, other types of satellite radar such as altimeters, scatterometers, and weather radars provide unique measurements and perspective.  For example, over the ocean radar backscatter is related wind-generated roughness and can be used to measure wind speed and direction.  Radar backscatter is particularly sensitive to melt/freeze conditions and can thus be used to map and monitor sea ice and soil conditions.  The contrast between ocean and ice scattering enables tracking of major icebergs in Antarctic. Using precise range measurements satellite altimeters measure ocean topography from which ocean currents can be inferred. Satellite weather radars measure rain rates and cloud density. With existing and planned systems, we are in the golden age of satellite radar remote sensing.

  • The NASA-ISRO Synthetic Aperture Radar Mission – The Final Stretch Toward a New Capability for Earth Science and Applications

    The National Aeronautics and Space Administration (NASA) in the United States and the Indian Space Research Organisation (ISRO) are developing the NASA-ISRO Synthetic Aperture Radar (NISAR) mission, now planned for launch in early 2024. The mission will use synthetic aperture radar to map Earth solid surfaces every 12 days, persistently on ascending and descending portions of the orbit, over all land and ice. The mission’s primary objectives will be to study Earth land and ice deformation, and ecosystems, in areas of common interest to the US and Indian science communities. This single observatory solution with L-band (24 cm wavelength) and S-band (9.4 cm wavelength) imaging radars has a swath of over 240 km at 5-10 m resolution, using full polarimetry where needed.  To achieve these unprecedented capabilities, both radars use a reflector-feed system, whereby the feed aperture elements are individually sampled to allow a scan-on-receive capability at both L-band and S-band.   The L-band and S-band electronics and feed apertures, provided by NASA and ISRO respectively, share a common 12-m diameter deployable reflector/boom system, provided by NASA.  These two radars, which can operate simultaneously, produce prodigious amounts of data even with FPGA-based on-board digital beamforming and filtering to reduce data rates.  Given the high data rates and ambitious coverage requirements, new technologies for high-rate Ka-band downlink complement these first-of-a-kind radar systems. 

    Slowed by the global pandemic, the mission is now approaching its final stage of integration and test. The radar electronics, GNSS unit and solid-state recorder are mounted on an octagonal cylindrical radar instrument structure.   This structure, as well as the Ka-band downlink system and associated control electronics finished testing in early March 2023 and are being shipped to India for integration with the ISRO-provided spacecraft bus.  The reflector/boom system completed testing in 2022 and will be shipped to India when needed in summer 2023. The integration and test period for the observatory is planned to complete in 2023, with the earliest possible launch date being after the eclipse season for the planned NISAR orbit ends on January 30, 2024.  The launch vehicle is ISRO’s GSLV Mark II.   

    This talk will describe the mission, the measurements, and the technologies and techniques that plans to deliver over 40 Tbits of science and applications data per day to understand our everchanging planet.

  • Radar as an Enabling Technology for Next Generation Human Ambient Intelligence

    As technology advances and an increasing number of devices enter our homes and workplace, humans have become an integral component of cyber-physical systems (CPS). One of the grand challenges of cyber-physical human systems (CPHS) is how to design autonomous systems where human-system collaboration is optimized through improved understanding of human behavior.  A new frontier within this landscape is afforded by the advent of low-cost, low-power millimeter (mm)-wave RF transceivers, which enables the exploitation of RF sensors almost anywhere as part of the Internet-of-Things (IoT), smart environments, personal devices, and even wearables.  RF sensors not only provide sensing capability when other sensors may be ineffective due to environmental factors, but also provide unique spatio-kinematic measurements that are complementary to that of other sensing modalities.  Moreover, in indoor environments where privacy is also a driving consideration, RF sensors offer relatively non-intrusive perception capabilities.  Consequently, there have been exciting recent advancements in the use of RF sensing for human-computer interaction, remote health monitoring, and smart homes. Since the first research in radar-based human activity recognition over 15 years ago, where the technology was demonstrated in controlled lab settings, now radar can be found in many new devices hitting the market.  This includes the Google SOLI sensor in cell phones for non-contact gesture recognition, as well as products under development by Amazon, Vayyar and others for sleep monitoring, vital sign monitoring, and occupancy recognition.  However, these applications only begin to touch the surface of the potential for radar-enabled cyber-physical human systems (CPHS).  Future intelligent devices equipped with cognitive perception and learning will be able to much more effectively and robustly decipher and respond to complex human behaviors.  This talk provides a detailed discussion of current sensing and machine learning challenges, as well as new perspectives that can help us overcome current limitations and pave the way for future radar-enabled interactive environments.

  • Radar-inspired imaging for breast cancer detection

    Biomedical applications at microwave and radio frequencies rely on the differences in permittivity and conductivity of biological tissues.  The properties of healthy tissues span a wide range that relates to water content, while diseased tissues such as malignancies typically exhibit increased properties.  Leveraging these differences, microwave imaging has been investigated as an alternative method for breast cancer detection and treatment monitoring. 
    Several approaches have been developed to map the properties of tissues and identify anomalies.  Microwave tomography involves measuring signals transmitted through the tissues, then iteratively updating properties of a model until simulations match these measurements.  Radar-based approaches involve collecting reflections from tissues, then processing and focusing these reflections to identify anomalies.  For both radar and tomography, key challenges are design of a measurement system and interface that enable reliable and rapid collection of data while operating close to the target tissues, developing imaging algorithms capable of detecting anomalies in a complex background, and reconciling the resulting microwave images with clinically available data.  
    At the University of Calgary, we have developed several generations of prototype systems, focusing on demonstrating the consistency of images collected at different time points, as well as the feasibility of detecting tumors and treatment-related changes.  Our most advanced radar-based system implements patient-specific capabilities, scanning the breast with 4 degrees of freedom in sensor positioning to enable consistent collection of data.  We leveraged the knowledge gained through experience with this system to develop a novel approach that estimates locally averaged properties of tissues by detecting pulses traveling through the breast. With this approach, we have demonstrated a high degree of similarity between images captured at different time points, as well as symmetry between properties of the right and left breasts.  Comparison of the images of right and left breasts of cancer patients has also enabled tracking treatment-related changes.  Recently, our team began testing the next generation of this transmission system that features improved resolution.  The initial results obtained with this system add to the growing body of work that illustrates the potential of microwave imaging to provide a unique breast imaging solution.