Scale Events
+00:00 GMT
Sign in or Join the community to continue

Monitoring Deforestation in the Amazon with Multimodal Learning

Posted Oct 19, 2022 | Views 561
# TransformX 2022
# Keynote
# Computer Vision
Share
speaker
avatar
Miriam Cha
Research Scientist, Artificial Intelligence Technology Group @ MIT Lincoln Library

Dr. Miriam Cha is a research scientist in the Artificial Intelligence Technology Group at MIT Lincoln Laboratory. Her research centers around multimodal representation learning and cross-modal synthesis. She is interested in developing artificial intelligence to be able to interpret and translate multimodal information, similar to how humans have a natural ability to process and relate inputs from different sensory modalities. She is currently investigating learning algorithms for multiple remote sensing modalities as well as medical modalities. Dr. Cha completed her PhD in computer science at Harvard University in 2019. She received BS and MS degrees in electrical and computer engineering from Carnegie Mellon University. She was a recipient of a National Science Foundation Graduate Research Fellowship, a National Defense Science and Engineering Graduate Fellowship, and a Lincoln Scholars Fellowship.

+ Read More
SUMMARY

Despite international efforts to reduce deforestation, the world loses an area of forest that is equivalent to the size of 40 football fields every minute. Deforestation in the Amazon rainforest accounts for the largest share, contributing to reduced biodiversity, habitat loss for many of the world’s most threatened animals and insects, and more rapid climate change.

Satellite remote sensing offers a powerful tool to track changes in the Amazon. The Multimodal Learning for Earth and Environment Challenge (MultiEarth 2022) is the first competition aimed at monitoring and analyzing deforestation in the Amazon rainforest at any time and in any weather or lighting condition. Weather conditions in the Amazon are often humid and cloudy, making it difficult to gather clear images.

In this workshop, learn from Miriam Cha, Research Scientist in the Artificial Intelligence Technology Group at MIT Lincoln Laboratory, about multimodal representation learning for the earth and environment.

Cha will discuss the benefits and challenges of synthetic aperture radar (SAR), a sensor that transmits microwave signals and then receives back the signals that are returned from the earth’s surface. Although SAR images are extremely clear, they can be difficult for humans to interpret due to noise and lack of spatial correlations. This is why MultiEarth 2022 is using SAR with optical sensors for maximum effect.

+ Read More

Watch More

0:46
Building Amazon Astro: The First Multi-Purpose Home Robot
Posted Oct 24, 2022 | Views 2.1K
# TransformX 2022
# Fireside Chat
# Computer Vision
# Robotics
The Learning Curve: Model-in-the-Loop Data Curation
Posted Jun 16, 2021 | Views 1.8K
# Tech Talk