NYU Session 3: Applications of Remote Sensing

Today’s session followed the usual macro/micro format, starting with a guest lecturer to provide big-picture context, and ending with hands-on labs for more in-depth skill development. Our guest lecturer, Lela Prashad, is Chief Data Scientist at NiJeL.org and technical adviser to my MIT thesis project, OpenIR. She specializes in geospatial data analysis, particularly with multi-spectral remote sensing, and she was the primary remote sensing instructor to ProPublica’s NewsApps team as they prepared to report on Louisiana’s eroding delta.

After each student quickly explained their experience and/or understanding of remote sensing, Lela gave the class an overview of how the Earth is imaged by various satellite-based instruments; what the most commonly used datasets are and how to access them; how spectral, temporal, and spatial resolution plays a role in choosing the right data to answer a specific research question; and how remote sensing can be used not just to find answers, but also to provide background and context for the questions that we will explore via our self-deployed in-situ sensors.

Lela then introduced MultiSpec, a free software application by Purdue University, used to analyze and process spectral data. Its functions include land use classification, side-by-side comparisons, and derivation of quantitative imagery like NDVI (normalized difference vegetation index). MultiSpec is free, but for practitioners who have access or need for more in-depth spectral analysis, the closed-source package ENVI is an option.

Students then explained their previous class experiments with environmental and bio-sensors, as well as what their midterm projects might be. Spectral tools had a clear application for some of the more Earth-oriented ideas, including Will’s historical/archeological idea, Graham’s bumpiness tracking idea, and Varun’s turf/lawn analysis idea. There may also be applications to Changyeon’s noise pollution idea and Tong’s pace-tracking idea.

At this point, it doesn’t seem like the majority of student projects will employ spectral data for post-collection analysis, but I think the students appreciated learning about the existence of tools and datasets that may be useful for background research and future projects.

After class, it was good to read and comment on the latest student blog posts–it helped me get a stronger sense of each student’s background, preferences, experiments so far, and ideas going forward. I really like having such a diverse group in class, but it can be a challenge to plan relevant material for such diversity, so the blog posts are helpful to me to prepare upcoming speakers and to adjust hands-on activities for upcoming sessions. A few highlights from the past week’s posts:
the work of Varun’s friend Ben Kreimer
– the pulse sensor experiment by Kania/Tanya/Claudia
Claudia’s class scribing
Tong, Changyeon, and Justin’s circuit experiments
Dimas’s observations on the evolution of microcontrollers and rapid prototyping


 

TODAY’S AGENDA

  • Announcements.
  • Guest Lecture: Remote sensing & in-situ data, by Lela Prashad, NiJeL.org
  • Q&A, Multispec exercise
  • Break
  • Open lab. Continue with Multispec, do some lit review, continue on with your sensors.

NEXT WEEK’S ASSIGNMENTS:

  1. Explore some of the data resources Lela Prashad suggested (see her presentation link here, and her data source PDF here: Remote_Sensing_Resource_Links). How do they provide some context for your project idea? Write up a short lit review using Lela’s suggestions, and using your own resources.
  2. If you plan to participate in the iOS training on March 3: accept the Apple Developer Invitation if you haven’t done so, send Marlon (cc me) your device UDID,  and solder headers to your bluetooth sensor if you haven’t done so.
  3. Download QGIS in preparation for next week’s analysis exercises.
  4. LOOKING AHEAD (optional assignment, not required for this week): If you have a Bluefruit LE, try hooking it up to your sensor and download the iOS Bluefruit app to see the data readings on your phone.