Digital Transformation of Development Traineeship Brings AI and Data Analytics to Under-Resourced Settings

Under a new NSF-funded research program housed at the Blum Center, the Digital Transformation of Development (DToD) Traineeship, students are using their research skills to apply digital tools, such as machine learning and AI, to the issues and challenges of poverty alleviation, disaster relief, and more — in pursuit of digital and technological justice, equity, and empowerment.

Digital Transformation of Development Traineeship Brings AI and Data Analytics to Under-Resourced Settings

Navigating around town, tailoring our workouts to our level of physical fitness, knowing when our packages will arrive: The boom in data collection and analysis has been a boon to our daily lives — and a new paradigm for businesses, organizations, and governments to optimize efficiency and improve services. It feels ubiquitous. Who hasn’t taken advantage of the digital revolution?

It turns out, many communities haven’t been able to. From marginalized neighborhoods nearby to many areas around the globe, the tools that increasingly govern and improve our lives are not available or not tailored to serving everyone, be it personal wellbeing, environmental health, or economic security. 

But under a new NSF-funded research program housed at the Blum Center, the Digital Transformation of Development (DToD) Traineeship, students are using their research skills to apply digital tools, such as machine learning and AI, to the issues and challenges of poverty alleviation, disaster relief, and more — in pursuit of digital and technological justice, equity, and empowerment.

“There’s a lot of recognition of the potential of rapidly emerging technologies like AI, new analytics, scalable cloud computing, and novel data sources,” says Matt Podolsky, DToD program coordinator. “But these advancements have not been particularly targeted to under-resourced communities and issues that pertain to them. We aim to address that shortfall.” 

After an initial planning year, the five-year program kicked off Fall 2022 with its first cohort of nine master’s and 16 PhD students. Though their formal traineeships center around three DToD-themed courses that are taken over two or more semesters, the program aims to keep them involved for the entire duration of their graduate studies. A handful of fellows receive one-year awards covering tuition and fees, plus a stipend; the fellowship also offers the unique opportunity to apply for travel grants for self-arranged internships, where they conduct fieldwork and applied research with and within low-resourced communities. For PhD students, the program additionally allows them to work toward the designated emphasis in Development Engineering, an official minor for PhDs. The interdisciplinary skills developed in DToD include everything from technical writing skills to ethical data collection, all with the goal of producing fair and inclusive analysis to benefit underserved communities.

One of the key courses students take in the Development Engineering ecosystem, “Design, Evaluate, and Scale Development Technologies,” provides a hands-on opportunity to develop a tangible solution — say, a simple-to-use, easy-to-carry solar-powered water pump — to real-world problems — say, climate-impacted farmlands. The focus is always on incorporating the context and needs of people and their communities.

“It encourages students to develop a solution that involves the end users in the design process and could be scaled beyond just a small research prototype,” says Podolsky. 

Another class, “DToD Research & Practice,” is a seminar featuring guest speakers including thought leaders and researchers from UC Berkeley’s faculty and experts in industry applying AI and data technologies for social impact, who work on everything from how to incorporate AI in development to finding early signs of eye disease using machine learning. This class also allows students to present their own work to peers, providing feedback to each other while learning about interesting research across campus. That model fosters collaboration, support, and skills in communicating research to those outside one’s discipline.

Rajiv Shah, former administrator of USAID and former Blum Center trustee, approached Prof. Shankar Sastry during multiple board-of-trustee meetings about building on the Center’s mission. 

“The brand of the Blum Center is really technology and mechanisms, incentive designs, and so on to lift people out of poverty,” says Sastry, the Center’s former faculty director, the College of Engineering’s former dean, and DToD’s principal investigator. “And Raj said, ‘Why don’t you take it to the extreme? And why don’t you see how you can combine these latest greatest technologies?’”

Thanks to Development Engineering pioneer Prof. Alice Agogino, the Blum Center already had a track record of success with National Science Foundation Research Traineeship (NRT) programs. Meanwhile, Dr. Yael Perez, director of the Center’s Development Engineering programs and the coordinator for the InFEWS NRT, felt it would be a boon for students with digitally minded social-impact projects to receive the same kind of holistic support that InFEWS students had for their own social impact work. She hoped to see more social impact–minded students come out of programs like those of Electrical Engineering and Computer Science. And Shah’s idea came back to Sastry.

“It became rapidly clear that technologies in AI and machine learning, as well as Internet of Things and cloud computing, all of that was booming,” he says. But what he found missing was “how you would bundle them into services that could then be offered for people to better themselves, and at scale.”

Around 2019 and 2020, “by the time we were putting this [program] together, most corporate boardrooms were talking about digital transformation. And they didn’t exactly know what they were talking about,” Sastry adds. “But we taught that by digital transformation, we really meant: How do you take these advances in AI, machine learning, IoT, cloud and edge computing to provide services — be they in healthcare, in energy, in distributed energy management — to be able to really enable economic development.”

“We could give our students an appreciation not only for these algorithms,” he says, “but also what happens when you use them.”

The first cohort’s fellows come from disciplines and departments across campus, from various branches of engineering to city and regional planning to the School of Information. Over 60 percent of the first cohort are women and nearly half are underrepresented minorities. 

Ritwik Gupta works with collaborators at the United States Geological Survey to measure soil water retention after the 2021 Dixie Fire.
Ritwik Gupta works with collaborators at the United States Geological Survey to measure soil water retention after the 2021 Dixie Fire. (Photo by Matt Podolsky)

Among this first class is Ritwik Gupta, a PhD student at Berkeley’s AI Research Lab, who focuses on computer vision for humanitarian assistance and disaster response, along with public policy for the effective and safe usage of such technology. Working with a diverse set of partners such as CAL FIRE, the Department of Defense’s Defense Innovation Unit, and the United Nations, his research on tasks such as assessing damage to buildings from space and detecting illegal fishing vessels in all weather conditions has been deployed worldwide.

The work of classmate Evan Patrick leverages geospatial science and ethnography to evaluate forest restoration efforts in Guatemala and explore the ongoing impacts of the El Niño Southern Oscillation on landscapes and livelihoods in the country. The Environmental Science, Policy, and Management (ESPM) PhD student worked with two MDevEng capstone groups in the Potts Lab to model the carbon benefits of plantation expansion in Guatemala and to use social media audience estimates to investigate ENSO-driven internal migration in Central America.

Sarah Hartman, in her fifth year of an ESPM PhD program, first got involved with the Blum Center through its first NRT on innovations in food, energy, and water systems. She wanted to continue what had been an “absolutely wonderful experience,” and DToD just happened to be relevant to her work using technology, engineering, and science to improve water and agricultural conditions in low-resource settings.

“The way the program’s designed is really nice,” she says, “because it gives you exposure to people who might be outside of your discipline but are using methods that might be of use to your particular application.” For instance, in one DToD class, a visiting professor discussed how she translated the skills and technology involved in using AI to build others’ personal wealth into using AI to detect early health concerns in low-resource settings. 

The programs’ benefits, however, extend beyond official class content.

“I really appreciate the balance they strike between professional opportunities and community-building opportunities,” Hartman says. “And sometimes it’s the small things.”

During a class presentation, Ritwik Gupta points to dark vessel detections generated from his xView3 AI model within the Department of Navy's SeaVision platform.
Ritwik Gupta points to dark vessel detections generated from his xView3 AI model within the Department of Navy’s SeaVision platform. (Photo by Matt Podolsky)

Like coffee and cookie breaks. Students like Hartman find inspiration and collaboration during opportunities to socialize during class. “There’s a lot of value in the informal conversations that we have around the structured lecture content,” she says.

Take, for instance, Gupta’s work using machine learning and satellite imagery to understand the toll Russia’s war in Ukraine is having in cities. His methods, Hartman found, could add a new layer to her work understanding how the war is impacting Ukrainian agricultural resilience, such as better, faster insight into the status of key agricultural infrastructure like grain silos and ports. This cross pollination improves her ability to conduct real-time analysis in a continuously evolving situation.

Going forward, Podolsky plans to implement more workshops on topics from data visualization to effective communication skills, such as op-ed writing. “We really just want to add to the research training aspect beyond just coursework,” he says. 

But in the big picture, Sastry says, the DToD program is about more than just the implementation of digital development solutions in accordance with the needs of people in under-resourced settings. It’s also about what comes before that process even starts.

“I think it’s great to do interventions, it’s great to think about clean water, it’s great to think about energy,” he says. “But what’s really emerging from this series is students getting a sense of empowerment to go and change the world. And that quite often transcends specific solutions. I think we’re giving them that, and we’re giving them some optimism.”

“Because at the end of the day, the technology is great, but it’s the starting point,” he adds. “It’s not the endpoint; the services are the middle point; and then the empowerment for people is really the endpoint. So it starts with empowering our students to empower the people to help themselves.”

In the AI Era, Blum Center Students and Alumni Find Ways to Apply the Technology for Social Good

Yet even before AI took center stage this past year, students and alumni of the Blum Center for Developing Economies were embracing the emerging technologies’ potential, specifically for social good. From detecting “deepfake” videos to analyzing agriculture changes and building understanding across communities, Blum Center folks share their experiences, inspirations, and the impact of their AI-driven projects and ventures. 

In the AI Era, Blum Center Students and Alumni Find Ways to Apply the Technology for Social Good

Only about a year has passed since DALL-E, the AI text-to-image model, and ChatGPT, the large language model–using chatbot, ushered in what feels like the age of AI: months of wall-to-wall news coverage of and personal experimentation with the most powerful publicly available artificial intelligence programs. But what started off as images of Darth Vader playing in the NBA and the near-instant generation of new bedtime stories have given way to concerns about how AI can contribute to disinformation campaigns, academic dishonesty, and even an end to whole classes of workers whose jobs can, theoretically, be done by machines.

Yet even before AI took center stage this past year, students and alumni of the Blum Center for Developing Economies were embracing the emerging technologies’ potential, specifically for social good. From detecting “deepfake” videos to analyzing agriculture changes and building understanding across communities, Blum Center folks share their experiences, inspirations, and the impact of their AI-driven projects and ventures. 

 

Shaping the battle against deepfakes

Raymond Lee (courtesy photo)
Raymond Lee (courtesy photo)

Back in 2019, Raymond Lee, a former Big Ideas Contest winner and a UC Berkeley alum, spotted the growing threat caused by deepfakes — media created using learning techniques that can swap faces, voices, and even entire bodies, making it appear as if someone is saying or doing something they never actually did.

While deepfakes became viral and raised public concerns, no technical solutions stood out then. Lee decided to initiate FakeNetAI, a deepfake detection SaaS (Software as a Service) that aimed to “protect against economic, societal, and political threats.” 

FakeNetAI began as Lee’s capstone project during his master’s program in data science, and it quickly evolved into a startup with the support of his teammates from data science, electrical engineering and computer science, and Haas MBA students. And through participating in the Big Ideas Contest, Lee connected with mentors, conducted market analysis, and developed a mature business plan. 

To train their machine learning model, Lee and his teammates curated a diverse dataset by employing open-source data sets, scraping raw data from YouTube, and collecting deepfake videos created using various methods. The team’s approach and the customized machine-learning architecture resulted in a detection accuracy of over 90 percent.

Since its victory in the Big Ideas Contest in 2019, the startup has successfully attracted media outlets, banks, and social media firms as customers. For NewsMobile, an Indian fact-checking site, FakeNetAI helped identify viral deepfake videos featuring prominent figures like Tom Cruise and Donald Trump. Moreover, it aided in verifying livestreaming videos during a critical moment of the Myanmar coup for another third-party fact-checker.

However, Lee acknowledges that the fight against deepfakes is far from over. The rapid proliferation of deepfake generation has still outpaced the development of detection expertise. Citing the research of Prof. Hany Farid from Berkeley’s School of Information, Lee emphasizes the urgent need for continued research in detecting these highly realistic manipulated videos.

“It’s like a cat and mouse game where you constantly have to outsmart the generation models with your detection model,” he says.

The other challenge is, because media comes in a wide range of formats, it takes a lot of work to train a machine-learning model that can perform well on all different formats. So the more generalized a machine-learning model is made to be, the less accurate it becomes.

FakeNetAI is still growing, and Lee is determined to stay at the forefront of innovation. He has also pivoted his career to do more general AI and data science consulting. “The image generation field has advanced at a rapid rate,” he says, “and there’s always a lot of interesting stuff going on in the video, image and text generation space.”

 

Understanding abandonment in Ukrainian agriculture amidst war

Sarah Hartman, a PhD student within Environmental Science, Policy, and Management, has been involved in the Digital Transformation of Development traineeship since last year, and will continue as a funded fellow for this upcoming year.

Before joining DToD, Hartman’s research interests in food and water led her to participate in the InFEWS program. What she enjoys the most about these programs is the opportunity to interact with individuals from diverse backgrounds and disciplines.

“I’ve gained a lot intellectually by chatting with people,” Hartman says. “There were things that I learned from the other fellows that have helped reframe how I think about my research or have made me think about potential tools and methods that I hadn’t otherwise stumbled upon.”

In her research on water and agriculture, Hartman utilizes machine learning and AI to map changes and quantify the extent of agricultural activities. Her recent work involves training machine-learning models to analyze satellite images of Ukrainian agriculture during its war with Russia, with a specific focus on abandonment and its underlying drivers. 

The idea stemmed from a teaching experience when Hartman was a graduate student instructor for a Principles of Natural Resource Management class. As the war in Ukraine began during the semester, she noticed the students’ desire to understand the global impact of such events on natural resources and supply chains. She then decided to deliver materials around globalization with a specific angle of Ukraine. For instance, the country is one of the world’s breadbaskets, where fields of sunflowers, wheat, and corn provide food for some of the world’s most water-stressed and vulnerable countries, particularly many in North Africa. The abandonment of Ukrainian fields could cause serious food instability for its importing countries.

Inspired by the dynamic conversations in class, Hartman wanted to pursue a research project on the issue. “I took it personally,” Hartman says. “I’ve been learning how to use satellite images and machine learning to look at agriculture; what better place to do it than to help inform what’s happening in Ukraine in near real-time?”

Are fields abandoned because of a physical tank or destruction? Or because of supply chain issues such as farmers unable to get sufficient fertilizer or no longer able to sell their crop? Hartman’s research looks into these issues by analyzing satellite images and hopes to reveal what affects the overall resilience of Ukrainian agriculture. Specifically, she analyzes historical and near-real-time satellite imagery in Google Earth Engine to map and categorize fields and landscapes to compare changes over time. She then relates these changes to other timestamped activities such as damage to transportation routes or local military activity.

Besides technical analysis, Hartman hopes she could do more outreach and engagement with local communities in Ukraine that are directly affected, which has been challenging given the ongoing war condition. She says she would love to work with organizations with access to the groups. 

Hartman is captivated by the power of analyzing publicly available satellite images. She explains that researchers can access these images and gain insights into any part of the world, dating back to the 1980s or earlier. This accessibility proves particularly invaluable when collecting information on remote and resource-limited regions where data has historically been uncollected or lost. 

What an incredible opportunity to pair images that have been collected for tens and tens of years and use that to analyze things in a way that provides data that may not necessarily be possible otherwise,” Hartman says. “That’s what I see as the potential.”

 

Paving the way for empathy in the digital age

Kenan Carames, a current Master of Development Engineering student in the AI & Data Analytics for Social Impact concentration, is developing platforms that help people in different communities grow closer together in an online space. Already having a background in data analysis, he joined the program to pursue his interest in social issues and international development.

Kenan Carames (courtesy photo)
Kenan Carames (courtesy photo)

“There’s so much exciting stuff happening in AI both from a technical and social point of view,” he says. “And coming from an engineering background, I do have the opportunity to provide a more technical perspective in the social conversations.”

Over his two and counting semesters, Carames took classes on development issues and intense technical skills such as applied machine learning. A course called Politics of Information taught by Prof. AnnaLee Saxenian, also Carames’ mentor, sparked his interest in political issues around data and the social media space. His passion found a nurturing ground during his ongoing internship with Search for Common Ground Peace, a peacebuilding organization that aims to mediate conflicts and violent areas across the world. 

Carames recently worked on developing a chatbot for the organization to guide individuals in countries including Sri Lanka, Nigeria, Kenya, Jordan, and Lebanon towards resources that promote empathy and mutual understanding. He primarily helped advance the user experience of the chatbot, which requires a lot of data analysis and visualization of how users interact with the product and which materials get the most engagement. The current structure of the chatbot is more like a decision tree or choose-your-own-adventure, Carames explains, but his team plans to use more natural language processing (NLP) to make it more dynamic and engaging in the future.

“It’s definitely different work than I’ve done previously. And it’s kind of cool to have these social goals that you’re working towards,” he says. “That feels very impactful.”

While Carames is still developing his MDevEng capstone project idea, he is determined to keep exploring efforts to build empathy in digital spaces. The concept of the project will be based on contact theory, he says. The theory holds that contact between two groups can promote tolerance and acceptance under appropriate conditions, and he wants to experiment with it in an online space. 

From an engineering perspective, Carames will explore the system design and algorithm decisions he could make to ensure that when people from different groups come in contact with each other, they’re fostering understanding of each other rather than developing prejudice and hate. He has been learning from a project where researchers use question-and-answer website Quora to facilitate conversations about Israel and Palestine for people living in and outside these areas. 

“It’s a very heated topic, and how do you set up those spaces and design conversations to get people to engage each other in a helpful way?” he asked. “It’s very difficult and messy, but I think that’s why it’s interesting.”

Host and Fellow Responsibilities

Host Organizations

  • Identify staff supervisor to manage I&E Climate Action Fellow
  • Submit fellowship description and tasks
  • Engage in the matching process
  • Mentor and advise students
  • Communicate with Berkeley program director and give feedback on the program.

Berkeley Program Director​

  • Communicate with host organizations, students, and other university departments to ensure smooth program operations

Student Fellows

  • Complete application and cohort activities
  • Communicate with staff and host organizations
  • Successfully complete assignments from host organization during summer practicum
  • Summarize and report summer experience activities post-fellowship