In the AI Era, Blum Center Students and Alumni Find Ways to Apply the Technology for Social Good

Sarah Hartman, a PhD candidate in ESPM minoring in Development Engineering, analyzes historical and near-real-time satellite imagery in Google Earth Engine to map and categorize fields and landscapes to compare changes over time. She then relates these changes to other timestamped activities such as damage to transportation routes or local military activity. (Video by Sarah Hartman)By Mengyuan Dong
Master of Journalism ’23

Only about a year has passed since DALL-E, the AI text-to-image model, and ChatGPT, the large language model–using chatbot, ushered in what feels like the age of AI: months of wall-to-wall news coverage of and personal experimentation with the most powerful publicly available artificial intelligence programs. But what started off as images of Darth Vader playing in the NBA and the near-instant generation of new bedtime stories have given way to concerns about how AI can contribute to disinformation campaigns, academic dishonesty, and even an end to whole classes of workers whose jobs can, theoretically, be done by machines.

Yet even before AI took center stage this past year, students and alumni of the Blum Center for Developing Economies were embracing the emerging technologies’ potential, specifically for social good. From detecting “deepfake” videos to analyzing agriculture changes and building understanding across communities, Blum Center folks share their experiences, inspirations, and the impact of their AI-driven projects and ventures. 

 

Shaping the battle against deepfakes

Raymond Lee (courtesy photo)
Raymond Lee (courtesy photo)

Back in 2019, Raymond Lee, a former Big Ideas Contest winner and a UC Berkeley alum, spotted the growing threat caused by deepfakes — media created using learning techniques that can swap faces, voices, and even entire bodies, making it appear as if someone is saying or doing something they never actually did.

While deepfakes became viral and raised public concerns, no technical solutions stood out then. Lee decided to initiate FakeNetAI, a deepfake detection SaaS (Software as a Service) that aimed to “protect against economic, societal, and political threats.” 

FakeNetAI began as Lee’s capstone project during his master’s program in data science, and it quickly evolved into a startup with the support of his teammates from data science, electrical engineering and computer science, and Haas MBA students. And through participating in the Big Ideas Contest, Lee connected with mentors, conducted market analysis, and developed a mature business plan. 

To train their machine learning model, Lee and his teammates curated a diverse dataset by employing open-source data sets, scraping raw data from YouTube, and collecting deepfake videos created using various methods. The team’s approach and the customized machine-learning architecture resulted in a detection accuracy of over 90 percent.

Since its victory in the Big Ideas Contest in 2019, the startup has successfully attracted media outlets, banks, and social media firms as customers. For NewsMobile, an Indian fact-checking site, FakeNetAI helped identify viral deepfake videos featuring prominent figures like Tom Cruise and Donald Trump. Moreover, it aided in verifying livestreaming videos during a critical moment of the Myanmar coup for another third-party fact-checker.

However, Lee acknowledges that the fight against deepfakes is far from over. The rapid proliferation of deepfake generation has still outpaced the development of detection expertise. Citing the research of Prof. Hany Farid from Berkeley’s School of Information, Lee emphasizes the urgent need for continued research in detecting these highly realistic manipulated videos.

“It’s like a cat and mouse game where you constantly have to outsmart the generation models with your detection model,” he says.

The other challenge is, because media comes in a wide range of formats, it takes a lot of work to train a machine-learning model that can perform well on all different formats. So the more generalized a machine-learning model is made to be, the less accurate it becomes.

FakeNetAI is still growing, and Lee is determined to stay at the forefront of innovation. He has also pivoted his career to do more general AI and data science consulting. “The image generation field has advanced at a rapid rate,” he says, “and there’s always a lot of interesting stuff going on in the video, image and text generation space.”

 

Understanding abandonment in Ukrainian agriculture amidst war

Sarah Hartman, a PhD student within Environmental Science, Policy, and Management, has been involved in the Digital Transformation of Development traineeship since last year, and will continue as a funded fellow for this upcoming year.

Before joining DToD, Hartman’s research interests in food and water led her to participate in the InFEWS program. What she enjoys the most about these programs is the opportunity to interact with individuals from diverse backgrounds and disciplines.

“I’ve gained a lot intellectually by chatting with people,” Hartman says. “There were things that I learned from the other fellows that have helped reframe how I think about my research or have made me think about potential tools and methods that I hadn’t otherwise stumbled upon.”

In her research on water and agriculture, Hartman utilizes machine learning and AI to map changes and quantify the extent of agricultural activities. Her recent work involves training machine-learning models to analyze satellite images of Ukrainian agriculture during its war with Russia, with a specific focus on abandonment and its underlying drivers. 

The idea stemmed from a teaching experience when Hartman was a graduate student instructor for a Principles of Natural Resource Management class. As the war in Ukraine began during the semester, she noticed the students’ desire to understand the global impact of such events on natural resources and supply chains. She then decided to deliver materials around globalization with a specific angle of Ukraine. For instance, the country is one of the world’s breadbaskets, where fields of sunflowers, wheat, and corn provide food for some of the world’s most water-stressed and vulnerable countries, particularly many in North Africa. The abandonment of Ukrainian fields could cause serious food instability for its importing countries.

Inspired by the dynamic conversations in class, Hartman wanted to pursue a research project on the issue. “I took it personally,” Hartman says. “I’ve been learning how to use satellite images and machine learning to look at agriculture; what better place to do it than to help inform what’s happening in Ukraine in near real-time?”

Are fields abandoned because of a physical tank or destruction? Or because of supply chain issues such as farmers unable to get sufficient fertilizer or no longer able to sell their crop? Hartman’s research looks into these issues by analyzing satellite images and hopes to reveal what affects the overall resilience of Ukrainian agriculture. Specifically, she analyzes historical and near-real-time satellite imagery in Google Earth Engine to map and categorize fields and landscapes to compare changes over time. She then relates these changes to other timestamped activities such as damage to transportation routes or local military activity.

Besides technical analysis, Hartman hopes she could do more outreach and engagement with local communities in Ukraine that are directly affected, which has been challenging given the ongoing war condition. She says she would love to work with organizations with access to the groups. 

Hartman is captivated by the power of analyzing publicly available satellite images. She explains that researchers can access these images and gain insights into any part of the world, dating back to the 1980s or earlier. This accessibility proves particularly invaluable when collecting information on remote and resource-limited regions where data has historically been uncollected or lost. 

What an incredible opportunity to pair images that have been collected for tens and tens of years and use that to analyze things in a way that provides data that may not necessarily be possible otherwise,” Hartman says. “That’s what I see as the potential.”

 

Paving the way for empathy in the digital age

Kenan Carames, a current Master of Development Engineering student in the AI & Data Analytics for Social Impact concentration, is developing platforms that help people in different communities grow closer together in an online space. Already having a background in data analysis, he joined the program to pursue his interest in social issues and international development.

Kenan Carames (courtesy photo)
Kenan Carames (courtesy photo)

“There’s so much exciting stuff happening in AI both from a technical and social point of view,” he says. “And coming from an engineering background, I do have the opportunity to provide a more technical perspective in the social conversations.”

Over his two and counting semesters, Carames took classes on development issues and intense technical skills such as applied machine learning. A course called Politics of Information taught by Prof. AnnaLee Saxenian, also Carames’ mentor, sparked his interest in political issues around data and the social media space. His passion found a nurturing ground during his ongoing internship with Search for Common Ground Peace, a peacebuilding organization that aims to mediate conflicts and violent areas across the world. 

Carames recently worked on developing a chatbot for the organization to guide individuals in countries including Sri Lanka, Nigeria, Kenya, Jordan, and Lebanon towards resources that promote empathy and mutual understanding. He primarily helped advance the user experience of the chatbot, which requires a lot of data analysis and visualization of how users interact with the product and which materials get the most engagement. The current structure of the chatbot is more like a decision tree or choose-your-own-adventure, Carames explains, but his team plans to use more natural language processing (NLP) to make it more dynamic and engaging in the future.

“It’s definitely different work than I’ve done previously. And it’s kind of cool to have these social goals that you’re working towards,” he says. “That feels very impactful.”

While Carames is still developing his MDevEng capstone project idea, he is determined to keep exploring efforts to build empathy in digital spaces. The concept of the project will be based on contact theory, he says. The theory holds that contact between two groups can promote tolerance and acceptance under appropriate conditions, and he wants to experiment with it in an online space. 

From an engineering perspective, Carames will explore the system design and algorithm decisions he could make to ensure that when people from different groups come in contact with each other, they’re fostering understanding of each other rather than developing prejudice and hate. He has been learning from a project where researchers use question-and-answer website Quora to facilitate conversations about Israel and Palestine for people living in and outside these areas. 

“It’s a very heated topic, and how do you set up those spaces and design conversations to get people to engage each other in a helpful way?” he asked. “It’s very difficult and messy, but I think that’s why it’s interesting.”

More Articles

ChenTalk: Berkeley and UCSF Professor Irene Chen speaks to students in class white pointing at presentation during DevEng 203 and DevEng 210.

Digital Transformation of Development Traineeship Brings AI and Data Analytics to Under-Resourced Settings

Under a new NSF-funded research program housed at the Blum Center, the Digital Transformation of Development (DToD) Traineeship, students are using their research skills to apply digital tools, such as machine learning and AI, to the issues and challenges of poverty alleviation, disaster relief, and more — in pursuit of digital and technological justice, equity, and empowerment.

Members of the MDevEng community sport SHE 4 Change clothing. (Courtesy photo)

Patricia Quaye: Empowering Rural Women and African Culture Through Fashion

SHE uses sustainable fashion design to empower talented rural women to break free from generational cycles of poverty while promoting rich African heritages to the world. The project helps women with years of skilled seamstressing experience who find themselves disregarded or deemed incapable due to the rural environment and a male-dominated society.

Blum Center for Developing Economies
The University of California, Berkeley
Blum Hall, #5570 Berkeley, CA 94720-5570 (Google Map Location)
(510) 643-5316 • blumcenter@berkeley.edu
Subscribe to our newsletter

© 2024 Blum Center for Developing Economies

Scroll to top

Host and Fellow Responsibilities

Host Organizations

  • Identify staff supervisor to manage I&E Climate Action Fellow
  • Submit fellowship description and tasks
  • Engage in the matching process
  • Mentor and advise students
  • Communicate with Berkeley program director and give feedback on the program.

Berkeley Program Director​

  • Communicate with host organizations, students, and other university departments to ensure smooth program operations

Student Fellows

  • Complete application and cohort activities
  • Communicate with staff and host organizations
  • Successfully complete assignments from host organization during summer practicum
  • Summarize and report summer experience activities post-fellowship