Somali Chaterji named 2026 University Faculty Scholar for advancing AI in IoT, agriculture and health

Somali Chaterji, associate professor in Purdue University’s Department of Agricultural and Biological Engineering(ABE) and the Elmore Family School of Electrical and Computer Engineering (ECE), has been named a 2026 University Faculty Scholar for her research at the intersection of machine learning, digital agriculture and genomics.

Chaterji’s work develops machine learning models that “right-size” AI for the devices and domains that need it most.

A major focus of her research is computer vision on small, resource-constrained devices, enabling real-time perception and decision-making on embedded graphic processing units (GPUs) in autonomous vehicles, drones and farm equipment. Her lab also builds interpretable machine learning methods for single-cell genomics and high-performance databases for metagenomics data—the direct genetic analysis of microbial communities from environmental samples—with applications to Purdue’s One Health Initiative.

Chaterji’s research is part of the Purdue One Health Initiative

This initiative brings together research on human, animal and plant health.

Learn more

Much of this work runs on the small, connected devices Chaterji calls the “Internet of Small Things.” She emphasizes the word "small" because these devices are already ubiquitous in our lives, operating under tight constraints in latency, energy and memory that demand a fundamentally different approach to AI. The same right-sizing philosophy also tames the data center, reducing the compute footprint of large, power-hungry workloads in the cloud.

To learn more about the ideas driving her research and the path behind becoming a University Faculty Scholar, we spoke to Chaterji about her work, career journey and vision for the future.

Q&A with Somali Chaterji

How would you define your research background? 

I earned my Ph.D. in biomedical engineering from Purdue University. After a postdoc in computer science, I started my lab, the Innovatory for Cells and Neural Machines (ICAN)—an applied machine learning lab that builds right-sized, adaptive AI for cutting-edge devices and data-intensive science, drawing on a foundation in both bioengineering and computer science. Early in my career, my biomedical engineering background drew me to computational genomics, where I built high-performance databases and scalable pipelines for large-scale genomics and metagenomics data.

This work kept surfacing the same question: how do you right-size computation for different tasks? Answering this question pulled me into cyber-physical systems (CPS) and computer vision, where constraints are even narrower as computation must happen in real time with no second chances. Much of the work I do now adapts and extends foundational AI towards use-inspired AI, whether the data is a single-cell experiment or a Light Detection and Ranging (LiDAR) sensor on a moving vehicle.

Research and development that advances AI methods by grounding them in specific, real-world problems, with an emphasis on tangible impact and societal relevance.

This grounding in machine learning, computer systems and computational genomics has allowed me to work across CPS and One Health, building systems that are rigorous enough for top-tier venues and deployable enough for the real world. My research centers on three areas: adaptive computing for real-time perception on small devices, learning with sparse and noisy data, and interpretable AI for One Health applications.

Which key projects have most shaped your research and its direction?

Some of the standout moments in my career have centered around developing adaptive, or “right-sized computing.” We’re at a stage in computing where these models are really powerful, capable and impressive, but in many settings, that level of complexity isn’t necessary.

The process of matching the requirements of the application—including demanding AI applications—to precisely identify the computation needed. This often means that on-device machine learning is running inference directly on the device rather than making round trips to the cloud. This matters for bandwidth-limited settings, such as rural farms and privacy-sensitive data—medical and financial records or confidential farm data. More broadly, right-sized computing enables decentralized learning, where intelligence is distributed across many devices rather than concentrated in a data center.

With a research group of 10 graduate students leading diverse projects, we’re developing a strong foundation in adaptive computing driven by three key components:

Content awareness

Using situational and environmental information about users, places and objects to adapt an application to fit a user’s need. For example, a self-driving car passing through a barren highway needs far less computation than one navigating a chaotic urban intersection.

Resource contention

Balancing workloads among multiple co-hosted neighboring virtual machines to ensure smooth operations. For example, on the embedded GPU inside a self-driving car, perception, tracking, planning and infotainment all share the same chip. When one workload spikes, the others can miss their real-time deadlines.

Perturbations

A temporary disruption or change in the regular motion, behavior or state of a system, which is an idea shaped through work with the NSF CHORUS Center.  For example, sudden glare blinding a camera on a self-driving car.

Across my lab, we design end-to-end systems that are both robust and adaptive to these various kinds of conditions, content, contention and uncertainty. For example, in projects like Agile 3D, we built systems that adapted computation based on LiDAR sensor data, and chose the most appropriate model according to scene complexity, available compute resources and environmental noise.

Another highlight has been advancing on-device machine learning, where models must operate on Internet of Things (IoT) devices—operating under very constrained resources and energy while still delivering accurate, real-time results. This includes work on adaptive inference, efficient training with limited labeled data, and deploying AI on systems like autonomous vehicles, drones and wearable devices. 

More recently, I’ve been excited about extending these ideas to foundation and generative AI models, making them more efficient, interpretable and tailored to specific applications. I've also been applying machine learning to single-cell genomics to better understand health, disease, and the biological foundations of fitness and vitality. The common thread across all of it: right-size the computation to the task, whether the device is a wearable on a wrist or a GPU on a robot.

What’s the most exciting aspect of your research?

I am really excited about my company, KeyByte, which grew directly out of two patents from my lab, Sophia and Optimus Cloud. It is the same right-sizing philosophy applied to the cloud, aimed to democratize cloud computing so that smaller companies can get the performance they need without overpaying for unused infrastructure. My students also play a big role in shaping the lab’s direction. Almost every summer, both my undergraduate and graduate students go out for internships across industry, and when they come back, they bring fresh perspectives that sharpen the overall vision of ICAN.

I’ve also been energized by my recent work on transformers.

Powerful AI architecture behind tools like ChatGPT and modern vision systems. Using a transformer involves two stages. First, training: the model studies massive amounts of data to learn patterns. Second, inference: the model uses what it learned to respond to new inputs it has never seen before. Inference itself consists of two phases. The first, known as prefill, processes the input query using heavy computation. The second, decode, generates the output one token at a time, roughly one word per step, where each word depends on all the words before it. This phase is typically memory-bound, meaning the bottleneck is moving data around rather than computing it.

The key idea is simple but powerful: different inputs require different amounts of computation. Simple queries take a faster, lower-cost path; complex ones receive the deeper processing they need. What makes this work especially exciting is that problems are not fixed, they shift as the technology evolves. Whether it is adaptive systems in agriculture, large-scale learning from distributed sensors or next-generation AI models, the core challenge remains: how do we build systems that are intelligent, efficient and responsive to real-world complexities?

What motivates your work?

I love my work—it energizes me and never really feels separate from the rest of my life. I'm driven by seeing ideas go from a whiteboard sketch to a working system on a real device. My students have been a huge source of energy and motivation, bringing their own perspectives and positive energy into ICAN. My research team has channeled that energy into 14 papers to leading venues this semester alone.

A lot of what drives my research comes from my relentlessness. I read a quote years ago that stuck with me that essentially said, “I am not the strongest. I am not the fastest, but I’m really good at suffering.” In academia, it takes relentless grit—the discipline to keep showing up for the work you believe in, balancing research, teaching and mentoring, all while staying curious day after day. 

In my lab, we never stay stagnant. In tech, you must continuously learn, adapt and engage with the latest devices and ideas. That’s where my broader network of mentors, colleagues and industry collaborators plays an important role, both challenging and supporting my work as it evolves.

In what ways has Purdue College of Agriculture contributed to your growth and success?

Leaders like Dean Bernie Engel, Jerry Shively, Ron Turco, and Nathan Mosier bring a grounded perspective and positive energy that makes collaboration feel natural.
 
Within ABE, my colleagues span a wide range of fields, and the overlap with my work shows up constantly. Sensing and computation come up in almost every corner of the department, from precision agriculture to environmental monitoring to biosystems, and that is what keeps my work sharp and grounded at the same time. The College has also backed my engagement with international networks and supported our most recent local high school hackathon, one of the ways I give back to the community.
 
Working across both ABE and ECE strengthens ICAN from both directions—grounding it in foundational algorithms and use-inspired applications. That dual footing has resulted in several industry funding opportunities, and the work keeps growing.

What’s next for you?

I’m now not only an associate professor, but I’m also a University Faculty Scholar. That comes with a lovely charge: training the next generation while continuing to grow as an innovator and professor myself. I think of myself as an academic ambassador, and that means open-source code and live demos, mentoring learners from high school to graduate levels, reaching broader audiences through podcasting and developing new courses to help working professionals learn machine learning.

I recently co-developed Applied Machine Learning: From Foundations to Latest Advances for Purdue's Master of Science in Data Science (MSDS) program, an online course for professionals. All my courses share the same fervor: teach the foundations rigorously, then update the latest advances every offering so the material stays alive and meshes directly with ICAN. Even in an online format, students have told me they can feel the energy—and that added energy, in turn, comes right back to me.

 

When students graduate from my lab, I want them to feel the pride over what they built and confidence in where they are headed, emboldened by their training and discoveries. Many of them go on to incredible companies, where they carry ICAN’s can-do spirit with them.

The field will keep evolving, but what stays constant is curiosity, adaptability and the grit to keep showing up. I love sharing knowledge widely but also want it to reach the understory—the students and learners who may not be in the spotlight yet and continue to grow quietly.

Lifelong learning is the canopy that connects all of us, and I am here for all of it because ICAN.

Featured Stories

Elizabeth Ogar
Elizabeth Ogar - Graduate Ag Research Spotlight

When Elizabeth Ogar was growing up in Nigeria, a weevil infestation hit her family’s corn...

Read More
Members and coaches of the Purdue Livestock Judging Team hold banners.
Purdue Livestock Judging places second high team overall at All-East

Purdue Livestock Judging earned second high team overall honors at the All-East Contest.

Read More
bee's perspective
Discovering the World of Honey Bees and Beekeeping

New Purdue press book published by Krispn Given, Purdue University, Emeritus Professor of...

Read More
To-chia poses in a rice paddy field wearing big rubber boots and a bucket hat
Using process-based modeling and high-throughput phenotyping data to predict how plants will grow

To-Chia Ting, a postdoctoral scholar in agronomy, studies new approaches to predict plant...

Read More
Department Head Paul Ebner stands on the stairs with students and alumni.
Purdue Animal Sciences honors its 2026 Outstanding Students and Distinguished Alumni

The department recognized students and alumni for leadership, achievement and impact.

Read More
Ag Barometer
Farmer sentiment declines in April amid input costs and availability concerns

Farmer sentiment fell in April as concerns about rising input costs, tighter availability and...

Read More