Research

Advancing Efficient AI

Our research focuses on developing energy-efficient foundational models that can power the next generation of embodied AI systems.

Research Areas

Energy-Efficient Neural Architectures

Developing novel network architectures that achieve high performance with minimal computational overhead.

Embodied Foundation Models

Creating foundational models specifically designed for robots and physical AI systems.

Sustainable Training Methods

Innovating training techniques that reduce energy consumption without sacrificing model quality.

Edge AI Optimization

Optimizing models for deployment on resource-constrained edge devices.

Publications

2025

Efficient Architectures for Embodied Foundation Models

J. Ortiz, M. Chen, E. Rodriguez

In Preparation

We present our research into novel architectures for embodied AI designed to achieve competitive performance while targeting significant reductions in computational requirements.

Read Paper
2025

Sustainable Training Methods for Large-Scale AI Systems

M. Chen, E. Rodriguez, J. Ortiz

Working Paper

This paper explores new training methodologies designed to reduce the carbon footprint of developing large AI models.

Read Paper
2025

Towards Zero-Carbon Embodied Intelligence

E. Rodriguez, J. Ortiz

Position Paper

A comprehensive framework proposal for developing embodied AI systems with minimal carbon emissions throughout their lifecycle.

Read Paper

Open Research Philosophy

We believe in sharing our research openly with the community. All our publications are freely accessible, and we actively contribute to open-source initiatives.

Open AccessOpen SourceCommunity-Driven