Eahou Fest 2026: Update on AI
Purple Maiʻa shares their vision for a community-centered approach to AI — one that prioritizes ʻāina care, Indigenous data sovereignty, and human agency over extraction and scale. From their Maui-based computing cluster to environmental monitoring work rooted in Hawaiian epistemology, they're building toward an AI future designed around community constraints rather than corporate ones.

A lot of people are concerned about AI, and at Purple Maiʻa we share those concerns. Building out the datacenters needed for AI uses vast amounts of land, electricity, water and rare earth elements while also impacting frontline communities where new datacenters are built. AI is trained on human creative output and community data, often without consent, while replacing human workers. Having seen how social media companies sell or otherwise misuse personal data, people are right to have little to no trust that AI companies won’t do the same. This concern is heightened for Indigenous communities fighting for data sovereignty.
And, there are other ways to approach AI that are different from the path being pushed on us. We can build and use AI in ways that center care for ʻāina, data sovereignty, and human agency. We have the opportunity to explore a fundamentally different future: AI systems designed to live within environmental, energy and community constraints rather than overriding them.
At Purple Maiʻa, we’ve sought out partners like Dr. Josiah Hester, whose research is on sustainable computing for environmental monitoring, and Dr. Keolu Fox, who is creating hardware for Indigenous data sovereignty in science. Their work and others’ points toward this different future.
One example: In partnership with Hester, we’ve been working on an environmental monitoring project called KILO (Kilo ʻIke Laulima na ʻŌiwi) that is inspired by the Kanakaʻole Foundation’s kānāwai methodology. While environmental monitoring tools can be powerful, they are often disconnected from place, community, and Indigenous governance. Large-scale platforms frequently extract information without local ownership or context. Data becomes something taken and interpreted elsewhere, rather than held in relationship and used in service of stewardship.
In contrast our system aspires to use environmentally aware edge computing and “locally grown” compute clusters built from recycled hardware that keeps data on-site, reducing ecological footprint while strengthening community control. Crucially, KILO is built around Indigenous data sovereignty principles, ensuring communities control their own data, determine how it is stored and shared, and interpret it within local context.
What this literally means is we have our own “cluster” of mixed hardware (5-6 machines) at our Maui office, that is used as the server for KILO and other projects. We own these machines so the data on them is never routed to mainland data centers. The Maui Cluster runs AI models we copied from open source models, and modified to suit our own needs. It receives data from sensors at deployment sites (farms or other ʻāina we kilo), but only when necessary. Some of the processing can be done by hardware (like a raspberry pi) on-site at the farm, keeping data literally local to a place.
The Maui Cluster isn’t perfect. While it uses no water, it does not yet run on solar, and some of the hardware was bought new, not repurposed. The Maui Cluster runs slower than major AI platforms like Claude and OpenAI, so we use it for sensitive data only, while our organization uses Claude and other platforms in our day-to-day work. Running our org totally on our own compute is a future aspiration.
There’s other ideas we’re working on–some of it is in-the-weeds to make things work, like smart routing tasks between low-powered edge devices and servers in real time. And some of it is foundational, like training models that reason constitutionally from Hawaiian epistemology.
And there’s much more to talk about. What economies are necessary to implement decentralized AI approaches? Given resource constraints, in what sectors and for what purposes should AI be most strategically and impactfully used, and when is it not needed? What approach makes sense for AI in education and creative fields, and what practices should we resist? What policies should Hawaiʻi adopt to limit the impacts of datacenters, protect Native Hawaiian data sovereignty, and protect people from AI-generated misinformation and discrimination?
Eahou highlights how cultural frameworks can guide innovation with ethics, accountability, and long-term vision. We hope you’ll join us at Eahou Fest 2026 to be part of this collective learning and reimagining of Hawaiʻi’s AI future.