When Aishwarya Lahariya studied fiber chemistry in university, she was taught a standardized method for processing cotton. However, after co-founding the artisanal fashion brand Jiwya and collaborating with artisans across India, she quickly saw the limitations of this approach, which depends heavily on water and chemicals. The artisans’ method was faster and used fewer resources. They skipped scouring—removing waxes and oils with hot water and cleaning agents—and bleaching, since both steps are unnecessary when working with natural dyes, as Jiwya does. “A lot of Jiwya’s water savings happen because we do not do those two steps,” she explains. Yet this alternative process was never mentioned in her formal education.

Much Indigenous knowledge remains undocumented within mainstream Western frameworks and requires genuine outreach and inquiry to uncover. Now, with the rise of AI and its growing role in research, information gathering, and decision-making, the bias toward Western perspectives is being amplified. As fashion outsources more processes to technology, it risks drifting further from vital Indigenous and traditional knowledge.

“AI cannot replace the lived human experience,” says Virginia Keesee, senior director of global fashion and nature initiatives at Conservation International. “Indigenous peoples and local communities are a huge part of the fashion value chain as stewards of nature, biodiversity, and climate. Partnership with them and support for them is critically important, not only for people, but for the future of our planet.”

Fashion has a long history of marginalizing Indigenous peoples. In 2022, Textile Exchange reported that only 5% of 252 surveyed fashion and textile companies consulted Indigenous peoples and local communities when developing their sustainability strategies. To address this gap, Conservation International, Textile Exchange, and Kering collaborated on a guide to Indigenous partnership principles in 2024. The guide aims to protect Indigenous communities from exploitation—such as land encroachment, biodiversity loss, and the unauthorized use of traditional designs—while also encouraging the integration of Indigenous knowledge into sustainability strategies. This can include practices like wild rubber tapping that preserves tree health, using natural dyes from cassava bark, and land preservation.

The integration of AI into fashion—from personalized ads and virtual fittings to supply chain management and nature-based solutions—further complicates the issue. Fashion-related queries are more likely to draw on data from U.S. or European research bodies, industry standards, or brands than on Indigenous knowledge. Trained on human-generated data, AI absorbs and amplifies existing biases, heavily favoring dominant Western viewpoints.

When I asked ChatGPT for a list of experts in cotton and water stewardship, it provided only Western academics and climate NGOs. Another prompt asking where water-saving data comes from revealed: “The training data is not evenly distributed globally. Indigenous, local, or unpublished farmer knowledge is under-represented.” (OpenAI, the U.S. developer of ChatGPT, did not provide comment in time for publication.)

Engaging traditional communities is not as simple as just inviting them to the table.Indigenous peoples are often left out of the conversation, and many are wary of having their knowledge exploited by AI, even if invited to participate. If these underlying biases go unaddressed, they risk undermining progress in both sustainability and diversity and inclusion.

So, who actually benefits from AI?

Taylor Sparklingeyes, a senior data sovereignty specialist at the consulting firm Shared Value Solutions and a member of Goodfish Lake First Nation in Canada, has been exploring this question. After fielding inquiries from her community about AI, she joined the Indigenous Pathfinders in AI program run by the Montreal AI institute Mila. The program aims to empower First Nations, Inuit, and Métis participants with Indigenous-centered approaches to AI.

Sparklingeyes cautions that the breakneck speed of AI development—the fastest-spreading technology in history—threatens to overlook safety, security, and privacy concerns within Indigenous communities. “To be a true ally, you sometimes have to let go of strict timelines and expectations,” she says. “Building trusted relationships takes time, and that foundation is essential, whether we’re co-designing governance, managing data, or assessing AI’s impact on communities.”

Some experts go further, suggesting that bias in AI isn’t just accidental but intentional. Deepak Varuvel Dennison, an AI researcher at Cornell University, points out that AI platforms have a financial incentive to cater to the biases of their majority user base, which keeps people engaged by reinforcing their existing beliefs. With users concentrated in the Global North, this “silicon gaze” further marginalizes Indigenous knowledge. “What’s economically valuable to those in power gets promoted, and what isn’t gets sidelined,” Dennison explains.

The issue of access adds another layer of complexity. For many Indigenous communities, the question isn’t just about representation but whether they want AI to have access to their data and insights at all. While creators in the Global North are now grappling with data ownership, Indigenous communities have long fought for data sovereignty.

Sparklingeyes notes that many Indigenous groups have experienced historical harm through knowledge and data extraction, often without fair terms or consent. Data about them—from maps to artwork—may have been taken and used to train AI systems if it appears online, in journals, or in government databases. This information is often stripped of its original context and filtered through Western perspectives, as English-language research from high-income countries dominates AI training materials.

To prevent any effort to rebalance AI from repeating these patterns of inequitable extraction, organizations like the Indigenous-led non-profit Earth Daughters advocate for strong safeguards. These include community-defined protections such as free, prior, and informed consent; Indigenous governance over data and knowledge; fair compensation; and the genuine right to refuse participation.In an email to Vogue Business, the Earth Daughters team explains, “These safeguards must be established before any engagement begins and cannot be reduced to mere technical or checklist-based solutions.”

In practice, this could mean Indigenous communities refuse fashion or technology companies access to their data. Intellectual property lawyer Monica Boţa Moisin founded the Cultural Intellectual Property Rights Initiative (CIPRI) in 2018 to advocate for the recognition of cultural intellectual property rights related to traditional garments, designers, and manufacturing techniques.

A 2019 case involved the Oma, an ethnic minority group in northern Laos, who accused a high-end Italian fashion brand of selling clothes featuring copies of their traditional designs. In partnership with the Traditional Arts and Ethnology Center, CIPRI helped the Oma create a digital database to protect their traditional knowledge and cultural expressions, giving them control over how these are accessed and commercialized. When a researcher later requested to use this dataset to train an AI system designed to prevent cultural misappropriation in fashion, the Oma and their support team were able to thoroughly evaluate the proposal.

Ultimately, the Oma declined the request, feeling the benefits to their community were insufficient. While the researcher aimed to protect Oma designs from misuse, the community believed granting access would exclude them from future discussions. Once data is used for AI training, it might discourage further direct engagement from the fashion industry. “Technology is inevitable, but we must ask: Is this beneficial to the Oma? Do they have the necessary infrastructure to benefit? And how?” says Boţa Moisin.

Quinn Manson Buchwald, director of the Indigenous and Traditional Peoples program at Conservation International and a citizen of the Little Shell Tribe of Chippewa Indians of Montana and the Manitoba Métis Federation, emphasizes that “Free, prior, and informed consent is an ongoing process. It’s not a one-time engagement. You must maintain constant partnership with these communities, keeping them updated and informed.” One-time data access simply doesn’t meet these standards.

The Earth Daughters team adds that refusing to participate in AI training shouldn’t be seen as hindering progress, but rather as an act of sovereignty and care. “Instead of debating whether AI is inherently good or bad, we focus on who controls it, who benefits, and who is exposed to harm.” Similarly, Sparklingeyes cautions against simply feeding Indigenous knowledge into centralized tools. “When an institution approaches a community saying, ‘We have this system; help us by uploading your data,’ the imbalance persists,” she explains. “They need to go back to the co-design phase to truly understand if it’s what the communities want.”

An Indigenous-Centered Approach

The erasure of Indigenous perspectives on AI platforms, models, and tools reflects their broader exclusion from society, making education programs essential to include these communities. “In Indigenous contexts, we’re often playing catch-up after years of being excluded from these spaces,” says Lynnsey Chartrand, head of Indigenous initiatives at Mila, which runs the Pathfinders program launched in 2024. “What’s exciting about AI is that, for once, there’s an opportunity for Indigenous voices to shape the field from the ground up as it evolves.”

One project developed by Pathfinders is Green Circle, an AI tool that combines traditional agricultural knowledge with climate and soil data to provide tailored guidance on crop selection, planting, and trading. This could be valuable for brands working with natural fibers or seeking sustainable sourcing solutions.Chartrand, who is also a citizen of the federally recognized Manitoba Métis Federation, emphasizes the importance of natural fibers. She reflects, “What has really struck me since the first year—and still does—is how powerful it is to give Indigenous talent the time, resources, tools, and creative freedom to explore how AI could help their communities. It also highlights the value of having this technology developed by us, not just for us. The care put into these projects feels impossible to replicate by someone outside the community.”

While there is a concern that the burden of creating fairness might fall on Indigenous people themselves, Chartrand remains hopeful. “I believe there are genuine allies who are not Indigenous but are stepping up to support,” she says.

The potential for more balanced and equitable AI will grow as we improve secure, locally managed data storage; as grassroots Indigenous involvement and advocacy expand; as Indigenous-led frameworks take shape; and as underrepresented cultures and voices push back against systemic biases. However, achieving this will demand ongoing, adaptive efforts from the fashion industry, along with difficult self-reflection on issues of access, benefits, and purpose.

Dennison adds, “Whenever I consider how to make AI models more representative, I immediately wonder: what’s the goal? Is it so a U.S. corporation can create ads that resonate better in India? Who ultimately benefits? That’s the essential question of value I’m asking.”

Frequently Asked Questions
FAQs AI and Indigenous Knowledge

BeginnerLevel Questions

What is indigenous knowledge
Indigenous knowledge refers to the unique understandings skills and philosophies developed by societies with long histories of interaction with their natural surroundings Its often passed down orally through generations and is deeply tied to culture language and place

How could AI cause this knowledge to fall out of use
If AI systems are primarily trained on dominant global datasets they may not recognize value or accurately represent indigenous ways of knowing This could make local oral and culturally specific knowledge seem less relevant or less correct compared to AIgenerated information leading to its gradual neglect

Isnt AI just a tool How can a tool make people forget things
AI is a tool that shapes what information is easy to find trusted and used If AI assistants search engines and educational tools dont include or prioritize indigenous knowledge younger generations may rely solely on these digital sources bypassing traditional learning from elders and community practices

Are there any examples of this happening already
Yes in areas like agriculture where AIdriven precision farming advice might override traditional crop rotation or land management practices In language preservation autotranslation tools often fail with indigenous languages pushing people toward dominant languages for convenience

Can AI actually help preserve indigenous knowledge instead
Absolutely When developed ethically and in partnership with communities AI can help document languages map ancestral lands using satellite data or create digital archives of stories and practices The key is who controls the process and the data

AdvancedLevel Questions

Whats the difference between digitizing knowledge and preserving it
Digitizing is creating a digital record Preserving means keeping the knowledge alive dynamic and integrated into community life AI might help with the first but could harm the second if it turns living knowledge into a static artifact disconnected from its cultural context

How do issues of data sovereignty come into play
Data sovereignty is the right of indigenous peoples to own control and govern the data collected from them or about their knowledge AI systems often require large datasets If indigenous knowledge is extracted without consent or proper governance it can be misused commercialized or distorted further endangering its authentic use

Could AI create biased or inaccurate versions of indigenous knowledge
Yes