×
Chile develops Latam-GPT, a 50B-parameter AI model for Latin America
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The Chilean National Center for Artificial Intelligence is developing Latam-GPT, an open-source large language model specifically designed for Latin America and trained on regional languages and contexts. The project aims to help the region achieve technological independence by creating AI that understands local dialects, cultural nuances, and historical contexts that global models often overlook.

What you should know: Latam-GPT represents a collaborative effort across Latin America to build regionally-focused AI capabilities.

  • The model contains 50 billion parameters, making it comparable to GPT-3.5 in scale and complexity.
  • It’s trained on over 8 terabytes of text data from 20 Latin American countries plus Spain, totaling 2,645,500 documents.
  • Brazil leads the data contribution with 685,000 documents, followed by Mexico (385,000), Spain (325,000), Colombia (220,000), and Argentina (210,000).
  • The first version launches this year, with plans to expand into multimodal capabilities including image and video.

The big picture: This isn’t about competing with tech giants but addressing their blind spots in regional understanding.

  • “We’re not looking to compete with OpenAI, DeepSeek, or Google. We want a model specific to Latin America and the Caribbean, aware of the cultural requirements and challenges that this entails,” explains Álvaro Soto, CENIA’s director.
  • The project leverages 33 strategic partnerships across Latin America and the Caribbean.
  • Future versions will incorporate indigenous languages like Mapuche, Rapanui, and Guaraní.

Why this matters: Global AI models often fail to capture Latin American contexts, potentially limiting their effectiveness in regional applications.

  • “If you ask one of these models for an example, it would probably tell you about George Washington. We should be concerned about our own needs; we cannot wait for others to find the time to ask us what we need,” Soto notes.
  • The model will enable region-specific adaptations for sectors like education, healthcare, and agriculture.
  • It provides Latin American researchers with unprecedented access to experiment with large language models.

Technical infrastructure: The University of Tarapacá in Chile provides the computational backbone with a $10 million investment.

  • The facility features 12 nodes, each equipped with eight NVIDIA H200 GPUs.
  • This represents the first large-scale model training capacity in Chile and the broader region.
  • The infrastructure emphasizes decentralization and energy efficiency.

What they’re saying: Regional leaders emphasize the collaborative and educational potential of the project.

  • “This work cannot be undertaken by just one group or one country in Latin America: It is a challenge that requires everyone’s participation,” Soto explains.
  • “If I had to choose where to apply these technologies, it would be in education, because it addresses the root cause of many of our problems.”
  • Success would mean “new generations of Latin Americans are better prepared because they had access to tools that spoke to them in their context, with their cultural references.”

Looking ahead: The open-source model is designed to spawn specialized versions for different sectors and countries.

  • Organizations across the region can adapt the base model for specific use cases.
  • Plans include expanding to indigenous language translation and cultural preservation.
  • The project aims to transform Latin America from AI consumers to active technology developers by 2030.
Latam-GPT: The Free, Open Source, and Collaborative AI of Latin America

Recent News

Chile develops Latam-GPT, a 50B-parameter AI model for Latin America

The open-source model addresses cultural blind spots that global AI systems routinely miss.

Alibaba shares surge 19% as cloud growth and AI chip development fuel rally

Cloud revenue growth accelerated to 26% as AI products maintained triple-digit expansion for the eighth straight quarter.