The homegrown artificial intelligence (AI) foundational model, K-Exaone, developed by LG AI Research, has entered the global top 10, ranking seventh. The feat meansThe homegrown artificial intelligence (AI) foundational model, K-Exaone, developed by LG AI Research, has entered the global top 10, ranking seventh. The feat means

LG’s K-Exaone breaks into the world’s top 10 AI models

The homegrown artificial intelligence (AI) foundational model, K-Exaone, developed by LG AI Research, has entered the global top 10, ranking seventh. The feat means the model is the only Korean presence in a ranking dominated by models developed by companies in the United States and China.

In a statement, LG mentioned that its latest AI model delivered the strongest performance among five teams in a government-led AI foundational model competition. The model achieved remarkable feats, topping 10 of 13 benchmark tests with an average score of 72. Internationally, the AI model ranked seventh on the Intelligence Index compiled by Artificial Analysis, making it the only Korean model to enter the top 10. China led with six models, while the US boasted three models. Z.AI’s GLM-4.7 took the first position.

LG’s foundational model ranks seventh in global rankings

LG released its foundational model as an open-weight on Hugging Face and saw it climb to second place on the platform’s global model trend chart. This suggested a strong interest from international leaders. LG mentioned that it is ready to roll out free API access to K-Exaone through January 28. This will allow developers and firms to use the model without any cost during the initial rollout period.

Epoch AI, a US-based nonprofit, also hailed the model. The platform added the model to its list of notable AI models. LG AI research now has five models on the list, making it the Korean company with the most. “We established the development plan according to the time and infrastructure we were given, and we developed the first-phase K-Exaone using about half the data we have,” said Lee Jin-sik, head of Exaone Lab at LG AI Research.

According to LG, the model is the brainchild of five years of in-house research and signals Korea’s entry into the global race for frontier-class AI systems. The division of LG mentioned that instead of relying on only scale, it redesigned the architecture to boost performance while reducing training and operating costs. K-Exaone uses a mixture-of-experts (MoE) architecture with 236 billion parameters, with about 23 billion parameters activated per inference.

The K-Exaone model beats other models in several parameters

The model uses its core technology, hybrid attention, to enhance its ability to focus on important information during data processing while reducing requirements and computational load by 70% compared to the previous models. The tokenizer was also upgraded by expanding its training vocabulary to 150,000 words. In addition, it often optimizes frequently used word combinations, improving the ability to process documents 1.3 times.

In addition, the adoption of multi-token prediction boosted inference speed by 150%, improving overall efficiency. K-Exaone is designed to maximize efficiency while reducing costs, allowing it to run on A100-class GPUs rather than requiring the most expensive infrastructure,” an LG AI Research official said. “This makes frontier-level AI more accessible to companies with limited computing resources and helps broaden Korea’s AI ecosystem.”

Aside from memorization, K-Exaone is trained to focus on improving its reasoning and problem-solving capabilities. LG explained that during its pre-training stage, the model was exposed to thinking trajectory data that shows how problems are solved and not just the final answer. Safety and compliance were also other priorities for the model. LG mentioned that it carried out data compliance reviews across training datasets, removing materials with potential copyright issues.

The company runs an internal AI ethics committee that carries out risk assessment across four categories, including social safety, Korea-specific considerations, future risks, and universal human values. Under KGC-Safety, the benchmark developed by LG AI research for safety in Korea, K-Exaone scored an average of 97.38 across four categories. It outperformed OpenAI’s GPT-OSS-120B model and Alibaba’s Qwen-3-235B model.

Want your project in front of crypto’s top minds? Feature it in our next industry report, where data meets impact.

Market Opportunity
Sidekick Logo
Sidekick Price(K)
$0.00521
$0.00521$0.00521
-5.30%
USD
Sidekick (K) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Washington Faces New Dilemma Over Venezuela’s Alleged BTC Reserves

Washington Faces New Dilemma Over Venezuela’s Alleged BTC Reserves

The issue surfaced after the dramatic removal of Venezuela’s longtime leader, Nicolás Maduro, who was captured by U.S. forces and […] The post Washington Faces
Share
Coindoo2026/01/13 10:14
Jose Mourinho Is Back. Can He Be The Special One Again?

Jose Mourinho Is Back. Can He Be The Special One Again?

The post Jose Mourinho Is Back. Can He Be The Special One Again? appeared on BitcoinEthereumNews.com. Portuguese coach Jose Mourinho (L) holds up a Benfica jersey with his name together with Benfica president Rui Costa during his official presentation as new Benfica coach at the Benfica Campus training center in Seixal, on the outskirts of Lisbon, on September 18, 2025. Benfica sacked Portuguese coach Bruno Lage following their defeat to Qarabag on September 16, 2025 evening in the Champions League, and contacted Jose Mourinho the next day to hire him. (Photo by PATRICIA DE MELO MOREIRA / AFP) (Photo by PATRICIA DE MELO MOREIRA/AFP via Getty Images) AFP via Getty Images Two decades after leaving Portugal with a Champions League winner medal in his pocket, Jose Mourinho is back in his home country. Benfica, Portugal’s most successful club, appointed the 62-year-old as their new manager on Thursday, just three weeks after he was fired by Turkish giants Fenerbahce after just over a year in charge. It marks an emotional return for Mourinho, who began his coaching career with the Lisbon giants in 2000, managing 11 matches before resigning. By the time he left Portugal for England just under four years later, his star was in the ascendency. As he introduced himself to the English media for the first time, Mourinho famously described himself as the “Special One”. It was a revealing remark, typical of a man whose confidence bordered on arrogance at times. Crucially, it was also borne out by results. In two seasons at Porto, Mourinho won two league titles, the UEFA Cup and the Champions League. Seven league titles across England, Italy and Spain with Chelsea, Inter Milan and Real Madrid followed, along with another Champions League crown and seven domestic cups across three countries. The Europa League and the Europa Conference League have also been added to Mourinho’s trophy cabinet, the former with…
Share
BitcoinEthereumNews2025/09/19 22:49
'Groundbreaking': Barry Silbert Reacts to Approval of ETF with XRP Exposure

'Groundbreaking': Barry Silbert Reacts to Approval of ETF with XRP Exposure

Grayscale is launching a "combo" multi-token ETF that offers exposure to Bitcoin (BTC), Ethereum (ETH), XRP, and other tokens
Share
Coinstats2025/09/18 13:04