This article was automatically translated from the original Turkish version.
In August 2025, OpenAI unveiled its first open-weight large language models to the public after five years of silence. The breaking of this long silence can be interpreted not only as a technical advancement but also as a strategic and political move, following the release of GPT-2 in 2019.
On January 21, 2025, one of the largest artificial intelligence infrastructure initiatives in U.S. history was launched. 【1】 The Stargate Project, developed through a partnership between OpenAI, Oracle, and SoftBank, aims to build a nationwide network of high-capacity data centers over the next four years with an investment of $50 billion. Evaluated not merely as a technological endeavor but as a strategic and geopolitical initiative, the Stargate Project seeks to secure American leadership in artificial intelligence, create hundreds of thousands of jobs for Americans, and contribute to the global economy.
Since assuming office, Donald Trump has sought to ease environmental regulations in energy-intensive sectors and facilitate the sale of American AI technologies. The document titled “Winning the Race: America’s AI Action Plan” defines artificial intelligence as a critical national security and economic imperative. The stated objective is undoubtedly to establish “unquestioned and unchallenged global technological dominance.” 【2】
In this context, Trump supports this vision with concrete steps in foreign policy. His Gulf tour on May 15, 2025, concluded with trillions of dollars in investment and technology agreements. Among the most notable were an agreement facilitating the United Arab Emirates’ access to semiconductor chips for AI use and a $20 billion investment to establish an AI data center in Abu Dhabi. 【3】
These and similar initiatives demonstrate that artificial intelligence is no longer merely a technology but has become a matter of sovereignty, security, and economic superiority. The Trump administration has adopted a proactive strategy, building domestic infrastructure while strengthening international investment partnerships. In the coming years, competition between the United States and China in artificial intelligence is expected to evolve into a state-level race for technological supremacy.
In this direction, it is evident that such strategic orientations are not limited to political declarations but have been materialized through the open-weight GPT-OSS models released by OpenAI in August 2025. The gpt-oss-120b and gpt-oss-20b models were presented as the technical embodiment of this vision, aiming both to strengthen U.S. AI leadership and to establish superiority over China in the open-model competition.
Open-weight models refer to the public release of the trained weights of an AI model. This means developers can download the model, run it on their own systems, and fine-tune it according to their specific applications. Although the models are not fully open-source, the release of weights for models of this scale provides substantial freedom for inspection and customization.
The models are available in two versions: gpt-oss-120b and gpt-oss-20b, both based on the Mixture-of-Experts (MoE) architecture.
Comparison of the two models (OpenAI)
OpenAI’s gpt-oss-120b model is a technically advanced Mixture-of-Experts (MoE) architecture. Although it has a total of 117 billion parameters, only 5.1 billion parameters are activated per token. This structure enables high performance while optimizing computational cost. Developers can set reasoning levels—“low,” “medium,” and “high”—via system prompts during inference, allowing a balance between speed and accuracy tailored to specific use cases.
The model transparently presents chain-of-thought (CoT) reasoning, enabling debugging, confidence analysis, and improved model interpretability. The gpt-oss-120b model is quantized in MXFP4 format and optimized to run on a single NVIDIA H100 GPU. Released under the Apache 2.0 license, it is suitable for both commercial and research purposes.
The gpt-oss-20b model is lighter and more accessible. With a total of 21 billion parameters, it activates only 3.6 billion parameters per token. Thanks to its MoE architecture, it maintains a balance of high accuracy and speed even on smaller devices.
The greatest advantage of this smaller model is its ability to run on systems with just 16 GB of RAM, enabling local inference on laptops. Like the larger model, the reasoning level can be adjusted in gpt-oss-20b. CoT is transparent, it is released under the Apache 2.0 license, and fine-tuning is supported.
When compared with other OpenAI language models such as o3i, o3-mini, and o4-mini, gpt-oss-120b outperforms o3-mini and achieves nearly equivalent performance to o4-mini in coding (Codeforces), general problem solving (MMLU and HLE), and tool usage (TauBench). It outperforms o4-mini in health-related queries (HealthBench) and mathematical competitions (AIME 2024 and 2025). The gpt-oss-20b model performs close to or slightly better than o3-mini in the same benchmarks.
You can access the model’s GitHub repository at from here.
You can access the Hugging Face gpt-oss-120b repository at from here.
You can access the Hugging Face gpt-oss-20b repository at from here.
[1]
Guldogan, Diyar. "Trump announces artificial intelligence infrastructure project". Anadolu Ajansı. Erişim Tarihi: 6 Ağustos 2025. https://www.aa.com.tr/en/americas/trump-announces-artificial-intelligence-infrastructure-project/3458842#.
[2]
Hernandez, Michael. "Trump signs 3 executive orders to bolster US standing in global AI race". Anadolu Ajansı. Erişim Tarihi: 6 Ağustos 2025. https://www.aa.com.tr/en/americas/trump-signs-3-executive-orders-to-bolster-us-standing-in-global-ai-race/3640325.
[3]
Okay, Dilara Zengin. "ABD ile BAE, 200 milyar doları aşan yeni ticaret anlaşmaları yaptı". Anadolu Ajansı. Erişim Tarihi: 6 Ağustos 2025. https://www.aa.com.tr/tr/ekonomi/abd-ile-bae-200-milyar-dolari-asan-yeni-ticaret-anlasmalari-yapti-/3569863.
GPT-OSS Series