Why your enterprise AI strategy needs both open and closed models: The TCO reality check

This article is part of the special number of Venturebeat, “The true cost of Amnesty International: Performance, efficiency and a large -scale investment.” Read more of this special number.
Over the past two decades, companies have had a choice between open technologies and closed property.
The original option for institutions is primarily centered on operating systems, as Linux offers an open source of the Microsoft Windows. In the world of developers, open source languages like Python and JavaScript dominate, because open source technologies, including kubernetes, are standards in the cloud.
The same type of choice between open and closing now faces institutions for artificial intelligence, with multiple options for both types of models. On the front of the closed ownership model, there are some of the largest and most used models on this planet, including those in Openai and anthropology. On the open source side, there are models such as Meta’s Llama, IBM Granite, Alibaba’s Qwen and Deepseek.
Understanding when an open or closed model is used as a decisive option for the Foundation’s decision makers in 2025 and beyond. Choosing it has financial effects and allocating on each of the options that institutions need to understand and consider.
Understand the difference between open and closed licenses
There is no lack of excessive excessive decades between open and closed licenses. But what does all this actually mean for the users of the institution?
The closed property technology, such as the GPT 4O of Openai, does not, for example, does not contain an open or exemplary or exemplary training or weights code or available to anyone to see it. The model can not easily be well available, and it is only available for the use of the real institution at a cost (definitely, Chatgpt is a free layer, But this will not be interrupted by the real institution’s work).
Open technology, such as Meta Llama, IBM Granite, or Deepseek, is available publicly. Institutions can use forms freely, in general without restrictions, including customization and customization.
Rohan Gupta, a director of Deloitte, told Venturebeat that the open discussion versus the closed source is not unique or original in artificial intelligence, and it is not possible to be solved any time soon.
Gopta explained that closed resource providers usually offer many covers about their model that allows ease of use, simplified expansion, the smoother promotions, flowing intervention and a continuous flow of improvements. It also provides great support for developers. This includes documents in addition to practical advice and often provides more strict integration with both infrastructure and applications. On the other hand, the institution is paid in addition to these services.
“Open source models, on the other hand, can provide more control, flexibility and allocation options, and are supported by a vibrant developed ecosystem,” said Gupta. “These models can be increasingly accessed through fully managed application programming facades through cloud sellers, which expands their distribution.”
Take the choice between the open and closed model of the AI
The question that many users of the institution may ask is the best: an open or closed model? But the answer is not necessarily one or another.
“We don’t consider this a bilateral choice,” David Garrera, the leader of the Aryicas, told Venturebeat. “Open versus closed is increasingly space for liquid design, where models are chosen, or even organized automatically, based on the differentials between accuracy, cumin, cost, interpretation and safety at different points in the workflow.”
Guarrera noticed that closed models limit how to improve organizations or behavior adaptation. Reserve models often restrict the formulation of formulas, charges premium rates, or hide the process in black boxes. Although API tools simplify integration, they are taking a lot of control, which makes it difficult to build specific or interpretable systems.
In contrast, open source models allow the target design, handrail design and improvement for specific use cases. This matters more in the future of an agent, as models are no longer homogeneous tools for general purposes, but they are substitute components within the dynamic workflow. The ability to form a model behavior accurately becomes, at a low cost and complete transparency, a large competitive advantage when publishing task or tightly organized solutions.
“In practice, we expect the future of an agent where the selection of models is extracted,” said Garra.
For example, the user may formulate an e -mail message using one AI tool, summarize legal documents with other research institution documents with an open source model, and interacts with artificial intelligence locally through LLM on the device, all without knowing any model that does what he does.
“The real question becomes: What is a mixture of models that suit your workflow demands?” Garrera said.
Consider the total cost of ownership
With open models, the basic idea is that the model is available for free for use. While in contrast, institutions always pay for closed models.
Reality is when it comes to looking at the total cost of ownership (TCO) more accurate.
Praven Akkiraju, Insight Partners, explained to Venturebeat that TCO has many different layers. Some main considerations include the costs of hosting infrastructure and engineering: Are open source models hosted by the institution or the cloud provider? How much engineering, including installation, the guard garden test and the security test, needed to operate the model safely?
Note akkiraju that Adjusting the open weight model can also be a very complex task. The closed border model companies spend tremendous engineering efforts to ensure performance through multiple tasks. In his opinion, unless institutions publish similar engineering experience, they will really face a complex budget when formulating open source models. This creates cost effects when institutions choose a model publishing strategy. For example, institutions can adjust multiple model versions of different tasks or use a single application programming interface for multiple tasks.
Ryan Gross, head of the data and applications department at Cloud Services Caylent Venturebeat that from his point of view, he does not care about the conditions of licensing, except for edge case scenarios. The largest restrictions are often related to the availability of models when there are data accommodations. In this case, publishing an open form on infrastructure like Amazon Sagemaker may be the only way to get a newer model that still corresponds to. When it comes to Tco, Gross notes that the barter falls between the costs of everything and the costs of hosting and maintenance.
“There is a clear equivalent point as the economy turns from closed to cheaper open models,” Gross said.
In his opinion, for most organizations, closed models, with a hosting and expansion solution on behalf of the organization, will be less. However, for large institutions, the SAAS companies with a very high demand on LLMS, but the simplified cows require a performance in the limits, or products that focus on artificial intelligence, can be more cost -effective distiller models.
How the Foundation’s programs developer evaluated the open models for the closed
Josh Bosquez, CTO in SECOND Front Systems is among many companies that have to think and evaluate open models for closed.
“We use both open and closed artificial intelligence models, depending on the specific use, safety requirements and strategic goals,” Buswiz told Venturebeat.
Bosquez explained that open models allow his company to integrate advanced capabilities without time or the cost of training forms from scratch. For internal experimentation or rapid initial models, open models help his company to repeat quickly and benefit from the developments that depend on society.
He said: “The closed models, on the other hand, are our choice when the database, degree support at the level of institutions and safety guarantees are necessary, especially for applications facing customers or publishing operations that include sensitive or organized environments.” “These models often come from reliable sellers, who provide strong performance, support for compliance and self -hosting options.”
Bosquez said that the selection of a multi -functional model and a risk infusion, not only evaluates technical suitability, but also data processing policies, integration requirements and long -term expansion.
Looking at TCO, he said that it differs greatly between open and closed models and the approach is not cheaper.
“This depends on the scale of publication and organizational maturity,” said Busquiz. “In the end, we evaluate TCO not only on the dollars spent, but on the speed of delivery, the risks of compliance and the ability to expand safely.”
What does this mean for the Foundation’s AI’s strategy
For smart technology decision makers who evaluate artificial intelligence investments in 2025, the open debate against a closed is not related to the choice of both sides. It comes to building a strategic wallet approach that improves various cases of use within your organization.
Immediate action elements are clear. First, check the current AI’s work burdens and plan them for the decision -making framework, taking into account accuracy requirements, cumin needs, cost restrictions, safety requirements and compliance obligations for each use case. Second, evaluating the engineering capabilities of your organization in order to formulate model models, hosting and maintenance, because this directly affects the real total cost of ownership.
Third, start experimenting with models formatting platforms that can automatically direct tasks to the most appropriate model, whether open or closed. This puts your institution for the future the agent that industry leaders expect, such as Guarrera in Ey, where the selection of models becomes invisible for the final users.
Don’t miss more hot News like this! Click here to discover the latest in Technology news!
2025-06-27 20:00:00