AI

IEEE standard offers 6 steps for AI system procurement

For more than three years, the IEEE standards association improves a draft standard for purchasing artificial intelligence and automated decision-making systems, IEEE 3119-2025. It aims to help purchasing teams identify and manage risk in high -risk fields. These systems are used by government entities involved in education, health, employment and many other public sector fields. Last year, the working group has a partnership with a European Syndicate Agency to assess the components of standard draft and collect information about the needs and opinions of users about the value of the standard.

At that time, the standard guarantees fiveOperations to help users develop their requests and determine damage, mitigate and monitor them usually associated with high -risk artificial intelligence systems.

These operations were to define the problem, evaluate the seller, evaluate solutions, negotiate contracts and monitor contracts.

The European Union agency’s notes led the working group to reconsider operations and sequence many activities. The final draft now includes an additional process: seam preparation, which comes immediately after the problem defining the problem. The working group believes that the additional process addresses the challenges facing institutions in preparing AI’s requests, such as the need to add transparent and strong data requirements and integrate questions related to the maturity of sellers ’governance.

The European Union Agency has also confirmed that it is necessary to include a seam setting, which provides purchasing teams additional opportunities to adapt their requests to technical requirements and questions related to responsible artificial intelligence system options. Leaving the space of adjustments is especially relevant when artificial intelligence acquisitions occur within the emerging and variable regulatory environments.

Giselle water

IEEE 3119 in the ecosystem of standards

There are currently many internationally accepted criteria Artificial Intelligence Managementand Artificial intelligence ethicsPublic Gain programs. Those from the IEEE standards association and International Organization for Recovery The goal is to design artificial intelligence, use, and life cycle management.

So far, there has been no universally acceptable standard that focuses on buying artificial intelligence tools and offering operationProcular instructions for the purchase High -risk artificial intelligence systems Which serves the public interest.

IEEE 3119 Standard addresses that gap. Unlike the ISO 42001 ISO standard and other certificates related to general supervision of artificial intelligence and risk governance, the new IEEE standard Risk menu, operationalApproach To help government agencies adapt traditional Buying practices.

Governments have an important role that they play in the official publication of Amnesty International. However, the dynamics of market and the experience of non -equivalent artificial intelligence between industry and the government can be obstacles to success.

One of the basic goals of the standard is to better inform the purchasing leaders of what they buy before they buy a high risk. IEEE 3119 is defined by highly dangerous artificial intelligence systems as those that are made or are a great worker in making it Dependency decisions It can have significant effectsOn people, groups or society. The definition is similar to the definition used in Colorado 2034 Artificial Intelligence lawThe first state level in the United States is comprehensively deals with high risk systems.

However, the standard processes complement ISO 42001 in several ways. The relationship between both is clarified below.

International standards, often characterized by them Soft lawUsed to form artificial intelligence development and Encouraging international cooperation With regard to its ruling.

The difficult laws of Amnesty International, or the binding binding rules and obligations, are an act of progress around the world. In the United States, a mixture State legislation governs various aspects of artificial intelligence, and the approach of the national artificial intelligence list FragmentedWith various federal agencies, implement their own guidelines.

Europe has led the European Union to pass Artificial Intelligence LawAnd that started governing artificial intelligence systems based on risk levels when they came into effect last year.

But the world lacks Difficult regulatory laws With an international scope.

The IEEE 3119-2025 criterion corresponds to the current difficult laws. Due to its focus on purchases, the standard supports the rulings at risk Chapter Three of the European Union Law, Amnesty International and Artificial intelligence work in Colorado. The criterion also corresponds to the proposal Texas HB 1709 Legislation, which would impose reports on the use of artificial intelligence systems by some commercial entities and state agencies.

because Most artificial intelligence systems used in the public sector It is purchased instead of built at home, and Iee 3119 applies to AI’s commercial products and services that do not require Big modifications Or allocations.

The target audience of the standard

The standard is dedicated to:

  • Medium -level purchasing specialists and multidisciplinary team members with a moderate level of artificial intelligence governance and knowledge of the artificial intelligence system.
  • Procurement professionals in the public and private sector who work as coordinators or buyers, or have equivalent roles, within their entities.
  • Other non -purchases and supervisors are either responsible for purchasing or supervising employees who provide purchasing jobs.
  • Professionals who work in the ruling entities participating in public education, facilities, transportation, and other services funded from the public sector that either work on or run purchases and are concerned with adapting purchases to artificial intelligence tools.
  • Artificial intelligence sellers who seek to understand the requirements of transparency and new disclosure of their products and their highly dangerous commercial solutions.

Business training program

the IEEE Standards Association I participated in partnership with Artificial Intelligence Procurement Laboratory To present Procurement training program responsible for artificial intelligence. The training course covers how to apply the basic standards of the standard and adapts current practices to buy high -risk artificial intelligence.

The standard includes more than 26 tools and models in the six operations, and the training program explains how to use many of these tools. For example, training includes instructions on how to conduct a risk analysis, and apply the seller’s evaluation registration guide for analysis The seller of artificial intelligence is calledAnd the creation of the “risk record” of artificial intelligence is linked to the specific risks and potential mitigation of the condition. The training session is now available for purchase.

The first days of artificial intelligence are still. Resolution makers do not have a great experience in buying artificial intelligence for high -risk fields and in reducing these risks. The IEEE 3119-2025 standard aims to support building agencies and enhance the muscles of mitigating the risk of artificial intelligence.

From your site articles

Related articles about the web

Don’t miss more hot News like this! Click here to discover the latest in AI news!

2025-05-15 18:00:00

Related Articles

Back to top button