Australia May Use AI to Assess Tenders and Expressions of Interes


EOIs and Tenders by Local Governments

A local government (LG) in Western Australia will frequently procure goods and services by way of an expression of interest (EOI) or a request for tender (RFT). 

An EOI or RFT may also form part of an LG’s processes to dispose of land, an interest in land (e.g. a lease), or to develop land.

RFTs and EOIs invariably contain evaluation criteria against which submissions are to be assessed. Proper application of the assessment criteria is very important but can be time consuming, require careful analysis, be technically complicated, and involve making an informed judgment.

There is no single source that codifies the legal obligations that apply to LGs when managing EOIs and RFTs. But when undertaking procurement processes, LGs do have obligations, including:

  • To adhere to procedural fairness;
  • To ensure the process is free of bias and unfair advantage;
  • To avoid and manage conflicts of interest;
  • Confidentiality obligations;
  • To not engage in misleading and deceptive conduct;
  • To make legal, rational, and fair decisions;
  • To comply with their own terms and conditions (as applicable to the LG);
  • To comply with express legislative requirements, eg the Local Government (Functions and General) Regulations 1996 and privacy laws relating to automated decision-making (where applicable).

Artificial intelligence (AI) can assist with the EOI and RFT assessment process. But it cannot replace the need for human decision making.

How AI Can Assist

AI refers to technologies that enable computer systems to simulate aspects of human learning, comprehension, problem solving, decision making, and creativity. Generative AI tools do this by making use of a large language model to process natural language (as distinct from structured data, such as spreadsheets or financial statements).

Generative AI tools can assist by sorting and summarising large volumes of information, generating tables, or undertaking the preliminary assessment of EOIs and RFTs. AI can be utilised to identify gaps or inconsistencies and to highlight issues that may require further clarification with a bidder. 

Used appropriately, AI can be a valuable tool.

Practical and Legal Risks

AI use carries risks. 

These include:

  • Incorrect assessments;
  • Assessment by reference to unstated evaluation criteria;
  • An advertent disclosure of confidential information;
  • Abrogation of responsibility; and
  • The risk that AI-generated material is given undue weight. 

AI outputs may be inaccurate, biased, outdated, or entirely fabricated. 

Bidders may “curate” their responses to be assessed favourably by an AI, e.g., to make their financial capacity or relevant experience appear greater than it is.

Reliance on AI generated outputs, particularly without adequate human oversight, can undermine the integrity of the assessment process and lead to difficulty in explaining or defending the final decision.

These issues may give rise to claims by unsuccessful bidders, including:

  • Claims for injunctive relief;
  • Allegations of misleading or deceptive conduct;
  • Judicial review of decisions on the basis that a decision is one that no reasonable government decision maker could have made;
  • A perceived lack of procedural fairness; and
  • Breach of express legislative requirements, contractual terms and conditions, or terms implied by law into the tender process.

What Needs to Be Done

If AI is used to assist with EOI and RFT assessments, LGs must be able to demonstrate that:

  • The assessment was undertaken strictly by reference to the stated evaluation criteria; and
  • Suitably qualified humans were entirely responsible for the assessment outcome.

Decision makers must understand the extent to which AI has been used by their LG in the assessment process. In choosing a suitable AI tool for this task, consideration should be given to the tool’s ability to explain its reasoning and the factors that led to a particular outcome or outputs.

If a decision is challenged the reasoning must be capable of explanation, independently of the output of any AI tools and supported by records showing how the AI outputs informed, but did not determine, the outcome.

EOI and RFT Documents Should Address AI Use

Clear disclosure of AI use is critical to transparency, procedural fairness, and defensibility.

Your EOI and RFT terms and conditions should expressly address AI use. 

This should include:

  • Disclosure that AI tools may be used in the assessment process;
  • Disclosure of the associated risks;
  • A reservation of rights to disregard AI-generated outputs;
  • The bidder’s consent to the use of AI notwithstanding its risks and deficiencies;
  • Appropriate waver of the right to bring claims arising from the use of AI; and
  • Confirmation that responsibility for the assessment remains with human decision makers.

When choosing a suitable AI tool, LGs may also wish to consider how the information they provide to the tool will be used and whether the provider of the AI tool acquires some rights in that information and any prompts (including whether they are permitted to use your data to train their models). This is more common in publicly available Generative AI tools. As with any software tool, consideration should be given to the terms of use and whether they are acceptable given the nature of the information that will be input into the system.

Where confidential information is involved, bidders should be required to acknowledge that the LG may not be able to verify how that information is handled once it is input into AI systems, including what is retained, reused, or used to train models. Put another way, the LG should not be giving assurances how the information inputted into AI will or will not be used (unless the LG is certain of the answer).



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *