Cal Fire rolled out an AI chatbot. Don’t ask it about evacuation orders

Published On:

In conclusion

Some simple fire-related inquiries cause the bot to falter. According to Cal Fire, fixes are being worked on.

Under Governor Gavin Newsom’s 2023 executive order, California government departments are putting all of their resources into generative AI technologies in an effort to increase government efficiency using AI. The California Department of Forestry and Fire Protection, the main organization responsible for organizing the state’s wildfire response, is among the first to introduce a chatbot.

According to a May announcement from Newsom’s office, the chatbot will improve Californians’ access to vital fire prevention materials and near-real-time emergency information. However, CalMatters discovered that it is unable to accurately represent how a particular wildfire is contained, is unable to reliably deliver information like an evacuation supply list, and is unable to notify users of evacuation orders.

In the upcoming months and years, Newsom plans to deploy AI apps for customer service, housing, and traffic. However, Cal Fire’s chatbot problems make it unclear if agencies are adhering to best practices.

According to Stanford University law professor Daniel Ho, whose work focuses on the use of AI by the government, evaluation is not an afterthought. When we pilot and implement a system like this, it ought to be a standard expectation.

The chatbot generates responses using the Cal Fire website and the organization ReadyForWildfire.org. Users can learn about Cal Fire’s programs, the agency, fire safety advice, and current wildfires. It was created by Citibot, a South Carolina-based business that supplies municipal government organizations nationwide with AI-powered chatbots. According to purchase records, Cal Fire intends to host the tool until at least 2027.

As stated by Isaac Sanchez, the agency’s deputy chief of communications, it was truly initiated with the intention and objective of having a more knowledgeable public about Cal Fire.

Cal Fire’s bot provided accurate responses when CalMatters asked it questions about the kind of fires that were raging at the time and basic details about the organization. However, CalMatters discovered that even if the question’s meaning stays the same, the chatbot may provide various responses if the query is significantly rephrased.

Assembling a bag of emergency supplies in case they need to escape is one crucial way Californians may get ready for fire season. Just What should my evacuation kit contain? gave back a certain set of items from the chatbot for Cal Fire. Instead, variations of the question that asked for a go bag, wildfire ready kit, and fire preparedness kit returned either a message stating that I’m not sure about the specific items you should have and the link to the wildfire site, or a prompt to visit Cal Fire’s Ready for Wildfire site, which contains that information. The website the chatbot cited included two of those phrases.


Furthermore, the chatbot doesn’t always pull the most recent data, even though it didn’t produce inaccurate responses to any of CalMatters’ requests.

The chatbot responded that the most recent information as of June 10 indicated that the Ranch Fire, a 4,293-acre fire in San Bernardino County, was 50% controlled when asked if it was. The information was six days outdated when CalMatters asked the chatbot, and the fire was 85% extinguished at that point.

In a similar vein, the chatbot responded that there were no employment openings at the agency at the moment. Two Cal Fire jobs were open at the time, according to a search on the state’s employment website.

As the head of research at the University at Albany’s Center for Technology in Government, Mila Gasc-Hernandez has examined the usage of AI-powered chatbots by government organizations. She evaluates these chatbots based on two main criteria: how accurately they deliver information and how consistently they respond to the same queries, even when they are posed in different ways.

“You do need both accuracy and consistency in the answer if a fire is approaching and you need to know how to react to it,” she said. You won’t consider how to politely ask the chatbot a question.

As of right now, the chatbot cannot tell users of fire-related evacuation orders. Sometimes it accurately identified police enforcement when asked who issued evacuation orders, but other times it stated it was unsure. Sanchez of Cal Fire stated that it is realistic to assume that the chatbot will be able to respond to inquiries on evacuations.

According to him, the response should be that there don’t seem to be any evacuations related to this occurrence if there are no evacuation orders for a specific fire.

Before launching the chatbot, Sanchez and his team of roughly four individuals tested it by sending questions they thought the general public would ask. By going through user queries and making sure the chatbot appropriately exposes the required response, Cal Fire is currently improving the bot’s responses.

What can you help me with? was the question CalMatters posed to the bot. When asked if CalMatters had any questions concerning the content on the Cal Fire site, it replied in early May, “Sorry, I don’t have the answer to that question right now.” By the middle of June, the response had been revised to include responses to queries concerning the content on this page, including information about ongoing fires, CAL FIRE employment classifications, exam requirements, and CAL FIRE’s different programs.

“Be patient is the main message we want to convey,” Sanchez stated.

However, experts advised that testing a chatbot should be done well in advance of the procurement process.

According to Stanford’s Ho, the ideal procedure is to set performance standards for the chatbot prior to choosing a provider so that there are unambiguous standards by which to measure the tool. An impartial third party should ideally establish those standards. Before the chatbot is made public, the advantages and disadvantages should also be assessed.

And in a best-case scenario, the public would be involved before launch, Albany s Gasc -Hernandez said. When considering the use of chatbots, agencies should determine in advance what questions the public is likely to ask the AI tool, make sure those are typical of the population the agency anticipates serving, and then refine the chatbot by having members of the public test the system to make sure it delivers the information they need.

These user engagement and user experiences are very important so the citizen ends up using the chabot, she said.

CalMatters has further information.

Text

Receive breaking news on your mobile device.

Get it here

Use our app to stay up to date.

Register

Get free updates delivered straight to your inbox.

Safeguard independent news for California s future

CalMatters has been educating millions of Californians and holding our government and special interests accountable for a decade.

  • We report what s happening.
  • We ask the hard questions.
  • We make our government transparent to Californians.

All donations are currently matched dollar for dollar. Would you kindly help CalMatters today?

Accountability lapses in the absence of independent, persistent reporting:

  • People lose visibility into decisions that shape their future.
  • Misinformation and special interests gain more ground.
  • Public trust erodes.

Don’t allow that to occur. Donate now to support CalMatters’ fearless and independent journalism.

The news cannot wait, so give today.

Leave a Comment