IE11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Commentary: Advice for Rural Counties on Managing AI

“The genie is out of the bottle, and there is little time to waste before county organizations are thrust into a full reactionary mode of dealing with this unavoidable AI-driven local policy topic,” writes Steve Monaghan, Nevada County’s agency director of Information and General Services.

Steve Monaghan, the former longtime chief information officer of Nevada County who was recently named director of the county Information and General Services Agency, published the following essay this week about artificial intelligence. He wrote it for Rural County Representatives of California, a service organization that advocates for policies on behalf of rural counties. It originally appeared on RCRC’s website.

There has been a lot of media press recently about the impact new artificial intelligence (AI) tools such as ChatGPT will have on our organizations and workforce. The first thing to know is that you already have employees in your county organization using these tools today. If you have a teenager, they probably have been using it too. There is no putting the genie back in the bottle on this one. As such, it is important for county policymakers and leaders to understand the potential and concerns these new AI tools present to our organizations.

A QUICK EXPLANATION


AI has been with us and working hard in our county organizations for many years, performing out-of-sight tasks like protecting our email inbox from spam and malware. This is called traditional, predictive, or prescriptive AI and can perform a wide range of tasks such as analyzing data and making decisions based on that analysis. The new type of AI is called “generative AI,” which is designed to generate new content, such as images, music or text.

ChatGPT is a popular product that utilizes generative AI to create new content based on a natural language request. Example: “Write a 10-point outline on the major factors limiting rural broadband deployment, include references for each item.” Our employees now have access to dozens of generative AI tools that can create images, make presentations, write articles, fix grammatical errors, perform language translations, and much, much more. The popularity of technology is measured in its overall adoption rate to reach 100 million users. Here are some examples: the mobile phone took 16 years, the Internet seven years, Facebook 4.5 years, TikTok nine months. ChatGPT beat them all, reaching 100 million users in just two months.

5 WAYS TO LEVERAGE AI


Generative AI products like ChatGPT have the potential to transform the way local governments interact with their citizens. By using natural language processing and machine learning, AI can provide quick and personalized responses to citizen inquiries, freeing up government resources, reducing wait times, and improving overall citizen satisfaction. Here are five uses for generative AI that counties should consider:
  • Citizen Services: to provide a wide range of citizen services, such as answering common questions about government services, helping citizens navigate complex bureaucratic processes, and providing personalized recommendations based on individual needs.
  • Emergency Response: to provide quick and accurate information during emergency situations, such as natural disasters or public safety incidents. By using real-time data feeds, AI can provide citizens with up-to-date information and personalized recommendations on how to stay safe.
  • Planning and Development: to help citizens navigate planning and development processes, such as zoning regulations and building permits.
  • Social Services: to help citizens navigate social services, such as welfare programs and housing assistance.
  • Public Safety: to provide citizens with information on public safety issues, such as crime prevention tips and emergency response procedures.

AI can also help our employees be more productive, empowering workers to perform higher-end tasks than they are able to do on their own or to do them in a fraction of the time:
  • Write press releases
  • Create graphics
  • Perform research
  • Generate interview questions
  • Write code
  • Create presentations
  • Perform complex data analysis

Increasing and improving citizen services and enhancing employee productivity will benefit all our rural counties. However, as with all new technology, there is always a negative side to how the technology can be misused, cause harm or take advantage of others. Criminals are exploiting AI for criminal activities such as cyber attacks and more realistic phishing emails. They can even use it to create “deepfakes” to mimic a real person’s voice and image.

5 POLICY CONCERNS FOR RURAL COUNTIES


1. Data Privacy: Any generative AI product that interacts with citizens will need to collect data in order to function. It is essential that this data is collected and stored in a way that respects citizens’ privacy rights. Local governments must ensure that the data collected is used only for the intended purpose and not shared with third parties without consent.

2. Bias: Chatbots may perpetuate bias and discrimination in decision-making processes, based on factors such as race, gender, or socioeconomic status. Counties must ensure that chatbots are trained with unbiased and diverse data sets and regularly monitored for potential bias.

3. Accountability: Chatbots may make decisions that have significant impacts on citizens, such as denial of social services or emergency response. Counties must ensure that chatbots are transparent, accountable, and subject to oversight and audit.

4. Security: The use of generative AI products can present security risks, particularly if they are handling sensitive data. Counties must ensure that these tools are designed with security in mind and have appropriate measures in place to protect against cyber attacks.

5. Ethical Considerations: Counties must ensure that the use of generative AI products is aligned with their ethical principles and values, such as respect for human rights, non-discrimination, and the promotion of the common good.

Counties have many sensitive decision-making areas where they should think twice before ever implementing AI, such as in child protective services, behavioral health, treasury investments, hearing determinations, and more. Luckily, to date, AI is not quite sophisticated enough for these applications, but maybe someday soon.

The White House recently released what it calls the Blueprint for an AI Bill of Rights, which “is a set of five principles and associated practices to help guide the design, use, and deployment of automated systems to protect the rights of the American public in the age of artificial intelligence.”

There will be much AI policymaking taking place at the federal, state and local levels in the near future. County policymakers need to start thinking about the implications AI will have on their organizations and constituents, both positive and negative. The genie is out of the bottle, and there is little time to waste before county organizations are thrust into a full reactionary mode of dealing with this unavoidable AI-driven local policy topic.
Steve Monaghan was Nevada County’s chief information officer for almost 23 years before being named director of the county's Information and General Services Agency. He is also the Nevada County Emergency Services chief and the county purchasing agent. Monaghan is a member and past president of the California County Information Services Directors Association (CCISDA), through which he created and helps lead training programs for current and emerging leaders. Monaghan also serves on the Rural County Representatives of California’s Broadband Advisory Committee and on the Cybersecurity Program Advisory Board at California State University at Chico, where he received his bachelor’s degree in computer science.