Their responses were part of the virtual CDT Vendor Forum on Oct. 5, which connected nearly 300 attendees with CDT’s director and top technology and procurement officials. Among the takeaways:
- The state uses or leverages a “challenge-based procurement approach where we are open to alternative ideas, innovative solutions, even open source technologies," said Marlon Paulo, CDT's deputy director of statewide technology procurement. His comment was in response to a question about whether California wants to create “open source solutions that can be easily enhanced and shared,” and how it might include a requirement that projects use open source technology.
- “If vendors have the solution to solve the problem that the state is facing and that involves open technology, we’re more than welcome to recognize those and evaluate those,” Paulo said. The state has set policy on open source and sharing data “where applicable,” said Manveer Bola, state deputy chief technology innovation officer. It maintains a code repository where state agencies may share open source code, most recently making COVID-19 modeling application code available via open source.
- The state is still using its pre-qualified vendor pool but wants to refresh it, Paulo said in response to a question on whether a state goal is still enabling smaller companies to participate in projects and how it will create the conditions necessary for them to win bids. The state, Paulo said, intends to seek feedback on how it can improve the pool of pre-qualified vendors. “Anticipate Q1 of 2021 that we are going through the refresh. And there’s also some interest to expand the pre-qualified vendor pool,” he said.
- CDT “absolutely” intends to include vendor input in the strategic plan it is now developing, said Director and state Chief Information Officer Amy Tong, noting that this is part of the engagement process. That includes “the various stakeholders” — city and state governments, the vendor community, legislative staffers and, of course state departments. “Everybody’s going to get an opportunity to provide input,” Tong said.
- The state has done “a lot of work in analytics” and it’s using many more data visualization tools, Bola said, responding to a question seeking thoughts on the advent and use of predictive analytics in state government and the types of projects underway or planned, and challenges. “Our current governor is very big on dashboarding and boiling down big data into meaningful charts that can be used to … tell the story behind it but also to support decision-making. So, we’ve been implementing a lot of different tools and we have a very diverse strategy,” Bola said, highlighting the state’s use of Tableau visuals.
Pam Haase, CDT's chief of statewide technology policy, looked at challenges and said it’s crucial to look at the biases behind data models to “make sure that you’re not presenting the information in a skewed way.” That, she said, is what officials hope to address in the upcoming statewide artificial intelligence (AI) initiative — creating guidelines to ensure data is represented correctly. - The California Department of Motor Vehicles is likely the state department most involved in adopting robotic process automation (RPA), Haase said in response to a question on whether CTD is involved in some of the ongoing RPA initiatives. DMV is “looking at it to address a lot of their internal processes,” Haase said, citing its use on motor carrier permits as well as “occupational licensing and accident reporting. “So, it’s like their back-end office. And it’s really helping automate their data collection and processing,” she said, affirming that CDT is engaged as DMV leads the internal project effort. DMV is also helping drive the statewide AI initiative, Haase said.
- Back-end challenges agencies may be facing with respect to data migration, “especially with COVID,” Bola said, could include insufficient data governance — identifying and classifying data and understanding its ownership. These challenges could also include not having “a good handle” on data suppression requirements and other tools to make data more shareable and available. Another large limitation officials identified was on skill sets, specifically in data engineering. There’s a “huge, huge value” in implementing a data pipeline and ETL (extract, transform and load) tools but, Bola said, many departments don’t have those solutions available and lack the skill sets to build them out and apply logic to clean the data and make it available.