(TNS) — California's first bar exam to be prepared by a private contractor accomplished one of its main goals in February: lowering financial costs for the deficit-plagued state bar. And its passage rate of 55.9% was the highest for a February exam in many years, compared with rates in the low to mid-30s earlier in the decade.
But the performance reviews have not been favorable. Computer programs frequently stopped and started while transmitting questions. Some of the questions were mistakenly imported from the "Baby Bar," a lower-level exam given to first-year law students. Proctors assigned to monitor exam sites were reported to have interrupted test-takers. And there was evidence that some questions on the exam were the products of artificial intelligence.
"Everything went wrong. Virtually everyone had an interruption," said Mary Basick, an assistant law school dean at UC Irvine. "It may have hurt people who should have passed," and the results, she said, may not be reliable.
"We saw in February the disaster of having (an organization) that's never done it before draft bar questions," said Erwin Chemerinsky, the law school dean at UC Berkeley, one of 15 law deans who had urged the State Bar of California to postpone reassignment of the exam to a private contractor. "My hope is that the California Supreme Court and the bar have learned from it."
The bar, which licenses nearly 200,000 practicing lawyers in California, says it will resume using questions from the National Conference of Bar Examiners, used by more than 40 states, for its next exam in July. Most July examinees have just graduated from law school, and the exam generally has at least twice as many participants as the February test, whose takers include many who have failed previous exams.
The bar has not said whether next February's exam will resume using questions from Kaplan Exam Services, which signed a five-year, $8.25 million contract with the state last year.
Bar officials say the contract would enable them to erase an $8 million deficit in their lawyer admission fund, which pays for the exam.
The 200 multiple-choice questions involve issues of civil and criminal law. The exam also includes essay questions on legal topics submitted and graded by the State Bar, along with a "performance test" on a lawyer's task such as writing a case brief.
The state Supreme Court, which oversees the bar, has ordered the legal organization's Committee of Bar Examiners to review all questions for future exams and conduct "a cost-benefit analysis for any proposed changes to the bar exam."
That's not enough, said Chemerinsky, who advocates a permanent return to the exam used in most other states. "It takes tremendous expertise to successfully draft bar questions," he said, and the national format enables successful test-takers to practice law elsewhere, while still allowing California and other states to add their own questions based on state laws.
Rules for the national exam require it to be taken in person in large auditoriums. Under the Kaplan contract, applicants had the option of taking the exam remotely in small test centers or from home, with monitoring by a bar contractor.
Basick, of UC Irvine, said some of the promised test centers were unavailable and others were chaotic, with shouts from some test-takers that their screens had frozen while others were trying to answer the same questions.
Some at-home test-takers reported having to stay up until 1 a.m. because they had been removed several times from the platform. Others said they could no longer see the exam question while writing the answer, and there were even reports of exam screens flipping sideways by 90 degrees, Basick said.
And she said test-takers reported that some of the questions seemed oddly drafted and did not fit well with any of the multiple-choice answers. The explanation may have emerged in April, Basick said, when the bar disclosed that some questions on the exam were the products of artificial intelligence.
Russell Schaffer, a spokesperson for Kaplan, said Friday that the company was not responsible for the questions or technology that caused problems in the February exam, "and we did not use AI in any of the content we drafted."
"We support additional oversight measures and remain confident in the quality of the exam questions our experienced legal experts will provide for future administrations," Schaffer said.
Shortly after the exam, the State Bar issued an apology to test-takers "for the challenges many applicants faced," and said it recognized "the need to communicate more clearly with applicants and stakeholders." The bar said those who failed the February exam would be allowed to take it in July without paying a fee.
On Friday, Brandon Stallings, chair of the bar's Board of Trustees, said it would continue to work with the state Supreme Court and the Conference of Bar Examiners, which administers and grades the test, "toward an accessible, affordable, and fair bar exam."
© 2025 the San Francisco Chronicle. Distributed by Tribune Content Agency, LLC.
State Bar Exam Plagued by Problems, Claims of Questions From AI
What to Know- The State Bar of California reportedly disclosed in April that some questions on the exam were the products of artificial intelligence — an assertion denied by Kaplan, the company that provided the test.
- The bar, which licenses nearly 200,000 practicing lawyers in California, says it will resume using questions from the National Conference of Bar Examiners, used by more than 40 states, for its next exam in July.

Robert Gauthier/Los Angeles Times/TNS