Us News

On California's state bar exams, more questions than answers

Thousands of people took the new California bar exam in February to join the state's 195,000 lawyers.

But a series of missteps by the agency responsible for licensing lawyers has made thousands of new legal careers a frustrating plight.

First, the wrong testing software was used during the exam. It is difficult for testers to log in. The software often crashes or lacks key features such as copy and paste, preventing many from completing exams. The organization that manages the exam, the state bar association of California, must adjust the scores and other remedies for the examiner.

Then there is news that at least with the help of artificial intelligence, a few multi-choice questions have been raised. It's not shocking for many people taking the exam – they already suspect that AI has been used because several issues they say hit them with their weird wording or legal insanity.

And now, future lawyers in California may have to wait longer to determine whether they lay off employees.

The State Bar Association said that based on the question, more time will be needed to get approval from the California Supreme Court to adjust test scores based on the question. Results for the February exam are scheduled to be released on Friday, but are likely to be postponed.

“I just want an opportunity to be a lawyer,” Edward Brickell, a 32-year-old graduate of Southwestern Law School in Los Angeles, said in an interview. “And it feels like there's another thing coming out every week and saying, 'We haven't given you a fair chance.'”

Mr Brickell and others who took the exam drown Reddit and other social media sites, with horror stories and plans to organize protests and demand accountability. A few examiners used public officials to express their dissatisfaction and frustration at a meeting of the state’s lawyers’ committee on Tuesday.

“You are the institutions that determine whether we have the ability to make a living,” one tester Dan Molina told the state lawyers' agreement committee at a virtual meeting. “The finances are being destroyed. Life is being destroyed, and it is about to be destroyed.”

California's bar exam has long been considered one of the most difficult bar exams in the country. In recent years, this threshold has been lowered.

In October, the State Bar Association received approval from the California Supreme Court to introduce a redesigned exam, raised questions from a new test provider, and chose the option to allow remote exams. State bars have made changes to save money.

The State Bar Association has previously used exams developed and prepared by the National Conference of Lawyers Reviewers, the organization behind the exams used by most states that are considered the gold standard in the field. NCBE does not allow remote testing.

Testers in California were told that new exams do not require any substantial preparation, so many exams are prepared in the same way as the NCBE version.

In November, the state bar association conducted an experimental check that could be run as a test. Those who accepted it reported technical difficulties. Then, the research guide published by new testing provider Kaplan is full of mistakes. The guide is quietly corrected and reposted within the weeks leading up to the February exam.

Kaplan declined to comment.

To show that the state bar association is expected to encounter some difficulties, it offers more than 5,000 registered examiners the option to postpone the exam to the next test date in July.

After the February exam, the state bar association acknowledged widespread technical failures.

“We know and have said that these issues are, and continue to be, for those still being tested, unacceptable in terms of their scope and severity,” the California State Bar Association said in a statement. “We apologize again and we have no excuses for the failure that happened.”

The State Bar Association added that it will assess whether Meazure Learning, a provider that provides technical and regulatory services to manage exams, fails to meet its contractual obligations. It also said it would attract a psychologist – an expert who focuses on measuring intangible qualities such as knowledge or intelligence, thinking that testers who are facing difficulties would propose score adjustments.

The state's bars' proposed test score adjustment was announced last week. The proposal greatly reduces the original pass score.

The proposal came on Tuesday after a request approved by the state Supreme Court, that the results will be three days before the results are published. Given the later documents, the state bar told examiners that the release of test results could be delayed, extending the dazzling uncertainty of many.

Burial in depth burial in announcements about scoring adjustments is a new development: some multi-choice test questions were not developed by Kaplan but were raised by ACS Ventures, a psychometrics provider of state bar associations, with the help of artificial intelligence.

ACS Ventures did not respond to a request for comment.

The State Bar Association said its attorney reviewers committee is the institution that oversees examinations and has not been aware of the use of AI, which had been directed by the state Supreme Court last year to explore changes to make the examinations lower, including potential use of AI.

“But the court has not yet endorsed the widespread use of AI,” Alex Chan, chairman of the Attorneys Reviewers Committee, said in a statement. “While AI may eventually play a role in the future of examination development, there is a lack of specific judicial guidance, the committee has not considered or approved its use to date.”

The Supreme Court said it did not realize that the technology was used to develop the exam and called for an investigation.

The State Bar Association has not disclosed details on how ACS Ventures uses the technology to help develop test questions.

For Mr Brickell and others, the disclosure using AI seems to provide an explanation for some of their confusion. He and other questions taking the test said some of the questions did not read that they were drafted by humans, but listed only the wrong multi-choice answers.

Aspiring entertainment lawyer Ceren Aytekin said she also noticed the particularity in some issues, but she initially refused to believe that AI was used.

“I initially thought, ‘Maybe I’m wrong’,” Ms. Aitkin said. “Maybe I blame the blame for an organization that will never do this to their candidates,” she added. “All the issues I found make sense when it comes to being involved in AI. We just don’t want to believe it.”

Two other large state bar associations in New York and Illinois say they have never used AI to develop test questions. The NCBE, which prepares for exams for New York, Illinois and most other states, said it never used AI for this purpose.

Using AI to develop test problems is not a problem, said April Dawson, associate dean of the Center for Technical Law and Policy at Central University’s School of Law in North Carolina. The problem, she said, is that it has completed transparency.

“You’re going to have a licensing agency that engages in this irresponsible behavior, which is really confusing,” she said.

If he fails, Mr. Brickell may take the exam in July. Those who fail the February exam will be able to attend for free. The State Bar Association said it will not use any questions raised by AI in the July exam.

If the exam is not available for free in July, Mr. Brickell was going to take it to another state.

“I don't want to use my attorney dues as the lawyer for the rest of my life,” Brickell said of the California State Bar. “It hurts a lot.”

Related Articles

Leave a Reply