Hallucinated cases
Some lawyers have already been caught using cases that did not exist and were made up by AI
Some lawyers have already been caught using cases that did not exist and were made up by AI
More than 140 AI systems have already been placed in Brazil
People are already turning to chatbots for simple legal advice
These signals show that not only is AI already an issue in courts today but the trend is moving towards even more involvement. With this come a handful of risks that include things hallucination and biases in the models themselves. Holborn and its judicial institutions become an industry that is no longer just about knowledge and skill but also about how well you can leverage these AI tools.
It means that many of the big legal instituions in Holborn like the Inns of Court will have to wrestle with how much AI they want in their workplace.
1. The Trend Continues - we see old legal libraries become prompt studying rooms, the best firms become AI houses with their speed and data
2. The Trend Stops - AI is banned in the court rooms but lawyers continue to use models that halluicante illegally
3. The Trend Mutates - Lawyers themselves are replaced by AI systems which beat them in cases 100% of the time
The law industry becomes increasingly reliant on AI technology furthering the biases in the AI models and bring these into the court room. The enourmos energy cost of these law models also further harm the planet. The access to the best AI law models is restricted and only attainable through lots of money. In the end lawyers as a whole may have their jobs automated.
There is an AI Lawyer crisis happening right now