Hallucinations: Should data make laws? Yes, and here’s why

Hallucinations: Should data make laws? Yes, and here’s why

Hallucinations: Should data make laws? Yes, and here’s why 1920 1080 Raymond Blyd

As a society, we evolve and make new laws. Laws are based on opinions we’ve formed after processing new information. What is new information?

TLDR: Thursday, 23 February, we presented our vision of the future of law. And our dream is data.

Hallucinations

We’ll focus less on numbers this outing. The numbers aren’t pretty, so let’s not pile on. On the bright side, it seems we’ve hit bottom. We calculated a recovery around February. A calculation we shared back in August 2022 with Law.com. Evidence in support: the US Federal Reserve only raised the interest rate by 0.25 this February. Previously, they raised rates by 0.50 and 0.75. Don’t pop the champagne because we are not an economist. Worse, most economists are actually sober about what lies ahead.

Another bright spot we shared with Law360 in early 2023: an avalanche of A.I. legal apps focused on consumers. All running on OpenAI. Talk about building a house in your neighbor’s backyard. This follows two other insights we shared. First, OpenAI is an operating system powering a new paradigm of tools. Second, by their own admission, OpenAI, and most apps build on top, will struggle to turn a profit. Remember, ChatGPT is a killer app, built by a non-profit receiving substantial backing from those with foresight. Realize, that we lack these ingredients in Europe and most other places except perhaps China.

Yet, working on something as fuzzy as “artificial intelligence” feels delirious. Note that the OpenAI dataset cut-off is 2021. You can ask ChatGPT to check this fact itself. So whenever GPT is faced with something after 2021, it will hallucinate. According to Wikipedia: artificial hallucinationis a confident response by an AI that does not seem to be justified by its training data”. This brings us to the point of this analysis: can we distinguish fact from fiction, and will we care? Welcome to a world of weirdness, which is our reality.

Euphoria

Now humans fake it until they make it all the time. Heck, the entire venture capital ecosystem is predicated on one principle: confidently project growth without proof of product-market fit. We closed 2022 with $496 Billion in capital committed to 696 funds looking to invest. This boils down to the craziest stat we’ve ever calculated: $1.3 Billion a day. That is how much is available for those looking for cash. Who’s getting a check? Here is a calculated answer. Contrast this with the reality that fewer tech companies are being founded. And the ones that already exist will struggle to grow.

Now that we’ve established that neither computers nor capitalists need facts to make decisions, the rest of us do, right? Yes, but we rely on computers that are funded by capitalists. Since we’re banning TikTok everywhere in the West, it’s obvious we don’t trust computers from communists. While spying is often offered as the reason, the underlying fear is seldom articulated: influence. With Brexit and Cambridge Analytica still fresh in our minds, it’s clear what guides our decisions: algorithms. In “A.I. the struggle“, the mini-documentary shows the stats and stories which illustrate the challenge. The dollars behind the data we need for our decisions are too overwhelming to dampen.

Surely, the law and lawyers can stop it, right? Actually, legislators created this problem in the first place. And lawyers are ill-equipped to handle any of it. Currently, before the US Supreme Court, there are a few cases around a law named section 230. This law allows those who create algorithms to not be liable for their impact. Pretty sure the lawyers confidently presented a legal argument, with no legal precedent, to sway the Supreme Court. The exchanges that followed revealed that those arguments weren’t convincing. As Justice Kagan admitted about the Supreme Court: “These are not like the nine greatest experts on the internet.”.

Dreams

It’s likely that the courts will kick these cases back to congress to legislate. Herein lies the paradox, neither legislation nor litigation solves these problems. In “A.I. the struggle”, we pitted the Video Assisted Referee (VAR) against Goal-line Technology. If you watch Football every weekend, you know the drama with VAR. The VAR is nowhere as consistent as Goal-line Technology. Ultimately, that is the dream: rational, data-driven legal decisions. More than a copilot, much like an autopilot. Call it Autolaw. Are we there yet? Promise, we’ll let you know when.

Meanwhile, catch this clip, which we made six years ago to illustrate this legal industry journey. And promise us to think about the path our society is on.

Close Cart
Back to top
Privacy Preferences

We created a unique video to explain our privacy policy. We hope more would follow in our footsteps. Meanwhile, feel free to reject or accept any of the settings below. 

Click to enable/disable Google Analytics tracking code.
Click to enable/disable Google Fonts.
Click to enable/disable Google Maps.
Click to enable/disable video embeds.
Our website uses cookies, mainly from Google. Check our unique privacy policy video to learn more.