There is so much going on. But none of it matters if we lose sight of securing the safety of the most vulnerable: our children.
Lately, Legalcomplex had some successes that we would love to share. It doesn’t matter because our mission is to help all legal technology succeed to create a safer society for everyone. This includes making our world safe for children. We refrain from addressing social justice issues. We did not analyze Ukraine, George Floyd, or any of the many ESG challenges. However, we do make one exception: children. Our first was LawKit: a legal framework to unlock smartphones inspired by Amber Alert. The second was Privacee: a video format to help children understand privacy in this complicated digital world.
This third analysis pales in comparison to the heartbreak of children mass shooting children. Maybe some see mass shootings as a typical American problem, but it also happened in Alphen aan den Rijn in 2011. I worked in Alphen at the time this happened, and it shocked us all. The parallels are that both incidents involve young people with known mental health issues legally acquiring assault rifles.
This brings us to what we can do to prevent gun violence. Obviously, this is a complicated matter without any easy fixes. One of these fixes is background screening. Here’s a quick reminder of our current reality: we’re actively screened on every aspect of our lives to determine purchase intent. Any action is tracked with or without consent through smart devices. This data is utilized for commercial purposes since that is the most lucrative business model. The question: is there a business model to save children from gun violence?
This is a very insensitive formulation of the problem, yet it illustrates the solution. Legalcomplex also captures FinTech and SmartTech companies along with LegalTech, RiskTech, and CivicTech. This provides a broader perspective on how technology impacts the law. While tracking and tagging investments across these spaces, we noticed something. The companies that build SmartTech to process data are third overall in total capital raised. After Payments and Credit technology, processing data is the most lucrative problem to solve.
In a weird way, the answer is Yes, there is a model, albeit morbid. If the majority of data processing is to find customers with purchase intent. Let’s agree that it does not help if the purchase intent is to kill. We want more customers, not less. This natural progression for sustainability is what drives industries like electric vehicles and ESG. One of the 21 examples, linked in the Spark Max pdf, is screening your Tinder date. Just Like Uber and Airbnb quickly realized, it hurts business if a service is not safe for customers. That’s why both deploy screening tech.
Let’s reiterate this point: this is not an easy fix. So far, financial security drives most successful legal technology advances. However, most data analytics solutions aim to save the economy, not society. They were built to protect companies, not citizens. Worse, when the government does the screening, the results can be pure evil. Clearview AI is a facial recognition company that sells mostly to law enforcement, as seen in the AI documentary. On May 23, 2022, the UK government fined them almost $10 Million for privacy violations.
On May 30 of 2022, the Dutch government acknowledge it is institutionally racist. Since 2001, the Dutch IRS used racial profiling technology to find fraud. What may surprise outsiders is there will be no civil or criminal prosecution, since everyone operated according to the law and was backed by the courts. Ironically, this tragedy is known as the ‘Childcare Benefits Scandal’.
Our lesson is that legal tech can be unconstitutional, even if it’s based on legitimate legislation. Our message is that legal data analytics is designed to take care of our cash, not of our kids. Our hope is that we built better screening tech to protect our future, which are our children.