RAISING AI

Software engineers have become parents, teaching and training their algorithms on stimulus of their choosing.

NeueHouse-FNDRs-Corner

In collaboration with our friends at FNDR, we are excited to share FNDR’s Corner – a weekly series of business and cultural provocations that encourages everybody to think like a Founder.

In the last twenty years, AI has gone from silver screen pipe dream to an unseen, misunderstood, and mundane part of everyday life. Dreams of Jarvis and HAL have turned into the reality of Tiktok’s content recommendation engine or iPhone’s portrait mode. It’s everywhere, and as with most software breakthroughs, most end users understand some of what it does, but not most of how it works. This puts software engineers in a parental role, teaching and training their algorithms on stimulus of their choosing. With recent — inevitable — events like Twitter’s racist photo feed or Microsoft’s Neo Nazi chat bot it’s becoming increasingly apparent that the black box approach to teaching machine learning algorithms is flawed. We need to do a better job raising AI.

AI is arguably humanity’s first child. Though humans have been raising children for millennia, this is the first time humanity has been faced with the task of raising something we’re all responsible for. It’s important to recognize the gravitas of the situation. We are a teenage mom trying to raise a god. Pandora’s box is open and we can no longer put the Siri back in the bottle.

There’s already a cultural conversation occurring around developmental robotics, the idea that we can raise our robots from scratch, like infants. After all, humans have much more experience parenting than we do teaching machine learning. With GPT-3 licenses being handed out to engineers across the globe, how do we take our learnings from raising children and apply them to raising AI?

It took massive breakthroughs in STEM to create machine learning, now it’s going to take massive breakthroughs in ethics, philosophy, and humanity to raise it responsibly. Our AIs are reflections of the society that created them. Often, when an AI acts out, it is parroting the conscious or unconscious biases of its creators or learning environment. Like a child repeating an overheard curse word in front of its parents, AI can often shine an ugly light on the shortcomings of the people who created it. When that same AI problem child profiles criminals, pilots drones, and runs economies, accidental influence has catastrophic consequence. It isn’t enough to raise AI that is human, it must be better.

FNDR works with the Founders of the world’s most transformative companies, bringing voice to Founders’ vision and defining culturally relevant, sustainable businesses. They are in direct conversation every week with the leaders who are building the next generation of business. They are fascinated by the shared themes and challenges seen across categories, and what it takes to lead a company intentionally.