June 23, 2023

Entrepreneurship and Ethics in the Age of AI with Ahmed Reza

In a world where technology is advancing at an exponential pace, we can already see that artificial intelligence (AI) will have a profound impact on our lives.

But AI is far from perfect. Too often, we end up grappling with a variety of problems when we bring AI into the real world, from increasing mental health issues in young girls and boys to anxiety for workers whose jobs are changing.

In a world where technology is advancing at an exponential pace, we can already see that artificial intelligence (AI) will have a profound impact on our lives.

But AI is far from perfect. Too often, we end up grappling with a variety of problems when we bring AI into the real world, from increasing mental health issues in young girls and boys to anxiety for workers whose jobs are changing. And while developers don’t set out to design AI technologies to have these unsavory effects, they happen anyway.

In this episode of This Anthro Life, we explore these issues and more with Ahmed Reza, a serial entrepreneur and self-professed AI geek who has founded multiple successful companies. With over a decade’s experience in the marketing and AI space, Ahmed shares his unique perspective on AI development, tech entrepreneurship, and more.

Show Highlights:

  • [05:53] Why Ahmed has been nicknamed “the intelligent man’s Forrest Gump”
  • [08:12] Ahmed’s experience as a child actor in Bangladesh
  • [12:45] How Ahmed came to be an entrepreneur
  • [23:00] What Yobi does
  • [28:38] Why we should be careful when developing AI
  • [34:46] What AI can ultimately enable us to do
  • [36:47] How AI can help us create a better world
  • [45:36] How we can incentivize leaders to use AI technologies for good
  • [54:26] The moral responsibility of AI developers
  • [58:52] The story behind the development of Trephub
  • [01:05:32] Ahmed’s hopes for the future


Links and Resources:



This show is part of the Spreaker Prime Network, if you are interested in advertising on this podcast, contact us at https://www.spreaker.com/show/5168968/advertisement

Transcript

[00:00:00] Adam Gamwell: Hello and welcome to This Anthro Life, a podcast about the little things we do as people that shape the course of humanity. I'm your host, Adam Gamwell. Have you ever considered what it means to lead with humanity in the age of AI? As people alive today, we are constantly learning how to navigate an increasingly complex world. And in a world where technology is advancing at such an exponential pace, it can be easy to forget the importance of human connection in both our personal and professional lives. Now, while that sounds a little bit strange to say, think about it. Apps are sold to consumers as ways to find dates and partners, as ways to keep up with friends and families at a distance, or as ways to find new like-minded communities. Now indeed, these things do happen through these technologies, but as you likely know or have heard, society is dealing with a lot of unintended consequences of intense technical infusions through things like increasing mental health issues in young girls and boys, and a combination of euphoria and anxiety for workers whose jobs are changing around automation and artificial intelligence. This represents the consumer side of the equation, we might say. 

[00:01:13] On the podcast, we recently did an episode that looked at the concept of modularity in software development. And modularity is the process of breaking down complex development tasks into individual pieces with individual responsibilities. One of the challenges that we found in this episode is that a sense of responsibility for how people or end-users use the technology is often divorced from the developers' perspective who create the technology because they're just working on one small part of the overall puzzle. So this represents, we might say, the developer side of the equation.

[00:01:46] Now, nobody in their right mind would set out to design technologies that actively harm people, polarize us, or make us feel worse. So we could think about these as unintended consequences as part of technology development. I understand that sounds a bit crazy to say this day and age, but hear me out. How can we also keep an eye on the bigger picture on what's happening at the business and societal levels? So in light of these challenges, it seems particularly important to bring in another part of the equation. That is to peel back the curtain and also speak with the entrepreneurs and leaders who are responsible for creating artificial intelligence technologies and deploying them in the world. 

[00:02:22] So we're tackling some big questions on today's episode: what role do entrepreneurs and business people play in the development and deployment of artificial intelligence technologies in society today? And how can we create responsibility and accountability when we have unintended consequences? Further, what does it look like to create technology leaders and leadership who don't simply over-index on STEM fields — that is, science, technology, engineering, and mathematics — but who also keep the well being of humanity firmly at the center of their ethical compass? Now, I don't mean to hit on STEM fields. We need them obviously very much. But the other question here is: how do we consistently widen the aperture to not just think about technology as an engineering problem or aspects of humanity as something to, quote unquote, solve for? In other words, how do we understand what motivates founders, investors, and business leaders to keep humanity at the center of their stories, not just technology?

[00:03:17] So in this episode of This Anthro Life, we explore these questions and more with special guest Ahmed Reza. Ahmed is a seasoned entrepreneur and investor who has founded multiple successful companies. He has a background working for NASA, the Department of Defense, and various startups, where he honed his skills in AI and entrepreneurship. This led him to create Yobi, an app that uses natural language understanding to take advanced business applications and break them down into simple conversations and automations. 

[00:03:44] But beyond his diverse business experiences, Ahmed draws on his life experiences of growing up as a child actor in Bangladesh, immigrating to the United States, and opening his first shop in a mall while homeless in effort to fund his computer science education at Cornell. And all of these experiences highlight for him the importance of humility and recognizing change as a part of existence that is not assuming that things will always be the same. And as we'll see, this is a perspective that is relevant to both entrepreneurship and AI. Such experiences also importantly play into Reza's philosophy when it comes to what it means to be a leader in the AI and tech industry today. He emphasizes the importance of identity beyond one's profession as a way to reflect on the impact of technology on human existence, asking ultimately what kind of world do we want to leave for our children. Ahmed also emphasizes the need for entrepreneurship to contribute to society in a meaningful way and for regulations to deal with those who may not be thinking straight or using AI as a tool for harm. This represents a shift in business thinking for many, and Reza sees this adaptive mindset as a necessary avenue for all of us, but especially tech leaders in this case, to reframe success as doing something great for humanity, and building a meaningful existence that elevates the human experience, not simply increases material success or technological output.

[00:05:05] So I'm really excited to share this conversation with you. There's a ton here. We'll jump right into Ahmed's story after these messages from today's sponsor.

[00:05:17] Awesome. Ahmed, super excited to have you on the podcast today. Thanks for joining me here on This Anthro Life.

[00:05:22] Ahmed Reza: Hey, thank you so much for having me on, Adam. 

[00:05:25] Adam Gamwell: Right on. And today, you know, we're excited because we're going to kind of get into the lovely weeds here around, I think, AI and ethics and kind of see where the future might be taking us. And so I think you're in a really great position with the companies that you founded, set up, run, as well as your really interesting background that I want to dig into that I'm curious how shaping your approach to the world today. And so, you know, from your both global perspective that you're bringing to the table as well as your business acumen and the technology you worked on, like there's a lot of really good angles that we could explore. 

[00:05:53] Something that struck me, though, when I was, you know, first kind of you were put on my radar is that you've been known as the intelligent man's Forrest Gump. I'm not sure what that means, so I'm curious to hear a little bit about that idea and like what does that mean to you? How did that nickname come about?

[00:06:08] Ahmed Reza: That was kind of interesting because the way I look at it, you know, just the way I frame life is I feel like I've been one of the luckiest people alive. Have had a lot of different journeys in life early on. I had a career as a child actor in Bangladesh and also as a recording artist, and that was a very interesting worldview. And being born — when I was born, I'm dating myself, right, growing up in the late eighties in Bangladesh, watching famine, and actually looking at the anxieties from back then, then becoming an immigrant coming to the United States, being pretty destitute while I was here. I was on the streets of New York, dropped out of school, had to be selling books on the street, then I was able to make it back into school. And that gave me a kind of determination and grit that, in retrospect, was much needed. There's nothing quite as motivating as not wanting to clean up poop for the rest of your life. And so folks have to do that or, you know, I understand that, you know, in life, circumstances happen that are beyond your control. So I'm one of those people who like — I look at life, I look at the journey. And if I told you about my life, you probably wouldn't believe it. And I feel very fortunate been in these different circumstances, the hard ones, the great ones, you know, kind of stumbled into this position at NASA when I was also in another difficult spot in my life, and that turned out to be one of the best experiences of my life. And then, went on, you know, just being a tech geek. I've always been a geek at heart. I've always been tinkering, always been figuring things out, really excited about technology and the promise that it holds. So I think during one of my earlier podcasts, you know, they were like dubbed me the intelligent man's Forrest Gump in that my worldview is one where I fully appreciate the luck that I've had along with the hard work along with, you know, the privilege that I've had as well.

[00:08:12] Adam Gamwell: That's incredible, too. And I mean, just to hear that the story too in terms of moving across the world, you know, dealing with issues of homelessness, the challenges that comes with that, like the building an entrepreneurial mindset out of that. But also, I mean, I'm curious, too. You're a child actor, you know? That's an interesting kind of role as well that I guess just curious. Like what sparked that idea, you know, in terms of — I'm curious how acting might come up throughout our conversation if it's something that's still in your mind. But how did that happen?

[00:08:37] Ahmed Reza: So I took singing lessons as a child. My mom was very adamant about, you know, having a very structured upbringing. So singing lessons art and, you know, I had to excel in math. So I had a relatively privileged upbringing being in probably upper crust of Bangladeshi society. And it was through like a singing gig at TV. And for me, as a little kid, that was just exciting, right? And one of the producers spotted me and asked my mom if I'd like to act. And my mom was like, "Oh, he has no acting experience." Like they're kids. They're fine. 

[00:09:13] Adam Gamwell: They're fine, yeah. 

[00:09:14] Ahmed Reza: That's all they could do. It was like just tell them that they were going to play pretend. So I never had any formal acting lessons until well into my acting career there. And yeah, it just sort of happened. But what was really interesting was — of course, it was very fascinating as a kid. And what was different was, you know, when people start recognizing you on the streets and, you know, people want your autograph, that was a little odd. It made me realize, you know, fame isn't all that it's cracked up to be. It's nice, but it also has its downsides, right? People assume you are somebody before they really get to know you, right? And that's something I think I empathize with well-known actors for. Like it can actually make it difficult to make human connections. Like human connections are really important in life because subsequently when I was in the United States, like nobody knew who the heck I was. My command of English wasn't that great. So ended up finding myself in this place where I'm asking like I don't deserve this. Like what's going on, right? It's like, why do I have to do these odd jobs and these other things? And then, realizing that's the reality of the world. The reality of the world is, you know, it's survival of the fittest. Nobody deserves anything. And wherever you are, you gotta think fast on your feet, understand your situation, make the best out of it, and it's always important to stay humble. And I think all of those experiences have helped me become a better entrepreneur and a better leader, knowing that don't take things for granted, don't let things get to your head and really get a good grasp of reality. Like get reality right because it's easy to lose yourself in the moment or in your society or in your city. But all of those things that we think of as very, very permanent and stable and unchanging in the world isn't as permanent or as stable or as unchanging, which is why initially, if you meet me and you work with me, you're like, Ahmed's a little paranoid, maybe, right? But, you know, during COVID, I think I ended up getting more people in my tribe because they realized when the world moves, it shifts beneath your feet, you know? Guys like me have an unfair advantage for having had that happen to them before. 

[00:11:35] Adam Gamwell: That's a really interesting perspective. And I think that this oftentimes, yeah, people can get wrapped up in their own head, their own story, right, and not kind of think outside of that. And so they get comfortable, right? And kind of then if something can either blindside them, it may have been, you know, other people may have seen that it's coming because that's just either changes on the way or they were paying attention differently. Also in the way that you're kind of describing this too that recognizing that so much change is part of our existence, right? And so we shouldn't just assume things will be just one way all the time. 

[00:12:04] I think that's a really interesting perspective actually to bring into conversation with entrepreneurship too. And as you’re building your own entrepreneurial journey, which is building us, you know, towards the question of AI, which again is also this interesting question because we're at this precipice of change, right, in terms of how humanity will adapt and what will societies look like in the next five, 10 years and beyond obviously than just that, you know? And it's funny too because even thinking about the questions of where is AI today and, you know, where will it go, and then how do we think about that question and opportunities of change that are ahead of us. So I appreciate the way you've kind of opened this kind of conversation about what do opportunities mean and how do we think about them and kind of not be blind to just where we're sitting. 

[00:12:45] And so even thinking about that, like in your own story, you know, you have worked in a lot of different kinds of technologies, a lot of different industries. I mean, you know, you mentioned NASA so far, the school work that you've done. You've worked in multiple ventures also. So I'm kind of curious to get your perspective in this thought too that, you know, I know hindsight's 20/20, but just thinking about as we're moving through this, you know, what was the kind of vision that was evolving for you in terms of what you were looking for and what kind of technology were you drawn to, what kind of businesses were you drawn to and how did that kind of perspective come together across time for you that brings us to today.

[00:13:17] Ahmed Reza: It's been a windy journey. Like being an immigrant, you know, believe it or not, startups were never a thing, right? Like even imagining myself as an entrepreneur was not in my mental frame. I've always kind of been an entrepreneur at heart, I guess. So through college, I actually opened my very first store at the Ithaca Mall, and it was a result of just being really broke. And here I am going to this Ivy League college, full scholarship, you know, I should be really grateful. And I couldn't relate to most of my peers because their situation was very different from mine. I had to take care of my family. And how do I take care of my family and, you know, pay the bills and go to school? Even on a full scholarship, right, on average, you have to make around $30,000 a year to at least take care of your expenses. Like I was taking care of my mom at the time, and entrepreneurship was just like the only thing that worked. So I opened a store in the mall, you know, learned about retail businesses, you know, and I remember in three months made more money than I would've made in the three jobs that I had before. So I was working three jobs while going to college. Cornell is known for being pretty difficult, especially in computer science. Then, I was struggling with that. And then once I opened this little store at the mall — the recommendation of a friend who knew about the incense and oil business. Like I knew nothing about incenses, right? But, you know, now I look back, think about it. Ithaca, you know, lots of folks that like incenses, right? And it started. It became a little bit of a phenomenon in town. So I got to know more people, made my first hire, learned how to make better hires, right? So all these things that didn't come from a place of, "Hey, I want to be an entrepreneur." It came from a place of "I'd like to not be broke." 

[00:15:08] And then, I also learned about failure. And I think that's really important to highlight is the failures really shape you into who you are, who you become, right? How do you handle those failures? They will eventually come, right? So everything's looking great. And then, 9/11 happens. And then right after that, just the stock market crashes, everything goes to, you know, hell in a handbasket, and the business ends up folding. And this was my first — I still have some incense holders left over in the garage. And I'm talking like, you know, from 20 years ago. Apparently, the incense holders are good quality. 

[00:15:44] Adam Gamwell: It's a good product, yeah.

[00:15:46] Ahmed Reza: So that was like my first entrepreneurial experience. And I didn't tell most people about it 'cause I was afraid of being judged, right? They'd be like, "Oh, I thought you were an engineer. I thought you were smart." 

[00:15:54] So the NASA job was actually because I was going to drop out of school because I couldn't figure out how am I going to still pay for home and, you know, go to school. So I'm going to have to drop out, go get a job at like a regular grocery store. And I actually had a work study job working for space sciences at Cornell. And when I went to quit the job, they were like, "Oh, you're quitting, but you're doing a great job," right? "You've done this thing that several PhDs gave up on," mainly because, you know, if you don't know what the heck you're walking into, you don't have preconceived notions of how things should be built. So I went back to first principles and just as a programmer like tried attacking the problem differently. That ended up leading into my NASA job, which I was one of the youngest engineers on the Spitzer Space Telescope project, did the image processing pipeline. It was just so much fun, right? Like you like really geek out, get to work in like particle accelerators that were bombarding the detectors, and figuring out how cosmic rays work, right, how alpha rays affect the detectors.

[00:16:55] So really enjoying that. Then suddenly got an opportunity in Florida after that working for the DoD. And that's where I really got to work on AI stuff. And at Cornell, I was part of the DARPA Grand Challenge, which eventually led to self-driving cars. So I guess my first foray into the AI space, very nascent AI space, was with the DARPA Grand Challenge. And then, my professional experience was doing some work for the DoD, creating some pretty rudimentary AI for unmanned aerial vehicles to avoid them getting shot down. 

[00:17:27] And then, just decided that I wanted to do something a little less serious, you know? This was like pretty serious stuff, right? And I'm in Florida at this point and I joined a startup and it was just so much fun. It was just — I didn't want to leave work. I couldn't believe I got paid to do this, right? It's like this is the kind of stuff that I do for free. And then, you have like all the soda you can drink, like the best startups have this culture, right? Like you just want to stay there, right? People don't understand this. You know, they talk about Google giving you massages and laundry service on-site. There's a reason they do all of that, right? The reason they do that is if you want excellence, you want people to stay in the zone as long as possible, right? I remember the CEO used to have to come and kick us out of the building 'cause they were like, "You guys have worked too long," right? So nowadays, you read about all the people who just don't want to go into work. Like we didn't want to leave work.

[00:18:15] Adam Gamwell: It's the opposite, yeah. 

[00:18:16] Ahmed Reza: That startup did really well. And through the 2008 crash, I ended up joining another startup and another startup, and all of them did really well, you know, and I did really well. And that's where, you know, it wasn't like a preplanned thing. But I knew like this is for me. And when I wanted to start like my own thing, I built this company called Dental Web Now, which was also like I got super lucky 'cause here I am, you know, I can do space flight systems, I can do all this really fancy stuff. But it's like, who wants to buy AI by the pound, right? It's like nobody wants to buy AI for the — So I'm hanging out with a friend of mine who's a dentist, and he has a $3,000 check on his desk for Yellow Pages. And I was just really offended as a geek. I was just like, "WTF, dude?" Right? And he's like, "Yeah, $3,000 a month. They give me patients, and I actually spend $6,000 a month on my marketing." So initially it started out with I'm going to prove to him that he's a complete, you know, completely horrible person for destroying the environment. And, you know, so I rolled a call tracking system using Asterisk to figure out which one of those marketing campaigns were getting him the calls and which one of them were getting him the patients. So after three months, we recorded over a thousand calls. 56 of those calls were from the Yellow Pages. So it was definitively like answered that the Yellow Pages was actually not working and that his AdWords were doing a much better job and that his Google organic was doing great. So I go back to him and I'm like, "Aha! Look at this. You can't say no to data." And he was a smart guy. He was like, "Yeah, I can't say no to data, and you'll be my marketer from now on" to which I was extremely offended, you know? I was like, "How dare you call me a marketer," right? "Do you know what I do?" Then he goes, "I'll give you the six grand a month." So I guess the inner entrepreneur in me was like, what if I went online, hired somebody, showed them what I did and just make them do this like relatively, you know, brainless work on optimizing customer acquisition costs? And that turned out to be a multimillion-dollar business that I bootstrapped. 

[00:20:19] Adam Gamwell: Smart move. Smart move. 

[00:20:21] Ahmed Reza: I didn't even name that company, right? So this is like really basic machine learning stuff, right? It was named by one of the folks that I hired, Judy, and Judy still works with me today in my new startup, right? So I've been very fortunate to have amazing people around me. So it was Chris and Judy. And Chris was the first one to point out to me that as I was looking for startup ideas that other people might call a business with revenues and growing customers a company. I remember that exact conversation and feeling like, yeah, I think you're right. Maybe I'm letting my ego get in the way and not accepting the money the universe is throwing my way. 

[00:21:03] So ended up selling that company in 2018. And that exit turned out to be better than many Silicon Valley outcomes. Became a private equity investor for a little while. And, you know, finally got to, you know, sit back, look at the drawing board. Here we were, helping dentists make an extra $400,000 a year on average, and the key insight was figuring out customer acquisition costs — applying really basic machine learning to a problem and using that as a advantage. And I thought to myself, "Man, we're just capturing a fraction of the conversation. If we could capture all the conversations and we could really build a brain in the cloud that had access to your business's data that had access to your business's conversations, we could really supercharge businesses in ways that they haven't really fathomed." 

[00:21:52] And by the end of 2018, the Watson era had come and gone, right? There was the hype cycle, right? Oh, my God! Watson's self-aware, you know, it'd just be jeopardy. And like that's why I'm a little bit wary of hype cycles because when hype cycles end, some people tend to throw the baby out with the bathwater. AI has real serious substance. Even traditional AI has real serious substance. The fact that transformers have been transformative — excuse the dad joke — so the transformers came out and there's all this change happening. I look at that and I go, "This is going to be something." Like this is the best time to build that brain in the cloud. And I think I know how I can deliver it. So the big challenge I've been working on was actually how do I deliver an AI in a nonthreatening way? 'Cause whenever you say "AI," people automatically assume like Terminator, right? They don't assume like humble things like your iRobot or other things that make your life better, right? Like your car's transmission has AI in it, right? Self-driving cars have AI in it, right? So there's a lot of potential there. 

[00:23:00] And then luckily enough, ChatGPT comes along and makes that problem even easier for us. So now, instead of us trying to tell people, "Hey, you want to try to use this platform called Yobi, which brings together all of your communications and brings your team members?" By the way, we have an additional team member named Yobi that can clone your voice, that can answer — that can literally like, you wish you could duplicate yourself as an entrepreneur? Like that's literally what I've done: I have a digital clone that sounds exactly like me and is able to interact at a superhuman level, right? So I can carry on 700 phone conversations at the same time if I wanted to or text conversation. I can nurture relationships, right? Blind spots that a normal human would miss, a normal team of humans would miss, are no longer a blind spot. It's like having a superpower. And we've built this app. We invested into building an app that works on every platform that makes it super easy and intuitive. And you now talk to the machine in natural language, which is actually a mind-boggling innovation because the user interfaces and advanced business applications start to become really confusing. And the way that we went about solving it is by simplifying it with how people naturally talk to each other or how you would naturally manage your business where you would ask your assistant, "Hey, can you go schedule this interview for me or this podcast for me?" You know, get me a summary of what happens before the podcast afterwards, you know, remember to thank Adam. Now, all of those things can happen, and I can do more of the in-depth conversations, the more human. A part of me can now be leveraged for my own fulfillment, for making better business, better business decisions. So we're getting into this very exciting new world. 

[00:24:49] Adam Gamwell: No, that is super exciting. I just want to check. I'm talking to a human right now, right? You're not that good as a clone, are you?

[00:24:57] Ahmed Reza: I don't know if you've been following me. So we actually did a fully AI-generated podcast with me and an AI host. 

[00:25:04] Adam Gamwell: I've not heard that yet. No, okay. Yeah. 

[00:25:06] Ahmed Reza: I sent it to my mom. And she goes, "That's a great podcast. When were you on?" I was like, "Mom, I expected you to know that that wasn't me." If you pay close attention, you can tell that it's not me, but it is surprisingly, surprisingly good. And I think it's just a new medium, just like videos were a new medium. And if you were the first person watching that train coming towards you in a theater, you'd be like, "Oh my God! That's an effing train coming towards me." But today, we watch movies all the time. They evoke emotions. They do all of these things, right? But you never see a picture or a video of your great grandfather, and you're not confused that they're alive. They're not. It's just a different medium. By the way, I'll just have my synthetic agent say hello to you: Hi, I'm Ahmed's Yobi, a synthetic agent trained on his personal data and communications. 

[00:25:57] Adam Gamwell: That sounds right on, yeah. 

[00:25:59] Ahmed Reza: So it's pretty close, right? And it allows me to interact and be more of the kind of CEO that I'd like to be, you know? A lot of times, you know, you probably find yourself repeating the same information, pitching again and again and again, right? All those instances where in the back of your head, you thought, "Man, I really wish I could teach that next sales agent to just be like or just to learn from what I have." Well, guess what? AI is that intern, that sales agent, that can actually help do that. Those memorized things, those things that you just go on autopilot, right, you can now hit the autopilot button on Yobi and let it take over a lot of those conversations. It doesn't mean you're not having the conversation still on your behalf, right? Just that that's totally possible and live. 

[00:26:46] Adam Gamwell: At a different kind of scale too, yeah. That you can now, as you said, talk to 700 people at the same time, which is really, really interesting in terms of like moderated or unmoderated interviews, I suppose or, you know, call center calls. I mean, I think there's something else you said that's really interesting there too is that like, there's a lot of the mundane tasks that like just take a bunch of time, right? And so the value there too is this notion that you said of kind of freeing up more of the human side of ourselves, right? I think this is an interesting question because as AI gets more embedded in our businesses and in our everyday lives, you know, there is obviously like some level of concern and fear that, you know, either I'll lose my job or that I will get replaced somehow. But I think this is an interesting question too that on one level, there's things that AI can't replace or won't at least can't for now replace or won't replace. But then, I think also I question this narrative too because a lot of people kind of talk about this replacement narrative and it's I'm kind of thinking about this and I'm curious your thoughts in terms of, you know, why do we have that perspective versus it frees us up to do other things, right? It kind of offers some different kinds of affordances of what we might be able to do and like free that up. So I'm curious. Like we see that back and forth a lot. So how do you kind of see those sides of the coin?

[00:27:55] Ahmed Reza: So the fear of loss is a much greater motivator than like looking at what you could gain. As an entrepreneur, that's been my unfair advantage. Like I just couldn't believe that there wouldn't be so many more people competing with me on the tech that I built in the last company. I just was like there's no way, right? Nobody built a competing company in like years, and it was relatively trivial to build. And one of the reasons that happens is because a lot of people will second-guess. And they're like, what if this and what if that, right? So oftentimes, entrepreneurs are the ones, yes, they take risks, but they're calculated risks, right? The difference between just straight-up gambling and, you know, risk-taking. 

[00:28:38] And in that mindset, yes, the fears are real. I don't want to dismiss them and say there are no fears, right? With every technology you can use it for good and you could use it for really — actually there's three kinds of things, right? There's the good. Then, there is like Terminator, which I think is a little bit overblown and it's more sensational. But then, there is the benign evil that I think requires deeper thought — things that unintentionally happen. And we've seen this with the advent of Web 2.0, where we never built social media to create divisiveness. Yet that became an emergent phenomenon of rather not-so-smart AI, right, where you just told it, "Hey, get me more attention. Get me more users." So okay, great. Here's the thing that induces dopamine hits. So your phone's dinging all the time. You'll notice that my phone's not dinging. I turn notifications off. My synthetic agent handles a lot of those for me. If it's really important, then it bubbles up to me, right? If it really requires my attention, it bubbles up to me. So just not living in a constant, you know, constant state of like getting hit in the head. It's kind of an unfair advantage in today's world. And that's not what any of the technologists set out to build. Like Jony Ive was giving an interview around the iPhone. When they were building the iPhone, they really wanted the technology to give access to many more people. And I think AI is going to allow us that step function increase in giving access to more people, right? It just removes one more step from getting to technology.

[00:30:16] But we have to be careful, and we have to be thoughtful about how do we build this thing out? What is the impact? What are the social impacts of what we're building? Because to just hand-wave it and say, "Well, that's not a big deal," you know, go ask the creators of some of the biggest companies that we have today. They're concerned because they're good human beings. I'm fortunate enough to know many of them, right? And at the end of the day, we are human. We want to build things that, you know, elevate, that increase for everyone, right? I haven't talked to a single entrepreneur who was just like, "Oh, it's just for the money." No. Almost every entrepreneur looks at their work as a creative act where it's part of their legacy, and nobody wants to build something that has negative effects on humanity. That is never the intention. But when that does end up happening in an unintended way, it's important to pause and reflect. And when you're looking forward, go, "What am I doing today that could have those kinds of unintended consequences?" 

[00:31:16] So at Yobi from a very early stage, we took anti-spam very, very seriously. We go above and beyond for a company our size to make sure that our technology isn't abused, that the folks that use our technology uses the way that it was intended, and that it's extremely powerful. They understand it. They understand that you have to train it. They understand that it can empower businesses. That's what we're trying to do. We don't focus heavily on the personal use, right? We discourage personal usage actually because this is not conversation that you would want to be having like where you're fighting with someone or your girlfriend or whoever, right? Like Yobi is not the place to do it. Like that's not necessarily where you want your synthetic agent, you know, going, "Yes! I now know how to curse at people," right? It's like no. If anything, in a business context, I want to come across as professional. And if I am getting really emotional, I would ideally like my partners, my colleagues, even my AI colleagues, to tell Ahmed to stop, take a deep breath. "Are you sure you want to say this? How about we rephrase it like this?" Right? That's sort of the way that we're purposefully building out our app.

[00:32:30] Adam Gamwell: No, that's really helpful. And I was actually talking to some friends last night about this kind of question that — I agree with you that the Terminator scenario is a bit overblown. And so, you know, why are we not telling more stories about it? Because if AI is trainable, right, and if we're providing data sets and we're training it with the material that we're giving it, why not have it be able to step in and help talk us down from a overly divisive scenario or a fight, right, and say, well, actually, you know, I know you want to yell right now, but, you know, whatever psychological principle A, B, and C would suggest that it's either like more like better to do this or lik, you're going to feel bad about this tomorrow, I promise. So like let's just — let's not yell right now. 

[00:33:07] Ahmed Reza: So it's funny. Alexis Ohanian, somebody I deeply respect, the founder of Reddit, pointed out that the algorithms don't necessarily promote nuanced discussion. It promotes things that are extreme, right? And I understand this fully well. And that's why I say we're having this discussion about AI in AI in a very AI-dominated world already, right? Like so I would have to say something extreme. But it doesn't actually understand I'm saying something extreme. What it says is like, "Oh, Ahmed did like a funny face" or "He did something that elicited a reaction. Let's just optimize for that," right? 

[00:33:53] Adam Gamwell: Interesting, yeah.

[00:33:54] Ahmed Reza: And that's why if I say "Terminator" a bunch of times, and if I make scary faces, you know, and if I say, "Oh my God, it's all going to end!" right, the algorithm will just zoom in and optimize on that. And as somebody who builds algorithms, I understand the silliness of it, right? But that's where I think it's important to know the other humans here in Silicon Valley and beyond who are building this, who are very human, who understand what's happening and who also understand the limitations of the current state of AI. And I think that's where the concern also comes from from them is, well, can we put our greed aside a bit and make sure we are taking responsible steps forward? I don't think I've heard any disagreement. I don't think I've heard anybody say, "Oh, forget responsibility. Let's just go full steam ahead and come what may." 

[00:34:46] And the thing is this is the time to really look inwards 'cause the AI should allow us to reflect on our humanity. Like what's it done? What's it done for you for the last 10 years, for 20 years? There's good stuff, and then there's stuff that you could do without, right? The constant dings? You can probably do without. The good stuff, like self-driving? It's really amazing AI. Like my Roomba is my favorite robot. There's AI around us that's already like making this call a little bit better, transcribing it, assisting us, right? So just extrapolate that forward. Now, that you can talk to the machine, the issue going forward isn't the technology; it is the humans behind the technology. So it behooves us humans to think about what does it mean to be human? What is my legacy going to be? The saving grace of AI is the fact that the people building AI have kids, right? And, right, like what is the world you want to leave your children, right? Yes, you have an incredible amount of power in your hands. You know, and as they say in Spider-Man, with great power comes great responsibility. 

[00:35:48] Adam Gamwell: That's — I mean, I think that's a really fascinating perspective too because oftentimes, we think about the development of AI tools, right, and even Silicon Valley too I think like you're — I think you're right on, too. Most people will jump to Web 2.0 examples, right, with social media and just sort of the development of that and like the rise of divisiveness. But then as we think about, you know, okay, what could something like ChatGPT as an easy access point for most, you know, end-consumers or again, on the B2B side, if you're using Yobi or something else for, you know, call center data or, you know, the different sets of interacting with different businesses who's making it on the other side. And that's a really interesting question to kind of say — I love this idea of the saving grace 'cause actually because developers have kids. You know, the CEOs, the C-suite, the folks building the businesses have kids is important too 'cause you're right. 'Cause we sometimes also forget in the conversations that like it isn't — hopefully the people versus the machines, but it's like it's sometimes us versus ourselves and how do we then ask questions about what kind of world do we want to build in. 

[00:36:47] So I'm curious about something you said in there in terms of AI, you know, ideally has the capacity to help us reflect on our humanity, what it means to be human. You know, how can we help people do that, like either through business, through, you know, consumer technology, through B2B technology? What are some of the ways that we can do that? I mean, is this one of these like here's a great prompt for ChatGPT that we should put on YouTube? Or are there kind of other ways we might kind of plug in and think about — Like how do we get folks to do that so we're actually not yelling at each other with our dopamine hits from, you know, Facebook telling us that we're all super different and actually enemies?

[00:37:19] Ahmed Reza: Right. And that's the funny part is as I travel around the world, I realized there's a little bit of a monoculture, right? The world that I was born into is very different. Like I remember growing up, nobody asked each other what they did. That was one of the really odd things when I immigrated to the United States. There was this, you know, "What do you do?" And we also have to ask ourselves that, you know, you're Adam, right? And if you really ask yourself, "Who's Adam? Are you an entrepreneur or a podcast host? Is that all you are?" And the reality is that's not all you are. Ask anybody who loves you. Ask any family member, right? They love you despite the quirks and maybe because of it, right, because of some little defect that they find adorable that they find to be you, right? I have a lot of them. Lord knows I have a lot of them, and I'm so grateful to have friends and to have family who appreciate the nonperfect version of me, right? Being like I eat my own dog food, right? So I've been testing this stuff since way before most people, right? So I've asked myself, well, here's this technology that's a better salesperson than me. It's more vigilant, doesn't have to sleep, doesn't have to, you know, eat. What does that make me? Well, that still makes me me, right? I still am this ephemeral being going to have a very limited time on this earth, and my reality is very different from the reality of this thing that runs on GPUs, right? It's a great salesperson. It's a great tool, right? 

[00:38:49] And yes, we end up being tools like back on the day before the Industrial Revolution, a lot of the work was manual. And maybe you got some sense of identity from that. And when that changed and when these machines came about, right, what did we do? We started building skyscrapers. We started building these cities. I mean, if you look at the world, you see human creativity in, you know, on display. You go to a new city and you look at this, you realize like, wow, this is so crazy. Like in the last hundred, 150 years, all these technological changes have come about so insanely rapidly, right? 'Cause our brain, like you watch a movie, the emotions you feel are very real, man, right? Emotions that you feel are very real. All of this stuff did not exist, right? Like being able to communicate in this way, right? If I told you back in, I don't know, 200 years ago saying that I'm talking to somebody on the other side of the world, I'd get institutionalized. Now, that's very real. 

[00:39:48] But it's amplifying our voices, our dreams. And that's how I look at any new technology that's coming about, right? It requires us to really look inwards also because in addition to amplifying like the results of industrialization, you could also argue, you know, nothing as catastrophic as World War I, World War II ever happened in human history before, right? So we have to look at history. We have to understand us and understand what does it look like for us going forward. — has a very interesting book called "The History of Us," where he tries to weave together a history of the world as it is today, which is very multicultural. Borders are not quite like they used to be, right? Everybody's from somewhere and going somewhere, right? And we're realizing the common shared humanity for everybody. So that book just made a really interesting impression on me because we often look at the history of Europe and the history of Asia. And walk around New York City, man. And it's like this is the history of us. How do we go forward? Do we build something wonderful, something utopian like Star Trek? I'm a big Star Trek fan, right? That's what the green screen is. Like they come into the transporter room. 

[00:41:10] Adam Gamwell: It's a holodeck over there, yeah.

[00:41:11] Ahmed Reza: Right. And you'd be surprised. Like you see a Star Trek fan from Korea or from the Middle East, and you'd realize you have more in common with that Star Trek fan than you realized, right? You can instantly bond and build human connections almost anywhere. So we could have this world that's objectively better, where people get to live more meaningful lives because the machines are able to allow us to achieve more productivity than we've ever thought before. The cities are evidence of that. Our world currently is evidence of the productivity that we've been able to achieve. Like even feeding eight billion people on Earth, like people in the '80s did not think that that was going to be possible. They thought there would be mass starvation, people would be dying off, right? But thanks to some geeks somewhere in a lab, right, working on food science — something not very sexy, right? Like when was the last time you were like, "Oh, food science, that's where it's at?" But that really was where it was at, right? It's literally feeding us all, like so much so that we have a bit of an obesity epidemic, right? Imagine telling somebody like 30 years ago, 40 — oh my God, that's 50 years ago, okay — 50 years ago like, "We're going to have an obesity epidemic, and we're going to have eight billion people on planet Earth." They'd be like, "That doesn't make any sense," right? 

[00:42:26] So if you look forward, I think those are things that we really should talk about is like, what do we do when all have a little too much? How do we live meaningful lives? How do we make sure that we're healthy or happy and don't go back into the things that caused the great wars?

[00:42:44] Adam Gamwell: We're going to take a quick break. Just wanted to let you know that we're running ads to support the show now. We'll be right back.

[00:42:54] I think that that's a really interesting point that at this, again, this like interesting precipice, this challenging one that — you're right. We have the capacity to feed eight billion people. We have the capacity to stop ourselves from getting into unnecessary wars. But I think, you know, I mean, part of it raises the questions of incentives. Like how do we help those — I don't know if "help" is the right term — help those in power to then make more I want to say ethical decisions. I mean, AI is an interesting fulcrum. I think, you know, both the technology and a lever that, you know, certain people can pull, and the question is like, you know, who has access to kind of both build and then pull those levers of power, and what does AI afford in that case? 

[00:43:33] And then, I think you're, on the one hand, totally right that like we need to be able to look at our history and ask these questions and say who are we, how did we get here, and like, how do we — we narrowly avoided nuclear war. You know, and I mean, obviously Japan was not able to, you know, in terms of the U. S. but just like there's certain parts that we didn't nuke the world, and let's hope we don't. But like, you know, that threat keeps getting since the Cold War, I mean, since World War II and then since the Cold War, right, there's always been kind of a specter. And so I think that there's something really interesting about like also the global nature of it. Like I really appreciated your point that when you're in New York, we are seeing a story of us as much as we are seeing like New Yorkers and then folks, you know, from the Philippines and from Bangladesh and from Texas and, you know, from all over the world like in one spot. And so we do really get this this wonderful human melting pot. Realize that we are — these are our neighbors, these are our people, these are our homies, you know, this is who we live with and spend time with every day. And like that humanity is fundamental, I think, to have in conversation with these other bigger questions of, you know, where can AI take us?

[00:44:30] And I appreciate your optimism too because it is this interesting question that like so often, we just hear folks again like either the fear question. But I'm curious your perspective in this too of like how can we help incentivize, you know, leaders to do good with this technology? I think you're totally right that there is the benign evil possibility, the unintended consequences possibility. It's not a given, which is good. That would be chaotic evil, I suppose, right, if we did our chart of where folks live on the chart. But yeah, I'm curious your perspective on this side, too. And this is a big hairy question, but I think it's an important one for us to ask too is like to allow us to be optimistic. I think it's also to say we have to like, how do we make sure people are kind of in the corner of humanity, right, and keep them there like if we're building tools that can totally and radically reshape opportunities and affordances and, you know, like food distribution, right? We can feed eight billion people, but we don't do it effectively or efficiently, right? And there's a great like inequal distribution of who gets what, when, where, and how. And, you know, is that just market forces? Maybe. But like, you know, how invisible is the invisible hand really, if we're to be honest? 

[00:45:36] And so, you know, again, big thinking, but I'm just curious your perspective on this, too. You know, what kind of tools do we have or how are you thinking about this? Like, I mean, ethics is such an important question. So I'm like, I want to — let's get in the weeds here. I'm curious what your perspective on like what we can do in this space.

[00:45:51] Ahmed Reza: Fortunately, we come built-in with this compass, right? And it's actually tied very much to our happiness. And this is something that I think is really important to discuss is when I was growing up in New York City, when I was destitute, it was hard to really think about these philosophical things. I dismissed philosophical musings as just musings of privileged people. To some degree, maybe I was right. But the reality struck — hit me hard in the face, where I made enough money, where I had all of those material things, including a Rolls-Royce. And I became absolutely miserable. I wasn't happy. And I just looked back and said, "How come I felt so happy when I was broke on the streets and was sharing a sandwich with some friends?" And I realized what a huge blessing it was to be broke because that's where you find the people that will share a sandwich with you, and you realize they are invaluable. 

[00:46:56] So one of my first investors in Yobi was one of my friends who like that has been there for me since high school. He bought me my first CD writer for my computer. And we're all — we're just these broke kids, right? And he worked at McDonald's. He worked his butt off. We built our — he's the one who actually helped me build my first business venture. And I remember him giving me the CD writer. I was like, "I can't believe this," right? He's like, "You're going to do something good with it," right? It was just bestowed great expectations on me. And then, I realized that those are the things that truly are meaningful. And if you want to have a meaningful existence, yes, you need enough money for food, shelter, you know, and some creature comfort. But you give yourself too much of that, that's actually not happiness. And unfortunately in our Instagram world today, we often just perpetuate just the material side. 

[00:47:50] And the truth is you talk to folks who are materially wealthy and you realize they deeply value these immaterial things and understand that those are blessings, and like the blessing of community, of friendship, of doing something good. And it's really important, like in my opinion, to not be dismissive of philanthropists, right? Because you go, "Oh, they're so rich. They're just giving away their money." Yeah, it's one thing to be questioning philanthropy. I'm not going to say every single philanthropist is of good heart. But most of the folks that I have had the fortune of rubbing shoulders with or being on the receiving end of that philanthropy, they've been incredible. They're genuinely good human beings trying to be better. And if we don't nurture that among people, if we don't nurture that culture, that's going to come back to bite us, especially if we take a very materialist worldview where it's just the more numbers in your bank account is winning. That's not true. That's not a virtue. Yes, being able to take care of things, being able to create value — there's something great about that. But the reason that's really great — ask most entrepreneurs, right? They might like put on their Superman shirt and go, "Yeah!" Right? But really, ask them privately like, "How many families do you end up feeding? How are you helping outside?" Just go and look at all the dentists. Say I'm going to bring up the dentists. Look at dentists, car wash owners, right, these folks who are entrepreneurs, you'll find them volunteering. You'll find them donating. You'll find them trying to move the needle. And that's something that's admirable, and they're doing that to be successful because success beyond just the basic material success isn't just about having more material things, but it's about, you know, building things that you're proud of that you can be happy with for yourself and for others. 

[00:49:38] And especially in a world that is increasingly rewarding to folks like myself, right, like if you're a tech geek, you absolutely have an unfair advantage, right? If you are a certain way, you have an unfair advantage in this current world that we live in. And understanding that is a position of privilege, and that if you exploit that without much thought, you will end up becoming unhappy, right? So the moral compass is kind of there to guide you to being a better you that you can live with, right? I think that tends to get understated and underestimated. We often tend to assign malice. Like assume ignorance, not malice. And that's what I've found to be true in like in getting reality right. That's the reality of the world that I see. And most of the people that I've seen, they're trying to be better. They're on their journey to being better. So if we reframe success as not just being wealthy but — growing up, I looked at Albert Einstein as someone who was very successful, which is why I've always kind of been a bit of a geek, right? We should reframe what success really means is not just making money, but actually doing something great for humanity, doing something great for your community, and to be someone who at 70, 80 years old can have the awesomeness to barbecues and, you know, have people miss them if they're gone, right? 

[00:51:04] If we optimize more for that and for leadership like that, I think we're going to end up with a very different outcome because let's be serious — if we didn't rob banks just because there were cops, if that was the only reason we didn't rob banks, man, we would descend into chaos so fast. And you go into societies where they don't have that much policing, but people just don't rob banks on a regular basis, right? So understand the reality of that. And yes, we do need police. Yes, we do need enforcement. Yes, we do need governments to help establish, you know, whether it's regulations or whatever underlying infrastructure is needed for society to survive and thrive. But also understand the number of people that are going to do harm, that are just not in a good place maybe mentally or whatever it is, right, that are not thinking straight, that's sort of what you build your police force and all these other things around is to make sure that when those anomalies happen, which they will, that you're able to deal with it effectively. So similarly, in the AI space, when somebody, you know, when some teenager who doesn't quite have their prefrontal cortex fully built out yet, you know, is raging and using AI as a tool, we want to make sure that that gets controlled, that gets contained. So there's that balance between, you know, absolute totalitarianism and complete anarchy, where there's that, you know, happy medium — special emphasis on happy. 

[00:52:33] Adam Gamwell: Yeah, right. No, I think that that's brilliantly said. And it's interesting because there's a connection that as you're saying this that had me think that I think you're totally right too that there is this important piece for us to then think about the way that we are defining success and like also knowing what matters to us as humans, right? And it reminded me of Rutger Bregman wrote a book that was called "Humankind" and it was a it's a history of human kindness or basically it's called "A Hopeful Human History" I think is the subtitle. But it's just this interesting point because we tend to do — I think you said too — like we tend to assume malice, which is not great. Or our brains are wired to hear the negative thing, and then we kind of latch onto that versus getting a hundred good stories and one bad story, then you'd be like, your brain says, "Oh, something bad is happening," right? 

[00:53:14] And this is an interesting point that we remind ourselves that there actually is like, by and large, like we are as humans are wired to care and to belong. It's like one of our fundamental drives, you know, kind of as you're saying. And I love your point that we have this kind of compass built into us, right, and that we kind of forget it when we're thinking across, you know, economic or power lines or just someone is different from me. And like but there's also a human there too, right, and kind of these questions of, you know, how do we want to live and who do we want to live with? And also recognize that there are these stories that are actually good, and like that's actually what's been humanity's success is that we learned to work together. I mean, that's why we have — cities are literally just an example of people that like beings that are okay living near each other, right? Cats don't do that. I mean, they do it because we make them. But like, you know, most other animals don't do that, you know?

[00:54:00] Something else — this just popped in my head as you were saying this so. It's that you mentioned before that algorithms like the, you know, Alexis Ohanian was saying that it'll tend to kind of like call out the extremes, you know, and kind of see those. And so is that just a weird reflection of actually how our brains work? Because we tend to hear like the, on the bell curve, we hear the extremes and say, "Oh, it's either super good or super bad," and we kind of like ignore the middle, which is 98% of most things. It's weird. Like if there's a connection there, I'm not sure what it is but.

[00:54:26] Ahmed Reza: That's actually absolutely like on point, right? Evolutionarily speaking, right, if there is movement in the bushes, it's better to assume that's a tiger than to assume it's just a rabbit, right? Because if it's a tiger and you were right and you were alert and you ran away, you survived, right? So understanding that and realizing that acting as if there's a tiger just because the door opened, right, is irrational, right? It's not reflective of our current reality. And making sure that as a technologist, as a scientist, right — there's a lot of psychologists that work very closely, there's a lot of neuroscientists that work very closely with startups, right, to try to make the apps more engaging. But understand that, you know, human beings are prone to super-releasers, just like any other creature that is biological in nature, right? We cannot avoid our nature. So you make things that you know will addict, you know? You have a moral responsibility to understand that with the distribution strategy that we have, this is going to addict the next generation. This is going to addict a lot of people; that there is, you know, "Oh, it's just a simple thing." Well, not really. It's not just a simple thing. As we build other machines, optimize for human happiness, optimize — yes, make money. But at what point do you draw that line, right? And that's a problem with reductionism because reductionism is where the benign evil seeps in where you say, "Oh, I was just doing my job," right? There's nothing more dangerous than "I was just doing my job." Doing your job really well to a fault unthinkingly is literally the thing that we are afraid of.

[00:56:10] Adam Gamwell: Yeah, that's a great point. Yeah, that's right. 

[00:56:13] Ahmed Reza: So that's us that's human, man. That's the whole point of being human. We are nuanced. We have all of these things. We are imperfect. But there is some special sauce in all of that mix of being imperfect that we can step back, right, which is why ethics is such an interesting discussion, right? For me, it's pretty obvious that you shouldn't just throw somebody onto the train tracks to save three more people, right? It's not — there's like a moral dilemma there, and we're built with it. We're built in with these things, with these feelings to preserve society for preservation of us. We're built to preserve "us," not "me." And that's what ends up happening is like if you hyperfocus on "me," you'll realize that your happiness requires "us." And if we want to live in a world that's better, we should be hyperaware of that. 

[00:57:10] Adam Gamwell: Yeah, no. It's well said. I mean, especially as the world is increasingly, you know, globally interconnected, right, like the number of "us" is both increasing but then also like the there's more ways we can both interact and be engaged with one another, too. And just to see, you know, to your point you were saying before, if like I can talk to somebody across the globe is both an incredible feat, then it reminds us of, I think, yeah, the "us" drive and like the responsibility we have for each other. 

[00:57:32] And so I guess my kind of last pocket of kind of question thinking being respectful of time is that just this idea that you're working with TrepHub in this space. And so this is a, you know, one of the other parts of your work I think that's really interesting and worth kind of digging a bit into is like, you know, having tech entrepreneurship both from a global perspective in terms of like building more kind of accountability and networks and support, especially for startups that are outside of the U. S. So this is a, you know, even as you mentioned up top too that like startups weren't a big part of conversations in your world until you either found your way a little past NASA, you know, in terms of like these business areas. And so this is an interesting question in terms of, you know, for folks that are not near NASA. You know, whether in Houston or Florida, you know, the two centers of the universe, right? Florida and Houston, you know? Just the ideas of like, how else can we do some work kind of in helping more of the global population have access to technology and tools supporting more kinds of businesses 'cause I think this is also part of the I think a fundamentally important part of what's leaning into our, I guess, our moral ethical compass, but then also the "we" part, right? That there is — it just comes down to us. And when we realize that we have the power, the ability to help others also who normally may not get the same resources, you know, what's our responsibility in that space? So I'd love to hear a bit about your work, your perspective in this side of things too in terms of building more global support for startups around the world. 

[00:58:52] Ahmed Reza: So TrepHub was like very near and dear to my heart. So I was in Melbourne, Florida. And didn't really have a startup ecosystem. And I was like, "How am I going to do this here? What am I going to do?" I started modeling something that I had learned at the Milken Institute Global Conference, where they bring smart minds together and they don't really set a crazy agenda, but amazing things just come out of it. So I thought, "What if I could bring like-minded geeks together?" So we started this thing, and we called it the Fight Club for hackers because we were worried that we'd lose our jobs if they found out that we were coding with some other people from, you know, other companies after hours. So that eventually grew from five people at our first meetup to, in less than six months, over a thousand members. And this was like only known, like invite-only right? We didn't even have a name for it.

[00:59:41] And it started spawning startups. And I remember the first startup that came in. We rented out the first floor of this building in Melbourne. We all chipped in a little bit, right? And this guy comes in and he's like, "Hey, without this place, like our startup would have never taken off. We got funding. We want to contribute." So we quickly gathered together. And at that time, like GitHub was really popular among geeks and the, you know, Entrepreneur magazine. The short is "Trep," so it was like TrepHub, right? It's like open-source entrepreneurship. And that was probably one of the best things that I ever did in my life. So TrepHub helped spawn a whole bunch of bootstrap companies, including mine. 

[01:00:23] And the advice that I got, the support that I got is invaluable. It truly is invaluable. Brought out the very best. I also learned about community, the power of community, because we were able to launch nonprofits that were incredible. So one of the nonprofits was trying to build a submarine that would clean up the Indian River. What a crazy idea! He comes to me and goes, "I need $5 million." And I was doing well by this time, but not that well, right? And I said, "Let's put it out to the community and see what happens." So his original plan was in several years, they'd have the first prototype of the sub. I check in with them three months later, they are on their third iteration of the sub. He said once he put the word out, somebody walked into the second meetup with a working sub, with a working autonomous submarine. And that's the power of community. That's the power of building things, right? And I realized most geeks, like we love building. Like the act of building, the act of solving problems — that's our legacy. That is what we leave to the world. It's a labor of love. 

[01:01:28] If you can bring those together like, so I don't really care where you go. And as I travel to various cities, right, I noticed this to be true, right? Initially, people will puff their chest and was like, entrepreneur this, entrepreneur that. But then I'd be like, "Nah, dude. I really just love building," right? I don't know what else I would do, right? Like to me, there is no retirement. I'm already like I'm already retired. This is what I do. And if I can live my life doing this thing that I love doing that you can't take away from me, you know, the next thing is what are the big problems that I want to solve in the world? So we actually launched another nonprofit out of there called Steadytown, which rehouses homeless people. And they've rehoused hundreds of homeless families that have become homeless due to financial troubles. And like when I first heard of it, I was like, "Alright, that's crazy." Like I'm all about solving problems, but the person who was solving this problem had solved really big problems before, and I saw it happen before my eyes. 

[01:02:24] So that spirit, that hacker spirit of just let's build it or what you call the creators of today, right? They're building things. They're building things in this new paradigm. If we understand that that's the reality that we actually live in — no matter where you are, if you're connected to the internet, you are living in some version of that. And that's why you'll see that 51% of Silicon Valley CEOs are immigrants. And I think you're going to see more and more of that. And you see countries in the Middle East like retooling themselves, where I went to the — recently, I was blown away by the multiculturalism. There's people from all over the world. Saudi Arabia is building this new city, trying to attract people from all over the world. You go to Istanbul and you're like, "Wow! Feels a lot like New York." So the world is recognizing that the power shift has happened to the builders, to the creators. And governments and others are kind of realigning themselves to make sure that they are part of this future. If you're in Africa somewhere, I think like the future is like in Africa, Sub-Saharan Africa is where we're going to see like a lot of amazing things come from. 

[01:03:33] Don't look at today and say this is how the world is always going to be. That is absolutely — like I can bet you that that's probably not true. The world is going to change. It is changing in a certain direction, probably towards leaning more towards you, the person who might appear disadvantaged today. But if you just go up and, you know, organize a coffee meetup, watch what it becomes, right? And you don't have ulterior motives. You just want to get some folks together and just want to see what you can build. You don't have to have the grand plan of world domination, but you might find yourself in a global stage able to impact the world in a positive way. And I find myself in that very humble position.

[01:04:15] Adam Gamwell: No, that's wonderful, and a ringing endorsement actually for winning more geeks in the world too, right, 'cause we need more builders and makers. 

[01:04:22] Ahmed Reza: Big fan of geeks here. 

[01:04:23] Adam Gamwell: You're in good company then, yeah, 'cause it's I think this is it's one of the pieces that I think is also just the most exciting that — I mean, we have such tools now too, right? We both have access to, you know, the internet in so many places. And then just even the ingenuity we see globally too in terms of, you know, cellphone usage in Sub-Saharan Africa, you know, was used to like flash minutes as a way of using like not unbanked currency right before like online banking was available and also communities that don't want to get banked in that sense, too. This is interesting. Like we always will use technology in these unique ways to build, you know, scenarios that work for us locally. And so I think you're right. This idea of getting kind of the hacker spirit and the power of community building is such an important piece. And I think it's a — I feel good. It's a positive reminder that like let's get together and build. That's what we need more of is doing well together and kind of building with this sense of optimizing for human happiness to — I'm going to borrow that term because I like that phrasing 'cause we can, you know, we can and we should. 

[01:05:15] And so I just want to, you know, so thanks, Ahmed, for sharing your wisdom, your stories with me in the pod today. I'm excited to share this with the audience. I guess, is there anything that's on your mind either that we didn't get to talk about so far or, you know, or even something that you're hopeful for for the future in terms of, you know, what we want folks to take away from this conversation?

[01:05:32] Ahmed Reza: Well, my hopes are really in people. And I really hope and pray that the world of the future includes better, happier people because sometimes we look at all the plenty that we have. And one of the leading causes of death in America is suicide. And you have to ask yourself, why is that? Why is it that we have so much and, you know, life expectancies go in the wrong direction? In business, you look at KPIs. And those KPIs direct you when there are problems, right? When the customer count's going down, when your churn rate is going up, pay attention, right? And I think this is time for us to pay attention and say, "What's happening here?" What we really need to do in my own humble opinion, and I think the opinion of many others actually now, is that we need to be thoughtful about the future that we design for humans because capitalism, the American dream, all of what we see around us, what we've aspired to was actually to live more fulfilling lives — life, liberty and the pursuit of happiness, not the pursuit of pleasure, but the pursuit of happiness. Like really understanding what is human happiness, contentment versus just, you know, you know, super-releasers that will like just pump you full of dopamine and other things and will ultimately leave you so unhappy that you don't want to live anymore. Like that's not a future that we want, you know? 

[01:07:01] And that's kind of the unintended consequences of when certain things go rogue, get overoptimized. We really should optimize for ourselves. We as people should think more deeply about the work that we do, the lives that we live, and instead of thinking, "Oh, I'm getting replaced," maybe we should ask, "Why is my civil identity 'programmer?'" Yes, this is what I do. I've put a lot of time into it. Yes, I'm a craftsman, but I am still me and like we have to start to become — as a nation, as a world, we have to start appreciating, even though there's eight billion of us, there's eight billion unique human experiences that are special. We know in our heart that that's true.

[01:07:43] Adam Gamwell: Yeah, well said. Ahmed, thank you so much. This has been a great enlightening conversation, and I really appreciate you jumping up and down in the rabbit holes with me. It's been fun and yeah, thanks so much. 

[01:07:52] Ahmed Reza: Thanks so much for having me, Adam. 

[01:07:55] Adam Gamwell: And that's a wrap for today's episode of This Anthro Life. A big thank you to our guest Ahmed Reza for sharing his insights on leading with humanity in the age of AI. Now, if you're a fan of our podcast and the content that we bring you, I'd love your help. Please share it with someone who needs to hear it and/or leave us a five-star review. Now, you can do that through your podcast app if you're on Apple Podcasts or Spotify or you can visit thisanthrolife.org/reviews — the link is in our show notes — to leave a review on the site or find links to Apple Podcasts and Spotify for reviewing there. As well, our team has launched the Anthrocurious Substack blog and newsletter, which is a companion project to our podcast that offers reflections, publications, and the latest from the social science community. The link is also in our show notes. So if you're a fan of our podcast, you'll love the content that we're sharing on Anthrocurious. Our blog and newsletter are packed with insights and stories that dig deeper into the topics that we discuss on the show and go beyond them. From interviews with leading anthropologists to reflections on the latest research, Anthrocurious is the perfect way to stay up to date with the latest developments in social science. But that's not all. We are also asking for your support. Our podcast and blog will always be free, but we rely on the support of our listeners to keep the community going. So if you're able to, please consider subscribing to our monthly or yearly subscription. Your support is the primary way that we're able to sustain the community and keep delivering high-quality content that you love. And if you're already a subscriber, hey, I want to thank you so much for your support. I really couldn't do this without you. 

[01:09:20] Alright. So here are three takeaways from today's conversation. First, AI has real substance and it's important to consider the ethical implications of its development and deployment. As Ahmed Reza noted, unintended consequences are always a possibility and it's important to mitigate risks through thoughtful consideration and regulation. Second, community and philanthropy are important values to consider in shaping a meaningful existence, especially through business. Success shouldn't just be defined by material possessions, but also doing something great for humanity that does not involve just making a new kind of technology. Third, change is part of existence, and recognizing and adapting to it is important. Entrepreneurs should embrace humility as part of their business and focus on building things that don't simply disrupt, but that foster opportunities for more equitable and meaningful futures.

[01:10:06] Now, after listening to this episode, I invite you to consider: 1) how can we encourage the development of AI in a way that prioritizes humanity? 2) how can we redefine success not only in terms of material possessions, but also through doing something meaningful for humanity? 

[01:10:21] As always, get connected with me over on social and email, and head on over to the Anthrocurious Substack and subscribe today. You won't regret it. Thank you so much for your time, attention, and energy once again. I'm your host, Adam Gamwell, and you're listening to This Anthro Life.

Ahmed RezaProfile Photo

Ahmed Reza

Founder and CEO of Silicon Valley based Synthropy AI and Yobi Founder and CEO of Silicon Valley based Synthropy AI and Yobi

Ahmed Reza is a tech entrepreneur with a story that reads like a blockbuster movie. He started as a child actor in Bangladesh before becoming a Milken scholar at Cornell University while homeless; then working as an engineer at NASA. But his true calling was entrepreneurship, founding successful small businesses before creating Yobi, a company that simplifies customer relationship management for businesses.
He's also known as the intelligent man's Forrest Gump, forming close relationships with Michael Milken and Dr. Edward Zuckerberg, among others. Ahmed's journey is a testament to the power of hard work and determination, making him an inspiring guest for any podcast.