Is AI Going to Take your Job? with Amanda Moriuchi

Is AI Going to Take your Job? with Amanda Moriuchi

Is AI coming for your job? In this episode we sit down with Amanda Moriuchi, CEO of Appit Ventures, to explore this pressing question. She shares her insights on how AI is disrupting industries, the vital human skills that technology can't replace, and why empathy and creativity are more important than ever. Together, we dive into what makes us uniquely human in a world increasingly dominated by machines. Packed with thought-provoking discussions and practical advice, this episode is a must-listen for anyone navigating the future of work in the AI era.

In this episode, we discuss the following:
1. The Impact of AI on Jobs and Leadership.
2. Essential Skills for Thriving in an AI-Driven World.
3. The Role of Empathy and Trust in the Age of AI.

CONNECT WITH AMANDA:
https://www.linkedin.com/in/amanda-moriuchi/

CONNECT WITH SUSIE:
https://www.linkedin.com/in/susietomenchok/

CONNECT WITH JAMES:
https://www.linkedin.com/in/capps/

[00:00:03] Welcome to the Quick Take Podcast, the show where you get targeted advice and coaching for executives by executives. I'm Susie Tomenchok.

[00:00:12] And I'm James Capps. Give us 15 minutes and we'll give you three secrets to address the complex topic of issues that are challenging executives like you today.

[00:00:22] Hey, welcome to Quick Take. I'm Susie along with my co-host, James. How are you, James?

[00:00:27] I'm proud to be here. Proud to serve.

[00:00:29] And I'm just so glad that I don't have to share the stage with just you. I have a good friend of mine and now a good friend of yours, Amanda.

[00:00:37] Tell us about yourself so that everybody understands the context that you come to this conversation.

[00:00:42] Of course. Yeah, so I'm Amanda. I own Appet Ventures. We build custom web and mobile applications.

[00:00:50] Love it. Love it. And we were just kind of talking about what are we going to talk about today?

[00:00:54] And we were saying like, is AI, so many people think that AI may replace you as an executive leader.

[00:01:01] And you said that kind of hit home. Tell us why.

[00:01:05] Yeah, Susie, you were asking me kind of what keeps me up at night.

[00:01:08] And this recent conversation about AI is the thing that's keeping me up at night.

[00:01:13] Because I've been told many times that AI is going to replace me.

[00:01:18] It's going to replace the job that I do.

[00:01:21] It's going to replace the work that we do on behalf of our clients.

[00:01:24] And that's a scary thing to hear, right?

[00:01:27] And so it's something that I've been thinking a lot about that I've been watching over, especially.

[00:01:33] And I have a lot to say on the topic.

[00:01:36] It's a fun topic. I think that like many things, when people bring up a topic like this,

[00:01:41] it's easy to gloss over the details and say these broad generalizations that your job, whatever your job is,

[00:01:49] I'm not even going to be specific. I'm just going to say your job will be replaced.

[00:01:52] And I think, sure, I would argue that with or without AI, all of our jobs will be replaced by evolution and change.

[00:01:59] But it's an interesting discussion around what AI is.

[00:02:03] As I think you said it earlier, this is the most disruptive thing we've seen, certainly in our work careers.

[00:02:09] And it is worth having an intelligent dialogue around it.

[00:02:12] Absolutely.

[00:02:13] And I do have to say, I kind of roll my eyes about AI, about how AI is the hot new thing.

[00:02:20] Because I've been doing this since 2009.

[00:02:22] And I think I was joking with you, James.

[00:02:25] I matured in my career at the same time that the app stores have matured.

[00:02:29] And for those of us that have been doing software development for as long as we have,

[00:02:34] AI in various forms has been around forever.

[00:02:39] I think what we're talking about right now are these large language models and the open AIs of the world,

[00:02:46] the clods of the world, and really reflecting on how these models have progressed, especially recently.

[00:02:53] I mean, the advancements in that technology have been significant.

[00:02:56] And I do think it is worth pausing and thinking about, okay, can this technology really replace my job?

[00:03:03] And what does that mean?

[00:03:04] And what are the skills that I should start thinking about acquiring now in the face of this disruption?

[00:03:11] I think it's less about the technology and more about the accessibility and the commoditization.

[00:03:17] You know, we could certainly talk about machine learning or heavy automation.

[00:03:23] And even machine learning was not exactly something that could be leveraged by my mom.

[00:03:28] You know, and so to have, you know, an oil and gas company seriously considering using AI versus the IBM 3B2 that they're using to do their current technology lift,

[00:03:40] that's a pretty big deal.

[00:03:41] And even if it's not as good as it is, or even if it was what we were doing three years ago, four years ago,

[00:03:46] the fact that it's so accessible really is a game changer.

[00:03:50] Absolutely.

[00:03:51] I fully agree.

[00:03:52] I think where I'm kind of being like, wait a minute, that's crossed a boundary for me,

[00:03:58] is this concept of somebody saying that AI can replace me.

[00:04:02] Oh, the panic.

[00:04:03] Yeah, I agree.

[00:04:04] Yeah, no, it does make me feel nervous mostly.

[00:04:07] And I think for me, I've learned to really pay attention to when I'm feeling nervous,

[00:04:12] that's a sign to me to learn more and study more and really wrap my head around the situation.

[00:04:19] And that's where my feedback came from, like thinking about really what makes us human in the face of these disruptive technologies

[00:04:27] and being brave in leaning into those human skills, right?

[00:04:33] Knowing that a machine can never be human.

[00:04:35] A machine can mimic our humanity, but mimicking is different than original thought or original type of behavior.

[00:04:45] And I think that's why, James, you and I were kind of going back and forth a little bit about AI not being able to create something new.

[00:04:54] And what does that mean?

[00:04:56] So I think that's kind of why I'm excited to be around you guys today.

[00:04:59] The first question we should poke at is really, you know, when people say it will replace your job, right?

[00:05:06] And I think it's worth talking about really understanding, you know, what is your job to your point where the humanity and the human element is so key.

[00:05:14] It's so easy to say that, you know, as a leader or technology leader, your job is to do technology.

[00:05:20] Or let's just be really specific, software development manager, your job is to write software.

[00:05:26] Where I would argue that person who's saying AI will take that person's job doesn't know what they do.

[00:05:32] Because writing code is the least important part of their job.

[00:05:35] And that's the thing we need to really take note of.

[00:05:39] Absolutely.

[00:05:40] And, you know, it's funny.

[00:05:41] Long ago, I was talking about the importance of empathy in software development process, right?

[00:05:48] So when you have a product owner that's meeting with an engineering team, that product owner must have empathy for the customer.

[00:05:56] Because otherwise, the product they're designing won't meet the needs of the customer and the product will fail.

[00:06:01] It doesn't matter if the code is pristine and beautiful and perfect and not a single bug to be found.

[00:06:08] If it doesn't meet the need of the customer, it will fail at start.

[00:06:14] And that is the importance of empathy.

[00:06:18] And, you know, I've heard some of the biggest AI enthusiasts say that AI can start to develop empathy.

[00:06:25] And that's something that I actually fundamentally disagree with.

[00:06:29] Because I think you have to have real world experience to have empathy.

[00:06:34] And you have to have, you know, it's kind of like if you come across a person that, as the kids these days say, you vibe, right?

[00:06:44] Like you need to do a vibe check with somebody to make sure that they're somebody that you want to be around.

[00:06:50] There is something unspoken, a vibe and energy and intuition that is an exchange between two people.

[00:06:57] And I think for a lot of managers in this space that don't know how to put words to it, I think that's one of the things that we've missed in the great movement towards remote work.

[00:07:09] There is something there that misses and empathy and intuition and connection simply cannot be manufactured.

[00:07:17] So that's why I think AI will never replace that one element of our humanity.

[00:07:24] Interesting.

[00:07:25] I mean, Susie, we've talked a lot about the importance of the connections.

[00:07:30] We've been talking about the loneliness factor.

[00:07:33] I mean, do you think that plays into this somehow?

[00:07:35] Well, it's interesting because I'm not a software developer and I don't play one on TV.

[00:07:39] But as a coach and I'm in front of people, I started thinking about, can you simulate really powerful questions that make people think?

[00:07:46] And I was so I was like putting myself in your kind of narrative.

[00:07:51] And I thought some of the art to being a good coach is knowing when to kind of slow down.

[00:07:57] And I also think when you think about virtual connection, can we get work done?

[00:08:01] Yes.

[00:08:01] And it's hard to put an ROI about what we're missing.

[00:08:05] But you can hear people say it's just so much better in three three D.

[00:08:10] And I think that when I think about somebody feeling seen, sometimes me one on one with somebody.

[00:08:17] I just met a woman today for the first time.

[00:08:19] And she said, I just felt from the very beginning that you got me and I trusted you so that I knew I could be really vulnerable with you.

[00:08:29] Those aren't words that I create.

[00:08:32] It is kind of this vibe, this vibe that that we do as professionals in knowing what we do well.

[00:08:42] Does that make sense?

[00:08:43] Sure.

[00:08:43] Am I in line with what software developers?

[00:08:46] Because to me, I don't understand what you mean by what the other things they do besides write code.

[00:08:51] Well, what they do is the things you just exactly described.

[00:08:53] It is creating that interpersonal relationship, having the relationship with the vendor or with the customer and understand where they're at.

[00:09:00] So, yeah, I think I think that's a very valid component of this discussion.

[00:09:05] You know, something that's really making me nervous for some of our clients is the thing that is most remarkable about today's version of AI.

[00:09:17] So today's GPT is basically right.

[00:09:20] So what's most remarkable about this is the speed in which it will give you an outcome or an answer based on a question or an outcome based on a data input.

[00:09:33] Right.

[00:09:34] And so we know to be true that if you put truth into the system, you'll get a truthful outcome.

[00:09:41] But if you put dirty data or inaccurate data or missing data into the GPT, you're going to get a mutated or a distorted answer or outcome.

[00:09:56] So the thing that makes me the most nervous for my clients is, yes, it will get you down the path faster.

[00:10:04] But how do you know that path is the right path?

[00:10:07] Because it either gets you on the path to the promised land or to total utter destruction in an instant if you're not careful.

[00:10:15] And, you know, in growing Abbott, I've really had to learn there.

[00:10:20] You have to experiment with who are your customers?

[00:10:24] What do they want?

[00:10:25] What happens if the market shifts?

[00:10:27] How are you going to respond to that?

[00:10:30] And making sure that you, whether you're gathering data through a piece of software or you're gathering data from your salespeople or your customer service reps or you yourself are talking to your customer, that you're getting the actual truth.

[00:10:46] Because if you put wrong data into a model like that, you're going to get the commensurate wrong answer.

[00:10:53] And how we get to the experience requires nuance and empathy and trustworthiness.

[00:11:01] And we know that most people still do not trust these AI agents, these large models, these GPTs, because they don't know how it works.

[00:11:13] It's so new.

[00:11:13] And so how are you as an executive or a manager or a leader, how are you getting the truth from your employees, from your customers, from the market in such a way that you can accurately respond?

[00:11:26] I think that's the piece that if you as a manager or leader aren't thinking about that, you really should be.

[00:11:33] I agree.

[00:11:33] But this is where I want to make an interesting distinction.

[00:11:37] I would argue that even a human has a hard time understanding what truth is.

[00:11:43] And I would argue humans are flawed in that we do distill and filter and take input and then mutate that input based on our own giant large language model.

[00:11:55] And so while we could argue that humans are the more correct version of what these large language models are, I would also say we too are flawed.

[00:12:06] And we too are equally capable of misinterpreting data.

[00:12:10] We are equally capable of having hallucinations, if you will, using the data term.

[00:12:16] And we also have hallucinations, which are super fun.

[00:12:18] I highly recommend them.

[00:12:20] But the truth is, is that this is exactly why I think it's also possible that AI can create.

[00:12:27] Because humans are simply deriving product from their own inputs.

[00:12:32] Everything is a derivation.

[00:12:34] There is very few things that are genuinely unique.

[00:12:36] I don't know that anything can be unique.

[00:12:38] It's almost impossible.

[00:12:40] And I would argue that by that same definition of a human, then therefore a computer is doing the exact same thing.

[00:12:45] And maybe that's a very academic point of view.

[00:12:48] But I do question how it is that we create any differently than a computer does.

[00:12:54] All right.

[00:12:54] I have to tell you guys this story.

[00:12:56] So I, early in my career, I went through sales training.

[00:12:59] And my trainer told this story that at the time I kind of rolled my eyes, but today I really appreciate.

[00:13:07] The story goes, it's around five o'clock.

[00:13:10] Johnny walks into the kitchen.

[00:13:12] His mom's making dinner and he says, mom, can I have a cookie?

[00:13:15] And any parent knows what the immediate response is.

[00:13:19] It's like, no, Johnny, dinner's in 30 minutes.

[00:13:21] Get out of here.

[00:13:22] Right?

[00:13:23] So then the next day he comes back and he asks the question, mom, what time is dinner?

[00:13:30] And the, I guess, power in that story is that Johnny's question isn't the real question.

[00:13:37] And as an adult or as a parent, you want to get to the bottom of why your child is asking the question.

[00:13:45] So the moral of the story is never answer a question unless you know why somebody is asking it.

[00:13:51] And I brought that with me into software design.

[00:13:56] Right?

[00:13:56] So another kind of interesting story, there is a cookie shop nearby the house.

[00:14:01] I love cookies.

[00:14:02] Crunchy chocolate chip cookies are my favorite.

[00:14:04] And so they say, Hey, download this app, create a profile, and we'll give you a free cookie.

[00:14:12] Right?

[00:14:12] So I download the free app, but I'm not giving them my real information because I don't want them to spam me.

[00:14:20] I just want the free cookie.

[00:14:21] From that frontline employees perspective, they've done their job, right?

[00:14:26] They got the app.

[00:14:27] They don't care.

[00:14:27] They don't care.

[00:14:28] And I've done my job because I just wanted the free cookie.

[00:14:32] But if you look at the data scientists who's sitting in the office somewhere in corporate, like 20 states away from us, they're like, okay, I don't understand why our sales are increasing because we have all of this rich data and the things we're doing aren't making sense.

[00:14:49] And so from a software perspective, I tie that back to you have got to understand why people are giving you the information they're giving you.

[00:15:00] And giving a free cookie is not trust building.

[00:15:03] It's transactional.

[00:15:05] And until you've earned my trust, you're not getting the real me.

[00:15:10] Right?

[00:15:10] And so I think that's a perfect example of you have got to earn trust.

[00:15:17] And the only way you earn trust is by serving before asking for something in return.

[00:15:23] So, for example, a better way in that same scenario is, hey, we'd love to send you a cookie for every time you have a birthday.

[00:15:31] We'll send it to you.

[00:15:32] All we need is your address and your birthday.

[00:15:34] Okay.

[00:15:35] That's when you get my real information because you're serving me in exchange.

[00:15:40] And it's a subtle difference, but that subtlety is understanding why people do what they do and then you get the truth.

[00:15:50] And so I think to your point, James, yes, we do the same thing all the time.

[00:15:55] Just like any parent who's tired and that Johnny example would say dinner's at 530 and he wanders off knowing, okay, I just saved myself a lecture.

[00:16:06] I'm going to ask for a cookie after dinner.

[00:16:08] Or 45 minutes before dinner.

[00:16:10] Yeah, that's what I'm saying too.

[00:16:13] But the thing is, is humans run slower than AI.

[00:16:17] And so we naturally have the ability to course correct when we're not seeing what we think we should be seeing versus the power is the same as the danger of AI and making business decisions.

[00:16:34] Trash in is trash out.

[00:16:35] But like this, instead of you can course correct over time.

[00:16:39] That's how I view it.

[00:16:41] I understand your point of view.

[00:16:42] I think that at the end of the day, just as you described, there are people you vibe with because they understand how to digest data much more quickly.

[00:16:51] You can read a room, that person walks in and they're pretty snappy.

[00:16:54] Then there's people that have taken you 20 years to get to know.

[00:16:59] And so not all humans are the same.

[00:17:02] And I will argue that yes, generative AI will not replace every human, but I bet it does replace some.

[00:17:10] It does.

[00:17:11] And in fact, I was just listening.

[00:17:13] I'm obsessed with Freakonomics.

[00:17:15] I don't know if you guys have read the podcast.

[00:17:16] Yeah, I listened to that podcast.

[00:17:17] It's so they have a newer episode talking about how 60% of the jobs that exist today did not exist even.

[00:17:25] Right.

[00:17:26] Yeah, that's a great response to that question saying statement saying AI is going to replace your job.

[00:17:32] Well, AI will or something else will.

[00:17:35] But yes, 60% of them will get replaced.

[00:17:38] So it is a harbinger of death kind of thing.

[00:17:41] And I 100% agree with you as a relates to that.

[00:17:44] Yeah.

[00:17:44] So what are the new jobs that arise out of this as some more manual tasks or repetitive tasks are absorbed by technology?

[00:17:55] Go ahead, James.

[00:17:55] I would argue of the three of us, the one most likely to have a job in the future will be Suzy.

[00:18:00] Well, absolutely.

[00:18:03] Absolutely.

[00:18:04] Because Suzy, you make it your job to understand the nuance of people on an individual level and help them pursue their path.

[00:18:12] Oh, no.

[00:18:13] I was just saying because she's a snappy dresser.

[00:18:15] No, it had nothing.

[00:18:16] Yeah, that's a good.

[00:18:17] Nothing.

[00:18:18] That's another great point.

[00:18:19] I hadn't really considered that.

[00:18:20] Yeah.

[00:18:21] She's also a real people.

[00:18:23] But in this disruption, it's being aware of what are the skills that I have today?

[00:18:30] What are the skills I'm gonna need on the other side?

[00:18:33] And how can I get a head start in cultivating the skills that I know no matter what will be valuable?

[00:18:40] And I think that's the piece that's allowed me to sleep better at night is reflecting on, okay, what does AI do?

[00:18:47] What does AI not do?

[00:18:49] And how do I bridge that gap?

[00:18:52] And that's where you kind of carve out a space for yourself during this disruption.

[00:18:56] What do you think the three skills that we need to have are?

[00:18:59] I would say creativity, which James, I think you and I can probably dive deep on that one.

[00:19:05] But I definitely think creativity is a big one.

[00:19:08] Critical thought is another one because you have to be able to review the output of some of these models and make sure you're filling in the gaps.

[00:19:18] And then social connection, which we were kind of chatting about before we hopped on.

[00:19:23] But those are the three that I think are most important.

[00:19:26] So if I had to summarize it, it's come up with new ideas that AI is not coming up with.

[00:19:34] Fill in the gaps that AI leaves and don't be a jerk.

[00:19:38] Like that's pretty much how I'd have to say.

[00:19:42] I love it.

[00:19:44] You know, this whole discussion has made me also think about just this topic.

[00:19:49] It's interesting how it's made us think about our world so differently and asking questions.

[00:19:55] Like it'd be really interesting to know how much further we've evolved just in our thinking by having this as a nucleus for us to discuss.

[00:20:05] Yeah. In relief, we are forced to almost examine our own humanity.

[00:20:08] I mean, what was the last time you or maybe it had been a very uncommon discussion about what is creativity?

[00:20:15] You know, that was not the typical cocktail hour discussion.

[00:20:20] But now it's a real one and it's happening every day.

[00:20:23] So it is.

[00:20:23] It's a real interesting time.

[00:20:25] Yeah.

[00:20:26] Also, too, as a mother of young, I have, well, older kids and I have a kindergartner as well.

[00:20:30] And thinking about how kindergartners play and learn and interact.

[00:20:37] And I wonder if our schools and our educational programs are going to have to adjust to really prioritize that.

[00:20:46] Because when I was coming through school, it was a lot about critical thought and discernment.

[00:20:53] Like deciding what was acting, what was opinion, what was truth, what was not.

[00:20:57] And I do think some of that will have to carry through.

[00:21:01] But I mean, gosh, watching younger kids, how they play and how they imagine and how they do create and they create weird things that I don't think really AI.

[00:21:14] Like, James, if you want to talk about like our debate on can AI create?

[00:21:19] I'm like, listen, talk to my five year old and let's see what real creation is.

[00:21:25] And it is fascinating to watch younger children and how they interact before they've been hardened in this way that we try to train them how to think.

[00:21:36] Yeah.

[00:21:36] Totally true.

[00:21:37] So true.

[00:21:38] All right.

[00:21:38] So, James, what is your big takeaway from today before we go?

[00:21:42] A big and very interesting conversation around the skills that really differentiate you in any role.

[00:21:47] And how do you, as we look down the gauntlet, down the gullet, down the barrel of AI, evaluate the true skills that we bring to the table?

[00:21:58] And, you know, this will be an outcome that there will be winners and losers.

[00:22:02] But that's so different than any other situation we have been in in the last 50 years.

[00:22:07] But it is good for an executive to take a look at their skills, take a look at what they really bring to the table and start to work on those.

[00:22:14] Because those are the ones that will keep you at the front and center.

[00:22:17] Yeah.

[00:22:18] Well, Amanda, thank you so much for being here.

[00:22:20] Tell people how to find you or what you want them to know about you.

[00:22:24] You guys, I think I'm a dinosaur now because email is my favorite way.

[00:22:30] Like, don't text me.

[00:22:31] I never respond to text messages.

[00:22:34] I am a victim of the AI bots on LinkedIn.

[00:22:37] So I don't go to LinkedIn anymore.

[00:22:40] But email is my thing.

[00:22:42] Like, I just, I really enjoy hearing from people and connecting that way.

[00:22:47] So, yeah, I'd love to hear from you.

[00:22:50] It's Amanda at appadventures.com.

[00:22:53] And if you spam me, I'm not going to respond.

[00:22:56] But if you send me a personal note, I will.

[00:22:58] And, you know, let's be human in the age of AI and email bots.

[00:23:03] I love it.

[00:23:04] So good.

[00:23:05] Thank you for joining us.

[00:23:07] It was such a pleasure.

[00:23:12] Susie, I have a question for you.

[00:23:14] Yes.

[00:23:14] What was your favorite pet and why?

[00:23:17] Oh, I feel bad naming one.

[00:23:19] It's like naming your favorite child.

[00:23:22] My favorite pet, if I had to pick one.

[00:23:25] You do.

[00:23:25] And it's not my favorite.

[00:23:27] And I don't think Charlie's in the room, so she can't hear me.

[00:23:30] My dog, Dixie, because you could have her off leash and she'd stay by you.

[00:23:36] And she always, like, was there.

[00:23:39] She didn't demand too much.

[00:23:41] But she was always right by me.

[00:23:43] And I remember one time I had to run up to get something and she was sound asleep next to me.

[00:23:48] And I just wanted to tell her, like, I'm going to be right back.

[00:23:51] Stay right here.

[00:23:52] Stay right there.

[00:23:52] Just stay asleep.

[00:23:53] I'm going to be right back.

[00:23:54] I'm going to be right back.

[00:23:55] I want you to run all the way up the stairs with me.

[00:23:57] And there she was right next to me.

[00:23:59] And I felt so bad.

[00:24:01] But she was really, like, her spirit was really sweet.

[00:24:05] What kind of dog was she?

[00:24:06] She's an Australian mix.

[00:24:09] We rescued her.

[00:24:10] She was a good one.

[00:24:11] I feel like I met that dog.

[00:24:13] Yeah, she was black.

[00:24:14] Do you remember?

[00:24:15] Really fuzzy.

[00:24:16] She's really pretty.

[00:24:17] Yeah.

[00:24:18] Yeah.

[00:24:21] Thanks for listening to this week's episode of Quick Take, where we talk about the questions

[00:24:25] that are on the minds of executives everywhere.

[00:24:27] Connect with us and share what's on your mind.

[00:24:30] You can find us on LinkedIn, YouTube, or whatever nerdy place on the internet you find your podcasts.

[00:24:35] All the links you really need are in the show notes.