As we evolve at a whirlwind pace, primarily driven by technological advancement, it’s essential to pause, step back, and look at the broader picture. Today, the debate around humanity and technology is more relevant than ever.
In a world where the average person checks their phone 144 times a day, and 69% have texted someone in the same room as them, the rapid proliferation of technology in every sphere of our lives raises an important question: Is our increased dependence on technology disrupting the essence of our humanity?
Joining the show today are Raj Goodman Anand, CEO of Goodman Lantern, and Jane Peacock, Digital Strategist at Partners in Digital, to explore the intersection of human connectivity and digital evolution.
Before we venture into this exploration, it’s crucial to understand that technology is a product of human ingenuity. It was born from our inherent curiosity and creativity to make our lives easier, more efficient, and more enjoyable. However, it’s true that any tool is only as beneficial or detrimental as the hands that wield it.
The current digital era, characterized by the ubiquity of smartphones, social media, AI, and other advanced technologies, has opened Pandora’s box of opportunities and challenges. The benefits of technological innovations are evident – they have revolutionized communication, democratized access to information, propelled economic growth, and much more. Yet, there’s a growing concern that our increased reliance on technology might be impacting the very fabric of human connectivity.
While technology brings us closer to those geographically distant, it can also create an emotional void and psychological distance. The irony of our times is that we’re more connected than ever, yet many of us feel increasingly isolated. It’s not uncommon to find families in a living room where everyone is engrossed in their digital world, creating a paradox of being physically present but emotionally absent.
Moreover, the art of face-to-face communication needs to grow. The richness of human emotions, the subtle nuances of expression, and the profound bonding through personal interactions are irreplaceable. Are we losing out on these invaluable human experiences in the quest for digital convenience?
Additionally, the incessant bombardment of information and the pressure to be ‘always on’ can take a toll on our mental health. The constant urge to keep up with our digital lives can lead to anxiety, stress, and depression. A delicate balance must be struck to ensure that technology serves us, not vice versa.
But how do we strike this balance? How do we ensure technology enriches our human experience rather than diluting it?
The answer lies in the conscious consumption of technology. We must strive to use technology as a tool for our advancement rather than it controlling us. Setting boundaries and taking deliberate breaks from our screens is crucial to connect with our surroundings and the people around us.
Education can also play a significant role in helping us understand the impact of technology on our lives. Through education, we can leverage technology effectively, equipping ourselves with the knowledge to use it in ways that augment rather than replace our human connections.
Moreover, in design and development, empathy and ethical considerations must guide technological innovation. Developers should aim to create technologies that enhance human interaction and address our innate need for genuine connectivity, not replace them.
Technology, as an extension of human capability, should enrich our lives, amplify our potential, and bridge gaps, not create them. It should offer a platform for creativity, collaboration, and sharing, fostering the ties that bind us. As we navigate this digitally connected world, remember the power of a simple human touch, a heartfelt conversation, or a shared experience. After all, our humanity is not just about being human; it’s about connecting, relating, and bonding.
Hey everyone. Thanks for listening. As we evolve at a whirlwind pace, primarily driven by technological advancement, it’s essential to pause, step back and look at the broader picture. Today, the debate around humanity and technology is more relevant than ever. In a world where the average person checks their phone 144 times a day, and 69% have texted someone in the same room as them, the rapid proliferation of technology in every sphere of our lives raises an important question, is our increased dependency on technology disrupting the essence of our humanity. Joining the show today are Raj Goodman, Anand, c e o of Goodman Lantern, and Jane Peacock, digital strategist at Partners in Digital to explore the intersection of human connectivity and digital evolution. To support the show, visit chris hood.com/show. Subscribe to the show on your favorite podcast platform. Follow us on social media, and you can email me directly. [email protected]. I’m Chris Hood, and let’s get connected.
Voice Over (01:09):
Connecting access. Granted, it’s The Chris Hood Digital Show where global business and technology leaders meet to discuss strategy, innovation, and digital acceleration. 5, 4, 3, 2, 1. Your digital evolution starts now. Here’s your host, Chris Hood.
Chris Hood (01:41):
As technology replaced our human connections, let’s find out and meet our guest. Raj, would you mind introducing yourself?
Raj Goodman Anand (01:48):
Thank you, Chris. Yeah, my name is Raj. What was how to describe yourself? I would say, I’ll start with saying that I’m, I’m a father, a, a husband based in the uk. I run a business called Goodman Lantern that is a second. It’s my third business. In the past, I’ve sold one company, one went from zero to acquisition, took one business from seven half million to 45 million revenue. I’m a geek who loves technology and marketing. I got into marketing by chance. Currently I run a company called Goodman Latern. We are a content marketing agency where our clients own five continents teams, and four, a big part of our company is our mission. Our mission is to empower women in technology and marketing. So much so that we 80% women in our organization. So I do it to leave a legacy and to really hopefully change the world for a better place.
Chris Hood (02:42):
Jane Peacock (02:43):
Fantastic. Thanks Raj. And thanks Chris. My name’s Jane and I run a boutique consultancy called Partners in Digital, and we focus on customer led digital transformation. So that’s really helping organizations put the customer in the middle of everything that they do by embedding that person within their strategic plan. We look at really creating then a human experience that is that is actually surprising, delightful friction free and, and helps customers get to where they wanna get to. And we do that. We co-create the strategy, so it’s often a co-creation process. So we bring everybody into the room and often people that aren’t usually involved in that conversation. And we then develop the roadmap for the organization and we underpin it with a real focus on capabilities. So often bringing in partners or consultants or coaches to help build the capabilities within the internal team so that they can continue to drive the process moving forward. It’s all about empowering it and enabling the internal team. A little bit of personal stuff about me. I am a mom. I, I’ve raised two boys, so I am, I think I learn more from them probably every day as they get older that I can take into my roles as a leader as well. And it’s interesting. Raj, you said you’re a tech who loves marketing. I’m a marketer, he loves tech. So hopefully there’ll be some interesting conversations as a result.
Chris Hood (04:19):
Well, I love the fact that you both included some of your human side of your portfolios and background. Maybe that is a good place for us to start today about humanity and technology. How do you think technology has influenced or impacted that change in culture that we see today?
Jane Peacock (04:40):
Oh, wow. I, I think if I, I’d probably look at the difference between generations. So I, I’m really interested in bringing different generations to the strategic table for that reason, because they’ve grown up, exposed to so many different things. So my boys generations or alphas, they’ve been exposed to everything since they were little. So that means they can see everyone of different, different shapes and sizes and cultures from all around the world. And that informs who they are. And if you look at young kids today, they, they are actually really passionate about diversity and inclusion, really passionate about sustainability simply because technology enabled them to be exposed to, to so much more than, than maybe my generation was.
Raj Goodman Anand (05:27):
I completely agree, Jane. I think that’s a really interesting point you made there. I think just this bit of fun fact, I’ve studied AI at university, did, did research in ai, and I always, you know, think about stuff from the perspective of the technology and human beings. And I think there’s a, a fundamental shift happening today especially with, you know, more and more people getting to know about AI and what it can do. When I was at university doing research in ai, my mom thought AI was a study of UFOs to, like, today when everybody knows what AI is, at least they kind of understand a bit about well chat g pt potentially, or some of them might have been known mid journey, for example. I think we are in a, a major shift where during the industrial revolution, for the first time ever, technology made people working at home maybe knitting at home, for example, building know craftsmen, building stuff at home or doing stuff in the farm into the factories and offices of today.
And now with technology, we moving them back into their homes maybe freelancing for example, maybe sort of remote working, for instance. So I think we are in one ways, we moving forward in our way of thinking and talking, technology changes to our lives. But in one other way, we actually, going back to what we did before, the big industrial revolution and the factories and the, the, the big machinery which came into, into play. So I think, you know, it’s hard to say, which is faster, but they are currently co-creating, going fast and slow at the same time.
Chris Hood (07:09):
Do we see other areas where that kind of circle of life is materializing due to technology?
Raj Goodman Anand (07:16):
To kinda say, from my perspective, technology is a major player making it possible. It’s become cheaper and faster to build stuff, to have stuff on in our, in our homes, in our offices, allows us to work remotely. I mean, imagine, you know, 20 years back, this competition only happened if you were in the same room. But now we are effectively in three different continents linked by the internet and a web browser, which I think is phenomenal. But what’s interesting also is that like, this is all part of content. It’s a podcast. Some, some people will consume this via YouTube, others will consume this via their podcast player. But fundamentally, content has been around for, for centuries. Famously, Napoleon used to do press releases to get more money for his war campaigns, and today the medium is different. Today we might have talkers, you know, perhaps James kids are using TikTok or Instagram to communic communicate. We might be doing more LinkedIn, for example, but sole content. And I think the form has changed, but the, the message and the value of the content is still there.
Jane Peacock (08:27):
I think something that I’m noticing, or at least the conversations I’m having, is technology was often or has been all about, if I look at the marketing team context, it’s all been about efficiency and productivity. And so we see then over time marketing teams getting busier and busier and busier and doing more and more and more. And now with, with artificial intelligence coming in, there’s just a different conversation happening. How do we actually use technology to do, to create the space to do more of what’s mat what matters to us, and therefore what does matter? And the conversations we’re having really is about, well, it’s the human interaction that matters. It’s the connection that we’re building that matters, and how can we then use the space that we’ve now got and the time that we’ve now got to really focus on that as a business and, and as a team,
Chris Hood (09:21):
From both a marketing and customer experience perspective, has AI impacted the human connection that we typically see in those types of services?
Jane Peacock (09:33):
I definitely think like AI and, you know, chat has been around for a long time in different forms. And I think the, that organizations, when they jump to use it from a tech led perspective, as in I’ll use the tech to create efficiency within the customer experience, often it impacts ne negatively because I’m not considering the customer on the other side of that and what they actually want from this experience. So if we flip it and we look at really what does the customer want on that journey, there are actually human interventions that they still want and they will likely always want in that journey. So then it’s a question of at what step in say, a hundred steps in that journey, can we use AI to remove the, the low value stuff and remove friction? And, and I think there’s huge opportunities to do that.
Raj Goodman Anand (10:25):
Yeah, I can agree with Jane. I think when I first saw AI in people’s hands, I was like, wow, are you going to replace human beings with ai? And, you know, we all agree that actually AI cannot replace human beings. It’ll just make us more efficient. But equally, like, you know, if you, if you go and buy products, if you go and buy a bag or some shirts, we still value handmade products that the, you know, if you look at love the luxury goods, for example, they’ll often say, this is handmade. It’s made with love and you know, James made this amazing piece of art. You know, it’s the person’s name. Is that the connection? The story is there and that will perhaps never change. It will. We always wanna be linked with the human beings. And it’s also clear sometimes when you kind of get the content out or information out, if it’s AI made or human made and events, human, human crafted or human created, the value is definitely higher.
In fact, even today, just doing the same analogy of factories, you know, the end product, the end material we produced with factories is still done. It’s a human qa. It’s a real thing. Like actually my first job out of university was actually, I worked at Parker Pen Factory. My job was to look around the Parker pen and just make sure there’s no, no, you know, sort of dent and marks on it. And I just like passed that QA test and put it into the other other pile. And that was still a process. There was no machine which could do that job better than human beings. I think today as well, I see in our organization as well, you know, the end work, the, the last big bulky, the majority of the real creative work is still even human led. The research part can be done with ai, but human develop element is so important in that journey,
Chris Hood (12:14):
Those roles that you’re talking about. And I think there’s lots of them that we would argue only a human could do. But obviously as technology is enhancing our lives, what do we think we need to be aware of to ensure that technology doesn’t completely disrupt what we believe to be our human experience?
Jane Peacock (12:35):
That’s a really big question, and it’s one that, that I’ve been p pondering at the moment, even with, with my family, my boys, the, the teams I work with. And I think it really comes down to, you have to be really conscious on the experience that you want from the technology and be really incredibly aware of it. And part of that is, is, you know, I personally am going on an education journey myself to understand like, deeply what AI is and what it might do to my own role and what I want to then be holding onto what’s the human part of my experience or the human part of my role that I really wanna hold onto and that I find really valuable. And I think we all have to do that. There’s a, there’s a great talk, I don’t know if you guys saw it, Esther Perel who really talked, she didn’t a TED talk on intimacy and she did a talk recently called the other ai. So artificial intimacy. And, and I think that ultimately is a bit of a key to what is missing when, when we look at artificial intelligence, it’s the intimacy, it’s the humanity between people and between things. And, and we can definitely maintain that if we’re conscious of it.
Raj Goodman Anand (13:44):
As I think about more, and I sort of spend a long time thinking about like large language models like lms, like the ones we have with, you know, from Google or from open ai, all these chat GTR forms or what we call large language models, lms. And they can, they’re very smart, you know, so far they’ve been help, they’ve been helping or distracting us on YouTube, Facebook, Google, to kind of give us data or to suggest things to us because they feel that we’ll consume that, that video more than this video, for example, or this is our bias. It has been used in that way. But then the other flip side of that is that there are this whole emergence of companionship through chatbots. So recently there was this lady, I dunno where she was based, but she, she, she’s a global designer and she said that she’s gonna marry an a AI and an AI bot because she felt she was more connected to the AI bot than any man she ever was, was with.
And I thought that was really interesting. I don’t know if that’s, that’s, that’s gonna be the, or that’s just, but, but there is an element of like, I mean, before I met my wife, I, you know, there wasn’t any sort of online dating apps, so I dunno much about them. I dunno, I’ve only heard about the phenomenon of swiping left, sorry, swiping right. But what I can say is that there’s element of connecting with people through the applications. You know, you chat with people, you having communication with them, you, you, you believe that they are real people. And a lot of that initial interaction happens there. But for me personally, the face-to-face cannot replace the chats, the videos, the zooms, because I actually really, like, I really missed that during the pandemic times. I really missed that. And I think that’s one thing which I, which I think is the human element of me, which is like, I wanna meet people face to face. I can do the small talk, initial communication through online, but without the face of interaction, without maybe, you know, having a meal together, having a coffee together. I just can’t bond the same way. So maybe that’s the element for me but times to tell what’s gonna happen in the future, you know, I think it’s, it’s still really early.
Jane Peacock (16:02):
I think Raj, the, there’s, there’s a generational difference. I think, you know, like when I look at my experience and what I seek, I’ve definitely grown up seeking out human interaction as as a priority. Whereas I look at my 15 year old son and he’s quite happy to live in web three and spend all of his money there and, and create meaningful relationships that feel really meaningful to him and friendships and, you know, you know, strategize over what they’re gonna do in the game. So really deep collaborative stuff they’re trained to do. Whereas my generation, it was, and, and I’m an, I’m a result of that, I, I I value human touch, human connection, eyeball to eyeball in person. So I think generationally there’s, there’s a bit of a difference.
Chris Hood (16:51):
I mean, you both touched on the intimacy and that human connection, and I think technology, or at least artificial intelligence, where we are still struggling with the concept is simply in the authenticity of that relationship. When we are in person, we can more or less judge if the conversation is authentic, if it’s meaningful to us. Now, yes, I agree that some of that can be captured online, but from a purely technical perspective, it still comes across as automated and in times unauthentic.
Jane Peacock (17:33):
I think Chris, one of the interesting conversations I had recently, so I’m dealing with a, a re an online retailer at the moment, and, and they reported that there a lot of their customers are actually afraid. Is it a real person? There’s a lot of negativity thinking when it is a real person that it’s not. So there’s a, there’s fear and it’s, I think it’s important to understand how the customer feels within this relationship and, and, and start a conversation about it and be willing to, to step away from AI or chat when it’s not valuable or does create some sort of negative impact for the customer.
Raj Goodman Anand (18:10):
Yeah, I would agree. I think the example of web three where you have augmented reality, for example, or you have VR headsets on and you’re looking around stuff while the other person might have an avatar for themselves in the video game, there’s a feeling which we have and we interact with real human beings, that feeling of like being connected, joint experiences, for example, even if it’s represented as a, as an avatar, it still is that feeling we get. So for me, maybe is a physical sort of in-person experience, but for some it might be different, but there’s still that feeling of human beings, which is a little different. It can be mimicked by ai and it’ll be, it’ll get better at that as you reach that sort of next agi, which is, you know, sort of the, the, the, the top as aware of intelligence with artificial intelligence artificial intelligence.
When that happens, we are gonna have a separate cha challenge. But for now, I think, I think we still bond when we are, we put teams together in experience. I think that’s very, as, as Chris was saying, very authentic, that kinda experience of being very authentic. But it’s scary. I think it’s very scary because, you know, even if AI doesn’t have emotions, it can mimic emotions, it can learn that well, when you are sad, you say this, when you’re happy you say this, and when you’re unin unsure, you said this. So it can have its own way to iterate or simulate emotions, and that’s the scary bit about it. And I think at some point, again, we might come back to the same thing, well, maybe, you know, some of those will value the fact that we are together in real, real life that’s more authentic. But again, time, time will tell what happens with that.
Jane Peacock (19:58):
I think that, just touching on your, your point there, Chris, about intimacy, one of the, one of the things we, we focus on is this idea of customer intimacy through digital, like your digital ecosystem. And it really is like in person, I can look at you, Raj, or Chris, and I can see you’re nodding or you’re leaning in or you’re smiling or you are taking a breath. And that body language I can respond to in, in the digital context. There’s also, you know, like triggers data that we can look at that see, that shows us, is the customer leaning in, are they leaning away? Are they they sort of interested or not interested? And we can use that to then deepen the intimacy digitally. It’ll never replace, you know, human to human, but we still can create, like, have the effort to, to do that and use our data to really get those insights and feedback to, to really transform the way we share our stories
Chris Hood (20:51):
Within that sharing of our stories. What type of ethical considerations are we concerned about in this age of technology?
Jane Peacock (21:01):
Well, I, I think for me the, it’s really around who, who owns the data? Who, who has created the language model. Like I, I think you touched on it, Raj, you know it, how is it, are there biases built in? Only recently I was reading a bit about, you know, built in racism you know, so those kind of things I worry about. I I, I don’t have a lot of knowledge and understanding of that, but those are probably the, the concerns I have. And then probably the, the speed with which we’ve started using chat G B T. So I’ve worked in tech for two decades now, and the internet took 10 years for a lot of people to get on, on board and chat. Gpt took three months for, with, without anybody having to influence or get them to use it. It was just suddenly being used in organizations without any governance or any sort of framework around it to, to either question, is it right, is it wrong? Is there a bias? Is there not a bias? Should I be sharing this content, you know, within this open system or not? So those kind of things haven’t been set up and that, that worries me a little bit.
Raj Goodman Anand (22:13):
Yeah, I would agree with that. I think also for me, the first step would be to, for tools or applications or networks actually sort of clarifying if what they, who they’re dealing with is actually ai. Number two is that data is interact, which we provide. Is there, so ours, if it’s because we are on their platform, it belongs to them. Are they recording that data as a way to train their algorithms or their ai? Cuz this for me is a challenge to suddenly then understand my tone. If I, if you’re recording my voice, for example, you can use tools, simulate my voice showing my face, my emotions, my emotions. And so basically it’s kind of how to like really mimic me completely. So, you know, who owns that data as I sort of happily interact with that, with that tool on their platform, for example, on Facebook, YouTube, LinkedIn or et cetera, et cetera.
I think that is the beginning of this, this question about the ethical stuff. But then also as Jane mentioned about, you know, how are we training the big models who owns that? If you are using Picasso’s art to create Picasso like art, then is that like copyrighted? Is it not copyrighted? Is that inspiration or it’s that, you know, where does this actually sit? And then, you know, art is still probably easier. How about writing? Cause writing, it’s hard to even pinpoint who owns that stuff. So where does that sit? For example, if you, if you’re taking ha making a clone of Harry Potter for example, we obviously we’ve gone to the books for example, and using clones for that, then what happens there then, for example. So it’s very difficult to even pinpoint which bit is owned by whom. I think the data owner should be able to decide if they want to part with that data or not. They should be, for example, on, on, on the web they think, or robot txt, which is used to stop bots from Google or other search engines to say, you can come to a site and crawl data, or you say you can’t, we need that. Something like that as well for AI, for instance. So there’s a lot to go through. I think we still at the very tip of the iceberg, and I think there’s a lot to kind of dive deeper into, to kind of really build something much more robust for the future.
Jane Peacock (24:44):
I think that the point around, you know, AI and art, it, there’s definitely been a lot of conversation recently around the point you were making Raj around, if I’ve created all of this art and then suddenly the AI mid journeys using that art to create something very similar to my art, then, then that creates a problem for the original artist. And, and if we wanna be nurturing all of the, and and retaining all of these amazing human skills that have been developed over a long period of time, we need to have something in place that protects that, that artist and their ip, which doesn’t exist at the moment.
Raj Goodman Anand (25:23):
Yeah, I agree. I think, you know, I think is that, is it the challenge between inspiration and copying? I think the question is where does, like you draw a line? That’s the bit which I find very challenging because I, I find like, well, we, we can’t really say in, in the past, if I used something for inspiration, then well, I’ll just expire, but now I can just copy it through ai. So where do we draw the line? That’s the bit I find very challenging.
Chris Hood (25:46):
I think generative AI is one thing, but if we took that to a more human perspective and changed the scenario a little bit, imagine a psychiatrist, would we be comfortable with sitting down with an AI psychiatrist who is trying to nurture some sort of human response back?
Jane Peacock (26:08):
I would probably, the, the Esther Perel example is a really good example. They create an AI of Esther Perel and, and it was lacking the, the deep human intimacy. And I, I think any role where you need that ability to sense, like you were saying, Raj, there’s an energy exchange when there’s humans together, that that brings so much more value. And, and that was missing. So for me personally, I, I, I probably, I wouldn’t, I’d still go for the human psychiatrist but doctors is a whole other, a whole other question because I’m totally keen to get into AI for, for health related,
Raj Goodman Anand (26:48):
Chris, I’ll answer you question by saying like, by just kind of remembering the fact that I read somewhere that while on a plane, most of it is not flown by human beings, it’s actually flown by programs. Like we trust our lives, you know, with ai like tools to keep us safe midair where we have nothing else, but that we have pilots there, but they’re probably to engage and they wanna just be used for take off and landing, for example. So for me, you know, that’s an interesting scenario. But the other one is that recently my uncle was not feeling well and you know, he was in the emergency for a few few days and I was getting reports from the doctors on what’s going on, and they were just too busy and I was getting reports from the, the, the doctors and like lab reports, for example.
Now I have no idea what that meant. So I did was I scanned those, those reports from image to text using a tool online and passed it all through chat G P t for example, and asked to explain it to me like a five year old or 10 year old, or a 20 year old, whatever it might be. It’s a layman for example. And it could really get down to the detail of what’s going on. Like I could explain to my uncle what was the issue with him. Now that for me allows me to get a more human perspective on health challenges. So maybe for some things I might prefer, like for landing and takeoff, I prefer pilots to do that for me, but for mid-air, I’m happy with like robots and AI to, to kind of take over. But I think it’s like courses for courses, right?
So I prefer something for, for AI to kind of help me with and other places, you know, I prefer human beings. And I think what’s interesting for me is that I often think about like the movie Iron Man, where you have the Iron Man, you know, like navigating his screens and like asking his, his bots to kind of answer for different things while bolt on this on, on my scooter, get my car ready for me. And I think of the AI in that way that we going to the, to the era where we will have AI like wisdom of the whole world packed into our, into our, into our, our garage or our computers and they’ll be there to help us out. I don’t think of AI’s database and sometimes people say, oh, this, that GBT is stuck in time. 2021 is where it stops. Well, yes, that’s true, but also it has all that logic behind it. I think logic behind is very powerful for me, and I use it to the strengths, for example, and not the weaknesses.
Chris Hood (29:26):
We all have mobile devices in our hands pretty much constantly every day. It’s not a farfetched conversation to believe that in the near future we’ll be having personalized AI that travel with us in some form, whether it’s on our phones or something else that can have that level of communication and insight, be able to personalize and customize experiences no matter what store we walk into. So I definitely think the Ironman example is a great one. For our final question before we wrap it up. What challenges or hesitations might you have about technology as we look towards the future?
Raj Goodman Anand (30:09):
I think there is a real challenge. I mean, especially I think AI has a bigger one because AI is the one which we can’t actually pinpoint with programming language that, you know, what is the logic behind? For example, I’m going, I’m just gonna give an example of tragedy again, tragedy 2.52, 3.3 was, was okay, suddenly 3.5 was far better than this and the previous, previous versions. And it’s bizarre because not even the scientists knew that it’s, it is that it’s that good. For example ai, you know, they actually train chat g bt like tools to, to answer questions in English. It trained them. They train, train them all the time and suddenly could answer question in version. How does that happen? No one really knows. And I think the, the challenge is that we don’t know how far AI really is.
We might be able to guess some of this stuff, but we don’t know accurately where it is. As we train the different versions or models of ai, it can get far better and it could create challenges for us. And this is why it’s a whole sort of focus on stopping the, the training to go at that quickly. I mean obviously Elon Musk and other people have said, let’s slow it down a little bit because I don’t know really how fast it is or how far it really has gone. So that for me is a real challenge. We need to start to really think about how do you put safeguards in for us so that one day it doesn’t take us, you know, take us over basically in one form or the other. So that for me, the biggest challenge. I think the second challenge for me is the, you know, the dependence of our lives on technology. You know, how will we become in the future? I mean moving from an abacus or Abacus to calculators, you know, maybe made us a little bit maybe not as sharp potentially, but ultimately it’s more efficient. So is it about being like challenged intellectually or is it being, is it about being more efficient? Where is the balance sort of set?
Jane Peacock (32:20):
Oh, that’s great, Raj, I was just, I just noted down, I think the, the most current topical challenge for me is really around the education system. And, and in Australia, one of our states is, as you know, it’s, you can’t use AI in schools. And so I really, I, the, the system within education, and I dunno where it’s like where, where you guys are based, but it is, it’s falling behind and so it’s, it’s really causing our kids to fall behind as well, rather than education, embracing AI and, and then helping our kids to know how to use it. And, and so that’s the one challenge, and, and I think the, the second one is really down to ourselves. Like there’s a moment in time where we’re being disrupted by this new technology. I think unlike any other time in, in history that I’ve experienced in my, my lifetime and really p are people considering how is this disrupting me as a person, whether I am, you know, someone who’s working in a, in a store or someone who’s a mortgage broker or someone who’s a consultant like me or someone who’s working in marketing, how’s it disrupting me and how will my role change in the next five years so that I can start moving in that direction?
I think people aren’t thinking about that, and I think it’s moving so quickly that, you know, there’s a lot of talk about people losing jobs and, and that concerns me. And I think there’s not a lot of talk around. And, and a lot of the time I spend is with my teams going, how are you gonna change? How are you gonna evolve? How are you gonna disrupt yourself? So you’re ready for the role you need to play in, in, you know, a year’s time?
Chris Hood (34:01):
All of this is fascinating. What you both said is fairly inspirational and things for us to look forward to in the future. Thank you, Raj. I appreciate it. Thank you, Chris, for the opportunity. Thank you. Thank you Jane, and thank you Jane.
Jane Peacock (34:14):
Thank you Chris. Thank you, Raj. It’s really, really fun.
Chris Hood (34:18):
And thanks to all of you who are listening. If you like what you heard, please subscribe to the show on your favorite podcast platform and leave a review. Your feedback helps us improve, grow, and reach a wider audience. If you have any questions, comments, or ideas for the show, you can connect with us throughout social media and online at Chris Hood Show or chris hood.com. And please share this episode with your friends, family, colleagues, or anyone else looking to grow their business and start their own digital evolution. Until next week, take care and stay connected.