Join Roxane Heaton, CIO, Macmillan Cancer Support as she discussed digital inclusion, innovation and leadership.
00:00 100:00:03,600 --> 00:00:05,880Welcome to CIO Leadership Live UK.200:00:05,880 --> 00:00:08,880I'm Lee Rennick Executive Director,CIO Communities300:00:09,000 --> 00:00:11,560and I'm very excited to introduceand welcome Dr.400:00:11,560 --> 00:00:14,560Roxane Heaton,CIO, Macmillan Cancer Support.500:00:14,880 --> 00:00:16,480Roxane, please introduce yourself.600:00:16,480 --> 00:00:19,480And could you tell us a little bitabout your current role?700:00:19,680 --> 00:00:21,920Thanks so much,Lee, and lovely to be here today.800:00:21,920 --> 00:00:25,600So as I've mentioned, my name is RoxaneHeaton, the Chief Information Officer900:00:25,960 --> 00:00:30,800of one of the UK's largest cancer charities,which is Macmillan Cancer Support.1000:00:31,080 --> 00:00:35,400And I'm responsible for technologyand digital and data optimization.1100:00:35,640 --> 00:00:39,760We are 98% funded by fundraisingand were are the size of 250.1200:00:39,760 --> 00:00:41,800So it's really importantwe spend our money wisely.1300:00:41,800 --> 00:00:43,040People living with cancer.1400:00:43,040 --> 00:00:44,560Well, thank you so much for sharing that1500:00:44,560 --> 00:00:46,400and I really appreciate youjoining us here today.1600:00:46,400 --> 00:00:50,160Roxanne, we've created the seriesto support diversity in technology1700:00:50,160 --> 00:00:53,680and to listen to women working inthe sector who are building and supporting1800:00:53,680 --> 00:00:57,960DE&I. This yearthe IWD theme was #embraceequity.1900:00:58,200 --> 00:01:00,240So the first question,can you please tell us a little bit2000:01:00,240 --> 00:01:01,200about your own career2100:01:01,200 --> 00:01:04,440path and provide some insightsor tips on that road path. As a woman2200:01:04,440 --> 00:01:07,440especially, are there any lessons learnedthat you could share?2300:01:07,600 --> 00:01:10,600It's a very great questionand I think it's applicable for any sector.2400:01:11,080 --> 00:01:15,080I went to an all girls boarding schooland then straight into university2500:01:15,080 --> 00:01:15,800doing engineering.2600:01:15,800 --> 00:01:18,880So when there were only seven females,our course, nearly 100.2700:01:19,160 --> 00:01:22,160So it was straightinto a very different world.2800:01:22,360 --> 00:01:26,760And I remember what I've always learned.Especially someone who, I've got a stammer.2900:01:26,920 --> 00:01:30,480So I'm always aware of differentsuperpowers and everyone's got superpowers3000:01:30,480 --> 00:01:33,720and it's about maximizing thosebecause we're all very different.3100:01:33,880 --> 00:01:36,800So I've had a squigley career.Which I think3200:01:36,800 --> 00:01:40,000is hugely valuable, and especially indiscussion with people. We're all different.3300:01:40,000 --> 00:01:42,320We will bring differentperspectives to the table.3400:01:42,320 --> 00:01:46,160So I started my career in the Royal Navyafter a quick stint in banking.3500:01:46,160 --> 00:01:49,280Actually,which got me really hungry for making sure3600:01:49,280 --> 00:01:52,600we did great things with moneyand the restrictions and things money.3700:01:52,600 --> 00:01:54,920But yeah,I spent 12 years in the Royal Navy,3800:01:54,920 --> 00:01:58,280which I think reallyis embedded my instinctive3900:01:59,360 --> 00:02:01,560ambitions and behaviors of4000:02:01,560 --> 00:02:05,280teamwork and being a team member firstand a leader second.4100:02:05,280 --> 00:02:10,520But actually as a leader isit is everyone's responsibility.4200:02:11,080 --> 00:02:15,120I'm really lucky to jump from thereto Morrisons, one of the UK's4300:02:15,320 --> 00:02:19,800largest supermarkets. Which is end to endretailer, so everything from farm to fork.4400:02:20,120 --> 00:02:20,920So hughly interesting.4500:02:20,920 --> 00:02:23,240I hope I can touch with somesome insight there later.4600:02:23,240 --> 00:02:27,040But everything I learned throughall of the really again, just showed me4700:02:27,040 --> 00:02:30,040that whatever skill you bring,it can4800:02:30,320 --> 00:02:31,920you can bring a different view.4900:02:31,920 --> 00:02:35,360So finding different alliesand different people, who could champion5000:02:35,360 --> 00:02:39,360you was,I think, hardest in my early, early time.5100:02:39,360 --> 00:02:42,560But actually it's been super interestingas I've gone on later in my career5200:02:42,720 --> 00:02:47,960to find people who I really learnedfrom are those around my network.5300:02:49,080 --> 00:02:50,960And that'swhere I get my greatest learning.5400:02:50,960 --> 00:02:52,680But actually those aremy greatest allies too.5500:02:52,680 --> 00:02:55,080And it probably took me a little whileto realize that.5600:02:55,080 --> 00:02:58,880So building your relationships acrossnetworks, the best thing you can do?5700:02:59,320 --> 00:03:00,840Well,I really appreciate you sharing that.5800:03:00,840 --> 00:03:03,360And it's looks like and we'regoing to talk about this in just a minute.5900:03:03,360 --> 00:03:05,360and it seguesreally well into the next question.6000:03:05,360 --> 00:03:09,480So I was researching some dataaround women and technology in the UK,6100:03:09,480 --> 00:03:13,280and some new data revealsthat the proportion of female employees6200:03:13,280 --> 00:03:18,520in the UK, in UK tech has declined forthe first time in five years by over 2%.6300:03:18,520 --> 00:03:21,440Now, this is also happened in Canada,which is very interesting.6400:03:21,440 --> 00:03:25,200So it looks like it might be a bitof a worldwide global phenomenon.6500:03:25,800 --> 00:03:28,800This study actually revealedthat under 15% of IT6600:03:28,800 --> 00:03:31,680directors are womenin the sector in the UK.6700:03:31,680 --> 00:03:34,760So you are really a big supporterof diversity and equity.6800:03:35,120 --> 00:03:36,960And I was wondering if you could share6900:03:36,960 --> 00:03:40,880what you believe organizationscould do to support diverse workforces.7000:03:41,240 --> 00:03:44,960Gosh, it's a really concerning stat isn't it.And it's something that after7100:03:45,440 --> 00:03:49,120so much work by a lot of peopleacross across the globe to increase that.7200:03:49,120 --> 00:03:53,920So female leaders, female CEOsand definitely females in technology. It's7300:03:53,920 --> 00:03:57,520really concerning is why is it going to besomething to really unpack?7400:03:58,080 --> 00:04:00,840But it's not only concerningfor the workforce we have, it's7500:04:00,840 --> 00:04:03,280concerning the solutionsthat we're designing.7600:04:03,280 --> 00:04:07,000And that's why it's so importantthat we really address this.7700:04:07,360 --> 00:04:09,680So we must normalize the conversation.7800:04:09,680 --> 00:04:13,760A big thing that I try and do istalk in business outcomes.7900:04:14,320 --> 00:04:18,840Make sure it's really open to non tech techiesas I try and call the people8000:04:19,200 --> 00:04:22,120add a huge or just as much amount of value,if not more value.8100:04:22,120 --> 00:04:24,760If you talk about the user, your technologist8200:04:24,760 --> 00:04:28,200because everybody uses technologyand I would love to hear from everybody.8300:04:28,440 --> 00:04:32,040It's about getting in early as welland making it and making it relevant8400:04:32,040 --> 00:04:33,920for people back to normalizing.8500:04:33,920 --> 00:04:37,440Help people understand what valuethey can bring to the technology sector.8600:04:39,200 --> 00:04:39,960But it's not good enough.8700:04:39,960 --> 00:04:43,960So we've worked really hardin terms of making new routes,8800:04:44,160 --> 00:04:48,600sideways routes through retrainingand also realizing not everyone's a unicorn.8900:04:48,800 --> 00:04:51,120So as long as we can talk aboutclear development plans9000:04:51,120 --> 00:04:54,120and that's for everyone, not not,not not just females,9100:04:54,400 --> 00:04:56,160we can support everyone to grow.9200:04:56,160 --> 00:05:00,840So we've now currently got 40% female in our organization,which is fantastic, in technology, and9300:05:01,960 --> 00:05:05,08050% female in my leadership team.We've gone from from9400:05:05,120 --> 00:05:09,480from almost oneone out of 12 to 50%, which is fantastic.9500:05:09,680 --> 00:05:12,240But we must keep pushing thatand having those role models.9600:05:12,240 --> 00:05:15,720There's that, there's an old phrasethat you can't be what you can't see.9700:05:16,040 --> 00:05:19,640So we need more diversity of lotsof different demographics9800:05:19,920 --> 00:05:24,360at all different spaces in people'scareers and understanding9900:05:24,360 --> 00:05:28,120what are the traps that the as youmentioned, that 2% let's work out why.10000:05:28,160 --> 00:05:29,520So yeah, it's really important.10100:05:29,520 --> 00:05:30,240Yeah, I think it is.10200:05:30,240 --> 00:05:32,760And given that you said you startedyour career in engineering,10300:05:32,760 --> 00:05:36,880I mean, I think there's that abilityto look at how younger women10400:05:36,880 --> 00:05:40,480are looking at STEAM versus STEMand bringing the arts10500:05:40,480 --> 00:05:44,280and you talked about the soft skills likemaking sure that people have you know,10600:05:44,280 --> 00:05:47,280we recognized people's skillsoutside of technology10700:05:47,280 --> 00:05:50,160that seems to be trending a lotin these conversations I'm having.10800:05:50,160 --> 00:05:53,640Like we want people to bring in skillsand then a lot of the,10900:05:53,920 --> 00:05:55,720you know, male alliesI've been speaking to11000:05:55,720 --> 00:05:57,600in the tech industry have been saying,you know, look,11100:05:57,600 --> 00:06:03,080we know that women don't apply for jobsif they only have 50 or 60%11200:06:03,080 --> 00:06:06,120of their skills based, they don't thinkthey should, whereas men will.11300:06:06,440 --> 00:06:07,080And so, you know,11400:06:07,080 --> 00:06:11,000just the male allies that are encouragingand looking at those applications11500:06:11,000 --> 00:06:14,640that come in, encouraging women to apply,I think that's such an important part.11600:06:14,640 --> 00:06:19,160So and just as you mentioned right now,having this dialog is so impactful for me.11700:06:19,160 --> 00:06:21,240Every time I speak to a womanwho's working in the sector11800:06:21,240 --> 00:06:23,560to talk about their careerwhen they've come,11900:06:23,560 --> 00:06:28,160I gain some insights myself, so I reallyappreciate you sharing that very much.12000:06:28,680 --> 00:06:32,440You mentioned in the beginning that you'vehad a career in various sectors,12100:06:32,760 --> 00:06:37,040and a lot of CIOs I speak with talkabout knowledge gained by working across12200:06:37,040 --> 00:06:40,040sectors, you know, and the key learningsin their career through that.12300:06:40,920 --> 00:06:41,400You've worked12400:06:41,400 --> 00:06:44,400both in private and public sectorand now you're in the charitable sector.12500:06:44,680 --> 00:06:49,000So could you please provide some insightson how you utilize your learnings12600:06:49,000 --> 00:06:52,960between sectors and perhaps whycross-sector learning is really important?12700:06:53,320 --> 00:06:56,000So when I think about this question,I think about this question, I think about12800:06:56,000 --> 00:06:57,920our user doesn't care.12900:06:57,920 --> 00:07:01,200I only work in the charitable sectoror the public sector or the private sector.13000:07:01,400 --> 00:07:03,240They don't just experience one sector.13100:07:03,240 --> 00:07:06,240They bounce across each sectorand they expect13200:07:06,240 --> 00:07:08,800the same sort of level of service.13300:07:08,800 --> 00:07:12,600Not everyone expects the Amazon experienceif one on one to shop at Amazon,13400:07:12,960 --> 00:07:15,960but absolutelyit's about keeping up with Joneses.13500:07:16,560 --> 00:07:21,560So it's not good enough to just think thatbecause I'm a I'm a charity, I'm it's13600:07:21,560 --> 00:07:25,960okay that I offer a less good experiencedoing the experience in those other sectors.13700:07:26,240 --> 00:07:29,240So I think it's always keepingpunchy is really important13800:07:29,320 --> 00:07:31,360because otherwisethe users will go elsewhere.13900:07:31,360 --> 00:07:35,360And they'll get the service from elsewhereand donate their money elsewhere.14000:07:35,560 --> 00:07:36,480From my perspective.14100:07:36,480 --> 00:07:40,160That's why I mentioned that networkingis so important and huge learnings14200:07:40,200 --> 00:07:43,360with peersand we're all hungry for the same thing.14300:07:43,360 --> 00:07:45,960There's not enough moneyto go around to reinvent the wheel14400:07:45,960 --> 00:07:47,720and the wheeldoesn't need to be reinvented14500:07:47,720 --> 00:07:50,800in a lot places. We can learn from each otheras we're going on14600:07:50,800 --> 00:07:53,800in each each journeyinto different use of technologies14700:07:53,880 --> 00:07:58,240and soft, skillsand sharing that diversity. So14800:07:59,280 --> 00:08:00,520what I've learned,14900:08:00,520 --> 00:08:03,160either packing boxes of boxesin a warehouse15000:08:03,160 --> 00:08:06,160or understandingthe fish canning line at Grimsby,15100:08:06,240 --> 00:08:10,600for example to look at to look insidefishes, make sure that they are good15200:08:10,600 --> 00:08:15,960enough, the quality or even the very hightech potato sorter on the manufacturing site,15300:08:16,040 --> 00:08:19,240just to give a few examplesof the supermarket15400:08:19,240 --> 00:08:24,120you see behind the sort of the shop front,I always think about how technology,15500:08:24,120 --> 00:08:27,240the user, the journey,the efficiency for people are so valuable.15600:08:28,400 --> 00:08:31,440And so often when I'm talking to differentstakeholders and users,15700:08:32,240 --> 00:08:35,400when I think about different experiencesI've had across those different sectors,15800:08:35,640 --> 00:08:38,960I try and I always thinkthere's different way to solve a problem15900:08:39,120 --> 00:08:44,720and that there are always sort of certainfriction and always I find an analysis16000:08:44,720 --> 00:08:47,720I can do from a sectorthat I've been in16100:08:47,960 --> 00:08:50,800to finding new ways to solve problems.16200:08:50,800 --> 00:08:52,080I think. So16300:08:52,080 --> 00:08:54,720the charity sectoris no different to another sector.16400:08:54,720 --> 00:08:57,560Even moreso with the same stakeholder pressures.16500:08:57,560 --> 00:09:01,080We've all got shareholders,daily profit and loss targets,16600:09:01,080 --> 00:09:03,480it's just differentthings we're looking at over16700:09:04,440 --> 00:09:07,560revenue implicationsor even when there's revenue implications.16800:09:07,560 --> 00:09:11,080But we can again keep up with the Joneseswe can really push ahead16900:09:11,320 --> 00:09:13,280and look outsidethe global trends as well,17000:09:13,280 --> 00:09:17,640because someone could have done it out outthere, and so why not?17100:09:17,720 --> 00:09:21,680So if my my only advice is always thinkdifferently, always keep hungry17200:09:21,680 --> 00:09:24,160and so you can stay ahead of the game.And that's absolutely fine.17300:09:24,160 --> 00:09:26,680Looking across sector, across. The world,I love that.17400:09:26,680 --> 00:09:28,560And you know,when you talk about across the world17500:09:28,560 --> 00:09:32,200and in the charitable sector,you know, a few years back the UK17600:09:32,480 --> 00:09:34,560introduced the tap to payat a lot of events. Right?17700:09:34,560 --> 00:09:38,760And so instead of having to put your £5 inor whatever into the donation box,17800:09:38,760 --> 00:09:40,400you could just tap your card.17900:09:40,400 --> 00:09:43,920Well, that hadn't come out in Canada,but it was learning through charities here18000:09:43,920 --> 00:09:46,880that's happening over there. I went,Oh my God, this is amazing.18100:09:46,880 --> 00:09:49,960And so, like the banking industrydid a whole innovation around18200:09:49,960 --> 00:09:53,000that in the UK, which again,you talk about global learning.18300:09:53,000 --> 00:09:54,960So I really appreciateyou bringing that up18400:09:54,960 --> 00:09:56,880because that is such an important partof it.18500:09:56,880 --> 00:10:00,000And obviously, again, your backgroundworking at Morrisons,18600:10:00,000 --> 00:10:03,080having that opportunity to look at,you know, how food is processed18700:10:03,080 --> 00:10:06,480and probably looking at blockchainand other things just, just again18800:10:06,480 --> 00:10:08,400enhances your role that you're in now18900:10:08,400 --> 00:10:10,720because you're having those outsidelearnings.19000:10:10,720 --> 00:10:14,400A CIO at a roundtable said to me once,or we were at a roundtable19100:10:14,400 --> 00:10:16,680talking about cloudand all sorts of things19200:10:16,680 --> 00:10:20,440and but the one CIO said,you know, people do expect the end user19300:10:20,440 --> 00:10:24,320does expect that Amazon experienceyour order one day and get it the next.19400:10:24,440 --> 00:10:27,400Worthy We talked about the Ocadoexperience of food delivery19500:10:27,400 --> 00:10:30,720or whatever,so I appreciate you sharing that with us.19600:10:30,720 --> 00:10:34,320And I wanted to really talkabout tech for good.19700:10:34,560 --> 00:10:35,680It really seems to be19800:10:35,680 --> 00:10:38,920with a lot of the CIOs,especially in the UK, talking about this19900:10:38,920 --> 00:10:41,920and really being intentionalabout looking at that.20000:10:42,360 --> 00:10:45,160And the last time we spoke,you spoke about your passion for tech20100:10:45,160 --> 00:10:48,040for good, really to support inclusionof underrepresented.20200:10:48,040 --> 00:10:50,720communities in accessing technology.20300:10:50,720 --> 00:10:53,280And you made a really interestingpoint to me around,20400:10:53,280 --> 00:10:56,200you know, building data modelsthat can really help inform that.20500:10:56,200 --> 00:10:59,600So really being intentionaland working with data to understand20600:10:59,880 --> 00:11:02,880where you can, you know,support underrepresented communities.20700:11:03,360 --> 00:11:06,120So could you please discuss thisand perhaps some of the ways20800:11:06,120 --> 00:11:09,080you believe leadersand organizations can support this?20900:11:09,080 --> 00:11:12,640This is a huge question and again,it's super exciting.21000:11:12,840 --> 00:11:16,800I probably think your last comment actuallyabout different sectors21100:11:16,800 --> 00:11:19,880and globally as well,because I think about Blockbuster,21200:11:19,880 --> 00:11:24,360which is a UK film hiring company.I always say21300:11:24,360 --> 00:11:28,320you don't want to turn into a Blockbuster,so the need was that change21400:11:28,320 --> 00:11:31,240but it happened a little bit too late.And I think that's really interesting21500:11:31,240 --> 00:11:34,240about the global thing as welland about doing doing tech for good.21600:11:34,520 --> 00:11:37,640The nations who are really performingare actually some of those21700:11:37,640 --> 00:11:42,200who are furthest behind in termsof digital inclusion, and that's why21800:11:42,240 --> 00:11:45,240I think it's really importantto not be not be complacent.21900:11:45,680 --> 00:11:48,360There were some reallysome really groundbreaking activities22000:11:48,360 --> 00:11:54,240in different areas of the worldwho were leapfrogging us in tap to pay22100:11:54,240 --> 00:11:57,880yu've mentionedbecause they need too. That need is so critical.22200:11:57,880 --> 00:12:00,960And so when I think aboutlooking at different data models, I've got22300:12:01,840 --> 00:12:05,200some different viewpointson joining up different data models22400:12:05,200 --> 00:12:08,200from different styles of data,for example,22500:12:08,600 --> 00:12:13,160and the power of that, because wewe actually only look at the boundaries22600:12:14,200 --> 00:12:17,200that we set in or the datasetsthat we currently see.22700:12:17,400 --> 00:12:20,880And so by thinking differently,by looking into this into space,22800:12:20,880 --> 00:12:21,840as one of my peers would say,22900:12:21,840 --> 00:12:25,720and just searching for new stars,you can find insight.23000:12:25,960 --> 00:12:27,920And so when I think about the charity23100:12:27,920 --> 00:12:31,720sector that I work in, I currently servea certain portion of people in living with cancer23200:12:32,160 --> 00:12:33,000that's not good enough.23300:12:33,000 --> 00:12:36,000And so whereas when I was going towhen I think how I can solve that,23400:12:36,240 --> 00:12:38,080I think what could help joining that up.23500:12:38,080 --> 00:12:41,640So this experience from when I worked incentral government at the height of COVID,23600:12:41,880 --> 00:12:46,920when we we formulated the discussions23700:12:46,920 --> 00:12:50,280to join up different datasetsfrom across government departments.23800:12:50,600 --> 00:12:54,240And what the power of that showis that through aggregated data sets23900:12:54,240 --> 00:13:01,440that were anonymized and had a kind ofa nationwide view, but localized impact.24000:13:01,680 --> 00:13:05,760You could see the different impactof different datasets being joined up.24100:13:05,760 --> 00:13:10,200So for example, water dataor people the number of people on the road,24200:13:10,240 --> 00:13:15,280you can see the see the transmissions,for example, either increase or decrease.24300:13:15,600 --> 00:13:19,760And I think the same thing can happen indifferent examples is using open24400:13:19,760 --> 00:13:25,440datasets or datasetsfrom other other other organizations24500:13:25,440 --> 00:13:29,080that can be shared.It's for example linked to social care.24600:13:29,120 --> 00:13:30,160So what24700:13:30,160 --> 00:13:34,920at what point in a social ecosystemcan you apply a certain lever24800:13:35,040 --> 00:13:38,360and the individual will not go backto social care like a rubber band.24900:13:38,600 --> 00:13:42,120And so it's understanding those datasetsand understanding well25000:13:42,120 --> 00:13:45,440actually if you provide that lever overand over again, you can move that child25100:13:45,440 --> 00:13:49,080and that dependance from social careand break and break the trend.25200:13:49,840 --> 00:13:52,240And I think that's so importantbecause obviously25300:13:52,240 --> 00:13:55,440the data is out there,but we in society aren't using it.25400:13:55,480 --> 00:13:57,720We're trying to hold on to it so closely.25500:13:57,720 --> 00:13:59,520But actually we all know it's there25600:13:59,520 --> 00:14:01,600and different people areusing it, different activities.25700:14:01,600 --> 00:14:04,000I mean, we've been acutuallyusing it, even offline.25800:14:04,000 --> 00:14:07,280So as a retailer for example, someonewould look out, see it's raining25900:14:07,440 --> 00:14:12,360or see a film being loaned to another cinemaand adjust their offering.26000:14:13,520 --> 00:14:15,840People have naturally done it for years.26100:14:15,840 --> 00:14:16,560But actually26200:14:16,560 --> 00:14:19,920what can we do if we actually harnessthe power of data sitting in systems,26300:14:20,520 --> 00:14:23,520making sure it's safeand controlled in the people's trust26400:14:23,880 --> 00:14:26,680and can people control of thatdata is really clear,26500:14:26,680 --> 00:14:30,360but actually showing the benefitsreally with real life applications,26600:14:30,720 --> 00:14:34,520then if we look at different societies,have achieved this really, really well26700:14:34,560 --> 00:14:38,000through COVID. They democratized the trust26800:14:39,440 --> 00:14:44,160and they democratised data, control input in discussionsand it really increase26900:14:44,160 --> 00:14:48,400people's engagement with activityand the trust in the entire system.27000:14:48,560 --> 00:14:50,200And therefore onwards27100:14:50,200 --> 00:14:54,320they could understand the motivatorsand detractors from using digital tools.27200:14:54,480 --> 00:14:57,280So I think it's really, really powerful27300:14:57,280 --> 00:14:59,880as one element of tech figured.27400:14:59,880 --> 00:15:02,880And so there's so many foundationswe put in place27500:15:03,000 --> 00:15:07,600before we get on to shiny new thingssuch as, well not just AI, but elements27600:15:07,600 --> 00:15:12,080of AI or other things, because there's peopleneed to want to trust the system.27700:15:12,360 --> 00:15:15,840So foundations are really important as part of it.27800:15:16,680 --> 00:15:17,760You're really inspiring me.27900:15:17,760 --> 00:15:19,680Thank you so much for saying that,because I'm thinking28000:15:19,680 --> 00:15:22,320we're going to our next question,which is going to be about innovation.28100:15:22,320 --> 00:15:22,920And GenAI.28200:15:22,920 --> 00:15:25,200But, you know,I think what you're saying is28300:15:25,200 --> 00:15:28,960we have an app and what I'm hearing isalmost we have this opportunity right now28400:15:28,960 --> 00:15:31,960with this new inflection point in techand maybe quantum,28500:15:32,400 --> 00:15:35,000You know, I feel likeQuantum is really coming in behind.28600:15:35,000 --> 00:15:38,840So there's that opportunity to potentiallylook at how we're connecting this data.28700:15:38,840 --> 00:15:39,600Imagine that28800:15:39,600 --> 00:15:43,480at the speed of which we could do thingsquicker and faster and support individuals28900:15:43,480 --> 00:15:44,400and how we could share.29000:15:44,400 --> 00:15:48,120And if corporations were aligned to that,to making ensure there was, you know,29100:15:48,120 --> 00:15:51,280because I think of organizationsthat donate food to food banks29200:15:51,280 --> 00:15:54,360or other organizations,imagine if that was all connected, how29300:15:54,360 --> 00:15:55,600that would really change the world.29400:15:55,600 --> 00:15:57,960So, yeah,you're really it's very inspirational.29500:15:57,960 --> 00:15:59,960So thank you for sharing that.29600:15:59,960 --> 00:16:03,480And I did want to go throughto our last question, which is I'm asking29700:16:03,480 --> 00:16:07,520everybody this question right now,and the theme is really around innovation.29800:16:07,520 --> 00:16:11,360and GenAI, so,you know, obviously, GenAI29900:16:11,360 --> 00:16:14,640and LLMs are very prevalent right nowin discussions about innovation30000:16:14,640 --> 00:16:17,920and just in discussionsabout technology, really.30100:16:18,280 --> 00:16:21,560So could you share your views on thatand perhaps some of the ways30200:16:21,560 --> 00:16:24,800you're looking at to deployor what you're seeing in the market?30300:16:25,000 --> 00:16:28,360So if I think about thisand I'm really a big fan of shiny toys,30400:16:29,080 --> 00:16:32,280but shiny toysthat are bound with their limitations30500:16:32,280 --> 00:16:35,760in the discussion.There's definitely a place keeping up30600:16:35,760 --> 00:16:38,760and understanding what's out in the marketand play and playing safely.30700:16:38,840 --> 00:16:42,360But at the same time, I think there weresome foundations that we thought about.30800:16:42,600 --> 00:16:45,040If I use example,one of my wonderful medical team30900:16:45,040 --> 00:16:48,200I'm really privileged to work with becausethey work directly with the frontline.31000:16:49,440 --> 00:16:51,720Is that Ithink about we need to be contextualized.31100:16:51,720 --> 00:16:57,240We've got a really real critical, excitingpoint in a technological revolution31200:16:57,840 --> 00:17:01,920or evolution to think, to recontextualize,not just do as we've always done.31300:17:02,000 --> 00:17:05,640So arthritis, for example,to be treated the same way since the 1950s31400:17:05,800 --> 00:17:10,600and one dataset of white middle aged menin the north of Manchester in the UK.31500:17:11,040 --> 00:17:12,240Why do we still do that?31600:17:12,240 --> 00:17:13,520And so it becomes31700:17:13,520 --> 00:17:17,480pretty apparent that actually the medicaltreatments are very different now.31800:17:17,640 --> 00:17:21,880And so the same solution of fixis isn't working.31900:17:22,040 --> 00:17:24,960So in the same way,how can we use the opportunity32000:17:24,960 --> 00:17:28,560with AI and large language modelsand whatever is going to come next32100:17:29,320 --> 00:17:30,560in order to recontextualize?32200:17:30,560 --> 00:17:33,240I go back to the bones,what actually the user needing.32300:17:33,240 --> 00:17:35,560And how can we solve that problem?32400:17:35,560 --> 00:17:38,880Because no doubt it will createmore discussions about dependencies32500:17:39,080 --> 00:17:44,680and also get rid of the wasteeffot. All those processes and also expose32600:17:44,680 --> 00:17:48,560second order effects of using AI and32700:17:49,840 --> 00:17:50,480LLMs.32800:17:50,480 --> 00:17:53,600This isall done for if you make the hospital porter32900:17:53,600 --> 00:17:57,640group more efficient, of courseit's going to open up more bed spaces.33000:17:57,960 --> 00:18:01,880Some of these some of the examples we justmapped them out and think a bit broader.33100:18:02,160 --> 00:18:05,160We can think about different,different implications.33200:18:05,880 --> 00:18:09,480So for me it's about exciting people,about the opportunities, data33300:18:09,760 --> 00:18:14,400to really talk to people about ethicsand talking about biases and help33400:18:14,400 --> 00:18:19,200people understand what is that excitementto level thing to bring on on the journey.33500:18:19,840 --> 00:18:23,760Because at the end of day we'reall looking to make the future better,33600:18:24,000 --> 00:18:27,080but it must be secure, resilientand joined up33700:18:27,080 --> 00:18:30,560and efficient because all of these thingsare really expensive as well.33800:18:30,840 --> 00:18:34,440So we all we I think some of us knowsome of the pricing models33900:18:34,440 --> 00:18:37,720with new AI systemsthrough our hype vendors34000:18:37,720 --> 00:18:41,280are staff to trying to go live andit's extortionate frankly.34100:18:41,520 --> 00:18:44,480I as a charity can't keep up.34200:18:44,480 --> 00:18:48,400So how can we all get on this bus and besupporting the users in the same way,34300:18:48,720 --> 00:18:53,400including those those vendors.Obviously there will be people who will be left behind,34400:18:53,640 --> 00:18:57,000and I don'tI don't mind if it's us, but I'm really34500:18:57,440 --> 00:19:00,480conscious of the person using the systembecause if we're being left behind34600:19:00,800 --> 00:19:04,000the user who needs the most serviceswill absolutely be left behind.34700:19:04,200 --> 00:19:07,440And that that and that that the people I'mI'm worried about.34800:19:07,640 --> 00:19:11,160So how I answer questions in my mindabout knowing what's what34900:19:11,480 --> 00:19:14,920what's out there absolutely making surethat the foundations35000:19:15,120 --> 00:19:18,240of data sharing data ethics and datamanagement are in place35100:19:19,040 --> 00:19:22,880and a real hunger for people35200:19:23,080 --> 00:19:27,960to understand the opportunitiesand and to excite people with with that.35300:19:28,280 --> 00:19:31,480Because at the same time,that will open up those motivators35400:19:31,760 --> 00:19:34,600for othersto get on a digital inclusion journey,35500:19:34,600 --> 00:19:38,080which will only help the health, wealthand ultimate well-being35600:19:38,680 --> 00:19:42,600in the communityif if they are enabled and excited to make35700:19:42,600 --> 00:19:45,720best use of the exciting technologiesthat are out there.35800:19:45,800 --> 00:19:50,040Dr. Roxane, you're very, very inspiringand I really appreciate this conversation35900:19:50,040 --> 00:19:50,400today.36000:19:50,400 --> 00:19:53,520It has been just phenomenal for meto listen to you and hear36100:19:53,560 --> 00:19:55,080give you points on technology.36200:19:55,080 --> 00:19:56,760And so thank you so much.36300:19:57,960 --> 00:19:58,320Thank you.36400:19:58,320 --> 00:20:01,080Thank you very much.I really loveed talking to you. Thank you.36500:20:01,080 --> 00:20:03,920And if you're interested in learning more,please don't hesitate to visit us36600:20:03,920 --> 00:20:06,320at cio.com/uk36700:20:06,320 --> 00:20:06,920Thanks again.