“Chapter 18 Promises and Perils of Digital Health Technologies” in “Toward a Moral Horizon”
CHAPTER 18
Promises and Perils of Digital Health Technologies
Elizabeth Peter, Charlene H. Chu, Rosalie Starzomski, Patricia Rodney, and Janet L. Storch
“[N]urses are a really, really important part of this picture, and there are ethical issues all along the technology pipeline. Nurses need to make their voices heard, from ideation, to implementation, to evaluation.”
IN THIS VIDEO INTERVIEW, the editors of Toward a Moral Horizon: Nursing Ethics for Leadership and Practice engage in discussion with two nursing scholars who have expertise in the field of digital health technologies. In this conversation, they talk about the role of nurses in the development, use, and evaluation of digital technologies in health care and nursing practice. A transcript of the conversation, which has been edited for clarity, is included below. Readers can view the video here https://doi.org/10.18357/9781550587128.
ROSALIE STARZOMSKI: Welcome everyone! This is a wonderful opportunity for us to have some conversation about digital technologies. We’re being joined in this session by Elizabeth Peter and Charlene Chu from the University of Toronto, and you will also see on the screen the three editors of the book, Dr. Paddy Rodney, Dr. Janet Storch, and myself, Dr. Rosalie Starzomski.
The plan for this session is for us to have a conversation about some of the promises and pitfalls related to digital technologies, particularly as they relate to nurses and nursing ethics and also health ethics research. So what we’re going to do to start is just some general introductions, and then we have some questions that we’re going to cover, but mostly we’re just going to have a conversation about this topic. You’ll have the opportunity to listen to this video and to view it but we’ll also have on the book site the opportunity for you to read a transcript of this session.
So as I mentioned, I’m Rosalie Starzomski, and I’m one of the editors of the book. I’m a professor at the University of Victoria, and if you’re interested in reading more about the biographies of myself, Jan, or Paddy you can do that on the book site. There will also be some information about Elizabeth and Charlene as well, but let me turn it over to Jan to introduce herself and say hello to everyone.
JANET STORCH: Hi everyone, I’m really happy to be part of this session. I think it’s innovative and should be very authoritative from two experts that we have the benefit of having. So welcome to both of you, and just a little note about me—I am a Professor Emeritus from the University of Victoria, was at the University of Calgary teaching there, and Dean there. And basically, though, most of my life in academia has been at the University of Alberta.
ROSALIE: Thank you Jan. So Paddy—
PATRICIA (PADDY) RODNEY: I’m Paddy Rodney, and absolutely delighted to be here with these wonderful colleagues on a very interesting topic that’s clearly changing and evolving in our society, so a really nice opportunity. My background is also in nursing and I graduated from the University of Alberta with a basic baccalaureate degree, did my master’s there, and also went on to do a PhD. I had a chance to teach with the University of Victoria for a while, and had a number of years working at the University of British Columbia, so I’m very happy to have a chance to work with wonderful colleagues, the faces we see on this screen. I’m really pleased to have this chance to have this discussion. I have a longstanding interest in health care ethics, that’s something that most of us share, and I think this is a really nice topic to also take an ethics lens to, so welcome everyone who’s watching this, thank you.
ROSALIE: Thanks, Paddy. So, Elizabeth, would you like to say a little bit about yourself, please?
ELIZABETH PETER: A personal thank you so much for being able to talk about a very much ever-evolving set of concerns and promises. My background is also in ethics. I’m a member of the Joint Centre for Bioethics at the University of Toronto, along with being a faculty member of the Bloomberg Faculty of Nursing, and got my PhD through their collaborative program in bioethics a number of years ago. My interest in this area has been mainly on the research ethics side, especially as we explore new ways of doing research using AI and machine learning, but I also have an interest in technologies as a whole.
ROSALIE: Elizabeth, wonderful, and we’re just delighted that you’re able to join us. I’ll turn it over to you, Charlene.
CHARLENE CHU: Thank you to you and to Patricia and Jan for the invitation to be here. My name is Charlene Chu. I am an assistant professor at the Lawrence Bloomberg Faculty of Nursing alongside Dr. Peter, and my program of research is focused on designing and evaluating different technologies to improve the quality of life of older adults.
My PhD was at the Faculty of Nursing at U. of T., and it was focused on interventions for older adults, and then my postdoc was in a lab that was building AI and looking at different technologies, so my program of research melds these two areas. And so I’ve got a keen interest in looking at the ethical aspects of technology, throughout the implementation as well, and its implications for nursing practice and nursing education.
ROSALIE: Thanks everyone, wonderful. Well, it’s just as my colleagues have said, delightful to have you both here, and Jan, Paddy, and I have had different discussions about some of the issues related to digital technologies as we’ve been working on the whole book.
And as some of the students that will be using this book in their graduate studies will be students who have a very keen interest in health informatics, I think this material will be particularly relevant to some of them, but of interest to everyone who’s involved in health care and working in nursing today, as it’s evolving so quickly. I think the fact that we can address some of the future challenges is a really good idea as well.
So let us start with the preliminary discussion about how do you think some of these digital health technologies are changing nursing practice, and do you have any thoughts about what some of the promises and pitfalls are? Why don’t I turn it to you first, Charlene?
CHARLENE: Sure, so that’s a great question. I think it’s an important question. I think it’s undeniable that digital health technologies are significantly transforming the practice of nursing and the health care industry as a whole. This includes how health care is accessed, how it’s delivered, and that of course has a direct impact on nursing practice. I think there’s lots of clear benefits and promises of digital health technologies.
As you know, the big five come to mind, which are: increased efficiencies—you have digital health technologies that help streamline documentation, automate tasks for example. So, for nurses that need to take a blood pressure every hour or transcribe paper orders from the physician, technology can help improve these tasks so that nurses are spending less time doing some of these tasks and spending their shifts on things like paperwork, for example.
It can help improve patient care, so technology allows for real-time monitoring a patient’s vital signs as well as other important information, which allows us to quickly identify any changes in health status, and we can intervene in a timely manner in order to address any potential health issues. Things like patient portals, remote patient monitoring, telehealth visits, for example. So all of these things can help improve patient access to care, improve patient care in general. Technology of course has implications to address equity issues as well.
Third, better collaboration, potentially. This allows nurses and other health care professionals to communicate in a timely manner. We have improved collaboration with better coordinated care so, for example, an NP can have the most recent lab results or medical images and so everybody knows what we’re talking about and the patient can get more expedient care. I think from a structural perspective, we have improved data management in general, where the digital health technologies will help enable the creation of comprehensive and centralized patient data so it can be easily accessed by all members of the health care team. And again, this allows for more informed decision making and patient care.
And I think lastly, it can enhance patient engagement and education. So it allows for empowering patients because now they’re able to take a more active role in their own care. They can use apps to track their progress, monitor symptoms, communicate with their care team, just with a text through an app, and so we’re seeing these digital health technologies being used as a prescription and standardized care now. For example, after cardiac surgeries, where having an app like Medly to download, and having to use it at home is becoming standard practice.
So I think as technology continues to evolve we’re going to see more and more impact on nursing practice and on patient care. I think more significant kinds of questions around how technology is implemented, I think are important ones, and I think there’s lots of perils. I think that was the word you used, Rosalie, and so I think the main ones that come to my mind immediately are the extent to which nursing work is either represented or underrepresented in these technology systems, so, for example, EMRs [electronic medical records]. If nursing work is not appropriately recognized and represented in these systems it can contribute to the under-valuing of nursing and the profession, and of course this can have negative implications for the recognition of nurses, for compensation of nurses, and so these systems really need to fully reflect the scope and complexity of nursing practice.
And then, of course, this also affects the quality of care for patients as well. I think a key understanding of digital health technology is that it can exacerbate inequity and it can result in more health data poverty. I think that’s a topic we’ll come to in a future question, but technology in itself is a source of inequity-generating interventions and so depending on how you want to split the digital divide, whether that’s age, geography, digital literacy, there’s many, many factors that make access and cost to technology an issue, for example, older adults that live in long-term care who may not have access to modern devices; Indigenous populations that may not have internet access, for example. So I think there’s a lot of issues when it comes to inequity.
And then the third is more general, but I think it’s the lack of user-centred design, so when we think about the technologies that surround us, and informatics systems, often they’re not created with the users in mind and the needs of the end users in mind, so this can result in systems that are difficult to use, time consuming. There are many consequences to that, so we can have burnout, moral distress. There’s a really wonderful article written by Atul Gawande in The New Yorker about “death by 1,000 clicks” speaking to the fact of EMR’s being poorly designed, so I think it’s a double-edged sword where we can have many, many benefits, of course, but I think we need to be aware of some of the perils as well. Maybe I’ll stop there.
ROSALIE: Well thank you, that’s a really comprehensive overview of a number of the issues. There’s just one thing I want to pick up on before we move on to another speaker, and that is when you were talking about the EMRs. I was really appreciative of your bringing to our attention Atul Gawande’s wonderful article as you mentioned, “death by 1,000 clicks.” But one of the things that I hear nurses talking, about particularly when new technologies are introduced, is a lack of time in terms of their understanding the technology and a lack of time in educating them about how they might use this technology effectively. And this has come up in particular in relation to EMRs, so not only in the development, yes, the nursing voice needs to be included, and nursing content, but also in terms of the implementation. I just wondered if you could say a couple words about that.
CHARLENE: I think that’s a great point. Many of these EMRs can be tailored to reflect nursing work, but sometimes young nurses aren’t at the table to be able to voice how that should be implemented or how it could potentially look. So because nurses make up the largest component of the health care workforce, as well as generate the most amount of health care data, it is really important for nurses to be at that table. And I think in one of the later questions we will talk about what are some of the roles of nursing and one of the roles is that of an implementer. So to me, I think being able to implement a system that can help support nurses in their workflows is really important, and it can have dire consequences if they are not supported in being able to use that effectively.
ROSALIE: Thank you. Elizabeth, do you have any thoughts about some of the promises and perils, or some of the issues that you think we haven’t addressed to this point?
ELIZABETH: Thank you. Charlene, you gave an excellent overview of so many of the issues. The one thing that occurs to me right away is in five years from now, what will we be talking about is something that fascinates me greatly, so things are ever evolving.
The one thing though that comes to mind immediately, and I think this came up in class today, is that technology of course is wonderful, potentially, it can do many, many things, but there are also so many distractions in terms of fundamental nursing care. It worries me a bit if too much of our focus goes away from some of the fundamental things around having relationships with people—their dignity, care of the body, and all of those things that are absolutely important. Hopefully the technology frees nurses up to provide that kind of care to patients. I know that in some of the research, for instance with virtual visits, many clients in the home very much enjoy them in the sense that without them there would be no link at all to health care, but it still is not a substitute for a person being there physically. So it’s good, but there is always for me that piece of wanting to hold on to some really important things in nursing, and there ought not to be a complete distraction. So hopefully one frees up the other to make that possible.
ROSALIE: Now those are some really great points. In my career as a nurse, who worked extensively as an advanced practice nurse in dialysis and transplantation, I saw that often when we introduced some new technology—because the focus became very much on the technology itself—and even though it was requiring that some of those technological artifacts, if you will, were used with the patient, the patient became lost in the application of the technology. So I think what you say is really important. Sorry—go ahead, Elizabeth.
ELIZABETH: Just one quick thing, it just reminded me of the work of Margarete Sandelowski, who focused on making sure we nurse the patient and not the technology, so I had forgotten about her marvelous work on this.
ROSALIE: That’s a great point to raise, and we will make sure we get a reference in to some of the work that she has done. Because the other piece of that, too, is the fact that particularly in regards to nursing ethics there’s quite a large focus on that relational practice and relational development, if you will, so we want to be sure that we make that very clear that the technology is not who we’re nursing, we’re nursing the patient, so thank you for that. Paddy, is there anything that you wanted to add?
PADDY: Yes, maybe to ask a question as well. One of the things, as I’m thinking and listening to what you’ve said, and it’s been tremendous already in this time that we’ve had together, is about what questions are getting asked or aren’t getting asked, and how might we continue to promote the kind of forward thinking and insightful thinking that just came up in this last discussion between the three of you. What changes might we consider to help to move the human side of this forward more, not just paying it lip service, but really enacting it and really paying attention to those, for example older adults, who can’t even hear what we’re saying, let alone if we come in explaining a really complicated tech, for instance. So just your thoughts about how progressive change might happen that moves in a really strong ethical direction?
ROSALIE: Paddy, I’m wondering if we could hold on to that question, because we’re going to talk about that a little bit later on, and so perhaps we can come back, circle around, and come back to that if that’s OK. So Jan, what about you, is there anything that you wanted to add or any thoughts?
JAN: Well, you’ll all laugh at this, but when I knew what the topic was and thought, good grief, what do I know about that, and how did I get to know about that as a student? And what came to mind immediately was more of the classical technology like an iron lung. And in my training days I had a chance to help nurse somebody on an iron lung on the sixth floor of the University of Alberta Hospital, and as I thought about that, I thought what a way technology advances on us, and the knowledge to support a person even on the iron lung was pretty extensive, pretty intensive. Then I got busy, and I’ll not be long, but I was gathering books I had around here to just find those early days and one of them did actually—it’s a text that some of you will have heard of by Harmer, called The Principles and Practice of Nursing. It’s very old, it’s very big. What I did find in there was a picture of an iron lung for the students to see lots of write-up about it.
I guess what other questions I would have is how do we get not only nurses who are in practice to be more conversant with these technologies and early learning, and how would we do that with students? So I’ll say more later if there’s time, but I have just had a very interesting romp through that. I guess the thing that finally came out in the end, is more recently, I’ve known friends with artificial hands, and this [shows photo] is a very good friend of mine, I don’t know if you can see, I think you can, you can see his hands, and he came to our wedding and he was delighted to be there. It was a big community hall, somebody came to tell me as the bride that he needed something, and it was that he wanted to drink water, but with his clip the plastic glass disintegrated.
ROSALIE: Technology—well, those are both really super examples of some of the things for many people—certainly the iron lung example—many people who will be watching this video will never had had the chance to have nursed somebody in an iron lung. I know there are only a few people left in the world who are still using iron lungs. It was so complex as far as some of the nursing care, the personal care that was required, and the relational care that was really necessary. So thank you for raising that.
So maybe what we can do is, based on just what you were saying, and what Paddy alluded to earlier, is move on to talking about, what do you think are some of the key skills needed to practice ethically, in these environments that we’re talking about where the development and use of technology is ever-increasing? So I’d be curious, maybe Elizabeth, you can start. What do you think some of the skills might be that we should be talking about in nursing and helping to develop?
ELIZABETH: Oh goodness. I think, for one, we need critical thinking skills, in terms of students really understanding what kind of data is feeding into these kinds of systems. And Charlene can speak to this better than I can. That it’s imperfect. And they need to understand that that data could be quite biased. It might not be relevant for certain populations. I think they need to understand that. Not to say that it’s useless, but to think about that critically. So that would be one thing.
I’d also want students and also leaders in practice environments to start to think about liability, if you will, in terms of responsibility. Who’s responsible, for perhaps, decisions being made? Where do we place that responsibility? So that’s something for many people to think about—regulators, as well, when these technologies become more prevalent. Those are some of the things that I would think about. A lot of critical thinking, I think, is important.
I really like what Charlene had to say. When these technologies are being introduced, to have nurses really have their perspectives given as to what is helpful, not helpful, in the practice environment. Having patients, clients also involved in the development of them so that they’re being used in a way that’s most meaningful and the least harmful, if you will, in very simplistic ethical terms.
ROSALIE: Really great points, and it brings to mind, as well, technology assessment, which is quite a large area in health care and evaluation of technology, and even the development of the processes to do that. In many cases, nursing voices have not always been included. And so some of the parameters that have been considered in the evaluation have been missed, because there isn’t that nursing input. So I can think of a couple of examples in my own practice in the introduction of new equipment for dialysis, for example, where that’s occurred. And also I’ve heard recently from colleagues who are working in critical care, where new technology that was introduced there during COVID-19, there wasn’t always the opportunity to do the evaluation that was needed. And this increased nursing time and also impacted on the level of patient care that was able to be provided.
So thank you for those comments. Charlene, did you want to add anything to that, about some of the skills that you think are needed?
CHARLENE: Yes, so I recently authored a book chapter about this that was called the impact of digital technologies and new skills and knowledge that nurses need [The impact of digital technologies, data analytics and AI on nursing informatics: The new skills and knowledge nurses need for the 21st century].
ROSALIE: Wonderful!
CHARLENE: So this is fitting. And so my co-authors and I, we looked at the basic roles of nurses, and nursing practice, and we reconceptualized core aspects of nursing practice. The five different areas were that of an advocate, an explainer, an implementer, a creator, and an analyst. Nurses have always been advocates for patients. So in this case, when it comes to digital health technologies, here we’re talking about advocating for equity, advocating for access, thinking about the suitability of the technology to the patient population, thinking about social determinants of health, privacy, confidentiality, data ownership. And really being that advocate for that patient. And community, and population.
And then, the second, being an explainer. Nurses have always had the ethical duty to ensure basic principles of informed consent. Informed consent is being met, as a professional standard, and as a professional practice standard. And so, the same applies with technologies that people are using. When we’re prescribing technologies for people to take home with them, they should know, “Where is this data being stored? Who has access to it? When can you look at it?”
Picking up on your earlier point there, Rosalie, about nurse as being an implementer, nurses have always led the way to move evidence into practice. And so this is the same for technologies as well. Using principles and frameworks for information science and implementation science, and to think about the context, the enablers, the barriers of moving technologies into various health care environments, whether that be acute care or in the community.
Another role, and we’ve sort of touched on this several times now, but this is the core tenet of my own work, is that of co-creator. Again, nurses are the backbone of most health care systems. And so, because we are found in many different health care sectors, nurses have an in-depth knowledge about patients, about the illness trajectory, about the communities in which patients live, the clinical context, the workflow in many of these settings. And we are privileged to be able to see the care that is being delivered, where developers, engineers, and those who are outside of the health care system are not privy to [knowledge about this care].
So, this expertise is often ignored, and that is to the detriment of building successful technology, and implementation, and adoption of successful technology. So nurses really need to step forward, and make their narrative and expertise heard.
And then, lastly, picking up on Elizabeth’s point around the use of critical thinking when it comes to technology, I think the nurse as analyst, being able to analyze the vital signs of the patient up in front of you. Or if you’re a manager, understanding the data from your unit and making decisions around resource allocation [are important].
But I think nurses now, with the use of big data, and the availability of big data, we can take a more systems approach to think about broader institutions and future directions of where we need to focus our energy, focus different resources, as well as health human resources for the betterment of communities and populations. I’ll stop there.
ROSALIE: Yes, some really excellent points that you’ve made there. I’m thinking that perhaps Paddy or Jan might want to make some comments, because you’ve raised a number of various points. I know that in other chapters of the book we’ve touched on this as well, in terms of nursing skill development when it comes to ethics. And it brings to mind also something we haven’t talked about so much, but the idea that if we look at Patricia Benner’s work, for example, and look at that continuum from novice to expert, that there are different challenges in skill development at different levels of overall nursing development. So Paddy, or Jan, do you have anything you’d like to add?
PADDY: There’s so much to think about and talk about with both of you. This is marvellous territory that you’re in, and you’re embarking on, and that you’re critically analyzing. It occurred to me, too, that the way we educate nurses from the very beginning is going to be enormously important in fostering the kind of capability, Charlene, that you and Elizabeth have been talking about in terms of people who are critically questioning, people who are not too afraid of the tech, people who are able to deal with the intricacies of [technology] and so forth. And [these people are] also facing what has always been the case at the bedside, and that is the grief and the tragedy that some families and patients face, and how to support them at the same time as navigating a technological imperative. And I’d be curious to hear what your thoughts are about how to move forward in the future for that. I know it’s a complicated question, but I’m sitting thinking about both the human aspect of it as well as the technological expertise that’s required. That is a big commitment.
ROSALIE: And we’ll come back to that, certainly, talking about some of the future developments as we finish up this session. So thanks for your comments, Paddy. Jan?
JAN: I was just thinking, then, too, about not only how the caregivers get to know and understand what the new invention or the new lab technique will be, but just wondering about how those who have to think about budgets and paying for some of these very novel instruments. I’d be really curious to know who is looking at how those payment schemes will go … with never enough dollars to go around. How will different areas be affected by their inability to fund some of these technical inventions? I know a few times we’ve lived through that. It happens in the university hospital, and maybe five years later, six, it reaches some outpost nursing station. But the challenges are going to be greater, I think.
ROSALIE: Well that’s a really significant point because we’ve seen how technologies that had in the past only been available in hospitals are now moving into homes. And certainly, again in dialysis, we’ve been doing very complex treatments for a number of years in patients’ homes. So patients themselves have been doing them, and their family members have been involved in helping in some instances. That’s just one example. I mean, intravenous therapies … There’s many things that were once just the purview of the nurse in a highly technological unit that are now moving into the community, and into homes, and into long-term care. And I know that’s something, Elizabeth, that you’ve given a lot of thought to and have been involved in thinking about in regard to home care, and the escalation of some of these things, and the skills needed by home care nurses to continue to provide that care. So you might have some thoughts about that.
ELIZABETH: Oh yes, and no doubt there is a great escalation there, as in the hospital. The one challenge that home care nurses have, not only with respect to new technologies and so on, but just in general, is that they are often working alone. So some of the learning opportunities that one would have in a hospital situation might not be as readily available. Again, you have to think about educating the clients at home—they’re often older adults. Many of them are very comfortable with technology, but some are not. And when technology goes awry, again, in the home it can be more of a challenge, as it is for any of us when we have problems at home. To get someone in to repair, help, intervene, and so on can be easier in a hospital, because it is more [accessible]. But some of these [technological] things, yes, in the home, can revolutionize things.
With the other end of the spectrum, though, in terms of still respecting people’s privacy, where we have people being monitored for a number of things, for their own safety, perhaps for vital signs, but it can be much greater than that, what the monitoring can entail. So the privacy aspect is really important to think about, particularly more on the extreme end, when we start to monitor people, especially in their own homes, where they have a high assumption around privacy. That would be probably the most private place to most people, where they live and go to sleep, if they live in a home, an apartment, etcetera. So privacy is a big one that we need to think about. And engaging again, that population, in terms of education, and what’s okay? We need their input.
ROSALIE: Charlene, anything that you wanted to add to that?
CHARLENE: We have a couple of projects that are focused on technologies for the home and designing homes and sensor systems to help support older adults living at home. It’s a really interesting kind of area to explore, because there’s so many ethical issues when it comes to using sensor technologies in the home. The capabilities of sensors these days can detect the quality of sleep, how fast you’re breathing, your heartbeat, how many times you leave the house, how fast you’re walking, where you drive! The proximity of you to, perhaps, your loved one. How long you’re spending together. And so there’s many inferences somebody can make based on the data and being able to triangulate the data around somebody.
But there’s several ethical issues when it comes to using this. Often, and I know from talking about this from a health care perspective, [there is] a health care provider potentially making the home more of a hospital kind of setting where you have these technologies in that home, but there’s that presence of a health care professional. But in many of these cases, and in my own life personally, I’ve had so many people ask me about putting in these monitors and going to Best Buy and buying these home sensor systems for their parents. And so now you have family caregivers who are also using these technologies, and they’re not fully aware of what to do with the data. When is something important, when is something not important? And then their parent, on the other hand, who is being monitored, also doesn’t understand what can my daughter/son see? How much are they actually following me here? And so what could be interpreted as, “Now all of a sudden I can’t live by myself anymore?” So there is this instilment of fear. What’s going to happen if I fall now? So it raises some really interesting questions around boundaries, changing of dynamics, changing of relationships. It really changes the nature of different relationships, not just between nurses and patients, but also family members, as well, that are part of that care team.
ROSALIE: Yes. You’ve brought up a number of really important points there. I was just thinking about recently, a situation—well, it wasn’t too long ago—that I was involved in, where I was talking with a parent of someone who has just switched to a continuous glucose monitor, which has been revolutionary in terms of diabetes care. No question, people get real-time data, fewer finger pricks—so many advantages. But they can also share that data with other people. So what’s happened in the case that the family member, the parent, was telling me about, is she now receives the data from the glucose monitor, and she is now feeling like she’s responsible to deal with the data she receives. Yet her teenager would prefer her not to be involved in that. So that’s the other piece of it. If you have data, what do you do with it? And how do you work with it? And where does that line fall between autonomy and also people who think paternalistically that they ought to be intervening? So you’ve touched on some of those important points, for sure.
So before we leave this area about skills needed, I wondered if we could just have a couple of minutes of conversation about whether you think there are some specific skills that are needed by advanced practice nurses or advanced practice nurse leaders, because many of the people who are going to be watching this video and reading this book, Toward a Moral Horizon: Nursing Ethics for Leadership and Practice, are going to be advanced practice nurse leaders. So could we just maybe hear from you, Elizabeth, about what you think might be some of the skills that those advanced practice nurse leaders might need?
ELIZABETH: Well, the first thing that comes to mind is many of them are educators, in various contexts. So they are going to need to become very knowledgeable about many of these things—to keep on top of this, to be able to use these technologies, and then to be able to teach about them. So knowledge, I think, is going to be absolutely huge for that group of nurses. So this would be one of the really key things that comes to mind for me immediately. Charlene and others may have some other ideas from other types of advanced practice nurses because there are many different kinds.
ROSALIE: Right. Well, that’s a really significant one, and you’re right, a number of nurse educators will in fact be watching this video. And within the book, we have a chapter about some of the ethical issues related to nursing education.
Charlene, what about you? Any thoughts about advanced practice nurse leaders?
CHARLENE: Yes, it’s a great question. APNs are such an important part of the bigger picture when it comes to digital technologies and the implementation, the co-creation. And from my perspective, technologies are really a critical enabler of APNs.
And if you look at the advanced practice nursing pan-Canadian framework from the CNA, the competencies include direct comprehensive care, optimizing health system competencies, educational competency, research competencies. All of those are supported by technologies, and so when you look at the individual sub-competencies under these categories, for example, optimizing health system competencies—one of the sub-competencies is being able to generate and incorporate new nursing knowledge, and to develop standards of care. Well, if you have technology you can look at the data and be able to help generate that knowledge to support any standard of care. So all of the different competencies are supported if we can actually look to the technology and embrace it and be able to understand it using critical thinking and being able to have a broader appreciation of the data that is being collected.
ROSALIE: Well said. I know in my own professional career when working as a clinical ethicist, many of the referrals that I received about ethical challenges came from advanced practice nurse leaders, and some of them were related to just what you talked about, and [often] the nurses didn’t believe they had the skills to be able to even provide that guidance and support and also leadership that was needed in those areas. So I’m glad to hear you bringing us back to the competencies. That’s great. Sorry, go ahead Charlene.
ELIZABETH: No, it was me, actually. One thing that also came to mind, Rosalie, is that competencies, obviously, are really important, in terms of skill development, nurse practicing, and so on. But the one thing that I would be mindful of is it’s not just the “what we know and how we do it,” it’s also who we are. So the moral identity of nurses is really important here. Nurse leaders often set the stage for that, for that moral behaviour to be part of our identity alongside with the competencies and so on. Again, I always go back to distractions. And what is the distraction of the day, if you will. And that’s one of them, is that we tend to throw out too many things when we adopt new things. So competencies, yes. But who we are, I think, is something also that advanced practice nurses have a big role to play in maintaining.
ROSALIE: Excellent point. Really excellent point. And that ties in to what we’ve been talking about around relational practice as well. And the “who we are” is so important in that, for sure.
And Paddy, I know that’s an area that you’re particularly interested in. So do you have any points that you’d like to add?
PADDY: Yes, just to say the whole discussion has been wonderful and enlightening. And Charlene and Elizabeth, we’re really blessed to have your expertise here. And I know that Jan, Rosalie, the backgrounds everybody brings to this is so important.
And having done critical care, which was fairly high tech back then—this was a few decades ago—certainly it was easy to get caught up or distracted in the mechanics, the pathophysiology, how somebody’s blood pressure is doing, and so on. That certainly was important material, and something that we had to pay attention to, but the harder questions even then, in what was then a fairly highly technological environment for its day and age, was also about “How’s the family doing? How’s the team doing? What does the family understand about whether this loved one will get better or how they might proceed? How are we coaching them to help them with their grief? What resources do they have?”
And I would also add, the other thing we’ve learned about—and of course this is where the concept of moral distress has been completely invaluable—is to understand “How are the care providers being affected? How are they working together as a team?” And I think again, as I’ve listened to each of you, that those kinds of questions are flourishing in the kind of work that you’re talking about and doing.
I’m certainly not anti-technology, although I’ll never brag about being particularly good at it, but the danger in technology is that it can distract us away from some of those really deep human conundrums that walk into our hospitals, and that walk out of our hospitals, where we hope that they’ll get home care and other resources. Again, I just think that the insights that you’re bringing to this are so important. So thanks for a chance to talk about that.
ROSALIE: Thanks, Paddy. Some really important points that you’ve raised there, and things that we need to think about, always. And it goes back to some of the earlier points you made, Charlene, about the promises and perils, so I think that’s a really relevant set of points that you’ve brought us back to think about, Paddy. Thank you.
And Jan, anything that you wanted to add before we move on to talking a little bit more about data and research ethics?
JAN: I don’t think so, Rosalie. I think some of the main areas have been touched on. Going into depth on any one of them would be [interesting], but I would like to hear the “what’s next” that you’re pointing to. I think that’s most important.
ROSALIE: Thank you, Jan. So, in terms of thinking about, you know we’ve tossed around terms here so far—big data, and we haven’t really talked about it, but we’ve used the term AI—artificial intelligence, and we mentioned research ethics. But I think what I’m curious about, and I think what all of us would benefit from hearing more about, is what do you think some of the implications are for data disparity, and even pernicious bias, for example, for ethical clinical practice, particularly as it might relate to ethical conduct of research? And I know, Elizabeth, you’ve been involved in a number of activities in this area, so perhaps you might like to start.
ELIZABETH: Well, one of the things I think that has been a part of conversation is for researchers to really have a strong understanding of the databases they’re using, and what their limitations are. Because there are always limitations. Because once those data sets get put together, what are the outcomes that people are going to look at? Is this reliable, what is coming forward?—is one of the things that is strongly emphasized.
The other thing, however, is it may seem lacking in risk, to use data where there is no contact per se with participants, but some of the ethical issues actually arise upon dissemination, and not so much in those early stages. So someone is bringing together data sets that may have implications for groups in society, especially if they are based on things like race, or on socio-demographics, and so on. Those implications can be huge. And so, some upfront thinking needs to go into this with community engagement, and so on, to make sure communities are involved in the kinds of research questions that are being asked. But really important, to make sure they’re involved in dissemination, in order to avoid some of the stigma and naive kinds of conclusions, and sometimes very painful conclusions for groups, if they’re misunderstood. Especially if the data is imperfect, and all data is imperfect. So that is something that research ethics I know currently is really looking at, is that point of dissemination, as well, and community engagement.
ROSALIE: Well, thank you for raising those points, and I know that, in particular, there have been a number of challenges in those areas with Indigenous communities, both around collection of data, data ownership, dissemination, and I think what you bring up applies, certainly, to that community, but also to many other communities as well. So thank you for raising those points.
Charlene, did you want to add anything to that?
CHARLENE: Data disparity is a big issue when it comes to technology and the collection of data. We live in a digital world, and we have 2.5 quintillion bytes of data, I think, that are being collected every single day. With all of that data being collected and knowing that data is being used to build and train different algorithms that impact our day-to-day lives, it’s really important for us to think about “whose data is that?” And who is being left out of that data? And so when we talk about health data disparities, there’s an article that was in The Lancet that was by Ibrahim, I think, in 2021, and they define health data disparities as systematic differences in either the quality, or the quantity, or both, of health data that represents either different individuals, groups, or populations. And so that can be across demographics, disciplines, or diseases.
And so when you have these health data disparities, that results in health data poverty, which is defined as groups, or individuals, or populations that are not able to benefit from the discovery or innovations because they are not being represented. And so when we think about this at a global scale, in high-, middle-, low-income countries, it’s likely that you’re going to be leaving out a lot of people in low- middle-income countries—that don’t have the technological infrastructure, that don’t have the access to use advanced technologies that are quite sophisticated on a day-to-day basis, may not even have the infrastructure, as in the internet, to be able to use some of these devices. And so then we may end up exacerbating different inequity gaps between have and have-nots.
So I think we really need to be cognizant that these datasets, like Elizabeth had mentioned here are not perfect. They underrepresent key segments of the overall population. And this data is being used to design digital health technologies that will be safe and effective for some, but maybe not as effective for others. And I think there’s a lot of examples of that in the literature.
ROSALIE: I appreciate what you’re saying about some of the global injustices because those are not questions that we often reflect on, or certainly not for very long, anyway, and I think it’s important for us to think more about that. Sorry, go ahead, Charlene.
CHARLENE: I was just going to say, clinically, I think there was an example in 2019 by Thomasev, where they came up with an algorithm in order to predict acute kidney injury in adults. But in their dataset, they only had 6% who were women. So, of course, the algorithm was not as effective in women as it was in men. That’s just an example of how you would have gender-related health data disparities that then end up causing relative health data poverty in women.
And then even more recently, with COVID-19, the New England Journal of Medicine, as well as the British Medical Journal, they had reports about pulse oximeters. And pulse oximeters don’t work as well on darker skin, because they are calibrated on people with white skin, and so now we have—I think their study said that pulse oximetry readings missed three times as many cases of hypoxemia in Black patients in comparison to White patients. So said in another way, it’s not detecting hypoxemia to the rate of three times for people with darker skin, which means that nurses may not be able to properly triage patients. This is a significant issue in COVID-19, which causes acute respiratory distress, and [nurses may not] apply critical supplemental oxygen in time. So there’s a lot of clinical examples when it comes to data disparity—harm from algorithmic bias—and so there needs to be a lot more consideration and attention to this, for sure.
ROSALIE: Well great points, and I’m just going to ask Paddy to comment on this for a moment because I know that disparity has been relevant and evident in some of the work done in cardiac care, where women, for example, have not been included, or people from different ethnocultural communities, or socio-economic groups. And Paddy, I know that that’s an area of interest of yours, so do you have anything that you would like to add?
PADDY: Thank you. I guess a lot of what I’ve been posing are questions, because the information you’ve given us is so powerful. It makes me wonder, listening to what you said so eloquently, Charlene, we talked earlier, and Rosalie, you emphasized the importance of being relational, that is, respect, looking at how we’re situated, how we work together, and so forth. And I was thinking that there needs to be a heavy dose of relationality to go to some policymakers, some scientists, who are obviously incredibly well meaning, but aren’t perhaps asking the right questions, and may not have been going through either educational [systems] or organizational [systems] where they’re rewarded for that kind of question asking. Because again, Charlene, listening to you, the kinds of points that you’ve raised, and that you’ve raised, Elizabeth, help us to dig beyond the surface in order to be closer to get to what’s actually going to make a difference in people’s lives. So I really want to thank you for that. I just think it’s enormously important, thank you.
ROSALIE: Thanks, Paddy. And Jan, I know that in your career, research ethics has been very prominent. And you’ve been very involved in a variety of different research ethics committees and groups. Reflecting on some of these questions, is there anything that you want to add?
JAN: I want to thank you two so much for the light you’ve shed in various corners of this topic. I think you had so much to add to it, and so many things to say about the goods, the bads, and maybe the ugly, and that is very meaningful, and very important for the rest of us to hear and know. I think especially some of the challenges of home care. In Montreal for a couple of years I was a home care nurse—[home care] was just in infancy in some ways for what could happen and how I could care for people in a way I wanted to. So I think both the home and the hospitals, and long-term care areas, long-term care homes especially, I think, so at risk, many times, for what’s happening. We need all of [your] advice, and we need to find a way to spread it around in whatever way we can. I thank you both so much for coming, and being part of this discussion.
ROSALIE: Thanks for your comments, Jan. One of the things, before we wrap up and talk about the [nursing] roles and some of the future thinking that we might have about areas where we can continue to develop and improve—and where we can make changes in what we think about nursing ethics—I just want to give you, Elizabeth and Charlene, the opportunity to say something if you wish, about big data or research ethics. You’ve touched on a number of points, so don’t feel like there’s anything you have to add, but I don’t want to move on to the next section until I’m sure that you’ve covered all the areas you want to.
CHARLENE: Maybe if I can add one more mention of bias. I think I gave an example of gender-related bias, and race-related bias. My work in particular is examining age-related bias, so I look at digital ageism and the ways in which ageism, which is an implicit bias, is then entrenched in the way we ideate technology, develop technology, build technology, and implement and then evaluate technology. And so in some of the work that we’ve done, we examined seven of the most commonly used publicly available facial data image datasets. These are datasets that contain hundreds of thousands of pictures of people that are taken from the internet. And these datasets are used for facial recognition, age estimation, and other AI applications. And when we looked at the seven most commonly used facial datasets, we found that of datasets that had tens of thousands of images, many of them had less than one percent of pictures of older adults.
So the implications of that, and speaking just more broadly around data disparities and not being able to benefit as much from the technologies that are being built, I think are really applicable to age tech or gerontology or technologies that are being built for older adults. I think that’s a really important piece that we don’t necessarily focus on when we talk about bias in AI. We tend to talk about it with respect to gender and race, but not so much with age, and a lot of my work is trying to shine a light on that—just broaden the aperture a bit.
And then Jan, you mentioned the different settings, but you know I think we’ve spoken about acute care, we’ve spoken about home care, and long-term care has come up a bit, but I think long-term care in particular—again, a place of older bodies, staffed by racialized women. I think it’s a really interesting cross-section of society, and certainly the way that technology is experienced is very different in a setting like long-term care. And so, in the case of COVID-19, where long-term care homes were in lockdown, and residents were not able to see their families, and families were not able to see their loved ones, they became these total institutions where the only way to have any contact was through technology, and a lot of them did not have the infrastructure to even support an iPad, or have the staff to use the technology in place. I think this speaks to sort of the broader context of how technology can change the nature of nursing work.
And so when technology, for example—and I think Paddy had mentioned the idea of grief—so over COVID-19, we had protected Code Blues. I mean I know in nursing school we never ran Code Blues using baby monitors, right? Like, that’s not something that anybody knew how to do from nursing school. But come COVID-19, that’s what my colleagues had to do. We were running Code Blues in ER using baby monitors.1 And people were saying goodbye to loved ones using iPads in the ICUs. And in long-term care, people were trying to have conversations using FaceTime, with not-so-good Wi-Fi. And so it really changed the nature of how we care, the way we care, the nature of care.
And so I think these are really big important ethical issues that deserve examination. And I think we need to be sort of raising these questions within nursing curricula. There’s some nods to how we prepare our APNs and the future generations of nurses to be sort of stewards of technology, but I think we need to be raising some of these really important questions earlier.
ROSALIE: Thank you, Charlene. Some really significant points. And Elizabeth, anything you wanted to add?
ELIZABETH: One point. Also with respect to long-term care and home care. There is a very large group of unregulated workers in that area. And these people also need to be very much involved in the development of technologies. Even much more than nurses, they don’t have a seat at the table. And, they are often the biggest point of care for older adults and other people who live in these settings. So, I’m basically just advocating for that group not to be forgotten.
ROSALIE: Well, thank you for raising that. One of the things that throughout our book, and definitely in this discussion, that we want to be clear about, is that [even though] what we’re talking about may relate to nurses. But it is our actions that we have with teams and other health care providers, that are equally important in terms of ensuring that patients and families and communities get the best care possible. So I appreciate what you’re saying, and thank you too, Charlene, for raising some of the issues that COVID just made much, much more apparent, and much worse.
And it’s probably a good segue into the last section that we wanted to talk about today, which is to think about what roles nurses have in future developments, as they relate to digital health technologies, and other technologies. And we’re being faced on a day-to-day basis with an array of different technologies that are being introduced without a lot of consideration about the impacts, and certainly the ethical issues that they might present.
And one of them, that some of us were talking about recently, is this whole area of bots that are being used, like ChatGPT, for example. And I was just made aware recently, that although that technology has been in development for a while, it was just introduced in November of 2022. And while we’re doing this session, it’s just a few months past that, and there’s already a hundred million users. And so, what are the implications of that technology? And it’s just one form of technology that has recently been introduced and has implications for health, for education, and a number of areas. So the question I would really love for us to address for a few minutes is, what role do you think nurses have in the development and application of health technologies to make sure, or to ensure, that these new technologies are developed and applied in accordance with ethical standards?
And maybe I’ll start with you Elizabeth, because you and I were having some discussion about ChatGPT, for example.
ELIZABETH: Gosh. Well, to your first question, when I think of nurses and ethics, for example, many of us have a hybrid background. It’s very, very common for people in ethics to have a variety of backgrounds, philosophy, nursing, and so on. I think if we could promote nurses also gaining quite a solid education, perhaps in engineering, computer science and so on, we could develop a cadre of nurse innovators, scientists, leaders, [with] a very untraditional hybrid background. Much in the way as has happened in bioethics. A lot of people in bioethics are hybrids, as I call us. And so I can see the same thing being really hugely beneficial in nursing with respect to these technologies as well. I know Charlene has a lot of expertise, which I could never claim to have in these areas, and probably has something to say about this as well.
ROSALIE: Thank you, Elizabeth. That’s an important point that you raise, I think, and really relevant to this discussion for certain. So Charlene, what are some of your thoughts?
CHARLENE: I think some of the roles that nurses could have within technology, are as innovators—I think nurses have always been very resourceful individuals, and nurses have a strong idea about what the problem is, and what our potential is to solve a problem. So I think nurses have a lot of ideas. And so, having the wherewithal to advocate for some of those solutions and work collaboratively with others as innovators, I think is a really big role.
As I’ve kind of been banging the drum, I think nurses are co-creators. I think alongside coming up with some of the ideas, nurses also make space for patients and patients’ families, and staff—that narrative of the patient, the patient’s story, in person-centred care, and the humanistic aspect of technology to make sure that technology is reflective of the needs and wants of the user, is embedded into the technology itself.
So not only from a data perspective, but I mean from a hardware perspective, like is the data easy for the user to use? Can somebody actually implement this and use this on a daily basis in their lives? Does this make your life better? And so really asking some of those questions and being that person who can help support and educate, I think is something that nurses can do really, really well! And then also from an implementation perspective, I think nurses, again, move evidence, move innovation into practice. At one point in time the stethoscope was considered technology, and nurses were the ones that wore the stethoscope and used the stethoscope on patients’ bodies, and I don’t see this as any different. I think there are many, many significant roles that nurses have.
Where we don’t prepare them enough is in our curriculum, because it doesn’t have—in my own perspective—we don’t emphasize informatics and health technologies and digital health technologies enough in [the] curriculum, so that nurses can be prepared to ask some of these bigger questions that are related to advocacy, that are related to education, that are related to creation, implementation. And then going back to some of the things that we talked about—the changing nature of nursing work. So being aware of those things. How do you advocate to make sure that there is that humanistic touch? And to make sure that technology is not being introduced as a straw man? And it’s not being introduced as a Band-Aid solution?—so that the underlying issue, that may be a lack of nurses, is addressed appropriately. I think in one example that we’ve spoken about in the past was a robot greeting or doing assessments in waiting rooms, because the wait times and the hours are ten hours plus, and so there’s a robot going around doing that now. So I think we really need to be aware of the reality that technology is being introduced into, as well as nursing practice. And I think nurses are the voice of the patients, and of the families.
ROSALIE: Really significant points. And I’m thinking, as we come close to closing, one of the things that I’d be interested in hearing a bit more about, and I know my colleagues that are part of this discussion would be as well, is more about the implications, the ethical implications of some of these programs like ChatGPT, for example, on educators. Because a number of the people who will be looking at this video, and reading this book, will be nurse educators. So I just wondered, Charlene, if you could say more about that, because I know you’ve given it a little bit of thought, and might have some insights that you would like to share.
CHARLENE: Sure. ChatGPT has taken the world by storm, and so I think almost every higher education institute, at the moment, around the world, is talking about ChatGPT. But I think there are significant concerns around ChatGPT being used for plagiarism, and how it can impact academic integrity, but also can get at the heart of key practices that help develop critical thinking. So, I think a good example is being able to articulate ideas, and writing ideas, and synthesizing ideas. But you can put those ideas into ChatGPT or you can cite different articles, and ChatGPT can synthesize it for you. And you can even identify what reference style you want it to be in, and it can provide you with the references, in that reference style.
ChatGPT though, is not perfect, so I think as educators we need to be aware of how to use ChatGPT. I don’t think it’s something that we can ignore, that it’s something we can turn our backs to. I think that it’s almost something that we need to embrace. So, potentially, when we pose a question to students, we are using, we generate something from ChatGPT and actually provide it to them, and say, “validate this,” or “how would you improve upon this?” So have ChatGPT do the basics, and expect more from our students to think a bit more critically around what’s being produced, or what’s being given to them.
I think, also this calls for educators to think about different assignments that require scaffolding in class, so that students can show that generative work over time. And so I think this is an opportunity to rethink some of our approaches, especially when it comes to nursing education.
One story that I will share about ChatGPT was I typed in a question about nursing informatics, and it actually provided me points from the chapter that I had authored. And so then I typed in the same prompt again, and I said, “Provide me references,” and so it provided the same prompts but now at the end of each sentence it actually provided me a reference, so it said Chu et al., 2021, and at the very end where it would provide me with the actual full text citation in APA style, it actually made up a completely imaginary reference. So it said my name, it said authors that I had co-authored with, a year. It made up a title, but actually cited a real journal, provided a volume, an issue, page numbers. And so I thought to myself, you know what, I’m going to see what is actually in this journal when I look up the citation, and it had nothing to do with informatics, or technology, or me, or any of my co-authors. It was from a totally random topic about nursing.
And so we really need to be careful around how we are using this type of technology. It’s not perfect, and so students really need to be aware of that. But ChatGPT—sorry to keep going on about this—ChatGPT also has implications for health care, because a lot of people are actually using ChatGPT as a counsellor. So somebody can type in a question around how “I’m feeling down today, what are some things that I can do, what are some things in the area that I can do to raise my spirits?” And it will provide you a really detailed example or really detailed answer about what someone can do. So people are actually turning to it for mental health issues and it’s actually being used in mental health care virtually. It has some interesting implications, for sure.
ROSALIE: Well thank you for that, I think you’ve given a nice overview of what some of the pitfalls are, but also what some of the promises are, because I think one of the points that you’ve made that’s really important in this whole discussion is for us not to take an adversarial approach to technology. It is with us, it is here, it’s always been involved in the development of different interventions within nursing, from the development of the thermometer. All of these areas we’ve talked about today, we can see how things have progressed. So, it’s important for us to be cognizant of that, but to understand where we need, as nurses, to be involved, and to make sure that we’re doing the kind of critical thinking that, Elizabeth, you pointed to earlier, and that we’re involved with application and also with the development and implementation. So those are really significant points.
I think we’re coming close to the end of our time. I know we can continue to talk about all of these areas for a long time. I wanted to ask if each of you could, perhaps, make a final point that you either think we haven’t covered, or it’s a message that you want to leave people with. And try and make that a pithy point, which is not always easy. So why I don’t I start with you, Paddy?
PADDY: I’m chuckling, because I was just thinking about, now what would be a pithy point? Well, I think the whole session, I just can’t thank you both enough, and the insights, and the wisdom, and the mutual support. I think it also shows the relational nature of our work. We learn from, with, and about each other, and how we take those insights forward. And I think it’s incredibly important when we’re into high tech and high stakes areas, such as the development of the kind of technology that you so eloquently described, Charlene, particularly the example of running into it as well, and seeing what happens. So I think it shows that we have a great deal to learn, we have a great deal to contribute, and we do it best I think when we work with each other across disciplines, within nursing, and so forth. So that was just a long-winded thank you.
ROSALIE: Thanks, Paddy. That’s a really significant point. And Jan—what about you? Is there something that you would like to leave the listeners with here today?
JAN: I want to say just how much I appreciated this time. The things I’ve learned during this time. I think what concerns me is how can we get more access to this kind of understanding, first of all. And technology, and what can be done? And especially, I think of people, again, who are maybe in home care, alone at home, people looking after their partners and that. But more than that, people who need to know what is possibly available in terms of information they could find out about. And it seems to me that some of what you’ve described is absolutely amazing, and wonderful to consider about how that could be helpful. And I’m just hoping there is going to be a way to get that information more locally to people who need it the most. How they can have access, what they can do to help themselves, in a way? So that’s my parting thoughts, because it’s such a rapidly developing area. I think the challenge for nurses, if we put [technology in the hands] of nurses, is to not be afraid to learn about it, and to be bold in using it.
ROSALIE: Well, really important, Jan, because as you say, we want to make sure that we try and level the playing field, and that there aren’t these inequities in terms of who has access to knowledge, and who has access to the technologies, etcetera. And that we emphasize this within nursing, and ensure that nurses are getting the education and the support they need to stay involved and to be current, and to also be innovators and change agents. I appreciate your comments. So Charlene, what about you, any pithy last comments you would like to make?
CHARLENE: I think I just want to thank the three of you for your invitation. I’ve really enjoyed this time that we’ve spent together, and it’s been really informative for me, and I hope it will be informative for the viewers in the future. My last ending message is nurses are a really, really important part of this picture, and there are ethical issues all along the technology pipeline. Nurses need to make their voices heard, from ideation, to implementation, to evaluation. I know that some nurses might not feel comfortable in any of those different aspects along the pipeline, but nurses are advocates, and I think we can be [people] who can put a small light on some of these really important ethical issues. So thank you again.
ROSALIE: Thanks, Charlene. And Elizabeth?
ELIZABETH: One pithy point, building on Charlene’s last point. I think we really, really need to hear from point-of-care nurses, in every context. In terms of evaluating technology, to development of technology, and so on. But we won’t hear from them if they’re not freed up from their caregiving responsibilities. So maybe this is more of a message for nurse leaders or others in organizations, to free nurses up to be involved in these kinds of activities. Otherwise, we won’t hear from them, because they’re so busy providing care, and so on. So it’s more of a message of “please consider freeing them up to be involved in this really important activity.” Along with all kinds of other activities that they’re not free to be involved in. So that would be my biggest message for APNs, who may have some control over nurses’ work. So that would be my pithy point. And also, thank you very much for inviting me. I’ve learned a lot from everybody in this group, so thank you. And it’s a wonderful book. I’m using it in my course, and I’m glad that there is an addition to [the book] involving technology.
ROSALIE: Well, thank you, Elizabeth, and you’re a big contributor to that book, and have been over time. It’s wonderful to have you now, Charlene, as well, making these contributions. As has been already said, I do want to extend my thanks to you [both], and to Jan, and to Paddy, for this session. I think we could continue this conversation for a long time, for sure. But lots of messages for people to think about. I’ve used this book, the previous editions, in teaching ethics courses at UVic at the graduate level, and I know that other people have done so as well, across Canada, and in other places in the world.
And I think one of the things that I really want to leave people thinking about is that we have a lot of information, as you mentioned, Charlene, earlier, in terms of the number of terabytes, I’ve forgotten how many you said—What was it again?
CHARLENE: A quintillion.
ROSALIE: Yes. So there you go. And a lot of what we know about this area is evolving rapidly, for sure. And we’ve got a lot of literature out there. I’m a very big science fiction fan, as I know others are in this session. But we can learn from mistakes, too, that we’ve seen in literature, in studies, in what’s been written. And I think we need to take that up as well, as we reflect on where we go in the future, and make very careful choices in terms of implementation. Just because we have a technology, doesn’t mean we ought to use it. And so I think that’s the other piece that we have to be very careful about. Because it’s often considered that something that’s new and innovative means it’s better, and that may not necessarily be the case. So we have to move with prudence, and with careful, critical reflection.
And then lastly, I just wanted to say that, in terms of some of the technologies that we haven’t talked about today, that are worrisome—big data, or artificial intelligence, or perhaps even the use of robotics, etcetera. [These] are areas where hopefully the viewers and readers of this session might want to explore a little bit more in some of their work. Because, again, there’s many, many promises, and we’ve seen the benefits of technology, but we also want to be cautious about the perils. So that’s the challenge that I throw out to everybody, to become as fully aware as you can. And, in aid of that, what we’ll do at the end of this session, is to add a number of references that will be useful for people to consider. We’re also going to be adding this video as a transcript to the overall digital book that we’re writing. The transcript will include some references as well. So that will give people an opportunity to view, listen, and read, so that will be accessible for, hopefully, everyone.
So with that, thank you very much again, and I’ll look forward to us as colleagues continuing these discussions, and maybe even doing some more writing about this in the future. So I wish everyone the best. And also, just to [let viewers know], I think [we will] include email addresses, if that’s acceptable to you, Charlene and Elizabeth, and I’ll talk more with Paddy and Jan about that. Because students and readers might want to connect with you in some way after this. So we’ll talk about how best to provide that information.
PADDY: And just before we close, Rosalie, I wanted to provide a huge thanks to you for helping to organize and guide this session. I think it’s gone incredibly well, and it’s been a vision. I know it’s a shared vision with all of us, but I think you’ve really understood what the necessity and the opportunities are here, so thank you so much for that.
ROSALIE: Well, that’s lovely for you to say, Paddy, thank you, and I’ve enjoyed this immensely. It’s hard to say that we have to close, but we do. So, I look forward to future discussions. Thanks very much, everyone.
Endnotes
1 “We now have a new protocol called a Protected Code Blue. A regular code blue is an emergency, when someone is in cardio-pulmonary arrest. They announce it over the speakers. In normal times, it’s all hands on deck: there are lots of people in the room, running in and out to coordinate and deliver medication and care. Now, whenever there’s someone who’s really, really sick, and we announce a Protected Code Blue, there’s almost no one in the room with them. We assume they have COVID-19. You just don’t know, so we’re trying to maximize precautions and not get inadvertently exposed to the virus. Lots of us are outside the room when we’d normally be in it. Just last night, we had a patient who came in sick and we called a Protected Code Blue. I communicated with the team inside—a doctor and two nurses—using a baby monitor, on the other side of the door. I can’t imagine what it’s like for the doctor in the room. You feel so alone as you’re resuscitating this patient. I’m there on the other side of the glass, giving advice into a baby monitor. It’s not the way we’re trained to do this job” (Choi, 2020, para. 5).
References
Abedi, A., Dayyani, F., Chu, C., & Khan, S. S. (2022, November 28–December 1). MAISON – Multimodal AI-based Sensor platform for Older Individuals [Conference presentation]. 2022 IEEE International Conference on Data Mining Workshops (ICDMW). IEEE. Orlando, FL, United States. https://doi.org/10.1109/ICDMW58026.2022.00040
Arries-Kleyenstüber, E. J., Davies, S., Luhanga, F., Chipanshi, M., & Cosford, K. (2021, August 4). Emerging digital technologies in virtual care in clinical nursing practice: An integrative review of ethical considerations and strategies. University of Regina. http://hdl.handle.net/10294/14451
Bajaj, S. (2022, July 27). Racial bias is built into the design of pulse oximeters. The Washington Post. https://www.washingtonpost.com/made-by-history/2022/07/27/racial-bias-is-built-into-design-pulse-oximeters/
Barnard, A. & Sandelowski, M. (2001). Technology and humane nursing care: (Ir)reconcilable or invented difference? Journal of Advanced Nursing, 34(3), 367–375. https://doi.org/10.1046/j.1365-2648.2001.01768.x
Choi, J. (2020, April 8). “You have minutes, maybe seconds, to get it right, or they’ll die”: A doctor describes what it’s like in hospital emergency rooms during COVID-19. Toronto Life. https://torontolife.com/life/you-have-minutes-maybe-seconds-to-get-it-right-or-theyll-die-a-doctor-describes-what-its-like-in-hospital-emergency-rooms/
Chu, C. H., Biss, R.K., Cooper, L., Quan, A. M. L., & Matulis, H. (2021). Exergaming platform for older adults residing in long-term care homes: User-centered design, development, and usability study. JMIR Serious Games, 9(1), Article e22370. https://doi.org/10.2196/22370.
Chu, C. H., Ronquillo, C., Hung, L., Khan, S., & Boscart, V. (2021). Technology recommendations to support person-centered care in long-term care homes during the COVID-19 pandemic and beyond. Journal of Aging and Social Policy, 33(4–5), 1–16. https://doi.org/10.1080/08959420.2021.1927620
Chu, C. H., Conway, A., Jibb, L., & Ronquillo, C. E. (2022). The impact of digital technologies, data analytics and AI on nursing informatics: The new skills and knowledge nurses need for the 21st century. In C. Delaney, C. Weaver, J. Sensmeier, L. Pruinelli, & P. Weber (Eds.), Nursing and informatics for the 21st century – embracing a digital world, Book 4: Nursing in an integrated digital world that supports people, systems, and the planet (3rd ed., pp. 149–170). Productivity Press.
Chu, C. H., Nyrup, R., Donato-Woodger, S., Leslie, K., Khan, S., Bernett, C., & Grenier, A. (2022). Examining the technology-mediated cycles of injustice that contribute to digital ageism: Advancing the conceptualization of digital ageism: Evidence and implications. In Petra ’22: Proceedings of the 15th international conference on PErvasive technologies related to assistive environments (pp. 545–551). Association for Computing Machinery.
Chu, C. H., Nyrup, R., Leslie, K., Shi, J., Bianchi, A., Lyn, A., McNicholl, M., Khan, S., Rahimi, S., & Grenier, A. (2022). Digital ageism: Challenges and opportunities in artificial intelligence for older adults. The Gerontologist, 62(7), 947–955. https://doi.org/10.1093/geront/gnab167
Chu, C. H., Yee, A., & Stamatopulous, V. (2022). Poor and lost connections: Essential family caregivers’ experiences using technology with family living in long-term care homes during COVID-19. Journal of Applied Gerontology, 41(6), 1547–1556. https://doi.org/10.1177/07334648221081850
Dykes, S., & Chu, C. H. (2021). Now more than ever, nurses need to be involved in technology design: Lessons from the COVID-19 pandemic. Journal of Clinical Nursing, 30(7–8), e25. https://doi.org/10.1111/jocn.15581
Fawzy, A., Wu, T. D., Wang K., Robinson, M., Farha, J., Bradke, A., Golden, S., Xu, Y., & Garibaldi, B. (2022). Racial and ethnic discrepancy in pulse oximetry and delayed identification of treatment eligibility among patients with COVID-19. JAMA Internal Medicine, 182(7), 730–738. https://doi.org/10.1001/jamainternmed.2022.1906
Gawande, A. (2018). Why doctors hate their computers. The New Yorker. https://www.newyorker.com/magazine/2018/11/12/why-doctors-hate-their-computers
Gottlieb, E. R., Ziegler, J., Morley, K., Rush, B., Celi, L. A. (2022). Assessment of racial and ethnic differences in oxygen supplementation among patients in the intensive care unit. JAMA Internal Medicine, 182(8), 849–858. https://doi.org/10.1001/jamainternmed.2022.2587
Haslam-Larmer, L., Shum, L., Chu, C. H., McGilton, K., McArthur, C., Flint, A., Khan, S. & Iaboni, A. (2022). Real-time location systems technology in the care of older adults with cognitive impairment living in residential care: A scoping review. Frontiers in Psychiatry, 13. https://doi.org/10.3389/fpsyt.2022.1038008
Ibrahim, H., Liu, X., Zariffa, N., Morris, A. D., & Denniston, A. K. (2021). Health data poverty: An assailable barrier to equitable digital health care. The Lancet Digital Health, 3(4), e260–e265. https://doi.org/10.1016/S2589-7500(20)30317-4
Irani, C. S., & Chu, C. H. (2022). Evolving with technology: Machine learning as an opportunity for operating room nurses to improve surgical care—A commentary. Journal of Nursing Management, 30(8), 3802–3805. https://doi.org/10.1111/jonm.13736
Ishiguro, K. (2021). Klara and the sun. Alfred A. Knopf.
Kwon, J. Y., Karim, M. E., Topaz, M., & Currie, L. M. (2019). Nurses “seeing forest for the trees” in the age of machine learning: Using nursing knowledge to improve relevance and performance. CIN: Computers, Informatics, Nursing, 37(4), 203–212.
McBride, S., Tietze, M., Robichaux, C., Stokes, L., & Weber, E. (2018). Identifying and addressing ethical issues with use of electronic health records. Online Journal of Issues in Nursing, 23(1). https://doi.org/10.3912/OJIN.Vol23No01Man05
Nyrup, R., Chu, C. H., & Falco, E. (2023, in press). Digital ageism, algorithmic bias and feminist critical theory. In J. Brown, S. Cave, K. Mackereth, & E. Drage (Eds.), Feminist AI: Critical perspectives on data, algorithms and intelligent machines. Oxford University Press.
Peter, E. (2020). Feminist reflections on home, digital health technologies and ethics. In H. Kohlen & J. McCarthy (Eds), Nursing ethics: Feminist perspectives (pp. 137–148). Springer.
Racine, E., Boehlen, W., & Sample, M. (2019). Healthcare uses of artificial intelligence: Challenges and opportunities. Healthcare Management Forum, 32(5), 272–275. https://doi.org/10.1177/0840470419843831
Ronquillo, C. E., Peltonen, L.-M., Pruinelli, L., Chu, C. H., Bakken, S., Beduschi, A., Cato, K., Hardiker, N., Junger, A., Michalowski, M., Nyrup, R., Rahimi, S., Reed, N., Salakoski, T., Salanterä, S., Walton, N., Weber, P., Wiegand, T., Topaz, M. (2021). Using artificial intelligence in nursing: Priorities, opportunities, and recommendations from an international invitational think-tank of the Nursing and Artificial Intelligence Leadership Collaborative. Journal of Advanced Nursing, 77(9), 3707–3717. https://doi.org/10.1111/jan.14855
Samrasweet, Y. (2023, March 31). How ‘compassionate ageism’ made its way into design of new technology. Spark. CBC Radio. https://www.cbc.ca/radio/spark/how-compassionate-ageism-made-its-way-into-design-of-new-technology-1.6782574
Sandusky, K. (2023, February 15). What will be the impact of AI-assisted robotics on humanity? CIFAR. https://cifar.ca/cifarnews/2023/02/15/what-will-be-the-impact-of-ai-assisted-robotics-on-humanity/
Schulte, F., & Fry, E. (2019, March 18). Death by 1,000 clicks: Where electronic health records went wrong. Fortune/Kaiser Health News. https://khn.org/news/death-by-a-thousand-clicks/
Sjoding, M. W., Dickson, R. P., Iwashyna, T. J., Gay, S. E., & Valley, T. S. (2020). Racial bias in pulse oximetry measurement. The New England Journal of Medicine, 383(25), 2477–2478.
Temsah, O., Khan, S. A., Chaiah, Y., Senjab, A., Alhasan, K., Jamal, A., Aljamaan, F., Malki, K. H., Halwani, R., Al-Tawfiq, J. A., Temsah, M.-H., Al-Eyadhy, A. (2023). Overview of early ChatGPT’s presence in medical literature: Insights from a hybrid literature review by ChatGPT and human experts. Cureus, 15(4), e37281. https://doi.org/10.7759/cureus.37281
Tomašev N., Glorot, X., Rae, J. W., Zielinski M., Askham, H., Saraiva, A., Mottram, A, Meyer, C, Ravuri, S., Protsyuk, I, Connell, A, Hughes, C., Karthikesalingam, A., Cornebise, J., Montgomery, H., Rees, G., Laing, C., Baker, C. R., Peterson, K., … Mohamed, S. (2019). A clinically applicable approach to continuous prediction of future acute kidney injury. Nature, 572(7767), 116–119. https://doi.org/10.1038/s41586-019-1390-1
Valbuena, V. S., Merchant, R., & Hough, C. L. (2022). Racial and ethnic bias in pulse oximetry and clinical outcomes. JAMA Internal Medicine, 182(7), 699–700. https://doi.org/10.1001/jamainternmed.2022.1903
Valbuena, V. S., Seelye, S., Sjoding, M. W., Valley, T. S., Dickson, R. P., Gay, S. E., & Iwashyna, T. J. (2022). Racial bias and reproducibility in pulse oximetry among medical and surgical inpatients in general care in the Veterans Health Administration 2013–19: Multicenter, retrospective cohort study. BMJ, 378, e06977. https://doi.org/10.1136/bmj-2021-069775
Von Gerich, H., Moen, H., Block, L. J., Chu, C. H., DeForest, H., Hobensack, M., Michalowski, M., Mitchell, J., Nibber, R., Olalia, M. A., Pruinelli, L., Ronquillo, C., Topaz, M., & Pelonen, L.-M. (2022). Artificial intelligence-based technologies in nursing: A scoping literature review of the evidence. International Journal of Nursing Studies, 127. https://doi.org/10.1016/j.ijnurstu.2021.104153
Zhu, J., Shi, K., Yang, C., Niu, Y., Zeng, Y., Zhang, N., Liu, T., & Chu, C. H. (2022). Ethical issues of smart home‐based elderly care: A scoping review. Journal of Nursing Management, 30(8), 3686–3699. https://doi.org/10.1111/jonm.13521
We use cookies to analyze our traffic. Please decide if you are willing to accept cookies from our website. You can change this setting anytime in Privacy Settings.