Safety Services New Brunswick

Navigating Risk in a Rapidly Changing World: Insights from a Canadian Forces Flight Safety Officer - Cpt. Mary-Frances Zielinski

Safety Services New Brunswick Season 3 Episode 28

Send us an e-mail to podcast@ssnb.ca

Mary-Frances Zielinski, Flight Safety Officer with the Canadian Army, joins host Perley Brewer to share powerful insights on risk, leadership, and the human side of safety in high-tech environments. From flying combat missions in Iraq to drone regulations, this episode offers a unique perspective on managing risk in a rapidly evolving world.

Perley Brewer   0:42
 Welcome to today's podcast. My name is Pearly Brewer and I will be your host. Today's podcast guest is Mary Francis Zelensky. Welcome Mary Francis.
Mary-Frances Zielinski   
0:53
 Hello, Pearly, thank you for inviting me to your show.
Perley Brewer   
0:57
 So my colleagues here at safety Services recently enjoyed a presentation that you did in Vancouver on safety future risk analysis. How do we manage advise on safety when tech outpaces organizational capacity? They came back and very highly recommended to you for one of our future.
 Podcast, which of course is today. So again, thank you very much for taking the time. Obviously you're very busy. Individual speaking on a wide variety of topics. So today what I want to focus on really is is 2 areas. One is.
 Is the presentation that you did in Vancouver just to give our listeners a little sense of some of the topics you cover? And then I see from your LinkedIn profile, you also speak on a few other topics and I'd like to delve into those just a tiny bit as well.
 So let's start Mary Francis with you, telling our listeners about your background.
Mary-Frances Zielinski   
1:54
 Yeah.
 Well, thanks pearly. I basically think that safety's been in my repertoire since the very beginning. I joined the Navy in the 1980s as a diesel mechanic. And right from doing shipboard fires to rescuing boats, taking on water, it became quick.
 Part of my life I was a paramedic in the forces and also in the Coast Guard. So you know, there were some other accident scenes that I attended and eventually went on an ice breaker through the Canadian Coast Guard. I worked for the Coast Guard for nine years.
 And fell in love with helicopters in the Arctic. So after that I got my flying license and flew oil and gas, mining, forestry, heli tours, and then eventually kind of got tired at that time. There were no women really flying in the industry, so I joined the Canadian.
 Forces to get a fair shake in the cockpit and that would have been in 2001 when I went to the forces Air Force and from there I've done tours to Bosnia, the Arctic, the United States worked as an operations officer for the Afghanistan.
 Ten years worked in operations for the Canadian Winter Olympics, and then most recently deployed to Iraq in 2019 and 2020, and I became a flight safety officer and currently I'm the flight safety officer for the Canadian Army of the West, so.
 Three division, which is Western Canada and that brings you kind of up to speed.
Perley Brewer   
3:39
 Very impressive backgrounds. You've, as I say, have been around obviously in a variety of situations. In your presentation, you talked about risk identification versus risk analysis versus risk assessment versus risk management. Do you want to talk a little bit about some of those terms?
Mary-Frances Zielinski   
3:59
 Sure. I I think it's important for us to understand that we do risk analysis every day and risk identification every day in our regular life and those terms may seem overwhelming at first, but really it's something that we do unconsciously as humans.
 The identification piece is what we and the forces call a OODA loop. So it's Orient, observe, decide, and act, and really those.
 That OODA loop is part of the identification, analysis and management piece. When you observe a risk, you're identifying it. So it could be anything from your dog is chewing on an electrical cord to something that you see in an accident scene that you.
 Are not sure of, but you have a feeling about the analysis piece comes when you're already turned toward whatever the risk is and you're deciding on how critical or how.
 Non prioritized, you're going to make it. There's risk. That's what's the closest alligator to my boat as I like to say. And then there's risk. That is, I must immediately decide that I'm going to deal with this or create a plan on how to deal with this. The worst case scenario.
 So we that turns us into management management of the risk or control of risk.
 I often say in my slides that you need to envision what your absolute worst case scenario is, and it's not a matter of if it's going to happen. It's a matter of when it's going to happen. So for us later we'll talk about a case that's Stalker Tutu was a cyclone helicopter with the Canadian Navy.
 Crashed in the Aegean Sea and there were a lot of factors in that, that accident that we'll discuss, but it's the worst case scenario to lose 5 lives off of a ship's crew on a beautiful sunny day.
 And that type of thing is going to happen. It's not a matter of when. So controlling that kind of risk comes down to your planning process. And in our OODA loop, when we have the decide act piece, that is where you're going to make your plan.
 So we've analyzed continually, I call it a wash, rinse, repeat cycle. But we analyze continually, go back and decide if it's a priority and then plan again for how we're going to act out, when and if that happens.
Perley Brewer   
6:41
 You mentioned that technology comes with a false sense of security when you're going through your process here. What do you mean by that?
Mary-Frances Zielinski   
6:51
 I think it's really easy to assume that people know what you know and technology gives us the false sense of security that it's been planned about. Again, Stalker 22's accident is a perfect example of that, so we'll talk about that later and even.
 One of the Chinook accidents we had two years ago in the Ottawa River, where 2 pilots lost their lives. I think that people get a lulled into just estimating that all the factors.
 Are considered all the factors of how solutions are presented to you and the type of information that would go into what the technology is presented to you is considered, but I like to use lessons from the battlefield in Ukraine.
 As part of my process of explaining risk, something that we can't analyze is human innovation, and this is where the connection between communication and human innovation comes to technology.
 When you take a UAS or an unmanned aerial system which people call drones and attach a incendiary device or a bomb, you know we have no way of predicting what the outcome of that bomb is going to be, how far reaching is it going to be? Is it going to be?
 10 meters or 100 meters when it explodes. How can you quantify risk with technology? And that's part of the basis of what I tried to explain. So we get lulled into thinking that technology is the answer, but really it's the human aspect that we need to consider.
Perley Brewer   
8:42
 Now I have a couple of questions that are I guess maybe not quite safety related. When you talk about drones, we watch have news channels a lot and of course that's one of the key items that's talked about. You mentioned drones a second ago. How accurate are they and how far can they go?
Mary-Frances Zielinski   
9:02
 Well, that is an extremely interesting question and I try to stay abreast of daily development, but really, truly right now the war between Ukraine and Russia daily is pushing the bounds of human innovation and human technology.
 We have the precision of attaching a device to a drone that is as small as a as a cell phone or smaller programming it to fly into a building on a window or event.
 Going up a shaft inside the building, an elevator shaft or a ventilation shaft and X sitting on a specific floor and exploding to collapse the whole building, we have the capability to predict and pick out very particular targets.
 And, you know, explode in the side of a of a driving supply truck with flichettes, which are razor sharp type arrowheads. That kind of damage, you know, how do we quantify that kind of a risk? How do we quantify that kind of a damage? So what we're looking for there is an outcome of an effect and we call these.
 These things air effects how far they can go. Right now I just read an article yesterday about a conversion. Sikorsky helicopters is done on the Blackhawk helicopter, which has been the mainstay of Air Army type of integration across.
 The battlefield for many years and it is completely automated now. They've removed the entire cockpit and the pilot. In fact they call it an optional pilot variant, so that helicopter variant.
 Now, unmanned or unpersoned is going to be able to fly into a battlefield, land, open its nose up, and deliver containers of logistical supplies. It in fact, can sling things under that helicopter airframe with no.
 Pilot. So it leads us to think more and more remote applications and where the human interface is.
Perley Brewer   
11:17
 Now for years and years, of course, and and maybe even decades, military focused on tanks, they focused on specialized airplanes and so on. How are the military dealing with?
 The EU advances in drones that have certainly come out of the Ukraine, Russia war.
Mary-Frances Zielinski   
11:42
 That's an interesting question to any of us that are in the profession of arms, and even the interested historian Canadian army has successfully flown drones for quite a number of years in the armored and the artillery course. Those drones were used for marking targets.
 Throughout the Afghanistan later part of the Afghanistan conflict, so we do have cadres or sections of the Canadian army that have been using drones for a while, but now this full on cultural change that's happening this full on roll out for the Canadian Army of.
 Of drones is challenging quite a few things, particularly for safety and for cultural awareness, where I think assumption biases, you know, a lot of people think because a drone is a small thing and they can just throw it up in the air or launch it out of the palm of their hand. If it's a micro drone that there's.
 Very little danger involved with that very little risk, but in fact, as soon as you operate and I'm saying this now as a pilot, as you operate in the third dimension of the airspace, well, there's an entirely new set of regulations, planning and conceptualizing about how you move across.
 Across the space and we can see that with Amazon and some of the highways in the sky that they're developing to deliver parcels where the drone is beyond visual line of sight of the operator. You know, we we can deliver anything from a pizza to a package to again a.
 Bomb. So there are a lot of growing pains currently in the forces adopting this and getting those assumptions that the Air Force has grown up with to to not make the mistake that the army knows what the Air Force is talking about.
 Sometimes I have an example of that in a small flight safety investigation that we did where the armored core who has been flying drones for a long time, as I said, had been keeping their log books in a particular way, and when we had a flight safety trained.
 Warrant Officer, go back from the Special Forces unit to the Armored Corps. He realized that the log books had been kept incorrectly and the assumption bias there was as well. They know what they're doing and they're flying. But nobody ever sat down with from the Air Force anyways and taught them.
 But as a pilot and aircrew in a traditional aircraft, you you know that you breathe and live that. So something as small as that makes a big difference because then the drone operators for the armored corps may not have been current or qualified to fly some of the missions they were flying because they have minimum approaches and landings just like.
 Pilots do to keep up, so it's a small example, but it goes to show you how function bias can make a big difference.
Perley Brewer   
14:48
 Now one of the comments in your presentation that to me was a a really good reminder was that regulations move slower than innovation.
 Do you want to talk just briefly about that?
Mary-Frances Zielinski   
14:59
 That is so true. Sure, I think when we have something brand new off the floor. So for example, this unmanned helicopter I just talked about, it took eight months for Sikorsky to conceptualize this and then to.
 Create the variant of it. The regulatory piece of that is always socialized and in flight safety in the Canadian Armed Forces, we do something called preventative measures. So if you see something, a risk coming or you have an accident that has a risk involved, we.
 Adopt and socialize between the various layers where risk is so yes, regulation follows slower. So are you going to be the person if you're the safety person, are you going to be the person that is going to move first and ask questions later? I call it the tail wagging the dog or.
 Are you going to be the type of safety person that may alert your higher headquarters or your higher office and say, hey, we got to think about this and wait for them to generate the direction and guidance that you might need to be on your work site?
 So I think that that lends itself to the discussion about an individual as a safety expert.
Perley Brewer   
16:22
 Are there any regulations now anywhere in Canada when it comes to drones?
Mary-Frances Zielinski   
16:28
 Absolutely yes. There's a whole set of regulations about drones. There's a very good drone selection tool that is out there that if you want to fly your drone in your backyard or in the local park, you can click on the map. I can send you a link for that if you'd like and Transport Canada.
 Has it you click on the province you're in, you click in the region that you're in. You click on the municipality you're in and it shows you all the different airspaces around you and what type of airspace you're allowed to fly in. And as a pilot, you need to be regulated. You are now considered a pilot, so you.
 You have a minimum standard for your weight class of your drone to get a pilot's license. Actually, as a drone pilot.
Perley Brewer   
17:17
 As a helicopter pilot, how much danger is associated with drones when it comes to you as a pilot?
Mary-Frances Zielinski   
17:26
 Absolutely huge. The last thing that I need to be doing is worrying about something that could be as small as the size of a crow or a magpie flying around in the sky or as large as an eagle or bigger potentially and.
Perley Brewer   
17:30
 Cool.
Mary-Frances Zielinski   
17:45
 I am on a mission for a forest fire or I'm on a mission for a flood in the local neighborhood, and I'm trying to assist people and somebody is there trying to get photos or just see what's happening and I fly right into that and then that gets ingested in my turbine engine and causes.
 Is a engine failure or smashes into my windshield and causes a major crack in the windshield, and you can imagine the the damage in the cockpit and the havoc that that would create. So it's a huge danger, pearly.
Perley Brewer   
18:19
 Now we had an incident actually this summer where that happened with some of the folks fighting wildfires here in Atlantic Canada. So how do you manage? So how do you manage that risk in as a helicopter pilot?
Mary-Frances Zielinski   
18:27
 That's right.
 So it's communication is really the key. You need to learn the layers of organizations that you need to communicate with and the media helps us quite a bit. First off, you're going to be finding out provincially.
 Your operation centre where it stood up. If there is any military component there, they're going to have what they call an air task force, an ATF or a tactical operation centre. You want to talk to all the liaisons that exist for these agencies, the RCMP, anybody else that might be.
 Creating a drone there and the media as well, so there will often be over these emergency locations a no fly zone that's published and you know we do our best to make sure that everybody's talking, but that doesn't stop Jane Doe.
 From flying her little micro drone to try and get pictures so the public itself needs to just be more aware and vigilant of how dangerous this is.
Perley Brewer   
19:40
 Yeah. So from a risk point of view, it's really for someone like a a pilot like yourself. It's a tremendous risk.
Mary-Frances Zielinski   
19:49
 It is what? Like what we watch out for birds constantly in a helicopter because birds are just as dangerous as a drone. In theory. You know, we've had aircraft hit things as big as a eagle or a goose and and completely make a hole in the cockpit window.
 Window we've ingested a number of smaller birds into the engines and then you need to land and shut down and check out the engines. Of course, if you're in a battlefield situation, that's not the case, but anything that is of risk to flight, you know, we're constantly looking out for wires.
 Antennas any anything like that?
Perley Brewer   
20:30
 So you've been in a variety of of different situations. What's the most dangerous situation that you have found yourself in?
Mary-Frances Zielinski   
20:39
 I would say that my last tour in Iraq was probably the most.
 Combat I've seen, and we were under enemy fire pretty much every day there. It started out originally as a night time predominantly attacks into the camp Camp Taji, which was about 25 kilometers just so.
 Outside the city of Baghdad, and we were flying daytime and nighttime missions there. We were constantly under threat from enemy fire there. And so tactically that would be probably the most risk that I've had we have.
 Had a number of deaths on the camp. We had explosions on the camp and we had some other incidents on the camp, so that would probably be personally my most dangerous theatre of operations.
Perley Brewer   
21:39
 So when you talk about someone being exposed to this whole concept of risk and risk analysis and so on, you've certainly been there and you've certainly seen it.
Mary-Frances Zielinski   
21:48
 Absolutely first hand.
Perley Brewer   
21:51
 You you talked in your presentation in Vancouver about investigations as part of the process and and looking at technology as well. Could you talk about one of the investigations you you were involved with and and the generally what the what was the approach that you used?
Mary-Frances Zielinski   
22:07
 Well, I'd like to talk about two investigations that I have not been involved with directly, but one of them was peripherally and the other one is just a great example. So the first example I'd like to discuss is that Cyclone Stalker 22 accident.
 That happened in 2020 in the Aegean Sea, and that was just after I got back from Iraq. That investigation.
 Has tremendous impact for a number of reasons. I feel that as I elaborate it, you will be able to see slowly how the faults came together and created this tragedy where 5 lives were lost.
 Essentially, it should never have happened. We had a brand new aircraft that had very little hours on it with a experienced, fully trained crew on board on a beautiful weather day. Doing a routine type of mission.
 On a calm sea just off their ship so they weren't flying far away. And unfortunately for your listeners, I'll do my best because I talk with my hands a lot to explain this, but the mission of a.
 Naval helicopter like that is to spot and locate submarines. So the type of flying that they do this mission was to get photographs of that type of flight regime and the ship for that day. So they had extra people on board. And This is why the loss of life was so.
 High.
 The aircraft departed the ship and was flying a mission where you fly low and fast over the surface of the ocean to pick out any enemy submarines. They locate the submarine and they they fly up.
 They're like a circus ride, a roller coaster ride. After they locate the submarine and they turn 180° away from the submarine so that they can get in a safe zone. They pick up more speed.
 And a little more altitude and then they dive down again towards the water and head over the ship one more time so that they've marked that submarine twice in the space of a short period of time. This information is sent back to their ship and it gives the ship an idea of what type of.
 Of submarine it is or or enemy vessel and what the path and speed of that enemy vessel could be. So this aircraft was built with four axes of Autopilot. In other words, it can operate with no hands.
 On the controls for a period of time, you control it as a pilot through the inputs that you put into your autopilot. So what heading you're going to fly, what speed you're going to fly, what altitude you're going to fly, climbing or descending and what orientation you'd like the aircraft to point in.
 The pilots were trained at the flight school to trust that Autopilot, to trust the technology that had been developed, and because they were younger pilots that have only known autopilots that work they've never had to fight off an autopilot my generation of.
 Pilot developing the tactical helicopter. I've flown the Griffin for my military career, which is a Bell 412 variant, which is essentially the Twin Huey from the Vietnam Days. But different blades and different transmission a little bigger.
 Autopilot was developed during my generation, so we learned when it fights us, what we need to do in our particular aircraft, how to kick it off. The other thing is, is that the collaboration between the government and the helicopter companies for manuals and training.
 Systems have gaps in them until things happen in industry or happen in the mission where you go back to the manufacturer and you say, hey, this part didn't work the way it was supposed to work or there was a failure of this particular component of the aircraft or why.
 Why can't we do this input into the autopilot? Essentially what happened here is what happened with the Boeing Max aircraft is that the engineers that designed the autopilot didn't understand the tactical aspect that the Navy had to fly. They didn't know.
 About this low fast gaining altitude, speed turning around really quickly thing tactic that the Navy needs to do so that wasn't even a consideration when they built the parameters of the autopilot.
 The next piece was that the schoolhouse didn't have a emergency procedure in the manuals that were written by the manufacturer to do an emergency kick off of the autopilot, and the other pieces is the pilots, as I said, were of a younger generation.
 And we're taught to rely upon that technology. So I'm not sure, Pearly, if you've ever heard of the Swiss cheese model of accidents, but essentially you have a hole in one layer.
Perley Brewer   
27:43
 Yes.
 Mm-hmm.
Mary-Frances Zielinski   
27:48
 And you have a hole again a gap and another layer. And so all these gaps together lined up to create the perfect storm of tragedy for Cyclone, Stalker, Tutu. So the aircraft fought the pilots, the pilots fought the aircraft, the aircraft.
 It was in a low, fast flight and trying to return to the ship for safety and unfortunately it didn't make it.
Perley Brewer   
28:18
 Now, yeah, the investigation of that, how long would it have taken?
Mary-Frances Zielinski   
28:23
 This is something that, as you can imagine because of the loss of life in the media, it's highly prioritized. It is the Director, Directorate of Flight Safety for the Canadian Armed Forces that spearheads the investigation and sends the team out to this aircraft.
 In the site recovery of the parts of the aircraft that were accessible still on the ocean floor was a complex, challenging terrain. In order to recover any information and see if there were any bodies to recover. So essentially we're we were looking at, you know, the initial reporting.
 Period that is circulated inside and then produced as a different timeline. It's been a it's been a year at least before the final report is published, but with the big interest that is quite quick. To be honest, the initial reports wrote sooner than that, but I can't speak to.
Perley Brewer   
29:18
 Hmm.
Mary-Frances Zielinski   
29:22
 That part of it, because I wasn't directly involved.
Perley Brewer   
29:26
 One of my pet peeves is whenever there's a crash you you turn on the TV and and the TV channels, whether it's CNN, whoever all of a sudden bring in these specialists to start speculating on what what could happen. And I just every time I shake my head and think, you know.
 A plane just went down, or a helicopter, whatever. And they have absolutely no idea whatsoever. And yet they will sit there for hours and speculate and try to suggest it could be this could be that and so on.
Mary-Frances Zielinski   
30:00
 We've had a couple accidents like the Washington DC with the helicopter in the plane. You know, we had a aircraft off the runway. I believe it was in Ontario last winter where they speculated on runway conditions and and you're completely right, people that are speculating on the news actually.
Perley Brewer   
30:01
 So.
 Yeah.
 Yeah.
 Yeah.
Mary-Frances Zielinski   
30:20
 Are speaking outside their professional realm.
Perley Brewer   
30:24
 Well, you know, really, when it comes down to it, with any accident, if you're really going to do a proper investigation, one you go into it with an open mind because you have no concept of what the causes could be.
 And it always frustrates me that they, they jump almost to a conclusion instead of going through the process. So you talk about technology, how are organizations being out paced by technology and how do you keep?
 Keep up with this technology that's changing just so rapidly and in not only your industry, but in everything from, you know, we see automated cars and you know every other kind of application coming down the tube as you mentioned.
 Delivery companies trying to deliver your pizza by drone and so on. How do you keep up with that?
Mary-Frances Zielinski   
31:17
 Well, it's a challenge. Obviously. I think the the key as I said is the human factor. You know we are the link between our past with no technology. You know when I was flying in Bosnia, we didn't have GPS, we didn't have cell phones, we didn't have any of those conveniences, the electronic tablet.
 On your knee with all the pre programmed flight approaches, that kind of thing. So the human factor is the connection between our past, our present and our future. And as we innovate daily and develop new technology.
 You want to culturally inside your company or your organization. Remember that people are your strongest asset, and if you have someone in your organization that is keen that way, or perhaps you have to consider devoting resources to developing a position that just deals with technology that they become your subject.
 Matter expert for your your drone. If your drone is going to be applied to whatever your property does, you know ranchers and farmers are using drones to survey their their cattle and their properties. We have real estate agents using drones to capture photos.
 So if they're going to bring that on board, you have to think if you're a small organization about doing due diligence and looking out for those regulations. If you're a larger organization and can afford to have someone that is your, I'll call I'll, I'll make a joke here. Your drone nerd, because they're into that technology then.
 And perhaps that's part of your solution, but it's the communication that drives us. That's where your failure and your accident risk space opens up.
Perley Brewer   
33:02
 So you all excuse me? You also do a lot of speaking on leadership, cultivating and developing leadership within the safety realm. What's your key message there when you talk to a group of people?
Mary-Frances Zielinski   
33:15
 Yeah.
 I think that people need to understand, and this is my personal take, is that passiveness and safety is not a thing. In order to be a good safety professional you need to understand that safety is an active role and that you need to be willing to speak up if you see something.
 Recognize that there is error risk there it it's using your social, your intellectual and your professional currency as I call it, you develop a relationship with your your supervisor, your general, your chief.
 Some.
 Administrative officer in your organization so that they are willing to give you an open door when you go and knock on the door. It's for a reason whether they want to hear that reason or not is another conversation. But leadership is as you develop that relationship and as you know that.
 You will have to stop operations at some point and you might be incurring a cost to the company, but my, you know opposite point of view from that is is what is the cost? Like I said, envision your worst case scenario if your organization on the work site on the job site.
 Doesn't have an emergency response plan if the if the crane tips over or if the you know worker at the bottom of the of the pit isn't addressed his safety, then you're you're setting yourself up for failure.
Perley Brewer   
34:54
 The importance of cultivating and developing leadership is it more critical today than it might have been even a a decade ago? When it comes to the military?
Mary-Frances Zielinski   
35:06
 Absolutely not. Just to the military, pearly to everybody.
 In order for us to get a grip on how fast things are changing day-to-day in our workplace, we need communication more than ever, and that cultivation of the team is part of my message. That's part of some of the other discussions I have in some of the talks that I do is.
 Functionality of a team culture of the workplace and how to develop every single person's leadership. Really, if we don't know one another and we don't feel connected to one another because connection truly is what our key is.
 That's how we build our relationships. So we're not going to do well with the changes that exist if we don't have that connection and that leadership and that teamwork.
Perley Brewer   
36:01
 Yeah, I know. If I go into a a manufacturing operation or a wood processing operation here in New Brunswick, they're certainly including technology more and more every day and and they tend to focus, I find a lot on the individual maybe operating one piece of equipment one.
 Part of their process that's been updated and and is now being technologically controlled, and I find only in a lot of cases the supervisor's not aware really, of how that technology works and the management's not.
 What kind of problems does that create?
Mary-Frances Zielinski   
36:37
 Again, it creates a a false sense of security. You know, part of risk management is addressing the the unknown. And again, I'll, I'll go back to the to the drone variance. When you're creating something new for your organization, your manufacturing.
 Ensuring that kind of thing. Has anybody sat back and and made a risk assessment? You know, you have to not expect every single person to be a subject matter expert in the organization with that technology, but you do have to look at.
 Where can we go wrong if we haven't looked at that, we don't have a plan and so.
 Part of the site, part of the Chinook accident that happened in the Ottawa River where we lost the two pilots, something very simple. When I talk about going back to basics is they had this accident basically at midnight.
 And the aircraft went down into the into the Ottawa River near the town of Petawawa. And so you can imagine the call out that ensued. After that, all the people that were recalled to come in.
 The commanding officer of that squadron was on a leave, an official leave and there was someone else that had been designated to be the acting commanding officer. But as they tried to get ahold of the commanding officer, his Apple Phone was on do not disturb.
 So. Umm.
 Do you know how to get a hold of your boss if the phone is on do not disturb? Apparently you can call an Apple Phone and I'm not an expert on this, but I believe it's if you call five times or three times within a certain minute period the phone will override to that do not disturb. The other thing that.
 Happened was the defence Wide Area Network computer system crashed because so many people were trying to log on and get the emergency response plan. Get details of the aircrew, the personnel records verify the fuel of the aircraft, verify the aircraft maintenance. The whole system crashed.
 So what do you do when we don't have technology? We go back to what we knew in the past and that is going back to basics. So back to your manufacturing question. The supervisor doesn't need to know how to manage the technology, but they have to have looked at what?
 But they're weak fronts, or their yaps, or their risk spaces.
Perley Brewer   
39:20
 Well, look, Mary Francis, certainly a very high opening today when it comes to this whole topic of risk and and what's your industry, what the military is going through. Very, very informative.
 Folks that are listening that would like a copy of a presentation that Mary Francis did in Vancouver. You can e-mail her at mfski212@gmail.com. That's MFSKI 212@gmail.com.
 Thank you very much, Mayor Francis, for being with us.
Mary-Frances Zielinski   
40:00
 Thank you, Pearly, and I really appreciate the time to talk to your listeners.