Plenary Session

Brief summary on the topic of the issue of software and hardware failures being a critical problem.

  • Software failure /technology failure
  • Noted that it’s not just tech but humans
  • Trust relationships are important. For AI good data is critical. Better governance is required. Ethics are vital.
  • CEOs: need to be responsible for a safety culture, not just a profit culture, and inculcation of a safety culture. A safety culture needs to be holistic as failures are often complex involving multiple components. People need to accept ownership and have a common language to communicate safety concerns and approaches.
  • Communication can be complex due to a lack of common frames of reference.
  • Do/should engineers understand the big picture? Differences in companies : some are hierarchical, some not, but things may still get used incorrectly, unethically. Systems are also now often very complex but still need to usable and reliable and safe. Usability may often be compromised due to poor user interfaces (e. G. Boeing or Airbus).
  • NASA was mentioned: too slow, or aware of past failures, or SpaceX – which is approach is best?
  • Radiation dosage problem : user interface made doctors used a work around but dosed inappropriately. Fitness for purpose and cost of implementation versus cost of consequence, especially where human life is at risk.
  • Negligence law discussed. This is not yet very present in tech. Is this really true?
  • Smartwatches, sensors. Who is responsible? Ease of use, but reliable? Patients have little leverage on their use of data? Personal health devices are easy to hack at the moment. Neural implants for emotion control – a huge issue for trust.
  • Proactive maintenance mentioned :cf DAME, BROADEN projects with Rolls-Royce in UK. Data, insurance companies? (Will add links later)
  • Prism effect… The bubble, confirmation bias.
  • Big data. Still costs money to gather, sort, curate, etc.
  • Use of technology important, e. G. In aircraft. Technology failure of drones and airspace concerns. Also for autonomous drivers if humans have to take over but rarely drive?
  • Last 5% of development takes a lot of time?
  • Self driving, and the need for accurate mapping (reminders me of an experience I had with sat navs)
  • Technology… Still need to maintain our humanity and not let the technology dominate 

Audience/comments questions:

  • Virtual medical records system – push back because physicians worried about being sued for not reading this.
  • Insurers the final and honest brokers. Is this true?
  • How can people stop unpopular technology? Facial recognition? Depending on use case. Facial recognition bans I’m government or private use?
  • Virtual reality. Sight-centric. Gaming merging with TV?