Boeing Crisis Management Case Study: A Detailed Analysis
In the fast-paced world of aerospace engineering, few companies have enjoyed the prestige and influence of Boeing.
Renowned for its innovative aircraft designs, Boeing has long been a symbol of excellence and reliability in the aviation industry.
However, even the most formidable airlines giants can stumble, and Boeing faced a monumental crisis that shook its foundation.
This blog post delves into the Boeing crisis management case study, examining how the company navigated through a storm of unprecedented proportions.
From fatal crashes to regulatory scrutiny, we unravel the complexities of the crisis and analyze Boeing’s response, shedding light on the importance of crisis management in the corporate landscape.
Let’s learn more about Boeing crisis management case study
Boeing as a prominent aerospace company
Boeing, a globally recognized aerospace company, has played a pivotal role in shaping the aviation industry for over a century.
Founded in 1916, Boeing has consistently pushed the boundaries of innovation, engineering some of the most iconic and groundbreaking aircraft in history.
From the pioneering days of commercial aviation to the modern era of space exploration, Boeing’s contributions have been instrumental in revolutionizing air travel and shaping the course of human progress.
As one of the largest aerospace manufacturers in the world, Boeing operates across multiple sectors, including commercial airplanes, defense, space, and services.
The company’s commercial aircraft division is particularly noteworthy, boasting a diverse portfolio of aircraft models that cater to the varying needs of airlines and passengers worldwide.
With a steadfast commitment to excellence and a relentless pursuit of technological advancements, Boeing has firmly established itself as a trusted partner to airlines, governments, and customers across the globe. Its aircraft have become synonymous with reliability, efficiency, and cutting-edge innovation, setting industry standards and shaping the future of flight
However, like any prominent organization, Boeing has faced its share of challenges and setbacks. In recent years, the company has been confronted with a crisis that has tested its resilience and called into question its reputation.
Background of the Boeing Crisis
Following are the key aspects of Boeing crisis and incidents that led the company towards unprecedented crisis.
Development of the Boeing 737 MAX aircraft
The Boeing 737 MAX, a narrow-body aircraft designed for fuel efficiency and enhanced performance, was a crucial addition to Boeing’s commercial aircraft lineup.
Developed as an upgrade to the highly successful Boeing 737 Next Generation (NG) series, the MAX promised increased fuel efficiency and operational cost savings, making it an attractive choice for airlines seeking to modernize their fleets.
The development of the 737 MAX began in 2011, with Boeing aiming to compete with rival Airbus’s A320neo aircraft. Key advancements included the incorporation of larger and more fuel-efficient engines, known as the LEAP-1B engines developed by CFM International, along with aerodynamic improvements and advanced avionics.
Boeing marketed the 737 MAX as a seamless transition for pilots already trained on the 737 NG, highlighting the aircraft’s commonality and familiarity. This offered airlines the opportunity to minimize training costs and streamline operations when introducing the new aircraft into their fleets.
To expedite the launch of the 737 MAX, Boeing pursued a strategy known as “minimum change, maximum benefit.” This involved making minimal alterations to the existing 737 design while maximizing performance gains through new engines and improved aerodynamics. However, this approach posed significant challenges in terms of maintaining the aircraft’s stability and handling characteristics.
As development progressed, Boeing faced pressures to bring the 737 MAX to market swiftly. The intense competition with Airbus and the demand for more fuel-efficient aircraft led to a compressed timeline, which put strain on the engineering and certification processes.
The Federal Aviation Administration (FAA) granted the 737 MAX its certification in March 2017, paving the way for deliveries to commence. Boeing anticipated that the 737 MAX would be a game-changer for the company, reaffirming its dominance in the narrow-body aircraft market.
Little did Boeing know that the development and subsequent introduction of the 737 MAX would soon be marred by a series of devastating events that would test the company’s crisis management capabilities to their limits.
Two fatal crashes involving the 737 MAX
The Boeing 737 MAX was thrust into the global spotlight following two tragic and highly publicized crashes that resulted in the loss of hundreds of lives. These crashes were:
Lion Air Flight 610 (October 29, 2018)
Lion Air Flight 610, a scheduled domestic flight in Indonesia, crashed into the Java Sea shortly after takeoff from Jakarta. The aircraft involved was a Boeing 737 MAX 8. All 189 passengers and crew on board perished in the accident. The investigation revealed that erroneous data from a malfunctioning angle of attack sensor triggered the aircraft’s Maneuvering Characteristics Augmentation System (MCAS), an automated flight control system designed to enhance pitch stability. The repeated activation of MCAS caused the aircraft’s nose to be pushed down, overpowering the pilots’ attempts to regain control. This tragic event raised concerns about the 737 MAX’s flight control system and its potential impact on flight safety.
Ethiopian Airlines Flight 302 (March 10, 2019)
Ethiopian Airlines Flight 302, a scheduled international flight from Ethiopia to Kenya, crashed shortly after takeoff from Addis Ababa. The aircraft involved was a Boeing 737 MAX 8, similar to the Lion Air aircraft. The crash claimed the lives of all 157 passengers and crew on board. Investigations into the accident indicated similar circumstances to the Lion Air crash, with the MCAS system being implicated once again. The data from the flight data recorder and cockpit voice recorder pointed to a faulty angle of attack sensor triggering the MCAS, leading to a nosedive that the pilots were unable to counteract.
Investigations and its Results
These two devastating crashes prompted worldwide alarm and raised serious questions about the safety of the Boeing 737 MAX. As a result, regulatory authorities around the globe, including the Federal Aviation Administration (FAA) in the United States, grounded the entire 737 MAX fleet pending further investigation and the implementation of appropriate safety measures.
Multiple investigations were launched to determine the root causes of the accidents. These investigations involved aviation authorities, Boeing, airlines, and other industry experts. The primary focus was on understanding the design and functionality of the MCAS system, the training provided to pilots, the certification process, and potential lapses in safety oversight.
The investigations revealed critical issues, including shortcomings in the design and operation of the MCAS system, inadequate pilot training regarding the system’s functionality and potential failure modes, and concerns about the regulatory processes surrounding the certification of the 737 MAX. The findings of these investigations had far-reaching implications for Boeing, the aviation industry, and the future of the 737 MAX aircraft.
Media across the world widely reported on Boeing crisis after incidents of two crashes .
Analysis of Boeing’s Crisis Management Approach
Boeing’s initial handling of the 737 MAX crisis was met with widespread criticism and scrutiny. Several key aspects of their approach can be evaluated:
Delayed Acknowledgment
Boeing’s initial response was perceived by many as slow and lacking in transparency. It took several days for Boeing to issue a statement expressing condolences and acknowledging the tragedies. This delay eroded public trust and raised concerns about Boeing’s commitment to transparency and accountability.
Lack of Transparency
Boeing’s delayed acknowledgment of the accidents and limited transparency surrounding the issues with the MCAS system undermined public trust and raised concerns about the company’s commitment to safety. The perception of secrecy and withholding of critical information further eroded confidence in Boeing’s crisis management approach.
Boeing was criticized for not being forthcoming with information about the MCAS system and its potential risks. It was revealed that Boeing had not disclosed the existence of the MCAS system to pilots or airlines prior to the accidents. This lack of transparency raised concerns about the adequacy of the information provided to operators and the extent of their understanding of the system’s functionality and potential failure modes.
Confidence in the Aircraft
In the immediate aftermath of the accidents, Boeing maintained confidence in the safety of the 737 MAX. The company initially stated that the aircraft was airworthy and did not require any additional pilot training beyond what was already provided. This response created a perception that Boeing was downplaying the severity of the situation and prioritizing commercial interests over safety.
Minimal Engagement with Stakeholders
Boeing’s initial response seemed to lack proactive and open engagement with key stakeholders, including regulators, airlines, and the public. Insufficient communication and consultation with these parties created an impression of disconnection and a failure to prioritize their concerns and perspectives.
Inadequate Crisis Communication
Boeing’s communication strategy during the early stages of the crisis was deemed reactive and insufficient. The company’s messaging lacked empathy and failed to address the severity of the situation adequately. This approach fueled speculation and contributed to a perception that Boeing was more concerned with protecting its brand than addressing the safety concerns raised by the accidents.
Overemphasis on Commercial Interests
The initial response by Boeing was perceived by some as prioritizing commercial interests over safety. Maintaining confidence in the aircraft’s airworthiness without additional pilot training raised questions about Boeing’s commitment to putting safety first. This perception further eroded trust in the company’s crisis management efforts.
Regulatory Relations and Oversight
The crisis also shed light on concerns surrounding the relationship between Boeing and regulatory authorities, particularly the FAA. Questions were raised about the level of oversight and the certification process for the 737 MAX. The perception of a cozy relationship between Boeing and the FAA added to the public’s skepticism regarding the independence and objectivity of safety evaluations.
Decision to continue production and delivery of the 737 MAX
The decision by Boeing to continue production and delivery of the 737 MAX aircraft during the early stages of the crisis was a subject of intense scrutiny and debate. Analyzing this decision involves considering the factors and considerations that influenced Boeing’s stance:
- Financial Implications: Boeing faced significant financial implications due to the grounding of the 737 MAX fleet. The production and delivery of aircraft generate substantial revenue for the company, and halting production would have resulted in substantial losses. Boeing likely considered the potential impact on its financial performance, stock value, and relationships with suppliers and customers when deciding to continue production.
- Confidence in Remedial Measures : Boeing believed that the software updates and additional pilot training being implemented as part of the proposed fixes for the MCAS system would address the safety concerns. They may have felt confident that these measures, once implemented, would reinstate the airworthiness of the 737 MAX and enable its safe operation. This confidence likely influenced their decision to continue production and delivery.
- Regulatory and Certification Expectations: Boeing may have also considered the expectations of regulatory authorities, particularly the Federal Aviation Administration (FAA), regarding the steps required to recertify the 737 MAX. By continuing production, Boeing may have sought to demonstrate their commitment to addressing the identified issues promptly and efficiently. This approach may have been viewed as a proactive step toward meeting regulatory expectations and expediting the return of the aircraft to service.
- Supply Chain Considerations: Halting production would have had significant implications for Boeing’s extensive global supply chain. Numerous suppliers and manufacturing partners rely on the production and delivery of the 737 MAX for their own operations and revenue. Disruptions to the supply chain could have had cascading effects on multiple stakeholders. Considering these dependencies, Boeing may have determined that continuing production, albeit at a reduced rate, would minimize disruptions throughout the supply chain.
Impact of the crisis on Boeing’s reputation and financials
The crisis surrounding the 737 MAX had a profound impact on Boeing’s reputation and financials. Let’s examine the consequences in both areas:
Reputation Impact
The 737 MAX crisis severely damaged Boeing’s reputation and eroded trust among key stakeholders, including airlines, passengers, regulators, and the general public. The accidents and subsequent revelations about the aircraft’s design and certification processes raised questions about Boeing’s commitment to safety and transparency.
Financial Impact
Grounding and Production Halt: The grounding of the 737 MAX fleet resulted in a halt in deliveries and production, leading to significant financial losses for Boeing. The company had to store and maintain grounded aircraft, face cancellations and delays in orders, and adjust its production schedules.
Order Cancellations
Boeing experienced a substantial number of order cancellations for the 737 MAX from airlines and leasing companies. The loss of these orders translated into reduced revenue and affected the company’s long-term sales projections.
Boeing’s communication strategy during the crisis
The effectiveness of Boeing’s communication strategy during the 737 MAX crisis can be evaluated based on several key factors:
- Timeliness: Boeing’s initial response to the crisis was delayed, which had a negative impact on its effectiveness. The company took several days to issue public statements acknowledging the accidents and expressing condolences. This delay resulted in a perception of unresponsiveness and lack of transparency, eroding public trust.
- Transparency and Openness: Boeing’s communication strategy during the early stages of the crisis was criticized for lacking transparency. The company faced allegations of withholding critical information from regulators, airlines, and the public. The limited disclosure and perceived secrecy fueled speculation and further eroded trust in Boeing’s crisis management approach.
- Clarity of Messaging: The clarity of Boeing’s messaging during the crisis was also a concern. There were instances where the company downplayed the severity of the situation and maintained confidence in the airworthiness of the 737 MAX without acknowledging the need for additional pilot training or design changes. This approach created confusion and raised questions about Boeing’s commitment to safety.
- Stakeholder Engagement: Boeing’s communication strategy faced criticism for its limited engagement with key stakeholders, including regulators, airlines, and the families of the crash victims. Insufficient communication and consultation with these stakeholders created a perception of disconnection and a failure to address their concerns and needs adequately.
- Crisis Management Updates: Boeing’s efforts to provide regular updates and progress reports regarding the investigation, the proposed fixes, and the recertification process were essential. However, there were instances where the information provided was seen as incomplete or lacking in transparency, fueling skepticism and undermining the effectiveness of their communication strategy.
Legal and regulatory challenges faced by Boeing
Boeing faced significant legal and regulatory challenges as a result of the 737 MAX crisis. Let’s examine some of the key challenges:
- Legal Liability: Boeing faced numerous legal challenges, including lawsuits from the families of the crash victims, airlines seeking compensation for financial losses, and investors alleging securities fraud. The lawsuits alleged negligence, product liability, wrongful death, and other claims against Boeing. The company had to navigate complex legal proceedings, potentially leading to substantial financial settlements and damage awards.
- Regulatory Investigations: Multiple regulatory authorities conducted investigations into the design, certification, and safety of the 737 MAX. The primary focus was on the Federal Aviation Administration (FAA), which faced scrutiny for its oversight of Boeing and the certification process. Other countries’ aviation authorities, such as the European Union Aviation Safety Agency (EASA), also conducted independent reviews. These investigations aimed to determine the extent of any regulatory lapses and evaluate the adequacy of the aircraft’s design and certification.
- Certification and Reapproval Process : The grounding of the 737 MAX led to a lengthy recertification process. Boeing had to work closely with regulatory agencies to address the identified safety concerns, implement software updates, and enhance pilot training requirements. The process involved rigorous testing, inspections, and demonstration of compliance with regulatory standards before the aircraft could be cleared to fly again. The recertification process required coordination between Boeing, regulatory authorities, and international aviation bodies, adding complexity and scrutiny to the company’s operations.
- Regulatory Reforms: The crisis also prompted calls for regulatory reforms to improve safety oversight and the certification process. There were concerns about the level of independence and objectivity in the relationship between Boeing and the FAA. Governments and regulatory agencies around the world were under pressure to strengthen safety regulations, enhance oversight, and ensure transparency to prevent similar incidents in the future.
- Increased Regulatory Scrutiny : Boeing faced heightened regulatory scrutiny beyond the 737 MAX. Inspections and audits of other Boeing aircraft models, manufacturing facilities, and quality control processes were conducted to ensure compliance with safety standards. This broader scrutiny affected the company’s operations and required additional resources to address any identified issues.
Corrective measures implemented by Boeing to address the crisis
In response to the 737 MAX crisis, Boeing implemented several corrective measures aimed at addressing the identified issues and restoring confidence in the aircraft. Let’s analyze some of these measures:
- Software Updates: Boeing developed and implemented software updates to address the MCAS system’s design flaws, which were identified as a contributing factor in the accidents. The updates included changes to the system’s activation criteria, increased redundancy, and enhanced pilot control. These updates were intended to prevent the system from engaging erroneously and provide pilots with more control over the aircraft.
- Enhanced Pilot Training: Boeing recognized the need to improve pilot training on the 737 MAX, particularly regarding the MCAS system. The company revised the training materials and procedures to ensure that pilots were adequately trained to handle any potential issues related to the MCAS system. The training enhancements aimed to provide pilots with a better understanding of the system’s functionality, failure modes, and appropriate responses.
- Collaboration with Regulators: Boeing worked closely with regulatory authorities, primarily the FAA, throughout the crisis and the subsequent recertification process. The company collaborated with regulators to address safety concerns, share technical information, and seek approval for the proposed fixes. This collaboration was aimed at ensuring that the aircraft met all regulatory requirements and regained certification for safe operation.
- Independent Review and Oversight: Boeing initiated an independent review of its processes and practices related to aircraft design, development, and certification. The review was led by experts outside the company and focused on identifying areas for improvement and strengthening safety practices. The findings and recommendations from the review were used to enhance Boeing’s internal processes and ensure better adherence to safety standards.
- Cultural and Organizational Changes: The crisis prompted Boeing to reflect on its internal culture and decision-making processes. The company acknowledged the need for cultural and organizational changes to foster a stronger focus on safety, transparency, and accountability. Boeing aimed to address any shortcomings in its culture and decision-making frameworks to prevent similar issues in the future.
Final Words
The Boeing crisis management case study surrounding the 737 MAX crisis serves as a powerful reminder to importance of prioritizing safety, timely and transparent communication, strong regulatory relationships, rigorous risk assessment, independent oversight, continuous learning, and ethical decision-making.
Boeing’s initial response to the crisis faced significant challenges, including a lack of transparency and accountability. The decision to continue production and delivery of the 737 MAX while it was under investigation also raised concerns. These missteps led to a severe impact on Boeing’s reputation and financials, including loss of trust, order cancellations, legal liabilities, and financial losses.
However, Boeing took corrective measures to address the crisis, including software updates, enhanced pilot training, collaboration with regulators, independent reviews, and organizational changes. These steps were crucial in addressing the identified issues, rebuilding trust, and ensuring the safe return of the 737 MAX to service.
About The Author
Tahir Abbas
Related posts.
Gamification in the Workplace – Benefits and Examples
What is DICE Framework in Change Management?
Rebuilding Trust: Apple Crisis Management Case Study
Advertisement
The Boeing 737 MAX: Lessons for Engineering Ethics
- Original Research/Scholarship
- Published: 10 July 2020
- Volume 26 , pages 2957–2974, ( 2020 )
Cite this article
- Joseph Herkert 1 ,
- Jason Borenstein 2 &
- Keith Miller 3
121k Accesses
71 Citations
111 Altmetric
12 Mentions
Explore all metrics
The crash of two 737 MAX passenger aircraft in late 2018 and early 2019, and subsequent grounding of the entire fleet of 737 MAX jets, turned a global spotlight on Boeing’s practices and culture. Explanations for the crashes include: design flaws within the MAX’s new flight control software system designed to prevent stalls; internal pressure to keep pace with Boeing’s chief competitor, Airbus; Boeing’s lack of transparency about the new software; and the lack of adequate monitoring of Boeing by the FAA, especially during the certification of the MAX and following the first crash. While these and other factors have been the subject of numerous government reports and investigative journalism articles, little to date has been written on the ethical significance of the accidents, in particular the ethical responsibilities of the engineers at Boeing and the FAA involved in designing and certifying the MAX. Lessons learned from this case include the need to strengthen the voice of engineers within large organizations. There is also the need for greater involvement of professional engineering societies in ethics-related activities and for broader focus on moral courage in engineering ethics education.
Similar content being viewed by others
Repentance as Rebuke: Betrayal and Moral Injury in Safety Engineering
Airworthiness and Safety in Air Operations in Ecuadorian Public Institutions
Regulations and Application of Aeroengine Airworthiness Exemption
Avoid common mistakes on your manuscript.
Introduction
In October 2018 and March 2019, Boeing 737 MAX passenger jets crashed minutes after takeoff; these two accidents claimed nearly 350 lives. After the second incident, all 737 MAX planes were grounded worldwide. The 737 MAX was an updated version of the 737 workhorse that first began flying in the 1960s. The crashes were precipitated by a failure of an Angle of Attack (AOA) sensor and the subsequent activation of new flight control software, the Maneuvering Characteristics Augmentation System (MCAS). The MCAS software was intended to compensate for changes in the size and placement of the engines on the MAX as compared to prior versions of the 737. The existence of the software, designed to prevent a stall due to the reconfiguration of the engines, was not disclosed to pilots until after the first crash. Even after that tragic incident, pilots were not required to undergo simulation training on the 737 MAX.
In this paper, we examine several aspects of the case, including technical and other factors that led up to the crashes, especially Boeing’s design choices and organizational tensions internal to the company, and between Boeing and the U.S. Federal Aviation Administration (FAA). While the case is ongoing and at this writing, the 737 MAX has yet to be recertified for flight, our analysis is based on numerous government reports and detailed news accounts currently available. We conclude with a discussion of specific lessons for engineers and engineering educators regarding engineering ethics.
Overview of 737 MAX History and Crashes
In December 2010, Boeing’s primary competitor Airbus announced the A320neo family of jetliners, an update of their successful A320 narrow-body aircraft. The A320neo featured larger, more fuel-efficient engines. Boeing had been planning to introduce a totally new aircraft to replace its successful, but dated, 737 line of jets; yet to remain competitive with Airbus, Boeing instead announced in August 2011 the 737 MAX family, an update of the 737NG with similar engine upgrades to the A320neo and other improvements (Gelles et al. 2019 ). The 737 MAX, which entered service in May 2017, became Boeing’s fastest-selling airliner of all time with 5000 orders from over 100 airlines worldwide (Boeing n.d. a) (See Fig. 1 for timeline of 737 MAX key events).
737 MAX timeline showing key events from 2010 to 2019
The 737 MAX had been in operation for over a year when on October 29, 2018, Lion Air flight JT610 crashed into the Java Sea 13 minutes after takeoff from Jakarta, Indonesia; all 189 passengers and crew on board died. Monitoring from the flight data recorder recovered from the wreckage indicated that MCAS, the software specifically designed for the MAX, forced the nose of the aircraft down 26 times in 10 minutes (Gates 2018 ). In October 2019, the Final Report of Indonesia’s Lion Air Accident Investigation was issued. The Report placed some of the blame on the pilots and maintenance crews but concluded that Boeing and the FAA were primarily responsible for the crash (Republic of Indonesia 2019 ).
MCAS was not identified in the original documentation/training for 737 MAX pilots (Glanz et al. 2019 ). But after the Lion Air crash, Boeing ( 2018 ) issued a Flight Crew Operations Manual Bulletin on November 6, 2018 containing procedures for responding to flight control problems due to possible erroneous AOA inputs. The next day the FAA ( 2018a ) issued an Emergency Airworthiness Directive on the same subject; however, the FAA did not ground the 737 MAX at that time. According to published reports, these notices were the first time that airline pilots learned of the existence of MCAS (e.g., Bushey 2019 ).
On March 20, 2019, about four months after the Lion Air crash, Ethiopian Airlines Flight ET302 crashed 6 minutes after takeoff in a field 39 miles from Addis Ababa Airport. The accident caused the deaths of all 157 passengers and crew. The Preliminary Report of the Ethiopian Airlines Accident Investigation (Federal Democratic Republic of Ethiopia 2019 ), issued in April 2019, indicated that the pilots followed the checklist from the Boeing Flight Crew Operations Manual Bulletin posted after the Lion Air crash but could not control the plane (Ahmed et al. 2019 ). This was followed by an Interim Report (Federal Democratic Republic of Ethiopia 2020 ) issued in March 2020 that exonerated the pilots and airline, and placed blame for the accident on design flaws in the MAX (Marks and Dahir 2020 ). Following the second crash, the 737 MAX was grounded worldwide with the U.S., through the FAA, being the last country to act on March 13, 2019 (Kaplan et al. 2019 ).
Design Choices that Led to the Crashes
As noted above, with its belief that it must keep up with its main competitor, Airbus, Boeing elected to modify the latest generation of the 737 family, the 737NG, rather than design an entirely new aircraft. Yet this raised a significant engineering challenge for Boeing. Mounting larger, more fuel-efficient engines, similar to those employed on the A320neo, on the existing 737 airframe posed a serious design problem, because the 737 family was built closer to the ground than the Airbus A320. In order to provide appropriate ground clearance, the larger engines had to be mounted higher and farther forward on the wings than previous models of the 737 (see Fig. 2 ). This significantly changed the aerodynamics of the aircraft and created the possibility of a nose-up stall under certain flight conditions (Travis 2019 ; Glanz et al. 2019 ).
(Image source: https://www.norebbo.com )
Boeing 737 MAX (left) compared to Boeing 737NG (right) showing larger 737 MAX engines mounted higher and more forward on the wing.
Boeing’s attempt to solve this problem involved incorporating MCAS as a software fix for the potential stall condition. The 737 was designed with two AOA sensors, one on each side of the aircraft. Yet Boeing decided that the 737 MAX would only use input from one of the plane’s two AOA sensors. If the single AOA sensor was triggered, MCAS would detect a dangerous nose-up condition and send a signal to the horizontal stabilizer located in the tail. Movement of the stabilizer would then force the plane’s tail up and the nose down (Travis 2019 ). In both the Lion Air and Ethiopian Air crashes, the AOA sensor malfunctioned, repeatedly activating MCAS (Gates 2018 ; Ahmed et al. 2019 ). Since the two crashes, Boeing has made adjustments to the MCAS, including that the system will rely on input from the two AOA sensors instead of just one. But still more problems with MCAS have been uncovered. For example, an indicator light that would alert pilots if the jet’s two AOA sensors disagreed, thought by Boeing to be standard on all MAX aircraft, would only operate as part of an optional equipment package that neither airline involved in the crashes purchased (Gelles and Kitroeff 2019a ).
Similar to its responses to previous accidents, Boeing has been reluctant to admit to a design flaw in its aircraft, instead blaming pilot error (Hall and Goelz 2019 ). In the 737 MAX case, the company pointed to the pilots’ alleged inability to control the planes under stall conditions (Economy 2019 ). Following the Ethiopian Airlines crash, Boeing acknowledged for the first time that MCAS played a primary role in the crashes, while continuing to highlight that other factors, such as pilot error, were also involved (Hall and Goelz 2019 ). For example, on April 29, 2019, more than a month after the second crash, then Boeing CEO Dennis Muilenburg defended MCAS by stating:
We've confirmed that [the MCAS system] was designed per our standards, certified per our standards, and we're confident in that process. So, it operated according to those design and certification standards. So, we haven't seen a technical slip or gap in terms of the fundamental design and certification of the approach. (Economy 2019 )
The view that MCAS was not primarily at fault was supported within an article written by noted journalist and pilot William Langewiesche ( 2019 ). While not denying Boeing made serious mistakes, he placed ultimate blame on the use of inexperienced pilots by the two airlines involved in the crashes. Langewiesche suggested that the accidents resulted from the cost-cutting practices of the airlines and the lax regulatory environments in which they operated. He argued that more experienced pilots, despite their lack of information on MCAS, should have been able to take corrective action to control the planes using customary stall prevention procedures. Langewiesche ( 2019 ) concludes in his article that:
What we had in the two downed airplanes was a textbook failure of airmanship. In broad daylight, these pilots couldn’t decipher a variant of a simple runaway trim, and they ended up flying too fast at low altitude, neglecting to throttle back and leading their passengers over an aerodynamic edge into oblivion. They were the deciding factor here — not the MCAS, not the Max.
Others have taken a more critical view of MCAS, Boeing, and the FAA. These critics prominently include Captain Chesley “Sully” Sullenberger, who famously crash-landed an A320 in the Hudson River after bird strikes had knocked out both of the plane’s engines. Sullenberger responded directly to Langewiesche in a letter to the Editor:
… Langewiesche draws the conclusion that the pilots are primarily to blame for the fatal crashes of Lion Air 610 and Ethiopian 302. In resurrecting this age-old aviation canard, Langewiesche minimizes the fatal design flaws and certification failures that precipitated those tragedies, and still pose a threat to the flying public. I have long stated, as he does note, that pilots must be capable of absolute mastery of the aircraft and the situation at all times, a concept pilots call airmanship. Inadequate pilot training and insufficient pilot experience are problems worldwide, but they do not excuse the fatally flawed design of the Maneuvering Characteristics Augmentation System (MCAS) that was a death trap.... (Sullenberger 2019 )
Noting that he is one of the few pilots to have encountered both accident sequences in a 737 MAX simulator, Sullenberger continued:
These emergencies did not present as a classic runaway stabilizer problem, but initially as ambiguous unreliable airspeed and altitude situations, masking MCAS. The MCAS design should never have been approved, not by Boeing, and not by the Federal Aviation Administration (FAA)…. (Sullenberger 2019 )
In June 2019, Sullenberger noted in Congressional Testimony that “These crashes are demonstrable evidence that our current system of aircraft design and certification has failed us. These accidents should never have happened” (Benning and DiFurio 2019 ).
Others have agreed with Sullenberger’s assessment. Software developer and pilot Gregory Travis ( 2019 ) argues that Boeing’s design for the 737 MAX violated industry norms and that the company unwisely used software to compensate for inadequacies in the hardware design. Travis also contends that the existence of MCAS was not disclosed to pilots in order to preserve the fiction that the 737 MAX was just an update of earlier 737 models, which served as a way to circumvent the more stringent FAA certification requirements for a new airplane. Reports from government agencies seem to support this assessment, emphasizing the chaotic cockpit conditions created by MCAS and poor certification practices. The U.S. National Transportation Safety Board (NTSB) ( 2019 ) Safety Recommendations to the FAA in September 2019 indicated that Boeing underestimated the effect MCAS malfunction would have on the cockpit environment (Kitroeff 2019 , a , b ). The FAA Joint Authorities Technical Review ( 2019 ), which included international participation, issued its Final Report in October 2019. The Report faulted Boeing and FAA in MCAS certification (Koenig 2019 ).
Despite Boeing’s attempts to downplay the role of MCAS, it began to work on a fix for the system shortly after the Lion Air crash (Gates 2019 ). MCAS operation will now be based on inputs from both AOA sensors, instead of just one sensor, with a cockpit indicator light when the sensors disagree. In addition, MCAS will only be activated once for an AOA warning rather than multiple times. What follows is that the system would only seek to prevent a stall once per AOA warning. Also, MCAS’s power will be limited in terms of how much it can move the stabilizer and manual override by the pilot will always be possible (Bellamy 2019 ; Boeing n.d. b; Gates 2019 ). For over a year after the Lion Air crash, Boeing held that pilot simulator training would not be required for the redesigned MCAS system. In January 2020, Boeing relented and recommended that pilot simulator training be required when the 737 MAX returns to service (Pasztor et al. 2020 ).
Boeing and the FAA
There is mounting evidence that Boeing, and the FAA as well, had warnings about the inadequacy of MCAS’s design, and about the lack of communication to pilots about its existence and functioning. In 2015, for example, an unnamed Boeing engineer raised in an email the issue of relying on a single AOA sensor (Bellamy 2019 ). In 2016, Mark Forkner, Boeing’s Chief Technical Pilot, in an email to a colleague flagged the erratic behavior of MCAS in a flight simulator noting: “It’s running rampant” (Gelles and Kitroeff 2019c ). Forkner subsequently came under federal investigation regarding whether he misled the FAA regarding MCAS (Kitroeff and Schmidt 2020 ).
In December 2018, following the Lion Air Crash, the FAA ( 2018b ) conducted a Risk Assessment that estimated that fifteen more 737 MAX crashes would occur in the expected fleet life of 45 years if the flight control issues were not addressed; this Risk Assessment was not publicly disclosed until Congressional hearings a year later in December 2019 (Arnold 2019 ). After the two crashes, a senior Boeing engineer, Curtis Ewbank, filed an internal ethics complaint in 2019 about management squelching of a system that might have uncovered errors in the AOA sensors. Ewbank has since publicly stated that “I was willing to stand up for safety and quality… Boeing management was more concerned with cost and schedule than safety or quality” (Kitroeff et al. 2019b ).
One factor in Boeing’s apparent reluctance to heed such warnings may be attributed to the seeming transformation of the company’s engineering and safety culture over time to a finance orientation beginning with Boeing’s merger with McDonnell–Douglas in 1997 (Tkacik 2019 ; Useem 2019 ). Critical changes after the merger included replacing many in Boeing’s top management, historically engineers, with business executives from McDonnell–Douglas and moving the corporate headquarters to Chicago, while leaving the engineering staff in Seattle (Useem 2019 ). According to Tkacik ( 2019 ), the new management even went so far as “maligning and marginalizing engineers as a class”.
Financial drivers thus began to place an inordinate amount of strain on Boeing employees, including engineers. During the development of the 737 MAX, significant production pressure to keep pace with the Airbus 320neo was ever-present. For example, Boeing management allegedly rejected any design changes that would prolong certification or require additional pilot training for the MAX (Gelles et al. 2019 ). As Adam Dickson, a former Boeing engineer, explained in a television documentary (BBC Panorama 2019 ): “There was a lot of interest and pressure on the certification and analysis engineers in particular, to look at any changes to the Max as minor changes”.
Production pressures were exacerbated by the “cozy relationship” between Boeing and the FAA (Kitroeff et al. 2019a ; see also Gelles and Kaplan 2019 ; Hall and Goelz 2019 ). Beginning in 2005, the FAA increased its reliance on manufacturers to certify their own planes. Self-certification became standard practice throughout the U.S. airline industry. By 2018, Boeing was certifying 96% of its own work (Kitroeff et al. 2019a ).
The serious drawbacks to self-certification became acutely apparent in this case. Of particular concern, the safety analysis for MCAS delegated to Boeing by the FAA was flawed in at least three respects: (1) the analysis underestimated the power of MCAS to move the plane’s horizontal tail and thus how difficult it would be for pilots to maintain control of the aircraft; (2) it did not account for the system deploying multiple times; and (3) it underestimated the risk level if MCAS failed, thus permitting a design feature—the single AOA sensor input to MCAS—that did not have built-in redundancy (Gates 2019 ). Related to these concerns, the ability of MCAS to move the horizontal tail was increased without properly updating the safety analysis or notifying the FAA about the change (Gates 2019 ). In addition, the FAA did not require pilot training for MCAS or simulator training for the 737 MAX (Gelles and Kaplan 2019 ). Since the MAX grounding, the FAA has been become more independent during its assessments and certifications—for example, they will not use Boeing personnel when certifying approvals of new 737 MAX planes (Josephs 2019 ).
The role of the FAA has also been subject to political scrutiny. The report of a study of the FAA certification process commissioned by Secretary of Transportation Elaine Chao (DOT 2020 ), released January 16, 2020, concluded that the FAA certification process was “appropriate and effective,” and that certification of the MAX as a new airplane would not have made a difference in the plane’s safety. At the same time, the report recommended a number of measures to strengthen the process and augment FAA’s staff (Pasztor and Cameron 2020 ). In contrast, a report of preliminary investigative findings by the Democratic staff of the House Committee on Transportation and Infrastructure (House TI 2020 ), issued in March 2020, characterized FAA’s certification of the MAX as “grossly insufficient” and criticized Boeing’s design flaws and lack of transparency with the FAA, airlines, and pilots (Duncan and Laris 2020 ).
Boeing has incurred significant economic losses from the crashes and subsequent grounding of the MAX. In December 2019, Boeing CEO Dennis Muilenburg was fired and the corporation announced that 737 MAX production would be suspended in January 2020 (Rich 2019 ) (see Fig. 1 ). Boeing is facing numerous lawsuits and possible criminal investigations. Boeing estimates that its economic losses for the 737 MAX will exceed $18 billion (Gelles 2020 ). In addition to the need to fix MCAS, other issues have arisen in recertification of the aircraft, including wiring for controls of the tail stabilizer, possible weaknesses in the engine rotors, and vulnerabilities in lightning protection for the engines (Kitroeff and Gelles 2020 ). The FAA had planned to flight test the 737 MAX early in 2020, and it was supposed to return to service in summer 2020 (Gelles and Kitroeff 2020 ). Given the global impact of the COVID-19 pandemic and other factors, it is difficult to predict when MAX flights might resume. In addition, uncertainty of passenger demand has resulted in some airlines delaying or cancelling orders for the MAX (Bogaisky 2020 ). Even after obtaining flight approval, public resistance to flying in the 737 MAX will probably be considerable (Gelles 2019 ).
Lessons for Engineering Ethics
The 737 MAX case is still unfolding and will continue to do so for some time. Yet important lessons can already be learned (or relearned) from the case. Some of those lessons are straightforward, and others are more subtle. A key and clear lesson is that engineers may need reminders about prioritizing the public good, and more specifically, the public’s safety. A more subtle lesson pertains to the ways in which the problem of many hands may or may not apply here. Other lessons involve the need for corporations, engineering societies, and engineering educators to rise to the challenge of nurturing and supporting ethical behavior on the part of engineers, especially in light of the difficulties revealed in this case.
All contemporary codes of ethics promulgated by major engineering societies state that an engineer’s paramount responsibility is to protect the “safety, health, and welfare” of the public. The American Institute of Aeronautics and Astronautics Code of Ethics indicates that engineers must “[H]old paramount the safety, health, and welfare of the public in the performance of their duties” (AIAA 2013 ). The Institute of Electrical and Electronics Engineers (IEEE) Code of Ethics goes further, pledging its members: “…to hold paramount the safety, health, and welfare of the public, to strive to comply with ethical design and sustainable development practices, and to disclose promptly factors that might endanger the public or the environment” (IEEE 2017 ). The IEEE Computer Society (CS) cooperated with the Association for Computing Machinery (ACM) in developing a Software Engineering Code of Ethics ( 1997 ) which holds that software engineers shall: “Approve software only if they have a well-founded belief that it is safe, meets specifications, passes appropriate tests, and does not diminish quality of life, diminish privacy or harm the environment….” According to Gotterbarn and Miller ( 2009 ), the latter code is a useful guide when examining cases involving software design and underscores the fact that during design, as in all engineering practice, the well-being of the public should be the overriding concern. While engineering codes of ethics are plentiful in number, they differ in their source of moral authority (i.e., organizational codes vs. professional codes), are often unenforceable through the law, and formally apply to different groups of engineers (e.g., based on discipline or organizational membership). However, the codes are generally recognized as a statement of the values inherent to engineering and its ethical commitments (Davis 2015 ).
An engineer’s ethical responsibility does not preclude consideration of factors such as cost and schedule (Pinkus et al. 1997 ). Engineers always have to grapple with constraints, including time and resource limitations. The engineers working at Boeing did have legitimate concerns about their company losing contracts to its competitor Airbus. But being an engineer means that public safety and welfare must be the highest priority (Davis 1991 ). The aforementioned software and other design errors in the development of the 737 MAX, which resulted in hundreds of deaths, would thus seem to be clear violations of engineering codes of ethics. In addition to pointing to engineering codes, Peterson ( 2019 ) argues that Boeing engineers and managers violated widely accepted ethical norms such as informed consent and the precautionary principle.
From an engineering perspective, the central ethical issue in the MAX case arguably circulates around the decision to use software (i.e., MCAS) to “mask” a questionable hardware design—the repositioning of the engines that disrupted the aerodynamics of the airframe (Travis 2019 ). As Johnston and Harris ( 2019 ) argue: “To meet the design goals and avoid an expensive hardware change, Boeing created the MCAS as a software Band-Aid.” Though a reliance on software fixes often happens in this manner, it places a high burden of safety on such fixes that they may not be able to handle, as is illustrated by the case of the Therac-25 radiation therapy machine. In the Therac-25 case, hardware safety interlocks employed in earlier models of the machine were replaced by software safety controls. In addition, information about how the software might malfunction was lacking from the user manual for the Therac machine. Thus, when certain types of errors appeared on its interface, the machine’s operators did not know how to respond. Software flaws, among other factors, contributed to six patients being given massive radiation overdoses, resulting in deaths and serious injuries (Leveson and Turner 1993 ). A more recent case involves problems with the embedded software guiding the electronic throttle in Toyota vehicles. In 2013, “…a jury found Toyota responsible for two unintended acceleration deaths, with expert witnesses citing bugs in the software and throttle fail safe defects” (Cummings and Britton 2020 ).
Boeing’s use of MCAS to mask the significant change in hardware configuration of the MAX was compounded by not providing redundancy for components prone to failure (i.e., the AOA sensors) (Campbell 2019 ), and by failing to notify pilots about the new software. In such cases, it is especially crucial that pilots receive clear documentation and relevant training so that they know how to manage the hand-off with an automated system properly (Johnston and Harris 2019 ). Part of the necessity for such training is related to trust calibration (Borenstein et al. 2020 ; Borenstein et al. 2018 ), a factor that has contributed to previous airplane accidents (e.g., Carr 2014 ). For example, if pilots do not place enough trust in an automated system, they may add risk by intervening in system operation. Conversely, if pilots trust an automated system too much, they may lack sufficient time to act once they identify a problem. This is further complicated in the MAX case because pilots were not fully aware, if at all, of MCAS’s existence and how the system functioned.
In addition to engineering decision-making that failed to prioritize public safety, questionable management decisions were also made at both Boeing and the FAA. As noted earlier, Boeing managerial leadership ignored numerous warning signs that the 737 MAX was not safe. Also, FAA’s shift to greater reliance on self-regulation by Boeing was ill-advised; that lesson appears to have been learned at the expense of hundreds of lives (Duncan and Aratani 2019 ).
The Problem of Many Hands Revisited
Actions, or inaction, by large, complex organizations, in this case corporate and government entities, suggest that the “problem of many hands” may be relevant to the 737 MAX case. At a high level of abstraction, the problem of many hands involves the idea that accountability is difficult to assign in the face of collective action, especially in a computerized society (Thompson 1980 ; Nissenbaum 1994 ). According to Nissenbaum ( 1996 , 29), “Where a mishap is the work of ‘many hands,’ it may not be obvious who is to blame because frequently its most salient and immediate causal antecedents do not converge with its locus of decision-making. The conditions for blame, therefore, are not satisfied in a way normally satisfied when a single individual is held blameworthy for a harm”.
However, there is an alternative understanding of the problem of many hands. In this version of the problem, the lack of accountability is not merely because multiple people and multiple decisions figure into a final outcome. Instead, in order to “qualify” as the problem of many hands, the component decisions should be benign, or at least far less harmful, if examined in isolation; only when the individual decisions are collectively combined do we see the most harmful result. In this understanding, the individual decision-makers should not have the same moral culpability as they would if they made all the decisions by themselves (Noorman 2020 ).
Both of these understandings of the problem of many hands could shed light on the 737 MAX case. Yet we focus on the first version of the problem. We admit the possibility that some of the isolated decisions about the 737 MAX may have been made in part because of ignorance of a broader picture. While we do not stake a claim on whether this is what actually happened in the MAX case, we acknowledge that it may be true in some circumstances. However, we think the more important point is that some of the 737 MAX decisions were so clearly misguided that a competent engineer should have seen the implications, even if the engineer was not aware of all of the broader context. The problem then is to identify responsibility for the questionable decisions in a way that discourages bad judgments in the future, a task made more challenging by the complexities of the decision-making. Legal proceedings about this case are likely to explore those complexities in detail and are outside the scope of this article. But such complexities must be examined carefully so as not to act as an insulator to accountability.
When many individuals are involved in the design of a computing device, for example, and a serious failure occurs, each person might try to absolve themselves of responsibility by indicating that “too many people” and “too many decisions” were involved for any individual person to know that the problem was going to happen. This is a common, and often dubious, excuse in the attempt to abdicate responsibility for a harm. While it can have different levels of magnitude and severity, the problem of many hands often arises in large scale ethical failures in engineering such as in the Deepwater Horizon oil spill (Thompson 2014 ).
Possible examples in the 737 MAX case of the difficulty of assigning moral responsibility due to the problem of many hands include:
The decision to reposition the engines;
The decision to mask the jet’s subsequent dynamic instability with MCAS;
The decision to rely on only one AOA sensor in designing MCAS; and
The decision to not inform nor properly train pilots about the MCAS system.
While overall responsibility for each of these decisions may be difficult to allocate precisely, at least points 1–3 above arguably reflect fundamental errors in engineering judgement (Travis 2019 ). Boeing engineers and FAA engineers either participated in or were aware of these decisions (Kitroeff and Gelles 2019 ) and may have had opportunities to reconsider or redirect such decisions. As Davis has noted ( 2012 ), responsible engineering professionals make it their business to address problems even when they did not cause the problem, or, we would argue, solely cause it. As noted earlier, reports indicate that at least one Boeing engineer expressed reservations about the design of MCAS (Bellamy 2019 ). Since the two crashes, one Boeing engineer, Curtis Ewbank, filed an internal ethics complaint (Kitroeff et al. 2019b ) and several current and former Boeing engineers and other employees have gone public with various concerns about the 737 MAX (Pasztor 2019 ). And yet, as is often the case, the flawed design went forward with tragic results.
Enabling Ethical Engineers
The MAX case is eerily reminiscent of other well-known engineering ethics case studies such as the Ford Pinto (Birsch and Fielder 1994 ), Space Shuttle Challenger (Werhane 1991 ), and GM ignition switch (Jennings and Trautman 2016 ). In the Pinto case, Ford engineers were aware of the unsafe placement of the fuel tank well before the car was released to the public and signed off on the design even though crash tests showed the tank was vulnerable to rupture during low-speed rear-end collisions (Baura 2006 ). In the case of the GM ignition switch, engineers knew for at least four years about the faulty design, a flaw that resulted in at least a dozen fatal accidents (Stephan 2016 ). In the case of the well-documented Challenger accident, engineer Roger Boisjoly warned his supervisors at Morton Thiokol of potentially catastrophic flaws in the shuttle’s solid rocket boosters a full six months before the accident. He, along with other engineers, unsuccessfully argued on the eve of launch for a delay due to the effect that freezing temperatures could have on the boosters’ O-ring seals. Boisjoly was also one of a handful of engineers to describe these warnings to the Presidential commission investigating the accident (Boisjoly et al. 1989 ).
Returning to the 737 MAX case, could Ewbank or others with concerns about the safety of the airplane have done more than filing ethics complaints or offering public testimony only after the Lion Air and Ethiopian Airlines crashes? One might argue that requiring professional registration by all engineers in the U.S. would result in more ethical conduct (for example, by giving state licensing boards greater oversight authority). Yet the well-entrenched “industry exemption” from registration for most engineers working in large corporations has undermined such calls (Kline 2001 ).
It could empower engineers with safety concerns if Boeing and other corporations would strengthen internal ethics processes, including sincere and meaningful responsiveness to anonymous complaint channels. Schwartz ( 2013 ) outlines three core components of an ethical corporate culture, including strong core ethical values, a formal ethics program (including an ethics hotline), and capable ethical leadership. Schwartz points to Siemens’ creation of an ethics and compliance department following a bribery scandal as an example of a good solution. Boeing has had a compliance department for quite some time (Schnebel and Bienert 2004 ) and has taken efforts in the past to evaluate its effectiveness (Boeing 2003 ). Yet it is clear that more robust measures are needed in response to ethics concerns and complaints. Since the MAX crashes, Boeing’s Board has implemented a number of changes including establishing a corporate safety group and revising internal reporting procedures so that lead engineers primarily report to the chief engineer rather than business managers (Gelles and Kitroeff 2019b , Boeing n.d. c). Whether these measures will be enough to restore Boeing’s former engineering-centered focus remains to be seen.
Professional engineering societies could play a stronger role in communicating and enforcing codes of ethics, in supporting ethical behavior of engineers, and by providing more educational opportunities for learning about ethics and about the ethical responsibilities of engineers. Some societies, including ACM and IEEE, have become increasingly engaged in ethics-related activities. Initially ethics engagement by the societies consisted primarily of a focus on macroethical issues such as sustainable development (Herkert 2004 ). Recently, however, the societies have also turned to a greater focus on microethical issues (the behavior of individuals). The 2017 revision to the IEEE Code of Ethics, for example, highlights the importance of “ethical design” (Adamson and Herkert 2020 ). This parallels IEEE activities in the area of design of autonomous and intelligent systems (e.g., IEEE 2018 ). A promising outcome of this emphasis is a move toward implementing “ethical design” frameworks (Peters et al. 2020 ).
In terms of engineering education, educators need to place a greater emphasis on fostering moral courage, that is the courage to act on one’s moral convictions including adherence to codes of ethics. This is of particular significance in large organizations such as Boeing and the FAA where the agency of engineers may be limited by factors such as organizational culture (Watts and Buckley 2017 ). In a study of twenty-six ethics interventions in engineering programs, Hess and Fore ( 2018 ) found that only twenty-seven percent had a learning goal of development of “ethical courage, confidence or commitment”. This goal could be operationalized in a number of ways, for example through a focus on virtue ethics (Harris 2008 ) or professional identity (Hashemian and Loui 2010 ). This need should not only be addressed within the engineering curriculum but during lifelong learning initiatives and other professional development opportunities as well (Miller 2019 ).
The circumstances surrounding the 737 MAX airplane could certainly serve as an informative case study for ethics or technical courses. The case can shed light on important lessons for engineers including the complex interactions, and sometimes tensions, between engineering and managerial considerations. The case also tangibly displays that what seems to be relatively small-scale, and likely well-intended, decisions by individual engineers can combine collectively to result in large-scale tragedy. No individual person wanted to do harm, but it happened nonetheless. Thus, the case can serve a reminder to current and future generations of engineers that public safety must be the first and foremost priority. A particularly useful pedagogical method for considering this case is to assign students to the roles of engineers, managers, and regulators, as well as the flying public, airline personnel, and representatives of engineering societies (Herkert 1997 ). In addition to illuminating the perspectives and responsibilities of each stakeholder group, role-playing can also shed light on the “macroethical” issues raised by the case (Martin et al. 2019 ) such as airline safety standards and the proper role for engineers and engineering societies in the regulation of the industry.
Conclusions and Recommendations
The case of the Boeing 737 MAX provides valuable lessons for engineers and engineering educators concerning the ethical responsibilities of the profession. Safety is not cheap, but careless engineering design in the name of minimizing costs and adhering to a delivery schedule is a symptom of ethical blight. Using almost any standard ethical analysis or framework, Boeing’s actions regarding the safety of the 737 MAX, particularly decisions regarding MCAS, fall short.
Boeing failed in its obligations to protect the public. At a minimum, the company had an obligation to inform airlines and pilots of significant design changes, especially the role of MCAS in compensating for repositioning of engines in the MAX from prior versions of the 737. Clearly, it was a “significant” change because it had a direct, and unfortunately tragic, impact on the public’s safety. The Boeing and FAA interaction underscores the fact that conflicts of interest are a serious concern in regulatory actions within the airline industry.
Internal and external organizational factors may have interfered with Boeing and FAA engineers’ fulfillment of their professional ethical responsibilities; this is an all too common problem that merits serious attention from industry leaders, regulators, professional societies, and educators. The lessons to be learned in this case are not new. After large scale tragedies involving engineering decision-making, calls for change often emerge. But such lessons apparently must be retaught and relearned by each generation of engineers.
ACM/IEEE-CS Joint Task Force. (1997). Software Engineering Code of Ethics and Professional Practice, https://ethics.acm.org/code-of-ethics/software-engineering-code/ .
Adamson, G., & Herkert, J. (2020). Addressing intelligent systems and ethical design in the IEEE Code of Ethics. In Codes of ethics and ethical guidelines: Emerging technologies, changing fields . New York: Springer ( in press ).
Ahmed, H., Glanz, J., & Beech, H. (2019). Ethiopian airlines pilots followed Boeing’s safety procedures before crash, Report Shows. The New York Times, April 4, https://www.nytimes.com/2019/04/04/world/asia/ethiopia-crash-boeing.html .
AIAA. (2013). Code of Ethics, https://www.aiaa.org/about/Governance/Code-of-Ethics .
Arnold, K. (2019). FAA report predicted there could be 15 more 737 MAX crashes. The Dallas Morning News, December 11, https://www.dallasnews.com/business/airlines/2019/12/11/faa-chief-says-boeings-737-max-wont-be-approved-in-2019/
Baura, G. (2006). Engineering ethics: an industrial perspective . Amsterdam: Elsevier.
Google Scholar
BBC News. (2019). Work on production line of Boeing 737 MAX ‘Not Adequately Funded’. July 29, https://www.bbc.com/news/business-49142761 .
Bellamy, W. (2019). Boeing CEO outlines 737 MAX MCAS software fix in congressional hearings. Aviation Today, November 2, https://www.aviationtoday.com/2019/11/02/boeing-ceo-outlines-mcas-updates-congressional-hearings/ .
Benning, T., & DiFurio, D. (2019). American Airlines Pilots Union boss prods lawmakers to solve 'Crisis of Trust' over Boeing 737 MAX. The Dallas Morning News, June 19, https://www.dallasnews.com/business/airlines/2019/06/19/american-airlines-pilots-union-boss-prods-lawmakers-to-solve-crisis-of-trust-over-boeing-737-max/ .
Birsch, D., & Fielder, J. (Eds.). (1994). The ford pinto case: A study in applied ethics, business, and technology . New York: The State University of New York Press.
Boeing. (2003). Boeing Releases Independent Reviews of Company Ethics Program. December 18, https://boeing.mediaroom.com/2003-12-18-Boeing-Releases-Independent-Reviews-of-Company-Ethics-Program .
Boeing. (2018). Flight crew operations manual bulletin for the Boeing company. November 6, https://www.avioesemusicas.com/wp-content/uploads/2018/10/TBC-19-Uncommanded-Nose-Down-Stab-Trim-Due-to-AOA.pdf .
Boeing. (n.d. a). About the Boeing 737 MAX. https://www.boeing.com/commercial/737max/ .
Boeing. (n.d. b). 737 MAX Updates. https://www.boeing.com/737-max-updates/ .
Boeing. (n.d. c). Initial actions: sharpening our focus on safety. https://www.boeing.com/737-max-updates/resources/ .
Bogaisky, J. (2020). Boeing stock plunges as coronavirus imperils quick ramp up in 737 MAX deliveries. Forbes, March 11, https://www.forbes.com/sites/jeremybogaisky/2020/03/11/boeing-coronavirus-737-max/#1b9eb8955b5a .
Boisjoly, R. P., Curtis, E. F., & Mellican, E. (1989). Roger Boisjoly and the challenger disaster: The ethical dimensions. J Bus Ethics, 8 (4), 217–230.
Article Google Scholar
Borenstein, J., Mahajan, H. P., Wagner, A. R., & Howard, A. (2020). Trust and pediatric exoskeletons: A comparative study of clinician and parental perspectives. IEEE Transactions on Technology and Society , 1 (2), 83–88.
Borenstein, J., Wagner, A. R., & Howard, A. (2018). Overtrust of pediatric health-care robots: A preliminary survey of parent perspectives. IEEE Robot Autom Mag, 25 (1), 46–54.
Bushey, C. (2019). The Tough Crowd Boeing Needs to Convince. Crain’s Chicago Business, October 25, https://www.chicagobusiness.com/manufacturing/tough-crowd-boeing-needs-convince .
Campbell, D. (2019). The many human errors that brought down the Boeing 737 MAX. The Verge, May 2, https://www.theverge.com/2019/5/2/18518176/boeing-737-max-crash-problems-human-error-mcas-faa .
Carr, N. (2014). The glass cage: Automation and us . Norton.
Cummings, M. L., & Britton, D. (2020). Regulating safety-critical autonomous systems: past, present, and future perspectives. In Living with robots (pp. 119–140). Academic Press, New York.
Davis, M. (1991). Thinking like an engineer: The place of a code of ethics in the practice of a profession. Philos Publ Affairs, 20 (2), 150–167.
Davis, M. (2012). “Ain’t no one here but us social forces”: Constructing the professional responsibility of engineers. Sci Eng Ethics, 18 (1), 13–34.
Davis, M. (2015). Engineering as profession: Some methodological problems in its study. In Engineering identities, epistemologies and values (pp. 65–79). Springer, New York.
Department of Transportation (DOT). (2020). Official report of the special committee to review the Federal Aviation Administration’s Aircraft Certification Process, January 16. https://www.transportation.gov/sites/dot.gov/files/2020-01/scc-final-report.pdf .
Duncan, I., & Aratani, L. (2019). FAA flexes its authority in final stages of Boeing 737 MAX safety review. The Washington Post, November 27, https://www.washingtonpost.com/transportation/2019/11/27/faa-flexes-its-authority-final-stages-boeing-max-safety-review/ .
Duncan, I., & Laris, M. (2020). House report on 737 Max crashes faults Boeing’s ‘culture of concealment’ and labels FAA ‘grossly insufficient’. The Washington Post, March 6, https://www.washingtonpost.com/local/trafficandcommuting/house-report-on-737-max-crashes-faults-boeings-culture-of-concealment-and-labels-faa-grossly-insufficient/2020/03/06/9e336b9e-5fce-11ea-b014-4fafa866bb81_story.html .
Economy, P. (2019). Boeing CEO Puts Partial Blame on Pilots of Crashed 737 MAX Aircraft for Not 'Completely' Following Procedures. Inc., April 30, https://www.inc.com/peter-economy/boeing-ceo-puts-partial-blame-on-pilots-of-crashed-737-max-aircraft-for-not-completely-following-procedures.html .
Federal Aviation Administration (FAA). (2018a). Airworthiness directives; the Boeing company airplanes. FR Doc No: R1-2018-26365. https://rgl.faa.gov/Regulatory_and_Guidance_Library/rgad.nsf/0/fe8237743be9b8968625835b004fc051/$FILE/2018-23-51_Correction.pdf .
Federal Aviation Administration (FAA). (2018b). Quantitative Risk Assessment. https://www.documentcloud.org/documents/6573544-Risk-Assessment-for-Release-1.html#document/p1 .
Federal Aviation Administration (FAA). (2019). Joint authorities technical review: observations, findings, and recommendations. October 11, https://www.faa.gov/news/media/attachments/Final_JATR_Submittal_to_FAA_Oct_2019.pdf .
Federal Democratic Republic of Ethiopia. (2019). Aircraft accident investigation preliminary report. Report No. AI-01/19, April 4, https://leehamnews.com/wp-content/uploads/2019/04/Preliminary-Report-B737-800MAX-ET-AVJ.pdf .
Federal Democratic Republic of Ethiopia. (2020). Aircraft Accident Investigation Interim Report. Report No. AI-01/19, March 20, https://www.aib.gov.et/wp-content/uploads/2020/documents/accident/ET-302%2520%2520Interim%2520Investigation%2520%2520Report%2520March%25209%25202020.pdf .
Gates, D. (2018). Pilots struggled against Boeing's 737 MAX control system on doomed Lion Air flight. The Seattle Times, November 27, https://www.seattletimes.com/business/boeing-aerospace/black-box-data-reveals-lion-air-pilots-struggle-against-boeings-737-max-flight-control-system/ .
Gates, D. (2019). Flawed analysis, failed oversight: how Boeing, FAA Certified the Suspect 737 MAX Flight Control System. The Seattle Times, March 17, https://www.seattletimes.com/business/boeing-aerospace/failed-certification-faa-missed-safety-issues-in-the-737-max-system-implicated-in-the-lion-air-crash/ .
Gelles, D. (2019). Boeing can’t fly its 737 MAX, but it’s ready to sell its safety. The New York Times, December 24 (updated February 10, 2020), https://www.nytimes.com/2019/12/24/business/boeing-737-max-survey.html .
Gelles, D. (2020). Boeing expects 737 MAX costs will surpass $18 Billion. The New York Times, January 29, https://www.nytimes.com/2020/01/29/business/boeing-737-max-costs.html .
Gelles, D., & Kaplan, T. (2019). F.A.A. Approval of Boeing jet involved in two crashes comes under scrutiny. The New York Times, March 19, https://www.nytimes.com/2019/03/19/business/boeing-elaine-chao.html .
Gelles, D., & Kitroeff, N. (2019a). Boeing Believed a 737 MAX warning light was standard. It wasn’t. New York: The New York Times. https://www.nytimes.com/2019/05/05/business/boeing-737-max-warning-light.html .
Gelles, D., & Kitroeff, N. (2019b). Boeing board to call for safety changes after 737 MAX Crashes. The New York Times, September 15, (updated October 2), https://www.nytimes.com/2019/09/15/business/boeing-safety-737-max.html .
Gelles, D., & Kitroeff, N. (2019c). Boeing pilot complained of ‘Egregious’ issue with 737 MAX in 2016. The New York Times, October 18, https://www.nytimes.com/2019/10/18/business/boeing-flight-simulator-text-message.html .
Gelles, D., & Kitroeff, N. (2020). What needs to happen to get Boeing’s 737 MAX flying again?. The New York Times, February 10, https://www.nytimes.com/2020/02/10/business/boeing-737-max-fly-again.html .
Gelles, D., Kitroeff, N., Nicas, J., & Ruiz, R. R. (2019). Boeing was ‘Go, Go, Go’ to beat airbus with the 737 MAX. The New York Times, March 23, https://www.nytimes.com/2019/03/23/business/boeing-737-max-crash.html .
Glanz, J., Creswell, J., Kaplan, T., & Wichter, Z. (2019). After a Lion Air 737 MAX Crashed in October, Questions About the Plane Arose. The New York Times, February 3, https://www.nytimes.com/2019/02/03/world/asia/lion-air-plane-crash-pilots.html .
Gotterbarn, D., & Miller, K. W. (2009). The public is the priority: Making decisions using the software engineering code of ethics. Computer, 42 (6), 66–73.
Hall, J., & Goelz, P. (2019). The Boeing 737 MAX Crisis Is a Leadership Failure, The New York Times, July 17, https://www.nytimes.com/2019/07/17/opinion/boeing-737-max.html .
Harris, C. E. (2008). The good engineer: Giving virtue its due in engineering ethics. Science and Engineering Ethics, 14 (2), 153–164.
Hashemian, G., & Loui, M. C. (2010). Can instruction in engineering ethics change students’ feelings about professional responsibility? Science and Engineering Ethics, 16 (1), 201–215.
Herkert, J. R. (1997). Collaborative learning in engineering ethics. Science and Engineering Ethics, 3 (4), 447–462.
Herkert, J. R. (2004). Microethics, macroethics, and professional engineering societies. In Emerging technologies and ethical issues in engineering: papers from a workshop (pp. 107–114). National Academies Press, New York.
Hess, J. L., & Fore, G. (2018). A systematic literature review of US engineering ethics interventions. Science and Engineering Ethics, 24 (2), 551–583.
House Committee on Transportation and Infrastructure (House TI). (2020). The Boeing 737 MAX Aircraft: Costs, Consequences, and Lessons from its Design, Development, and Certification-Preliminary Investigative Findings, March. https://transportation.house.gov/imo/media/doc/TI%2520Preliminary%2520Investigative%2520Findings%2520Boeing%2520737%2520MAX%2520March%25202020.pdf .
IEEE. (2017). IEEE Code of Ethics. https://www.ieee.org/about/corporate/governance/p7-8.html .
IEEE. (2018). Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems (version 2). https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_v2.pdf .
Jennings, M., & Trautman, L. J. (2016). Ethical culture and legal liability: The GM switch crisis and lessons in governance. Boston University Journal of Science and Technology Law, 22 , 187.
Johnston, P., & Harris, R. (2019). The Boeing 737 MAX Saga: Lessons for software organizations. Software Quality Professional, 21 (3), 4–12.
Josephs, L. (2019). FAA tightens grip on Boeing with plan to individually review each new 737 MAX Jetliner. CNBC, November 27, https://www.cnbc.com/2019/11/27/faa-tightens-grip-on-boeing-with-plan-to-individually-inspect-max-jets.html .
Kaplan, T., Austen, I., & Gebrekidan, S. (2019). The New York Times, March 13. https://www.nytimes.com/2019/03/13/business/canada-737-max.html .
Kitroeff, N. (2019). Boeing underestimated cockpit chaos on 737 MAX, N.T.S.B. Says. The New York Times, September 26, https://www.nytimes.com/2019/09/26/business/boeing-737-max-ntsb-mcas.html .
Kitroeff, N., & Gelles, D. (2019). Legislators call on F.A.A. to say why it overruled its experts on 737 MAX. The New York Times, November 7 (updated December 11), https://www.nytimes.com/2019/11/07/business/boeing-737-max-faa.html .
Kitroeff, N., & Gelles, D. (2020). It’s not just software: New safety risks under scrutiny on Boeing’s 737 MAX. The New York Times, January 5, https://www.nytimes.com/2020/01/05/business/boeing-737-max.html .
Kitroeff, N., & Schmidt, M. S. (2020). Federal prosecutors investigating whether Boeing pilot lied to F.A.A. The New York Times, February 21, https://www.nytimes.com/2020/02/21/business/boeing-737-max-investigation.html .
Kitroeff, N., Gelles, D., & Nicas, J. (2019a). The roots of Boeing’s 737 MAX Crisis: A regulator relaxes its oversight. The New York Times, July 27, https://www.nytimes.com/2019/07/27/business/boeing-737-max-faa.html .
Kitroeff, N., Gelles, D., & Nicas, J. (2019b). Boeing 737 MAX safety system was vetoed, Engineer Says. The New York Times, October 2, https://www.nytimes.com/2019/10/02/business/boeing-737-max-crashes.html .
Kline, R. R. (2001). Using history and sociology to teach engineering ethics. IEEE Technology and Society Magazine, 20 (4), 13–20.
Koenig, D. (2019). Boeing, FAA both faulted in certification of the 737 MAX. AP, October 11, https://apnews.com/470abf326cdb4229bdc18c8ad8caa78a .
Langewiesche, W. (2019). What really brought down the Boeing 737 MAX? The New York Times, September 18, https://www.nytimes.com/2019/09/18/magazine/boeing-737-max-crashes.html .
Leveson, N. G., & Turner, C. S. (1993). An investigation of the Therac-25 accidents. Computer, 26 (7), 18–41.
Marks, S., & Dahir, A. L. (2020). Ethiopian report on 737 Max Crash Blames Boeing, March 9, https://www.nytimes.com/2020/03/09/world/africa/ethiopia-crash-boeing.html .
Martin, D. A., Conlon, E., & Bowe, B. (2019). The role of role-play in student awareness of the social dimension of the engineering profession. European Journal of Engineering Education, 44 (6), 882–905.
Miller, G. (2019). Toward lifelong excellence: navigating the engineering-business space. In The Engineering-Business Nexus (pp. 81–101). Springer, Cham.
National Transportation Safety Board (NTSB). (2019). Safety Recommendations Report, September 19, https://www.ntsb.gov/investigations/AccidentReports/Reports/ASR1901.pdf .
Nissenbaum, H. (1994). Computing and accountability. Communications of the ACM , January, https://dl.acm.org/doi/10.1145/175222.175228 .
Nissenbaum, H. (1996). Accountability in a computerized society. Science and Engineering Ethics, 2 (1), 25–42.
Noorman, M. (2020). Computing and moral responsibility. In Zalta, E. N. (Ed.). The Stanford Encyclopedia of Philosophy (Spring), https://plato.stanford.edu/archives/spr2020/entries/computing-responsibility .
Pasztor, A. (2019). More Whistleblower complaints emerge in Boeing 737 MAX Safety Inquiries. The Wall Street Journal, April 27, https://www.wsj.com/articles/more-whistleblower-complaints-emerge-in-boeing-737-max-safety-inquiries-11556418721 .
Pasztor, A., & Cameron, D. (2020). U.S. News: Panel Backs How FAA gave safety approval for 737 MAX. The Wall Street Journal, January 17, https://www.wsj.com/articles/panel-clears-737-maxs-safety-approval-process-at-faa-11579188086 .
Pasztor, A., Cameron.D., & Sider, A. (2020). Boeing backs MAX simulator training in reversal of stance. The Wall Street Journal, January 7, https://www.wsj.com/articles/boeing-recommends-fresh-max-simulator-training-11578423221 .
Peters, D., Vold, K., Robinson, D., & Calvo, R. A. (2020). Responsible AI—two frameworks for ethical design practice. IEEE Transactions on Technology and Society, 1 (1), 34–47.
Peterson, M. (2019). The ethical failures behind the Boeing disasters. Blog of the APA, April 8, https://blog.apaonline.org/2019/04/08/the-ethical-failures-behind-the-boeing-disasters/ .
Pinkus, R. L., Pinkus, R. L. B., Shuman, L. J., Hummon, N. P., & Wolfe, H. (1997). Engineering ethics: Balancing cost, schedule, and risk-lessons learned from the space shuttle . Cambridge: Cambridge University Press.
Republic of Indonesia. (2019). Final Aircraft Accident Investigation Report. KNKT.18.10.35.04, https://knkt.dephub.go.id/knkt/ntsc_aviation/baru/2018%2520-%2520035%2520-%2520PK-LQP%2520Final%2520Report.pdf .
Rich, G. (2019). Boeing 737 MAX should return in 2020 but the crisis won't be over. Investor's Business Daily, December 31, https://www.investors.com/news/boeing-737-max-service-return-2020-crisis-not-over/ .
Schnebel, E., & Bienert, M. A. (2004). Implementing ethics in business organizations. Journal of Business Ethics, 53 (1–2), 203–211.
Schwartz, M. S. (2013). Developing and sustaining an ethical corporate culture: The core elements. Business Horizons, 56 (1), 39–50.
Stephan, K. (2016). GM Ignition Switch Recall: Too Little Too Late? [Ethical Dilemmas]. IEEE Technology and Society Magazine, 35 (2), 34–35.
Sullenberger, S. (2019). My letter to the editor of New York Times Magazine, https://www.sullysullenberger.com/my-letter-to-the-editor-of-new-york-times-magazine/ .
Thompson, D. F. (1980). Moral responsibility of public officials: The problem of many hands. American Political Science Review, 74 (4), 905–916.
Thompson, D. F. (2014). Responsibility for failures of government: The problem of many hands. The American Review of Public Administration, 44 (3), 259–273.
Tkacik, M. (2019). Crash course: how Boeing’s managerial revolution created the 737 MAX Disaster. The New Republic, September 18, https://newrepublic.com/article/154944/boeing-737-max-investigation-indonesia-lion-air-ethiopian-airlines-managerial-revolution .
Travis, G. (2019). How the Boeing 737 MAX disaster looks to a software developer. IEEE Spectrum , April 18, https://spectrum.ieee.org/aerospace/aviation/how-the-boeing-737-max-disaster-looks-to-a-software-developer .
Useem, J. (2019). The long-forgotten flight that sent Boeing off course. The Atlantic, November 20, https://www.theatlantic.com/ideas/archive/2019/11/how-boeing-lost-its-bearings/602188/ .
Watts, L. L., & Buckley, M. R. (2017). A dual-processing model of moral whistleblowing in organizations. Journal of Business Ethics, 146 (3), 669–683.
Werhane, P. H. (1991). Engineers and management: The challenge of the Challenger incident. Journal of Business Ethics, 10 (8), 605–616.
Download references
Acknowledgement
The authors would like to thank the anonymous reviewers for their helpful comments.
Author information
Authors and affiliations.
North Carolina State University, Raleigh, NC, USA
Joseph Herkert
Georgia Institute of Technology, Atlanta, GA, USA
Jason Borenstein
University of Missouri – St. Louis, St. Louis, MO, USA
Keith Miller
You can also search for this author in PubMed Google Scholar
Corresponding author
Correspondence to Joseph Herkert .
Additional information
Publisher's note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Reprints and permissions
About this article
Herkert, J., Borenstein, J. & Miller, K. The Boeing 737 MAX: Lessons for Engineering Ethics. Sci Eng Ethics 26 , 2957–2974 (2020). https://doi.org/10.1007/s11948-020-00252-y
Download citation
Received : 26 March 2020
Accepted : 25 June 2020
Published : 10 July 2020
Issue Date : December 2020
DOI : https://doi.org/10.1007/s11948-020-00252-y
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Engineering ethics
- Airline safety
- Engineering design
- Corporate culture
- Software engineering
- Find a journal
- Publish with us
- Track your research
MIT Sloan is the leader in research and teaching in AI. Dive in to discover why.
Which program is right for you?
Through intellectual rigor and experiential learning, this full-time, two-year MBA program develops leaders who make a difference in the world.
Earn your MBA and SM in engineering with this transformative two-year program.
A rigorous, hands-on program that prepares adaptive problem solvers for premier finance careers.
A 12-month program focused on applying the tools of modern data science, optimization and machine learning to solve real-world business problems.
Combine an international MBA with a deep dive into management science. A special opportunity for partner and affiliate schools only.
A doctoral program that produces outstanding scholars who are leading in their fields of research.
Bring a business perspective to your technical and quantitative expertise with a bachelor’s degree in management, business analytics, or finance.
Apply now and work for two to five years. We'll save you a seat in our MBA class when you're ready to come back to campus for your degree.
Executive Programs
The 20-month program teaches the science of management to mid-career leaders who want to move from success to significance.
A full-time MBA program for mid-career leaders eager to dedicate one year of discovery for a lifetime of impact.
A joint program for mid-career professionals that integrates engineering and systems thinking. Earn your master’s degree in engineering and management.
Non-degree programs for senior executives and high-potential managers.
A non-degree, customizable program for mid-career professionals.
Teaching Resources Library
Boeing's 737 MAX 8 Disasters
John D. Sterman
James Quinn
Jul 26, 2023
On October 29, 2018, Indonesia’s Lion Air flight 610, a nearly new Boeing 737 MAX 8 jet, plunged into the Java Sea at 400 miles per hour, killing all 189 people on board. Eight days later Boeing issued a bulletin to all 737 MAX 8 and 737 MAX 9 operators stating that “erroneous angle-of-attack data” could result in “uncommanded nose-down movement of the aircraft and that this action can repeat until the related system is deactivated.” The US Federal Aviation Administration (FAA) followed by issuing an Emergency Airworthiness Directive requiring Boeing to revise the operating procedures in its flight manual for 737 MAX aircraft, including the new Maneuvering Characteristics Augmentation System (MCAS), software designed to prevent the aircraft from stalling by automatically pushing the nose of the plane down it detected a high angle of attack. Boeing promised the software would be fixed in a few weeks, but by March 2019 the revisions were still not completed. Then, on March 10, Ethiopian Airlines flight 302 from Addis Ababa to Nairobi, Kenya, the same model 737 MAX 8, crashed shortly after takeoff, killing all 157 on board.
Under pressure from Indonesia and Ethiopia, airlines, pilots, the public—and the families of the 346 dead—the President of the United States ordered the grounding of all 737 MAX aircraft on March 13, 2019. Other nations followed suit, grounding Boeing’s best selling plane worldwide for 21 months. Hundreds of MAX 8 orders were canceled, Boeing suffered billions in losses, the CEO was replaced, and multiple civil and criminal investigations were launched. Evidence soon showed that the company’s own engineers and test pilots had known about the MCAS problem well before the crashes, but that knowledge was not shared with the FAA, airlines, or pilots.
Learning Objectives
This case explores the causes of and responses to the accidents, from multiple perspectives, from the design of the MAX 8 and MCAS software to Boeing’s corporate culture to government oversight. Questions to consider include:
What role did company engineers play in causing the two catastrophes? What about their managers? Their test pilots?
What role did Boeing’s CEO, board, and other senior leaders play in shaping the processes, procedures, and corporate culture that may have set the stage for the disasters?
Why were the problems with MCAS covered up?
Why didn’t the F.A.A. detect the flaws in the design before allowing the 737 MAX 8 to enter service?
How could future disasters be prevented?
And what of the families of the 346 victims?
Appropriate for the Following Course(s)
operations management, industrial engineering, computer science, product design, organizational behavior, corporate strategy, public policy, government regulation, and ethics
Boeing's 737 MAX 8 Disasters
THERE IS NO TEACHING NOTE FOR THIS CASE STUDY.
Why Boeing’s Problems with the 737 MAX Began More Than 25 Years Ago
Once again, Boeing’s 737 MAX is back in the headlines.
After two crashes that killed 346 people in 2018 and 2019 and five years of ensuing design changes and regulatory scrutiny, the 737 MAX is grounded again after a mid-air blowout of a fuselage panel on January 5. After loose bolts were discovered on other MAX 9s, the Federal Aviation Administration (FAA) grounded the planes and opened an investigation into whether MAX is safe to fly, accompanied by a stern warning, saying , “This incident should have never happened, and it cannot happen again.”
Boeing has also experienced repeated problems in design and production with its newest jumbo jet, the 787 Dreamliner. Such frequent, repeated crises point to a deeper issue than isolated engineering mishaps. The underlying cause of these issues is a leadership failure that has allowed cultural drift away from Boeing’s once-vaunted engineering quality.
William Boeing created the commercial aviation industry. For the next century, Boeing was the leading producer, based on its excellence in aircraft design and safety. Boeing’s problems today date back to former CEO Philip Condit, who made two ill-fated decisions that dramatically changed Boeing’s culture. The first was acquiring archrival McDonnell Douglas in 1997, a leader in military aviation with its fighter jets and Boeing’s major competitor in commercial aviation. In contrast to Boeing’s culture of engineering excellence, McDonnell Douglas focused on cost-cutting and upgrading older airplane models at the expense of all-new aircraft.
Secondly, in 2001 Condit moved Boeing’s headquarters from its original home in Seattle to Chicago—all to gain $60 million in state and local tax credits over 20 years. With none of its businesses based in Chicago, the move separated Boeing’s corporate executives from its engineering and product decisions and alienated its Seattle-based engineers.
Leader turnover, ethical lapses, and buybacks
After Condit resigned in 2003 following an ethics scandal, Boeing’s board convinced former McDonnell Douglas executive Harry Stonecipher to come out of retirement to replace Condit. Stonecipher, a General Electric (GE) alum, immediately set out to change Boeing’s culture, proclaiming , “When people say I changed the culture of Boeing, that was the intent, so that it is run like a business rather than a great engineering firm.”
One of Stonecipher’s fated decisions was to turn down the proposal from Boeing’s head of commercial aviation to design an all-new single-aisle aircraft to replace the Boeing 727 (FAA-certified in 1964), 737 (1968 certification) and 757 (1972). Instead of designing a new airplane incorporating all the advances in aviation technology from the past 30-40 years, Stonecipher elected to maximize profits from older models and use the cash to buy back Boeing stock.
Just two years after taking over, Stonecipher resigned after violating the company’s code of conduct. Rather than promoting the internal candidate, Alan Mulally, who headed up commercial aviation and led the development of the highly successful Boeing 777, the board recruited ex-GE executive Jim McNerney to succeed Stonecipher. At the time, McNerney was CEO of 3M, a tenure that lasted just four years. Meanwhile, Mulally became CEO of Ford, where he led one of the most important business turnarounds in history.
Maximizing earnings, but at what cost?
By 2011, Boeing found its 737 losing out to rival Airbus’s A320neo with major customers like American Airlines. Rather than designing a new aircraft to replace the 737, McNerney opted for a five-year program to upgrade to the 737 MAX. This decision required the redesign to stay within FAA’s original type-certification with the same flying characteristics. Boeing also agreed with customers like Southwest Airlines to avoid retraining pilots and upgrading its training manuals. These decisions minimized short-term cost to maximize short-term earnings.
In my experience with advanced technology products, quick fixes often lead to design compromises that create more problems. This happened with the 737 MAX in 2015 when it encountered stall problems. Rather than further design changes that would have risked the 737’s original type-certification, Boeing opted for a major software change that was not disclosed to the FAA or described in its pilot’s manual.
The flaws in the software design that took flight control away from the pilots without their knowledge based on data from a single sensor, ultimately led to the two 737 MAX crashes in 2018 and 2019, causing the deaths of 346 people. After the first crash, Boeing issued a statement that offered pilots and passengers “our assurance that the 737 MAX is as safe as any airplane that has ever flown the skies.” That assurance came back to haunt Boeing four months later when the second MAX crashed.
When Boeing leadership failed to ground its fleet of 737 MAX planes, the FAA forced them to do so, pushing Boeing into a lengthy set of regulatory inspections, tests, and design changes lasting 20 months. During that time, its customers waited for aircraft they desperately needed in service. Then Congress held a damaging hearing that eventually forced Boeing’s board to relieve then-CEO Dennis Muilenburg. Today, ex-GE executive David Calhoun, a Boeing board member since 2009, has the onerous task of restoring Boeing’s business and its quality standards.
The cost of reputational damage
When we discuss the Boeing cases in my classes at Harvard Business School, I ask participants, “Are Boeing’s problems caused by individual leadership failures or a flawed culture?” The answer “both” eventually emerges. It was the actions of Condit and Stonecipher that turned Boeing’s culture from excellence in aviation design, quality, and safety into emphasizing short-term profit and distributing cash to shareholders via stock buybacks.
McNerney compounded the problem through his decision to launch a quick fix to the 737 rather than design a new airplane. Muilenburg was left with the flawed aircraft, but failed to ground the planes after the first crash and pinpoint the root cause of the failure. The Boeing board, which is composed of exceptional individuals, failed to preserve Boeing’s culture and reputation.
Ironically, decisions made in the name of shareholder value over the past two decades have cost its investors $87 billion since 2018. The long-term damage to Boeing’s reputation and market position is even greater as Airbus has outsold Boeing in new aircraft orders each of the last five years.
Now CEO Calhoun faces immediate pressure to fix the MAX fuselage panel and win back market share. However, his real task must be to restore Boeing’s culture of aviation excellence—a long-term project that is the only real solution to restore this once iconic company to aviation leadership.
Bill George is an Executive Fellow at Harvard Business School, and former Chair and Chief Executive Officer of Medtronic. He doesn't hold a financial stake in Boeing, its competitors, or any airlines.
You Might Also Like:
- What Went Wrong with the Boeing 737 Max?
- Courage: The Defining Characteristic of Great Leaders
- Why Leaders Lose Their Way
Feedback or ideas to share? Email the Working Knowledge team at [email protected] .
Image note: Illustration was created with artwork from AdobeStock/John Vlahidis.
- 25 Jun 2024
- Research & Ideas
Rapport: The Hidden Advantage That Women Managers Bring to Teams
- 11 Jun 2024
- In Practice
The Harvard Business School Faculty Summer Reader 2024
How transparency sped innovation in a $13 billion wireless sector.
- 24 Jan 2024
- 27 Jun 2016
These Management Practices, Like Certain Technologies, Boost Company Performance
- Leadership Development
- Research and Development
- Crisis Management
- Air Transportation
- Transportation
Sign up for our weekly newsletter
- SUGGESTED TOPICS
- The Magazine
- Newsletters
- Managing Yourself
- Managing Teams
- Work-life Balance
- The Big Idea
- Data & Visuals
- Case Selections
- HBR Learning
- Topic Feeds
- Account Settings
- Email Preferences
Share Podcast
What Went Wrong with the Boeing 737 Max?
Harvard Business School professor Bill George examines the Boeing 737 Max crashes through the lens of industry and corporate culture.
- Apple Podcasts
How did the evolution of Boeing’s organization and management lead to two tragic plane crashes within six months, in which a total of 346 people died?
Harvard Business School professor Bill George discusses the long roots that ultimately led to the crash of Lion Air flight 610 in October 2018 in Indonesia and the crash of Ethiopian Airlines flight 302 in March 2019 in Ethiopia. He discusses the role cost cutting, regulatory pressure, and CEO succession played in laying the foundation for these tragedies and examines how Boeing executives responded to the crises in his case “ What Went Wrong with Boeing’s 737 Max? ”
HBR Presents is a network of podcasts curated by HBR editors, bringing you the best business ideas from the leading minds in management. The views and opinions expressed are solely those of the authors and do not necessarily reflect the official policy or position of Harvard Business Review or its affiliates.
BRIAN KENNY: On January 17, 1967, aviation history was made when Boeing unveiled the very first Boeing 737 aircraft at King County International airport, outside of Seattle. Baby Boeing, as it was called, had a greater seating capacity than any plane in its class and sported wing-mounted engines that provided a more balanced center of gravity. The team at Boeing thought they had a winner on their hands, and they were right. The Boeing 737 is the top selling commercial aircraft of all time and holds its own entry in the Guinness Book of World Records . It’s operated by more than 5,000 airlines in over 200 countries. And at any given moment, there are roughly 1,200 737 airliners in the sky. In over three decades of operation, the plane was involved in 19 fatal accidents, which equates to about one accident for every four million departures. So when a brand new Boeing 737 max fell from the sky in October 2018, killing all 195 souls on board, the aviation world took notice.
Today on Cold Call , we welcome professor Bill George to discuss his case entitled, “What Went wrong with Boeing 737 Max?” I’m your host, Brian Kenny, and you’re listening to Cold Call on the HBR presents network.
Bill George is an expert on leadership, a topic he teaches and writes about extensively, including numerous books, articles and business cases. He’s also the former chairman and chief executive officer of Medtronic. And he’s a repeat guest here on Cold Call . Bill, we’re so happy to have you back. Thanks for joining us today.
BILL GEORGE: Thanks for having me back, Brian.
BRIAN KENNY: We always love your cases. They’re super relatable. Sometimes they’re looking at historic figures, like Martin Luther King Jr. We had a conversation about that case, and then sometimes they’re more ripped from the headlines, and this falls into that category. I think these incidents are pretty fresh on people’s minds. We’re going to dig into some of the cultural issues at Boeing and I think people will really like to hear your thoughts on where there might’ve been lapses in judgment or leadership. Let me ask you to start by telling us: What’s your cold call to start the case when you walk into the classroom? Or when you tune in on Zoom, I guess these days, is how we’re doing it.
BILL GEORGE: Brian, here’s how I start. It’s now March nine, 2019. You’re Dennis Muilenburg, the CEO of Boeing. You’re asleep in your apartment, Chicago, while your family’s back in St. Louis at your home. You get a call from your chief safety officer of Boeing. “Dennis, we’ve had bad news. A second Boeing 737 Max has crashed outside Addis Ababa airport in Ethiopia, just a few minutes after takeoff. We just don’t know anything at this stage. You’re Dennis Muilenburg. I want you to walk in his shoes. What are you going to do right now?
BRIAN KENNY: That’s a tough question, but it’s the kind of thing that these leaders have had to face. Tell me… You’ve written so many cases on leadership. Why did this one strike you as one that was worthwhile? And how does it relate to the things that you write about and teach as a scholar?
BILL GEORGE: I think this captures both the crucibles that the CEO and all the executive of Boeing felt about these two planes crashing from the air, and how they led through this crisis. And this really illustrates leading in crisis, which a lot of us are facing these days. But it’s an understanding of all the pressures that come to bear because you have not only the internal pressure, you have the customer pressures, you have the government pressure of the FAA, you have the media pressures, you have investor pressures. You probably have pressures from your board of directors who want to know what’s happened. And all these things come to bear. And so the real test of a leader is how you respond in crisis. That’s what I was trying to capture and really examine how this leader dealt with this crisis.
BRIAN KENNY: Give us a sense before we really dive into the details, what are the key learning points that emerge out of these cases?
BILL GEORGE: I asked the students after they describe how they would deal with this situation, what’s the root cause of this problem. That’s at the start of the class. The other learning that comes out of it is a lot of times crises have very long roots. And if you only deal in the immediacy of it, you don’t understand the depth and complexity of the problem. And the third learning point is to understand what pressures CEOs are under from external forces, as well as internal forces. And how they have to, not only balance, but integrate and satisfy and deal with each of these forces. And the pressures are quite different from a supplier or a customer. And FAA, or frankly, Muilenburg, had to testify before Congress and the pressures you get testifying before Congress. That was an extremely embarrassing situation because the families of the victims of these crashes came with pictures of their loved ones on a board. It was about 16 inches high and eight inches wide. And they held these pictures up behind him at a certain point in time, a dramatic point, they all held them up. And the Congressman asked him to turn around and look, and they asked him, “How are you feeling?” So it’s a high pressure job, I’ll say that. And I think few people understand the multiple pressures CEOs are under, and they wonder, why do they behave in the ways they do?
BRIAN KENNY: You talked about the Ethiopian Airlines disaster. I teased in the intro, the Lion Air flight that crashed in 2018. And that was the first 737 Max to go down. So I wonder if you could maybe talk a little bit about that flight, but I also want to talk about Boeing and its history. They’ve been around for a long time, a storied history in aviation. So tell us what happened with Lion Air first, maybe as a starting point.
BILL GEORGE: The interesting thing about that is how Boeing responded. There was no statement from the CEO. Instead, there was a statement from the public relations department saying that we are very sad this happened, and we are deferring all inquiries to the National Transportation Safety Board that’s going to investigate it. Basically not stepping up to any level of responsibility. And then about a week, 10 days later, Boeing published a statement saying the 737 Max is the safest aircraft that’s ever been flown. Boy, that’s a risky statement when you don’t have all the information. The pressures people are under. You don’t know what happened. There was a clear application they were trying to blame it on the pilots. But you really don’t know what happened. And you’re making these very strong statements and they aren’t coming from a person, the CEO. They’re coming from a press department. And I think that’s significant to examine. They did not shut the flight planes down. They let the others do the investigation for them, and then eventually they started their own investigation, but the planes continued to fly.
BRIAN KENNY: Yes. So, in your opinion, is this consistent with the culture at Boeing? Talk a little bit about the origins of Boeing and the culture that was there, let’s say prior to the merger with McDonnell Douglas.
BILL GEORGE: Boeing, founded by William Boeing, was really the premier aviation company for the last 110 years. And it’s an amazing company. I’ve visited many times out in Seattle and it’s an amazing company. And they have these huge bays where they produce the planes in Renton, Washington, of course, they’re the ones who produced the original 707. They produce the 727 and then they had a huge bonanza with the 747, which of course, a gargantuan airplane that holds up to 450 passengers. So they were the premier company and they were an engineer’s company. They were a company where every engineer would fight to get a job there. And you mentioned McDonnell Douglas. There are pretty clear indications in the case and in all my studies that when they acquired McDonnell Douglas, who was the premier fighter jet pilot — I used to work with them when I was in the Pentagon in the sixties with the F4, a very, very popular aircraft — that the culture changed and it became much more, if you will, cost oriented and less safety and design oriented. We’ve seen this happen to a number of American companies. It happened to General Motors in the 20th century. General Electric too, started focusing less engineering, more on cost reduction. And so Boeing did that, and it also had a rather difficult succession of CEOs. Philip Condit was the CEO and had to resign over an ethical scandal. He actually moved the headquarters from Seattle to Chicago for something like $60 million in subsidies spread over 20 years. Personally, I felt at the time it was a terrible mistake. And I think history has proven it was indeed a very poor decision. Because you separate yourself now from the people running the company, from the engineers, the marketers, the people engaged every day in the business. And production people, you can’t walk out in the factory, they have no interest in Chicago. And then Harry Stonecipher came in. He actually was called out of retirement when Condit was terminated. And he actually did quite a good job reviving the commercial aircraft business because they were losing out to Airbus and always complaining about Airbus’s subsidies. But Stonecipher was a cost guy and he openly said, “We’re going to cut the costs. This has got to be a more business-oriented company, less of an engineer’s company.” And Stonecipher also unfortunately, was caught in a sexual scandal and had to resign. And they had the obvious candidate to become Stonecipher’s successor, right there in place who had revived the commercial aviation business. A man named Alan Mulally. But Mulally was passed over for Jim McNerney, who at the time was CEO of 3M. He had been at General Electric, very famous executive, and only been at 3M for three years. Even got a call from the President of the United States, urging him to take the Boeing job in the national interest. So, they’ve had quite a succession of CEO’s. And Muilenburg comes in, then later on, 2013 was COO, but he really didn’t become CEO or in charge until 2016, when McNerney retired.
BRIAN KENNY: So, let me just jump in for a second, because I’m hearing words like cost cutting, downsizing, whatever the right word, whatever euphemisms we want to use. Cost-cutting in an aircraft manufacturing, there seems to be a tension there. And I’m wondering if what we started to see was a change in emphasis in the way that they went about designing and developing aircraft that may have caused them to not be as thorough as they were in the earlier days when they first developed the 737. Is that fair to say?
BILL GEORGE: It’s always hard to prove that if you’re not working there, but that certainly is the impression one has.
BRIAN KENNY: So, there’s huge cost pressure in the industry, we know that. Why does Boeing make the decision to redesign… Or rather modify, I guess, might be the right word, the 737, rather than just designing an entirely new aircraft?
BILL GEORGE: Bingo, that’s the root cause of the case. It takes people a while to get there. You got there a little bit faster, Brian. That’s the key question. They were losing out to Airbus and American Airlines was about ready to shift away from them. So, they put together a crash program to develop a plane that can hold, a modification of the 737, that could hold the American Airlines business. Point of fact, it was late and it took five, five and a half years to get there, but they did everything to avoid the type certification that you referenced back in 1968. So, think about all the advances in technology and aviation. They had to stay within this type certification that meant they couldn’t make any changes. They couldn’t make it higher, they couldn’t add things, they couldn’t change their training manuals. And so they really couldn’t describe the differences to the pilots that were flying it, or even train them on those differences to stay within this type certification. So, that was the start of everything in the case I believe that predetermined, every design decision was based on not designing a whole new aircraft, which might have taken five to nine years.
BRIAN KENNY: So, is that the key issue there then? Because I’m wondering what the benefits are of constraining yourself in this way, by using a modification approach rather than a completely new design. Are they saving money, or time, or both?
BILL GEORGE: Speed. Get there fast. You have money. Don’t spend the kind of money you did on the 787. Get there fast and hold the customer. And so they’re responding to the short-term competition and frankly made a lot of short-term decisions that proved to be fatal. I was in the safety business at Medtronic. And we knew that one failed defibrillator could cost someone their life. And if we had a software problem that went across all of our defibrillators, they’d all have to be pulled out of people’s bodies or changed, corrected. And so safety was paramount. And you’re talking about a plane that’s flying with 150, 200 people in the air and maybe 300. You can have no compromise on safety. There is no such thing as a trade-off between cost and safety. Safety has to be paramount. Everything you do because the cost of a failure… I’m not even going to talk the cost of this, just the cost of one plane going down, so far exceeds any savings you get in designing the aircraft that you wouldn’t ever want to consider compromising. But these things are very subtle. See, there’s no one that goes out and says, “We’re putting cost over safety.” General Motors had that problem back in the nineties. No one’s saying that. It’s a slippery slope that people get to. And that’s important for students in the class to understand: how do you get in that slippery slope where all of a sudden decisions are being made that put cost over safety. And you do that, you’re going to have a fatal crash, which Boeing did.
BRIAN KENNY: What were some of the design changes they made?
BILL GEORGE: They had to move the engines because it was a heavier plane. But then they couldn’t raise the height like they normally would. So the engines were going to be too low or the profile was too low. So they ran into real problems early on and they had to redesign part of the electronic systems. And so they put in a system called the MCAS system. And that was a software fix to correct this problem. And what the MCAS did is, it had sensors, one sensor, that if a sensor sensed something was going wrong, it would take over flying the aircraft. It would actually take it away from the pilots. It’s like having a self-driving car and you can’t regain control. And so the pilots had lost control, but no one told them this. So if you’re trying to pitch it up and it’s going down, and you pitch it up and it goes down more because you don’t have control. And that’s exactly what happened in these two crashes. They had the same pattern problem. But meanwhile, they couldn’t train the pilots on what this was all about. And so see it’s what I call in the classroom, I talked about Murphy’s Law of Compound Error. When you have one thing go wrong, if you don’t fix the root cause, then you’re going to have compound a series of problems. Because you make decisions based on that first flawed decision, and it can lead to a very negative situation.
Editor’s Note: MCAS is Maneuvering Characteristics Augmentation System
BRIAN KENNY: I would assume Bill, that they must have tested these planes as they were building them. And the FAA plays some role, don’t they, in approving the planes when they’re done. Where was the FAA throughout this process?
BILL GEORGE: The FAA’s budgets were being cut back severely during the early stages of this. And so Boeing agreed to put some of its people into the FAA. So in effect, you had Boeing presenting the product for approval, and Boeing-paid engineers reviewing it, as FAA officials. There was no fraud or sham here. It was very out in the open. It was a huge mistake.
BRIAN KENNY: Yes. You talked a lot about the changes in CEOs that they went through. It sounds like it was a revolving door and maybe not with the most qualified people stepping into that role. What would you say, in your experience working with CEOs in all industries, what would the role of the CEO be in a situation like this? How should they behave? Should they be advocating for the engineers and their employees? Should they be advocating for the customer? What should they be doing?
BILL GEORGE: First of all, I would disagree with one thing. I think they were qualified. Jim McNerney was fully qualified to come in there. He had run the GE jet engine business. Dennis Muilenburg was fully qualified. He’d been with the company 22 years as an engineer, but he came in very late in the game. So what do you do when you’re CEO in 2016 and the planes are flying? You just don’t go shut your production line down and abandon your customers. It could’ve been done after the first Lion Air crash. In fact, should have been done with the benefit of 2020 hindsight. When you have that first crash, that first product that fails. That’s a good place to shut everything down. Yet, in my experience, not saying something, leads to real problems in a crisis. People want to hear from a human being. They want to hear from the CEO. So the role of the CEO, first of all, is to be a spokesman. You remember Brian, the famous Tylenol case with Jim Burke?
And he was out there every day. He didn’t have the information. He didn’t know who’d put cyanide in the Tylenol. He had no information, but he went out and reassured people, “We’re doing everything, okay.” And he didn’t say, it’s impossible this could happen again. Because he was scared to death it might happen again. So, you got to be out there on the firing line. I remember in the case of the British Petroleum, when they had that huge blow up in the Gulf of Mexico and it’s leaking, and it took over 90 days to shut it down. Tony Hayward didn’t go down there for six weeks, the CEO. After six weeks, he goes there. And they ask, “Mr. Hayward, what are you going to say about this?” And he said, “Actually I just wish this whole situation to be over. I’d like my life back.” And someone said, “Mr. Hayward, you know that these 11 employees of yours, they’re not getting their life back.” But that kind of inability to respond to a crisis, I think many CEOs are not well-trained. You can be an engineer at Boeing for 25 years and get to be CEO, but it doesn’t mean you’re well-trained to be in the public eye. So I think CEOs today must be out in front on very important issues because you’re going to be challenged. That’s a big learning from the case, of how important it is to be prepared to do that.
BRIAN KENNY: Yes. Bill, this has been a fascinating conversation. It’s a great case and a disturbing one because you hope that these problems have been resolved, but I guess you don’t really know for sure. But tell us, if there’s one thing you want listeners to take away from this case, what would it be?
BILL GEORGE: The importance of leadership in a crisis and why as the leader, you need to be out in front. If there’s a problem, you should be out there, first of all, apologizing and saying how sorry you are on behalf of your organization for what happened. Even if you don’t know if you’re culpable. Express empathy, because in the media it’s not about facts, it’s about feelings. How do people feel? Do you care about them? And we had a great debate in the case discussion about is it leadership or culture? Well, it’s both. But If the culture’s not right, the leader has to change that culture. And that’s a tough job, but that’s so important that leaders take that role on. So they have the internal role and the external role.
BRIAN KENNY: We’ve talked before about crucible moments and this is something you’ve written quite a bit about. And it sounds like this is exactly what a crucible moment is, right? By definition.
BILL GEORGE: It sure is. And this is the test because you have no preparation for it. You can go through all the crisis training you want, but no one is going to actually put you in that situation. So that’s why I think that Harvard Business School training is so important. We have a lot of people in this particular course, who become CEOs. And this is a great preparation for them to realize, “Oh my gosh, these are the kinds of things that could happen to me too.” So I’d better be emotionally and mentally prepared, even if I’m going to have to work on the fly and decide what to do, because the situation is constantly changing. The fact-based situation’s changing. But like I said, this is a real test.
BRIAN KENNY: Yes. Bill, it’s a great case. Thank you for writing it and thank you for coming on to discuss it with us.
BILL GEORGE: Anytime, Brian, thank you.
BRIAN KENNY: If you enjoy Cold Call , you should check out our other podcasts from Harvard Business School, including After Hours , Skydeck , and Managing the Future of Work . Find them on Apple Podcasts or wherever you listen. Thanks again for joining us. I’m your host Brian Kenny and you’ve been listening to Cold Call , an official podcast of Harvard Business School brought to you by the HBR Presents Network.
- Subscribe On:
Latest in this series
This article is about crisis management.
- Global strategy
- Product development
IMAGES
COMMENTS
Learn how Boeing faced a monumental crisis after two fatal crashes involving its 737 MAX aircraft and how it responded to restore its reputation and safety. Explore the background, causes, investigations, and lessons of the Boeing crisis management case study.
The discussion of the case surrounded on the chronology of the case, crisis communication and management strategies carried out by Boeing 737 and lessons learned from it.
In this paper, we examine several aspects of the case, including technical and other factors that led up to the crashes, especially Boeing’s design choices and organizational tensions internal to the company, and between Boeing and the U.S. Federal Aviation Administration (FAA).
This case explores the causes of and responses to the accidents, from multiple perspectives, from the design of the MAX 8 and MCAS software to Boeing’s corporate culture to government oversight. Questions to consider include: What role did company engineers play in causing the two catastrophes? What about their managers? Their test pilots?
By 2011, Boeing found its 737 losing out to rival Airbus’s A320neo with major customers like American Airlines. Rather than designing a new aircraft to replace the 737, McNerney opted for a five-year program to upgrade to the 737 MAX.
Read Case Study. Discover inspiring customer success stories from Boeing Global Services and learn how our products and services have helped our customers thrive. Explore real-world examples and gain valuable insights into how we can help you too.
Boeing Case Study. Intern Name: Seth Kleckner. Major: Environmental Science. School: University of Washington, Tacoma. Business: The Boeing Company. Industry: Aerospace Manufacturing (NAICS 3364) WASI Project: Environmental Heat Map and Sludge Drying. Company Description.
This reinterpretation intensified the threat on Boeing’s reputation, and raised doubts on the appropriateness of the initial response. This case study illustrates the importance for organizations to be wary and to anticipate developments when selecting and creating a crisis response.
After two fatal crashes of Boeing 737 MAX aircraft within five months, killing all 346 people aboard both planes, the Chicago-based U.S. aviation giant Boeing Co. was faced with a major crisis. In March 2019, aviation authorities around the world grounded the Boeing 737 MAX passenger airliner.
Listen to Harvard professor Bill George analyze the causes and consequences of the two fatal accidents involving the Boeing 737 Max plane. He explores the role of cost cutting, regulatory pressure, and CEO succession in shaping Boeing's corporate culture and crisis management.