The Downs of AI Technology in the Automobile Industry.
The Downs of AI Technology in the Automobile Industry
People have often associated technology with evolution and progress. However, the case is different under different circumstances. The automotive industry in the USA is trying to compete with Japanese automakers. By doing so, there are risks, which can cause dire consequences to the people, industry, and the nation at large. In fact, the pending regulations on emissions control and safety burden the automotive industry. Thus, the industry is left with little choice to try to manipulate technology to achieve foreseeable progress.
On 14 February 2016, a Google’s self-driving car collided with a bus in Mountain View. Nonetheless, no one was injured in the process as the car drove at a speed of 2mph while the bus moved at 15mph. Google argued that the car was trying to get around sandbags on a street when this incident occurred. On another case, a Google autonomous car was involved in an accident in California whereby the car was programmed to take enough time through the lane when the traffic lights turned blue, and the car could not manage to calculate the unexpected situation. In fact, these cars are at times manufactured with the aim of making profits rather than quality because people fail to understand that business involves narrow aspect, and profit is a merely hidden motive at all times (Antonelli 12). The automated cars at times fail because they are rushed to the market without proper testing.
The program to retire old cars and purchase new fuel-efficient and AI technology cars has been suggested and promoted by people who want to make money (Shanjun et al. 176). As a result, there are increased cases of accidents and lack of proper regulation to guarantee customers’ and any other passenger’s safety while driving the AI cars. For instance, a malfunction of autopilot system caused an accident on 8 May involving a Tesla car. The accident claimed the life of a former SEAL agent on a highway in Florida when a truck crossed the path of the car (Kerstett). This fatal crash showed that self-driving car is still lacking in efficiency and thus signals that there is much to be done to improve the AI vehicles (Kerstett). Self-driving is still not comprehensive, and it is exposed to a few shortcomings which can have dire consequences. In response, Tesla argued that Autopilot was designed in-house using systems from suppliers and its technology. These cars are still not fully trustworthy.
AI technology is meant to help people, ease the burdens on drivers, and help in the development of safe automobiles. Nevertheless, there is always room for error; therefore, it is hard for automotive companies to come up with a car that will not contain any technical flaws. The automotive industries are usually blamed when such a car is involved in an accident or when it encounters a slight problem. People expect these cars to be operational and they fail to understand that it is nearly impossible to create something that is perfect.
Automakers and technology companies are advocating for people to move towards an autonomous future. They argue that these AI cars may not crash as often as people do, but that does not mean that they are infallible. In addition, there has been a high-profile hacking of a car, and this brings a lot of questions about the security of these cars. Moreover, the cars can react differently from what a person expects or fail to do some operations that they are commanded to do (Koo et al. 270). Thus, autonomous cars can cause accidents, even if they are few. Also, the cars may fail to heed to the sense of ethics that is seen among humans (Lin 70). This is because some of the decision-making processes do not depend on the rule of law, rather they require a person to think before making a decision, and this is not possible for these cars.
Antonelli, Cristiano. New Information Technology and Industrial Change: The Italian Case. Springer Science & Business Media, 1988.
Kerstett, Jim. “Daily Report: Tesla Driver Dies in ‘Autopilot’ Accident.” The New York Times, 1 July 2016, https://www.nytimes.com/2016/07/02/technology/daily-report-tesla-driver-dies-in-autopilot-accident.html?_r=0. Accessed 28 Mar. 2017.
Koo, Jeamin, et al. “Why Did My Car Just Do That? Explaining Semi-Autonomous Driving Actions to Improve Driver Understanding, Trust, and Performance. International Journal on Interactive Design and Manufacturing, vol. 9, no. 4, 2014, pp. 269-275.
Lin, Patrick. “Why Ethics Matters for Autonomous Cars.” Autonomous Driving: Technical, Legal and Social Aspects, edited by Markus Mauer et al., Springer Berlin Heidelberg, 2016, pp. 69-85.
Shanjun, Linn, et al. “Evaluating “Cash-for-Clunkers”: Program Effects on Auto Sales and the Environment.” Journal of Environmental Economics and Management, vol. 65, no. 2, 2013, pp. 175-193.