p.p1 p.p7 {margin: 0.0px 0.0px 4.0px 0.0px; line-height:

Topic: BusinessInternational Marketing
Sample donated:
Last updated: August 29, 2019

p.p1 {margin: 0.0px 0.0px 0.0px 0.

0px; text-align: justify; line-height: 20.0px; font: 14.7px Arial; color: #000000; -webkit-text-stroke: #000000; background-color: #ffffff}p.

Don't use plagiarized sources.
Get Your Custom Essay on "p.p1 p.p7 {margin: 0.0px 0.0px 4.0px 0.0px; line-height:..."
For You For Only $13.90/page!


Get custom paper

p2 {margin: 0.0px 0.0px 0.0px 0.

0px; text-align: justify; font: 11.0px Helvetica; color: #000000; -webkit-text-stroke: #000000; background-color: #ffffff}p.p3 {margin: 0.0px 0.

0px 0.0px 0.0px; text-align: justify; font: 11.0px Helvetica; color: #000000; -webkit-text-stroke: #000000; background-color: #ffffff; min-height: 13.0px}p.p4 {margin: 0.0px 0.

0px 0.0px 0.0px; text-align: justify; font: 11.0px Helvetica; color: #000000; -webkit-text-stroke: #000000; background-color: #fcfcfc}p.p5 {margin: 0.

0px 0.0px 0.0px 0.

0px; text-align: justify; font: 11.0px Helvetica; color: #000000; -webkit-text-stroke: #000000}p.p6 {margin: 0.

0px 0.0px 4.0px 0.0px; line-height: 37.0px; font: 11.0px ‘Source Sans Pro’; color: #333333; -webkit-text-stroke: #333333; background-color: #fcfcfc}p.

p7 {margin: 0.0px 0.0px 4.0px 0.0px; line-height: 37.0px; font: 11.0px Georgia; color: #333333; -webkit-text-stroke: #333333; background-color: #fcfcfc; min-height: 12.

0px}span.s1 {font-kerning: none}span.s2 {font-kerning: none; background-color: #ffffff}span.s3 {text-decoration: underline ; font-kerning: none}Assignment 1: Autonomous CarsShould programmers be responsible for determining which ethics settings are used in autonomous cars?As we advance in technology we are becoming less and less dependent on people and more dependent on robots.

Autonomous cars are the future. A self-driving car is a vehicle controlled by an autopilot that allows passengers and other road users to travel safely to their destination with minimal or no human interaction. I believe that autonomous cars are the future but there is further development to be done. Autonomous cars offer a safer alternative to human driven cars. Self driving cars are designed to improve road safety and cause less injuries and fatalities on the roads. When a person is driving a car and they do something to avoid an incident, it is considered a reaction.

Self driving cars are programmed by a programmer to calculate the best outcome in the event of an accident, the programmer has told the car what to do. No system is perfect. There are always problems with new systems and even existing systems.

Manufacturers have to do their best to create a safe, secure system for both the owner and other road users. In an event of a crash, people don’t have time to think about morals. However, a computer has the time and it can make decisions based on preprogrammed ethics. These ethics would not be chosen by the programmer but predetermined by extensive research projects. I don’t think the programmer should be responsible for what ethics settings are used in autonomous cars.

Ninety three percent of the nearly six million crashes in America were caused by human error. (Ref 1). Ethics cannot be ignored as if the car behaves badly, the car manufacturer could be held legally liable. The manufacturer could then pass this blame on to the programmers but this is not right. I believe that the manufacturer of the car should be held responsible. If cars are to be autonomous they need to be able to reduce human errors when making decisions while driving. Even though this can be a hard algorithm to design there are already semi-autonomous vehicles that are available to consumers.

 With the development of semi-autonomous vehicles arises the issue of hacking. The software has to be secure but if cars are hacked into, the manufacturer or programmers could be blamed for not making it secure enough. This is an issue with autonomous cars but it is not ethical.If the car was set to protect other drivers and road users over its owner then it should be programmed to prefer a collision with the heavier vehicle rather than the lighter one. If an autonomous car can identify the type of cars on the road, then it would make sense to collide with a safer vehicle over a car that has an unknown safety rating. This could be legally and ethically better than just protecting the car’s owner.

Because the driver is the one who introduced the risk to society, operating an autonomous vehicle on public roads, the driver may be legally obligated, or at least morally obligated, to absorb the brunt of any harm, at least when squared off against pedestrians, bicycles, and perhaps lighter vehicles. (Ref 2)In a real accident, a driver usually does not have the time or the information to make  a decision that is the least harmful. A person who is startled by a small animal may react poorly. He might drive into oncoming traffic and kill a family, or oversteer into a ditch and to his own death.

Neither of these results is likely to lead to criminal charges since there was no negligence or bad intent in making a forced  split-second decision. But the programmer and manufacturer make potentially life-and-death decisions under no truly urgent time-constraint and therefore incur the responsibility of making better decisions than human drivers reacting reflexively in surprise situations.According to consequentialism, we should strive to minimise harm and maximise happiness. Even if consequentialism is the best ethical theory and the car’s moral calculations are correct, the problem may not be with the ethics but with a lack of discussion about ethics. At times an autonomous car may be faced with a scenario where there is no win, putting the programmer in a difficult position. To mitigate this risk, industry may do well to set expectations not only with users but also with broader society, educating them that they could also become victims even if not operating or in a robot car, and that perhaps this is justified by a greater public or overall good.

But programming is only one of many areas to reflect upon as society begins to adopt autonomous driving technologies. Assigning legal and moral responsibility for crashes is a popular topic already The larger challenge, though, isn’t just about thinking through ethical dilemmas. It’s also about setting accurate expectations with users and the general public who might find themselves surprised in bad ways by autonomous cars; and expectations matter for market acceptance and adoption. Whatever answer to an ethical dilemma that industry might lean towards will not be satisfying to everyone. Ethics and expectations are challenges common to all automotive manufacturers and tier-one suppliers who want to play in this emerging field, not just particular companies.Autonomous cars have good benefits and effects that are difficult to predict.

Change is inescapable and not necessarily a bad thing in itself. But major disruptions and new harms should be anticipated and avoided where possible. I don’t think programmers should be responsible for ethics settings in autonomous cars as they are simply writing code and algorithms for the manufacturer based on data from the car. The owner should be responsible. ReferencesRef 1https://link-springer-com.ezproxy.waikato.ac.nz/article/10.1007/s11948-016-9806-x#CR8Ref 2https://link-springer-com.ezproxy.waikato.ac.nz/article/10.1007/s11948-016-9806-x#CR84.1.2 

Choose your subject

x

Hi!
I'm Jessica!

Don't know how to start your paper? Worry no more! Get professional writing assistance from me.

Click here