Complex IP Challenges in Autonomous Vehicles

“Autonomous Vehicle technology is in a transition phase – advanced driver assistance technologies are being implemented on a wide scale, but we are not yet at full self-driving capability. As a result, consumers may be over-reliant on the technology.”

autonomous vehicle technology - https://depositphotos.com/29760661/stock-photo-autonomous-vehicles-self-driving-cars.htmlOn August 16, the National Highway Traffic Safety Administration (NHTSA) announced that it had opened a probe into Tesla’s driver-assistance technologies after it identified 11 crashes since 2018 in which a Tesla vehicle had struck an emergency-response vehicle. All Tesla vehicles involved had been using the automaker’s Autopilot feature at the time of the crashes, which enables the vehicles to steer, accelerate, and brake automatically. The crashes have attracted the scrutiny of lawmakers and regulators of Autopilot and similar technologies. With increased attention being paid to AV safety, AV companies are shifting their research and development and IP strategies toward technologies designed to address consumers’ real-world safety concerns.

The Basics of Autonomous Vehicle Technology

AVs rely on artificial intelligence (AI) to recognize and interact with their environments. They map their environments primarily using radar, sonar, and lidar technologies that allow the vehicles to “see” their environment. Radar and sonar emit radio and sound waves, respectively, in pulses that reflect off nearby objects and provide the sensors data on a nearby object’s location, distance, speed, and direction of movement. Lidar is a similar technology that works by emitting thousands of laser beams in all directions that reflect off surrounding objects, which the sensor sees as “point clouds.” Data points from the sensors are then fed into a centralized AI processor that synthesizes it to produce a full “picture” of the AV’s environment. Radar, sonar, and lidar are ideal for detecting moving objects such as other vehicles and pedestrians, while cameras and GPS provide information about fixed infrastructure such as roadways, buildings, and trees.

The AI systems that power AVs must be trained in advance using machine learning to correctly identify objects and stimuli that the AV may encounter. This training must also occur across various environments and conditions the AV may encounter. This training typically produces satisfactory performance under normal circumstances (e.g., driving on standard roadways in dry, sunny conditions), but problems may arise when AVs encounter situations where the environments have been altered, unknown stimuli are present in the environment, or environmental detection technologies are challenged. For instance, AVs may have difficulty perceiving complex roadway scenes such as those involving road construction, stalled cars and crashes, and occasionally may struggle to perceive infrastructure in dark conditions. To minimize the chances of an accident, the AI systems must be trained across a multitude of environmental conditions, including snow, rain, day, night, twilight, etc.

More than merely recognizing their environments, AVs also need to be able to interact and communicate with them. They achieve this through various connections to other elements of the transportation system, including infrastructure, vehicles, and pedestrians. There are five key types of AV connectivity: V2I: Vehicle to infrastructure; V2V: Vehicle to vehicle; V2C: Vehicle to cloud; V2P: Vehicle to pedestrian; V2X: Vehicle to everything.

A number of companies have begun implementing these technologies into AVs. Currently, two communications standards support V2X, which is a peer-to-peer technology that can warn vehicles about obstacles the AV’s sensors may not identify quickly enough. One of these, C-V2X, is based on a cellular network, while the other, 802.11p, is based on short-range communications (DSRC). DSRC allows communication without the use of a wireless network but is limited to short ranges, while C-V2X requires a cellular network but works over longer distances. Current wireless networks allow some V2X connectivity, but the rollout of the 5G network will provide coverage, reliability, low latency, high data bandwidth, and geolocation services with the potential for improvement.

When AVs need to make split-second decisions on the road (e.g., determining right of way, avoiding collisions, etc.), they rely on expert systems that are programmed to mimic the decision-making skills of a human driver. However, the expert systems are limited by their inputs—if the environmental recognition technologies do not accurately detect the environment, AV decision-making will be affected.

The NHTSA’s Tesla Investigation

Tesla’s Autopilot feature – the subject of the NHTSA’s investigation – is, according to the company’s website, “designed to assist you with the most burdensome parts of driving. Autopilot introduces new features and improves existing functionality to make your Tesla safer and more capable over time.” It includes the following driver assistance features:

  • Traffic-Aware Cruise Control: Matches the speed of the car to that of surrounding traffic
  • Autosteer: Assists in steering within clearly marked lanes and uses traffic-aware cruise control

The vehicles accomplish these tasks through the use of cameras, ultrasonic sensors, and an advanced neural network. All Tesla vehicles come standard with Autopilot, although buyers may opt to add the Full Self-Driving Capability package, which provides even more active driver assistance features such as automatic navigation, vehicle summoning, self-parking, and self-driving. Notably, Tesla includes the following disclaimer on its Autopilot support page: “Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment [emphasis added]. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.”

In the summary of its preliminary investigation, the NHTSA states that it is investigating 11 crashes in which Tesla models of various configurations encountered first responder scenes and struck one or more vehicles involved with those scenes. Most of the accidents occurred after dark and the crash scenes included scene control measures such as vehicle lights, flares, an illuminated arrow board, and road cones. The NHTSA confirmed that all vehicles involved were engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes. The investigation “will assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation.” In other words, it seems that the NHTSA suspects that Tesla drivers are over-relying on driver assistance features, despite Tesla’s instructions to its drivers.

One of the 11 accidents being investigated – the only one resulting in a fatality – is illustrative. On December 29, 2019, a fire truck was responding to a single vehicle accident on Interstate 70 in Cloverdale, Indiana. Upon arriving on the scene, the fire department positioned its emergency vehicles in the passing lane of I-70 and activated their emergency lights. A passing Tesla failed to observe the emergency vehicle in the passing lane and collided with it. The driver and his passenger sustained serious injuries in the collision and had to be extracted from the vehicle. The passenger later died from her injuries. Although investigators confirmed that Autopilot was engaged at the time of the accident, it is not clear to what extent the driver was attentive and prepared to take over control of the vehicle.

Coming to Terms with AV Capability

There is always a consumer learning curve whenever a new technology enters the market. AV technology is in a transition phase – advanced driver assistance technologies are being implemented on a wide scale, but we are not yet at full self-driving capability. As a result, consumers may be over-reliant on the technology; they may not fully understand how the technology works and may assume that it is more capable than it really is. Authorities believe others may be attempting to push the limits and are getting hurt in the process.

This may be partially a marketing issue – when consumers see the phrases “Autopilot” and “Full Self-Driving Capability,” they take them at face value, Tesla’s disclaimers notwithstanding. However, this is also a driver responsibility and training issue. Many of the drivers involved in these accidents seem to have taken advantage of an emerging technology without regard for its inherent challenges. Drivers need to understand that AV technology is evolving and that they have the ultimate responsibility for their vehicles. Regulators may require more robust disclaimers from AV manufacturers and may even mandate additional licensing requirements for AV drivers.

Finally, traffic safety regulators may ultimately take a closer look at the environment detection and expert systems used by AVs. In other words, traffic safety regulators may seek to certify some minimum set of environmental detection training and associated decision-making for AVs before the advanced autonomous operation packages are allowed to be marketed to consumers. This may, in practice, be similar to the Federal Aviation Administration certifying autopilot instrumentation for aircraft.

The growing pains of the AV industry come with a silver lining: the more self-driving cars there are, the safer the roadways will get. More AVs means more communication between AVs about the environment. And because the AI systems that power AVs rely on machine learning, the more data that they process, the more “intelligent” and capable they will become. In the short term, automakers should work to implement systems that are capable of detecting stalled vehicles and accident scenes to avoid the types of collisions the NHTSA is investigating. In the long term, advancements in AV technology – and the more orderly roadways it will lead to – have the potential to significantly reduce the number of crashes and save thousands of lives every year.

Image Source: Deposit Photos
Image ID:29760661
Copyright:iqoncept 

Share

Warning & Disclaimer: The pages, articles and comments on IPWatchdog.com do not constitute legal advice, nor do they create any attorney-client relationship. The articles published express the personal opinion and views of the author as of the time of publication and should not be attributed to the author’s employer, clients or the sponsors of IPWatchdog.com.

Join the Discussion

3 comments so far.

  • [Avatar for Common Sense Squared]
    Common Sense Squared
    September 17, 2021 07:47 pm

    Common Sense missed an important point. Emergency procedures should be adapted to accomodate AVs. After all, AVs are on the way in and, as a result, emergecy vehicles are on the way out. This is not a joke, this is inevitable.

  • [Avatar for Amen]
    Amen
    September 16, 2021 09:58 pm

    Amen to Common Sense and in honor of Elon Musk, founder of Telsa Motors, who dared to pioneer AV technology and who is called on to explain why it is not perfect.

  • [Avatar for Common Sense]
    Common Sense
    September 16, 2021 07:36 pm

    The more that AVs are used, the more organized, structured, and safe will be the roadways and the less that dangerous circumstances will occur. For example, the more that AVs are used, the fewer the accidents and thus the fewer the accident-related secondary accidents as with the eleven Tesler crashes. Also, the more simplistic protections include inter-vehicular communications to warn upcoming AVs about crash scenes and either invoke special precautions such as slow-downs and special AI precautions or force human-driver takeovers. And human-related features such as bright flashing lights and lane-blocking vehicles that can confuse AV sensors will give way to inter-vehicular communications, on-board AV-sensors, forced slow-downs, and other AV-non-intrusive measures.

    But this is only common sense. And what the future will bring will make the above seem cave-man primitive.

    Don’t hamper progress because it is not perfect at the start and don’t penalize the pioneers because they are not solve all of the problems at the start. AVs will make the world a better and safer place and the AV pioneers must be honored and rewarded, not vilified as with the Wright Brothers. See the Sept. 1, 2021, IP Watchdog article “Thomas Edison and the Consumer Welfare Benefits of Patent Enforcement.”