Practical lessons for regulating autonomous vehicles
As autonomous vehicle (AV) technology evolves, regulators must address growing concerns about the realities of having these vehicles on roads around the world. Four new policy papers from the Berkman Klein Center for Internet and Society at Harvard University outline the main issues and provide policy suggestions. ITU News caught up with Aida Joaquin Acosta, the papers’ author and Fellow at the center, about the key takeaways.
What do regulators need to understand about autonomous vehicles (AVs)?
Regulators need to understand the complexity of AV technology and its capabilities and limitations, to avoid underregulating or overregulating AVs. Underregulating could endanger civil rights, and overregulating could hamper innovation and delay spreading their benefits among society.
AVs are complex: they are more than a vehicle with a camera, and their impacts go beyond traditional safety and security concerns.
‘Governments should consider the potential benefits and challenges of AVs holistically and from different perspectives.’ – Aida Joaquin Acosta, Fellow, Berkman Klein Center at Harvard University
AV technology is comprised of numerous systems including sensors (e.g., cameras, radars, ultrasounds and LiDARs) that generate massive amounts of data; and an Artificial Intelligence (AI) system, which commonly use deep learning and neural networks techniques to analyze the data and make decisions while driving.
The complexity of AVs is further increased by their interactions with traditional vehicles and pedestrians, as well as their communications with other AVs, infrastructure and other devices.
AVs may challenge cybersecurity, privacy, ethical norms, environmental and landscaping schemes, mobility and accessibility paradigms, use of resources, or sectorial employment.
What are the main challenges to regulating AVs and how can they be addressed?
In the AV papers, I discuss three main challenges to regulating AVs:
- Defining guiding values, concepts and processes, which form a foundation for a coherent regulatory framework, such as the nature of the driver, the meaning of being in control of the vehicle, the impact of the technology in society, and the processes and actors that should be involved in deciding ethical questions on AVs;
- Deciding the most effective policy regime for issues such as liability, insurance or a safety threshold to place AVs on public roads; and
- Dealing with different technical parameters that can hamper interoperability, such as the minimum distance allowed between vehicles (e.g., 5 meters in France and 2 seconds in Germany), parameters that would restrict the flow of cross-border innovation.
To help with these issues, the paper suggests three practical tools relevant to each challenge:
- Structured dialogues with stakeholders to help define the guiding values and concepts, facilitating consensus on how to govern both the technology and its impacts;
- Law Labs to help decide effective policy regimes. Law Labs would be used similarly to regulatory sandboxes, but instead of testing innovations in a sandboxed environment, Law Labs would test regulations of innovations in a sandboxed environment, enabling measurement of the performance of the regulations; and
- Legal Interfaces, a new regulatory model we conceived for dealing with different technical parameters across jurisdictions, while requiring less legal harmonization. Legal Interfaces separates a law into its parameters (e.g., speed limits, lane of driving, etc.) and procedures (e.g., overtaking a vehicle, parking, etc.), adding the parameters in an international database and offering an interface for AVs to download the parameters of the jurisdiction they seek to operate in. Therefore, the model does not require general customization of the AV code for each jurisdiction or harmonization of parameters across jurisdictions.
What does a “strengths, weaknesses, opportunities, and threats” (SWOT) analysis reveal about how governments can create effective policies for AVs?
Governments should consider the potential benefits and challenges of AVs holistically and from different perspectives, as the potential net impacts of AVs are not clear, and balanced public policies can help optimize net outcomes.
The SWOT analysis shows that potential benefits can easily transform into correlated challenges, and vice versa. For instance, AVs could reduce the environmental impact of road traffic by decreasing CO2 emissions via more efficient driving.
However, AVs could contribute to more CO2 emissions by adding more vehicles to the roads, for example, by allowing the use of vehicles by wider sectors of the population, or by increasing highway capacity via shortening the distance between vehicles. A comprehensive analysis of benefits and challenges will help to develop more effective policies.
Furthermore, the SWOT analysis reveals key assets that governments can use to increase the performance of public policies, as well as weaknesses that they should work to mitigate.
‘It is vital to maintain a fluent dialogue with industry and stakeholders during the whole regulatory cycle of emerging technologies such as AVs and, more generally, AI.’
For example, governments can promote technology that prioritizes social values or educate society on the use and risks of the technology; and governments can work to reduce their lack of specialized technical knowledge, improve interdepartmental coordination, or revise existing laws to reduce obstacles to innovations.
What is an example of a best practice for AV regulation?
To be successful, each best practice may need to be adapted to the specific context of its region; however, there are some mechanisms that can work well in many cases.
For example, in my opinion, it is vital to maintain a fluent dialogue with industry and stakeholders during the whole regulatory cycle of emerging technologies such as AVs and, more generally, AI.
For regulators, it is hard to predict the impacts of these technologies before placing them in the market. And working closely with the agents that develop and interact directly with the technology — from examining policy options, to designing a clear regulatory framework, to implementing and reviewing regulations — can help to reduce some of these uncertain impacts and produce more effective regulations.
There are several examples of this best practice.
For instance, the European Commission (EC) created GEAR 2030, a High-Level Group which included governments, industry and stakeholders in order to make recommendations on European AV policy.
The German government created a multidisciplinary Ethics Commission on Automated Driving, bringing together academia and different stakeholders to develop ethical guidelines for AVs.
The government of Singapore has developed a clear regulatory framework and close collaborations with AV companies to develop AV testing specific to the project.
As the technology continues to advance, how can regulators best prepare?
In order to prepare, regulators can:
- Establish fluent communication channels with industry, academia and other stakeholders, through multi-stakeholder collaborations, to design more effective regulations.
- Public-private collaborations could help policy makers to better understand the technology and its diverse impacts, and thus, facilitate informed decisions and regulations that are better fitted and adapted to the technology.
- Create a task force or committee to work across departments to coordinate policy actions and regulations of emerging technologies. Emerging technologies such as AI, and its applications, including IoT and AVs, are horizontal technologies, impacting many sectors. Improving coordination between governmental departments could help to develop comprehensive and coherent regulations of technology, avoiding excessive regulatory burdens or a lack of regulation where needed.
- Share best practices and cooperate internationally with other governments to accelerate the design of better AV policies. Governments should learn from one another what worked well and what did not, to build from best practices.
- Fund interdisciplinary research that promotes the development and use of technologies which aim at benefiting society, as well as research on the impacts of technologies in society. In this way, regulators can both promote the development of technology for the social good and develop proactive policies to mitigate potential negative impacts of the technology in society (job losses, ethical and privacy concerns, bias and discrimination, etc.).
Views expressed in this article do not necessarily reflect those of ITU.