Advisory Council addendum to final report

We, the members of the Advisory Council, were asked to advise the director of the National Network for Critical Technology Assessment (NNCTA), provide feedback on its activities in its pilot year, and “write a summary of [our] assessment and recommendations as a separate report.”

We applaud the National Science Foundation for its leadership in supporting this important work and associated novel approaches.

Assessment

The objective of the NNCTA’s pilot year was to produce a vision for critical technology assessment that outlines (i) current capabilities (with demonstrations thereof) to help inform Congress and agency leaders on how to prioritize limited national resources—and in particular investments in research and innovation—to have the greatest impact on US societal, national, and geostrategic challenges; (ii) gaps in those capabilities; and (iii) the national investment and organizational form necessary to achieve that vision.

We find that the Network performers achieved the NNCTA objective, clearly demonstrating the value of analytics through the different use cases and providing value beyond the excellent individual work of each subgroup.

We are particularly impressed by the following important outcomes from the pilot year activities:

  • High-quality analytics together with tool development and use underpinning the research and conclusions of the five use cases
  • Novel approach, combining a collaboratory structure with focused and precise research
  • High level of discipline to achieve the Network’s mission within the project’s tight timeline
  • Analysis of equity considerations and impacts in the AI and semiconductor demonstration use cases
  • Focus on identifying and evaluating multiple metrics, which can serve as advance signals about changes in progress or as “feedback control” levers for modifying the direction of progress
  • Broad geographic and disciplinary ambit
  • Multistakeholder engagement throughout the year via multiple channels
  • Identification of gaps in global awareness, data availability and/or access, and broad national interdisciplinary capability and collaboration to analyze critical technology needs
  • High level of interdependence maintained in work and recommendations despite diverse input
  • Exceptional use of graphics to elucidate complex topics, enhancing the communication and likely impact of the work.

Recommendation

Establish a critical technology assessment capability home

The nation’s economic prosperity, national security, and social well-being increasingly depend on technologies that have outsized impact (critical technologies). Given limited resources, it is essential to have the tools, skills, analytical capabilities, and diverse human capital to make decisions about prioritization of and investment in such technologies.

Technology assessment is not just about predicting how a technology will evolve or when. It is also about watching for technology developments, identifying control levers to accelerate (or decelerate) development, and implementing feedback mechanisms to allow effective management of technology investments.

To this end, we strongly support the creation and committed funding of an entity in NSF TIP that continues to mature the discipline of critical technology assessment (CTA) for the nation by investing in research and demonstrations. This entity should

  • create, curate, and make available tools, skills, and capabilities for federal, state, and local government agencies, industry, researchers, decision makers, and others to perform CTA, and ensure access to the data needed to execute these functions;
  • continuously (i) evaluate the quantitative and qualitative validity of metrics used in CTA and (ii) improve these metrics; and
  • communicate the nation’s CTA needs and capabilities, sponsor a broad community of practice of CTA, and provide expertise and tools to federal, state, and local government agencies, industry, researchers, and others as they execute CTA activities.

In addition to the essential role of creating the above competency, the CHIPS and Science Act requires the NSF to lead an annual critical technology assessment, and the above entity could be an appropriate home for this effort.

We further recommend that the home for a national CTA capability have the following essential features built into its work:

  • Engagement: Increase participation from other sectors—in particular, the private sector—perhaps not just as advisors but as stronger collaborators or even participants and independent researchers.
  • Commonality: Provide focus and vehicles to identify commonalities between the tools and approaches developed for diverse applications—it is important to show the value of the network approach to complex critical technology assessments.
  • Data: Ensure access to sufficient data to both produce accurate analyses and, for global competitiveness assessments, to enable international comparability of proposed analytical methods and data resources. Because publicly available datasets might not be adequate and valuable data may be held by private organizations, a key role for this entity will be to establish (i) public-private data partnerships to support CTA and (ii) new models for data sharing.
  • Organization: Foster creativity and integration—we find the general organization structure described in the report to be a very good starting point.
  • Equity: Incorporate equity throughout the demonstration projects, with (i) assessment of return on investments in this area, (ii) reference to relevant programs, and (iii) possible new methods (e.g., informed by NSF’s Eddie Bernice Johnson INCLUDES program or DARPA).
  • Independence and objectivity: Demonstrate the objectivity of the research process and recommendations by maintaining organizational and research independence from undue outside influence, to ensure the integrity of the work and results as well as acceptance by stakeholders that the results can be trusted. Organizational and research independence must be carefully thought through and cannot just be assumed.
  • Continuous evaluation: Foster a culture and practice of iteration, feedback, and validation, constantly evaluating how well the tools are working and iterating between policy, research, and innovation.
  • Communication: Develop a robust communication plan to ensure that the work is effectively conveyed to diverse stakeholders and to amplify impact for all Americans.

Finally, because we anticipate that critical technology assessment will be performed by an increasing variety of government and private sector stakeholders, it is important to recognize that the meaning of the term critical technology may vary by the intended use of the analysis. It could, for example, refer to a fast-evolving technology versus an older, more mature one; or one that, while still early in its life cycle, is projected to have important economic effects downstream; or a technology that has an outsized influence on economic, social, and/or national securities. In any case, we stress that it is essential for any critical technology assessment to start with both a precise definition of what is meant by the term and clear expectations of the desired form and content of the assessment.