Technology assessment: Can the GAO fulfill the OTA’s mission?

 Image source here

To those who have paid attention to recent technology policy debates in Congress, such as the recent hearings with Mark Zuckerberg, it is obvious that the technology expertise gap between Silicon Valley and Washington is getting worse. At the same time science and technology issues are only becoming more complex and pressing.

In any given week, we expect our lawmakers to competently answer policy questions across a number of technological domains – often in reaction to the news cycle. These are complicated issues that Members of Congress and their staff are ill-prepared to tackle alone given their limited resources, high staff turnover, and the fast-paced congressional calendar.

To address this problem, it is critical to strengthen Congress’ internal capacity to anticipate and understand emerging technology issues. One way to do this, is to revive Congress’ Office of Technology Assessment (OTA).

Before being defunded in 1995, the OTA was an expert legislative support agency staffed with technical experts from various disciplines. When it was shut down, its mission was not taken up by the Congressional Research Service (CRS) as some had anticipated. Instead, various groups sought ways to fill the technology assessment gap. This included doing more with outside groups, like the National Research Council, the Potomac Institute, and the now-defunct Institute of Technology Assessment. Various unsuccessful attempts were also made to resurrect OTA itself. However, there was one successful effort by Congress that resulted in the creation of an experimental technology assessment program in the Government Accountability Office (GAO).

While there has been some recent interest in a revival of the OTA and a strengthening of technology assessment in the GAO, the institutional culture, report methodology, oversight structure, and functions of these two entities are quite different. With that said, this post will examine the differences between the two offices and the advantages and disadvantages of technology assessment within the GAO (in light of the previous functions of the OTA).

Office of Technology Assessment

The OTA existed from 1972 to 1995 as an independent legislative branch support agency. Its mission was to provide lawmakers with access to deep technical expertise necessary to confront an expanding field of complex science and technology challenges. When the OTA closed its doors, it had a staff of around 200 and a budget of $22 million (about $35 million in today’s dollars). During its existence, OTA published around 750 assessments, background papers, technical memoranda, and other reports. (See a full archive of them here).

Government Accountability Office

The GAO’s technology assessment program was first set up as an experimental function in the Fiscal Year 2002 appropriations bill. This function was made permanent in 2008, and continues to this day, producing a few reports each year. Currently, the technology assessment program at GAO sits within the Center for Science, Technology and Engineering within the Applied Research and Methods mission team. It is a very small program, sharing resources and staff with other GAO functions. Like the OTA, the GAO’s technology assessment program is largely oriented to serving requests made by congressional committees. Recent reports have covered topics such as artificial intelligence and the Internet of Things. However, their work product does not seem to have had the impact of the OTA or the National Academies, and they do not have a significant roster of in-house technical experts available to consult with Congress as “shared staff.”

Technology assessment at the GAO

Given the past difficulty and political challenges around reviving the OTA directly, proponents of strengthening science and technology expertise in Congress may consider the GAO as a vehicle for reviving the OTA’s functions. Below are some additional considerations regarding this avenue (for differences between OTA and other entities such as CRS and NAS, see here).

The potential advantages of doing technology assessment in the GAO include:

●      Political viability: The GAO is already doing technology assessment. Politically, it would be much easier to make this program bigger and better than to revive the OTA. Indeed, there have been numerous failed attempts to revive the OTA over the past two decades, and there is no guarantee future efforts would fare any better.

●      More teeth: The GAO has broader statutory powers than the OTA, which may be helpful in investigating government use of technology (although the OTA was also authorized to contract with the GAO for its services).

●      Areas of mission overlap: Some of the OTA’s major successes involved finding cost savings in ill-conceived government programs related to science and technology, or helping government think through its adoption of new technologies. This aspect of the OTA’s mission is well-aligned with what the GAO does.

●      Lean oversight structure: The absence of the OTA’s Technology Assessment Board, which approved new projects and the public release of final reports, reduces the reporting time and could limit opportunities for any particular report to be politicized or censored. The GAO’s congressional protocols allow for any committee or subcommittee to request technology assessment work directly and work with GAO staff to clarify needs, interests, and project objectives.

●      Structural synergies: Given the variable requirements of different report topics, sharing staff between the technology assessment program and other GAO functions produces useful synergies.

●      Perceived neutrality: While this has not always been the case, the GAO today – with its mission as a government watchdog – generally has good credibility with both Republicans and Democrats. The OTA, on the other hand, struggled against the perception of having a left-wing bias.

There are also some disadvantages to using GAO as a vehicle, as well as obvious areas for improvement that Congress and the Comptroller General could fix:

●      Lack of resources: The GAO’s technology assessment program is still tiny relative to the size of the OTA. Its pilot program in 2002 allocated $1 million in funding, which expanded to $2.5 million in 2008. It produces a handful of reports each year, compared to around 50 reports produced each year by the OTA in its final years. If it’s to be successful in covering the range of complex issues before Congress and deepening its technical expertise, it would need to grow an order of magnitude from its current resources (the GAO’s 2018 budget request also points to the need to expand this program but does not indicate how much).

●      Lack of structural independence: The GAO’s technology assessment program does not appear on their organizational chart, suggesting that the program lacks structural independence or the ability to request and compete for resources within the organization. This could be addressed by giving them clearer independence, perhaps elevating the unit to a distinct mission team focused on science and technology issues, and giving them a dedicated public-facing director.

●      No menu of options: In its assessments, the OTA emphasized evaluating pros and cons associated with different policy options. But it did not issue recommendations, which involve value judgments and political considerations best left to Congress. This approach more directly helped lawmakers consider tradeoffs and make informed choices. However, this feature is largely absent in GAO’s technology assessments, which emphasize surveying the field to identify areas of policy interest.

●      Outsider-driven: The content of GAO’s technology assessment reports are largely driven by panels of outside experts from industry or academia. This is similar to NAS, but different in its lack of emphasis on reaching a consensus view. This methodology may limit the depth of the reports, as well as opportunities for congressional staff to access and consult with in-house experts. OTA reports, by contrast, were more driven by in-house experts augmented with project-based contractors and contract agencies – providing more utility as “shared staff” to Congress. These concerns could be addressed through methodological tweaks, but ultimately would require more resources and dedicated staff.

●      Different oversight structure: The OTA had two bodies that helped it determine strategic priorities and consider methodological concerns: The Technology Assessment Board (TAB) and the Technology Assessment Advisory Council (TAAC). The TAB was made up of 12 Members of Congress (six from each party) and the OTA Director as a nonvoting member. The TAAC included prominent leaders from industry and academia, as well as the Comptroller General and the CRS Director. Technology assessments at the GAO, on the other hand, follow the GAO’s general congressional protocols. It is unclear whether the OTA’s extra bureaucratic layer was a net help or hindrance to fulfilling the technology assessment mission, therefore this organizational structure should be considered further. One avenue might be to create a new advisory committee for the GAO’s technology assessment program similar to the TAAC.

●      Different mission and culture: Institutional culture may impede the mission of technology assessment, which is inherently different than the GAO’s auditing, investigations, and legal functions.

●      Giving up on OTA: Advocates may see bolstering technology assessment at the GAO as giving up on the revival of the OTA. However, it would be a mistake to fetishize the OTA for its own sake. If the GAO can be made to fulfill even half of the OTA’s original mission, Congress would be much better equipped than it is today.

Even though the GAO has some structural disadvantages over the OTA that may never be solved, it has a key advantage in that it is already performing technology assessment. While there are some serious challenges and questions that would need to be resolved before strengthening this part of the GAO, advocates for deepening Congress’ technical expertise should consider this as a potential vehicle going forward. Nonetheless, for it to be successful, Congress would need to make substantial changes in both the GAO’s resourcing and structure.

Filed Under:
Topics: Congress & Technology
Tags: Zachary Graves