The first panel of presenters at the symposium examined how SBIR/STTR programs fit into the larger DOD research, development, and acquisition ecosystem. This broader context provides insights on several metrics that could be proposed to the Pentagon and Congress, said the panel’s moderator, Arun Seraphin, executive director of the Emerging Technologies Institute at the National Defense Industrial Association.
Jason Rathje, director of the Office of Strategic Capital within the Office of the Secretary of Defense, began by highlighting three papers that describe recent changes within the SBIR program, including using metrics to measure the
effectiveness and performance of SBIR programs more broadly and how the program relates specifically to the DOD’s objectives.
The first of those papers, based on Rathje’s dissertation at Stanford University, looked at the impact of SBIR programs on technology startups.1 SBIR is focused on small business innovation, but there are multiple kinds of small businesses, Rathje observed. Some are recent, some have been around for decades. Some are small businesses that intend to grow quickly and are backed by sources of capital like venture capital, while others do not have growth as a goal. Rathje’s research asked whether working with the department created a positive, negative, or neutral impact on issues important to companies, such as growth. More broadly, it looked at working with mission-backed organizations like the DOD or the National Aeronautics and Space Administration, where the intent of research and development (R&D) funding was to generate technology innovations that supported the department’s or agency’s priorities. His research showed that working with the DOD through SBIR programs tended to lead to slower growth in companies. The message he took away from that finding is that “we had to find different ways to accommodate different types of small businesses.”
The second paper, written by New York University professor Sabrina Howell, London School of Economics economist John Van Reenen, University of Chicago economist Jun Wong, and Rathje, looked at the new and emerging model of SBIR funding called open topics funding.2 Typically, the DOD releases SBIR solicitations based on department requirements, such as a new lithium-ion battery for intercontinental ballistic missiles. Open topics allow companies that have little familiarity with the department to submit what they are working on without specific preexisting requirements. This approach, he said, “is more aligned with commercially oriented companies to offer ideas to the department and say, ‘Here’s something I’m working on. Is it of interest to the DOD?’” The open topics approach “transformed the way that at least the Air Force SBIR program worked with SBIR companies.” Open topics SBIR companies tended to have stronger positive growth in such areas as raising venture capital and received more follow-on DOD contracts than did conventional topic companies. “There was a stronger impact of SBIR financing on winning future defense contracts.” This has implications for the types of policies the DOD could direct toward SBIR, Rathje said. The SBIR statute is very broad, meaning that agencies can exercise the legislation in many different ways. For example, although SBIR traditionally has concentrated on R&D work, the statute permits testing and evaluation projects as well, extending beyond development into procurement.
___________________
1 Jason Rathje. 2019. Survive, but not thrive? The constraining influence of government funding on technology start-ups. Proceedings of the Sixteenth Annual Acquisition Research Symposium. https://dair.nps.edu/bitstream/123456789/1741/1/SYM-AM-19-052.pdf
2 Sabrina T. Howell, Jason Rathje, John Van Reenen, and Jun Wong. 2023. Opening up military innovation: Causal effects of reforms to U.S. defense research. National Bureau of Economic Research. Working Paper 28700. http://www.nber.org/papers/w28700
The last paper was a recent report by the Government Accountability Office that also compared open topics with conventional topics.3 One key finding in that paper was the impact on socially disadvantaged businesses. Open topics were more likely to be awarded to socially disadvantaged businesses than were conventional topics, though the paper does not comment on why that trend exists.
Metrics for these programs need to recognize the different ways of operationalizing policies, said Rathje, whether between open topics and conventional topics or alternative types of financing. The Office of Strategic Capital within the DOD has been thinking deeply about the use of metrics in transitioning component technologies. For example, how can programs like SBIR continue to increase investment in hardware-focused areas that are critical to supply chains and the industrial base? How can such programs affect defensive capabilities such as aerospace, hypersonics, or geospatial analytics? Rathje’s office is using new tools, such as loans and loan guarantees to increase capital investment into such component industries as microelectronics, biotech, and semiconductors.
The DOD’s SBIR programs fund a wide variety of technologies that affect a wide variety of mission capabilities, and there are many different pathways by which technologies transition. “It’s important from a metrics perspective to adequately capture the intent of an SBIR topic or approach and also the outcome itself in its own unique way.”
In response to a question, Rathje pointed out that the defense industrial base has consolidated dramatically over the last 30 years. One consequence has been that the impact of patents from the defense industrial base has declined dramatically. As a result, “the criticality of small business innovation programs to the future of innovation in the DOD is only increasing,” to the point that even the Secretary of Defense has been talking about SBIR—which makes the outcomes of the discussion over metrics highly important to the future of the program inside the DOD.
Microelectronics underpin many critical technologies with military applications, from future-generation wireless technology to trusted artificial intelligence to biotechnology to autonomous vehicles. Within this context, roadmaps are critical, said Devanand Shenoy, principal director for microelectronics in the Office of the Under Secretary of Defense for Research and Engineering, both to technology insertion into defense systems and to commercial microelectronics.
Within the DOD, Shenoy described, the Trusted and Assured Microelectronics program enables access to state-of-the-art technologies by supporting design, manufacturing, assembly, packaging, and testing of
___________________
3 U.S. Government Accountability Office. 2023. Small business research programs: Most agencies allow applicants to define needs and propose solutions. https://www.gao.gov/assets/d23106338.pdf
microelectronics. In addition, a new initiative called the Microelectronics Commons, created by the CHIPS and Science Act, supports lab-to-fab prototyping and optimization for defense program demonstrators. In partnership with the Commerce Department and other federal agencies, the Microelectronics Commons is designed to expand the number of concepts and ideas that can transition from proof-of-concept to the market, in part through SBIR/STTR projects. In this way, he said, the commons addresses the valley of death for technology commercialization and benefits not only the defense industrial base and defense programs but commercial industry as well.
Shenoy focused specifically on the data and metrics that can be used to evaluate SBIR/STTR programs as part of its broader efforts in microelectronics. He mentioned:
Shenoy also pointed to the importance of coupling programs to leverage and capitalize on investments through SBIR/STTR projects. For instance, going through the Microelectronics Commons helps SBIR/STTR projects develop needed prototypes. “That’s got to be a critical objective, at least in microelectronics.”
Bruce Jette, president and chief executive officer of Innvistra LLC, contrasted the size of SBIR/STTR programs with overall defense programs. At one point in his previous responsibilities at the Department of Defense, Jette had about 150,000 people working for him, either directly or indirectly, and his budget was as much as $200 billion on contract, with $7.5 billion of that representing R&D funds annually. In contrast, the funds devoted to SBIR programs were about $250 million to $280 million a year—“0.1 percent of my budget.” The average Phase I contract was about $300,000, compared with a rifle procurement program that might amount to $7 billion. As a result, he said, an SBIR program doesn’t necessarily get the focus it should “because it takes a lot to do the process on the government side to get all of these things done.” Even for a Phase II program, which might be funded at the $2 million level, the amount of money being spent is relatively tiny compared with other programs, he said.
Jette’s answer to this mismatch was a program called xTechSearch, which brought together engineers, program managers, and budget officers to pick promising technologies and shepherd them through the system.4 “We were pretty successful at that,” he said. Part of the program involved setting up meetings around the country to talk with companies about their capabilities and about DOD needs. After such discussions, some companies expressed an interest while others did not. “We tried to understand why they said, ‘no thanks,’” Jette said.
That observation relates to the metric he suggested for the SBIR program. Why would companies turn down an opportunity to work with the DOD? “Is there something that you didn’t understand? Or is it part of the programmatics? Aside from the standard measurements, that’s “probably one of the most important pieces” to evaluate.
In addition, Jette pointed out that metrics change as a program changes. Even if an initial set of metrics is appropriate, programs may need to abide by different sets of metrics as they evolve. Metrics such as cost, schedule, performance, and risk apply in different ways over time. This can limit the ability to apply SBIR funding to bridge the funding valley of death, and it may discourage companies from applying for funds to make that transition. The rate at which the government proceeds may not match the rate at which the small business needs to acquire funding.
In response to a question, Jette also raised the issue of who is going to gather new data and do new analyses. “Because SBIR dollars can’t be used to do the studies, I have to budget for it,” he said. “I’m the guy who gets all these good ideas and has to figure out how to weave that into a $7 billion program and actually deliver on time.”
___________________
4 Information about the program is available at https://www.xtech.army.mil
A major part of the work of the Institute for Defense Analyses’ System Evaluation Division, which Stephen Ouellette directs,5 is to evaluate the performance and effectiveness of U.S. military systems, including munitions, platforms and delivery vehicles, sensors, communications, and command and control systems for every military service and in every domain of warfare, except for the cyber domain, Ouellette reported. A core element of this evaluation is assessing the benefits, the challenges, and the risks associated with integration of new technology into the department’s programs of record.
DOD acquisitions involve big businesses with high barriers to entry, Ouellette observed. This favors the large DOD prime contractors, which understand the DOD acquisition system, have experience working with it, and have structures in place to manage burdensome government processes and requirements. As privileged entities in the defense industrial ecosystem, he suggested, the primes are interested in maintaining stability to protect their advantages and the market share of their existing product lines. The major contractors do invest in R&D to maintain their competitiveness, but they are incentivized to manage innovation in a way that is consistent with sustaining their profitability and controlling competitive risk.
In this environment, SBIR programs are a mechanism to stimulate innovation independent of established large contractors. Such innovations may be higher-risk, higher-reward approaches to improve performance or lower cost that are not deemed sufficiently beneficial to a larger business with an established technology base. SBIR innovation may also be disruptive and radically change what the department is seeking, putting existing product lines at competitive risk.
In this environment, SBIR projects must overcome two major barriers to succeed, said Ouellette. First, if the small business is not in a position to sell directly to the government, a major contractor might see value from an SBIR innovation to their product offerings and may purchase, lease, or acquire the technology from the small business to incorporate it into products for the department. However, if a major contractor sees a disruptive threat to its product lines, it may seek an acquisition or exclusive license simply for the purpose of burying the innovation to manage their risk. “At least anecdotally, this is reported to happen, [although] I believe it’s an open question how pervasive this phenomenon might be,” Ouellette said. Until more data are collected to understand this issue, metrics assuming that a license or acquisition actually resulted in the application of a technology should be viewed with skepticism, he said. The practice of acquiring technologies to prevent their potentially disruptive effects is clearly contrary to the intended purpose of the SBIR program and to the DOD’s interest.
___________________
5 Since the time of the workshop, Stephen Ouellette has transitioned to a new role at the Institute for Defense Analyses and no longer directs the System Evaluation Division.
The second challenge is that SBIR awards only take technologies up to the edge of the valley of death, and many do not make the transition across that gap into programs of record. One factor in this problem is that DOD programs themselves are risk averse. A technology may not be deemed suitable if the cost per unit is too high, especially prior to wider commercialization and manufacturing scale-up. This is why some technologies, when they make it to programs, are put on the shelf for an extended period while a manufacturing, development, and commercialization effort proceeds to bring the unit cost down to where it becomes palatable for an acquisition program.
Ouellette indicated that another issue new technologies encounter is integration risk. For example, if the space, weight, power, robustness, or environmental suitability of a technology creates a risk for a program, the technology may be avoided until the risks can be brought down. In either of these cases, the program will forego the new technology in favor of known approaches with lower risk.
The DOD would benefit if it had better metrics for three specific issues, said Ouellette. The first is whether SBIR awardees are well positioned to cross the valley of death, either because they have robust and mature plans for commercialization, are aligned with critical needs of current programs for which the programs do not have an alternative technological solution, or are willing to take risks despite the potential costs.
The second issue, he said, involves the alignment of SBIR project objectives with DOD science and technology priorities identified in the solicitations. In other words, are the SBIR innovations closing technology gaps and meeting the needs described in solicitations, or is there a duplication of effort? Are they only partially satisfying needs such that more investment is needed in other areas?
Finally, are SBIR technologies actually making it into acquisition programs or operational uses? This can be challenging to determine when needed information is classified or otherwise unavailable. Ouellette reported that a DOD report that looked at the national economic impacts of the SBIR/STTR program from 1995 to 2018 found that “many projects involved classified applications, and companies were reluctant to provide sales data or were expressly prohibited from doing so. Also, many companies or their SBIR-derived capabilities were acquired by large prime contractors and these small businesses were not allowed to disclose (or did not know) the resultant sales figures.”6 Were they simply burying the technology, or were they applying it in classified programs that might lack visibility?
Ouellette closed with two other thoughts about how the metrics for SBIR programs might be used or misused. First, metrics clearly have many purposes. They may be compared to threshold criteria, such as the benchmark requirement
___________________
6 TechLink, in collaboration with the Business Research Division, Leeds School of Business at the University of Colorado Boulder. 2019. National economic impacts from the DoD SBIR/STTR Program, 1995-2018. https://www.sbir.gov/impact/impact-reports
for commercialization applied to repeat SBIR awardees. Metrics can also measure outcomes relative to expectations. For example, are projects doing better than or are they on par with similar current projects or historical expectations?
Second, perhaps the most powerful use of metrics is to drive improvements in outcomes by examining correlations between the outcome metrics and associated data on process inputs and controls. For instance, how do the robustness and maturity of the commercialization plan for Phase II SBIR proposals relate to the degree of success? Is there a way to establish how one impacts the other using the metrics and data available?
Metrics also can be used at different levels of oversight, he said. Metrics can be used by Congress to determine the effectiveness of the SBIR program. They can be used by the Small Business Administration. They can be used by the departments and agencies awarding the SBIR. They can be used by SBIR performers to judge their own success. “In every case, the metrics and the data they are based on should be focused on management and assessment by oversight authorities downward. If metrics or data can be manipulated by the overseen entity, there is an incentive to misuse them to upwardly manage the oversight authority. I’ve seen this personally, not as much in the case of SBIR in my experience but in the use of technology readiness levels to assess the maturity of technologies. There’s enough leeway in the definitions of readiness levels that the programs of record can inflate the apparent maturity of a technology and downplay technical risks that could result in program delays or eventual cancellation. And they have a tremendous incentive to do so if they can manipulate their metrics to influence the oversight. So it would be useful to think about how metrics and the data they’re based on can be verified or audited to ensure their trustworthiness. Do you know what you’re getting when you ask for things?”
In response to a question about using the transition of a technology into a program of record as a metric, Rathje pointed out that program of record has a very specific definition and not every entity meets that definition. “Plenty of commercial systems that our warfighters use every day will never be considered a program of record.” Ultimately, the DOD is interested in funding things that support the DOD. This can include commercial or defense technologies, but both fall under the banner of commercialization. In that regard, one measure of success is whether a technology is eventually procured by the department, whether directly or more deeply within the supply chain.
The department also thinks about SBIR programs as a way of bringing nontraditional vendors into the DOD, Rathje pointed out. That might involve bringing in more mature technologies from the commercial sector and integrating those into defense systems. In that regard, a metric beyond advancing technologies is understanding the commercial innovation ecosystem and how to integrate technologies into defense systems.
Shenoy added that a useful metric is not only whether the program advances technology but also whether it lowers costs for the DOD or lowers other barriers to use. “The value of the technology that we support through these SBIR/STTR companies is that it is critical in connecting to an end commercial product or different products. The more we do to try to figure out what that value is upfront, the better.” Other useful approaches, he said, are to tie evaluations to technology roadmaps and to evaluate the potential commercial impact of a technology.
Ouellette suggested three ways of thinking about the value to the department of SBIR programs. Did an SBIR investment make an impact for the DOD? If there is an impact, what is the net difference in terms of savings? And did the technology represent a novel innovation? “Sometimes there’s an alternative way of doing something, and what the SBIR innovation does is increase the competitive space.”
Asked what he would term the “top metric,” Rathje responded that it is value per dollar spent. Others exist, such as total value created, economic growth, or job creation, but the ultimate measure is how much a dollar spent generates in return. He cautioned against measuring the total number of projects. Even if a large number of Phase I projects are funded in the interests of experimentation, greater funds invested in relatively fewer Phase II projects may yield a greater return.
In response to a question about the feasibility of basing metrics on the careers of program officers, Shenoy said “there is a lot of value to understanding that metric.” Jette pointed out, however, that the current budgeting methodology and career path expectations present obstacles to that kind of tracking. Another possibility would be following the track records of scientists who are successful at getting patents, starting companies, and other measures of success within SBIR/STTR programs.
Finally, the panelists discussed the relative emphasis given to incremental advances in SBIR/STTR programs versus major and transformational advances. Rathje pointed out that the size of the awards tends to favor incremental advances more than exceptional ones. “There’s only so much money you can put on an SBIR, and because of that there’s only so much value that it generates for the department and also so much focus and time the department will spend on it.” But policy reforms could allow the program to take on bigger challenges, he added. Another factor, Shenoy pointed out, is that higher-risk projects typically take longer to mature, which is another reason that SBIR programs tend to focus on smaller components within larger systems. Beyond the size of the advances, Ouellette observed, successful programs tend to combine talented engineers and entrepreneurs who are all interested in making a technology successful. “They had a passion for it. They had the right type of background, the right expertise.”
This page intentionally left blank.