Research Menu

Skip Search Box

The Next Wave | Vol. 19 | No. 4 | 2012

Funding research for a science
of cybersecurity: The Air Force
makes it a mission

The Air Force Office of Scientific Research (AFOSR) plans, coordinates, and executes the Air Force Research Laboratory's basic research program. AFOSR's technical experts identify and fund long-range technology options at Air Force, university, and industry research laboratories. This support ensures the timely transition of research results that lead to revolutionary scientific breakthroughs, enabling the Air Force and US industry to produce world-class, militarily significant, and commercially valuable products. Such research is inherently risky, sometimes outside of the mainstream, and often requires an extended period of support. This article describes several AFOSR initiatives that focus on the science of [cyber]security (SoS). The initiatives include a Multidisciplinary University Research Initiative (MURI), a Young Investigator Program (YIP) grant, and a Basic Research Initiative (BRI).

Multidisciplinary University Research Initiative (MURI)

In 2010, the deputy director for cybersecurity in the Information Systems and Cyber Security Directorate of the Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) requested the AFOSR to fund a MURI focused specifically on the science of [cyber]security (SoS). The MURI program is DoD-wide and complements other DoD programs that support university research through the single-investigator awards. The MURI supports the research of teams of investigators whose backgrounds intersect multiple traditional science and engineering disciplines in order to accelerate research progress. The government team for this effort was led by Dr. Robert Herklotz, AFOSR, and included support from a number of research funding organizations including the Air Force Research Laboratory/Information Directorate; the Army Research Office; the Office of Naval Research; the National Science Foundation; the National Security Agency; the National Institute of Standards and Technology; and the Office of the Director, Defense Research and Engineering (now the ASD(R&E)).

The SoS MURI was prompted by the widely held belief in the security community that cybersecurity has been pursued largely as a reactive effort, with an endless cycle of new attacks and defensive responses. Many security experts have come to believe this cycle cannot be broken because today's information technology systems are too complex to ever be modeled with formally defined and verified security properties. In fact, no formal definition of cybersecurity described in terms of system properties has yet been produced, let alone metrics capable of measuring those properties.

The objectives of the SoS MURI, as presented in the proposal solicitation, are to begin the development of an architecture or first principle foundation to define cybersecurity for such systems, to discover and define basic system properties that comprise system security and other useful attributes, and to identify system properties that can be verified and validated through theoretical proof and/or experimentation. A primary goal is to answer the following questions through the discovery and analysis of basic system properties:

    Can the system enforce the desired security policies in each system component?

    Can the system enforce the desired security policies across all system components simultaneously? If so, what are the security properties of the whole system?

    Can system capability, as defined in the first two bullets above, defend against each class of attack, once classes of cybersecurity attacks are defined?

    How can we formally define cybersecurity policies and mechanisms (including defense, monitoring, response, etc.) and assess their effectiveness against classes of attacks?

    Can an adversarial process model be formally defined that is capable of generating known classes of attacks?

    Can we define metrics for basic system properties and for the ability of a system to enforce a security policy that defends against a class of attacks?

    Can we define system properties and metrics dealing with system characteristics, such as scalability, adaptability, ease of use, etc., in order to compare alternative system designs?

The development of theoretical underpinnings (i.e., system properties and relationship to policies) and the theories and metrics (i.e., relationships between attacks, defenses, and policies) will allow us to create system engineering methodologies that can perform rigorous design trade-offs among cybersecurity properties, as well as other properties, in the development of complex systems. In addition, this research will:

    Enable the creation of new technologies and supporting tools grounded on sound principles,

    Establish a baseline for comparing technology capabilities among vendors,

    Encourage the creation of a new industry for security software engineering technologies, and

    Reduce development costs by providing scientifically supported evidence of security properties rather than applying exhaustive testing to look for evidence of insecurity.

The winning MURI proposal

The winning proposal, announced April 22, 2011, is entitled "Science of cybersecurity: Modeling, composition, and measurement." The work is to be performed by a multiuniversity team of researchers led by Professor John C. Mitchell of Stanford University.

Professor Mitchell's team proposed research to advance a science base for trustworthiness by developing concepts, relationships, and laws with predictive value. Their work will focus on problem areas amenable to rigorous treatment and generalizable solutions and is organized around the following three thrust areas:

  1. Security modeling. A uniform approach to security modeling will allow systematic approaches to be developed and applied to a broad range of richly connected systems, supporting analysis of resilience against graduated classes of clearly defined threat models.
  2. Secure composition. Principles of secure composition will be developed, analyzed, and evaluated for systematic and modular construction of trustworthy systems, relative to security properties that can be verified and validated through theoretical proof and/or experimentation.
  3. Security measurement. New security measurement concepts will be devised and used to determine relative strengths of defense mechanisms, whether security improves from one version of a system to another, and when additional security mechanisms are warranted, given incentives associated with system attackers and defenders.

Together, the advances anticipated for these three complementary thrusts will support a science base for future systems that proactively resist attacks through secure design, development, and implementation based on principled foundations.

Young Investigator Research Program

On January 11, 2012, the AFOSR announced it would award approximately $18 million in grants to 48 scientists and engineers who submitted research proposals through the Air Force's Young Investigator Research Program (YIP).

The YIP is open to scientists and engineers at research institutions across the US who received a PhD or an equivalent degree in the last five years and show exceptional ability and promise for conducting basic research. The objective of this program is to foster creative basic research in science and engineering, enhance early career development of outstanding young investigators, and increase opportunities for the young investigators.

Among the 2012 winners was Michael Clarkson, assistant professor in the Department of Computer Science at the George Washington University. His YIP proposal, "Making cybersecurity quantifiable," is focused on further development of his PhD thesis on hyperproperties, a very promising tool for security science.

Basic Research Initiative on cyber trust and suspicion

On March 27, 2012, the AFOSR announced a Basic Research Initiative (BRI) to build the foundational understanding of human trust and suspicion in the cyberspace domain. Cyberspace operations rely heavily on the degree to which users trust, or are suspicious of, their information technology systems. To date, there has been little or no work in providing any unified/comprehensive treatment of the impacts of social, cultural, economic, political, and emotional factors (to name a few) underlying trust and suspicion, especially in complex systems.

The winning proposal, "A social, cultural, and emotional basis for trust and suspicion," led by Dr. Eunice E. Santos of the Institute of Defense and Security at the University of Texas, El Paso (UTEP), was funded on September 14, 2012. Her team, which includes UTEP, Syracuse University, the University of Tulsa, the University of Houston, and Assured Information Security, Inc., proposed research to develop a model of system users and managers and insider behavior that accounts for and explains the social, cultural, and emotional basis for trust and suspicion.

Among the questions their research will address are:

  1. How can different people be swayed (or sway others) based on trust or suspicion?
  2. How and why do group member sociocultural characteristics, group size, information sharing patterns, and events affect group cohesion?
  3. Is it possible to detect significant drops in situational awareness or when the level of trust is inappropriate in a given context?
  4. What are the critical interrelationships between information, emotional responses, situational awareness, influences on decision making, and associated changes in task performance?
  5. How do complex multiscale and multilevel factors affect insider threat detection?
  6. Lastly, and most importantly, can this research be unified into a single overarching framework of social, cultural, and emotional factors underlying trust and suspicion?

The end product of their project is a methodology that can be used to better understand system users and managers and the insider threat by providing the social, cultural, and emotional basis of human behavior in the cyber domain and the impacts of trust and suspicion on cyberspace operations.

A legacy of research

The AFOSR was born out of the need to address a long-standing shortfall in military basic research. This deficiency became obvious during World War II, when massive civilian-led research and development efforts were required to create the technology needed for our nation to dominate warfare in a physical battle space. Today the AFSOR continues its original mission by investing in the development of basic research to support domination of the emerging battle space in the cyber domain. Just as a well understood scientific foundation is necessary for secure and safe physical systems, a science of cybersecurity is needed for safety and security in the cyber world. To learn more about the AFOSR basic research program funding opportunities, download the broad agency announcement (i.e., BAA-AFOSR-2012-0001) from

About the author

Dr. Robert L. Herklotz is currently the program manager for the Information Operations and Security basic research program at the Air Force Office of Scientific Research in Arlington, Virginia. He invests in science to develop secure information systems for our warfighters and to deny the enemy such systems. His specific subareas of research include the science of cybersecurity, secure humans, secure networks, secure hardware, covert channels, secure execution on insecure systems, secure data, and secure systems-security policy. From 2000 to 2006, he managed the Air Force's basic research investment in three programs: software and systems, artificial intelligence, and external aerodynamics and hypersonics. Prior to that, he was a career Air Force officer, retiring in 1999.

Dr. Herklotz holds a PhD from the Massachusetts Institute of Technology, an MS from Purdue University, and a BS from the US Air Force Academy. His awards include two Silver Stars, four Distinguished Flying Crosses, eight Air Medals, and the Association for Computing Machinery 2012 Special Interest Group on Security, Audit and Control Outstanding Contributions Award.

View PDF version of this article (238 KB)


Date Posted: Jan 15, 2009 | Last Modified: May 9, 2012 | Last Reviewed: May 9, 2012