Subscribe to DSC Newsletter

Obama Administration Unveils $200M Big Data R&D Initiative

Throughout the 2008 hurricane season, the Texas Advanced Computing Center was an active participant in a NOAA research effort to develop next-generation hurricane models. Teams of scientists relied on TACC's Ranger supercomputer to test high-resolution ensemble hurricane models, and to track evacuation routes from data streams on the ground and from space. Using up to 40,000 processing cores at once, researchers simulated both global and regional weather models and received on-demand access to some of the most powerful hardware in the world enabling real-time, high-resolution ensemble simulations of the storm. This visualization of Hurricane Ike shows the storm developing in the gulf and making landfall on the Texas coast [image courtesy Gregory P. Johnson, Romy Schneider, John Cazes, Karl Schulz, Bill Barth, The University of Texas at Austin; Frank Marks, NOAA; Fuqing Zheng, University of Pennsylvania; Yonghui Weng, Texas A&M; via NSF].

The Obama Administration this morning unveiled details about its Big Data R&D Initiative, committing more than $200 million in new funding through six agencies and departments to improve “our ability to extract knowledge and insights from large and complex collections of digital data.” The effort, spearheaded by the White House Office of Science and Technology Policy (OSTP) and National Science Foundation (NSF), along with the National Institutes of Health (NIH)Department of Defense (DoD)Defense Advanced Research Projects Agency (DARPA)Department of Energy (DoE) Office of Science, and U.S. Geological Survey (USGS), seeks to “advance state-of-the-art core technologies needed to collect, store, preserve, manage, analyze, and share huge quantities of data; harness these technologies to accelerate the pace of discovery in science and engineering, strengthen our national security, and transform teaching and learning; and expand the workforce needed to develop and use Big Data technologies.”

The first wave of commitments to support the Big Data Initiative features a new joint solicitation of up to $25 million supported by NSF and NIH – Core Techniques and Technologies for Advancing Big Data Science and... – that will advance foundational research in Big Data. The solicitation aims to (after the jump):

extract and use knowledge from collections of large data sets in order to accelerate progress in science and engineering research. Specifically, it will develop and evaluate new algorithms, statistical methods, technologies, and tools for improved data collection and management, data analytics, and e-science collaboration environments.

Farnam Jahanian, Assistant Director for NSF’s Directorate for Computer and Information Science and Engineering (C..., noted:

“The Big Data solicitation creates enormous opportunities for extracting knowledge from large-scale data across all disciplines. Foundational research advances in data management, analysis, and collaboration will change paradigms of research and education, and promise new approaches to addressing national priorities.

For the solicitation, NIH is particularly interested in imaging, molecular, cellular, electrophysiological, chemical, behavioral, epidemiological, clinical, and other data sets related to human health and disease.

In addition to the BIGDATA solicitation, NSF is also issuing several new awards today in support of the initiative:

  • A $10 million Expeditions in Computing award to a team of University of California, Berkeley, researchers, to integrate “algorithms, machines, and people” (cloud computing, machine learning, and crowdsourcing) to generate new knowledge and insights from big data.
  • The first round of awards made under the Foundation’s Cyberinfrastructure Framework for 21st Century Science and Engineer.... Through a program called EarthCube, these awards will “support community-guided cyberinfrastructure to integrate big data across geosciences,” ultimately transforming how geoscientists access, analyze, and share information about our planet.
  • A $2 million award for a research training group in big data that will support training for undergraduate and graduate students and postdoctoral fellows using novel statistical, graphical, and visualization techniques to study complex data.
  • And a $1.4 million award for a focused research group that brings together statisticians and biologists to develop network models and automatic, scalable algorithms and tools to determine protein structures and biological pathways.

Meanwhile, DoD is “placing a big bet on big data,” launching “Data to Decisions” – an investment of $250 million annually, with $60 million available for new research projects, in a series of programs that will

  • harness and utilize massive data in new ways, and bring together sensing, perception, and decision support to make truly autonomous systems that can maneuver and make decisions on their own; and
  • improve situational awareness to help warfighters and analysts and provide increased support to operations.


[DoD] is seeking a 100-fold increase in the ability of analysts to extract information from texts in any language, and a similar increase in the number of objects, activities, and events that an analyst can observe.

To accelerate innovation in Big Data, DoD will initiate a series of open prize competitions in this space in the coming months.

Read full story at 

Views: 1287


You need to be a member of AnalyticBridge to add comments!

Join AnalyticBridge

On Data Science Central

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service