BERAC Meeting Minutes May 1-2, 2001

MINUTES

Biological and Environmental Research Advisory Committee (BERAC) Meeting
Office of Biological and Environmental Research
Office of Science
U.S. Department of Energy

 

DATE: May 1-2, 2001

LOCATION: American Geophysical Union, Washington, D.C. The meeting was announced in the Federal Register.

PARTICIPANTS: A list of BERAC members who were present is attached.  Approximately 65 other people were in attendance during the meeting. 

Tuesday, May 1, 2001

Warren Washington - BERAC Subcommittee Review of Global Change Research Program

  • Subcommittee meeting March 26-27, 2001
  • US Global Change Research Program (USGCRP) issues are very important today in light of current Bush Administration energy plans that are under development.
  • USGCRP and BER Global Change Research Program make key contributions to a wide range of issues from forecasts through climate models, the global carbon cycle, the relationship between clouds and radiation in climate systems, ecosystems responses to climate change.
  • The BER program is of very high quality.
  • BER program is the third largest behind the National Aeronautical and Space Administration (NASA) and National Science Foundation (NSF) programs.
  • BER program was a major contributor to the knowledge base underpinning the recently completed third assessment of the Intergovernmental Panel on Climate Change (IPCC) report and to the US national assessment aimed at understanding the impacts of regional climate change.
  • Subcommittee asked BER for a list of its accomplishments over the past 6-7 years. This is a very significant list. BER also provided a total listing of program publications, which is a very thick document.
  • General issues – concern about BER’s limited staffing levels and program management needs. The program is faced with a wide range of complexities due to needed interactions with other agencies. The program deals with the operation and maintenance of facilities that are a key component of the research.
  • Atmospheric Radiation Measurement (ARM) program – ongoing and unique to DOE. Getting at the difficult issue of radiation and clouds. DOE is the only agency with a well-coordinated effort addressing this challenge. The program received high marks from the review and makes suggestions for improvement in the report.
  • Atmospheric Science Program – oxidants, pollution in urbanized areas, field experiments, hydrocarbons, nitrogen oxides, and ozone. A separate research strategy is needed for each of these areas.
  • Carbon cycle research program – research to pin down aspects of the global carbon cycle, inter-annual variation in CO2 increase, terrestrial modeling very important, the AmeriFlux network is a key part. The report includes recommendations to support increased field and modeling work.
  • Ecological process modeling – program is expanding the frontiers of soils and microbial research, impacts of rising CO2 on forest ecosystems, surprising results in terms of large trees and the impacts of precipitation.
  • Integrated Assessment program - unique aspects not found in other agencies, consequences of rising green house gases and surprising results, carbon sequestration options, complex factors overall and need to tease out individual impacts.
  • Climate modeling – DOE innovative and creative in efforts to get larger community of scientists to make climate models useable and efficient and to be more inclusive of the different tools that are available. These models can be used to inform policy decisions.
  • Terrestrial carbon research and ecological process including sequestration – working with partners in other agencies to determine where excess carbon goes.
  • Subcommittee asked for a more detailed outline of the Integrated Assessment program and how it could contribute to current issues. This has the interested of the Bush Administration. Areas of potential impact are identified in the back of the report.
  • Gene Bierly comments – IPCC had 3 working groups. We keep hearing about the science committee, but there was also an adaptation working group focused on adaptation to changing climate and a mitigation group focused on what can governments do. We know a fair amount about the science, but much less about the application of science to real life problems (especially adaptation). IPCC is undergoing many changes and this is an opportune time for the US to a take greater leadership role. DOE argued early on (and alone) for research on the role of clouds and began ARM program. The Australians are now getting ready to set up and pay for an ARM site. The Atmospheric Sciences program, with its emphasis on local air pollution problems, focuses on the important issue of getting to the regional/local areas where people live. Tropospheric aerosols may be a reason why the atmosphere not warming/working the way physics says it should, i.e., due to the cooling effect of these aerosols. We need to get this funded and going. Overall BER’s program is really outstanding with many unique aspects.
  • Lou Pitelka comments – BER program plays a unique role in terrestrial carbon and ecosystem research. DOE started work on the impacts of elevated CO2 long before most thought about climate change. At the time few appreciated the potential role of the forest ecosystem as a driver of climate change and CO2 balance.
  • Where does the report go next? Would be useful to package it for wider circulation. The recently completed Genomes to Life report is a useful “example.”  There is a need for graphics. Need to include the members of the subcommittee. These types of things very useful when talking about the roles of DOE science in Congress, OMB, OSTP, etc.
  • BERAC formally approved the subcommittee report.

 

Bob Ellingson – Director, University of Maryland Cooperative Institute for Climate Studies                     
Associate Director, Earth Systems Science Interdisciplinary Center                                       
Original co-authors of ARM program plan

Mission scientist for ARM Unmanned Aerial Vehicle program

  • Want to try to determine the net amount of radiation that either enters or leaves a volume of atmosphere. Can then determine how much is absorbed in the clouds.
  • Anomalous absorption – greater than predicted by models – paper about 10 years ago that really initiated current activities; off by several 10’s of percents.
  • Motivation for the ARM Enhanced Shortwave Experiment (ARESE)– Knowledge of the amount and location of solar radiation absorption is key to understanding the general circulation of carbon in the ocean and atmosphere and to our understanding and prediction of climate change. Many, though not all, studies show much more absorption than can be explained by theory. Need to change remote sensing techniques.
  • Two ARESE experiments Sep-Nov 1995; Feb-Apr 2000 – directly measure absorption of radiation by clear and cloudy atmosphere.
  • 3 stacked aircraft platforms, satellites, ARM sites – measurements over several hundred kilometers, only 1 day in ARESE 1 with extensive cloudiness.
  • ARESE 2 experiment – unique sampling strategy, single aircraft only on overcast days over Southern Great Plains ARM (CART) site, multiple instruments making same measurements with different technologies, extensive pre and post experiment calibrations, long duration during a period of extensive overcast conditions, science team with considerably different pre-experiment views.
  • Data released to science community, March 17, 2001.
  • National Center for Environmental Prediction (NCEP) forecasts are a must.
  • Apparent disagreement between models. Many of the parameters being used need to be updated. Not all causes of discrepancies identified yet, but have also not yet used spectral data.

 

Nora Volkow – Director, Life Sciences Research, Brookhaven National Laboratory

  • We have technologies that allow us to understand how the brain works, but do not have the capabilities to fully utilize or analyze all the information we have.
  • Stress facilitates drug addiction. Still many discrepancies about the definition of addiction.
  • Start in the dopamine system. Drugs of abuse increase concentration of dopamine in pleasure centers of the brain. Why does this lead to loss of control?
  • Loss of dopamine D2 receptors in orbital frontal cortex associated with obsessive-compulsive disorder, but not previously drug addiction. Why do some become addicted and others not? Obviously can’t study people before and after addiction.
  • Different individuals reporting Ritalin as pleasant versus unpleasant had fewer and more D2 receptors respectively though there was considerable overlap of individual levels so certainly more than one process. How do we even prove an association here?
  • Using human data to guide animal experiments by modifying receptor numbers using adenovirus carrying D2 receptor.  Temporal increase (up to 5-fold). Big reduction in self-administered alcohol correlating directly over time to increase and subsequent loss of receptor.
  • Monoamine oxidase (MAO) B high in nonsmokers and low(er) in smokers – little/no overlap, former smokers return to higher levels, direct consequence of chronic smoking. MAO B breaks down dopamine leading to more dopamine available to enhance the reinforcing effect. Animals treated with L-deprenyl to inhibit MAO B have enhanced response to cocaine, i.e., like having less MAO/more dopamine.
  • Dopamine terminal damage in animals treated with methamphetamine. Possible long term result in neurodegenerative diseases – Parkinson’s-like effects? Reduction in dopamine transporters in users though again overlaps with normal controls. Transporter level recovery in (5) former addicts 9 nine months post use though no recovery in psychosocial behavior in same individuals.
  • Need to engage pharmaceutical companies to help develop radiotracers for experimental/diagnostic use from the many drugs that never make it to market.

 

Ari Patrinos - Associate Director for Biology and the Environment, Office of Science

  • This is a time of transition and change. Currently on a 4-6 week detail to the old Executive Office Building to help the Administration develop a plan to deal with climate change issues. There are parallel “energy plan” activities underway.
  • BER personnel issues – sense that some movement is underway. One IPA has been hired and signs that other actions may be breaking through.
  • Bottom line of the budget - Lots of early anxiety on the Office of Science budget due to tax cuts and education needs. The FY 2002 request is much better than feared and better than those of other science agencies. Our bottom line is flat in contrast to NSF and NASA, which saw reductions in their science budgets. Even did well compared to rest of the Department such as Science and Technology in EM with whom we have ongoing relationships. Even within the flat budget we are still proud to be highlighting and growing Genomes to Life as our major initiative. We still have hopes for even greater growth in the Genomes to Life Program, selling it on its potential applications in low dose, bioremediation, carbon sequestration and clean energy.
  • Biotechnology promise Genomes to Life – the health benefits of the proposed new research are easy to quantify and are relatively unchallenged. The overall benefits of Genomes to Life are as yet just promises but remain to be quantified/demonstrated. Clearly these benefits will not come “tomorrow” but will, hopefully, be a solution by 2050 and beyond? This is an exercise that needs to be undertaken in the near future. We want to avoid giving the impression that there are simply boutique applications for many of the described biotechnology benefits.
  • New BERAC charge – The new National Institutes of Health / National Institute of General Medical Sciences (NIGMS) structural genomics initiative benefited from BER’s help in launching  the structural biology and structural genomics programs, something that BER and BERAC take great  pride and satisfaction in. BER continues its partnership with NIGMS. Most of our labs are involved in the NIGMS structural genomics initiative pilot projects. A continuing role for BER in this research likely. BERAC is now being charged to provide advice on to how BER and the Office of Science should best position themselves (including Basic Energy Sciences) for the future as structural genomics goes from the pilot phase to the full-blown phase.
  • In spite of the many distractions facing the new DOE Secretary at the beginning of his tenure, he was engaged very early on to realize and appreciate DOE’s central role in genomics and now in the new Genomes to Life program. In the FY 2002 rollout Genomes to Life was singled out. The Secretary considers Genomes to Life as one of his legacy programs.
  • Thanks to Warren and subcommittee for wonderful job and timing of the report that took a hard look at our global change program

 

Marv Cassman - Director, National Institute of General Medical Science, NIH

  • The NIGMS Structural Genomics Initiative is a high throughput program.
  • Why now? Progress in genomics. Technology breakthroughs – synchrotrons, protein crystallization robots, DNA sequence comparison capabilities.
  • Benefits of the initiative – Full coverage of protein space by homology modeling. Structure classes of proteins identified through genomic database modeling can have medical importance. Improvement of knowledge-based sequence-to-structure and structure-to-function predictions. Evolutionary links that can give clues to unknown structures. New technologies and reagents for broad use.
  • Will not do – membrane proteins initially; proteins in large assemblies; unfolded or partly folded proteins; post-translationally modified proteins; direct drug design.
  • First international conference on structural genomics at Hingston, UK in 2000 – “…representing the entire range of structural diversity found in nature”
  • Again, this program is about the large scale determination and analysis of protein structure – high throughput!
  • Brief history of key meetings - DOE 1998 ANL workshop, NIGMS feasibility workshop April 1998; Rutgers October 1998; Structure targets workshop February 1999; Hingston April 2000; Organization of Economic Cooperation and Development (OECD) workshop Italy June 2000; International Conference on Structural Genomics 2000 Japan November 2000; Airlie House April 2001.
  • Program will focus on a representative set of structures; functional information; models to extend coverage of sequence space; high throughput structure determination. It will represent each family of proteins by sequence homology. Homologous families are groupings with 30-35% sequence homology. Current guess is that ~10,000 structures will be required (probably more).
  • 1033 structures were deposited in the Protein Data Bank (PDB) in 1994 – 10% were novel. 1977 structures were deposited in 1998 – 9.2% novel structures. At this pace of structure acquisition it will take far too long to accumulate 10,000 novel structures.
  • Scientific issues – target selection; high throughput methods; methods for homology based modeling; development of informatics systems with interconnectivity; structures of difficult proteins, e.g., membranes.
  • Related efforts are underway in Japan and Germany. The UK, and possibly the French, are talking about initiating such efforts. Several industrial efforts are also underway – new startups plus an industrial consortium like the Single Nucleotide Consortium (Alan Williamson lead). Most other efforts focused on practical choices of targets.
  • Policy issues – approaches for tracking progress (not farming out/coordinating targets); timely release of structure information; what constitutes publication (at one per day regular journal publication won’t work); interaction with industry; intellectual property protection versus rapid dissemination of information – a confusing challenge.
  • Tracking – each pilot project group being funded by NIGMS will maintain a public web site with targets, progress and multiple milestones tracked using “date stamps”.
  • Pilots – all components of the initiative are included in each pilot research effort including test target selection approaches, best strategy for scale-up, high throughput and management practices.
  • Requirements - Immediate deposition of coordinates (4-6 week delay/waiting period okay at this point; automatic deposition seems too problematic right now) – immediate deposition and release of coordinates now required for all NIGMS published structures; sharing of materials and samples; annual meetings to share progress; tracking web sites.
  • Initial NIGMS solicitation - 11 applications, 7 awards involving 41 institutions and $29.7M annually.
  • Rockefeller, ANL, UC Berkeley, Rutgers, LANL, U GA, Scripps – most of these efforts are much broader than this – may fund another one or two efforts this year following another review of new applications next month.
  • The organisms included for protein selection include yeast, human, C. elegans, fly, microbes.
  • International task forces have been put together to address issues on informatics, intellectual property, data release criteria and implementation.
  • Data release – ensuring high quality, may be accompanied by short peer reviewed papers; 3-6 weeks following completion; up to 6 months in limited cases (needed to accommodate IP needs of many different groups & countries); still not clear what value of pure structure information is but still pretty murky after discussion with lawyers, judges & patent office; maybe only 8 pure structures patented so far; desirable to move as far as possible to arching raw data as the data management technologies permit – still needs definition (diffraction images may not be reasonable but not too practical) – structure factors
  • Degree of resolution that is likely from these pilot projects? Not defined yet but likely 3 Angstroms or better.
  • Intellectual property – the program encourages a limitation on the patenting of simple 3-dimensional coordinates and an increased emphasis on utility.
  • Bottlenecks – targets; automation of Nuclear Magnetic Resonance (NMR) structure determination; expression and growth of crystals; crystallization automation; data management.
  • Future of the program? These are 5 year awards though not guaranteed. The end result should be 1-3 protein structure determination production lines. The program has a goal of 200 structures per year per group in years 3-4. This program is an adjunct to what is already going on in structural biology.
  • A number of companies are up that are working in this area. Are they running yet? High throughput capacity is being developed. Automated crystallization capability looks very good. Some sharing is going on, e.g., Novartis group. No sharing of structures at this point. At least one example of a company releasing structures but not target lists.
  • Hoping for a centralized point-and-click site so people can search all of the different sites.
  • Two of seven pilot projects have ancillary NMR components. There are some interesting “high throughput” NMR strategies being developed.
  • The insoluble protein problem is being addressed by some of the projects. Some even include membrane proteins to a limited extent. Approximately 40% of all proteins are considered insolvable by current methods. There is a 20% likelihood that useful information will come out from a any selected start at the present time.
  • An individualized grants program continues to be run in parallel to solve many of these challenges.
  • Hopefully current investments will be enough to meet projected demand at beamlines.

 

Keith Hodgson - Chair, BERAC
  • New BERAC charge. BER contributions – user beamline infrastructure, detectors, software, core technologies at synchrotron centers, long tradition in large facility operation, support of experienced science and engineering teams with range of capabilities.
  • Remarkable increase in biology users of DOE facilities over time – four-fold increase during the past 10 years. Projected to reach 11,000 users annually when all beam lines are fully instrumented (currently at 6000+ users annually).
  • In 2000, the US user facilities accounted for over 60% of structure publications in a sample of key journals.
  • Software control systems for system / beam line control and collaboratory development are needed.
  • Issues/concerns/opportunities – new dedicated beam lines require ~3 year lead time. Will there be enough beamlines? Centralized strategies for technology sharing – need adequate resources to make this happen and adequate motivation. Reinvention is costly. Center management is becoming increasingly important. Older, general user beam lines may fall behind without additional funding and focused integration programs to help them keep pace. Data mining tools and resources are required and are not currently covered by the NIGMS initiative.

General Discussion –

  • Need tools for high throughput expression, screening, etc. phase. Can actually solve structures fast.
  • Intent was to expand access by improving productivity of individual beam lines.  The Advanced Photon Source (APS) new beam lines may be the exception especially if the double undulator goes in and then may dedicate a portion just to high throughput structural genomics rather than keeping it completely open.
  • We are pretty inefficient in beam line use now. Most time spent manipulating the sample rather than collecting data. How will this get propagated throughout the system.
  • Determining structures at one per day will simply require beam time.
  • Getting 900 MHz NMRs into the process could be role for BER. Japanese have large investment in NMR farm with a potential for about 20 600-900 MHz machines. There was a structural genomics meeting last summer at the Environmental Molecular Sciences Laboratory that demonstrated some of its potential in this area. There is considerable optimism that the rate of structure determination using NMR is speeding up by even an order of magnitude.
  • Effective informatics systems that tie all the pieces together are key.
  • A subcommittee will be formed to hold a small workshop and draft report.
  • Science vs “infrastructure support” should be considered.

Keith Hodgson - comments on recent activities of the six Office of Science Advisory Committee Chairs

  • Informally organized to communicate with one another to discuss issues of common interest – how we can raise the visibility of DOE science and Federal R&D portfolio.  Have tried in the last 9 months.  Have tried to engage scientific societies.  Have been to Washington, DC twice. It seems like it has begun to be a relatively effective activity.  All six chairs have visited the Office of Management and Budget.  A focus of these discussions was on metrics of scientific success. The Office of Science has an effective mechanism for managing and reviewing its science portfolio. Discussions also focused on the opportunities for DOE science.  Had a Congressional lunch with ~25 House and Senate staffers. Has generated feedback already. Chairs may be asked to testify before the house science committee in a couple of weeks. 
  • Met with Kevin Kolevar, principal science advisor to Secretary Abraham.
  • DOE as a Science Agency – ranked third in overall basic research spending, ranked first in physical sciences funding, first in user facilities support, third in total research investment.
  • Minimal growth of US natural science and engineering bachelors degrees versus in many other countries during the past 10+ years.  There has been a rapid increase in submissions to physical sciences journals from other countries versus from the US which was flat over last 10+ years. More rapid growth in science PhDs from other countries than from the US.
  • There is considerable support for things that the Office of Science does so there is a need and opportunity to continue educational/informational efforts as the new Administration progresses.

No public comment.

Meeting adjourned at 4:45 PM.

 

Wednesday May 2, 2001

Tom Terwilliger - Structural Biologist, Los Alamos National Laboratory

  • What are the biggest bottlenecks now and in the future in structural genomics – information management (integrated databases); process development lags behind production needs; protein purification (remains a one-by-one operation, parallel systems and large-scale implementation of purification facilities is essential – not difficult in itself just as a high throughput, automated process given protein-protein differences).
  • Walnut Creek Facility very well suited to large scale processes already. Possibility of using this facility as a purification facility plus for other applications.
  • Tomorrow’s bottlenecks – analysis of crystallization results (robots for making crystals but need to inspect automatically, still requires a person); x-ray data collection automation and staffing (synchrotrons are wonderful but need people, full staffing and automation will become critical as more data are collected); analysis (refinement of structures and analysis of structural information is time consuming, automation will become critical); long-term employee motivation to work on high throughput structure determination projects.
  • Barriers throughout the process
  • DOE pilot with Pyrobaculum aerophilum – expression (45% readily expressed in soluble form in 1 pass and 55% were not); crystallization (55% readily crystallizable and about half were good crystals – so down to half of half of half ~12%; further got x-ray structures of half).
  • Develop a process for structure determination with a high success rate for suitable proteins and modify sequences of targeted genes to make them more solvable, e.g., by linking a folding reporter protein covalently to a protein of interest the folding of the reporter will be enhanced or inhibited by the folding of protein of interest. Green fluorescence protein (GFP) has been fused to proteins. GFP requires correct folding for green color to be expressed. There was a good correlation of GFP expression with solubility. Can also do cyclic mutagenesis and fusion to select for newly soluble proteins – a type of directed evolution, examples of highly soluble fusion proteins were shown that started out insoluble. Still working to demonstrate that the wild type protein has the same structure as the mutant protein whose structure was actually solved. Can still test for maintenance or loss of function if you know it. This amenable to high throughput.
  • Structure solution is not a limiting factor today – not really to point of automated structure solution yet. There is a separate, NIH-funded collaborative project (PHENIX) to automate structure solution.
  • Facility use today - Half or more of use time is wasted. More data sets are collected than needed. We need to spread automation technology across all beam lines to impact all users. Funds are needed to retrofit beam lines given that there are 20+ beam lines today going to 40+ beam lines in a few years.
  • Computational facilities/capacity are needed. These would enable calculations to be tried that wouldn’t otherwise be attempted. There is a need for real time analysis of collected data sets that would reduce the need for multiple data set collection (with MAD for example) since solvent information coupled with initial data sets often have all of the information needed for solving a structure without the need for collection of two or more data sets as is currently done.

General Discussion

  • The future look of user facilities will be very different with more technical and engineering staff versus postdocs, graduate students and scientific staff.
  • How much of this will be taken up and improved/marketed by companies? How big is the market for the different pieces?

Public comment

  • What is the government’s role in running biological factories, e.g., protein production facilities? Should these be privatized once they are up and running? Not clear how to make this happen effectively, e.g., isotopic labeling of proteins for NMR was discussed 15 years ago yet it is still done in individual labs.
  • How many of these types of “facilities” will end up being part of “core facilities” at universities that are run on overhead versus direct costs?
  • Question of special case proteins that today are not amenable to high throughput approaches – what fraction of proteins are represented by these case? How much should we be investing in these special cases? Having lots of structures of the simple things will likely speed the solution of many of the hard problems by having the structures of the component parts. Ribosome solution was a crystal quality problem for many years. Once high quality crystals were obtained the structure solution fell out very quickly. Structural genomics approaches would certainly have helped by enabling much more rapid development of crystals.

Adjourned 10:11 AM

BERAC Members in attendance: 

Dr. Gene Bierly, American Geophysical Union

Dr. Claire Fraser, The Institute for Genomic Research

Dr. Ray Gesteland, University of Utah

Dr. Jonathan Greer, Abbott Laboratories

Dr. Dick Hallgren, American Meteorological Society

Dr. Will Harrison, University of Florida

Dr. Fern Hunt, National Institute of Standards and Technology

Dr. Lou Pitelka, Appalachian Laboratory, University of Maryland

Dr. Alan Rabson, National Cancer Institute

Dr. Janet Smith, Purdue University

Dr. Warren Washington, National Center for Atmospheric Research