The Challenge of Ocean Drilling into the Earthquake Source
Frederick M. Chester, Ph.D.
Director, Center for Tectonophysics
David Bullock Harris Chair in Geology
Professor, Department of Geology & Geophysics
Texas A&M University
It is impossible to directly observe the source of a great earthquake because it occurs deep within the Earth's crust, is short-lived, and cannot be predicted. Nonetheless, several long-standing questions about the physics of earthquakes can be answered by rapid-response drilling after an earthquake. The Japan Trench Fast Drilling Project (Expedition 343/343T of the Integrated Ocean Drilling Program) was the first rapid-response expedition in the history of ocean drilling to successfully drill into an earthquake rupture zone. The drilling target was the source of the March 2011 Mw 9.0 Tohoku-oki earthquake that produced the giant tsunami that struck the coast of Northern Japan. The goals of the expedition included collecting geophysical measurements and rock samples from within the earthquake rupture zone, as well as placing instruments across the zone to measure the frictional heat produced during the earthquake. The drill target was penetrated at a record-setting depth of nearly eight km below sea level in extremely deep water near the Japan oceanic trench. He will summarize the scientific motivation for the rapid response expedition, the challenges that were overcome, and the truly unique findings that shed light on the Tohoku-oki event and other tsunamigenic subduction zones worldwide.
Translation of the Cancer Genome
Lynda Chin, M.D.
MD Anderson Institute for Applied Cancer Science
Chair, Genomic Medicine, Division of Cancer Medicine
The University of Texas MD Anderson Cancer Center
Cancer is the phenotypic endpoint of myriad genetic and epigenetic aberrations that collectively commandeer key cancer-relevant pathways. Comprehensive characterization of the cancer genome holds enormous potential to (i) provide penetrating insight into the genetic bases of cancer, (ii) identify promising candidate therapeutic targets and diagnostic biomarkers, and (iii) illuminate the path toward personalized cancer medicine. The Cancer Genome Atlas Research Network (TCGA) has generated a catalogue of somatic genomic abnormalities in glioblastoma multiforme (GBM), including whole genome, exome and RNA sequencing, as well as copy number, transcriptomic and targeted proteomic profiling of over 500 patients. Integrated analyses of this comprehensive reference will lead to new insights and hypotheses that can potentially improve management and outcome. Examples and approaches to translating this type of complex cancer genomic information into biological insights will be discussed.
Keppel Offshore & Marine (Keppel O&M), a Singapore company, has built a significant portion of the offshore drilling rigs—about half of the world’s jack-up rigs and about a third of the semisubmersible rigs since 2000.
Established in 1968 as a ship repair yard, the company has grown to be one of the world’s largest offshore and marine groups with a global network of 20 yards and offices, including a construction shipyard in Brownsville, Texas, and business and engineering offices in Houston.
In the process, Keppel O&M has acquired and developed a suite of proprietary designs that have become industry benchmarks. Of the company’s new rig orders, more than 80% are for proprietary designed rigs.
The company has also pioneered specialized innovative rigs such as semisubmersible drilling tenders for niche markets. Through R&D and engineering centers, Keppel O&M is now developing the next generation of rigs, including harsh environment semisubmersibles for the North Sea and rigs that can withstand the extreme condition of the Arctic Ocean.
Mr. Chow Yew Yuen will talk about how Keppel O&M has continually upgraded its competencies and developed its technology expertise to engineer award-winning offshore platforms and build “high-tech heavy metal” for oil and gas exploration and production.
Breakthroughs in Drilling Technology
Marketing and Technology Manager
Schlumberger Drilling Group
The International Energy Agency recently predicted that the United States’ oil production will surpass Saudi Arabia’s by 2020. This dramatic turnaround is based on new technology which has opened oil and natural gas plays in the deepwater offshore and in shale formations on land. Breakthroughs in drilling technology allow the industry to drill wells in three kilometers of water depth, to vertical depths of 12 kilometers, and with horizontal reaches of 11 kilometers. Such extreme wells cost hundreds of millions to drill and complete. The same drilling technologies have been successfully applied to drill horizontal shale oil and shale gas wells on land which only cost a few million dollars.Modern drilling technologies can drill complex well trajectories, can hit a targeted hydrocarbon zone and remain in it, can measure the earth’s properties at the drill bit, and can transmit information to the surface in real time. For example, sophisticated and autonomous instruments are built into drill collars which utilize electromagnetic fields, gamma-ray scattering, neutron scattering, nuclear spectroscopy, magnetic resonance, pressure sensors, and ultrasonic, sonic, and seismic measurements. These instruments must function accurately in the hostile drilling environment, at high temperatures, in high pressures, and with high shock levels.
Challenges of Petroleum Exploration in the Deepwater Gulf of Mexico
Senior Research Scientist
Department of Earth, Atmospheric and Planetary Sciences
Massachusetts Institute of Technology
Until the recent developments that have significantly improved our ability to extract gas and oil from tight rocks located onshore, the Deepwater Gulf of Mexico was the largest and most promising region for increasing domestic petroleum production. The exploration for, and development of, the resources in this region present enormous technical challenges if petroleum is to continue to be found and economically extracted while limiting impacts on the environment and worker safety. Geophysical methods such as seismic, electromagnetic, and gravity provide a majority of the quantitative information about the subsurface that is used to identify the locations of resources. Many of the methods used had undergone steady evolution for applications to other regions. However, significant improvements were required for application to the Deepwater Gulf of Mexico due to the complex geology that includes large salt bodies that act as seals for reservoirs but that had previously been opaque to geophysical signals. Geophysicists rose to the task, and exploration methodologies are greatly improved over what they were even five years ago. Techniques like full waveform inversion, 4D seismic, joint imaging using multiple geophysical techniques, and advances in acquisition have all played a role. Advanced simulation has also played a role in providing benchmark data for realistically complex earth models that can be used to benchmark and improve new methods. Simulations that were previously considered too large to conduct, and that generated datasets that were estimated as being too large to manage, are now feasible, particularly when done collaboratively in consortia like the SEG Advanced Modeling Corp (SEAM) projects. Improvements made using these new methods are now being applied both onshore and to other deepwater fields like those offshore of Brazil and Africa.
The Deepwater Horizon Oil Spill NRDA: The Good, the Bad, and the Ugly
Robert Haddad, Ph.D.
Division Chief, Assessment and Restoration Division
NOAA’s Office of Response and Restoration
On April 20, 2010, an explosion and fire on the mobile offshore drilling unit Deepwater Horizon, which was being used to drill a well for BP in the Macondo prospect (Mississippi Canyon 252 – MC252), killed 11 men and injured 17 others. The rig sank and left the well spewing tens of thousands of barrels of oil per day into the Gulf of Mexico. It is estimated that five million barrels (210 million gallons) of oil were released from the Macondo wellhead from a depth of approximately 5,000 ft. beneath the surface of the ocean. Of that, approximately 4.1 million barrels (172 million gallons) of oil were released directly into the Gulf of Mexico over nearly three months. In battling the spill, unprecedented volumes of dispersants were used at the surface (1.7 million gallons) and at depth (771,000 gallons).
Under the Oil Pollution Act of 1990 (OPA), those responsible for an oil spill are financially responsible for a variety of costs, including spill cleanup costs, increased costs of public services related to the spill, property damage related to the spill, compensation for public and private economic losses, and restoration of injured natural resources. While the U.S. Coast Guard is directing federal efforts to contain and clean up the Deepwater Horizon oil spill, state and federal natural resource trustees are responsible for leading efforts to assess damage to natural resources and to restore those injured resources to the condition they would have been in had the spill not occurred.
The Natural Resource Damage Assessment (NRDA) regulations, under OPA, designate federal, state and tribal natural resource trustees to conduct NRDAs on behalf of the public. Ultimately, the trustees have a mandate to restore, rehabilitate, replace, or acquire the equivalent of the damaged natural resources. To meet this mandate, the trustees seek to restore injured resources and services to baseline (the condition they would have been in had the spill not occurred) and to compensate the public for interim losses (i.e., the losses that occur during the time it takes the resources to recover to baseline).
The Deepwater Horizon NRDA, given its geographic size, multi-dimensional nature and ecological complexity, may continue for years. State and federal trustees are working together to determine how the oil spill affected the Gulf of Mexico’s natural resources and the human use of those natural resources. With potential natural resource injury spanning five states and their waters, as well as federal waters, this is the largest damage assessment ever undertaken.
The natural resource trustees for this case include the National Oceanic and Atmospheric Administration (NOAA), the Department of the Interior (DOI), the Department of Agriculture (USDA), and the Environmental Protection Agency (USEPA) from the federal government and designated agencies within each of the five affected Gulf States: Florida, Alabama, Mississippi, Louisiana, and Texas. Immediately after the oil spill started, trustee scientists trained in emergency response to marine pollution mobilized and began collecting environmental information to quantify baseline conditions and potential impacts. In accordance with the OPA regulations, all potential responsible parties were invited to participate in the Deepwater Horizon NRDA. BP is the only responsible party participating in the cooperative NRDA process.
Dr. Haddad’s presentation will provide a background on the basics of NRDA – from both a regulatory framework and a real-world policy perspective. He will define NRDA, where it comes from, how it works, and its major challenges. He will then provide an overview of the NRDA framework being used to identify and quantify impacts to the Gulf of Mexico ecosystem caused by the Deepwater Horizon oil spill and guide the development of a comprehensive restoration plan. Due to the challenges in quantifying ecosystem-level impacts at the level of causality (as required under OPA), this approach generally begins with quantifying impacts at either the resources (e.g., birds) or habitat (e.g., near shore marsh) level. As these impacts are assessed and realizing that the resources and habitats are interconnected through the concept of ecosystem services, he will briefly touch on the type of framework that will be required to examine and evaluate how they are integrated to produce adverse impacts throughout the larger Gulf of Mexico ecosystem.
The range of dull, dirty, and dangerous tasks carried out by unmanned systems continues to expand year by year as militaries and industries invest substantially in their development. The most famous challenge for unmanned systems is that of autonomy—detecting and handling changing situations capably as a “thinking human” would.
For ocean operations in particular, a more ordinary challenge has actually been a more fundamental limitation. This is the problem of energy. Ocean distances are large and locomotion is highly energy consumptive. For small vehicles in particular, the need to carry fuel severely limits both functionality and range. Most unmanned marine vehicles (UMVs) thus require a manned ship for deployment and support—eliminating much of the desired benefit. Originally inspired by a desire to monitor humpback whale song, the Liquid Robotics Wave Glider® enables long-term, unmanned operations by continuously harvesting energy from waves and the sun. It does so at the ocean surface—an area that covers two-thirds of our planet. This is enabling rapid development of an unexpected range of new applications and solutions for ocean data collection, monitoring, and communications supporting the needs of the ocean scientific, environmental, energy production, food production, and security sectors.
Understanding Neurodevelopmental Disorders and Autism: From Basic Neuroscience Research to Clinical Trials
Kimberly M. Huber, Ph. D.
Associate Professor of Neuroscience
The University of Texas Southwestern Medical Center
Fragile X Syndrome is the most common inherited cause of mental retardation and autism and is caused by loss of function in a single gene, called FMR1. Currently there are multiple clinical trials in Fragile X Syndrome and autism patients using therapeutics that block a neurotransmitter receptor called mGluR5 (metabotropic glutamate receptor 5). Early results in patients and animal models find that mGluR5 blockers are a promising therapeutic for Fragile X Syndrome. Dr. Huber was the first to discover that mGluR5 was overactive in mouse model of Fragile X Syndrome. She will discuss the history of this discovery and how this originated from basic research questions about normal mGluR5 function in the brain. She also will discuss her latest research which provides an understanding of how a defective FMR1 gene and other autism-linked genes lead to abnormal mGluR5 function.
Mechanism of Rapid Antidepressant Action
Lisa Monteggia, Ph.D.
Associate Professor of Psychiatry
The University of Texas Southwestern Medical Center
Clinical studies consistently demonstrate that a single sub-psychomimetic dose of ketamine, an ionotropic glutamatergic n-methyl-d-aspartate receptor (NMDAR) antagonist, produces fast-acting antidepressant responses in patients suffering from major depressive disorder (MDD), although the underlying mechanism is unclear. Depressed patients report alleviation of MDD symptoms within two hours of a single low-dose intravenous infusion of ketamine with effects lasting up to two weeks, unlike traditional antidepressants (i.e. serotonin reuptake inhibitors), which take weeks to reach efficacy. This delay is a major drawback to current MDD therapies, leaving a need for faster acting antidepressants particularly for suicide-risk patients.
Ketamine’s ability to produce rapidly acting, long-lasting antidepressant responses in depressed patients provides a unique opportunity to investigate underlying cellular mechanisms. Using a variety of techniques, we examined the ability of ketamine, and other NMDAR antagonists, to trigger a rapid antidepressant response. We find that ketamine, as well as other NMDAR antagonists, produce fast-acting behavioral antidepressant-like effects in mouse models that are dependent on rapid synthesis of brain-derived neurotrophic factor (BDNF). Furthermore, we find that mice lacking BDNF or trkB receptors that mediate BDNF signaling do not respond to ketamine treatment. Our findings suggest that BDNF synthesis regulation by NMDA receptors may serve as a viable therapeutic target for fast-acting antidepressant development.
Development of Spinal Cord Connections That Underlie Movement
Samuel L. Pfaff, Ph.D.
Howard Hughes Medical Institute
Gene Expression Laboratory
Benjamin H. Lewis Chair in Neuroscience
The Salk Institute for Biological Studies
The spinal cord contains much of the neural circuitry that controls simple movements, but the molecular cues that specify the pattern of neural connections remains poorly understood. The basic framework for neuronal connections is set during embryonic development using guided axonal growth to form the underlying circuits for complex behaviors. Axon navigation is regulated by a relatively small number of guidance receptors and extracellular signals, yet it occurs with remarkable precision and generates immensely complex wiring patterns. The integration of guidance information to create interconnected signaling pathways is an important layer of the fidelity-control system that ensures precision in neuronal wiring. In this talk I will present our recent findings on the genetic pathways that control how spinal motor neurons extend axons to different muscles in the periphery.
Stem Cells Behaving Badly: Probing the Origins of Childhood Brain Tumors
Robert Wechsler-Reya, Ph.D.
Director of the Tumor Development Program
Sanford-Burnham Medical Research Institute
Stem cells have the remarkable ability to renew themselves and to generate multiple different cell types. This allows them to generate normal tissues during development and to repair tissues following injury, but at the same time, renders them highly susceptible to mutations that can result in cancer. Only by understanding the signals that control growth and differentiation of stem cells can we learn to harness their regenerative capacity and restrain their malignant potential. Our research focuses on the role of progenitors and stem cells in development of the cerebellum, and on the ability of these cells to give rise to medulloblastoma, the most common malignant brain tumor in children. This lecture will discuss our recent studies using cerebellar stem cells to model the most aggressive subtype of medulloblastoma and to test novel therapies for this disease.
Neurobiology of Rett Syndrome and Related Neuropsychiatric Disorders
Huda Y. Zoghbi, M.D.
Director, Jan and Dan Duncan Neurological Research Institute, Texas Children’s Hospital
Professor of Pediatrics, Molecular and Human Genetics, Neurology, and Neuroscience, Baylor College of Medicine
Investigator, Howard Hughes Medical Institute
Rett syndrome is a progressive neuropsychiatric disorder that is characterized by apparently normal development up to 12-24 months of age followed by loss of acquired skills (e.g., language and hand use) and the development of a plethora of neurobehavioral abnormalities including cognitive and social deficits, motor dysfunction and Parkinsonian features, stereotyped behaviors, anxiety and a variety of psychiatric symptoms, and autonomic dysfunction.
Rett syndrome is caused by loss-of-function mutations in the X-linked gene encoding Methyl-CpG-binding protein 2 (MeCP2). Curiously, doubling MeCP2 levels also causes Rett-like phenotypes and premature lethality in males, whereas females manifest psychiatric symptoms. Mouse models, either lacking or over-expressing MeCP2, reproduce the features of Rett and the MeCP2 duplication syndrome, respectively. Furthermore, mouse models carrying various mutant MeCP2 alleles have demonstrated that the brain is exquisitely sensitive to the levels of MeCP2 and that the loss and gain of the protein produce inverse effects on synapse number and synaptic function. Studying the role of the protein in various neuronal subtypes is revealing that MeCP2 is critical for the normal function of all the neuronal types tested. Moreover, these studies are providing insight into the contributions of various neurons and neuronal networks to various neuropsychiatric features and physiological behaviors.