Wednesday, October 17, 2007

Global warming is NOT the end of the world

Hoax For Sale: The Green Revolution That Never Happened Is About To Not Happen Again.

Who best fits the following description: Wealthy white American male born to political royalty. Claims to have unique insight that reduces complex issues inherent to the modern world to simple action items that fit his personal and political agenda. Relentlessly pushes his ideas in every possible public forum, but remains secure in the knowledge that he will never have to make any of the personal sacrifices necessary to make his vision a reality. Sound like anyone we know? Of course George W. Bush immediately comes to mind. But this description could just as easily be applied to Al Gore.

With the awarding of the Peace Prize to Al Gore et al., an act of political correctness so grotesque that the Prize may never fully recover, Mr. Nobel’s creation has reached a new low. Final proof, if such proof was required, that in the age of the image the medium has finally become the entire message. The worst part of this fiasco is that Mr. Gore’s warm fuzzy message will set the stage for a second great green revolution hoax. And this time humanity may not have another century to get it right.

Global warming will undoubtedly change the face of our planet (far from an anomaly in geological history)… it is not the bang or the whimper upon which our world will end. And even if turning up Earth’s thermostat was just about to terminate both humans and polar bears, ‘biofuels’ the most popular technical solution being offered to the public is worse than a red herring. The biofuel solution wraps itself in the glittering cloak of ‘sustainability’ so that (to quote another southern politician) it shines and stinks like rotten mackerel by moonlight.

But here’s the hard truth…biofuels are not sustainable, they are not green, and they most certainly will not stop global warming. The public is being shamelessly scammed by the same old crew. The only difference is that this time they have some high-profile but seriously misguided liberals out front to shill for them. To paraphrase the old Clinton campaign mantra ... it’s the Military-Industrial Complex stupid!

In order to understand why biofuels are not a sustainable technology, it is necessary to recognize that the term ‘Green’ when used in conjunction with biofuels has nothing, repeat nothing to do with restoring balance to an ecosystem. The ecology movement's use of the term 'Green' ultimately derives from the color of most terrestrial plants. Plants and microbes such as phytoplankton are natural sources of energy because they are capable of conducting photosynthesis. During the photosynthetic process plants convert solar photons – the cleanest energy source of all – into sugar. They do this by transmuting sunlight and air into sugar. Quite a trick to be sure! Packets of light energy are captured and transmuted into chemical bond energy. The green machine works something like this:

Inputs = sunlight + carbon dioxide + water vapor
Outputs = carbohydrates (sugars, cellulose, etc) + oxygen

Rich in carbohydrate-based chemical energy, photosynthetic biomass fuels the rest of biological life on the planet. So if plants can fuel humans, why can’t they fuel our cars and planes? The answer, of course, is that it takes far more than sunlight to grow a crop. Modern agriculture is a form of manufacturing that is driven by the intensive application of fertilizers, pesticides, groundwater, soil cultivation and more. All of these inputs have one thing in common… they require energy! Growing corn to make ethanol is based on the same logic (and agricultural ecology) that rationalizes feeding ten pounds of balanced vegetable protein to an animal in order to produce one pound of balanced animal protein to feed a human. But how is this pseudologic perpetuated?

In the age of semiotics ‘branding’ is king, and no brand name in science has more recognition and credibility than the Nobel Prize. So, when Norman Borlaug, Ph.D. won the Nobel Peace Prize in 1970 as the father of the ‘Green Revolution’, the public assumed that a Green Revolution had, in fact, occurred. And therein lies the first hoax. Norman Borlaug won the Nobel Prize for his work as a plant breeder. But, in fact, his main contribution was the development of crop plants that required enormous amounts of energy. Ergo the first Green revolution was an exercise in reverse sustainability!

Wheat and corn are the staple food crops of the western world. Until Dr. Borlaug came along, the yield per acre of wheat was seriously limited by the fact that these plants had tall, thin stalks. When farmers tried to apply too much fertilizer to a field of wheat, the spindly-stemmed plants would collapse under the weight of all that extra grain on top—a trait called lodging... literally plant strangulation. To prevent lodging, Borlaug crossed in a Japanese dwarf variety of wheat (called Norin 10) and ultimately created shorter, stronger stalks that could better support larger seed heads. As a result, farmers could pour on the enormous amount of fertilizer necessary to create big-headed plants and dramatic increases in yield were achieved. Borlaug went on to breed in additional traits such as disease resistance. He deserves great credit for helping create the modern mechanized agricultural system of the mid-20th century that allowed developing countries to feed millions who otherwise would have starved.

But was this a Green Revolution? Dr. Norman Borlaug was a product of his time, and the mandate for agronomists after World War 2 was: increase yield per acre. More bushels of wheat and rice, more ears of corn. The explicit operative assumption was that energy was not limiting… any sustainability alarms going off yet? One could fertilize at will. There was no shortage of fuel to run the tractor back and forth across the field as often as necessary to apply pesticides or cultivate the soil. If the climate was arid, it was assumed that fuel was plentiful to pump water out of the ground or ship it in via artificial aquaducts (to be constructed as necessary by more fuel-gobbling heavy equipment). In terms of food production, the results were spectacularly successful. But the first ‘Green revolution’ had absolutely nothing to do with sustainability.

Much of Borlaug’s seminal work was done when he was a Program Director at the International Maize and Wheat Improvement Center (CIMMYT, pronounced simiyat). CIMMYT was established by the Rockefeller and Ford Foundations in ‘cooperation’ with the Mexican government. I mention this because the Nobel Prize was extremely important to successfully branding the first Green Revolution. Now the Rockefellers and Fords of the 21st century, variants on the people who brought us mechanized monoculture (and more recently GMOs) want to sell us a second Green Revolution with the brand name Biofuels. It’s a snappy moniker for using energy intensive mechanized agriculture to create a product that, when introduced into an internal combustion engine, will still pour CO2 into the atmosphere. That’s a real inconvenient truth.

Sunday, April 22, 2007

Cassandra And The Bell Curve: The Chaotic Prophesy of Genomic Medicine

When I was a child in the 1950’s, we did not know what caused cancer. Popular theories ranged from the medical to pure folklore. Some people held to the idea that cancer was a disease carried by a virus or bacteria. Some believed that cancer entered our bodies from the environment; by drinking polluted water, by breathing fallout from nuclear bomb tests, or from eating red meat. Assuming that cancer was inherited, others kept a close eye on the fate of parents and relatives. But mostly, cancer appeared to be caused by bad luck. There was no pattern or logic. We all knew that obese people were candidates for heart attacks and that old people had strokes or became palsied. Cancer, on the other hand, appeared almost whimsical. Or, like the God of quantum mechanics, played dice with the universe to choose its victims and even its symptoms. A child here, an athlete there; different organs, different symptoms, different prognoses.

On April 4, 1971, two years after America put a man on the moon, President Richard Nixon declared war on cancer. Thirty three years and hundreds of billions of dollars later, cancer remains our deadly and implacable foe. This disease has more than held its own against the greatest medical armada ever assembled. Given this reality, it is reasonable to suppose that cancer has a secret weapon… and it does. Cancer is not a disease.

Fast forward to your doctor’s office in the not-too-distant future, say 2010. You have recently been diagnosed with Stage III melanoma, which means that the cancer has metastasized throughout your body. Just six years ago, in 2004, the choice of treatment would have been based on the type of primary cancer, the size and location of the metastasis, your age and general health, and your treatment history. But that was back in 2004, before the development of molecular diagnostics, the cutting edge of personalized medicine. In 2010, molecular diagnostics will mean gene scans conducted with the latest DNA microarray technology. Introduced commercially in the late 20th century, these gene microarrays or chips are capable of simultaneous and instantaneous analysis of expression patterns composed of thousands of individual genes. Every type of cell has a unique gene expression pattern or profile. If a cell becomes cancerous, this profile will change. Your Stage III melanoma displays a schizoid gene expression pattern reflecting both their skin cell heritage and their newly-acquired outlaw metabolism.

In the age of personalized medicine the doctor must explain that, while your cancer has a great deal in common with other Stage III melanomas, cells from your body can never be exactly like any other. Your doctor knows this because for the past decade DNA from melanoma cells has been routinely extracted and scanned with arrays. In 2010 this accumulated gene-scan data, once restricted to research, has been gathered and deposited in a National Medical Genomics Database administered by the Department of Health and Human Services.

Like any technology with a truly global market, genomics is driven by a law of inverse trivialization. This law dictates that, the more useful a technique the more rapidly it is simplified and packaged as a foolproof product. Take the methods developed in the 1970s for producing recombinant DNA molecules. Within 10 years this set of Nobel Prize-winning techniques had been packaged into prefab kits suitable for a high school biology exercise. In 2010, similar marketplace compression and simplification will be on display when you visit a clinical molecular diagnostics laboratory. On an ordinary bench, in a nondescript room the intellectual fruits of an entire scientific revolution will be on display, distilled into a delivery system the size of a small home entertainment unit. The need to take a blood sample will provide the lone, almost ephemeral, tether between the parallel worlds of human biology and computerized bioinformatics.

The molecular diagnostics clinic of 2010 is filled with a rainbow of genomic paraphernalia. Diaphanous pink microtubes sit in bubble packs like sets of false nails. Red motorized pipettes hang in translucent blue racks like designer tool kits from the Starship Enterprise. The shelves are filled with what appear to be large family-sized cereal boxes with very slick, very bright labeling. These boxes contain individually packaged, ready-to-use molecular diagnostic kits with exciting brand names; The DNA Warrior, Mighty Clone, or The Gene Catcher. There is a brief atavistic moment when a finger stick is required to obtain your blood. This small drop of your body’s fluid is injected into the DNA Warrior, which is a sponge-filled cylindrical cassette the size of a ballpoint pen cap. This cassette is slotted into the Sherlock Genomes molecular diagnostics system. From the outside, this ‘system’ appears considerably less complex than a Y2K-vintage message-fax machine. In the DNA Warrior cassette slotted into Sherlock Genomes, melanoma cells are purified from your blood via solid phase immunoaffinity chromatography. Twenty years ago cell sorting by immunoaffinity required a million dollar instrument the size of a 727 cockpit and a dedicated operator. Now this procedure is little more than routine blood-work.

Once purified, a few hundred cells are moved via electroosmotic microfluidic channels to a lab chip that, in another venue, could pass for a credit card. A microfluidic channel is a piece of plumbing the size of a human hair that empties into a reaction chamber no larger than the head of a pin. On the lab chip, DNA is rapidly purified and chemically tagged with fluorescent labels. The purified, labeled DNA is transferred to Sherlock’s microarray scanner and a gene expression profile generated within minutes. The scanner module is no larger than the PDA of a few years before. In 2010, the entire molecular diagnostics procedure will appear only slightly more challenging than adjusting the chlorine level in a swimming pool.

Your gene scan is automatically converted to a standard electronic file format no larger than a high quality image from a digital camera. Within 15 minutes of your arrival, your cancer is undergoing statistical analysis and classification within the melanoma sub-library of the National Medical Genomics Database. Molecular diagnostics involves matching the genetic profile of your cancer to a target profile derived from thousands of other melanomas as well as a baseline profile derived from healthy skin tissue. CPU time for completion of your analysis, including conversion to a user-friendly graphical spreadsheet, is less than a minute. In under five minutes, your molecular diagnostics are emailed to your doctor… and your insurance company. Of course they were recycled into the National Medical Genomics Database as supplementary raw data before either of these events occurred. You gave up all future rights to this rogue component of your genetic code when you signed the consent for treatment form; but that is another story. Based on these data, a personalized therapeutic regime is created and placed in your file. In addition to classic chemotherapy and radiation therapy, a new biopharmaceutical, manufactured by the Swiss company XtremeGen, is prescribed. Your doctor, trained to use every tool in her arsenal, schedules a complete series of treatments to begin within 24 hours. But before treatment can be initiated a red flag goes up. Your data, like your cancer, has taken on a life of its own.

By the time you arrive at the treatment center the following day, your insurance company has re-scanned your gene scan using its own software. More specifically, it has drawn its own actuarial conclusions from the microarray data. The National Medical Genomics Database groups Stage III melenoma cells into 17 distinct types based on the variation in 273 genes whose expression differs from healthy controls. Clinical studies have demonstrated that only 24% of patient’s with cell type MM14, your cell type, show a significant clinical response to XtremeGen’s product. The bottom line is that your insurance company has declined to cover this part of your war on cancer.

The true and deceptively complex nature of cancer was just one of the many surprises unveiled along the path to a promised golden age of biotechnology. In the 1940s, many biochemists thought DNA was far too simple to contain our genetic code. After all, there are only 4 bases; A, T, G, and C. How could the novel of life be written with such a limited molecular alphabet? Proteins, with 20 unique letters, seemed much more likely to be the language of life. Ultimately the elegant simplicity of a DNA genome was established but the surprises continued. In the 1970s, when nucleic acid sequencing methods became available, we were shocked and alarmed to find out that individual genes did not exist as discrete units but were fragmented; sometimes into hundreds of pieces. Before a gene could carry out its function, these pieces had to be assembled into a single molecule with a precision and complexity that had previously been unimaginable.

The Human Genome Project (HUGO) recently handed biologists the equivalent of severe reverse sticker shock. The bottom line was far too low. In fact, when all the parts had been assembled our genetic vehicle appeared to be hopelessly underpowered. Until HUGO, it was commonly assumed that our chromosomes carried between three hundred thousand and three million genes. Yet, when the tally was complete we learned that human beings, the most complex of biology’s children, only have around thirty thousand genes. This was considered especially humiliating because the humble bacterium E. coli has over three thousand genes. Could creatures with the sophisticated behaviors necessary to put a man on the moon or elect Arnold Schwartznegger Governor of California really have only ten times the gene-power of a one-celled, one-trick creature whose repertoire consists of endless, boring cycles of self-duplication?

The answer is that extreme complexity can be generated by combinations of even the simplest signals. Computers are the ultimate example. Millions of tiny transistors provide the fundamental units for data storage on a computer chip. An individual transistor either passes current or it doesn’t; off or on states more commonly designated as 0 or 1. Linear strings composed of these two digits can create messages of extreme complexity. The letter A is represented in one computer ‘language’ by the unique string of binary digits 11000001. There are, in fact, exactly 256 ways to arrange 0 and 1 in a linear array eight numbers long. Throughout 4 billion years of evolution, biology has perfected the art of generating complexity from simple subunits. This explains the apparent disconnect between the genetic and behavioral complexities of H. sapiens and E. coli. Returning to the two states of the transistor, an eight letter word can be written 28, or 256 ways. When we increase the number of states by one order of magnitude, from 2 to 20, we can write 208, or 25,600,000,000 unique eight letter words. Genetic words are much longer than eight letters. So, with only ten times the genes, our metabolic vocabulary can easily be billions of times more complex than E. coli’s.

There are a trillion cells in the human body. With minor exceptions, every cell contains a full set of chromosomes with all 30,000 genes. Molecular biology has taught us that each gene has an essential role in the living process. Liver cells, for example, express liver-specific genes while brain cells express brain-specific genes. More correctly, stem cells in the human embryo develop into liver cells via a unique tissue-specific pattern of gene expression. Even in E. coli, different genes are expressed depending on its travels through its microbial world. Passing through a body, E. coli tunes its metabolic pallette to enjoy our luxurious intestinal buffet. Different genes are required to find dinner in the soil or in pond water. A few genes; those that encode life’s most basic functions, are always on. The relationship between gene expression and cell function is somewhat analogous to that old saying about politics. In biology we use all of our genes some of the time, some of our genes all of the time, but no cell ever uses all of its genes at the same time. Tracking the permutations and combinations of gene expression throughout the life of a cell is the mission of the new and profoundly powerful branch of biotechnology called functional genomics.

To enter the world of functional genomics is to see our chromosomal machinery in action; a symphony that requires thousands of instruments. The performance is a superposition of classical and jazz, simultaneously structured and improvisational. Most importantly, this molecular music can only be properly understood when performed by the entire genomic orchestra. Discovery of the double helix unleashed the power of genetic engineering, but for the remainder of the 20th century genes were mostly cloned and studied one at a time. This approach brought insight, much as watching a lone musician perform on stage would demonstrate the workings of a clarinet or a violin. When enough musicians had performed we would have a fairly complete catalog of the types of instruments; woodwinds, strings, percussion. We would understand the materials from which they were made, their mechanical function, and their tonal range. But even if we recorded each instrument’s full performance and mixed them in a studio there is virtually no chance that, without having heard the original, we could accurately re-create, much less appreciate or coherently improvise on the symphony. Functional genomics may be likened to a DVD system capable of recording and playing back the complete fusion symphony of gene expression. This symphony, in turn, is the first act of the performance art piece called the living cell.

To see a functional genomics experiment start by laying flat on your back in the middle of a room the size of a basketball court. The room is totally dark, not a single stray photon. Suddenly, an unseen hand throws a switch and tens of thousands of individual floodlights simultaneously illuminate the ceiling. The circular lights are arrayed in a perfect rectilinear grid of rows and columns, like a symmetric bed of electric tulips planted by a robot gardener. Each spot is one of three colors; red, green, or yellow. Within this grid there is no obvious color pattern, rather the spots of light form an impressionistic mosaic. If you could count each one, there would be thirty thousand.

Micro-scale variations of this performance occur thousands of times a day, every day, inside the chambers of machines built to scan our genome. Each chamber is, in fact, only a few centimeters across. Each circular floodlight, in fact, a painted spot of fluorescent DNA dried down from the millionth part of a liter. Thirty thousand drops, thirty thousand individual microliter spots, each placed by a precision robot arm that ends in a tiny needle. The point of the needle is dipped into a DNA solution and then touched to a precise location on a thirty thousand point grid, leaving behind a drop of exactly one microliter. The drop dries to a spot, painting a DNA circle in a discrete region of space. The robot arm is washed, dips into another DNA solution containing a different gene, and returns to paint another micro-spot at a different, exact location. When all thirty thousand grid points have been painted we have a DNA microarray on a piece of glass smaller than a microscope slide. The illumination seen from below is caused by a short laser burst sequentially targeting and lighting up each of the thirty thousand spots. The laser scans and fires so rapidly that the entire grid shines out like a work of pointillist art or a visual fluorescent symphony.

A microarray chip representing the entire human genome will have around 30,000 unique spots. It is not necessary to understand the biochemistry of nucleic acids to appreciate the revolutionary potential of array technology. Rather, consider each gene-spot on the array surface as equivalent to a light-emitting phosphor dot, or pixel found on the inside of a television screen. An extract of gene expression products from a specific type of cell is washed across the surface of the array. Expression products from this extract stick to the identical genes on the array but not to unexpressed genes. Wherever they adhere, expression products alter the color-generating properties that gene pixel. When the laser sweeps across the array, a novel pattern of colored spots is generated. To prepare the array for the next diagnosis, adherent material is removed by washing the surface with a powerful detergent solution. Robotic automation brings the additional aspects of high speed and high throughput to array imaging. Functional genomics data may be viewed as a series of ever-changing pixilated images. Call it genomics television; GTV. With intellectual sponsorship from the Human Genome Project and major biopharmaceutical companies, the GTV Network already has several big commercial projects in development. A pilot called Personalized Medicine, is being test-marketed right now. But there may a problem with truth in advertising.

From robotic DNA extraction, to microfluidic lab chips, to targeting genes with lasers, functional genomics could not appear more techno-sexy. But the output of all this high-end instrumentation is an ocean of numerical data that can only be navigated by pure brute-force statistical number crunching. The race to sequence the human genome ended in a tie between a private company and a government consortium. But even as our chromosomes were being assembled in cyberspace, scientists were fully aware that elucidation of the complete genetic code was only the beginning. In order to avoid being thrown off of biomedical island, genome scientists must find a way to package and deliver all that data in a way that impacts the quality and length of our lives, i.e. manufacture products that we will buy.

As a result, the genome revolution has entered a phase we might call ‘Cassandra on a bell curve’. Over the next decade, the tools of functional genomics will allow us to predict both the metabolic state and the ultimate fate of cells and tissues with increasing precision. On a long enough timeline, this means a new arsenal of weapons for, among other things, the war on cancer. The promised golden age of biopharmaceuticals. But if every new discovery is an incremental victory for medical science, ‘Cassandra on a bell curve’ is a potential nightmare for the logistics of healthcare delivery in a free market society. Put simply, functional genomics will provide diagnostic tools long before biotechnology provides efficacious cures. The magic spreadsheet far in advance of the magic bullet.

This reality has not been lost on the biotechnology industry. In 2000 Celera Genomics made history as the private company that forced an international consortium of developed nations to share the glory of sequencing of the human genome. Celera ( ) still markets the intellectual property created by this accomplishment, but the heavyweight champion of DNA sequencing is now vigorously pursuing a career the ring of molecular diagnostics. Celera Diagnostics has focused its discovery efforts on, “identifying genetic variations associated with common, complex diseases.” And is “working to develop new diagnostic products and to improve human health through an approach we call Targeted Medicine.”
In theory targeted medicine (a.k.a. personalized medicine) sounds awesome and whenever it is perfected most of us will want it. Unfortunately, prior to bearing fruit, the garden of functional genomics will require relentless weeding. Consider recent progress in the molecular diagnostics of breast cancer.

Breast cancer patients with the same stage of disease can have markedly different treatment responses. In practical terms this means that no woman with breast cancer, even from the same demographic, has exactly the same illness as any other. Each woman’s cancer has its own unique genotype. Currently, conventional medical treatment with chemotherapy can reduce the risk of metastases by approximately one-third. However, clinical data also show that 70-80% of patients receiving chemotherapy do not, in fact, benefit from it. Put simply, at least 7 out of every 10 women patients endue chemotherapy for nothing. The agonizing current dilemma for doctors and patients is that chemotherapy will prolong life for 3 of the 10 women, but we can’t determine which 3.

The plan is to use gene-scan data to predict which patients will benefit from chemo. In 2002 workers in The Netherlands used a DNA microarray to develop a gene expression profile that outperformed all currently used clinical parameters in predicting disease outcome. They suggested that their findings provided a strategy to select patients who would benefit from adjuvant therapy.

This information, originally published as basic research, reached the public in articles with encouraging titles like “New Study Could Cut Breast Cancer Overtreatment”
In this article a member of the research team was quoted as saying. "We have confirmed that we can predict with 90% certainty that a patient will remain free of breast cancer for at least five years.” The key concept here is “a patient”, i.e. you or me but not her. The world of personalized medicine via gene-scan powered molecular diagnostics. Clinical research shows that at least 7 out of 10 patients can forgo chemotherapy and its serious side effects while remaining disease-free. In a perfect near-future world, microarray gene scans will tell us which 7 women can decline chemotherapy. In the real near-future world, microarray gene scans will only tell us the probability that a woman can safely decline treatment. This probability will get better every year, but when will microarray diagnostics be reliable enough?

Last year the plot thickened when another group of researchers developed a test that used novel DNA array technology to predict which breast cancer tumors would respond to a commercial product.

This study identified differences in the gene patterns from tumor samples that could predict which patients would respond to treatment with the specific chemotherapy agent, Taxotere ®. The workers were quoted as saying, "We may have a clinically useful predictive test for chemotherapy sensitivity that may allow us to prioritize breast cancer treatment strategies based on their likelihood of success. This research, if validated, may lead to important advances in the treatment of breast cancer including reducing unnecessary treatment for some women, while optimizing therapy for others."

This work on breast cancer is, literally, just the tip of the iceberg. In thousands of labs around the world, microarrays are creating profiles of people and populations. These profiles will be used to develop molecular diagnostic strategies for every major disease and disorder. Like cloning, this technology takes us to the very essence of what it means to be an individual. Unlike cloning, the field of molecular diagnostics is receiving almost universal acclaim as a worthy goal for the future of medicine.

“A little knowledge is a dangerous thing. So is a lot.” Once again, Einstein provides the appropriate homily. Functional genomics, like any other empirical science, will proceed incrementally. But unlike the next generation of semiconductor chips, the time to market for each new product in targeted medicine will be measured in human lives. Before we have the set of genetic profiles necessary to treat all breast cancers, we will know enough to modify the treatment regimes of a few breast cancers, then enough to help some breast cancers, then enough to help many. At what point is this knowledge allowed to enter the healthcare system, and who decides? No one questions the awesome diagnostic power of microarrays or their future role in medical science. But it is equally true that no one understands how this revolution in personal medicine will impact a healthcare delivery system that, for many, is already hopelessly complex and frustrating.

Molecular biology broke the code in the war on cancer and, in doing so, revealed cancer’s secret weapon. Cancer it is not a disease but rather a progressive series of metabolic states; in effect cellular evolution within the human body. Like all forms of evolution, cancer is based on variation followed by selection. It is a fact of biological life that the genetic material in our cells is constantly undergoing mutation from, among other things, horribly synthetic pollutants in our water and completely natural radiation in our sunlight. Our DNA takes thousands of chemical ‘hits’ a day. Almost all of these mutational events are corrected by enzymatic maintenance crews that patrol our chromosomes 24/7. At some time in our lives, probably many times, we all receive a potentially cancerous mutation. For over a million people a year, some combination of chance and genetics means that this initial mutation goes un-repaired. The cellular avalanche called cancer has begun.

The first genetic hit often involves a gene that controls cell division. The result is uncontrolled cell growth. After a few division cycles a new population of cells is created and evolution begins within the tissue. If no further mutations occur, uncontrolled cell division leads to a solid tumor which is limited in size by the ability of surrounding capillaries to supply nutrients and oxygen. Within the ecosystem of the tumor, a cellular subpopulation may acquire a further mutation that allows them to develop their own circulatory system. This trait, called angiogenesis, is often the next step in malignancy. Several pharmaceutical companies are developing anti-angiogenesis compounds as cancer treatments. To evolve to the lethal Stage III, cancer cells must pick up additional mutations in genes that control metastatic traits like migration through the tissue, and the ability to squeeze into a blood vessel in order to circulate through the body and find a new location. The multi-gene nature of cancer means that while cancers specific to certain tissues are similar, no two cancers are ever exactly the same. This explains the state of our knowledge in the 1950s. Bad luck was, in fact, part of the process.

The potential of functional genomics to revolutionize medical care is also the measure of its potential to dangerously overload our national healthcare delivery system. New pharmaceuticals currently reach the marketplace by showing efficacy in clinical trials based on the average response of a patient population. Personalized medicine, by definition, means that there is no average response. In the world of molecular diagnostics and targeted medicine, each patient’s treatment regime is unique. The obvious answer is that we will add up all the outcomes of a study involving personalized medicine. If the targeted regime has a higher success rate than the standard treatment the procedure is approved for the market. But in order to make customization the standard, personalized medicine will need to wage war on our current healthcare system as well as on cancer.

Take our example of microarray-directed treatment of breast cancer. A gene expression profile is necessary but not sufficient to describe the state of a cancer cell. As a result, we will have a world of molecular diagnostics decades before we have a world of molecular cures. In the immediate future, gene-scans will mainly guide the use of conventional chemotherapies. In this world 7 of the 10 women who are diagnosed with breast cancer will be advised that post-operative chemotherapy will not extend their survival. But this advice will come with a statistical caveat. More correctly, each patient will get her own prognosis with her own statistical caveat. The woman, her doctor, her insurance company, and the government will all receive a statistically-weighted prediction about her future. For at least a generation, molecular diagnostics will be Cassandra and the bell curve. The first act of the drama called personalized medicine will still be written by nature, the second by biotechnology. But who or what will be the author of the finale?

In developed nations, the practice of medicine may form the most crucial and complex interface between science and society. In addition to the immeasurable cost in the quality and duration of human lives, the economics are staggering. U.S. healthcare spending in 2002 hit $1.6 trillion, fully 15% of our Gross Domestic Product (GDP). We were told from the outset that completion of the Human Genome Project (HUGO) would signal the beginning of a new age of molecular medicine. Personalized medicine, enabled by functional genomics, is the dawn of this new age. Guided by microarray data, the first wave will mainly consist of advanced molecular diagnostics; both for the initial diagnosis of diseases and to fine-tune the administration of conventional pharmaceuticals and therapies.

A technology that can revolutionize human healthcare must have equally revolutionary implications for healthcare delivery. Just for starters, genomics-based treatment will require a complete inversion in the use of medical statistics. Once upon a time the enclosed literature might warn that a vaccine had several know side effects and that, for ‘some people’, there was a low probability of developing one or more of these complications. The age of personalized medicine will mean that there is no such thing as ‘some people’. You can find out if you are that one-in-a-thousand. Will you want to know? Will you be allowed to know? What will it cost to know, and who will pay? Our current system was never designed for, nor can it handle, the flood of molecular diagnostic data that will reach biblical proportions within a decade. Just when we thought the web of healthcare delivery couldn’t get any more tangled, patients, doctors, and HMOs are about to meet the world of functional genomics.

Cassandra, get your calculator.

Friday, March 16, 2007

Neuroethics And Neurolaw or How I Learned To Stop Worrying And Love Molecular Medicine

Dear SSDs:

Since, so far as I can tell, no one is reading this blog I'll keep it breif. In the next few days my intrepid webmaster John Kingston will post a recent article from the NY Times on the emerging field of neurolaw. For those of you who can't wait, the article is called:

The Brain on the Stand
How advances in neuroscience could transform our legal system.
It is in the NY Times MAGAZINE | March 11, 2007, and may be read/downloaded for free.

Since no one reads this blog, it is a bit difficult to see how this information will be helpful... but who knows, a new trend may emerge. Neurolaw is a minor manifestation of the global impact of Molecular Engineering (my term for nanotechnology) on medicine and health care. As such, the implications may be deduced from the nanomedicine piece I published in a year or two ago.

Because the impact of neurolaw will be relatively immediate and certainly makes for a fun discussion... we will cover Rosen's article in the April Science Salon. However, the global impact of Molecular Engineering on medicine and healthcare is not much fun to consider if one is a humanist. Simply put, the more we learn about the molecular operation of the human machine the more we can tinker. In the area of human consciousness, researchers tend to talk about circuits. So, if we isolate and map the neural circuit for a certain form of mental behavior, say depression, we can quantitate it and begin to use these quantative data for a number of purposes:

1. The good guys (and of course gals) can begin to develop therapies to alleviate the suffering caused by depression. Dr. Bill Marks, a definite good guy, will talk to us today about his work on the cutting edge of using deep brain electrical stimulation to treat diseases such as Parkinson's and possibly depression.

2. The bad guys (and gals) will use this information to develop strategies for manipulating the human emotional state for purposes other than the alleviation of pain and suffering. Using inverted medical symmetry, one could visualize the development of a bioweapon that plunges people into an irreversible suicidal depression. We could get 'our' enemies to kill themselves off without firing a shot. What a savings in terms of the defense budget! But then again, who gets to define the term enemy?

3. The post modern guys (and gals) won't worry about good and bad. They will just keep developing molecular tools to manipulate the various circuits. These tools will be used by atavistic modernists (and undoubtedly ultra-atavistic religious fanatics) to further various conditions of goodness or badness as they define these terms.

4. Finally, the lawyers will get hold of these data and use them to defend or prosecute people accused of various criminal activities. As Jeff Rosen describes in his article, brain tumors and other medical conditions affecting the functionality of our our primary organ of consciousness are already being used successfully as evidence in trials. This is just a slightly more complex version of DNA 'fingerprinting'. Soon we will have the molecular 'fingerprints' for a wide range of medical conditions. If enough molecules are involved we will not call the data a fingerprint, we will call the data a circuit or pathway... but the principle is the same.

I warned in my piece that the advent of molecular medicine means the end of any functional form of medical privacy. Soon, brain scanners will not only know our innermost thoughts, they will be able to manipulate them and, if necessary, put them on the witness stand. Perhaps we need an appendix to the 5th amendment that says our own bodies can't be compelled to testify against us. But then again, what about DNA fingerprints... to say nothing of the fingerprints that come from the end of our hands. When one steps in a trillion molecules, one is indeed on a a slippery slope!

Best of all, there is not a single thing you can do about it without becoming a scientifically informed citizen who participates actively in shaping the new laws that will emerge to regulate the products of molecular medicine. Since mind control techniques are the ultimate marketing tool, if you do decide to participate you will have to take on the MIRUC (Military-Industrial-Research University Complex... should be pronounced like murk). What do you think your chances are?

Cheers, AG

Sunday, March 11, 2007

Zen and the Art of Molecular Engineering

And now for something completely different.

It's always cool to start with quotes...

"It is a profound and necessary truth that the deep things in science are not found because they are useful; they are found because it was possible to find them." J. Robert Oppenheimer

“I'm afraid nanotechnology is one of those fields that, no matter how exciting it gets in real life, will be very difficult to turn into a successful nonfiction book.” An email from a Senior Editor, at Random House to yours truly.

The gentle reader will soon understand why I started with these quotes. The first originates from a monumental figure who profoundly altered the course of 20th century history via the application of radical new science to the ancient practice of war. The second originates from someone who selects books for a 21st century multimedia giant to perpetuate the ancient practice of profitable commerce.

Oppenheimer synthesized a transcendent principle that governs the origin of knowledge. What it is possible for humans to discover about their world. The editor was giving an informed opinion about what the reading public is willing to discover about its world. Both quotes may be true. But unless the paradox inherent in the second is overcome we risk placing humanity, and even biology, in grave danger. To the reader I say it is a necessary truth that the profound implications of the nanotechnology revolution must be communicated to the general public.

The advent of the nanotechnology era (more correctly termed the ‘molecular engineering’ era) requires that philosophers of science elucidate a field whose unprecedented interdisciplinary nature spans practically the sum of all scientific discoveries and technical developments that preceded it. Philosophers of science (or someone!?) needs to explain to Homo sapiens, the toolmaker, why the ability to build with molecules is not just another tool but the ultimate tool with which to shape our physical world. Those of us who work in this field have an obligation to explain to the rest of society why nanobiotechnology – whose explicit goal is the atomic and molecular integration of living and nonliving materials – is far more than just a synonym for cyborg.

In this brief posting, I can only warn you that if the implications of molecular engineering are not openly debated in public, particular manifestations will be thrust upon us by the rush of discoveries fueled by a worldwide ‘race to the bottom’ involving untold billions and some of the finest minds in the finest labs and corporate board rooms on the planet.

For those who think such consequences must be far in the future, it is instructive to keep in mind that the first genetically engineered bacteria were released into the environment on April 24, 1987. That was only 15 years after the first rDNA molecule was engineered in a test tube. Fortunately, there was no catastrophe that day; the probability had always been vanishingly small. But while the debate about when and how to use GM crops continues, the cutting edge of technology has moved far beyond cloning. In the United States, the National Nanotechnology Initiative (NNI) has catalyzed interdisciplinary creations that would have been unthinkable only a generation ago. Areas such as ‘Synthetic Biology’ and ‘Artificial Life’ are now bona fide academic disciplines. The Synthetic Biology Center at UC Berkeley explicitly states, “The defining goal of SynBERC is to make biology into an engineering discipline.” A recent publication in PNAS boldly states, “The implementation of the silicon-neuron-neuron-silicon circuit constitutes a proof-of-principle experiment for the development of neuroelectronic systems.” Nanobiotechnology has become the chemical crossroads where living and nonliving materials meet and fuse at the molecular level to create that which has never before existed. We are already fabricating hybrid devices that go far beyond genetic or any other known form of engineering. Protein to semiconductor, DNA to nanowire, we are building these ‘things’ right now. But what are these ‘things’ we are building?

Saturday, February 17, 2007

Who put the action in Acton Potentials

Dear Science Salon Denizens (SSDs from now on... we scientists & engineers love our acronyms & jargon):

This post is the promised follow-up to yesterday's Salon. Mainly I am putting it here so you (the SSDs) can send me feedback wrt (with respect to) how you thought the session went.

The goal of my talk yesterday was to arm you with some concepts and termnology so that you can begin to read some of the posted articles. The concepts included:

1. An introduction to the neuron, one of the thousands of specialized cell types in the body. As we discussed, the neuron is the fundamental unit of signal transmission throughout the nervous system.

Nota bene: There are many exceptions to this last statement. But remember I warned you my teaching style is based on a series on benign lies whereby I tell you something that is not completely true in order to get a primary concept across... later I say "that wasn't exactly true". And then I tell you the next benign lie.

In fact, the neurons have an enormous amount of help in signal transmission. Even their primary function - the propagation of the action potential- cannot be accomplished without specific companion cells. But for now I won't burden you with the names of any more cell types.

2. An introduction to neural transmission which is part electrical and part chemical. Just for fun I will tell you that, in fact, BOTH the electrical and chemical signal transmission components are simply different manifestations of the same driving force... a force called the Gibbs Free Energy. The only thing of importance we really do as life forms, is extract and store energy. This is not easy in a hostile, highly entropic universe and it is the true and only magic trick of life. Once we have this energy, we are free to use it to maintain and propagate ourselves as life forms. Hence, the concept of 'Free Energy': as in free to be applied to useful work... not free as in 'at no cost'. Both the electrical and chemical components of signal transmission in a neuron are powered by Free Energy.

The full name for the Free Energy that drives electrochemical reactions such as propagation of the action potential is Gibbs Free Energy named after the great (and I do mean genius-level great) physical chemist J. Willard Gibbs. Gibbs was the classic mad genius who basically lived in a broom closet-sized office at Yale or Princeton (I forget) for his entire academc career. The fameous (non-scientific) story about him is that although he attended every faculty meeting for something like 30 years he never uttered a word... except once. At one meeting a motion was made to elimate the mathematics requirement in favor of a language requirement (or some such). As legend has it, Gibbs raised his hand, was recognized by the Chair, and said, "But gentlemen mathematics is a language.", then sat back down. End of story. But old J. Willard received the highest honor science has to offer (and it's not the Nobel Prize). He has a basic unit of physical measurement named after him. The unit of Gibbs Free Energy is right up there with an Einstein of photons, a Curie of radioactive disintegrations, and -of course- a Newton of force.

3. We also briefly reviewed the rudimentary structure of a neural network, i.e. head to tail, presynaptic to post synaptic. Dendrites on the back side, axons on the front side.

4. We took a look at brain complexity and and ran a few numbers get an idea of how many neurons, how many synaptic connections, and how many signals per second per whole brain.

These calculations and all the rest of it led to a final generalization of how whole-brain scans work. If one area of the brain is signaling like crazy lots of neurons will be sending lots of action potentials which means they will need lots of Gibbs Free Energy which means the burning (metabolism of) lots of sugar, which means a requirement for lots of oxygen-carrying blood, which means the blood supply to that region will spike. If we can follow changes in blood supply or any related parameter in real time by MRI, PET or other scan technology we can begin to correlate regions of the brain with specific aspects of consciousness as well as other subconscious brain-controlled functons (musical genius, sociopathy... you fill in the blanks).

5. Finally we looked at a lot of cool and very beautiful images. Never a waste of time.

Cheers, Dr. G

Thursday, February 8, 2007

Pacemakers for the brain!

Dear Science Salon members:

I promised we would still have outside speakers for
some of the Salons so I'm extremely pleased to inform
you that William J. Marks, Jr., M.D.,
Associate Professor of Neurology,
University of California, San Francisco
join us for the March Science Salon
to talk about his research.

Without giving too much away, I will say that I asked
a number of friends in the bioengineering industry for
the names of people doing extremely cool brain research
who also worked in the San Francisco area.
Professor Marks' name came up on
more than one list.

The title of his talk will be:
"Pacemakers for the Brain: Electrical Stimulation to
Treat Neurological Disorders"

In the near future, Dr. marks will pass along one
or two articles for me to post
on my Science Salon page at:

Neuroelectronic stimulation for the treatment of
diseases and disorders is one of the hottest areas
in all of bioengineering. When I was back in graduate
school the idea of using electricity to
stimulate any kind of cell was still considered...
well pretty weird (as in "What do you think you're
doing with that lightning rod,Dr. Frankenstein!").
I can still remember attending
the first Gordon Conference on 'Bioelectrochemistry'.
People were talking about using external electric fields
to stimulate bone healing, to get DNA into cells for
genetic engineering*, and a whole host of other things.
It all sounded great... but only to the few of us
willing to belabelled 'bioelectrochemists'. Thirty some
years later, bioelectrochemistry has merged with a bunch
of other fields to become bioengineering and no one
considers this stuff strange anymore.

In fact, we are right in the mainstream.

More soon. AG
*PS This technique, now called electroporation
is a backbone of the genetic engineering
industry (a.k.a. Industrial Biology) and
commercial devices are produced by several
major biotech instrument companies!

Monday, January 29, 2007

Welcome to Dr. G's Science Salon

Well here we are... but where are we? Those of you who attended the Mechanics' Institute Science Salon know me as the host and moderator, Dr. Alan H Goldstein. And that's still who I am.

This year, the Science Salon will bifurcate (getting technical on you already), i.e. we split in two. The seminar-style Salon will continue at the Mechanic's Institute every third Friday of the month. Check us out by hitting the button at the Mechanics' Institute Library (MIL from now on) website:

As you will see, the general topic for this spring is -

The Brain: Biology, Consciousness, Memory and Language

That's a whole lot and, of course, we won't actually attempt to drill through the entire cranial mass on this first go-round. But I do guarantee two things: we will have fun, and we will learn amazing stuff!

The 'bi' in our bifurcation is the new online component of the Science Salon. By blogging along in real time, people can visit me here to continue the discussions we begin in the hallowed halls of the MIL. In addition, I will post articles on a special Science Salon web page at:

Hopefully, many of you will download and review some or all of these posts before our next meeting.

Science Salon FAQs:

1. Will there still be a presentation at the 3rd Friday-noon MIL Science Salon? YES!
2. Do I need to read and understand ALL the materials posted on the Science Salon page at in order to attend and enjoy the Friday-noon MIL Science Salon? NO!!
3. Do I need to read and understand ANY of the materials posted on the Science Salon page at in order to attend and enjoy the 3rd Friday-noon MIL Science Salon? NO!!

In other words, when I sit down with you at noon on Feb. 16th, I will provide an overview of our current understanding of human consciousness that will be entirely self-contained, i.e. I will assume we are starting from ground zero and build up from there. And yes, there will be cool pictures of brain cells, Magnetic Reasonance Imaging (MRI) scans, and all the other things you have come to expect from the MIL Science Salon.

4. Will there be guest speakers? YES... but not at each Salon. I will give the February presentation and we have a speaker for March (more on that in a future post).

So there it is. Let me hear from you... both with respect to future topics of interest and just because I want this Blog to become extremely popular.

What's life without a dream?

Cheers, Dr. G