Sunday, December 27, 2009

Advances in robotics

Robotics can be defined as the integration of sensors, computation, and machine systems to manipulate matter. Some of the most important current applications are in military use, factory automation, telepresence, entertainment, and human interaction. Some contemporary trends include bipedal robots, autonomous robotics, and swarm computing. Walking, instead of navigating around on wheeled or multi-legged bases, could open up a variety of new applications for robotics. Similarly, autonomous robotics could handle tasks at a higher level of abstraction with less of a control burden. Swarm computing could allow the efforts of multiple robots to be coordinated, for example in warehouse automation or RoboCup soccer.

Military robotics
The U.S. military’s current deployment of robots includes 7,000 unmanned aerial vehicles (UAVs) such as the Predator drone and 12,000 unmanned ground vehicles (UGVs) such as the PackBot (P.W. Singer, Wired for War). Boston Dynamics has developed several interesting robots for military use. One is the BigDog, a quadruped robot that can walk, run and climb on rough terrain and carry heavy loads. More recently, the company has been working on the PEDMAN bipedal robot which balances dynamically using a human-like walking motion and is to be used initially for testing chemical protection clothing by walking and climbing like a human. Another example of military robotics is the DARPA Grand Challenge, where there have been three rounds of competition for unmanned navigation vehicles, lastly in an urban environment.

Industrial robotics
A second important area is industrial robotics, extending automated machines by making them mobile. One leader in mobile robotic solutions for warehouse automation is Kiva Systems who uses robots to organize, manage and move inventory. The robots cooperate using swarm behavior by reading barcodes on the floor and other messaging systems. There are other examples of robots for potential use in corporate or health care environments. Willow Garage’s PR2 (Personal Robot 2) can autonomously open doors and locate and plug itself in to power outlets. AnyBots offers a corporate telepresence robot and a bipedal robot under development.

Personal robotics
There is also a research focus on creating robots with emotional intelligence for human interaction. Two notable examples are MIT’s Personal Robots Group and Hanson Robotics. MIT has robots such as Leonardo which has 50 independently controlled servo motors creating a full range of facial expressions. Hanson Robotics’ Zeno and other robots which have life-like skin created from frubber, a nanoporous materials advance in elastic polymers. For consumer use, robots are starting as small appliances such as the Roomba and Neato Robotics unit for home vacuuming and the Rovio for home security, and toys such as the Furby, Aibo, and Kondo.

Sunday, December 20, 2009

Engineering life into technology

Information optimization, presently known as intelligence, is a centerpiece phenomenon in the universe. It arises from simplicity, then continuously breaks symmetry and cycles through instability on its progression to increasingly dense nodes of complexity and diversity.

A contemporary imbalance has arisen that exponentially growing technology is potentially poised to be a sole successor to human intelligence. A complex dynamical system is emerging in response, the engineering of life into technology. Numerous macroscopic and microscopic elements are under development which could together stimulate advancement to the next node of symmetry and stability, creating a phase transition in intelligence which could broadly include many varieties of sentience.

The macroscopic and microscopic network elements that comprise the complex adaptive system, engineering life into technology, are illustrated in Figure 1.

Figure 1: Elements of engineering life into technology

Sunday, December 13, 2009

Progress in Aging: Secretome, mRNA and Nutrients

The U.S. National Institute on Aging held a Systems Biology of Human Aging conference in Baltimore, MD on December 8-9, 2009. Several interesting topics were considered including the complexities of modeling the process of aging, the role of RNA in gene regulation, neurodegenerative disease and vascular compromise, and gene expression and signaling networks.

Aging: break-down in signaling networks
Aging is a systems biology problem where signaling networks break down. As part of the signaling break down, senescent cells secrete inflammatory proteins which together can be thought of as the ‘secretome.’ Judy Campisi has found that the secretome, the senescence-associated secretory phenotype (SASP), can provide a common biological explanation for the related phenomena of aging, degenerative disease and cancer. Senescent cells produce the SASP, essentially inflammation, which can then trigger degenerative disease (aging) and hyper-prolific disease (cancer). A potential solution is to remove the 10-15% of senescent cells that are not naturally killed by the immune system. Some secretome research has been applied specifically to vascular smooth muscle cells which have the tendency to unhealthily proliferate and migrate with aging, in a process called the pro-inflammatory age associated arterial secretory phenotype (AAASP).

RNA and gene regulation
With mRNA analysis it is possible to obtain the transcriptome, the complement of DNA that has been synthesized into RNA and exists in a cell at any given time snapshot. This is starting to allow findings that the process of transcription and translation is probably more tightly coordinated than previously thought, and that translational control could be a dominant force in transcription. The norm is starting to be that RNA binding protein and non-coding mRNA expression should be identified too in analysis, not just protein expression. Generally, DNA is much more active than initially thought with perhaps 90% of the human genome being actively expressed in some cell of the body. The level of certain mRNAs can be an upstream pathway indicator of aging as mRNAs may increase or decrease with aging which can cause the level of damaging proteins to increase. For example, MKK4 increases with the overexpression of four mRNAs.

Alternate day fasting and nutrients
Alternate day fasting may potentially confer the same benefits as calorie restriction in animals and humans, both in physical and neurological health. Neurodegenerative disease and neurological decline are part of aging pathologies. A countermeasure may be to increase the levels of certain proteins, especially BDNF, brain-derived neurotrophic factor, which is neuroprotective, neurogenerative and important in plasticity and synaptic activity. Some nutrients that may help to increase BDNF levels are sulforphane (broccoli), curcumin (tumeric), catechins (green tea), allicin (garlic), hypericin (St. John’s Wort) and plumbagin.

Sunday, December 06, 2009

Digital personas

There are more machines than humans on the internet, and more machine-to-machine traffic than human-to-human traffic despite the trillions of text messages sent every year. Perhaps the most interesting category of messaging is machine-to-human. There are many mundane examples of this such as RSS feeds, automated email notifications and status updates from entities (groups, companies and other organizations). Coming innovations in machine-human communication could be quite fun and life-enhancing.

There are already fan-run Twitter accounts, FaceBook parodies and other interaction sites for fictional characters, often contemporary television characters. The next step could be creating digital emulations that could automatically respond in character. For example, subscribing to the Ben Franklin feed – “ooh-zapped the heck out of myself with my kite last night.”

There could be many uses for digital personas in addition to entertaining status updates, for example, having kids hang out with Marie Curie and the Wright Brothers as role models. It could be interesting to have a society where dead or fictional characters become part of the conversation, having a voice and a lasting ongoing presence. Digital personas could be managed with sliding parameters (e.g.; amp up Churchill’s humor), and have add-on modules (get the early-adopter technophile package for the great-grandmother persona...”I’m off to text in my response to Dancing with the Stars.”)

Digital personas would not need to be exclusively reserved for dead or fictional characters, anyone could create one as a facsimile with some sort of fidelity from current digital content, data and other artifacts. Celebrities could possibly earn greater remuneration by renting their emulations rather than through live engagements involving their actual physical persona.

As robotics continue to advance, digital persona overlays could be applied so that Franklin Lloyd Wright or Frank Gehry could walk around and discuss home renovation plans with you, Lady GaGa could be at your next soiree or Einstein and Feynman could join a scientific brainstorming session. A new field for productive and entertainment endeavor could emerge to create and bring together the digital personas of historical figures for problem-solving and fun. Would ‘Lost’ be better with Genghis Khan, Ella Fitzgerald, Moctezuma, Rosalind Franklin and Sherlock Holmes in the cast? Lawmakers could obtain measured input by running the Thomas Jefferson and John Adams personas simultaneously. The world’s great scientific and intellectual minds could be assembled to focus on current problems.

Sunday, November 29, 2009

Genomics – The Global Opportunity

Genomics is particularly interesting as a candidate area for possibly making the most difference the most quickly to the most people worldwide by contributing to developments in energy, food and public health.

A full understanding of genomics, the instruction set for life, could mean a more comprehensive ability to manipulate both the world around us and the world within us. Biology evolved to be just good enough to survive and genomics provides the critical next-generation toolkit for its greater exploitation. With the possibility of a complete understanding of biology and the ability to engineer life to be optimum, traditional limits can be overcome, moving from the gene therapies of today (replacing or silencing one gene) to working with whole genomes and possibly creating new ones.

The global challenge and opportunity is for humanity to move safely and expediently into the genomic era of biological manipulation.
The agricultural applications of genomics have been underway for some time in the form of genetically-modified crops. Energy applications of genomics are in development using synthetic biology to generate fossil fuel replacements and are estimated to be ready for commercial launch in 2011. The public health application of genomics is especially promising, using genomics to further understand and eradicate disease. Genetic information is already starting to be medically actionable and is likely to become increasingly useful over time. Its two main current uses are in pharmacogenomics, personalized therapeutics, categorizing drug responders and non-responders for tailored treatment, and in routing higher-risk individuals to earlier screenings for chronic diseases such as prostate cancer and breast cancer. It is estimated that each individual is in the upper 5% risk tier for at least one chronic disease and that $100,000 per person per condition could be saved as a result of earlier detection. By 2010, according to a World Health Organization (WHO) report, cancer will surpass heart disease as the world’s greatest killer, and in fact, developing countries could be at the highest risk due to smoking and high-fat diets.

As our molecular understanding of disease progresses and genomic technologies continue to decrease in cost and become increasingly medically relevant, the use of genomics could become quite widespread. Physicians could start to see the precise, additive information conferred by genomics as a means of improving the care now delivered, finding themselves initially encouraged and eventually regulated into incorporating genomics in care regimens. Pharmaceutical companies are already using genomics as a means of improving efficacy in drug discovery and delivery, providing much-needed assistance to their ailing cost structures. Individuals worldwide could have unprecedented access to their health information which could prompt a much greater level of responsibility-taking and health self-management.

Sunday, November 22, 2009

Humanity: hedgehog or fox?

Isaiah Berlin discusses an interesting paradigm for understanding different kinds of thinkers, the hedgehog and the fox. The hedgehog operates under a single vision while the fox incorporates many ideas into a worldview. Philip Tetlock applies this framework to an analysis of political predictors and finds that while all expert predictors are bad, foxes are not as bad as hedgehogs. The success of the fox is perhaps partly due to Bayesian updates, adjusting the synthesis-oriented worldview per new information, as opposed to the hedgehog being stuck trying to fit all new developments into the same model.

Humanity: hedgehog or fox?
It could be argued that so far all of human history has been organized around certain grand hedgehog visions such as mastery over matter, immortality or evolution, to name a few.

Mastery over Matter
Over time, humans have been continually demonstrating increasing mastery over matter. The current focus is on improving control of biology and indeed reengineering it with genomics and synthetic biology, matter with 3D printing and eventually molecular nanotechnology, the brain with fMRI technology and smart-drugs and space with a next-generation understanding of physics. Perhaps the most intense version of mastery over matter is the present focus on the biomolecular interface, the integration of organic and inorganic matter. However mastery over matter may not persist as a paradigm, a future redefinition could include mastery over information.

Immortality
Immortality is another grand vision, persisting from the time of the Pharaohs and earlier to long-established religious beliefs to the contemporary notions of life extension, uploading and cryonics. In some sense, immortality is just another kind of mastery over matter.

Evolution
Evolution is a strong paradigm, explaining many things, and connotes a higher order than just mastery over matter since not everything is matter. Evolution can examine more phenomena, including the progression of intelligence, possibly across substrates as is contemplated with artificial intelligence. However, despite myriad application attempts, it is not clear yet whether evolution can explain everything, for example the laws of physics and how the universe developed.

Conclusion
Since even in a simple analysis, no one model can explain everything and there are multiple ideas relating to a composite explanation of human activity, the conclusion would be that humanity bears more resemblance to the fox model than the hedgehog model. Tetlock’s finding that the multi-viewed more adaptable foxes are better could likely hold true for societies and humanity in general as well as for individuals.

Sunday, November 15, 2009

MMP inhibitor to kill senescent cells

Important work in the understanding and remedy of aging at the Buck Institute’s Systems Biology Symposium of Aging held November 10-13, 2009 was presented by Judith Campisi in a keynote talk, “The Four Horsemen – Damage, Inflammation, Cancer and Aging: Integrating Aging and Age-Related Research.”

Summary
Campisi has found a common biological explanation for the related phenomena of aging, degenerative disease and cancer: the senescence-associated secretory phenotype (SASP). Senescent cells produce the SASP, essentially inflammation, which can then trigger degenerative disease (aging) and hyper-prolific disease (cancer). A potential solution is to remove the 10-15% of senescent cells that are not naturally killed by the immune system by using matrix metalloproteinase (MMP) inhibitors.

Background

Humans are much longer-lived than other organisms such as flies because they have evolved cell-dividing mechanisms for tissue regeneration and repair. However, mistakes in the form of mitotic mutations occur during this process and build-up cumulatively which can cause cancer. To counter the build-up of mutations, tumor suppressor mechanisms evolved. One action of gate-keeper tumor suppression mechanisms is to direct damaged cells to senesce, or lose function.

Senescent cells are not harmless, they amass at sites of inflammation and pre-cancer and secrete up to 40 different cytokines (immunoregulatory proteins) which together can be thought of as the SASP secretome. All major age-related diseases share an inflammatory pathogenesis including atherosclerosis, myocardial infarction, stroke and metabolic syndrome. The build-up of senescent cells can lead to both degenerative disease (aging) and hyper-proliferative disease (cancer).

The purpose of the cytokines is to repair tissue. In the SASP secretome, they are perhaps trying to summon the immune system, communicating to the rest of the tissue that there is a problem. The immune system does arrive and kill most senescent cells, but 10-15% survive, perhaps due to the over-expression of matrix metalloproteinases (MMPs) which can cleave the ligands off the cell surface where natural killer cells would bind, allowing the cell to escape the immune system.

Solution
Extending the existing research and application of matrix metalloproteinase (MMP) inhibitors, chemicals that mimic the binding site, Campisi’s lab has been able to drive senescent cell killing to 95%.

Sunday, November 08, 2009

Ubiquitous information technology fields

The broadest thematic point in futurist Ray Kurzweil’s opening keynote at Singularity University on November 6, 2009 was that once any area becomes an information technology, it starts conforming to the exponential curves of Moore’s Law progress that have defined the computing and communications industries since 1900 or earlier.

Health is well on its way to becoming an information science with genomic sequencing and synthesizing, bioinformatics and continuous automated biomarker capture. Energy is starting to be an information science with the smart grid, essentially an electron routing network allowing on-demand ingress and egress of diverse flows. Many other fields could behave in the networking and packet-routing metaphor, directing fungible quantized resources to where they are needed and requested like people in driverless cars, neurons in a brain, clean air and water molecules, disease management and health care delivery. Since demand varies, market principles could be used for unobtrusive resource allocation in automatic markets that meet and transact per digitally-inferred demand profiles and pre-specified permissions.

All science is in some phase of becoming or has already become an information science in the sense of using computational models, simulation and informatics.

With computation and communication becoming increasingly embedded in every manufactured object, it is obvious that many more if not all fields could become information technologies.
Intelligence, for example, is becoming an information science. With the exponential growth of computing, it is likely that at some future point, machine intelligence could surpass that of humans. One path forward is to reengineer life into technology that can keep pace with technological advances. There are already three dimensions of progress towards this goal: understanding the existing examples of the brain through neuroscience, simulating and building de novo intelligence in software and robotic forms and integrating human and machine capabilities with brain-computer interfaces, creating the biomolecular interface of integrating organic and inorganic material.

Social sciences
The question arises about how seemingly subjective and nuanced fields like politics could become information sciences. In the short term this is already happening with citizen journalism and collective organization through social networking (examples: flashmob protests and Twitter Iran election feedback). In the longer term, it is imaginable that political artificial intelligences, pleasantly absent the agency problem and special interests of human politicians, could start to perform low level political tasks and over time be used to a much larger degree in policy formation, public resource allocation and administration of nation state affairs.

Sunday, November 01, 2009

Synthetic biology enables green petroleum

The good news about the number of worldwide vehicles, approximately 1 billion at present and expected to double in the next few decades, is the number of fossil fuel alternatives feverishly underway, many of which have established pilot projects and are expected to launch in selected commercial markets in 2011.

Synbio enables green petroleum

The current killer app of synthetic biology, the programming and engineering of biology, is green petroleum.
Several companies are developing improved versions of fossil fuels which can be easily substituted into the existing worldwide fuel infrastructure for autos, planes, etc. at approximately the same cost of fossil fuels (oil is presently $80 per barrel). Pilot plants are underway and commercial introduction is expected in 2011. Sapphire Energy and Synthetic Genomics are working with algal fuel, ramping the highly efficient natural process of algae creating petroleum through photosynthesis.

Other companies such as Amyris Biotechnologies are using synthetic biology to generate ethanol, and LS9 is synthesizing carbohydrates into petroleum with designer microbes. In the farther future, late-generation biofuels are contentious but already being envisioned by companies like Craig Venter’s Synthetic Genomics, employing carbon dioxide (CO2) as a feedstock for bacteria to convert into methane using molecular hydrogen as the energy source.

Green petroleum vs. electric vehicles
There may be less of a competition between transportation fuel alternatives and more of a market suitability analysis governing which choices arise in which areas. Large markets like the U.S. are already showing signs of both, or all, alternatives arising. Markets and countries with other parameters such as smaller size, increased government involvement and more stringent emissions regulations my make a strategic commitment towards certain choices, for example an interesting train and EV-sharing program announced in Denmark.

International electric vehicle leader Better Place notes that the ‘Goldilocks’ markets for greenfield electric vehicle networks are countries that are not too big to risk the introduction of such a disruptive solution and not too small such that economies of scale would not work. The poster child market for the company is Israel, with has networks of charging stations already installed in Tel Aviv and plans to build another 100 in Jerusalem for the mass availability of electric cars in 2011.

Sunday, October 25, 2009

Role of B.S. in Advanced Society

B.S. is a deeper philosophical topic than it might seem at first glance. Two interesting books contemplate the matter: B.S. and Philosophy (2006) and On B.S. (2005).

What is the role of B.S. in advanced society? Since it exists, it must have some role, possibly related to conflict reduction and social lubrication. A second reason for B.S. could be the complex values hierarchies in which individuals and societies operate. Social pressure and belongingness may trump truth as values. When someone is asked a question, the presupposition is that he or she may be able to answer and the inclination of the person asked is to try to respond even if a misrepresentation, e.g.; B.S., occurs.

These authors and others agree that B.S. has proliferated from the past to the present. Given that, what could be said about the future, is B.S. likely to increase or decrease? In the short term it will probably continue to increase but could then be reduced in the longer term with the advent of more advanced technology.

Personalized hypertargeted B.S.
On one hand, technology is increasing the detectibility of B.S., suggesting that B.S. could go down in the future. On the other hand, information is continuing to explode, providing more potential venues for B.S., suggesting that B.S. could go up in the future. B.S. is like spam or commercials, growing, but simultaneously control mechanisms are also growing to mediate interactions. Although B.S. could be more insidious, less detectible and even desirable when it is highly personalized and hypertargeted such as marketing is starting to be now.

Politicians replaced by Artificial Intelligences
Considering fields ranging from science, with a zero-low tolerance for B.S., to politics, with a high tolerance for B.S., it is possible in the future that it would be desirable to replace people in high-B.S. professions with Artificial Intelligences. This would solve the agency problem and special interests control overnight. Policy debates could be resolved by running a million different permutations via virtual simulation varying every parameter of a given policy change such that overall utility is maximized.

Sunday, October 18, 2009

Affinity Capital

A key concept in the 2.0 Economy is affinity capital. Deeper levels of information about every economic transaction are starting to be available such that individuals, businesses and communities can be very specific in directing and democratizing their capital. In many cases, products can be chosen that are organic, recyclable, fair trade, made from sustainable materials and made by companies with fair labor practices or whatever affinities or attributes the buyer cares about.

Affinity-directed capital can influence both cash inflows and outflows. Affinity inflows are the money earned. Earners can now be more selective by checking Corporate Social Responsibility reports if thinking of working for large companies, by being entrepreneurs and contractors, finding projects on website marketplaces like TopCoder (software programming), oDesk (professional services) and 99 designs (graphic design) or by having clients seek them directly through their web activities and content. A taxonomy of affinity capital marketplace links is available here.

Affinity capital influences capital outflows too: investing, donating and purchasing. In investing, socially-responsible investing (SRI) mutual funds have been available for several years, and now peer-to-peer lending and social venture capital platforms allow investors to direct capital into these asset classes too. Philanthropy is merging with investing in cases like Kiva where investors find a lower or blended financial return is acceptable when social outcomes can also be achieved. The SocialCapitalMarkets conference has continued to draw several hundred worldwide social entrepreneurs to talk about how to bring social change with economic transactions at their annual September conference in San Francisco. The organization also sponsors The Hub, twelve worldwide physical spaces for social capital markets collaboration.

Affinity purchasing, voting with dollars based on product attributes, is another way of democratizing capital as consumers and businesses check websites like ClimateCooler, the Fair Trade Federation and others to see how socially and environmentally friendly products are before buying or purchasing directly from green product websites like GreenHome or other affinity-based marketplaces.

Socially-responsible and environmentally-friendly are some of the biggest affinity attributes but the key point is that deep attribute knowledge means that capital can be directed granularly to ANY affinity attribute.

Sunday, October 11, 2009

FutureThink: the Mindset of the Future

To think strategically about the future, it is necessary to realize that the mindset of today may be outdated for appropriately contemplating the future. The inadequacy of current human minds is sometimes given as a possible reason that humans may not be able to understand the full physics of the universe (multiple dimensions, multiple universes) or design artificial general intelligence.

One technique to improve the current mindset is to try deriving future intellectual norms from historically trending principles.

There are other examples of this concept. Successful athletes do not move to where the ball is now but where it is going to be. Ray Kurzweil exhorts not to invent a future object based on today’s technology, but rather where the technology will be in the future.

Three key principles and one meta-principle are discussed below.

1. Increase in humaneness: The first historical principle that can be identified is an increase in humaneness. Over time, there are new tiers of behavior that are deemed inhumane and become unacceptable. For example, slavery was acceptable in past eras but is not now, many societies have moved away from capital punishment and diminished discrimination is ongoing. Some contemporary issues are the rights of homosexuals, and noise and light pollution. In the future, it could be seen as inhumane to keep people waiting, or bored, or under-actualized, or without personalized-temperature control, or exposed to air pollution or disease toxins emitted by others.

Medicine and dentistry are obvious fields of increased humaneness. Today state-of-the-art treatment from 100 years ago seems primitive and barbaric. When seen from the lens of the future, it is easy to contemplate a time where people would be shocked to be operated on with a knife; we are already starting to see this now as da Vinci robotic surgery is 90% less invasive than traditional methods.

2. Increase in choices: A second historical principle is an increase in choices. For example, in music, with the advent of records, industry insiders were afraid that people would stop listening to the radio, however there was a boom in both as they reinforced each other and expanded music-listening as a category. If a new concept provides value, it helps to refine, stratify and expand the whole market. Contemporary examples are TiVo, YouTube and free e-books spurring traditional sales.

3. Decrease in limitations: The third idea is related to the second, not just are there more choices but there are new choices. For example, population growth may be a problem if only certain areas of the Earth can be inhabited and if resources are constrained, but FutureTech may open up living in places that were formerly uninhabitable (for example, the Seasteading Institute is investigating the feasibility of water-based settlements). Resources could become more abundant such as is happening now with solar and wind energy and the possibility of repurposing of cellulosic plant waste for fuel or food with synthetic biology. A significantly higher population could be supportable on Earth. Today’s constraints will not be tomorrow’s constraints.

Meta-principle: Abundance
These three principles, increase in humaneness, increase in choices and decrease in limitations are all aspects of the overarching principle of abundance. With abundance, there is more in every dimension, not a world of either/or scarcity. For example, a classic futurist thought experiment is whether someone would forsake their embodied form by uploading their mind to a computer. This is a perfect example of the fallacy of applying current thinking, e.g.; today’s resource scarcity mindset, to future scenarios. A future seems much more likely where many options would be possible. People may have many digital backup copies of their mind files, possibly multiple copies engaging in different activities, as well as one or more embodied forms, rather than an either/or choice of identity representation. An increase in options in existing and new possibility spaces, physical, intellectual, emotional and philosophical, is the hallmark of abundance thinking about future scenarios.

Sunday, October 04, 2009

Preventive Medicine and Docs vs. Genomics

Despite NIH Director Francis Collins’ strong support of personalized genomics (he claims he lost 15 pounds after finding out through direct-to-consumer genetic testing that he is at higher risk for Type 2 Diabetes) and noting that the only way to successfully transition to the genomic era is with a skilled professional work force, doctors are taciturn about embracing genomics, and rarely try it even when it is made available to them and their patients for free (less than 5% uptake in a recent example with El Camino Hospital and DNA Direct making genomic testing available to 1000 physicians).

Top 10 reasons doctors will probably not be the ones implementing genomic data in patient care, in rank order. Physicians...

  1. think they have to be the domain experts of any health area they direct for patients and are too constrained, unwilling or unable to be a genomics domain expert
  2. do not see the clinical utility of genomics
  3. have the attitude that genomics is optional, not required
  4. have a precedent for non-adoption of preventive medicine tools as evidenced by slow uptake of molecular diagnostics
  5. driven by liability, malpractice fears
  6. self-direct per insurance non-reimbursability
  7. believe genomics overconsumes scarce medical resources
  8. are already cost, time, new knowledge acquisition constrained
  9. are resistant to change and enjoy autonomy in directing their own practices
  10. do not have specific tools for implementing genomics in their practices
Number one reason physicians would adopt genomics:
  1. if their peers did
Physicians are intelligent and could easily adopt genomics
In reality the way that genomics adoption unfolds in the traditional health care system could be straightforward. Once regulated, physicians would have no choice but to adopt. Whole human genomes would be on file in patient Electronic Medical Records (EMR) and genomics tests could be a few more items on the standard blood test menu where primary care physicians interpret results within quantified ranges. Even though physicians are spending on average only 12 minutes with each patient per year in the US, they are required to spend 100-200 hours per year on Continuing Medical Education, and being quite intelligent, could easily master the basics of delivering genomic medicine.

Best quotes from the September 2009 National Coalition for Professional Education in Genetics (NCHPEG) meeting:
  • “Not only is genomic data useless, educating physicians about genomic data is useless”
  • “Learning about genomics might be useful to my practice, so would speaking Spanish, but I’m not going to do it”
Solution: new care provider tier for Preventive Medicine
The disincentives to physician adoption of genomic medicine are really part of the bigger issue of how societies are going to shift to preventive medicine in general.
The traditional health care model of physicians and insurance companies is probably not going to deliver preventive health, a new tier of care providers, entrepreneurs, is.

Figure 1. Future Health
Image: MS Futures Group, Oct. 2009

A model for the future of health care is presented in Figure 1. The patient is at the center, increasingly taking responsibility for managing their own health. Easy-to-use tools, both devices and web-based software, could provide the first shell of actionable health information to individuals. Over time (decades), there is no reason that the primary care provider could not be superseded by automated health monitoring tools.

New Era Preventive Care Specialists: the Health Advisor
The next preventive medicine shell is the new tier of health care providers. When consumers say “I have my genomic data, now what?” traditional doctors say, “I have no idea what to do with that” or “That is not clinically useful,” but the New Era Preventive Care Specialists do not. They show what to do with personalized data by using genome-in-the-cloud browser tools to make genomic data intelligible and actionable. They incorporate genomic data, together with family history and current phenotype and biomarker data into an overall care plan (when is Keas finally going to launch? what about Omicia?), with a systemic approach (when will Entelos license their virtual patient technology to consumer-pointing applications?).

The Health Advisor (analagous to the Financial Advisor) could be one of the fastest growing new job areas. The business model may be traditionally trained experts in general medicine, genomics, nutrition and sports medicine coming together in private clinics to work in the new paradigm of exploding volumes of digitized health data (both health metrics collected daily and genomic, transcriptomic, etc. data) together with EMRs. One first service could be EMR assembly where patients own and control the data. Other services could include all manner of personalized health plan creation and monitoring. Anti-aging treatments would be another logical area for inclusion.

Health Savings Account (HSA) Dollars
Accustomed to the third-party pay model, consumers may object to paying for medical services (although they do shell out several billions of dollars per year for weight-loss products) but instead of paying directly out-of-pocket, it is quite possible that preventive care services could be purchased with pre-tax HSA dollars, as more than half of U.S. large-company plans may be offering as an insurance option. This marketing point that should not be lost on the new era of preventive health providers.

Sunday, September 27, 2009

Status of Stem Cell Research

The World Stem Cell Summit in Baltimore MD held September 21-23, 2009 attracted several hundred professionals to discuss contemporary science, industry and societal perspectives on stem cells. Attendance was high, but down from last year and, similar to cancer meetings, a key theme several keynote speakers acknowledged was

the overall lack of truly meaningful progress in stem cell research in the last twenty years.

Science Focus: Safe Stem Cell Generation

The science tracks featured current research in different stem cell areas including the production of safe hESC (human embryonic stem cells) and iPS (induced pluripotent stem cells) for use in regenerative medicine, the research and therapeutic use of mesenchymal stem cells (MSCs) and hematopoietic stem cells (HSCs) and reports from specific sub-fields: cancer stem cells, cardiovascular stem cells and neural stem cells. Overall, the work presented was incremental and in many cases, confirming what has been known already, such as a growing confirmation that cancer stem cells are probably responsible for triggering the resurgence of cancer but cannot at present be distinguished from other cells at the time of tumor removal.

Contract Research Demand: Cell Therapies and Recombinant Proteins
One stem cell area experiencing growth is contract research organizations, the outsourcing tool of choice for research labs and pharmaceutical companies in the production of biological materials. For large contract research manufacturing such as Basel, Switzerland-based Lonza, the biggest demand area is in cell therapies. Cell therapies denote the introduction of any type of new cell into other tissue for therapeutic purposes, but in the current case generally means any variety of stem cell-based therapies. Other large contract research manufacturing organizations such as Morrisville, NC-based Diosynth (owned by Schering Plough) lead in biologics (antibodies, protein production) production, an important area for nextgen biotech where synthetic biology could have a big impact.

For smaller contract research manufacturing organizations producing test compounds (e.g.; 1 liter for $10,000) and scaling to Phase I and II clinical trial quantities such as Baltimore MD-based Paragon Bioservices, the biggest demand is for recombinant proteins. Recombinant proteins are created by inserting recombinant DNA into a plasmid of rapidly reproducing bacteria and can take many useful forms such as antibodies, antigens, hormones and enzymes.

Venture capital hot topics: zinc fingers, RT PCR, tech transfer
Zinc fingers (small protein domains that bind DNA, RNA, proteins and small molecules) have been surfacing in a variety of cutting-edge biotech innovations. In July 2009, St. Louis, MO-based biotechnology chemical producer Sigma-Aldrich (SIAL) announced the creation of the first genetically modified mammals using zinc finger nuclease (ZFN) technology to execute modifications such as taking away the tail of the zebrafish. A second example of recent landmark research involving zinc fingers is that of Carlos Barbas at Scripps who uses zinc finger proteins to reprogram serine recombinases as a more specific alternative to the homologous recombination method of genome modification. In addition, the Barbas lab has a useful web-based zinc finger protein design tool available for public use, Zinc Finger Tools.

Real-time PCR offerings continue to expand and flourish with declining prices as startup newcomer Helixis announced a $10,000 real-time PCR solution at the conference.

Bethesda, MD-based Toucan Capital, a leading investor in stem cells and regenerative medicine discussed their sixteen interesting portfolio companies such as San Diego CA-based VetStem who is conducting joint and tendon stem cell therapies for race horses.

Johns Hopkins has one of the country’s leading technology transfer programs, licensing a growing number of technologies each year (nearly 100 in the last fiscal year), and has a searchable, though not extremely user-friendly, website.

Sunday, September 20, 2009

Personalized genomics inflection point

One of the world’s fastest accelerating technologies is that of genomic sequencing. The first whole human genome (6 billion base pairs) was sequenced at a cost of $3b and was completed in 2003. The current cost is $20,000 for researchers (Complete Genomics) and $48,000 for consumers (with Illumina’s EveryGenome program). Leading third-generation sequencing company Pacific Biosciences affirmed at the Cold Spring Harbor Laboratory Personal Genomes meeting September 14-17, 2009 that the company has 12 prototype instruments in operation and continues to be on track for ~$100 (“the cost of a nice dinner”) whole human genome sequencing to be commercially available in the second half of 2010. NimbleGen indicated that they may have a $2,000 exome sequencer available in 2010.

In a challenging venture capital climate, Pacific Biosciences was able to close an additional $68m round in financing on August 12, 2009. Leading commercial sequencer Complete Genomics was also notable in closing a $45m D round on August 24, 2009. The company has sequenced 14 whole human genomes to date, and hopes to sequence exponentially more, 10,000, in 2010 at a minimum cost of $5,000 per genome.

Viability of DTC genomics sector
Where the genomics technology sector has rosy prognostications, the direct-to-consumer personalized genomics market has volatility. Events in the last several months have led to questions of the sector’s viability with upheavals at the three leading companies, 23andme (“Avey Leaves 23andMe to Start Alzheimer's Research Foundation Using DTC Genomics Firm's Platform"), deCODEme (“deCODE close to broke” – Augusty 11, 2009) and Navigenics (“Navigenics Names Jonathan Lord, MD to Serve as President and Chief Executive Officer” April 7, 2009). Absent innovation, DTC genomics companies are a “window business” in the sense that the window for their current offerings may only be open for a short time with the advent of whole human genome sequencing and standardized public multi-SNP condition interpretation tools.

Figure 1. Direct-to-Consumer Genomics Offerings: ongoing price declines (Chart PDF)

As depicted in Figure 1, there are three types of Direct-to-Consumer (DTC) genomics offerings currently available directly to individuals: one-off SNP (single nucleotide polymorphism) tests for specific conditions and paternity tests, multi-SNP risk assessment tests mapping several SNPs to dozens of disease conditions and whole human genome sequencing assessing hundreds of disease risks. The five companies offering multi-SNP risk assessments are: 23andme ($399 for 111 conditions), deCODEme (42 conditions for $985), Navigenics (28 conditions for $999), Gene Essence (84 conditions for $1,195) and Pathway Genomics (77 conditions for $249). 23andme, deCODEme and Navigenics are the most transparent, disclosing the specific SNPs, research references and risk assessment methodologies for their tests, Gene Essence discloses SNPs and Pathway Genomics does not disclose anything. A detailed condition and SNP analysis is here.

Slow DTC genomics adoption
DTC genomics has had slow adoption so far for several reasons, first, there has been very little marketing, few consumers know of the availability and value proposition of DTC genomics services. Second, since automated tools are not yet available, many people are not interested in preventively managing their health, and may still perceive it to be in the responsibility and domain of health care professionals. Third, the conventional but incorrect view is that genetic information is already known (from family history), negative and deterministic. Fourth, as initially pointed out by ExperimentalMan David Ewing Duncan, there are conflicting interpretations from DTC services for the same conditions such as heart attack. This is because the scientific community has little knowledge and agreement yet regarding multi-SNP conditions. DTC companies are looking at different SNPs, assigning different quantitative risk values and employing differing estimates of overall population averages which all contribute to heterogeneous interpretations of risk for the same condition.

Sunday, September 13, 2009

VC guide to anti-aging biotechnology investing

Several promising startup companies focused on the nascent but obviously significant and growing anti-aging biotechnology space were present or discussed with interest at the recent SENS4 (Strategies for Engineered Negligible Senescence) conference in Cambridge, U.K., September 3rd – 7th, 2009 (program) (full conference report).

  1. Epeius Biotechnologies, San Marino, CA, USA: Rexin-G, a tumor-targeted injectable gene delivery system
  2. FoldRx, Cambridge, MA, USA: small molecule therapeutics to treat protein misfolding diseases, and bind and clear undesired molecules
  3. Gencia Corporation, Charlottesville, VA, USA: mitochondrial DNA rejuvenation using the rhTFAM (recombinant-human mitochondrial transcription factor A) protein
  4. Genscient, Fountain Valley, CA, USA: novel chronic disease therapeutics by combining genomics and selective screening (a large Alzheimer’s Disease genetic study is in progress with Kronos and TGen)
  5. Knome, Cambridge, MA, USA: whole human genome sequencing (consumer offering)
  6. Neotropix, Malvern, PA, USA: oncolytic viruses for the treatment of solid tumors
  7. Pentraxin Therapeutics Ltd, London, UK: small molecule drug CPHPC specifically targeting SAP (serum form of amyloid P) and removing it from the blood and brains of patients with Alzheimer’s Disease
  8. Repeat Diagnostics, Vancouver, BC, Canada: telomere length measurement for total lymphocyte and granulocyte populations (consumer offering)
  9. Retrotope, Los Altos Hills, CA, USA: using isotope effect to slow down damage pathways and control metabolic processes associated with oxidative stress
  10. StemCor Systems, Inc., Menlo Park, CA, USA: bone marrow harvesting system
  11. T.A. Sciences, New York, NY, USA: telomerase activation via the single molecule TA-65, licensed from Geron Corporation (consumer offering)
  12. TriStem Corporation, London, UK: retrodifferentiation technology to create stem cells from mature adult cells

Sunday, September 06, 2009

So discontinuous a discontinuity

A key aspect of thinking systemically about the future is being able to see how rapidly advancing technologies across many fields interrelate. Whatever next Internet-like discontinuity or singularity occurs will influence whatever comes thereafter. It is likely that some high percentage of what are now thought to be expected future advances will recede or be reshaped at minimum, for example:

  • If all chronic disease and aging becomes controllable and there is effective immortality, does uploading matter as much?
  • If artificial general intelligence is achieved, how does that change the exigency and requirements of molecular nanotechnology?
  • If affordable space launch and space-based solar power is achieved, what happens to ethanol, electrical and other terrestrial alternative vehicle and transportation infrastructure solutions?
  • If immersive virtual reality and post-material scarcity are achieved, does molecular nanotechnology matter and what happens to global political systems?
  • If whole human genome testing is available, do single SNP tests go away? If there are home health monitors and nanodiagnostics, do primary care physicians go away?

Sunday, August 30, 2009

Real-time unified search

It is surprising that a unified search application across different types of web content does not yet exist.

According to OneRiot, 40% of web searches at present are for real-time content such as that from Twitter, FaceBook, PeopleBrowsr, digg, bookmarking sites, blogging and microblogging sites (friendfeed, etc.).

Megafeed
One vision of a unified search 'megafeed' app would be a customizable html page with search across many types of web content, automatically updated and delivered together or organized into categories such as events, articles, comments, people, etc. The types of web content to search would be:

  • Traditional web search: Google, Bing, etc., which could be more richly granularized with content-tagging per a variety of parameters such as information type (news, blog, video, book, event, etc.), time (time added to web, time of occurrence), original vs. subsequent post and other distinctions.
  • Real-time web search: The emerging real-time content search engines should be extended and unified into one digital social interaction feed for Twitter, FaceBook, LinkedIn, bookmarking, email, blogging, microblogging and possibly IM/SMS notification AND response. User-permissioned credentials can be browser-stored for such a unified action platform. In addition to usability, the fast-growing real-time web search companies are also focused on monetization, reinventing generating AdSense-like models.
  • Local search: New restaurant and retail notifications, events, craigslist and other commercial postings of interest, friends traveling to the area (links to feeds from GeckoGo, Dopplr and other travel social networking and public calendaring websites).
  • Academic search: notification of new papers, articles or news. Federated PubMed, ArXiv-like journal portals are needed for all academic fields, including economics and liberal arts.
  • Multimedia search: notification of non-text postings of photo, music, podcast and video content.

Content demand mechanisms: ambience to supersede keywords
Content could be searched by the usual user-entered keywords or a deeper variety of content demand-interaction mechanisms could be developed, for example, permissioning-in by users such that ambient profiles from hard-drive content and previous web interactions automatically form and evolve (a precursor to pre-AI web interactions).

Sunday, August 23, 2009

Automatic Markets

At Singularity University, one of the most pervasive memes was the “routing packets” metaphor; that many current activities are just like routing packets on the Internet. This includes areas such as people in driverless cars, electrons in electric vehicle charging and power entry, load-balancing, routing and delivery on smartgrid electricity networks.

Fungible resources and quantized packet-routing
The packet-routing concept could be extended to neurons (routed in humans or AIs), clean water, clean air, food, disease management, health care system access and navigation, and in the farther future, information (neurally-summoned) and emotional support (automatically-summoned per human dopamine levels from nearby people or robots). It is all routing…directing quantized fungible resources to where they are needed and requested.

Automatic Markets
Since these various resources are not uniformly demanded, the idea of markets as a resource allocation mechanism is immediately obvious.

Further that automated, or automatic markets with pre-specified user preferences, analogous to limit orders, could be optimum. Markets could meet in equilibrium and transact, buying, selling, and adjusting automatically per evolving conditions and pre-programmed user profiles, permissions, and bidding functions.

Truly smart grids would have automatic bidding functions (as a precursor to more intelligence-like utility functions) that would indicate preferences and bid and equalize resource allocation, the truly invisible digital hand.

The key parameters of a working market, liquidity, price discovery and ease of exchange would seem to be present in these cases with large numbers of participants and market monitoring and bidding via web or SMS interfaces. The next layer, secondary markets and futures and options could also evolve as an improvement to market efficiency, if designed with appropriate incentives.

Automatic markets are not without flaw, they exist now in traditional financial markets, causing occasional but volatile disruptions in the form of quantitative program-trading (blamed for exacerbating the 1987 Black Monday stock market crash) and flash-trading. Speculative aspects are not trivial and would be a critical area for market designers to watch, particularly managing for high liquidity and equal access (e.g.; faster Internet connections do not matter).

Markets to grow as a digitized resource allocation tool
At present, markets are not pervasive in life. The most notable examples are traditional financial markets, eBay, peer-to-peer finance websites and prediction markets. Being in a global digital era with the ability to use resources in a more fungible and transferable way could further promulgate the use of markets as a resource allocation tool.

A focus on preference rather than monetary value, and other currencies such as attention, authority, trust, etc. could vastly extend the range of implementation of market principles.

Sunday, August 16, 2009

iPhone Biodefense App

Right now it would be nice for people to be able to perform a detailed inspection of whatever environment they are in, and of themselves internally. As the future evolves, it could become an exigency. Portable personal biosensing devices for biothreat defense and medical self-diagnosis could become de rigueur, most logically as an extension of current mobile device platforms.

Hardware Requirements:

  • Integrated Lab-on-a-chip module with flow cytometer, real-time PCR, microarray and sequencing unit (genome, proteome, metabolome, lipidome, etc.)
  • Disposable finger-prick lancets
Software Requirements:
  • Data is collected and perhaps digitized locally, then transmitted for processing and interpretation via web services
What is the current status of the iPhone Biodefense App?
  • A. Order online
  • B. DIY with components from Fry’s
  • C. Have a roadmap, getting supplies and building tools
  • D. Homesteads and landgrab available to pioneers
  • E. “Ahead of the science,” aka it’s always 20 years out!
Answer: C. Have a roadmap, getting supplies and building tools
Single-cell identification, extraction and genotyping is starting to be possible from a research perspective (ex: Love Lab, MIT). Lab-on-a-chip functionality has been miniaturized (e.g.; small flow cytometers, small PCR machines). Now the trick is to integrate and add features to these systems, extend the functionality, shrink them further and reduce constraints. Microarrays and sequencing also have several innovation cycles ahead.

Key constraint: time
In addition to moving down the cost curve (most relevant for sequencing), performance time is the key constraint. Substances, expressed genes, blood biomarkers, etc. can be detected but it is taking hours and days when it needs to be immediate.

Declassify custom biodefense microarrays
Lawrence Livermore National Laboratory has one of the most advanced biodefense labs in the country. Custom microarrays have been developed for government agencies that the lab would now like to transfer into the public health domain. This could revolutionize and hasten commercial biosensing applications much like the declassification of adaptive optics revolutionized astronomy. At least three custom microarrays have been developed:
  • Microbial Detection Array: identify what a substance is
  • Virulence Array: identify how much damage a substance could do
  • Microbial Defense Genotyping Array: identify SNPs, indels

Sunday, August 09, 2009

Open Global Courseware

The U.K., long an adopter of surveillance technology, announced recently that high-definition CCTV cameras from Classwatch have been installed in 94 schools. The result has been improved classroom management and there are plans to install hundreds more cameras nationwide in primary and secondary schools.

Free global education resource
With minimal effort, this internal surveillance initiative could be expanded into a worldwide sousveillance victory. A global education resource could be generated by broadcasting and archiving the live feeds to the web for access by teachers and students worldwide in their own classrooms and via cell phones. This is essentially an extension of MIT’s open courseware concept.

Language imperialism and the return of the British Empire?
The U.K. might briefly enjoy the notion of re-establishing the British Empire by exporting English-language education, but

language is becoming more fungible over time
The issue of language imperialism could be avoided with the use of audio translation tools (Google Translate – audio version?) and by opting in CCTV broadcasts from schools in other countries. The pilot project phases could be U.K. transmissions targeted at India and Beijing, etc. transmissions targets at rural Chinese schools.

PenPal 2.0 flattens the world
Classroom broadcasts could quickly become interactive with commenting and messaging on the streams. Students worldwide could get to know each other and work on team projects together in virtual world classrooms like Second Life’s Teen Grid; a multi-dimensional PenPal 2.0. Students in India could come up with ideas to work on problems in the U.K. by interviewing British students and vice versa. Teacher and student exchange programs could arise. Students could vote on the curriculum.
The real way to raise test scores would be to have live head-to-head competitions between different schools in a district, country or around the world (“The class in Chennai did 5% better….”).

Local community engagement tool
Internet broadcast could also enable the local community. Parents could tune in to their children’s classrooms (“Mom, did you see what I did around 10:30?”…”What happened at school today?” “Mom, just watch the feed archive…”). The social networking dimension could deepen student, teacher and parent interaction as many are already managing homework assignments colaboratively on the web.

American Idol Teacher: injecting abundance
Classroom broadcast could bring more abundance to teaching by providing acknowledgement (whuffie) for good teachers. Innovative and engaging teachers could reach a global audience and become YouTube celebrities. There could be competitions for the Best Teacher of the Pythagorean theorem, Best Teacher in Swindon, etc. as nominated through video clips. Videos could be linked to teacher ranking websites. From a policy perspective, education could become easier to evaluate and standardize. Countrywide best practices could be culled to train new teachers.

Conclusion: inevitablility of full-life recording
It seems inevitable that video surveillance/sousveillance will increasingly penetrate public and private areas for a variety of reasons ranging from safety and crime control to life-logging. One classic opposition argument is that recording inhibits ‘natural’ behavior, however most people quickly forget and adjust and it could be likely that the ongoing recording of society will advance without much opposition as long as there is a balance between surveillance and sousveillance (e.g.; there is popular access to the technologies and streams).

Sunday, August 02, 2009

Bio-design automation and synbio tools

The ability to write DNA could have an even greater impact than the ability to read it. Synthetic biologists are developing standardized methodologies and tools to engineer biology into new and improved forms, and presented their progress at the first-of-its-kind Bio-Design Automation workshop (agenda, proceedings) in San Francisco, CA on July 27, 2009, co-located with the computing industry’s annual Design Automation Conference. As with many areas of technological advancement, the requisite focus is on tools, tools, tools! (A PDF of this article is available here.)


Experimental evidence has helped to solidify the mindset that biology is an engineering substrate like any other and the work is now centered on creating standardized tools that are useful and reliable in an experimental setting. The metaphor is very much that of computing: just as most contemporary software developers work at high levels of abstraction and need not concern themselves with the 1s and 0s of machine language, in the future, synthetic biology programmers would not need to work directly with the Ac, Cs, Gs and Ts of DNA or understand the architecture of promoters, terminators, open reading frames and such. However, with synthetic biology being in its early stages, the groundwork to define and assemble these abstraction layers is currently at task.

Status of DNA synthesis
At present, the DNA synthesis process is relatively unautomated, unstandardized and expensive ($0.50-$1.00 per base pair (bp)); it would cost $1.5-3 billion to synthesize a full human genome. Synthesized DNA, which can be ordered from numerous contract labs such as DNA 2.0 in Menlo Park, CA and Tech Dragon in Hong Kong, has been following Moore’s Law (actually faster than Moore’s Law Carlson Curves doubling at 2x/yr vs. 1.5x/yr), but is still slow compared to what is needed. Right now short oligos, oligonucleotide sequences up to 200 bp, can be reliably synthesized but a low-cost repeatable basis for genes and genomes extending into the millions of bp is needed. Further, design capability lags synthesis capability, being about 400-800-fold less capable and allowing only 10,000-20,000 bp systems to be fully forward-engineered at present.

So far, practitioners have organized the design and construction of DNA into four hierarchical tiers: DNA, parts, devices and systems. The status is that the first two tiers, DNA and parts (simple modules such as toggle switches and oscillators), are starting to be consistently identified, characterized and produced. This is allowing more of an upstream focus on the next two tiers, complex devices and systems, and the methodologies that are needed to assemble components together into large-scale structures, for example those containing 10 million bp of DNA.

Standardizing the manipulation of biology
A variety of applied research techniques for standardizing, simulating, predicting, modulating and controlling biology with computational chemistry, quantitative modeling, languages and software tools are under development and were presented at the workshop.

Models and algorithms
In the models and algorithms session, there were some examples of the use of biochemical reactions for computation and optimization, performing arithmetic computation essentially the same way a digital computer would. Basic mathematical models such as the CME (Chemical Master Equation) and SSA (Stochastic Simulation Algorithm) were applied and extended to model, predict and optimize pathways and describe and design networks of reactions.

Experimental biology
The experimental biology session considered some potential applications of synthetic biology, first the automated design of synthetic ribosome binding sites to make protein production faster or slower (finding that the translation rate can be predicted if the Gibbs free energy (delta G) can be predicted). Second, an in-cell disease protection mechanism was presented where synthetic genetic controllers were used to prevent the lysis normally occurring in the lysis-lysogeny switch turned on in the disease process (lysogeny is the no-harm state and lysis is the death state).

Tools and parts
In the tools and parts session, several software-based frameworks and design tools were presented, many of which are listed in the software tools section below.

Languages and standardization
The languages and standardization session had discussions of language standardization projects such as the BioStream language, PoBol (Provisional BioBrick Language) and the BioBrick Open Language (BOL).

Software tools: a SynBio CrunchUp
Several rigorous computer-aided design and validation software tools and platforms are emerging for applied synthetic biology, many of which are freely available and open-source.

  • Clotho: An interoperable design framework supporting symbol, data model and data structure standardization; a toolset designed in a platform-based paradigm to consolidate existing synthetic biology tools into one working, integrated toolbox
  • SynBioSS - Synthetic Biology Software Suite: A computer-aided synthetic biology tool for the design of synthetic gene regulatory networks; computational synthetic biology
  • RBS Calculator: A biological engineering tool that predicts the translation initiation rate of a protein in bacteria; it may be used in Reverse Engineering or Forward Engineering modes
  • SeEd - Sequence Editor (work in progress): A tool for designing coding sequence alterations, a system conceptually built around constraints instead of sequences
  • Cellucidate: A web-based workspace for investigating the causal and dynamic properties of biological systems; a framework for modeling modular DNA parts for the predictable design of synthetic systems
  • iBioSim: A design automation software for analyzing biochemical reaction network models including genetic circuits, models representing metabolic networks, cell-signaling pathways, and other biological and chemical systems
  • GenoCAD: An experimental tool for building and verifying complex genetic constructs derived from a library of standard genetic parts
  • TinkerCell: A computer-aided design software for synthetic biology

Future of BioCAD
One of the most encouraging aspects in the current evolution of synthetic biology is the integrations the field is forging with other disciplines, particularly electronics design and manufacture, DNA nanotechnology and bioinformatics.

Scientists are meticulously applying engineering principles to synthetic biology and realize that novel innovations are also required since there are issues specific to engineering biological systems. Some of these technical issues include device characterization, impedance, matching, rules of composition, noise, cellular context, environmental conditions, rational design vs. directed evolution, persistence, mutations, crosstalk, cell death, chemical diffusion, motility and incomplete biological models.

As it happened in computing, and is happening now in biology, the broader benefit of humanity having the ability to develop and standardize abstraction layers in any field can be envisioned.
Clearly there will be ongoing efforts to more granularly manipulate and create all manner of biology and matter. Some of the subsequent areas where standards and abstraction hierarchies could be useful, though not immediate, are the next generations of computing and communications, molecular nanotechnology (atomically precise matter construction from the bottom up), climate, weather and atmosphere management, planet terraforming and space colony construction.

(Image credits: www.3dscience.com, www.biodesignautomation.org)

Sunday, July 26, 2009

Ethics of brainless humans

As a thought experiment, if it were possible, would it be ethical to make humans without brains for research purposes?

The idea arises since a more accurate model of humans for drug testing would be quite helpful. Drugs may work in mice, rats and monkeys but not in humans or in some humans but not others. Human biology is more complex and the detailed pathways and mechanisms are not yet understood.

Of course by definition, a brainless human is not really a human; a human form without a brain would be more equivalent to a test culture of liver cells than a cognitive agent.

Tissue culturing, regenerative medicine and 3D organ printing
The less contentious versions of the idea of growing brainless humans is currently under initial exploration in taking tissue from a human, growing it up in culture and testing drugs or other therapies on it. A further step up is regenerative medicine, producing artificial organs from a person’s cells such as the Wake Forest bladder and Gabor Forgacs 3D organ printing work.

Brain as executive agent may be required
The next steps for testing would be creating systems of interoperating tissue and organs (e.g.; how would this person’s heart and liver respond to this heart drug?) and possibly a complete collection of human biological systems sans brain. One obvious issue is that this might not even work since the brain is obviously a critical component of a human and that a brainless human could not be built, that some sort of executive organizing system like the brain would be needed. Also medical testing would need to include the impact on the brain and the brain’s role and interaction with the other biological systems and the drug.

Ethical but impractical
Where it is quite clear that generating a full living human for research purposes would be unethical, it is hard to argue that generating a brainless human, a complex collection of human biological systems without a brain, which is not really human and does not have consciousness or personhood, would be unethical. Certainly some arguments could be made to the contrary regarding the lack of specific knowledge about consciousness and concepts of personhood, but would seem to be outweighed.

Unlikely to arise
It is extremely unlikely that the situation of manufacturing brainless humans for research purposes would ever arise, first since a lot of testing and therapy may be possible with personalized tissue cultures and regenerative medicine, and informed by genomic and proteomic sequencing. Also, in an eventual era where it might be possible to construct a brainless human or a collection of live interacting tissues and organ systems, it would probably be more expedient to model the whole biological system digitally.