Showing posts with label conference. Show all posts
Showing posts with label conference. Show all posts

Sunday, February 01, 2015

Machine Cognition and AI Ethics Percolate at AAAI 2015

The AAAI’s Twenty-Ninth Conference on Artificial Intelligence was held January 25-30, 2015 in Austin, Texas. Machine cognition was an important focal area covered in two workshops on AI and Ethics, and Beyond the Turing Test, and in a special track on Cognitive Systems. Some of the most interesting emergent themes are discussed below.

Computational Ethics Systems
One main research activity in machine ethics is developing computational ethics systems. The status is that there are several such systems, however, a paucity of overall standards bodies, general ethics modules, and an articulation of universal principles that might be included like human dignity, informed consent, privacy, and benefit-harm analysis. Some standards bodies that are starting to address these ideas include the IEEE’ s Technical Committee on Robot Ethics and European committees involved in RoboLaw and Roboethics

One required feature of computational ethics systems could be the ability to flexibly apply different systems of ethics to more accurately reflect the ways that human intelligent agents approach real-life situations. For example, it is known from early programming efforts that simple models like Bentham and Mill’s utilitarianism are not robust enough ethics models. They do not incorporate comprehensive human notions of justice that extend beyond the immediate situation in decision-making. What is helpful is that machine systems on their own have evolved more expansive models than utilitarianism such as a prima facie duty approach. In the prima facie duty approach, there is a more complex conceptualization of intuitive duties, reputation, and the goal of increasing benefit and decreasing harm in the world. This is more analogous to real-life situations where there are multiple ethical obligations competing to determine the right action. GenEth is a machine ethics sandbox that is available to explore these kinds of systems for Mac OS, with details discussed in this conference paper.

There could be the flexible application of different ethics systems, and also integrated ethics systems. As in philosophy, computational ethics modules connote the idea of metaethics, a means of evaluating and integrating multiple ethical frameworks. These computational frameworks differ by ethical parameters and machine type; for example an integrated system is needed to enable a connected car to interface with a smart highway. The French ETHICAA (Ethics and Autonomous Agents) project seeks to develop embedded and integrated metaethics systems.

An ongoing debate is whether machine ethics should be separate modules or part of regular decision-making. Even though ultimately ethics might be best as a feature of any kind of decision-making, ethics are easiest to implement now in the early stages of development as a standalone module. Another point is that ethics models may vary significantly by culture; consider for example collectivist versus individualist societies, and how these ideals might be captured in code-based computational ethics modules. Happily for implementation, however, the initial tier of required functionality might be easy to achieve: obtaining ethicist consensus on overall how we want robots to treat us as humans. QA’ing computational ethics modules and machine behavior might be accomplished through some sort of ‘Ethical Turing Test;’ metaphorically, not literally, evaluating the degree to which machine responses match human ethicist responses.

Computational Ethics Systems: 
Enumerated, Evolved, or Corrigible
There are different approaches to computational ethics systems. Some involve the attempted enumeration of all involved principles and processes, reminiscent of Cyc. Others attempt to evolve ethical behavioral systems like the prima facie duty approach, possibly using methods like running machine learning algorithms over large data corpora. Others attempt to instill values-based thinking in ways like corrigibility. Corrigibility is the idea of building AI agents that reason as if they are incomplete and potentially flawed in dangerous ways. Since the AI agent apprehends that it is incomplete, it is encouraged to maintain a collaborative and not deceptive relationship with its programmers since the programmers may be able to help provide more complete information, even while both parties maintain different ethics systems. Thus a highly-advanced AI agent might be built that is open to online value learning, modification, correction, and ongoing interaction with humans. Corrigibility is proposed as a reasoning-based alternative to enumerated and evolved computational ethics systems, and also as an important ‘escape velocity’ project. Escape velocity refers to being able to bridge the competence gap between the current situation of not yet having human moral concepts reliably instantiated in AI systems, and the potential future of true moral superintelligences indispensably orchestrating many complex societal activities.

Lethal Autonomous Weapons
Machine cognition features prominently in lethal autonomous weapons where weapon systems are increasingly autonomous, making their own decisions in target selection and engagement without human input. The banning of autonomous weapons systems is currently under debate. On one side, detractors argue that full autonomy is too much, and that these weapons no longer have ‘meaningful human control’ as a positive obligation, and do not comply with the Geneva Convention’s Martens Clause requiring that fully autonomous weapons comply with principles of humanity and conscience. On the other side, supporters argue that machine morality might exceed human morality, and be more accurately and precisely applied. Ethically, it is not clear if weapons systems should be considered differently than other machine systems. For example, the Nationwide Kidney Exchange automatically allocates two transplant kidneys per week, where the lack of human involvement has been seen positively as a response to the agency problem.

Future of Work and Leisure
The automation economy is one of the great promises of machine cognition, where humans are able to offload more and more physical tasks, and also cognitive activities to AI systems. The Keynesian prediction of the leisure society by 2030 is becoming more imminent. This is the idea that leisure time, rather than work, will characterize national lifestyles. However, several thinkers are raising the need to redefine what is meant by work. The automation economy, possibly coupled with Guaranteed Basic Income initiatives, and an anti-scarcity mindset, could render obligation-based labor a thing of the past. There is ample room for redefining ‘work’ as productive activity that is meaningful to one’s sense of identity and self-worth for fulfillment, self-actualization, social-belonging, status-garnering, mate-seeking, cooperation, collaboration, and meeting other needs. The ‘end of work’ might just mean the ‘end of obligated work.’ 

Persuasion and Multispecies Sensibility
As humans, we still mostly conceive and employ the three modes of persuasion outlined centuries ago by Aristotle. These are ethos, relying on the speaker’s qualities like charisma; pathos, using emotion or passion to cast the audience into a certain frame of mind; and logos, employing the words of the oration as the argument. However, the human-machine interaction might cause these modes of human-related persuasion to be rethought and expanded, in both the human and machine context. Given that machine value systems and character may be different, so too might the most effective persuasion systems; both those employed on and deployed by machines. The ethics of human-machine persuasion is an area of open debate. For example, researchers are undecided on questions such as “Is it morally acceptable for a system to lie to persuade a human?” There is a rising necessity to consider ethics and reality issues from a thinking machine’s point-of-view in an overall future world system that might comprise multiple post-biological and other intelligent entities interacting together in digital societies.

Tuesday, June 03, 2014

EmergingTechs Nanotechnology, Synthetic Biology, and Geoengineering in the Governance Eye

The second annual Governance of Emerging Technologies conference held in Phoenix AZ May 27-29, 2014 discussed a variety of governance (regulation), legal, and ethical aspects of three areas of emerging technology: nanotechnology, synthetic biology, and geoengineering (climate management).

The prevailing attitude in nanotechnology is much like that in artificial intelligence, “no new news” and some degree of weariness after having experienced a few hype-bust cycles, coupled with the invisibility frontier. The invisibility frontier is when an exciting emerging technology becomes so pervasive and widely-deployed that it becomes invisible. There are numerous nanotechnology implementations in a range of fields including materials, computing, structures, nanoparticles, and new methods, similar to the way artificial intelligence deployments are also widely in use but ‘invisible’ in fraud detection, ATM machine operation, data management algorithms, and traffic coordination.

Perhaps the biggest moment of clarity was that different groups of people with different value systems, cultures, and ideals are coming together with more frequency than historically to solve problems. The locus of international interaction is no longer primarily geopolitics, but shifting to be much more one of collaboration between smaller groups in specific contexts who are inventing models for sharing knowledge that simultaneously reconfigure and extend it to different perspectives and value systems.

Sunday, January 12, 2014

Integrated Information as a Measure of Consciousness

The fourth FQXi international conference was held in Vieques Puerto Rico January 6-10, 2014 on the Physics of Information.

The first and primary focus was on information in the quantitative physical sense, as opposed to the epistemic sense, particularly as information is used in quantum mechanics. There are several objective measurable definitions of information such as Shannon information. Objective information and other mathematical and physics theories were also used to formalize definitions and distinctions between determinism, free will, and predictability, and intelligence versus consciousness.

Many talks and debates helped to sharpen thinking regarding consciousness, where we have been stuck with crude explanatory heuristics like ‘consciousness may be an emergent property of any sufficiently complex system.’ Interesting and provocative research was presented by Giulio Tononi and Larissa Albantakis from the Center for Sleep and Consciousness at the University of Wisconsin. They have an objective measure called ‘integrated information’ which is meant as the compositional character of experience (including subjective experience), and represents the causality amongst macro-level elements within a system. There could be systems that are complex at the macro level but have low integrated information if there are not extensive mechanisms with causal relations within the system. In other words, complexity does not necessarily confer consciousness, and the relevant factors to look for could be causality and experience. 

Sunday, March 31, 2013

What's new in AI? Trust, Creativity, and Shikake

The AAAI spring symposia held at Stanford University in March provide a nice look at the potpourri of innovative projects in process around the world by academic researchers in the artificial intelligence field. This year’s eight tracks can be grouped into two overall categories: those that focus on computer-self interaction or computer-computer interaction, and those that focus on human-computer interaction or human sociological phenomena as listed below.

Computer self-interaction or computer-computer interaction (Link to details)
  • Designing Intelligent Robots: Reintegrating AI II 
  • Lifelong Machine Learning 
  • Trust and Autonomous Systems 
  • Weakly Supervised Learning from Multimedia
Human-computer interaction or human sociological phenomena (Link to details)
  • Analyzing Microtext 
  • Creativity and (Early) Cognitive Development 
  • Data Driven Wellness: From Self-Tracking to Behavior Change 
  • Shikakeology: Designing Triggers for Behavior Change 
This last topic, Shikakeology, is an interesting new category that is completely on-trend with the growing smart matter, Internet-of-things, Quantified Self, Habit Design, and Continuous Monitoring movements. Shikake is a Japanese concept, where physical objects are embedded with sensors to trigger a physical or psychological behavior change. An example would be a trash can playing an appreciative sound to encourage litter to be deposited.

Sunday, February 17, 2013

Polyamory – an Anti-Scarcity Relationship Model for the Future

The first International Academic Polyamory Conference was held in Berkeley CA February 15-17, 2013 with approximately 100 attendees. Polyamory is the practice, desire, or acceptance of having more than one intimate relationship at a time with the knowledge and consent of everyone involved. It is not new or revolutionary that individuals may be involved with more than one other party; what is new is the openness, acknowledgement, and support and encouragement of the situation.

A number of academic studies were presented by researchers from around the world regarding the practice of polyamory. Polyamory is a niche, but increasingly becoming a defined field of sociology research. Theory papers and discussion drew on social movement theory, queer theory, intimacy theory, performance theory, and other aspects of philosophy and sociology. Other conference tracks discussed public education, experiential aspects, and legal and political issues. Some common themes were the notion of plurality and choice in relationship models and a superior level of communications mastery and emotional intelligence.

Plurality of Relationship Models 
There may be many relationship models aside from traditional normative monogamy. One only has to look at the fluidity and nuance in the reality of lived existence to see different kinds of relationships. There is the notion of bringing other relationship models into the light for greater legitimacy under the umbrella that would include monogamies and non-monagamies. Any individual may have an almost endless stream of potential demographic self-identifications to make in the categories of Race, Ethnicity, Religion, Gender, Sexuality, Relationship Model, and other categories. Each category has a plurality and range of possible answers and it would be considered discriminatory and inappropriately normative to privilege any specific identification.

Emotional Mastery 
There may be a belief that polyamorous individuals have a silver bullet and do not experience jealousy and other challenging emotions often occur in relationships. This is not at all the case. The polyamory community has not mastered jealousy, but it is often true that individuals are in trying on an ongoing basis to master communication with the self and others. Individuals in polyamorous relationships may have more experience, permission, and tools at their disposal for recognizing, acknowledging, and managing jealousy and other emotions that arise in human relationships - poly people may be more skilled at dealing with the situation.

Sunday, October 21, 2012

Singularity Summit 2012: Image Recognition, Analogy, Big Health Data, and Bias Reduction

The seventh Singularity Summit was held in San Francisco, California on October 13-14, 2012. As in other years, there were about 600 attendees, although this year’s conference program included both general-interest science and singularity-related topics. Singularity in this sense denotes a technological singularity - a potential future moment when smarter-than-human intelligence may arise. The conference was organized by the Singularity Institute, who focuses on researching safe artificial intelligence architectures. The key themes of the conference are summarized below. Overall the conference material could be characterized as incrementalism within the space of traditional singularity-related work and faster-moving advances coming in other fields such as image recognition, big health data, synthetic biology, crowdsourcing, and biosensors.

Key Themes:
  • Singularity Thought Leadership
  • Big Data Artificial Intelligence: Image Recognition
  • Era of Big Health Data
  • Improving Cognition: Bias Reduction and Analogies
  • Singularity Predictions
Singularity Thought Leadership
Singularity thought leader Vernor Vinge, who coined the term technological singularity, provided an interesting perspective. Already since at least 2000, he has been referring to the idea of computing-enabled matter and the wireless Internet-of-things as Digital Gaia. He noted that 5% of objects worldwide are already embedded with microprocessors, and it could be scary as reality ‘wakes up’ further, especially as we are unable to control other phenomena we have created such as financial markets. He was pessimistic regarding privacy, suggesting that Brin’s traditional counterproposal to surveillance, sousveillance, is not necessarily better. More positively, he discussed the framing of computers as a neo-neocortex for the brain, extreme UIs to provide convenient and unobtrusive cognitive support, other intelligence amplification techniques, and how we have been unconsciously prepping many of our environments for robotic operations. There has also been the rise of an important resource in crowdsourcing as the network (the Internet plus potentially 7 billion Turing-test passing agents) filters optimal resources to specific cognitive tasks (like protein folding analysis).

Big Data Artificial Intelligence: Image Recognition
Peter Norvig continued in his usual vein of discussing what has been important in resolving contemporary problems in artificial intelligence. In machine translation (interestingly a Searlean Chinese room), the key was using large online data corpuses and straightforward machine learning algorithms (The Unreasonable Effectiveness of Data). In more recent work, his lab at Google has been able to recognize pictures of cats. In this digital vision processing advance (announced in June 2012 (article, paper)), the key was creating neural networks for machine learning that used hierarchical representation and problem solving, and again large online data corpuses (10 million images scanned by 16,000 computers) and straightforward learning algorithms.

Era of Big Health Data 
Three speakers presented innovations in the era of big health data, a sector which is generating data faster than any other and starting to use more sophisticated artificial intelligence techniques. Carl Zimmer pointed out that new viruses are continuing to develop and spread, and that this is expected to persist. Encouragingly, new viruses are genetically sequenced increasingly rapidly, but it still takes time breed up vaccines. A faster means of vaccine production could possibly come from newer techniques in synthetic biology and nanotechnology such as those from Angela Belcher’s lab.  Linda Avey discussed Curious, Inc, a personal data discovery platform in beta launch that looks for correlations across big health data streams (more information). John Wilbanks discussed the pyrrhic notion of privacy provided by traditional models as we move to a cloud-based big health data era (for example, only a few data points are needed to identify an individual and medical records may have ~500,000). Some health regulatory innovations include an updated version of HIPAA privacy policies, a portable consent for granting the use of personalized genomic data, and a network where patients may connect directly with researchers.

Improving Cognition: Bias Reduction and Analogies (QS’ing Your Thinking) 
A perennial theme in the singularity community is improving thinking and cognition, for example through bias reduction. Nobel Prize winner Daniel Kahneman spoke remotely on his work regarding fast and slow thinking. We have two thinking modes, fast (blink intuitions) and slow (more deliberative logical) thinking, both of which are indispensable and potentially problematic. Across all thinking is a strong inherent loss aversion, and this helps to generate a bias towards optimism. Steven Pinker also spoke about the theme of bias, indirectly. In recent work, he found that there has been a persistent decline in violence over the multi-century history of time, possibly mostly due to increases in affluence and literacy/knowledge. This may seem counter to popular media accounts which, guided by short-term interests, help to create an area of societal cognitive bias. Other research regarding cognitive enhancement and the processes of intelligence was Melanie Mitchell’s claim that analogies are a key attribute of intelligence. The practice of using analogies in new and appropriate ways could be a means of identifying intelligence, perhaps superior to other mechanisms such as general-purpose problem solving, question-answering, or Turing test-passing as the traditional proxies for intelligence.

Singularity Predictions 
Another persistent theme in the singularity community is sharpening analysis, predictions, and context around the moment when there might be greater-than-human intelligence. Singularity movement leader Ray Kurzweil made his usual optimistic remarks accompanied by slides with exponentiating curves of technology cost/functionality improvements, but did not confirm or update his long-standing prediction of a technological singularity circa 2045 [1]. Stuart Armstrong pointed out how predictions are usually 15-25 years out, and that this is true every year. In an analysis of the Singularity Institute’s database of 257 singularity predictions from 1950 forward, there is no convergence of time in estimates ranging from 2020-2080. Vernor Vinge encourages the consideration of a wide range of scenarios and methods including ‘What if the Singularity Doesn’t Happen.’ The singularity prediction problem might be improved by widening the possibility space, for example perhaps it less useful to focus on intelligence as the exclusive element for the moment of innovation, speciation, or progress beyond human-level; other dimensions such as emotional intelligence, empathy, creativity, or a composite thereof could be considered.

Reference
1. Kurzweil, R. The Singularity is Near; Penguin Group: New York, NY, USA, 2006; pp. 299-367.

Sunday, September 23, 2012

The Quantified Self becomes the Qualified Self and the Exoself



Quotable quotes from the third Quantified Self conference held at Stanford University September 15-16, 2012. 

  • Can I query my shirt or am I limited to consuming the querying that comes packaged in my shirt?
  • Quantified Self enables the constant creation of this thing called the self
  • We think more about our cats/dogs than we do our real pets, our microbiome
  • Information conveyance, not dataviz
  • Quantified emotion and data sensation (haptics)
  • Display of numerical data and graphs are the interface
  • Quantifying is the intermediary step...exosenses (haptics, wearable electronic senses) is really what we want
  • Perpetual data explosion
  • Our mission as Quantified Selves is to discover our mission
Figure 1. Word Cloud Visualization of Agenda Topics at the third Quantified Self Conference.


Sunday, September 16, 2012

Sensor Mania! TechCrunch Disrupt Hardware Day!

Taking advantage of Sensor Mania! – the exploding wireless internet of things – TechCrunch Disrupt featured a special Hardware day on September 12, 2012 at its annual conference held in San Francisco, CA. 29 companies in a wide range of areas presented their hardware products, as 26 had at a similarly successful event in New York in May 2012. If underwhelming in ‘new new thing’ innovation, the startups at least appeared commercializable for the most part. Perhaps the biggest similarity in the eclectic mix was the high number of Kickstarter projects (the new alpha customer sales and financing platform).

The biggest interesting category of startups was biosensors, including a consumer EEG company (InteraXon), a sports sensor training program (GolfSense), an integrated biosensor platform for personalized pain management delivering the equivalent of white noise stimulus to nerve endings (Thimble Bioelectronics (integrating the Somaxis muscle sensor)), and a pedometer watch and social gaming fitness app for kids (Sqord).

Another interesting category of startups was internet-of-things companies. Ninja Blocks (‘the API for atoms’) was providing a standard offering of an internet-enabled console block plus five home sensor units (with distance, temperature, motion, camera, etc. sensor capabilities) for $200. Similarly, knut was providing a small, battery powered, Wi-Fi enabled sensor hub for real-time monitoring in the home environment for $80.

Figure 1: Electric Skateboard from Boosted Boards



The other companies were a mix including electric skateboards (Boosted Boards as shown in Figure 1), kitchen products (sous vide cooking (Nomiku) and high-end coffee brewing (Blossom Coffee)), standing desks, flashlights 2.0 (HexBright), rear-view cycling camera unit (Cerevellum), iPad kiosks (Lilitab), and the expected photo, audio (Vers), and gaming-related apps.

Sunday, June 19, 2011

Conference report: interventional anti-aging

The focus of the 40th annual meeting of the American Aging Association held June 3-6, 2011 in Raleigh NC USA was emerging concepts in the mechanisms of aging.

Many usual topics in aging were covered such as dietary restriction (DR), inflammation, stress resistance, homeostasis and proteasome activity, sarcopenia, and neural degeneration.

Newer methods like microRNAs and genome sequencing were employed to investigate gene expression variance with aging and genetic signatures of longevity.

Aging as a field continues to mature including by using a systems approach to tracing conserved pathways across organisms, sharpening definitions of sarcopenia, frailty, and healthspan, and distinguishing interventions by age-tier (early-onset versus late-onset).

A pre-conference session on late-onset intervention concluded that there are numerous benefits to deriving such interventions.

Conference talks applied the biology of aging in a translational manner to intervention development.

  • Using an individual’s own stem cells to regenerate organs for transplantation and as a cell source for cellular therapies could be a powerful near-term solution to disease.
  • Several proposed interventions were pharmaceutical, myostatin inhibition, losartan, JAK pathway inhibitors, and enalapril for frailty and sarcopenia, and metformin to promote Nrf2 anti-inflammation response.
  • In dietary restriction, protein restriction was found to be better than general calorie restriction. Short-term fasting may be helpful in chemotherapy, surgery, and acute stress, simultaneously increasing the killing of cancer cells by chemotherapy, while improving the survival of normal cells.
  • Immune system interventions remain elusive, although statins may help to improve cellular-senescence promoted bacterial infection.
  • Engineered enzymes may be useful in lysosomal catabolism.
  • Dietary restriction mimetics, most promisingly involving TOR (TORC1 inhibition and rapamycin), may be more feasible than dietary restriction.
More details: Meeting Summary preprint.

Sunday, December 05, 2010

Bay area aging meeting summary

In the second Bay Area Aging Meeting, held at Stanford on December 4, 2010, research was presented regarding attempts to further elucidate and characterize the processes of aging, primarily in model organisms such as yeast, C. elegans (worms), and mice. A detailed summary of the sessions is available here. The work spanned some repeating themes in aging research:

Theme: processes work in younger organisms but not in older organisms
A common theme in aging is that processes function well in the first half of an organism’s life, then break-down in the second half, particularly the last 20% of the lifespan. In one example, visualizations and animations were created from the 3D tissue-sectioning of the intestine of young (4 days old) and old (20 days old) C. elegans. In the younger worms, nuclei and cells were homogenous and regularly spaced over the course of the intestine running down the length of the worm. In older worms, nuclei disappeared (an initial 30 sometimes ultimately dropped to 10), and the intestine became twisted and alternately shrunken and convoluted due to DNA accumulation and bacterial build-up.

Theme: metabolism and oxidation critically influence aging processes
Two interesting talks concerned UCP2 (mitochondrial uncoupling protein 2), an enzyme which reduces the rate of ATP synthesis and regulates bioenergy balance. UCP2 and UCP3 have an important but not yet fully understood role in regulating ROS (reactive oxygen species) and overall metabolic function, possibly by allowing protons to enter the mitochondria without oxidative phosphorylation. The mechanism was explored in results that worm lifespan was extended by inserting zebrafish UCP2 genes (not natively present in the worm).

Theme: immune system becomes compromised in older organisms
Two talks addressed the issue of immune system compromise. One team created a predictive analysis that could be used to assess an individual’s immune profile and potential response to vaccines by evaluating demographics, chronic infection status, gene expression data, cytokine levels, and cell subset function. Other work looked into the specific mechanisms that may degrade immune systems in older organisms. SIRT1 (an enzyme related to cell regulation) levels decline with age. This leads to the instable acetylation of transcription factor FoxP3 (a gene involved in immune system response), which suppresses the immune system by reducing regulatory T cell (Treg) differentiation to respond to pathogens.

Theme: systems-level understanding of aging processes
Many aging processes are systemic in nature with complex branching pathways and unclear causality. Research was presented regarding two areas: p53 pathway initiation and amyloid beta plaque generation. P53 is a critical tumor suppressor protein controlling many processes related to aging and cell maintenance: cell division, apoptosis, and senescence, and is estimated to be mutated in 50% of cancers. Research suggested that more clues for understanding the multifactorial p53 pathway could come from SnoN, which may be an alternative mechanism for activating p53 as part of cellular stress response. Neurodegenerative pathologies such as Alzheimer’s disease remain unsolved problems in aging. For example, it is not known if the amyloid beta plaques that arise are causal, or a protection mechanism in response to other causal agents. Some research looked at where amyloid beta is produced in cells, finding that after the amyloid precursor protein (APP) leaves the endosome, both the Golgi and a related recycling complex may be related in the generation of amyloid beta.

Theme: lack of conservation progressing up the model organism chain
Aging and other biological processes become more complicated with progression up the chain of model organisms. What works in yeast and worms may not work in mice, and what works in mice and rats may not work in humans. Some interesting research looked at ribosomal proteins, whose deletion is known to extend lifespan in model organisms. The key points were first that there was fairly little (perhaps less than 20%) overlap in lifespan-extending ribosomal protein deletions conserved between yeast and worms. Second, an examination of some of the shared deletions in mice (especially RPL19, 22, and 29) found some conservation (e.g.; RPL29), and also underlined the systemic-nature of biology, finding that other homologous genes (e.g.; RPL22L (“-like”)) may compensate for the deletion, and thereby not extend lifespan.

Theme: trade-offs is a key dynamic of aging processes

The idea of trade-offs is another common theme in aging; the trade-offs between processes, resource consumption, and selection. Exemplar of this was research showing that the deletion of a single gene involved in lipid synthesis, DGAT1, is beneficial and promotes longevity in mice when calories are abundant, but is also crucial for survival in calorie restricted situations. This supports the use of directed methylation to turn genes on and off in different situations. More details were presented in a second area of trade-offs: reproduction-lifespan. It is known that reproduction is costly and organisms without reproductive mechanisms may have extended lifespans. Research examined the specific pathways, finding that Wnt and steroid hormone signaling in germline and somatic reproductive tissues influenced worm longevity, particularly through non-canonical (e.g.; not the usual) pathways by involving signaling components MOM-2/Wnt and WRM-1/beta-catenin.

Conclusion
Academic aging research is continually making progress in the painstaking characterization of specific biological phenomena in model organisms, however the question naturally arises as to when and how the findings may be applied in humans for improving lifespan and healthspan. In fact there is a fair degree of activity in applied human aging research. Just as more individuals are starting to include genomic medicine, preventive medicine, and baseline wellness marker measurement in health self-management, so too are they consulting with longevity doctors. One challenge is that at present it is incumbent on individuals to independently research doctors and treatments. Hopefully in the future there could be a standard list of the anti-aging therapies that longevity doctors would typically offer. Meanwhile, one significant way for an individual to start taking action is by self-tracking: measuring a variety of biomarkers, for example annual blood tests, and exercise, weight, nutritional intake, supplements, and sleep on a more frequent basis.

Sunday, September 12, 2010

Personal genome: data analysis challenge

Five themes emerged from the material presented at the 3rd annual personal genomes meeting at Cold Spring Harbor Laboratory held September 10-12, 2010.


First was the trend of family sequencing becoming more of a norm; looking at genetic disease as it is represented in trios, quartets or other family groups.

Second was the trend of increasingly common multi-level analysis, investigating traditional genotype data together with structural variation, expression data, pathways, and cell lines.

Third was the trend of greater breadth and sophistication in cancer genome analysis; the fledgling field of a few years ago now including dozens of sequenced cancer tumor genomes, the first cancer methylome, and cancer transcriptome analysis.

Fourth was the trend of the oft-heard challenge of scaling personal genome interpretation, making it automated, affordable, and actionable.

The fifth theme was the continued improvement in genome sequencing technology through new approaches such as quantum dot nanocrystal sequencing and strobe sequencing.

Sunday, June 13, 2010

Dollar Van Demos

Easily the phreshest idea from New York’s Internet Week, held in Manhattan June 7-14, was Dollar Van Demos! Dollar Van Demos are a showcase of musicians, rappers, and comedians performing inside a dollar van with real passengers. Numerous Brooklyn-based artists are featured in a collection of videos filmed inside dollar vans as they travel along their usual transportation routes.

The site has new videos from Cocoa Sarai, Tah Phrum Duh Bush, Grey Matter, Zuzuka Poderosa, Kid Lucky, Atlas, BIMB Family, I-John, Top Dolla Raz, Hasan Salaam, illSpokinn, Joya Bravo, Bacardiiiii and Jarel soon!

Figure 1. Example videos from Cocoa Sarai and Top $ Raz

Sunday, June 06, 2010

Rational growth in consumer genomics

The overall tone of the Consumer Genetics Show, held June 2-4, 2010 in Boston MA, was a pragmatic focus on the issues at hand as compared with the enthusiasm and optimism that had marked the conference’s inaugural event last year. One of the biggest shifts was the new programs that some top-tier health service providers have been developing to include genetic testing and interpretation in their organizations. It also became clear that the few widely-agreed upon success stories for genomics in disease diagnosis and drug response (i.e.; warfarin dosing) have been costly to achieve and will not scale to all diseases and all drugs. Cancer continues to be a key killer app for genomics in diagnosis, treatment, prognosis, cancer tumor sequencing, and risk prediction. Appropriate approaches to multigenic risk assessment for health risk and drug response remain untackled. Greater state and federal regulation seems inevitable. Faster-than-Moore’s-law improvements in sequencing costs continue as Illumina dropped the price of whole human genome sequencing for the retail market from $48,000 to $19,500.

There was generally wide agreement that from a public health perspective, personalized genomics is scientifically valid, clinically useful, and reimbursable in specific situations, but not universally – at this time, better information should be obtained for some people, not more information for all people. The focus should be on medical genetics strongly linked to disease, and on pharmacogenomics in treatment. For example, genomic analysis is required for some drugs by the FDA (maraviroc, cetuximab, trastuzumab, and dasatinib), and recommended for several others (warfarin, rasburicase, carbamazepine, abacavir, azathiprine, and irinotecan). A key point is to integrate the drug test with the guidance for drug dosage.

Even when genomic tests are inexpensive enough to be routine, interpretation may be a bottleneck as each individual’s situation is different when taking into account family history, personal medical history, and environmental and other factors. One idea was that the 20,000 pathologists in the US could be a resource for genomic test interpretation; pathologists are already involved as they must certify genetic test data in CLIA labs. Genomic tests and their interpretation would likely need to be standardized and certified in order to be reimbursed in routine medical care. A challenge is that health service payers are not interested or able to drive genomic test product design.

Key science findings

1. The Regulome and Structural Variation

Michael Snyder presented important research that the regulome, the parts of the genome located around the exome (the 1-2% of the genome that codes for protein), may be critical in understanding disease genesis and biological processes. The complexities of RNA are just beginning to be understood. It is known that there is more than just the simple transcription of DNA to RNA involved in controlling gene expression. For example, there is also tight regulation in splicing newly synthesized RNA molecules into the final RNA molecule and in translating messenger RNA to ribosomes to create proteins. Research findings indicate a global/local model of gene regulation, that there are master regulators with universal reach and local regulators operating on a local range of 200 or so genes.

Snyder also presented updates on his lab’s ongoing research into the structural variation of the human genome. A high-resolution sequencing study has been conducted regarding the amount of structural variation in humans, finding that there are ~1,500 structural variations per person that are over 3 kilobases long and that the majority of the structural variations are 3-10 kilobases long with a few extending to 50-100 kilobases (application of this research: Kasowski M, Science, 2010 Apr 9).

2. Reaching beyond the genome to the diseasome, proteome, and microbiome

Several scientists addressed the ways in which science is quickly reaching beyond the single point mutations and structural variation of the genome to other layers of information. There is a need for the digital quantification of the epigenome, the methylome, the transcriptome, the proteome, the metabolome, and the dieaseome/VDJome. For example, the immune system is one of the best monitors of disease state and progression. The strength of individual immune systems can be evaluated through the VDJome (the repertoire of recombined V-D-J regions in immune cells; cumulative immunoglobulin and T-cell receptor antigen exposure)

There are many areas of interest in proteomics including protein profiling, protein-protein interactions, and post-translational modification. A large-scale digital approach to proteomics was presented by Michael Weiner of Affomix. A key focal area is post-translational modification. At least one hundred post-translational modifications have been found, and two are being investigated in particular: phosphorylation (the signal transduction can possibly indicate tumor formation) and glycosylation (possibly indicating tumor progression).

The microbiome (human microbial bacteria) and host-bacteria interactions are an important area for understanding human disease and drug response, and for Procter & Gamble in creating consumer products. The company has basic research and publications underlying products such as the ProX anti-wrinkle skin cream (Hillebrand, Brit Jrl Derm, 2010), rhinovirus, and gingivitis. The company has a substantial vested interest in understanding the microbiome with its variety of nasal, oral, scalp, respiratory, skin, and GI tract-related products.

Sunday, May 30, 2010

Microbubbles and photoacoustic probes energize cancer researchers

The Canary Foundation’s eighth year of activities was marked with a symposium held at Stanford University May 25-27, 2010. The Canary Foundation focuses on the early detection of cancer, specifically lung, ovarian, pancreatic, and prostate cancer, in a three-step process of blood tests, imaging, and targeted treatment.

Imaging advances: microbubbles and photoacoustic probes
Imaging is an area that continues to make advances. One exciting development is the integration of multiple technologies, for example superimposing molecular information onto traditional CT scans. Contemporary scans may show that certain genes are over-expressed in the heart, for example, but obscure the specific nodule (tumor) location. Using integrins to bind to cancerous areas may allow their specific location to be detected (4 mm nodules now, and perhaps 2-3 mm nodules as scanning technologies continue to improve).

Other examples of integrated imaging technologies include microbubbles, which are gassy and can be detected with an ultrasound probe as they are triggered to vibrate. Similarly, photoacoustic probes use light to perturb cancerous tissue, and then sound detection tools transmit the vibrations. Smart probes are being explored to detect a variety of metaloproteases on the surface of cancer cells, breaking apart and entering cancer cells where they can be detected with an ultrasound probe.

Systems biology approaches to cancer
Similar to aging research, some of the most promising progress points in cancer research are due to a more systemic understanding of disease, and the increasing ability to use tools like gene expression analysis to trace processes across time. One example is being able to identify and model not just one, but whole collections of genes that may be expressed differentially in cancers, seeing that whole pathways are disrupted, and the downstream implications of this.

Cancer causality
Also as in aging research, the 'chicken or the egg' problem arises as multiple things that go wrong are identified, but which happens first, and causality, is still unknown. For example, in ovarian cancer, where there are often mutations in the p53 gene, and gene rearrangements and CNV (copy number variation; different numbers of copies of certain genes), but which occurs first and what causes both is unknown.

Predictive disease modeling
There continues to be a need for models that predict clinical outcome, and serve as accurate representations of disease. DNA and gene expression, integrated with traits and other phenotypic data in global coherent datasets could allow the ability to build probabilistic causal models of disease. It also may be appropriate to shift to physics/accelerator-type models to manage the scale of data now being generated and reviewed in biomedicine.

Sunday, March 07, 2010

Genomics: progress in exomes and structural variance

The fast rate of progress in many areas of genomics was the most salient dynamic of the Future of Genomic Medicine III conference at Scripps in San Diego CA, March 5-6, 2010. Cancer genomics and pharmacogenomics continue to blossom as wide-ranging fields of applied genomics. Aging and genomics, and the role of genetics in studying disease and the microbiome are nascent and growing. Importantly coming to the forefront for the first time is structural analysis and exome analysis.

Structural analysis of genomes concerns copy number variation (multiple copies of genes), inserted genes, deleted genes, inverted genes and other structural changes, and is found in all classes of traits and disease. There is thought to be 12% structural variation between humans as opposed to 0.1% SNP variation between humans. SNP variation is the 'typos' at specific genetic locations where the normal nucleotide combination is 'AA' and some people have the risk alleles 'AT' or 'TT.”

Using exomes (the 1-2% of the genome that contains protein coding regions) as a cheaper alternative to whole human genome sequencing, and conducting basic SNP analysis together with more complex structural variation analysis, and possibly methylation analysis (which genes are blocked from expression), and RNA transcriptome analysis (levels of DNA expression), could bring more sophistication to DNA analysis for myriad purposes including pharmacogenomics and disease analysis.

Some interesting startup companies are starting to realize these new aspects of genomic medicine:

Sunday, February 28, 2010

Human microbiome and personalized medicine

In genomics, the eleventh annual meeting of Advances in Genome Biology and Technology (AGBT) was held February 24-27, 2010, and featured an eclectic mix of new research and bioinformatics tools. Genomic research was presented in a diversity of areas including human, animal, plant, and bacteria. Many research advances are coming from partnerships between one or more academic research teams together with commercial entities. The biggest buzz was around Pacific Biosciences, the 3rd generation sequencing darling, with their single-molecule real-time (SMRT) platform which is still on track for an estimated launch later this year. The platform could deliver a 30,000-fold improvement over current methods, and ultimately achieve sub-$100 whole human genome sequencing. Attendees were also wowed by 454 Roche’s bench top GS Junior System (initially announced in late 2009), making sequencing much quicker and easier, and priced at only $98,000 (a milestone for sequencing equipment which usually runs in the several hundreds of thousand dollars).


Sequencing data storage and transfer costs continue to increase with the computing industry still not cognizant of the whole new era of data processing and communications transfer that is necessary for Very Large Datasets. The NIH 1000 Genomes project, for example, is transferring many terabyte-sized files per day.

From a research standpoint, some of the most activity is in cancer genomics. A recent NIH study generated 100TB data sequencing a melanoma sample and a normal blood sample and has been refining the Most Probable Variant (MPV) Bayesian analysis method used to identify genetic mutations. Perhaps the most innovative new research activity is in RNA sequencing. Other specific findings of note are in the areas of the microbiome and genetic variation:

Human microbiome
The complex interactions between individual humans and their microbiomes could have a substantial impact on personalized medicine. In some cases of infectious disease in humans, the pathogenesis may be unknown 40-60% of the time (e.g.; respiratory disease, skin disease). Even rudimentary issues remain unsolved, for example, it may be undetectable from a simple blood draw showing staph infection whether the bacteria was on the skin surface or in the blood. Microbiome sequencing is allowing the identification of novel pathogens, and could also be useful at the human population level to assess the spread and mutation trajectory of pathogens.

Genetic variation: human and otherwise
The populations analyzed in human genome wide association studies are being expanded, with important findings for both ancestry reconstruction and medical genomics. Research was presented on African-American, Mexican-American, Bushmen, and Bantu genome studies. A deeper understanding of genetic variation is also being used to facilitate the selection of desirable qualities in agriculture and animal livestock. For example, a chicken sequencing project found 7 million unique SNPs, 5 million of which were novel, and several of which were useful in translational application.

Sunday, September 27, 2009

Status of Stem Cell Research

The World Stem Cell Summit in Baltimore MD held September 21-23, 2009 attracted several hundred professionals to discuss contemporary science, industry and societal perspectives on stem cells. Attendance was high, but down from last year and, similar to cancer meetings, a key theme several keynote speakers acknowledged was

the overall lack of truly meaningful progress in stem cell research in the last twenty years.

Science Focus: Safe Stem Cell Generation

The science tracks featured current research in different stem cell areas including the production of safe hESC (human embryonic stem cells) and iPS (induced pluripotent stem cells) for use in regenerative medicine, the research and therapeutic use of mesenchymal stem cells (MSCs) and hematopoietic stem cells (HSCs) and reports from specific sub-fields: cancer stem cells, cardiovascular stem cells and neural stem cells. Overall, the work presented was incremental and in many cases, confirming what has been known already, such as a growing confirmation that cancer stem cells are probably responsible for triggering the resurgence of cancer but cannot at present be distinguished from other cells at the time of tumor removal.

Contract Research Demand: Cell Therapies and Recombinant Proteins
One stem cell area experiencing growth is contract research organizations, the outsourcing tool of choice for research labs and pharmaceutical companies in the production of biological materials. For large contract research manufacturing such as Basel, Switzerland-based Lonza, the biggest demand area is in cell therapies. Cell therapies denote the introduction of any type of new cell into other tissue for therapeutic purposes, but in the current case generally means any variety of stem cell-based therapies. Other large contract research manufacturing organizations such as Morrisville, NC-based Diosynth (owned by Schering Plough) lead in biologics (antibodies, protein production) production, an important area for nextgen biotech where synthetic biology could have a big impact.

For smaller contract research manufacturing organizations producing test compounds (e.g.; 1 liter for $10,000) and scaling to Phase I and II clinical trial quantities such as Baltimore MD-based Paragon Bioservices, the biggest demand is for recombinant proteins. Recombinant proteins are created by inserting recombinant DNA into a plasmid of rapidly reproducing bacteria and can take many useful forms such as antibodies, antigens, hormones and enzymes.

Venture capital hot topics: zinc fingers, RT PCR, tech transfer
Zinc fingers (small protein domains that bind DNA, RNA, proteins and small molecules) have been surfacing in a variety of cutting-edge biotech innovations. In July 2009, St. Louis, MO-based biotechnology chemical producer Sigma-Aldrich (SIAL) announced the creation of the first genetically modified mammals using zinc finger nuclease (ZFN) technology to execute modifications such as taking away the tail of the zebrafish. A second example of recent landmark research involving zinc fingers is that of Carlos Barbas at Scripps who uses zinc finger proteins to reprogram serine recombinases as a more specific alternative to the homologous recombination method of genome modification. In addition, the Barbas lab has a useful web-based zinc finger protein design tool available for public use, Zinc Finger Tools.

Real-time PCR offerings continue to expand and flourish with declining prices as startup newcomer Helixis announced a $10,000 real-time PCR solution at the conference.

Bethesda, MD-based Toucan Capital, a leading investor in stem cells and regenerative medicine discussed their sixteen interesting portfolio companies such as San Diego CA-based VetStem who is conducting joint and tendon stem cell therapies for race horses.

Johns Hopkins has one of the country’s leading technology transfer programs, licensing a growing number of technologies each year (nearly 100 in the last fiscal year), and has a searchable, though not extremely user-friendly, website.