Sunday, September 27, 2009

Status of Stem Cell Research

The World Stem Cell Summit in Baltimore MD held September 21-23, 2009 attracted several hundred professionals to discuss contemporary science, industry and societal perspectives on stem cells. Attendance was high, but down from last year and, similar to cancer meetings, a key theme several keynote speakers acknowledged was

the overall lack of truly meaningful progress in stem cell research in the last twenty years.

Science Focus: Safe Stem Cell Generation

The science tracks featured current research in different stem cell areas including the production of safe hESC (human embryonic stem cells) and iPS (induced pluripotent stem cells) for use in regenerative medicine, the research and therapeutic use of mesenchymal stem cells (MSCs) and hematopoietic stem cells (HSCs) and reports from specific sub-fields: cancer stem cells, cardiovascular stem cells and neural stem cells. Overall, the work presented was incremental and in many cases, confirming what has been known already, such as a growing confirmation that cancer stem cells are probably responsible for triggering the resurgence of cancer but cannot at present be distinguished from other cells at the time of tumor removal.

Contract Research Demand: Cell Therapies and Recombinant Proteins
One stem cell area experiencing growth is contract research organizations, the outsourcing tool of choice for research labs and pharmaceutical companies in the production of biological materials. For large contract research manufacturing such as Basel, Switzerland-based Lonza, the biggest demand area is in cell therapies. Cell therapies denote the introduction of any type of new cell into other tissue for therapeutic purposes, but in the current case generally means any variety of stem cell-based therapies. Other large contract research manufacturing organizations such as Morrisville, NC-based Diosynth (owned by Schering Plough) lead in biologics (antibodies, protein production) production, an important area for nextgen biotech where synthetic biology could have a big impact.

For smaller contract research manufacturing organizations producing test compounds (e.g.; 1 liter for $10,000) and scaling to Phase I and II clinical trial quantities such as Baltimore MD-based Paragon Bioservices, the biggest demand is for recombinant proteins. Recombinant proteins are created by inserting recombinant DNA into a plasmid of rapidly reproducing bacteria and can take many useful forms such as antibodies, antigens, hormones and enzymes.

Venture capital hot topics: zinc fingers, RT PCR, tech transfer
Zinc fingers (small protein domains that bind DNA, RNA, proteins and small molecules) have been surfacing in a variety of cutting-edge biotech innovations. In July 2009, St. Louis, MO-based biotechnology chemical producer Sigma-Aldrich (SIAL) announced the creation of the first genetically modified mammals using zinc finger nuclease (ZFN) technology to execute modifications such as taking away the tail of the zebrafish. A second example of recent landmark research involving zinc fingers is that of Carlos Barbas at Scripps who uses zinc finger proteins to reprogram serine recombinases as a more specific alternative to the homologous recombination method of genome modification. In addition, the Barbas lab has a useful web-based zinc finger protein design tool available for public use, Zinc Finger Tools.

Real-time PCR offerings continue to expand and flourish with declining prices as startup newcomer Helixis announced a $10,000 real-time PCR solution at the conference.

Bethesda, MD-based Toucan Capital, a leading investor in stem cells and regenerative medicine discussed their sixteen interesting portfolio companies such as San Diego CA-based VetStem who is conducting joint and tendon stem cell therapies for race horses.

Johns Hopkins has one of the country’s leading technology transfer programs, licensing a growing number of technologies each year (nearly 100 in the last fiscal year), and has a searchable, though not extremely user-friendly, website.

Sunday, September 20, 2009

Personalized genomics inflection point

One of the world’s fastest accelerating technologies is that of genomic sequencing. The first whole human genome (6 billion base pairs) was sequenced at a cost of $3b and was completed in 2003. The current cost is $20,000 for researchers (Complete Genomics) and $48,000 for consumers (with Illumina’s EveryGenome program). Leading third-generation sequencing company Pacific Biosciences affirmed at the Cold Spring Harbor Laboratory Personal Genomes meeting September 14-17, 2009 that the company has 12 prototype instruments in operation and continues to be on track for ~$100 (“the cost of a nice dinner”) whole human genome sequencing to be commercially available in the second half of 2010. NimbleGen indicated that they may have a $2,000 exome sequencer available in 2010.

In a challenging venture capital climate, Pacific Biosciences was able to close an additional $68m round in financing on August 12, 2009. Leading commercial sequencer Complete Genomics was also notable in closing a $45m D round on August 24, 2009. The company has sequenced 14 whole human genomes to date, and hopes to sequence exponentially more, 10,000, in 2010 at a minimum cost of $5,000 per genome.

Viability of DTC genomics sector
Where the genomics technology sector has rosy prognostications, the direct-to-consumer personalized genomics market has volatility. Events in the last several months have led to questions of the sector’s viability with upheavals at the three leading companies, 23andme (“Avey Leaves 23andMe to Start Alzheimer's Research Foundation Using DTC Genomics Firm's Platform"), deCODEme (“deCODE close to broke” – Augusty 11, 2009) and Navigenics (“Navigenics Names Jonathan Lord, MD to Serve as President and Chief Executive Officer” April 7, 2009). Absent innovation, DTC genomics companies are a “window business” in the sense that the window for their current offerings may only be open for a short time with the advent of whole human genome sequencing and standardized public multi-SNP condition interpretation tools.

Figure 1. Direct-to-Consumer Genomics Offerings: ongoing price declines (Chart PDF)

As depicted in Figure 1, there are three types of Direct-to-Consumer (DTC) genomics offerings currently available directly to individuals: one-off SNP (single nucleotide polymorphism) tests for specific conditions and paternity tests, multi-SNP risk assessment tests mapping several SNPs to dozens of disease conditions and whole human genome sequencing assessing hundreds of disease risks. The five companies offering multi-SNP risk assessments are: 23andme ($399 for 111 conditions), deCODEme (42 conditions for $985), Navigenics (28 conditions for $999), Gene Essence (84 conditions for $1,195) and Pathway Genomics (77 conditions for $249). 23andme, deCODEme and Navigenics are the most transparent, disclosing the specific SNPs, research references and risk assessment methodologies for their tests, Gene Essence discloses SNPs and Pathway Genomics does not disclose anything. A detailed condition and SNP analysis is here.

Slow DTC genomics adoption
DTC genomics has had slow adoption so far for several reasons, first, there has been very little marketing, few consumers know of the availability and value proposition of DTC genomics services. Second, since automated tools are not yet available, many people are not interested in preventively managing their health, and may still perceive it to be in the responsibility and domain of health care professionals. Third, the conventional but incorrect view is that genetic information is already known (from family history), negative and deterministic. Fourth, as initially pointed out by ExperimentalMan David Ewing Duncan, there are conflicting interpretations from DTC services for the same conditions such as heart attack. This is because the scientific community has little knowledge and agreement yet regarding multi-SNP conditions. DTC companies are looking at different SNPs, assigning different quantitative risk values and employing differing estimates of overall population averages which all contribute to heterogeneous interpretations of risk for the same condition.

Sunday, September 13, 2009

VC guide to anti-aging biotechnology investing

Several promising startup companies focused on the nascent but obviously significant and growing anti-aging biotechnology space were present or discussed with interest at the recent SENS4 (Strategies for Engineered Negligible Senescence) conference in Cambridge, U.K., September 3rd – 7th, 2009 (program) (full conference report).

  1. Epeius Biotechnologies, San Marino, CA, USA: Rexin-G, a tumor-targeted injectable gene delivery system
  2. FoldRx, Cambridge, MA, USA: small molecule therapeutics to treat protein misfolding diseases, and bind and clear undesired molecules
  3. Gencia Corporation, Charlottesville, VA, USA: mitochondrial DNA rejuvenation using the rhTFAM (recombinant-human mitochondrial transcription factor A) protein
  4. Genscient, Fountain Valley, CA, USA: novel chronic disease therapeutics by combining genomics and selective screening (a large Alzheimer’s Disease genetic study is in progress with Kronos and TGen)
  5. Knome, Cambridge, MA, USA: whole human genome sequencing (consumer offering)
  6. Neotropix, Malvern, PA, USA: oncolytic viruses for the treatment of solid tumors
  7. Pentraxin Therapeutics Ltd, London, UK: small molecule drug CPHPC specifically targeting SAP (serum form of amyloid P) and removing it from the blood and brains of patients with Alzheimer’s Disease
  8. Repeat Diagnostics, Vancouver, BC, Canada: telomere length measurement for total lymphocyte and granulocyte populations (consumer offering)
  9. Retrotope, Los Altos Hills, CA, USA: using isotope effect to slow down damage pathways and control metabolic processes associated with oxidative stress
  10. StemCor Systems, Inc., Menlo Park, CA, USA: bone marrow harvesting system
  11. T.A. Sciences, New York, NY, USA: telomerase activation via the single molecule TA-65, licensed from Geron Corporation (consumer offering)
  12. TriStem Corporation, London, UK: retrodifferentiation technology to create stem cells from mature adult cells

Sunday, September 06, 2009

So discontinuous a discontinuity

A key aspect of thinking systemically about the future is being able to see how rapidly advancing technologies across many fields interrelate. Whatever next Internet-like discontinuity or singularity occurs will influence whatever comes thereafter. It is likely that some high percentage of what are now thought to be expected future advances will recede or be reshaped at minimum, for example:

  • If all chronic disease and aging becomes controllable and there is effective immortality, does uploading matter as much?
  • If artificial general intelligence is achieved, how does that change the exigency and requirements of molecular nanotechnology?
  • If affordable space launch and space-based solar power is achieved, what happens to ethanol, electrical and other terrestrial alternative vehicle and transportation infrastructure solutions?
  • If immersive virtual reality and post-material scarcity are achieved, does molecular nanotechnology matter and what happens to global political systems?
  • If whole human genome testing is available, do single SNP tests go away? If there are home health monitors and nanodiagnostics, do primary care physicians go away?

Sunday, August 30, 2009

Real-time unified search

It is surprising that a unified search application across different types of web content does not yet exist.

According to OneRiot, 40% of web searches at present are for real-time content such as that from Twitter, FaceBook, PeopleBrowsr, digg, bookmarking sites, blogging and microblogging sites (friendfeed, etc.).

Megafeed
One vision of a unified search 'megafeed' app would be a customizable html page with search across many types of web content, automatically updated and delivered together or organized into categories such as events, articles, comments, people, etc. The types of web content to search would be:

  • Traditional web search: Google, Bing, etc., which could be more richly granularized with content-tagging per a variety of parameters such as information type (news, blog, video, book, event, etc.), time (time added to web, time of occurrence), original vs. subsequent post and other distinctions.
  • Real-time web search: The emerging real-time content search engines should be extended and unified into one digital social interaction feed for Twitter, FaceBook, LinkedIn, bookmarking, email, blogging, microblogging and possibly IM/SMS notification AND response. User-permissioned credentials can be browser-stored for such a unified action platform. In addition to usability, the fast-growing real-time web search companies are also focused on monetization, reinventing generating AdSense-like models.
  • Local search: New restaurant and retail notifications, events, craigslist and other commercial postings of interest, friends traveling to the area (links to feeds from GeckoGo, Dopplr and other travel social networking and public calendaring websites).
  • Academic search: notification of new papers, articles or news. Federated PubMed, ArXiv-like journal portals are needed for all academic fields, including economics and liberal arts.
  • Multimedia search: notification of non-text postings of photo, music, podcast and video content.

Content demand mechanisms: ambience to supersede keywords
Content could be searched by the usual user-entered keywords or a deeper variety of content demand-interaction mechanisms could be developed, for example, permissioning-in by users such that ambient profiles from hard-drive content and previous web interactions automatically form and evolve (a precursor to pre-AI web interactions).

Sunday, August 23, 2009

Automatic Markets

At Singularity University, one of the most pervasive memes was the “routing packets” metaphor; that many current activities are just like routing packets on the Internet. This includes areas such as people in driverless cars, electrons in electric vehicle charging and power entry, load-balancing, routing and delivery on smartgrid electricity networks.

Fungible resources and quantized packet-routing
The packet-routing concept could be extended to neurons (routed in humans or AIs), clean water, clean air, food, disease management, health care system access and navigation, and in the farther future, information (neurally-summoned) and emotional support (automatically-summoned per human dopamine levels from nearby people or robots). It is all routing…directing quantized fungible resources to where they are needed and requested.

Automatic Markets
Since these various resources are not uniformly demanded, the idea of markets as a resource allocation mechanism is immediately obvious.

Further that automated, or automatic markets with pre-specified user preferences, analogous to limit orders, could be optimum. Markets could meet in equilibrium and transact, buying, selling, and adjusting automatically per evolving conditions and pre-programmed user profiles, permissions, and bidding functions.

Truly smart grids would have automatic bidding functions (as a precursor to more intelligence-like utility functions) that would indicate preferences and bid and equalize resource allocation, the truly invisible digital hand.

The key parameters of a working market, liquidity, price discovery and ease of exchange would seem to be present in these cases with large numbers of participants and market monitoring and bidding via web or SMS interfaces. The next layer, secondary markets and futures and options could also evolve as an improvement to market efficiency, if designed with appropriate incentives.

Automatic markets are not without flaw, they exist now in traditional financial markets, causing occasional but volatile disruptions in the form of quantitative program-trading (blamed for exacerbating the 1987 Black Monday stock market crash) and flash-trading. Speculative aspects are not trivial and would be a critical area for market designers to watch, particularly managing for high liquidity and equal access (e.g.; faster Internet connections do not matter).

Markets to grow as a digitized resource allocation tool
At present, markets are not pervasive in life. The most notable examples are traditional financial markets, eBay, peer-to-peer finance websites and prediction markets. Being in a global digital era with the ability to use resources in a more fungible and transferable way could further promulgate the use of markets as a resource allocation tool.

A focus on preference rather than monetary value, and other currencies such as attention, authority, trust, etc. could vastly extend the range of implementation of market principles.

Sunday, August 16, 2009

iPhone Biodefense App

Right now it would be nice for people to be able to perform a detailed inspection of whatever environment they are in, and of themselves internally. As the future evolves, it could become an exigency. Portable personal biosensing devices for biothreat defense and medical self-diagnosis could become de rigueur, most logically as an extension of current mobile device platforms.

Hardware Requirements:

  • Integrated Lab-on-a-chip module with flow cytometer, real-time PCR, microarray and sequencing unit (genome, proteome, metabolome, lipidome, etc.)
  • Disposable finger-prick lancets
Software Requirements:
  • Data is collected and perhaps digitized locally, then transmitted for processing and interpretation via web services
What is the current status of the iPhone Biodefense App?
  • A. Order online
  • B. DIY with components from Fry’s
  • C. Have a roadmap, getting supplies and building tools
  • D. Homesteads and landgrab available to pioneers
  • E. “Ahead of the science,” aka it’s always 20 years out!
Answer: C. Have a roadmap, getting supplies and building tools
Single-cell identification, extraction and genotyping is starting to be possible from a research perspective (ex: Love Lab, MIT). Lab-on-a-chip functionality has been miniaturized (e.g.; small flow cytometers, small PCR machines). Now the trick is to integrate and add features to these systems, extend the functionality, shrink them further and reduce constraints. Microarrays and sequencing also have several innovation cycles ahead.

Key constraint: time
In addition to moving down the cost curve (most relevant for sequencing), performance time is the key constraint. Substances, expressed genes, blood biomarkers, etc. can be detected but it is taking hours and days when it needs to be immediate.

Declassify custom biodefense microarrays
Lawrence Livermore National Laboratory has one of the most advanced biodefense labs in the country. Custom microarrays have been developed for government agencies that the lab would now like to transfer into the public health domain. This could revolutionize and hasten commercial biosensing applications much like the declassification of adaptive optics revolutionized astronomy. At least three custom microarrays have been developed:
  • Microbial Detection Array: identify what a substance is
  • Virulence Array: identify how much damage a substance could do
  • Microbial Defense Genotyping Array: identify SNPs, indels

Sunday, August 09, 2009

Open Global Courseware

The U.K., long an adopter of surveillance technology, announced recently that high-definition CCTV cameras from Classwatch have been installed in 94 schools. The result has been improved classroom management and there are plans to install hundreds more cameras nationwide in primary and secondary schools.

Free global education resource
With minimal effort, this internal surveillance initiative could be expanded into a worldwide sousveillance victory. A global education resource could be generated by broadcasting and archiving the live feeds to the web for access by teachers and students worldwide in their own classrooms and via cell phones. This is essentially an extension of MIT’s open courseware concept.

Language imperialism and the return of the British Empire?
The U.K. might briefly enjoy the notion of re-establishing the British Empire by exporting English-language education, but

language is becoming more fungible over time
The issue of language imperialism could be avoided with the use of audio translation tools (Google Translate – audio version?) and by opting in CCTV broadcasts from schools in other countries. The pilot project phases could be U.K. transmissions targeted at India and Beijing, etc. transmissions targets at rural Chinese schools.

PenPal 2.0 flattens the world
Classroom broadcasts could quickly become interactive with commenting and messaging on the streams. Students worldwide could get to know each other and work on team projects together in virtual world classrooms like Second Life’s Teen Grid; a multi-dimensional PenPal 2.0. Students in India could come up with ideas to work on problems in the U.K. by interviewing British students and vice versa. Teacher and student exchange programs could arise. Students could vote on the curriculum.
The real way to raise test scores would be to have live head-to-head competitions between different schools in a district, country or around the world (“The class in Chennai did 5% better….”).

Local community engagement tool
Internet broadcast could also enable the local community. Parents could tune in to their children’s classrooms (“Mom, did you see what I did around 10:30?”…”What happened at school today?” “Mom, just watch the feed archive…”). The social networking dimension could deepen student, teacher and parent interaction as many are already managing homework assignments colaboratively on the web.

American Idol Teacher: injecting abundance
Classroom broadcast could bring more abundance to teaching by providing acknowledgement (whuffie) for good teachers. Innovative and engaging teachers could reach a global audience and become YouTube celebrities. There could be competitions for the Best Teacher of the Pythagorean theorem, Best Teacher in Swindon, etc. as nominated through video clips. Videos could be linked to teacher ranking websites. From a policy perspective, education could become easier to evaluate and standardize. Countrywide best practices could be culled to train new teachers.

Conclusion: inevitablility of full-life recording
It seems inevitable that video surveillance/sousveillance will increasingly penetrate public and private areas for a variety of reasons ranging from safety and crime control to life-logging. One classic opposition argument is that recording inhibits ‘natural’ behavior, however most people quickly forget and adjust and it could be likely that the ongoing recording of society will advance without much opposition as long as there is a balance between surveillance and sousveillance (e.g.; there is popular access to the technologies and streams).

Sunday, August 02, 2009

Bio-design automation and synbio tools

The ability to write DNA could have an even greater impact than the ability to read it. Synthetic biologists are developing standardized methodologies and tools to engineer biology into new and improved forms, and presented their progress at the first-of-its-kind Bio-Design Automation workshop (agenda, proceedings) in San Francisco, CA on July 27, 2009, co-located with the computing industry’s annual Design Automation Conference. As with many areas of technological advancement, the requisite focus is on tools, tools, tools! (A PDF of this article is available here.)


Experimental evidence has helped to solidify the mindset that biology is an engineering substrate like any other and the work is now centered on creating standardized tools that are useful and reliable in an experimental setting. The metaphor is very much that of computing: just as most contemporary software developers work at high levels of abstraction and need not concern themselves with the 1s and 0s of machine language, in the future, synthetic biology programmers would not need to work directly with the Ac, Cs, Gs and Ts of DNA or understand the architecture of promoters, terminators, open reading frames and such. However, with synthetic biology being in its early stages, the groundwork to define and assemble these abstraction layers is currently at task.

Status of DNA synthesis
At present, the DNA synthesis process is relatively unautomated, unstandardized and expensive ($0.50-$1.00 per base pair (bp)); it would cost $1.5-3 billion to synthesize a full human genome. Synthesized DNA, which can be ordered from numerous contract labs such as DNA 2.0 in Menlo Park, CA and Tech Dragon in Hong Kong, has been following Moore’s Law (actually faster than Moore’s Law Carlson Curves doubling at 2x/yr vs. 1.5x/yr), but is still slow compared to what is needed. Right now short oligos, oligonucleotide sequences up to 200 bp, can be reliably synthesized but a low-cost repeatable basis for genes and genomes extending into the millions of bp is needed. Further, design capability lags synthesis capability, being about 400-800-fold less capable and allowing only 10,000-20,000 bp systems to be fully forward-engineered at present.

So far, practitioners have organized the design and construction of DNA into four hierarchical tiers: DNA, parts, devices and systems. The status is that the first two tiers, DNA and parts (simple modules such as toggle switches and oscillators), are starting to be consistently identified, characterized and produced. This is allowing more of an upstream focus on the next two tiers, complex devices and systems, and the methodologies that are needed to assemble components together into large-scale structures, for example those containing 10 million bp of DNA.

Standardizing the manipulation of biology
A variety of applied research techniques for standardizing, simulating, predicting, modulating and controlling biology with computational chemistry, quantitative modeling, languages and software tools are under development and were presented at the workshop.

Models and algorithms
In the models and algorithms session, there were some examples of the use of biochemical reactions for computation and optimization, performing arithmetic computation essentially the same way a digital computer would. Basic mathematical models such as the CME (Chemical Master Equation) and SSA (Stochastic Simulation Algorithm) were applied and extended to model, predict and optimize pathways and describe and design networks of reactions.

Experimental biology
The experimental biology session considered some potential applications of synthetic biology, first the automated design of synthetic ribosome binding sites to make protein production faster or slower (finding that the translation rate can be predicted if the Gibbs free energy (delta G) can be predicted). Second, an in-cell disease protection mechanism was presented where synthetic genetic controllers were used to prevent the lysis normally occurring in the lysis-lysogeny switch turned on in the disease process (lysogeny is the no-harm state and lysis is the death state).

Tools and parts
In the tools and parts session, several software-based frameworks and design tools were presented, many of which are listed in the software tools section below.

Languages and standardization
The languages and standardization session had discussions of language standardization projects such as the BioStream language, PoBol (Provisional BioBrick Language) and the BioBrick Open Language (BOL).

Software tools: a SynBio CrunchUp
Several rigorous computer-aided design and validation software tools and platforms are emerging for applied synthetic biology, many of which are freely available and open-source.

  • Clotho: An interoperable design framework supporting symbol, data model and data structure standardization; a toolset designed in a platform-based paradigm to consolidate existing synthetic biology tools into one working, integrated toolbox
  • SynBioSS - Synthetic Biology Software Suite: A computer-aided synthetic biology tool for the design of synthetic gene regulatory networks; computational synthetic biology
  • RBS Calculator: A biological engineering tool that predicts the translation initiation rate of a protein in bacteria; it may be used in Reverse Engineering or Forward Engineering modes
  • SeEd - Sequence Editor (work in progress): A tool for designing coding sequence alterations, a system conceptually built around constraints instead of sequences
  • Cellucidate: A web-based workspace for investigating the causal and dynamic properties of biological systems; a framework for modeling modular DNA parts for the predictable design of synthetic systems
  • iBioSim: A design automation software for analyzing biochemical reaction network models including genetic circuits, models representing metabolic networks, cell-signaling pathways, and other biological and chemical systems
  • GenoCAD: An experimental tool for building and verifying complex genetic constructs derived from a library of standard genetic parts
  • TinkerCell: A computer-aided design software for synthetic biology

Future of BioCAD
One of the most encouraging aspects in the current evolution of synthetic biology is the integrations the field is forging with other disciplines, particularly electronics design and manufacture, DNA nanotechnology and bioinformatics.

Scientists are meticulously applying engineering principles to synthetic biology and realize that novel innovations are also required since there are issues specific to engineering biological systems. Some of these technical issues include device characterization, impedance, matching, rules of composition, noise, cellular context, environmental conditions, rational design vs. directed evolution, persistence, mutations, crosstalk, cell death, chemical diffusion, motility and incomplete biological models.

As it happened in computing, and is happening now in biology, the broader benefit of humanity having the ability to develop and standardize abstraction layers in any field can be envisioned.
Clearly there will be ongoing efforts to more granularly manipulate and create all manner of biology and matter. Some of the subsequent areas where standards and abstraction hierarchies could be useful, though not immediate, are the next generations of computing and communications, molecular nanotechnology (atomically precise matter construction from the bottom up), climate, weather and atmosphere management, planet terraforming and space colony construction.

(Image credits: www.3dscience.com, www.biodesignautomation.org)

Sunday, July 26, 2009

Ethics of brainless humans

As a thought experiment, if it were possible, would it be ethical to make humans without brains for research purposes?

The idea arises since a more accurate model of humans for drug testing would be quite helpful. Drugs may work in mice, rats and monkeys but not in humans or in some humans but not others. Human biology is more complex and the detailed pathways and mechanisms are not yet understood.

Of course by definition, a brainless human is not really a human; a human form without a brain would be more equivalent to a test culture of liver cells than a cognitive agent.

Tissue culturing, regenerative medicine and 3D organ printing
The less contentious versions of the idea of growing brainless humans is currently under initial exploration in taking tissue from a human, growing it up in culture and testing drugs or other therapies on it. A further step up is regenerative medicine, producing artificial organs from a person’s cells such as the Wake Forest bladder and Gabor Forgacs 3D organ printing work.

Brain as executive agent may be required
The next steps for testing would be creating systems of interoperating tissue and organs (e.g.; how would this person’s heart and liver respond to this heart drug?) and possibly a complete collection of human biological systems sans brain. One obvious issue is that this might not even work since the brain is obviously a critical component of a human and that a brainless human could not be built, that some sort of executive organizing system like the brain would be needed. Also medical testing would need to include the impact on the brain and the brain’s role and interaction with the other biological systems and the drug.

Ethical but impractical
Where it is quite clear that generating a full living human for research purposes would be unethical, it is hard to argue that generating a brainless human, a complex collection of human biological systems without a brain, which is not really human and does not have consciousness or personhood, would be unethical. Certainly some arguments could be made to the contrary regarding the lack of specific knowledge about consciousness and concepts of personhood, but would seem to be outweighed.

Unlikely to arise
It is extremely unlikely that the situation of manufacturing brainless humans for research purposes would ever arise, first since a lot of testing and therapy may be possible with personalized tissue cultures and regenerative medicine, and informed by genomic and proteomic sequencing. Also, in an eventual era where it might be possible to construct a brainless human or a collection of live interacting tissues and organ systems, it would probably be more expedient to model the whole biological system digitally.