Get PDF Overviews: Thirty-Five Years of Cell Biology

Free download. Book file PDF easily for everyone and every device. You can download and read online Overviews: Thirty-Five Years of Cell Biology file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with Overviews: Thirty-Five Years of Cell Biology book. Happy reading Overviews: Thirty-Five Years of Cell Biology Bookeveryone. Download file Free Book PDF Overviews: Thirty-Five Years of Cell Biology at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF Overviews: Thirty-Five Years of Cell Biology Pocket Guide.

The society was established in and is one of the nation's oldest and most respected medical honor societies. Dachuan Zhang, a Postdoctoral Fellow in the laboratory of Dr. Welcome New Cell Biology Faculty! Britta Will, Ph. Will received her Ph. Prior to this appointment, Britta did outstanding research on the regulation of normal hematopoietic and malignant stem cell function under Dr.

Her research program dissects how age-related changes in gene-regulatory mechanisms contribute to hematopoietic stem cell failure and malignant transformation and exploits a combination of genetic mouse models, primary human specimen and cutting-edge molecular and cell biological assay systems. The grant was made to support her research on the roles of PI3 Kinase in myeloid leukemia cells and their bone marrow niche.

The Department of Cell Biology would like extend a very warm welcome to our newest faculty member, Associate Professor Dr. Matthew Gamble. Gamble has been on the faculty in the Department of Molecular Pharmacology since He was recently promoted to Associate Professor of Molecular Pharmacology and he will join our department with a secondary appointment as Associate Professor of Cell Biology.


  • Shigella: Molecular and Cellular Biology.
  • Schools on the Edge: Responding to Challenging Circumstances?
  • Overview | Department of Cell Biology | Albert Einstein College of Medicine.
  • Trends in complex analysis, differential geometry, and mathematical physics : proceedings of the 6th International Workshop on Complex Structures and Vector Fields : St. Konstantin, Bulgaria, 3-6 September 2002.
  • IN ADDITION TO READING ONLINE, THIS TITLE IS AVAILABLE IN THESE FORMATS:.

Several labs in our department have already had very fruitful interactions with Dr. Gamble and his lab members. His laboratory is located in Golding Building. She will use the funding to characterize molecular abnormalities in HSCs of patients with myelodysplastic syndrome in order to develop mechanism-based therapeutic approaches. The honor recognizes an individual who has made exemplary research contributions to the field of B cell biology.

Scharff is world-renowned as a pioneer in the development and application of monoclonal antibodies, which have become a cornerstone in biomedical research. Leonard A. Travis Bernardo and Dr. Barnali Biswas for winning Postdoctoral Fellowship awards! Shulman Award for Excellence in Teaching. The recipient of this award is nominated and selected by the graduate students as a faculty member who has demonstrated exemplary skill in teaching and mentoring.

Of special note: This is the second time that Barbara has received this award! Congratulations to Barbara on this very appropriate recognition of her dedication and teaching and mentoring skills by the graduate students. In , AAAS members were awarded this honor because of their scientifically or socially distinguished efforts to advance science or its applications.

Margaret Kielian, Ph. Golding Chair in Microbiology. Richard Kitsis, M. Robert Singer, Ph. Laboratory Directory. Clifford Brangwynne Host: Dr. Complete Event Listing. Education Research Health. Department of Cell Biology Overview Research in the Department of Cell Biology is focused on understanding molecular mechanisms of gene regulation in eukaryotic cells. The human population would lack immunological protection against such viruses, existing antiviral drugs would not afford any protection, and these viruses could be spread simply by release of an aerosol spray in several crowded areas.

A more holistic understanding of complex biological systems e. Critical components can then serve as targets for therapeutic and preventive intervention or manipulation; they can also serve as targets for malevolent manipulation and as the basis for novel kinds of biological attack. Examples of the tools that could be used to manipulate complex biological systems include gene silencing, novel binding reagents e.

In many ways this category of technologies opens up entirely novel aspects of the future biodefense and biothreat agent landscapes and changes the fundamental paradigm for future discussions on this topic. The phenomenon is now known as RNA interference, and is recognized to be a common antiviral defense mechanism in plants and a common phenomenon in many other organisms, including mammals.

This field is exploding with new discoveries almost daily concerning the role of miRNAs in regulating gene expression during development and after. The interaction of endogenous miRNAs with cellular mRNAs encoding specific proteins leads to suppression of protein expression, either by impairing the stability of the mRNA or by suppressing its translation into protein. The fact that small, largely double-stranded RNAs of this type, about 21 nucleotides in length, could play such an apparently broad and fundamental role in development and in the control of cellular homeostasis was not at all appreciated just a few years ago and highlights the sudden, unpredictable paradigm shifts and sharp turns in the way scientists think that are possible in the advance of the life sciences Figure The basic molecular mechanism of RNAi is as follows.

They are capable of similarly silencing gene expression but can also direct post-transcriptional silencing by blocking translation of a targeted host mRNA. RNAi is highly specific and remarkably potent only a few dsRNA molecules per cell are required for effective interference , and the interfering activity can occur in cells and tissues far removed from the site of introduction. The technology is expected to prove particularly valuable in cases where the targeted RNA encodes genes and protein products inaccessible to conventional drugs i.

However, clinical delivery poses a significant challenge, as does the likelihood of undesirable silencing of nontargeted genes. In , a German research team announced the successful lentivirus vector. Substantial progress is being made toward this aim, however, using liposome and lipid nanoparticle formulations of chemically modified, and hence stabilized, siRNAs. Scientists at Sirna, a small biotech company working for well over a decade on nucleic-acid-based therapies, have recently described a 1,fold reduction in the amount of hepatitis B virus present in the blood of mice replicating this virus in the liver, following a series of three separate intravenous inoculations of a lipid nanoparticle formulated, chemically modified, siRNA.

In November , researchers from Alnylam Pharmaceuticals used chemically modified siRNAs to silence genes encoding Apolipoprotein B ApoB in mice, resulting in decreased plasma levels of ApoB protein and reduced total cholesterol. Importantly, the delivery did not inadvertently impact nontargeted genes. Still, there are questions about the specificity of the siRNA, given that the investigators did not evaluate all proteins and given that they collected measurements over a relatively short period of time.

Observations that RNAi works in vivo in mammals has not only created opportunities for the development of new therapeutic tools but also spawned a new generation of genetic research in mammals. One could temporarily switch off a tumor suppressor gene suspected of providing genome protection e. It is reasonable to expect significant additional advances in the formulation of siRNAs for use as pharmacological agents, particularly with contributions from the field of nanotechnology.

As with so many of the technologies outlined in this chapter, just as RNAi promises new therapeutic options for cancer and other diseases, it could also be used to manipulate gene expression with the intent to do harm. Aptamers are short, single-stranded nucleic acid or peptidic ligands that fold into well-defined three-dimensional shapes, allowing them to inhibit or modulate their protein targets with high affinity and specificity. Since their discovery in the early s, aptamers have been used in target validation, detection reagents, and functional proteomic tools.


  1. Simulations as Scaffolds in Science Education.
  2. In the News.
  3. Pension Finance!
  4. The Great Hangover: 21 Tales of the New Recession from the Pages of Vanity Fair.
  5. OECD Reviews of Innovation Policy OECD Reviews of Innovation Policy: China 2008.
  6. Extraction 84. Symposium on Liquid–Liquid Extraction Science.
  7. Sword-Bound (Tiger and Del, Book 7)?
  8. One of the first aptamers tested in an animal model was an antithrombin agent that blocks the proteolytic activity of thrombin, a protein involved in thrombosis blood clot formation in a blood vessel. Cambridge, MA and Nuvelo, Inc.

    Follow journal

    New York, NY is testing Macugen, an aptamer that targets VEGF vascular endothelial growth factor as a treatment for age-related macular degeneration and diabetic macular edema. The head has an affinity for a specific target molecule; the tail, which contains a region for PCR. Their sensitivity, dynamic range, and, in the case of tadpoles, precise quantification make these high-affinity binding molecules potentially very useful tools for disease diagnosis and environmental detection, including pathogen and other biological agent detection in the event of a naturally occurring or deliberate biological attack.

    Despite their promise as therapeutic agents, aptamers are very expensive to synthesize and are still a largely unknown entity with respect to administration, formulation, adverse effects, etc.

    Recent News

    So although several compounds have entered clinical trial, their future as biopharmaceuticals is unclear. Life scientists have exploited computing for many years in some form or another. But what is different today—and will be increasingly so in the future—is that the knowledge of computing and mathematical theory needed to address many of the most challenging biological problems can no longer be easily acquired but requires instead a fusion of the disciplines of biology, computation, and informatics.

    A National Research Council NRC report entitled Catalyzing Inquiry at the Interface of Computing and Biology December has pointed out that the kinds and levels of expertise needed to address the most challenging problems of contemporary biology stretch the current state of knowledge of the field. The report identifies four distinct but interrelated roles of computing for biology:. Computational tools are artifacts—usually implemented as software but sometimes hardware—that enable biologists to solve very specific and precisely defined problems.

    Such biologically oriented tools acquire, store, manage, query, and analyze biological data in a myriad of forms and in enormous volume for its complexity. These tools allow bi-. Computational models are abstractions of biological phenomena implemented as artifacts that can be used to test insight, to make quantitative predictions, and to help interpret experimental data. These models enable biological scientists to understand many types of biological data in context, and even at very large volumes, and to make model-based predictions that can then be tested empirically.

    Such models allow biological scientists to tackle harder problems that could not readily be posed without visualization, rich databases, and new methods for making quantitative predictions. Biological modeling itself has become possible because data are available in unprecedented richness and because computing itself has matured enough to support the analysis of such complexity.

    A computational perspective or metaphor on biology applies the intellectual constructs of computer science and information technology as ways of coming to grips with the complexity of biological phenomena that can be regarded as performing information processing in different ways. This perspective is a source for information and computing abstractions that can be used to interpret and understand biological mechanisms and function. Because both computing and biology are concerned with function, information and computing abstractions can provide well-understood constructs that can be used to characterize the biological function of interest.

    Further, they may well provide an alternative and more appropriate language and set of abstractions for representing biological interactions, describing biological phenomena, or conceptualizing some characteristics of biological systems. Cyberinfrastructure and data acquisition are enabling support technologies for 21st century biology. Cyberinfrastructure—high-end general-purpose computing centers that provide supercomputing capabilities to the community at large; well-curated data repositories that store and make available to all researchers large volumes and many types of biological data; digital libraries that contain the intellectual legacy of biological researchers and provide mechanisms for sharing, annotating, reviewing, and disseminating knowledge in a collaborative context; and high-speed networks that connect geographically distributed computing resources—will become an enabling mechanism for large-scale, data-intensive biological research that is distributed over multiple laboratories and investigators around the world.

    New data acquisition technologies such as genomic sequencers will enable researchers to obtain larger amounts of data of different types and at different scales, and advances in informa-. A new level of sophistication in computing and informatics is required for interpretation of much of the data generated today in the life sciences. These data are highly heterogenous in content and format, multimodal in collection, multidimensional, multidisciplinary in creation and analysis, multiscale in organization, and international in collaborations, sharing, and relevance.

    These data may well be of very high dimension, since data points that might be associated with the behavior of an individual unit must be collected for thousands or tens of thousands of comparable units. The size and complexity of the data sets being generated require novel methods of analysis, which are being provided by computational biologists. For example, scientists at the U. The application of this technology means that large-scale problems—such as the analysis of an organism—can be solved in minutes rather than weeks.

    The NRC report notes that these data are windows into structures of immense complexity. Biological entities and systems consisting of multiple entities are sufficiently complex that it may well be impossible for any human being to keep all of the essential elements in his or her head at once. Thus, advances in computational biology will be driven by the need to understand how complex biological systems operate and are controlled and will contribute fundamentally to the development of a systems view in biology. In some ways, computing and information will have a relationship to the language of 21st century biology that is similar to the.

    Computing itself can provide biologists with an alternative and possibly more appropriate language and sets of intellectual abstractions for creating models and data representations of higher-order interactions, describing biological phenomena, and conceptualizing some characteristics of biological systems. Systems biology—also known as integrative biology—uses high-throughput, genome-wide tools e.


    • Overview | Department of Cell Biology | Albert Einstein College of Medicine.
    • Three Schemes and a Scandal (Writing Girls, Book 3.5);
    • Total English Elementary Students!

    It is, in a sense, classical physiology taken to a new level of complexity and detail. A systems biologist seeks to quantify all of the molecular elements that make up a biological system and then integrate that information into graphical network models that can serve as predictive hypotheses. A growing number of researchers in the life sciences community are recognizing the usefulness of systems biology tools for analyzing complex regulatory networks both inside the cell, and the regulatory networks that integrate and control the functions of distinctly different cell types in multicellular organisms like humans and for making sense of the vast genomic and proteomic data sets that are so rapidly accumulating.

    Systems biology is being seen as a valuable addition to the drug discovery toolbox.

    Course overview | Department of Biology

    This field is rapidly evolving, with the computational tools still in an immature state and inadequate for handling the reams of data derived from microarray assays and their functional correlates. Unconventional means of recording experimental results and conveying them rapidly to others in the field using an Internet-based approach are being pursued in an effort to manage the scale of data collection and analysis required for this effort.

    They are coming to realize that many novel molecular mechanisms are involved in controlling these signaling pathways, not only phosphorylation and kinase activation as classically recognized in signal transduction but also specific protein conformational changes, the translocation of proteins to different cellular compartments, proteolytic cleavage of signaling partners and latent transcription factors, and the binding and release of modulatory proteins from key signaling intermediates.

    A similar multiplicity of mechanisms exists within the extracellular regulatory networks, that must ultimately take their cues from intracellular events. In all of these signaling networks, tremendous specificity of responses stems from the timing, duration, amplitude, and type of signal generated and the pathways from which it emanates.

    Biology lesson 3

    At present, perhaps it could be said that while the magnitude and nature of the challenge posed by systems biology are increasingly well recognized, it remains unclear exactly how these challenges will be met, or how successful such attempts to do so will be. The rise of systems biology is expected to have profound implications for research, clinical practice, education, intellectual property, and industrial competitiveness.

    As computational technologies advance, simulation of complex biological systems will have more predictive accuracy, aspects of laboratory experimentation will replaced by more cost-effective computational approaches, and physicians will have new decision support tools to help them identify the best preventative and therapeutic approaches for individual genotypes and phenotypes. Just as systems biology will profoundly alter the way scientists and physicians conduct their analyses, the same global problem-solving ap-.

    Genomic, or personalized, medicine refers to potential patient-tailored therapies made possible by improved molecular characterization of disease, technologies that allow for rapid genomic and proteomic analyses of individual patients, and advances in information technology that allow practitioners to access this information in meaningful ways. Scientists have known for a long time that human genetic variation is associated with many diseases and questions. With recent advances in technology that allow for quick, affordable genotypic assessments i.

    Recent accomplishments in the field include the use of an epidermal growth factor receptor EGFR tyrosine kinase inhibitor, gefitinib i. Moreover, in one study the mutations and benefits of treatment were more prevalent among Japanese patients than U. Herceptin provides another more publicized example of the potential for genomic medicine.

    In , this drug became the first gene-based therapeutic licensed and marketed for use against breast cancer. Understanding and harnessing genomic variation are expected to contribute significantly to improving the health of people worldwide, including the developing world. The Mexican government and medical and biomedical research communities view the present time as a window of opportunity for investing in this emerging technological trend, so as to minimize the likelihood of needing to depend on foreign aid and sources in the future. Already, high-tech manufacturing and financial services serve as the fulcrum of the Singaporean economy.

    Strengthening biotechnological capacity, including genomic medicine capacity, is viewed as the next high-tech step forward to accelerated economic growth. Integrating personalized, or genomic, medicine into regular health care in any country will require overcoming two major challenges. Some experts believe it will require a new technology. The second and arguably more significant challenge will be making the philosophical jump from the highly interventional, British-style school of medicine to a preventative, predictive health care paradigm.

    Genomic medicine is expected to revolutionize human medicine by altering the nature of diagnosis, treatment, and prevention. In traditional medicine, diagnosis is based on clinical criteria, treatment is population-based, and prevention is based on late-stage identification of disease. In genomic medicine, diagnosis is based on molecular criteria e. Knowledge generated from genomic medicine could potentially be used to target specific ethnic, racial, or other population characteristics. While knowledge spreading from the various genome projects has fueled speculation in this area, two points should be kept in mind when considering this topic.

    First, the hugely large number of point mutations and other polymorphisms within the genome are not likely to lead to any selective targeting in the near future. Although techniques such as RNAi, as discussed previously, certainly have the capability to inhibit the expression of key genes with relevant single nucleotide polymorphisms SNPs within them, the proportion of such mutations lying in functionally important areas of the genome is small and the technical difficulties associated with exploiting them are real.

    The technology to construct such weapons exists. For almost two decades, researchers have been using adenoviruses to target tumor cells in individuals and steadily refining their techniques for directing viral entry into cells. For example, it is now possible to modify through genetic approaches the fibers used by the virus for cellular attachment so that the virus attaches to particular cell types.

    Interestingly, while the availability of the complete human genome sequence has revealed numerous SNPs and other polymorphic elements—and has consequently raised greater concern about the possibility of using biological weapons to target specific racial or ethnic populations—the ability to identify and exploit genetic differences among such populations does not require this new information. Adenoviruses could be used to deliver antibodies that target distinct ethnic groups with characteristic cell surface molecules, without needing to identify population-specific SNPs.

    The stability and integrity homeostasis of the molecular circuits, pathways and networks responsible for diverse body functions are altered by disease and by exposure to noxious environmental pollutants and toxins xenobiotics. The quest to identify the molecular circuits and control systems in each specialized cell type in the body, and to understand the perturbations that give rise to disease, is a dominant research theme in contemporary biology. Analysis of the disease-induced perturbations in biocircuits also provides the intellectual foundation of modern drug discovery, which is based increasingly on rational design therapeutic agents directed against the specific molecular lesions responsible in disease etiology.

    These insights hold great promise for future advances in medicine, agriculture, ecology, and the environmental sciences. But the very same knowledge about the homeostatic control of body biocircuits can be usurped for less beneficent intentions. The rapid pace of research progress in revealing the detailed molecular circuit diagrams and control processes for every body function, dictates that the risk of evolution of new threats will escalate in parallel. The commercial availability of large libraries of bioactive chemical compounds, together with automated high-throughput screening meth-.

    Combinatorial chemistry and the directed evolution methods described earlier in this chapter are now used to routinely generate chemical libraries containing 10 4 to 10 7 compounds at relatively low cost tens of thousands of dollars. The emerging field of toxicogenomics involves profiling the changes in gene and protein expression induced by chemicals found in the industrial workplace to assess potential risk from exposure to occupational and environmental hazards. The pharmaceutical industry and drug regulatory agencies such as the FDA and their international counterparts also have recognized the value of toxicogenomic profiling as a new tool to detect how investigational drugs might adversely affect genes important in drug metabolism or affect homeostatic genes that may lead to acute or chronic side effects.

    The current heightened public and legislative concern over drug safety will likely intensify pressures for the adoption of toxicogenomics as a routine part of the drug approval process. The benefits of toxicogenomics are self-evident. Once again, however, research that reveals structure-activity relationship SAR correlations between chemical structure s and specific toxicity events provide useful grist for the design of biological circuit disruptors in malevolent hands.

    More robust correlations between chemical structure and therapeutic activity and absorption, distribution, metabolism, excretion, and toxicology ADMET properties will also come from research in the new field of chemical genomics also referred to as chemogenomics or chemical biology. This emerging area of research seeks to establish the SAR rule of how chemical structure defines the selective interaction of different structural classes of molecules with various families of cellular proteins.

    The center will be part of a consortium of chemical genomics screening centers to be located across the country whose purpose will be to identify small-molecule inhibitors of every important human cellular protein or signaling pathway. Part of the rationale for the chemical genomics initiative s is that, in contrast to researchers in the pharmaceutical industry, many academic and government scientists do not have easy access to large libraries. The database will give academic and government researchers an opportunity to identify useful biological targets and thereby contribute more vigorously to the early stages of drug development.

    With plans to screen more than , small-molecule compounds in its first year of operation, one of the goals of the Chemical Genomics Center network is to explore the areas of the human genome for which small-molecule chemical probes have yet to be identified. Data generated by the network will be deposited in a comprehensive database of chemical structures and their biological activities. The database, known as PubChem, will be freely available to the entire scientific community. In addition to screening and probe data, it will list compound information from the scientific literature.

    However, the availability of information and reagents that enable one to disrupt critical human physiological systems has profound implications for the nature of the future biological and chemical threat spectrum. The difference between the NIH and industrial efforts resides in the fate of the information produced from these large-scale screening programs. Companies view their screening data and the accompanying SARs to be proprietary assets. Their data are viewed as a source of corporate competitive advantage and are not typically placed in the public domain.

    In contrast, the NIH data will be placed in the public domain, with the unavoidable accompanying complication of creating a rich source of SAR information that could potentially be exploited for malevolent use. In the past, the dual-use risk of bioregulators was considered minimal because of their lack of suitability for aerosolization unless microencapsulated, their limited shelf life after atmospheric release, the fact that proteins denature at very high temperatures and lose activity at low temperatures, and high purchase costs.

    However, new knowledge and advancing technologies, particularly encapsulation technologies as discussed elsewhere in this chapter , have raised concerns about the dual-use risk of bioregulators. A greater understanding of how small molecules and naturally occurring bioregulatory peptides function in higher organisms will open up novel opportunities to design agents—for good or bad—that target par-. Technologies that allow for such production and delivery are evolving very quickly, driven by the goals and needs of the pharmaceutical, agricultural, and healthcare sectors.

    Some of these technologies, which clearly have immense potential future impact on biology, have not been traditionally viewed as biotechnologies or as having relevance to future biological threats. A prime example is the potential now offered by developments in nanoparticle science for the creation of novel and highly efficient delivery systems for previously difficult-to-deliver biologically-active compounds. These technologies can be subdivided into those concerned with production, packaging, and delivery. Examples of production technologies with relevance to biology include microreactor technology as used in the chemical engineering industrial sector , microfluidics and microfabrication technologies e.

    Examples of packaging technologies with relevance to biology include microencapsulation and nanotechnology. Examples of delivery technologies with relevance to biology include aerosol technology and gene therapy and gene vector technology. Transgenic crop. Biopharming differs from bioprospecting in that the latter is sourced in wild populations. A novel advantage of biopharming is the crop-based production of vaccines and antibodies otherwise not possible or too expensive to produce using conventional methods.

    As described in Chapter 1 , many different genetically engineered crop varieties with genes for therapeutic products have been developed: transgenic rice beta carotene, human milk proteins, higher iron content, higher zinc content, low phytic acid, high phytase ; transgenic potato gene from grain amaranth for high protein content, antigens of cholera and diarrheal pathogens, and hepatitis B vaccine ; transgenic maize AIDS antigens, higher content of lysine and tryptophan, nutritive value equivalent to that of milk ; transgenic fruits and vegetables bananas, melons, brinjals [ Solanum melongena ], and tomatoes with subunit vaccines against rabies; AIDS antigens in tomatoes; and human glycoprotein in tomatoes to inhibit Helicobacter pylori against ulcers and stomach cancer ; transgenic tobacco human hemoglobin, human antibody against hepatitis B virus, and 50 percent lower nicotine , and genetically engineered coffee decaffeinated by gene splicing.

    However, despite the existence of functional prototypes and evidence that the technology works, there are some technical, delivery, and regulatory challenges that are slowing progress in the field. Plant manufacturing platforms may provide a cost-effective means to produce vaccines, offering the ability to address some of the problems associated with global vaccine manufacture and delivery. However, transgenic plants could also be engineered to produce large quantities of bioregulatory or otherwise toxic proteins, which could either be purified from plant cells or used directly as biological agents.

    As with legitimate production, using transgenic plants as bioreactors would eliminate the need for mechanical equipment normally associated with the process. The technology would be limited to producing protein-based agents. Microfluidics and microfabrication are rapidly growing technologies in which a wide variety of processes and manipulations are carried out at miniaturized scales e. The most sophisticated systems are completely integrated, with sample introduction, preprocessing e.

    But most systems are bulkier and rely on external detector and other devices. Limitations of the current technology include reagent stability or instability and the need for liquid reagent reservoirs. Microelectromechanical systems MEMS are a similar miniaturized technology. Unlike microfluidic systems, MEMS devices are self-contained and do not require reagents. Swallowed-capsule technology is a popular example of a MEMS: patients swallow a capsule containing all of the miniaturized equipment necessary for taking images in the gastrointestinal tract.

    Nanotechnological advances are decreasing the size of microfluidic and other miniature diagnostic systems even further. For example, Biotrove, Inc. Waltham, MA has developed a nanoliter sample size real-time PCR machine that, when commercially available, will allow users to analyze thousands of samples simultaneously and for a much lower per-sample cost than with currently available high-throughput microarray systems. Other sampling problems come into play at smaller volumes e.

    For example, there have been several recent advances in convenient sampling methods, including breath and saliva sampling, that would be necessary before personalized diagnostic devices become a widely accepted component of personal health care. Nanotechnology, which was defined in Chapter 2 , started off as little more than a clever means of making incredibly small things. Novelties though they were, these feats proved that,with new tools in hand, scientists could arrange atoms as methodically as masons arrange bricks—and in doing so build materials never made in nature.

    Last year alone, hundreds of tons of nanomaterials were made in U.

    Microscopically thin sheets of tightly woven carbon atoms are being wrapped around the cores of tennis balls to keep air from escaping; new fabrics have been endowed with nanofibers that keep stains from setting; some sunscreens have ultraviolet-absorbing nanoparticles so small they cannot reflect light, making them invisible; and tennis rackets and airplane bodies are being made with nanomaterials whose atoms have been carefully arranged to make them especially strong.

    An intriguing feature of the nanoscale is that it is the scale on which biological systems build their structural components, like microtubules, microfilaments, and chromatin. Even more intriguingly, a key property of these. In their quest to emulate these biological phenomena, scientists have created the field of DNA nanotechnology and the closely related field of DNA-based computation by algorithm self-assembly.

    Some of the most interesting nanotech research being conducted today falls within the realm of so-called DNA nanotechnology. DNA nanotechnology is the design and development of objects, lattices, and devices made of synthetic DNA. Since the DNA helix is naturally linear i. The latter, which laboratories worldwide are involved with, can be used in many ways to organize large complexes. There are only about a dozen labs worldwide involved in high structural resolution DNA nanotech, the potential applications—which are many and varied—include architectural control and scaffolding e.

    Self-assembling systems are completely autonomous devices that do not require the input of a person or a robot in order to function i. Last year an investigator at Purdue University made one of the first self-assembling nano-devices, in this case a DNAzyme, which can bind and cleave RNA molecules one by one. The future trajectory of the field, particularly the convergence of nanotechnology and molecular biology, is unclear, although it will almost certainly have multiple medical applications, including therapeutic delivery by nanoparticles.

    Just as nanotubes and other nanodevices promise novel advantageous means of drug delivery, there is considerable concern that the very same devices and particles could have inadvertent dangerous i. Several recent studies have examined the possible toxicity of nanotechnology-derived products. In biomedical research, aerosol science revolves around the study of the use of inhaled particulate matter as a means to treat human disease. Although its current widespread use is for local treatment of asthma and chronic obstructive pulmonary disease, direct administration of drugs to the respiratory tract has been effectively used or is being tested to treat bacterial lung infections, cystic fibrosis, and lung carcinoma.

    The effectiveness of aerosol delivery for systemic action is also being explored, as a novel, injection-free way to control pain and deliver various therapeutics for the treatment of diabetes, human growth hormone deficiency in children , prostate cancer, and endometriosis.

    In the drug delivery industry the three most common types of aerosol delivery devices currently in medical use are propellant metered-dose inhalers pMDIs , dry powder inhalers DPIs , and nebulizers. The aerosol is drawn into a metering chamber, followed by propulsion of the solution as droplets into the lung. But CFC propellants are being phased out in favor of non-ozone-depleting hydrofluoroalkane inhalers. But the latter require device components and formulations that are different than those of CFC pMDIs, which has necessarily led to the creation of novel delivery means.

    Many new DPI devices and technologies have been developed and patented since the first one was introduced in the s. Most powder products are mixtures of drug particles and large lactose carrier particles. The smaller particles are delivered to the lungs while the larger particles, which help with dispersion, are deposited on the mouthpiece.

    A variety of different technologies have been used in the development of DPIs, and performance varies widely among different types of inhalers. Attempts to improve the delivery of respirable dry products to the lower airways and lungs remains an active area of research. Although air-jet nebulizers are inconvenient devices, due to their utilization of compressed gas and thus requiring an air compressor and their comparatively long aersolization time, their capability to deliver a high dose over an extended time period is widely considered an advantage over pMDIs and DPIs.

    In addition to their inconvenience, other limitations of the technology include the partial loss of drug dose during exhalation since nebulizers generate aerosol continuously and the large size of some of the devices. In addition to the quality and features of the delivery device, critical to the delivery of the drugs to the lungs is the preparation of particles of correct size and shape for incorporation into aerosol products. Advances in powder technology and particle engineering play a significant role in.

    For example, supercritical fluid SCF processing has recently emerged as an alternative technology for designing particles to use in metered-dose and dry powder inhalers. Biomedical advances in aerosol delivery technology are expected to improve drug delivery and patient adherence. Several companies are pursuing aerosolized insulin delivery as a non-invasive alternative to injectable insulin.

    It is widely believed that, once proven safe for prolonged use, aerosolized insulin delivery will stimulate further activity in this already very active field. Aerosol delivery is also being explored as a means of gene therapy. Advances in drug delivery technology, including aerosol delivery, have raised concerns about the use of bioregulators for nefarious purposes. In the past, bioregulators have not generally been viewed as potential dual-use agents, largely because of the lack of effective delivery technology.

    The dual-use risk of bioregulators was considered to be minimal due to their lack of suitability for aerosolization unless microencapsulated, their limited shelf life after atmospheric release, the fact that proteins denature at very high temperatures and lose activity at low temperatures, and high purchase costs. However, new knowledge and advancing technologies, particularly delivery technologies, have raised concerns about the dual-use risk of bioregulators.

    Potential delivery platforms include the use of bacterial plasmids or viral vectors for cloning the genes that encode bioregulators; use of transgenic insects i. Moreover, transgenic plants could be put to dual use as bioregulator-production factories. Microencapsulation is the envelopment of small solid particles, liquid droplets, or gas bubbles with a protective coating derived from any of a number of compounds organic polymer, hydrocolloid, sugar, wax, fat, metal, or inorganic oxide.

    The capsules, which are basically miniature containers that protect their contents from evaporation, oxidation, and contamination and can be engineered with any of a variety of unique release mechanisms e. Microencapsulation is not a new technology. Between the late s and early s, the concept of chemical microencapsulation generated interest in the pharmaceutical industry as an alternative mode of drug delivery that could offer sustained controlled release.

    In fact, it was partly in response to potential agrochemical applications of encapsulation technology that the Controlled Release Society, an international organization with 3, members from more than 50 countries, was formed in the mids microencapsulation is the most common but not the only form of controlled release. According to data provided by the Southwest Research Institute, the number of U. There are two general categories of microencapsulation processes: physical e. Between and , polymerization was the most commonly used process, based on U.

    It has also been used as a way to manage mercury-contaminated and other hazardous wastes. Examples of recent use and exploration of this technology include an investigation by University of Saskatchewan researchers into the use of microencapsulated engineered cells as an alternative approach to cancer treatment. Implantation of encapsulated cells into a mouse model system led to tumor regression and slower tumor growth. In another study, researchers from the Netherlands tested the release, upon chewing, of flavored microencapsulates in Gouda cheese the microencapsulates contained sunflower oil, lemon, and orange oil flavors.

    An exciting future application is the transplantation of encapsulated live cells for therapeutic purposes. In January , for example, Northrop Grumman San Diego, CA announced the development of new encapsulation technology that allows non-marinized weapons and vehicles to be released by submarines. Currently, the most commonly used vectors are viruses including retroviruses, adenoviruses, adeno-associated viruses, and herpes simplex viruses that have been genetically altered to carry normal human DNA. Nonviral options for gene delivery include the direct introduction of therapeutic DNA into target cells, although direct administration can only be used with certain tissues and requires large amounts of DNA see Figure Gene therapy is still experimental, and most of the research performed to date has been conducted in animal trials from rodents to primates.

    For example, in a study that appeared in Nature Medicine in March , using a guinea pig model system, researchers from the University of Michigan and Kansai Medical University, Japan, reported that they had used gene therapy to restore hearing in mature deaf animals.

    The introduced gene, Atoh1 also known as Math1 , encodes a basic helix-loop-helix transcription factor and key regulator of hair cell development. Upon delivery, hearing is substantially improved. The few human clinical trials that have been conducted have not been as successful as originally hoped. When gene therapy does become a clinical reality, it will be used to correct faulty or defective disease-causing genes.

    The efficacy and safety of medical drugs, imaging agents, and vaccines depend on the ability to deliver these agents to the right location in the body and, ideally, with precision targeting only to the cells of interest. Selectivity in drug delivery reduces the exposure of nontarget tissues to the drug, thereby reducing the risk of unwanted drug actions and adverse events. However, this obvious therapeutic need is far from easy to achieve in practice.

    Selective targeting of bioactive molecules remains a largely unfulfilled objective in clinical therapeutics. The pharmaceutical and biotechnology industries, including companies that specialize only in the design of ways to optimize drug delivery, are investing substantial sums in research and development to achieve this attractive, yet elusive, goal.

    These range from efforts to deliver materials to specific zones in the body e. A broad repertoire of targeting vehicles have been examined in this research effort. These include carrier particles containing encapsulated drugs e. Additional cognate molecular interaction systems can be designed to enhance the efficiency of drug uptake by cells once selective targeting has occurred and for directing the delivered drug or gene to the correct location inside the cell.

    Two different technical approaches underpin technical strategies for targeted drug delivery. In the second approach the cognate properties required for recognition and binding to target cells are engineered into a drug carrier rather than the drug itself. Drugs are associated with the carrier either via passive encapsulation e. Both approaches exploit cognate molecular interactions as a common design principle. If an author would prefer to have figures printed in colour in hard copies of the journal, a fee will be charged by the Publisher , unless the Open Access Article Publication Charge is paid, in which case colour print charges will be waived.

    The Table of Contents entry must include the article title, the authors' names with the Corresponding Author indicated by an asterisk , no more than 80 words or 3 sentences of text summarising the key findings presented in the paper and a figure that best represents the scope of the paper see the section on Abstract writing for more guidance. The image supplied should fit within the dimensions of 50mm x 60mm, and be fully legible at this size.

    Supporting information is information that is not essential to the article, but provides greater depth and background. It is hosted online and appears without editing or typesetting. It may include tables, figures, videos, datasets, etc.

    Congratulations to our Top-Three Finishers!

    Note: if data, scripts, or other artefacts used to generate the analyses presented in the paper are available via a publicly available data repository, authors should include a reference to the location of the material within their paper. Manuscript Preparation Tips : Wiley has a range of resources for authors preparing manuscripts for submission available here. Editing, Translation, and Formatting Support : Wiley Editing Services can greatly improve the chances of a manuscript being accepted. Offering expert help in English language editing, translation, manuscript formatting, and figure preparation, Wiley Editing Services ensures that the manuscript is ready for submission.

    The acceptance criteria for all papers are the quality and originality of the research and its significance to journal readership. Except where otherwise stated, manuscripts are single-blind peer reviewed.

    https://ciamisrero.tk Papers will only be sent to review if the Editor-in-Chief determines that the paper meets the appropriate quality and relevance requirements. After review, the Editor-in-Chief makes a decision guided by the reviewers' evaluations. Wiley's policy on the confidentiality of the review process is available here.

    Authors who wish to appeal a decision must submit their appeal in writing to the Editor-in-Chief. Appeals are unlikely to override earlier decisions unless new information becomes available to support the appeal. Authors who wish to comment on the editorial process should contact Wiley. Authors publishing in the journal are therefore encouraged to make their data, scripts, and other artefacts used to generate the analyses presented in the paper available via a publicly available data repository; however, this is not mandatory.

    If the study includes original data, at least one author must confirm that he or she had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. For manuscripts reporting medical studies that involve human participants, a statement identifying the ethics committee that approved the study and confirmation that the study conforms to recognized standards is required, for example: Declaration of Helsinki ; US Federal Policy for the Protection of Human Subjects ; or European Medicines Agency Guidelines for Good Clinical Practice.

    Images and information from individual participants will only be published where the authors have obtained the individual's free prior informed consent. Authors do not need to provide a copy of the consent form to the publisher; however, in signing the author license to publish, authors are required to confirm that consent has been obtained.

    Wiley has a standard patient consent form available for use. A statement indicating that the protocol and procedures employed were ethically reviewed and approved, as well as the name of the body giving approval, must be included in the Methods section of the manuscript. Authors are encouraged to adhere to animal research reporting standards, for example the ARRIVE reporting guidelines for reporting study design and statistical analysis; experimental procedures; experimental animals and housing and husbandry.

    Authors should also state whether experiments were performed in accordance with relevant institutional and national guidelines for the care and use of laboratory animals:. The journal requires that clinical trials are prospectively registered in a publicly accessible database and clinical trial registration numbers should be included in all papers that report their results. Authors are asked to include the name of the trial register and the clinical trial registration number at the end of the Abstract. If the trial is not registered, or was registered retrospectively, the reasons for this should be explained.

    Accurate and complete reporting enables readers to fully appraise research, replicate it, and use it. Authors are encouraged to adhere to the following research reporting standards:. Upon its first use in the title, Abstract, and text, the common name of a species should be followed by the scientific name genus, species, and authority in parentheses. For well-known species, however, scientific names may be omitted from article titles. If no common name exists in English, only the scientific name should be used.

    Sequence variants should be described in the text and tables using both DNA and protein designations whenever appropriate. Sequence variant nomenclature must follow the current HGVS guidelines; see varnomen. Addresses are as follows:. The journal requires that all authors disclose any potential sources of Conflict of Interest. Any interest or relationship, financial or otherwise that might be perceived as influencing an author's objectivity is considered a potential source of Conflict of Interest.

    These must be disclosed when directly relevant or directly related to the work that the authors describe in their manuscript. Potential sources of Conflict of Interest include, but are not limited to: patent or stock ownership, membership of a company board of directors, membership of an advisory board or committee for a company, and consultancy for or receipt of speaker's fees from a company.

    The existence of a Conflict of Interest does not preclude publication. If the authors have no Conflict of Interest to declare, they must also state this at submission. It is the responsibility of the Corresponding Author to review this policy with all authors and collectively to disclose with the submission ALL pertinent commercial and other relationships. Authors should list all funding sources in the Acknowledgments section.

    Authors are responsible for the accuracy of their funder designation. The list of authors should accurately illustrate who contributed to the work and how. All those listed as authors should qualify for authorship according to the following criteria:. Contributions from anyone who does not meet the criteria for authorship should be listed, with permission from the contributor, in an Acknowledgments section for example, to recognize contributions from people who provided technical help, collation of data, writing assistance, acquisition of funding, or a department chairperson who provided general support.

    Prior to submitting the article all authors should agree on the order in which their names will be listed in the manuscript. Additional Authorship Options : Joint first or senior authorship: In the case of joint first authorship, a footnote should be added to the author listing, e. This takes around 2 minutes to complete. Find more information here. If a paper is accepted for publication, the author identified as the formal Corresponding Author will receive an email prompting them to log in to Author Services, where via the Wiley Author Licensing Service WALS they will be required to complete a copyright license agreement on behalf of all authors of the paper.

    General information regarding licensing and copyright is available here. Note that certain funders mandate a particular type of CC license be used; to check this please click here. Please click here for more detailed information about self-archiving definitions and policies. Open Access fees: Authors who choose to publish using OnlineOpen will be charged a fee. A list of Article Publication Charges for other Wiley journals is also available here. The author will be asked to sign a publication license at this point. Accepted Articles are published online a few days after final acceptance and appear in PDF format only.

    After the final version article is published the article of record , the DOI remains valid and can still be used to cite and access the article.