Aug 31, 2016

A brief overview in Adult Neurogenesis: the most fascinating brain discovery over the last 50 years

Probably one of the most exciting scientific findings of the last 50 years is the discovery that discrete brain regions generate new neurons throughout life, a concept known as adult neurogenesis. Despite the neurogenesis importance acquired nowadays, this emergent concept remained obscure until neurogenesis was found to occur in the brain of adult humans (Eriksson et al., 1998). In fact, until the 1990s the neuroscience field was under the dogma established in the late 19th to early 20th centuries by the most famous Spanish scientist and considered the father of the modern neuroscience, Santiago Ramon y Cajal (1852-1934). Unfortunately, despite his successful work (author of the ‘neuron theory’), Cajal’s observations made think that the nervous system was a non-variable system. Joseph Altman was the responsible of this new concept, publishing a series of works in the nineteen sixties showing evidence for adult neurogenesis in adult rats and cats (for review see Gross, C.G., 2000). Last year, the neuroscientific community celebrated the 50 anniversary of adult neurogenesis discovery.

Adult neurogenesis involves cell proliferation, survival, migration, and cell differentiation. This process occurs due to the cells that have this proliferative feature, named as Neural Stem Cells (NSCs). Neurogenesis, or the new neurons generation, was traditionally understood to be mainly an embryogenetic phenomenon, but research in the field has shown the existence of neural cells generation in several areas of the adult brain. These areas include the Dentate Gyrus (DG) in the hippocampus, subventricular zone (SVZ) around the lateral ventricles, olfactory bulb (OB), and other brain places where it was recently described such as the cortex, amygdala or hypothalamus, among other. However, in these last regions, neurogenesis function remains unclear (Gould, 2007; Ortega-Martinez, S. 2015) (Figure 1):

It was the use of a tritiated Thymidine analogue, Bromodeoxyuridine (BrdU), used as a proliferation marker, which allows the scientists to discover this exciting finding. BrdU can easily be labeled with immunohistochemical methods and investigated with a bright filter and fluorescence microscopy. In addition, specific antibodies against neuronal or glial markers were developed, providing easy methods to distinguish neurons from glia. Between the most relevant markers in the study of neurogenesis, it is possible to highlight the widely use of Sox2+/GFAP+ to label precursor or Neural Stem Cells (NSCs) (Type 1 cells), MCM2 for the analysis of amplifying progenitors (Type 2 cells), DCX to label neuroblasts/ immature neurons (Type 3 cells), and NeuN or Prox1 to distinguish mature neurons, among other. With the help of these methods, adult neurogenesis has been demonstrated to exist until senescence in numerous mammalian species including humans (Eriksson, 1998). Finally, the neuronal behavior of these new cells and their integration into the network was confirmed by experiments testing long-term potentiation (LTP), synapse formation and expression of immediate early genes after stimulation of the hippocampal network (Kempermann et al, 2003; Song et al, 2002; H.Van Praag et al, 1999).

The study of adult neurogenesis has been made at different levels. Indeed, in vivo models have been used aimed the neurogenesis removal such as for example X-ray (Saxe MD et al, 2006) or using transgenic models (as used by Jin K et al, 2010). In addition, this process has been analyzed through NSC in vitro culture. Nowadays, specific techniques to label these nascent neurons have emerged. In this regard, neurogenesis field has improved, a lot of thanks to the use of retrovirus (Zhao et al, 2006), which specifically infect cells in division allowing the visualization of these nascent cells along the route to neurogenesis (figure 2):

The essential neurogenesis area of research is the dentate gyrus of the hippocampus, named as Adult Hippocampal Neurogenesis (AHN). Since the general recognition and acceptance of AHN in the scientific community, many studies have been conducted to investigate how neurogenesis is regulated. Nowadays it is known that AHN is a process highly regulated by multiple factors such as hormones, growth factors, neurotransmitters, etc. (see Ortega-Martinez, 2015). AHN is extensively studied due to its important brain functions. In this sense, AHN has been widely described as occurring at the level of the subgranular zone (SGZ) and has been shown to contribute to learning and memory (Cameron et al, 2015; Fanselow et al, 2010), mainly in the dorsal hippocampus. AHN involves not only new neuron formation but also the integration of these new-born neurons into functional networks, making the process as a functional one. From this perspective, AHN facilitates memory consolidation via formation of networks (Deng et al., 2010; Weisz and Argibay, 2012; Ortega-Martinez, S. 2015). Furthermore, AHN provides plasticity required in memory processes and allows for ‘‘pattern separation mechanisms’’ which is crucial for memory consolidation (Bruel- Jungerman et al., 2007; Sahay et al., 2011; Bekinschtein et al., 2011; Yassa and Reagh, 2013; Ortega-Martinez, S. 2015).
In contrast, the ventral hippocampus has been related to emotional functions (Fanselow et al, 2010), involved in processes such as stress, depression, and anxiety. A great number of studies have demonstrated the relationship between hippocampal neurogenesis, stress, and depressive disorders (McEwen et al, 2001; Sapolsky, 2003). Consequently, AHN has been postulated as a key factor in these processes (Jun et al, 2012; Petrik et al, 2012) (Figure 3):

Basic and clinical studies demonstrate that depression is associated with reduced size of certain brain regions that regulate mood and cognition, including the prefrontal cortex and the hippocampus, and a decreased number of neuronal synapses is observed in these areas (Duman et al, 2012). In the 1990s it was discovered that stress and stress hormones robustly decrease the generation of hippocampal neurons and increase cell death (Gould et al, 1992). It was also found that depletion of serotonin inhibits adult neurogenesis and that chronic, but not acute, antidepressant treatment increases SGZ proliferation and neurogenesis (Brezun et al, 1999; Malberg et al, 2000).

Thus, ‘The neurogenesis hypothesis of depression’ was coined, by which the recuperation of depressive symptoms are associated with an increase in AHN (Eisch et al, 2012). Later work has confirmed the association between hippocampal plasticity and stress (McEwen et al, 2001; Hirschfeld, R.M, 2000; Mahar et al, 2014). Even though the role of AHN in stress and anxiety is to some extent accepted, the molecular mechanisms involved in the regulation of these processes remain poorly understood.

In addition, there is extensive evidence for altered neurogenesis in neurodegenerative diseases such as Alzheimer’s disease (Mu et al, 2011). The memory loss of Alzheimer disease patients is related to a disturbed neurogenesis in the hippocampus, and hippocampal neurogenesis would constitute a potential target for therapy. The compensation for neuron loss in the hippocampus by controlled stimulation of endogenous neurogenesis might restore some of the lost hippocampal function (Mu et al, 2011).

Summary: adult hippocampal neurogenesis is extensively considered essential in memory and emotional processes. Numbers of papers related to neurogenesis and learning have exponentially increased since AHN discovery, 50 years ago (see Ortega-Martinez, S., 2015). Even recently, adult neurogenesis process has been described to occur in more brain areas, the link between memory and related neurogenesis primarily occurs in the DG of the hippocampus and OB. By the other hand, the adult hippocampus has been widely studied for its implications in numerous neurodegenerative and neuropsychiatric disorders, such as depression, anxiety, or Alzheimer´s disorder, among others. All these brain disorders also share, between other physiological features, its decreased AHN. In fact, and because hippocampus-dependent memory processes are altered in such diseases, understanding the underlying molecular mechanisms is important for restoring normal brain function. Indeed, some recent drug discovery efforts have focused on increasing AHN (Ortega-Martinez, S., 2015). To reach this functional ‘in vivo’ increase in hippocampal neurogenesis within the human brain, constitutes nowadays, an essential key target in neuroscience, for the future advance and development of new treatments for brain disorders, with all the societal and economic implications associated.

Bekinschtein, P. ,Oomen, C.A., Saksida, L.M.,andBussey, T.J (2011). Effects of environmental enrichment and voluntary exercise on neurogenesis, learning and memory and pattern separation: BDNF as a critical variable? Semin. Cell Dev. Biol. 22: 536–542.doi:10.1016/j.semcdb.2011.07.002

Brezun, J.M.; Daszuta, A. Depletion in serotonin decreases neurogenesis in the dentate gyrus and the subventricular zone of adult rats. Neuroscience 89, 999-1002 (1999). 

Bruel-Jungerman, E.,Rampon,C.,and Laroche,S.(2007). Adult hippocampal neurogenesis, synaptic plasticity and memory: facts and hypotheses. Rev. Neurosci. 18: 93–114. 

C. Zhao, E. M. Teng, R. G. Summers, Jr., G. L. Ming, and F. H. Gage, 'Distinct Morphological Stages of Dentate Granule Neuron Maturation in the Adult Mouse Hippocampus', J Neurosci, 26 (2006), 3-11. 

Cameron, H.A.; Glover, L.R. Adult neurogenesis: beyond learning and memory. Annual review of psychology 66, 53-81 (2015). 

Deng, W., Aimone, J.B., and Gage, F.H (2010). New Neurons and New Memories: How Does Adult Hippocampal Neurogenesis Affect Learning and Memory?. Nat Rev Neurosci, 11: 339-50. 

Duman, R.S.; Aghajanian, G.K. Synaptic dysfunction in depression: potential therapeutic targets. Science 338, 68-72 (2012). 

Eisch, A.J.; Petrik, D. Depression and hippocampal neurogenesis: a road to remission? Science 338, 72-75 (2012). 

Eriksson, P.S., Perfilieva, E., Bjork-Eriksson, T., Alborn, A.M., Nordborg, C., Peterson, D.A., and Gage, F.H (1998). Neurogenesis in the Adult Human Hippocampus. Nat Med, 4: 1313-7. 

Fanselow, M.S.; Dong, H.W. Are the dorsal and ventral hippocampus functionally distinct structures? Neuron 65, 7-19 (2010). 

Gould, E. (2007). How Widespread Is Adult Neurogenesis in Mammals?. Nat Rev Neurosci, 8: 481-8. 

Gould, E., Cameron, H.A., Daniels, D.C., Woolley, C.S.; McEwen, B.S. Adrenal hormones suppress cell division in the adult rat dentate gyrus. The Journal of neuroscience : the official journal of the Society for Neuroscience 12, 3642-3650 (1992). 

Gross, C.G. (2000). Neurogenesis in the Adult Brain: Death of a Dogma. Nat Rev Neurosci, 1: 67-73. 

Hirschfeld, R.M. Antidepressants in long-term therapy: a review of tricyclic antidepressants and selective serotonin reuptake inhibitors. Acta psychiatrica Scandinavica. Supplementum 403, 35-38 (2000). 

Jun, H., Mohammed Qasim Hussaini, S., Rigby, M.J. & Jang, M.H. Functional role of adult hippocampal neurogenesis as a therapeutic strategy for mental disorders. Neural plasticity 2012, 854285 (2012). 

K. Jin, X. Wang, L. Xie, X. O. Mao, and D. A. Greenberg, 'Transgenic Ablation of Doublecortin-Expressing Cells Suppresses Adult Neurogenesis and Worsens Stroke Outcome in Mice', Proc Natl Acad Sci U S A, 107 (2010), 7993-8. 

Kempermann G., Jessberger S, (2003). Adult-Born Hippocampal Neurons Mature into Activity-Dependent Responsiveness. The European journal of neuroscience, 18: 2707-12. 

Kempermann, G., Gast, D., Kronenberg, G., Yamaguchi, M., and Gage, F.H. (2003). Early Determination and Long-Term Persistence of Adult-Generated New Neurons in the Hippocampus of Mice. Development, 130: 391-9. 

M. D. Saxe, F. Battaglia, J. W. Wang, G. Malleret, D. J. David, J. E. Monckton, A. D. Garcia, M. V. Sofroniew, E. R. Kandel, L. Santarelli, R. Hen, and M. R. Drew, 'Ablation of Hippocampal Neurogenesis Impairs Contextual Fear Conditioning and Synaptic Plasticity in the Dentate Gyrus', Proc Natl Acad Sci U S A, 103 (2006), 17501-6. 

Mahar, I., Bambico, F.R., Mechawar, N.; Nobrega, J.N. Stress, serotonin, and hippocampal neurogenesis in relation to depression and antidepressant effects. Neuroscience and biobehavioral reviews 38, 173-192 (2014). 

Malberg, J.E., Eisch, A.J., Nestler, E.J. & Duman, R.S. Chronic antidepressant treatment increases neurogenesis in adult rat hippocampus. The Journal of neuroscience : the official journal of the Society for Neuroscience 20, 9104-9110 (2000).

McEwen, B.S. & Magarinos, A.M. Stress and hippocampal plasticity: implications for the pathophysiology of affective disorders. Human psychopharmacology 16, S7-S19 (2001). 

Mu, Y. ; Gage, F.H. Adult hippocampal neurogenesis and its role in Alzheimer's disease. Molecular neurodegeneration 6, 85 (2011). 

Ortega-Martinez, S. (2015). A new perspective on the role of the CREB family of transcription factors in memory consolidation via adult hippocampal neurogenesis. Front. Mol. Neurosci. 8(46). DOI: 10.3389/fnmol.2015.00046 

Ortega-Martinez, S. (2015). Influences of prenatal and postnatal stress on adult hippocampal neurogenesis: The double neurogenic niche hypothesis. Behav. Brain Res. 281: 309–317. 

Petrik, D., Lagace, D.C. & Eisch, A.J. The neurogenesis hypothesis of affective and anxiety disorders: are we mistaking the scaffolding for the building? Neuropharmacology 62, 21-34 (2012). 

Sahay, A., Wilson, D.A., and Hen, R (2011). Pattern Separation: A Common Function for New Neurons in Hippocampus and Olfactory Bulb. Neuron, 70: 582-8.

Sahay,A.,Scobie,K.N.,Hill,A.S.,O’Carroll,C.M.,Kheirbek,M.A.,Burghardt,N.S.,etal.(2011). Increasing adult hippocampal neurogenesis is sufficient to improve pattern separation. Nature 472: 466–470 

Sapolsky, R. Taming stress. Scientific American 289, 86-95 (2003). 

Song, H., Stevens, C.F., and Gage, F.H (2002). Astroglia Induce Neurogenesis from Adult Neural Stem Cells. Nature, 417: 39-44. 

Van Praag, H., Kempermann, G., and Gage, F.H (1999). Running Increases Cell Proliferation and Neurogenesis in the Adult Mouse Dentate Gyrus. Nat Neurosci, 2: 266-70.

 Weisz, V.I., and Argibay, P.F.(2012). Neurogenesis interferes with the retrieval of remote memories: forgetting in neurocomputational terms. Cognition 125(1): 13-25.

 Yassa, M.A., and Reagh, Z.M (2013). Competitive trace theory: a role for the hippocampus in contextual linterference during retrieval. Front. Behav. Neurosci. 7:107.  

Written by Sylvia Ortega-Martínez for The All Results Journals. The author acknowledges Marie Curie fellowship and Åbo Akademi University for supporting Dr. Sylvia Ortega-Martinez. Dr. Sylvia Ortega-Martínez received a postdoctoral fellowship from the FP7 Marie Curie ITN r’BIRTH.

Aug 29, 2016

Satellite cells do not mediate zebrafish extraocular muscle: when an initial negative result guides you to a more scientifically relevant finding

Degenerative and atrophic muscle diseases such as muscular dystrophies, as well as extensive muscle damage or loss related to trauma, tumor resections and other conditions, represent one of the most important public health concerns. It is estimated that the combined cost of the relatively uncommon disorders amyotrophic lateral sclerosis, Duchenne muscular dystrophy, and myotonic muscular dystrophy is $1.07 to $1.37 billion per year in the USA. Although global estimates are difficult, the total cost of degenerative and traumatic muscle conditions would be substantially higher. Right now the only available treatments for these pathologies are palliative therapies that do not solve the underlying cause of the disease.

In contrast to mammals, salamanders and fish has the capacity of regenerating complex tissues. Thus, zebrafish has become a popular model to study tissue regeneration because are highly regenerative and very amenable to both forward and reverse genetic approaches. To provide foundational transferable knowledge to develop new pharmacological and therapeutic targets, we have described a new model of muscle regeneration using adult zebrafish extraocular muscles. We found that after amputating 50% of the lateral rectus (extraocular muscle chosen as a proof of principle) zebrafish regenerate an anatomically correct and functional muscle in 7 to 10 days (Fig. 1):

Mammalian muscles are considered post-mitotic meaning that they have completely differentiated in mature muscle cells and will never enter the cell cycle again. Therefore, muscle growth and repair are achieved by myogenic progenitor cells termed satellite cells. Therefore, as a first step to understand the mechanisms that control zebrafish muscle regenerative response we looked for the satellite cells, the usual suspects in these matters. We used Pax7, a transcription factor expressed only in muscle satellite cells, to identify the satellite cells in regenerating muscles (Fig. 2). We were able to find Pax7-positive cells (satellite cells) in somatic muscle, both embryonic (Fig. 2A) and adult (Fig. 2D), and embryonic extraocular muscles (Fig. 2B, C). Surprisingly, we could not detect satellite cells in the extraocular muscles of adult zebrafish, either uninjured (Fig. 2E, F) or regenerating after amputation (Fig. 2 G, H):

Because extraocular muscles satellite cells do not do not express Pax7 in some species, we tried to identify these cells using a different method. Classic satellite cells localize between the basal lamina and sarcolemma of muscle fibers. Thus, we used electron microscopy to identify putative satellite cells following this morphologic criterion. Again, classic satellite cells could not be detected in adult zebrafish extraocular muscles but we identified rare cells that might represent a type of ‘‘post satellite cells’’. These cells are muscle progenitor cells fully encapsulated by the basal lamina described in newts.

Combining the fact that we could not detect satellite cells using their most typical marker, that even these so-called ‘‘post’’ satellite cells in newts express Pax7 and that newt limb regeneration also appears to be mostly independent of Pax7-positive cells we concluded that the regeneration of adult extracellular muscles is not mediated by classic satellite cells.

This initial negative result forced us to elaborate a multidisciplinary research approach (including whole-mount fluorescent imaging, cellular tracing, molecular and histological techniques) to decipher how zebrafish extraocular muscles regenerate after injury. We found a mechanism that goes against the accepted dogma in the muscle biology field that postulates that muscle fibers are post-mitotic and therefore will never enter the cell cycle again.

More details about these results and the following investigations in zebrafish extraocular muscle regeneration can be found in Dr. Saera-Vila scientific papers (, and  

Written by Dr. Alfonso Saera for The All Results Journals.

Jun 16, 2016

Could we change the way we design our clinical trials to minimize failures?

We all hear from time to time that clinical trials fail sometimes, but the question today is why do they fail? And why should everyone know when something fails? Could we change the way we design our studies to minimize failures if we know why they fail?

Any regulations set for the clinical trials have a common goal to harmonize the procedures for clinical trials while making sure of the safety of clinical trial volunteers, the ethicality of trials and the dependability and productive effect of data derived. It advocates increasing reliability with regard to clinical trials results, data and their result. For life science companies, clinical trial information is highly of discreet affair which permeates a wide cultural evolution given that they operate in a very systematic and competitive environment, with lots of risk and profit.

The last few years, increasingly noticeable numbers of life science organizations have adopted a more free and transparent policy with regards to the results of their clinical trials, regardless of the positive or negative results.

From the recent past (March 2016) Celldex Therapeutics conducted a clinical trial on a brain tumor vaccine Rintega but the results were fruitless crushing its stocks. The organization announced the failure of Rintega Phase III study known as ACT UV after an interim analysis conducted by independent data monitors. The studies enrolled patients with certain type of Gliblastoma Multiforme (GBM), an aggressive brain tumor. Rintega was found to reduce the risk of death by just 1% compared to the control arm. However, at the median, Rintega-treated patients fared worse, surviving 20.4 months compared to 21.1 months for the control arm. Celldex is discontinuing clinical development of Rintega. Obviously, the company's plans to seek approval for the product in the U.S. or Europe are also being shelved. Celldex has two other drugs being tested on different clinical trials.

Cancer vaccines, particularly those targeting GBM, have a dismal track record. The failure of Rintega follows negative study results for ImmunoCelluar Therapeutics' GBM vaccine ICT-107. Northwest Biotherapeutics is developing a GBM vaccine known as DCVax, but a phase III study has been stalled since August due to an unexplained patient enrollment halt. There are other clinical trial failures too: Chrimex`s stocks also plunged recently after the failure of the clinical trial of an anti-infection drug in Phase III. [Dec-2015]. A French company`s [Bial] drug trial leaves on Brain dead and two others with permanent damage, there is no known antidote as the drug was never used on human before this trial. There is a 50% probability for every clinical trial to fail.

There are three root causes of clinical trial failures:

A. Molecule issues
When the molecule, doesn’t have sufficient biological activity or doesn’t have manageable toxicity. A well-designed clinical program can pick up the side activity and the program can be redirected. But if the molecule has no biological activity when it enters clinical development, only little can be done to salvage it. Just as important are predictability and pervasiveness of the unmanageable toxicity. Every molecule has toxicity but it is important to design a clinical trial accordingly around them.

B. Logistic issues
Half of the published preclinical experiments may be unreproducible. In the hurry to get the clinical trial started, sometimes sponsors neglect to triple-check the randomization algorithm. Despite validation, mistakes in data analysis programming can occur. Many companies don’t double program (have two sets of independent programmers or program two full sets of analysis independently) and in that case, it is almost inevitable there will be mistakes somewhere.

C. Study design issues
Error in clinical trial design is perhaps the most common reason trials fail, other than the above two reasons. There are many variables in clinical trial design but of those, three are the most important when it comes to insuring a successful clinical trial:
  • Selecting the right patients
    • Patient selection can go awry if the selection blindly follow the conventional disease categories and definitions. There are many ways to define patient populations and diseases. It is not always optimal to define the patient population by a previously recognized disease category because disease categories are intellectual constructs
  • Selecting the right dosing
    • All the characteristics of the dose, including: the amount of an intervention administered, the route of administration (e.g., oral, IV, SC), the dosing interval, the rate and duration of administration.
    • Frequently seen error is using a dosing regime that is too undifferentiated, such as a flat dose. When great heterogeneity in patient response or a narrow therapeutic window exists, you may have to customize the dose.
  • Selecting the right endpoint
  • Clinically relevant
  • Closely and comprehensively reflects overall disease being treated
  • Rich in information
  • Responsive (sensitive, discriminating, and has good distribution)
  • Reliable (precise, low variability, and is reproducible) even across studies
  • Robust to dropouts and missing data
  • Does not influence treatment response or have biological effect in and of itself
  • Practical (implementable at different sites, measurable in all patients, economical, and reasonably noninvasive)

Often, a drug has biological activity but it is tested for the wrong indication. Or, it is tested in the right indication but in the wrong sub-populations. In other instances, the wrong dose or dose interval is selected. It is therefore kept in mind that sometimes the most expensive testing studies fail for various reasons leaving long lasting scientific and trading loss.

References l

Written by Shalini P. Burra for The All Results Journals. 

Oct 1, 2015

Negative results on cold fusion experiments

What is Cold Fusion?

The dictionary definition of cold fusion is a nuclear fusion occurring at or close to room temperature. It is only a theory at this point in time. It's never actually been proven. There are some claims from people saying they've achieved it but they have all been debunked. We'll delve into that later, this first section is just to get your feet wet. An introduction to get you familiarized with what this energy source is and the benefits we could reap from it if we ever were to successfully prove the theory. 

Imagine if you didn't need to worry about over heating cores at nuclear facilities. If cold fusion were ever actually attained we could basically have the power of a star at room temperature. The main reason it has never been put into actual testing is because of two factors: radiation leaks and long term environmental effects. 

We know that radiation would quite possibly leak out but we have no way of measuring how much and how it would affect us. It would absolutely affect those that work closely with fusion reactors. Who's to say they would be willing to risk their health for an alternative that could potentially provide cheap energy sources to the world? Some of the environmental effects would be longer spring seasons. So plants would grow longer and become more prevalent which would increase pollen and disorient living for those that suffer from allergies. Since the reactors would be a cheap source of energy the influx of toxic waste dumping could sky-rocket. These devices have the potential to create better ways of living in third world countries. They could help provide drinking water. Although, a desalinator would need to be involved as heavy metals, salts, and minerals could form in the water provided.


Negative Results

The hunt to prove this energy source outside of just theories started in 1989. The masterminds behind the tests were Stanley Pons and Martin Fleischmann. They claimed to have proved the theory in a lone test tube with none other than a palladium electrode and heavy water. Physicists everywhere took to their labs and began attempting to replicate the experiment but they all received negative results. Repeatedly, due to the hype over the successful experiment before them and the anxiety created by the possibility it might actually be proven physicists kept trying their hand and bending over backwards trying to have another successful experiment. After so many experiments ended in failure Pons and FLeischmann were called into question by the scientific community. Eventually, their experiment, the only proven experiment was deemed falsified. No one else was able to replicate what they had done and so they were deemed to have fooled everyone into false hopes. 

Research for the scientific version of the holy grail is still continued today. In about eight countries, the search for was not over and done with. Dr. Alexander Borisovich Karabut was one of the scientists who didn't give up hope when Pons and Fleischmann's theory was declared false in the 1990s. Pons and Fleischmann's theory rallied around electrolysis while Dr. Karabut thought the theory might be proven if they went down the path of using plasma. His entire life became a contradictory. His whole education had ingrained in him the simple fact that nuclear energy would not work at low energies. He was determined and sought to prove that it was indeed possible. However, this tenacity would be his downfall. In some of his papers he confessed to having seen neutrons when they weren't there. It was just a mistake of measurements. His tedious work was not all for not, he did come to some conclusions and made steps, however small, towards making it a reality. He had a break through when he used an x ray along with a lead screen. By all laws of physics, the x ray should not have made an emission due to the lead screen but it did. He had his work double checked by a fellow doctor at the University of Russia and he confirmed what Karabut saw but he did not want to be associated with the experiment whatsoever. Dr. Karabut died in March 2015 due to a stroke but his work was continued on by Professor Peter Hagelstein and Dr. Fran Tanzella. 

While majority of experiments have received little to no results, we are still closer to solving the unattainable. Perhaps we will be successful when our great grandchildren have grandchildren but it is a great hope that we succeed.



Resources on cold-fusion: Experimental Evidence - Cold-Fusion IE Magazine Resources on Dr. Alexander Borisovich Karabut: IE Magazine - Karabut Continuing Karabut's experiments - Hagelstein and Tanzella Resources on Pons and Fleischmann: Berkley.EDU - Pons and Fleischmann NY Times - Pons and Fleischmann

Written by Dr. David Alcantara for The All Results Journals 

Sep 14, 2015

Increasing clinical trials reliability

The new clinical trial regulation (EU Regulation No. 536/2014), created in April 2014 is in full evaluation to the end of May 2016. Its goal is to harmonize the procedures for clinical trials across the EU while making sure of the safety of clinical trial volunteers, the ethicality of trials and the dependability and productive effect of data derived. It advocates increasing reliability with regard to clinical trials results, data and their result.

Indeed, full access to internet, mobile devices and the availability of social media have gaped in a wide range desire for access to clinical trial awareness and result from researchers, patients and consumers alike. For life science companies, clinical trial information is highly of discreet affair which permeates a wide cultural evolution given that they operate in a very systematic and competitive environment, with lots of risk and profit.

Safety first 

The new rules here strives to take off the EU Clinical Trial Directive 2001/20/EC, which the European Commission apprehended enabled an important decrease in the number of clinical trials conducted across Europe, and increased the administrative accumulated load and the time lapsed to enforce new trials by 90 per cent. The new EU regulation becomes applicable all through the EU member states; it provides a level of regularity that has been crippled over the past decade. The goal here is to design a friendly environment for practicing clinical trials, while the highest standards of patient safety, across the EU is highly considered. The current rules are simplified categorically here:
  •  Highlighting the process for evaluating and justifying meaningful clinical trials, extracting duplicability of data and taking off staggeringly slow method in launching new clinical studies.
  • Implementing a lighter regulatory system for trials given with medicines that are already authorised and which may only have minimal risk in comparing to normal clinical practice.
  • Making the report requirements simplified, sparing researchers from submitting duplicated data on the procedure of the study to various forms.
  • Understanding that a trial can be implementated implemented by more than one organisation, by formally introducing the idea of co-sponsorship. Both researchers and patients should be provided with ideas of past trials and their results, and any endless repetition of same studies by establishing clinical trial outcome, whether good or bad, developing transparency.

Going to the open

The last few years, apprehended increasingly noticeable numbers of life science organizations which have adopted a more free and transparent policy with regards to the results of their clinical trials, regardless of the positive or negative results. This is a more spaced out policy which advocates good leadership and important investment from finance and human resources by the provision of stronger analysis, smooth knowledge of treatment results and help in identifying other uses for products.

Clinical trials are an important part of the provision of new medicines and also have a part in the development of medical care more densely. The conventional guidelines for designing a drug is made up of five key study phases: preclinical, clinical trial phases I, II and III, and post-marketing phase IV. At the first three clinical trial phases, where the study subject is more likely at risk, implementation of clinical trial data outcome enables in reduced risks and avoidance of identical or duality of efforts by other companies trying to establish the same kind of studies. This could then open up research resources to be delegated to other clinical trials. More so, the results of these publications will in time help physicians to better match products and essentially, the new structure will bring about procedures for sponsors to submit a public available summary of the results of the trial within one year of its end, regardless of the result (positive or negative). Moreover, for clinical trials which have led to a request for marketing approval, a full clinical study report will be submitted for publication in the EU database.

In order to subvert the mean commercial use of the data, the European Medicines Agency has defined a systematic way for publication of clinical reports with reports on-screen for any user, with a simple registration process, and other downloadable clinical reports at the expense of its identified users. Both situations will be guided by strict terms of use. This action will bring awareness to researchers, patients and the public to be aware of both the trials of the past and their outcomes. When stated in for general informational reasons, the users can may know the results but may not download or save them. And also users logged in for academic and non-commercial research purposes may download clinical study reports, transcript, and copy data.

A moral duty

A 2014 research survey by the Faculty of Pharmaceutical Medicine of the Royal College of Physicians, stumbled over breath taking, that majority of doctors gave full support to the principle of “Clinical Trial Data Transparency” and that such information should be free and easily accessible. 81 per cent of respondents talked to agreed that a “moral duty” should exit for drug manufacturers to submit the finished available data to trial participants, the general public and the research community, and 87 per cent agreed that strict procedures and screening of data will lead to better science and research.

There are many guidelines that pharma companies will have to rethink when planning to make their clinical trial data publicly available. These guidelines are:
  • Establishing a Clinical Trial Disclosure Governance Committee. 
  • Devising an open structure and guideline for the company. 
  • Enforcing a procedure for handling clinical trial result requests. 
  • Inhabiting a technology for redaction, especially to protect study subjects’ private information and commercially sensitive information. 
  • Bringing up a structural process for embarking a study summary and clinical study report. 
  • Selection of an IT solution for publication and submission of data to regulatory authorities. 
  • Allocation of available resources to guide the clinical trial transparency landscape including regulatory requirements and industry trends.
The major benefit that will be encountered from the growing need for clinical trial transparency and disclosure is that patients at this point will be able to gain from reasonably good prescribed products with higher quality of life. Disclosing clinical trials results by life science companies will make use of resources rationally and avoid duplication of studies. Whole hardheartedly, the accessible information is the better possible innovations.

Sep 9, 2015

Winter is coming: an overview on ice-nucleating bacteria

At temperatures between 0 ° C and -40 °C, below its melting point, water can remain liquid without freezing spontaneously; in this state, water is supercooled. To crystallize, it is necessary the creation of nuclei around which crystal structures can be formed. The origin of these nuclei or catalysts for ice formation is very diverse, ranging from mineral, including airborne dust or sand, to biological such as organic aerosols and bacteria.

The phenomenon of bacterial ice nucleation in supercooled water may occur even at relatively high temperatures of -2ºC. It was observed for the first time in Pseudomonas syringae, but other bacteria such as Erwinia herbicola, Pseudomonas fluorescens, Pseudomonas viridiflava and Xanthomonas campestris also have this property. Such property resides in ice-nucleation activity (INA) proteins, which are placed in the bacterial membrane and have high affinity for water molecules. These proteins have a particular structure with repetitive regions that contain aminoacid residues capable of forming hydrogen bonds at regular intervals. In this way, water molecules can be aligned in a specific orientation to promote the formation of structures with a certain order: ice crystals.

These bacteria usually live on the surface of many plants, and given its ubiquity, the phenomenon of ice nucleation is quite common in nature having important effects. The most frost-sensitive plants are adversely affected and eventually die due to the formation of ice crystals in their inside that break cell membranes, allowing the bacteria to feed on damaged plant cells. This causes significant economic losses in agriculture during cold temperatures.

To fight frost damage, genetic engineering has been used since the 1980s for removing the gene coding for the ice-nucleation protein. The resulting strains were supposed to compete and displace the natural populations of bacteria with INA, thus reducing the critical temperature at which freeze damage begins to occur. The first field test was conducted in 1987 in the United States using a recombinant strain of P. syringae, which was sprayed on strawberry plants. It was the first Genetic Modified Organism (GMO) to be released to the environment, thus creating some controversy. Still, it was approved by the Environmental Protection Agency. Results did not clearly prove that the released bacteria had prevented frost damage, although many plants of the study might have been shattered by environmentalists. This strategy was further tested in the following years.

The second experiment was carried out in potato crops and, in this case, a positive result was observed. However, studies outside the laboratory have been hampered by the numerous protests from ecologists. Still, at present there are commercially available preparations consisting of a mixture of naturally occurring, not modified bacteria with ice-nucleation-minus proteins. They are used in certain crops to control frost damage, but better results could be achieved with recombinant bacteria.

Bacteria with INA may have other amazing applications. For example, certain strains of P. syringae are commercially available as additives for the water used in the production of artificial snow on a large scale. They significantly increase the amount of snow produced per unit of time and at temperatures even higher than 1ºC.

Current studies also consider the possibility of using these bacteria in the upper atmosphere to initiate ice formation at relatively high, near-melting temperatures, in order to enhance precipitations. This approach is based on the “bioprecipitations”, consisting in a cycle that starts when bacteria form colonies on plants. Next, the wind drags them into the atmosphere, nearby the clouds in the troposphere, and begin to nucleate ice crystals. Water molecules are incorporated into the crystals that get larger and larger, and then fall as rain or snow. Along with the rainfall, the bacteria return to the ground, colonizing other plants and starting the cycle again. Nevertheless, a large number of cells and a very effective method to disperse them in the air under the clouds would be required to change weather conditions.

On the other hand, the process of freezing certain food often alters the texture and organoleptic properties when not operating at temperatures around -40ºC, which is a very costly process. To improve the quality of frozen foods, ice-nucleating bacteria can be added. They promote the formation of small crystals that do not damage food cell membranes and the texture of the final product is not very affected. Simultaneously, the freezing temperature is increased and the freezing time decreased, resulting in a reduction in the energy cost of the process.

As summarized here, applications of ice-nucleating bacteria are numerous. However, many of them involve the creation of transgenic organisms that must be released into the field. Since it is not precisely known how these GMOs might affect the environment, many tests remain to be done to assess their safety. Once solved this issue, ice-nucleating bacteria may constitute a very useful tool, especially in agricultural biotechnology.

The Potential Impact of Ice-Minus Bacteria as a Frost ProteCtant in New York Tree Fruit Production. John Love and William Lesser. NJARE, April 1989.

Applications in biotechnology: Field Testing Genetically Engineered Plants and Microorganisms. C. Hagedorn and S.A. Hagedorn. Virginia Journal of Science, volume 42, number 1, 1991.

Bacterial ice nucleation: significance and molecular basis. Douglas Gurian-Sherman and Steven E. Lindow. The FASEB Journal, vol. 7, November 1993.

Ice crystallization by Pseudomonas syringae. N. Cochet, P. Widehem. Appl Microbiol Biotechnol, 54:153-161; 2000.

Ubiquity of biological ice nucleators in snowfall. Brent C. Christner, Cindy E. Morris, Christine M. Foreman, Rongman Cai, David C. Sands. Science, February 2008, Vol. 319.

Bacteria in the Leaf Ecosystem with Emphasis on Pseudomonas syringae—a Pathogen, Ice Nucleus, and Epiphyte. Susan S. Hirano and Christen D. Upper. Microbiology and molecular biology reviews. Sept. 2000, p. 624–653.

Written by Dr. Manuel García Moreno for The All Results Journals

Jun 26, 2015

Making social media research more reliable and reproducible

The social media revolution is a good data source for those scientists interested in human behavior.

Juergen Pfeffer and Derek Ruths, in a scientific paper based on the shortcomings of studies about social media, wrote, “Powerful computational resources combined with the availability of massive social media data sets has given rise to a growing body of work that [measures] population structure and human behavior at unprecedented scale.”

This huge amount of data providing detailed information about human attitudes is important but according to Pfeffer and Ruths, “mounting evidence suggests that many of the forecasts and analyses being produced misrepresent the real world.” These results are full of troubles and most of them are connected with other areas and not easy to overcome. 

Population bias is one of the main problems of social media. In social science research, large samples are necessary to solve the problem of sampling bias, which is the probability to not choose at random the objective group and consequently the possibility of not being representative.

This is far more probable to take place with a group of 20 people than a group of 20,000.

Because of that, sampling biases are rarely corrected for, and often are not even accepted at all, write Pfeffer and Ruths. However, the groups of people who use each platform can have distinctive characteristics.

The lack of unrestricted access to data is a more important problem. In the mentioned study, it said that social media companies use algorithms to sample and filter the results without detailed information about the concrete process. Moreover, replication is also restricted due to privacy limitations.

Last but not least, the biggest problem in social media studies is publication bias. A great deal of failed studies aren’t published, while positive results studies are almost always published. If negative results aren’t published, Pfeffer and Ruths think that it’s impossible to tell how much chance is involved in the positive findings. Getting researchers to publish negative results—and getting publishers to accept them into the journals—would be a huge step forward in solving this problem.

All of these problems —publication bias, sampling problems, open access to data, and inaccurate self-reporting—are problems in many fields.

Ultimately, these problems are not impossible to fix. However, it’s essential to resolve the methodological limits in the process so in order to identify reliable and consistent results. And in the meantime, we probably need to take analyses of social media data with a truckload of salt. 2014. 

Science, 2014. DOI: 10.1126/science.1257756

Written by Dr. David Alcantara and Paula Ruíz for The All Results Journals.

Jun 12, 2015

Negative results of CETUXIMAB

The Journal of Clinical Oncology has removed an editorial about a clinical study that analyses the efficacy of cetuximab (Erbitux, Merck in Europe; Bristol-Myers Squibb/Eli Lily in the United States) in the treatment of a particular type of metastatic colorectal cancer (CRC).

The article was available online since November 17th in the section of the journal called Comments and Controversies, and then it was withdrawn. It was reposted on November 28th without any changes and enclosed a note from the editor explaining the reasons why no corrections were needed.

The article was written by three international academics: Bernard Nordlinger (France), Graeme J. Poston (United Kingdom) and Richard M. Goldberg (Ohio).

They stated that the results of the study using a new Eloxatin Peri-Operative Chemotherapy (EPOC), which were published earlier this year (Lancet Oncology. 2014;15:601-611), should not change clinical practice.

New EPOC, used in the treatment of metastatic CRC, was limited to operable colorectal liver metastases, and it demonstrated that the addition of cetuximab to surgery and chemotherapy shortened progression-free survival.

Although negative results have been obtained, the authors think that this isn’t a reason to discourage clinicians from using cetuximab in this situation.

In addition, the authors said that "three other published studies examining preoperative cetuximab in this setting" have all "found benefit with the addition of cetuximab."

"We believe that confirmatory clinical trials are needed before clinicians should act on these findings," the authors concluded.

Multiple Factual Errors; Repost Next Week 

The article had multiple errors according to an official from Cancer Research UK, which sponsored the New EPOC study.

"We have been advised that the commentary contains factual errors and speculation. The [New EPOC] authors will produce a rebuttal to JCO in due course, preferably when the commentary is revised to remove the errors of fact," said Kate Law, Cancer Research UK's director of clinical trials, in an email to Medscape Medical News.

A member of the American Society of Clinical Oncology, which publishes the Journal of Clinical Oncology, stated that the article would be reposted next week, including the reasons for removing it, as well as its corrections.

Cetuximab has been approved by the US Food and Drug Administration for use in patients with metastatic CRC without the KRAS mutation (wild-type), and whose tumors express the epidermal growth factor receptor (EGFR) protein. Moreover, the drug is compatible with the use of chemotherapy.

In Europe, the drug is approved for EGFR-expressing RAS wild-type metastatic CRC.

The New EPOC study focused on patients with CRC metastases limited to the liver.

The use of cetuximab in this setting cannot be recommended. 

In the study, patients were randomly assigned to chemotherapy with or without cetuximab for 12 weeks, followed by surgery and then a further 12 weeks of chemotherapy with or without cetuximab.

The chemotherapy backbone was either oxaliplatin and fluorouracil or oxaliplatin and capecitabine.

Progression-free survival was significantly shorter in the chemotherapy-plus-cetuximab group than in the chemotherapy-alone group.

The researchers conclude that "the use of cetuximab in this setting cannot be recommended."

Written by Dr. David Alcantara and Paula Ruíz for The All Results Journals.