Collective sensing by communicating cells

Speaker: Andrew Mugler
Organising Department: Bionanoscience Department
Subject: Collective sensing by communicating cells
Location: Building 58, TU Delft
Date: 17-12-2016

Author:  Nemo Andrea

Andrew Mugler obtained his PhD at Columbia University for his research in biological networks and models for gene expressions. He has also worked at AMOLF, a well known research laboratory within the Netherlands. In this seminar, he gave a broad talk on environment sensing of both independent cells and collective cell groups.

The start of the seminar focused on introducing the topic and defining the research questions. The speaker covered chemotaxis in E. Coli, a method of sensing a concentration gradient by means of a run-and-tumble method. This was followed up by a brief recap section on diffusion and introducing a concept not yet covered in the nanobiology course, namely the Berg-Purcell limit. This limit relates to how many particles a cell must reasonably count in order to be certain that an observed concentration is not due to low number noise, something that would perhaps set in motion a wasteful process in the cell. This limit is based on the assumption that the membrane has no influence and that the particles diffuse freely.

Naturally, such simplifications can be valuable in certain applications, but in order to test whether this limit is a good reflection of biological reality, a more realistic approach is required. Research by Bialek & Sateyeshgar (PNAS, 2005) incorporated membrane interaction using the fluctuation dissipation theorem, while Kaize et al. (Biophysical Journal, 2014) used reaction diffusion theory. Both these models are closer to the biological system, and in both models, the Berg-Purcell term sets the noise floor. In other words, while the other models did have some additional noise terms, the minimal noise was still determined by the Berg-Purcell limit. This Berg-Purcell limit has been applied to both E. Coli chemotaxis, wherein the Berg-Purcell limit gives an estimate of the minimum time required for a cell to make a reasonable assessment of the concentration around it matches the observed parameters in experiments. A similar predictive power of the limit was observed in experiments with amoeba.

The question Mugler is interested in is whether cells could surpass this limit by communicating. They created a model of cells that could use short range communication to exchange information and solved this model using differential equations. These differential equations also included noise terms and solutions were obtained by working in Fourier space. These solutions showed that whereas a single cell has a prefactor of 1/2 before the noise term, two cells had a factor of 3/8. This shows that indeed cells can reduce noise by communicating, but the increase in efficiency does decrease the more cells are added. This is because it is possible that a cell may measure a particle that another cell has already measured, making that new observation not true new information. This was repeated for long range communication between cells. In this scenario an optimisation problem arises for cells. One the one hand they want to be apart to minimise the double counting of the same particle, but they also want to be close together to minimise noise from their internal communication. This long range model may explain why some tissues like to be a particular distance apart. It should be mentioned that this particular spacing between cells may also be due to other factors.

mugler1
Left: [1] phase plot of results in 3D

Lastly, they also set out to test concentration gradient sensing experiments. They had built a custom setup wherein organoids (multiple cell clusters) and single cells were placed in a container with a variable concentration gradient. Here, by setting the right steepness of gradient, it was observed that single cells could not sense a concentration gradient, whereas the organoids could. This could be explained by the cells communicating and thereby being able to surpass the limit that the single cells could not. They modelled this communication by a local excitation – global inhibition (LEGI) system. This revised model suggests that beyond a certain number of cells, there is no advantage in terms of sensory precision. The speaker illustrated this by a rather elegant analogy; the children’s game Telephone. As the signal passes around through various cells it gets more and more distorted by noise until eventually it no longer contains the original information.

mugler2
Left: [2] Precision of gradient sensing with temporal integration      

I really enjoyed how this seminar covered various models related to cell sensing, starting at the very naive theoretical models and moving on to models that increase in complexity. Seeing how the rather naive theoretical could even predict some of the observed parameters in living systems really suggested these models could reflect some aspects of the systems. I think that studying the exchange of information between cells could teach us a lot about the rise of multicellularity and specialisation of cell types.

Besides the points mentioned above, seeing how various topics covered in the nanobiology course were used in current day research. We have extensively covered and even naively modelled bacterial chemotaxis, and we have studied fourier transforms and its applications. Our courses also covered key concepts related to this research such as diffusion, making this seminar a good example of the kind of research nanobiology students could consider doing during their research career.

[1] Fancher, Mugler, arXiv:1603.04108
[2] Mugler, Levchenko, Nemenman, PNAS, 2016


 

Diversity of immune repertoires

Speaker: Aleksandra Wolczak  
Department: Bionanoscience Department
Subject: Diversity of immune repertoires        
Location: Building 58, TU Delft        
Date: 09-12-2016 

Author: Nemo Andrea

Aleksandra Wolczak is a researcher at the Ecole Normale Supérieure (ENS) and the Centre National Recherche Scientifique (CNRS). She is known for applying statistical dynamics to cells in cases where traditional physics concept such as forces and energy are not a suitable approach. In her seminar, she covered a wide range of concepts concerning the immune system and experiments, with a particular focus on T cells.

The adaptive immune system consist of B and T cells. These cells should be able to detect and react to foreign pathogens. This is mediated through a great diversity of receptors on these cells’ membranes. It is estimated that there are around 10^9 receptors in the human body in healthy individuals. If all these different receptors were hardcoded in the DNA, it would have to be impossibly large. The way that this incredible variety is obtained is through alternative splicing of gene regions known as V, D and J. Various parts if these regions can be spliced together to create significantly more diverse set of receptors than if all the genes were simply transcribed. Additionally, random insertions and deletions in the regions where these V, D and J regions are separated allow for even greater diversity in the repertoire of receptors. So the staggering diversity in receptors comes not from the size of the DNA but from the combined effect of combinatorics and randomness.

wolczak1
[1] flow chart of the analysis pipeline of the model

They generated a probabilistic model for receptor generation, by creating a model that self learns and can calculate the probability that a certain receptor is generated. Furthermore, this model could also predict the specific mutations and splicing events that must have taken place (inference of the cause given an outcome). Additionally, they found that at the level of generation (the receptors generated in the immune system before any selection takes place) where very different between people. Their model predicted that you share as many receptors during generation with family members as complete strangers, with the exception of identical twins. The exception can be explained by the fact that identical twins share blood in the womb, thereby bringing their immune system together to some extent.

Another central question of research was what the optimal distribution of membrane receptors was. One can imagine that one might want to cover the widest range of receptors, while also producing more of the type that is compatible with the most common pathogen receptors. They modeled this with a model in which each receptors has a certain cross reactivity, which means each receptor can still bind to related receptors (albeit less strongly). Their model predicted that the optimal receptor repertoire was a peaked distribution with coverage following the antigen distribution. In practical terms this meant that the receptors were most strongly present at common antigen receptor types and the less common pathogens would be covered by cross reactivity. Such a setup still provided adequate antigen coverage, as can be seen from the image below.

wolczak2
[2] optimal receptor distribution (1D)

Another experiment conducted by her research group related to mutation and receptor effectiveness. Here, they mutated receptors and tested them against one single antigen. This way they could study the effect of mutations on receptor evolution. To asses the difference in affinity of the mutated receptor for the antigen, they used a new experimental approach called Tite-Seq[1]. This approach determines the affinity by means of something that can be seen as analogous to a titration curve (where antigen concentration is gradually increased and fluorescence is measured), rather than a traditional measurement which is usually done at a single concentration. This method would give a more accurate assessment of concentration, as a single concentration measure could easily be deceiving. These experiments showed that most mutations were detrimental, and only a small fraction of mutation actually improved affinity.

wolczak3
[3] Effects of mutations on receptor affinity and expression

I found this seminar to be very enlightening, as it covered such a wide range of experiments and disciplines. Seeing how theory of statistical dynamics, modelling, cell biology, and evolutionary dynamics were all covered in this short seminar, I think I will certainly read some more of her research group’s work. It was also fantastic to see more about the immune system, which is something we haven’t covered in any significant detail in the nanobiology course, showing that these seminars can be complementary to the course material. Added to that, I was intrigued by the probabilistic model they created, as it seems to be a form of a Bayesian Network, which is something I’m currently trying to code as a pastime project. Lastly, it surprised me that tite-seq was a new technology, since the arguments she made in favour of this new method seemed particularly convincing to me. Maybe there are some drawbacks of this method that I don’t see with my limited practical experience, but it may be prudent for the researchers working with binding affinity even in in the bionanoscience department to consider using this method.

[1] https://arxiv.org/pdf/1208.3925v1.pdf
[2] https://arxiv.org/pdf/1407.6888v1.pdf
[3] https://arxiv.org/abs/1601.02160

Optical Tweezers: gene regulation, studied one molecule at a time

Speaker:  Steven M. Block
Institute Speaker: Applied Physics and Biological Sciences, Stanford University
Organising Department: Kavli Institute
Subject:  Optical Tweezers: gene regulation, studied one molecule at a time    
Location: Faculty of Industrial Design, TU Delft
Date: 01-12-16

Author: Nemo Andrea

Professor Block is a well-known figure in the bionanoscience department. He is considered to be one of the founders of the single molecule biophysics field. He currently holds the Ascherman Chair in the Departments of Applied Physics and Biology at Stanford University. One of his former students is Elio Abbondanzieri, who is currently one of the members of the bionanoscience department at the TU Delft. He gave an intriguing seminar on Optical tweezers and their applications and discussed riboswitches.

In the introductory section, Professor Block discussed the importance of the single molecule biophysics field, and how bulk analysis can have it shortcomings. This was followed by a short section on one of the workhorses of this field: the optical trap. Having been introduced to the working concept and applications of these optical traps in our course, it was interesting to see someone who pioneered the use of optical tweezers.

The more contemporary discussion started with the versatility of RNA polymerase (RNAP), which coincidentally was also the focus of the preceding week’s BN seminar. This RNAP molecule, as he explained is a very well document and very versatile protein complex which is regulated in part by RNA hairpins. To demonstrate both the applications of optical tweezers and the properties of DNA he explained how they could determine how many DNA molecules were tethered in their optical trap by measuring the force-distance curves and comparing it to the mechanical properties of DNA. This is not a new experiment by any stretch of the imagination, but it is key in illustrating the use of the optical trap.

The second part of his seminar revolved around the research done in part by Elio, namely researching the manner in which RNAP moves along the DNA, in particular focusing on testing the validity of the inchworm model. He explained the particular circumstances that made that experiment particularly challenging. To hear about the complexity of environmental noise and finding new ways to minimise noise in experiments was refreshing and educational, as we often solely see elegant and low noise data but never hear about what problem solving steps were taken in the actual experiment that ultimately lead to said results.

The Final and main focus of his seminar revolved around RNA hairpins and riboswitches. He explained how various theoretical formulas can be used to determine many properties of various RNA hairpins. They studied riboswitches, with a particular focus on a riboswitch that as adenine as its substrate. The idea behind this particular type of riboswitch is that when the substrate binds to the RNA molecule, it will affect the transcription of the gene itself. This way we have a way in which RNA molecules can become part of the complex gene regulation networks. This concepts also ties in nicely with the RNA world hypothesis.

While this seems feasible in theory, it is not enough to just describe a model qualitatively in order for it to be accepted. It must be mentioned that this is still an area of active research and only three riboswitches have currently been studied in any detail. Steven Block and his lab and others set out to characterise the mechanical properties and kinetic behaviour of the adenine riboswitch, with the goal of being able to describe it more quantitatively.

They studied force extension curves of the specific riboswitch (which ultimately is just RNA being transcribed) by using optical tweezers. They used a framework (Dudko 2006) [3] to find the free energy of activation and characterise various other aspects of the conformations of this RNA molecule as it would be transcribed. This ultimately lead to the creation of a 5 state system as described in a paper by Greenleaf et al. (2008). This group created an energy landscape of this system which elegantly describes the behaviour of this system and the effect of adenine on the energy levels.

Left: the adenine riboswitch as depicted in [1], right: the TPP riboswitch as seen in [2]

They also ran experiments on a different riboswitch, one that is found in eukaryotes and functions with TPP as a substrate. Structural analysis of this molecule had led to the proposal of a model wherein two hairpins would (blue and red in the image above) would come together in order for the riboswitch to be switched on. In order to verify this hypothesis, they combined the optical trap with FRET. They labeled each arm and observed the intensity and wavelengths of light emitted. This allowed them to make a model which suggests that there are various conformations in which this riboswitch can reside. More can be read about this in reference [2].

It was intriguing to see the results and applications of optical tweezers, a topic we have covered in physics courses and are heralded as one of the prime example of the nanobiology field and the new technologies that are created when the worlds of molecular biology and physics meet. Additionally, I had never explored riboswitches in any great detail, making this topic a completely new area for me. While they may only be a uncommon way to regulate gene expression, the mechanisms through which they regulate the genome are very unique and certainly require further study. Even with the things above, the thing I enjoyed most was seeing how theory led to experiments which then led to new theories and models.

[1]https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4692365/
[2]https://elifesciences.org/content/4/e12362
[3]http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.96.108101

New Principles of transcription coupled DNA repair

Speaker:  Evgeny Nudler
Speaker Institute: Howard Hughes Medical Institute
Department: Bionanoscience Deparment
Subject: New Principles of transcription coupled DNA repair
Location: Building 58 (TU Delft)
Date: 24-11-2016

Author: Nemo Andrea  

Evgeny Nudler is a researcher working at the Howard Hughes Medical Institute. He has a PhD in biochemistry and does active research in the areas of Molecular Biology and Biochemistry. He gave a talk on Transcription Coupled Repair (TCR), a process by which the transcription of genes facilitates detection and removal of DNA damage.

The first part of the seminar covered the current and past understanding of Nucleotide Excision Repair (NER) and the factors that recruit the NER system towards sites of DNA damage. This made the lecture quite accessible, as this meant everyone had a good overview of the current understanding of these mechanisms in vivo. This brief section was, for me personally, quite enlightening. We had covered NER in various courses but this short recap greatly expanded on the interactions of NER and RNAP and various other factors. It was also fascinating to see how various components we had studied in various courses such as magnesium atoms in proteins are critical in the functioning of various mechanisms.

The main focus of his research was how transcription and the corresponding movement and stalling at sites of DNA damage of RNAP results in the recruitment of the NER complex. He introduced various systems related to RNAP and its stalling that were not covered in the nanobiology courses such as GreA and GreB. He touched upon the well known concept of how the stalling of RNAP results in the recruitment of factors and subsequent alleviation of DNA damage, but also stressed how other elements moving along the DNA such as the replisome, can cause double stranded breaks when they come into contact with a stalled RNAP complex. He stressed how double stranded breaks have strong detrimental effects on the cell and that therefore stalled RNAP complexes also had to have a way to transition from a stalled to active state again. The old model, which had been proposed in 1993 by Selby and Sancar (science, 1993), was centered around a protein called Mfd and featured a system in which the RNAP is pushed over the site of DNA damage, where after the NER complex would be recruited. In other words, this model was based on forward dislocation of the stalled RNAP.

The new model which Nudler and his lab proposes differs significantly from the old model, but still features RNAP stalling at sites of DNA damage as the event that recruits the NER complex. This new model is centrally based off the two proteins NusA and UvrD. These proteins would push the RNAP backwards rather than over the site of DNA damage, exposing the site in this backwards dislocation manner. He presented evidence for this in various experiments that showed that if UvrD was removed, cells became significantly more sensitive to DNA damage. Additionally, if factors such as GreA/GreB and Mfd were increased in cells, the cells also became more sensitive to DNA damage. This is because, as he explained, these factors are counter-backtracking factors – they prevent the RNAP from dislocating backwards, a fact that has been confirmed by other studies.

They then set out to figure out exactly how these factors could facilitate the backtracking of RNAP. To answer this question, they employed what Nudler called a ‘power technology’ that is known as XLMS to figure out where the proteins such as UvrD could bind to RNAP. Analysis by this method resulted in a probable binding site for UvrD on the ‘backside’ of the RNAP molecule (on the side of the RNAP molecule furthest away from the direction of movement). They also determined that a single UvrD molecule is not good enough to explain the backtracking of RNAP, suggesting that dimerisation may be required for this model to reflect the system. They found out that in dimer form, this complex consisting of two UvrD molecules and NusA could indeed facilitate the backtracking of RNAP, but also determined that this dimer form is unlikely to persist for a long time due the the low concentration of UvrD in the cell.

They also realised that this system must be very flexible, as the activity of TCR various greatly depending on the conditions of the cell (e.g. in case of cytotoxic stress, the system is very active) so they set out to find some of the regulating factors. They found out that the bacterial alarmone ppGpp played an important role in regulating this repair system. Given ppGpp’s function as an alarmone, this would certainly seem feasible. After that, they uncovered the system by which the ribosome trailing the RNAP is removed from the complex. This removal is critical, as this ribosome is closely connected to the RNAP and could prevent the backtracking functionality. Lastly, they uncovered mechanisms by which the stalled complex could be reactivated, preventing DNA damage by the replisome colliding with the stalled complex. They found that by hydrolysis of ppGpp and by introduction of counter backtracking factors such as Mfd and GreA and GreB could restart the stalled RNAP

[1] Left: Experimental Data and the new model; [2] Right: on the effect of ppGpp

It was very interesting to see various theoretical concepts be used to decipher important mechanisms inside the cell. Powerful concepts such as dimerisation, catalytic sites and NER all coming together to make a new model that more accurately reflects reality. It was also interesting to see how new technologies such as XLMS greatly facilitate new discoveries and open up new possibilities.

If you wish to read his full article on (part of) this topic, visit:
[1]https://www.ncbi.nlm.nih.gov/pubmed/24402227
[2]https://www.ncbi.nlm.nih.gov/pubmed/27199428

From Stem Cells to Organs: Exploiting the organ niche for interspecies organogenesis

Speaker: Hiro Nakauchi
Speaker Institute: Stanford University
Organising Department: Developmental Biology
Subject:  From Stem Cells to Organs: Exploiting the organ niche for interspecies organogenesis
Location: Erasmus MC
Date:  24-11-2016

Author: Nemo Andrea          

Hiro Nakauchi is Professor of Genetics Operations at Stanford University. He received his Master’s Degree from Yokohama City University School of Medicine and obtained a PhD in immunology from University of Tokyo Graduate School of Medicine. He did a postdoc at Stanford University. In this seminar, he gave an overview of his research into induced pluripotent stem cells (iPSC) and their applications regarding (interspecies) organogenesis.

His lecture started out focusing on the social relevance of advances in organogenesis technique. It is well known that there is an ever increasing shortage of organ donors. This shortage not only results in many patients being unable to get a transplant they so critically need, but also creates a black market for illegal organ trafficking, resulting in an even greater number of people suffering as a result of organ shortage. He explained how organs grown from the patient’s own cells could resolve these problems. This shows that his research is not just theoretical, but that it is also of great societal relevance.

The discovery of iPSC, as he describes, was a major breakthrough in the field of medicine and biology. It allowed for the generation of stem cells from (theoretically) any somatic cell in the body. While these iPSC can be used in cell therapy to treat specific diseases, they cannot be grown into organs, which are highly complex three dimensional structures. While in theory it should be possible to grow organs fully from iPSC, this is incredibly difficult in practice and may never become a practically attainable option. Hiro and his research group had a solution to this problem that had to be verified experimentally.

The first experiments had as aim to verify the viability of chimeras created by placing iPSC from a donor mouse into a blastocyst of a different mouse. This experiment was successful, as it was found that the resulting mouse was a mosaic of the two cell types. Some cells were from the original mouse and others were of the donor mouse. They decided to take this one step further and one step closer to organogenesis. They made a knockout mouse blastocyst that could not grow a kidney, and once again added the iPSC from a different mouse to the developing blastocyst. They found that the resulting mouse was completely healthy and upon closer inspection they found that the entire kidney was made out of the donor mouse’s cells. Other knockout mice were also successfully rescued from their artificially induced deficiencies.

While these experiments were very promising, one could not conclude whether this would also work for iPSC with xenogeneic barrier. This was also researched by Hiro’s research group and they found that in the same experiment as above, but now carried out with mouse and rat cells still produced viable chimeras and organs. They found that the size of the organism and organs were determined by the species of origin of the blastocyst. They were also able to grow mouse islets in a rat and then transplant those cells into a diabetic mouse and thereby make this mouse recover from its diabetic phenotype.

They were also able to create pig chimeras using the same approach. One big barrier remained: the creating of human iPSC. While human iPSC can be generated, the iPSC that can be created resemble a state later in cell differentiation compared to pig and mouse iPSC. This made chimera formation impossible. At the same time, he mentioned how another problem was also something that had to be resolved: the humanisation of the host animal. It would ethically be hard to grow organs in a half human-pig chimera. He expressed how this could be resolved if it were possible to only have the iPSC create the specific organ desired. Further experiments found a way to allow for chimera formation even out of the (more differentiated) human iPSC. They were even able to allow for chimera formation out of progenitor cells (cells that can only become one cell type/group) which meant they could now overcome both the problem of humanisation, as these progenitor cells can only make a small subset of all body cells (e.g. only liver related cells) and the problem of human chimeras.

This brings us to the current state of research. Research into Human-Sheep chimeras is currently underway. Additionally, one of the reasons why Professor Nakauchi was present to give this talk was because he is here to talk with lawmakers and ethics specialists to discuss how the law should reflect the current state and potential of this technology.

[1] left: diagram of the main concept; [2] right: Graphical respresentation of the zenogeneic organ generation 

I found it very exciting to hear about this research, as it is truly remarkable how well this all works considering the relatively small knowledge we have of exact organ formation and development. I think there is a lot of room for further research in the exact mechanisms governing this transspecies organogenesis. I intend to do part of my Honours Programme on the topic of iPSC, so this topic is of great interest to me. I hope this research will also be able to eventually result in something that can save the many patients waiting for a suitable donor.

[1] http://www.sciencemag.org/news/2015/10/major-grant-limbo-nih-revisits-ethics-animal-human-chimeras
[2] http://www.cell.com/cell/abstract/S0092-8674(10)00843-3

A re-appraisal of brain strategies for constructing the visual image

Speaker:  Semir Zeki
Department: Dept. of Neurobiology, Univ. College London, UK
Subject: A re-appraisal of brain strategies for constructing the visual image
Location: Erasmus MC
Date: 19-09-2016

Author: Nemo Andrea          

Professor Zeki has a proven track record in neurobiology and his lecture was fairly accessible for someone with limited knowledge of the neurology/neurobiology field. Semir Zeki is professor of Neuroesthetics at University College London. He has received various prestigious awards such as the Golden Brain Award (1985), King Faisal International Prize in Biology (2004), Erasmus Medal (Academia Europeae, 2008), and Aristotle Gold Medal (2011). He has given over 60 lectures in his career and published three books. His experience was reflected in the seminar, as it was clear that he was comfortable in the position of lecturer, which made his message that much more coherent. In this seminar he challenged current ideas – or what he deemed misconceptions – regarding image processing in the brain and touched upon the topic of parallelism.

As mentioned before, Professor Zeki believes that the current believes regarding how image processing in the brain is handled is inaccurate. He stressed how in light of contrary evidence, many people in the neurobiology field still held on to outdated belief regarding image processing in the brain. He outlined how he saw the evidence as convincing that his main point was indeed the correct interpretation considering the current understanding of the brain

The main point of contention was the question of whether all visual information is passed through the visual cortex (V1) before being passed through to other visual cortexes or processing regions in the brain (the hierarchical model). This would mean that the processing done in the V1 section would be critical for the other visual processing sections in the brain. Professor Zeki’s was driving the point home that this is simply not the case and that visual signals are fed into V1 and other visual processing regions (V2, V3, V4, V5) in parallel fashion, allowing for asynchronous and parallel processing of visual data in the brain. He stressed how this asynchronous nature is often overlooked and not properly reflected in computer based models of human vision.

One of the main arguments and points of evidence against the hierarchical model discussed by the speaker was the fact that there are known cases in which a patient has tragically damaged the V1 area of the visual cortex, yet retains the ability to see fast motion; something attributed to the V5 area. If the hierarchial model would hold, the signals from the eye should pass through V1 first, but, seeing how this would be damaged in the patients in question, these signals would then never make it to V5 and should subsequently not be observable by the patient. The fact that the patient is able to see the motion attributed to this region of the brain suggests the model is not accurate.

This rejection of the model proposes a more complex model of interactions between the visual cortex regions. This model heavily relies on the idea that image processing is a highly parallelised process. While the concept of parallelisation of processes in the brain is not a controversial or even new concept, the speaker stressed how image processing must be a parallel and asynchronous process. His main arguments were that 1) with the new model for image processing, the same signals would arrive at the different visual areas at different times, requiring some form of asynchrony in order for the visual cortext to produce a meaningful image out of all the areas 2) Given the short response time and general efficiency of image analysis in the human brain, the visual cortex must employ the vast ‘’performance’’ benefits provided by parallel processing.

This new insight in the functioning of human image processing is interesting, as this new view of the system would allow for more dynamic behavior depending on the type of image. This would be a very complex and useful thing to quantify further and to develop accurate simulations for this system. I believe nanobiologists could play a relevant role in this process, as they should have a good mastery of the mathematics required to model such a system and the physics to consider the physical limitations of the system. While the neuron itself is already a highly dynamic system, the brain is a beautiful example of how (roughly speaking) local interactions can lead to the highly dynamic and complex behaviour known as consciousness. These new insights into the brain can help us better understand it and if we pair this with a more nanoscale understanding of neurons, it may be possible to either improve the current computational image analysis algorithms or give us fundamental insight into the inner functioning of the brain. 

If you wish to read his full article on this topic, visit: http://journal.frontiersin.org/article/10.3389/fnint.2015.00021/full

Chronic myelomonocytic leukemia: Recent insights in pathogenesis

Speaker: Eric Solary
Department:
Hematology
Subject: Chronic myelomonocytic leukemia: Recent insights in pathogenesis
Location: Erasmus MC
Date: 19-12-2016          

Eric Solary talked to us about his research in chronic myelomonocytic leukemia (CMML). He discussed new and improved ways to diagnose and treat the disease, as well as a new way to make a better prognosis for patients.

The first way to make a diagnosis easier, is to look at the percentage of different peripheral blood monocytes a patient has. It turns out that in CMML patients more than 94% of their monocytes are classical CD14+ and CD16-, while this is less in healthy individuals. It was shown that demethylating agents could return the normal distribution in CMML patients. Moreover, monocyte phenotype could detect MDS, a disease that usually evolves into CMML.

Secondly, looking at the genetic mutations in patients may also help to diagnose them properly. Three genomic signatures for CMML have been found, of which TET2 was the most frequent mutation, occurring in 60% of the cases. Aside from these three, two TET3 mutations have been found that are associated with the TET2 mutation.

 

seminar-1Figure 1: A chart showing the different mutations in CMML patients

 

Solary’s lab tried to find a better prognostic method for CMML, but unfortunately they were unsuccessful. There are currently many factors that can be used to make a prognosis, but Solary added a new one: ASXL1 mutation. This mutation is also associated with the disease. Unfortunately, this method does not outperform the prognostic methods that are already being used. However, it also did not do any worse, so it is still useful.

Lastly, new ways of treatment were discussed. Hypomethylating agents help restore the healthy phenotype in patients. However, it turned out that this does not rid the patients of the mutations that they have. Furthermore, hypomethylating factors increase miR-150, a microRNA which has a higher expression in non-classical monocytes. Unfortunately, not all patients responded to this treatment. Looking at the baseline DNA methylation of patients could help distinguish responders from non-responders beforehand, making for a more effective treatment.

I found this seminar quite difficult to follow, but I still found it interesting. I was inspired by the fact that there are still new ways of treatment being found for cancer, even very specific types of cancer like in this case. From listening to this seminar I realized it is important to not only look at new ways of treatment, but also for a better diagnosis and prognosis. I think I might someday like to do research in the field of medicine, so it was very interesting to hear what kind of discoveries are made there.

Chronic Myelomyocytic Leukemia: Recent Insights in Pathogenesis

peaker: Eric Solary
Department: Department of Hematology
Subject: Chronic Myelomyocytic Leukemia: Recent Insights in Pathogenesis
Location:
Erasmus Medical Center
Date: 19 December 2016
Author: Romano van Genderen

Professor Eric Solary, director of the prestigious Gustave Roussy Institute of Oncology, had his talk on chronic myelomyocytic leukemia (CMML). This disease has both traits of myeloproliferative neoplasms and myelodysplastic syndrome. So you both have too many excessive cell production and cells not differentiating properly into their final state. So you have too many undifferentiated cells. Currently, you firstly check if the patient has monocytosis for over three months. Next, diagnosis is done using a cross-out method by checking for 3 common diseases and if the patient does not have one of those 3 you call it CMML.

Firstly, he showed that there are 3 distinct kinds of monocytes, namely classical, intermediate and non-classical monocytes. When a patient has CMML, more than 94% of the monocytes are of the classical subtype, while other diseases like CMML have a overproduction (dysplasia) of all three subtypes. These three subtypes can be distinguished from each other using FACS. Using this technology it has been shown that using demethylating agents on CMML patients makes the original distribution return.

facs 
Fig 1. Left: Normal healthy patient, having a normal distribution. Right: Diseased patient with mostly classical monocytes (Green)

Secondly he analyzed the genetic causes of CMML. Firstly, he found that juvenile MML is often caused by over-activation of the RAS, making the blood stem cells more likely to differentiate to common myeloid progenitor cells. But in CMML there are many genes related to the disease. But these genes have in common that they regulate epigenetic markers, splicing and signaling molecules.

Next he investigated the clonal architecture, so which cell types are present where and when. He showed that the mutations that cause CMML give the cells a selective advantage, since the mutated stem cells do quickly outpopulate the non-mutated cells. It was also seen that the order in which mutations occur is most often first a change in epigenetic regulation, then splicing and finally signaling. This is an example of the multi-step model of cancer. Also, a mutation in the TET2 gene makes the cell more likely to become a myeloblast, the cell right before the monocyte. Also mutations that cause the cell to be more sensitive to signals that promote differentiation.

Treatment of this disease is very hard and prognosis is not always clear. One of the ways to treat the disease is hypomethylating agents, which remove methyl groups from DNA. But only 40% responds to this treatment. This medicine does not remove the mutations or lower the risks of new mutations occurring, but it stops the monocytes from differentiating to the classical subtype. The difference between responders and non-responders is the baseline DNA methylation.

Next, he showed that epigenetic regulation plays a large role in this disease. They found out that the gene TRIM33 is downregulated by an epigenic cause. A large clue was recently found, namely a decrease of the micro-RNA called miRNA-150. This miRNA is found in higher concentrations in non-classical monocytes. It does not directly cause the disease, but it causes the monocyte distribution. The fact that hypomethylating drugs work can now also be explained. Hypomethylating drugs demethylate a certain promotor, which causes the miR-150 concentration to rise and more cells to become non-classical monocytes.

Finally, he talked about the role that the mature cells play. The granulocytes produce IL-6, which causes more granulocyte and monocyte production, causing a negative feedback loop. Also, the granulocytes seem to produce a gene that stop monocytes from differentiating into macrophages.

I did find this talk to be very interesting, although a bit difficult, since I did not completely knew the process of hematopoiesis by heart. It was quite confronting how little we know about the disease, and how complicated it is. The most confronting still was that if you do not belong to the 40% of the population that happens to be a responder, you are completely out of options for treatment.

Collective Sensing by Communicating Cells

Speaker: Andrew Mugler
Department: Department of Physics and Astronomy, Purdue University
Subject: Collective Sensing by Communicating Cells
Location: Delft University of Technology

Date: 16 December 2016
Author: Romano van Genderen

This weeks talk at the department of Bionanoscience was given by prof Mugler of Purdue University. He started by showing us a video of a small cluster of epithelial cells called an organoid growing tentacle shaped protrusions that were growing in the same direction as a chemical gradient, showing that cells can sense gradients and use this information. He told us that he has developed a model capable of explaining this behavior. He said that his talk was split in two sections, one dealing with the fundamental limits to sensing and the other with how cells surpass these limits by communicating.

He started with one of the simplest models, namely E coli chemotaxis. These bacteria can move in two ways, either run forward or change direction by tumbling. The chance that a cell tumbles depends on the concentration gradient. But how accurately can a cell measure its local concentration? To answer this question he modeled the cell very simply, namely as a permeable membrane cube with edges a. So the mean number of molecules in the cell (μ) is equal to a^3*c. Because diffusion is a Poisson process the standard deviation is linearly dependent on the mean. If multiple measurements are done the standard deviation is equal to the mean divided by a refreshing factor T(a^2/D) in which T is the time between 2 measurements and D the diffusion constant. This leads to the Burg-Purcell limit:  μ/σ=(TDAc)^-0.5 In real E coli cells this relationship holds true, the cells are almost optimal measurement devices. Next he went on to a bigger cell, namely the amoeba. The amoeba has two detectors, one on the head and one on the tail. This cell also behaves in such a way to make an optimal measurement with a high signal-to-noise ratio.

Next he went on to show models involving cells talking to each other through autocrine signaling. He first showed some models for uniform cell gradients. This model had receptors, ligands and a messenger molecule. For this model he showed some scary looking rate equations with noise terms. He then solved them using a for me completely unknown method. This led to a function for the mean error. It had a term ½ in front for one cell. Two adjacent cells turned this term into 3/8, not ¼ because the cells are sometimes measuring some things twice. Then when he made the cells no longer adjacent the error became a function of the distance. The ideal distance according to this model was 8/3 cell radius. This is a balance between measuring a lot of different points and losing information due to communication. Comparing this to real life cell packing, the cells do in fact pack as predicted. This sparse packing is one of the two ways to minimize noise. They can also be directly attached through gap junctions.

im

 

 

Fig 1. The model used for long-range signaling. Ligand in blue, bound ligand in purple and communication model in green. The communication molecule concentration drops as a power law.

Finally he showed a setup for measuring a gradient. For this setup to work, the gradient must be relatively shallow. Then he placed a single epidermal cell in the gradient. This cell has no preferred growing direction. But an organoid of talking cells do have a bias. When the gap junctions are blocked, they have no bias. So there is juxtacrine signaling through the gap junctions. A new model was made. But the signaling is now done through two molecules, a local signaling molecule that does not diffuse and a global one that does. So the difference between the local and global molecule is a measure for the gradient. Using this model he showed that the optimal distance for a good signal is 4 cells. But like in the telephone tag game, the longer the signal is passed on from cell to cell, the more noise you get.

This was a very interesting talk, but some of the mathematics used was slightly too advanced for me to completely understand the talk. I did like that cells are way more optimal than I usually see them. They behave as “optimal noise reduction machines”. This is most likely something that is selected for a lot in evolution, because problems in signaling lead to swift death.

Recent insights in the pathophysiology of chronic myelomonocytic leukemia

                                                                                                                                                    Aïsha Mientjes

                                                                                                                             aishamientjes@gmail.com

                                                                                                                                                                             4460960                       

Speaker: Eric Solary       

Department: Hematology

Subject: Recent insights in the pathophysiology of chronic myelomonocytic leukemia. 

Location: Erasmus MC  

Date: 19-12-2016           

The presentation was about CMML (chronic myelomonocytic leukemia), the genetic background of this disease and the treatments. The speaker begun by explaining a little bit about different monocytes and how CMML is defined. You need a certain level of white blood cells and monocytosis for a period of three months. In patients who suffer from this disease, monocyte levels are irregular. There is an increased fraction of classical monocytes. This means that monocyte phenotype can be used to detect the disease or an early ‘form’, called MDS. The speaker then went on to describe the genetic background of the disease. Several mutations occur in patients, 40 recurrently mutated genes were found in 49 patients. In CMML patients, the majority of the hematopoietic genes are mutated, there is a linear accumulation of mutations. The most mutated cells have a growth advantage and branching events can occur. CMML is a very severe disease and the main method of treatment is currently the use of hypomethylating agents. Baseline DNA methylation distinguishes responders and non-responders. Using this, you can predict CMML resistance to hypomethylating agents. After a conclusion, the presentation came to an end.

Actually, the majority of the presentation was new for me. I knew nothing about CMML before coming to this presentation. Moreover, this was my first HP Seminar, so I was excited to find out how these would take place. I learned a lot about the disease (how it is diagnosed, what is happening inside the body), but I also learned a lot about genetics and how several mutations can lead to a disease like this. Additionally, I found out a lot about different types of treatment and how the correct treatment for a patient is determined. The speaker had a very interesting flowchart describing how they determine what treatment is best suited. I really enjoyed the topic of the lecture, however it was often hard to follow. As someone with practically no background in the field, a lot of terms were knew to me and remained vague throughout the entirety of the seminar. The seminar was also given at quite a high speed. Despite this, I learned a lot. I really enjoyed finding out how different types of treatment are tested and implemented, given that I had very little knowledge about this.

seminar-1

This picture shows the survival rate of CMML patients, showing that CMML is truly a very severe disease.

In conclusion, this was a difficult seminar to follow but the topic was very interesting and educational. Even though medicine is maybe not the field I would really like to pursue, I would like to get to know more about this topic.