Molecular and functional heterogeneity in the human haematopoietic stem cell compartment

Speaker: Elisa Laurenti
Department: Cambridge Stem Cell Institute, University of Cambridge, Cambridge
Location: Erasmuc MC Rotterdam
Date: Juni 12, 2017
Author: Teun Huijben

Elisa Laurenti has been interested in stem cells during her entire academic carrier. After doing a PhD and PostDoc in this field, she now has her own lab at Cambridge, where she studies the heterogeneity in the human haematopoietic stem cell (HSC) compartment. In this one hour she introduced us to the field and explained the research she has performed in the last years.

The main point of Elisa’s talk was that where we all think of stem cells as just stem cells, there is actually a large heterogeneity between them. By quantifying the differences between the different HCSs, she hopes to define distinct subsets with different functions, characteristics and detectable markers within the broad HSC-pool.

By single-cell analysis, Elisa found two distinct subsets of haematopoietic stem cells: long-term HSCs (LT-HSC) and short-term HSCs (ST-HSCs). They are characterized by just two surface markers: LT-HSCs express high levels of CDf49 and low levels of CD90, where ST-HSCs have opposite expression levels. The functional difference between them is that LT-HSCs divide very rarely, and ST-HSCs divide more often.

Transcriptional analysis of LT-HSCs and ST-HSCs didn’t give any results, they both showed the same expression landscapes. One explanation for this could be that both cells are very quiescent and therefore not transcriptionally active. The solution Elisa and her colleagues found was to activate the cells and then analyse their transcriptomes. Once activated, the cells start in quiescence, which is a ’sleeping’ state, and are then activated. The ST-HSCs are activated earlier than LT-HSCs, which is another functional difference between them.

To activate the in vitro cultured HSCs, they are transplanted into living mice or into in vitro cultured tissues. Both activated cells are analyzed by single RNA-seq and microarrays. By doing this, they found at least 34 genes that are differently expressed between two subsets. CDK6 appeared to have to most distinct difference in expression between the two groups and was the best gene to indicate whether a cell is ST-HSC or LT-HSC. Surprisingly, treatment with CDK6 determined the state of the cells: over-activation of CDK6 resulted in a faster activation and a CDK6 inhibitor resulted in slower activation.

However, next to this direct effect by changing the expression level of CDK6, also long-term effects were measured. When CDK6 was over-expressed, LT-HSCs gained a positive competitive advantage over SC-HSCs over the long term. In other words, they outnumbered the ST-HSCs. This can be explained by the fact that CDK6 stimulates activation of the cells. ST-HSCs already activate quite fast, so stimulating activation results in activation of all ST-HSC. They all start differentiating and no ST-HSC will be left. LT-HSCs on the other hand, activate more slowly and will remain abundant in the HSC-pool, and will eventually dominate over the ST-HSCs in number.

In the remainder of the time, Elisa told about her current research in further defining subsets of haematopoietic stem cells by finding new markers that characterize distinct groups. Her talk emphasized once more the difficulties we face when looking at stem cells, or molecular biology in general; tissues are very heterogenous and we do not yet know a lot about all their differences. However, her talk was very clear and she is obvious an important person in this field.

EMT-controlled phenotype switching drives malignant progression

Speaker:              Geert Berx

Department:      JNI Oncology

Location:            Erasmus MC Rotterdam

Date:                    24-5-2017

Author:                Katja Slangewal

Dr. Geert Berx was one of the people how helped discovering E-cadherin as aberrantly regulated in cancers. E-cadherin is for instance lost or partially lost in invasive lobular breast cancers. However, this type of breast cancer only represents 15% of all the breast cancer types. Dr. Berx went on in studying the transcriptional regulation of the protein. He found that mutations in specific E-boxes right before the transcriptional start site can lead to the loss of E-cadherin expression in epithelial cells. This brought him to a protein called ZEB2, an important transcription factor.

ZEB2 is, together with ZEB1, part of a small but evolutionary conserved family. Both ZEB1 and ZEB2 have multiple interaction partners. One of which is E-cadherin. ZEB2 represses E-cadherin expression. This has been shown by inducing a knock-out of ZEB2. This resulted in an upregulation of E-cadherin, leading to mis-expression. On the other hand, a conditional up regulation of ZEB2 leads to the loss of E-cadherin expression. The same is true for another transcription factor called Snail. Dr. Berx and his group have focused on ZEB1, ZEB2 and Snail as transcription factors regulating E-cadherin expression. The loss of E-cadherin has a cancerous effect. The hallmarks of invasive cancer cells are occurring. This means that cells are undergoing EMT (epithelial mesenchymal transition, figure 1). This transforms the cells in invasive cells.

seminar 12

Figure 1: Epithelial to Mesenchymal transition. The main processes and related proteins are indicated in the picture. http://www.ijournalhs.org/article.asp?issn=2349-5006;year=2015;volume=8;issue=2;spage=77;epage=84;aulast=Angadi

EMT is a process controlled by four major interconnected regulatory networks:

  • Post-translational control
  • Transcriptional control
  • Differential splicing
  • Non-coding regulation

So far it was mainly thought that cells are either in a stable epithelial state, an unstable transition state or a stable mesenchymal state. However, dr. Berx has shown eight metastable transition states.

EMT is not only associated with invasive cancers. Dr. Berx also shows a connection to stemness of cells. This can be explained by taking a look at the reprogramming of iPSCs (induced pluripotent stem cells). When firoblasts are being reprogrammed towards iPSCs, they undergo an epithelial intermediate. This state is reach by a decrease of for instance Snail. Snail has also been shown to regulate stem cell maintenance in the intestines. ZEB2 on the other hand has been shown to be important for stem cell maintenance in embryonic hematopoiesis. When ZEB2 is knocked- out, stem cells accumulate and leukocytes, erythrocytes ad platelets are reduced. This means that ZEB2 is not necessary for stem cell maintenance, but it is necessary for faith determination.

In the recent months, dr. Berx has mainly focused on ZEB2 in melanoma cancer types. Melanomas are a product of EMT, so this makes it interesting for ZEB2 expression. A ZEB2 knock-out mouse has no pigment. This indicates that ZEB2 functions in melanocyte differentiation. A very interesting observation was made during this experiment. When ZEB2 levels decrease, ZEB1 levels automatically increase. An immunostaining revealed that ZEB1 mainly localizes in the stem cell compartments, whereas ZEB2 is mainly found in the differentiated states of melanocytes. Also, an overexpression of ZEB1 leads to a dedifferentiated gene signature in primary melanocytes. A double knock-out of both ZEB1 and ZEB2 lead to growth arrest, indicating that at least one of them is necessary for a cell to function.

In short, an increase of ZEB1 leads to a highly invasive signature and almost no proliferation. An increase of ZEB2 leads to a low invasive signature and high proliferation. This observation led to the following model:

seminar 12_2

Figure 2: The oscillating model proposed by dr. Berx and his group.

In the primary tumor ZEB2 is high. When invasion starts and EMT is happening ZEB1 takes over. Then in the metastasis ZEB2 expression increases again. So ZEB2 drives melanoma proliferation and differentiation. Enhancing the ZEB2 expression does not lead to an increased melanoma frequency. However, it does lead to an increased metastasis formation. For this to happen an oscillation between ZEB2 and ZEB1 expression is necessary. This is regulated on protein level by TNF. TNF probably marks one of the ZEB proteins for degradation. However, how this regulation exactly takes place is not known.

After reading this you will probably have noticed this: increase of ZEB2 leads to increase of metastasis. So, shouldn’t there just be a drug that inhibits ZEB2 expression? Unfortunately for drug design, biology is not that simple. ZEB2 does not only have an important role in E-cadherin expression. It is also required for natural killer cell maturation. This means that an inhibition of ZEB2 will cause an immature immune system, leading to many more problems. Luckily, there are many more options to investigate. Recently, it has been shown in a different cell line, that an increased expression of ZEB2 marks the cells sensitive to a specific drug. This observation is useful to prevent giving unnecessary medicine to patients. Dr. Berx and his group will focus on this feature in melanomas in the near future.

My final seminar, was an interesting one. Dr. Berx had a clear story. Just like my very first seminar, there were a lot of technical terms embedded in the talk. I am very happy to see the difference in how far I could understand this talk compared to two years ago. It makes the talks a lot more interesting if you can understand them properly.

The ‘Nano’ in the huge Universe

Speaker:              Prof. Sir Vincent Icke

Department:      Theoretical astrophysics University of Leiden

Location:            TU Delft

Date:                    17-5-2017

Author:                Katja Slangewal

The universe is immensely huge. So, it might be a bit surprising that describing Nano-scale processes are necessary to describe our universe. Until a few years ago the field of Cosmo-chemistry, which focusses on the Nano-scale processes happening in the universe, seemed unnecessary. However, nothing is less true.

The universe exists for 99% of hydrogen and helium. All the other elements we know are filling the final percent. Most of the material is concentrated at stars and solar systems. The rest of the universe is quite empty compared to these dense regions. However, small dust grains, once formed after the dying of a star, are ‘floating’ around in space. The surface of these small dust grains contains attractive and repulsive parts. Which can for instance bind a glycolaldehyde molecule. This event might seem very unlikely to happen considering the density in which the dust grains and glycolaldehyde molecules are found in space. However, one should not forget we are talking about immense timescales. The reaction taking place at the surface of dust grains happen at approximately 50 Kelvin. Also, there is no source of energy anywhere near. So, when a particle binds and the energy excess is released in form of radiation, the changes are small that the particle will leave the dust grain anytime soon. This means that several molecules can spontaneously form at dust grains. This forms the basis for Cosmo-chemistry.

Nearby a star, the conditions are very different from the ‘empty’ universe. A star forms a reserve of energy and a source for water. When stars are just starting to get formed a huge ring of dust will surround the star. The dust is attracted by gravity. Because of the excess of water, the dust grains will be covered in ice. As soon as the star grows and starts irradiating, the ice will melt of for dust grains close enough to the stars. A so called ‘ice line’ is formed. This line determines the difference between the nearby rocky planets that will form and the more far away gas giants. This has happened in our solar system (compare Mercury, Venus, Earth and Mars to Jupiter, Saturn, Neptune and Uranus), but this process is also true for exoplanets. As mentioned before, as soon as molecules bind to the surface of dust grains there will be an excess of energy. This energy is turned into radiation, which can be measured. This has allowed us to image exoplanets and see the difference between the icy ad rocky ones.

The question is, can there be life on the exoplanets. To go deeper into this question prof. Icke told us more about the Miller experiment (figure 1). In this experiment, a primeval soup was made, which has shown to produce complex molecules (like amino acids) within a couple of weeks. The primeval soup contained substances which were present on early earth. The same conditions can be found in black smokers on the bottom of our oceans. Prof. Icke stresses that according to physics life is unbelievably simple. It only contains six elements, some relatively easy molecules and the more complex structures found in a cell are just made of chains from the easy molecules. All the elements needed are synthesized by stars, so no complex processes on a planet are needed. And even from the simple molecules we don’t need much. Take for instance the amino acids. We use only 20, while there are over 500 found in nature. So exo-life is very probable, planets are everywhere and the recipe for life is quite simple.

seminar 11

Figure 1: Millers experiment. http://www.smithlifescience.com/millersexp.htm

After this very interesting talk on the importance of Nano-processes happening on dust grains, the formation of solar systems and the probability of finding exo-life, prof. Icke continued his talk about Nano-pills, which he calls a spin-off of astrophysics. He thinks DNA specific medication in the future will read of the genetic code and determine which medicine to release in patients. He thinks this will prevent misuse of medicine, since you cannot buy pills and use them for someone else anymore.

To my opinion, the last part of the talk really brought down the rest of the talk. I really liked the main part. It contained a new view on where Nano-processes can be interesting. Also, I think the question of how life originated is very interesting and it gave a fresh look to consider life as something easy and simple. However, I thought it was a real pity how prof. Icke ended his talk. The part about DNA specific medicine was brought as if we had no idea about biology. The talk was given at the Nanobiology symposium, so I think he should have known what kind of study Nanobiology is. He showed us pictures of DNA with an attitude as if we were seeing it for the first time. Also, I am wondering if prof. Icke has any idea about DNA imprinting, translational regulation and other mechanisms which won’t enable a quick DNA readout to decide on medication. This was really a disappointment after such a good start of the talk.

Big data analysis for precision medicine in dementia

Speaker:              Wiro Niessen

Department:      Biomedical image analysis

Location:            TU Delft

Date:                    17-5-2017

Author:                Katja Slangewal

Are doctors in the near future being replaced by technology? Approximately five years ago scientists were still laughing at people working at neural networks. However, at the moment neural networks can process more and more data in a short amount of time. Together with a huge progress in imaging, this revolution in machine intelligence makes the first question a reasonable one. According to Dr. Wiro Niessen artificial intelligence is still complementary to humans, he sees a bright future in which human and computer will work together.

Artificial intelligence keeps surprising people. It is already clear that many (repetitive kinds of jobs) will soon be taken over by machines. However, at the moment machines are also capable of making music and paintings (figure 1). These are things which were long considered to be limited to humans. In the late 1980s computers were already capable of beating humans in a game of chess. Challenges were designed to come up with algorithms for other games. For instance, the game GO, a very complex Chinese board game, was a subject of focus. Scientists predicted artificial intelligence to beat humans around the year 2025. However, this already happened last year (2016). One interesting thing Niessen mentioned during his talk is the difference between human and computer intelligence. During the game of GO the computer program, based on neural networks, learned through experience. So it was very unexpected when the computer made a move which was never made by humans before. In the end the move turned out to be truly smart 5 or 6 turns later. This means that the computer was capable of looking ahead in the game. So computers are getting better and better. This is not only useful in games, but also in the medical field.

seminar 10.png

Figure 1: Machines can learn styles from various painters and apply them to any picture. In small are paintings from famous painters. In large is the top left picture remade to the different styles. https://motherboard.vice.com/en_us/article/artificial-intelligence-can-now-paint-like-arts-greatest-masters

Niessen and his colleagues are focusing on early prognosis of dementia. They showed that several markers of the disease already show an elevated level long before the onset of the disease. To find the markers visible before the start of a disease is a difficult thing to do. This is why a population study was started in 1990 in Rotterdam. During this study people regularly undergo various tests and over the years it becomes clear how their health develops. During the study, risk factors (genetic markers, lifestyle, smoking, etc.) are being linked to an outcome (dementia, stroke, etc.). Next, the brain of the patients can be traced back to look for markers in morphology, volume, lesions and more. Niessen and his group have found patterns in the brain images linked to a disease. These patterns are visible long before other symptoms are starting. So, they use the data about several markers to write algorithms which will predict the chance of getting dementia or a stroke.  One example which has been determined during the Rotterdam study, is the shrinking rate of the brain and its standard deviation. Programs and information like this will help radiologists to diagnose patients.

At the moment Niessen and his colleagues are connecting SNPs to imaging phenotypes (like brain morphology or volume). This way they found certain SNPs that are correlated with the volume of the hippocampus. These SNPs are being put in a library accessible online. So, to conclude, computers are getting smarter and smarter. They will take over repetitive jobs and they will also be useful in more complex kinds of work. In the medical field Niessen expects computers to shift the focus from curing diseases to preventing diseases.

Dr. Wiro Niessen gave an interesting talk. It keeps surprising me how much computers are already capable of. I had no idea computers can paint quite good already. I also liked to hear about the possibilities of combining biological knowledge about the brain with writing algorithms to predict diseases. A lot of different research areas are combined here, which is of course interesting for nanobiologists.

 

Induced Pluripotent Stem Cells for treatment of cardiovascular and respiratory diseases

Speaker:             Ulrich Martin

Department:     Cell biology

Location:            Erasmus MC Rotterdam

Date:                    15-5-2017

Author:               Katja Slangewal

The endogenous heart regeneration after a myocardial infarct is far from sufficient in mammals. Actually, less than 50% of the cardiomyocytes in a mammal heart are replaced during the entire life span. These facts immediately show the importance of the development of stem cell based regenerative treatments. This is the main area of research for Ulrich Martin from the Hannover Medical School, centre for Regenerative Medicine. During his talk, he did not only talk about the use of induced pluripotent stem cells (iPSCs) as therapeutic for heart repair, but also about the use of patient specific iPSCs in context for cystic fibrosis (a severe monogenic disease).

It has become more and more clear that quite often a general type of disease differs a lot from patient to patient. Emphasizing the need of patient specific treatments. Martin and his team work on patient specific iPSCs for disease modelling and the search to new drugs. The production of iPSCs (figure 1) becomes more and more efficient. The reprogramming step has become routine and it is even possible to use big tanks for the production of large numbers of iPSCs. This in contrast to a few years ago, when the production of iPSCs was tedious, time consuming, intensive and about low numbers of cells. The automatization and upscaling of stem cell production however was needed for high throughput and industrialization. For the first time in history, even big pharma are interested in stem cells.

seminar 9

Figure 1: iPSCs are formed by taking somatic cells (for instant adult fibroblast cells) and adding reprogramming factors (KLF4, SOX2, c-Myc, Nanog, Oct-3/4 and LIN-28). After culturing the iPSCs and if desired adding mutations, the iPSCs can be differentiated in various tissues. https://www.rndsystems.com/resources/articles/differentiation-potential-induced-pluripotent-stem-cells

Now back to one of the applications Martin and his team focus on: treatment of Cystic Fibrosis. Cystic fibrosis is a severe and quite common disease, 1:2000 new-borns is diagnosed with CF. The disease is almost always caused by a single point mutation in the CFTR gene: F508del-CFTR. This mutation leads to a shorter and dysfunctional protein. Martin and his team want to investigate this mutation in iPSCs.

The workflow of Martin and his team goes as follows: first they generate CF specific iPSCs from the peripheral blood of CF patients. For his research, he uses patients with varieties in severity of CF. Next, he wants to pick the interesting clones (in which CFTR is expressed but not functional). Since the known antibodies for CFTR are not reliable, the lab uses a CFTR construct labelled with tdtomato. To screen for functionality, he uses eYFP labelling of the cell. eYFP activity is regulated by the iodide concentration, which is travels in and out the cell through the CFTR channel (figure 2). So when a colony of iPSCs is both fluorescent for the red tdtomato and yellow YFP, Martin can use it for further research. Now, a protocol is needed in which iPSCs can be differentiated in functional lung cells. There is a protocol proposed by Kadzik and Morrisey (2012) for this differentiation. However, Martin wants to use an upscaled version. At the moment, the upscaled procedure is not as efficient as the original variant, but it is still work in progress.

seminar 9_2

Figure 2: YFP is sensitive for anions like iodide. It will get activated when iodide enters the cell. Which happens when the CFTR channel is functional.  Adapted from Vijftigschild, L.A.W., Ent, C.K. van der, Beekman, J.M. (2013) A novel fluorescent sensor for measurement of CFTR function by flow cytometry. Cytometry Part A 83A: 578 fig. 1A

Besides CF, Martins research also focusses on iPSCs derived cardiomyocytes as cellular therapeutic for heart repair. In order for therapies like these to exist, cardiomyocytes need to be produced in a safe, efficient and large scale production. In this case large scale production is a real must, since humans lose 1-2 billion cardiomyocytes after a myocardial infarction. Martin and his team are testing these large-scale productions. One important finding is the relationship between the state of differentiation and the density in which the cells where located. This stresses the importance of an equally divide density of cells over the entire tank.

It is now possible to form small amounts of differentiated tissue (few centimeter in diameter), which can beat like heart-tissue. However, the tissue is far from the strength of adult heart tissue. On the bright side, the tissue can deal with higher forces than the heart-tissue of new-borns. At the moment, stem cell derived cardiomyocyte tissue is implanted in monkeys who have suffered a myocardial infarction. This means it is time to prepare for clinical application. The start of ‘iPSCs for Clinically Applicable heart Repair’ or iCARE will help in the realization of clinical application.

I thought this seminar was one of the most interesting ones I have attended. It made me realise how incredible fast the progress in stem cell studies goes. I liked to see the connection between research and clinic. Martin used many clear examples which made his talk easy to follow. I also liked to see the enthusiasm with which he talks about his work. This mainly came back at the end when he got some interesting questions. All in all a good seminar.

 

Steroid hormone receptor profiling in human cancers: from biomarkers to novel therapeutics

  • Speaker:             Wilbert Zwart
  • Department:     Cell biology
  • Subject:              Steroid hormone receptor profiling in human cancers
  • Location:            Erasmus MC Rotterdam
  • Date:                    30-03-2017
  • Author:               Katja Slangewal

Breast- and prostate cancers are the most frequently diagnosed cancers in women and men respectively (figure 1). They are the second deathliest cancers both behind lung cancer. This makes breast- and prostate cancer an important research topic.

seminar 8 im1

Figure 1: The most frequently diagnosed cancers in male and females in the UK in 2014. According to Zwart, this graph also beholds for the rest of the world.  http://www.cancerresearchuk.org/health-professional/cancer-statistics/incidence/common-cancers-compared#heading-Zero

Approximately 75% of breast cancers and 100% of prostate cancers are related to hormonal defects. The main hormonal players are estrogen and androgen. These hormones lead to nuclear enriched transcription factors, which will cause the transcription of genes related to cell growth and division (figure 2). Both receptors complementing the estrogen and androgen receptor have a similar structure.

The relation between hormonal function and cancer is not a new concept. This relation has already been described in 1896. The treatments in that time were a little less subtle than nowadays. Cases have been described in which the entire ovary of female patients had been removed. This lead to the reduction of breast cancer, due to the removal of the estrogen source.

seminar 8 im2

Figure 2: The estrogen receptor pathway. Its stimulation leads eventually to the transcription of multiple genes and the synthesis of proteins important for cell growth and division. https://www.jci.org/articles/view/27987/figure/1

Nowadays more subtle treatments are used. The main treatment consists of tamoxifen, aromatase inhibitors or enzalutamide injections. These substances block the estrogen receptor or the nuclear import of the androgen receptor. The treatment works, but unfortunately not perfectly. This is where the research of Wilbert Zwart and his colleagues comes into play. They are looking for predictive biomarkers, which are able to tell whether the hormonal treatment will have an effect or whether it will be better to start immediately with chemotherapy.

Zwart has performed Chip-seq experiments to identify the DNA binding sites of the estrogen and androgen receptors. This led to the identification of many thousands of binding sites. Only a few of these binding sites were located at a promoter. However, most binding sites were near the start site of transcription. Zwart also found the importance of FoxA1, this protein is necessary for the estrogen/androgen receptor to be functional.

A point of consideration is related to the cell lines used for research. All cell lines are quite old and 90% of the research is done on one single cell line: MC7. This of course does not show the complexity and diversity of cancers. So Zwart decided to perform his research on newly derived tissues from recent patients. He compared the DNA binding sites of the receptors between the new tissue and MC7 cell line. Most of the binding sites did overlap. However, the necessity of FoxA1 in MC7 cell lines did not show as clearly in newly derived tissues. Zwart showed that FoxA1 is mainly present in the primary tumor, but is quite often absent in metastasis. This is an important observation, because the hormonal treatments only work in presence of FoxA1. So, this makes FoxA1 and important biomarker, to check how promising the hormonal treatment is.

Next, Zwart went back to the thousands of DNA binding sites he found. The goal is to identify the sites that matter. Individual binding sites can have a direct causal effect on proliferation. So, it is important to check them individually by making knock-outs and see what happens. The selection of specific binding sites is based on several factors. One example of these factors is whether Crispr-Cas can reach the binding site easily.

Several genes were identified nearby binding sites of the estrogen/androgen receptors. One of them is the gene FEN1. FEN1 is up regulated in breast cancer and it influences the survival after hormonal therapy. The gene is required for hormone induced gene expression, so it forms another good biomarker to test whether hormonal treatments will have an effect. In the future, more biomarkers should be identified and also be used in the clinic.

I thought the seminar was quite interesting. I really notice the difference between now and the beginning of last year, considering how much of a talk I actually understand. I liked the talk mainly because of the content. The presenting could be a bit better. I often had trouble listening, mainly because Wilbert often turned around and talked to the slides instead of to the audience. This decreased the volume a lot. Also, he had a lot of images per slide, which made the images quite small and hard to read. So sometimes the talk was hard to follow, but the main message was interesting. I had for instance no idea that most of the breast cancer research is based on one specific cell line. So I learned some new things.

Light Sheet Microscopy

  • Speaker:             Malte Wachsmuth
  • Department:     Cell biology 
  • Location:            Erasmus MC
  • Date:                    23-02-2017
  • Author:               Katja Slangewal

Light-sheet microscopy is a microscopy technique optimized by a start-up company called Luxendo (part of EMBL). This technique is special because it enables live cell imaging for an increased amount of time with a resolution close to confocal resolution. The main idea behind light-sheet microscopy is the uncoupling of the illumination and detection light paths. The illuminating beam illuminates a thin sheet of the sample (2-8μm), which is in the focal plane of the detection lens. This enables full area detection with a single light sheet (figure 1, right). This creates an advantage over confocal microscopes, since they use point-wise raster scanning. This leads to the illumination of a quite large part of the sample over time (figure 1, left). So, light-sheet microscopy reduces the amount of light needed to image your sample. This reduces the photobleaching of your samples. Also, by imaging with a single light-sheet is becomes possible to capture your sample in one single shot. This makes light-sheet imaging ideal for the imaging of dynamic processes in live specimen. One disadvantage or possible problem could be the intensity drop of the illuminating beam. The amount of intensity drop is dependent on your sample and labeling, so it might not be a problem in most cases. However, there is a way to reduce the intensity drop. By using dual illumination (simultaneously from the front and back of the sample), the intensity drop can be reduced.

seminar 7 im1

Figure 1: Light-sheet microscopy (right) versus confocal microscopy (left). Light-sheet microscopy drastically reduces the amount of illumination of your sample, thereby decreasing photobleaching. Source: http://luxendo.eu/

The MuVi-SPIM (Multiview selective-plane illumination microscope) is one of the microscopes using light-sheet microscopy. The MuVi-SPIM is based on four objective lenses. Two lenses are used for illumination of the sample and two lenses are used for the detection of the fluorescent signal. The four lenses give four different views on your sample, which can be fused to one optimized image (figure 2). This way you can image larger living specimen without rotating them, thereby reducing the acquisition time.

seminar 7 im2

Figure 2: The MuVi-SPIM: by using four objective lenses and adding the four separate images an optimal image can be formed, thereby reducing noise and increasing the signal. Source: http://luxendo.eu/

Besides the MuVi-SPIM there are more possible geometries which are able to use the principles of light-sheet microscopy. One example is the InVi-SPIM (inverted view selective-plane illumination microscope). This geometry uses only two objective lenses, one for illumination and one for detection (figure 3). The InVi-SPIM has been developed for long-term 3D imaging of living specimen. It has an inverted microscope configuration, which makes it easier to access the sample chamber. According to Wachsmuth, your specimen will be able to stay alive for approximately 2 days in the chamber. The InVi-SPIM is for instance very useful for stem cell differentiation assays. This because of the lower amount of photobleaching compared to a confocal microscope, but with similar resolution. A small disadvantage is the inability to use dual illumination (illuminating your sample from the front and back simultaneously).

seminar 7 im3

Figure 3: The InVi-SPIM enables the imaging of living specimen for longer periods of time. Source: http://luxendo.eu/invi-spim

Light-sheet microscopy makes it possible to image two different parts of a sample at the same time. One could for instance image both the heart of a zebrafish and its blood flow in the tail. This is not possible with a confocal microscope, since the confocal microscope needs time to image by point-wise raster scanning. This together with the lower amount of photobleaching and higher imaging speed makes light-sheet microscopy a promising microscopy technique.

This seminar was different from the other seminars I have visited so far. This was mainly because Malte Wachsmuth was representing a company instead of a research group. This gave the talk a bit the appearance of a sales pitch. However, it was still very interesting. I hadn’t heard about light sheet microscopy before and I think it sounds like an interesting technique. Malte was a very enthusiastic speaker and he had a very clear talk.

 

Evolution and Assembly of Eukaryotic Chromatin

Speaker: Fransesca Mattiroli

Department: Lugi Lab, University of Colorado Boulder

Subject: Evolution and assembly of eukaryotic Chromatin

Location: TU Delft, Bionanoscience department

Date: 10-02-2017

Author: Mirte Golverdingen

 

Fransesca Mattiroli’s research is focussed on the DNA packaging units called nucleosomes. These structures organize DNA in the eukaryotic cell nucleus. Nucleosomes are formed by an octameric complex of folded histone dimers called the H3-H4 and H2A-H2B dimers. In mammals, the histones have histone tails which highly contribute to post-translational modifications and they stabilize the nucleosome. Nucleosomes need to assemble and disassemble when they bind to the genome DNA. Histone modifications and variants are dynamic and can promote or inhibit certain interactions. The nucleosome dynamics and compositions have a direct effect on transcription, translation and repair.

The first main interest of Mattiroli is the evolutionary origin of the nucleosome. The nucleosomes are very well conserved through species. Mattiroli focusses on the structural conservation of the histone dimers in Archaea. They, however, miss the tails that contribute to post-translational modification. So, how do these species organize their archaeal genome?

The archaeal histone binding to DNA is similar to eukaryotic histone binding. Archael histones, however, do not form octamers. They can form a much longer structure instead, called nucleosomal ramps. In Vivo, this structure also forms, the longest ramp they found was 90 bp long. So, they found a new way of arranging histone DNA complexes.

Histones are formed on the DNA in two steps, first, two H3-H4 dimers form a tetrasome, then two H2A-H2B dimers attach to this tetrasome forming a nucleosome. Histone chaperones shield the charges of the histones and facilitate their deposition on DNA. However, not much is known on how the chaperones actually contribute to this deposition step. The Chromatin Assembly Factor 1, CAF-1, is Mattiroli’s main interest. CAF-1 mediates in this histone deposition step and is essential in multicellular organisms. Matteroli tried to understand how CAF-1 contributes to the deposition step.

Mattiroli’s first step was to research how CAF-1 binds the H3-H4 dimer. She used mass spectrometry (HX-MS) with a hydrogen-deuterium exchange. She could, in this way, measure the change in mass and which regions have the largest changes in deuterium uptake. This region could then be the binding site of CAF-1 on H3-H4. When CAF-1 binds to the dimer, they see a stabilization of the dimer. This result indicates the following hypothesis: Only if a H3-H4 dimer is bound to CAF-1 it can form a tetrasome.

A next step for Mattiroli was to test if CAF-1 can form nucleosomes in vitro, in absence of other proteins. To test this, Mattiroli mixed CAF-1, histones and DNA, treat them with micrcococcal nuclease to digest unprotected DNA and purified and quantified the length of DNA covered by histones. The result showed that CAF-1 is able to assemble tetrasomes, and therefore enabling nucleosome formation in vitro.

So, how is the H3-H4 tetrasome on the DNA formed? Mattiroli used increased lengths of DNA, to trap any intermediates in the process. Mattiroli showed that the forming of the H3-H4 dimer activates the DNA binding of the dimer. The key intermediate that mediates the DNA binding results to be two CAF-1 units. This was the most interesting result so far, because it was never showed before that two independent CAF-1 were involved in the H3-H4 DNA binding.

The interesting and clear seminar showed again how complex the system of DNA and all the DNA-interacting molecules is. The research of Mattiroli gives a good foundation for more research to nucleosomes and their interaction with DNA. Bringing us closer to fully understand the biological system of DNA.

Honours 7

Figure 1. Canonical and variant nucleosomes

(A) Elements of the histone fold and structures of Xenopus leavis H2A–H2B, H3–H4 and (H3–H4)2 (PDB ID: 1KX5). (B) Structure of the canonical Xenopus leavis nucleosome (PDB ID: 1KX5). Other nucleosome structures, such as the human nucleosome, are structurally similar. (C) Structure of the CenH3CENP‐A‐containing nucleosome (PDB ID: 3AN2). (D) Zoomed view of the αN helix of CenH3CENP‐A (left) and H3 (right) involved in stabilizing the DNA ends. Histone H3 is blue, CenH3CENP‐A is cyan, H4 is green, H2A is yellow, H2B is red, and DNA is white.

Adapted From: Mattiroli, F., D’Arcy, S., & Luger, K. (2015). The right place at the right time: chaperoning core histone variants. EMBO reports, e201540840.

Crosstalk of Immune and Nervous Systems

Speaker: Frauke Zipp
Department: Neurology, Mainz (Germany)
Location: Erasmus MC Rotterdam
Date: February 6, 2017
Author: Teun Huijben

Frauke Zipp is the head of the neuro-immunology research group in the Department of Neurology of the Medical University of Mainz, Germany. The main focus of her research is gaining better understanding of the complex interplay between the immune system and the central nervous system.

Many diseases are caused by problems in the interactions between the immune system and the central nervous system (CNS), examples are: Multiple Sclerosis (MS), Acute Disseminated Encephalomyelitis (ADEM), Meningitis, strokes, migraine and Alzheimers, Parkinsons and Huntington disease. Of all these diseases, MS is the best model disease of this interaction, since it is a chronic inflammation of the central nervous system.

The hypothesis of the cause of MS is an auto-immune respons where T-cells attack the central nervous system during the young adolescence. These CNS-specific T-cells are activated in some way and cross the blood-brain barrier. Arrived in the brain they are re-stimulated by seeing the CNS-antigens and their activation induces a cascade of reactions ultimately resulting in demyelination of axons (figure 1). Myelin is a fatty substance wrapped around axons to increase their electrical conductivity and provides mechanical protection. When this myelin is removed, the axons get damaged and the resulting neural injury will have all sort of effects.

demyelinationFigure 1: Demyelination of axons in multiple sclerosis. As an effect of the auto-immune attack of T-cells in the brain, the myelin around axons is removed and axon function disrupted. [1]

How the demyelination is caused by the immune respons is not exactly known. One idea is that the activated lymphocytes enter the CNS and attack oligodendrocytes, cells that produce and maintain the myelin. Another idea is that cytotoxic (Cd8+) and T-helper (Cd4 + and Th17) cells directly attack the axons.

Remarkable is that 40 percent of the patients does not have extreme pathology effect and show a pattern of relapses and remissions. This suggest some repair of the CNS during these remissions. Frauke Zipp and her group are interested in this repair and also in better understanding MS in general to come up with new therapies.

To study MS they use an EAE (Experimental Autoimmune Encephalitis) mouse model that mimics the human disease. Using cultures of these cells it is possible to follow the immune respons of the brain by imaging living cells. This in vivo research led to multiple new concepts.

The first one is counterbalancing the inflammatory response. Microglia cells are the immune cells of the brain that can be best compared with macrophages. They found that microglia cells are able to catch Th17 cells using their long processes and engulf them by phagocytosis. Hereby defending the CNS against the mistaken attack of the immune system. Another concept is the discovery that T-helper-2 cells (Th2) are able to repair the nervous system during periods of remission. They do this by inducing regeneration of axons damaged by the immune system.

Overal, it was a quite difficult seminar to follow given our limited knowledge about the brain and the many medical terms used by Fauke Zipp. However, it is interesting and promising that her research has led to completely new concepts regarding the understanding of multiple sclerosis. And hopefully more research will result in newer and better therapies to defeat this disease.

[1]: http://medical-dictionary.thefreedictionary.com/demyelination

Towards Single-Molecule Protein Sequencing

Speaker: Chirlmin Joo
Department: Bionanoscience Department TU Delft
Location: TU Delft 
Date: January 12, 2017
Author: Teun Huijben

Chirlmin Joo starts his talk with a graph showing the costs for whole genome sequencing over the years. The costs were once around 100 million dollars, where sequencing is nowadays possible for less then 1000 dollars. This decrease is enormous, but to improve medical care even more, the need for cheaper and easier methods is still high. New techniques like single-molecule fluorescence sequencing methods and nanopore detection have the potential to fulfill these needs.

It is clear that DNA sequencing is important, but Chirlmin rises the question why not sequencing proteins? Proteins are the working machinery of the cell, they perform all kind of functions. To name a few: they replicate our DNA, generate ATP, transport all kind of molecules and take care of muscle contractions. Having a fast and cheap method to sequence proteins will therefore be very interesting for both medical care as fundamental research. The technique that is used nowadays to sequence proteins is mass spectrometry. However, it is quite difficult to use and a lot of sample is needed for a proper measurement.

Chirlmin and his group are searching for a single-molecule protein sequencing technique. Sequencing proteins in stead of DNA gives rise to a problem, where DNA consists of only 4 subunits (the 4 nucleotides), are proteins built of 20 different amino acids. The idea was to label all different amino acids with a different fluorophore, but the spectrum of visible light is too small to distinguish 20 different colors. The brilliant idea they came up with was protein fingerprinting (figure 1). This means that of the protein only 2 types of amino acids are labeled. Computationally was determined that if by sequencing the order of only these two amino acids (they chose lysine and cysteine) was known, the right protein could be determined from the database with some error. However, if also the relative distance between the amino acids of interest was known, the right protein could be determined with high precision. This means that in theory the technique of protein fingerprinting was able to sequence proteins. Now they had to find a method to sequence the labeled proteins.

protein-fingerprinting

figure 1: Protein fingerprinting. The idea of protein fingerprinting is labelling only two amino acids, in this case cysteine and lysine. When the order and relative distance of only these two amino acids is known in the unknown protein, comparison with the protein database will results in determining the correct protein.[1] 

Chirlmin came up with the idea to linearize the protein and let it be pulled through a chaperon protein. A chaperon is a cylindrical protein that grabs the C-terminus of a protein and pulls it through its cavity. In this way the protein is sort of scanned. This idea was combined with FRET (Förster resonance energy transfer). The lysines and cysteines are labeled with different fluorophores (red and green), which have the same excitation wavelength but a different emission wavelength. At the end of the chaperon protein an extra fluorophore is attached, called the donor dye. The emitted light of this donor dye has the exact wavelength that excites the fluorophores that are attached to the lysines and cysteins on the protein. So while the protein is translocating through the chaperon the labeled amino acids pass the donor dye and will be excited. This results in a series of red and green light flashes, and these indicate the number and order of lysines and cysteines in the protein. When this is compared with the database, the right protein can be determined.

This technique has many advantages, it is relatively simple and only one copy of the protein is needed to sequence it. It has therefore the potential to be optimized to a very useful technique. Nowadays Chirlmins group is working on adding an extra fluorophore to the system which will enable the sequencing of three different amino acids, this will make the protein determination more reliable.

[1] Yao Yao et al. Single-molecule protein sequencing through fingerprinting: computational assessment. Physics. Biol. 12 (2015).