Induced Pluripotent Stem Cells for treatment of cardiovascular and respiratory diseases

Speaker:             Ulrich Martin

Department:     Cell biology

Location:            Erasmus MC Rotterdam

Date:                    15-5-2017

Author:               Katja Slangewal

The endogenous heart regeneration after a myocardial infarct is far from sufficient in mammals. Actually, less than 50% of the cardiomyocytes in a mammal heart are replaced during the entire life span. These facts immediately show the importance of the development of stem cell based regenerative treatments. This is the main area of research for Ulrich Martin from the Hannover Medical School, centre for Regenerative Medicine. During his talk, he did not only talk about the use of induced pluripotent stem cells (iPSCs) as therapeutic for heart repair, but also about the use of patient specific iPSCs in context for cystic fibrosis (a severe monogenic disease).

It has become more and more clear that quite often a general type of disease differs a lot from patient to patient. Emphasizing the need of patient specific treatments. Martin and his team work on patient specific iPSCs for disease modelling and the search to new drugs. The production of iPSCs (figure 1) becomes more and more efficient. The reprogramming step has become routine and it is even possible to use big tanks for the production of large numbers of iPSCs. This in contrast to a few years ago, when the production of iPSCs was tedious, time consuming, intensive and about low numbers of cells. The automatization and upscaling of stem cell production however was needed for high throughput and industrialization. For the first time in history, even big pharma are interested in stem cells.

seminar 9

Figure 1: iPSCs are formed by taking somatic cells (for instant adult fibroblast cells) and adding reprogramming factors (KLF4, SOX2, c-Myc, Nanog, Oct-3/4 and LIN-28). After culturing the iPSCs and if desired adding mutations, the iPSCs can be differentiated in various tissues.

Now back to one of the applications Martin and his team focus on: treatment of Cystic Fibrosis. Cystic fibrosis is a severe and quite common disease, 1:2000 new-borns is diagnosed with CF. The disease is almost always caused by a single point mutation in the CFTR gene: F508del-CFTR. This mutation leads to a shorter and dysfunctional protein. Martin and his team want to investigate this mutation in iPSCs.

The workflow of Martin and his team goes as follows: first they generate CF specific iPSCs from the peripheral blood of CF patients. For his research, he uses patients with varieties in severity of CF. Next, he wants to pick the interesting clones (in which CFTR is expressed but not functional). Since the known antibodies for CFTR are not reliable, the lab uses a CFTR construct labelled with tdtomato. To screen for functionality, he uses eYFP labelling of the cell. eYFP activity is regulated by the iodide concentration, which is travels in and out the cell through the CFTR channel (figure 2). So when a colony of iPSCs is both fluorescent for the red tdtomato and yellow YFP, Martin can use it for further research. Now, a protocol is needed in which iPSCs can be differentiated in functional lung cells. There is a protocol proposed by Kadzik and Morrisey (2012) for this differentiation. However, Martin wants to use an upscaled version. At the moment, the upscaled procedure is not as efficient as the original variant, but it is still work in progress.

seminar 9_2

Figure 2: YFP is sensitive for anions like iodide. It will get activated when iodide enters the cell. Which happens when the CFTR channel is functional.  Adapted from Vijftigschild, L.A.W., Ent, C.K. van der, Beekman, J.M. (2013) A novel fluorescent sensor for measurement of CFTR function by flow cytometry. Cytometry Part A 83A: 578 fig. 1A

Besides CF, Martins research also focusses on iPSCs derived cardiomyocytes as cellular therapeutic for heart repair. In order for therapies like these to exist, cardiomyocytes need to be produced in a safe, efficient and large scale production. In this case large scale production is a real must, since humans lose 1-2 billion cardiomyocytes after a myocardial infarction. Martin and his team are testing these large-scale productions. One important finding is the relationship between the state of differentiation and the density in which the cells where located. This stresses the importance of an equally divide density of cells over the entire tank.

It is now possible to form small amounts of differentiated tissue (few centimeter in diameter), which can beat like heart-tissue. However, the tissue is far from the strength of adult heart tissue. On the bright side, the tissue can deal with higher forces than the heart-tissue of new-borns. At the moment, stem cell derived cardiomyocyte tissue is implanted in monkeys who have suffered a myocardial infarction. This means it is time to prepare for clinical application. The start of ‘iPSCs for Clinically Applicable heart Repair’ or iCARE will help in the realization of clinical application.

I thought this seminar was one of the most interesting ones I have attended. It made me realise how incredible fast the progress in stem cell studies goes. I liked to see the connection between research and clinic. Martin used many clear examples which made his talk easy to follow. I also liked to see the enthusiasm with which he talks about his work. This mainly came back at the end when he got some interesting questions. All in all a good seminar.


The Von Willebrand factor/ADAMTS13 axis in ischemic stroke

Speaker: Simon de Meyer
Department: Department of Thrombosis, KU Leuven, Belgium
Location: Erasmuc MC Rotterdam
Date: May 15, 2017
Author: Teun Huijben

Simon de Meyer studied Industrial Engineering and Bio-engineering and obtained his PhD in Leuven focusing on the von Willebrand disease. Later he did post-doctoral research for three year at Harvard University. Afterwards he returned to Leuven to take an associate professorship at the Faculty of Biomedical Science in the department of Thrombosis.

Simon starts his talk by explaining the basic principles of a stroke. During a stroke a small clot of blood (thrombus) gets trapped in some small blood vessel, thereby obstructing the blood. This obstruction can lead to tissue damage which is especially dangerous when it happens in the brain or heart muscle. A typical stroke exists of two phases. The first phase is called the acute phase during which the thrombus is blocking the blood flow. The second phase is the diffusive phase in which the thrombus has been resolved, but the surrounding tissue still suffers from the period of reduced oxygen supply.

The problem with strokes is that they are unpredictable and there are almost no effective therapies. The best known therapy is the drug tPA (tissue plasminogen activator) that helps the rapid resolving of the thrombus. The disadvantages of tPA are that it can cause dangerous bleedings in other places in the body, you can become tPA resistant and it can be neurotoxic. Because it has to be given in less than 4.5 hours after the stroke, only 10% of the patients can have the drug on time, and only 50% of them reacts positively to it. All of this together makes tPA not the best option to treat strokes.

Another therapy is thrombectomy which is removal of the thrombus with a mechanical device. Besides that it is a good alternative to tPA is also gives us the advantages of collecting fresh human thrombi to study their composition. And this is exactly the part Simon is most interested in: of which cells/components is the thrombus made off and can this explain the different behaviour of thrombi during tPA treatment?

They study the thrombi by making sections, stain them with different dyes and look at them with the microscope. Besides the fact that all thrombi look very different, also the percentages fibrin, red blood cells, neutrophils and platelets differ enormously between different thrombi. They knew that Von Willebrand factor (VWF) is very important during wound healing and the formation of thrombi. Because VWF binds to both the collagen as to the platelets, providing a scaffold to form a blood clot. Using this knowledge they started looking at the VWF in the thrombi and found that this differed a lot between different thrombi. They also found that the concentration of VWF in the thrombi negatively correlated with the percentage of red blood cells in the thrombi, strongly suggesting that the VWF is important in the formation and composition of the thrombus.

It was known that the protein ADAMTS13 cuts VWF. So the hypothesis of Simon and his colleagues was that using ADAMTS13 promotes thrombus resolution in a stroke. To test this they used a mouse model in which thrombi were introduced by opening the skulls and adding a chemical that activates the blood clotting pathways. When the thrombi were formed and the mouse visibly suffered from a stroke, ADAMTS13 was added. And as expected, the thrombus was resolved, the blood flow increased and the damaging effects reduced. Even when ADAMTS13 was added more than 1 hour after the stroke, it was still helpful in reducing the injuries.

However, when the thrombus is resolved and the blood flow is restored, the problems are not over. It is known that after a stroke the surrounding tissue and vessels are getting more damaged for a while. Simon and his group showed that VWF knock-out mice experience less damage after a stroke, suggesting that VWF stimulates tissue damaging. Here ADAMTS13 treatment has the same effects as during the acute phase, it reduces the injury.

In the future, Simon will try to further quantify the effects of tPA, ADAMTS13 and nuclease (a promising drug he mentioned in the last part of the talk) hoping to find the perfect cocktail to treat patients suffering from strokes.

After all, I really enjoyed Simons talk and he explained patiently and very clear all the experiments leading to his conclusions. Although this is not directly my field of interested it was enjoyable and educational.

Short term plasticity and E/I balance combine to control Purkinje cell discharge in the cerebellum

Speaker: Philippe Isope
Subject: Short term plasticity and E/I balance combine to control Purkinje cell discharge in the cerebellum
Location: Erasmus MC
Date: 03-04-2017
Author: Renée van der Winden

Philippe Isope came to talk to us about his work on some of the workings of the cerebellum. He started with giving us a very brief overview of how the cerebellum works, namely, that it is for motor coordination. He mentioned two mechanisms that were important for the rest of his talk. Those were the fact that the cerebellum can predict the sensory input caused by a voluntary movement and that it adapts its feedback systems through plasticity. One of the questions Isope was concerned with was: ‘Do different tasks of the cerebellum rely on the same processing mechanism?
He then went on to talk about Purkinje cells, which provide the sole output of the cerebellar cortex. The firing of these cells can precede movement, which is linked to the predictive function of the cerebellum. The next topic was the different modules in the cerebellum. It turns out the cerebellum is physiologically heterogeneous and is divided into different modules. The parallel processing this makes possible ensures precision in the cerebellum. Moreover, the communication between the different modules did not seem to be important. This led to a working hypothesis, which said that the individual modules can be coordinated by parallel fibers. However, this raises the question of how they can be precise if the information is spread between them.

Seminar 6

Figure 1: A brief overview of the different ways a Purkinje cell is regulated

In order to test this, Isope and his group first identified the modules they wanted to work with. After that, they mapped granule cell inputs in the Purkinje cells. They found out that activity can quite easily tune these maps, so they are apparently not genetically determined. This shows the cerebellum is capable of plasticity. The conclusion was that the E/I (excitation/inhibition) balance is spatially organized and that that leads to precision. In the end, these two things were put together to show that both short term plasticity and the E/I balance working together to control the discharge of the Purkinje cells.
I thought this talk was quite difficult to follow, in part because of the very thick accent of the speaker. This made it less enjoyable to listen to. However, I am still curious about neuroscience so the topic in itself interested me. However, I am sorry to say that I just did not understand quite enough of it to find it truly interesting.

Steroid hormone receptor profiling in human cancers: from biomarkers to novel therapeutics

  • Speaker:             Wilbert Zwart
  • Department:     Cell biology
  • Subject:              Steroid hormone receptor profiling in human cancers
  • Location:            Erasmus MC Rotterdam
  • Date:                    30-03-2017
  • Author:               Katja Slangewal

Breast- and prostate cancers are the most frequently diagnosed cancers in women and men respectively (figure 1). They are the second deathliest cancers both behind lung cancer. This makes breast- and prostate cancer an important research topic.

seminar 8 im1

Figure 1: The most frequently diagnosed cancers in male and females in the UK in 2014. According to Zwart, this graph also beholds for the rest of the world.

Approximately 75% of breast cancers and 100% of prostate cancers are related to hormonal defects. The main hormonal players are estrogen and androgen. These hormones lead to nuclear enriched transcription factors, which will cause the transcription of genes related to cell growth and division (figure 2). Both receptors complementing the estrogen and androgen receptor have a similar structure.

The relation between hormonal function and cancer is not a new concept. This relation has already been described in 1896. The treatments in that time were a little less subtle than nowadays. Cases have been described in which the entire ovary of female patients had been removed. This lead to the reduction of breast cancer, due to the removal of the estrogen source.

seminar 8 im2

Figure 2: The estrogen receptor pathway. Its stimulation leads eventually to the transcription of multiple genes and the synthesis of proteins important for cell growth and division.

Nowadays more subtle treatments are used. The main treatment consists of tamoxifen, aromatase inhibitors or enzalutamide injections. These substances block the estrogen receptor or the nuclear import of the androgen receptor. The treatment works, but unfortunately not perfectly. This is where the research of Wilbert Zwart and his colleagues comes into play. They are looking for predictive biomarkers, which are able to tell whether the hormonal treatment will have an effect or whether it will be better to start immediately with chemotherapy.

Zwart has performed Chip-seq experiments to identify the DNA binding sites of the estrogen and androgen receptors. This led to the identification of many thousands of binding sites. Only a few of these binding sites were located at a promoter. However, most binding sites were near the start site of transcription. Zwart also found the importance of FoxA1, this protein is necessary for the estrogen/androgen receptor to be functional.

A point of consideration is related to the cell lines used for research. All cell lines are quite old and 90% of the research is done on one single cell line: MC7. This of course does not show the complexity and diversity of cancers. So Zwart decided to perform his research on newly derived tissues from recent patients. He compared the DNA binding sites of the receptors between the new tissue and MC7 cell line. Most of the binding sites did overlap. However, the necessity of FoxA1 in MC7 cell lines did not show as clearly in newly derived tissues. Zwart showed that FoxA1 is mainly present in the primary tumor, but is quite often absent in metastasis. This is an important observation, because the hormonal treatments only work in presence of FoxA1. So, this makes FoxA1 and important biomarker, to check how promising the hormonal treatment is.

Next, Zwart went back to the thousands of DNA binding sites he found. The goal is to identify the sites that matter. Individual binding sites can have a direct causal effect on proliferation. So, it is important to check them individually by making knock-outs and see what happens. The selection of specific binding sites is based on several factors. One example of these factors is whether Crispr-Cas can reach the binding site easily.

Several genes were identified nearby binding sites of the estrogen/androgen receptors. One of them is the gene FEN1. FEN1 is up regulated in breast cancer and it influences the survival after hormonal therapy. The gene is required for hormone induced gene expression, so it forms another good biomarker to test whether hormonal treatments will have an effect. In the future, more biomarkers should be identified and also be used in the clinic.

I thought the seminar was quite interesting. I really notice the difference between now and the beginning of last year, considering how much of a talk I actually understand. I liked the talk mainly because of the content. The presenting could be a bit better. I often had trouble listening, mainly because Wilbert often turned around and talked to the slides instead of to the audience. This decreased the volume a lot. Also, he had a lot of images per slide, which made the images quite small and hard to read. So sometimes the talk was hard to follow, but the main message was interesting. I had for instance no idea that most of the breast cancer research is based on one specific cell line. So I learned some new things.

Information processing in neural and gene regulatory networks

Speaker: Gašper Tkačik
Subject: Information processing in neural and gene regulatory networks
Location: TU Delft
Date: 22-03-2017
Author: Renée van der Winden

Gašper Tkačik came to talk to us about his research on information processing in biological networks. His main goal is to predict what biological networks do from first principles and to quantify their function in this sense. He first gave us a brief introduction into Shannon’s information theory, which quantifies and optimizes information transmission. He also posed the question: ‘How can we recover the input at the end of a process?’. To illustrate his points, Tkačik explained two examples to us.

The first example was about the retina and how it encodes information. Through measuring the information flow into and out of the retina, it was predicted what modification the neurons make on the incoming light. Namely, that they perform center-surround filtering. After this prediction was made, it was confirmed by measurements. So in this case they succeeded in predicting the function of a network from first principles. Continuing with the retina, a different experiment was performed in which the pattern of neurons firing when a movie was shown was examined. Through measurements, the scientists found a probability distribution for these patterns. Looking at this distribution they found out that the neural output actually is not decorrelated, as was previously thought. In fact, each pair of neurons is weakly correlated. Moreover, they succeeded in decoding what movie had been shown by looking at the output information provided by the retina.

Seminar 5 (2)

Figure 1: A shortened overview of how the movie was decoded from the retinal code

The second experiment was about how morphogen gradients convey information in early development. The question that was posed was: ‘How much information is stored in the pattern?’. It turned out that the answer is approximately 2 bits per gene. However, four genes store 4.3 bits of information. By finding these numbers, Tkačik formalized an established concept of positional information.

The idea of quantifying what happens in biological networks is very interesting to me. I am interested in how organisms work, but I also really like the certainty that mathematics and physics give you. This is a way to combine the two. The talk was relatively easy to understand, which is always nice. It was also the first seminar in which I recognized concepts that I have learned during my own courses.

Real-time observation of translation of single mRNA molecules in live cells.

Speaker: Marvin Tanenbaum
Department: Department of Bionanoscience
Subject: Real-time observation of translation of single mRNA molecules in live cells.
Location: Delft University of Technology

Date: 24 March 2017
Author: Romano van Genderen

The research by professor Tanenbaum was about the kinetics of translation. He started by talking about how many genes, about 20%, oscillate in gene expression during the cell cycle, setting off processes of division and differentiation. For this to happen, a very strong regulation of the genes is necessary. This happens on many levels, not only on the scale of transcription, but also translation. Translational regulation has been shown to be the most important process to regulate how much of a certain protein is produced. This regulation happens through miRNA and RNA binding proteins.

In order to more carefully study this regulation, it is needed to see the RNA translation in action. A relatively old method for visualizing RNA was developed by Singer labs. They build in a series of hairpins in the RNA. These hairpins are a binding site for a protein called MCP. This MCP protein has a GFP tag to allow for visualization.

Showing the translation product sounds like something that would not be too hard. Just let the ribosome translate the RNA for GFP and it should be visible. But a problem about this method is that you create a lot of background noise from the free-floating GFP. Another problem is that GFP needs some time to maturate, which takes longer than the translation. So they used a free-floating GFP-antibody complex that binds the protein that is being synthesized. This SunTag system they developed is very bright and allows for very good visualization.


Fig 1. An overview of the SunTag system. You can see them being used to look at proteins that are being synthesized (Tanenbaum et al, A Protein-Tagging System for Signal Amplification for Signal Amplification in Gene Expression and Fluorescence Imaging, Cell 159)

So now you can combine the two aforementioned approaches. First fuse some small peptides to your protein of interest. Then let green antibodies bind the peptides. At the same time, you attach mCherry to a newer form of MCP, PP7 and let it bind the RNA hairpins which are non-coding. Wherever you see yellow, active translation is going on. When PP7 is injected into the cytoplasm, it can now be used to follow a piece of mRNA from the export into the cytoplasm until its eventual degradation.

Using this technique a few experiments were done.

Firstly, they were able to count the number of ribosomes on a single piece of mRNA, showing that the average number was around 20.

Next, they showed the translation speed. Because when the translation would be very slow, the ribosome signal would slowly dim after adding a translation initiation inhibiting drug. If translation would be very fast, the signal would be lost immediately.

Other research was done on the regulation of translation. They did find that specific RNA cutting proteins only work when RNApol collides with them, “bumping them off”. The cutting step of this protein works automatically, but they need a collision with RNApol to release the protein and therefore the cut strand.

I did really enjoy the technical overview of the versatile SunTag procedure and the applications of it. I do expect even more findings to come from this method, especially if this method can also be expanded to work on DNA as well. This would be a good improvement of the method. But I continue to doubt if understanding of the kinetics of translation has any practical applications.

Light Sheet Microscopy

  • Speaker:             Malte Wachsmuth
  • Department:     Cell biology 
  • Location:            Erasmus MC
  • Date:                    23-02-2017
  • Author:               Katja Slangewal

Light-sheet microscopy is a microscopy technique optimized by a start-up company called Luxendo (part of EMBL). This technique is special because it enables live cell imaging for an increased amount of time with a resolution close to confocal resolution. The main idea behind light-sheet microscopy is the uncoupling of the illumination and detection light paths. The illuminating beam illuminates a thin sheet of the sample (2-8μm), which is in the focal plane of the detection lens. This enables full area detection with a single light sheet (figure 1, right). This creates an advantage over confocal microscopes, since they use point-wise raster scanning. This leads to the illumination of a quite large part of the sample over time (figure 1, left). So, light-sheet microscopy reduces the amount of light needed to image your sample. This reduces the photobleaching of your samples. Also, by imaging with a single light-sheet is becomes possible to capture your sample in one single shot. This makes light-sheet imaging ideal for the imaging of dynamic processes in live specimen. One disadvantage or possible problem could be the intensity drop of the illuminating beam. The amount of intensity drop is dependent on your sample and labeling, so it might not be a problem in most cases. However, there is a way to reduce the intensity drop. By using dual illumination (simultaneously from the front and back of the sample), the intensity drop can be reduced.

seminar 7 im1

Figure 1: Light-sheet microscopy (right) versus confocal microscopy (left). Light-sheet microscopy drastically reduces the amount of illumination of your sample, thereby decreasing photobleaching. Source:

The MuVi-SPIM (Multiview selective-plane illumination microscope) is one of the microscopes using light-sheet microscopy. The MuVi-SPIM is based on four objective lenses. Two lenses are used for illumination of the sample and two lenses are used for the detection of the fluorescent signal. The four lenses give four different views on your sample, which can be fused to one optimized image (figure 2). This way you can image larger living specimen without rotating them, thereby reducing the acquisition time.

seminar 7 im2

Figure 2: The MuVi-SPIM: by using four objective lenses and adding the four separate images an optimal image can be formed, thereby reducing noise and increasing the signal. Source:

Besides the MuVi-SPIM there are more possible geometries which are able to use the principles of light-sheet microscopy. One example is the InVi-SPIM (inverted view selective-plane illumination microscope). This geometry uses only two objective lenses, one for illumination and one for detection (figure 3). The InVi-SPIM has been developed for long-term 3D imaging of living specimen. It has an inverted microscope configuration, which makes it easier to access the sample chamber. According to Wachsmuth, your specimen will be able to stay alive for approximately 2 days in the chamber. The InVi-SPIM is for instance very useful for stem cell differentiation assays. This because of the lower amount of photobleaching compared to a confocal microscope, but with similar resolution. A small disadvantage is the inability to use dual illumination (illuminating your sample from the front and back simultaneously).

seminar 7 im3

Figure 3: The InVi-SPIM enables the imaging of living specimen for longer periods of time. Source:

Light-sheet microscopy makes it possible to image two different parts of a sample at the same time. One could for instance image both the heart of a zebrafish and its blood flow in the tail. This is not possible with a confocal microscope, since the confocal microscope needs time to image by point-wise raster scanning. This together with the lower amount of photobleaching and higher imaging speed makes light-sheet microscopy a promising microscopy technique.

This seminar was different from the other seminars I have visited so far. This was mainly because Malte Wachsmuth was representing a company instead of a research group. This gave the talk a bit the appearance of a sales pitch. However, it was still very interesting. I hadn’t heard about light sheet microscopy before and I think it sounds like an interesting technique. Malte was a very enthusiastic speaker and he had a very clear talk.


Information processing in the neural and gene regulatory networks

Aïsha Mientjes


Seminar 4:

Speaker:  Gasper Tkacik             

Department: Bionanoscience

Subject: Information processing in the neural and gene regulatory networks

Location: TU Delft          

Date: 22-03       

Dr. Tkacik dived up his lecture in several parts. The first part dealt with history. He explained a little bit about Shannon’s information theory. Shannon stated that in communication there are three parts: the source, the channel and the receiver. Dr. Tkacik then explained  a little bit about the mutual information theory: the measure for the ability to send and recover signals through a noisy channel. Shannon’s theory provides a framework for understanding biological processing.

Part two dealt with the retina as a coding device: going beyond single neurons to neural populations. Dr. Tkacik showed us a movie of fish in which he could show the response of neurons. The brain receives a binary signal, but there are still many questions relating to these signals. Dr. Tkacik concerns himself with whether the pattern can be converted back into the initial movie. He can actually do this, and there is a pretty good correspondence.

Part 3 of the lecture was about positional information. In many cellular processes, cellular specification is q=guided by positional information. In this lecture, some questions were asked about this:

  • How much is needed?
  • Are some patterns better than others?
  • How much information do patterns give?
  • How do you read the positional code?seminar 4This image shows the French flag model, an important model in positional information.

Information theory can be used to quantify many biological processes, such as positional information.

The final part of the lecture dealt with perspectives for the future.

  1. New data allows us to observe networks in action.
  2. Quantitative measurements can be done for networks.
  3. Quantifying can help with efficient coding.
  4. Computation will play a big role.
  5. Evolutionary dynamics can be studied.

This concluded the lecture.

This was the second seminar I attended at the TU Delft. I found this topic slightly easier to follow than the last TU Delft seminar. This was mainly because I has some knowledge on this topic from evolutionary developmental biology. I found it very interesting to see that many biological processes can be quantified, which will make studying them a lot easier. I found this seminar very interesting and the lecturer was very passionate and could tell us a lot.

All in all, I found it a very interesting and educational seminar.

Quantum optomechanics – exploring mechanical motion in the quantum regime

Aïsha Mientjes


Seminar 3:

Speaker:  Markus Aspelmeyer  

Department: Nanoscience/Physics

Subject: Quantum Optomechanics – exploring mechanical motion in the quantum regime.

Location: TU Delft          

Date: 02-03       

Dr. Aspelmeyer started out the lecture by explaining what quantum optomechanics is. It is the combination of opto mechanics and quantum optics. In quantum optics, fluctuations infer with the measurements. This is referred to as the standard quantum limit. Quantum optomechanics is the full quantum toolbox to prepare and control mechanical quantum states via photonic quantum states.

He went on to state several quantum states.

  1. Quantum ground state of motion
  2. Quantum states of motion
  3. Non-Gaussian quantum states of motion
  4. Quantum entanglement

The next part of the lecture dealt with applications of quantum optomechanics, some examples are:

  • On chip quantum information processing
  • Quantum hybrid devices
  • New coating techniques

 seminar 3



This image shows a quantum nanodevice, one of the applications of quantum optomechanics.


The precision of many quantum measurements is limited by the coating that is used. Coatings which do not function well are the most important problem in our measurements of space and time. We have to deal with things such as Brownian motion and thermal fluctuations.

The team of Dr. Aspelmeyer mainly concerns themselves with 2 questions:

  1. How small can a source mass be?
  2. How massive can a quantum system be?

This concluded his lecture.

This was the first seminar I attended at the TU Delft. The previous two had both been at the Erasmus MC ant therefore this one had a very different character. It dealt much more with the applications of physics and mathematics and was not focussed on human health. Despite this seminar being different from the past two I attended, I enjoyed this very much. The topic is quite abstract and difficult to grasp, but Dr. Aspelmeyer explained everything very well. He was very enthusiastic about the work being done in this field currently, which made the seminar very enjoyable to listen to.

I found it very interesting to learn how something as simple as a coating can have such major effects on the measurements being done. It amazed me how something so simple can be so limiting. All in all, I found it a very interesting and enjoyable seminar.

Evolution and Assembly of Eukaryotic Chromatin

Speaker: Fransesca Mattiroli

Department: Lugi Lab, University of Colorado Boulder

Subject: Evolution and assembly of eukaryotic Chromatin

Location: TU Delft, Bionanoscience department

Date: 10-02-2017

Author: Mirte Golverdingen


Fransesca Mattiroli’s research is focussed on the DNA packaging units called nucleosomes. These structures organize DNA in the eukaryotic cell nucleus. Nucleosomes are formed by an octameric complex of folded histone dimers called the H3-H4 and H2A-H2B dimers. In mammals, the histones have histone tails which highly contribute to post-translational modifications and they stabilize the nucleosome. Nucleosomes need to assemble and disassemble when they bind to the genome DNA. Histone modifications and variants are dynamic and can promote or inhibit certain interactions. The nucleosome dynamics and compositions have a direct effect on transcription, translation and repair.

The first main interest of Mattiroli is the evolutionary origin of the nucleosome. The nucleosomes are very well conserved through species. Mattiroli focusses on the structural conservation of the histone dimers in Archaea. They, however, miss the tails that contribute to post-translational modification. So, how do these species organize their archaeal genome?

The archaeal histone binding to DNA is similar to eukaryotic histone binding. Archael histones, however, do not form octamers. They can form a much longer structure instead, called nucleosomal ramps. In Vivo, this structure also forms, the longest ramp they found was 90 bp long. So, they found a new way of arranging histone DNA complexes.

Histones are formed on the DNA in two steps, first, two H3-H4 dimers form a tetrasome, then two H2A-H2B dimers attach to this tetrasome forming a nucleosome. Histone chaperones shield the charges of the histones and facilitate their deposition on DNA. However, not much is known on how the chaperones actually contribute to this deposition step. The Chromatin Assembly Factor 1, CAF-1, is Mattiroli’s main interest. CAF-1 mediates in this histone deposition step and is essential in multicellular organisms. Matteroli tried to understand how CAF-1 contributes to the deposition step.

Mattiroli’s first step was to research how CAF-1 binds the H3-H4 dimer. She used mass spectrometry (HX-MS) with a hydrogen-deuterium exchange. She could, in this way, measure the change in mass and which regions have the largest changes in deuterium uptake. This region could then be the binding site of CAF-1 on H3-H4. When CAF-1 binds to the dimer, they see a stabilization of the dimer. This result indicates the following hypothesis: Only if a H3-H4 dimer is bound to CAF-1 it can form a tetrasome.

A next step for Mattiroli was to test if CAF-1 can form nucleosomes in vitro, in absence of other proteins. To test this, Mattiroli mixed CAF-1, histones and DNA, treat them with micrcococcal nuclease to digest unprotected DNA and purified and quantified the length of DNA covered by histones. The result showed that CAF-1 is able to assemble tetrasomes, and therefore enabling nucleosome formation in vitro.

So, how is the H3-H4 tetrasome on the DNA formed? Mattiroli used increased lengths of DNA, to trap any intermediates in the process. Mattiroli showed that the forming of the H3-H4 dimer activates the DNA binding of the dimer. The key intermediate that mediates the DNA binding results to be two CAF-1 units. This was the most interesting result so far, because it was never showed before that two independent CAF-1 were involved in the H3-H4 DNA binding.

The interesting and clear seminar showed again how complex the system of DNA and all the DNA-interacting molecules is. The research of Mattiroli gives a good foundation for more research to nucleosomes and their interaction with DNA. Bringing us closer to fully understand the biological system of DNA.

Honours 7

Figure 1. Canonical and variant nucleosomes

(A) Elements of the histone fold and structures of Xenopus leavis H2A–H2B, H3–H4 and (H3–H4)2 (PDB ID: 1KX5). (B) Structure of the canonical Xenopus leavis nucleosome (PDB ID: 1KX5). Other nucleosome structures, such as the human nucleosome, are structurally similar. (C) Structure of the CenH3CENP‐A‐containing nucleosome (PDB ID: 3AN2). (D) Zoomed view of the αN helix of CenH3CENP‐A (left) and H3 (right) involved in stabilizing the DNA ends. Histone H3 is blue, CenH3CENP‐A is cyan, H4 is green, H2A is yellow, H2B is red, and DNA is white.

Adapted From: Mattiroli, F., D’Arcy, S., & Luger, K. (2015). The right place at the right time: chaperoning core histone variants. EMBO reports, e201540840.